Sunday, June 22, 2014

Learning basic MongoDB by installing and using CRUD

Today, we are going to learn MongoDB, including understand what is MongoDB, installation and doing CRUD operation. We start with the basic question.

what is MongoDB?

MongoDB (from "humongous") is a cross-platform document-oriented database. Classified as a NoSQL database, MongoDB eschews the traditional table-based relational database structure in favor of JSON-like documents with dynamic schemas (MongoDB calls the format BSON), making the integration of data in certain types of applications easier and faster.

With that said, let's move on to install MongoDB. There are many ways to install MongoDB but with this article, the one I'm chosen is to install MongoDB using deb package built by MongoDB. Even though MongoDB comes with ubuntu however the version in the repository is just too old. Current in the ubuntu repository, mongodb version is 1:2.4.9-1ubuntu2 and meanwhile official production release version is 2.6.1.

The instructions below are from http://docs.mongodb.org/manual/tutorial/install-mongodb-on-ubuntu/ . But I summarize into one liner. You will add a new MongoDB repository from official database site and install latest version.
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 7F0CEB10 && echo 'deb http://downloads-distro.mongodb.org/repo/ubuntu-upstart dist 10gen' | sudo tee /etc/apt/sources.list.d/mongodb.list && sudo apt-get update && sudo apt-get install mongodb-org

If everything goes well, you should get a similar output installation MongoDB such as below:
jason@localhost:~$ sudo apt-get install mongodb-org
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following packages were automatically installed and are no longer required:
jhead libcec2 libgdata-google1.2-1 libgdata1.2-1 libjdependency-java liblockdev1 libmaven-archiver-java libmaven-clean-plugin-java
libmaven-compiler-plugin-java libmaven-dependency-tree-java libmaven-filtering-java libmaven-install-plugin-java libmaven-jar-plugin-java
libmaven-resources-plugin-java libmaven-shade-plugin-java libphp-adodb libpigment0.3-11 libplexus-compiler-java libplexus-digest-java oxideqt-codecs-extra
php-auth-sasl php-cache php-date php-file php-http-request php-log php-mail php-mail-mime php-mdb2 php-mdb2-driver-mysql php-net-dime php-net-ftp
php-net-smtp php-net-socket php-net-url php-services-weather php-soap php-xml-parser php-xml-serializer printer-driver-c2esp printer-driver-min12xxw
printer-driver-pnm2ppa printer-driver-pxljr python-axiom python-coherence python-configobj python-epsilon python-gpod python-louie python-nevow python-pgm
python-pyasn1 python-storm python-tagpy python-twill python-twisted-conch python-twisted-web2 qtdeclarative5-window-plugin tinymce2 xbmc-pvr-argustv
xbmc-pvr-dvbviewer xbmc-pvr-mediaportal-tvserver xbmc-pvr-mythtv-cmyth xbmc-pvr-nextpvr xbmc-pvr-njoy xbmc-pvr-tvheadend-hts xbmc-pvr-vdr-vnsi
xbmc-pvr-vuplus xdg-user-dirs-gtk
Use 'apt-get autoremove' to remove them.
The following extra packages will be installed:
mongodb-org-mongos mongodb-org-server mongodb-org-shell mongodb-org-tools
The following NEW packages will be installed:
mongodb-org mongodb-org-mongos mongodb-org-server mongodb-org-shell mongodb-org-tools
0 upgraded, 5 newly installed, 0 to remove and 51 not upgraded.
Need to get 113 MB of archives.
After this operation, 284 MB of additional disk space will be used.
Do you want to continue? [Y/n] Y
Get:1 http://downloads-distro.mongodb.org/repo/ubuntu-upstart/ dist/10gen mongodb-org-shell i386 2.6.1 [4,389 kB]
Get:2 http://downloads-distro.mongodb.org/repo/ubuntu-upstart/ dist/10gen mongodb-org-server i386 2.6.1 [9,308 kB]
Get:3 http://downloads-distro.mongodb.org/repo/ubuntu-upstart/ dist/10gen mongodb-org-mongos i386 2.6.1 [7,045 kB]
Get:4 http://downloads-distro.mongodb.org/repo/ubuntu-upstart/ dist/10gen mongodb-org-tools i386 2.6.1 [92.3 MB]
Get:5 http://downloads-distro.mongodb.org/repo/ubuntu-upstart/ dist/10gen mongodb-org i386 2.6.1 [3,652 B]
Fetched 113 MB in 3min 25s (549 kB/s)
Selecting previously unselected package mongodb-org-shell.
(Reading database ... 564794 files and directories currently installed.)
Preparing to unpack .../mongodb-org-shell_2.6.1_i386.deb ...
Unpacking mongodb-org-shell (2.6.1) ...
Selecting previously unselected package mongodb-org-server.
Preparing to unpack .../mongodb-org-server_2.6.1_i386.deb ...
Unpacking mongodb-org-server (2.6.1) ...
Selecting previously unselected package mongodb-org-mongos.
Preparing to unpack .../mongodb-org-mongos_2.6.1_i386.deb ...
Unpacking mongodb-org-mongos (2.6.1) ...
Selecting previously unselected package mongodb-org-tools.
Preparing to unpack .../mongodb-org-tools_2.6.1_i386.deb ...
Unpacking mongodb-org-tools (2.6.1) ...
Selecting previously unselected package mongodb-org.
Preparing to unpack .../mongodb-org_2.6.1_i386.deb ...
Unpacking mongodb-org (2.6.1) ...
Processing triggers for man-db (2.6.7.1-1) ...
Processing triggers for ureadahead (0.100.0-16) ...
Setting up mongodb-org-shell (2.6.1) ...
Setting up mongodb-org-server (2.6.1) ...
Adding system user `mongodb' (UID 143) ...
Adding new user `mongodb' (UID 143) with group `nogroup' ...
Not creating home directory `/home/mongodb'.
Adding group `mongodb' (GID 155) ...
Done.
Adding user `mongodb' to group `mongodb' ...
Adding user mongodb to group mongodb
Done.
mongod start/running, process 22386
Setting up mongodb-org-mongos (2.6.1) ...
Setting up mongodb-org-tools (2.6.1) ...
Processing triggers for ureadahead (0.100.0-16) ...
Setting up mongodb-org (2.6.1) ...

Looks like installation processed is done and fine. Even it is already started. So now let's play using mongo db command line.
jason@localhost:~$ mongo
MongoDB shell version: 2.6.1
connecting to: test
Welcome to the MongoDB shell.
For interactive help, type "help".
For more comprehensive documentation, see
http://docs.mongodb.org/
Questions? Try the support group
http://groups.google.com/group/mongodb-user
Server has startup warnings:
2014-06-02T22:29:43.933+0800 [initandlisten]
2014-06-02T22:29:43.933+0800 [initandlisten] ** NOTE: This is a 32 bit MongoDB binary.
2014-06-02T22:29:43.933+0800 [initandlisten] ** 32 bit builds are limited to less than 2GB of data (or less with --journal).
2014-06-02T22:29:43.933+0800 [initandlisten] ** Note that journaling defaults to off for 32 bit and is currently off.
2014-06-02T22:29:43.933+0800 [initandlisten] ** See http://dochub.mongodb.org/core/32bit
2014-06-02T22:29:43.934+0800 [initandlisten]
>

As you can see, I'm running 32bit cpu, but it should work fine for 64bit cpu and the rest of this article. So everything has been smooth sailing so far, we will start to create, read, update and delete operation.

  • create




To create or insert a document, it is as easy as
db.inventory.insert( { _id: 10, type: "misc", item: "card", qty: 15 } )

More insert example
db.inventory.update(
{ type: "book", item : "journal" },
{ $set : { qty: 10 } },
{ upsert : true }
)

Interesting insert using save
db.inventory.save( { type: "book", item: "notebook", qty: 40 } )



  • read




to read or query document, it is as easy as
db.inventory.update(
{ type: "book", item : "journal" },
{ $set : { qty: 10 } },
{ upsert : true }
)

read more example here.




  • update


see create above for example.


  • delete




to remove all documents,
db.inventory.remove({})


That's it for this lengthy introduction.

Saturday, June 21, 2014

measure java object size using jamm

Often when you develop a java application, you want to measure how big the object occupied in the heap. There are many available tool, such as the SizeOf.jar but today we will take a look at jbellis/jamm. What is Jamm? Jamm provides MemoryMeter, a java agent to measure actual object memory use including JVM overhead.

It is very easy to use, get the source and build the jar. Below is the output of building jar .
jason@localhost:~/codes/jamm$ ant jar
Buildfile: /home/jason/codes/jamm/build.xml

ivy-download:
[echo] Downloading Ivy...
[mkdir] Created dir: /home/jason/codes/jamm/target
[get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
[get] To: /home/jason/codes/jamm/target/ivy-2.1.0.jar

ivy-init:
[mkdir] Created dir: /home/jason/codes/jamm/target/lib

ivy-retrieve-build:
[ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:retrieve] :: loading settings :: url = jar:file:/home/jason/codes/jamm/target/ivy-2.1.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: jamm#jamm;working@debby.e2e.serveftp.net
[ivy:retrieve] confs: [default]
[ivy:retrieve] found junit#junit;4.11 in public
[ivy:retrieve] found org.hamcrest#hamcrest-core;1.3 in public
[ivy:retrieve] downloading http://repo1.maven.org/maven2/junit/junit/4.11/junit-4.11-javadoc.jar ...
[ivy:retrieve] .................................................................................................................................................................................................................. (370kB)
[ivy:retrieve] .. (0kB)
[ivy:retrieve] [SUCCESSFUL ] junit#junit;4.11!junit.jar(javadoc) (2772ms)
[ivy:retrieve] downloading http://repo1.maven.org/maven2/junit/junit/4.11/junit-4.11.jar ...
[ivy:retrieve] ........................................................................................................................................... (239kB)
[ivy:retrieve] .. (0kB)
[ivy:retrieve] [SUCCESSFUL ] junit#junit;4.11!junit.jar (1725ms)
[ivy:retrieve] downloading http://repo1.maven.org/maven2/junit/junit/4.11/junit-4.11-sources.jar ...
[ivy:retrieve] ................................................................................................. (147kB)
[ivy:retrieve] .. (0kB)
[ivy:retrieve] [SUCCESSFUL ] junit#junit;4.11!junit.jar(source) (1403ms)
[ivy:retrieve] downloading http://repo1.maven.org/maven2/org/hamcrest/hamcrest-core/1.3/hamcrest-core-1.3.jar ...
[ivy:retrieve] ........................... (43kB)
[ivy:retrieve] .. (0kB)
[ivy:retrieve] [SUCCESSFUL ] org.hamcrest#hamcrest-core;1.3!hamcrest-core.jar (1363ms)
[ivy:retrieve] :: resolution report :: resolve 9107ms :: artifacts dl 7338ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 2 | 2 | 2 | 0 || 4 | 4 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: jamm#jamm [sync]
[ivy:retrieve] confs: [default]
[ivy:retrieve] 3 artifacts copied, 0 already retrieved (431kB/40ms)

init:
[mkdir] Created dir: /home/jason/codes/jamm/target/classes
[mkdir] Created dir: /home/jason/codes/jamm/target/test/classes

build:
[echo] jamm: /home/jason/codes/jamm/build.xml
[javac] Compiling 3 source files to /home/jason/codes/jamm/target/classes
[javac] Note: /home/jason/codes/jamm/src/org/github/jamm/AlwaysEmptySet.java uses unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.

jar:
[jar] Building jar: /home/jason/codes/jamm/target/jamm-0.2.7-SNAPSHOT.jar

BUILD SUCCESSFUL
Total time: 26 seconds

The jar is found in target/jamm-0.2.7-SNAPSHOT.jar . You can start testing the built jar using ant test. Below is the output.
jason@localhost:~/codes/jamm$ ant test
Buildfile: /home/jason/codes/jamm/build.xml

ivy-download:

ivy-init:

ivy-retrieve-build:
[ivy:retrieve] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:retrieve] :: loading settings :: url = jar:file:/home/jason/codes/jamm/target/ivy-2.1.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: jamm#jamm;working@debby.e2e.serveftp.net
[ivy:retrieve] confs: [default]
[ivy:retrieve] found junit#junit;4.11 in public
[ivy:retrieve] found org.hamcrest#hamcrest-core;1.3 in public
[ivy:retrieve] :: resolution report :: resolve 266ms :: artifacts dl 23ms
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 2 | 0 | 0 | 0 || 4 | 0 |
---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: jamm#jamm [sync]
[ivy:retrieve] confs: [default]
[ivy:retrieve] 0 artifacts copied, 3 already retrieved (0kB/18ms)

init:

build:
[echo] jamm: /home/jason/codes/jamm/build.xml

jar:

build-test:
[javac] Compiling 2 source files to /home/jason/codes/jamm/target/test/classes
[javac] Note: /home/jason/codes/jamm/test/org/github/jamm/MemoryMeterTest.java uses or overrides a deprecated API.
[javac] Note: Recompile with -Xlint:deprecation for details.
[javac] Note: /home/jason/codes/jamm/test/org/github/jamm/MemoryMeterTest.java uses unchecked or unsafe operations.
[javac] Note: Recompile with -Xlint:unchecked for details.

checkos:

test-mac:

test:
[echo] running tests
[mkdir] Created dir: /home/jason/codes/jamm/target/test/output
[echo] Testing with default Java
[junit] Testsuite: org.github.jamm.GuessTest
[junit] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.869 sec
[junit]
[junit] Testsuite: org.github.jamm.MemoryMeterTest
[junit] Tests run: 13, Failures: 0, Errors: 0, Skipped: 3, Time elapsed: 0.813 sec
[junit]
[junit] Testcase: testMacOSX_i386(org.github.jamm.MemoryMeterTest):SKIPPED: got: "Linux", expected: is "Mac OS X"
[junit] Testcase: testMacOSX_x86_64(org.github.jamm.MemoryMeterTest):SKIPPED: got: "Linux", expected: is "Mac OS X"
[junit] Testcase: testCollections(org.github.jamm.MemoryMeterTest):SKIPPED: These vary quite radically depending on the JVM.

BUILD SUCCESSFUL
Total time: 14 seconds

Very easy and work out of the box. To even start to use, just import the class and start measure the object you interested in,
MemoryMeter meter = new MemoryMeter();
meter.measure(object);

but remember to add "-javaagent:<path to>/jamm.jar" to your classpath when you run your app.

Friday, June 20, 2014

Test subversion project using jenkins

In our last article, we learned basic of jenkins. If you do not know what is jenkins and how to install it, please read the article before continue on this. Today, we will learned by configured a project to be test by jenkins.

This article will focus using subversion. If you store code in git, jenkins support it too but you need to install git plugin for jenkins. The installation process is a few clicks only.

To configure the project to jenkins, point your browser to the jenkins server. The click on 'New Item'. This article continue with the first option 'Build a free-style software project' and as for the Item name, you can basically name after your project but if you have specific scope of test, you can also name it as such. The click next and browser is redirected to another page similar to the one below.



I will explain using the screenshot above. For obvious reason, I have to obfuscate certain part of image to protect party interest but you should get the idea easily. With field Description, you can fill additional information here. First four options, you can play around for it but for this simple project, I don't see the need for it. For field Advanced Project Options, you will most likely to start to use it once you get better understanding of jenkins. So we leave those untick as well.

Next, field Source Code Management, this is where you need to select your code repository. As mentioned earlier, we will select the radio button for Subversion. Field Repository URL must be fill it as you tell jenkins where to get your code from. Most likely you code is security protected and hence you should also provide access credential for jenkins to retrieve project codebase. For field Check-out Strategy, you can choose the strategy you like, for me, I just goes for Use 'svn update' as much as possible because there is no point to checkout everything everytime to build project.

So in order to trigger this project within jenkins, you can specify how you want to trigger it. You can also specified by ticking a few options you want to trigger the build process. For me, I like to trigger manually when I want to quickly test my project. Also, I have setup a periodic build that every friday evening at 11pm, the build will be kickstarted automatically.

Normally target is test, but it should be easily understandable if you develop using ant before. This is the target where jenkins will execute. So for your project, open up ant build file and check out the test target. I recommend you click on Advanced... button to see additional configurations which you might need to change. If you ant build file is on the same directory as you configured in Repository URL just now, then you will not need to modify. If you have special configurations which you need to feed into ant build file during jenkins build, specify in properties.

Last step, Post-build Actions, Click on the drop down button Add post-build action, you can add as many action you want but as a starter, a simple email notification would be suffice. That's it and remember to click Save button to save all your configuration!

Go to the dashboard, you should see now your project configured, in the content page, for your project, you click on a drop down button and select Build now, jenkins would check out your project and execute the test target. If you click on the project, you should be able to see the build history on the left menu. This should get you started and by now, you should get a feel on where to go further, so on the left menu, click on Configure and alter advanced configuration and see how it goes!

That's it for this article, I hope you like it.

Sunday, June 8, 2014

Initial study into apache hadoop single node cluster

If you have read big data article, you will definitely encountered the term, hadoop. Today, we are going to learn Apache Hadoop.

What is Apache Hadoop?

The Apache™ Hadoop® project develops open-source software for reliable, scalable, distributed computing.

The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-availability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-available service on top of a cluster of computers, each of which may be prone to failures.

There are links here and here further explain what is hadoop and its components.

I must admit to quickly setup and run single node cluster is difficult. Mainly because this is my first time learning hadoop and official documentation is not for the starter. So I google a few and got a few helpful links. The following setup is mainly for starter to get a feel on how it works. Sort of like hello world example of hadoop. As such goal is as simple as possible to get a feel of what it is.

Setup is a single node cluster, it work with current linux (debian) user environment and we can remove easily changes we've made after this tutorial. Note that example below is using my own username (jason), and it should work with your user ($HOME) environment too. User security is not a concern issue here as the objective is to learn the basic of hadoop here. A few system setup are needed and we start to prepare for the environment for hadoop.

Because this is a java library, a required JRE installed is needed. This article assume you have java installed and running. You can check it below. If you do not have java, google how to install JRE.
jason@localhost:~$ java -version
java version "1.7.0_55"
Java(TM) SE Runtime Environment (build 1.7.0_55-b13)
Java HotSpot(TM) 64-Bit Server VM (build 24.55-b03, mixed mode)

ssh daemon is required on your workstation. It is also recommend that openssh-client is installed as we will generate public and private key for automatic ssh login. Thus, apt-get install openssh-server openssh-client

Once both packages are installed, make sure sshd daemon is running and generate public and private key.
ssh-keygen -t rsa -P '' -f id_rsa_hadoop

with the above commands, we specified key type is rsa with empty passphrase so ssh will not prompt for passphrase and the key filename is id_rsa_hadoop. It's okay if you do not specify the key filename but because I have a few keys file, it is easy for me to identify and remove it later when this tutorial is done. The key should be available in your current user .ssh directory. To ensure ssh to localhost is automatic, echo your public key into authorized_keys file as a valid authorized key.
jason@localhost:~$ ls .ssh/
authorized_keys id_rsa id_rsa_hadoop id_rsa_hadoop.pub id_rsa.pub known_hosts

$ cat $HOME/.ssh/id_rsa_hadoop.pub >> $HOME/.ssh/authorized_keys

Right now if you ssh to localhost, you should logged without ssh asking for password in the terminal. That's it for the localhost setup. We will move on to the hadoop configuration.

Download a copy of hadoop. For this example, we are using hadoop version 2.4.0 . You can download it here. Then extract in the Desktop directory.
jason@localhost:~/Desktop$ tar -zxf hadoop-2.4.0.tar.gz
jason@localhost:~/Desktop$ cd hadoop-2.4.0
jason@localhost:~/Desktop/hadoop-2.4.0$ ls
bin etc include lib libexec LICENSE.txt logs NOTICE.txt README.txt sbin share

Then we will create directory for namenode and datanode.
jason@localhost:~/Desktop/hadoop-2.4.0$ pwd
/home/jason/Desktop/hadoop-2.4.0
jason@localhost:~/Desktop/hadoop-2.4.0$ mkdir -p hadoop_store/hdfs/namenode hadoop_store/hdfs/datanode

Then there are a few environment needed to be setup. Assuming you are using bash, enter the following into your .bashrc
#HADOOP VARIABLES START
export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_55
export HADOOP_INSTALL=/home/jason/Desktop/hadoop-2.4.0
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
#HADOOP VARIABLES END

The only variable you need to take notice is JAVA_HOME and HADOOP_INSTALL. Once this is done, source immediately this file in your terminal as you will use the commands next.
jason@localhost:~/Desktop/hadoop-2.4.0$ source $HOME/.bashrc

We will now configured five xml properties files for hadoop, namely

  1. etc/hadoop/hadoop-env.sh

  2. etc/hadoop/core-site.xml

  3. etc/hadoop/hdfs-site.xml

  4. etc/hadoop/yarn-site.xml

  5. etc/hadoop/mapred-site.xml


It is assume you are still at current working directory such as below so you can easily edit the above files.
$ pwd
/home/jason/Desktop/hadoop-2.4.0

add the following content into etc/hadoop/hadoop-env.sh
export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_55

add the following contents into etc/hadoop/core-site.xml
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>

add the following contents into etc/hadoop/hdfs-site.xml
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/home/jason/Desktop/hadoop-2.4.0/hadoop_store/hdfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/home/jason/Desktop/hadoop-2.4.0/hadoop_store/hdfs/datanode</value>
</property>

add the following into etc/hadoop/yarn-site.xml
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>

for file etc/hadoop/mapred-site.xml, you can start by copy from etc/hadoop/mapred-site.xml.template
jason@localhost:~/Desktop/hadoop-2.4.0$ cp etc/hadoop/mapred-site.xml.template etc/hadoop/mapred-site.xml

then add the following into the file  etc/hadoop/mapred-site.xml
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>

Once it is done, that's it for the hadoop configuration and now run the command hdfs namenode -format . Below is the output in my terminal.
jason@localhost:~/Desktop/hadoop-2.4.0$ hdfs namenode -format
14/05/30 16:00:55 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG: host = localhost/127.0.1.1
STARTUP_MSG: args = [-format]
STARTUP_MSG: version = 2.4.0
STARTUP_MSG: classpath = /home/jason/Desktop/hadoop-2.4.0/etc/hadoop:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/log4j-1.2.17.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/paranamer-2.3.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/avro-1.7.4.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jersey-core-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jersey-server-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/hadoop-annotations-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/httpcore-4.2.5.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jets3t-0.9.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jetty-6.1.26.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/asm-3.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-cli-1.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/hadoop-auth-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-el-1.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-math3-3.1.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jsp-api-2.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-lang-2.6.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jsch-0.1.42.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jettison-1.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/servlet-api-2.5.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/activation-1.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/xmlenc-0.52.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/guava-11.0.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-codec-1.4.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/stax-api-1.0-2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-digester-1.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/xz-1.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-net-3.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/jersey-json-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-logging-1.1.3.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-io-2.4.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/httpclient-4.2.5.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/junit-4.8.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/hadoop-common-2.4.0-tests.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/hadoop-nfs-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/common/hadoop-common-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/asm-3.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/lib/commons-io-2.4.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/hadoop-hdfs-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/hdfs/hadoop-hdfs-2.4.0-tests.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jackson-xc-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jline-0.9.94.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jersey-client-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jetty-6.1.26.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jackson-jaxrs-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/asm-3.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/commons-cli-1.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/zookeeper-3.4.5.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/commons-lang-2.6.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jettison-1.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/servlet-api-2.5.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/activation-1.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/javax.inject-1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/guava-11.0.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/commons-codec-1.4.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/xz-1.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/guice-3.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/jersey-json-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/commons-io-2.4.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/hadoop-yarn-common-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/hadoop-yarn-client-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/hadoop-yarn-server-common-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/hadoop-yarn-api-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.4.0-tests.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.4.0.jar:/home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.4.0.jar:/contrib/capacity-scheduler/*.jar
STARTUP_MSG: build = http://svn.apache.org/repos/asf/hadoop/common -r 1583262; compiled by 'jenkins' on 2014-03-31T08:29Z
STARTUP_MSG: java = 1.7.0_55
************************************************************/
14/05/30 16:00:55 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
14/05/30 16:00:55 INFO namenode.NameNode: createNameNode [-format]
14/05/30 16:00:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Formatting using clusterid: CID-a15244a5-fea6-42ad-ab38-92b9730521f5
14/05/30 16:00:58 INFO namenode.FSNamesystem: fsLock is fair:true
14/05/30 16:00:58 INFO namenode.HostFileManager: read includes:
HostSet(
)
14/05/30 16:00:58 INFO namenode.HostFileManager: read excludes:
HostSet(
)
14/05/30 16:00:58 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
14/05/30 16:00:58 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
14/05/30 16:00:58 INFO util.GSet: Computing capacity for map BlocksMap
14/05/30 16:00:58 INFO util.GSet: VM type = 64-bit
14/05/30 16:00:58 INFO util.GSet: 2.0% max memory 889 MB = 17.8 MB
14/05/30 16:00:58 INFO util.GSet: capacity = 2^21 = 2097152 entries
14/05/30 16:00:58 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
14/05/30 16:00:58 INFO blockmanagement.BlockManager: defaultReplication = 1
14/05/30 16:00:58 INFO blockmanagement.BlockManager: maxReplication = 512
14/05/30 16:00:58 INFO blockmanagement.BlockManager: minReplication = 1
14/05/30 16:00:58 INFO blockmanagement.BlockManager: maxReplicationStreams = 2
14/05/30 16:00:58 INFO blockmanagement.BlockManager: shouldCheckForEnoughRacks = false
14/05/30 16:00:58 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
14/05/30 16:00:58 INFO blockmanagement.BlockManager: encryptDataTransfer = false
14/05/30 16:00:58 INFO blockmanagement.BlockManager: maxNumBlocksToLog = 1000
14/05/30 16:00:58 INFO namenode.FSNamesystem: fsOwner = jason (auth:SIMPLE)
14/05/30 16:00:58 INFO namenode.FSNamesystem: supergroup = supergroup
14/05/30 16:00:58 INFO namenode.FSNamesystem: isPermissionEnabled = true
14/05/30 16:00:58 INFO namenode.FSNamesystem: HA Enabled: false
14/05/30 16:00:58 INFO namenode.FSNamesystem: Append Enabled: true
14/05/30 16:00:59 INFO util.GSet: Computing capacity for map INodeMap
14/05/30 16:00:59 INFO util.GSet: VM type = 64-bit
14/05/30 16:00:59 INFO util.GSet: 1.0% max memory 889 MB = 8.9 MB
14/05/30 16:00:59 INFO util.GSet: capacity = 2^20 = 1048576 entries
14/05/30 16:00:59 INFO namenode.NameNode: Caching file names occuring more than 10 times
14/05/30 16:00:59 INFO util.GSet: Computing capacity for map cachedBlocks
14/05/30 16:00:59 INFO util.GSet: VM type = 64-bit
14/05/30 16:00:59 INFO util.GSet: 0.25% max memory 889 MB = 2.2 MB
14/05/30 16:00:59 INFO util.GSet: capacity = 2^18 = 262144 entries
14/05/30 16:00:59 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
14/05/30 16:00:59 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
14/05/30 16:00:59 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension = 30000
14/05/30 16:00:59 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
14/05/30 16:00:59 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
14/05/30 16:00:59 INFO util.GSet: Computing capacity for map NameNodeRetryCache
14/05/30 16:00:59 INFO util.GSet: VM type = 64-bit
14/05/30 16:00:59 INFO util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
14/05/30 16:00:59 INFO util.GSet: capacity = 2^15 = 32768 entries
14/05/30 16:00:59 INFO namenode.AclConfigFlag: ACLs enabled? false
14/05/30 16:01:00 INFO namenode.FSImage: Allocated new BlockPoolId: BP-908722954-127.0.1.1-1401436859922
14/05/30 16:01:00 INFO common.Storage: Storage directory /home/jason/Desktop/hadoop-2.4.0/hadoop_store/hdfs/namenode has been successfully formatted.
14/05/30 16:01:01 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
14/05/30 16:01:01 INFO util.ExitUtil: Exiting with status 0
14/05/30 16:01:01 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at localhost/127.0.1.1
************************************************************/

With this output, you should not see any error. Okay, all good and now, start the engine!
jason@localhost:~/Desktop/hadoop-2.4.0$ start-dfs.sh && start-yarn.sh
14/05/30 16:04:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /home/jason/Desktop/hadoop-2.4.0/logs/hadoop-jason-namenode-localhost.out
localhost: starting datanode, logging to /home/jason/Desktop/hadoop-2.4.0/logs/hadoop-jason-datanode-localhost.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /home/jason/Desktop/hadoop-2.4.0/logs/hadoop-jason-secondarynamenode-localhost.out
14/05/30 16:05:09 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
starting yarn daemons
starting resourcemanager, logging to /home/jason/Desktop/hadoop-2.4.0/logs/yarn-jason-resourcemanager-localhost.out
localhost: starting nodemanager, logging to /home/jason/Desktop/hadoop-2.4.0/logs/yarn-jason-nodemanager-localhost.out
jason@localhost:~/Desktop/hadoop-2.4.0$

So you can check using jps if your hadoop is running. The expected hadoop processes are ResourceManager, SecondaryNameNode, NameNode, NodeManager and DataNode.
jason@localhost:~$ jps
22701 ResourceManager
22512 SecondaryNameNode
22210 NameNode
22800 NodeManager
6728 org.eclipse.equinox.launcher_1.3.0.v20120522-1813.jar
22840 Jps
22326 DataNode

You can access apache hadoop via the web interfaces:

Cluster status: http://localhost:8088
HDFS status: http://localhost:50070
Secondary NameNode status: http://localhost:50090

So that's looks good, everything is configured and now it is running fine. So we will continue by running a few examples.
jason@localhost:~/Desktop/hadoop-2.4.0$ hadoop jar /home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.4.0-tests.jar TestDFSIO -write -nrFiles 20 -fileSize 10
14/05/30 16:10:54 INFO fs.TestDFSIO: TestDFSIO.1.7
14/05/30 16:10:54 INFO fs.TestDFSIO: nrFiles = 20
14/05/30 16:10:54 INFO fs.TestDFSIO: nrBytes (MB) = 10.0
14/05/30 16:10:54 INFO fs.TestDFSIO: bufferSize = 1000000
14/05/30 16:10:54 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
14/05/30 16:10:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/05/30 16:10:57 INFO fs.TestDFSIO: creating control file: 10485760 bytes, 20 files
14/05/30 16:11:01 INFO fs.TestDFSIO: created control files for: 20 files
14/05/30 16:11:01 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
14/05/30 16:11:01 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
14/05/30 16:11:04 INFO mapred.FileInputFormat: Total input paths to process : 20
14/05/30 16:11:04 INFO mapreduce.JobSubmitter: number of splits:20
14/05/30 16:11:05 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1401437120030_0001
14/05/30 16:11:06 INFO impl.YarnClientImpl: Submitted application application_1401437120030_0001
14/05/30 16:11:06 INFO mapreduce.Job: The url to track the job: http://localhost:8088/proxy/application_1401437120030_0001/
14/05/30 16:11:06 INFO mapreduce.Job: Running job: job_1401437120030_0001
14/05/30 16:11:28 INFO mapreduce.Job: Job job_1401437120030_0001 running in uber mode : false
14/05/30 16:11:28 INFO mapreduce.Job: map 0% reduce 0%
14/05/30 16:12:30 INFO mapreduce.Job: map 7% reduce 0%
14/05/30 16:12:31 INFO mapreduce.Job: map 17% reduce 0%
14/05/30 16:12:34 INFO mapreduce.Job: map 23% reduce 0%
14/05/30 16:12:36 INFO mapreduce.Job: map 28% reduce 0%
14/05/30 16:12:37 INFO mapreduce.Job: map 30% reduce 0%
14/05/30 16:13:36 INFO mapreduce.Job: map 33% reduce 0%
14/05/30 16:13:39 INFO mapreduce.Job: map 40% reduce 0%
14/05/30 16:13:40 INFO mapreduce.Job: map 42% reduce 0%
14/05/30 16:13:42 INFO mapreduce.Job: map 52% reduce 0%
14/05/30 16:13:43 INFO mapreduce.Job: map 55% reduce 0%
14/05/30 16:13:44 INFO mapreduce.Job: map 58% reduce 0%
14/05/30 16:13:45 INFO mapreduce.Job: map 60% reduce 0%
14/05/30 16:14:47 INFO mapreduce.Job: map 67% reduce 2%
14/05/30 16:14:50 INFO mapreduce.Job: map 75% reduce 2%
14/05/30 16:14:51 INFO mapreduce.Job: map 78% reduce 22%
14/05/30 16:14:53 INFO mapreduce.Job: map 82% reduce 22%
14/05/30 16:14:54 INFO mapreduce.Job: map 85% reduce 22%
14/05/30 16:14:55 INFO mapreduce.Job: map 85% reduce 28%
14/05/30 16:15:37 INFO mapreduce.Job: map 88% reduce 28%
14/05/30 16:15:40 INFO mapreduce.Job: map 93% reduce 28%
14/05/30 16:15:42 INFO mapreduce.Job: map 95% reduce 32%
14/05/30 16:15:44 INFO mapreduce.Job: map 100% reduce 32%
14/05/30 16:15:45 INFO mapreduce.Job: map 100% reduce 67%
14/05/30 16:15:47 INFO mapreduce.Job: map 100% reduce 100%
14/05/30 16:15:49 INFO mapreduce.Job: Job job_1401437120030_0001 completed successfully
14/05/30 16:15:50 INFO mapreduce.Job: Counters: 50
File System Counters
FILE: Number of bytes read=1673
FILE: Number of bytes written=1965945
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=4720
HDFS: Number of bytes written=209715278
HDFS: Number of read operations=83
HDFS: Number of large read operations=0
HDFS: Number of write operations=22
Job Counters
Killed map tasks=3
Launched map tasks=23
Launched reduce tasks=1
Data-local map tasks=23
Total time spent by all maps in occupied slots (ms)=1319128
Total time spent by all reduces in occupied slots (ms)=124593
Total time spent by all map tasks (ms)=1319128
Total time spent by all reduce tasks (ms)=124593
Total vcore-seconds taken by all map tasks=1319128
Total vcore-seconds taken by all reduce tasks=124593
Total megabyte-seconds taken by all map tasks=1350787072
Total megabyte-seconds taken by all reduce tasks=127583232
Map-Reduce Framework
Map input records=20
Map output records=100
Map output bytes=1467
Map output materialized bytes=1787
Input split bytes=2470
Combine input records=0
Combine output records=0
Reduce input groups=5
Reduce shuffle bytes=1787
Reduce input records=100
Reduce output records=5
Spilled Records=200
Shuffled Maps =20
Failed Shuffles=0
Merged Map outputs=20
GC time elapsed (ms)=14063
CPU time spent (ms)=127640
Physical memory (bytes) snapshot=5418561536
Virtual memory (bytes) snapshot=14516457472
Total committed heap usage (bytes)=4196401152
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=2250
File Output Format Counters
Bytes Written=78
14/05/30 16:15:50 INFO fs.TestDFSIO: ----- TestDFSIO ----- : write
14/05/30 16:15:50 INFO fs.TestDFSIO: Date & time: Fri May 30 16:15:50 MYT 2014
14/05/30 16:15:50 INFO fs.TestDFSIO: Number of files: 20
14/05/30 16:15:50 INFO fs.TestDFSIO: Total MBytes processed: 200.0
14/05/30 16:15:50 INFO fs.TestDFSIO: Throughput mb/sec: 1.6888468553671554
14/05/30 16:15:50 INFO fs.TestDFSIO: Average IO rate mb/sec: 1.840719223022461
14/05/30 16:15:50 INFO fs.TestDFSIO: IO rate std deviation: 0.7043729046488437
14/05/30 16:15:50 INFO fs.TestDFSIO: Test exec time sec: 289.58
14/05/30 16:15:50 INFO fs.TestDFSIO:

clean the project.
jason@localhost:~/Desktop/hadoop-2.4.0$ hadoop jar /home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.4.0-tests.jar TestDFSIO -clean
14/05/30 16:20:03 INFO fs.TestDFSIO: TestDFSIO.1.7
14/05/30 16:20:03 INFO fs.TestDFSIO: nrFiles = 1
14/05/30 16:20:03 INFO fs.TestDFSIO: nrBytes (MB) = 1.0
14/05/30 16:20:03 INFO fs.TestDFSIO: bufferSize = 1000000
14/05/30 16:20:03 INFO fs.TestDFSIO: baseDir = /benchmarks/TestDFSIO
14/05/30 16:20:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/05/30 16:20:06 INFO fs.TestDFSIO: Cleaning up test files

another job example.
jason@localhost:~/Desktop/hadoop-2.4.0$ hadoop jar /home/jason/Desktop/hadoop-2.4.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.4.0.jar pi 2 5
Number of Maps = 2
Samples per Map = 5
14/05/30 16:21:18 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Wrote input for Map #0
Wrote input for Map #1
Starting Job
14/05/30 16:21:23 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
14/05/30 16:21:25 INFO input.FileInputFormat: Total input paths to process : 2
14/05/30 16:21:26 INFO mapreduce.JobSubmitter: number of splits:2
14/05/30 16:21:27 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1401437120030_0002
14/05/30 16:21:28 INFO impl.YarnClientImpl: Submitted application application_1401437120030_0002
14/05/30 16:21:28 INFO mapreduce.Job: The url to track the job: http://localhost:8088/proxy/application_1401437120030_0002/
14/05/30 16:21:28 INFO mapreduce.Job: Running job: job_1401437120030_0002
14/05/30 16:21:53 INFO mapreduce.Job: Job job_1401437120030_0002 running in uber mode : false
14/05/30 16:21:53 INFO mapreduce.Job: map 0% reduce 0%
14/05/30 16:22:18 INFO mapreduce.Job: map 100% reduce 0%
14/05/30 16:22:34 INFO mapreduce.Job: map 100% reduce 100%
14/05/30 16:22:35 INFO mapreduce.Job: Job job_1401437120030_0002 completed successfully
14/05/30 16:22:36 INFO mapreduce.Job: Counters: 49
File System Counters
FILE: Number of bytes read=50
FILE: Number of bytes written=280470
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=530
HDFS: Number of bytes written=215
HDFS: Number of read operations=11
HDFS: Number of large read operations=0
HDFS: Number of write operations=3
Job Counters
Launched map tasks=2
Launched reduce tasks=1
Data-local map tasks=2
Total time spent by all maps in occupied slots (ms)=46538
Total time spent by all reduces in occupied slots (ms)=13821
Total time spent by all map tasks (ms)=46538
Total time spent by all reduce tasks (ms)=13821
Total vcore-seconds taken by all map tasks=46538
Total vcore-seconds taken by all reduce tasks=13821
Total megabyte-seconds taken by all map tasks=47654912
Total megabyte-seconds taken by all reduce tasks=14152704
Map-Reduce Framework
Map input records=2
Map output records=4
Map output bytes=36
Map output materialized bytes=56
Input split bytes=294
Combine input records=0
Combine output records=0
Reduce input groups=2
Reduce shuffle bytes=56
Reduce input records=4
Reduce output records=0
Spilled Records=8
Shuffled Maps =2
Failed Shuffles=0
Merged Map outputs=2
GC time elapsed (ms)=631
CPU time spent (ms)=7890
Physical memory (bytes) snapshot=623665152
Virtual memory (bytes) snapshot=2097958912
Total committed heap usage (bytes)=559939584
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
File Input Format Counters
Bytes Read=236
File Output Format Counters
Bytes Written=97
Job Finished in 73.196 seconds
Estimated value of Pi is 3.60000000000000000000

You can also create file and save on hadoop. You can read more at http://hadoop.apache.org/docs/r2.4.0/hadoop-project-dist/hadoop-common/FileSystemShell.html
jason@localhost:~$ hadoop fs -mkdir -p /user/hduser
14/05/30 16:27:31 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
jason@localhost:~$ hadoop fs -copyFromLocal dummy.txt dummy.txt
14/05/30 16:27:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
jason@localhost:~$ hadoop fs -ls
14/05/30 16:28:10 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
-rw-r--r-- 1 jason supergroup 13 2014-05-30 16:27 dummy.txt
jason@localhost:~$ hadoop fs -cat /user/hduser/dummy.txt
14/05/30 16:29:00 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
cat: `/user/hduser/dummy.txt': No such file or directory
jason@localhost:~$ hadoop fs -cat /user/jason/dummy.txt
14/05/30 16:29:11 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
hello world.
jason@localhost:~$ hadoop fs -ls /
14/05/30 16:29:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 3 items
drwxr-xr-x - jason supergroup 0 2014-05-30 16:20 /benchmarks
drwx------ - jason supergroup 0 2014-05-30 16:11 /tmp
drwxr-xr-x - jason supergroup 0 2014-05-30 16:27 /user
jason@localhost:~$ hadoop fs -rm dummy.txt
14/05/30 16:29:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
14/05/30 16:29:54 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 0 minutes, Emptier interval = 0 minutes.
Deleted dummy.txt
jason@localhost:~$ hadoop fs -ls
14/05/30 16:30:03 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
jason@localhost:~$

Once you are done with hadoop cluster, you can shut it down using stop-dfs.sh && stop-yarn.sh
jason@localhost:~/Desktop/hadoop-2.4.0$ stop-dfs.sh && stop-yarn.sh
14/05/30 17:51:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Stopping namenodes on [localhost]
localhost: stopping namenode
localhost: stopping datanode
Stopping secondary namenodes [0.0.0.0]
0.0.0.0: stopping secondarynamenode
14/05/30 17:51:25 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
stopping yarn daemons
stopping resourcemanager
localhost: stopping nodemanager
no proxyserver to stop

You can remove/revert the changes you made for this tutorial.

/home/jason/Desktop/hadoop-2.4.0
/home/jason/.ssh/id_rsa_hadoop.pub
/home/jason/.ssh/id_rsa_hadoop
/home/jason/.ssh/authorized_keys
/home/jason/.bashrc

That's it for this lengthy article, hope you like it and if you learn something , remember to donate to us too!

Saturday, June 7, 2014

Learning Apache Drill Basic

Today, we will take a look at Apache Drill. What is Apache Drill?

Apache Drill is an open-source software framework that supports data-intensive distributed applications for interactive analysis of large-scale datasets.

Below instruction building from source is obtain via apache drill wiki page. You must have java 7 and maven 3 installed. There is this one dependency you have to install as well, protocol buffer, with linux debian, you can install via apt.
sudo apt-get install protobuf-compiler

to get the source, you can git clone from https://github.com/apache/incubator-drill.git
git clone https://github.com/apache/incubator-drill.git

Once cloned, then we can start building the source. Below is my build log.
jason@localhost:~/drill/incubator-drill$ mvn clean install -DskipTests
[INFO] Scanning for projects...
[WARNING]
[WARNING] Some problems were encountered while building the effective model for org.apache.drill.exec:drill-java-exec:jar:1.0.0-m2-incubating-SNAPSHOT
[WARNING] 'dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: pentaho:mondrian-data-foodmart-json:jar -> duplicate declaration of version 0.3.2 @ line 108, column 17
[WARNING] 'dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: org.codehaus.janino:janino:jar -> version 2.7.3 vs 2.6.1 @ line 241, column 17
[WARNING]
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO]
[INFO] Apache Drill Root POM
[INFO] Drill Protocol
[INFO] Common (Logical Plan, Base expressions)
[INFO] contrib/Parent Pom
[INFO] contrib/data/Parent Pom
[INFO] contrib/data/tpch-sample-data
[INFO] contrib/storage-hive
[INFO] exec/Parent Pom
[INFO] exec/Netty Little Endian Buffers
[INFO] exec/Java Execution Engine
[INFO] contrib/hbase-storage-plugin
[INFO] exec/JDBC Driver using dependencies
[INFO] contrib/sqlline
[INFO] Packaging and Distribution Assembly
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Drill Root POM 1.0.0-m2-incubating-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ drill-root ---
[INFO] Deleting /home/jason/drill/incubator-drill/target
[INFO]
[INFO] --- maven-enforcer-plugin:1.2:enforce (no_commons_logging) @ drill-root ---
[INFO]
[INFO] --- git-commit-id-plugin:2.1.9:revision (default) @ drill-root ---
[info] dotGitDirectory /home/jason/drill/incubator-drill/.git
[info] git.build.user.name Jason Wee
[info] git.build.user.email peichieh@gmail.com
[info] git.branch master
[info] --always = false
[info] --dirty = -dirty
[info] --abbrev = 7
[info] --long = %s true
[info] --match =
[info] Tag refs [ [Ref[refs/tags/drill-1.0.0-m1=04020a8fca8b287874528d86dc7b8be0269ad788], Ref[refs/tags/drill-root-1.0.0-m1=ad638d9e41aa9efdb1e877cfe7e0a4b910f539fc], Ref[refs/tags/oscon_workshop=eaf95ed3c30d7bb147afe337e0e0477be6518d90], Ref[refs/tags/pre_exec_merge=a97a22b0a9547f8639e92258c0a3475b01742f15]] ]
[info] Resolved tag [ drill-1.0.0-m1 ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Fri Sep 6 13:05:42 2013 -0700] ], points at [ commit a0d3c6977820516983142c96d7f9374681529968 0 ------ ]
[info] Resolved tag [ drill-root-1.0.0-m1 ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Wed Sep 4 04:23:47 2013 -0700] ], points at [ commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------ ]
[info] Resolved tag [ pre_exec_merge ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Fri Jul 19 18:33:56 2013 -0700] ], points at [ commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------ ]
[info] key [ commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------ ], tags => [ [DatedRevTag{id=ad638d9e41aa9efdb1e877cfe7e0a4b910f539fc, tagName='drill-root-1.0.0-m1', date=September 4, 2013 7:23:47 PM MYT}] ]
[info] key [ commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------ ], tags => [ [DatedRevTag{id=a97a22b0a9547f8639e92258c0a3475b01742f15, tagName='pre_exec_merge', date=July 20, 2013 9:33:56 AM MYT}] ]
[info] key [ commit a0d3c6977820516983142c96d7f9374681529968 0 ------ ], tags => [ [DatedRevTag{id=04020a8fca8b287874528d86dc7b8be0269ad788, tagName='drill-1.0.0-m1', date=September 7, 2013 4:05:42 AM MYT}] ]
[info] Created map: [ {commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------=[drill-root-1.0.0-m1], commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------=[pre_exec_merge], commit a0d3c6977820516983142c96d7f9374681529968 0 ------=[drill-1.0.0-m1]} ]
[info] HEAD is [ e1e5ea0eddd9199672ab01c5ae31f7a3c0a57249 ]
[info] Repo is in dirty state [ false ]
[info] git.commit.id.describe drill-1.0.0-m1-398-ge1e5ea0
[info] git.commit.id e1e5ea0eddd9199672ab01c5ae31f7a3c0a57249
[info] git.commit.id.abbrev e1e5ea0
[info] git.commit.user.name Parth Chandra
[info] git.commit.user.email pchandra@maprtech.com
[info] git.commit.message.full DRILL-423: C++ Client. Initial implementation (reviewed)

[info] git.commit.message.short DRILL-423: C++ Client. Initial implementation (reviewed)
[info] git.commit.time 30.05.2014 @ 06:32:29 MYT
[info] git.remote.origin.url https://github.com/apache/incubator-drill.git
[info] git.build.time 31.05.2014 @ 16:51:51 MYT
[info] found property git.commit.id.abbrev
[info] found property git.commit.user.email
[info] found property git.commit.message.full
[info] found property git.commit.id
[info] found property git.commit.message.short
[info] found property git.commit.user.name
[info] found property git.build.user.name
[info] found property git.commit.id.describe
[info] found property git.build.user.email
[info] found property git.branch
[info] found property git.commit.time
[info] found property git.build.time
[info] found property git.remote.origin.url
[info] Writing properties file to [ /home/jason/drill/incubator-drill/target/classes/git.properties ] (for module Apache Drill Root POM1 )...
[info] Apache Drill Root POM ] project Apache Drill Root POM
[info] Drill Protocol ] project Drill Protocol
[info] Common (Logical Plan, Base expressions) ] project Common (Logical Plan, Base expressions)
[info] contrib/Parent Pom ] project contrib/Parent Pom
[info] contrib/data/Parent Pom ] project contrib/data/Parent Pom
[info] contrib/data/tpch-sample-data ] project contrib/data/tpch-sample-data
[info] contrib/storage-hive ] project contrib/storage-hive
[info] exec/Parent Pom ] project exec/Parent Pom
[info] exec/Netty Little Endian Buffers ] project exec/Netty Little Endian Buffers
[info] exec/Java Execution Engine ] project exec/Java Execution Engine
[info] contrib/hbase-storage-plugin ] project contrib/hbase-storage-plugin
[info] exec/JDBC Driver using dependencies ] project exec/JDBC Driver using dependencies
[info] contrib/sqlline ] project contrib/sqlline
[info] Packaging and Distribution Assembly ] project Packaging and Distribution Assembly
[INFO]
[INFO] --- maven-remote-resources-plugin:1.4:process (default) @ drill-root ---
[INFO]
[INFO] --- apache-rat-plugin:0.10:check (rat-checks) @ drill-root ---
[INFO] 56 implicit excludes (use -debug for more details).
[INFO] Exclude: **/*.log
[INFO] Exclude: **/*.md
[INFO] Exclude: sandbox/**
[INFO] Exclude: **/*.json
[INFO] Exclude: **/*.sql
[INFO] Exclude: **/git.properties
[INFO] Exclude: **/*.csv
[INFO] Exclude: **/drill-*.conf
[INFO] Exclude: **/.buildpath
[INFO] Exclude: **/*.proto
[INFO] Exclude: **/*.fmpp
[INFO] Exclude: **/target/**
[INFO] Exclude: **/*.iml
[INFO] Exclude: **/*.tdd
[INFO] Exclude: **/*.project
[INFO] Exclude: .*/**
[INFO] Exclude: *.patch
[INFO] Exclude: **/*.pb.cc
[INFO] Exclude: **/*.pb.h
[INFO] 16 resources included (use -debug for more details)
Warning: org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser: Property 'http://www.oracle.com/xml/jaxp/properties/entityExpansionLimit' is not recognized.
Compiler warnings:
WARNING: 'org.apache.xerces.jaxp.SAXParserImpl: Property 'http://javax.xml.XMLConstants/property/accessExternalDTD' is not recognized.'
Warning: org.apache.xerces.parsers.SAXParser: Feature 'http://javax.xml.XMLConstants/feature/secure-processing' is not recognized.
Warning: org.apache.xerces.parsers.SAXParser: Property 'http://javax.xml.XMLConstants/property/accessExternalDTD' is not recognized.
Warning: org.apache.xerces.parsers.SAXParser: Property 'http://www.oracle.com/xml/jaxp/properties/entityExpansionLimit' is not recognized.
[INFO] Rat check: Summary of files. Unapproved: 0 unknown: 0 generated: 0 approved: 4 licence.
[INFO]
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ drill-root ---
[INFO]
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ drill-root ---
[INFO] Installing /home/jason/drill/incubator-drill/pom.xml to /home/jason/.m2/repository/org/apache/drill/drill-root/1.0.0-m2-incubating-SNAPSHOT/drill-root-1.0.0-m2-incubating-SNAPSHOT.pom
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Drill Protocol 1.0.0-m2-incubating-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ drill-protocol ---
[INFO] Deleting /home/jason/drill/incubator-drill/protocol/target
[INFO]
[INFO] --- maven-enforcer-plugin:1.2:enforce (no_commons_logging) @ drill-protocol ---
[INFO]
[INFO] --- git-commit-id-plugin:2.1.9:revision (default) @ drill-protocol ---
[info] dotGitDirectory /home/jason/drill/incubator-drill/.git
[info] git.build.user.name Jason Wee
[info] git.build.user.email peichieh@gmail.com
[info] git.branch master
[info] --always = false
[info] --dirty = -dirty
[info] --abbrev = 7
[info] --long = %s true
[info] --match =
[info] Tag refs [ [Ref[refs/tags/drill-1.0.0-m1=04020a8fca8b287874528d86dc7b8be0269ad788], Ref[refs/tags/drill-root-1.0.0-m1=ad638d9e41aa9efdb1e877cfe7e0a4b910f539fc], Ref[refs/tags/oscon_workshop=eaf95ed3c30d7bb147afe337e0e0477be6518d90], Ref[refs/tags/pre_exec_merge=a97a22b0a9547f8639e92258c0a3475b01742f15]] ]
[info] Resolved tag [ drill-1.0.0-m1 ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Fri Sep 6 13:05:42 2013 -0700] ], points at [ commit a0d3c6977820516983142c96d7f9374681529968 0 ------ ]
[info] Resolved tag [ drill-root-1.0.0-m1 ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Wed Sep 4 04:23:47 2013 -0700] ], points at [ commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------ ]
[info] Resolved tag [ pre_exec_merge ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Fri Jul 19 18:33:56 2013 -0700] ], points at [ commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------ ]
[info] key [ commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------ ], tags => [ [DatedRevTag{id=ad638d9e41aa9efdb1e877cfe7e0a4b910f539fc, tagName='drill-root-1.0.0-m1', date=September 4, 2013 7:23:47 PM MYT}] ]
[info] key [ commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------ ], tags => [ [DatedRevTag{id=a97a22b0a9547f8639e92258c0a3475b01742f15, tagName='pre_exec_merge', date=July 20, 2013 9:33:56 AM MYT}] ]
[info] key [ commit a0d3c6977820516983142c96d7f9374681529968 0 ------ ], tags => [ [DatedRevTag{id=04020a8fca8b287874528d86dc7b8be0269ad788, tagName='drill-1.0.0-m1', date=September 7, 2013 4:05:42 AM MYT}] ]
[info] Created map: [ {commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------=[drill-root-1.0.0-m1], commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------=[pre_exec_merge], commit a0d3c6977820516983142c96d7f9374681529968 0 ------=[drill-1.0.0-m1]} ]
[info] HEAD is [ e1e5ea0eddd9199672ab01c5ae31f7a3c0a57249 ]
[info] Repo is in dirty state [ false ]
[info] git.commit.id.describe drill-1.0.0-m1-398-ge1e5ea0
[info] git.commit.id e1e5ea0eddd9199672ab01c5ae31f7a3c0a57249
[info] git.commit.id.abbrev e1e5ea0
[info] git.commit.user.name Parth Chandra
[info] git.commit.user.email pchandra@maprtech.com
[info] git.commit.message.full DRILL-423: C++ Client. Initial implementation (reviewed)

[info] git.commit.message.short DRILL-423: C++ Client. Initial implementation (reviewed)
[info] git.commit.time 30.05.2014 @ 06:32:29 MYT
[info] git.remote.origin.url https://github.com/apache/incubator-drill.git
[info] git.build.time 31.05.2014 @ 16:52:03 MYT
[info] found property git.commit.id.abbrev
[info] found property git.commit.user.email
[info] found property git.commit.message.full
[info] found property git.commit.id
[info] found property git.commit.message.short
[info] found property git.commit.user.name
[info] found property git.build.user.name
[info] found property git.commit.id.describe
[info] found property git.build.user.email
[info] found property git.branch
[info] found property git.commit.time
[info] found property git.build.time
[info] found property git.remote.origin.url
[info] Writing properties file to [ /home/jason/drill/incubator-drill/protocol/target/classes/git.properties ] (for module Drill Protocol2 )...
[info] Apache Drill Root POM ] project Apache Drill Root POM
[info] Drill Protocol ] project Drill Protocol
[info] Common (Logical Plan, Base expressions) ] project Common (Logical Plan, Base expressions)
[info] contrib/Parent Pom ] project contrib/Parent Pom
[info] contrib/data/Parent Pom ] project contrib/data/Parent Pom
[info] contrib/data/tpch-sample-data ] project contrib/data/tpch-sample-data
[info] contrib/storage-hive ] project contrib/storage-hive
[info] exec/Parent Pom ] project exec/Parent Pom
[info] exec/Netty Little Endian Buffers ] project exec/Netty Little Endian Buffers
[info] exec/Java Execution Engine ] project exec/Java Execution Engine
[info] contrib/hbase-storage-plugin ] project contrib/hbase-storage-plugin
[info] exec/JDBC Driver using dependencies ] project exec/JDBC Driver using dependencies
[info] contrib/sqlline ] project contrib/sqlline
[info] Packaging and Distribution Assembly ] project Packaging and Distribution Assembly
[INFO]
[INFO] --- maven-remote-resources-plugin:1.4:process (default) @ drill-protocol ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ drill-protocol ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jason/drill/incubator-drill/protocol/src/main/resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.0:compile (default-compile) @ drill-protocol ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 9 source files to /home/jason/drill/incubator-drill/protocol/target/classes
[INFO]
[INFO] --- apache-rat-plugin:0.10:check (rat-checks) @ drill-protocol ---
[INFO] 51 implicit excludes (use -debug for more details).
[INFO] Exclude: **/*.log
[INFO] Exclude: **/*.md
[INFO] Exclude: sandbox/**
[INFO] Exclude: **/*.json
[INFO] Exclude: **/*.sql
[INFO] Exclude: **/git.properties
[INFO] Exclude: **/*.csv
[INFO] Exclude: **/drill-*.conf
[INFO] Exclude: **/.buildpath
[INFO] Exclude: **/*.proto
[INFO] Exclude: **/*.fmpp
[INFO] Exclude: **/target/**
[INFO] Exclude: **/*.iml
[INFO] Exclude: **/*.tdd
[INFO] Exclude: **/*.project
[INFO] Exclude: .*/**
[INFO] Exclude: *.patch
[INFO] Exclude: **/*.pb.cc
[INFO] Exclude: **/*.pb.h
[INFO] 11 resources included (use -debug for more details)
Warning: org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser: Property 'http://www.oracle.com/xml/jaxp/properties/entityExpansionLimit' is not recognized.
Compiler warnings:
WARNING: 'org.apache.xerces.jaxp.SAXParserImpl: Property 'http://javax.xml.XMLConstants/property/accessExternalDTD' is not recognized.'
Warning: org.apache.xerces.parsers.SAXParser: Feature 'http://javax.xml.XMLConstants/feature/secure-processing' is not recognized.
Warning: org.apache.xerces.parsers.SAXParser: Property 'http://javax.xml.XMLConstants/property/accessExternalDTD' is not recognized.
Warning: org.apache.xerces.parsers.SAXParser: Property 'http://www.oracle.com/xml/jaxp/properties/entityExpansionLimit' is not recognized.
[INFO] Rat check: Summary of files. Unapproved: 0 unknown: 0 generated: 0 approved: 10 licence.
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ drill-protocol ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /home/jason/drill/incubator-drill/protocol/src/test/resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.0:testCompile (default-testCompile) @ drill-protocol ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ drill-protocol ---
[INFO] Tests are skipped.
[INFO]
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ drill-protocol ---
[INFO] Building jar: /home/jason/drill/incubator-drill/protocol/target/drill-protocol-1.0.0-m2-incubating-SNAPSHOT.jar
[INFO]
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ drill-protocol ---
[INFO]
[INFO] --- maven-shade-plugin:2.1:shade (default) @ drill-protocol ---
[INFO] Including com.google.protobuf:protobuf-java:jar:2.5.0 in the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.0.7.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.0.7.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.0.7.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.0.7.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.0.7.Final from the shaded jar.
[INFO] Excluding com.google.guava:guava:jar:14.0.1 from the shaded jar.
[INFO] Excluding org.slf4j:slf4j-api:jar:1.7.5 from the shaded jar.
[INFO] Excluding org.slf4j:jul-to-slf4j:jar:1.7.5 from the shaded jar.
[INFO] Excluding org.slf4j:jcl-over-slf4j:jar:1.7.5 from the shaded jar.
[INFO] Excluding org.slf4j:log4j-over-slf4j:jar:1.7.5 from the shaded jar.
[INFO] Attaching shaded artifact.
[INFO]
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ drill-protocol ---
[INFO] Installing /home/jason/drill/incubator-drill/protocol/target/drill-protocol-1.0.0-m2-incubating-SNAPSHOT.jar to /home/jason/.m2/repository/org/apache/drill/drill-protocol/1.0.0-m2-incubating-SNAPSHOT/drill-protocol-1.0.0-m2-incubating-SNAPSHOT.jar
[INFO] Installing /home/jason/drill/incubator-drill/protocol/pom.xml to /home/jason/.m2/repository/org/apache/drill/drill-protocol/1.0.0-m2-incubating-SNAPSHOT/drill-protocol-1.0.0-m2-incubating-SNAPSHOT.pom
[INFO] Installing /home/jason/drill/incubator-drill/protocol/target/drill-protocol-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar to /home/jason/.m2/repository/org/apache/drill/drill-protocol/1.0.0-m2-incubating-SNAPSHOT/drill-protocol-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Common (Logical Plan, Base expressions) 1.0.0-m2-incubating-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ drill-common ---
[INFO] Deleting /home/jason/drill/incubator-drill/common/target
[INFO]
[INFO] --- maven-enforcer-plugin:1.2:enforce (no_commons_logging) @ drill-common ---
[INFO]
[INFO] --- git-commit-id-plugin:2.1.9:revision (default) @ drill-common ---
[info] dotGitDirectory /home/jason/drill/incubator-drill/.git
[info] git.build.user.name Jason Wee
[info] git.build.user.email peichieh@gmail.com
[info] git.branch master
[info] --always = false
[info] --dirty = -dirty
[info] --abbrev = 7
[info] --long = %s true
[info] --match =
[info] Tag refs [ [Ref[refs/tags/drill-1.0.0-m1=04020a8fca8b287874528d86dc7b8be0269ad788], Ref[refs/tags/drill-root-1.0.0-m1=ad638d9e41aa9efdb1e877cfe7e0a4b910f539fc], Ref[refs/tags/oscon_workshop=eaf95ed3c30d7bb147afe337e0e0477be6518d90], Ref[refs/tags/pre_exec_merge=a97a22b0a9547f8639e92258c0a3475b01742f15]] ]
[info] Resolved tag [ drill-1.0.0-m1 ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Fri Sep 6 13:05:42 2013 -0700] ], points at [ commit a0d3c6977820516983142c96d7f9374681529968 0 ------ ]
[info] Resolved tag [ drill-root-1.0.0-m1 ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Wed Sep 4 04:23:47 2013 -0700] ], points at [ commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------ ]
[info] Resolved tag [ pre_exec_merge ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Fri Jul 19 18:33:56 2013 -0700] ], points at [ commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------ ]
[info] key [ commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------ ], tags => [ [DatedRevTag{id=ad638d9e41aa9efdb1e877cfe7e0a4b910f539fc, tagName='drill-root-1.0.0-m1', date=September 4, 2013 7:23:47 PM MYT}] ]
[info] key [ commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------ ], tags => [ [DatedRevTag{id=a97a22b0a9547f8639e92258c0a3475b01742f15, tagName='pre_exec_merge', date=July 20, 2013 9:33:56 AM MYT}] ]
[info] key [ commit a0d3c6977820516983142c96d7f9374681529968 0 ------ ], tags => [ [DatedRevTag{id=04020a8fca8b287874528d86dc7b8be0269ad788, tagName='drill-1.0.0-m1', date=September 7, 2013 4:05:42 AM MYT}] ]
[info] Created map: [ {commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------=[drill-root-1.0.0-m1], commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------=[pre_exec_merge], commit a0d3c6977820516983142c96d7f9374681529968 0 ------=[drill-1.0.0-m1]} ]
[info] HEAD is [ e1e5ea0eddd9199672ab01c5ae31f7a3c0a57249 ]
[info] Repo is in dirty state [ false ]
[info] git.commit.id.describe drill-1.0.0-m1-398-ge1e5ea0
[info] git.commit.id e1e5ea0eddd9199672ab01c5ae31f7a3c0a57249
[info] git.commit.id.abbrev e1e5ea0
[info] git.commit.user.name Parth Chandra
[info] git.commit.user.email pchandra@maprtech.com
[info] git.commit.message.full DRILL-423: C++ Client. Initial implementation (reviewed)

[info] git.commit.message.short DRILL-423: C++ Client. Initial implementation (reviewed)
[info] git.commit.time 30.05.2014 @ 06:32:29 MYT
[info] git.remote.origin.url https://github.com/apache/incubator-drill.git
[info] git.build.time 31.05.2014 @ 16:52:24 MYT
[info] found property git.commit.id.abbrev
[info] found property git.commit.user.email
[info] found property git.commit.message.full
[info] found property git.commit.id
[info] found property git.commit.message.short
[info] found property git.commit.user.name
[info] found property git.build.user.name
[info] found property git.commit.id.describe
[info] found property git.build.user.email
[info] found property git.branch
[info] found property git.commit.time
[info] found property git.build.time
[info] found property git.remote.origin.url
[info] Writing properties file to [ /home/jason/drill/incubator-drill/common/target/classes/git.properties ] (for module Common (Logical Plan, Base expressions)3 )...
[info] Apache Drill Root POM ] project Apache Drill Root POM
[info] Drill Protocol ] project Drill Protocol
[info] Common (Logical Plan, Base expressions) ] project Common (Logical Plan, Base expressions)
[info] contrib/Parent Pom ] project contrib/Parent Pom
[info] contrib/data/Parent Pom ] project contrib/data/Parent Pom
[info] contrib/data/tpch-sample-data ] project contrib/data/tpch-sample-data
[info] contrib/storage-hive ] project contrib/storage-hive
[info] exec/Parent Pom ] project exec/Parent Pom
[info] exec/Netty Little Endian Buffers ] project exec/Netty Little Endian Buffers
[info] exec/Java Execution Engine ] project exec/Java Execution Engine
[info] contrib/hbase-storage-plugin ] project contrib/hbase-storage-plugin
[info] exec/JDBC Driver using dependencies ] project exec/JDBC Driver using dependencies
[info] contrib/sqlline ] project contrib/sqlline
[info] Packaging and Distribution Assembly ] project Packaging and Distribution Assembly
[INFO]
[INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ drill-common ---
[INFO] ANTLR: Processing source directory /home/jason/drill/incubator-drill/common/src/main/antlr3
ANTLR Parser Generator Version 3.4
org/apache/drill/common/expression/parser/ExprLexer.g
org/apache/drill/common/expression/parser/ExprParser.g
[INFO]
[INFO] --- maven-remote-resources-plugin:1.4:process (default) @ drill-common ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ drill-common ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.0:compile (default-compile) @ drill-common ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 109 source files to /home/jason/drill/incubator-drill/common/target/classes
[WARNING] /home/jason/drill/incubator-drill/common/src/main/java/org/apache/drill/common/config/DrillConfig.java:[51,90] VM is internal proprietary API and may be removed in a future release
[INFO]
[INFO] --- apache-rat-plugin:0.10:check (rat-checks) @ drill-common ---
[INFO] 51 implicit excludes (use -debug for more details).
[INFO] Exclude: **/*.log
[INFO] Exclude: **/*.md
[INFO] Exclude: sandbox/**
[INFO] Exclude: **/*.json
[INFO] Exclude: **/*.sql
[INFO] Exclude: **/git.properties
[INFO] Exclude: **/*.csv
[INFO] Exclude: **/drill-*.conf
[INFO] Exclude: **/.buildpath
[INFO] Exclude: **/*.proto
[INFO] Exclude: **/*.fmpp
[INFO] Exclude: **/target/**
[INFO] Exclude: **/*.iml
[INFO] Exclude: **/*.tdd
[INFO] Exclude: **/*.project
[INFO] Exclude: .*/**
[INFO] Exclude: *.patch
[INFO] Exclude: **/*.pb.cc
[INFO] Exclude: **/*.pb.h
[INFO] 116 resources included (use -debug for more details)
Warning: org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser: Property 'http://www.oracle.com/xml/jaxp/properties/entityExpansionLimit' is not recognized.
Compiler warnings:
WARNING: 'org.apache.xerces.jaxp.SAXParserImpl: Property 'http://javax.xml.XMLConstants/property/accessExternalDTD' is not recognized.'
Warning: org.apache.xerces.parsers.SAXParser: Feature 'http://javax.xml.XMLConstants/feature/secure-processing' is not recognized.
Warning: org.apache.xerces.parsers.SAXParser: Property 'http://javax.xml.XMLConstants/property/accessExternalDTD' is not recognized.
Warning: org.apache.xerces.parsers.SAXParser: Property 'http://www.oracle.com/xml/jaxp/properties/entityExpansionLimit' is not recognized.
[INFO] Rat check: Summary of files. Unapproved: 0 unknown: 0 generated: 0 approved: 116 licence.
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ drill-common ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 8 resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.0:testCompile (default-testCompile) @ drill-common ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 3 source files to /home/jason/drill/incubator-drill/common/target/test-classes
[INFO]
[INFO] --- maven-surefire-plugin:2.15:test (default-test) @ drill-common ---
[INFO] Tests are skipped.
[INFO]
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ drill-common ---
[INFO] Building jar: /home/jason/drill/incubator-drill/common/target/drill-common-1.0.0-m2-incubating-SNAPSHOT.jar
[INFO]
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ drill-common ---
[INFO]
[INFO] --- maven-jar-plugin:2.4:test-jar (test-jar) @ drill-common ---
[INFO] Building jar: /home/jason/drill/incubator-drill/common/target/drill-common-1.0.0-m2-incubating-SNAPSHOT-tests.jar
[INFO]
[INFO] --- maven-shade-plugin:2.1:shade (default) @ drill-common ---
[INFO] Excluding org.apache.drill:drill-protocol:jar:1.0.0-m2-incubating-SNAPSHOT from the shaded jar.
[INFO] Including com.google.protobuf:protobuf-java:jar:2.5.0 in the shaded jar.
[INFO] Excluding junit:junit:jar:4.11 from the shaded jar.
[INFO] Excluding org.hamcrest:hamcrest-core:jar:1.3 from the shaded jar.
[INFO] Excluding net.hydromatic:optiq-core:jar:0.7-SNAPSHOT from the shaded jar.
[INFO] Excluding net.hydromatic:optiq-avatica:jar:0.7-SNAPSHOT from the shaded jar.
[INFO] Excluding eigenbase:eigenbase-properties:jar:1.1.4 from the shaded jar.
[INFO] Excluding net.hydromatic:linq4j:jar:0.2 from the shaded jar.
[INFO] Excluding org.codehaus.janino:janino:jar:2.7.3 from the shaded jar.
[INFO] Excluding org.codehaus.janino:commons-compiler:jar:2.7.3 from the shaded jar.
[INFO] Excluding commons-dbcp:commons-dbcp:jar:1.4 from the shaded jar.
[INFO] Excluding commons-pool:commons-pool:jar:1.5.4 from the shaded jar.
[INFO] Excluding com.typesafe:config:jar:1.0.0 from the shaded jar.
[INFO] Excluding org.apache.commons:commons-lang3:jar:3.1 from the shaded jar.
[INFO] Excluding org.msgpack:msgpack:jar:0.6.6 from the shaded jar.
[INFO] Excluding com.googlecode.json-simple:json-simple:jar:1.1.1 from the shaded jar.
[INFO] Excluding org.javassist:javassist:jar:3.16.1-GA from the shaded jar.
[INFO] Excluding org.reflections:reflections:jar:0.9.8 from the shaded jar.
[INFO] Excluding javassist:javassist:jar:3.12.1.GA from the shaded jar.
[INFO] Excluding dom4j:dom4j:jar:1.6.1 from the shaded jar.
[INFO] Excluding xml-apis:xml-apis:jar:1.0.b2 from the shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-annotations:jar:2.2.0 from the shaded jar.
[INFO] Excluding org.hibernate:hibernate-validator:jar:4.3.1.Final from the shaded jar.
[INFO] Excluding javax.validation:validation-api:jar:1.0.0.GA from the shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-databind:jar:2.2.0 from the shaded jar.
[INFO] Excluding com.fasterxml.jackson.core:jackson-core:jar:2.2.0 from the shaded jar.
[INFO] Excluding org.antlr:antlr-runtime:jar:3.4 from the shaded jar.
[INFO] Excluding org.antlr:stringtemplate:jar:3.2.1 from the shaded jar.
[INFO] Excluding antlr:antlr:jar:2.7.7 from the shaded jar.
[INFO] Excluding joda-time:joda-time:jar:2.3 from the shaded jar.
[INFO] Excluding io.netty:netty-handler:jar:4.0.7.Final from the shaded jar.
[INFO] Excluding io.netty:netty-buffer:jar:4.0.7.Final from the shaded jar.
[INFO] Excluding io.netty:netty-common:jar:4.0.7.Final from the shaded jar.
[INFO] Excluding io.netty:netty-transport:jar:4.0.7.Final from the shaded jar.
[INFO] Excluding io.netty:netty-codec:jar:4.0.7.Final from the shaded jar.
[INFO] Excluding com.google.guava:guava:jar:14.0.1 from the shaded jar.
[INFO] Excluding org.slf4j:slf4j-api:jar:1.7.5 from the shaded jar.
[INFO] Excluding org.slf4j:jul-to-slf4j:jar:1.7.5 from the shaded jar.
[INFO] Excluding org.slf4j:jcl-over-slf4j:jar:1.7.5 from the shaded jar.
[INFO] Excluding org.slf4j:log4j-over-slf4j:jar:1.7.5 from the shaded jar.
[INFO] Attaching shaded artifact.
[INFO]
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ drill-common ---
[INFO] Installing /home/jason/drill/incubator-drill/common/target/drill-common-1.0.0-m2-incubating-SNAPSHOT.jar to /home/jason/.m2/repository/org/apache/drill/drill-common/1.0.0-m2-incubating-SNAPSHOT/drill-common-1.0.0-m2-incubating-SNAPSHOT.jar
[INFO] Installing /home/jason/drill/incubator-drill/common/pom.xml to /home/jason/.m2/repository/org/apache/drill/drill-common/1.0.0-m2-incubating-SNAPSHOT/drill-common-1.0.0-m2-incubating-SNAPSHOT.pom
[INFO] Installing /home/jason/drill/incubator-drill/common/target/drill-common-1.0.0-m2-incubating-SNAPSHOT-tests.jar to /home/jason/.m2/repository/org/apache/drill/drill-common/1.0.0-m2-incubating-SNAPSHOT/drill-common-1.0.0-m2-incubating-SNAPSHOT-tests.jar
[INFO] Installing /home/jason/drill/incubator-drill/common/target/drill-common-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar to /home/jason/.m2/repository/org/apache/drill/drill-common/1.0.0-m2-incubating-SNAPSHOT/drill-common-1.0.0-m2-incubating-SNAPSHOT-rebuffed.jar
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building contrib/Parent Pom 1.0.0-m2-incubating-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ drill-contrib-parent ---
[INFO] Deleting /home/jason/drill/incubator-drill/contrib/target
[INFO]
[INFO] --- maven-enforcer-plugin:1.2:enforce (no_commons_logging) @ drill-contrib-parent ---
[INFO]
[INFO] --- git-commit-id-plugin:2.1.9:revision (default) @ drill-contrib-parent ---
[info] dotGitDirectory /home/jason/drill/incubator-drill/.git
[info] git.build.user.name Jason Wee
[info] git.build.user.email peichieh@gmail.com
[info] git.branch master
[info] --always = false
[info] --dirty = -dirty
[info] --abbrev = 7
[info] --long = %s true
[info] --match =
[info] Tag refs [ [Ref[refs/tags/drill-1.0.0-m1=04020a8fca8b287874528d86dc7b8be0269ad788], Ref[refs/tags/drill-root-1.0.0-m1=ad638d9e41aa9efdb1e877cfe7e0a4b910f539fc], Ref[refs/tags/oscon_workshop=eaf95ed3c30d7bb147afe337e0e0477be6518d90], Ref[refs/tags/pre_exec_merge=a97a22b0a9547f8639e92258c0a3475b01742f15]] ]
[info] Resolved tag [ drill-1.0.0-m1 ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Fri Sep 6 13:05:42 2013 -0700] ], points at [ commit a0d3c6977820516983142c96d7f9374681529968 0 ------ ]
[info] Resolved tag [ drill-root-1.0.0-m1 ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Wed Sep 4 04:23:47 2013 -0700] ], points at [ commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------ ]
[info] Resolved tag [ pre_exec_merge ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Fri Jul 19 18:33:56 2013 -0700] ], points at [ commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------ ]
[info] key [ commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------ ], tags => [ [DatedRevTag{id=ad638d9e41aa9efdb1e877cfe7e0a4b910f539fc, tagName='drill-root-1.0.0-m1', date=September 4, 2013 7:23:47 PM MYT}] ]
[info] key [ commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------ ], tags => [ [DatedRevTag{id=a97a22b0a9547f8639e92258c0a3475b01742f15, tagName='pre_exec_merge', date=July 20, 2013 9:33:56 AM MYT}] ]
[info] key [ commit a0d3c6977820516983142c96d7f9374681529968 0 ------ ], tags => [ [DatedRevTag{id=04020a8fca8b287874528d86dc7b8be0269ad788, tagName='drill-1.0.0-m1', date=September 7, 2013 4:05:42 AM MYT}] ]
[info] Created map: [ {commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------=[drill-root-1.0.0-m1], commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------=[pre_exec_merge], commit a0d3c6977820516983142c96d7f9374681529968 0 ------=[drill-1.0.0-m1]} ]
[info] HEAD is [ e1e5ea0eddd9199672ab01c5ae31f7a3c0a57249 ]
[info] Repo is in dirty state [ false ]
[info] git.commit.id.describe drill-1.0.0-m1-398-ge1e5ea0
[info] git.commit.id e1e5ea0eddd9199672ab01c5ae31f7a3c0a57249
[info] git.commit.id.abbrev e1e5ea0
[info] git.commit.user.name Parth Chandra
[info] git.commit.user.email pchandra@maprtech.com
[info] git.commit.message.full DRILL-423: C++ Client. Initial implementation (reviewed)

[info] git.commit.message.short DRILL-423: C++ Client. Initial implementation (reviewed)
[info] git.commit.time 30.05.2014 @ 06:32:29 MYT
[info] git.remote.origin.url https://github.com/apache/incubator-drill.git
[info] git.build.time 31.05.2014 @ 16:52:52 MYT
[info] found property git.commit.id.abbrev
[info] found property git.commit.user.email
[info] found property git.commit.message.full
[info] found property git.commit.id
[info] found property git.commit.message.short
[info] found property git.commit.user.name
[info] found property git.build.user.name
[info] found property git.commit.id.describe
[info] found property git.build.user.email
[info] found property git.branch
[info] found property git.commit.time
[info] found property git.build.time
[info] found property git.remote.origin.url
[info] Writing properties file to [ /home/jason/drill/incubator-drill/contrib/target/classes/git.properties ] (for module contrib/Parent Pom4 )...
[info] Apache Drill Root POM ] project Apache Drill Root POM
[info] Drill Protocol ] project Drill Protocol
[info] Common (Logical Plan, Base expressions) ] project Common (Logical Plan, Base expressions)
[info] contrib/Parent Pom ] project contrib/Parent Pom
[info] contrib/data/Parent Pom ] project contrib/data/Parent Pom
[info] contrib/data/tpch-sample-data ] project contrib/data/tpch-sample-data
[info] contrib/storage-hive ] project contrib/storage-hive
[info] exec/Parent Pom ] project exec/Parent Pom
[info] exec/Netty Little Endian Buffers ] project exec/Netty Little Endian Buffers
[info] exec/Java Execution Engine ] project exec/Java Execution Engine
[info] contrib/hbase-storage-plugin ] project contrib/hbase-storage-plugin
[info] exec/JDBC Driver using dependencies ] project exec/JDBC Driver using dependencies
[info] contrib/sqlline ] project contrib/sqlline
[info] Packaging and Distribution Assembly ] project Packaging and Distribution Assembly
[INFO]
[INFO] --- maven-remote-resources-plugin:1.4:process (default) @ drill-contrib-parent ---
[INFO]
[INFO] --- apache-rat-plugin:0.10:check (rat-checks) @ drill-contrib-parent ---
[INFO] 55 implicit excludes (use -debug for more details).
[INFO] Exclude: **/*.log
[INFO] Exclude: **/*.md
[INFO] Exclude: sandbox/**
[INFO] Exclude: **/*.json
[INFO] Exclude: **/*.sql
[INFO] Exclude: **/git.properties
[INFO] Exclude: **/*.csv
[INFO] Exclude: **/drill-*.conf
[INFO] Exclude: **/.buildpath
[INFO] Exclude: **/*.proto
[INFO] Exclude: **/*.fmpp
[INFO] Exclude: **/target/**
[INFO] Exclude: **/*.iml
[INFO] Exclude: **/*.tdd
[INFO] Exclude: **/*.project
[INFO] Exclude: .*/**
[INFO] Exclude: *.patch
[INFO] Exclude: **/*.pb.cc
[INFO] Exclude: **/*.pb.h
[INFO] 25 resources included (use -debug for more details)
Warning: org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser: Property 'http://www.oracle.com/xml/jaxp/properties/entityExpansionLimit' is not recognized.
Compiler warnings:
WARNING: 'org.apache.xerces.jaxp.SAXParserImpl: Property 'http://javax.xml.XMLConstants/property/accessExternalDTD' is not recognized.'
Warning: org.apache.xerces.parsers.SAXParser: Feature 'http://javax.xml.XMLConstants/feature/secure-processing' is not recognized.
Warning: org.apache.xerces.parsers.SAXParser: Property 'http://javax.xml.XMLConstants/property/accessExternalDTD' is not recognized.
Warning: org.apache.xerces.parsers.SAXParser: Property 'http://www.oracle.com/xml/jaxp/properties/entityExpansionLimit' is not recognized.
[INFO] Rat check: Summary of files. Unapproved: 0 unknown: 0 generated: 0 approved: 25 licence.
[INFO]
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ drill-contrib-parent ---
[INFO]
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ drill-contrib-parent ---
[INFO] Installing /home/jason/drill/incubator-drill/contrib/pom.xml to /home/jason/.m2/repository/org/apache/drill/contrib/drill-contrib-parent/1.0.0-m2-incubating-SNAPSHOT/drill-contrib-parent-1.0.0-m2-incubating-SNAPSHOT.pom
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building contrib/data/Parent Pom 1.0.0-m2-incubating-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ drill-contrib-data-parent ---
[INFO] Deleting /home/jason/drill/incubator-drill/contrib/data/target
[INFO]
[INFO] --- maven-enforcer-plugin:1.2:enforce (no_commons_logging) @ drill-contrib-data-parent ---
[INFO]
[INFO] --- git-commit-id-plugin:2.1.9:revision (default) @ drill-contrib-data-parent ---
[info] dotGitDirectory /home/jason/drill/incubator-drill/.git
[info] git.build.user.name Jason Wee
[info] git.build.user.email peichieh@gmail.com
[info] git.branch master
[info] --always = false
[info] --dirty = -dirty
[info] --abbrev = 7
[info] --long = %s true
[info] --match =
[info] Tag refs [ [Ref[refs/tags/drill-1.0.0-m1=04020a8fca8b287874528d86dc7b8be0269ad788], Ref[refs/tags/drill-root-1.0.0-m1=ad638d9e41aa9efdb1e877cfe7e0a4b910f539fc], Ref[refs/tags/oscon_workshop=eaf95ed3c30d7bb147afe337e0e0477be6518d90], Ref[refs/tags/pre_exec_merge=a97a22b0a9547f8639e92258c0a3475b01742f15]] ]
[info] Resolved tag [ drill-1.0.0-m1 ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Fri Sep 6 13:05:42 2013 -0700] ], points at [ commit a0d3c6977820516983142c96d7f9374681529968 0 ------ ]
[info] Resolved tag [ drill-root-1.0.0-m1 ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Wed Sep 4 04:23:47 2013 -0700] ], points at [ commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------ ]
[info] Resolved tag [ pre_exec_merge ] [ PersonIdent[Jacques Nadeau, jacques@apache.org, Fri Jul 19 18:33:56 2013 -0700] ], points at [ commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------ ]
[info] key [ commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------ ], tags => [ [DatedRevTag{id=ad638d9e41aa9efdb1e877cfe7e0a4b910f539fc, tagName='drill-root-1.0.0-m1', date=September 4, 2013 7:23:47 PM MYT}] ]
[info] key [ commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------ ], tags => [ [DatedRevTag{id=a97a22b0a9547f8639e92258c0a3475b01742f15, tagName='pre_exec_merge', date=July 20, 2013 9:33:56 AM MYT}] ]
[info] key [ commit a0d3c6977820516983142c96d7f9374681529968 0 ------ ], tags => [ [DatedRevTag{id=04020a8fca8b287874528d86dc7b8be0269ad788, tagName='drill-1.0.0-m1', date=September 7, 2013 4:05:42 AM MYT}] ]
[info] Created map: [ {commit 41c18197e3b8ae3c42d55089d641e9a0b68c6f29 0 ------=[drill-root-1.0.0-m1], commit 5052b64d9953857575f8f40995b8da05160e5457 0 ------=[pre_exec_merge], commit a0d3c6977820516983142c96d7f9374681529968 0 ------=[drill-1.0.0-m1]} ]
[info] HEAD is [ e1e5ea0eddd9199672ab01c5ae31f7a3c0a57249 ]
[info] Repo is in dirty state [ false ]
[info] git.commit.id.describe drill-1.0.0-m1-398-ge1e5ea0
[info] git.commit.id e1e5ea0eddd9199672ab01c5ae31f7a3c0a57249
[info] git.commit.id.abbrev e1e5ea0
[info] git.commit.user.name Parth Chandra
[info] git.commit.user.email pchandra@maprtech.com
[info] git.commit.message.full DRILL-423: C++ Client. Initial implementation (reviewed)

[info] git.commit.message.short DRILL-423: C++ Client. Initial implementation (reviewed)
[info] git.commit.time 30.05.2014 @ 06:32:29 MYT
[info] git.remote.origin.url https://github.com/apache/incubator-drill.git
[info] git.build.time 31.05.2014 @ 16:52:53 MYT
[info] found property git.commit.id.abbrev
[info] found property git.commit.user.email
[info] found property git.commit.message.full
[info] found property git.commit.id
[info] found property git.commit.message.short
[info] found property git.commit.user.name
[info] found property git.build.user.name
[info] found property git.commit.id.describe
[info] found property git.build.user.email
[info] found property git.branch
[info] found property git.commit.time
[info] found property git.build.time
[info] found property git.remote.origin.url
[info] Writing properties file to [ /home/jason/drill/incubator-drill/contrib/data/target/classes/git.properties ] (for module contrib/data/Parent Pom5 )...
[info] Apache Drill Root POM ] project Apache Drill Root POM
[info] Drill Protocol ] project Drill Protocol
[info] Common (Logical Plan, Base expressions) ] project Common (Logical Plan, Base expressions)
[info] contrib/Parent Pom ] project contrib/Parent Pom
[info] contrib/data/Parent Pom ] project contrib/data/Parent Pom
[info] contrib/data/tpch-sample-data ] project contrib/data/tpch-sample-data
[info] contrib/storage-hive ] project contrib/storage-hive
[info] exec/Parent Pom ] project exec/Parent Pom
[info] exec/Netty Little Endian Buffers ] project exec/Netty Little Endian Buffers
[info] exec/Java Execution Engine ] project exec/Java Execution Engine
[info] contrib/hbase-storage-plugin ] project contrib/hbase-storage-plugin
[info] exec/JDBC Driver using dependencies ] project exec/JDBC Driver using dependencies
[info] contrib/sqlline ] project contrib/sqlline
[info] Packaging and Distribution Assembly ] project Packaging and Distribution Assembly
[INFO]
[INFO] --- maven-remote-resources-plugin:1.4:process (default) @ drill-contrib-data-parent ---
[INFO]
[INFO] --- apache-rat-plugin:0.10:check (rat-checks) @ drill-contrib-data-parent ---
[INFO] 52 implicit excludes (use -debug for more details).
[INFO] Exclude: **/*.log
[INFO] Exclude: **/*.md
[INFO] Exclude: sandbox/**
[INFO] Exclude: **/*.json
[INFO] Exclude: **/*.sql
[INFO] Exclude: **/git.properties
[INFO] Exclude: **/*.csv
[INFO] Exclude: **/drill-*.conf
[INFO] Exclude: **/.buildpath
[INFO] Exclude: **/*.proto
[INFO] Exclude: **/*.fmpp
[INFO] Exclude: **/target/**
[INFO] Exclude: **/*.iml
[INFO] Exclude: **/*.tdd
[INFO] Exclude: **/*.project
[INFO] Exclude: .*/**
[INFO] Exclude: *.patch
[INFO] Exclude: **/*.pb.cc
[INFO] Exclude: **/*.pb.h
[INFO] 1 resources included (use -debug for more details)
Warning: org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser: Property 'http://www.oracle.com/xml/jaxp/properties/entityExpansionLimit' is not recognized.
Compiler warnings:
WARNING: 'org.apache.xerces.jaxp.SAXParserImpl: Property 'http://javax.xml.XMLConstants/property/accessExternalDTD' is not recognized.'
Warning: org.apache.xerces.parsers.SAXParser: Feature 'http://javax.xml.XMLConstants/feature/secure-processing' is not recognized.
Warning: org.apache.xerces.parsers.SAXParser: Property 'http://javax.xml.XMLConstants/property/accessExternalDTD' is not recognized.
Warning: org.apache.xerces.parsers.SAXParser: Property 'http://www.oracle.com/xml/jaxp/properties/entityExpansionLimit' is not recognized.
[INFO] Rat check: Summary of files. Unapproved: 0 unknown: 0 generated: 0 approved: 1 licence.
[INFO]
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ drill-contrib-data-parent ---
[INFO]
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ drill-contrib-data-parent ---
[INFO] Installing /home/jason/drill/incubator-drill/contrib/data/pom.xml to /home/jason/.m2/repository/org/apache/drill/contrib/data/drill-contrib-data-parent/1.0.0-m2-incubating-SNAPSHOT/drill-contrib-data-parent-1.0.0-m2-incubating-SNAPSHOT.pom
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building contrib/data/tpch-sample-data 1.0.0-m2-incubating-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for com.googlecode.maven-download-plugin:download-maven-plugin:jar:1.2.0-SNAPSHOT is missing, no dependency information available
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Drill Root POM ............................. SUCCESS [21.251s]
[INFO] Drill Protocol .................................... SUCCESS [20.039s]
[INFO] Common (Logical Plan, Base expressions) ........... SUCCESS [27.872s]
[INFO] contrib/Parent Pom ................................ SUCCESS [2.687s]
[INFO] contrib/data/Parent Pom ........................... SUCCESS [1.698s]
[INFO] contrib/data/tpch-sample-data ..................... FAILURE [0.308s]
[INFO] contrib/storage-hive .............................. SKIPPED
[INFO] exec/Parent Pom ................................... SKIPPED
[INFO] exec/Netty Little Endian Buffers .................. SKIPPED
[INFO] exec/Java Execution Engine ........................ SKIPPED
[INFO] contrib/hbase-storage-plugin ...................... SKIPPED
[INFO] exec/JDBC Driver using dependencies ............... SKIPPED
[INFO] contrib/sqlline ................................... SKIPPED
[INFO] Packaging and Distribution Assembly ............... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:16.738s
[INFO] Finished at: Sat May 31 16:52:54 MYT 2014
[INFO] Final Memory: 39M/384M
[INFO] ------------------------------------------------------------------------
[ERROR] Plugin com.googlecode.maven-download-plugin:download-maven-plugin:1.2.0-SNAPSHOT or one of its dependencies could not be resolved: Failed to read artifact descriptor for com.googlecode.maven-download-plugin:download-maven-plugin:jar:1.2.0-SNAPSHOT: Failure to find com.googlecode.maven-download-plugin:download-maven-plugin:pom:1.2.0-SNAPSHOT in https://oss.sonatype.org/content/groups/public was cached in the local repository, resolution will not be reattempted until the update interval of sonatype-public-repository has elapsed or updates are forced -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
jason@localhost:~/drill/incubator-drill$

I'm not sure why the build is a faillure although the dependencies are fulfilled as outline in the wiki. Luckily there is a prebuilt binary that we can use with. Below is the output.
jason@localhost:~/drill/apache-drill/apache-drill-1.0.0-m1$ ./bin/sqlline -u jdbc:drill:zk=local -n admin -p admin 

Invalid maximum direct memory size: -XX:MaxDirectMemorySize=8G
The specified size exceeds the maximum representable size.
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.

edit file bin/drill-config.sh to bring down the memory usage. Try start it again,
jason@localhost:~/drill/apache-drill/apache-drill-1.0.0-m1$ ./bin/sqlline -u jdbc:drill:zk=local -n admin -p admin 

Loaded singnal handler: SunSignalHandler
/home/jason/.sqlline/sqlline.properties (No such file or directory)
scan complete in 84ms
20:26:26,643 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.groovy]
20:26:26,644 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml]
20:26:26,645 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Found resource [logback.xml] at [file:/home/jason/drill/apache-drill/apache-drill-1.0.0-m1/conf/logback.xml]
20:26:27,147 |-INFO in ch.qos.logback.classic.joran.action.ConfigurationAction - debug attribute not set
20:26:27,199 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [de.huxhorn.lilith.logback.appender.ClassicMultiplexSocketAppender]
20:26:27,252 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [SOCKET]
20:26:27,413 |-INFO in de.huxhorn.lilith.logback.appender.ClassicMultiplexSocketAppender[SOCKET] - Waiting 1s to establish connections.
20:26:28,414 |-INFO in de.huxhorn.lilith.logback.appender.ClassicMultiplexSocketAppender[SOCKET] - Started de.huxhorn.lilith.logback.appender.ClassicMultiplexSocketAppender[SOCKET]
20:26:28,414 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender]
20:26:28,421 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [STDOUT]
20:26:28,444 |-INFO in ch.qos.logback.core.joran.action.NestedComplexPropertyIA - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property
20:26:28,706 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [ch.qos.logback.core.rolling.RollingFileAppender]
20:26:28,712 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [FILE]
20:26:28,748 |-INFO in ch.qos.logback.core.rolling.FixedWindowRollingPolicy@17019f7 - No compression will be used
20:26:28,765 |-INFO in ch.qos.logback.core.joran.action.NestedComplexPropertyIA - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property
20:26:28,767 |-INFO in ch.qos.logback.core.rolling.RollingFileAppender[FILE] - Active log file name: /var/log/drill/sqlline.log
20:26:28,767 |-INFO in ch.qos.logback.core.rolling.RollingFileAppender[FILE] - File property is set to [/var/log/drill/sqlline.log]
20:26:28,769 |-ERROR in ch.qos.logback.core.rolling.RollingFileAppender[FILE] - Failed to create parent directories for [/var/log/drill/sqlline.log]
20:26:28,770 |-ERROR in ch.qos.logback.core.rolling.RollingFileAppender[FILE] - openFile(/var/log/drill/sqlline.log,true) call failed. java.io.FileNotFoundException: /var/log/drill/sqlline.log (No such file or directory)
at java.io.FileNotFoundException: /var/log/drill/sqlline.log (No such file or directory)
at at java.io.FileOutputStream.open(Native Method)
at at java.io.FileOutputStream.<init>(FileOutputStream.java:221)
at at ch.qos.logback.core.recovery.ResilientFileOutputStream.<init>(ResilientFileOutputStream.java:28)
at at ch.qos.logback.core.FileAppender.openFile(FileAppender.java:149)
at at ch.qos.logback.core.FileAppender.start(FileAppender.java:108)
at at ch.qos.logback.core.rolling.RollingFileAppender.start(RollingFileAppender.java:86)
at at ch.qos.logback.core.joran.action.AppenderAction.end(AppenderAction.java:96)
at at ch.qos.logback.core.joran.spi.Interpreter.callEndAction(Interpreter.java:317)
at at ch.qos.logback.core.joran.spi.Interpreter.endElement(Interpreter.java:196)
at at ch.qos.logback.core.joran.spi.Interpreter.endElement(Interpreter.java:182)
at at ch.qos.logback.core.joran.spi.EventPlayer.play(EventPlayer.java:62)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:149)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:135)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:99)
at at ch.qos.logback.core.joran.GenericConfigurator.doConfigure(GenericConfigurator.java:49)
at at ch.qos.logback.classic.util.ContextInitializer.configureByResource(ContextInitializer.java:75)
at at ch.qos.logback.classic.util.ContextInitializer.autoConfig(ContextInitializer.java:148)
at at org.slf4j.impl.StaticLoggerBinder.init(StaticLoggerBinder.java:85)
at at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:55)
at at org.slf4j.LoggerFactory.bind(LoggerFactory.java:128)
at at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:107)
at at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:295)
at at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:269)
at at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:281)
at at org.apache.drill.jdbc.DrillHandler.<clinit>(DrillHandler.java:34)
at at org.apache.drill.jdbc.RefDriver.createHandler(RefDriver.java:65)
at at net.hydromatic.optiq.jdbc.UnregisteredDriver.<init>(UnregisteredDriver.java:52)
at at org.apache.drill.jdbc.RefDriver.<init>(RefDriver.java:32)
at at org.apache.drill.jdbc.RefDriver.<clinit>(RefDriver.java:38)
at at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at at java.lang.Class.newInstance(Class.java:374)
at at sqlline.SqlLine.scanDrivers(SqlLine.java:1763)
at at sqlline.SqlLine.scanForDriver(SqlLine.java:1687)
at at sqlline.SqlLine.access$2300(SqlLine.java:58)
at at sqlline.SqlLine$Commands.connect(SqlLine.java:4069)
at at sqlline.SqlLine$Commands.connect(SqlLine.java:4003)
at at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at at java.lang.reflect.Method.invoke(Method.java:606)
at at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2964)
at at sqlline.SqlLine.dispatch(SqlLine.java:878)
at at sqlline.SqlLine.initArgs(SqlLine.java:652)
at at sqlline.SqlLine.begin(SqlLine.java:699)
at at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:460)
at at sqlline.SqlLine.main(SqlLine.java:443)
20:26:28,772 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting additivity of logger [org.apache.drill] to false
20:26:28,772 |-INFO in ch.qos.logback.classic.joran.action.LevelAction - org.apache.drill level set to INFO
20:26:28,772 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [FILE] to Logger[org.apache.drill]
20:26:28,774 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting additivity of logger [org.apache.drill] to false
20:26:28,774 |-INFO in ch.qos.logback.classic.joran.action.LevelAction - org.apache.drill level set to DEBUG
20:26:28,774 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [SOCKET] to Logger[org.apache.drill]
20:26:28,774 |-INFO in ch.qos.logback.classic.joran.action.LevelAction - ROOT level set to ERROR
20:26:28,774 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [STDOUT] to Logger[ROOT]
20:26:28,774 |-INFO in ch.qos.logback.classic.joran.action.ConfigurationAction - End of configuration.
20:26:28,776 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@3e459e - Registering current configuration as safe fallback point

scan complete in 7308ms
Connecting to jdbc:drill:zk=local
20:26:33.660 [main] ERROR c.n.c.f.imps.CuratorFrameworkImpl - Background exception was not retry-able or retry gave up
java.net.UnknownHostException: local: Name or service not known
at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method) ~[na:1.7.0_55]
at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:901) ~[na:1.7.0_55]
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1293) ~[na:1.7.0_55]
at java.net.InetAddress.getAllByName0(InetAddress.java:1246) ~[na:1.7.0_55]
at java.net.InetAddress.getAllByName(InetAddress.java:1162) ~[na:1.7.0_55]
at java.net.InetAddress.getAllByName(InetAddress.java:1098) ~[na:1.7.0_55]
at org.apache.zookeeper.client.StaticHostProvider.<init>(StaticHostProvider.java:60) ~[zookeeper-3.4.3.jar:3.4.3-1240972]
at org.apache.zookeeper.ZooKeeper.<init>(ZooKeeper.java:440) ~[zookeeper-3.4.3.jar:3.4.3-1240972]
at org.apache.zookeeper.ZooKeeper.<init>(ZooKeeper.java:375) ~[zookeeper-3.4.3.jar:3.4.3-1240972]
at com.netflix.curator.utils.DefaultZookeeperFactory.newZooKeeper(DefaultZookeeperFactory.java:11) ~[curator-client-1.1.9.jar:na]
at com.netflix.curator.HandleHolder$1.getZooKeeper(HandleHolder.java:91) ~[curator-client-1.1.9.jar:na]
at com.netflix.curator.HandleHolder.getZooKeeper(HandleHolder.java:52) ~[curator-client-1.1.9.jar:na]
at com.netflix.curator.ConnectionState.reset(ConnectionState.java:152) ~[curator-client-1.1.9.jar:na]
at com.netflix.curator.ConnectionState.start(ConnectionState.java:114) ~[curator-client-1.1.9.jar:na]
at com.netflix.curator.CuratorZookeeperClient.start(CuratorZookeeperClient.java:181) ~[curator-client-1.1.9.jar:na]
at com.netflix.curator.framework.imps.CuratorFrameworkImpl.start(CuratorFrameworkImpl.java:189) ~[curator-framework-1.1.9.jar:na]
at org.apache.drill.exec.coord.ZKClusterCoordinator.start(ZKClusterCoordinator.java:87) [java-exec-1.0.0-m1-rebuffed.jar:1.0.0-m1]
at org.apache.drill.jdbc.DrillHandler.onConnectionInit(DrillHandler.java:80) [sqlparser-1.0.0-m1.jar:1.0.0-m1]
at net.hydromatic.optiq.jdbc.UnregisteredDriver.connect(UnregisteredDriver.java:127) [optiq-0.4.10.jar:na]
at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4802) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4853) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine$Commands.connect(SqlLine.java:4094) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine$Commands.connect(SqlLine.java:4003) [sqlline-1.1.0.jar:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_55]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_55]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_55]
at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_55]
at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2964) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine.dispatch(SqlLine.java:878) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine.initArgs(SqlLine.java:652) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine.begin(SqlLine.java:699) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:460) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine.main(SqlLine.java:443) [sqlline-1.1.0.jar:na]
20:26:38.720 [main] ERROR com.netflix.curator.ConnectionState - Connection timed out for connection string (local) and timeout (5000) / elapsed (5127)
org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss
at com.netflix.curator.ConnectionState.getZooKeeper(ConnectionState.java:94) ~[curator-client-1.1.9.jar:na]
at com.netflix.curator.CuratorZookeeperClient.getZooKeeper(CuratorZookeeperClient.java:106) [curator-client-1.1.9.jar:na]
at com.netflix.curator.framework.imps.CuratorFrameworkImpl.getZooKeeper(CuratorFrameworkImpl.java:393) [curator-framework-1.1.9.jar:na]
at com.netflix.curator.framework.imps.GetChildrenBuilderImpl$3.call(GetChildrenBuilderImpl.java:184) [curator-framework-1.1.9.jar:na]
at com.netflix.curator.framework.imps.GetChildrenBuilderImpl$3.call(GetChildrenBuilderImpl.java:173) [curator-framework-1.1.9.jar:na]
at com.netflix.curator.RetryLoop.callWithRetry(RetryLoop.java:85) [curator-client-1.1.9.jar:na]
at com.netflix.curator.framework.imps.GetChildrenBuilderImpl.pathInForeground(GetChildrenBuilderImpl.java:169) [curator-framework-1.1.9.jar:na]
at com.netflix.curator.framework.imps.GetChildrenBuilderImpl.forPath(GetChildrenBuilderImpl.java:161) [curator-framework-1.1.9.jar:na]
at com.netflix.curator.framework.imps.GetChildrenBuilderImpl.forPath(GetChildrenBuilderImpl.java:36) [curator-framework-1.1.9.jar:na]
at com.netflix.curator.x.discovery.details.ServiceDiscoveryImpl.getChildrenWatched(ServiceDiscoveryImpl.java:306) [curator-x-discovery-1.1.9.jar:na]
at com.netflix.curator.x.discovery.details.ServiceDiscoveryImpl.queryForInstances(ServiceDiscoveryImpl.java:276) [curator-x-discovery-1.1.9.jar:na]
at com.netflix.curator.x.discovery.details.ServiceCache.refresh(ServiceCache.java:193) [curator-x-discovery-1.1.9.jar:na]
at com.netflix.curator.x.discovery.details.ServiceCache.start(ServiceCache.java:116) [curator-x-discovery-1.1.9.jar:na]
at org.apache.drill.exec.coord.ZKClusterCoordinator.start(ZKClusterCoordinator.java:89) [java-exec-1.0.0-m1-rebuffed.jar:1.0.0-m1]
at org.apache.drill.jdbc.DrillHandler.onConnectionInit(DrillHandler.java:80) [sqlparser-1.0.0-m1.jar:1.0.0-m1]
at net.hydromatic.optiq.jdbc.UnregisteredDriver.connect(UnregisteredDriver.java:127) [optiq-0.4.10.jar:na]
at sqlline.SqlLine$DatabaseConnection.connect(SqlLine.java:4802) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine$DatabaseConnection.getConnection(SqlLine.java:4853) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine$Commands.connect(SqlLine.java:4094) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine$Commands.connect(SqlLine.java:4003) [sqlline-1.1.0.jar:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_55]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_55]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_55]
at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_55]
at sqlline.SqlLine$ReflectiveCommandHandler.execute(SqlLine.java:2964) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine.dispatch(SqlLine.java:878) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine.initArgs(SqlLine.java:652) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine.begin(SqlLine.java:699) [sqlline-1.1.0.jar:na]
at sqlline.SqlLine.mainWithInputRedirection(SqlLine.java:460) [sqlline-1.1.0.jar:na]

It is just too many errors and exception. As a starter to learn apache drill, I decided to stop learning it now because all these exceptions prohibit to learn for a beginner. I think the developer should improve the documentation and provide working example. We will revisit apache drill when the incubation is over.