Hadoop 2.4 installing on ubuntu 14.04

Download Hadoop 2.4 installing on ubuntu 14.04

Post on 15-Jul-2015

406 views

Category:

Technology

1 download

Embed Size (px)

TRANSCRIPT

<p> akshath@baabte.com</p> <p>facebook.com/akshath.kumar180</p> <p>Twitter.com/akshath4u</p> <p>in.linkedin.com/in/akshathkumar</p> <p>HADOOP 2.4 INSTALLATION ON UBUNTU 14.04Hadoop is a free, Java-based programming framework that supports the processing of large data sets in a distributed computing environment. It is part of the Apache project sponsored by the Apache Software Foundation.</p> <p>In this presentation, we'll install a single-node Hadoop cluster backed by the Hadoop Distributed File System on Ubuntu.</p> <p>HADOOP ON 64 BIT UBUNTU 14.04</p> <p>INSTALLING JAVAADDING A DEDICATED HADOOP USERINSTALLING SSHCREATE AND SETUP SSH CERTIFICATESINSTALL HADOOPSETUP CONFIGURATION FILES FORMAT THE NEW HADOOP FILESYSTEMSTARTING HADOOPSTOPPING HADOOPHADOOP WEB INTERFACESUSING HADOOPSTEPS FOR INSTALLING HADOOP</p> <p>Hadoop framework is written in Java!Open Terminal in Ubuntu 14.04, and Following the Steps</p> <p>INSTALLING JAVA</p> <p>ADDING A DEDICATED HADOOP USER</p> <p>SSH has 2 main components :</p> <p>ssh : The command we use to connect to remote machines - the client.sshd : The daemon that is running on the server and allows clients to connect to the server.</p> <p>The ssh is pre-enabled on Linux, but in order to start sshd daemon, we need to install ssh first. Use this command to do that.</p> <p>$ sudo apt-get install ssh</p> <p>INSTALLING SSH</p> <p>Hadoop requires SSH access to manage its nodes,we therefore need to configure SSH access to localhost. So, we need to have SSH up and running on our machine and configured it to allow SSH public key authentication.</p> <p>CREATE AND SETUP SSH CERTIFICATES</p> <p>The second command adds the newly created key to the list of authorized keys so that Hadoop can use ssh without prompting for a password.We can check if ssh works</p> <p>CREATE AND SETUP SSH CERTIFICATES</p> <p>Type the following Command in Ubuntu Terminal.</p> <p>We want to move the Hadoop installation to the /usr/local/hadoop directory using the following commandINSTALL HADOOP</p> <p>$ wget http://mirrors.sonic.net/apache/hadoop/common/hadoop-2.4.1/hadoop-2.4.1.tar.gz$ tar xvzf hadoop-2.4.1.tar.gz$ sudo mv hadoop-2.4.1 /usr/local/hadoop $ sudo chown -R hduser:hadoop hadoop$ pwd/usr/local/hadoop$ lsbin etc include lib libexec LICENSE.txt NOTICE.txt README.txt sbinshareThe following files will have to be modified to complete the Hadoop setup</p> <p>~/.bashrc/usr/local/hadoop/etc/hadoop/hadoop-env.sh/usr/local/hadoop/etc/hadoop/core-site.xml/usr/local/hadoop/etc/hadoop/mapred-site.xml.template/usr/local/hadoop/etc/hadoop/hdfs-site.xml</p> <p>First of all we need to find the path where Java has been installed to set the JAVA_HOME environment variable using the following commandSETUP CONFIGURATION FILES</p> <p>$ update-alternatives --config javaThere is only one alternative in link group java (providing /usr/bin/java): /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/javaNothing to configure. ~/.bashrc</p> <p>We can append the following to the end of ~/.bashrc</p> <p>SETUP CONFIGURATION FILES</p> <p>2. /usr/local/hadoop/etc/hadoop/hadoop-env.sh</p> <p>We need to set JAVA_HOME by modifying hadoop-env.sh file</p> <p>export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64</p> <p>Adding the above statement in the hadoop-env.sh file ensures that the value of JAVA_HOME variable will be available to Hadoop whenever it is started up.</p> <p>SETUP CONFIGURATION FILES</p> <p>3. /usr/local/hadoop/etc/hadoop/core-site.xm</p> <p>he /usr/local/hadoop/etc/hadoop/core-site.xml file contains configuration properties that Hadoop uses when starting up. This file can be used to override the default settings that Hadoop starts with.</p> <p>$ sudo mkdir -p /app/hadoop/tmp$ sudo chown hduser:hadoop /app/hadoop/tmp</p> <p>SETUP CONFIGURATION FILES</p> <p>Open the file and enter the following in between the tag.</p> <p>SETUP CONFIGURATION FILES</p> <p>4. /usr/local/hadoop/etc/hadoop/mapred-site.xml</p> <p>By default, the /usr/local/hadoop/etc/hadoop/ folder contains the /usr/local/hadoop/etc/hadoop/mapred-site.xml.template file which has to be renamed/copied with the name mapred-site.xml</p> <p> $ cp/usr/local/hadoop/etc/hadoop/mapred-site.xml.template /usr/local/hadoop/etc/hadoop/mapred-site.xml</p> <p>SETUP CONFIGURATION FILES</p> <p>The mapred-site.xml file is used to specify which framework is being used for MapReduce. We need to enter the following content in between the tag</p> <p>SETUP CONFIGURATION FILES</p> <p>5. /usr/local/hadoop/etc/hadoop/hdfs-site.xml</p> <p>Before editing this file, we need to create two directories which will contain the namenode and the datanode for this Hadoop installation. This can be done using the following commands</p> <p>SETUP CONFIGURATION FILES</p> <p>Open the file and enter the following content in between the tag</p> <p> dfs.replication 1 Default block replication. The actual number of replications can be specified when the file is created. The default is used if replication is not specified in create time. dfs.namenode.name.dir file:/usr/local/hadoop_store/hdfs/namenode dfs.datanode.data.dir file:/usr/local/hadoop_store/hdfs/datanode </p> <p>SETUP CONFIGURATION FILES</p> <p>Now, the Hadoop filesystem needs to be formatted so that we can start to use it. The format command should be issued with write permission since it creates current directory under /usr/local/hadoop_store/hdfs/namenode folder.</p> <p>hduser@k:~$ hadoop namenode -formatDEPRECATED: Use of this script to execute hdfs command is deprecated.Instead use the hdfs command for it.</p> <p>. . .. . .. . . . ..</p> <p>14/07/13 22:13:10 INFO namenode.NameNode: SHUTDOWN_MSG: /************************************************************SHUTDOWN_MSG: Shutting down NameNode at k/127.0.1.1************************************************************/</p> <p>FORMAT THE NEW HADOOP FILESYSTEM</p> <p>The hadoop namenode -format command should be executed once before we start using Hadoop. If this command is executed again after Hadoop has been used, it'll destroy all the data on the Hadoop file system.Now it's time to start the newly installed single node cluster. We can use start-all.sh or (start-dfs.sh and start-yarn.sh)</p> <p>hduser@k:/home/k$ start-all.sh This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh14/07/13 23:36:59 WARN util.NativeCodeLoader: Unable to load native-hadoop library for yourplatform... using builtin-java classes where applicableStarting namenodes on [localhost]. .. .. . . .. . .. .. ..localhost: starting nodemanager, logging to /usr/local/hadoop/logs/yarn-hduser-nodemanager-k.out</p> <p> check if it's really up and running.hduser@k:/home/k$ jpsAnother way to check is using netstathduser@k:/home/k$ netstat -plten | grep java</p> <p>STARTING HADOOP</p> <p>Type the following Command into the Terminal for Stopping the Hadoop</p> <p>$ pwd/usr/local/hadoop/sbin$ lsdistribute-exclude.sh httpfs.sh start-all.sh----------------------- ------------- ---------- ----------------------- -- -- - -- -------- - -- - - -- - - - - - ----start-secure-dns.sh stop-balancer.sh stop-yarn.sh</p> <p>We run stop-all.sh or (stop-dfs.sh and stop-yarn.sh) to stop all the daemons running on our machine:$ /usr/local/hadoop/sbin/stop-all.sh</p> <p>STOPPING HADOOP</p> <p>Thank You...</p> <p>USUKUAE 7002 Hana Road, Edison NJ 08817, United States of America.90 High Street, Cherry Hinton, Cambridge, CB1 9HZ, United Kingdom.</p> <p> Suite No: 51, Oasis Center,Sheikh Zayed Road, Dubai, UAE Email to info@baabtra.com or Visit baabtra.com</p> <p>Looking for learning more about the above topic? India CentresEmarald Mall (Big Bazar Building)Mavoor Road, Kozhikode,Kerala, India.Ph: + 91 495 40 25 550</p> <p>NC Complex, Near Bus StandMukkam, Kozhikode,Kerala, India.Ph: + 91 495 40 25 550</p> <p>Cafit Square IT Park,Hilite Business Park,KozhikodeKerala, India.</p> <p>Email: info@baabtra.com</p> <p>TBI - NITCNIT Campus, Kozhikode.Kerala, India.</p> <p>Start up VillageEranakulam,Kerala, India.</p> <p>Start up VillageUL CCKozhikode, Kerala</p> <p>Follow us @twitter.com/baabtra</p> <p>Like us @ facebook.com/baabtra</p> <p>Subscribe to us @ youtube.com/baabtra</p> <p>Become a follower @ slideshare.net/BaabtraMentoringPartner</p> <p>Connect to us @ in.linkedin.com/in/baabtra</p> <p>Give a feedback @ massbaab.com/baabtraThanks in advancewww.baabtra.com|www.massbaab.com|www.baabte.com</p> <p>Want to learn more about programming or Looking to become a good programmer? </p> <p>Are you wasting time on searching so many contents online? </p> <p>Do you want to learn things quickly? </p> <p>Tired of spending huge amount of money to become a Software professional?</p> <p> Do an online course @baabtra.com </p> <p>We put industry standards to practice. Our structured, activity based courses are so designed to make a quick, good software professional out of anybody who holds a passion for coding. </p>