读书人

Hadoop 0.23.6装配实践1-单机开发版安

发布时间: 2013-04-22 16:01:35 作者: rapoo

Hadoop 0.23.6安装实践1-单机开发版安装
<configuration><property> <name>fs.defaultFS</name> <value>hdfs://localhost:12200</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/opt/hadoop/hadoop-root</value> </property> <property> <name>fs.arionfs.impl</name> <value>org.apache.hadoop.fs.pvfs2.Pvfs2FileSystem</value> <description>The FileSystem for arionfs.</description></property> </configuration>?hadoop@ubuntu:/opt/hadoop/etc/hadoop$ vi hdfs-site.xml

?

?

<configuration> <property>    <name>dfs.namenode.name.dir</name>    <value>file:/opt/hadoop/data/dfs/name</value>    <final>true</final>    </property>    <property>    <name>dfs.namenode.data.dir</name>    <value>file:/opt/hadoop/data/dfs/data</value>    <final>true</final>    </property>    <property>      <name>dfs.replication</name>    <value>1</value>    </property>    <property>    <name>dfs.permission</name>    <value>false</value>    </property></configuration>
?hadoop@ubuntu:/opt/hadoop/etc/hadoop$ vi mapred-site.xml

?

?

<configuration><property>    <name>mapreduce.framework.name</name>    <value>yarn</value>    </property>    <property>    <name>mapreduce.job.tracker</name>    <value>hdfs://localhost:9001</value>    <final>true</final>    </property>    <property>    <name>mapreduce.map.memory.mb</name>    <value>1536</value>    </property>    <property>    <name>mapreduce.map.java.opts</name>    <value>-Xmx1024M</value>    </property>    <property>    <name>mapreduce.reduce.memory.mb</name>    <value>3072</value>    </property>    <property>    <name>mapreduce.reduce.java.opts</name>    <value>-Xmx2560M</value>    </property>    <property>    <name>mapreduce.task.io.sort.mb</name>    <value>512</value>    </property> <property>    <name>mapreduce.task.io.sort.factor</name>    <value>100</value>    </property>        <property>    <name>mapreduce.reduce.shuffle.parallelcopies</name>    <value>50</value>    </property>    <property>    <name>mapreduce.system.dir</name>    <value>file:/opt/hadoop/data/mapred/system</value>    </property>    <property>    <name>mapreduce.local.dir</name>    <value>file:/opt/hadoop/data/mapred/local</value>    <final>true</final>    </property></configuration>
?hadoop@ubuntu:/opt/hadoop/etc/hadoop$ vi yarn-site.xml

?

?

<configuration><!-- Site specific YARN configuration properties --><property>     <name>yarn.nodemanager.aux-services</name>     <value>mapreduce.shuffle</value>   </property>   <property>     <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>     <value>org.apache.hadoop.mapred.ShuffleHandler</value>   </property>   <property>     <name>mapreduce.framework.name</name>     <value>yarn</value>   </property>   <property>     <name>user.name</name>     <value>hadoop</value>   </property>   <property>     <name>yarn.resourcemanager.address</name>     <value>localhost:54311</value>   </property>   <property>     <name>yarn.resourcemanager.scheduler.address</name>     <value>localhost:54312</value>   </property> <property>     <name>yarn.resourcemanager.webapp.address</name>     <value>localhost:54313</value>   </property>   <property>     <name>yarn.resourcemanager.resource-tracker.address</name>     <value>localhost:54314</value>   </property>   <property>     <name>yarn.web-proxy.address</name>     <value>localhost:54315</value>   </property>   <property>     <name>mapred.job.tracker</name>     <value>localhost</value>   </property></configuration>
?到此所有配置完成,其中上述的localhost可以改成成本机的ip或者host,至于配置里面的属性待后续相关文章叙述。

?

?七.启动并运行wordcount程序

1.设置JAVA_HOME

?

hadoop@ubuntu:/opt/hadoop$ vi libexec/hadoop-config.sh在if [[ -z $JAVA_HOME ]]; then  # On OSX use java_home (or /Library for older versions)  if [ "Darwin" == "$(uname -s)" ]; then    if [ -x /usr/libexec/java_home ]; then      export JAVA_HOME=($(/usr/libexec/java_home))    else      export JAVA_HOME=(/Library/Java/Home)    fi  fi  # Bail if we did not detect it  if [[ -z $JAVA_HOME ]]; then    echo "Error: JAVA_HOME is not set and could not be found." 1>&2    exit 1  fifi之前添加export JAVA_HOME=/usr/lib/jvm/java-7-sun
?2. 格式化namenode

?

?

hadoop@ubuntu:/opt/hadoop$ hadoop namenode -format
?3.启动

?

?

hadoop@ubuntu:/opt/hadoop/sbin$ ./start-dfs.shStarting namenodes on [localhost]localhost: starting namenode, logging to /opt/hadoop-0.23.6/logs/hadoop-hadoop-namenode-ubuntu.outlocalhost: starting datanode, logging to /opt/hadoop-0.23.6/logs/hadoop-hadoop-datanode-ubuntu.outStarting secondary namenodes [0.0.0.0]0.0.0.0: starting secondarynamenode, logging to /opt/hadoop-0.23.6/logs/hadoop-hadoop-secondarynamenode-ubuntu.outhadoop@ubuntu:/opt/hadoop/sbin$ ./start-yarn.sh starting yarn daemonsstarting resourcemanager, logging to /opt/hadoop-0.23.6/logs/yarn-hadoop-resourcemanager-ubuntu.outlocalhost: starting nodemanager, logging to /opt/hadoop-0.23.6/logs/yarn-hadoop-nodemanager-ubuntu.out
?4.检查启动是否成功

?

?

hadoop@ubuntu:/opt/hadoop/sbin$ jps5036 DataNode5246 SecondaryNameNode5543 NodeManager5369 ResourceManager4852 NameNode5816 Jps
?5.试着运行wordcount

?

1)构造输入数据

生成一个字符文本文件

hadoop@ubuntu:/opt/hadoop$ cat tmp/test.txt a c b a b d f f e b a c c d g i s a b c d e a b f g e i k m m n a b d g h i j a k j e

?2)上传到hdfs

hadoop@ubuntu:/opt/hadoop$  hadoop fs -mkdir /testhadoop@ubuntu:/opt/hadoop$  hadoop fs -copyFromLocal tmp/test.txt /testhadoop@ubuntu:/opt/hadoop$ hadoop fs -ls  /testFound 1 items-rw-r--r--   1 hadoop supergroup         86 2013-04-18 07:47 /test/test.txt

?3)执行程序

hadoop@ubuntu:/opt/hadoop$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-0.23.6.jar wordcount /test/test.txt /test/out                 #其中/test/out 为输出目录13/04/18 22:41:11 INFO input.FileInputFormat: Total input paths to process : 113/04/18 22:41:11 INFO util.NativeCodeLoader: Loaded the native-hadoop library13/04/18 22:41:11 WARN snappy.LoadSnappy: Snappy native library not loaded13/04/18 22:41:12 INFO mapreduce.JobSubmitter: number of splits:113/04/18 22:41:12 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar13/04/18 22:41:12 WARN conf.Configuration: mapred.output.value.class is deprecated. Instead, use mapreduce.job.output.value.class13/04/18 22:41:12 WARN conf.Configuration: mapreduce.combine.class is deprecated. Instead, use mapreduce.job.combine.class13/04/18 22:41:12 WARN conf.Configuration: mapreduce.map.class is deprecated. Instead, use mapreduce.job.map.class13/04/18 22:41:12 WARN conf.Configuration: mapred.job.name is deprecated. Instead, use mapreduce.job.name13/04/18 22:41:12 WARN conf.Configuration: mapreduce.reduce.class is deprecated. Instead, use mapreduce.job.reduce.class13/04/18 22:41:12 WARN conf.Configuration: mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir13/04/18 22:41:12 WARN conf.Configuration: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir13/04/18 22:41:12 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps13/04/18 22:41:12 WARN conf.Configuration: mapred.output.key.class is deprecated. Instead, use mapreduce.job.output.key.class13/04/18 22:41:12 WARN conf.Configuration: mapred.working.dir is deprecated. Instead, use mapreduce.job.working.dir13/04/18 22:41:13 INFO mapred.ResourceMgrDelegate: Submitted application application_1366295287642_0001 to ResourceManager at localhost/127.0.0.1:5431113/04/18 22:41:13 INFO mapreduce.Job: The url to track the job: http://localhost:54315/proxy/application_1366295287642_0001/13/04/18 22:41:13 INFO mapreduce.Job: Running job: job_1366295287642_000113/04/18 22:41:21 INFO mapreduce.Job: Job job_1366295287642_0001 running in uber mode : false13/04/18 22:41:21 INFO mapreduce.Job:  map 0% reduce 0%13/04/18 22:41:36 INFO mapreduce.Job:  map 100% reduce 0%13/04/18 22:41:36 INFO mapreduce.Job: Task Id : attempt_1366295287642_0001_m_000000_0, Status : FAILEDKilled by external signal13/04/18 22:41:37 INFO mapreduce.Job:  map 0% reduce 0%13/04/18 22:42:11 INFO mapreduce.Job:  map 100% reduce 0%13/04/18 22:42:26 INFO mapreduce.Job:  map 100% reduce 100%13/04/18 22:42:26 INFO mapreduce.Job: Job job_1366295287642_0001 completed successfully13/04/18 22:42:27 INFO mapreduce.Job: Counters: 45

? 4)查看结果

hadoop@ubuntu:/opt/hadoop$ hadoop fs -ls /testFound 2 itemsdrwxr-xr-x   - hadoop supergroup          0 2013-04-18 22:42 /test/out-rw-r--r--   1 hadoop supergroup         86 2013-04-18 07:47 /test/test.txthadoop@ubuntu:/opt/hadoop$ hadoop fs -ls /test/outFound 2 items-rw-r--r--   1 hadoop supergroup          0 2013-04-18 22:42 /test/out/_SUCCESS-rw-r--r--   1 hadoop supergroup         56 2013-04-18 22:42 /test/out/part-r-00000hadoop@ubuntu:/opt/hadoop$ hadoop fs -cat /test/out/part-r-0000013/04/18 22:45:25 INFO util.NativeCodeLoader: Loaded the native-hadoop librarya7b6c4d4e4f3g3h1i3j2k2m2n1s1

?

读书人网 >开源软件

热点推荐