<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
        <title>Hadoop on ZRJ | 学习笔记</title>
        <link>https://blog.zrj.me/tags/hadoop/</link>
        <description>Recent content in Hadoop on ZRJ | 学习笔记</description>
        <generator>Hugo -- gohugo.io</generator>
        <language>zh-CN</language>
        <lastBuildDate>Fri, 22 Apr 2016 00:13:58 +0800</lastBuildDate><atom:link href="https://blog.zrj.me/tags/hadoop/index.xml" rel="self" type="application/rss+xml" /><item>
        <title>hadoop yarn hdfs 概念的理解</title>
        <link>https://blog.zrj.me/posts/2016-04-22-hadoop-yarn-hdfs-%E6%A6%82%E5%BF%B5%E7%9A%84%E7%90%86%E8%A7%A3/</link>
        <pubDate>Fri, 22 Apr 2016 00:13:58 +0800</pubDate>
        
        <guid>https://blog.zrj.me/posts/2016-04-22-hadoop-yarn-hdfs-%E6%A6%82%E5%BF%B5%E7%9A%84%E7%90%86%E8%A7%A3/</guid>
        <description>&lt;p&gt;有几个说的不错的文章，看这里，&lt;a class=&#34;link&#34; href=&#34;https://www.ibm.com/developerworks/cn/opensource/os-cn-hadoop-yarn/&#34;  title=&#34;https://www.ibm.com/developerworks/cn/opensource/os-cn-hadoop-yarn/&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;https://www.ibm.com/developerworks/cn/opensource/os-cn-hadoop-yarn/&lt;/a&gt;，来自 ibm，从理论到例子，讲解了 yarn 的产生和变化&lt;/p&gt;
&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;http://www.lai18.com/content/1103036.html&#34;  title=&#34;http://www.lai18.com/content/1103036.html&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://www.lai18.com/content/1103036.html&lt;/a&gt;，对计算框架的历史演变有一个概述&lt;/p&gt;
&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;http://www.cnblogs.com/LeftNotEasy/archive/2012/02/18/why-yarn.html&#34;  title=&#34;http://www.cnblogs.com/LeftNotEasy/archive/2012/02/18/why-yarn.html&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://www.cnblogs.com/LeftNotEasy/archive/2012/02/18/why-yarn.html&lt;/a&gt;，辅助理解 yarn 的一些产生背景&lt;/p&gt;
&lt;p&gt;=================&lt;/p&gt;
&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;http://www.cnblogs.com/esingchan/p/3917094.html&#34;  title=&#34;http://www.cnblogs.com/esingchan/p/3917094.html&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://www.cnblogs.com/esingchan/p/3917094.html&lt;/a&gt;，一一个 wordcount 为例，说明了老 mapreduce 的过程，详细说明了 map sort partition combine shuffle merge reduce 的阶段和概念&lt;/p&gt;
</description>
        </item>
        <item>
        <title>Hadoop HBase 碎碎念</title>
        <link>https://blog.zrj.me/posts/2016-03-25-hadoop-hbase-%E7%A2%8E%E7%A2%8E%E5%BF%B5/</link>
        <pubDate>Fri, 25 Mar 2016 22:08:12 +0800</pubDate>
        
        <guid>https://blog.zrj.me/posts/2016-03-25-hadoop-hbase-%E7%A2%8E%E7%A2%8E%E5%BF%B5/</guid>
        <description>&lt;p&gt;一些小笔记会持续追加到这里&lt;/p&gt;
&lt;p&gt;2016-3-25 22:09:24&lt;/p&gt;
&lt;p&gt;重启或添加节点： $bin/hadoop-daemon.sh start datanode&lt;/p&gt;
&lt;p&gt;-&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;-&lt;/p&gt;
&lt;p&gt;2016-3-25 22:09:31&lt;/p&gt;
&lt;p&gt;Had the same problem with 2.6.0, and shamouda&amp;rsquo;s answer solved it (I was not using dfs.hosts at all so that could not be the answer. I did add&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;&amp;lt;property&amp;gt;
  &amp;lt;name&amp;gt;dfs.namenode.datanode.registration.ip-hostname-check&amp;lt;/name&amp;gt;
  &amp;lt;value&amp;gt;false&amp;lt;/value&amp;gt;
&amp;lt;/property&amp;gt;
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;to hdfs-site.xml and that was enough to fix the issue.&lt;/p&gt;
&lt;p&gt;-&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;-&lt;/p&gt;
&lt;p&gt;2016-3-25 22:11:55&lt;/p&gt;
&lt;p&gt;$bin/hbase-daemon.sh stop regionserver $bin/hbase-daemon.sh start regionserver&lt;/p&gt;
&lt;p&gt;-&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;ndash;&lt;/p&gt;
&lt;p&gt;2016-3-27 16:54:22&lt;/p&gt;
&lt;p&gt;disable &amp;lsquo;kline_minute&amp;rsquo; describe &amp;lsquo;kline_minute&amp;rsquo; alter &amp;lsquo;kline_minute&amp;rsquo; , NAME=&amp;gt;&amp;lsquo;kline&amp;rsquo;, COMPRESSION=&amp;gt;&amp;lsquo;gz&amp;rsquo; describe &amp;lsquo;kline_minute&amp;rsquo; enable &amp;lsquo;kline_minute&amp;rsquo; major_compact &amp;lsquo;kline_minute&amp;rsquo;&lt;/p&gt;
&lt;p&gt;启用压缩&lt;/p&gt;
&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;http://www.searchtb.com/2011/01/understanding-hbase.html&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://www.searchtb.com/2011/01/understanding-hbase.html&lt;/a&gt;，这里说明了文件的存储方式 &lt;a class=&#34;link&#34; href=&#34;http://stackoverflow.com/questions/18656483/data-size-increases-in-hbase&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://stackoverflow.com/questions/18656483/data-size-increases-in-hbase&lt;/a&gt;，这里有一个计算的例子&lt;/p&gt;
&lt;p&gt;-&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;ndash;&lt;/p&gt;
&lt;p&gt;2016-3-27 16:56:14&lt;/p&gt;
&lt;p&gt;重启一个regionserver&lt;/p&gt;
&lt;p&gt;bin/graceful_stop.sh &amp;ndash;restart &amp;ndash;reload &amp;ndash;debugnodename&lt;/p&gt;
&lt;p&gt;这个操作是平滑的重启regionserver进程，对服务不会有影响，他会先将需要重启的regionserver上面的所有region迁移到其它的服务器，然后重启，最后又会将之前的region迁移回来，但我们修改一个配置时，可以用这种方式重启每一台机子，这个命令会关闭balancer，所以最后我们要在hbase shell里面执行一下balance_switch true，对于hbase regionserver重启，不要直接kill进程，这样会造成在zookeeper.session.timeout这个时间长的中断，也不要通过bin/hbase-daemon.sh stop regionserver去重启，如果运气不太好，-ROOT-或者.META.表在上面的话，所有的请求会全部失败。&lt;/p&gt;
&lt;p&gt;关闭下线一台regionserver&lt;/p&gt;
&lt;p&gt;bin/graceful_stop.sh &amp;ndash;stop nodename&lt;/p&gt;
&lt;p&gt;和上面一样，系统会在关闭之前迁移所有region，然后stop进程，同样最后我们要手工balance_switch true，开启master的region均衡。&lt;/p&gt;
</description>
        </item>
        <item>
        <title>hadoop 2.5.2 集群安装</title>
        <link>https://blog.zrj.me/posts/2016-03-18-hadoop-2-5-2-%E9%9B%86%E7%BE%A4%E5%AE%89%E8%A3%85/</link>
        <pubDate>Fri, 18 Mar 2016 16:40:52 +0800</pubDate>
        
        <guid>https://blog.zrj.me/posts/2016-03-18-hadoop-2-5-2-%E9%9B%86%E7%BE%A4%E5%AE%89%E8%A3%85/</guid>
        <description>&lt;p&gt;之前写过一个，&lt;a class=&#34;link&#34; href=&#34;http://zrj.me/archives/888&#34;  title=&#34;http://zrj.me/archives/888&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://zrj.me/archives/888&lt;/a&gt;，不过现在是 2.5.2，东西不同了，安装方式有所变化。&lt;/p&gt;
&lt;p&gt;说起来，主要以下几点：&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;安装 JDK&lt;/li&gt;
&lt;li&gt;打通 SSH&lt;/li&gt;
&lt;li&gt;停掉防火墙&lt;/li&gt;
&lt;li&gt;配置 core-site.xml, maprd-site.xml, hdfs-site.xml，新增了一个 yarn-site.xml&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;参考这里的文章，&lt;a class=&#34;link&#34; href=&#34;http://blog.csdn.net/tang9140/article/details/42869531&#34;  title=&#34;http://blog.csdn.net/tang9140/article/details/42869531&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://blog.csdn.net/tang9140/article/details/42869531&lt;/a&gt;，&lt;a class=&#34;link&#34; href=&#34;http://blog.csdn.net/greensurfer/article/details/39450369&#34;  title=&#34;http://blog.csdn.net/greensurfer/article/details/39450369&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://blog.csdn.net/greensurfer/article/details/39450369&lt;/a&gt;，配置文件使用如下：&lt;/p&gt;
&lt;p&gt;core-site.xml&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-xml&#34; data-lang=&#34;xml&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;configuration&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;hadoop.tmp.dir&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;/home/hadoop/tmp&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;description&amp;gt;&lt;/span&gt;Abase for other temporary directories.&lt;span class=&#34;nt&#34;&gt;&amp;lt;/description&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;fs.defaultFS&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;hdfs://10.0.1.100:9000&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;io.file.buffer.size&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;4096&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;/configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;hdfs-site.xml&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-xml&#34; data-lang=&#34;xml&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;configuration&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;dfs.nameservices&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;hadoop-cluster1&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;dfs.namenode.secondary.http-address&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;10.0.1.100:50090&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;dfs.namenode.name.dir&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;/home/hadoop/dfs/name&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;dfs.datanode.data.dir&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;/home/hadoop/dfs/data&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;dfs.replication&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;2&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;dfs.webhdfs.enabled&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;true&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;/configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;mapred-site.xml&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-xml&#34; data-lang=&#34;xml&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;configuration&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;mapreduce.framework.name&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;yarn&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;mapreduce.jobtracker.http.address&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;10.0.1.100:50030&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;mapreduce.jobhistory.address&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;10.0.1.100:10020&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;mapreduce.jobhistory.webapp.address&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;10.0.1.100:19888&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;/configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;yarn-site.xml&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-xml&#34; data-lang=&#34;xml&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;configuration&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c&#34;&gt;&amp;lt;!-- Site specific YARN configuration properties --&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;yarn.nodemanager.aux-services&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;mapreduce_shuffle&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;yarn.resourcemanager.address&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;10.0.1.100:8032&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;yarn.resourcemanager.scheduler.address&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;10.0.1.100:8030&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;yarn.resourcemanager.resource-tracker.address&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;10.0.1.100:8031&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;yarn.resourcemanager.admin.address&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;10.0.1.100:8033&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;yarn.resourcemanager.webapp.address&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;        &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;10.0.1.100:8088&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;  
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;/configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;最后配置 slaves 文件，没有了 master 文件&lt;/p&gt;
&lt;p&gt;然后格式化 namenode，输出如下&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;zrj@FWSVRLPT01:~/hadoop-2.5.2$ ./bin/hdfs namenode -format
16/03/18 16:22:31 INFO namenode.NameNode: STARTUP_MSG: 
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = FWSVRLPT01/10.16.18.206
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 2.5.2
STARTUP_MSG:   classpath = /home/zrj/hadoop-2.5.2/etc/hadoop:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-configuration-1.6.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/activation-1.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jsp-api-2.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-io-2.4.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/paranamer-2.3.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/httpclient-4.2.5.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/log4j-1.2.17.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jets3t-0.9.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/zookeeper-3.4.6.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jsr305-1.3.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/hadoop-auth-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-el-1.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jettison-1.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jersey-server-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/avro-1.7.4.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-codec-1.4.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-cli-1.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-net-3.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jetty-util-6.1.26.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/netty-3.6.2.Final.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-digester-1.8.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/hadoop-annotations-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/guava-11.0.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-compress-1.4.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jsch-0.1.42.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jersey-core-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/xz-1.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-httpclient-3.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/stax-api-1.0-2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/asm-3.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-logging-1.1.3.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jersey-json-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-collections-3.2.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-math3-3.1.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jetty-6.1.26.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/hamcrest-core-1.3.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/commons-lang-2.6.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/junit-4.11.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/mockito-all-1.8.5.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/servlet-api-2.5.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/httpcore-4.2.5.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/xmlenc-0.52.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/hadoop-common-2.5.2-tests.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/hadoop-nfs-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/common/hadoop-common-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/commons-io-2.4.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/commons-el-1.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/guava-11.0.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/asm-3.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/hadoop-hdfs-2.5.2-tests.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/hadoop-hdfs-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/hdfs/hadoop-hdfs-nfs-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jline-0.9.94.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/activation-1.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/commons-io-2.4.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/log4j-1.2.17.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jsr305-1.3.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jettison-1.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jersey-server-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/commons-codec-1.4.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/commons-cli-1.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jersey-client-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/aopalliance-1.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/guava-11.0.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jersey-core-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/guice-3.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/xz-1.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/commons-httpclient-3.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/asm-3.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jersey-json-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/commons-collections-3.2.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jetty-6.1.26.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/commons-lang-2.6.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/servlet-api-2.5.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/lib/javax.inject-1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/hadoop-yarn-server-tests-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/hadoop-yarn-api-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/hadoop-yarn-server-common-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/hadoop-yarn-client-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/yarn/hadoop-yarn-common-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/hadoop-annotations-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/guice-3.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/xz-1.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/asm-3.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/junit-4.11.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/lib/javax.inject-1.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.5.2-tests.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.5.2.jar:/home/zrj/hadoop-2.5.2/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.5.2.jar:/home/zrj/hadoop-2.5.2/contrib/capacity-scheduler/*.jar
STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r cc72e9b000545b86b75a61f4835eb86d57bfafc0; compiled by &amp;#39;jenkins&amp;#39; on 2014-11-14T23:45Z
STARTUP_MSG:   java = 1.7.0_72
************************************************************/
16/03/18 16:22:31 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
16/03/18 16:22:31 INFO namenode.NameNode: createNameNode [-format]
Formatting using clusterid: CID-98edc835-228b-479b-b912-be8db8cc32ad
16/03/18 16:22:31 INFO namenode.FSNamesystem: fsLock is fair:true
16/03/18 16:22:31 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
16/03/18 16:22:31 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
16/03/18 16:22:31 INFO blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
16/03/18 16:22:31 INFO blockmanagement.BlockManager: The block deletion will start around 2016 Mar 18 16:22:31
16/03/18 16:22:31 INFO util.GSet: Computing capacity for map BlocksMap
16/03/18 16:22:31 INFO util.GSet: VM type       = 64-bit
16/03/18 16:22:31 INFO util.GSet: 2.0% max memory 889 MB = 17.8 MB
16/03/18 16:22:31 INFO util.GSet: capacity      = 2^21 = 2097152 entries
16/03/18 16:22:31 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
16/03/18 16:22:31 INFO blockmanagement.BlockManager: defaultReplication         = 2
16/03/18 16:22:31 INFO blockmanagement.BlockManager: maxReplication             = 512
16/03/18 16:22:31 INFO blockmanagement.BlockManager: minReplication             = 1
16/03/18 16:22:31 INFO blockmanagement.BlockManager: maxReplicationStreams      = 2
16/03/18 16:22:31 INFO blockmanagement.BlockManager: shouldCheckForEnoughRacks  = false
16/03/18 16:22:31 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
16/03/18 16:22:31 INFO blockmanagement.BlockManager: encryptDataTransfer        = false
16/03/18 16:22:31 INFO blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
16/03/18 16:22:31 INFO namenode.FSNamesystem: fsOwner             = zrj (auth:SIMPLE)
16/03/18 16:22:31 INFO namenode.FSNamesystem: supergroup          = supergroup
16/03/18 16:22:31 INFO namenode.FSNamesystem: isPermissionEnabled = true
16/03/18 16:22:31 INFO namenode.FSNamesystem: HA Enabled: false
16/03/18 16:22:31 INFO namenode.FSNamesystem: Append Enabled: true
16/03/18 16:22:31 INFO util.GSet: Computing capacity for map INodeMap
16/03/18 16:22:31 INFO util.GSet: VM type       = 64-bit
16/03/18 16:22:31 INFO util.GSet: 1.0% max memory 889 MB = 8.9 MB
16/03/18 16:22:31 INFO util.GSet: capacity      = 2^20 = 1048576 entries
16/03/18 16:22:31 INFO namenode.NameNode: Caching file names occuring more than 10 times
16/03/18 16:22:31 INFO util.GSet: Computing capacity for map cachedBlocks
16/03/18 16:22:31 INFO util.GSet: VM type       = 64-bit
16/03/18 16:22:31 INFO util.GSet: 0.25% max memory 889 MB = 2.2 MB
16/03/18 16:22:31 INFO util.GSet: capacity      = 2^18 = 262144 entries
16/03/18 16:22:31 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
16/03/18 16:22:31 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
16/03/18 16:22:31 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
16/03/18 16:22:31 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
16/03/18 16:22:31 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
16/03/18 16:22:31 INFO util.GSet: Computing capacity for map NameNodeRetryCache
16/03/18 16:22:31 INFO util.GSet: VM type       = 64-bit
16/03/18 16:22:31 INFO util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
16/03/18 16:22:31 INFO util.GSet: capacity      = 2^15 = 32768 entries
16/03/18 16:22:31 INFO namenode.NNConf: ACLs enabled? false
16/03/18 16:22:31 INFO namenode.NNConf: XAttrs enabled? true
16/03/18 16:22:31 INFO namenode.NNConf: Maximum size of an xattr: 16384
16/03/18 16:22:31 INFO namenode.FSImage: Allocated new BlockPoolId: BP-667685565-10.16.18.206-1458289351864
16/03/18 16:22:31 INFO common.Storage: Storage directory /home/zrj/hadoop-2.5.2/dfs/name has been successfully formatted.
16/03/18 16:22:32 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid &amp;gt;= 0
16/03/18 16:22:32 INFO util.ExitUtil: Exiting with status 0
16/03/18 16:22:32 INFO namenode.NameNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at FWSVRLPT01/10.16.18.206
************************************************************/
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;记得要分别在文件hadoop-env.sh和yarn-env.sh中添加JAVA_HOME配置&lt;/p&gt;
&lt;p&gt;启动 dfs&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;zrj@FWSVRLPT01:~/hadoop-2.5.2$ ./sbin/start-dfs.sh 
Starting namenodes on [FWSVRLPT01]
FWSVRLPT01: starting namenode, logging to /home/zrj/hadoop-2.5.2/logs/hadoop-zrj-namenode-FWSVRLPT01.out
FWSVRLPT02: starting datanode, logging to /home/zrj/hadoop-2.5.2/logs/hadoop-zrj-datanode-FWSVRLPT02.out
qtdata: starting datanode, logging to /home/zrj/hadoop-2.5.2/logs/hadoop-zrj-datanode-qtdata.out
Starting secondary namenodes [FWSVRLPT01]
FWSVRLPT01: starting secondarynamenode, logging to /home/zrj/hadoop-2.5.2/logs/hadoop-zrj-secondarynamenode-FWSVRLPT01.out
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;启动 yarn&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;zrj@FWSVRLPT01:~/hadoop-2.5.2$ ./sbin/start-yarn.sh 
starting yarn daemons
starting resourcemanager, logging to /home/zrj/hadoop-2.5.2/logs/yarn-zrj-resourcemanager-FWSVRLPT01.out
FWSVRLPT02: starting nodemanager, logging to /home/zrj/hadoop-2.5.2/logs/yarn-zrj-nodemanager-FWSVRLPT02.out
qtdata: starting nodemanager, logging to /home/zrj/hadoop-2.5.2/logs/yarn-zrj-nodemanager-qtdata.out
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;通过浏览器访问 http://10.0.1.100:50070/ http://10.0.1.100:8088/&lt;/p&gt;
</description>
        </item>
        <item>
        <title>WordCount - Hadoop 的 HelloWorld</title>
        <link>https://blog.zrj.me/posts/2013-06-20-wordcount-hadoop-%E7%9A%84-helloworld/</link>
        <pubDate>Thu, 20 Jun 2013 18:14:54 +0800</pubDate>
        
        <guid>https://blog.zrj.me/posts/2013-06-20-wordcount-hadoop-%E7%9A%84-helloworld/</guid>
        <description>&lt;p&gt;按照官方教程，&lt;a class=&#34;link&#34; href=&#34;http://hadoop.apache.org/docs/stable/mapred_tutorial.html&#34;  title=&#34;http://hadoop.apache.org/docs/stable/mapred_tutorial.html&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://hadoop.apache.org/docs/stable/mapred_tutorial.html&lt;/a&gt;，开始着手写一个 Hadoop 的 HelloWorld 程序，叫做 WordCount&lt;/p&gt;
&lt;p&gt;首先准备好代码如下：&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-java&#34; data-lang=&#34;java&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;package&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;nn&#34;&gt;org.myorg&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;	
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;nn&#34;&gt;java.io.IOException&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;nn&#34;&gt;java.util.*&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;nn&#34;&gt;org.apache.hadoop.fs.Path&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;nn&#34;&gt;org.apache.hadoop.conf.*&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;nn&#34;&gt;org.apache.hadoop.io.*&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;nn&#34;&gt;org.apache.hadoop.mapred.*&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kn&#34;&gt;import&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;nn&#34;&gt;org.apache.hadoop.util.*&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;kd&#34;&gt;public&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;class&lt;/span&gt; &lt;span class=&#34;nc&#34;&gt;WordCount&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;    &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;public&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;static&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;class&lt;/span&gt; &lt;span class=&#34;nc&#34;&gt;Map&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;extends&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;MapReduceBase&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;implements&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Mapper&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;LongWritable&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IntWritable&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;private&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;final&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;static&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IntWritable&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;one&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;k&#34;&gt;new&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IntWritable&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;private&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Text&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;word&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;k&#34;&gt;new&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;();&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;public&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kt&#34;&gt;void&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;map&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;LongWritable&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;key&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Text&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;OutputCollector&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IntWritable&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;output&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Reporter&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;reporter&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;throws&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IOException&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;        &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;String&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;line&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;value&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;toString&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;();&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;        &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;StringTokenizer&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;tokenizer&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;k&#34;&gt;new&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;StringTokenizer&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;line&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;        &lt;/span&gt;&lt;span class=&#34;k&#34;&gt;while&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;tokenizer&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;hasMoreTokens&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;())&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;          &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;word&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;set&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;tokenizer&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;nextToken&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;());&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;          &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;output&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;collect&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;word&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;one&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;        &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;    &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;    &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;public&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;static&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;class&lt;/span&gt; &lt;span class=&#34;nc&#34;&gt;Reduce&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;extends&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;MapReduceBase&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;implements&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Reducer&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IntWritable&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IntWritable&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;public&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kt&#34;&gt;void&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;reduce&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Text&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;key&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Iterator&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IntWritable&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;values&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;OutputCollector&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;lt;&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IntWritable&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;&amp;gt;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;output&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Reporter&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;reporter&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;throws&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IOException&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;        &lt;/span&gt;&lt;span class=&#34;kt&#34;&gt;int&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;sum&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;        &lt;/span&gt;&lt;span class=&#34;k&#34;&gt;while&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;values&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;hasNext&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;())&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;          &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;sum&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;+=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;values&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;next&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;().&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;get&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;();&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;        &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;        &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;output&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;collect&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;key&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;k&#34;&gt;new&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IntWritable&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;sum&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;));&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;    &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;    &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;public&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;static&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kt&#34;&gt;void&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;nf&#34;&gt;main&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;String&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;[]&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;args&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;)&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;kd&#34;&gt;throws&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Exception&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;{&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;JobConf&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;k&#34;&gt;new&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;JobConf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;WordCount&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;class&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;setJobName&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;s&#34;&gt;&amp;#34;wordcount&amp;#34;&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;setOutputKeyClass&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Text&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;class&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;setOutputValueClass&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;IntWritable&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;class&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;setMapperClass&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Map&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;class&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;setCombinerClass&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Reduce&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;class&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;setReducerClass&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Reduce&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;class&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;setInputFormat&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;TextInputFormat&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;class&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;setOutputFormat&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;TextOutputFormat&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;class&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;FileInputFormat&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;setInputPaths&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;k&#34;&gt;new&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Path&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;args&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;[&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;0&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;));&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;FileOutputFormat&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;setOutputPath&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;,&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;k&#34;&gt;new&lt;/span&gt;&lt;span class=&#34;w&#34;&gt; &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;Path&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;args&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;[&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;1&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;]&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;));&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;      &lt;/span&gt;&lt;span class=&#34;n&#34;&gt;JobClient&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;.&lt;/span&gt;&lt;span class=&#34;na&#34;&gt;runJob&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;(&lt;/span&gt;&lt;span class=&#34;n&#34;&gt;conf&lt;/span&gt;&lt;span class=&#34;p&#34;&gt;);&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;w&#34;&gt;    &lt;/span&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;p&#34;&gt;}&lt;/span&gt;&lt;span class=&#34;w&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;然后进行编译打包&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;$ mkdir wordcount_classes 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;$ javac -classpath &lt;span class=&#34;si&#34;&gt;${&lt;/span&gt;&lt;span class=&#34;nv&#34;&gt;HADOOP_HOME&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;}&lt;/span&gt;/hadoop-&lt;span class=&#34;si&#34;&gt;${&lt;/span&gt;&lt;span class=&#34;nv&#34;&gt;HADOOP_VERSION&lt;/span&gt;&lt;span class=&#34;si&#34;&gt;}&lt;/span&gt;-core.jar -d wordcount_classes WordCount.java 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;$ jar -cvf /usr/joe/wordcount.jar -C wordcount_classes/ .
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;编译完了可以看到有这三个文件&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;root@cluster-1:~/workspace/hadoop-1.1.2/wordcount_classes/org/myorg# ll
total 20
drwxr-xr-x 2 root root 4096 2013-06-20 18:00 ./
drwxr-xr-x 3 root root 4096 2013-06-20 18:00 ../
-rw-r--r-- 1 root root 1546 2013-06-20 18:00 WordCount.class
-rw-r--r-- 1 root root 1938 2013-06-20 18:00 WordCount$Map.class
-rw-r--r-- 1 root root 1611 2013-06-20 18:00 WordCount$Reduce.class
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;打包完得到 wordcount.jar&lt;/p&gt;
&lt;p&gt;如果之前有按照官方例程做那个 example 的话，需要先清空文件&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;root@cluster-1:~/workspace/hadoop-1.1.2# bin/hadoop fs -rmr input
Deleted hdfs://cluster-1:9000/user/root/input
root@cluster-1:~/workspace/hadoop-1.1.2# bin/hadoop fs -rmr output
Deleted hdfs://cluster-1:9000/user/root/output
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;然后根据教程准备好输入的文件&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Sample text-files as input:&lt;/p&gt;
&lt;p&gt;$ bin/hadoop dfs -ls /usr/joe/wordcount/input/ /usr/joe/wordcount/input/file01 /usr/joe/wordcount/input/file02&lt;/p&gt;
&lt;p&gt;$ bin/hadoop dfs -cat /usr/joe/wordcount/input/file01 Hello World Bye World&lt;/p&gt;
&lt;p&gt;$ bin/hadoop dfs -cat /usr/joe/wordcount/input/file02 Hello Hadoop Goodbye Hadoop&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;然后开始运行&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;$ bin/hadoop jar wordcount.jar org.myorg.WordCount /user/root/input /user/root/output
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;得到结果&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;bin/hadoop fs -cat output/*
Bye	1
Goodbye	1
Hadoop	2
Hello	2
World	2
&lt;/code&gt;&lt;/pre&gt;</description>
        </item>
        <item>
        <title>Hadoop 1.1.2 安装（集群版）</title>
        <link>https://blog.zrj.me/posts/2013-06-19-hadoop-1-1-2-%E5%AE%89%E8%A3%85%E9%9B%86%E7%BE%A4%E7%89%88/</link>
        <pubDate>Wed, 19 Jun 2013 11:10:29 +0800</pubDate>
        
        <guid>https://blog.zrj.me/posts/2013-06-19-hadoop-1-1-2-%E5%AE%89%E8%A3%85%E9%9B%86%E7%BE%A4%E7%89%88/</guid>
        <description>&lt;p&gt;首先还是安装必须的 jdk，看到这里，&lt;a class=&#34;link&#34; href=&#34;http://os.51cto.com/art/201003/189114.htm&#34;  title=&#34;http://os.51cto.com/art/201003/189114.htm&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://os.51cto.com/art/201003/189114.htm&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;终端下进入你存放jdk-6u12-linux-i586.bin，例如我的位置是:/home/liangshihong&lt;/p&gt;
&lt;p&gt;$ sudo -s ./jdk-6u12-linux-i586.bin&lt;/p&gt;
&lt;p&gt;一路回车，直到询问是否安装，输入yes回车&lt;/p&gt;
&lt;p&gt;ok，安装完毕，下面配置环境变量&lt;/p&gt;
&lt;p&gt;配置classpath，修改所有用户的环境变量&lt;/p&gt;
&lt;p&gt;$ sudo gedit /etc/profile&lt;/p&gt;
&lt;p&gt;在文件最后添加&lt;/p&gt;
&lt;p&gt;#set java environment&lt;/p&gt;
&lt;p&gt;JAVA_HOME=/home/liangshihong/jdk1.6.0_12&lt;/p&gt;
&lt;p&gt;export JRE_HOME=/home/liangshihong/jdk1.6.0_12/jre&lt;/p&gt;
&lt;p&gt;export CLASSPATH=.:$JAVA_HOME/lib:$JRE_HOME/lib:$CLASSPATH&lt;/p&gt;
&lt;p&gt;export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH&lt;/p&gt;
&lt;p&gt;重新启动计算机，用命令测试jdk的版本&lt;/p&gt;
&lt;p&gt;java -version&lt;/p&gt;
&lt;p&gt;显示如下信息：成功安装&lt;/p&gt;
&lt;p&gt;java version &amp;ldquo;1.6.0_12&amp;rdquo;&lt;/p&gt;
&lt;p&gt;Java(TM) SE Runtime Environment (build 1.6.0_12-b04)&lt;/p&gt;
&lt;p&gt;Java HotSpot(TM) Server VM (build 11.2-b01, mixed mode)&lt;/p&gt;
&lt;p&gt;liangshihong@liangshihong-Imagine:~$&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;另外需要修改 /etc/hosts&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;10.241.32.32 cluster-1
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;10.241.158.17 cluster-2
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;10.241.158.171 cluster-3
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;然后需要互相打通集群内的访问，由于我们的目标是，master 需要 ssh 要下面各个 node，而 node 之间不需要互相连接，对于这个需求，可以这么来实现&lt;/p&gt;
&lt;p&gt;在 master 上面使用&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;$ ssh-keygen -t dsa -P &lt;span class=&#34;s1&#34;&gt;&amp;#39;&amp;#39;&lt;/span&gt; -f ~/.ssh/id_dsa
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;$ cat ~/.ssh/id_dsa.pub &amp;gt;&amp;gt; ~/.ssh/authorized_keys
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;来生成自己的公钥，然后把这个 auth 文件复制到下面的每个 node 就可以了&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;scp /root/.ssh/authorized_keys cluster-2:~/.ssh/
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;scp /root/.ssh/authorized_keys cluster-3:~/.ssh/
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;插入一下 tips，在修改完毕 profile 文件后，执行source /etc/profile 来使其生效。&lt;/p&gt;
&lt;p&gt;测试一下，应该就可以实现各个机器之间的无密码访问了，然后开始配置 Hadoop，主要看到这个教程，&lt;a class=&#34;link&#34; href=&#34;http://blog.csdn.net/hguisu/article/details/7237395&#34;  title=&#34;http://blog.csdn.net/hguisu/article/details/7237395&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://blog.csdn.net/hguisu/article/details/7237395&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;3. 集群配置（所有节点相同） 3.1配置文件：conf/core-site.xml&lt;/p&gt;
&lt;p&gt;fs.default.name hdfs://node1:49000 hadoop.tmp.dir /home/hadoop/hadoop_home/var&lt;/p&gt;
&lt;p&gt;1）fs.default.name是NameNode的URI。hdfs://主机名:端口/ 2）hadoop.tmp.dir ：Hadoop的默认临时路径，这个最好配置，如果在新增节点或者其他情况下莫名其妙的DataNode启动不了，就删除此文件中的tmp目录即可。不过如果删除了NameNode机器的此目录，那么就需要重新执行NameNode格式化的命令。 3.2配置文件：conf/mapred-site.xml&lt;/p&gt;
&lt;p&gt;mapred.job.tracker node1:49001 mapred.local.dir /home/hadoop/hadoop_home/var&lt;/p&gt;
&lt;p&gt;1）mapred.job.tracker是JobTracker的主机（或者IP）和端口。主机:端口。&lt;/p&gt;
&lt;p&gt;3.3配置文件：conf/hdfs-site.xml&lt;/p&gt;
&lt;p&gt;dfs.name.dir /home/hadoop/name1, /home/hadoop/name2 #hadoop的name目录路径&lt;/p&gt;
&lt;p&gt;dfs.data.dir /home/hadoop/data1, /home/hadoop/data2&lt;/p&gt;
&lt;p&gt;dfs.replication 2&lt;/p&gt;
&lt;p&gt;1） dfs.name.dir是NameNode持久存储名字空间及事务日志的本地文件系统路径。 当这个值是一个逗号分割的目录列表时，nametable数据将会被复制到所有目录中做冗余备份。 2） dfs.data.dir是DataNode存放块数据的本地文件系统路径，逗号分割的列表。 当这个值是逗号分割的目录列表时，数据将被存储在所有目录下，通常分布在不同设备上。 3）dfs.replication是数据需要备份的数量，默认是3，如果此数大于集群的机器数会出错。 注意：此处的name1、name2、data1、data2目录不能预先创建，hadoop格式化时会自动创建，如果预先创建反而会有问题。&lt;/p&gt;
&lt;p&gt;3.4配置masters和slaves主从结点 配置conf/masters和conf/slaves来设置主从结点，注意最好使用主机名，并且保证机器之间通过主机名可以互相访问，每个主机名一行。&lt;/p&gt;
&lt;p&gt;vi masters： 输入：&lt;/p&gt;
&lt;p&gt;node1&lt;/p&gt;
&lt;p&gt;vi slaves：&lt;/p&gt;
&lt;p&gt;输入： node2 node3&lt;/p&gt;
&lt;p&gt;配置结束，把配置好的hadoop文件夹拷贝到其他集群的机器中，并且保证上面的配置对于其他机器而言正确，例如：如果其他机器的Java安装路径不一样，要修改conf/hadoop-env.sh&lt;/p&gt;
&lt;p&gt;$ scp -r /home/hadoop/hadoop-0.20.203 root@node2: /home/hadoop/&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;对于我，我的 conf/core-site.xml 如下&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-xml&#34; data-lang=&#34;xml&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cp&#34;&gt;&amp;lt;?xml version=&amp;#34;1.0&amp;#34;?&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cp&#34;&gt;&amp;lt;?xml-stylesheet type=&amp;#34;text/xsl&amp;#34; href=&amp;#34;configuration.xsl&amp;#34;?&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c&#34;&gt;&amp;lt;!-- Put site-specific property overrides in this file. --&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;fs.default.name&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;hdfs://cluster-1:9000&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;/configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;conf/mapred-site.xml 如下&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-xml&#34; data-lang=&#34;xml&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cp&#34;&gt;&amp;lt;?xml version=&amp;#34;1.0&amp;#34;?&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cp&#34;&gt;&amp;lt;?xml-stylesheet type=&amp;#34;text/xsl&amp;#34; href=&amp;#34;configuration.xsl&amp;#34;?&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c&#34;&gt;&amp;lt;!-- Put site-specific property overrides in this file. --&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;mapred.job.tracker&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;cluster-1:9001&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;/configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;conf/hdfs-site.xml&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-xml&#34; data-lang=&#34;xml&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cp&#34;&gt;&amp;lt;?xml version=&amp;#34;1.0&amp;#34;?&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cp&#34;&gt;&amp;lt;?xml-stylesheet type=&amp;#34;text/xsl&amp;#34; href=&amp;#34;configuration.xsl&amp;#34;?&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c&#34;&gt;&amp;lt;!-- Put site-specific property overrides in this file. --&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;dfs.replication&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;2&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;/configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;接下来配置主从节点，修改 conf/masters&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cluster-1
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;和 conf/slaves&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cluster-2
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cluster-3
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;然后把配置文件拷贝到其他机器&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;scp conf/core-site.xml conf/mapred-site.xml conf/hdfs-site.xml conf/masters conf/slaves cluster-2:~/workspace/hadoop-1.1.2/conf
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;scp conf/core-site.xml conf/mapred-site.xml conf/hdfs-site.xml conf/masters conf/slaves cluster-3:~/workspace/hadoop-1.1.2/conf
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;格式化 namenode&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/workspace/hadoop-1.1.2# bin/hadoop namenode -format
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:55 INFO namenode.NameNode: STARTUP_MSG:
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;/************************************************************
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;STARTUP_MSG: Starting NameNode
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;STARTUP_MSG:   &lt;span class=&#34;nv&#34;&gt;host&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; cluster-1/10.241.32.32
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;STARTUP_MSG:   &lt;span class=&#34;nv&#34;&gt;args&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;[&lt;/span&gt;-format&lt;span class=&#34;o&#34;&gt;]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;STARTUP_MSG:   &lt;span class=&#34;nv&#34;&gt;version&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; 1.1.2
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;STARTUP_MSG:   &lt;span class=&#34;nv&#34;&gt;build&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1440782&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt; compiled by &lt;span class=&#34;s1&#34;&gt;&amp;#39;hortonfo&amp;#39;&lt;/span&gt; on Thu Jan &lt;span class=&#34;m&#34;&gt;31&lt;/span&gt; 02:03:24 UTC &lt;span class=&#34;m&#34;&gt;2013&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;************************************************************/
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:56 INFO util.GSet: VM &lt;span class=&#34;nb&#34;&gt;type&lt;/span&gt;       &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; 64-bit
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:56 INFO util.GSet: 2% max &lt;span class=&#34;nv&#34;&gt;memory&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; 19.33375 MB
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:56 INFO util.GSet: &lt;span class=&#34;nv&#34;&gt;capacity&lt;/span&gt;      &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; 2^21 &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;2097152&lt;/span&gt; entries
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:56 INFO util.GSet: &lt;span class=&#34;nv&#34;&gt;recommended&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;2097152, &lt;span class=&#34;nv&#34;&gt;actual&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;2097152&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:56 INFO namenode.FSNamesystem: &lt;span class=&#34;nv&#34;&gt;fsOwner&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;root
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:56 INFO namenode.FSNamesystem: &lt;span class=&#34;nv&#34;&gt;supergroup&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;supergroup
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:56 INFO namenode.FSNamesystem: &lt;span class=&#34;nv&#34;&gt;isPermissionEnabled&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;nb&#34;&gt;true&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:56 INFO namenode.FSNamesystem: dfs.block.invalidate.limit&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;100&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:56 INFO namenode.FSNamesystem: &lt;span class=&#34;nv&#34;&gt;isAccessTokenEnabled&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;nb&#34;&gt;false&lt;/span&gt; &lt;span class=&#34;nv&#34;&gt;accessKeyUpdateInterval&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt; min&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;s&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;, &lt;span class=&#34;nv&#34;&gt;accessTokenLifetime&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt; min&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;s&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:56 INFO namenode.NameNode: Caching file names occuring more than &lt;span class=&#34;m&#34;&gt;10&lt;/span&gt; &lt;span class=&#34;nb&#34;&gt;times&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:57 INFO common.Storage: Image file of size &lt;span class=&#34;m&#34;&gt;110&lt;/span&gt; saved in &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt; seconds.
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:57 INFO namenode.FSEditLog: closing edit log: &lt;span class=&#34;nv&#34;&gt;position&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;4, &lt;span class=&#34;nv&#34;&gt;editlog&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;/tmp/hadoop-root/dfs/name/current/edits
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:57 INFO namenode.FSEditLog: close success: truncate to 4, &lt;span class=&#34;nv&#34;&gt;editlog&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;/tmp/hadoop-root/dfs/name/current/edits
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:57 INFO common.Storage: Storage directory /tmp/hadoop-root/dfs/name has been successfully formatted.
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 10:48:57 INFO namenode.NameNode: SHUTDOWN_MSG:
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;/************************************************************
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;SHUTDOWN_MSG: Shutting down NameNode at cluster-1/10.241.32.32
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;************************************************************/
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;如果格式化失败，可以尝试这个命令&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/workspace/hadoop-1.1.2# rm -rf /tmp/hadoop-root*
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;格式化之后，启动集群，在这里，由于我们的 namenode 和 jobtracker 在同一台机器，所以可以这样来启动&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/workspace/hadoop-1.1.2# bin/start-all.sh
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;starting namenode, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-namenode-cluster-1.out
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cluster-2: starting datanode, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-datanode-cluster-2.out
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cluster-3: starting datanode, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-datanode-cluster-3.out
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cluster-1: starting secondarynamenode, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-secondarynamenode-cluster-1.out
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;starting jobtracker, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-jobtracker-cluster-1.out
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cluster-2: starting tasktracker, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-tasktracker-cluster-2.out
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;cluster-3: starting tasktracker, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-tasktracker-cluster-3.out
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;由于在配置 conf/hdfs-site.xml 的时候，我们没有配置 dfs.data.dir，所以，datanode 中，数据的存放位置是 /tmp，我们可以在 cluster-2 和 clutster-3 中看到&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-2:/tmp# ll
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;total &lt;span class=&#34;m&#34;&gt;32&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;drwxrwxrwt  &lt;span class=&#34;m&#34;&gt;6&lt;/span&gt; root root &lt;span class=&#34;m&#34;&gt;4096&lt;/span&gt; 2013-06-19 10:57 ./
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;drwxr-xr-x &lt;span class=&#34;m&#34;&gt;21&lt;/span&gt; root root &lt;span class=&#34;m&#34;&gt;4096&lt;/span&gt; 2013-06-18 17:03 ../
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;drwxr-xr-x  &lt;span class=&#34;m&#34;&gt;4&lt;/span&gt; root root &lt;span class=&#34;m&#34;&gt;4096&lt;/span&gt; 2013-06-19 10:52 hadoop-root/
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--  &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt; root root    &lt;span class=&#34;m&#34;&gt;5&lt;/span&gt; 2013-06-19 10:57 hadoop-root-datanode.pid
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--  &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt; root root    &lt;span class=&#34;m&#34;&gt;5&lt;/span&gt; 2013-06-19 10:57 hadoop-root-tasktracker.pid
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-3:/tmp# ll
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;total &lt;span class=&#34;m&#34;&gt;32&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;drwxrwxrwt  &lt;span class=&#34;m&#34;&gt;6&lt;/span&gt; root root &lt;span class=&#34;m&#34;&gt;4096&lt;/span&gt; 2013-06-19 10:57 ./
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;drwxr-xr-x &lt;span class=&#34;m&#34;&gt;21&lt;/span&gt; root root &lt;span class=&#34;m&#34;&gt;4096&lt;/span&gt; 2013-06-18 17:18 ../
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;drwxr-xr-x  &lt;span class=&#34;m&#34;&gt;4&lt;/span&gt; root root &lt;span class=&#34;m&#34;&gt;4096&lt;/span&gt; 2013-06-19 10:52 hadoop-root/
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--  &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt; root root    &lt;span class=&#34;m&#34;&gt;5&lt;/span&gt; 2013-06-19 10:57 hadoop-root-datanode.pid
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--  &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt; root root    &lt;span class=&#34;m&#34;&gt;5&lt;/span&gt; 2013-06-19 10:57 hadoop-root-tasktracker.pid
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;同样可以通过 web 来看到集群的情况&lt;/p&gt;
&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;http://zrj.me/wp-content/uploads/2013/06/QQ%e6%88%aa%e5%9b%be20130619110443.png&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://blog.zrj.me/images/QQ%e6%88%aa%e5%9b%be20130619110443.png&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;QQ截图20130619110443&#34;
	
	
&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;http://zrj.me/wp-content/uploads/2013/06/QQ%e6%88%aa%e5%9b%be20130619110502.png&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://blog.zrj.me/images/QQ%e6%88%aa%e5%9b%be20130619110502.png&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;QQ截图20130619110502&#34;
	
	
&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;然后运行官方教程的例程，&lt;a class=&#34;link&#34; href=&#34;http://hadoop.apache.org/docs/stable/single_node_setup.html&#34;  title=&#34;http://hadoop.apache.org/docs/stable/single_node_setup.html&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://hadoop.apache.org/docs/stable/single_node_setup.html&lt;/a&gt;&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Copy the input files into the distributed filesystem: $ bin/hadoop fs -put conf input&lt;/p&gt;
&lt;p&gt;Run some of the examples provided: $ bin/hadoop jar hadoop-examples-*.jar grep input output &amp;lsquo;dfs[a-z.]+&amp;rsquo;&lt;/p&gt;
&lt;p&gt;Examine the output files:&lt;/p&gt;
&lt;p&gt;Copy the output files from the distributed filesystem to the local filesytem and examine them: $ bin/hadoop fs -get output output $ cat output/*&lt;/p&gt;
&lt;p&gt;or&lt;/p&gt;
&lt;p&gt;View the output files on the distributed filesystem: $ bin/hadoop fs -cat output/*&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;首先拷贝文件&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/workspace/hadoop-1.1.2# bin/hadoop fs -put conf input
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/workspace/hadoop-1.1.2# bin/hadoop fs -ls input
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Found &lt;span class=&#34;m&#34;&gt;16&lt;/span&gt; items
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup       &lt;span class=&#34;m&#34;&gt;7457&lt;/span&gt; 2013-06-19 11:14 /user/root/input/capacity-scheduler.xml
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup        &lt;span class=&#34;m&#34;&gt;535&lt;/span&gt; 2013-06-19 11:14 /user/root/input/configuration.xsl
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup        &lt;span class=&#34;m&#34;&gt;294&lt;/span&gt; 2013-06-19 11:14 /user/root/input/core-site.xml
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup        &lt;span class=&#34;m&#34;&gt;327&lt;/span&gt; 2013-06-19 11:14 /user/root/input/fair-scheduler.xml
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup       &lt;span class=&#34;m&#34;&gt;2240&lt;/span&gt; 2013-06-19 11:14 /user/root/input/hadoop-env.sh
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup       &lt;span class=&#34;m&#34;&gt;1488&lt;/span&gt; 2013-06-19 11:14 /user/root/input/hadoop-metrics2.properties
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup       &lt;span class=&#34;m&#34;&gt;4644&lt;/span&gt; 2013-06-19 11:14 /user/root/input/hadoop-policy.xml
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup        &lt;span class=&#34;m&#34;&gt;274&lt;/span&gt; 2013-06-19 11:14 /user/root/input/hdfs-site.xml
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup       &lt;span class=&#34;m&#34;&gt;4441&lt;/span&gt; 2013-06-19 11:14 /user/root/input/log4j.properties
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup       &lt;span class=&#34;m&#34;&gt;2033&lt;/span&gt; 2013-06-19 11:14 /user/root/input/mapred-queue-acls.xml
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup        &lt;span class=&#34;m&#34;&gt;290&lt;/span&gt; 2013-06-19 11:14 /user/root/input/mapred-site.xml
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup         &lt;span class=&#34;m&#34;&gt;10&lt;/span&gt; 2013-06-19 11:14 /user/root/input/masters
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup         &lt;span class=&#34;m&#34;&gt;20&lt;/span&gt; 2013-06-19 11:14 /user/root/input/slaves
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup       &lt;span class=&#34;m&#34;&gt;1243&lt;/span&gt; 2013-06-19 11:14 /user/root/input/ssl-client.xml.example
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup       &lt;span class=&#34;m&#34;&gt;1195&lt;/span&gt; 2013-06-19 11:14 /user/root/input/ssl-server.xml.example
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;-rw-r--r--   &lt;span class=&#34;m&#34;&gt;2&lt;/span&gt; root supergroup        &lt;span class=&#34;m&#34;&gt;382&lt;/span&gt; 2013-06-19 11:14 /user/root/input/taskcontroller.cfg
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;然后运行例程&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/workspace/hadoop-1.1.2# bin/hadoop jar hadoop-examples-*.jar grep input output &lt;span class=&#34;s1&#34;&gt;&amp;#39;dfs[a-z.]+&amp;#39;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:14:49 INFO util.NativeCodeLoader: Loaded the native-hadoop library
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:14:49 WARN snappy.LoadSnappy: Snappy native library not loaded
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:14:49 INFO mapred.FileInputFormat: Total input paths to process : &lt;span class=&#34;m&#34;&gt;16&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:14:50 INFO mapred.JobClient: Running job: job_201306191057_0001
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:14:51 INFO mapred.JobClient:  map 0% reduce 0%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:01 INFO mapred.JobClient:  map 12% reduce 0%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:02 INFO mapred.JobClient:  map 25% reduce 0%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:08 INFO mapred.JobClient:  map 37% reduce 0%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:09 INFO mapred.JobClient:  map 50% reduce 0%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:14 INFO mapred.JobClient:  map 62% reduce 0%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:17 INFO mapred.JobClient:  map 75% reduce 0%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:19 INFO mapred.JobClient:  map 87% reduce 0%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:20 INFO mapred.JobClient:  map 87% reduce 20%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:26 INFO mapred.JobClient:  map 100% reduce 29%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:33 INFO mapred.JobClient:  map 100% reduce 100%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient: Job complete: job_201306191057_0001
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient: Counters: &lt;span class=&#34;m&#34;&gt;30&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:   Job Counters 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Launched reduce &lt;span class=&#34;nv&#34;&gt;tasks&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;SLOTS_MILLIS_MAPS&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;110582&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Total &lt;span class=&#34;nb&#34;&gt;time&lt;/span&gt; spent by all reduces waiting after reserving slots &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ms&lt;span class=&#34;o&#34;&gt;)=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Total &lt;span class=&#34;nb&#34;&gt;time&lt;/span&gt; spent by all maps waiting after reserving slots &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ms&lt;span class=&#34;o&#34;&gt;)=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Launched map &lt;span class=&#34;nv&#34;&gt;tasks&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;16&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Data-local map &lt;span class=&#34;nv&#34;&gt;tasks&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;16&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;SLOTS_MILLIS_REDUCES&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;31899&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:   File Input Format Counters 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Bytes &lt;span class=&#34;nv&#34;&gt;Read&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;26873&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:   File Output Format Counters 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Bytes &lt;span class=&#34;nv&#34;&gt;Written&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;180&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:   FileSystemCounters
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;FILE_BYTES_READ&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;82&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;HDFS_BYTES_READ&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;28595&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;FILE_BYTES_WRITTEN&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;866947&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;HDFS_BYTES_WRITTEN&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;180&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:   Map-Reduce Framework
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Map output materialized &lt;span class=&#34;nv&#34;&gt;bytes&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;172&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Map input &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;759&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Reduce shuffle &lt;span class=&#34;nv&#34;&gt;bytes&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;172&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Spilled &lt;span class=&#34;nv&#34;&gt;Records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;6&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Map output &lt;span class=&#34;nv&#34;&gt;bytes&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;70&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Total committed heap usage &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;bytes&lt;span class=&#34;o&#34;&gt;)=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;2703933440&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     CPU &lt;span class=&#34;nb&#34;&gt;time&lt;/span&gt; spent &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ms&lt;span class=&#34;o&#34;&gt;)=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;5950&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Map input &lt;span class=&#34;nv&#34;&gt;bytes&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;26873&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;SPLIT_RAW_BYTES&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1722&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Combine input &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Reduce input &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Reduce input &lt;span class=&#34;nv&#34;&gt;groups&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Combine output &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Physical memory &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;bytes&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;nv&#34;&gt;snapshot&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;2599989248&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Reduce output &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Virtual memory &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;bytes&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;nv&#34;&gt;snapshot&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;7510061056&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient:     Map output &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.FileInputFormat: Total input paths to process : &lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:36 INFO mapred.JobClient: Running job: job_201306191057_0002
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:37 INFO mapred.JobClient:  map 0% reduce 0%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:44 INFO mapred.JobClient:  map 100% reduce 0%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:52 INFO mapred.JobClient:  map 100% reduce 33%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:54 INFO mapred.JobClient:  map 100% reduce 100%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient: Job complete: job_201306191057_0002
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient: Counters: &lt;span class=&#34;m&#34;&gt;30&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:   Job Counters 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Launched reduce &lt;span class=&#34;nv&#34;&gt;tasks&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;SLOTS_MILLIS_MAPS&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;7370&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Total &lt;span class=&#34;nb&#34;&gt;time&lt;/span&gt; spent by all reduces waiting after reserving slots &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ms&lt;span class=&#34;o&#34;&gt;)=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Total &lt;span class=&#34;nb&#34;&gt;time&lt;/span&gt; spent by all maps waiting after reserving slots &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ms&lt;span class=&#34;o&#34;&gt;)=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Launched map &lt;span class=&#34;nv&#34;&gt;tasks&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Data-local map &lt;span class=&#34;nv&#34;&gt;tasks&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;SLOTS_MILLIS_REDUCES&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;9828&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:   File Input Format Counters 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Bytes &lt;span class=&#34;nv&#34;&gt;Read&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;180&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:   File Output Format Counters 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Bytes &lt;span class=&#34;nv&#34;&gt;Written&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;52&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:   FileSystemCounters
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;FILE_BYTES_READ&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;82&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;HDFS_BYTES_READ&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;296&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;FILE_BYTES_WRITTEN&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;100437&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;HDFS_BYTES_WRITTEN&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;52&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:   Map-Reduce Framework
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Map output materialized &lt;span class=&#34;nv&#34;&gt;bytes&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;82&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Map input &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Reduce shuffle &lt;span class=&#34;nv&#34;&gt;bytes&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;82&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Spilled &lt;span class=&#34;nv&#34;&gt;Records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;6&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Map output &lt;span class=&#34;nv&#34;&gt;bytes&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;70&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Total committed heap usage &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;bytes&lt;span class=&#34;o&#34;&gt;)=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;210501632&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     CPU &lt;span class=&#34;nb&#34;&gt;time&lt;/span&gt; spent &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ms&lt;span class=&#34;o&#34;&gt;)=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1090&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Map input &lt;span class=&#34;nv&#34;&gt;bytes&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;94&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;SPLIT_RAW_BYTES&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;116&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Combine input &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Reduce input &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Reduce input &lt;span class=&#34;nv&#34;&gt;groups&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Combine output &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Physical memory &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;bytes&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;nv&#34;&gt;snapshot&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;227721216&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Reduce output &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Virtual memory &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;bytes&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt; &lt;span class=&#34;nv&#34;&gt;snapshot&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;898035712&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 11:15:55 INFO mapred.JobClient:     Map output &lt;span class=&#34;nv&#34;&gt;records&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;3&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;查看结果&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/workspace/hadoop-1.1.2# bin/hadoop fs -cat output/*
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;1	dfs.replication
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;1	dfs.server.namenode.
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;1	dfsadmin
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;至此，Hadoop 的集群版本环境搭建完成&lt;/p&gt;
&lt;p&gt;附上一个配置文件说明的文档，&lt;a class=&#34;link&#34; href=&#34;http://www.cnblogs.com/serendipity/archive/2011/08/23/2151031.html&#34;  title=&#34;http://www.cnblogs.com/serendipity/archive/2011/08/23/2151031.html&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://www.cnblogs.com/serendipity/archive/2011/08/23/2151031.html&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;-&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;&amp;mdash;-&lt;/p&gt;
&lt;p&gt;2013-06-19 16:35:52 update 修改了配置文件 conf/hdfs-site.xml&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-xml&#34; data-lang=&#34;xml&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cp&#34;&gt;&amp;lt;?xml version=&amp;#34;1.0&amp;#34;?&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;cp&#34;&gt;&amp;lt;?xml-stylesheet type=&amp;#34;text/xsl&amp;#34; href=&amp;#34;configuration.xsl&amp;#34;?&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c&#34;&gt;&amp;lt;!-- Put site-specific property overrides in this file. --&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;dfs.name.dir&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;/var/local/hadoop/name&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;dfs.data.dir&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;/var/local/hadoop/data&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;name&amp;gt;&lt;/span&gt;dfs.replication&lt;span class=&#34;nt&#34;&gt;&amp;lt;/name&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;         &lt;span class=&#34;nt&#34;&gt;&amp;lt;value&amp;gt;&lt;/span&gt;2&lt;span class=&#34;nt&#34;&gt;&amp;lt;/value&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;     &lt;span class=&#34;nt&#34;&gt;&amp;lt;/property&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nt&#34;&gt;&amp;lt;/configuration&amp;gt;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;然后重新格式化&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/workspace/hadoop-1.1.2# bin/hadoop namenode -format
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:38 INFO namenode.NameNode: STARTUP_MSG: 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;/************************************************************
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;STARTUP_MSG: Starting NameNode
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;STARTUP_MSG:   &lt;span class=&#34;nv&#34;&gt;host&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; cluster-1/10.241.32.32
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;STARTUP_MSG:   &lt;span class=&#34;nv&#34;&gt;args&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;[&lt;/span&gt;-format&lt;span class=&#34;o&#34;&gt;]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;STARTUP_MSG:   &lt;span class=&#34;nv&#34;&gt;version&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; 1.1.2
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;STARTUP_MSG:   &lt;span class=&#34;nv&#34;&gt;build&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.1 -r 1440782&lt;span class=&#34;p&#34;&gt;;&lt;/span&gt; compiled by &lt;span class=&#34;s1&#34;&gt;&amp;#39;hortonfo&amp;#39;&lt;/span&gt; on Thu Jan &lt;span class=&#34;m&#34;&gt;31&lt;/span&gt; 02:03:24 UTC &lt;span class=&#34;m&#34;&gt;2013&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;************************************************************/
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:38 INFO util.GSet: VM &lt;span class=&#34;nb&#34;&gt;type&lt;/span&gt;       &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; 64-bit
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:38 INFO util.GSet: 2% max &lt;span class=&#34;nv&#34;&gt;memory&lt;/span&gt; &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; 19.33375 MB
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:38 INFO util.GSet: &lt;span class=&#34;nv&#34;&gt;capacity&lt;/span&gt;      &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; 2^21 &lt;span class=&#34;o&#34;&gt;=&lt;/span&gt; &lt;span class=&#34;m&#34;&gt;2097152&lt;/span&gt; entries
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:38 INFO util.GSet: &lt;span class=&#34;nv&#34;&gt;recommended&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;2097152, &lt;span class=&#34;nv&#34;&gt;actual&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;2097152&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:39 INFO namenode.FSNamesystem: &lt;span class=&#34;nv&#34;&gt;fsOwner&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;root
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:39 INFO namenode.FSNamesystem: &lt;span class=&#34;nv&#34;&gt;supergroup&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;supergroup
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:39 INFO namenode.FSNamesystem: &lt;span class=&#34;nv&#34;&gt;isPermissionEnabled&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;nb&#34;&gt;true&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:39 INFO namenode.FSNamesystem: dfs.block.invalidate.limit&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;100&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:39 INFO namenode.FSNamesystem: &lt;span class=&#34;nv&#34;&gt;isAccessTokenEnabled&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;nb&#34;&gt;false&lt;/span&gt; &lt;span class=&#34;nv&#34;&gt;accessKeyUpdateInterval&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt; min&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;s&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;, &lt;span class=&#34;nv&#34;&gt;accessTokenLifetime&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt; min&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;s&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:39 INFO namenode.NameNode: Caching file names occuring more than &lt;span class=&#34;m&#34;&gt;10&lt;/span&gt; &lt;span class=&#34;nb&#34;&gt;times&lt;/span&gt; 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:39 INFO common.Storage: Image file of size &lt;span class=&#34;m&#34;&gt;110&lt;/span&gt; saved in &lt;span class=&#34;m&#34;&gt;0&lt;/span&gt; seconds.
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:40 INFO namenode.FSEditLog: closing edit log: &lt;span class=&#34;nv&#34;&gt;position&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;4, &lt;span class=&#34;nv&#34;&gt;editlog&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;/var/local/hadoop/name/current/edits
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:40 INFO namenode.FSEditLog: close success: truncate to 4, &lt;span class=&#34;nv&#34;&gt;editlog&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;/var/local/hadoop/name/current/edits
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:40 INFO common.Storage: Storage directory /var/local/hadoop/name has been successfully formatted.
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/19 16:33:40 INFO namenode.NameNode: SHUTDOWN_MSG: 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;/************************************************************
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;SHUTDOWN_MSG: Shutting down NameNode at cluster-1/10.241.32.32
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;************************************************************/
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;格式化之后，会在 master 的 /var/local 下面自动建立 hadoop/name&lt;/p&gt;
&lt;p&gt;然后启动集群，启动之后，会自动在各个 node 的 /var/local 下面建立 hadoop/data&lt;/p&gt;
&lt;p&gt;于是例程的输出也有了变化&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/workspace/hadoop-1.1.2# bin/hadoop fs -cat output/*
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;1	dfs.data.dir
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;1	dfs.name.dir
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;1	dfs.replication
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;1	dfs.server.namenode.
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;1	dfsadmin
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;hr&gt;
&lt;h2 id=&#34;历史评论&#34;&gt;历史评论
&lt;/h2&gt;&lt;p&gt;&lt;strong&gt;ZRJ&lt;/strong&gt; (2013-08-21 17:12:10):&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;在namenode启动脚本%HADOOP_HOME%/bin/start-dfs.sh的时候发现datanode报错：
Error: JAVA_HOME is not set
原因是在%HADOOP_HOME%/conf/hadoop-env.sh内缺少JAVA_HOME的定义，只需要在hadoop-env.sh中增加：
JAVA_HOME=/your/jdk/root/path&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;strong&gt;ZRJ&lt;/strong&gt; (2013-08-21 20:41:24):&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;周一发现hadoop集群down掉了
发现由于磁盘已满100%
删除无用文件后重启集群，发现还是起不来，错误如下：
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   host = SFserver141.localdomain/192.168.15.141
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 0.20.3-SNAPSHOT
STARTUP_MSG:   build =  -r ; compiled by &amp;lsquo;root&amp;rsquo; on Wed Jun  8 12:43:33 CST 2011
************************************************************/
2012-10-22 08:50:42,096 INFO org.apache.hadoop.ipc.metrics.RpcMetrics: Initializing RPC Metrics with hostName=NameNode, port=9000
2012-10-22 08:50:42,104 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Namenode up at: SFserver141.localdomain/192.168.15.141:9000
2012-10-22 08:50:42,112 INFO org.apache.hadoop.metrics.jvm.JvmMetrics: Initializing JVM Metrics with processName=NameNode, sessionId=null
2012-10-22 08:50:42,113 INFO org.apache.hadoop.hdfs.server.namenode.metrics.NameNodeMetrics: Initializing NameNodeMeterics using context object:org.apache.hadoop.metrics.spi.NullContext
2012-10-22 08:50:42,169 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner=root,root,bin,daemon,sys,adm,disk,wheel
2012-10-22 08:50:42,169 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroupsupergroup=supergroup
2012-10-22 08:50:42,169 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled=false
2012-10-22 08:50:42,187 INFO org.apache.hadoop.hdfs.server.namenode.metrics.FSNamesystemMetrics: Initializing FSNamesystemMetrics using context object:org.apache.hadoop.metrics.spi.NullContext
2012-10-22 08:50:42,188 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemStatusMBean
2012-10-22 08:50:42,248 INFO org.apache.hadoop.hdfs.server.common.Storage: Number of files = 799968
2012-10-22 08:50:47,535 INFO org.apache.hadoop.hdfs.server.common.Storage: Number of files under construction = 13
2012-10-22 08:50:47,540 INFO org.apache.hadoop.hdfs.server.common.Storage: Image file of size 102734547 loaded in 5 seconds.
2012-10-22 08:50:48,131 INFO org.apache.hadoop.hdfs.server.common.Storage: Edits file /data/java/hadoop020/data/dfs.name.dir/current/edits of size 2749136 edits # 17772 loaded in 0 seconds.
2012-10-22 08:50:48,801 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: java.lang.NumberFormatException: For input string: &amp;quot;&amp;quot;
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
at java.lang.Integer.parseInt(Integer.java:470)
at java.lang.Short.parseShort(Short.java:120)
at java.lang.Short.parseShort(Short.java:78)
at org.apache.hadoop.hdfs.server.namenode.FSEditLog.readShort(FSEditLog.java:1311)
at org.apache.hadoop.hdfs.server.namenode.FSEditLog.loadFSEdits(FSEditLog.java:541)
at org.apache.hadoop.hdfs.server.namenode.FSImage.loadFSEdits(FSImage.java:1011)
at org.apache.hadoop.hdfs.server.namenode.FSImage.loadFSImage(FSImage.java:826)
at org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:364)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.loadFSImage(FSDirectory.java:87)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.initialize(FSNamesystem.java:315)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.&lt;init&gt;(FSNamesystem.java:296)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:205)
at org.apache.hadoop.hdfs.server.namenode.NameNode.&lt;init&gt;(NameNode.java:283)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:986)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:995)&lt;/p&gt;
&lt;p&gt;2012-10-22 08:50:48,802 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:&lt;br&gt;
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at SFserver141.localdomain/192.168.15.141
************************************************************/
大致是因为edits这个文件出现问题；
上网查了不少文档，但由于没有设置secondarynamenode；所以没有edits的镜像文件
之后发现一篇文章写：
printf &amp;ldquo;\xff\xff\xff\xee\xff&amp;rdquo; &amp;gt; edits
把上面一段字符串写到edits文件中
重启正常
注：dfs.name.dir/current文件夹下还出现了edits.new的文件，我是删除的 不知道有没有影响
本文出自 “工作笔记” 博客，请务必保留此出处http://693340562.blog.51cto.com/1125757/1033582&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;strong&gt;hadoop 2.5.2 集群安装 | ZRJ&lt;/strong&gt; (2016-03-18 16:40:56):&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;[…] 之前写过一个，http://zrj.me/archives/888，不过现在是 2.5.2，东西不同了，安装方式有所变化。 […]&lt;/p&gt;
&lt;/blockquote&gt;
</description>
        </item>
        <item>
        <title>Hadoop 1.1.2 安装（单机版）</title>
        <link>https://blog.zrj.me/posts/2013-06-18-hadoop-1-1-2-%E5%AE%89%E8%A3%85%E5%8D%95%E6%9C%BA%E7%89%88/</link>
        <pubDate>Tue, 18 Jun 2013 16:56:27 +0800</pubDate>
        
        <guid>https://blog.zrj.me/posts/2013-06-18-hadoop-1-1-2-%E5%AE%89%E8%A3%85%E5%8D%95%E6%9C%BA%E7%89%88/</guid>
        <description>&lt;p&gt;操作的环境是阿里云的服务器，&lt;a class=&#34;link&#34; href=&#34;http://www.aliyun.com/&#34;  title=&#34;http://www.aliyun.com/&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://www.aliyun.com/&lt;/a&gt;，还不错，按时间收费，一个小时 0.28 块钱，操作系统是 Ubuntu 10.10 64 位，单核 CPU，内存 512&lt;/p&gt;
&lt;p&gt;首先上来先修改主机名&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;hostname cluster-1
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;然后开始按照这里，&lt;a class=&#34;link&#34; href=&#34;http://hadoop.apache.org/docs/stable/single_node_setup.html&#34;  title=&#34;http://hadoop.apache.org/docs/stable/single_node_setup.html&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://hadoop.apache.org/docs/stable/single_node_setup.html&lt;/a&gt;，的教程安装 Hadoop 1.1.2&lt;/p&gt;
&lt;p&gt;首先看到 Prerequisites，需要 java 的环境，试一下 java 的命令&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~# java
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;The program &lt;span class=&#34;s1&#34;&gt;&amp;#39;java&amp;#39;&lt;/span&gt; can be found in the following packages:
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt; * gcj-4.4-jre-headless
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt; * gcj-4.5-jre-headless
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt; * openjdk-6-jre-headless
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Try: apt-get install &amp;lt;selected package&amp;gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;那么首先安装 java，看到这里的教程，&lt;a class=&#34;link&#34; href=&#34;http://www.itkee.com/developer/detail-edd.html&#34;  title=&#34;http://www.itkee.com/developer/detail-edd.html&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://www.itkee.com/developer/detail-edd.html&lt;/a&gt;，首先补上源，然后安装&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;我的虚拟机装的是Ubuntu 10.04，安装的是Sun JDK6，因为Ubuntu 10.04后官方剔除了Sun JDK的源，提倡用OpenJDK代替（我估计是因为Sun被Oracle收购了，Oracle做了一些伤害开源组织的事，出于报复，故而提倡OpenJDK！），所以首先得加入SunJDK的源，&lt;/p&gt;
&lt;p&gt;sudo gedit /etc/apt/sources.list编辑源列表，在文件末尾加上一句：&lt;/p&gt;
&lt;p&gt;deb &lt;a class=&#34;link&#34; href=&#34;http://archive.canonical.com/&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://archive.canonical.com/&lt;/a&gt; lucid partner&lt;/p&gt;
&lt;p&gt;然后sudo apt-get update运行更新，&lt;/p&gt;
&lt;p&gt;再输入sudo apt-get install sun-java6-jdk，提示您要不要下载软件包，输入y继续。&lt;/p&gt;
&lt;p&gt;安装好JDK后，然后就是设置环境变量&lt;/p&gt;
&lt;p&gt;sudo gedit /etc/profile编辑文件，在文件末尾加上&lt;/p&gt;
&lt;p&gt;export JAVA_HOME=/usr/lib/jvm/java-6-sun-1.6.0.24&lt;/p&gt;
&lt;p&gt;export PATH=$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$PATH&lt;/p&gt;
&lt;p&gt;export CLASSPATH=$CLASSPATH:.:$JAVA_HOME/lib:$JAVA_HOME/jre/lib&lt;/p&gt;
&lt;p&gt;java路径是在命令行下安装时默认的路径，如果你的路径不同，只需修改第一行即可。&lt;/p&gt;
&lt;p&gt;使配置生效，输入“source /etc/profile”&lt;/p&gt;
&lt;p&gt;使用“java -version”查看是否配置成功，我的是：&lt;/p&gt;
&lt;p&gt;bwk@ubuntu:~$ java -version&lt;/p&gt;
&lt;p&gt;java version &amp;ldquo;1.6.0_24&amp;rdquo;&lt;/p&gt;
&lt;p&gt;Java(TM) SE Runtime Environment (build 1.6.0_24-b07)&lt;/p&gt;
&lt;p&gt;Java HotSpot(TM) Client VM (build 19.1-b02, mixed mode, sharing)&lt;/p&gt;
&lt;p&gt;证明配置成功，好了，大功告成！&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;但是在 apt-get update 的时候出现问题，没有办法 update 成功，于是想到自己手工去 java 官网下载安装程序来安装，看到这篇文章，&lt;a class=&#34;link&#34; href=&#34;http://os.51cto.com/art/201003/189114.htm&#34;  title=&#34;http://os.51cto.com/art/201003/189114.htm&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://os.51cto.com/art/201003/189114.htm&lt;/a&gt;，去到这个网址，&lt;a class=&#34;link&#34; href=&#34;http://www.oracle.com/technetwork/java/javase/downloads/jdk6downloads-1902814.html&#34;  title=&#34;http://www.oracle.com/technetwork/java/javase/downloads/jdk6downloads-1902814.html&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://www.oracle.com/technetwork/java/javase/downloads/jdk6downloads-1902814.html&lt;/a&gt;，下载了 jdk 6 的 bin 文件，上传到 vps&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~# java -version
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;java version &lt;span class=&#34;s2&#34;&gt;&amp;#34;1.6.0_45&amp;#34;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Java&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;TM&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt; SE Runtime Environment &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;build 1.6.0_45-b06&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Java HotSpot&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;TM&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt; 64-Bit Server VM &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;build 20.45-b01, mixed mode&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;然后从 &lt;a class=&#34;link&#34; href=&#34;http://www.apache.org/dyn/closer.cgi/hadoop/common/&#34;  title=&#34;http://www.apache.org/dyn/closer.cgi/hadoop/common/&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://www.apache.org/dyn/closer.cgi/hadoop/common/&lt;/a&gt; 下载 1.1.2 的 Hadoop&lt;/p&gt;
&lt;pre tabindex=&#34;0&#34;&gt;&lt;code&gt;hadoop-1.1.2.tar.gz                                31-Jan-2013 22:42            61927560
&lt;/code&gt;&lt;/pre&gt;&lt;p&gt;解压缩完了，按照要求，edit the file conf/hadoop-env.sh to define at least JAVA_HOME to be the root of your Java installation.&lt;/p&gt;
&lt;p&gt;修改完了，按照教程上的示例，跑了一下这个代码&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;$ mkdir input 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;$ cp conf/*.xml input 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;$ bin/hadoop jar hadoop-examples-*.jar grep input output &lt;span class=&#34;s1&#34;&gt;&amp;#39;dfs[a-z.]+&amp;#39;&lt;/span&gt; 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;$ cat output/*
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;得到的结果是&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/hadoop-1.1.2# cat output/*
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;1	dfsadmin
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;这个是在 standalone 模式下的，问题教程也没有给出标准答案，所以也不知道得到这样的结果是不是对的&lt;/p&gt;
&lt;p&gt;不过使用以下这个命令，可以大致的判断出来，那个 jar 是模拟 grep 的功能的，而我们的结果也应该是对的&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/hadoop-1.1.2# cat input/* &lt;span class=&#34;p&#34;&gt;|&lt;/span&gt; egrep dfs&lt;span class=&#34;o&#34;&gt;[&lt;/span&gt;a-z.&lt;span class=&#34;o&#34;&gt;]&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;    dfsadmin and mradmin commands to refresh the security policy in-effect.
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;接下来根据教程，尝试一下那个 Pseudo-Distributed 模式，按照教程修改了配置文件之后，需要让 ssh 能够无密码登录&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;$ ssh-keygen -t dsa -P &lt;span class=&#34;s1&#34;&gt;&amp;#39;&amp;#39;&lt;/span&gt; -f ~/.ssh/id_dsa 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;$ cat ~/.ssh/id_dsa.pub &amp;gt;&amp;gt; ~/.ssh/authorized_keys
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;按照教程跑起 start-all 的脚本之后，访问 namenode 的 web 控制页面，可以看到这样的界面&lt;/p&gt;
&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;http://zrj.me/wp-content/uploads/2013/06/QQ%e6%88%aa%e5%9b%be20130618144948.png&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://blog.zrj.me/images/QQ%e6%88%aa%e5%9b%be20130618144948.png&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;QQ截图20130618144948&#34;
	
	
&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;以及这个是 jobtracker 的控制界面&lt;/p&gt;
&lt;p&gt;&lt;a class=&#34;link&#34; href=&#34;http://zrj.me/wp-content/uploads/2013/06/QQ%e6%88%aa%e5%9b%be20130618145148.png&#34;  target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;&lt;img src=&#34;https://blog.zrj.me/images/QQ%e6%88%aa%e5%9b%be20130618145148.png&#34;
	
	
	
	loading=&#34;lazy&#34;
	
		alt=&#34;QQ截图20130618145148&#34;
	
	
&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;但是在按照教程跑这个命令的时候报错了&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;bin/hadoop jar hadoop-examples-*.jar grep input output &lt;span class=&#34;s1&#34;&gt;&amp;#39;dfs[a-z.]+&amp;#39;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;错误是&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:57:52 INFO util.NativeCodeLoader: Loaded the native-hadoop library
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:57:52 WARN snappy.LoadSnappy: Snappy native library not loaded
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:57:52 INFO mapred.FileInputFormat: Total input paths to process : &lt;span class=&#34;m&#34;&gt;16&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:57:52 INFO mapred.JobClient: Running job: job_201306181446_0001
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:57:54 INFO mapred.JobClient:  map 0% reduce 0%
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:09 INFO mapred.JobClient: Task Id : attempt_201306181446_0001_m_000001_0, Status : FAILED
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Error: Java heap space
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:09 INFO mapred.JobClient: Task Id : attempt_201306181446_0001_m_000000_0, Status : FAILED
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:19 INFO mapred.JobClient: Task Id : attempt_201306181446_0001_m_000000_1, Status : FAILED
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Error: Java heap space
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:19 INFO mapred.JobClient: Task Id : attempt_201306181446_0001_m_000001_1, Status : FAILED
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Error: Java heap space
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:29 INFO mapred.JobClient: Task Id : attempt_201306181446_0001_m_000000_2, Status : FAILED
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Error: Java heap space
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:29 INFO mapred.JobClient: Task Id : attempt_201306181446_0001_m_000001_2, Status : FAILED
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Error: Java heap space
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:42 INFO mapred.JobClient: Job complete: job_201306181446_0001
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:42 INFO mapred.JobClient: Counters: &lt;span class=&#34;m&#34;&gt;7&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:42 INFO mapred.JobClient:   Job Counters 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:42 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;SLOTS_MILLIS_MAPS&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;42937&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:42 INFO mapred.JobClient:     Total &lt;span class=&#34;nb&#34;&gt;time&lt;/span&gt; spent by all reduces waiting after reserving slots &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ms&lt;span class=&#34;o&#34;&gt;)=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:42 INFO mapred.JobClient:     Total &lt;span class=&#34;nb&#34;&gt;time&lt;/span&gt; spent by all maps waiting after reserving slots &lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ms&lt;span class=&#34;o&#34;&gt;)=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:42 INFO mapred.JobClient:     Launched map &lt;span class=&#34;nv&#34;&gt;tasks&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;8&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:42 INFO mapred.JobClient:     Data-local map &lt;span class=&#34;nv&#34;&gt;tasks&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;8&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:42 INFO mapred.JobClient:     &lt;span class=&#34;nv&#34;&gt;SLOTS_MILLIS_REDUCES&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;0&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:42 INFO mapred.JobClient:     Failed map &lt;span class=&#34;nv&#34;&gt;tasks&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;1&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 14:58:42 INFO mapred.JobClient: Job Failed: &lt;span class=&#34;c1&#34;&gt;# of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201306181446_0001_m_000000&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;java.io.IOException: Job failed!
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.mapred.JobClient.runJob&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;JobClient.java:1327&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.examples.Grep.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Grep.java:69&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.util.ToolRunner.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ToolRunner.java:65&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.examples.Grep.main&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Grep.java:93&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke0&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Native Method&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;NativeMethodAccessorImpl.java:39&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.DelegatingMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;DelegatingMethodAccessorImpl.java:25&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at java.lang.reflect.Method.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Method.java:597&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.util.ProgramDriver&lt;span class=&#34;nv&#34;&gt;$ProgramDescription&lt;/span&gt;.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ProgramDriver.java:68&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.util.ProgramDriver.driver&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ProgramDriver.java:139&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.examples.ExampleDriver.main&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ExampleDriver.java:64&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke0&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Native Method&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;NativeMethodAccessorImpl.java:39&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.DelegatingMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;DelegatingMethodAccessorImpl.java:25&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at java.lang.reflect.Method.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Method.java:597&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.util.RunJar.main&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;RunJar.java:156&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;google 一下，看到这里，&lt;a class=&#34;link&#34; href=&#34;http://stackoverflow.com/questions/15609909/error-java-heap-space&#34;  title=&#34;http://stackoverflow.com/questions/15609909/error-java-heap-space&#34;
     target=&#34;_blank&#34; rel=&#34;noopener&#34;
    &gt;http://stackoverflow.com/questions/15609909/error-java-heap-space&lt;/a&gt;，需要修改 java 的堆大小&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;Clearly you have run out of the heap size allotted to Java. So you shall try to increase that.&lt;/p&gt;
&lt;p&gt;For that you may execute the following before executing hadoop command:&lt;/p&gt;
&lt;p&gt;export HADOOP_OPTS=&amp;quot;-Xmx4096m&amp;quot; Alternatively, you can achieve the same thing by adding the following permanent setting in your mapred-site.xml file, this file lies in HADOOP_HOME/conf/ :&lt;/p&gt;
&lt;p&gt;mapred.child.java.opts -Xmx4096m This would set your java heap space to 4096 MB (4GB), you may even try it with a lower value first if that works. If that too doesn&amp;rsquo;t work out then increase it more if your machine supports it, if not then move to a machine having more memory and try there. As heap space simply means you don&amp;rsquo;t have enough RAM available for Java.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;然后重启整个 Hadoop&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/workspace/hadoop-1.1.2# bin/stop-all.sh 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;stopping jobtracker
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;localhost: stopping tasktracker
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;stopping namenode
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;localhost: stopping datanode
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;localhost: stopping secondarynamenode
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;root@cluster-1:~/workspace/hadoop-1.1.2# bin/start-all.sh 
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;starting namenode, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-namenode-cluster-1.out
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;localhost: starting datanode, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-datanode-cluster-1.out
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;localhost: starting secondarynamenode, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-secondarynamenode-cluster-1.out
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;starting jobtracker, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-jobtracker-cluster-1.out
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;localhost: starting tasktracker, logging to /root/workspace/hadoop-1.1.2/libexec/../logs/hadoop-root-tasktracker-cluster-1.out
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;再次尝试之前失败的任务，依然报同样的错误，回想起之前的 conf/hadoop-env.sh 文件，把其中的&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;nb&#34;&gt;export&lt;/span&gt; &lt;span class=&#34;nv&#34;&gt;HADOOP_HEAPSIZE&lt;/span&gt;&lt;span class=&#34;o&#34;&gt;=&lt;/span&gt;&lt;span class=&#34;m&#34;&gt;2000&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;这一行的注释去掉，再重启 Hadoop，运行任务，跑起来直接崩溃了&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 15:16:58 ERROR security.UserGroupInformation: PriviledgedActionException as:root cause:org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.mapred.SafeModeException: JobTracker is in safe mode
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.mapred.JobTracker.checkSafeMode&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;JobTracker.java:5270&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.mapred.JobTracker.getStagingAreaDir&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;JobTracker.java:3797&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke0&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Native Method&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;NativeMethodAccessorImpl.java:39&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.DelegatingMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;DelegatingMethodAccessorImpl.java:25&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at java.lang.reflect.Method.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Method.java:597&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.ipc.RPC&lt;span class=&#34;nv&#34;&gt;$Server&lt;/span&gt;.call&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;RPC.java:578&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.ipc.Server&lt;span class=&#34;nv&#34;&gt;$Handler$1&lt;/span&gt;.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Server.java:1393&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.ipc.Server&lt;span class=&#34;nv&#34;&gt;$Handler$1&lt;/span&gt;.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Server.java:1389&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at java.security.AccessController.doPrivileged&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Native Method&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at javax.security.auth.Subject.doAs&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Subject.java:396&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.security.UserGroupInformation.doAs&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;UserGroupInformation.java:1149&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.ipc.Server&lt;span class=&#34;nv&#34;&gt;$Handler&lt;/span&gt;.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Server.java:1387&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot delete /user/root/grep-temp-1080766159. Name node is in safe mode.
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;The reported blocks &lt;span class=&#34;m&#34;&gt;17&lt;/span&gt; has reached the threshold 0.9990 of total blocks 17. Safe mode will be turned off automatically in &lt;span class=&#34;m&#34;&gt;19&lt;/span&gt; seconds.
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;FSNamesystem.java:2111&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;FSNamesystem.java:2088&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.hdfs.server.namenode.NameNode.delete&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;NameNode.java:832&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke0&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Native Method&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;NativeMethodAccessorImpl.java:39&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.DelegatingMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;DelegatingMethodAccessorImpl.java:25&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at java.lang.reflect.Method.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Method.java:597&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.ipc.RPC&lt;span class=&#34;nv&#34;&gt;$Server&lt;/span&gt;.call&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;RPC.java:578&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.ipc.Server&lt;span class=&#34;nv&#34;&gt;$Handler$1&lt;/span&gt;.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Server.java:1393&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.ipc.Server&lt;span class=&#34;nv&#34;&gt;$Handler$1&lt;/span&gt;.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Server.java:1389&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at java.security.AccessController.doPrivileged&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Native Method&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at javax.security.auth.Subject.doAs&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Subject.java:396&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.security.UserGroupInformation.doAs&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;UserGroupInformation.java:1149&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.ipc.Server&lt;span class=&#34;nv&#34;&gt;$Handler&lt;/span&gt;.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Server.java:1387&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.ipc.Client.call&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Client.java:1107&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.ipc.RPC&lt;span class=&#34;nv&#34;&gt;$Invoker&lt;/span&gt;.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;RPC.java:229&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at com.sun.proxy.&lt;span class=&#34;nv&#34;&gt;$Proxy1&lt;/span&gt;.delete&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Unknown Source&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke0&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Native Method&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;NativeMethodAccessorImpl.java:39&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.DelegatingMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;DelegatingMethodAccessorImpl.java:25&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at java.lang.reflect.Method.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Method.java:597&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;RetryInvocationHandler.java:85&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;RetryInvocationHandler.java:62&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at com.sun.proxy.&lt;span class=&#34;nv&#34;&gt;$Proxy1&lt;/span&gt;.delete&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Unknown Source&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.hdfs.DFSClient.delete&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;DFSClient.java:981&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.hdfs.DistributedFileSystem.delete&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;DistributedFileSystem.java:245&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.examples.Grep.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Grep.java:87&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.util.ToolRunner.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ToolRunner.java:65&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.examples.Grep.main&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Grep.java:93&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke0&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Native Method&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;NativeMethodAccessorImpl.java:39&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.DelegatingMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;DelegatingMethodAccessorImpl.java:25&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at java.lang.reflect.Method.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Method.java:597&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.util.ProgramDriver&lt;span class=&#34;nv&#34;&gt;$ProgramDescription&lt;/span&gt;.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ProgramDriver.java:68&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.util.ProgramDriver.driver&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ProgramDriver.java:139&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.examples.ExampleDriver.main&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;ExampleDriver.java:64&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke0&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Native Method&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.NativeMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;NativeMethodAccessorImpl.java:39&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at sun.reflect.DelegatingMethodAccessorImpl.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;DelegatingMethodAccessorImpl.java:25&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at java.lang.reflect.Method.invoke&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;Method.java:597&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.util.RunJar.main&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;RunJar.java:156&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;把之前的 mapred-site.xml 里面的改动恢复，再尝试，执行任务依然会直接崩溃&lt;/p&gt;
&lt;p&gt;最后经过多个搜索，定位到问题应该是在 conf/hadoop-env.sh 这个文件中的 HADOOP_HEAPSIZE 这个变量中，对于这个变量的修改，我的步骤是这样的，首先停掉 Hadoop&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# bin/stop-all.sh&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;然后删除 logs 和 tmp&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;#rm -rf logs/*&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# rm -rf /tmp/hadoop-root*&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;然后修改配置文件&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;#vi conf/hadoop-env.sh&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;并重新格式化 namenode&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;#bin/hadoop namenode -format&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;然后重启 Hadoop&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;#bin/start-all.sh&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;并重新拷入文件&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# bin/hadoop fs -mkdir input&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# bin/hadoop fs -put conf/*.xml input&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;然后执行任务&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;&lt;span class=&#34;c1&#34;&gt;# bin/hadoop jar hadoop-examples-*.jar grep input output &amp;#39;dfs[a-z.]+&amp;#39;&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;可惜的是，当我把变量的值设为 5000 或者更高的时候，就会报内存不足，无法初始化 java 虚拟机&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 16:28:14 INFO mapred.JobClient: Task Id : attempt_201306181619_0001_m_000007_1, Status : FAILED
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;java.lang.Throwable: Child Error
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.mapred.TaskRunner.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;TaskRunner.java:271&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Caused by: java.io.IOException: Task process &lt;span class=&#34;nb&#34;&gt;exit&lt;/span&gt; with nonzero status of 1.
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;	at org.apache.hadoop.mapred.TaskRunner.run&lt;span class=&#34;o&#34;&gt;(&lt;/span&gt;TaskRunner.java:258&lt;span class=&#34;o&#34;&gt;)&lt;/span&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;attempt_201306181619_0001_m_000007_1: Error occurred during initialization of VM
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;attempt_201306181619_0001_m_000007_1: Could not reserve enough space &lt;span class=&#34;k&#34;&gt;for&lt;/span&gt; object heap
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;而在 4000 或者以下的时候，就会报堆内存不足&lt;/p&gt;
&lt;div class=&#34;highlight&#34;&gt;&lt;pre tabindex=&#34;0&#34; class=&#34;chroma&#34;&gt;&lt;code class=&#34;language-bash&#34; data-lang=&#34;bash&#34;&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;13/06/18 16:46:13 INFO mapred.JobClient: Task Id : attempt_201306181644_0001_m_000001_1, Status : FAILED
&lt;/span&gt;&lt;/span&gt;&lt;span class=&#34;line&#34;&gt;&lt;span class=&#34;cl&#34;&gt;Error: Java heap space
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;&lt;p&gt;由于这个 vps 的内存只有 512，所以我觉得应该没有办法在单机上模拟 Pseudo-Distributed，决定跳过，尝试 Fully-Distributed&lt;/p&gt;
</description>
        </item>
        
    </channel>
</rss>
