扫二维码与项目经理沟通
我们在微信上24小时期待你的声音
解答本文疑问/技术咨询/运营咨询/技术建议/互联网交流
0.前言
为电白等地区用户提供了全套网页设计制作服务,及电白网站建设行业解决方案。主营业务为成都网站设计、成都网站建设、电白网站设计,以传统方式定制建设网站,并提供域名空间备案等一条龙服务,秉承以专业、用心的态度为用户提供真诚的服务。我们深信只要达到每一位用户的要求,就会得到认可,从而选择与我们长期合作。这样,我们也可以走得更远!前面一篇《Hadoop初体验:快速搭建Hadoop伪分布式环境》搭建了一个Hadoop的环境,现在就使用Hadoop自带的wordcount程序来做单词统计的案例。
1.使用示例程序实现单词统计
(1)wordcount程序
wordcount程序在hadoop的share目录下,如下:
[root@leaf mapreduce]# pwd /usr/local/hadoop/share/hadoop/mapreduce [root@leaf mapreduce]# ls hadoop-mapreduce-client-app-2.6.5.jar hadoop-mapreduce-client-jobclient-2.6.5-tests.jar hadoop-mapreduce-client-common-2.6.5.jar hadoop-mapreduce-client-shuffle-2.6.5.jar hadoop-mapreduce-client-core-2.6.5.jar hadoop-mapreduce-examples-2.6.5.jar hadoop-mapreduce-client-hs-2.6.5.jar lib hadoop-mapreduce-client-hs-plugins-2.6.5.jar lib-examples hadoop-mapreduce-client-jobclient-2.6.5.jar sources
就是这个hadoop-mapreduce-examples-2.6.5.jar程序。
(2)创建HDFS数据目录
创建一个目录,用于保存MapReduce任务的输入文件:
[root@leaf ~]# hadoop fs -mkdir -p /data/wordcount
创建一个目录,用于保存MapReduce任务的输出文件:
[root@leaf ~]# hadoop fs -mkdir /output
查看刚刚创建的两个目录:
[root@leaf ~]# hadoop fs -ls / drwxr-xr-x - root supergroup 0 2017-09-01 20:34 /data drwxr-xr-x - root supergroup 0 2017-09-01 20:35 /output
(3)创建一个单词文件,并上传到HDFS
创建的单词文件如下:
[root@leaf ~]# cat myword.txt leaf yyh yyh xpleaf katy ling yeyonghao leaf xpleaf katy
上传该文件到HDFS中:
[root@leaf ~]# hadoop fs -put myword.txt /data/wordcount
在HDFS中查看刚刚上传的文件及内容:
[root@leaf ~]# hadoop fs -ls /data/wordcount -rw-r--r-- 1 root supergroup 57 2017-09-01 20:40 /data/wordcount/myword.txt [root@leaf ~]# hadoop fs -cat /data/wordcount/myword.txt leaf yyh yyh xpleaf katy ling yeyonghao leaf xpleaf katy
(4)运行wordcount程序
执行如下命令:
[root@leaf ~]# hadoop jar /usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.5.jar wordcount /data/wordcount /output/wordcount ... 17/09/01 20:48:14 INFO mapreduce.Job: Job job_local1719603087_0001 completed successfully 17/09/01 20:48:14 INFO mapreduce.Job: Counters: 38 File System Counters FILE: Number of bytes read=585940 FILE: Number of bytes written=1099502 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=114 HDFS: Number of bytes written=48 HDFS: Number of read operations=15 HDFS: Number of large read operations=0 HDFS: Number of write operations=4 Map-Reduce Framework Map input records=5 Map output records=10 Map output bytes=97 Map output materialized bytes=78 Input split bytes=112 Combine input records=10 Combine output records=6 Reduce input groups=6 Reduce shuffle bytes=78 Reduce input records=6 Reduce output records=6 Spilled Records=12 Shuffled Maps =1 Failed Shuffles=0 Merged Map outputs=1 GC time elapsed (ms)=92 CPU time spent (ms)=0 Physical memory (bytes) snapshot=0 Virtual memory (bytes) snapshot=0 Total committed heap usage (bytes)=241049600 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=57 File Output Format Counters Bytes Written=48
(5)查看统计结果
如下:
[root@leaf ~]# hadoop fs -cat /output/wordcount/part-r-00000 katy 2 leaf 2 ling 1 xpleaf 2 yeyonghao 1 yyh 2
3.参考资料
http://www.aboutyun.com/thread-7713-1-1.html
另外有需要云服务器可以了解下创新互联scvps.cn,海内外云服务器15元起步,三天无理由+7*72小时售后在线,公司持有idc许可证,提供“云服务器、裸金属服务器、高防服务器、香港服务器、美国服务器、虚拟主机、免备案服务器”等云主机租用服务以及企业上云的综合解决方案,具有“安全稳定、简单易用、服务可用性高、性价比高”等特点与优势,专为企业上云打造定制,能够满足用户丰富、多元化的应用场景需求。
我们在微信上24小时期待你的声音
解答本文疑问/技术咨询/运营咨询/技术建议/互联网交流