这篇文章主要讲解了“Java的Hadoop KeyValueTextInputFormat怎么使用”,文中的讲解内容简单清晰,易于学习与理解,下面请大家跟着小编的思路慢慢深入,一起来研究和学习“Java的Hadoop KeyValueTextInputFormat怎么使用”吧!
KeyValueTextInputFormat使用案例
1.需求
统计输入文件中每一行的第一个单词相同的行数。
(1) 输入数据
hadoop ni haoxiaoming hive helloworldhadoop ni haoxiaoming hive helloworld
(2) 期望结果数据
2.需求分析

3.代码编写
(1) 编写Mapper类
public class KVTextMapper extends Mapper<Text, Text, Text, LongWritable>{
// 1 设置value LongWritable v = new LongWritable(1);
@Override protected void map(Text key, Text value, Context context) throws IOException, InterruptedException { // 2 写出 context.write(key, v); }}
(2) 编写Reducer类
public class KVTextReducer extends Reducer<Text, LongWritable, Text, LongWritable>{
LongWritable v = new LongWritable();
@Override protected void reduce(Text key, Iterable<LongWritable> values, Context context) throws IOException, InterruptedException {
long sum = 0L;
// 1 汇总统计 for (LongWritable value : values) { sum += value.get(); } v.set(sum); // 2 输出 context.write(key, v); }}
(3) 编写Driver类
public class KVTextDriver {
public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException {
Configuration conf = new Configuration(); // 设置切割符 conf.set(KeyValueLineRecordReader.KEY_VALUE_SEPERATOR, " "); // 1 获取job对象 Job job = Job.getInstance(conf);
// 2 设置jar包位置,关联mapper和reducer job.setJarByClass(KVTextDriver.class); job.setMapperClass(KVTextMapper.class);job.setReducerClass(KVTextReducer.class);
// 3 设置map输出kv类型 job.setMapOutputKeyClass(Text.class); job.setMapOutputValueClass(LongWritable.class);
// 4 设置最终输出kv类型 job.setOutputKeyClass(Text.class);job.setOutputValueClass(LongWritable.class);
// 5 设置输入输出数据路径 FileInputFormat.setInputPaths(job, new Path(args[0]));
// 设置输入格式 job.setInputFormatClass(KeyValueTextInputFormat.class);
// 6 设置输出数据路径 FileOutputFormat.setOutputPath(job, new Path(args[1]));
// 7 提交job job.waitForCompletion(true); }}
感谢各位的阅读,以上就是“Java的Hadoop KeyValueTextInputFormat怎么使用”的内容了,经过本文的学习后,相信大家对Java的Hadoop KeyValueTextInputFormat怎么使用这一问题有了更深刻的体会,具体使用情况还需要大家实践验证。这里是天达云,小编将为大家推送更多相关知识点的文章,欢迎关注!