欧美一级特黄大片做受成人-亚洲成人一区二区电影-激情熟女一区二区三区-日韩专区欧美专区国产专区

hadoopMapReducejava示例

wordcount工作流程
input-> 拆分Split->映射map->派發(fā)Shuffle->縮減reduce->output
hadoop jar /usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.3.jar wordcount 10803060234.txt /output

成都創(chuàng)新互聯(lián)是一家集網(wǎng)站建設(shè),北鎮(zhèn)企業(yè)網(wǎng)站建設(shè),北鎮(zhèn)品牌網(wǎng)站建設(shè),網(wǎng)站定制,北鎮(zhèn)網(wǎng)站建設(shè)報(bào)價(jià),網(wǎng)絡(luò)營(yíng)銷(xiāo),網(wǎng)絡(luò)優(yōu)化,北鎮(zhèn)網(wǎng)站推廣為一體的創(chuàng)新建站企業(yè),幫助傳統(tǒng)企業(yè)提升企業(yè)形象加強(qiáng)企業(yè)競(jìng)爭(zhēng)力??沙浞譂M(mǎn)足這一群體相比中小企業(yè)更為豐富、高端、多元的互聯(lián)網(wǎng)需求。同時(shí)我們時(shí)刻保持專(zhuān)業(yè)、時(shí)尚、前沿,時(shí)刻以成就客戶(hù)成長(zhǎng)自我,堅(jiān)持不斷學(xué)習(xí)、思考、沉淀、凈化自己,讓我們?yōu)楦嗟钠髽I(yè)打造出實(shí)用型網(wǎng)站。

package wordcount;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

public class Test {

public Test() {
    // TODO Auto-generated constructor stub
}

public static void main(String[] args) throws Exception {
    // TODO Auto-generated method stub

    Configuration conf = new Configuration();
    conf.set("fs.defaultFS","hdfs://172.26.19.40:9000");
    conf.set("mapreduce.job.jar", "target/wc.jar");
    conf.set("mapreduce.framework.name", "yarn");
    conf.set("yarn.resourcemanager.hostname", "hmaster");
    conf.set("mapreduce.app-submission.cross-platform", "true");
    Job job = Job.getInstance(conf);
    job.setMapperClass(WordMapper.class);
    job.setReducerClass(WordReducer.class);

    job.setMapOutputKeyClass(Text.class);
    job.setMapOutputValueClass(IntWritable.class);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(LongWritable.class);

    FileInputFormat.setInputPaths(job, "");
    FileOutputFormat.setOutputPath(job, new Path(""));

    job.waitForCompletion(true);
}

}

package wordcount;

import java.io.IOException;

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Mapper;

public class WordMapper extends Mapper<LongWritable, Text, Text, IntWritable> {

@Override
protected void map(LongWritable key, Text value, Mapper<LongWritable, Text, Text, IntWritable>.Context context)
        throws IOException, InterruptedException {
    String lineValue = value.toString();
    String[] words = lineValue.split(" ");
    IntWritable cIntWritable = new IntWritable(1);
    for(String word : words) {
        context.write(new Text(word), cIntWritable);
    }
}

}

package wordcount;

import java.io.IOException;

import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Reducer;

public class WordReducer extends Reducer<Text, IntWritable, Text, LongWritable> {

@Override
protected void reduce(Text key, Iterable<IntWritable> values,
        Reducer<Text, IntWritable, Text, LongWritable>.Context context) throws IOException, InterruptedException {

     Long tmpCount = 0L;
     for(IntWritable value : values) {
         tmpCount = tmpCount + value.get();
     }

     context.write(key, new LongWritable(tmpCount));

}

}

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.skcc</groupId>
<artifactId>wordcount</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>wordcount</name>
<description>count the word</description>

<properties>
    <project.build.sourceencoding>UTF-8</project.build.sourceencoding>
    <hadoop.version>2.7.3</hadoop.version>
</properties>
<dependencies>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>4.12</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-client</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-hdfs</artifactId>
        <version>${hadoop.version}</version>
    </dependency>
</dependencies>

</project>

網(wǎng)站欄目:hadoopMapReducejava示例
URL網(wǎng)址:http://www.aaarwkj.com/article34/ipdese.html

成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供電子商務(wù)全網(wǎng)營(yíng)銷(xiāo)推廣網(wǎng)站設(shè)計(jì)公司、搜索引擎優(yōu)化、靜態(tài)網(wǎng)站、標(biāo)簽優(yōu)化

廣告

聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶(hù)投稿、用戶(hù)轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請(qǐng)盡快告知,我們將會(huì)在第一時(shí)間刪除。文章觀點(diǎn)不代表本網(wǎng)站立場(chǎng),如需處理請(qǐng)聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時(shí)需注明來(lái)源: 創(chuàng)新互聯(lián)

網(wǎng)站建設(shè)網(wǎng)站維護(hù)公司
国内外成人皇色视频| 亚洲精品欧美激情专区| 白嫩少妇情久久密月久久| 蜜臀av午夜福利在线观看| 四影虎影永久免费观看| 亚洲综合久久国产一区二区| 国产自拍在线视频精品| 精品亚洲一区二区三区四| 精品国产美女主播在线 | 精品国产免费第一区二区三| 欧美艳星一区二区三区四区| 99热国产这里只有精品| 亚洲精品一区二区成人影院| 午夜少妇久久久久久久久| 亚洲男人av天堂午夜在| 在线免费观看国产黄色av| 中文字幕乱码亚洲影视| 久久亚洲国产成人精品性色| 亚洲国产欧美日韩在线| 九九在线视频免费观看精品视频| 欧美av精品一区二区三区| 欧美日韩精品激情一区二区| 九九热这里只有免费视频| 久久精品国产视频在热| 91在线国产精品视频| 一区二区三区高清人妻日本| 欧美劲爆三级免费观看| 国产亚洲精品第一综合| 国产老熟女高潮视频| 久久精品色一情一乱一伦| 久久国产亚洲欧美一区| 五月婷婷丁香在线观看| 和富婆啪啪一区二区免费看| 亚洲国产一区二区高清| 一本在线不卡中文字幕| 色偷偷亚洲精品一区二区| 人妻熟妇av在线一区二区三区| 国产精品伊人久久精品| 丰满人妻毛片一区二区三区| 久久精品一区二区熟女| 亚洲日本日本午夜精品|