DataStream 數據轉換
常規操作符
// order zhangsan TV,GAME
val env = StreamExecutionEnvironment.createLocalEnvironment()
val props = new Properties()
props.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
"CentOS:9092,CentOS:9093,CentOS:9094")
props.setProperty(ConsumerConfig.GROUP_ID_CONFIG, "g1")
env.addSource(new FlinkKafkaConsumer[String](“topic01”,new SimpleStringSchema(),props))
.filter(line => line.startsWith(“order”))
.map(line => line.replace(“order”,"").trim)
.flatMap(user =>for(i <- user.split(" “)(1).split(”,")) yield (user.split(" ")(0),i))
.print()
env.execute(“word counts”)
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
分組操作符
邏輯上將流分區爲不相交的分區。具有相同Key的所有記錄都分配給同一分區。在內部,keyBy()是使用散列分區實現的。
val env = StreamExecutionEnvironment.createLocalEnvironment()
val props = new Properties()
props.setProperty(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "CentOS:9092,CentOS:9093,CentOS:9094")
props.setProperty(ConsumerConfig.GROUP_ID_CONFIG, "g1")
//001 zhansan 蘋果 4.5 2 2018-10-01
//003 lisi 機械鍵盤 800 1 2018-01-23
//002 zhansan 橘子 2.5 2 2018-11-22
env.addSource(new FlinkKafkaConsumer[String](“topic01”,new SimpleStringSchema(),props))
.map(line => {
val user = line.split(" “)(1)
val cost= line.split(” “)(3).toDouble * line.split(” ")(4).toInt
(user,cost)
})
.keyBy(0)
.reduce((item1,item2)=>(item1._1,item1._2+item2._2))
.print()
env.execute(“order counts”)
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
聚合操作符
Reduce
key數據流上的“滾動”減少。將當前元素與最後一個Reduce的值組合併產生新值。
val env = StreamExecutionEnvironment.createLocalEnvironment()
env.socketTextStream(“localhost”,9999)
.flatMap(.split("\W+"))
.map((,1))
.keyBy(0)
.reduce((v1,v2)=>(v1._1,v1._2+v2._2))
.print()
env.execute(“reduce test”)
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
Fold
有初始值的鍵控數據流上的“滾動”摺疊。將當前元素與最後fold值組合併產生新值。該方法在未來版本即將廢除。
val env = StreamExecutionEnvironment.createLocalEnvironment()
env.socketTextStream(“localhost”,9999)
.flatMap(.split("\W+"))
.map((,1))
.keyBy(0)
.fold(("",0))((v1,v2)=>(v2._1,v1._2+v2._2))
.print()
env.execute(“fold test”)
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
Aggregations
滾動Aggregations數據流上的聚合。 min和minBy之間的差異是min返回最小值,而minBy返回該字段中具有最小值的元素(max和maxBy相同)。
val env = StreamExecutionEnvironment.createLocalEnvironment()
env.socketTextStream(“localhost”,9999)
.flatMap(.split("\W+"))
.map((,1))
.keyBy(0)
.sum(1)
.print()
env.execute(“aggregate test”)
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
合併分支操作符
Union
流合併,必須保證合併的流的類型保持一致.
val env = StreamExecutionEnvironment.createLocalEnvironment()
val stream1: DataStream[String] = env.fromElements("a","b","c")
val stream2: DataStream[String] = env.fromElements("b","c","d")
stream1.union(stream2)
.print()
env.execute("union test")
- 1
- 2
- 3
- 4
- 5
- 6
Connect
將兩個流數據連接在一起,和union類似但是不要求兩個流元素的類型一致,但是在實現CoMapFunction[IN1,IN2,OUT]函數的時候,要求OUT類型必須一致,也就是說當經過該函數的處理,兩個流最終的轉換形式要一致。
val env = StreamExecutionEnvironment.createLocalEnvironment()
val s1: DataStream[String] = env.socketTextStream("CentOS",9999)
val s2: DataStream[String] = env.socketTextStream("CentOS",8888)
s1.connect(s2)
.map(new CoMapFunction[String,String,String] {
override def map1(value: String) = {
value.split(" “)(0)+”,"+value.split(" “)(1)
}
override def map2(value: String) = {
value.split(”,")(0)+","+value.split(",")(1)
}
})
.map(line => (line.split(",")(0),line.split(",")(1).toDouble))
.keyBy(_._1)
.sum(1)
.print()
env.execute(“connect demo”)
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
Split/Select
使用Split算子先對一個Stream元素進行轉換算子,將給定元素分發到對應的named stream
中然後下游通過Select算子完成對各個named stream
個性化處理.
val env = StreamExecutionEnvironment.createLocalEnvironment()
val split = env.socketTextStream("CentOS", 9999)
.split(new OutputSelector[String] {
override def select(value: String): lang.Iterable[String] = {
var list = new util.ArrayList[String]()
if (value.contains("error")) {
list.add("error")
} else {
list.add("info")
}
return list
}
})
//選擇分支流
split.select("error").map(t=> "ERROR "+t).print()
split.select("info").map(t=> "INFO "+t).print()
env.execute(“split demo”)
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
- 11
- 12
- 13
- 14
- 15
- 16
- 17
- 18
更多精彩內容關注
</div>