Kafka 集成Flume

1.環境準備

1.準備一個Kafka集羣環境並啓動

Kafka 3.6.1 集羣安裝與部署

2.在任意Kafka集羣節點上安裝Flume

Flume 1.11 部署

2.Flume 生產者

1.配置 Flume

cd /usr/flume/apache-flume-1.11.0-bin/
mkdir jobs
mkdir /mnt/applog
vi jobs/file_to_kafka.conf
# 1 組件定義
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# 2 配置 source
a1.sources.r1.type = TAILDIR
a1.sources.r1.filegroups = f1
a1.sources.r1.filegroups.f1 = /mnt/applog/app.*
a1.sources.r1.positionFile = /usr/flume/apache-flume-1.11.0-bin/taildir_position.json

# 3 配置 channel
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# 4 配置 sink
a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink
a1.sinks.k1.kafka.bootstrap.servers = 192.168.58.130:9092,192.168.58.131:9092,192.168.58.132:9092
a1.sinks.k1.kafka.topic = first
a1.sinks.k1.kafka.flumeBatchSize = 20
a1.sinks.k1.kafka.producer.acks = 1
a1.sinks.k1.kafka.producer.linger.ms = 1

# 5 拼接組件
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

2.啓動 Flume

bin/flume-ng agent -c conf/ -n a1 -f jobs/file_to_kafka.conf &

3.創建first Topic

/usr/kafka/kafka_2.13-3.6.1/bin/kafka-topics.sh --bootstrap-server 192.168.58.130:9092 --create --partitions 1 --replication-factor 3 --topic first

4.啓動Kafka消費者

/usr/kafka/kafka_2.13-3.6.1/bin/kafka-console-consumer.sh --bootstrap-server 192.168.58.130:9092 --topic first

5.向文件中追加數據

echo coreqi >> /mnt/applog/app.log

6.觀察 kafka 消費者,能夠看到消費的 寫入文件的 數據

3.Flume 消費者

1.配置 Flume

vi /usr/flume/apache-flume-1.11.0-bin/jobs/kafka_to_file.conf
# 1 組件定義
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# 2 配置 source
a1.sources.r1.type = org.apache.flume.source.kafka.KafkaSource
a1.sources.r1.batchSize = 50
a1.sources.r1.batchDurationMillis = 200
a1.sources.r1.kafka.bootstrap.servers = 192.168.58.130:9092
a1.sources.r1.kafka.topics = first
a1.sources.r1.kafka.consumer.group.id = custom.g.id

# 3 配置 channel
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# 4 配置 sink
a1.sinks.k1.type = logger

# 5 拼接組件
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

2.啓動 Flume

cd /usr/flume/apache-flume-1.11.0-bin/
bin/flume-ng agent -c conf/ -n a1 -f jobs/kafka_to_file.conf -Dflume.root.logger=INFO,console

3.啓動 kafka 生產者

/usr/kafka/kafka_2.13-3.6.1/bin/kafka-console-producer.sh --bootstrap-server 192.168.58.130:9092 --topic first

輸入數據,例如:hello world

4.觀察控制檯輸出的日誌

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章