kafka 安裝 kafka-manager 安裝 及python調用

kafka 安裝 kafka-manager 安裝 及python調用

docker-compose 文件

version: '3'
services:
  zookeeper:
    image: wurstmeister/zookeeper
    ports:
      - "2181:2181"
  kafka:
    image: wurstmeister/kafka
    depends_on: [ zookeeper ]
    ports:
      - "9092:9092"
    environment:
      KAFKA_ADVERTISED_HOST_NAME: 192.168.1.88
      KAFKA_CREATE_TOPICS: "test:1:1"
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
    volumes:
      - /data/product/zj_bigdata/data/kafka/docker.sock:/var/run/docker.sock
  kafka-manager:
    image: sheepkiller/kafka-manager:latest
    ports:
      - "9000:9000"
    links:
      - zookeeper
      - kafka
    environment:
      ZK_HOSTS: zookeeper:2181
      APPLICATION_SECRET: letmein
      KM_ARGS: -Djava.net.preferIPv4Stack=true

在這裏插入圖片描述
kafka-python 1.4.7
github地址: https://github.com/dpkp/kafka-python
文檔地址: https://kafka-python.readthedocs.io/en/master/apidoc/modules.html
由於消息使用snappy 壓縮方式,還需要安裝python-snappy 0.5.4
snappy 安裝需要gcc 及

ubuntu : sudo apt-get install libsnappy-dev
centos : yaml install  snappy-devel

客戶端代碼

class KafkaConsumerM(object):

    def __init__(self,host,port,topic,group_id):
        self.consumer = KafkaConsumer(topic,group_id=group_id, bootstrap_servers=[f'{host}:{port}'],
                                 auto_offset_reset='latest',
                                 enable_auto_commit=True,auto_commit_interval_ms=2000,consumer_timeout_ms=10*60*1000)
try:

    MyKafkaConsumerM = KafkaConsumerM(Config.KAFKA_HOST, Config.KAFKA_PORT, Config.KAFKA_TOPIC, Config.KAFKA_TOPIC)

except Exception as e:
    print(os.getcwd(),": connect error:",e)
    os._exit(1)
鏈接代碼:
from dbsource.mongoHelper import ServerMongoStore
from dbsource.kafkaHelper import MyKafkaConsumerM
import json
import os
import time
from etc.config import Config
from comm.funcs import Funcs
from comm.log import logger

logs = []
cachelen = Config.CACHE_LEN
    def saveToMongo():
        logger.info("saveToMongo 任務啓動")
        # 將flume 發送到kafka的內容記錄到mongo中
        while True:
            try:
                for msg in MyKafkaConsumerM.consumer:
                    try:
                        log = json.loads(msg[6].decode("utf-8"))
                        try:
                            log["status"] = log["response"]["status"]
                            log["time"] = int(time.time())
                            del log["latencies"]
                            del log["service"]
                            del log["route"]
                            del log["response"]
                            del log["tries"]
                        except Exception as e:
                            print(os.getcwd(), ", json decode error:", e)
                        logs.append(log)
                    except Exception as e:
                        print(os.getcwd(), ", delete keys error:", e)
                        continue
                    try:
                        if len(logs) >= cachelen:
                            ServerMongoStore.insertMany(logs)
                            logs.clear()
                    except Exception as e:
                        print(os.getcwd(), ", mongo insert error:", e)
                    time.sleep(1)
            except Exception as e:
                print(os.getcwd(), ", error:", e)
                continue
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章