日誌監控:即將日誌打印到指定目錄下,然後用filebeat將本地日誌發送到目標kafka服務器,然後做日誌分析。
1.在服務器安裝filebeat(如filebeat-7.0.1-x86_64.rpm);
2.修改/etc/filebeat/filebeat.yml文件,filebeat.yml示例:
#=========================== Filebeat inputs =============================
filebeat.inputs:
- type: log
enabled: true
paths:
- /home/app/logs/data.log # 此處改爲你的應用日誌文件的地址
multiline.pattern: '^20'
multiline.negate: true
multiline.match: after
tail_files: true
#============================= Filebeat modules ===============================
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
#==================== Elasticsearch template setting ==========================
setup.template.settings:
index.number_of_shards: 3
#================================ Outputs =====================================
#-------------------------- kafka output ------------------------------
output.kafka:
enabled: true
hosts: ["10.3.7.34:9092","10.3.7.35:9092","10.3.7.36:9092"] # 此處改爲kafka的服務器ip
topic: "data"# 此處改爲kafka 對應的topic
#================================ Procesors =====================================
processors:
- drop_fields:
fields: ["@timestamp","beat","input","offset","source","@metadata","host","prospector"]