日誌採集的工具有很多種,如logagent, flume, logstash,betas等等。首先要知道爲什麼要使用filebeat呢?因爲logstash是jvm跑的,資源消耗比較大,啓動一個logstash就需要消耗500M左右的內存,而filebeat只需要10來M內存資源。常用的ELK日誌採集方案中,大部分的做法就是將所有節點的日誌內容通過filebeat送到kafka消息隊列,然後使用logstash集羣讀取消息隊列內容,根據配置文件進行過濾。然後將過濾之後的文件輸送到elasticsearch中,通過kibana去展示。
1.下載安裝包
https://www.elastic.co/cn/downloads/beats/filebeat
2.安裝
解壓:tar -zxvf filebeat-5.5.2-linux-x86_64.tar.gz
創建軟鏈接:ln -s filebeat-5.5.2-linux-x86_64 filebeat
3.配置文件(同時輸出到es和logstash以及kafka)
vi filebeat.yml
filebeat.prospectors:
- input_type: log
paths:
- /home/pgxl/elk/a.txt
# ----------------output.elasticsearch-------------------------
output.elasticsearch:
hosts: ["localhost:9200"]
protocol: "http"
index: "stat_filebeat"
# template.name: "stat_ilebeat"
# template.path: "filebeat.template.json"
# template.overwrite: false
#----------------output.logstash-----------------------------
output.logstash:
hosts: ["localhost:5044"]
#---------------output.kafka----------------------------------
output.kafka:
enabled: true
hosts: ["192.168.10.1:9092","192.168.10.2:9092","192.168.10.3:9092"]
topic: "liuzc_test"
partition.round_robin:
reachable_only: true
worker: 2
required_acks: 1
compression: gzip
max_message_bytes: 10000000
4.啓動filebeat
./filebeat -e -c filebeat.yml
5.logstash監聽filebeat的配置文件(只是輸出監聽到的數據到控制檯,不寫入別的組件)
input {
beats {
port => 5044
}
}
output {
stdout{codec=>"rubydebug"}
}
6.kibana查看結果
7.遇到的坑
用上述配置文件啓動logstash的時候居然報錯
The error reported is:
Couldn't find any input plugin named 'beats'. Are you sure this is correct? Trying to load the beats input plugin resulted in this error: no such file to load -- logstash/inputs/beats
大概意思就說說缺少logstash-input-beats這個組件,然後就查看一下安裝的組件有哪些:
[pgxl@lx33 logstash]$ bin/plugin list
logstash-codec-collectd
logstash-codec-dots
logstash-codec-edn
logstash-codec-edn_lines
logstash-codec-es_bulk
logstash-codec-fluent
logstash-codec-graphite
logstash-codec-json
logstash-codec-json_lines
logstash-codec-line
logstash-codec-msgpack
logstash-codec-multiline
logstash-codec-netflow
logstash-codec-oldlogstashjson
logstash-codec-plain
logstash-codec-rubydebug
logstash-filter-anonymize
logstash-filter-checksum
logstash-filter-clone
logstash-filter-csv
logstash-filter-date
logstash-filter-dns
logstash-filter-drop
logstash-filter-fingerprint
logstash-filter-geoip
logstash-filter-greenline
logstash-filter-grok
logstash-filter-json
logstash-filter-kv
logstash-filter-metrics
logstash-filter-multiline
logstash-filter-mutate
logstash-filter-ruby
logstash-filter-sleep
logstash-filter-split
logstash-filter-syslog_pri
logstash-filter-throttle
logstash-filter-urldecode
logstash-filter-useragent
logstash-filter-uuid
logstash-filter-xml
logstash-input-couchdb_changes
logstash-input-elasticsearch
logstash-input-eventlog
logstash-input-exec
logstash-input-file
logstash-input-ganglia
logstash-input-gelf
logstash-input-generator
logstash-input-graphite
logstash-input-heartbeat
logstash-input-http
logstash-input-imap
logstash-input-irc
logstash-input-kafka
logstash-input-log4j
logstash-input-lumberjack
logstash-input-pipe
logstash-input-rabbitmq
logstash-input-redis
logstash-input-s3
logstash-input-snmptrap
logstash-input-sqs
logstash-input-stdin
logstash-input-syslog
logstash-input-tcp
logstash-input-twitter
logstash-input-udp
logstash-input-unix
logstash-input-xmpp
logstash-input-zeromq
logstash-output-cloudwatch
logstash-output-csv
logstash-output-elasticsearch
logstash-output-elasticsearch_http
logstash-output-email
logstash-output-exec
logstash-output-file
logstash-output-ganglia
logstash-output-gelf
logstash-output-graphite
logstash-output-hipchat
logstash-output-http
logstash-output-irc
logstash-output-juggernaut
logstash-output-kafka
logstash-output-lumberjack
logstash-output-nagios
logstash-output-nagios_nsca
logstash-output-null
logstash-output-opentsdb
logstash-output-pagerduty
logstash-output-pipe
logstash-output-rabbitmq
logstash-output-redis
logstash-output-s3
logstash-output-sns
logstash-output-sqs
logstash-output-statsd
logstash-output-stdout
logstash-output-tcp
logstash-output-udp
logstash-output-xmpp
logstash-output-zeromq
logstash-patterns-core
發現果然沒有logstash-input-beats,那麼就我們自己來安裝一下吧
bin/plugin install logstash-input-beats
Validating logstash-input-beats
Unable to download data from https://rubygems.org/ - Received fatal alert: protocol_version (https://api.rubygems.org/latest_specs.4.8.gz)
ERROR: Installation aborted, verification failed for logstash-input-beats
報驗證不通過,那麼我們就不讓它驗證好了,加一個(-no-verify)參數吧
bin/plugin install --no-verify logstash-input-beats
然而我們還是會發現,安裝不成功,到此爲止,我也不知道怎麼解決這個問題,因爲我們線上logstash 的版本是1.5.3的,所以我感覺可能是版本太低導致的,接着就下載了一個稍微高點的版本的2.3.0的logstash,同樣的配置文件,結果發現啓動沒有任何問題
結果:
監聽到的filebeat寫到logstash的數據能正常採集到,流程驗證ok