ELK說明:
Elasticsearch是個開源分佈式搜索引擎,它的特點有:分佈式,零配置,自動發現,索引自動分片,索引副本機制,restful風格接口,多數據源,自動搜索負載等
Logstash是一個完全開源的工具,他可以對你的日誌進行收集、分析,並將其存儲供以後使用(如,搜索)
kibana 也是一個開源和免費的工具,Kibana可以爲 Logstash 和 ElasticSearch 提供的日誌分析友好的 Web 界面,可以幫助您彙總、分析和搜索重要數據日誌
功能:
1.方便日誌查詢,統計排查問題
2.報表展示,不用登錄每臺服務器查看日誌
組件:
Logstash: logstash server端用來蒐集日誌;
Elasticsearch: 存儲各類日誌;
Kibana: web化接口用作查尋和可視化日誌;
搭建部署(略)
應用:收集syslog,nginx access/error日誌,mongo日誌,程序日誌;說明如下:
nignx 訪問日誌:因nginx訪問日誌可自定義,這裏自定義爲json格式,方便ES存儲和索引
格式定義如下:
log_format main_json '{ "timestamp": "$time_local", ' '"remote_addr": "$remote_addr", ' '"remote_user": "$remote_user", ' '"body_bytes_sent": "$body_bytes_sent", ' '"request_time": "$request_time", ' '"status": "$status", ' '"domain": "$host", ' '"request": "$request", ' '"request_method": "$request_method", ' '"http_referrer": "$http_referer", ' '"body_bytes_sent":"$body_bytes_sent", ' '"http_x_forwarded_for": "$http_x_forwarded_for", ' '"http_user_agent": "$http_user_agent" }';
其他日誌收集見配置文件,說明如下:
input { file { path => [ "/var/log/syslog" ] #定義日誌路徑 type => "syslog" start_position => "beginning" ignore_older =>0 } file { path => "/var/log/nginx/*access.log" codec => json start_position => "beginning" type => "nginx-acc" } file { path => "/var/log/nginx/*error.log" start_position => "beginning" type => "nginx-error" ignore_older =>0 } file { path => [ "/data/mongo/mongo.log" ] type => "mongo" start_position => "beginning" #ignore_older =>0 } } filter { if [type] == "syslog" { grok { #grok 功能將字符串轉換爲相應的字段,方便檢索 match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" } add_field => [ "received_at", "%{@timestamp}" ] add_field => [ "received_from", "%{host}" ] } date { match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ] } } if [type] == "mongo" { grok { #mongo_v3 match => ["message","%{TIMESTAMP_ISO8601:timestamp}\s+%{MONGO3_SEVERITY:severity}\s+%{MONGO3_COMPONENT:component}\s+(?:\[%{DATA:context}\])?\s+%{GREEDYDATA:body}"] match => ["message","%{SYSLOGTIMESTAMP:timestamp} \[%{WORD:component}\] %{GREEDYDATA:body}"] #mongo_v2 } if[body]=~"ms$" { grok { match => ["body","query\s+%{WORD:db_name}\.%{WORD:collection_name}.*}.*\}(\s+%{NUMBER:spend_time:int}ms$)?"] } } date { match => [ "timestamp", "UNIX", "YYYY-MM-dd HH:mm:ss", "ISO8601" ] remove_field => ["timestamp"] } } if [type] == "nginx-error" { grok { match => { "message" => "(?<timestamp>%{YEAR}[./-]%{MONTHNUM}[./-]%{MONTHDAY}[- ]%{TIME}) \[%{LOGLEVEL:severity}\] %{POSINT:pid}#%{NUMBER}: %{GREEDYDATA:errormessage}(?:, client: (?<client>%{IP}|%{HOSTNAME}))(?:, server: %{IPORHOST:server})(?:, request: %{QS:request})?(?:, upstream: \"%{URI:upstream}\")?(?:, host: %{QS:host})?(?: referrer: \"%{URI:referrer}|-\")?" } overwrite => [ "message" ] } date { match => [ "nginx_error_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ] remove_field => [ "timestamp" ] } } } output { if [type] == "nginx-acc" { elasticsearch { #存儲 hosts => ["127.0.0.1:9200"] index => "nginx_access-%{+YYYY.MM.dd}" } } if [type] == "nginx-error" { elasticsearch { hosts => ["127.0.0.1:9200"] index => "nginx_error-%{+YYYY.MM.dd}" } } if [type] == "syslog" { elasticsearch { hosts => ["127.0.0.1:9200"] index => "syslog-%{+YYYY.MM.dd}" } } if [type] == "mongo" { elasticsearch { hosts => ["127.0.0.1:9200"] index => "mongo-%{+YYYY.MM.dd}" } } }
採集到數據展示如下:
參考鏈接:
https://github.com/logstash-plugins/logstash-patterns-core/tree/master/patterns
https://grokdebug.herokuapp.com/
http://logz.io/blog/nginx-log-analysis/