SpringBoot2.x日誌收集搭建 ELK(7.6.2)+(RabbitMq3.7.16+Erlang 21.0.1)
項目地址:sb-elk-rabbitmq
-
rabbitmq-3.7.6 自行到官網下載安裝對應的版本以及對應的Erlang可參考地址
-
elasticsearch-7.6.2(下載解壓即可) 集羣搭建選看
-
elasticsearch-head(下載解壓放到對應目錄下,稍後講解)
-
kibana-7.6.2(下載解壓即可)
-
logstash-7.6.2(下載解壓即可)
在系統中elk文件夾放置以下文件
1、配置elasticsearch
打開elasticsearch的目錄下elasticsearch.yml文件添加配置
# ----------------------------------- Paths ------------------------------------
#
# Path to directory where to store the data (separate multiple locations by comma):
# 配置elasticsearch數據目錄
path.data: D:\dev\devsoft\elk\data
#
# Path to log files:
# 配置elasticsearch日誌目錄
path.logs: D:\dev\devsoft\elk\logs
# ---------------------------------- Network -----------------------------------
#
# Set the bind address to a specific IP (IPv4 or IPv6):
#
#network.host: 192.168.0.1
#
# Set a custom port for HTTP:
#
#http.port: 9200
# 以下兩個配置問了配置elasticsearch-head防止跨域問題
http.cors.enabled: true
http.cors.allow-origin: "*"
在elasticsearch的bin目錄下雙擊elasticsearch.bat
文件即可啓動
啓動成功後訪問: http://localhost:9200
2、配置elasticsearch-head插件
在elasticsearch-head目錄下打開命令窗口執行以下命令(前提安裝了node.js/npm命令)
> npm install
# 啓動elasticsearch-head
> npm run start
訪問:http://localhost:9100 即可訪問
3、配置kibana
打開kibana的配置目錄conf下的kibana.yml配置文件,添加配置
# Kibana is served by a back end server. This setting specifies the port to use.
server.port: 5601
# Specifies the address to which the Kibana server will bind. IP addresses and host names are both valid values.
# The default is 'localhost', which usually means remote machines will not be able to connect.
# To allow connections from remote users, set this parameter to a non-loopback address.
server.host: "localhost"
# The URLs of the Elasticsearch instances to use for all your queries.
elasticsearch.hosts: ["http://localhost:9200"]
# Specifies locale to be used for all localizable strings, dates and number formats.
# Supported languages are the following: English - en , by default , Chinese - zh-CN .
#i18n.locale: "en"
i18n.locale: "zh-CN"
在bin目錄下雙擊啓動文件kibana.bat
啓動
訪問:http://localhost:5601
4、logstash接入RabbitMQ
官方文檔地址:訪問
在logstash的bin目錄新建配置文件rabbitmq-log.conf 並加入配置,具體配置可查看上面官方文檔地址
input {
# 輸入配置,這裏選用Rabbitmq的配置
rabbitmq {
# rabbitmq地址
host => "127.0.0.1"
# rabibtmq端口
port => 5672
# codec爲來源數據格式
codec => "json"
# rabbitmq中的交換器
exchange=> "ex_es_logstash"
# 監聽的mq的queue,設置該項可以不用配置key
queue => "es-log-queue"
# 監聽的路由key
key => "elk-es-log"
# queue是否持久化
durable => true
# type內容可以自由定義,可作爲標識
type => "es"
}
}
filter {
# 過濾,非必須
# input使用codec爲json格式,可以不進行grok正則匹配處理
}
output {
# 在字符串外的變量使用中括號"[]"包裹,如[type]
if [type] == "es" {
elasticsearch {
# elasticsearch地址
hosts => ["127.0.0.1:9200"]
# 根據輸入的type類型與爬蟲名稱構建index名稱
# 字符串內變量使用"%{}"包裹,如%{type}
index => "es-%{type}_log-%{+YYYY.MM.dd}"
}
} else {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "es-%{type}-log-%{+YYYY.MM.dd}"
}
}
}
在bin目錄下打開命令窗口執行以下命令
> logstash.bat -f rabbitmq-log.conf
5、項目中日誌接入rabbitmq收集
rabbitmq包引入
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-amqp</artifactId>
<version>2.2.1.RELEASE</version>
</dependency>
在項目中的日誌配置文件logback-spring.xml
中配置以下信息
<appender name="AMQP" class="org.springframework.amqp.rabbit.logback.AmqpAppender">
<layout>
<pattern>
{
"time": "%date{ISO8601}",
"thread": "%thread",
"level":
"%level",
"class": "%logger{60}",
"message": "%msg"
}
</pattern>
</layout>
<!--rabbitmq地址-->
<host>127.0.0.1</host>
<!--端口-->
<port>5672</port>
<!--用戶名-->
<username>admin</username>
<!--密碼-->
<password>root</password>
<!--項目名-->
<applicationId>byterun-es-service</applicationId>
<!--rabbitmq隊列接收的路由key和logstash的rabbitmq-log.conf中的路由key對應-->
<routingKeyPattern>elk-es-log</routingKeyPattern>
<declareExchange>true</declareExchange>
<exchangeType>direct</exchangeType>
<!--rabbitmq交換器名稱,和logstash的rabbitmq-log.conf的交換器對應-->
<exchangeName>ex_es_logstash</exchangeName>
<generateId>true</generateId>
<charset>UTF-8</charset>
<durable>true</durable>
<deliveryMode>PERSISTENT</deliveryMode>
</appender>
<!--info日誌輸出到rabbitmq-->
<root level="INFO">
<appender-ref ref="AMQP" />
</root>
<!--項目日誌debug日誌輸出到rabbitmq-->
<logger name="com.xxx" level="DEBUG" additivity="false">
<appender-ref ref="AMQP" />
</logger>
具體可用配置請查看源碼org.springframework.amqp.rabbit.logback.AmqpAppender
如圖:
完整配置
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<!--輸出到控制檯-->
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<Pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{50} - %msg%n</Pattern>
<charset>UTF-8</charset>
</encoder>
</appender>
<!-- ch.qos.logback.core.rolling.RollingFileAppender 文件日誌輸出 -->
<appender name="FILE"
class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>logs/byterun-es-service.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>logs/byterun-es-service.%d{yyyy-MM-dd}-%i.log</fileNamePattern>
<maxHistory>10</maxHistory>
<timeBasedFileNamingAndTriggeringPolicy
class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
<MaxFileSize>30MB</MaxFileSize>
</timeBasedFileNamingAndTriggeringPolicy>
</rollingPolicy>
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<appender name="FILE-ERROR"
class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>logs/byterun-es-service.err</file>
<filter class="ch.qos.logback.classic.filter.LevelFilter">
<level>ERROR</level>
<onMatch>ACCEPT</onMatch>
<onMismatch>DENY</onMismatch>
</filter>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>logs/byterun-es-service.%d{yyyy-MM-dd}-%i.err</fileNamePattern>
<maxHistory>10</maxHistory>
<timeBasedFileNamingAndTriggeringPolicy
class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
<MaxFileSize>30MB</MaxFileSize>
</timeBasedFileNamingAndTriggeringPolicy>
</rollingPolicy>
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<appender name="AMQP" class="org.springframework.amqp.rabbit.logback.AmqpAppender">
<layout>
<pattern>
{
"time": "%date{ISO8601}",
"thread": "%thread",
"level":
"%level",
"class": "%logger{60}",
"message": "%msg"
}
</pattern>
</layout>
<host>127.0.0.1</host>
<port>5672</port>
<username>admin</username>
<password>root</password>
<applicationId>byterun-es-service</applicationId>
<routingKeyPattern>elk-es-log</routingKeyPattern>
<declareExchange>true</declareExchange>
<exchangeType>direct</exchangeType>
<exchangeName>ex_es_logstash</exchangeName>
<generateId>true</generateId>
<charset>UTF-8</charset>
<durable>true</durable>
<deliveryMode>PERSISTENT</deliveryMode>
</appender>
<root level="INFO">
<appender-ref ref="STDOUT"/>
<appender-ref ref="FILE"/>
<appender-ref ref="FILE-ERROR"/>
<appender-ref ref="AMQP" />
</root>
<logger name="com.es" level="DEBUG" additivity="false">
<appender-ref ref="STDOUT"/>
<appender-ref ref="FILE"/>
<appender-ref ref="FILE-ERROR"/>
<appender-ref ref="AMQP" />
</logger>
<appender name="accessLog" class="ch.qos.logback.core.FileAppender">
<file>logs/access_log.log</file>
<encoder>
<pattern>%msg%n</pattern>
</encoder>
</appender>
<appender name="async" class="ch.qos.logback.classic.AsyncAppender">
<appender-ref ref="accessLog"/>
</appender>
<logger name="org.springframework.jdbc.core" level="DEBUG" additivity="false">
<appender-ref ref="STDOUT"/>
<appender-ref ref="FILE"/>
</logger>
<appender name="eventTrackLog" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>logs/byterun-es-service-event-track.log</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>logs/byterun-es-service-event-track.%d{yyyy-MM-dd}-%i.log</fileNamePattern>
<maxHistory>10</maxHistory>
<timeBasedFileNamingAndTriggeringPolicy
class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
<MaxFileSize>30MB</MaxFileSize>
</timeBasedFileNamingAndTriggeringPolicy>
</rollingPolicy>
<encoder>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} %msg%n</pattern>
</encoder>
</appender>
<logger name="service-event-track" level="INFO" additivity="false">
<appender-ref ref="eventTrackLog"/>
</logger>
</configuration>
啓動順序elasticsearch=>elasticsearch-head=>kibana=>logstash=>項目
訪問項目即可在kibana中查看到對應的日誌
參考文獻: