Logstash-logbak-input

 

 

 

 

LK可以說是當前對分佈式服務器集羣日誌做彙總、分析、統計和檢索操作的很好的一套系統了。而Spring Boot作爲一套爲微服務而生的框架,自然也免不了處理分佈式日誌的問題,通過ELK日誌系統來處理日誌還是很有意義的。在這套系統中,E即爲ElasticSearch,負責日誌存儲;L爲LogStash,負責日誌收集,並將日誌信息寫入ElasticSearch,K則爲Kibana,負責將ElasticSearch中的日誌數據進行可視化及分析檢索操作。可以說將Spring Boot與ELK整合很大程度上相當於將Spring Boot與Logstash進行整合。將那麼如何將Spring Boot與LogStash整合起來呢?

在Spring Boot當中,默認使用logback進行log操作。和其他日誌工具如log4j一樣,logback支持將日誌數據通過提供IP地址、端口號,以Socket的方式遠程發送。在Spring Boot中,通常使用logback-spring.xml來進行logback配置。

要想將logback與Logstash整合,必須引入logstash-logback-encoder包。該包的依賴如下:

<dependency>
      <groupId>net.logstash.logback</groupId>
      <artifactId>logstash-logback-encoder</artifactId>
      <version>5.3</version>
</dependency>

將依賴添加好之後,就可以進行logback的配置了。logback-spring.xml配置如下:

<?xml version="1.0" encoding="utf-8" ?>
<!--該日誌將日誌級別不同的log信息保存到不同的文件中 -->
<configuration>
    <include resource="org/springframework/boot/logging/logback/defaults.xml"/>

    <springProperty scope="context" name="springAppName" source="spring.application.name"/>

    <!-- 日誌在工程中的輸出位置 -->
    <property name="LOG_FILE" value="${BUILD_FOLDER:-build}/${springAppName}"/>

    <!-- 控制檯的日誌輸出樣式 -->
    <property name="CONSOLE_LOG_PATTERN"
              value="%clr(%d{yyyy-MM-dd HH:mm:ss.SSS}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}}"/>

    <!-- 控制檯輸出 -->
    <appender name="console" class="ch.qos.logback.core.ConsoleAppender">
        <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
            <level>INFO</level>
        </filter>
        <!-- 日誌輸出編碼 -->
        <encoder>
            <pattern>${CONSOLE_LOG_PATTERN}</pattern>
            <charset>utf8</charset>
        </encoder>
    </appender>

    <!-- 爲logstash輸出的Appender -->
    <appender name="logstash" class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>192.168.1.111:8081</destination>
        <encoder charset="UTF-8" class="net.logstash.logback.encoder.LogstashEncoder"/>
    </appender>

    <!-- 日誌輸出級別 -->
    <root level="INFO">
        <appender-ref ref="console"/>
        <appender-ref ref="logstash"/>
    </root>
</configuration>

logback的基礎配置不多說,百度谷歌上一大把,這裏我們只討論與Logstash相關的部分。在上面的XML文件中的第4行至第7行即爲Logstash的配置。destination爲日誌的發送地址,在配置Logstash的時候選擇監聽這個地址即可進行日誌收集。這裏我把Logstash部署在了本地的虛擬機上(其地址就是192.168.1.111),端口爲8081。當然這裏你需要把地址改成你自己的。下面的encoder是必選項。Console則是爲了保證Spring Boot原始的日誌配置不被覆蓋。這裏logback配置完畢。

Logstash方面,在安裝目錄下新建一個文件夾conf,在conf文件夾下創建文件logstash.conf,內容如下:

# Logstash configuration
# TCP -> Logstash -> Elasticsearch pipeline.

input {
  tcp {
    mode => "server"
    host => "192.168.1.111" //儘量使用IP
    port => 8081            //從本地的8081端口取日誌
    codec => json_lines     //需要安裝logstash-codec-json_lines插件
  }
}

output {
  elasticsearch {
    hosts => ["http://192.168.1.111:9200"]  //輸出到ElasticSearch
    index => "logstash-%{+YYYY.MM.dd}"
  }
  stdout {                              //若不需要在控制檯中輸出,此行可以刪除
    codec => rubydebug
  }
}

如果你的Logstash沒有安裝logstash-codec-json_lines插件,通過以下命令安裝:

[root@ecs-55e5 ~]# cd /usr/share/logstash/
[root@ecs-55e5 logstash]# ls
bin  CONTRIBUTORS  data  Gemfile  Gemfile.lock  lib  LICENSE.txt  logstash-core  logstash-core-plugin-api  modules  NOTICE.TXT  tools  vendor  x-pack
[root@ecs-55e5 logstash]# cd bin
[root@ecs-55e5 bin]# ./logstash-plugin install logstash-codec-json_lines
Validating logstash-codec-json_lines
Installing logstash-codec-json_lines
Installation successful
[root@ecs-55e5 bin]# 

啓動Logstash 暴露出端口8081接受日誌  :         

[root@ecs-55e5 logstash]# logstash -f logstash.conf
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2019-03-06 14:38:50.990 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2019-03-06 14:38:51.007 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.5.4"}
[INFO ] 2019-03-06 14:38:54.639 [Converge PipelineAction::Create<main>] pipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[INFO ] 2019-03-06 14:38:55.095 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://192.168.1.111:9200/]}}
[WARN ] 2019-03-06 14:38:55.284 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://192.168.1.111:9200/"}
[INFO ] 2019-03-06 14:38:55.549 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>6}
[WARN ] 2019-03-06 14:38:55.553 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[INFO ] 2019-03-06 14:38:55.577 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://192.168.1.111:9200"]}
[INFO ] 2019-03-06 14:38:55.591 [Ruby-0-Thread-5: :1] elasticsearch - Using mapping template from {:path=>nil}
[INFO ] 2019-03-06 14:38:55.608 [Ruby-0-Thread-5: :1] elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[INFO ] 2019-03-06 14:38:55.644 [[main]>worker7] tcp - Starting tcp input listener {:address=>"localhost:8081", :ssl_enable=>"false"}
[INFO ] 2019-03-06 14:38:55.917 [Converge PipelineAction::Create<main>] pipeline - Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x67d08165 run>"}
[INFO ] 2019-03-06 14:38:55.952 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2019-03-06 14:38:56.152 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}

大功告成!最終效果如下:

{
    "logger_name" => "com.amt.hibei.sysframework.config.SysConfig",
    "thread_name" => "main",
     "@timestamp" => 2019-03-06T07:27:26.348Z,
    "level_value" => 20000,
           "host" => "182.148.112.187",
           "port" => 58138,
          "level" => "INFO",
       "@version" => "1",
        "message" => "============系統參數加載完成!=============="
}
{
    "logger_name" => "com.amt.hibei.client.HibeiGameClientHiApplication",
    "thread_name" => "main",
     "@timestamp" => 2019-03-06T07:27:26.259Z,
    "level_value" => 20000,
           "host" => "182.148.112.187",
           "port" => 58138,
          "level" => "INFO",
       "@version" => "1",
        "message" => "Starting HibeiGameClientHiApplication on Amt-PC with PID 4256 (E:\\IdeWorkspace\\hibeigame\\hibeigame-client-HI\\target\\classes started by amt in E:\\IdeWorkspace\\hibeigame)"
}

log-bak.xml

<?xml version="1.0" encoding="UTF-8"?>
<!--該日誌將日誌級別不同的log信息保存到不同的文件中 -->
<configuration>
    <include resource="org/springframework/boot/logging/logback/defaults.xml"/>

    <springProperty scope="context" name="springAppName"
                    source="spring.application.name"/>

    <!-- 日誌在工程中的輸出位置 -->
    <property name="LOG_FILE" value="${BUILD_FOLDER:-build}/${springAppName}"/>

    <!-- 控制檯的日誌輸出樣式 -->
    <property name="CONSOLE_LOG_PATTERN"
              value="%clr(%d{yyyy-MM-dd HH:mm:ss.SSS}){faint} %clr(${LOG_LEVEL_PATTERN:-%5p}) %clr(${PID:- }){magenta} %clr(---){faint} %clr([%15.15t]){faint} %m%n${LOG_EXCEPTION_CONVERSION_WORD:-%wEx}}"/>

    <!-- 控制檯輸出 -->
    <appender name="console" class="ch.qos.logback.core.ConsoleAppender">
        <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
            <level>INFO</level>
        </filter>
        <!-- 日誌輸出編碼 -->
        <encoder>
            <pattern>${CONSOLE_LOG_PATTERN}</pattern>
            <charset>utf8</charset>
        </encoder>
    </appender>

    <!-- 爲logstash輸出的JSON格式的Appender -->
    <appender name="logstash"
              class="net.logstash.logback.appender.LogstashTcpSocketAppender">
        <destination>192.168.11.86:9250</destination>
        <!-- 日誌輸出編碼 -->
        <encoder
                class="net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder">
            <providers>
                <timestamp>
                    <timeZone>UTC</timeZone>
                </timestamp>
                <pattern>
                    <pattern>
                        {
                        "severity": "%level",
                        "service": "${springAppName:-}",
                        "trace": "%X{X-B3-TraceId:-}",
                        "span": "%X{X-B3-SpanId:-}",
                        "exportable": "%X{X-Span-Export:-}",
                        "pid": "${PID:-}",
                        "thread": "%thread",
                        "class": "%logger{40}",
                        "message": "%message",
                        "timeDate": "%d{yyyy-MM-dd HH:mm:ss.SSS}",
                        "stack_trace": %exception{5}異常信息,
                        "line":"%line"日誌行號,
                        "logLevel":"%level",
                        "serviceName":"${spring.application.name}"項目名字,從配置文件中獲取
                        }
                    </pattern>
                </pattern>
            </providers>
        </encoder>
    </appender>

    <!-- 日誌輸出級別 -->
    <root level="INFO">
        <appender-ref ref="console"/>
        <appender-ref ref="logstash"/>
    </root>
</configuration>

 

發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章