問題現象:
{"index"=>{"_index"=>"product", "_type"=>"_doc", "_id"=>"146", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [create_time] of type [date] in document with id '146'. Preview of field's value: '2022-01-25T09:17:01.000Z'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2022-01-25T09:17:01.000Z] with format [yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
由於數據時間格式問題導致無法將數據傳輸到es中,es中的索引對應的時間字段也設置了格式,如下
PUT /product
{
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0
},
"mappings": {
"properties": {
"create_time": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
},
"update_time": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
}
}
}
}
問題原因:
在logstash啓動配置中關於mysql的配置部分,原先查詢數據使用的方法是:
#要執行的sql
statement => "select * from product where update_time > date_add(:sql_last_value, INTERVAL 8 HOUR)"
後面對時間字段做了專門的處理
#要執行的sql
statement => "select date_format(create_time,'%Y-%m-%d %H:%i:%s') as create_time, date_format(update_time,'%Y-%m-%d %H:%i:%s') as update_time
from product where update_time > date_add(:sql_last_value, INTERVAL 8 HOUR)"
這樣子就可以傳輸到es中了