Filebeat => logstash => Elasticsearch and working modules

Setting up filbeat modules to work when you are uisng logstash to send logs over to elastic.

So i started setting up filbeat to ship my mysql-slow.log and planned to use the filbeat module.

The logs started flowing and after some time i got the logs into the correct index. But to my surprise the logs where not correct parsed. ?

The problem is that filebeat want to connect direct to elastic and ad a pipline script (grokparser in elastic )

And my setup was

Filebeat –> logstash –> Elasticsearch

1. Setup Logstash to filter out filebeat logs and add them to a own output and use the pipline

Here a pick out the filebeat slow logs logs and give them the field log_type = filebeat

        else if [fileset][name] == "slowlog" {
                mutate {
                add_field => {
                        "log_type" => "filebeat"
                        }
                }}

Then i use that field in my output to send it to the correct endex in elasticsearch. And also tell eleastic to use a pipline script to parse the incomming data (It runs grok)

	else if [log_type] == "filebeat" {
         elasticsearch {
                hosts => ["http://localhost:9200"]
                index => "filebeat-%{+YYYY.MM.dd}"
		pipeline => "%{[@metadata][pipeline]}"
 		 }
	}

Yee so that pipline script where is it and how to a upload it to elasticsearch.
The file is in the folder
/usr/share/filebeat/module/mysql/slowlog/ingest/pipeline.json

So run this curl to load it into elasticsearch

  curl -X PUT "localhost:9200/_ingest/pipeline/filebeat-7.6.1-mysql-slowlog-pipeline?pretty" -H 'Content-Type: application/json' -d @pipline.json
  

curl -X GET "localhost:9200/_ingest/pipeline/?pretty"

So now restart stuff and then we have fine parsed logs from filebeat going trow logstash 🙂