Configuring Logstash with Filebeat

In post Configuring ELK stack to analyse Apache Tomcat logs  we configured Logstash to pull data from directory whereas in this post we will configure Filebeat to push data to Logstash. Before configuring, let’s have a brief about why we need Filebeat.

Why Filebeat?
Filebeat helps in decentralization the server where logs are generated from where logs are processed, thus sharing the load from a single machine.

Now, lets’ start with our configuration, following below steps:

Step 1: Download and extract Filebeat in any directory, for me it’s filebeat under directory /Users/ArpitAggarwal/ as follows:

$ mkdir filebeat
$ cd filebeat
$ wget https://download.elastic.co/beats/filebeat/filebeat-1.0.0-darwin.tgz
$ tar -xvzf filebeat-1.0.0-darwin.tgz

Step 2: Replace the filebeat.yml content inside directory /Users/ArpitAggarwal/filebeat/filebeat-1.0.0-darwin/ with below content:

filebeat:
  prospectors:
    -
      paths:
        - /Users/ArpitAggarwal/tomcat/logs/*.log*"
      input_type: log
      document_type: my_log
output:
  logstash:
    hosts: ["localhost:5000"]
  console:
    pretty: true
shipper:
logging:
  files:
    rotateeverybytes: 10485760 # = 10MB

paths tag specified above is the location from where data is to be pulled.
document_type specified above is the type to be published in the ‘type’ field of logstash configuration.

Step 3: Start filebeat as a background process, as follows:

$ cd filebeat/filebeat-1.0.0-darwin
$ ./filebeat -c filebeat.yml &

Step 4: Configure Logstash to receive data from filebeat and output it to ElasticSearch running on localhost. To do the same, create a directory where we will create our logstash configuration file, for me it’s logstash created under directory /Users/ArpitAggarwal/ as follows:

$ cd /Users/ArpitAggarwal/
$ mkdir logstash patterns
$ cd logstash
$ touch logstash.conf
$ cd ../patterns
$ touch grok-patterns.txt

Copy the below content to logstash.conf:

input {
   beats {
     type => beats
     port => 5000
   }
}
filter {
    multiline {
              patterns_dir => "/Users/ArpitAggarwal/logstash/patterns"
              pattern => "\[%{TOMCAT_DATESTAMP}"
              what => "previous"
    }
    if [type] == "my_log" and "com.test.controller.log.LogController" in [message] {
        mutate {
                add_tag => [ "MY_LOG" ]
               }
        if "_grokparsefailure" in [tags] {
                  drop { }
              }
       date {
             match => [ "timestamp", "UNIX_MS" ]
             target => "@timestamp"
            }
        } else {
            drop { }
      }
}
output {
   stdout {
          codec => rubydebug
   }
   if [type] == "my_log"  {
                elasticsearch {
                           manage_template => false
                           hosts => ["localhost:9201"]
                 }
    }
}

Next, copy the contents from file https://github.com/elastic/logstash/blob/v1.2.2/patterns/grok-patterns to patterns/grok-patterns.txt

Step 5: Download and extract Logstash in any directory, for me it’s logstash-installation under directory /Users/ArpitAggarwal/, as follows:

$ wget https://download.elastic.co/logstash/logstash/logstash-2.1.0.zip
$ unzip logstash-2.1.0.zip

Step 6: Validate logstash configuration file using below command:

$ cd /Users/ArpitAggarwal/logstash-installation/logstash-2.1.0/bin
$ ./logstash -f /Users/ArpitAggarwal/logstash/logstash.conf --configtest --verbose —debug

Step 7: Install logstash-input-beats plugin and start Logstash as a background process to push data to ElasticSearch received from Filebeat, as follows:

$ cd /Users/ArpitAggarwal/logstash-installation/logstash-2.1.0/bin
$ ./plugin install logstash-input-beats
$ ./logstash -f /Users/ArpitAggarwal/logstash/logstash.conf &

6 thoughts on “Configuring Logstash with Filebeat

  1. Landed on this page searching for any existing filebeat implementations with weblogic.
    I’m still new to the ELK stack so this question might be naive.
    I believe it is possible to send weblogic logs messages via filebeat directly to elasticsearch, but is that a good approach?
    Logstash seems to have plugins for weblogic which can help in log enrichment, field creation, tagging etc
    so what in your opinion is a better idea?
    filebeat -> logstash -> (optional redis)-> elasticsearch -> kibana?

    or

    filebeat -> elasticsearch -> kibana ?

    Liked by 1 person

  2. filebeat -> logstash -> (optional redis)-> elasticsearch -> kibana is a good option I believe rather than directly sending logs from filebeat to elasticsearch, because logstash as an ETL in between provides you many advantages to receive data from multiple input sources and similarly output the processed data to multiple output streams along with filter operation to perform on input data.

    Liked by 1 person

  3. Do you know How can I send a custom message from filebeat to log stash if I don’t receive any input in filebeat for 5 minutes?

    Like

  4. HI,

    Followed same steps to configure in windows but still logs are not parsing to the elastic search (Other server) and please find the both logstash and filebeat information below which i have configured.

    Filebeat:

    filebeat:
    prospectors:

    paths:
    – C:/inetpub/logs/LogFiles/*/*”
    input_type: log
    document_type: my_log
    output:
    logstash:
    hosts: [“localhost:5044”]
    console:
    pretty: true
    shipper:
    logging:
    files:
    rotateeverybytes: 10485760 # = 10MB

    Logstash:

    input {
    beats {
    type => beats
    port => 5044
    }
    }
    output {
    stdout {
    codec => rubydebug
    }
    if [type] == “my_log” {
    elasticsearch {
    manage_template => false
    hosts => [“xxx.xx.xx.xxx:9200”]
    }
    }
    }

    Please let me know the issue

    Thanks.

    Like

  5. Hi Mr Arpit,
    Thank you for this great tutorial,
    Can you please help me with this http://stackoverflow.com/questions/38150042/parsing-xml-data-from-filebeat-using-logstash

    I am trying to parse and store XML documents in Windows environnment.
    First I used Filebeat to parse XML documents and send them to Logstash for further parsing using XPATH filter and sending them later to Elasticsearch as JSON documents.

    Apparently, XPATH Logstash filter failed to parse XML received from Filebeat.

    Like

Leave a comment