in Sysadmin

Running ELK-stack on FreeBSD

This article describes how to install  and run ELK-stack (Elasticsearch, Logstash and Kibana) on FreeBSD.

Background

The ELK-stack (now called Elastc Stack) is a powerful software stack consisting of Elasticsearch, Logstash and Kibana that can be used to store and search data (Elasticsearch), harvest log files and other metrics (Logstash) and visualise the data (Kibana).  The stack is optimized for running on Linux but ports to FreeBSD have existed for a long time. This article describes the basic steps to get ELK-stack running on FreeBSD.

Install and configure Elasticsearch

Elasticsearch is a distributed search and analytics engine that stores all actual data in the ELK-stack. The most basic configuration is very simple, we just need to install the package: pkg install elasticsearch7

and configure what ports to listen to and where to store the data in

cluster.name: generic
node.name: node1
path.data: /usr/elasticsearch
path.logs: /var/log/elasticsearch
http.port: 9200

Install and configure Logstash

Logstash will be doing the “heavy lifting” in our setup. Logstash is used to parse all our logs and feed them into elasticsearch i a searchable format. You can think of every record in elasticsearch as a set of key-values and Logstash is used to extract the key/values from plain text logs (this is of course much easier if your application already logs in json format for example) or other input data. The basic configuation is simple, just install logstash: pkg install logstash7

and configure where to find your pipeline configuration

path.config: /usr/local/etc/logstash/conf.d/
Basic pipeline

This is an example of a very basic pipeline that reads a log file and outputs the data to Elasticsearch

input {
    file {
        path => "/var/log/messages"
    }
}

output {
    elasticsearch { 
        hosts => [ "localhost:9200" ]
    }
}
Minimal actual pipeline

This example will parse actual logs from the pkg(8) tool in FreeBSD. There is plenty of resources online on how to parse other types of logs.

input {
    file {
        path => "/var/log/messages"
    }
}
filter {
    # Parse standard syslog format. Match timestamp and remove it from message
    grok {
        match => { "message" => "%{SYSLOGBASE} %{GREEDYDATA:message}"}
        overwrite => "message"
    }
    # Parse standard syslog date
    date {
        match => [ "timestamp","MMM  d HH:mm:ss", "MMM dd HH:mm:ss", "ISO8601" ]
        remove_field => [ "timestamp" ]
    }
    
    # If basic syslog parser found the logging program to be "pkg" then parse out package and action
    # Mar 16 20:58:17 torus pkg[37129]: kibana7-7.6.1 installed
    if [program] == "pkg" {
        grok {
            match => [
                "message", "%{NOTSPACE:pkg_package} %{WORD:pkg_action}"
            ]
         }
    }
}
output {
    elasticsearch {
        hosts => [ "localhost:9200" ]
    }
}

Install and configure Kibana

Kibana is the visualisation tool bundled in the ELK-stack. With Kibana you can build visualisations and dashboards for your data making it easier to search and understand. Install kibana: pkg install kibana7

and configure it

server.host: "localhost"
server.port: 5601

Running your ELK-stack

When all components are configured you can start your them by:

# sysrc elasticsearch_enable=YES
# sysrc logstash_enable=YES
# sysrc kibana_enable=YES
# service elasticsearch start
# service logstash start
# service kibana start

Now you should have a logstash instance running that reads /var/log/messages and send each log row as a record to elasticsearch for indexing. You can then view the data using Kibana by visiting http://127.0.0.1:5601

Please note that you will need to configure index patterns in kibana before you can actually search and visualise data in kibana. This is outside the scope of this article but there is plenty of resources online on this.

Quick note on beats

When you need to ship data/logs from one machine to another the state of the art way to do this is to use the filebeat component in beats, which is now included in the Elastic stack.

Beats can also be used to collect other types of metrics like network performance data, netflow and it can also parse many different log file types out of the box. This makes the tool very powerful for collection logs and/or metrics and ship them to elasticsearch.

Write a Comment

Comment

Webmentions

  • How to install ElasticSearch and Logstash (and Kibana for ELK) in a TrueNAS Core Jail - Angry Sysadmins
    Warning: Attempt to read property "comment_date" on null in /usr/local/www/sites/framkant.org/wp-includes/comment-template.php on line 558

    […] If you want to configure kibana, follow this guide […]