David Vassallo's Blog

If at first you don't succeed; call it version 1.0

ELK : exporting to CSV

Note: the following requires the “jq” json parser, available from: http://stedolan.github.io/jq/

1. Run the desired query through the Kibana WebUI

2. Expand the additional options pane by clicking on the arrow underneath the graph as indicated in the below screenshot:

blog_csv_export

3. Select “Request” and copy the request displayed:

blog_csv_export_2

4. Open a linux terminal and use the following command, pasting the above where indicated, and changing the template name as appropriate:

curl -XGET ‘http://192.168.12.68:9200/template_name-*/_search?pretty’ -d ‘—–PASTE HERE———‘ | jq -r ‘.hits.hits[]._source | del(.tags) | [to_entries[]] | map(.value) | @csv’ > /tmp/output.csv

Note: “size” in the copy/pasted text needs to be modified according to how many records are to be exported.

The csv output will be stored in the /tmp/output.csv file, which can be downloaded via SFTP and manipulated as necessary

Note: Since ELK deals with unstructured data, it may be the case that the csv file does not have the same number of columns for each entry, especially if different types of records are queried. This is to be expected and is by design. In order to get a list of column names for each row, the following command can be run:

curl -XGET ‘http://192.168.12.68:9200/template_name-*/_search?pretty’ -d ‘—–PASTE HERE———‘ | jq -r ‘.hits.hits[]._source | del(.tags) | [to_entries[]] | map(.key) | @csv’ > /tmp/output.csv

The above command will generate the /tmp/output.csv file but will contain column headings for each row rather than actual data.

AlienVault ELK Integration

In the last couple of blog posts[1][2] we’ve been exploring how to use the ELK stack as a forensic logging platform. We also had a couple of posts on deploying some AlienVault features [3][4]. In this post we explore a quick and easy way to integrate between the two systems. Apart from the flexible querying that ELK brings, ELK is also extremely easy to scale and replicate data across a distributed cluster of machines. In addition to all this, Kibana allows you to have auto-updating visualizations, such as trend analysis of the events per seconds from a particular data source (which by the way is a much more elegant and simple solution to the problem originally presented in [4])

Source trend analysis (with a very limited dataset unfortunately)

Source trend analysis (with a very limited dataset unfortunately)

AlienVault’s best feature in my opinion is the OSSIM (https://www.alienvault.com/products/ossim), an open source SIEM with a very flexible rule and correlation engine that works very well! Less of a joy to use is the AlienVault logger (https://www.alienvault.com/docs/data-sheets/AlienVault-Logger.pdf), which while it does what it’s advertised to do, is nowhere near as flexible or as polished as ELK is. After all, ELK is built from the ground up to deal with searching and scalability. The objective for me was to get the AlienVault OSSIM logs into ELK, one way or another. We came up with two ways of doing this:

1. Streaming logs from the AlienVault OSSIM servers to ELK in a “live” fashion. This is the preferred method but it does involve installing the NXLog [5] agent on alienvault systems.

2. Loading the OSSIM logs into ELK manually, “on-demand” in a bulk fashion. This is the best option for those deployments (maybe in highly sensitive or contractually-binding environments) where the alienvault sytems cannot be touched directly but logs still need to be shipped to ELK in some way.

– Streaming Logs

This is the preferred method because it “just works”. Simply build and install the NXLog agent on the Alienvault OSSIM server (there’s a guide on how to do this here [1]) and use something similar to the following for nxlog.conf:

We simply monitor the text logs being produced under “/var/ossim/logs/*.log” and that’s it. NXLog is intelligent enough to recursively search through the directory tree under /var/ossim/logs/ and pick up any files ending in “.log”. In this case, they get sent to host 192.168.12.68 on port 5142. Fire up NXLog and that’s it…

– Manual, on-demand logs.

In those environments where we cannot simply install NXLog on alienvault systems, it’s still possible to ship logs semi-manually into ELK. To do this we install NXLog directly onto the ELK server (or any other staging server) and use the following nxlog.conf:

The interesting part of the above config file is line 23 – where we set “ReadFromLast FALSE”. This is necessary so that we can load past log files by simply copy/pasting OSSIM log files into the directory specified (“/elk/historic_data/ossim/*.log” in this case). We then send this data on to port 5142. So shipping any log data we need to ELK is as simple as using SFTP to login to the alienvault servers, locating the OSSIM logs of interest under /var/ossim/logs and copy / pasting those files into the /elk/historic_data/ossim directory on our staging server.

Logstash configuration

In either case, we need to properly configure Logstash to receive the logs. Here’s the configuration file used:

The most important section in the config log above is the part where we use the kv filter to parse the received log files:

         kv {
            value_split => "='"
            field_split => "' "
         }

This uses the “key – value” logstash filter [6] to parse log file automatically without needing to define any complicated grok filters. The KV filter turned out to be incredibly useful because the OSSIM logs differ slightly according to which AlienVault plugin produced the log, but all OSSIM logs thankfully keep the same format of key-value pairs seperated by an equals (=) sign (trust me, going after the grok filters manually can get hairy… this is what it was looking like)

OSSIM logs produced in key value format

OSSIM logs produced in key value format

The KV filter neatly abstracts all this and caters for any changes in the key fields, leaving you with easily queried and searchable logs in ELK:

OSSIM logs in ELK

OSSIM logs in ELK

:)

References

[1] Building a Logging Forensics Platform using ELK (Elasticsearch, Logstash, Kibana), http://blog.davidvassallo.me/2015/04/21/building-a-logging-forensics-platform-using-elk-elasticsearch-logstash-kibana/

[2] Beyond the basics : Logging Forensics with ELK (Elasticsearch, Logstash, Kibana), http://blog.davidvassallo.me/2015/06/25/beyond-the-basics-logging-forensics-with-elk-elasticsearch-logstash-kibana/

[3] AlienVault: Adding a logger to a distributed deployment, http://blog.davidvassallo.me/2015/06/03/alienvault-adding-a-logger-to-a-distributed-deployment/

[4] AlienVault: Monitoring individual sensor Events Per Second [EPS]http://blog.davidvassallo.me/2015/02/03/alienvault-monitoring-individual-sensor-events-per-second-eps/

[5] NXLog, http://nxlog.org/

[6] ElastichSearch KV filter: https://www.elastic.co/guide/en/logstash/current/plugins-filters-kv.html

Follow

Get every new post delivered to your Inbox.

Join 230 other followers