David Vassallo's Blog

If at first you don't succeed; call it version 1.0

Category Archives: Open Source

Lessons learned: Gephi not starting up

Scenario: When using Ubuntu Trusty (14.04.3LTS) Gephi 0.82 initializes but gets stuck on “Loading Cached Objects”

Workspace 1_318

Solution: The default Gephi startup bash script does not correctly initialize the JAVA path (it uses $jdkhome) The solution is simply to add the following to the beginning of the startup script (~/gephi/bin/gephi)

export JAVA_HOME=/usr/lib/jvm/default-java

The full script is here:

AlienVault ELK Integration

In the last couple of blog posts[1][2] we’ve been exploring how to use the ELK stack as a forensic logging platform. We also had a couple of posts on deploying some AlienVault features [3][4]. In this post we explore a quick and easy way to integrate between the two systems. Apart from the flexible querying that ELK brings, ELK is also extremely easy to scale and replicate data across a distributed cluster of machines. In addition to all this, Kibana allows you to have auto-updating visualizations, such as trend analysis of the events per seconds from a particular data source (which by the way is a much more elegant and simple solution to the problem originally presented in [4])

Source trend analysis (with a very limited dataset unfortunately)

Source trend analysis (with a very limited dataset unfortunately)

AlienVault’s best feature in my opinion is the OSSIM (https://www.alienvault.com/products/ossim), an open source SIEM with a very flexible rule and correlation engine that works very well! Less of a joy to use is the AlienVault logger (https://www.alienvault.com/docs/data-sheets/AlienVault-Logger.pdf), which while it does what it’s advertised to do, is nowhere near as flexible or as polished as ELK is. After all, ELK is built from the ground up to deal with searching and scalability. The objective for me was to get the AlienVault OSSIM logs into ELK, one way or another. We came up with two ways of doing this:

1. Streaming logs from the AlienVault OSSIM servers to ELK in a “live” fashion. This is the preferred method but it does involve installing the NXLog [5] agent on alienvault systems.

2. Loading the OSSIM logs into ELK manually, “on-demand” in a bulk fashion. This is the best option for those deployments (maybe in highly sensitive or contractually-binding environments) where the alienvault sytems cannot be touched directly but logs still need to be shipped to ELK in some way.

– Streaming Logs

This is the preferred method because it “just works”. Simply build and install the NXLog agent on the Alienvault OSSIM server (there’s a guide on how to do this here [1]) and use something similar to the following for nxlog.conf:

We simply monitor the text logs being produced under “/var/ossim/logs/*.log” and that’s it. NXLog is intelligent enough to recursively search through the directory tree under /var/ossim/logs/ and pick up any files ending in “.log”. In this case, they get sent to host on port 5142. Fire up NXLog and that’s it…

– Manual, on-demand logs.

In those environments where we cannot simply install NXLog on alienvault systems, it’s still possible to ship logs semi-manually into ELK. To do this we install NXLog directly onto the ELK server (or any other staging server) and use the following nxlog.conf:

The interesting part of the above config file is line 23 – where we set “ReadFromLast FALSE”. This is necessary so that we can load past log files by simply copy/pasting OSSIM log files into the directory specified (“/elk/historic_data/ossim/*.log” in this case). We then send this data on to port 5142. So shipping any log data we need to ELK is as simple as using SFTP to login to the alienvault servers, locating the OSSIM logs of interest under /var/ossim/logs and copy / pasting those files into the /elk/historic_data/ossim directory on our staging server.

Logstash configuration

In either case, we need to properly configure Logstash to receive the logs. Here’s the configuration file used:

The most important section in the config log above is the part where we use the kv filter to parse the received log files:

         kv {
            value_split => "='"
            field_split => "' "

This uses the “key – value” logstash filter [6] to parse log file automatically without needing to define any complicated grok filters. The KV filter turned out to be incredibly useful because the OSSIM logs differ slightly according to which AlienVault plugin produced the log, but all OSSIM logs thankfully keep the same format of key-value pairs seperated by an equals (=) sign (trust me, going after the grok filters manually can get hairy… this is what it was looking like)

OSSIM logs produced in key value format

OSSIM logs produced in key value format

The KV filter neatly abstracts all this and caters for any changes in the key fields, leaving you with easily queried and searchable logs in ELK:

OSSIM logs in ELK

OSSIM logs in ELK



[1] Building a Logging Forensics Platform using ELK (Elasticsearch, Logstash, Kibana), http://blog.davidvassallo.me/2015/04/21/building-a-logging-forensics-platform-using-elk-elasticsearch-logstash-kibana/

[2] Beyond the basics : Logging Forensics with ELK (Elasticsearch, Logstash, Kibana), http://blog.davidvassallo.me/2015/06/25/beyond-the-basics-logging-forensics-with-elk-elasticsearch-logstash-kibana/

[3] AlienVault: Adding a logger to a distributed deployment, http://blog.davidvassallo.me/2015/06/03/alienvault-adding-a-logger-to-a-distributed-deployment/

[4] AlienVault: Monitoring individual sensor Events Per Second [EPS]http://blog.davidvassallo.me/2015/02/03/alienvault-monitoring-individual-sensor-events-per-second-eps/

[5] NXLog, http://nxlog.org/

[6] ElastichSearch KV filter: https://www.elastic.co/guide/en/logstash/current/plugins-filters-kv.html


Get every new post delivered to your Inbox.

Join 250 other followers