AlienVault ELK Integration

In the last couple of blog posts[1][2] we’ve been exploring how to use the ELK stack as a forensic logging platform. We also had a couple of posts on deploying some AlienVault features [3][4]. In this post we explore a quick and easy way to integrate between the two systems. Apart from the flexible querying that ELK brings, ELK is also extremely easy to scale and replicate data across a distributed cluster of machines. In addition to all this, Kibana allows you to have auto-updating visualizations, such as trend analysis of the events per seconds from a particular data source (which by the way is a much more elegant and simple solution to the problem originally presented in [4])

Source trend analysis (with a very limited dataset unfortunately)
Source trend analysis (with a very limited dataset unfortunately)

AlienVault’s best feature in my opinion is the OSSIM (https://www.alienvault.com/products/ossim), an open source SIEM with a very flexible rule and correlation engine that works very well! Less of a joy to use is the AlienVault logger (https://www.alienvault.com/docs/data-sheets/AlienVault-Logger.pdf), which while it does what it’s advertised to do, is nowhere near as flexible or as polished as ELK is. After all, ELK is built from the ground up to deal with searching and scalability. The objective for me was to get the AlienVault OSSIM logs into ELK, one way or another. We came up with two ways of doing this:

1. Streaming logs from the AlienVault OSSIM servers to ELK in a “live” fashion. This is the preferred method but it does involve installing the NXLog [5] agent on alienvault systems.

2. Loading the OSSIM logs into ELK manually, “on-demand” in a bulk fashion. This is the best option for those deployments (maybe in highly sensitive or contractually-binding environments) where the alienvault sytems cannot be touched directly but logs still need to be shipped to ELK in some way.

– Streaming Logs

This is the preferred method because it “just works”. Simply build and install the NXLog agent on the Alienvault OSSIM server (there’s a guide on how to do this here [1]) and use something similar to the following for nxlog.conf:


define ROOT /nxlog
Moduledir /usr/local/libexec/nxlog/modules
CacheDir %ROOT%/data
Pidfile %ROOT%/data/nxlog.pid
SpoolDir %ROOT%/data
LogFile %ROOT%/data/nxlog.log
<Extension _syslog>
Module xm_syslog
</Extension>
<Extension json>
Module xm_json
</Extension>
<Input in_ossim>
Module im_file
File '/var/ossim/logs/*.log'
SavePos TRUE
ReadFromLast TRUE
PollInterval 1
Exec $Message = $raw_event;
</Input>
<Output out_ossim>
Module om_tcp
Port 5142
Host 192.168.12.68
</Output>
<Route 1>
Path in_ossim => out_ossim
</Route>

view raw

nxlog.conf

hosted with ❤ by GitHub

We simply monitor the text logs being produced under “/var/ossim/logs/*.log” and that’s it. NXLog is intelligent enough to recursively search through the directory tree under /var/ossim/logs/ and pick up any files ending in “.log”. In this case, they get sent to host 192.168.12.68 on port 5142. Fire up NXLog and that’s it…

– Manual, on-demand logs.

In those environments where we cannot simply install NXLog on alienvault systems, it’s still possible to ship logs semi-manually into ELK. To do this we install NXLog directly onto the ELK server (or any other staging server) and use the following nxlog.conf:


Global directives #
########################################
User nxlog
Group nxlog
LogFile /var/log/nxlog/nxlog.log
LogLevel INFO
########################################
# Modules #
########################################
<Extension _syslog>
Module xm_syslog
</Extension>
<Extension json>
Module xm_json
</Extension>
<Input in_ossim>
Module im_file
File '/elk/historic_data/ossim/*.log'
SavePos TRUE
ReadFromLast FALSE
PollInterval 1
Exec $message = $raw_event;
</Input>
<Output out_ossim>
Module om_tcp
Port 5142
Host 127.0.0.1
</Output>
<Route 1>
Path in_ossim => out_ossim
</Route>

view raw

nxlog.conf

hosted with ❤ by GitHub

The interesting part of the above config file is line 23 – where we set “ReadFromLast FALSE”. This is necessary so that we can load past log files by simply copy/pasting OSSIM log files into the directory specified (“/elk/historic_data/ossim/*.log” in this case). We then send this data on to port 5142. So shipping any log data we need to ELK is as simple as using SFTP to login to the alienvault servers, locating the OSSIM logs of interest under /var/ossim/logs and copy / pasting those files into the /elk/historic_data/ossim directory on our staging server.

Logstash configuration

In either case, we need to properly configure Logstash to receive the logs. Here’s the configuration file used:


input {
tcp {
port => 5142
type => "ossim-events"
codec => json {
charset => "CP1252"
}
}
syslog {
type => "syslog"
}
}
filter {
mutate {
add_field => { "Agent_IP" => "%{host}" }
}
######## ALIENVAULT OSSIM Logs ########################################
if [type] == "ossim-events" {
kv {
value_split => "='"
field_split => "' "
}
}
}
output {
stdout { }
elasticsearch {
host => "localhost"
template => "/elk/logstash/templates/elasticsearch-template.json"
template_overwrite => true
}
}

view raw

logstash.conf

hosted with ❤ by GitHub

The most important section in the config log above is the part where we use the kv filter to parse the received log files:

         kv {
            value_split => "='"
            field_split => "' "
         }

This uses the “key – value” logstash filter [6] to parse log file automatically without needing to define any complicated grok filters. The KV filter turned out to be incredibly useful because the OSSIM logs differ slightly according to which AlienVault plugin produced the log, but all OSSIM logs thankfully keep the same format of key-value pairs seperated by an equals (=) sign (trust me, going after the grok filters manually can get hairy… this is what it was looking like)

OSSIM logs produced in key value format
OSSIM logs produced in key value format

The KV filter neatly abstracts all this and caters for any changes in the key fields, leaving you with easily queried and searchable logs in ELK:

OSSIM logs in ELK
OSSIM logs in ELK

🙂

References

[1] Building a Logging Forensics Platform using ELK (Elasticsearch, Logstash, Kibana), https://blog.davidvassallo.me/2015/04/21/building-a-logging-forensics-platform-using-elk-elasticsearch-logstash-kibana/

[2] Beyond the basics : Logging Forensics with ELK (Elasticsearch, Logstash, Kibana), https://blog.davidvassallo.me/2015/06/25/beyond-the-basics-logging-forensics-with-elk-elasticsearch-logstash-kibana/

[3] AlienVault: Adding a logger to a distributed deployment, https://blog.davidvassallo.me/2015/06/03/alienvault-adding-a-logger-to-a-distributed-deployment/

[4] AlienVault: Monitoring individual sensor Events Per Second [EPS]https://blog.davidvassallo.me/2015/02/03/alienvault-monitoring-individual-sensor-events-per-second-eps/

[5] NXLog, http://nxlog.org/

[6] ElastichSearch KV filter: https://www.elastic.co/guide/en/logstash/current/plugins-filters-kv.html

Advertisement
Privacy Settings

2 thoughts on “AlienVault ELK Integration

Comments are closed.