Create actionable data from your Vulnerability Scans

Create actionable data from your vulnerability scans

VulnWhisperer is a vulnerability data and report aggregator. VulnWhisperer will pull all the reports and create a file with a unique filename which is then fed into logstash. Logstash extracts data from the filename and tags all of the information inside the report (see logstash_vulnwhisp.conf file). Data is then shipped to elasticsearch to be indexed.

Build Status MIT License Twitter

Currently Supports

Vulnerability Frameworks

Reporting Frameworks

Getting Started

1) Follow the install requirements 2) Fill out the section you want to process in example.ini file 3) Modify the IP settings in the logstash files to accomodate your environment and import them to your logstash conf directory (default is /etc/logstash/conf.d/) 4) Import the kibana visualizations 5) Run Vulnwhisperer

Need assistance or just want to chat? Join our slack channel



Install Requirements-VulnWhisperer(may require sudo)

First install requirement dependencies

sudo apt-get install  zlib1g-dev libxml2-dev libxslt1-dev 

Then install requirements

pip install -r /path/to/VulnWhisperer/requirements.txt
cd /path/to/VulnWhisperer
python install

Now you’re ready to pull down scans. (see run section)

Install Requirements-ELK Node *SAMPLE*

The following instructions should be utilized as a Sample Guide in the absence of an existing ELK Cluster/Node. This will cover a Debian example install guide of a stand-alone node of Elasticsearch & Kibana.

While Logstash is included in this install guide, it it recommended that a seperate host pulling the VulnWhisperer data is utilized with Logstash to ship the data to the Elasticsearch node.

Please note there is a docker-compose.yml available as well.

Debian: (

sudo apt-get install -y default-jre
wget -qO - | sudo apt-key add -
sudo apt-get install apt-transport-https
echo "deb stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-5.x.list
sudo apt-get update && sudo apt-get install elasticsearch kibana logstash
sudo /bin/systemctl daemon-reload
sudo /bin/systemctl enable elasticsearch.service
sudo /bin/systemctl enable kibana.service
sudo /bin/systemctl enable logstash.service

Elasticsearch & Kibana Sample Config Notes

Utilizing your favorite text editor:

Start elasticsearch and validate they are running/communicating with one another:

sudo service elasticsearch start
sudo service kibana start


sudo systemctl start elasticsearch.service
sudo systemctl start kibana.service

Logstash Sample Config Notes

Once configured run Logstash: (Running Logstash as a service will pick up all the files in /etc/logstash/conf.d/ If you would like to run only one logstash file please reference the command below):

Logstash as a service:

sudo service logstash start


sudo systemctl start logstash.service

Single Logstash file:

sudo /usr/share/logstash/bin/logstash --path.settings /etc/logstash/ -f /etc/logstash/conf.d/1000_nessus_process_file.conf


There are a few configuration steps to setting up VulnWhisperer:

frameworks_example.ini file


To run, fill out the configuration file with your vulnerability scanner settings. Then you can execute from the command line.

vuln_whisperer -c configs/frameworks_example.ini -s nessus
vuln_whisperer -c configs/frameworks_example.ini -s qualys

If no section is specified (e.g. -s nessus), vulnwhisperer will check on the config file for the modules that have the property enabled=true and run them sequentially.

Next you’ll need to import the visualizations into Kibana and setup your logstash config. A more thorough README is underway with setup instructions.


The docker-compose file has been tested and running on a Ubuntu 18.04 environment, with docker-ce v.18.06. The structure’s purpose is to store locally the data from the scanners, letting vulnwhisperer update the records and Logstash feed them to ElasticSearch, so it requires a local storage folder.

To launch docker-compose, do:

docker-compose -f docker-compose.yml up

Running Nightly

If you’re running linux, be sure to setup a cronjob to remove old files that get stored in the database. Be sure to change .csv if you’re using json.

Setup crontab -e with the following config (modify to your environment) - this will run vulnwhisperer each night at 0130:

00 1 * * * /usr/bin/find /opt/vulnwhisp/ -type f -name '*.csv' -ctime +3 -exec rm {} \;

30 1 * * * /usr/local/bin/vuln_whisperer -c /opt/vulnwhisp/configs/example.ini

For windows, you may need to type the full path of the binary in vulnWhisperer located in the bin directory.

Elastic presentation on VulnWhisperer