Giter Site home page Giter Site logo

hasecuritysolutions / vulnwhisperer Goto Github PK

View Code? Open in Web Editor NEW
1.3K 116.0 271.0 2.5 MB

Create actionable data from your Vulnerability Scans

Home Page: https://twitter.com/VulnWhisperer

License: Apache License 2.0

Python 92.39% Smarty 0.58% Dockerfile 0.39% Shell 6.64%
nessus elasticstack elasticsearch logstash vulnerability python qualys

vulnwhisperer's Introduction

Create actionable data from your vulnerability scans

VulnWhisperer is a vulnerability management tool and report aggregator. VulnWhisperer will pull all the reports from the different Vulnerability scanners and create a file with a unique filename for each one, using that data later to sync with Jira and feed Logstash. Jira does a closed cycle full Sync with the data provided by the Scanners, while Logstash indexes and tags all of the information inside the report (see logstash files at /resources/elk6/pipeline/). Data is then shipped to ElasticSearch to be indexed, and ends up in a visual and searchable format in Kibana with already defined dashboards.

VulnWhisperer is an open-source community funded project. VulnWhisperer currently works but is due for a documentation overhaul and code review. This is on the roadmap for the next month or two (February or March of 2022 - hopefully). Please note, crowd funding is an option. If you would like help getting VulnWhisperer up and running, are interested in new features, or are looking for paid support (for those of you that require commercial support contracts to implement open-source solutions), please reach out to [email protected].

Build Status GitHub license Twitter

Currently Supports

Vulnerability Frameworks

Reporting Frameworks

Getting Started

  1. Follow the install requirements
  2. Fill out the section you want to process in frameworks_example.ini file
  3. [JIRA] If using Jira, fill Jira config in the config file mentioned above.
  4. [ELK] Modify the IP settings in the Logstash files to accommodate your environment and import them to your logstash conf directory (default is /etc/logstash/conf.d/)
  5. [ELK] Import the Kibana visualizations
  6. Run Vulnwhisperer

Need assistance or just want to chat? Join our slack channel

Requirements

  • Python 2.7
  • Vulnerability Scanner
  • Reporting System: Jira / ElasticStack 6.6

Install OS packages requirement dependencies (Debian-based distros, CentOS don't need it)

sudo apt-get install  zlib1g-dev libxml2-dev libxslt1-dev 

(Optional) Use a python virtualenv to not mess with host python libraries

virtualenv venv (will create the python 2.7 virtualenv)
source venv/bin/activate (start the virtualenv, now pip will run there and should install libraries without sudo)

deactivate (for quitting the virtualenv once you are done)

Install python libraries requirements

pip install -r /path/to/VulnWhisperer/requirements.txt
cd /path/to/VulnWhisperer
python setup.py install

(Optional) If using a proxy, add proxy URL as environment variable to PATH

export HTTP_PROXY=http://example.com:8080
export HTTPS_PROXY=http://example.com:8080

Now you're ready to pull down scans. (see run section)

Configuration

There are a few configuration steps to setting up VulnWhisperer:

  • Configure Ini file
  • Setup Logstash File
  • Import ElasticSearch Templates
  • Import Kibana Dashboards

frameworks_example.ini file

To run, fill out the configuration file with your vulnerability scanner settings. Then you can execute from the command line.

(optional flag: -F -> provides "Fancy" log colouring, good for comprehension when manually executing VulnWhisperer)
vuln_whisperer -c configs/frameworks_example.ini -s nessus 
or
vuln_whisperer -c configs/frameworks_example.ini -s qualys

If no section is specified (e.g. -s nessus), vulnwhisperer will check on the config file for the modules that have the property enabled=true and run them sequentially.

Next you'll need to import the visualizations into Kibana and setup your logstash config. You can either follow the sample setup instructions [here](https://github.com/HASecuritySolutions/VulnWhisperer/wiki/Sample-Guide-ELK-Deployment) or go for the `docker-compose` solution we offer.

Docker-compose

ELK is a whole world by itself, and for newcomers to the platform, it requires basic Linux skills and usually a bit of troubleshooting until it is deployed and working as expected. As we are not able to provide support for each users ELK problems, we put together a docker-compose which includes:

  • VulnWhisperer
  • Logstash 6.6
  • ElasticSearch 6.6
  • Kibana 6.6

The docker-compose just requires specifying the paths where the VulnWhisperer data will be saved, and where the config files reside. If ran directly after git clone, with just adding the Scanner config to the VulnWhisperer config file (/resources/elk6/vulnwhisperer.ini), it will work out of the box.

It also takes care to load the Kibana Dashboards and Visualizations automatically through the API, which needs to be done manually otherwise at Kibana's startup.

For more info about the docker-compose, check on the docker-compose wiki or the FAQ.

Getting Started

Our current Roadmap is as follows:

  • Create a Vulnerability Standard
  • Map every scanner results to the standard
  • Create Scanner module guidelines for easy integration of new scanners (consistency will allow #14)
  • Refactor the code to reuse functions and enable full compatibility among modules
  • Change Nessus CSV to JSON (Consistency and Fix #82)
  • Adapt single Logstash to standard and Kibana Dashboards
  • Implement Detectify Scanner
  • Implement Splunk Reporting/Dashboards

On top of this, we try to focus on fixing bugs as soon as possible, which might delay the development. We also very welcome PR's, and once we have the new standard implemented, it will be very easy to add compatibility with new scanners.

The Vulnerability Standard will initially be a new simple one level JSON with all the information that matches from the different scanners having standardized variable names, while maintaining the rest of the variables as they are. In the future, once everything is implemented, we will evaluate moving to an existing standard like ECS or AWS Vulnerability Schema; we prioritize functionality over perfection.

Video Walkthrough -- Featured on ElasticWebinar

Elastic presentation on VulnWhisperer

Authors

Contributors

AS SEEN ON TV

vulnwhisperer's People

Contributors

alias454 avatar andrew-bailey avatar anhlqn avatar austin-taylor avatar cybergoof avatar dependabot[bot] avatar detvan avatar harvii avatar kraigu avatar pemontto avatar qmontal avatar rogierm avatar scottmcgowan avatar smapper avatar spasaintk avatar tcstool avatar uovobw avatar wyv3rnsec avatar yashvendra avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

vulnwhisperer's Issues

Request for clearer installation instructions

Hi,

I've downloaded the tool and I'm currently in the process of setting it up with Nessus. However, I was running into some issues with Vulnwhisperer not picking up the report_tracker.db file. As it turns out, Nessus needs to create this database file and then the path must be supplied in the Vulnwhisperer configuration file. Could you please update the configuration screenshot to maybe say something like "write_path=/path/to/nessus/report_tracker.db" or something like that? As a new user to Nessus and Vulnwhisperer this took me a while to understand on my own.

I'm also curious what the next steps are with a brand new installation? Once I point Vulnwhisperer to the right database, what comes next? My understanding is that the ELK stack should be started?

Kibana unable to fetch mapping

I have placed both filebeat.yml, filebeat input, and nessus logstash config in the correct locations. Logstash doesn't seem to be creating the index. I noticed that there is a logstash template but it isn't clear if it needs to be imported or not.

Help: To Use VulnWhisper

HI All,

I need help,I have configured VulnWhisper on my winodws 7 64x.
Configured it and run as per Git Readme.
Now I have Nessus data in CSV.
Can you guide me now what to do with CSV, so i can get complete scene in one page.
Graph and top 10 data as shown in screenshot.

Thanks in advance.

Unable to create qualys reports - invalid template id

I am unable to get pass the report generation stage - seems an invalid template ID is being used to generate the qualys reports.

Processing 1/399
[ACTION] - Generating report for 10794322
[FAIL] Could not process report ID: <?xml version="1.0" encoding="UTF-8"?>
<ServiceResponse xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="https://qualysapi.qg2.apps.qualys.com/qps/xsd/3.0/was/report.xsd">
  <responseCode>INVALID_REQUEST</responseCode>
  <responseErrorDetails>
    <errorMessage>Template with id 126024 was not found. Provide a different template id.</errorMessage>
  </responseErrorDetails>
</ServiceResponse>

I am unclear where the template id of 126024 is coming from. Anyone else having a similar issue?

Deprecated Index Mapping

Since the given template uses deprecated commands for elasticsearch 6.0+ has someone created a new mapping?

[ELK6] Kibana Visualisations

Importing the vulnWhispererBaseVisuals.json adds VulnWhisperer - Risk: Critical (Also info, low, medium, high and total) as a 'goal' type visualization which without alteration prevents the visualization from working.
Changing the type to 'gauge' manually in the JSON file seems to resolve this.

Questions/Request: Integration with SecurityCenter

We have Tenable SecurityCenter and 5 nessus scanners. I'm not sure if the integrations with nessus work when they are managed by securitycenter. For example I know you can no longer login to the nessus boxes and view scan history or policies; since all of that is done at the securitycenter level and pushed out to all scanners.

This is what I get when I run vulnwhisp against securitycenter

[INFO] Connected to database at /opt/vulnwhisperer/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[FAIL] Could not login to Nessus
[FAIL] Could not properly load your config!
Reason: [FAIL] Could not connect to nessus -- Please verify your settings in <vulnwhisp.base.config.vwConfig object at 0x7f8b8b68ea50> are correct and try again.
Reason: [FAIL] Could not login to Nessus

SecurityCenter's API can be found at https://docs.tenable.com/sccv/api/index.html
I would be willing to offer assistance to get this working.

Nessus CSV files timestamped 5 hours in future.

I'm using the standard nessus conf file to pull my scans through but all scans are timestamped 5 hours ahead of their last modified time.

So if i executed a scan at 12pm the CSV file would have a unix time stamp of 5pm on the same day.

Anyone came across a similar issue?

No output when running for openvas

I've updated vulnwhisp.py, added the openvas section to the config file, and added openvas.py to the frameworks folder. When I run vuln_whisperer -c configs/frameworks_example.ini -s openvas there is no output and I am immediately returned to the command prompt.

OpenVAS result not processed

Hi,

The json result file from openvas scan is not processed by logstash:

[2018-03-05T12:36:52,893][DEBUG][logstash.inputs.file ] each: file grew: /opt/vulnwhisp/openvas/openvas_scan_e05c3ff470a1439d9525df35ac064be6_1520247708.json: old size 0, new size 2634 [2018-03-05T12:36:53,894][DEBUG][logstash.inputs.file ] each: file grew: /opt/vulnwhisp/openvas/openvas_scan_e05c3ff470a1439d9525df35ac064be6_1520247708.json: old size 0, new size 2634 [2018-03-05T12:36:54,896][DEBUG][logstash.inputs.file ] each: file grew: /opt/vulnwhisp/openvas/openvas_scan_e05c3ff470a1439d9525df35ac064be6_1520247708.json: old size 0, new size 2634 [2018-03-05T12:36:55,897][DEBUG][logstash.inputs.file ] each: file grew: /opt/vulnwhisp/openvas/openvas_scan_e05c3ff470a1439d9525df35ac064be6_1520247708.json: old size 0, new size 2634 [2018-03-05T12:36:56,899][DEBUG][logstash.inputs.file ] each: file grew: /opt/vulnwhisp/openvas/openvas_scan_e05c3ff470a1439d9525df35ac064be6_1520247708.json: old size 0, new size 2634

due to not having a end of line at the end. If i append a blank line to the json file logstash process it fine.

Thanks.
Best regards.

support for OWASP ZAP

OWASP ZAP is one of the most popular open source web application vulnerability scanners.
It will be really cool to support for it, I notice it isn't on the list of supported
scanners. Otherwise, how could VulnWhisperer to extend to support it ?

['[FAIL] ERROR:'] Nessus: Download Fails with Larger Scan

An authenticated scan of 1000 hosts generated a file around 500 MB in size when exported as a csv manually. This tool fails to download this scan, but after I removed that scan it succeeded at downloading a much smaller scan that targeted only 1 host.

Client: Archlinux 4.16.10-1-ARCH
VulnWhisperer: 1.5 and master
Python: 2.7.15

Server: Centos 7.5.1804
Nessus: Professional 7.1.0

(root@endor-vm) vulnwhisperer # python2 bin/vuln_whisperer -c configs/frameworks_example.ini -s nessus
[INFO] Creating directory /opt/vulnwhisperer/database
[INFO] Creating file /opt/vulnwhisperer/database/report_tracker.db
[INFO] Connected to database at /opt/vulnwhisperer/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[SUCCESS] Connected to nessus on sherlock.strsoh.org:8834
[INFO] Gathering all scan data... this may take a while...
[INFO] Identified 2 scans to be processed
Download for file id 1720763221.
..............................
...............
[FAIL] ERROR:

(root@endor-vm) vulnwhisperer # python2 bin/vuln_whisperer -c configs/frameworks_example.ini -s nessus
[INFO] Connected to database at /opt/vulnwhisperer/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[SUCCESS] Connected to nessus on sherlock:8834
[INFO] Gathering all scan data... this may take a while...
[INFO] Identified 1 scans to be processed
Download for file id 1331664840.
.
Processing 1/1 for scan: easton-mbx1
[INFO] 522 records written to easton-mbx1_55_56_1527889658.csv

cant get it to go beyond converting nessus file to csv

Hi,
Still cant get it to do anything beyond converting the nessus file to csv, both logstash and kibana service status shows running ok, not sure what am i doing wrong but it doesn't generate report and/or output the visualization to kibana, I have followed the video sample to follow the exact steps but I am not getting me anywhere.

Please advise

below is the output and attached are the logstash yml, logstash conf and .ini file
logstash_conf.txt
logstash_yml.txt
ini file.txt

OUTPUT
test@test-virtual-machine:~/Desktop/VulnWhisperer-master$ sudo vuln_whisperer -c configs/frameworks_example.ini -s nessus
[INFO] Connected to database at /opt/vulnwhisp/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[SUCCESS] Connected to nessus on localhost:8834
[INFO] Gathering all scan data... this may take a while...
[INFO] Identified 9 scans to be processed
[INFO] Directory already exist for /opt/vulnwhisp/nessus/Trash - Skipping creation
[INFO] Directory already exist for /opt/vulnwhisp/nessus/My Scans - Skipping creation
Download for file id 1168345030.
.
Processing 1/9 for scan: test
[INFO] 78 records written to test_25_26_1521411373.csv
Download for file id 819727134.
.
Processing 2/9 for scan: scan
[INFO] 86 records written to scan_22_23_1520385708.csv
Download for file id 1969498333.
.
Processing 3/9 for scan: attemptthird
[INFO] 79 records written to attemptthird_19_20_1519954223.csv
Download for file id 79218840.
.
Processing 5/9 for scan: attempt 1
[INFO] 79 records written to attempt_1_14_15_1519952202.csv
Download for file id 950718717.
.
Processing 8/9 for scan: test
[INFO] 79 records written to test_7_8_1519946993.csv

Does it support multiple scanners?

Is VulnWhisperer designed to support multiple nessus scanners?
Because I've read another comment about the Tenable Security Center that it only does work when you address a scanner directly. But this would not be a big issue if I am able to address all of them.

Will not pull scans form Nessus

root@scanbox:/opt/vulnwhisperer/configs# cat example.ini
[nessus]
enabled=true
hostname=192.168.1.54
port=8834
username=scanner
password=!myP@SSWORDisPASSW0RD
write_path=/opt/vulnwhisperer/nessus/
db_path=/opt/vulnwhisperer/database
trash=false
verbose=true

root@sanbox:/bin# vuln_whisperer -c /opt/vulnwhisperer/configs/example.ini -s nessus
[INFO] Creating file /opt/vulnwhisperer/database/report_tracker.db
[INFO] Connected to database at /opt/vulnwhisperer/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[SUCCESS] Connected to nessus on 192.168.1.54:8834
[INFO] Gathering all scan data... this may take a while...
[INFO] Identified 0 scans to be processed
[INFO] No new scans to process. Exiting...

I have over 200 scans on this server

Could not locate that index-pattern-field

In the "VulnWhisperer - Reporting" and "VulnWhisperer - Reporting Qualys Scoring" dashboards, I have the following errors:

  • Could not locate that index-pattern-field (id: asset.keyword)
  • Could not locate that index-pattern-field (id: plugin_name.keyword)

In the "VulnWhisperer - Risk Mitigation" dashboard, I have the following errors:

  • Could not locate that index-pattern-field (id: @timestamp)
  • Could not locate that index-pattern-field (id: risk_score)
  • Could not locate that index-pattern-field (id: cvss)
  • Could not locate that index-pattern-field (id: plugin_name.keyword)
  • Could not locate that index-pattern-field (id: scan_name.keyword)
  • Could not locate that index-pattern-field (id: asset.keyword)

In the "VulnWhisperer - Risk Mitigation Qualys Web Scoring" dashboard, I have the following errors:

  • Could not locate that index-pattern-field (id: @timestamp)
  • Could not locate that index-pattern-field (id: risk_score)
  • Could not locate that index-pattern-field (id: cvss)
  • Could not locate that index-pattern-field (id: plugin_name.keyword)
  • Could not locate that index-pattern-field (id: scan_name.keyword)
  • Could not locate that index-pattern-field (id: asset.keyword)
  • Could not locate that index-pattern-field (id: operating_system.keyword)
  • Could not locate that index-pattern-field (id: owner.keyword)

Fields in the logstash-vulnwhisperer-* index are:

  • _id
  • _index
  • _score 
  • _source
  • _type

Index was loaded:

curl -XPUT 'http://192.168.160.5:9200/logstash-vulnwhisperer-template' -d@/home/ansible/VulnWhisperer/elasticsearch/logstash-vulnwhisperer-template.json

I'm not sure if this is a bug, an incorrectly loaded index or data that needs to be configured in OpenVAS?

This is with ELK version 5.6.9.

Remove InsecureRequestWarning workaround

the latest version of requests and urllib3 no longer supports the workaround to remove the insecure requests warning being used. Execution will fail with the following:

...(snip)...
from requests.packages.urllib3.exceptions import InsecureRequestWarning
ImportError: cannot import name InsecureRequestWarning

Fixed by removing the imports. I can push or if you guys just want to fix that's cool too. Thanks for a great tool!

Kibana crashes when opening dashboards

When I open one of your dashboard templates my kibana crashes with the following messages:

Version: 6.2.2
Build: 16588
Error: Uncaught TypeError: gaugeTypes[_this.gaugeConfig.type] is not a constructor (http://172.18.48.110/bundles/vendors.bundle.js?v=16588:12)
    at window.onerror (http://172.18.48.110/bundles/commons.bundle.js?v=16588:21:468094)

doesnot do anything after "identified scans to be processed

Hi,
I have managed to get the command working with "touch report_tracker.db", however the scripts runs acknowledges the nessus scan but it doesn't go further than that, not sure what is going on. Can anyone help?

->/opt/VulnWhisperer$ vuln_whisperer -c configs/nessus.ini -s nessus
[INFO] Connected to database at /opt/VulnWhisperer/vulnwhisp/database/report_tracker.db
[INFO] Attempting to connect to nessus...
[SUCCESS] Connected to nessus on localhost:8834
[INFO] Gathering all scan data... this may take a while...
[INFO] Identified 1 scans to be processed

Improved Readme for importing nessus csv into Logstash

I have a fresh install of Elasticsearch, Logstash, Kibana, Nessus in default locations on the same Ubuntu 16.04 server VM and managed to import the Nessus scans and got the csv file in '/opt/vulnwhisp/nessus/My Scans' folder. Need some guidance on importing the CSV files into Logstah or Elasticsearch for a new index. Sorry am new to Elastic stack and trying to get VulnWhisperer operational and will be happy to contribute instructions on how to setup the entire setup.

Thanks in advance for your help.
Singh

allow error logs

Maybe find a way, in both windows and linux, to get an error log created so that failures can be tracked.

Fails to Download Scans

I have setup VulnWhisperer as instructed but when I go to run it authenticates then fails to download the 3 scans identified.

Any way to see more verbose logging of why its failing?

I thought it could be file permissions issue but turns out it wasn't.

Open to ideas.

image

Nessus 7 CSV Parsing Error

All piped reports are getting the same parsing error:

Error parsing csv {:field=>"message", :source=>"  Recommended upgrade : Server 2008 R2 Service Pack 1 
\" \"79638\",\"CVE-2014-6321\",\"10.0\",\"Critical\",\"x.x.x.x\",\"tcp\",\"3389\",
\"MS14-066: Vulnerability in Schannel Could ", :exception=>#<CSV::MalformedCSVError: Illegal quoting in line 1.>}```

Docker support

It will be nice if this project supports Docker deployment.

Tks

Qualys VM setup

Hello,
I'm hoping this is a simple question. But for the moment I am unable to resolve.
I have lots of data in Qualys VM and would like to try VulnWhisperer. However, it appears the default Qualys pull request is for the WAS application. How/where do I set up VulnWhisperer for Qualys VM?
Thanks!
Richard

enable selection based on date

Allow the ability to pass a date, or last x days, so that it will only pull nessus scans from that time period to current date.

For instance, be able to run vulnwhisperer to tell it only pull the last 3 days.

Empty json reports - qualys - unexpected keyword argument 'lines'

Hello. I am unable to get the json output for my qualys scans.

[ACTION] - Generating report for 12206112
[INFO] - Successfully generated report! ID: 12206112
[INFO] - New Report ID: 813288
[ACTION] - File written to /opt/vulnwhisp/qualys/813288.csv
[ACTION] - Downloading file ID: 813288
[FAIL] - Could not process 12206112 - to_json() got an unexpected keyword argument 'lines'

The csv reports generate just fine, but not the json - the json files are empty due to the above error. As far as I can tell, the logstash filters for qualys are set up for ingesting the json, not the csv, correct? Can I do anything withe CSV reports that are generated or do they exist only as a step in creating the final json report format?

Implement logging

Logging should be implemented for nightly runs to shows status of jobs

Check if config file exists, and provide error message

create a different error message if the config ini file does not exist. Current error says "Could not properly load your config! Reason: No Section: 'nessus'".

Stating that the file did not exist would help with troubleshooting issues.

allow all ini arguments to be passed in

allowing arguments instead of an ini file will help reduce file dependence when running in Docker containers.

All ini files can be sent in the docker run, or in a docker-compose file.

in addition to scan results, store and ship scan metadata about scans

Use to track when scans are running, supposed to run, and latest results. This should be a separate scan index.

It should include all the data about a scan result, without the history information. Including:

status, control, uuid, name, read, enabled, owner, creation_date, user_permissions, folder_id, starttime, timezone, last_modification_date, shared, type, id, rrules)

Missing dependencies from documentation

On Debian 8, the following additional packages must be installed and aren't getting picked up anywhere else:
zlibg1-dev
Libxml2-dev
Libxslt1-dev

Update docs to reflect. Again, I can do and push if you want. Thanks!

Importing Supplied ES template into ES 5.5 results in errors

#! Deprecation: match_mapping_type [float] is invalid and will be ignored: No field type matched on [float], possible values are [object, string, long, double, boolean, date, binary]
#! Deprecation: match_mapping_type [byte] is invalid and will be ignored: No field type matched on [byte], possible values are [object, string, long, double, boolean, date, binary]
#! Deprecation: match_mapping_type [short] is invalid and will be ignored: No field type matched on [short], possible values are [object, string, long, double, boolean, date, binary]
#! Deprecation: match_mapping_type [integer] is invalid and will be ignored: No field type matched on [integer], possible values are [object, string, long, double, boolean, date, binary]
#! Deprecation: match_mapping_type [geo_point] is invalid and will be ignored: No field type matched on [geo_point], possible values are [object, string, long, double, boolean, date, binary]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [source]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [synopsis]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [see_also]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [cve]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [solution]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [@Version]
#! Deprecation: The [string] field is deprecated, please use [text] or [keyword] instead on [risk]
#! Deprecation: [omit_norms] is deprecated, please use [norms] instead with the opposite boolean value
{
"acknowledged": true
}

Current Versions:
elasticsearch v5.5.1
kibana v5.5.1
logstash v5.5.1

Token error with Nessus

I am able to connect to Nessus per the documentation. However after my scans are identified the next line says

[ERROR] 'token'

local variable 'token_id' referenced before assignment.

I am trying to debug as well.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.