How to add Nessus CSV report to logstash

Hi guys

I am kind of new in ELK. I am trying to add CSV file which is Nessus report to logstash in order to analyze them.
What I have done is I create a .conf file in logstash but when I try to run logstash -f configfile.conf , It gave my tons of error regarding that logstash cannot parse the csv file correctly.

`input {
file {
path => "/root/csvs/*.csv"
type => "core2"
start_position => "beginning"
}
}

filter {
csv {
columns => ["@timestamp", "interface", "bytes in", "bytes out"]
separator => ","
}
}

output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
# stdout {
# codec => rubydebug
# }
}

Please somebody tell me how can I fix that.`

Thank you,

2 Likes

Hi Abe,

Do you have a small sample of the csv file? If not that's ok, I'll generate a sample one myself.

Hi Peter,

This is the first couple lines of my CSV file:

`Plugin ID,CVE,CVSS,Risk,Host,Protocol,Port,Name,Synopsis,Description,Solution,See Also,Plugin Output
"10107","","","None","1.1.1.1","tcp","80","HTTP Server Type and Version","A web server is running on the remote host.","This plugin attempts to determine the type and the version of the
remote web server.","n/a","","The remote web server type is :

Apache/2.2.1 (Red Hat)

You can set the directive 'ServerTokens Prod' to limit the information
emanating from the server in its response headers."
"10107","","","None","1.1.1.1","tcp","443","HTTP Server Type and Version","A web server is running on the remote host.","This plugin attempts to determine the type and the version of the
remote web server.","n/a","","The remote web server type is :`

Thank you,
Abe

1 Like

I change my config file to this, but nothing changes :pensive:

`input {
file {
path => "/root/csvs/*.csv"
start_position => "beginning"
}
}

filter {
csv {
columns => [
"Plugin ID",
"CVE","CVSS",
"Risk","Host",
"Protocol",
"Port",
"Name",
"Synopsis",
"Description",
"Solution",
"See Also",
"Plugin Output"
]
separator => ","
remove_field => ["message"]
}
date {
match => ["time", "ISO8601"]
}
}

output {
elasticsearch {
action => "index"
hosts => "localhost"
index => "logstash-%{+YYYY.MM.dd}"
workers => 1
}
# stdout {
# codec => rubydebug
# }
}

`

1 Like

When you post source data or code in this app, surround the text with triple backticks.

About your problem:
Do you have newlines a field data?
If so, I don't the CSV library that we use can cope with newlines in field data as it assumes that the text before a newline is a full CSV line with the correct number of commas etc.

1 Like

That newline point is a good one, can you confirm if any new line chars exist inside the fields of your CSV?

Also, do you have examples of the errors it was throwing?

Hi,

I am also facing same problem with Nessus data. Were you able to parse it correctly?

Thanks

Hi guys,

I just want to check out is anybody has an answer for my question?
I need to import nessus report log (csv file) to ELK somehow. I am going to leave couple lines of csv file, might be helpful.

`
89082,CVE-2016-0799,9.3,High,1.1.1.1,tcp,443,OpenSSL 1.0.2 < 1.0.2g Multiple Vulnerabilities (DROWN),The remote service is affected by multiple vulnerabilities.,"According to its banner, the remote host is running a version of
OpenSSL 1.0.2 prior to 1.0.2g. It is, therefore, affected by the
following vulnerabilities :

  • A key disclosure vulnerability exists due to improper
    handling of cache-bank conflicts on the Intel
    Sandy-bridge microarchitecture. An attacker can exploit
    this to gain access to RSA key information.
    (CVE-2016-0702)

  • A double-free error exists due to improper validation of
    user-supplied input when parsing malformed DSA private
    keys. A remote attacker can exploit this to corrupt
    memory, resulting in a denial of service condition or
    the execution of arbitrary code. (CVE-2016-0705)

  • A NULL pointer dereference flaw exists in the
    BN_hex2bn() and BN_dec2bn() functions. A remote attacker
    can exploit this to trigger a heap corruption, resulting
    in the execution of arbitrary code. (CVE-2016-0797)

  • A denial of service vulnerability exists due to improper
    handling of invalid usernames. A remote attacker can
    exploit this, via a specially crafted username, to leak
    300 bytes of memory per connection, exhausting available
    memory resources. (CVE-2016-0798)

  • Multiple memory corruption issues exist that allow a
    remote attacker to cause a denial of service condition
    or the execution of arbitrary code. (CVE-2016-0799)

  • A flaw exists that allows a cross-protocol
    Bleichenbacher padding oracle attack known as DROWN
    (Decrypting RSA with Obsolete and Weakened eNcryption).
    This vulnerability exists due to a flaw in the Secure
    Sockets Layer Version 2 (SSLv2) implementation, and it
    allows captured TLS traffic to be decrypted. A
    man-in-the-middle attacker can exploit this to decrypt
    the TSL connection by utilizing previously captured
    traffic and weak cryptography along with a series of
    specially crafted connections to an SSLv2 server that
    uses the same private key. (CVE-2016-0800)",Upgrade to OpenSSL version 1.0.2g or later.,"https://www.openssl.org/news/secadv/20160301.txt
    https://www.openssl.org/news/cl102.txt
    https://drownattack.com/
    https://www.drownattack.com/drown-attack-paper.pdf","
    Banner : Apache/2.4.17 (Win32) OpenSSL/1.0.2d PHP/5.6.14
    Reported version : 1.0.2d
    Fixed version : 1.0.2g
    "`