Simple mutate copy ip to string, doesnt work

what am i missing here..

suricata logs store source and destination fields in type IP.. which is fine.. but i also need that information in string format for another purpose.. so i just want to copy it to another field..but it wont do it.. what am i missing here. no errors are given, it just doesnt do it.

#this doesnt work
filter {
if "suricata" in [event][module] {
mutate {
copy => {"[source][ip]" => "[source][address]"}
copy => {"[destination][ip]" => "[destination][address]"}
}
}
}

#this doesnt work
filter {
if "suricata" in [service][type] {
mutate {
# copy => {"ip_address" => "source_address"}
add_field => { "source_address" => "%{[source][ip]}" }
add_field => { "destination_address" => "%{[destination][ip]}" }
} #end mutate

} #end filter

any help would be appreciated, this should be simple.
thanks

What does an event look like if you use

output { stdout { codec => rubydebug } }

i get no errors, it just goes through the data... but it doesnt add or copy the field. i've tried every combination i can think of:
filter {
mutate {
#approach 1
add_field => {"source_address2" => "%{[source][ip]}" }
#approach 2
copy => { "[source][ip]" => "source_address"}
}
}

i get no errors:
2019-07-02T01:29:33,109][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2019-07-02T01:29:33,121][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-07-02T01:29:33,206][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-07-02T01:29:33,210][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2019-07-02T01:29:33,468][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
"service" => {
"type" => "suricata"
},
....

in log:
"source_address2" => "%{[source][ip]}",

source_address doesnt show up at all..

i'm at a loss.. any suggestions would be appreciated,
thank you badger for all your help

Darrell

this is how the fields show up in kibana.. with an IP address attached.
55%20PM
47%20PM

a little more explination of what i'm trying to do.. (seems simple)

i just have a translation table of devices on my network, i'd like to be able to translate the source.ip and destination.ip to a device name.

example:
Source.IP: 192.168.1.105

translates to "Building 4 Rm 2 Panasonic Projector"

this works fine with zeek logs, because zeek has a field called source.address that is the source IP in text/string form.

suricata stores source.ip and destination.ip in IP form.
translation tables.. IF statements.. and anything else i try dont seem to work on type IP.

logstash does not have an IP data type. elasticsearch and kibana do, but logstash does not. That's why I am asking you to show us what your data looks like in rubydebug.

thank you for your help badger

here is the filter i'm running:
filter {
mutate {
add_field => {"source_address2" => "%{[source][ip]}" }
copy => { "[source][ip]" => "source_address"}
}
}

source_address doesnt even appear.. and source_address2 just has " "%{[source][ip]}"" i've tried every combination i can think of..

here is some of the output from rubydebug:

Thread.exclusive is deprecated, use Thread::Mutex
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2019-07-03T00:28:36,212][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.2.0"}
[2019-07-03T00:28:42,582][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2019-07-03T00:28:42,586][INFO ][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>6, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>750, :thread=>"#<Thread:0x124f3f1e run>"}
[2019-07-03T00:28:42,869][INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2019-07-03T00:28:42,883][INFO ][logstash.javapipeline    ] Pipeline started {"pipeline.id"=>"main"}
[2019-07-03T00:28:42,974][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-07-03T00:28:42,981][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2019-07-03T00:28:43,201][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
{
          "input" => {
    "type" => "log"
},
"source_address2" => "%{[source][ip]}",
            "ecs" => {
    "version" => "1.0.0"
},
          "event" => {
    "dataset" => "suricata.eve",
     "module" => "suricata"
},
        "service" => {
    "type" => "suricata"
},
        "fileset" => {
    "name" => "eve"
},
           "host" => {
             "name" => "elk72",
     "architecture" => "x86_64",
               "id" => "0e26c676bb604f558630d782e5dae2ef",
    "containerized" => false,
               "os" => {
        "codename" => "bionic",
         "version" => "18.04.2 LTS (Bionic Beaver)",
        "platform" => "ubuntu",
          "family" => "debian",
            "name" => "Ubuntu",
          "kernel" => "4.15.0-54-generic"
    },
         "hostname" => "elk72"
},
       "@version" => "1",
            "log" => {
      "file" => {
        "path" => "/var/log/suricata/eve.json"
    },
    "offset" => 10440991
},
     "@timestamp" => 2019-07-02T01:32:03.134Z,
           "tags" => [
    [0] "es72",
    [1] "filebeat",
    [2] "suricata",
    [3] "beats_input_raw_event"
],
 ...

Generally, if the source of a filter is nil, then it does nothing, which may be why your mutate+copy does nothing.

The rubydedug output that you show does not include a [source] object that contains an [ip] field so I think what you are seeing is expected.

Badger, thank you for talking this through with me.. i got it.. but i'm still confused to why..

this works:
filter {
mutate {
copy => { "[json][src_ip]" => "source_address"}
copy => { "[json][dest_ip]" => "destination_address"}

    }

}

the fields show up in ruby debug nested under [json]
in kibana they are displayed under [suricata][eve]..

example:
[suricata][eve][source][ip]
[suricata][eve][destination][ip]

but when you do a search, or type in the query field, they come up as:
suricata.eve.src_ip
suricata.eve.dest_ip

is this some kind of alias?

Are you using the filebeat suricata module? That renames fields and does other change during ingestion (after the events have gone through logstash).

Generally modules are intended to get you up and running and looking at data in Kibana as quickly as possible. If you need something that the module does not do you may be better off starting over from scratch.

yes, i'm using the suricata module.. my workflow is:
Suricata -> Filebeat(suricata module) -> Logstash --> RabbitMQ <-- Logstash (alot of enrichment) --> Elasticsearch

thank you for your help badger.
i really do appreciate it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.