Inject data from a log file

Hello everyone, I have a .log file with a format that I show below, I needed to know what code I use to put it in logstash and index it to elastisearch?
File.log

MODSEN=000&I=216F6F057C105410&TS=Sun, 17 /9/10, 17:30:0&BAT=91&TCA=17.23&HUMA=93&PA=100.1&ANE=3.1&PLUV=0.00&DIR=140&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Sun, 17/9/10, 18:0:1&BAT=91&TCA=17.73&HUMA=93&PA=99.9&ANE=5.1&PLUV=0.00&DIR=130&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Sun, 17/9/10, 20:30:1&BAT=91&TCA=17.32&HUMA=93&PA=99.8&ANE=3.1&PLUV=0.00&DIR=50&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Sun, 17/9/10, 21:0:0&BAT=91&TCA=17.32&HUMA=93&PA=99.8&ANE=3.1&PLUV=0.00&DIR=50&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Sun, 17/9/10, 21:30:0&BAT=91&TCA=17.66&HUMA=93&PA=99.7&ANE=2.6&PLUV=0.00&DIR=50&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Sun, 17/9/10, 22:0:0&BAT=91&TCA=17.66&HUMA=93&PA=99.7&ANE=2.6&PLUV=0.00&DIR=50&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Sun, 17/9/10, 22:30:0&BAT=91&TCA=17.61&HUMA=93&PA=99.7&ANE=5.1&PLUV=0.00&DIR=360&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Sun, 17/9/10, 23:0:0&BAT=91&TCA=17.33&HUMA=93&PA=99.7&ANE=5.1&PLUV=0.00&DIR=360&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Sun, 17/9/10, 23:30:0&BAT=91&TCA=15.00&HUMA=93&PA=99.7&ANE=9.8&PLUV=0.00&DIR=340&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Mon, 17/9/11, 0:0:1&BAT=91&TCA=14.37&HUMA=87&PA=99.9&ANE=8.2&PLUV=0.00&DIR=340&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Mon, 17/9/11, 0:30:1&BAT=91&TCA=14.38&HUMA=87&PA=99.9&ANE=8.2&PLUV=0.00&DIR=340&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Mon, 17/9/11, 1:0:1&BAT=91&TCA=13.30&HUMA=87&PA=99.9&ANE=9.8&PLUV=0.00&DIR=340&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Mon, 17/9/11, 1:30:0&BAT=91&TCA=11.32&HUMA=87&PA=99.9&ANE=10.3&PLUV=0.00&DIR=330&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Thu, 17/9/14, 19:57:1&BAT=91&TCA=17.21&HUMA=42&PA=101.7&ANE=3.1&PLUV=0.00&DIR=190&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Thu, 17/9/14, 20:0:2&BAT=91&TCA=17.21&HUMA=42&PA=101.7&ANE=3.1&PLUV=0.00&DIR=190&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158
MODSEN=000&I=216F6F057C105410&TS=Thu, 17/9/14, 20:5:1&BAT=91&TCA=17.21&HUMA=42&PA=101.7&ANE=3.1&PLUV=0.00&DIR=190&US=9.99&GPS_LAT=-34.638112&GPS_LNG=-58.397158

Hi,

you can build the filter on your own at https://grokdebug.herokuapp.com/

Looks like you should be able to parse this with a kv filter. The apply a date filter to the TS field.

hi it is my code, in kibana have not see data

input {
  file {
  path => "\C:\ELK\logstash-6.0.0\data\node-red.txt"
}
}
filter {
 kv {
field_split => "&?"
}
}

output{
     elasticsearch {
            index => ["log-%{+yyyy}"]
            hosts => ["10.10.0.113:9200"]
            }
     stdout { codec => rubydebug }
}

output

C:\ELK\logstash-6.0.0\bin>logstash -f C:\ELK\logstash-6.0.0\data\Tcp.bat
Sending Logstash's logs to C:/ELK/logstash-6.0.0/logs which is now configured via log4j2.properties
[2017-11-28T10:15:06,567][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/ELK/logstash-6.0.0/modules/fb_apache/configuration"}
[2017-11-28T10:15:06,598][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/ELK/logstash-6.0.0/modules/netflow/configuration"}
[2017-11-28T10:15:07,036][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
 [2017-11-28T10:15:09,004][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-11-28T10:15:14,198][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.10.0.113:9200/]}}
[2017-11-28T10:15:14,213][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://10.10.0.113:9200/, :path=>"/"}
[2017-11-28T10:15:14,370][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://10.10.0.113:9200/"}
[2017-11-28T10:15:14,432][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-11-28T10:15:14,448][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-11-28T10:15:14,463][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.10.0.113:9200"]}
[2017-11-28T10:15:14,495][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0x2e4902d5@C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-11-28T10:15:15,089][INFO ][logstash.pipeline        ] Pipeline started {"pipeline.id"=>"main"}
[2017-11-28T10:15:15,214][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}

While you are creating configuration and verifying that you can parse the data correctly, it is recommended to use a stdout plugin with a rubydebug codec instead of sending the data to Elasticsearch. This will speed up troubleshooting and allow you to focus on one thing at a time. Once you are happy with the results, you can start sending it to Elasticsearch and ensure mappings are correct etc.

new code and output,
it still does not work

input {
 file {
  path => "\C:\ELK\logstash-6.0.0\data\node-red.txt"
    start_position => "beginning"
    type => "log"
    ignore_older => 0
}
}

filter {

kv { 
field_split => "&"
value_split => "="
}

mutate {
        convert => ["MODSEN","integer"]
        convert => ["I","integer"]
        convert => ["TS","float"]
        convert => ["BAT","integer"]
        convert => ["TCA","float"]
        convert => ["HUMA","float"]
        convert => ["PA","float"]
        convert => ["ANE","float"]
        convert => ["PLUV","float"]
        convert => ["DIR","float"]
        convert => ["US","float"]
        convert => ["GPS_LAT","float"]
        convert => ["GPS_LNG","float"]

    }
}
   date{
        match => ["TS", "EEE, yy/MM/dd, HH:mm:ss"]
        target => "@timestamp"
}
output{
    stdout { codec => rubydebug } 
      elasticsearch {
            index => ["log-%{+yyyy}"]
            hosts => ["10.10.0.113:9200"]
            }
     
 }


C:\ELK\logstash-6.0.0\bin>logstash -f C:\ELK\logstash-6.0.0\data\Tcp.bat   
Sending Logstash's logs to C:/ELK/logstash-6.0.0/logs which is now configured via log4j2.properties
[2017-11-28T11:24:01,258][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/ELK/logstash-6.0.0/modules/fb_apache/configuration"}
[2017-11-28T11:24:01,258][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/ELK/logstash-6.0.0/modules/netflow/configuration"}
[2017-11-28T11:24:01,367][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-11-28T11:24:02,492][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-11-28T11:24:02,601][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 34, column 6 (byte 781) after ", :backtrace=>["C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:42:in `compile_ast'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:50:in `compile_imperative'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:54:in `compile_graph'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:12:in `block in compile_sources'", "org/jruby/RubyArray.java:2486:in `map'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:11:in `compile_sources'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:107:in `compile_lir'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:49:in `initialize'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:215:in `initialize'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/pipeline_action/create.rb:35:in `execute'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:335:in `block in converge_state'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:332:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:319:in `converge_state'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:90:in `execute'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/runner.rb:362:in `block in execute'", "C:/ELK/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}

As far as I can see there are several formatting issues in your config:

Look at the examples in the documentation and format the configuration parameters correctly.

I do not understand what the errors are

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.