Entrada File en Logstash

Buenas este es mi codigo en el archivo de logstash y debajo pongo la salida, no logro entender los errores que estoy teniendo. Si alguien puede ayudarme.

este es el archivo que quiero desmenuzar.

MODSEN:000#I:216F6F057C105410#TS:Tue, 17/11/28, 16:18:0#BAT:91#TCA:19.99#HUMA:59#PA:1020#ANE:6.7#PLUV:0.00#DIR:100#US:9.99#GPS_LAT:-34.638112#GPS_LNG:-58.397158
MODSEN:000#I:216F6F057C105410#TS:Tue, 17/11/28, 16:20:1#BAT:91#TCA:19.99#HUMA:59#PA:1020#ANE:6.7#PLUV:0.00#DIR:100#US:9.99#GPS_LAT:-34.638112#GPS_LNG:-58.397158
MODSEN:000#I:216F6F057C105410#TS:Tue, 17/11/28, 16:22:0#BAT:91#TCA:19.99#HUMA:59#PA:1020#ANE:6.7#PLUV:0.00#DIR:100#US:9.99#GPS_LAT:-34.638112#GPS_LNG:-58.397158

  input {
   file {
path => "\C:\ELK\logstash-6.0.0\data\lognode.txt"
    start_position => "beginning"
    type => "log"
    sincedb_path => "NUL"
    ignore_older => "0"
}
}

filter {

 kv { 
field_split => "#"
value_split => ":"
 }

mutate {
        convert => ["MODSEN","integer"]
        convert => ["I","integer"]
        convert => ["BAT","integer"]
        convert => ["TCA","float"]
        convert => ["HUMA","float"]
        convert => ["PA","float"]
        convert => ["ANE","float"]
        convert => ["PLUV","float"]
        convert => ["DIR","float"]
        convert => ["US","float"]
        convert => ["GPS_LAT","float"]
        convert => ["GPS_LNG","float"]

   }
}
    date{
        match => ["TS", "EEE, yy/MM/dd, HH:mm:ss"]
        target => "@timestamp"
   }
output{
    stdout { codec => rubydebug } 
      elasticsearch {
            index => ["log"]
            hosts => ["10.10.0.113:9200"]
            }
     
}

C:\ELK\logstash-6.0.0\bin>logstash -f C:\ELK\logstash-6.0.0\logs\logstash-plain.log
Sending Logstash's logs to C:/ELK/logstash-6.0.0/logs which is now configured via log4j2.properties
[2017-11-28T17:19:42,755][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/ELK/logstash-6.0.0/modules/fb_apache/configuration"}
[2017-11-28T17:19:42,763][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/ELK/logstash-6.0.0/modules/netflow/configuration"}
[2017-11-28T17:19:42,885][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-11-28T17:19:43,812][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-11-28T17:19:43,795][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 34, column 6 (byte 775) after ", :backtrace=>["C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:42:in compile_ast'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:50:incompile_imperative'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:54:in compile_graph'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:12:inblock in compile_sources'", "org/jruby/RubyArray.java:2486:in map'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:11:incompile_sources'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:107:in compile_lir'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:49:ininitialize'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:215:in initialize'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/pipeline_action/create.rb:35:inexecute'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:335:in block in converge_state'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:332:in block in converge_state'", "org/jruby/RubyArray.java:1734:ineach'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:319:in converge_state'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:166:inblock in converge_state_and_update'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:164:inconverge_state_and_update'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:90:in execute'", "C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/runner.rb:362:inblock in execute'", "C:/ELK/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}

Hola @Diego_Gomez,

El error esta alli en el medio de el monton de información.

La linea 34 cierra el filter y esta incorrecta, antes del date filter. Lo correcto deberia de ser lo siguiente:

input {
  file {
      path => "\C:\ELK\logstash-6.0.0\data\lognode.txt"
      start_position => "beginning"
      type => "log"
      sincedb_path => "NUL"
      ignore_older => "0"
  }
}

filter {

 kv { 
    field_split => "#"
    value_split => ":"
 }

mutate {
        convert => ["MODSEN","integer"]
        convert => ["I","integer"]
        convert => ["BAT","integer"]
        convert => ["TCA","float"]
        convert => ["HUMA","float"]
        convert => ["PA","float"]
        convert => ["ANE","float"]
        convert => ["PLUV","float"]
        convert => ["DIR","float"]
        convert => ["US","float"]
        convert => ["GPS_LAT","float"]
        convert => ["GPS_LNG","float"]

   }

    date{
        match => ["TS", "EEE, yy/MM/dd, HH:mm:ss"]
        target => "@timestamp"
   }
}
output{
    stdout { codec => rubydebug } 
      elasticsearch {
            index => ["log"]
            hosts => ["10.10.0.113:9200"]
            }
     
}

Saludos!

Hola Gabriel ya no me da ese error de antes pero no logro que me lleguen los datos a kibana, dejo la salida que me da logstash a ver si ves la razón, gracias por la ayuda.

C:\ELK\logstash-6.0.0\bin>logstash -f C:\ELK\logstash-6.0.0\data\Tcp.bat
 Sending Logstash's logs to C:/ELK/logstash-6.0.0/logs which is now configured via log4j2.properties
     [2017-11-29T11:02:42,927][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/ELK/logstash-6.0.0/modules/fb_apache/configuration"}
[2017-11-29T11:02:42,974][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/ELK/logstash-6.0.0/modules/netflow/configuration"}
[2017-11-29T11:02:43,162][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-11-29T11:02:44,177][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-11-29T11:02:53,417][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://10.10.0.113:9200/]}}
[2017-11-29T11:02:53,417][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://10.10.0.113:9200/, :path=>"/"}
[2017-11-29T11:02:53,651][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://10.10.0.113:9200/"}
[2017-11-29T11:02:53,776][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-11-29T11:02:53,792][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-11-29T11:02:53,854][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//10.10.0.113:9200"]}
[2017-11-29T11:02:53,870][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0x262dec83@C:/ELK/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"}
[2017-11-29T11:02:54,667][INFO ][logstash.pipeline        ] Pipeline started {"pipeline.id"=>"main"}
[2017-11-29T11:02:54,776][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}
[2017-11-29T11:02:54,995][WARN ][logstash.inputs.file     ] failed to open \C:\ELK\logstash-
 6.0.0\data\lognode.txt: Illegal char <:> at index 2: \C:\ELK\logstash-6.0.0\data\lognode.txt            
[2017-11-29T11:07:55,262][WARN ][logstash.inputs.file     ] failed to open \C:\ELK\logstash-6.0.0\data\lognode.txt: Illegal char <:> at index 2: \C:\ELK\logstash-6.0.0\data\lognode.txt

Hola Diego,

La ruta del archivo esta incorrecta. Por eso el siguiente error:

[2017-11-29T11:02:54,995][WARN ][logstash.inputs.file     ] failed to open \C:\ELK\logstash-
 6.0.0\data\lognode.txt: Illegal char <:> at index 2: \C:\ELK\logstash-6.0.0\data\lognode.txt            

Saludos!

--Gabriel

Gabriel ahi logre una salida de logstash bien pero no en kibana me aparecen las variables pero sin datos ahi te muestro como lo veo.

Logstash

"HUMA" => 59.0,
    "MODSEN" => 0,
         "I" => 216,
   "message" => "MODSEN:000#I:216F6F057C105410#TS:Tue, 17/11/28, 16:18:0#BAT:91#TCA:19.99#HUMA:59#PA:1020#ANE:6.7#PLUV:0.00#DIR:100#US:9.99#GPS_LAT:-34.638112#GPS_LNG:-58.397158\r",
      "type" => "log",
       "DIR" => 100.0,
   "GPS_LNG" => -58.397158,
      "path" => "C:\\ELK\\logstash-6.0.0\\data\\lognode.txt",
        "PA" => 1020.0,
"@timestamp" => 2017-11-28T19:18:00.000Z,
       "TCA" => 19.99,
       "BAT" => 91,
   "GPS_LAT" => -34.638112,
  "@version" => "1",
      "host" => "DESKTOP-OPPINET",
       "ANE" => 6.7,
      "PLUV" => 0.0,
  "location" => {
    "GPS_LNG" => "GPS_LNG",
    "GPS_LAT" => "GPS_LAT"
},
        "US" => 9.99,
        "TS" => "Tue, 17/11/28, 16:18:0"

}

Kibana, en discover no veo nada ningun punto, esto de abajo se ve en management

name 	type 	format 	searchable  	aggregatable  	excluded  	controls
@timestamp   	date					
@version  	string					
@version.keyword  	string					
ANE  	number					
BAT  	number					
DIR  	number					
GPS_LAT  	number	Number				
GPS_LNG  	number					
HUMA  	number					
 I  	number					 
MODSEN  	number					
PA  	number					
PLUV  	number					
TCA  	number					
TS  	string					
TS.keyword  	string					
US  	number					
_id  	string					
_index  	string					
_score  	number					
_source  	_source					
_type  	string					
 created_at  	string					
created_at.keyword  	string					
display_text_range  

este es mi codigo de logstash

 input {
 file {
path => "C:\ELK\logstash-6.0.0\data\lognode.txt"
    start_position => "beginning"
    type => "log"
    sincedb_path => "NUL"
    ignore_older => "0"
}
}

filter {

kv { 
field_split => "#"
value_split => ":"
 }

mutate {
        convert => ["MODSEN","integer"]
        convert => ["I","integer"]
        convert => ["BAT","integer"]
        convert => ["TCA","float"]
        convert => ["HUMA","float"]
        convert => ["PA","float"]
        convert => ["ANE","float"]
        convert => ["PLUV","float"]
        convert => ["DIR","float"]
        convert => ["US","float"]
        convert => ["GPS_LAT","float"]
        convert => ["GPS_LNG","float"]

     }
 mutate {
     add_field => {
         "[location][GPS_LAT]" => "GPS_LAT"
         "[location][GPS_LNG]" => "GPS_LNG" 
                  }
     }

  date{
        match => ["TS", "EEE, yy/MM/dd, HH:mm:ss"]
        target => "@timestamp"
}
}
output{
    stdout { codec => rubydebug } 
      elasticsearch {
            index => ["log"]
            hosts => ["10.10.0.113:9200"]
            }
     
}

Hola Diego!

Tienes que crear el "index pattern", con log. Sino , no va a saber que mostrarte.

Saludos!
--Gabriel

Hola gabriel si ese lo cree y me muestra esos datos que te mande cuando cree el index pattern pero en discover no veo datos ni variables nada, con un index de twitter por ejemplo yo en dicover veo datos y variables pero aca no veo nada.

Te recomiendo que utilices Dev Tools para ver si se esta llenando el indice:

GET log/_search

Asi podras ver que esta sucediendo. Si la informacion esta en Elasticsearch, entonces tiene que estar disponible en Kibana.

Saludos!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.