Display text file in Kibana with filebeat and logstash

Hi guys,

I run ELK Stack (one machine) with filebeat (second machine) as log shipper.

I've got txt file with 4228 rows where each row is one log in follow form:

Jul 4 13:56:17 vMMR mmr-core[29839]: GtsAwegAOMTbez_1562241377271986.mt npdbProfiling-end: pid[29839] table[npdbcz] operation[SELECT] duration[6.27 ms] error sql[SELECT carrier,validity,now()>validity as now_valid FROM npdbcz WHERE range IN ('606339842','60633984','6063398','606339','60633','6063') ORDER BY now_valid DESC,validity DESC]

How can I send the txt file with these logs to the elasticsearch and see them in Kibana? Which configuration files should I edit and how? Is it possible to send each row in the file as one log message in Kibana?

Any help is really appreciated !!

Thank you :blush:

I'm trying to use logstash for load .txt file into elasticsearch. Here is my config file loadfile.conf:

input {
 file {
  path => "/home/vladuser/logs.txt"
  start_position => "beginning"
 }
}


output {
  elasticsearch {
   hosts => ["127.0.0.1:9200"]
   index => "ssh_auth-2019.07"
  }
}

I tried change hosts to localhost:9200 but it doesn't work either.

Unfortunately when I'm trying to run sudo bin/logstash -f /etc/logstash/conf.d/loadfile.conf I end up with this message:

[INFO ] 2019-07-10 12:49:43.061 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9601}

Try adding

sincedb_path => "/dev/null"

to your file input and restarting logstash.

2 Likes

Thank you for your help!

Now it goes further but for each row in the .txt file it shows this WARN:

[WARN ] 2019-07-10 15:55:52.020 [[main]>worker0] elasticsearch - Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"ssh_auth-2019.07", :_type=>"_doc", :routing=>nil}, #LogStash::Event:0x64e46377], :response=>{"index"=>{"_index"=>"ssh_auth-2019.07", "_type"=>"_doc", "_id"=>"aYKZ3GsBKh5qbaH2ItLV", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"}}}}

The index where I want to send it "ssh_auth-2019.07" is already created and connected with Filebeat. Could output part in logstash config file create new index for the .txt file?

Does this help?

1 Like

I added this filter

filter {
  mutate {
rename {
  "[host]" => "[host][name]"
}
  }
}

And when I run sudo bin/logstash -f /etc/logstash/conf.d/loadfile.conf it returns me another ERROR:

[ERROR] 2019-07-10 17:01:17.788 [Converge PipelineAction::Create] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 14, column 12 (byte 156) after filter {\n mutate {\n rename ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:49:in compile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2577:in map'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:24:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:325:in block in converge_state'"]}

I have no idea what it means especially that it expects # after filter ???

That should be

rename => {

The error message is telling you that it expects to find either a comment or => after

filter {
  mutate {
     rename

OK, don't know if it is right solution but I somehow created new index (my_index) which I found in Index Patterns in Kibana. Then without any filters I change output index for my_index.

After I ran sudo bin/logstash -f /etc/logstash/conf.d/loadfile.conf I ended up with the same output as in beginning:

[INFO ] 2019-07-10 12:49:43.061 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9601}

Although this output showed up I can see the rows from my file as logs in Kibana interface in Discover bar.

@Badger Could you please help :pray:

I need to parse my data from the txt file so I can get duration of the translation I can querying later. As you can see my file contents these rows:

> Jul 4 13:56:17 vMMR mmr-core[29839]: GtsAwegAOMTbez_1562241377271986.mt npdbProfiling-end: pid[29839] table[npdbcz] operation[SELECT] duration[6.27 ms] error [] sql[SELECT carrier,validity,now()>validity as now_valid FROM npdbcz WHERErangeIN ('606339842','60633984','6063398','606339','60633','6063') ORDER BY now_valid DESC,validity DESC]

I'd like to separate it to the fields as it is : pid, table, operation, duration ... the most important for me is duration.

Do you have any idea how to manage it? I'm a beginner and don't understand grok so much but if you have any hints I'd be really thankful!!

`

I've got this in my grok debugger so far:

%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:hostname} %{DATA:type} %{SPACE} %{DATA:file_id} %{DATA:file_name} %{DATA:syslog_pid} .*table\s*\[%{WORD:table}\] .*operation\s*\[%{WORD:operation}\] %{GREEDYDATA:rest}



{
  "rest": "duration[6.27 ms] error [] sql[SELECT carrier,validity,now()>validity as now_valid FROM npdbcz WHERE `range` IN ('606339842','60633984','6063398','606339','60633','6063') ORDER BY now_valid DESC,validity",
  "syslog_pid": "pid[29839]",
  "file_name": "npdbProfiling-end:",
  "type": "mmr-core[29839]:",
  "hostname": "vMMR",
  "syslog_timestamp": "Jul  4 13:56:17",
  "file_id": "GtsAwegAOMTbez_1562241377271986.mt",
  "operation": "SELECT",
  "table": "npdbcz"
}

I have problem to get parsed duration :sweat_smile: When I try to skip the word duration as I did it before with operation or table grok shows me an error that it doesn't match :frowning:

[ and ] have to be escaped because they are used to delimit character groups in a regexp. Try

grok { match => { "rest" => "^duration\[%{NUMBER:d:float} ms\]" } }
1 Like

Thank you very much! It works nicely :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.