Timestamp replaced

Hi,

There is one attribute in my json file called "ts" for each item in the file.
{"name" : "David", "ts" : 2018-2-15 13:20:00 +0000", .....}

And if I directly import this json to ES, the "@timestamp" will be the exact time when I import the date, not the time that this specific event happened ("ts" in the json). So I am wondering if there is some way to replace the default "@timestamp" of ES by the specific date attribute that is already in the json file("ts")?

Thanks.

Yes, use a date filter.

Could you explain a lil bit more about how to do that?

Is it the date filter inside of json filter? And how does that work?

You posted this in the logstash forum. Are you using logstash to push the json into elasticsearch?

Yes, I used logstash

and this is my current config

input {
	file {
		path => "/Users/apple/Desktop/Logstash_trial_3/event.log.20180214"
		start_position => "beginning"
		sincedb_path => "/dev/null"
		codec => json

	}
}

filter {
	json{
		source => "message"
		#target =>"doc"
		remove_field => ["message"]
	}
	mutate{
		rename => ["_id" , "ID"]
	}

}

output {
	elasticsearch {
		hosts => "localhost:9200"
		index => "realrollingdata"
		document_type => "eventdata"
	}
	stdout {}
	
}

Thanks for posting that. I would suggest you get rid of the elasticsearch output until you get the parsing working, and modify the stdout output to use rubydebug

output { stdout { codec => rubydebug } }

You should either use a json codec on the input or a json filter to do the parsing. You cannot do both. If your input is valid json (i.e. the timestamp starts with a quote, missing in your initial post) then I would use a codec. In that case your events will look like this

    "@timestamp" => 2018-02-15T21:39:39.855Z,
            "ts" => "2018-2-15 13:20:00 +0000",
          "name" => "David",

So remove the json filter, because it is failing every time (there is no message field when you use a json codec on the input). Then add a date filter. Something like

date { match => [ "ts", "yyyy-M-d HH:mm:ss Z" ] }

Then your events will look more like

            "ts" => "2018-2-15 13:21:00 +0000",
    "@timestamp" => 2018-02-15T13:21:00.000Z,
          "name" => "Goliath",

That "yyyy-M-d HH:mm:ss Z" pattern may be wrong. For example, you may need dd rather than d. Read the doc and look at your data to find out.

This is so helpful, I am gonna try it. Thank you so much.

This is my updated config,

input {
	file {
		path => "/Users/apple/Desktop/Logstash_trial/Logstash_trial_3/event.log.20180214"
		start_position => "beginning"
		sincedb_path => "/dev/null"
		codec => json

	}
}

filter {

	mutate{
		rename => ["_id" , "ID"]
	}
	date{
		match => ["ts", "yyyy-MM-dd HH:mm:ss.SSS ZZ"]
	}

}

output {
	elasticsearch {
		hosts => "localhost:9200"
		index => "realrollingdata"
		document_type => "eventdata"
	}
	stdout { codec => rubydebug } }
	
}

But when I run the logstash to import the data, it threw the error that I did not meet before.

And if I commented off "stdout{ codec => rubydebug } }", the logstash will be stuck in Pipelines running.

That has an imbalance of }, which is what the original error was complaining about.

If you comment it out then logstash will sit there and tail all the files that match your path, waiting for new log entries to be written to them, so that it can process them and write them to your elasticsearch output. Uncomment the stdout and remove the extra } and you will see it doing this. You can even append an extra line to the file whilst it is sitting there with pipelines running and see it wake up and process it.

Also the example date you gave ("2018-2-15 13:21:00 +0000") does not have milliseconds in it. So I would expect a _dateparsefailure if you keep the .SSS in the pattern.

Also, please do not post images of text, they are a pain to read. Just post the text.

Thank you for explaining it patiently.

Here is my latest config

input {
	file {
		path => ""C:\Users\User\Desktop\rollingdata\event.log.20180214""
		start_position => "beginning"
		sincedb_path => "/dev/null"
		codec => json

	}
}

filter {

	mutate{
		rename => ["_id" , "ID"]
	}
	date{
		match => ["ts", "yyyy-MM-dd HH:mm:ss ZZ"]
	}

}

output {
	elasticsearch {
		hosts => "localhost:9200"
		index => "realrollingdata"
		document_type => "eventdata"
	}
	stdout { codec => rubydebug } 
	
}

And it threw the error like this.

[2018-02-17T02:29:46,489][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, {, } at line 3, column 13 (byte 31) after input {\r\n\tfile {\r\n\t\tpath => \"\"", :backtrace=>["/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/compiler.rb:42:incompile_imperative'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/compiler.rb:50:in compile_graph'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/compiler.rb:12:inblock in compile_sources'", "org/jruby/RubyArray.java:2486:in map'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/compiler.rb:11:incompile_sources'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/pipeline.rb:51:in initialize'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/pipeline.rb:169:ininitialize'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/pipeline_action/create.rb:40:in execute'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:315:inblock in converge_state'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:312:inblock in converge_state'", "org/jruby/RubyArray.java:1734:in each'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:299:inconverge_state'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:166:in block in converge_state_and_update'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:164:in converge_state_and_update'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/agent.rb:90:inexecute'", "/Users/HanBit/Desktop/logstash-6.2.1/logstash-core/lib/logstash/runner.rb:348:in block in execute'", "/Users/HanBit/Desktop/logstash-6.2.1/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:inblock in initialize'"]}`

Meanwhile, if I removed the content in stdout, it works fine.

Btw, do I need to terminate the importing data thru logstash by myself once it complete importing all the files. Cuz for now, my interface of logstash is like stilling running and waiting for the new files. I thought it will terminate by itself once it has done importing.

Thanks!!

That seems reasonable. You have two sets of double quotes around the path. You only need one.

I have fixed the quote issues.

And I just found some problems through the process of ingesting data files.

Normally, it shows the status like this.

2018-02-14T14:51:44.021Z Hans-MacBook-Pro.local %{message}

But I got some abnormal ingesting.

2018-02-14T14:51:44.021Z Hans-MacBook-Pro.local %{message}
2018-02-14T14:54:13.295Z Hans-MacBook-Pro.local %{message}
2018-02-17T07:59:49.059Z Hans-MacBook-Pro.local "result.statusMessage":"Success",
2018-02-13T22:24:40.685Z Hans-MacBook-Pro.local %{message}
2018-02-14T14:53:37.384Z Hans-MacBook-Pro.local %{message}
2018-02-17T07:59:49.019Z Hans-MacBook-Pro.local {"jurHash":"112256955",
2018-02-17T07:59:49.060Z Hans-MacBook-Pro.local "durationMillis":3,"requestId":"1518411024","extUserId":"mocktest3","result.status":"0000","wsdlVersion":"1_8","operation":"createUser","_id":"car4be-w2-tc.1518411025323.3904474"}
2018-02-13T22:24:50.373Z Hans-MacBook-Pro.local %{message}
2018-02-14T01:05:26.438Z Hans-MacBook-Pro.local %{message}
2018-02-14T01:05:27.258Z Hans-MacBook-Pro.local %{message}
2018-02-14T14:42:56.489Z Hans-MacBook-Pro.local %{message}
2018-02-14T14:45:48.741Z Hans-MacBook-Pro.local %{message}
2018-02-17T07:59:49.055Z Hans-MacBook-Pro.local "txnId":"vipusDEE529597C6FD8AF",
2018-02-12T04:50:33.424Z Hans-MacBook-Pro.local %{message}
2018-02-13T22:24:50.453Z Hans-MacBook-Pro.local %{message}

And before ingesting, the error info is

[2018-02-17T02:59:49,008][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected end-of-input within/between Object entries
 at [Source: (String)"{"jurHash":"112256955","; line: 1, column: 47]>, :data=>"{\"jurHash\":\"112256955\","}
[2018-02-17T02:59:49,054][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>"\"txnId\":\"vipusDEE529597C6FD8AF\","}
[2018-02-17T02:59:49,057][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>"\"ts\":\"2018-02-12 04:50:25.323 +0000\","}
[2018-02-17T02:59:49,058][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: incompatible json object type=java.lang.String , only hash map or arrays are supported>, :data=>"\"result.statusMessage\":\"Success\","}

With a json codec there is no event field called message. It is going to be easier to tell what is happening if you remove the code on the input, and go back to using

json { source => "message" }

(Do not delete the message field.) The output you have is stdout with a plain codec. Does rubydebug work for you now?

I tried to remove the code on the input and using json{ source => "message" },
but it caused the same problems as before.

Yes, rubydebug works for me now.

input {
	file {
		path => "C:\Users\User\Desktop\rollingdata\event.log.20180214"
		start_position => "beginning"
		sincedb_path => "/dev/null"
		#codec => json

	}
}

filter {
	json{
		source => "message"
	}

	mutate{
		rename => ["_id" , "ID"]
	}
	date{
		match => ["ts", "yyyy-MM-dd HH:mm:ss ZZ"]
	}

}

output {
	elasticsearch {
		hosts => "localhost:9200"
		index => "realrollingdata"
		document_type => "eventdata"
	}
	stdout { codec => rubydebug } 
	
}

It seems the whole message for one item is accidentally splited by logstash.

The logs suggest that is a complete line, but it is not valid JSON, so both the codec and the filter will tail to parse it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.