How can i add a field to the original message (json) that was sent to logstash?

I have a log message that is a json file.
I would like to add a field to that original message within the json structure.
I've tried multiple ways of doing it and none of them get the job done.
I have trouble with the elastic documentation.

Lets say my original message input was {"hello": "world"}.
I want my output to be {"hello": "world", "logstash_timestamp", "2018-05-02T13:20:51.179Z"}.
I don't care about the metadata as it is intentionally lost in SNS.

Below is a single message entering my logstash configuration (the names have been changed to protect the innocent)...


{"log_timestamp":"2016-11-04T07:30:31.944Z","my_message_bytes":"040848036966B1","log_type":"my_json_log","source_topicArn":"arn:aws:sns:us-east-1:000000000000:my-feeder-devel","my_site":"my_devel"}


My attempt at the logstash-sns.conf file is shown below...


input {
  file {
    path => "/usr/share/logstash/logs/*.json"
    start_position => "beginning"
    codec => plain
  }
}

filter { 
  mutate {
    add_field => { "logstash_timestamp" => "%{@timestamp}" }
  }
}

output {
  stdout { 
    # codec => rubydebug
    codec => line {
      format => "%{message} %{logstash_timestamp}"
    }
  }

  sns {
    region => "us-east-1"
    arn => "arn:aws:sns:us-east-1:000000000000:my_messages"
    access_key_id => "blahblah"
  	secret_access_key => "blahblah"
    codec => line {
      format => "%{message} %{logstash_timestamp}"
    }
  }
}

The output of this is... (notice how the timestamp was added outside the json structure. I would like it on the inside.)


"Message" : "{\"log_timestamp\":\"2016-11-04T07:30:31.944Z\",\"my_message_bytes\":\"040848036966B1\",\"log_type\":\"my_json_log\",\"source_topicArn\":\"arn:aws:sns:us-east-1:000000000000:my-feeder-devel\",\"my_site\":\"my_devel\"}\r 2018-05-02T14:37:42.772Z\n"

I stripped down the metadata from the output and only am showing the "Message" field which is all I am concerned with.

Change codec => plain to codec => json_lines so Logstash deserializes the JSON in each line. You'll probably want to change the codec for the sns output in the same way.

Unfortunately that did not work. I edited the post to add the input message (above the configuration file).

Please always state what happens rather than "it doesn't work" or similar.

So the JSON objects you want to parse and adorn with additional fields is broken across multiple lines? If yes, the file input is line-oriented so you need to use a multiline codec to join the lines of the file into a single event, then use a json filter to deserialize that field in the events. Examples of this have been posted in the past.

You got exactly what you asked for. You told the codec that you wanted the output to be the input message with the timestamp appended to it.

Adding spaces and CRLF actually makes it much harder to understand what you are doing. Please do not do it.

The "Message" field that I am trying to achieve is the following:


"Message" : "{\"log_timestamp\":\"2016-11-04T07:30:31.944Z\",\"my_message_bytes\":\"040848036966B1\",\"log_type\":\"my_json_log\",\"source_topicArn\":\"arn:aws:sns:us-east-1:000000000000:my-feeder-devel\",\"my_site\":\"my_devel\",\"logstash_output\":\"2018-05-02T14:37:42.772Z\"}\r\n"

@Badger I have updated the input and output in my original post to reflect the actual input and actual output, respectively.
@magnusbaeck sorry for just saying "that didnt work". What I meant was that I got the identical output that I had originally gotten.

Again, don't use the line codec. If you want to add fields to the input message you need to deserialize the input JSON into Logstash fields, otherwise the input is just a string stored in the message field. The fields should then be serialized back into a suitable format, e.g. JSON, using a json or json_lines codec.

I find it very hard to believe that you're getting the exact same results when using a json_lines codec in both the input and output plugin compared to when you were using a plain and a line codec.

So, please switch both input and outputs to one of the mentioned codecs and report the full output of the stdout plugin.

If you really want what you say you want, which strikes me as unlikely, but I have never used sns, so it may be, then this would do it.

mutate { add_field => { 'Message' => "%{message}" } }
mutate { gsub => [ "Message", "}$", ', "logstash_timestamp": "%{@timestamp}" }' ] }

and remove the codec from the sns output.

@magnusbaeck the config file doesnt run when i change both the input and the output codecs to json_lines as you suggested. However, if I use codec => json on the input side and codec => json_lines on the output side it does run but output is not what i am looking for. Please see below:


{"logstash_timestamp":"2018-05-02T19:09:57.178Z","path":"/usr/share/logstash/logs/log_logstash_singleline_input.json","@version":"1","log_type":"my_json_log","@timestamp":"2018-05-02T19:09:57.178Z","host":"d39db92db8c6","log_timestamp":"2016-11-04T07:30:31.944Z","my_site":"my_devel","source_topicArn":"arn:aws:sns:us-east-1:000000000000:my-feeder-devel","my_message_bytes":"040848036966B1"}

The problem with this is that I want the same order the message was in before I parsed it and I do not want all the other fields that are now in there. I cant move the fields around because there there are too many variables as to what the fields will be. I simply want to add a single <key, value> pair inside the json structure that was input to logstash and then send that out.

the config file doesnt run when i change both the input and the output codecs to json_lines as you suggested.

Well, there's no way to debug that without details.

The problem with this is that I want the same order the message was in before I parsed it

Why on earth for? Nobody should care about the order of fields in a JSON object. Logstash doesn't support maintaining the original order.

and I do not want all the other fields that are now in there.

You can use the prune filter to wildcard-delete fields you don't want, except @timestamp which you're stuck with.

I simply want to add a single <key, value> pair inside the json structure that was input to logstash and then send that out.

Well. You can use a mutate filter's gsub option to replace the closing } in the JSON string with , "foo": "bar"} but that's really ugly.

1 Like

@Badger This is the output i get with your idea.


{"log_timestamp":"2016-11-04T07:30:31.944Z","my_message_bytes":"040848036966B1","log_type":"my_json_log","source_topicArn":"arn:aws:sns:us-east-1:000000000000:my-feeder-devel","my_site":"my_devel"} %{logstash_timestamp}

It still places the timestamp outside the json structure.

That's because you have that codec that appends that string to the field called message. You asked for a field call Message, which is a different field. Change the stdout codec to rubydebug and you will see what I mean.

1 Like

@Badger I'm sorry for all this confusion. I am looking at a few outputs at different points in the messages life. I was looking at the output from SQS which is after SNS when I gave you that "Message" field. I am getting confused and confusing everyone else. I dont need an extra field I just want the output of logstash which is "message" to have the logstash_timestamp field within it.

like so...

{"blah_key":"blah_value", ... , "logstash_timestamp":"2018-05-02T14:37:42.772Z"}

Should do that.

1 Like

I tried both methods proposed in the comments. I like the approach of pulling apart the json, then blacklisting the extra fields added by logstash, and finally packing it back into json in the output.

Below is the working code I am using:


input {
  file {
    path => "/usr/share/logstash/logs/*.json"
    start_position => "beginning"
    codec => json
  }
}

filter { 
  prune {
    blacklist_names => [ "@version", "host", "path" ]
  }
}

output {
  stdout { 
    codec => json_lines
  }
  sns {
    region => "us-east-1"
    arn => "arn:aws:sns:us-east-1:000000000000:my_messages"
    access_key_id => "blahblah"
    secret_access_key => "blahblah"
    codec => json_lines
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.