Creating subfileds in logstash

Dear Experts,

I am trying to parse some logs with logstahs 2.3.3 into elasticsearch. The logs have similar format up to the log message, say: time_stamp, node, field1, field2, log_type, log_message.
Until now I was parsing all logs using one pattern which deals with the log message as a text field and everything was fine.
Now the next step would be to drill down the log message and define new fields within the message itself. However, I would like to keep the whole log message as one field and define the new information in it as additional fields (which exist for some log messages only). I did that by defining 3 patterns in the grok filter (2 for different log types and 1 general which is the old pattern that deals with the whole message as a string, i.e. no subfields). My grok filter looked like the following:
grok {
patterns_dir => ["patterns_path"]
match => { "message" => ["%{TIME:time} %{SPECIALTYPE1:node} %{INT:field1} %{SPECIALTYPE2:field2} %{SPECIALTYPE3:log_type} %{MESSAGE1:log_message}",
"%{TIME:time} %{SPECIALTYPE1:node} %{INT:field1} %{SPECIALTYPE2:field2} %{SPECIALTYPE3:log_type} %{MESSAGE2:log_message}",
"%{TIME:time} %{SPECIALTYPE1:node} %{INT:field1} %{SPECIALTYPE2:field2} %{SPECIALTYPE3:log_type} %{GENERICMESSAGE:log_message}"]
Where MESSAGE1 and MESSAGE2 are defined in the patterns file and contain the subfields. GENERICMESSAGE is also defined there as a combination of everything except new line, and no subfields are defined for it.
The problem with this approach was that when I check the subfields in Kibana I can't visualize them because they are not indexed.
Afterwards I have tried to index these fields before uploading the logs into elastic search by sending an XPUT message --> mappings --> default --> properties with setting the option "index" to "not_analyzed" or "analyzed" depending on the field type (so that the fields become indexed). However, this did not change anything and the fields remained not indexed.
My question is:

  • Is there any way in ES to change "not indexed" fields to "indexed"?
  • If not, is there any other way to keep my log_message field while defining sub-fields in it at the same time?
  • I am aware of the mutate filter, but the problem is that I am not defiing fields for all the messge content (e.g. if the message contains 100 characters then 20 are saved in subfield1, 20 in subfield2 and 60 are not saved in any subfield. However, all of them must exist in the log_message field).

Is there any way in ES to change "not indexed" fields to "indexed"?

By default all fields are indexed so it seems you have default settings that are getting in the way. What are the mappings for the index in question?

Hello Magnus,

Thanks for the quick response!

I did not have any extra settings but I figured out the problem. I needed to delete the index in Kibana before uploading the data again. So deleting the data from ES and uploading them again was not enough since the old settings in Kibana were being used which did not contain the new fields (the subfields).

It sounds like you just didn't refresh the field list in Kibana. Recreating the index isn't necessary for ES to pick up new fields.

1 Like

Yes. Beginner's mistake!