I'm trying to get xewriter to deliver MSSQL logfiles to my elasticsearch

In my logstash I have this to handle different logfiles with different index names:

output {
  elasticsearch {
    hosts => "https://elasticsearch:9200"
      index => "%{[fields][logtype]}-%{[@metadata][version]}-%{+YYYY.MM}"
      document_type => "%{[@metadata][type]}"
      cacert => "/usr/share/logstash/config/certs/ca/ca.crt"
      user => user
#logstash_internal
      password => password
  }
}

This works for all sorts of beats etc.
xewriter uses tcp input, and I've tried getting that to name the indexes properly as well, but so far without luck.

The index name is not filled in, instead the index looks like this:


How do I get this to name them correctly?
As this is the only tcp-input I have I even tried using mutate to set the fields:

filter {
  if "tcp-input" in [type] {
    mutate {
      add_field => {
        "fields.logtype" => "sqllogs"
      }
    }
  }
}

And that made the 'fields.logtype' but it still didn't give the index a proper name
Do I need to manipulate @metadata and version as well?

It seems you fields are empty. Have you converted the message to JSON?

You have a wrong nested field. Should be:

    "[fields][logtype]" => "sqllogs"

Which version of ELK are you using?

  • for elasticsearch clusters 8.x: no value will be used;
  • for elasticsearch clusters 7.x: the value of _doc will be used;
  • for elasticsearch clusters 6.x: the value of doc will be used;

Hi @Rios
Thankyou for helping out.
No, I didn't get the filter done, and I remembered this last night, so after adding that, bam, everything works :slight_smile:
I'm using 7.17.6 as far as I remember.
So you were absolutely right!

So have you solved or not? If still have problems pls provide full message as text.

It's solved, it was the JSON filter that solved it

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.