Parsing problem when streaming a log file

So did you actually delete the data stream and all before you restarted...???

Stack Management Data Streams etc...

I am going through the example right now

I don't think I did,

All I did was stop the service and start it up again

While I am going through the rest of this... try this... note the corrected syntax on the dataset

1 Like

It changed!

The entire log is still in the message field though :confused:

Yeah because it is still writing to the old index...

I got it working... you were close...

Steps Stop The Agent

Go to Kibana -> Index Management and Delete the logs-example-default and logs-generic-default data streams

then Put these notice the names above you using generic etc..

PUT _index_template/logs-example-default-template
{
  "index_patterns": [ "logs-example-*" ],
  "data_stream": { },
  "priority": 500,
  "template": {
    "settings": {
      "index.default_pipeline":"logs-example-default"
    }
  },
  "composed_of": [
    "logs-mappings",
    "logs-settings",
    "logs@custom",
    "ecs@dynamic_templates"
  ],
  "ignore_missing_component_templates": ["logs@custom"]
}

PUT _ingest/pipeline/logs-example-default
{
  "description": "Extracts the timestamp and log level",
  "processors": [
    {
      "json": {
        "field": "message",
        "target_field": "message_details"
      }
    }
  ]
}

Agent File

outputs:
  default:
    type: elasticsearch
    hosts: '<host:port>
    api_key: '<apikey>'
inputs:
  - id: your-log-id
    type: filestream
    streams:
      - id: your-log-stream-id
        data_stream:
          dataset: example
        paths:
          # - /var/log/*.log
          - /opt/Elastic/Agent/elastic-agent-*.ndjson

start the agent

I can't thank you enough, Stephen

I really appreciate the time you had to put up with me.

Thank you again!

1 Like

No it was fine... 1 little glitch in the docs!
Thanks for hanging in!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.