Moving my first logs to ECS

Hi friends.
After succesing submiting my custom events into elk and visalize ok kibana, I would like to improve my event login format.
After googlin about creating custom ecs index and indexing its documents , I did not success.
First , tryed to manually create an example index, doing:

PUT /ecs_sample
{
    "mappings": {
      "timestamp": {"type": "date"},
      "log.level": {"type": "keyword"},
      "message": {"type": "keyword"},
      "service": {
          "name": {"type": "keyword"}
      },
      "event": {
          "severity": {"type": "short"},
          "timezone": "Hora ARG",
          "created": {"type": "date"},
          "category": {"type": "keyword"}
      },
          "ecs": {
          "version": {"type": "keyword"}
      }
    }  
}

but elk yelds:

"root_cause" : [
  {
    "type" : "mapper_parsing_exception",
    "reason" : "Root mapping definition has unsupported parameters:  [ecs : {version={type=keyword}}] [service : {name={type=keyword}}] [log.level : {type=keyword}] [message : {type=keyword}] [event : {severity={type=short}, timezone=Hora ARG, created={type=date}, category={type=keyword}}] [timestamp : {type=date}]"
  }

So ... continue reading and install ECS Tooling from:
https://github.com/elastic/ecs/blob/master/USAGE.md#setup-and-install

but ... after install can not make it run:

[root@devel ecs]# python  scripts/generator.py
Traceback (most recent call last):
  File "scripts/generator.py", line 7, in <module>
    from generators import csv_generator
  File "/opt/ecs/scripts/generators/csv_generator.py", line 5, in <module>
    from generator import ecs_helpers
  File "/opt/ecs/scripts/generator.py", line 7, in <module>
    from generators import csv_generator
ImportError: cannot import name csv_generator

Have not idea what to do , im not familiarized with python.
As last attempt , I tryed to copy the mapping from an existing index.
I enable system filebeat module, I can see data but when trying to analize its mapping doing:

GET filebeat-7.9.0/_mapping

Dont understand what is in response ... it is a very large document.

So:
Please if you can help me to accomplish this would be great.
Do I need to ingest my logs with logtash or filebeat or can I continue using the api ?
Can I copy the index mapping from an existing / template index and then add my fields?
Is there an exisiting php / perl library to log data with ECS format ?
Is there some working and not gigant doc to learn this ?

Any idea would be wellcome,
Leandro.

The request is malformed.... I will put it at the bottom but overall.
You should probably read about Mappings and Index Template which is how to apply mapping to a pattern of indexes.

But lets focus on these

Do I need to ingest my logs with logtash or filebeat or can I continue using the api ?

If your app is writing logs to a file (Especially ECS logs see below) ... I would start with Filebeat you will get a lot of the mapping etc for free.

Can I copy the index mapping from an existing / template index and then add my fields?

Yes, but you need to be careful and format it correctly, which you did not. If you learn a bit and name your fields correct with the builtin in dynamic templates most of it could be taken care of for you.... just using the base filebeat index .

Is there an exisiting php / perl library to log data with ECS format ?
Yes Right Here

Is there some working and not gigant doc to learn this ?

Perhaps start with the many free webinars like this one

If it were me and I was learning....

  1. I would watch a couple of the Logging / Observability webinars and / or access our excellent free training like this short quick start on logging
  2. I would use the php ECS logger and write to log files
  3. Use filebeat to send the data to Elasticsearch
  4. Learn and adjust from there..
  5. If you need to more specifics add an ingest pipeline to do additional parsing and put the fields in the existing ECS fields or name them properly so the dynamic template will take care of it for you..... or do what you are trying to do with a custom template which I gave you a sample below.

BTW here is the correct mapping you were trying to create... I am not sure what you were trying to accomplish with the timezone fields so I took it out ... otherwise I would make it a keyword type.

You can also create index templates via Kibana under Stack Management

You should really create this as an index tempate

PUT /_index_template/ecs_sample
{
  "index_patterns": [
    "ecs-sample-*"
  ],
  "template": {
    "settings": {
      "number_of_shards": 1
    },
    "mappings": {
      "properties": {
        "@timestamp": {
          "type": "date"
        },
        "log": {
          "properties": {
            "level": {
              "type": "keyword"
            }
          }
        },
        "message": {
          "type": "keyword"
        },
        "service": {
          "properties": {
            "name": {
              "type": "keyword"
            }
          }
        },
        "event": {
          "properties": {
            "severity": {
              "type": "short"
            },
            "created": {
              "type": "date"
            },
            "category": {
              "type": "keyword"
            }
          }
        },
        "ecs": {
          "properties": {
            "version": {
              "type": "keyword"
            }
          }
        }
      }
    }
  }
}
2 Likes

Thanks !!! you provided a lot of info ...
I need to study.

I dont understand this:
do you mean ; I should use filebeat on my app side to export logs or use filebeat on elk side to recibe logs ?
just this.
thanks again.

App writes logs to file using php ECS logger

Use Filebeat to read logs and send to elasticsearch.

I don't want to misdirect this discussion, but @leostereo, do feel free to open a separate topic if you need any help getting started with the ECS tooling.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.