We’re working on ingesting logs from network devices (e.g., Cisco IOS) that send their logs via Syslog directly to Logstash, which then forwards the data to Elasticsearch. We manage all components through custom automation (e.g., Ansible) — without using Elastic Agent or Fleet Server .
For operational and security reasons, we want to avoid using Fleet and Elastic Agent, mainly due to concerns about vendor lock-in.
We would like to make use of official Elastic integrations from GitHub:
integrations/packages/cisco_ios at main · elastic/integrations · GitHub
And extract from them:
• index mappings (without requiring Fleet or Elastic Agent)
What we’ve tried:
We took the fields.yml from the integration: https://github.com/elastic/integrations/blob/main/packages/cisco_ios/data_stream/log/fields/fields.yml
Then attempted to generate ECS-compliant mappings using the ECS generator: https://github.com/elastic/ecs
We placed the fields.yml into usage-example/fields/custom/, created a custom subset.yml, and manually added missing attributes like level and description. We ran the generator with:
python3 scripts/generator.py \
--ref v8.0.0 \
--include usage-example/fields/custom/ \
--subset usage-example/fields/subset.yml \
--out output2/ \
--template-settings-legacy usage-example/fields/template-settings-legacy.json \
--template-settings usage-example/fields/template-settings.json \
--mapping-settings usage-example/fields/mapping-settings.json \
--semconv-version v1.23.0
However, we ran into multiple validation errors like:
ValueError: Field is missing the following mandatory attributes: description.
Even after adding level, description, etc., the generator continues to fail. It seems that the fields.yml from integrations is not directly compatible with the ECS schema generator.
Questions
- Is it officially supported or recommended to extract and apply the index_template/default.yml and ingest_pipeline/default.yml from an integration package via API (PUT _index_template, PUT _ingest/pipeline)?
- Is there an official tool or supported method to extract only the ingest pipeline and mappings from an Elastic integration without using Fleet?
- If we don’t want to use Elasticsearch ingest nodes, but process everything via Logstash, is there any way to get or convert the integration’s ingest pipeline into a Logstash pipeline format (e.g., filter { ... })?
Our goals are:
• No Fleet or Elastic Agent
• Full control via automation (e.g., Ansible)
• Manual setup of index templates
• Prefer using Logstash instead of ingest nodes, ideally with ready-to-use pipeline definitions (e.g., grok, date, etc.) — without manually extracting or rewriting them from ingest pipelines
Thanks in advance for your help or guidance!
— Václav Šulc