Is there possibility for logstash to ingest whole document to one index and specific fields to other index in the elastic search output.
Example: If i have 10 fields coming in from input , can i write all the fields to one index and some specific fields to other index.
input
{
kafka input
}
filter
{
}
output
{
elasticsearch
{
hosts=>[]
index=> index for all fields for incoming request
}
elasticsearch
{
hosts=>[]
index=> index for specific fields from document
}
Outputs are applied after filters, so to do this could write to elasticsearch in one pipeline and add a second output that writes to a second pipeline. In the second pipeline you can strip off the unwanted fields and write to the second elasticsearch.
You can use tcp output/input bound to localhost to communicate between pipelines (or maybe even the new beta inter-pipeline communications if you are feeling brave).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.