Setting Up Logstash In Docker-Compose For Bulk Ingest Of CSV Files In Local Machine

@Ethan777100

Lets take a step back

We have spent more that 50 Response to show you how to debug / setup the dockers etc JUST to get it to work...

Not necessarily how you wanted to use all this in the long term, and yes I understood WHY you wanted to leave files but we were debugging to get it to JUST work for you... as there were challenges understanding paths etc.

You never communicated the volumes etc..etc.. which, for a laptop you have some significant volumes

I repeatedly reminded you to try to get it working with 1 file... 1file

AFTER you did that and you knew how everything worked ... You then could have configured your paths, both the sources and the elasticsearch data, where you wanted.

What I would suggest is to clean up the all the volumes...

Set up the elasticsearch data volumes where you have space
You will probably need to read more about mounting volumes vs bind mounts

- esdata01:/usr/share/elasticsearch/data

Perhaps you should read this...

You can also then leave your sources where they are and set up the paths as you have learned...

So now, from now on, you should learn about mappings (schema)

And replicas, which you have 1, which I find odd ... did you go back to 3 nodes. I would have expected 1 primaries and 0 replicas if you were running 1 node

You don't need a 1 replica as it doubles the space...

Also you did not create a mapping so you got the default one which take up more space...

It seems that you are trying to do the minimum without some of the basic understanding of elasticsearch which is fine ... but then you should not be surprised.

We have been helping you with docker and logstash and all these things... these are just about setting up the data ingest ... but you have not spent anytime learning about actually elastic...

What I would do.....

Clean up everything including the volumes (that will reduce your filespace again

Now setup where you want to read the sources from.

I would only run 1 node... still confused if you went back to 3... you should not have a replica with 1 node.

Determine for the elasticsearch data whether you want to use a volume mount or bind mount (probably volume mount would be good)

I would upload 1 CSV file via the file uploaded, you look at the types and it will upload 1 file and create and index and Mapping etc...

Then I would set everything up and upload 1 file via logstash etc... to the SAME index that you upload and see if you like the way it works
Make sure everything is OK

Then I would probably divide up your sources into subdirectories or load them some subset at at time with *

events2022-01*.csv
events2022-02*.csv

In short ... be methodical ... Plan, Code, Test, Run, Fix... repeat THEN go for volume.

I did not go back to 3 nodes. i've always stuck with one.

So you would like me to sweep everything clean ES/Kibana and Logstash as docker-compose down

And docker-compose up -d again?

Ok good... sorry so it is Asking for 1 primary + 1 replica but it looks like you only have 1 primary... allocated which is why it is Yellow.

So that is not causing the double space

But you are using the default schema which makes 2 fields for every field a keyword and text more that you probably need...

Space can be compacted when you are done...

So Again... you probably should have loaded 10 or 20 files and looked at the space first...

Sorry, but we can not do this all for you... my suggestion is to slow down... and iterate...

So when I look at your Docs vs Space

I see 27GB for 51159726 = 27*1024^3 / 51159726 ~567 Bytes per document for the source and all the data structures to search, filter, aggregate.. that does not see too bad.

With proper mapping (schema) I sure the space could be lessened...

Why all your filesystem being taken up not sure...

@Ethan777100 Let me be clear: I have been very happy to help, but I can not do this project for you... Slow Down and Consider your steps

Me, I would not just leave the system ingesting for hours on its own and walk away, I would plan, segment, test, run, test etc...

you need to figure out what works for you, understand the components and process and proceed in an orderly fashion

late here... good luck

1 Like

Regarding mapping / schema

{
  "mappings": {
    "properties": {
      "@timestamp": {
        "type": "date"
      },
      "@version": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "alarm": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "alarmvalue": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "column19": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "description": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "equipment": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "event": {
        "properties": {
          "original": {
            "type": "text",
            "fields": {
              "keyword": {
                "type": "keyword",
                "ignore_above": 256
              }
            }
          }
        }
      },
      "eventtype": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "graphicelement": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "host": {
        "properties": {
          "name": {
            "type": "text",
            "fields": {
              "keyword": {
                "type": "keyword",
                "ignore_above": 256
              }
            }
          }
        }
      },
      "id": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "location": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "log": {
        "properties": {
          "file": {
            "properties": {
              "path": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              }
            }
          }
        }
      },
      "message": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "mmsstate": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "operator": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "severity": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "sourcetime": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "state": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "subsystem": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "system": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "uniqueid": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "value": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "zone": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      }
    }
  }
}

This what I retrieved and yeah I know its a json file. 19+ columns and yeah i don't really see anything weird about this?

it was 1-1 mapping of column to object (thats how I understand it)

But you are using the default schema which makes 2 fields for every field a keyword and text more that you probably need...

More than I need?

I just read up about mappings inside Mapping | Elasticsearch Guide [8.11] | Elastic

I don't think it looks that straight forward to edit the fields directly.

I consciously realised quite alot of my fields have been read as Text and not numbers.

Could this help me reduce file size?

Meanwhile: Docker Hard Disk Image File Being Too Large - Elastic Stack / Logstash - Discuss the Elastic Stack

Yes, this is the default mapping I was speaking of. I suspect many of these fields are exact matches or need to be aggregated not free text search. So instead the mapping for should look something like this for most of your fields

"alarm": {
        "type": "keyword",
         "ignore_above": 256
        
      },

Yes also Set the fields as numbers for the ones you want.
You simply have to create the mapping through the Dev Tool ahead of time and then write to that index. It's very simple and straightforward. I would suggest to try it and do it with one file and see what the results are.

Also for the time field you should probably make them a date type... But you need to pay attention to the format

This is interesting. So I managed to retrieve the mapping and pipeline.json from the Import File approach.

I uploaded 1 csv file under integrations, Upload File.

mapping.json

{
  "properties": {
    "@timestamp": {
      "type": "date"
    },
    "alarm": {
      "type": "keyword"
    },
    "alarmvalue": {
      "type": "long"
    },
    "description": {
      "type": "text"
    },
    "equipment": {
      "type": "keyword"
    },
    "eventtype": {
      "type": "keyword"
    },
    "graphicelement": {
      "type": "long"
    },
    "id": {
      "type": "long"
    },
    "location": {
      "type": "keyword"
    },
    "mmsstate": {
      "type": "long"
    },
    "operator": {
      "type": "keyword"
    },
    "severity": {
      "type": "long"
    },
    "sourcetime": {
      "type": "date",
      "format": "yyyy-MM-dd HH:mm:ss.SSS||yyyy-MM-dd HH:mm:ss.SS||yyyy-MM-dd HH:mm:ss||yyyy-MM-dd HH:mm:ss.S"
    },
    "state": {
      "type": "long"
    },
    "subsystem": {
      "type": "keyword"
    },
    "system": {
      "type": "keyword"
    },
    "uniqueid": {
      "type": "keyword"
    },
    "value": {
      "type": "keyword"
    },
    "zone": {
      "type": "long"
    }
  }
}

vs

{
  "mappings": {
    "properties": {
      "@timestamp": {
        "type": "date"
      },
      "@version": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "alarm": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "alarmvalue": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "column19": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "description": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "equipment": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "event": {
        "properties": {
          "original": {
            "type": "text",
            "fields": {
              "keyword": {
                "type": "keyword",
                "ignore_above": 256
              }
            }
          }
        }
      },
      "eventtype": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "graphicelement": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "host": {
        "properties": {
          "name": {
            "type": "text",
            "fields": {
              "keyword": {
                "type": "keyword",
                "ignore_above": 256
              }
            }
          }
        }
      },
      "id": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "location": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "log": {
        "properties": {
          "file": {
            "properties": {
              "path": {
                "type": "text",
                "fields": {
                  "keyword": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              }
            }
          }
        }
      },
      "message": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "mmsstate": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "operator": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "severity": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "sourcetime": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "state": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "subsystem": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "system": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "uniqueid": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "value": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      },
      "zone": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword",
            "ignore_above": 256
          }
        }
      }
    }
  }
}

There are more fields mapped as long when I upload 1 csv.

When I ingest via pipeline, everything is keyword, "type": "keyword" even if entire column is a number.

ingest-pipeline
Not sure what I should look out for ingest-pipeline

{
  "description": "Ingest pipeline created by text structure finder",
  "processors": [
    {
      "csv": {
        "field": "message",
        "target_fields": [
          "id",
          "uniqueid",
          "alarm",
          "eventtype",
          "system",
          "subsystem",
          "sourcetime",
          "operator",
          "alarmvalue",
          "value",
          "equipment",
          "location",
          "severity",
          "description",
          "state",
          "mmsstate",
          "zone",
          "graphicelement"
        ],
        "ignore_missing": false
      }
    },
    {
      "date": {
        "field": "sourcetime",
        "timezone": "{{ event.timezone }}",
        "formats": [
          "yyyy-MM-dd HH:mm:ss.SSS",
          "yyyy-MM-dd HH:mm:ss.SS",
          "yyyy-MM-dd HH:mm:ss",
          "yyyy-MM-dd HH:mm:ss.S"
        ]
      }
    },
    {
      "convert": {
        "field": "alarmvalue",
        "type": "long",
        "ignore_missing": true
      }
    },
    {
      "convert": {
        "field": "graphicelement",
        "type": "long",
        "ignore_missing": true
      }
    },
    {
      "convert": {
        "field": "id",
        "type": "long",
        "ignore_missing": true
      }
    },
    {
      "convert": {
        "field": "mmsstate",
        "type": "long",
        "ignore_missing": true
      }
    },
    {
      "convert": {
        "field": "severity",
        "type": "long",
        "ignore_missing": true
      }
    },
    {
      "convert": {
        "field": "state",
        "type": "long",
        "ignore_missing": true
      }
    },
    {
      "convert": {
        "field": "zone",
        "type": "long",
        "ignore_missing": true
      }
    },
    {
      "remove": {
        "field": "message"
      }
    }
  ]
}

The top mapping.json is much closer to what you want....

You might need to fix a few.... just change the mapping... to long etc.

And you could just use logstash to read the file and then set the ingest pipeline on the logstash output to run the ingest pipeline when the data is in

pipeline

  • Value type is string
  • There is no default value.

Set which ingest pipeline you wish to execute for an event. You can also use event dependent configuration here like pipeline => "%{[@metadata][pipeline]}". The pipeline parameter won’t be set if the value resolves to empty string ("").

input { 
    file { 
        path => "/usr/share/logstash/csv_files/test.csv"
        start_position => "beginning" 
        sincedb_path => "/dev/null"
    } 
}

output { 
    elasticsearch {
        index => "discuss-test" <!--- my test index... 
        hosts => ["https://es01:9200"]
        user => "elastic"
        password => "mypassword"
        ssl_verification_mode=> "none"
        pipeline => "discuss-test-pipeline"  <!--- This is from the file uploaded
    }
    stdout{ codec => "rubydebug"} 
}

Note the above is with my test data ... but I think you see the pattern...

You can see the ingest pipeline and index in Stack Management ...



1 Like

Cool. These are really the small hacks not mentioned in Documentation.

This is essentially for ppl like me who don't know how to create or modify a mapping or pipeline from scratch.

So the hack is to upload a csv using the provided UI, get the mapping and pipeline created inside Elastic.

Then use the logstash container to ingest the rest of the files.

PS: I'm closely tracking my ext4.vhdx file size.
10.5GB -> 14.46 GB at 1.2million rows.
For 1 month of logs, I'm looking at ~4.122 million rows. (assuming 137.4k rows per csv file)

Guess it now beats the question : why was the default schema that I used earlier on just "inflated" and "bulky" when the current mapping and pipelines are honestly more streamlined?

The ingestion stopped for unknown reason.

The logs ingested was from 01-June-2023 08:00:00 to 12-June-2023 02:55:00.

My full range of logs is 01-June-23 00:00:00 to 30-June-23 23:59:59
I'm not sure whats going on. My logstash docker is still running, no errors raised.

Nothing raised in ES and Kib containers as well.

Oh no. My disk is still growing despite not ingesting anything.

Restarted logstash several times. The Logs are now completely empty and smth is fishy. The pipeline doesn't even start upon container startup.

I also learnt from the past 1 hour that when logstash is restarted, the pipeline re-reads everything into my index.

Now, this cycle of parsing has caused a duplicate of rows for the data 01-June-2023 08:00:00 to 12-June-2023 02:55:00.

This time round logstash continued ingesting files beyond 12-June-2023 02:55:00 (smooth-sailing case).

How can I delete off the 1st section of logs?

We discussed this

take that out... that resets "logstash" memory of what it has read already...

I think this is where you start to read the docs...

But me... I would be loading the data in "batches" and putting them in indcice like

ats-logs-2022-01
ats-logs-2022-02

Then create an Data view .. ats-logs-* that way I would have more control... and could delete and reload "sections" of the data better than putting in all the data in a single index.

1 Like
2023-11-21 13:36:50 [2023-11-21T05:36:50,943][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
2023-11-21 13:36:52 [2023-11-21T05:36:52,189][INFO ][org.reflections.Reflections] Reflections took 252 ms to scan 1 urls, producing 132 keys and 464 values
2023-11-21 13:36:52 [2023-11-21T05:36:52,727][ERROR][logstash.outputs.elasticsearch] Unknown setting 'mapping' for elasticsearch
2023-11-21 13:36:52 [2023-11-21T05:36:52,743][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1318)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:452)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:92)", "org.jruby.RubyClass.newInstance(RubyClass.java:931)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:452)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:92)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:561)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:516)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:293)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:328)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:352)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:110)", "java.base/java.lang.Thread.run(Thread.java:840)"]}
2023-11-21 13:36:56 [2023-11-21T05:36:56,136][ERROR][logstash.outputs.elasticsearch] Unknown setting 'mapping' for elasticsearch
2023-11-21 13:36:56 [2023-11-21T05:36:56,143][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1318)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.RubyClass.newInstance(RubyClass.java:931)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:561)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:291)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:328)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:352)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:110)", "java.base/java.lang.Thread.run(Thread.java:840)"]}
2023-11-21 13:36:59 [2023-11-21T05:36:59,065][ERROR][logstash.outputs.elasticsearch] Unknown setting 'mapping' for elasticsearch
2023-11-21 13:36:59 [2023-11-21T05:36:59,068][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1318)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.RubyClass.newInstance(RubyClass.java:931)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:561)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:291)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:328)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:352)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:110)", "java.base/java.lang.Thread.run(Thread.java:840)"]}
2023-11-21 13:37:02 [2023-11-21T05:37:02,163][ERROR][logstash.outputs.elasticsearch] Unknown setting 'mapping' for elasticsearch
2023-11-21 13:37:02 [2023-11-21T05:37:02,166][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1318)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.RubyClass.newInstance(RubyClass.java:931)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:561)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:291)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:328)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:352)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:110)", "java.base/java.lang.Thread.run(Thread.java:840)"]}
2023-11-21 13:37:04 [2023-11-21T05:37:04,996][ERROR][logstash.outputs.elasticsearch] Unknown setting 'mapping' for elasticsearch
2023-11-21 13:37:04 [2023-11-21T05:37:04,998][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1318)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.RubyClass.newInstance(RubyClass.java:931)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:561)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:291)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:328)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:352)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:110)", "java.base/java.lang.Thread.run(Thread.java:840)"]}
2023-11-21 13:37:07 [2023-11-21T05:37:07,967][ERROR][logstash.outputs.elasticsearch] Unknown setting 'mapping' for elasticsearch
2023-11-21 13:37:07 [2023-11-21T05:37:07,969][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1318)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.RubyClass.newInstance(RubyClass.java:931)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:561)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:291)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:328)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:352)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:110)", "java.base/java.lang.Thread.run(Thread.java:840)"]}
2023-11-21 13:37:11 [2023-11-21T05:37:11,000][ERROR][logstash.outputs.elasticsearch] Unknown setting 'mapping' for elasticsearch
2023-11-21 13:37:11 [2023-11-21T05:37:11,004][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1318)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.RubyClass.newInstance(RubyClass.java:931)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:561)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:291)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:328)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:352)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:110)", "java.base/java.lang.Thread.run(Thread.java:840)"]}
2023-11-21 13:37:13 [2023-11-21T05:37:13,995][ERROR][logstash.outputs.elasticsearch] Unknown setting 'mapping' for elasticsearch
2023-11-21 13:37:13 [2023-11-21T05:37:13,996][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1318)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.RubyClass.newInstance(RubyClass.java:931)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:561)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:291)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:328)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:352)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:110)", "java.base/java.lang.Thread.run(Thread.java:840)"]}
2023-11-21 13:37:16 [2023-11-21T05:37:16,946][ERROR][logstash.outputs.elasticsearch] Unknown setting 'mapping' for elasticsearch
2023-11-21 13:37:16 [2023-11-21T05:37:16,948][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ConfigurationError) Something is wrong with your configuration.", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.AbstractPipelineExt.initialize(AbstractPipelineExt.java:186)", "org.logstash.execution.AbstractPipelineExt$INVOKER$i$initialize.call(AbstractPipelineExt$INVOKER$i$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:847)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1318)", "org.jruby.ir.instructions.InstanceSuperInstr.interpret(InstanceSuperInstr.java:139)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:128)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:115)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.RubyClass.newInstance(RubyClass.java:931)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:90)", "org.jruby.ir.instructions.CallBase.interpret(CallBase.java:561)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:367)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:88)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:238)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:225)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:228)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:291)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:328)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:116)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:143)", "org.jruby.RubyProc.call(RubyProc.java:352)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:110)", "java.base/java.lang.Thread.run(Thread.java:840)"]}

decided to add a pipeline and mapping directory to the json files I obtained from the mapping and ingest pipeline.
image

Not sure if this is the right format. logstash.conf

output { 
    elasticsearch { 
        index => "ats-mainline-logs-2023-06" 
        hosts => ["https://es01:9200"]
        manage_template => false
        user => "elastic"
        password => "elastic123"
        ssl_verification_mode=> "none"
        pipeline => "./ats-logs-mainline/ats-logs-ingest pipeline.json"
        mapping => "./ats-logs-mainline/ats-logs-mapping.json"

Regarding

Delete by query API | Elasticsearch Guide [8.11] | Elastic

Is there a way to delete rows based on a range of timestamp? I tried to do that but the format isn't right.

format is 1/6/2023 00:00:00 - but as you can see, the display format is just different
image

That is not correct, that is not how it works... You can't just point to a json file for the pipeline and mapping.

Like on this page then use its name in the logstash conf

In the Dev Tools
You need to PUT the ingest pipeline

For the mapping you need to PUT template

1 Like

Meanwhile, I'm having this weird behaviour where

My csv files are being ingested as per usual.

The index is yet not growing with files.

This was an index I created using file upload, then with Console API to delete the 137000+ rows of data. to prevent duplicate entries for 1 June 2023.

Now I start the logstash container to ingest the data as per the pipeline and mapping json that was generated.

I checked double and triple that all my pointers and namings have been correct before starting the logstash container.

Idk where my data is being piped to now.

At the same time,

I'm getting quite a few bad rows who cannot be ingested into Elastic. Which is weird because i thought - we got the mapping and pipeline configs right ?

The log output is flashing by too fast to copy just in time

Go to Dev Tools

GET _cat/indices?v

Pasting images of text sigh...

But the error says it can not index because date time format exception

health status index                                                          uuid                   pri rep docs.count docs.deleted store.size pri.store.size dataset.size
green  open   .internal.alerts-observability.logs.alerts-default-000001      O54Ee8Z_TaeI4TSNYbPXwQ   1   0          0            0       249b           249b         249b
green  open   .internal.alerts-observability.uptime.alerts-default-000001    yg2Q9o8KR_a03cKGBqx6Dw   1   0          0            0       249b           249b         249b
green  open   .internal.alerts-ml.anomaly-detection.alerts-default-000001    voKYtP5LRC23_X-skrAOfg   1   0          0            0       249b           249b         249b
green  open   .internal.alerts-observability.slo.alerts-default-000001       eHQPEP6QTx6O1f1rPVko4A   1   0          0            0       249b           249b         249b
green  open   .internal.alerts-observability.apm.alerts-default-000001       AgnDZUQ9RPS2csA4MUkL7g   1   0          0            0       249b           249b         249b
green  open   .internal.alerts-observability.metrics.alerts-default-000001   WKqpRWVtRtitYRdyFolR8g   1   0          0            0       249b           249b         249b
green  open   .kibana-observability-ai-assistant-conversations-000001        3MzK0r5rT72EKYePE2meOw   1   0          0            0       249b           249b         249b
green  open   .internal.alerts-observability.threshold.alerts-default-000001 TBSeIwJ3RFCGvKjYhgJ34Q   1   0          0            0       249b           249b         249b
yellow open   ats-mainline-logs-2023-06                                      UnuGNRFKTsGI85YILs-scA   1   1          0            0      2.2mb          2.2mb        2.2mb
green  open   .kibana-observability-ai-assistant-kb-000001                   TJIEehIuSJebmgUrV9VUzA   1   0          0            0       249b           249b         249b
green  open   .internal.alerts-security.alerts-default-000001                7nSFEQuwTwqlh6OXmg-b8Q   1   0          0            0       249b           249b         249b
green  open   .internal.alerts-stack.alerts-default-000001                   R854JGu-TsSo4ivd04w4zw   1   0          0            0       249b           249b         249b

The pipeline file

    {
      "date": {
        "field": "sourcetime",
        "timezone": "Asia/Singapore",
        "formats": [
          "yyyy-MM-dd HH:mm:ss.SSS",
          "yyyy-MM-dd HH:mm:ss.SS",
          "yyyy-MM-dd HH:mm:ss",
          "yyyy-MM-dd HH:mm:ss.S"
        ]
      }
    },

And mapping file

     "sourcetime": {
          "type": "date",
          "format": "yyyy-MM-dd HH:mm:ss.SSS||yyyy-MM-dd HH:mm:ss.SS||yyyy-MM-dd HH:mm:ss||yyyy-MM-dd HH:mm:ss.S"

So many formats were already accounted for and detected from File Upload tho.

It's failing and not ingesting the data

I can't really see / read the error because you pasted and image... But the error tells you what the issue is...

Bad timezone or something
.

However some rows appear to be passing through.

Yeah i noticed this this morning. Was wondering if this could explain why my data only ingested from 08:00:00 instead of 00:00:00 ?