Currently we are using Splunk and planning to migrate to Elastic. We are in the process of deciding how data needs to be ingested into Elastic Cluster.
In this topic I will give provide
- Sample log data.
- Current Splunk Query we are using on that data.
- How this data is indexed into Elastic cluster.
- My questing is, with the current indexed data in Elastic cluster, is it possible to write the queries in DSL to get the information as done in Splunk? or do we need to change the way we are indexing the documents?
Sample data:
2019-03-25 23:09:59 (973) worker.3 worker.3 txid=f0fe292fdb50 Completed: ASYNC: Discovery - Sensors in 0:03:31.246, next occurrence is null
Splunk Query to extract the jobname and form a table
index=abc timeformat= "%Y-%m-%d %H:%M:%S" earliest="2019-03-25 23:00:00" searchtimespanminutes=10 instance=* node=* Completed: worker
| rex field=_raw "Completed: (?<jobname>.*) in"
| table _time, node, jobname
When we ingested the data into Elastic cluster, we used the reg ex which converted the raw data into below fields:
@timestamp:2019-03-25 23:09:59
Threadname : worker.3
message : txid=f0fe292fdb50 Completed: ASYNC: Discovery - Sensors in 0:03:31.246, next occurrence is null
My question is, from message, during query time can we extract jobname is Elastic?
Also in this example, the jobname is string, but in some other cases the extracted value might be integer, so can we extract different type values in elastic during query time and apply required functions on top of extracted values?
Please suggest?
In case any further information required please let me know.
Thanks,
Ravi