Hi All.. A complete newbie to ELK.. booked a 2 weeks Elastic cloud and onboarded some sample logs. got stuck with the line break issue.
(checked the logs and searched on youtube, but no luck)
i mean.. the sample log is very simple.. it got some 10 lines(no timestamps actually).. each two lines should be one event.
so, i entered new lines(empty lines) after every two lines of logs.
now.. when i uploaded this log, i got errors.. i edited the settings on that page(unchecked the timestamp, etc).. to update each two lines as a single event.. i see no options. please suggest, thanks.
Hi @inventsekar,
Welcome to the community! How are you uploading your data into Elasticsearch? Can you share the error you are seeing when trying to ingest the log?
Thanks Carly..
Actually I am using ELK cloud.. and I just want to upload one simple log file, only once.. (need not monitor the file or anything)..so I am not using any filebeat, etc..
on the elk cloud, data upload option I used.. it worked fine.. but the line breaking gives me difficult time.
Thanks
Sekar
So to confirm you're not receiving an import error, but are struggling with line breaks in your logs? Can you give an example of the input you have and the result you are trying to achieve?
Yep..import initially given some error..
I unchecked the timestamp (as the logs got no timestamp)
And I selected that the file as semi structured and delimited by space. then it got imported fine.
But the line break is what the important thing.
The sample log is very simple... Let's assume I got a notepad file with 100 lines of structured data(a csv file we can say)
I just want to do line breaking at every two lines.
I did this project using Splunk tool and now I would like to achieve the same thru ELK.
(I am unable to add a direct YouTube link.. so the link is https // youtu.be / htm6l_PzWhw )
Pls check this video of just 4 mins.
Much appreciating your response.
Thanks.
Sekar
Thanks for sharing the video @inventsekar. Am I right in assuming being able to handle the line breaks is a requirement for you rather than manipulating the file so each event is on a distinct line.
For doing a similar approach to the pattern matching you have in your Splunk video I would recommend looking at pre-processing your data first. One way you could do this is by using Logstash and the multi-line codec plugin with your regex pattern to ingest the data in the way you want.
Hope that helps! Looking forward to seeing the next video.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.