I want to import a 3GB system file to ELK in one go

But it is showing Disk error.I installed graylog now but it is also not working .Will anyone be able to tell me how to read a 3 GB file in ELK or any other free open source software just like ELK if you will be able to tell.kindly suggest on urgent basis

If anyone know any answer related to this kindly comment

Can you provide more details on the file format?

Hey thnx for the reply I have a system log file of 3GB with data like this

172.16.2.22	Jun 22 10:49:18	date=2020-06-22	local7	notice		time=10:49:17 devname="FW_1" devid="FG2TK19907000" logid="000020" type="traffic" subtype="forward" level="notice" vd="root" eventtime=1592803157 srcip=172.16.16.21 srcport=57945 srcintf="VLAN-2" srcintfrole="lan" dstip=52.179.24.121 dstport=443 dstintf="wan2" dstintfrole="wan" poluuid="7b6d31e94-51ea-a540-5761e9fa764f" sessionid=38967804 proto=6 action="accept" policyid=10 policytype="policy" service="HTTPS" dstcountry="United States" srccountry="Reserved" trandisp="snat" transip=49.206.32.147 transport=57945 appid=41469 app="Microsoft.Portal" appcat="Collaboration" apprisk="elevated" applist="default" duration=104665 sentbyte=271556 rcvdbyte=381246 sentpkt=3499 rcvdpkt=1767 sentdelta=310 rcvddelta=432 

I tried to import it through ELK but it is giving me disk errors after some time..If not elk then can u suggest me any other free and open source software that can read and apply filers to 3GB file

@Deepika_Rawat
What's the exception? Is it on logstash side or ES side.

On ES side the error was.... could not index to elastic search but when i split the file into 15MB each it does not show errors

If you have access to ES can u post stacktraces from ES log.
It could be batch size and number of concurrent batches. But hard to figure out from just one liner. Also msg in the last post does not say anything about disk. How did u conclude it's a disk error.

Any additional info on your pipeline / hardware setup will be useful to debug.

okay i will send the errors when i will execute again but it was cluster block exception error something....but just tell me is it posiible to import a 3GB file to elk on one go.

I would recommend using Logstash to read it line by line as it allows to to define filter that to parse the information in the log records.

I tried using logstash but it is giving errors after some time of error code 403

Then try to resolve that problem. Sharing the logs, error message and configuration would allow us to help.

okay just tell me what is the maximum amount of size of file that can be imported to ELK on one go

You insert line by line into Elasticsearch, not full files.

I want to import a 3GB system file to ELK in one go ...these are the logs and configuration file i have used KV filter so they are showing errors

I can see a sample log line but no config. or error messages

indent preformatted text by 4 spacesinput

input {
file {
path =>"/root/og00"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter {
kv {
}
}

output
{
elasticsearch {
hosts => "http://localhost:9200"
index => "log00_210270"
}
stdout { }
}

As the first few fields are not in Kv format you can not just use a KV filter. Instead you need to parse out the fields that are not in KV format first and store the long KV string in a separate field. You can then apply the KV filter to this field. Have a look at this blog post for a guide on how to parse data and work with Logstash.

hey thankx for the reply...i am new to elk will you be able to write a filter section ...actually i don't know how to start...i will be grateful

Have a look at the tutorial I linked to. I think that is a good point to start.

what do you mean a long KV string in one field in this line...i don't want to display the first few fields