What steps are required to analyze a small PCAP file in Elastic Machine Learning Anomaly Detection? When I saved the PCAP file as a CSV, the data was not suitable for analysis with anomaly detection, even though it is only 95 KB

What steps are required to analyze a small PCAP file in Elastic Machine Learning Anomaly Detection? When I saved the PCAP file as a CSV, the data was not suitable for analysis with anomaly detection, even though it is only 95 KB.

This is synthetic data.

1 Like

Hello,

For us, CSV import issues in Elastic almost always arise from EU vs. US decimal and delimiter differences (comma vs. dot). Elastic defaults to the US format, causing parsing errors. Solutions include pre-processing CSV files, but imho its problematic and should be fixed..

Your issue could be related to sth else of course.

Willem

Thank you Willem for sharing your experience! I will take that in consideration. Somebody told that most likely my problem arises because I do not have a timestamp column. I will fix that and run again the experiment. I will keep you posted! Thank you for everything! P.S. Machine Learning, using Dataframe Analytics worked.

1 Like

Using Wireshark, I added two timestamp columns: one with seconds since 1970 (or something similar) and another with the month, day, year, hours, minutes, and seconds. I am not 100% sure which one was the most important—I guess the second new column. However, after that, the data was recognized inside Elastic Cloud. Of course, I saved the PCAP data as a CSV file and imported it into Elastic Cloud.

Now. I am sure. The key was to add a column, using Wireshark, that I called Epoch Time with seconds from 1970, or something like that. After that I exported as CSV and I checked that the size of the file is less than 500MB. After that I imported the file, and Elastic Cloud allows to do machine learning jobs with it.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.