I tried to upload a log file and a CSV file through the Kibana UI upload file option. But in both cases getting "File structure cannot be determined" error

I am new to Elastic.
I have setup ELK in Ubuntu machine.
I tried to upload a log file and a CSV file through the Kibana UI upload file option. But in both cases getting "File structure cannot be determined" error.
I have tried to override settings to take 10 lines only, still getting same issue.
Please help.

Error snapshot

My CSV file looks like this - just a simple CSV file

My log file looks like this

Hi @cadrija. Can you share some more information?

  • What version of Kibana are you on?
  • How large are your CSV and log files?
  • Do you see any relevant errors in the Kibana server logs (likely located at /var/log/kibana)?

Hi @nickpeihl
Please consider below answers.

  • What version of Kibana are you on? - 7.17.6
  • How large are your CSV and log files? - Log file is 4.76 MB and CSV file is 2.38 MB
  • Do you see any relevant errors in the Kibana server logs (likely located at /var/log/kibana)? - Okay I will check this.
    Update - Checked logs, no error stack trace found.

Hi @cadrija. What version of Kibana are you running? I wonder if your issue is related to this bug?

Same version as Elasticsearch i.e. 7.17.6

I checked with different CSV file, it seems if the file has timestamp and it is matching with the listed time format then it is getting imported.
However all of my log files are throwing the above mentioned error even if they are having timestamp and time format is matching the listed options.

Hi @cadrija Welcome to the community!

Apologies you are having issues Can you provide a couple lines of the raw CSV in text (not screen shot)?

Raw lines exactly how they are in the CSV you can anonymize anything you need to we just want to try with your data.

I suspect there is still a format issue, and you can change the format / data type under the advanced setting give us a couple lines and we may be able to help.

Hello @stephenb thank you!

Sample logs 1

2022-09-21T04:39:42,473 [main] ERROR: Test log: Inside LoanIQLoggingManager
2022-09-21T04:39:44,685 [main] INFO: Started loading datatables
2022-09-21T04:39:44,701 [main] INFO: Found resource DataTable.Deal.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.Deal.xml
2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.abstractsgchangetransaction.xml, value:{DataTable.AbstractSGChangeTransaction.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AbstractSGChangeTransaction.xml}
2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.abstractschedule.xml, value:{DataTable.AbstractSchedule.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AbstractSchedule.xml}
2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.abstractscheduleitem.xml, value:{DataTable.AbstractScheduleItem.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AbstractScheduleItem.xml}
2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.abstracttemplatefield.xml, value:{DataTable.AbstractTemplateField.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AbstractTemplateField.xml}
2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.accrualcyclepayment.xml, value:{DataTable.AccrualCyclePayment.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AccrualCyclePayment.xml}
2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.accruallineitem.xml, value:{DataTable.AccrualLineItem.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AccrualLineItem.xml}
2022-09-21T04:39:44,732 [main] INFO: adding key:datatable.accrualmatchfundedcostoffundspayableaggregation.xml, value:{DataTable.AccrualMatchFundedCostOfFundsPayableAggregation.xml, jar:file:/C:/LoanIQ/Server/lib/liq_datatables-7.6.0.0.jar!/DataTable.AccrualMatchFundedCostOfFundsPayableAggregation.xml}

Sample logs 2

2022-09-01 09:08:05.369  INFO 119618 --- [           main] c.c.c.ConfigServicePropertySourceLocator : Fetching config from server at : http://localhost:8887
2022-09-01 09:08:05.966  INFO 119618 --- [           main] c.c.c.ConfigServicePropertySourceLocator : Located environment: name=userManagement-service, profiles=[development], label=null, version=null, state=null
2022-09-01 09:08:05.968  INFO 119618 --- [           main] b.c.PropertySourceBootstrapConfiguration : Located property source: [BootstrapPropertySource {name='bootstrapProperties-file:./userManagement-service.yaml'}, BootstrapPropertySource {name='bootstrapProperties-classpath:/config/userManagement-service.yaml'}]
2022-09-01 09:08:05.975  INFO 119618 --- [           main] c.u.microservice.LaunchUserApplication   : The following profiles are active: development
2022-09-01 09:08:08.025  INFO 119618 --- [           main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data JPA repositories in DEFERRED mode.
2022-09-01 09:08:08.637  INFO 119618 --- [           main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 597ms. Found 34 JPA repository interfaces.
2022-09-01 09:08:08.818  WARN 119618 --- [           main] o.s.boot.actuate.endpoint.EndpointId     : Endpoint ID 'service-registry' contains invalid characters, please migrate to a valid format.
2022-09-01 09:08:08.842  WARN 119618 --- [           main] o.s.boot.actuate.endpoint.EndpointId     : Endpoint ID 'hystrix.stream' contains invalid characters, please migrate to a valid format.
2022-09-01 09:08:09.138  INFO 119618 --- [           main] o.s.cloud.context.scope.GenericScope     : BeanFactory id=267123e5-afdb-3749-aa42-6d221c5a80ab
2022-09-01 09:08:09.730  INFO 119618 --- [           main] trationDelegate$BeanPostProcessorChecker : Bean 'zuulConfiguration' of type [com.usermanagement.microservice.ZuulConfiguration$$EnhancerBySpringCGLIB$$82804667] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)

Sample CSV

carat cut color clarity depth table price x y z
0.23 Ideal E SI2 61.5 55 326 3.95 3.98 2.43
0.21 Premium E SI1 59.8 61 326 3.89 3.84 2.31
0.23 Good E VS1 56.9 65 327 4.05 4.07 2.31
0.29 Premium I VS2 62.4 58 334 4.2 4.23 2.63
0.31 Good J SI2 63.3 58 335 4.34 4.35 2.75
0.24 Very Good J VVS2 62.8 57 336 3.94 3.96 2.48
0.24 Very Good I VVS1 62.3 57 336 3.95 3.98 2.47
0.26 Very Good H SI1 61.9 55 337 4.07 4.11 2.53
0.22 Fair E VS2 65.1 61 337 3.87 3.78 2.49
0.23 Very Good H VS1 59.4 61 338 4 4.05 2.39

Please let me know if this helps. I would be very grateful if any of the log files were able to get imported.
Thanks in advance!

Hi @cadrija

I am using 7.17.6, Basic License

In short the samples you provided just loaded up for me, no extra work. etc. I simply used all the default and clicked Import..

I wonder if you have some corrupt characters, or something else in the files...These samples loaded fine for me....

A little what I noticed

First your Sample1 and 2 only partially meet the criteria

  • Delimited text files, such as CSV and TSV
  • Newline-delimited JSON
  • Log files with a common format for the timestamp

Sample 1 and Sample 2

Have a common time format but are not CSV, TSV delimited or ndjson the the automatic parsing will be minimal in other words it will not parse "the message" portion

Both of these imported the first time for me. ...

I just Clicked Through and and the Log Was parsed and Loaded

BUT it is not going to parse the whole log message because it is CSV, TSV or ndjson it is Unstructured Text

You will need to build your own custom parsing to parse these logs if you want additional parsing

The results look like

GET discuss-sample-1/_search
...
     {
        "_index" : "discuss-sample-1",
        "_type" : "_doc",
        "_id" : "xQ6JhIMB2qNNMYeXRN1I",
        "_score" : 1.0,
        "_source" : {
          "@timestamp" : "2022-09-21T04:39:42.473-07:00",
          "loglevel" : "ERROR",
          "message" : "2022-09-21T04:39:42,473 [main] ERROR: Test log: Inside LoanIQLoggingManager"
        }
      },
      {
        "_index" : "discuss-sample-1",
        "_type" : "_doc",
        "_id" : "xg6JhIMB2qNNMYeXRN1I",
        "_score" : 1.0,
        "_source" : {
          "@timestamp" : "2022-09-21T04:39:44.685-07:00",
          "loglevel" : "INFO",
          "message" : "2022-09-21T04:39:44,685 [main] INFO: Started loading datatables"
        }
      },     {
        "_index" : "discuss-sample-1",
        "_type" : "_doc",
        "_id" : "xQ6JhIMB2qNNMYeXRN1I",
        "_score" : 1.0,
        "_source" : {
          "@timestamp" : "2022-09-21T04:39:42.473-07:00",
          "loglevel" : "ERROR",
          "message" : "2022-09-21T04:39:42,473 [main] ERROR: Test log: Inside LoanIQLoggingManager"
        }
      },
      {
        "_index" : "discuss-sample-1",
        "_type" : "_doc",
        "_id" : "xg6JhIMB2qNNMYeXRN1I",
        "_score" : 1.0,
        "_source" : {
          "@timestamp" : "2022-09-21T04:39:44.685-07:00",
          "loglevel" : "INFO",
          "message" : "2022-09-21T04:39:44,685 [main] INFO: Started loading datatables"
        }
      },

In Discover

This is Sample2 in

Sample 3 you did not provide the actual CSV txt ... I would expect that to load as well..

1 Like

@stephenb Thanks a lot!
I tried with partial file it is importing first 1000 lines only. Let me analyze what is wrong with the whole file.
Could you kindly help me with another query.
What do I need to do in order to pull this logs live from another machine?
My ELK is on x.x.x.x host and the application log file is on y.y.y.y host.
I am learning about the integrations but I am confused about which one to use in this case.

Please open a new thread with the new question with all the details.

You'll need to use filebeat or the elastic agent. I would suggest that you read about those. That's pretty common way to ship logs.

If you're very new to all these concepts, I might just start with filebeat.

1 Like

Thank you so much @stephenb

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.