Some Artifactory logs parsed by Logstash grok, others indexed but fields not visible in Kibana

Hello Team,

I am ingesting multiple Artifactory log types through Filebeat → Logstash → Elasticsearch, but I am facing an issue where only some log lines are parsed correctly, while others are indexed without parsed fields showing in Kibana.

artifactory-conan-v2-migration.log

r0c4a78ba7137a4fb8cf2c9beb99db1c9c01c5f45 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.

2025-12-11T19:16:55.448Z \[jfrt \] \[INFO \] \[ccc90c936ec12132278c73df89ca5e64\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1417\] - \[ACCEPTED GET\] /provider0c4a78ba7137a4fb8cf2c9beb99db1c9c01c5f45 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:16:55.482Z \[jfrt \] \[INFO \] \[d836cef1bb8567afe96bac99b10a20a5\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1432\] - \[ACCEPTED GET\] /providerf293c7995c949374e8876dcf41e5b16c75ade3b0 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:16:55.514Z \[jfrt \] \[INFO \] \[89a234d14a910923e5943dc9c3a4e9e2\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1392\] - \[ACCEPTED GET\] /provider0c4a78ba7137a4fb8cf2c9beb99db1c9c01c5f45 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:16:55.818Z \[jfrt \] \[INFO \] \[b594156afa61d77d1a9d4b02f1e4025a\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1430\] - \[ACCEPTED GET\] /provider4b80fc6456965df75762afa0ffae77fbefec60e4 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:16:56.017Z \[jfrt \] \[INFO \] \[c40b21e90e7ee4f2ce917b7749ee7f64\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1434\] - \[ACCEPTED GET\] /provider4b80fc6456965df75762afa0ffae77fbefec60e4 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:16:56.144Z \[jfrt \] \[INFO \] \[c40b21e90e7ee4f2ce917b7749ee7f64\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1433\] - \[ACCEPTED GET\] /providerb96c4ec635a9161e73157d799fb32718e42cc292 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:16:56.276Z \[jfrt \] \[INFO \] \[ea7c346d3ee7e1260a748f75edb1634c\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1428\] - \[ACCEPTED GET\] /providerb96c4ec635a9161e73157d799fb32718e42cc292 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:16:56.946Z \[jfrt \] \[INFO \] \[d836cef1bb8567afe96bac99b10a20a5\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1320\] - \[ACCEPTED GET\] /provider4b80fc6456965df75762afa0ffae77fbefec60e4 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:16:57.445Z \[jfrt \] \[INFO \] \[c40b21e90e7ee4f2ce917b7749ee7f64\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1421\] - \[ACCEPTED GET\] /provider4b80fc6456965df75762afa0ffae77fbefec60e4 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:18:12.785Z \[jfrt \] \[INFO \] \[b594156afa61d77d1a9d4b02f1e4025a\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1427\] - \[ACCEPTED GET\] /provider4be9102a906eadb06c53816c485c15d975f3adb6 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:18:12.885Z \[jfrt \] \[INFO \] \[ccc90c936ec12132278c73df89ca5e64\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1426\] - \[ACCEPTED GET\] /provider4be9102a906eadb06c53816c485c15d975f3adb6 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:18:12.895Z \[jfrt \] \[INFO \] \[2b88b683c65b4b5f643ed7dc53569d6f\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1431\] - \[ACCEPTED GET\] /provider4be9102a906eadb06c53816c485c15d975f3adb6 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:18:14.603Z \[jfrt \] \[INFO \] \[48145b43c245223cd77a8eadfeb94952\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1417\] - \[ACCEPTED GET\] /provider4be9102a906eadb06c53816c485c15d975f3adb6 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:19:11.303Z \[jfrt \] \[INFO \] \[b594156afa61d77d1a9d4b02f1e4025a\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1379\] - \[ACCEPTED GET\] /provideraff7dc652045b326170b778dd90f5530d613b617 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:19:11.321Z \[jfrt \] \[INFO \] \[86de111b44414007ff612671e965a810\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1435\] - \[ACCEPTED GET\] /provideraff7dc652045b326170b778dd90f5530d613b617 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:19:11.408Z \[jfrt \] \[INFO \] \[332b2e073addf65ff2130da94d2cc9a7\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1414\] - \[ACCEPTED GET\] /provideraff7dc652045b326170b778dd90f5530d613b617 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
2025-12-11T19:19:12.913Z \[jfrt \] \[INFO \] \[332b2e073addf65ff2130da94d2cc9a7\] \[s.s.BinaryStoreAccessLogger:27\] \[p-nio-8081-exec-1428\] - \[ACCEPTED GET\] /provideraff7dc652045b326170b778dd90f5530d613b617 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.
\[root@ca1vmartifactory log\]# cat artifactory-build-info-migration.log
\[root@ca1vmartifactory log\]# cat artifactory-conan-v2-migration.log
2024-05-18T04:39:53.802Z \[jfrt \] \[INFO \] \[5508a90bf42e2230\] \[.a.c.m.ConanV2MigrationJob:203\] \[f42e2230|art-exec-18\] - No Conan v1 path found in db - migration will not run, attempting finalization.
2024-05-18T04:39:53.814Z \[jfrt \] \[INFO \] \[5508a90bf42e2230\] \[.a.c.m.ConanV2MigrationJob:410\] \[f42e2230|art-exec-18\] - Successfully updated DB configs table for migration finalized.
2024-05-18T04:39:53.815Z \[jfrt \] \[INFO \] \[5508a90bf42e2230\] \[.a.c.m.ConanV2MigrationJob:395\] \[f42e2230|art-exec-18\] - artifactory.ConanV2MigrationJob#bf6720b7-19c8-46b0-9ffb-e3fc4b8df3d2: all nodes reached minimal version '6.9.0-m001', continuing execution


\[root@ca1vmartifactory log\]#

artifactory-consumption-usage.log	

jfrt_package_event_count{action_type="download",package_type="maven"} 328 1711963224554
jfrt_package_event_count{action_type="download",package_type="nuget"} 10 1711963224554
jfrt_package_event_count{action_type="download",package_type="npm"} 1122 1711963224554
jfrt_package_event_count{action_type="download",package_type="debian"} 317 1711963224554
jfrt_package_event_count{action_type="download",package_type="pypi"} 584 1711963224554
jfrt_package_event_count{action_type="download",package_type="docker"} 4 1711963224554
jfrt_package_event_count{action_type="download",package_type="generic"} 54 1711963224554
jfrt_package_event_count{action_type="download",package_type="alpine"} 1 1711963224554
jfrt_package_event_count{action_type="upload",package_type="docker"} 1 1711963224554
jfrt_package_event_count{action_type="redirect",package_type="pypi"} 4 1711963224554
jfrt_package_event_count{action_type="head",package_type="docker"} 4 1711963224554

jfrt_package_event_count{action_type="download",package_type="nuget"} 1 1711964124554
jfrt_package_event_count{action_type="download",package_type="generic"} 19 1711964124554
jfrt_package_event_count{action_type="upload",package_type="nuget"} 1 1711964124554

jfrt_package_event_count{action_type="download",package_type="nuget"} 13 1711965024555
jfrt_package_event_count{action_type="download",package_type="debian"} 13 1711965024555
jfrt_package_event_count{action_type="download",package_type="generic"} 40 1711965024555
jfrt_package_event_count{action_type="upload",package_type="nuget"} 4 1711965024555
jfrt_package_event_count{action_type="upload",package_type="docker"} 2 1711965024555
jfrt_package_event_count{action_type="head",package_type="maven"} 2059 1711965024555

jfrt_package_event_count{action_type="download",package_type="nuget"} 18 1711965924555
jfrt_package_event_count{action_type="download",package_type="generic"} 108 1711965924555
jfrt_package_event_count{action_type="upload",package_type="nuget"} 3 1711965924555
jfrt_package_event_count{action_type="upload",package_type="generic"} 1 1711965924555
jfrt_package_event_count{action_type="head",package_type="maven"} 1747 1711965924555

jfrt_package_event_count{action_type="download",package_type="nuget"} 34 1711966824555
jfrt_package_event_count{action_type="download",package_type="generic"} 5 1711966824555
jfrt_package_event_count{action_type="upload",package_type="nuget"} 3 1711966824555

jfrt_package_event_count{action_type="download",package_type="nuget"} 16 1711967724556
jfrt_package_event_count{action_type="download",package_type="generic"} 19 1711967724556
jfrt_package_event_count{action_type="upload",package_type="nuget"} 9 1711967724556
jfrt_package_event_count{action_type="head",package_type="generic"} 1 1711967724556

\[root@ca1vmartifactory log\]#

Hi @Pratiksha_Desai, Welcome to the community.

If you would like help you need to post more than just the logs... there is not enough information to help

You will need to provide

What version of components?

The filebeat and logstash configuration files

Exact samples of logs that are not being processed

The error logs from the logs that are not being process

Also Please Format your logs / code with 3 backticks ``` before and after the code / logs.

Curious why you are using logstash?

Provide these things and perhaps someone can help....

1 Like

The dissect plugin can be suitable for lines parsing in this case - artifactory-conan-v2-migration.log.

input {
	beats {
	port => 5044
	include_codec_tag => false
	}
}
filter {
  dissect {
    mapping => {
    "message" => "%{timestamp} [%{servicetype}] [%{level}] [%{traceid}] [%{class}] [%{thread}] - %{msg}"
    }
  }
  date {
    match => [ "timestamp", "ISO8601"] 
    target => "timestamp" # or use only @timestamp
  }
	mutate {
	 strip => ["level", "servicetype"]
	remove_field => [ "event", "host" , "log"]

	}
}
output {
  stdout { codec => rubydebug }
}

Output:

{
       "@version" => "1",
          "class" => ".a.c.m.ConanV2MigrationJob:395",
          "level" => "INFO",
     "@timestamp" => 2026-01-02T19:44:42.782Z,
    "servicetype" => "jfrt",
      "timestamp" => 2024-05-18T04:39:53.815Z,
        "traceid" => "5508a90bf42e2230",
            "msg" => "artifactory.ConanV2MigrationJob#bf6720b7-19c8-46b0-9ffb-e3fc4b8df3d2: all nodes reached minimal version '6.9.0-m001', continuing execution",
        "message" => "2024-05-18T04:39:53.815Z [jfrt ] [INFO ] [5508a90bf42e2230] [.a.c.m.ConanV2MigrationJob:395] [f42e2230|art-exec-18] - artifactory.ConanV2MigrationJob#bf6720b7-19c8-46b0-9ffb-e3fc4b8df3d2: all nodes reached minimal version '6.9.0-m001', continuing execution",
         "thread" => "f42e2230|art-exec-18"
}

It's possible to split furthermore thread and class fields. In case of artifactory-consumption-usage.log you can use the same parser and modify mappings.

Version of components

All components are running on Elastic Stack 8.15.5

Elasticsearch : 8.15.5
Logstash      : 8.15.5
Filebeat      : 8.15.5
Kibana        : 8.15.5


Filebeat configuration (filebeat.yml)

filebeat.inputs:
- type: log
  id: my-filestream-id
  enabled: true
  paths:
    - D:\ELK Soft\whatsapplog\twogrok*

output.logstash:
  hosts: ["localhost:5044"]

processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~
  - add_docker_metadata: ~
  - add_kubernetes_metadata: ~

setup.kibana:
  host: "localhost:5601"

Filebeat is reading logs from:

D:\ELK Soft\whatsapplog\twogrok*

and forwarding them to Logstash on port 5044.

Logstash configuration

input {
  beats {
    port => 5044
  }
}

filter {

  if [log][original] {
    mutate {
      copy => { "[log][original]" => "message" }
    }
  }

  grok {
    break_on_match => true
    tag_on_failure => ["_grok_failed"]

    match => {
      "message" => [

        # 1️⃣ Metrics logs
        "^jfrt_package_event_count\{action_type=\"%{WORD:action_type}\",package_type=\"%{WORD:package_type}\"\} %{NUMBER:event_count:int} %{NUMBER:epoch_ms:long}$",

        # 2️⃣ BinaryStoreAccessLogger logs
        "%{TIMESTAMP_ISO8601:log_time} \[%{DATA:service}\s*\] \[%{LOGLEVEL:log_level}\s*\] \[%{DATA:request_id}\] \[%{DATA:class}:%{NUMBER:line}\] \[%{DATA:thread}\] - \[%{WORD:status} %{WORD:http_method}\] %{DATA:artifact_path} for %{DATA:user}@%{DATA:repo}/%{IP:client_ip}\.?",

        # 3️⃣ Conan v2 migration logs
        "%{TIMESTAMP_ISO8601:log_time} \[%{DATA:service}\] \[%{LOGLEVEL:log_level}\] \[%{DATA:migration_id}\] \[%{DATA:class}:%{NUMBER:line}\] \[%{DATA:thread}\] - %{GREEDYDATA:migration_message}"
      ]
    }
  }

  date {
    match => ["epoch_ms", "UNIX_MS"]
    target => "@timestamp"
    tag_on_failure => []
  }

  date {
    match => ["log_time", "ISO8601"]
    target => "@timestamp"
    tag_on_failure => []
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    user => "elastic"
    password => "********"
    index => "twogork"
  }
}


Sample logs being processed

Metrics log

jfrt_package_event_count{action_type="download",package_type="maven"} 328 1711963224554

BinaryStoreAccessLogger log

2025-12-11T19:16:55.448Z [jfrt ] [INFO ] [req123] [BinaryStoreAccessLogger:128] [http-nio-8081-exec-4] - [OK GET] libs-release-local/com/test/app.jar for admin@repo/10.62.49.84

Conan v2 migration log

2025-12-11T19:16:55.448Z [jfrt ] [INFO ] [r0c4a78ba7137a4f] [ConanMigration:221] [migration-thread] - Migration completed successfully


Why Logstash is being used

Logstash is used because:

  • Multiple different log formats exist in the same file

  • Complex grok parsing is required

  • Timestamp normalization using the date filter

  • Future plans include enrichment and conditional parsing

Direct Filebeat → Elasticsearch is not sufficient for this use case.

Current issue

  • Logs are reaching Elasticsearch

  • Some logs are not parsed correctly

  • Fields are missing in Kibana for those events

Which logs? Grok is more suitable for optional fields.

It's not clear which logs/fields make problems.

I have 3 different log types in the same file.
All logs are indexed, but only the metrics log shows parsed fields in Kibana.
The BinaryStoreAccessLogger and Conan migration logs are indexed but usually show only the message field (parsed fields missing).
So ingestion works, but grok parsing is inconsistent across log types.

For the log:
2025-12-11T19:16:55.448Z [jfrt ] [INFO ] [ccc90c936ec12132278c73df89ca5e64] [s.s.BinaryStoreAccessLogger:27] [p-nio-8081-exec-1417] - [ACCEPTED GET] /provider0c4a78ba7137a4fb8cf2c9beb99db1c9c01c5f45 for jfrt@01dda1bjranfcp0c11a8ns1d12/10.62.49.84.

You should use grok pattern like this:
%{TIMESTAMP_ISO8601:log_time} \[%{DATA:service}\s*\] \[%{LOGLEVEL:log_level}\s*\] \[%{DATA:request_id}\] \[%{DATA:class}:%{NUMBER:line}\] \[%{DATA:thread}\] - \[%{WORD:status} %{WORD:http_method}\] %{DATA:artifact_path} for %{DATA:user}@%{DATA:repo}\/%{IP:client_ip}%{GREEDYDATA}

For the log;
2024-05-18T04:39:53.802Z [jfrt ] [INFO ] [5508a90bf42e2230] [.a.c.m.ConanV2MigrationJob:203] [f42e2230|art-exec-18] - No Conan v1 path found in db - migration will not run, attempting finalization.

You should use grok pattern like this:
%{TIMESTAMP_ISO8601:log_time} \[%{DATA:service}\s*\] \[%{LOGLEVEL:log_level}\s*\] \[%{DATA:migration_id}\] \[%{DATA:class}:%{NUMBER:line}\] \[%{DATA:thread}\] - %{GREEDYDATA:migration_message}

or in grok:

  grok {
    break_on_match => true
    tag_on_failure => ["_grok_failed"]

    match => {
      "message" => [

        # 1️⃣ Metrics logs
        "^jfrt_package_event_count\{action_type=\"%{WORD:action_type}\",package_type=\"%{WORD:package_type}\"\} %{NUMBER:event_count:int} %{NUMBER:epoch_ms:long}$",

        # 2️⃣ BinaryStoreAccessLogger logs
        "%{TIMESTAMP_ISO8601:log_time} \[%{DATA:service}\s*\] \[%{LOGLEVEL:log_level}\s*\] \[%{DATA:request_id}\] \[%{DATA:class}:%{NUMBER:line}\] \[%{DATA:thread}\] - \[%{WORD:status} %{WORD:http_method}\] %{DATA:artifact_path} for %{DATA:user}@%{DATA:repo}\/%{IP:client_ip}%{GREEDYDATA}",

        # 3️⃣ Conan v2 migration logs
        "%{TIMESTAMP_ISO8601:log_time} \[%{DATA:service}\s*\] \[%{LOGLEVEL:log_level}\s*\] \[%{DATA:migration_id}\] \[%{DATA:class}:%{NUMBER:line}\] \[%{DATA:thread}\] - %{GREEDYDATA:migration_message}"
      ]
    }
  }

I have tested, grok patterns are working. It's small difference however parsing was not OK.
Since you are using v8.15, the data view should be automatically refreshed.
You can add the error handling in the output.

output {
 if ( ("_grokparsefailure" in [tags]) or ("_groktimeout" in [tags]) )  {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    user => "elastic"
    password => "********"
    index => "errors_%{+YYYY.MM}"
  }
 }
 else {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    user => "elastic"
    password => "********"
    index => "twogork"
  }
 }
}
1 Like

Thanks @Rios

And BTW @Pratiksha_Desai

This could all be done with an ingest pipeline in elasticsearch

Filebeat -> Elasticsearch no need for Logstash if you do not need to run logstash (Logstash is perfectly fine / valid)

Something like

PUT _ingest/pipeline/artifactory_pipeline
{
  "description": "Converted from Logstash filter to ingest pipeline",
  "processors": [
    {
      "set": {
        "if": "ctx.log != null && ctx.log.original != null",
        "field": "message",
        "value": "{{log.original}}"
      }
    },
    {
      "grok": {
        "field": "message",
        "patterns": [
          "^jfrt_package_event_count\\{action_type=\"%{WORD:action_type}\",package_type=\"%{WORD:package_type}\"\\} %{NUMBER:event_count:int} %{NUMBER:epoch_ms:long}$",
          "%{TIMESTAMP_ISO8601:log_time} \\[%{DATA:service}\\s*\\] \\[%{LOGLEVEL:log_level}\\s*\\] \\[%{DATA:request_id}\\] \\[%{DATA:class}:%{NUMBER:line}\\] \\[%{DATA:thread}\\] - \\[%{WORD:status} %{WORD:http_method}\\] %{DATA:artifact_path} for %{DATA:user}@%{DATA:repo}/%{IP:client_ip}%{GREEDYDATA}",
          "%{TIMESTAMP_ISO8601:log_time} \\[%{DATA:service}\\s*\\] \\[%{LOGLEVEL:log_level}\\s*\\] \\[%{DATA:migration_id}\\] \\[%{DATA:class}:%{NUMBER:line}\\] \\[%{DATA:thread}\\] - %{GREEDYDATA:migration_message}"
        ],
        "ignore_failure": true,
        "tag": "_grok_failed"
      }
    },
    {
      "date": {
        "if": "ctx.epoch_ms != null",
        "field": "epoch_ms",
        "target_field": "@timestamp",
        "formats": ["UNIX_MS"],
        "ignore_failure": true
      }
    },
    {
      "date": {
        "if": "ctx.log_time != null",
        "field": "log_time",
        "target_field": "@timestamp",
        "formats": ["ISO8601"],
        "ignore_failure": true
      }
    }
  ]
}

POST _ingest/pipeline/artifactory_pipeline/_simulate
{
  "docs": [
    {
      "_source": {
        "message": """jfrt_package_event_count{action_type="download",package_type="maven"} 328 1711963224554
"""
      }
    },
        {
      "_source": {
        "message": """2025-12-11T19:16:55.448Z [jfrt ] [INFO ] [req123] [BinaryStoreAccessLogger:128] [http-nio-8081-exec-4] - [OK GET] libs-release-local/com/test/app.jar for admin@repo/10.62.49.84
"""
      }
    },
        {
      "_source": {
        "message": """2025-12-11T19:16:55.448Z [jfrt ] [INFO ] [r0c4a78ba7137a4f] [ConanMigration:221] [migration-thread] - Migration completed successfully
"""
      }
    }
  ]
}

# Results
{
  "docs": [
    {
      "doc": {
        "_index": "_index",
        "_version": "-3",
        "_id": "_id",
        "_source": {
          "epoch_ms": 1711963224554,
          "@timestamp": "2024-04-01T09:20:24.554Z",
          "action_type": "download",
          "event_count": 328,
          "message": """jfrt_package_event_count{action_type="download",package_type="maven"} 328 1711963224554
""",
          "package_type": "maven"
        },
        "_ingest": {
          "timestamp": "2026-01-03T16:05:23.213780192Z"
        }
      }
    },
    {
      "doc": {
        "_index": "_index",
        "_version": "-3",
        "_id": "_id",
        "_source": {
          "line": "128",
          "repo": "repo",
          "log_level": "INFO",
          "thread": "http-nio-8081-exec-4",
          "message": """2025-12-11T19:16:55.448Z [jfrt ] [INFO ] [req123] [BinaryStoreAccessLogger:128] [http-nio-8081-exec-4] - [OK GET] libs-release-local/com/test/app.jar for admin@repo/10.62.49.84
""",
          "log_time": "2025-12-11T19:16:55.448Z",
          "http_method": "GET",
          "artifact_path": "libs-release-local/com/test/app.jar",
          "@timestamp": "2025-12-11T19:16:55.448Z",
          "service": "jfrt",
          "client_ip": "10.62.49.84",
          "request_id": "req123",
          "class": "BinaryStoreAccessLogger",
          "user": "admin",
          "status": "OK"
        },
        "_ingest": {
          "timestamp": "2026-01-03T16:05:23.213811879Z"
        }
      }
    },
    {
      "doc": {
        "_index": "_index",
        "_version": "-3",
        "_id": "_id",
        "_source": {
          "@timestamp": "2025-12-11T19:16:55.448Z",
          "migration_id": "r0c4a78ba7137a4f",
          "service": "jfrt",
          "line": "221",
          "log_level": "INFO",
          "migration_message": "Migration completed successfully",
          "thread": "migration-thread",
          "message": """2025-12-11T19:16:55.448Z [jfrt ] [INFO ] [r0c4a78ba7137a4f] [ConanMigration:221] [migration-thread] - Migration completed successfully
""",
          "class": "ConanMigration",
          "log_time": "2025-12-11T19:16:55.448Z"
        },
        "_ingest": {
          "timestamp": "2026-01-03T16:05:23.213816291Z"
        }
      }
    }
  ]
}
1 Like

Awesome job Stephen. :+1:

There are two solution, it's up to Pratiksha to select LS or ES&IngPipeline.

If you will use LS and tag_on_failure , change my if to
if ( ("_grokparsefailure" in [tags]) or ("_groktimeout" in [tags] ) or ("_grok_failed" in [tags] )) {

thanks it work @Rios , @stephenb

1 Like