Having a hard time parsing artifactory logs using 7.3.0

Having a hard time parsing artifactory logs -- I've utilized the grok debugger in kibana as well as referenced jfrog's page regarding parsing logs -- but still it seems to not register. I've inserted greedydata in a lot of data fields because that is the only way it would not error. Any idea of what I am doing wrong?

04-jfrog-artifactory-input.conf:

input {
file {
tags => "artifactorylog"
path => ["/var/opt/jfrog/artifactory/logs/artifactory.log"]
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

12-logstash-jfrog-artifactory.conf:

filter {
if "artifactorylog" in [tags] {
grok {
match => { "message" => "^%{TIMESTAMP_ISO8601:artifactory_timestamp} %{DATA:artifactory_thread_name} %{GREEDYDATA:loglevel} (%{GREEDYDATA:artifactory
_event_type}) - %{GREEDYDATA:artifactory_message}$" }
}
}
date {
match => [ "artifactory_timestamp", "YYYY-MM-dd HH:mm:ss,SSS" ]
}
}

Help?

I'm also hoping to solve a very similar issue. If I figure it out before you get a response, I'll post what worked for me.

What do the logs look like?

I'm not seeing anything of remark the last time I restarted it:

[2019-08-13T16:48:04,804][INFO ][logstash.runner ] Starting Logstash{"logstash.version"=>"7.3.0"}
[2019-08-13T16:48:08,786][INFO ][org.reflections.Reflections]Reflections took 63 ms to scan 1 urls, producing 19 keys and 39 values
[2019-08-13T16:48:10,692][INFO ][logstash.outputs.elasticsearch]Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://logstash_internal:xxxxxx@hostname.domain.com:9200/]}}
[2019-08-13T16:48:11,035][WARN ][logstash.outputs.elasticsearch]Restored connection to ES instance{:url=>"http://logstash_internal:xxxxxx@hostname.domain.com:9200/"}
[2019-08-13T16:48:11,108][INFO ][logstash.outputs.elasticsearch] ESOutput version determined {:es_version=>7}
[2019-08-13T16:48:11,113][WARN ][logstash.outputs.elasticsearch]Detected a 6.x and above cluster: the type event field won't be used todetermine the document _type {:es_version=>7}
[2019-08-13T16:48:11,157][INFO ][logstash.outputs.elasticsearch] NewElasticsearch output {:class=>"LogStash::Outputs::ElasticSearch",:hosts=>[“//hostname.domain.com:9200"]}
[2019-08-13T16:48:11,484][INFO ][logstash.filters.geoip ] Using geoipdatabase{:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.1-java/vendor/GeoLite2-City.mmdb"}
[2019-08-13T16:48:11,697][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gaugemetric of an unknown type (org.jruby.specialized.RubyArrayOneObject)has been create for key: cluster_uuids. This may result in invalidserialization. It is recommended to log an issue to the responsibledeveloper/development team.
[2019-08-13T16:48:11,702][INFO ][logstash.javapipeline ] Startingpipeline {:pipeline_id=>"main", "pipeline.workers"=>6,"pipeline.batch.size"=>125, "pipeline.batch.delay"=>50,"pipeline.max_inflight"=>750, :thread=>"#<Thread:0x2ebccf43 run>"}
[2019-08-13T16:48:12,239][INFO ][logstash.inputs.file ] No sincedb_pathset, generating one based on the "path" setting{:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_036fb49111d9a1546bb11bef32468499", :path=>["/var/opt/jfrog/artifactory/logs/request.log"]}
[2019-08-13T16:48:12,646][INFO ][logstash.inputs.beats ] Beats inputs:Starting input listener {:address=>"0.0.0.0:5044"}
[2019-08-13T16:48:12,676][INFO ][logstash.inputs.file ] No sincedb_pathset, generating one based on the "path" setting{:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_1420aae557ebe80d785cfe7e5d478cb8", :path=>["/var/opt/jfrog/artifactory/logs/access.log"]}
[2019-08-13T16:48:12,707][INFO ][logstash.javapipeline ] Pipelinestarted {"pipeline.id"=>"main"}
[2019-08-13T16:48:12,755][INFO ][filewatch.observingtail ] START,creating Discoverer, Watch with file and sincedb collections
[2019-08-13T16:48:12,767][INFO ][filewatch.observingtail ] START,creating Discoverer, Watch with file and sincedb collections
[2019-08-13T16:48:12,770][INFO ][filewatch.observingtail ] START,creating Discoverer, Watch with file and sincedb collections
[2019-08-13T16:48:12,784][INFO ][filewatch.observingtail ] START,creating Discoverer, Watch with file and sincedb collections
[2019-08-13T16:48:12,826][INFO ][filewatch.observingtail ] START,creating Discoverer, Watch with file and sincedb collections
[2019-08-13T16:48:12,976][INFO ][org.logstash.beats.Server] Startingserver on port: 5044
[2019-08-13T16:48:13,112][INFO ][logstash.agent ] Pipelines running{:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-08-13T16:48:13,508][INFO ][logstash.agent ] Successfully startedLogstash API endpoint {:port=>9600}
[2019-08-13T16:48:16,201][INFO ][logstash.outputs.elasticsearch]Elasticsearch pool URLs updated {:changes=>{:removed=>[http://logstash_internal:xxxxxx@hostname.domain.com:9200/],:added=>[http://logstash_internal:xxxxxx@xx.xx.xx.xxx:9200/]}}
[2019-08-13T16:48:16,216][WARN ][logstash.outputs.elasticsearch]Restored connection to ES instance{:url=>"http://logstash_internal:xxxxxx@xx.xx.xx.xxx:9200/"}

Sorry, I meant what do the artifactory logs you are trying to parse look like?

Here's a redacted example:

2019-08-14 14:59:43,446 [hel-loo-6019–blah-123] [WARN ] (o.a.r.ArtifactoryResponseBase:107) - Sending HTTP error code 403: Download request for blah:blah ‘stuff-stuff:lo/surf/unt/morestuff/morestuff-blah-blah/2.5.7/morestuff-blah-blah-2.5.7/stuff.exe’ is forbidden for user 'blah_blah'.

I would use dissect in preference to grok for that.

    dissect { mapping => { "message" => "%{[@metadata][timestamp]} %{+[@metadata][timestamp]} [%{thread_name}] [%{loglevel}] (%{event_type}) - %{restOfLine}" } }
    date { match => [ "[@metadata][timestamp]", "YYYY-MM-dd HH:mm:ss,SSS" ] }
    mutate { gsub => [ "loglevel", " ", "" ] }

Sadly, no joy from that :confused:

I don't think it's identifying correctly on the conditional (which doesn't make sense to me) as it seems to be hitting the 13- filter as it has a geofailure tag and that's the only one of my filters with said geo tag on it.

Input config is unchanged as above..but then changed the filter conf:

12-logstash-jfrog-artifactory.conf:

filter {
if "artifactorylog" in [tags] {
dissect { mapping => { "message" => "%{[@metadata][timestamp]} %{+[@metadata][timestamp]} [%{thread_name}] [%{loglevel}] (%{event_type}) - %{restOfLine}
" } }
date { match => [ "[@metadata][timestamp]", "YYYY-MM-dd HH:mm:ss,SSS" ] }
mutate { gsub => [ "loglevel", " ", "" ] }
}
}

Ah, I've got it working but is there any way to parse the user easily out of that (where if grokking I'd probably utilize something like %{GREEDYDATA}%{USER:username} ?

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.