ramesh87
(RAMESH)
August 31, 2018, 2:23pm
1
Hi,
I am using the logstash configuration version 6.3.
I am trying to parsing my log files with conjunction of json and kv filter in the logstash configuration.
Logstash Configuration:
input {
beats {
port => 5990
}
}
filter {
if [fields][log_type] == "te-test" {
grok {
match => { message => "%{TIMESTAMP_ISO8601:timestamp} [%{DATA:thread}] %{LOGLEVEL:log-level} %{DATA:class} %{NUMBER:RunId} %{GREEDYDATA:kv}"}
}
json {
source => "message"
target => "document"
add_tag => [ "kv_parsed" ]
}
kv {
source => "document"
remove_field => "document"
field_split => " ,"
value_split => "=:"
include_brackets => "false"
remove_char_key => "{]"
recursive => "true"
}
}
}
output {
if [fields][log_type] == "te-test" {
elasticsearch {
hosts => ["192.168.0.159:9200"]
manage_template => false
index => "%{[fields][log_type]}"
}
}
When i am trying to run the above configuration , json filter is not working and logs are not parsing.
Can you suggest me is there any configuration errors on the above.
BR,
Ram
1 Like
How could we possibly help out without knowing what the input logs look like? What does an event produced by Logstash look like?
ramesh87
(RAMESH)
August 31, 2018, 8:35pm
3
Hi ,
I am very new to elk .Sorry i missed my input logs. Here u can find my input log that needs to be parsed by using the above posted configuration.
Input Log:
2018-08-22 08:39:54,173 [aaa-4-bbbb-6] INFO com.test.xxxx.configuration.sessioninformation- RamId = 4127185 Desiredinformation = Capabilities {RANKID: 4127185, accepCerts: true, browserName: chrome, commandTimeout: 300, custom-data: {TestEntityName: Commands Testing}, goog:chromeOptions: {args: [--lang=en], extensions: []}, Timeout: 960, maxTime: 10800, name: testbox-97TJ63A-4127185, platform: WIN10, recordMp3: true, screen-resolution: 1024x768, yyyProviderId: 2, yyyVersion: 3.10.0, tags: [testName: testbox, projectName: test project, testType: TestDefinition, TestEnvironment: DEV], unexpectedBehaviour: ignore, unhandledBehavior: ignore, version: 59.0}
Json filter is not happened after grok was happened. My target is to parse each filed in the above log message.
If you required any more information please let me know.
BR,
Ram.
Hello,
[%{DATA:thread}] instead if this pattern use below one
\[%{DATA:thread}\]
To my knowledge json filter works on logs which are in json format. It will not work on the log format you shared.
ramesh87:
2018-08-22 08:39:54,173 [aaa-4-bbbb-6] INFO com.test.xxxx.configuration.sessioninformation- RamId = 4127185 Desiredinformation = Capabilities {RANKID: 4127185, accepCerts: true, browserName: chrome, commandTimeout: 300, custom-data: {TestEntityName: Commands Testing}, goog:chromeOptions: {args: [--lang=en], extensions: }, Timeout: 960, maxTime: 10800, name: testbox-97TJ63A-4127185, platform: WIN10, recordMp3: true, screen-resolution: 1024x768, yyyProviderId: 2, yyyVersion: 3.10.0, tags: [testName: testbox, projectName: test project, testType: TestDefinition, TestEnvironment: DEV], unexpectedBehaviour: ignore, unhandledBehavior: ignore, version: 59.0}
[2018-09-01T02:46:52,795][WARN ][logstash.filters.json ] Error parsing json {:source=>"message", :raw=>"2018-08-22 08:39:54,173 [aaa-4-bbbb-6] INFO com.test.xxxx.configuration.sessioninformation- RamId = 4127185 Desiredinformation = Capabilities {RANKID: 4127185, accepCerts: true, browserName: chrome, commandTimeout: 300, custom-data: {TestEntityName: Commands Testing}, goog:chromeOptions: {args: [--lang=en], extensions: }, Timeout: 960, maxTime: 10800, name: testbox-97TJ63A-4127185, platform: WIN10, recordMp3: true, screen-resolution: 1024x768, yyyProviderId: 2, yyyVersion: 3.10.0, tags: [testName: testbox, projectName: test project, testType: TestDefinition, TestEnvironment: DEV], unexpectedBehaviour: ignore, unhandledBehavior: ignore, version: 58.0}", :exception=>#<LogStash::Json::ParserError: Unexpected character ('-' (code 45)): Expected space separating root-level values
ramesh87
(RAMESH)
September 1, 2018, 3:14am
5
Hi Harshad/Magnusbaeck,
Thanks for your response. But if I use only grok and kv filter combination my input logs was not broken in complete format .
Suggest me without adding json filter , by using the below configuration how could i achieve my log parsing for my input log using the combination of grok and kv filter. Pls take a look at the configuration and input log below .
Input Log:
2018-08-22 08:39:54,173 [aaa-4-bbbb-6] INFO com.test.xxxx.configuration.sessioninformation- RamId = 4127185 Desiredinformation = Capabilities {RANKID: 4127185, accepCerts: true, browserName: chrome, commandTimeout: 300, custom-data: {TestEntityName: Commands Testing}, goog:chromeOptions: {args: [--lang=en], extensions: []}, Timeout: 960, maxTime: 10800, name: testbox-97TJ63A-4127185, platform: WIN10, recordMp3: true, screen-resolution: 1024x768, yyyProviderId: 2, yyyVersion: 3.10.0, tags: [testName: testbox, projectName: test project, testType: TestDefinition, TestEnvironment: DEV], unexpectedBehaviour: ignore, unhandledBehavior: ignore, version: 59.0}
Logstash Configuration:
input {
beats {
port => 5990
}
}
filter {
if [fields][log_type] == "te-test" {
grok {
match =>
{ message => "%{TIMESTAMP_ISO8601:timestamp} [%{DATA:thread}] %{LOGLEVEL:log-level} %{DATA:class} %{NUMBER:RunId} %{GREEDYDATA:kv}"}
}
kv {
source ="kv"
remove_field = "kv"
field_split =" ,"
value_split ="=:"
include_brackets = "false"
remove_char_key ="{]"
recursive ="true"
}
}
}
output {
if [fields][log_type] == "te-test" {
elasticsearch {
hosts => ["192.168.0.159:9200"]
manage_template => false
index => "%{[fields][log_type]}"
}
}
So what does your current filters produce, i.e. in what way are your current filters not working? Copy/paste from Kibana's JSON tab or use a stdout { codec => rubydebug }
output.
system
(system)
Closed
October 1, 2018, 6:06am
7
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.