ELK Logstash API

Hi,
I want to display my .log file into Kibana Dashboard thorugh logstash API(Java).I am able to achieve this through logstash config file.But i am not able to achieve same through logstash api. I need json file to push my log file (For example-D:/Test.log) in Kibana dashboard.

Thanks,
Rajapandian.B

Which API are you specifically referring to?

I am using Elasticsearch Java Rest Client api. I referred below link

Through this code i am able to create index but i am not able push my extenal log file into dashboard.

Thanks,
Rajapandian.B

I don't know the details of this client, but if you can provide more information on what isn't working and what you are trying, someone should be able to help

My Logstash config file.
input {
file {
type => "java"
path => "D:/elk-example-spring-boot/elk-example.log"
codec => multiline {
pattern => "^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}.*"
negate => "true"
what => "previous"
}
}
}

filter {
#If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
if [message] =~ "\tat" {
grok {
match => ["message", "^(\tat)"]
add_tag => ["stacktrace"]
}
}

grok {
match => [ "message",
"(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:level} %{NUMBER:pid} --- [(?[A-Za-z0-9-]+)] [A-Za-z0-9.].(?[A-Za-z0-9#_]+)\s:\s+(?.)",
"message",
"(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:level} %{NUMBER:pid} --- .+? :\s+(?.
)"
]
}

date {
match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss.SSS" ]
}
}

output {

stdout {
codec => rubydebug
}

Sending properly parsed log events to elasticsearch

elasticsearch {
hosts => ["localhost:9200"]
}
}

Through this config file i am able to display my logs in kibana dashbaord. But my requirement is-" Instead of logstash conf i have to use logstash api to display the logs.". Is it possible to configure json input file for the same?

Just to be clear, you want to use the Elasticsearch Java client to send logs to Elasticsearch?

yes Mark.I am trying something as input as below.But it didn't work.

{
"description": "Pipeline to parse Apache logs",
"message": {
"type1": {
"_source": {
"path": "C:/Users/Test/Desktop/apache_logs.log"
},
"processors": [{
"grok": {
"field": "message",
"pattern": "%{COMBINEDAPACHELOG}"
},
"date": {
"match_field": "timestamp",
"target_field": "timestamp",
"match_formats": ["dd/MMM/YYYY:HH:mm:ss Z"]
},
"convert": {
"field": "response",
"type": "integer"
}
}]
}
}
}

That is not the Elasticsearch java client, that is Logstash.

I'm sorry, it's just not clear what you are asking.

How to send my logs into elastic search through Logstasth rest api?

Logstash does not have a REST API for ingestion.

ok thanks Mark.Then we need to run logstash command to send logs to elastic search. logstash -f myconfigfile

Our developers have a JSON log shipper configured in their JAVA applications that send JSON to Logstash
The Logstash input is configured like

input {
  udp {
    port => 12346
    queue_size => 10000
    codec => "json"
  }
}

Test bash script

#!/bin/bash

  BLAH=$1

   DATE=$(date)
   echo "{ \"log\": \"Testing JSON logs $BLAH - $DATE\"}" | nc -u -w2 logs.example.com 12346
exit

In case that helps :slight_smile:

-AB

Thanks.Is it possible to define my external file in json file? or i need to put my log file in - echo "{ "log": "Testing JSON logs $BLAH - $DATE"}" | nc -u -w2 D:/Test.log 12346

is this correct?

In my example your JAVA application would send JSON data to logs.example.com on UDP port 12346 directly.

If your application writes to a log file on disk and that should be shipped to Logstash, then I would recommend using Filebeat for that,

File beat can deal with multiline stacktraces etc.

-AB

Thanks.I am new to this. Filebeat also having Api's?

I don't think Filebeat has an API...

You specify a file to read similarly to a Logstash file input.

Ok Thanks A_B.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.