ELK Logstash API

(Raja Pandian) #1

I want to display my .log file into Kibana Dashboard thorugh logstash API(Java).I am able to achieve this through logstash config file.But i am not able to achieve same through logstash api. I need json file to push my log file (For example-D:/Test.log) in Kibana dashboard.


(Mark Walkom) #2

Which API are you specifically referring to?

(Raja Pandian) #3

I am using Elasticsearch Java Rest Client api. I referred below link

Through this code i am able to create index but i am not able push my extenal log file into dashboard.


(Mark Walkom) #4

I don't know the details of this client, but if you can provide more information on what isn't working and what you are trying, someone should be able to help

(Raja Pandian) #5

My Logstash config file.
input {
file {
type => "java"
path => "D:/elk-example-spring-boot/elk-example.log"
codec => multiline {
pattern => "^%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}.*"
negate => "true"
what => "previous"

filter {
#If log line contains tab character followed by 'at' then we will tag that entry as stacktrace
if [message] =~ "\tat" {
grok {
match => ["message", "^(\tat)"]
add_tag => ["stacktrace"]

grok {
match => [ "message",
"(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:level} %{NUMBER:pid} --- [(?[A-Za-z0-9-]+)] [A-Za-z0-9.].(?[A-Za-z0-9#_]+)\s:\s+(?.)",
"(?%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME}) %{LOGLEVEL:level} %{NUMBER:pid} --- .+? :\s+(?.

date {
match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss.SSS" ]

output {

stdout {
codec => rubydebug

Sending properly parsed log events to elasticsearch

elasticsearch {
hosts => ["localhost:9200"]

Through this config file i am able to display my logs in kibana dashbaord. But my requirement is-" Instead of logstash conf i have to use logstash api to display the logs.". Is it possible to configure json input file for the same?

(Mark Walkom) #6

Just to be clear, you want to use the Elasticsearch Java client to send logs to Elasticsearch?

(Raja Pandian) #7

yes Mark.I am trying something as input as below.But it didn't work.

"description": "Pipeline to parse Apache logs",
"message": {
"type1": {
"_source": {
"path": "C:/Users/Test/Desktop/apache_logs.log"
"processors": [{
"grok": {
"field": "message",
"date": {
"match_field": "timestamp",
"target_field": "timestamp",
"match_formats": ["dd/MMM/YYYY:HH:mm:ss Z"]
"convert": {
"field": "response",
"type": "integer"

(Mark Walkom) #8

That is not the Elasticsearch java client, that is Logstash.

I'm sorry, it's just not clear what you are asking.

(Raja Pandian) #9

How to send my logs into elastic search through Logstasth rest api?

(Mark Walkom) #10

Logstash does not have a REST API for ingestion.

(Raja Pandian) #11

ok thanks Mark.Then we need to run logstash command to send logs to elastic search. logstash -f myconfigfile


Our developers have a JSON log shipper configured in their JAVA applications that send JSON to Logstash
The Logstash input is configured like

input {
  udp {
    port => 12346
    queue_size => 10000
    codec => "json"

Test bash script



   echo "{ \"log\": \"Testing JSON logs $BLAH - $DATE\"}" | nc -u -w2 logs.example.com 12346

In case that helps :slight_smile:


(Raja Pandian) #13

Thanks.Is it possible to define my external file in json file? or i need to put my log file in - echo "{ "log": "Testing JSON logs $BLAH - $DATE"}" | nc -u -w2 D:/Test.log 12346

is this correct?


In my example your JAVA application would send JSON data to logs.example.com on UDP port 12346 directly.

If your application writes to a log file on disk and that should be shipped to Logstash, then I would recommend using Filebeat for that,

File beat can deal with multiline stacktraces etc.


(Raja Pandian) #15

Thanks.I am new to this. Filebeat also having Api's?


I don't think Filebeat has an API...

You specify a file to read similarly to a Logstash file input.

(Raja Pandian) #17

Ok Thanks A_B.

(system) #18

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.