Index not created correct


(madalin) #1

Hi guys

I'm new in ELK and I need your help.
I want to see my log4j2 logs into Kibana.
For that i created and index in ES trying to map the fields from my log :
curl -XPUT localhost:9200/log_index -d '
{
"settings" : {
"index" : {
"number_of_shards" : 5,
"number_of_replicas" : 0
}
}
}
'

curl -XPUT localhost:9200/log_index/_mapping/profile -d '
{
"profile" : {
"properties" : {
"level" : {"type":"keyword", "store":true},
"log_date": {"type":"date", "store":true},
"thread" : {"type":"keyword", "store":true},
"class" : {"type" : "keyword","store":true},
"message" :{"type" : "keyword", "store": true}
}
}

}
'
For some reasons I cannot see my app logs in Kibana and my toughs were that is from the index.
Please help.


(Mujtaba Hussain) #2

What happens when you try and ingest the logs? What does the ES log file say?


(poojagupta) #3

Hi,

which plugin you are using to fetch logs that should be input to Elasticsearch?
Is there any index in elasticsearch ?
Check that index is pushed into elasticsearch via curl API as:
curl http://:9200/_cat/indices?v
if index is successfully created in elasticsearch but unable to fetch in kibana then there is some issue in kibana and if index is not created in ES then there is some communication issue between elasticsearch and plugin by which you are pushing the data.
for this share the elasticsearch logs


(madalin) #4

in my java app I'm using log4j2 config file.

<Socket name="Socket" host="host" port="5010">
            <JsonLayout properties="true"/>
        </Socket>

in Logstash inthe config file:
input
{
tcp
{
mode => "server"
host => "0.0.0.0"
port => 5010
}
}

output
{

elasticsearch
{
    hosts => ["127.0.0.1:9200"]
    document_id => "%{logstash_checksum}"
    index => "logstash-%{+YYYY.MM.dd}"
}

}

The index is created :
health status index uuid pri rep docs.count docs.deleted store.size pri.store.size
yellow open .kibana Bqd3zVbDQk2TQ2LSyR_gxw 1 1 2 0 9.2kb 9.2kb
yellow open logstash-2017.09.12 lTgMTJsZSDiENOz1IEnxNw 5 1 1 35 8.4kb 8.4kb

p.s. that AWS EC2 instance was blocking and I created a new one with a new install and instead of creating my intial index I created the dynamic one using timestamp "logstash-*" but the problem is the same.

As talking about plugin from what I know Logstash doesn't have yet a plugin for Log4j2.
If I don't have any plugins installed cannot work?


(poojagupta) #5

As talking about plugin from what I know Logstash doesn't have yet a plugin for Log4j2.
If I don't have any plugins installed cannot work?

Logstash have log4j2 plugin . for more details refer below link for installation and understanding:
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-log4j.html

After that fetch the logs using the input plugin as:
log4j {
mode => "server"
host => "127.0.0.1"
port => 9500
type => "log4j"
}


(poojagupta) #6

@madalin

If I don't have any plugins installed cannot work?

You can use grok filter plugin also for fetching log4j log files and there is no any another installation is required for this plugin. By default it is installed with logstash.
for more details refer this:
https://www.elastic.co/guide/en/logstash/5.3/plugins-filters-grok.html


(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.