Unable to see indices,Loaded too much data in elasticsearch,

Hi there!

I was trying something and loaded 10 log files of 100 MB each. Now I 'm not able to see the indices in the elastic search (using postman and no data displaying in kibana).

Can please some one help?

Thanks!

Exactly how did you load the data? What does output from the cat indices API show?

Hi,

Now, working fine.
To load data into elastic search I used bin/logstash -f sample.conf < bin/dma.log

Yesterday, it didn't show up anything, I don't know what happened ,but today simply I loaded log file in to elasticsearch, now its working fine , its showing me the indices. Can I know the reason why it didn't show up the indices (yesterday)?

Thanks!

Without logs it is impossible to tell what happened yesterday. Why are you running it manually instead of having Logstash monitor the directory where the log file resides and process it automatically?

Hi Christian,

Just I want to make sure its working fine locally, then I will try to do it automatically.
Log file looks like this

2016-12-16 21:28:05,668 ERROR [int-http-28] [nbiws::::] c.t.d.s.impl.DiagnosticServiceImpl - Error running a diagnostic workflow : 9003: Invalid arguments
com.twowire.dmc.listener.DeviceInteractionException: 9003: Invalid arguments
at com.twowire.dmc.listener.DeviceInteractionTemplate.execute(DeviceInteractionTemplate.java:102) ~[cms-core-4.2.8.9.jar:4.2.8.9]
at com.twowire.dmc.listener.DeviceInteractionTemplate.execute(DeviceInteractionTemplate.java:59) ~[cms-core-4.2.8.9.jar:4.2.8.9]
at com.twowire.dmc.listener.DeviceInteractionTemplate.execute(DeviceInteractionTemplate.java:48) ~[cms-core-4.2.8.9.jar:4.2.8.9]

logstash conf file

input {
stdin {
codec => multiline {
pattern => "(^%{TIMESTAMP_ISO8601} )"
negate => true
what => "previous"
# Record that this is an "exception" event.
multiline_tag => "exception"
}
}
}

filter {
mutate{ add_field => { "Source" => "SKY"} }

environment{
   # add_metadata_from_env => {"region" => "HADOOP_HOME"}
    add_field => ["my_environment", "Hello World, from %{host}"]
    }

if "exception" not in [tags] {

    # example output:
    # 2016-12-16 20:43:20,535 DEBUG [CWMP-processor-6] [00D09E-0000000001:1002:C5852D7218635D7B09FE0DDE0FBE75F5:0:] c.twowire.dmc.service.PolicySvcImpl - Device matched policy 1001
    # encoder pattern (dmc/conf/logback.xml):
    # %date{ISO8601} %-5level [%thread] [%X{username}:%X{deviceId}:%X{sessionId}:%X{userInteraction}:%X{workflowName}] %logger{35} - %msg%n

    grok {
        match => {
            message => "%{DATESTAMP:timestamp} %{LOGLEVEL:level}( +)\[%{DATA:thread}\] \[%{DATA:mdc}\] %{JAVACLASS:class} - %{JAVALOGMESSAGE:logmessage}"
        }
        # Record that this is an "log" event.
        add_tag => ["log"]
    }

    if "log" in [tags] {

        grok {
            match => {
                mdc => "%{DATA:username}:%{DATA:deviceId:int}:%{DATA:sessionId}:%{DATA:userInteraction:int}:%{GREEDYDATA:workflowName}"
            }
        }

        

        date {
            timezone => GMT
            match => [
                           # "16-12-16 21:58:20,606"
                "timestamp", "yy-MM-dd HH:mm:ss,SSS"
            ]
        }
    }

}
if [level] in ["ERROR", "error"] or [level] in ["FATAL", "fatal"]{
    mutate {
        add_tag => ["alert"]
    }
}

}

output {
if "_grokparsefailure" in [tags] {
stdout { codec => rubydebug }
}
if "log" in [tags] {
elasticsearch { hosts => ["zero.auslab.2wire.com"] }
}
if "exception" in [tags]{
elasticsearch { hosts => ["zero.auslab.2wire.com"] }
}
}

Can you please look at it where I'm going wrong?
Thanks for you response!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.