Logstash seems successful. but nothing found on elasticsearch

Hi,

I have /etc/logstash/conf.d/logstash.conf:
input {
file {
path => "/var/lib/collectd/csv/Gateway/cpu-0/cpu-idle-2017-04-24"
start_position => beginning
sincedb_path => "/dev/null"
ignore_older => 0
}
}

filter {
csv {
separator => ","
#epoch,value
columns => ["EPOCH","VALUE"]
}
}

output {
elasticsearch {
hosts => "localhost:9200"
action => "index"
index => "cpu0_utilization"
}
stdout { codec => rubydebug }
}

sudo service logstash restart
sudo service elasticsearch restart

/opt/logstash/bin/logstash -f /etc/logstash/conf.d/logstash.conf -v ===> Displays something similar to

{
"message" => "1493090576.255,93.400394",
"@version" => "1",
"@timestamp" => "2017-04-25T03:22:56.811Z",
"path" => "/var/lib/collectd/csv/Gateway/cpu-0/cpu-idle-2017-04-24",
"host" => "raghu-OptiPlex-9020",
"EPOCH" => "1493090576.255",
"VALUE" => "93.400394"
}

For elasticsearch:
curl localhost:9200
{
"name" : "Wiz Kid",
"cluster_name" : "Gateway",
"cluster_uuid" : "6iIoakzRRr6VM5Zd8QM6xw",
"version" : {
"number" : "2.4.4",
"build_hash" : "fcbb46dfd45562a9cf00c604b30849a6dec6b017",
"build_timestamp" : "2017-01-03T11:33:16Z",
"build_snapshot" : false,
"lucene_version" : "5.5.2"
},
"tagline" : "You Know, for Search"
}

Nothing is seen in elastic search:
curl -X GET localhost:9200/cpu0_utilization
{"cpu0_utilization":{"aliases":{},"mappings":{"logs":{"properties":{"@timestamp":{"type":"date","format":"strict_date_optional_time||epoch_millis"},"@version":{"type":"string"},"EPOCH":{"type":"string"},"VALUE":{"type":"string"},"host":{"type":"string"},"message":{"type":"string"},"path":{"type":"string"}}}},"settings":{"index":{"creation_date":"1493090457803","uuid":"Av7GmdHlSD-QVWFfB_aGyw","number_of_replicas":"0","number_of_shards":"1","version":{"created":"2040499"}}},"warmers":{}}}

I created index using: curl -X POST 'localhost:9200/cpu0_utilization'
I have following doubts.

  1. Can't logstash create index based on options provided in logstash.conf ?

  2. Is it neccessary to run always run: /opt/logstash/bin/logstash -f /etc/logstash/conf.d/logstash.conf -v

  3. Is there a option in logstash to configure it to pick .csv from different machine? as my csv file is in different machine.

  4. How do I verify contents of elasticsearch?

  5. kibana says: log [20:20:24.592] [info][status][plugin:elasticsearch] Status changed from yellow to green - Kibana index ready
    whereas elasticsearch says:
    [2017-04-24 20:20:21,887][INFO ][cluster.routing.allocation] [Wiz Kid] Cluster health status changed from [RED] to [YELLOW] (reason: [shards started [[.kibana][0]] ...]).
    [2017-04-24 20:20:57,816][INFO ][cluster.metadata ] [Wiz Kid] [cpu0_utilization] creating index, cause [api], templates [], shards [1]/[0], mappings []
    [2017-04-24 20:20:58,054][INFO ][cluster.routing.allocation] [Wiz Kid] Cluster health status changed from [RED] to [YELLOW] (reason: [shards started [[cpu0_utilization][0]] ...]).
    [2017-04-24 20:21:06,509][INFO ][cluster.metadata ] [Wiz Kid] [cpu0_utilization] create_mapping [logs]

Can both of them be in different state YeLLOW and Green?

Can't logstash create index based on options provided in logstash.conf ?

Sure it can.

Is it neccessary to run always run: /opt/logstash/bin/logstash -f /etc/logstash/conf.d/logstash.conf -v

What do you mean? With the exception of the -v option that's a normal Logstash startup command, although one usually passes the path to the conf.d directory so that Logstash reads all files in the directory.

Is there a option in logstash to configure it to pick .csv from different machine? as my csv file is in different machine.

Then you need to use NFS or a similar network file system to mount from the other machine. Logstash can't magically read files from other computers.

How do I verify contents of elasticsearch?

Kibana for example, or one of the many low-level REST APIs that Elasticsearch has.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.