Index not creating

After configuring filter.conf and logstash input index not showing in Kibana. Successfully tested logstash conf files. No errors.

Below are the conf files

Filter.conf

filter {
  if [type] == "signinattempts" {
      json {
        source => "message"
      }
    }
 }

Logstash input & output

input {
  file {
    type => "signinattempts"
    path => "/path/*.json"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}

output { 
if [type] == "signinattempts"  {
    elasticsearch {
    ssl => true
    ssl_certificate_verification => false
    user => 
    password => 
    action => "index"
    hosts => ["https://localhost:9200"]
    index => "test-events-%{+yyyy.MM.dd}"
   }
  }
}

Could somebody can verify the configuration

Does it work if you remove all of the conditional if statements?

No. It is not working with if statements

Does it work if you just output to the screen?

output { stdout {} }

What does your logstash log say? Any errors or warnings?

No errors or warnings. I can see other if condition logs. Example using same logstash file to fetch syslogs which is working with beat configuration inputs

You said it happened after you added the filter. Are you saying if you remove the filter the data will flow properly?

Can you post a single message from the log file that you expect to work for this pipeline?

What I'm saying is the logstash has other input configurations which are working fine, even adding this one also. But I'm not able to fetch this file data to elk. My previous configurations fetching logs without any issues after adding this. I ran a config test and the result is a success, but indexing is not working. Is it any way to check log stash is taking data from this file and sending it to elasticsearch ?

Yes. Change your output to this. You are looking on the screen to see if messages are getting through Logstash or not.

You will need to start Logstash not as a service in order to see the results output.

output { 
if [type] == "signinattempts"  {
    stdout { }
  }
}

Okay will change.
FYI: I'm running logstash on docker

Then you probably need to output to a file. I am not that familiar with Docker.

This is also not worked. :frowning_face:

I think in order to help further I would need to see your logstash logs when you try to process this file.

Here is the log when i starting logstash container

 logstash.javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/usr/share/logstash/config/conf.d/1password-filter.conf", "/usr/share/logstash/config/conf.d/filebeat-filter.conf", "/usr/share/logstash/config/conf.d/logstash.conf", "/usr/share/logstash/config/conf.d/pfsense-filter.conf", "/usr/share/logstash/config/conf.d/webhook-filter.conf"], :thread=>"#<Thread:0x575d69ef run>"}
04:51:44.933 [[main]-pipeline-manager] INFO  logstash.javapipeline - Pipeline Java execution initialization time {"seconds"=>1.68}
04:51:44.951 [[main]-pipeline-manager] INFO  logstash.inputs.beats - Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
04:51:45.387 [[main]-pipeline-manager] INFO  logstash.javapipeline - Pipeline started {"pipeline.id"=>"main"}
04:51:45.400 [[main]<http] INFO  logstash.inputs.http - Starting http input listener {:address=>"0.0.0.0:3233", :ssl=>"false"}
04:51:45.505 [[main]<beats] INFO  org.logstash.beats.Server - Starting server on port: 5044
04:51:45.530 [[main]<udp] INFO  logstash.inputs.udp - Starting UDP listener {:address=>"0.0.0.0:5044"}
04:51:45.551 [[main]<file] INFO  filewatch.observingtail - START, creating Discoverer, Watch with file and sincedb collections
04:51:45.554 [Agent thread] INFO  logstash.agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
04:51:45.585 [[main]<udp] INFO  logstash.inputs.udp - UDP listener started {:address=>"0.0.0.0:5044", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}
04:51:45.775 [Api Webserver] INFO  logstash.agent - Successfully started Logstash API endpoint {:port=>9600}

Everything looks normal. It appears Logstash is watching for files at /path/*.json but might not be seeing them. Can you verify the path is correct?

Yes, the path is correct. re-verified

Hi,

Please change index name to small letter, elasticsearch will not be able to understand 'MM' , replace this with mmm and then check in Index management

Modified the same but didn't work.
Note: I'm using the same 'MM' format for other syslogs and it is working

  1. Verify file path and file contents.
  2. Verify you can do a straight input to stdout or file output with no filters.

What I am reading is you are failing step #2. If that's the case then this is probably a docker question which I can't answer. Maybe something to do with mounting or volumes. But you said you are doing something similar already so I don't think you would miss that part.

Below is what I would use to test.

input {
  file {
    path => "/path/*.json"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}
output { 
 stdout {}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.