Index not creating

Following with this topic Index not creating - #17 by jubin03
I have created a new ELK and fetched data and it is working fine and able to create index patter, but the existing one is not working. Could somebody have any idea?

Did you follow up on Index not creating - #18 by aaron-nimocks?

Yes, I have done the same and no error was found while running logstash, but no index pattern was created. Please find the logs

[[main]-pipeline-manager] INFO  logstash.javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/usr/share/logstash/config/conf.d/1password-filter.conf", "/usr/share/logstash/config/conf.d/filebeat-filter.conf", "/usr/share/logstash/config/conf.d/logstash.conf", "/usr/share/logstash/config/conf.d/pfsense-filter.conf", "/usr/share/logstash/config/conf.d/webhook-filter.conf"], :thread=>"#<Thread:0x69d20ebd run>"}
18:21:28.920 [[main]-pipeline-manager] INFO  logstash.javapipeline - Pipeline Java execution initialization time {"seconds"=>1.71}
18:21:29.047 [[main]-pipeline-manager] INFO  logstash.inputs.beats - Starting input listener {:address=>"0.0.0.0:5044"}
18:21:29.105 [[main]-pipeline-manager] INFO  logstash.javapipeline - Pipeline started {"pipeline.id"=>"main"}
18:21:29.125 [[main]<http] INFO  logstash.inputs.http - Starting http input listener {:address=>"0.0.0.0:3233", :ssl=>"false"}
18:21:29.159 [Agent thread] INFO  logstash.agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
18:21:29.235 [[main]<udp] INFO  logstash.inputs.udp - Starting UDP listener {:address=>"0.0.0.0:5044"}
18:21:29.236 [[main]<beats] INFO  org.logstash.beats.Server - Starting server on port: 5044
18:21:29.251 [[main]<file] INFO  filewatch.observingtail - START, creating Discoverer, Watch with file and sincedb collections
18:21:29.318 [[main]<udp] INFO  logstash.inputs.udp - UDP listener started {:address=>"0.0.0.0:5044", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}
`

The idea behind changing your output from Elasticsearch to stdout is to test your data and if it's flowing.

When you do the below and run it you should see records on the screen being processed. If you do not, then your issue is in your filter or input.

What results do you see when you run it?

output { 
 stdout {}
}

Yes, I have enabled as you said, and can see other filter data flowing, but not from the JSON file.

Can you post your entire conf? If you ran the below and data didn't flow then it most likely isn't Logstash and is your source files not having the data you expect it to.

input {
  file {
    path => "/path/*.json"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}
output { 
 stdout {}
}

Please find the configurations

Logstash conf

# Filebeat input
input {
  beats {
    port => 5044
  }
}

#File input
input {
  file {
    path => "/path/*.json"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}

# webhook 
input {
  http {
    port => port number
    type => "webhook"
 }
}

#udp syslogs stream via 5044
input {
  udp {
    type => "syslog"
    port => 5044
  }
}

output {
  if [type] == "syslog" {
  elasticsearch {
    ssl => true
    ssl_certificate_verification => false
    user => username
    password => "password"
    action => "index"
    hosts => ["https://elk.com:9200"]
    index => "logstash-pfsense-%{+YYYY.MM.dd}"
   }
  }

  else if [type] == "signinattempts" {
    elasticsearch {
    ssl => true
    ssl_certificate_verification => false
    user => username
    password => "password"
    action => "index"
    hosts => ["https://elk.com:9200"]
    index => "events-%{+YYYY.MM.dd}"
   }
  }

  else if [type] == "webhook" {
    elasticsearch {
    ssl => true
    ssl_certificate_verification => false
    user => username
    password => "password"
    action => "index"
    hosts => ["https://elk.com:9200"]
    index => "webhook-%{+YYYY.MM.dd}"
   }
  }

  else {
    elasticsearch {
    ssl => true
    ssl_certificate_verification => false
    user => username
    password => "password"
    action => "index"
    hosts => ["https://elk.com:9200"]
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    }
   }
  stdout {
    codec => rubydebug
  }
}


Filter.conf

filter {
    if [type] == "signinattempts" {

          json {
            source => "message"
          }
    }
}

None of your inputs set type to signinattempts. So type will never == signinattempts.

I haven't added type input for webhook that filter also same as this and working fine. Could you please suggest what changes required on the conf

I don't understand the problem still. If you are expecting data from all json files from that path to be processed by Logstash but running this exact config does not produce records then I would look at your json files and not your conf to start with.

input {
  file {
    path => "/path/*.json"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}
output { 
 stdout {}
}

I have created a script to fetch some events data to a single JSON file via API auth also checked. Due to security concerns, I can't share the JSON file. Seems like the data inside the file is not reading logstash

I would verify that is valid JSON in that file. Also run Logstash in --debug mode to see if it give you any other hints.

Are the JSON records 1 per line or multiline? Can you give a sample of how it's structured without using the actual data?

It should look like this with how your input is setup.

{"message": "value", "another": "field"}
{"message": "value", "another": "field"}
{"message": "value", "another": "field"}
{"message": "value", "another": "field"}
{"message": "value", "another": "field"}

This is the file sample content. I have removed the datas only

{"uuid":"","session_uuid":"","timestamp":"2021-07-02T03:55:35Z","country":"US","category":"success","type":"mfa_ok","details":null,"target_user":{"uuid":"","name":"","email":""},"client":{"app_name":" Browser Extension","app_version":"20166","platform_name":"Firefox","platform_version":"68.0","os_name":"Windows","os_version":"10.0","ip_address":""}}

Taking your sample data and running the below conf produces the below output and works. Next part to verify would be the path to your json files.

Conf

input {
  file {
    path => "/Applications/elastic/logstash/config/data1.json"
    sincedb_path => "/dev/null"
    start_position => "beginning"
    codec => json
  }
}
output {
    stdout { codec => json_lines }
}

Output

{
    "path": "/Applications/elastic/logstash/config/data1.json",
    "target_user": {
        "uuid": "",
        "email": "",
        "name": ""
    },
    "@version": "1",
    "@timestamp": "2021-09-29T11:48:21.644Z",
    "details": null,
    "uuid": "",
    "timestamp": "2021-07-02T03:55:35Z",
    "client": {
        "app_version": "20166",
        "app_name": " Browser Extension",
        "platform_name": "Firefox",
        "ip_address": "",
        "os_name": "Windows",
        "platform_version": "68.0",
        "os_version": "10.0"
    },
    "host": "Aarons-MBP",
    "category": "success",
    "type": "mfa_ok",
    "country": "US",
    "session_uuid": ""
}

I have added this file in a path and the permission is chmod 777

I would run just the below using --debug option and post results.

input {
  file {
    path => "/Applications/elastic/logstash/config/data1.json"
    sincedb_path => "/dev/null"
    start_position => "beginning"
    codec => json
  }
}
output {
    stdout { codec => json_lines }
}

Now I'm able to create an index, but in Kibana the index shows the data from beats not from the JSON file. I have used the same data which I mentioned in the previous chat

Conf file

#events
input {
  file {
  path => "/path/events.json"
  sincedb_path => "/dev/null"
  start_position => "beginning"
  codec => json
  }
}

filter {
      json {
        source => "message"
      }
 }

output {
    elasticsearch {
    ssl => true
    ssl_certificate_verification => false
    user => 
    password => 
    action => "index"
    hosts => ["https://url:9200"]
    index => "events-%{+YYYY.MM.dd}"
   }
  }

FYI: It is dockerized ELK. No errors were found from docker logs

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.