Logatsh unable to read custom logstash.conf file

Logatsh is unable to read custom logstash.conf file and it's pointing to the default file path. i.e /usr/share/logstash/pipeline/logstash.conf.

docker-compose.yml

  version: '3.2'

services:
  elasticsearch:
    build:
      context: elasticsearch/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./elasticsearch/config/elasticsearch.yml
        target: /usr/share/elasticsearch/config/elasticsearch.yml
        read_only: true
      - type: volume
        source: elasticsearch
        target: /usr/share/elasticsearch/data
    ports:
      - "9200:9200"
      - "9300:9300"
    environment:
      ES_JAVA_OPTS: "-Xmx256m -Xms256m"
      ELASTIC_PASSWORD: changeme
      discovery.type: single-node
    networks:
      - elg

  logstash:
    build:
      context: logstash/
      args:
        ELK_VERSION: $ELK_VERSION
    volumes:
      - type: bind
        source: ./logstash/config/logstash.yml
        target: /usr/share/logstash/config/logstash.yml
        read_only: true
      - type: bind
        source: ./logstash/pipeline
        target: /usr/share/logstash/pipeline
        read_only: true
    ports:
      - "5044:5044"
      - "5000:5000/tcp"
      - "5000:5000/udp"
      - "9600:9600"
    environment:
      LS_JAVA_OPTS: "-Xmx256m -Xms256m"
    user: root
    networks:
      - elg
    depends_on:
      - elasticsearch

grafana:
container_name: grafana
image: grafana/grafana
environment:
- ELASTICSEARCH_HOSTS=http://elasticsearch:9200
ports:
- 3000:3000
depends_on:
- elasticsearch
networks:
- elg
depends_on:
- elasticsearch

networks:
elg:
driver: bridge

volumes:
elasticsearch:

Logstash logs

    logstash_1       | OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
logstash_1       | WARNING: An illegal reflective access operation has occurred
logstash_1       | WARNING: Illegal reflective access by org.jruby.ext.openssl.SecurityHelper (file:/tmp/jruby-1/jruby10716120224889564442jopenssl.jar) to field java.security.MessageDigest.provider
logstash_1       | WARNING: Please consider reporting this to the maintainers of org.jruby.ext.openssl.SecurityHelper
logstash_1       | WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
logstash_1       | WARNING: All illegal access operations will be denied in a future release
logstash_1       | Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
logstash_1       | [2020-11-24T05:49:43,986][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.9.3", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.8+10-LTS on 11.0.8+10-LTS +indy +jit [linux-x86_64]"}
logstash_1       | [2020-11-24T05:49:44,221][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
logstash_1       | [2020-11-24T05:49:44,285][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
logstash_1       | [2020-11-24T05:49:45,736][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"e872d89c-cddf-44a5-a443-61e2edf1caaf", :path=>"/usr/share/logstash/data/uuid"}
logstash_1       | [2020-11-24T05:50:38,349][INFO ][org.reflections.Reflections] Reflections took 131 ms to scan 1 urls, producing 22 keys and 45 values 
logstash_1       | [2020-11-24T05:50:46,581][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"testelastic", id=>"f841bc62af6ca263e40598f94cc11be3076687f132add7d6ced03ade62725a0e", hosts=>[http://elasticsearch:9200], document_type=>"sch", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_1ca96686-2ee1-47c2-957c-3176134771c7", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", ecs_compatibility=>:disabled, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
logstash_1       | [2020-11-24T05:50:52,037][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
logstash_1       | [2020-11-24T05:50:53,779][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
logstash_1       | [2020-11-24T05:50:54,314][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
logstash_1       | [2020-11-24T05:50:54,351][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
logstash_1       | [2020-11-24T05:50:55,016][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://elasticsearch:9200"]}
logstash_1       | [2020-11-24T05:50:55,455][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
logstash_1       | [2020-11-24T05:50:55,870][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
logstash_1       | [2020-11-24T05:50:56,549][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x70b47679 run>"}
logstash_1       | [2020-11-24T05:51:17,292][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>20.71}
logstash_1       | [2020-11-24T05:51:18,867][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
logstash_1       | [2020-11-24T05:51:21,458][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
logstash_1       | [2020-11-24T05:51:21,696][INFO ][filewatch.observingtail  ][main][32ee3be21399aaffc75fd42d8cc7785bcc622bf9c7289de83c2e830d7db357a7] START, creating Discoverer, Watch with file and sincedb collections
logstash_1       | [2020-11-24T05:51:23,333][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Logstash. conf ( remove some of the columns due to sensitivity )

input
{
        file {
                path => "/home/practical-devsecops/Downloads/input.csv"
                type => "ppp"
		start_position => "beginning"
                sincedb_path => "/dev/null"
                

        }
}

filter {
	
	  if [type] == "ppp" {
        csv {
               columns => [ "Year" ]



mutate {
        replace => { "reportMonth" => "%{Month}-%{Year}" }
}
date {
    match => [ "reportMonth", "MMM-YYYY", "ISO8601" ]
    target => "reportMonth"
}


}



}

output {
	if [type] == "ppp" {
        elasticsearch {
        hosts => "http://elasticsearch:9200"
        index => "testelastic"
               
document_type => "sch"
        }
        stdout {codec => rubydebug}

}



}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.