Indexing mails to elasticsearch

please i need ur help i can't index my emails into elastic search

C:\elk\logstash\bin>logstash -f parsemail.conf --config.reload.automatic
Sending Logstash's logs to C:/elk/logstash/logs which is now configured via log4j2.properties
[2018-04-18T10:23:48,465][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/elk/logstash/modules/fb_apache/configuration"}
[2018-04-18T10:23:48,576][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/elk/logstash/modules/netflow/configuration"}
[2018-04-18T10:23:49,686][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-04-18T10:23:52,239][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2018-04-18T10:23:54,130][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-04-18T10:24:15,084][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"localemails", document_type=>"email", hosts=>[//localhost:9200], id=>"6eb862d539531d5f6d7e9d09096a9571a75bbe4a56db027a6b1456db90d866b1", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_43a18763-56a4-4391-b37b-845acf62b921", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-04-18T10:24:15,357][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-04-18T10:24:23,661][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-04-18T10:24:23,687][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-04-18T10:24:24,415][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-04-18T10:24:24,963][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-18T10:24:25,097][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-04-18T10:24:25,176][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-04-18T10:24:25,324][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-04-18T10:24:25,844][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-04-18T10:24:34,931][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x63741106 run>"}
[2018-04-18T10:24:35,365][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}

What's in parsemail.conf? If you're using the file input, what's in the input files?

1 Like

thank you very much Magnus
there is the config file

input {
imap {
host => "imap.gmail.com"
password => "xxxxxxxxxx"
user => "rouchad767@gmail.com"
port => 993
check_interval => 10
folder => "Inbox"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
index => "emails"
document_type => "email"
hosts => "localhost:9200"
}
}

Not sure what's going on. Do you get any additional clues by increasing Logstash's log level?

1 Like

first of all thank you very much, I did not understand your question, when I tested the program again I got this result
C:\Users\goldenBoy>cd C:\elk\logstash\bin

C:\elk\logstash\bin>logstash -f gmail.conf
Sending Logstash's logs to C:/elk/logstash/logs which is now configured via log4j2.properties
[2018-04-19T18:22:41,010][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/elk/logstash/modules/fb_apache/configuration"}
[2018-04-19T18:22:41,127][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/elk/logstash/modules/netflow/configuration"}
[2018-04-19T18:22:43,293][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-04-19T18:22:47,591][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2018-04-19T18:22:50,258][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-04-19T18:23:09,186][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"emails", document_type=>"email", hosts=>[//localhost:9200], id=>"91ce52bea64c35175945f266a70e13faf50f1b7bbe8fd235f467850ac2249d65", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_aba4e9a3-5c9f-40e7-b2e3-c88d1b1f6dac", enable_metric=>true, charset=>"UTF-8">, workers=>1, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-04-19T18:23:09,697][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-04-19T18:23:12,255][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-04-19T18:23:12,370][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-04-19T18:23:15,123][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-04-19T18:23:16,194][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-19T18:23:16,375][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-04-19T18:23:16,449][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-04-19T18:23:16,631][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-04-19T18:23:19,598][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-04-19T18:23:29,843][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x6dcc139f run>"}
[2018-04-19T18:23:30,380][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}
{
"subject" => "Goodreads Newsletter: April 19, 2018",
"x-amazon-metadata" => "CA=C36S8VKT7PAUOV-CU=A2IGUBJZJ7TEQ0-RI=A3UQAZE4JJEIRW",
"x-original-messageid" => "urn.rtn.msg.20180419154508144ddd9046334001a839dbf6cd60p0na@1524152708467.rtn-na-mktng-m41b-a797d6c6.us-east-1.amazon.com",
"x-google-smtp-source" => "AB8JxZrI7tkWOaysAIwZbCPG7h3IVI6FX3ASeARxHOF8p64JAAgxfThI2ba77dQfWOtUOsgMvlEK",
"from" => "Goodreads no-reply@mail.goodreads.com",
"message-id" => "01000162de94ad91-8f0b0d72-30f1-4bcc-942e-4fc3f9418e55-000000@email.amazonses.com",
"received" => [
[0] "by 10.223.139.202 with SMTP id w10csp823166wra; Thu, 19 Apr 2018 08:45:10 -0700",
[1] "from a15-138.smtp-out.amazonses.com (a15-138.smtp-out.amazonses.com. [54.240.15.138]) by mx.google.com with ESMTPS id p43-v6si1584826qtg.155.2018.04.19.08.45.09 for rouchad767@gmail.com (version=TLS1 cipher=ECDHE-RSA-AES128-SHA bits=128/128); Thu, 19 Apr 2018 08:45:09 -0700"
],
"delivered-to" => "rouchad767@gmail.com",
"@version" => "1",
"x-ses-outgoing" => "2018.04.19-54.240.15.138",
"arc-seal" => "i=1; a=rsa-sha256; t=1524152709; cv=none; d=google.com; s=arc-20160816; b=Eprqz1o1kS72s9iKMN8vPvay5etnSkpxDQ39uW8iR//o0w2k/8lA5rv0KSjsoY5dtz
"content-type" => "multipart/alternative; boundary="----=_Part_1557594_2123001854.1524152708461"",
[2018-04-19T18:40:44,429][INFO ][logstash.outputs.elasticsearch] retrying failed action with response code: 403 ({"type"=>"cluster_block_exception", "reason"=>"blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];"})
[2018-04-19T18:40:44,501][INFO ][logstash.outputs.elasticsearch] Retrying individual bulk actions that failed or were rejected by the previous bulk request. {:count=>1}

Good, so it's able to fetch a message Gmail. Here's why it doesn't show up in ES:

[2018-04-19T18:40:44,429][INFO ][logstash.outputs.elasticsearch] retrying failed action with response code: 403 ({"type"=>"cluster_block_exception", "reason"=>"blocked by: [FORBIDDEN/12/index read-only / allow delete (api)];"})

This exact error message came up here less than a week ago. The folks in the Elasticsearch category can help debug the problem.

thank u very much Mr MagnusBaeck

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.