Help with Logstash file input

I am not receiving the contents of fortune.txt for my ELK implementation. This is the input section:

file {    
    path => "/etc/elasticsearch/scripts/otherScripts/fortune.txt"
    sincedb_path => "/dev/null"  
    start_position => "beginning"
    type => "fortune"
}

This is the output:

if [type]=="fortune" {
       elasticsearch {  
       hosts => ["localhost:9200"] 
       index => "linux-%{+yyyy.MM.dd}"
       }
stdout { codec => rubydebug }
}

It should send the contents of the file every time I change the fortune.txt file, but it isn't. Can you help? Also permissions on the fortune.txt are giving read access to the logstash user. I'm pretty stumped.

Changing permissions for just the fortune.txt file would not work if the logstash user does not have access to the rest of the path, which it will not have per default since /etc/elasticsearch should have root:elasticsearch permissions.

I would not recommend changing any permissions for /etc/elasticsearch, you should also not put user files inside the /etc directory as this directory should be used for configuration files, not user files.

Move the fortune.txt file to another path where the logstash user has permissions to read it and test again to see if it work.

For example, create a /opt/logstash, change the permissions to the logstash user, put the fortune.txt file in this path and see if it works.

1 Like

Thanks for the response. I was trying the /opt/logstash and playing around with permissions, but could not get it to work. I am now just putting the fortune.txt in the main root (/) directory (so it isn't in any sub-directories) but it doesn't seem to be working either. These are the permissions/ownership on the file:

-rw-rw----. 1 root logstash 37 Oct 20 09:21 fortune.txt

This is the new input section in my Logstash config file:

file {
    path => "/fortune.txt"
    sincedb_path => "/dev/null"
    start_position => "beginning"
    type => "fortune"
  }

Any idea what the potential solution to this is?

If the logstash user has permission it should work.

What is the content of the file? Do you have multiple lines or just one line?

Do you have any error in the logs?

This is the content of the file (I just changed it):

Genius doesn't work on an assembly line basis.  You can't simply say,
"Today I will be brilliant."
                -- Kirk, "The Ultimate Computer", stardate 4731.3

No errors in logstash-plain.log

Yeah, it should work.

How are you running Logstash? As a service? Please restart the service and share what logstash logs in the file.

Also, do you have Kibana? If yes, what is the result of running GET linux-*/_search on Dev Tools?

outputs info regarding the indices in linux-*

Actually I found logs in the error log that may point us in the right direction:

[root@ELK-Stack logstash]# cat logstash-plain.log | grep "fortune"

[2023-10-20T12:09:20,115][WARN ][logstash.outputs.elasticsearch][main][6af69ef4d05a6aa7d99c4eec8c84eb9e9b6643d49c93b6e0e40033e115e5496e] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"linux-2023.10.20", :routing=>nil}, {"message"=>"The sooner you fall behind, the more time you have to catch up.", "@version"=>"1", "@timestamp"=>2023-10-20T22:09:04.282Z, "type"=>"fortune", "path"=>"/fortune.txt", "host"=>"ELK-Stack.uhtasi.local"}], :response=>{"index"=>{"_index"=>"linux-2023.10.20", "_type"=>"_doc", "_id"=>"FF8iT4sBzjRrQHEkwpHm", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"}}}}

[2023-10-20T12:09:20,407][WARN ][logstash.outputs.elasticsearch][main][6af69ef4d05a6aa7d99c4eec8c84eb9e9b6643d49c93b6e0e40033e115e5496e] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"linux-2023.10.20", :routing=>nil}, {"message"=>"e's methods.", "@version"=>"1", "@timestamp"=>2023-10-20T22:09:17.761Z, "type"=>"fortune", "path"=>"/fortune.txt", "host"=>"ELK-Stack.uhtasi.local"}], :response=>{"index"=>{"_index"=>"linux-2023.10.20", "_type"=>"_doc", "_id"=>"FV8iT4sBzjRrQHEkxJEQ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"}}}}

So the issue seems to be:

"reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"

Can you help?

Yeah, this is a mapping error.

It means that your events from logstas have a host field as a string, like { "host": "somevalue" }, but your destination index expect this to be an object, like { "host" : { "field": "somevalue" } }.

This can happen in two cases, you have an index template that maps the host field as an object or you do not have an index template, but indexed something in that same indice that had the field host as an object.

You can solve this adding the following in your logstash configuration after your input block:

filter {
    mutate {
        remove_field => ["host"]
    }
}

But I recommend that you read the documentation about how mapping works.

1 Like

Thanks, that filter fixed the input issue. However I now have another issue to solve. It seems each line in the fortune.txt is being sent as a separate log. Like this:

When the content of the file is actually this:

[root@ELK-Stack /]# cat fortune.txt
        Now she speaks rapidly.  "Do you know *why* you want to program?"
        He shakes his head.  He hasn't the faintest idea.
        "For the sheer *joy* of programming!" she cries triumphantly.
"The joy of the parent, the artist, the craftsman.  "You take a program,
born weak and impotent as a dimly-realized solution.  You nurture the
program and guide it down the right path, building, watching it grow ever
stronger.  Sometimes you paint with tiny strokes, a keystroke added here,
a keystroke changed there."  She sweeps her arm in a wide arc.  "And other
times you savage whole *blocks* of code, ripping out the program's very
*essence*, then beginning anew.  But always building, creating, filling the
program with your own personal stamp, your own quirks and nuances.  Watching
the program grow stronger, patching it when it crashes, until finally it can
stand alone -- proud, powerful, and perfect.  This is the programmer's finest
hour!"  Softly at first, then louder, he hears the strains of a Sousa march.
"This ... this is your canvas! your clay!  Go forth and create a masterwork!"

I want it all to just be in the message field in one single log.

Adding multiline codec to Logstash input causes not to send to Kibana/Elasticsearch

This is my input for the file:

file {
    path => "/opt/logstash/fortune.txt"
    sincedb_path => "/dev/null"
    start_position => "beginning"
    type => "fortune"
    codec => multiline {
      pattern => "^$" 
      negate => true 
      what => "previous"
    }
  }

Before adding the codec it was sending the text in fortune.txt every time the file changed, however now it isn't appearing in Kibana at all.

Can anyone fix this?

You could try could try setting 'auto_flush_interval` on the multiline codec,

It's working better now. But in some cases it still doesn't fully work. For example, this is the full content of fortune.txt :

"Ever free-climbed a thousand foot vertical cliff with 60 pounds of gear
strapped to your butt?"
   "No."
"'Course you haven't, you fruit-loop little geek."
-- The Mountain Man, one of Dana Carvey's SNL characters
[ditto]

but this is what I received:

pounds of gear
strapped to your butt?"
   "No."
"'Course you haven't, you fruit-loop little geek."
-- The Mountain Man, one of Dana Carvey's SNL characters
[ditto]

Not exactly sure why it cut off after the "60"

Another case:

Actual file content:

I think irc isn't going to work though---we're running out of topic space!
        -- Joseph Carter

What I received:

space!
        -- Joseph Carter

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.