Logstasg ERROR filewatch.tailmode.handlers.grow

Hi, set up reading mysql logs from nfs using multiline codec and grok filter
The first problem is that it doesn’t read the file after rotation, restarting doesn’t help, it still doesn’t seem to see it, there was an error with the heap size, it doesn’t seem to appear anymore, jvm config-

-Xms8g
-Xmx8g

File input config-

input {
  file {
   path => "/home/dir/logs/mysql-master_slow.log"
   codec => multiline {
   pattern => "^#%{SPACE}Time:%{SPACE}%{TIMESTAMP_ISO8601:time}"
   negate => true
   what => "previous"
   auto_flush_interval => 60
  }
 }
}

The second problem appeared today

[2022-09-30T09:05:23,550][ERROR][filewatch.tailmode.handlers.grow][slowlog][05f681df29f8ebd8d5df33285b91122f77a58bce1285e132e6aa879d8fa95b63] controlled_read general error reading {:path=>"/home/dir/logs/mysql-master_slow.log", :exception=>Java::JavaLang::NegativeArraySizeException, :message=>"-1796390041"}


And there are a lot of such errors, can anyone help?

Try with this:

    file {
        path => "/home/dir/logs/mysql-master_slow.log"
        codec => multiline {
           pattern => "^#%{SPACE}Time:%{SPACE}%{TIMESTAMP_ISO8601}"
           negate => true
           what => "previous"
           auto_flush_interval => 60
        }
        start_position => beginning
        sincedb_path => ["/path/file-sincedb.db"]
        mode => "tail"
	}

The NegativeArraySizeException is a runtime exception in Java that occurs when an application attempts to create an array with a negative size. Try without :time in pattern

If still not working, paste few lines from the mysql-master_slow.log

Hello, i added this config

   start_position => beginning
   sincedb_path => ["/var/lib/logstash/plugins/inputs/file/.sincedb_ec67afd1dee253d7ccda120a4fa59e62"]
   mode => "tail"

Still I get this error

[FATAL][org.logstash.Logstash    ]
java.lang.OutOfMemoryError: Java heap space

After it logstash restarts
And a few lines from mysql-master_slow.log

# Time: 2022-10-01T11:10:17.781340Z
# User@Host: user[user] @  [192.168.100.1]  Id: 325513
# Query_time: 0.000017  Lock_time: 0.000000 Rows_sent: 0  Rows_examined: 0
SET timestamp=1664742234;
SHOW WARNINGS;

logstash reads these lines from a file typically around 200GB in size daily

In case that someone needs... To read multiline mysql-master_slow.log file use code:

filter {
 # remove multiline in tags
  mutate{
   remove_field => ["tags"] 
  }
  
 # split fields without Query_time line
 grok {
      break_on_match => false
      match => { "message" => "^#%{SPACE}Time:%{SPACE}%{TIMESTAMP_ISO8601:time}%{GREEDYDATA}User@Host:%{SPACE}%{USERNAME:f_name}\[%{DATA:s_name}\]%{SPACE}@%{SPACE}%{DATA:hostname}%{SPACE}\[(%{DATA:ip})?\]%{SPACE}Id:%{SPACE}%{NUMBER:user_id}%{GREEDYDATA}Rows_examined:%{SPACE}%{NUMBER}(\r\n|\r|\n)%{GREEDYDATA:query}"}
 }
   # convert to date and save in @timestamp
   date {
		match => ["time", "ISO8601" ]
		remove_field => ["time"]
		timezone => "Europe/Berlin"
   }

   # extract fields from queries
   grok {
      break_on_match => false
      tag_on_failure => "_grokqueryfailure"
      match => {
        "query" => [
          "ua2_.phone='%{DATA:ua2_phone}'",
          "ua2_.email='%{DATA:ua2_email}'",
          "%{SPACE}mobile_number='%{DATA:mobile_number}'",
          "%{SPACE}email='%{DATA:email}'",
          "%{DATA}"
        ]
      }
   }

}

Also is possible to read line by line.

 grok {
      break_on_match => false
      match => { "message" => "^#%{SPACE}Time:%{SPACE}%{TIMESTAMP_ISO8601:time}" }
      match => { "message" => "^#%{SPACE}User@Host:%{SPACE}%{USERNAME:f_name}\[%{DATA:s_name}\]%{SPACE}@%{SPACE}%{DATA:hostname}%{SPACE}\[(%{DATA:ip})?\]%{SPACE}Id:%{SPACE}%{NUMBER:user_id}" }
      match => { "message" => "^#%{SPACE}Query_time:%{SPACE}%{BASE16FLOAT:query_time}%{SPACE}Lock_time:%{SPACE}%{BASE16FLOAT:lock_time}%{SPACE}Rows_sent:%{SPACE}%{NUMBER:rows_sent}%{SPACE}Rows_examined:%{SPACE}%{NUMBER:rows_examined}" }
      match => { "message" => "^(?!#)%{GREEDYDATA:query}"}
 }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.