Parsing logback log files with filebeat and sending them to Elasticsearch

Hi, i have a java application with logback for the log configuration and i want to parse my application log files, so they become more useful and to send them to ES, log files will be stored in a directory called logs and this is the log entry format used with logback:

%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n

I need to know the simplest way of doing this task (including the configuration file to use), i thought, i can do that using filebeat but filebeat cannot parse log entries.

Thank you.

You can use Filebeat to ship the data to Logstash where you can apply a grok filter to parse the log line. Then ship the data to Elasticsearch from Logstash. The Beats documentation has an example of how to configure Filebeat to send to Logstash.

In Elasticsearch version 5 (there's an alpha release now) there is a new feature called Ingest Node that enables you to grok the message without the use of Logstash. See https://www.elastic.co/guide/en/elasticsearch/reference/master/grok-processor.html

I tried this configuration file with filebeat:

filebeat:
	prospectors:
		-
		 paths:
         	- /home/fathi/IdeaProjects/fouras/logs*.log
         input_type: log
         fields:
        	context: fouras
        	version: 0
         document_type: fouras_logs
         scan_frequency: 120s
         multiline:
         	pattern: ^\[
    registry_file: /var/lib/filebeat/registry
    output:
    	elasticsearch:
    		hosts: ["localhost:9200"]
    		index: "fouras_logs"
  shipper:
  logging:
    files:
      path: /var/log/mybeat
      rotateeverybytes: 10485760 # = 10MB

But, it's not working

Your indentations are wrong and because it's YAML that matters. Also tabs are not allowed in YAML files.

Try this;

filebeat:
  registry_file: /var/lib/filebeat/registry
  prospectors:
    - paths:
        - /home/fathi/IdeaProjects/fouras/logs*.log
      input_type: log
      fields:
        context: fouras
        version: 0
      document_type: fouras_logs
      scan_frequency: 120s
      multiline:
        pattern: ^\[

output:
  elasticsearch:
    hosts: ["localhost:9200"]
    index: "fouras_logs"

shipper:

logging:
  to_files: true
  files:
    path: /var/log/filebeat
    rotateeverybytes: 10485760 # = 10MB

Sorry, but the indentations are wrong because i copied it from /etc/filebeat/filebeat.yml, but the original file is correctly indented

http://www.mediafire.com/download/hporwhls4av6ppw/filebeat.yml

Actually, i modified that existing configuration file /etc/filebeat/filebeat.yml and restarted filebeat service. That is how to configure Filebeat, right?

Can you elaborate on the problem? Are there errors?

The config file you uploaded looks OK.

I modified the configuration file and i started filebeat service using sudo service filebeat start, and i got * filebeat is running

But, it sounds like filebeat is not considering the configuration file, actually even log files under /var/log/mybeat was not created.

Increase the logging level. By default it's set to error and the file only gets created when something is logged.

logging:
  level: debug     # or info if you want less verbose output
  to_files: true
  files:
    path: /var/log/filebeat
    rotateeverybytes: 10485760 # = 10MB

I got this:

* Restarting Sends log files to Logstash or directly to Elasticsearch. filebeat                                                                                2016/05/06 13:24:14.630402 beat.go:135: DBG  Initializing output plugins
2016/05/06 13:24:14.630485 geolite.go:24: INFO GeoIP disabled: No paths were set under output.geoip.paths
2016/05/06 13:24:14.630603 client.go:265: DBG  ES Ping(url=http://localhost:9200, timeout=1m30s)
2016/05/06 13:24:14.646024 client.go:274: DBG  Ping status code: 200
2016/05/06 13:24:14.646119 outputs.go:119: INFO Activated elasticsearch as output plugin.
2016/05/06 13:24:14.646172 publish.go:232: DBG  Create output worker
2016/05/06 13:24:14.646296 publish.go:274: DBG  No output is defined to store the topology. The server fields might not be filled.
2016/05/06 13:24:14.646416 publish.go:288: INFO Publisher name: fathi-HP-Pavilion-g6-Notebook-PC
2016/05/06 13:24:14.647094 async.go:78: INFO Flush Interval set to: 1s
2016/05/06 13:24:14.647162 async.go:84: INFO Max Bulk Size set to: 50
2016/05/06 13:24:14.647213 async.go:92: DBG  create bulk processing worker (interval=1s, bulk size=50)
2016/05/06 13:24:14.647400 beat.go:147: INFO Init Beat: filebeat; Version: 1.1.2
                                                                         [ OK ]

And nothing on Elasticsearch side

Is there anything in your registry file at /var/lib/filebeat/registry? You should probably delete it before each test to ensure that filebeat re-ships logs that it has already read.

I expect a lot more output in the log file if you are running with level: debug. Is the above output from the log?

/var/lib/filebeat/registry is containing {}

log file content: filebeat

The only error is this:

ERR Stop Harvesting. Unexpected encoding line reader error: unknown matcher type:

Few lines of my log files to be processed using Filebeat:

[2016-05-06 14:47:33.808 [main] DEBUG o.s.w.c.s.StandardServletEnvironment - Adding [servletConfigInitParams] PropertySource with lowest search precedence]
[2016-05-06 14:47:33.837 [main] DEBUG o.s.w.c.s.StandardServletEnvironment - Adding [servletContextInitParams] PropertySource with lowest search precedence]
[2016-05-06 14:47:33.837 [main] DEBUG o.s.w.c.s.StandardServletEnvironment - Adding [systemProperties] PropertySource with lowest search precedence]
[2016-05-06 14:47:33.837 [main] DEBUG o.s.w.c.s.StandardServletEnvironment - Adding [systemEnvironment] PropertySource with lowest search precedence]
[2016-05-06 14:47:33.837 [main] DEBUG o.s.w.c.s.StandardServletEnvironment - Initialized StandardServletEnvironment with PropertySources [servletConfigInitParams,servletContextInitParams,systemProperties,systemEnvironment]]
[2016-05-06 14:47:34.126 [main] INFO  com.datcom.fouras.FourasApplication - Starting FourasApplication on fathi-HP-Pavilion-g6-Notebook-PC with PID 13441 (/home/fathi/IdeaProjects/fouras/target/classes started by fathi in /home/fathi/IdeaProjects/fouras)]
[2016-05-06 14:47:34.127 [main] INFO  com.datcom.fouras.FourasApplication - No active profile set, falling back to default profiles: default]
[2016-05-06 14:47:35.620 [main] INFO  o.s.b.c.e.AnnotationConfigEmbeddedWebApplicationContext - Refreshing org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext@1e730495: startup date [Fri May 06 14:47:35 CET 2016]; root of context hierarchy]
[2016-05-06 14:47:39.468 [main] INFO  o.s.b.f.s.DefaultListableBeanFactory - Overriding bean definition for bean 'beanNameViewResolver' with a different definition: replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=org.springframework.boot.autoconfigure.web.ErrorMvcAutoConfiguration$WhitelabelErrorViewConfiguration; factoryMethodName=beanNameViewResolver; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/springframework/boot/autoconfigure/web/ErrorMvcAutoConfiguration$WhitelabelErrorViewConfiguration.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=org.springframework.boot.autoconfigure.web.WebMvcAutoConfiguration$WebMvcAutoConfigurationAdapter; factoryMethodName=beanNameViewResolver; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/springframework/boot/autoconfigure/web/WebMvcAutoConfiguration$WebMvcAutoConfigurationAdapter.class]]]

You are missing the match key from your multiline config. For example:

multiline:
    pattern: ^\[
    negate: true
    match: after

Reference: https://www.elastic.co/guide/en/beats/filebeat/current/configuration-filebeat-options.html#multiline

In the next major version we are working on improving configuration validation so that we can catch these errors with -configtest.

1 Like

It's working, i forget this:

multiline:
pattern: ^[
negate: true
match: after

instead of:

multiline:
pattern: ^[

Thank you very much, i appreciate it :slight_smile:

Hi Andrew and Jemli,

I am a newbie to Filebeat and Elastic Stack.
I am also trying to do something similar to you, Jemli.

Where did you specify the grok format expression for parsing the Java log entries?
Likewise, don't we need to specify the transformation to JSON?!
I assume that we need something like:

output:
elasticsearch:
# Array of hosts to connect to.
hosts: ["localhost:9200"]
template.name: "filebeat"
template.path: "filebeat.template.json"