Monitor/Log search queries in elasticsearch/kibana

Hello,

I am trying to create a pipeline for monitoring user executed queries in elasticsearch.

I followed this tutorial

I successfully installed logstash, packetbeat and kibana. Both applications are configured locally right now and I am able to access all of them. I verified existence of new indices in elastisearch with a name
logstash-2020.04.06-000001

However, I cannot create Index Pattern in Kibana to track user queries. I can see it exists both in Rest API and Kibana Index Managment. I also see that it's an empty index. I already tried to restart all of services after configuration using systemd.

I am using Elasticsearch 7.6.2.

How can I fix it? Is there maybe other solution (I played around with filebeat, but nothing came out of it)? I don't necessarily need Kibana visualization, but I need to know user queries to fix relevance scores for results.

This is my packetbeat console output:

user@host:/usr/share/packetbeat$ sudo ./bin/packetbeat -e -c /etc/packetbeat/packetbeat.yml -d "publish"
2020-04-07T15:59:50.855+0200	INFO	instance/beat.go:622	Home path: [/usr/share/packetbeat/bin] Config path: [/usr/share/packetbeat/bin] Data path: [/usr/share/packetbeat/bin/data] Logs path: [/usr/share/packetbeat/bin/logs]
2020-04-07T15:59:50.873+0200	INFO	instance/beat.go:630	Beat ID: 25a0570e-3395-4a8b-8dcf-38c19560eb44
2020-04-07T15:59:50.897+0200	INFO	[api]	api/server.go:62	Starting stats endpoint
2020-04-07T15:59:50.906+0200	INFO	[api]	api/server.go:64	Metrics endpoint listening on: 127.0.0.1:5066 (configured: localhost)
2020-04-07T15:59:50.907+0200	INFO	[seccomp]	seccomp/seccomp.go:124	Syscall filter successfully installed
2020-04-07T15:59:50.907+0200	INFO	[beat]	instance/beat.go:958	Beat info	{"system_info": {"beat": {"path": {"config": "/usr/share/packetbeat/bin", "data": "/usr/share/packetbeat/bin/data", "home": "/usr/share/packetbeat/bin", "logs": "/usr/share/packetbeat/bin/logs"}, "type": "packetbeat", "uuid": "25a0570e-3395-4a8b-8dcf-38c19560eb44"}}}
2020-04-07T15:59:50.907+0200	INFO	[beat]	instance/beat.go:967	Build info	{"system_info": {"build": {"commit": "d57bcf8684602e15000d65b75afcd110e2b12b59", "libbeat": "7.6.2", "time": "2020-03-26T05:09:32.000Z", "version": "7.6.2"}}}
2020-04-07T15:59:50.907+0200	INFO	[beat]	instance/beat.go:970	Go runtime info	{"system_info": {"go": {"os":"linux","arch":"amd64","max_procs":8,"version":"go1.13.8"}}}
2020-04-07T15:59:50.908+0200	INFO	[beat]	instance/beat.go:974	Host info	{"system_info": {"host": {"architecture":"x86_64","boot_time":"2020-04-07T11:24:50+02:00","containerized":false,"name":"host","ip":["127.0.0.1/8","::1/128","192.168.1.11/24","fe80::dacb:8aff:fe80:d3f5/64","172.17.0.1/16"],"kernel_version":"4.9.0-12-amd64","mac":["XXXXXXXXXXXXXXX"],"os":{"family":"debian","platform":"debian","name":"Debian GNU/Linux","version":"9 (stretch)","major":9,"minor":0,"patch":0,"codename":"stretch"},"timezone":"CEST","timezone_offset_sec":7200,"id":"414bf25d70c54332b8cf4d2a82ee0108"}}}
2020-04-07T15:59:50.908+0200	INFO	[beat]	instance/beat.go:1003	Process info	{"system_info": {"process": {"capabilities": {"inheritable":null,"permitted":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend","audit_read"],"effective":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend","audit_read"],"bounding":["chown","dac_override","dac_read_search","fowner","fsetid","kill","setgid","setuid","setpcap","linux_immutable","net_bind_service","net_broadcast","net_admin","net_raw","ipc_lock","ipc_owner","sys_module","sys_rawio","sys_chroot","sys_ptrace","sys_pacct","sys_admin","sys_boot","sys_nice","sys_resource","sys_time","sys_tty_config","mknod","lease","audit_write","audit_control","setfcap","mac_override","mac_admin","syslog","wake_alarm","block_suspend","audit_read"],"ambient":null}, "cwd": "/usr/share/packetbeat", "exe": "/usr/share/packetbeat/bin/packetbeat", "name": "packetbeat", "pid": 9679, "ppid": 9678, "seccomp": {"mode":"filter"}, "start_time": "2020-04-07T15:59:50.259+0200"}}}
2020-04-07T15:59:50.908+0200	INFO	instance/beat.go:298	Setup Beat: packetbeat; Version: 7.6.2
2020-04-07T15:59:50.908+0200	INFO	[publisher]	pipeline/module.go:110	Beat name: host
2020-04-07T15:59:50.908+0200	INFO	procs/procs.go:105	Process watcher disabled
2020-04-07T15:59:50.924+0200	INFO	[monitoring]	log/log.go:118	Starting metrics logging every 30s
2020-04-07T15:59:50.924+0200	INFO	instance/beat.go:439	packetbeat start running.
2020-04-07T16:00:20.926+0200	INFO	[monitoring]	log/log.go:145	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":40,"time":{"ms":44}},"total":{"ticks":160,"time":{"ms":172},"value":160},"user":{"ticks":120,"time":{"ms":128}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":8},"info":{"ephemeral_id":"0634aeb9-6358-4947-a908-7043973876dc","uptime":{"ms":30132}},"memstats":{"gc_next":39455344,"memory_alloc":21608032,"memory_total":26051720,"rss":66273280},"runtime":{"goroutines":14}},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"logstash"},"pipeline":{"clients":0,"events":{"active":0}}},"system":{"cpu":{"cores":8},"load":{"1":0.5,"15":0.64,"5":0.75,"norm":{"1":0.0625,"15":0.08,"5":0.0938}}}}}}

This is my logstash console output:

user@host:/usr/share/logstash$ sudo ./bin/logstash --path.settings /etc/logstash -f es-first-config.conf
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-04-07T15:58:33,546][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-04-07T15:58:33,731][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.2"}
[2020-04-07T15:58:35,916][INFO ][org.reflections.Reflections] Reflections took 33 ms to scan 1 urls, producing 20 keys and 40 values 
[2020-04-07T15:58:37,443][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2020-04-07T15:58:37,630][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2020-04-07T15:58:37,688][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-04-07T15:58:37,694][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-04-07T15:58:37,789][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1:9200"]}
[2020-04-07T15:58:37,850][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-04-07T15:58:37,935][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash"}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-04-07T15:58:38,108][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-04-07T15:58:38,114][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/usr/share/logstash/es-first-config.conf"], :thread=>"#<Thread:0x7437b24 run>"}
[2020-04-07T15:58:39,246][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-04-07T15:58:39,314][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-04-07T15:58:39,388][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-04-07T15:58:39,480][INFO ][org.logstash.beats.Server][main] Starting server on port: 5044
[2020-04-07T15:58:39,762][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.