Error in template


(Aadarsh Kumayan) #1

Hi i have a problem , i am trying to run logsatsh it generaates the following error

:message=>"Error: Expected one of #, input, filter, output at line 34, column 1 (byte 855) after ", :level=>:error}

i have downloaded the deafult json template i am not able to identify the error
my full configuration file

input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
  add_field => [ "received_at", "%{@timestamp}" ]
  add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
{
"mappings": {
"_default_": {
 "_all": {
"enabled": true,
"norms": {
  "enabled": false
}
},
"dynamic_templates": [
{
  "template1": {
    "mapping": {
      "doc_values": true,
      "ignore_above": 1024,
      "index": "not_analyzed",
      "type": "{dynamic_type}"
    },
"match": "*"
  }
}
],
"properties": {
"@timestamp": {
  "type": "date"
},
"message": {
  "type": "string",
  "index": "analyzed"
},
"offset": {
  "type": "long",
  "doc_values": "true"
},
"geoip"  : {
  "type" : "object",
  "dynamic": true,
  "properties" : {
    "location" : { "type" : "geo_point" }
   }
   }
  }
 }
 },
"settings": {
"index.refresh_interval": "5s"
 },
"template": "filebeat-*"
}

(Christian Dahlqvist) #2

Why do you have mappings in your config file? If you want to specify mappings and have Logstash manage them, you should place the mappings in a separate file and instruct the elastic search output plugin to use it.


(Aadarsh Kumayan) #3

i followed the installation steps on digital ocean .i ran the following commands

 curl -O https://gist.githubusercontent.com/thisismitch/3429023e8438cc25b86c/raw/d8c479e2a1adcea8b1fe86570e42abab0f10f364/filebeat-index-template.json
 curl -XPUT 'http://localhost:9200/_template/filebeat?pretty' -d@filebeat-index-template.json

i got output

 {
  "acknowledged" : true
 }

And the json template was generated

i have specified the mappings in different file
i concatenated all four files under conf.d to upload on the internet
When logstash runs, it combines all the files in your config directory into one file.

02-beats-input.conf
10-syslog-filter.conf
30-elasticsearch-output.conf
filebeat-index-template.json


(Christian Dahlqvist) #4

Move the template file out of the config directory as it can not be parsed as config by Logstash.


(Aadarsh Kumayan) #5

after removing template
On your ELK Server, verify that Elasticsearch is indeed receiving the data by querying for the Filebeat index with this command:

curl -XGET 'http://localhost:9200/filebeat-*/_search?pretty'

i am recieving output

{
"took" : 1,
"timed_out" : false,
"_shards" : {
"total" : 0,
"successful" : 0,
"failed" : 0
},
"hits" : {
"total" : 0,
 "max_score" : 0.0,
 "hits" : [ ]
  }
}

(Christian Dahlqvist) #6

Once you have uploaded the template to the cluster, which you did with your second curl request, it will apply to all matching indices.


(Aadarsh Kumayan) #7

after removing template
On your ELK Server, verify that Elasticsearch is indeed receiving the data by querying for the Filebeat index with this command:

curl -XGET 'http://localhost:9200/filebeat-*/_search?pretty'

i am recieving output

{
"took" : 1,
"timed_out" : false,
"_shards" : {
"total" : 0,
"successful" : 0,
"failed" : 0
},
"hits" : {
"total" : 0,
 "max_score" : 0.0,
 "hits" : [ ]
  }
}

when the template was present i was receiving input


(Aadarsh Kumayan) #8

should mappings file must contain header in their configuration files like other three files
beats has input as header
syslog has filter as header
elasticsearch has output as header
mappings file has no header


(Abinstephen1989) #9

The error you mentioned
one of #, input, filter, output at line 34, column 1 (byte 855) after ", :level=>:error}

is resolved??


(Aadarsh Kumayan) #10

nope


(Abinstephen1989) #11

"mappings": {
"default": {
"_all": {
"enabled": true,
"norms": {
"enabled": false
}
}

Can you try removing the space in starting of _all??


(Aadarsh Kumayan) #12

i mistakenly put the extra space while formatting on the website no extra space in my configuration file

still not working need help guys


(Magnus B├Ąck) #13

This is a confusing thread. What does your configuration file look like now?


(Aadarsh Kumayan) #14

thanks man
it worked mapping file should not be present in the logstash configuration folder
ubuntu log were having auth.log.1 but i had configured it for auth.log


(system) #15

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.