Failed to create index


(Gobinda Nandi) #1

using Ubuntu 16.04

rev1.conf file

input {
  file {
    path            => "/home/skills34/es/templetes/test1.txt"
    start_position  => "beginning"
    sincedb_path    => "/dev/null"
    codec  => "json"        
  }
}
filter { }

output {
  elasticsearch {
    hosts => ["http://localhost:9200/"]
    template =>"/home/skills34/es/templetes/datamobi.json"
    template_name=>"datamobi"
    template_overwrite => true
    index => "datamobi_in_11_2"
  }
}

datamobi.json file

{
  "index_patterns": [
    "datamobi_in_11*"
  ],
  "settings": {
    "index": {
      "number_of_shards": "2",
      "number_of_replicas": "1"
    }
  },
  "mappings": {
    "doc" : {
      "dynamic": "false",
      "properties" : {
        "reviewId": { "type": "text" },
        "displayName": { "type": "text" },
        "profilePhotoUrl": { "type": "text" },
        "comment": { "type": "text" },
        "name": { "type": "text" },
        "createTime": { "type": "date" },
        "updateTime": { "type": "date" }
      }
    }
  },
  "aliases": {}
}

test1.txt file

{"reviewId":"test","displayName":"Afrid Khan","profilePhotoUrl":"dfdf","comment":"Good working environment","createTime":"2018-11-21T11:38:54.366207Z","updateTime":"2018-11-21T11:39:11.527644Z","name" : "dnfd"}

on Terminal ran this command :

sudo /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/rev1.conf

response:

WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2018-12-06 12:15:29.138 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2018-12-06 12:15:29.153 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.5.1"}
[INFO ] 2018-12-06 12:15:31.528 [Converge PipelineAction::Create] pipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[INFO ] 2018-12-06 12:15:31.991 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[INFO ] 2018-12-06 12:15:31.999 [[main]-pipeline-manager] elasticsearch - Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[WARN ] 2018-12-06 12:15:32.194 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://localhost:9200/"}
[INFO ] 2018-12-06 12:15:32.371 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>6}
[WARN ] 2018-12-06 12:15:32.375 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[INFO ] 2018-12-06 12:15:32.398 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200/"]}
[INFO ] 2018-12-06 12:15:32.419 [Ruby-0-Thread-5: :1] elasticsearch - Using mapping template from {:path=>"/home/skills34/es/templetes/datamobi.json"}
[INFO ] 2018-12-06 12:15:32.428 [Ruby-0-Thread-5: :1] elasticsearch - Attempting to install template {:manage_template=>{"index_patterns"=>["datamobi_in_11*"], "settings"=>{"index"=>{"number_of_shards"=>"2", "number_of_replicas"=>"1"}}, "mappings"=>{"doc"=>{"dynamic"=>"false", "properties"=>{"reviewId"=>{"type"=>"text"}, "displayName"=>{"type"=>"text"}, "profilePhotoUrl"=>{"type"=>"text"}, "comment"=>{"type"=>"text"}, "name"=>{"type"=>"text"}, "createTime"=>{"type"=>"date"}, "updateTime"=>{"type"=>"date"}}}}, "aliases"=>{}}}
[INFO ] 2018-12-06 12:15:32.588 [Ruby-0-Thread-5: :1] elasticsearch - Installing elasticsearch template to _template/datamobi
[INFO ] 2018-12-06 12:15:32.759 [Converge PipelineAction::Create] pipeline - Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x605bc3d4 sleep>"}
[INFO ] 2018-12-06 12:15:32.814 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[INFO ] 2018-12-06 12:15:32.820 [[main]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2018-12-06 12:15:33.176 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9601}

Kibana response

GET datamobi_in_11_2/_search

{
  "error" : {
    "root_cause" : [
      {
        "type" : "index_not_found_exception",
        "reason" : "no such index",
        "resource.type" : "index_or_alias",
        "resource.id" : "datamobi_in_11_2",
        "index_uuid" : "_na_",
        "index" : "datamobi_in_11_2"
      }
    ],
    "type" : "index_not_found_exception",
    "reason" : "no such index",
    "resource.type" : "index_or_alias",
    "resource.id" : "datamobi_in_11_2",
    "index_uuid" : "_na_",
    "index" : "datamobi_in_11_2"
  },
  "status" : 404
}

thats all, please help. It was creating index intially then sddenll what happened i dnt know .


(Harsh Bajaj) #2

Hi @inandi

If you analyze the below logs where clearly mentioned that logstash is not able to find YML file and log4j file.

And if you install the logstash through RPM or YUM than start it with systemctl or please let me know if you used another approach to install logstash.

Thanks,
Harsh Bajaj


(Gobinda Nandi) #3

Thanks for replying

how to get rid of it ? may be some config issue.

And for for installation of logstash i used these three only

1. sudo apt-get update
2. sudo apt-get install logstash
3. sudo systemctl restart logstash 

is there anything i am missing. I am new to this platform .


(Harsh Bajaj) #5

Hi @inandi,

Sorry for my previous reply. i observed that logstash is started properly.

Please try with elasticsearch URL as below and restart logstash service.


(Gobinda Nandi) #6

Thanks for replying.

thats my confusion, how to do that. is it in etc/logtash/logstash.yml file ?

logstash.yml file

# Settings file in YAML
#
# Settings can be specified either in hierarchical form, e.g.:
#
#   pipeline:
#     batch:
#       size: 125
#       delay: 5
#
# Or as flat keys:
#
#   pipeline.batch.size: 125
#   pipeline.batch.delay: 5
#
# ------------  Node identity ------------
#
# Use a descriptive name for the node:
#
# node.name: test
#
# If omitted the node name will default to the machine's host name
#
# ------------ Data path ------------------
#
# Which directory should be used by logstash and its plugins
# for any persistent needs. Defaults to LOGSTASH_HOME/data
#
path.data: /var/lib/logstash
path.setting: /etc/logstash
#
# ------------ Pipeline Settings --------------
#
# The ID of the pipeline.
#...#############################.COMMENTED DEFAULT CODE.........................................
#   * info (default)
#   * debug
#   * trace
#
# log.level: info
path.logs: /var/log/logstash
#
# ------------ Other Settings --------------
#
# Where to find custom plugins
# path.plugins: []
#
# ------------ X-Pack Settings (not applicable for OSS build)--------------
#
# X-Pack Monitoring
# https://www.elastic.co/guide/en/logstash/current/monitoring-logstash.html
#xpack.monitoring.enabled: false
#xpack.monitoring.elasticsearch.username: logstash_system
#xpack.monitoring.elasticsearch.password: password
#xpack.monitoring.elasticsearch.url: ["https://es1:9200", "https://es2:9200"]
#xpack.monitoring.elasticsearch.ssl.ca: [ "/path/to/ca.crt" ]
#xpack.monitoring.elasticsearch.ssl.truststore.path: path/to/file
#xpack.monitoring.elasticsearch.ssl.truststore.password: password
#xpack.monitoring.elasticsearch.ssl.keystore.path: /path/to/file
#xpack.monitoring.elasticsearch.ssl.keystore.password: password
#xpack.monitoring.elasticsearch.ssl.verification_mode: certificate
#xpack.monitoring.elasticsearch.sniffing: false
#xpack.monitoring.collection.interval: 10s
#xpack.monitoring.collection.pipeline.details.enabled: true
#
# X-Pack Management
# https://www.elastic.co/guide/en/logstash/current/logstash-centralized-pipeline-management.html
#xpack.management.enabled: false
#xpack.management.pipeline.id: ["main", "apache_logs"]
#xpack.management.elasticsearch.username: logstash_admin_user
#xpack.management.elasticsearch.password: password
#xpack.management.elasticsearch.url: ["https://es1:9200", "https://es2:9200"]
#xpack.management.elasticsearch.ssl.ca: [ "/path/to/ca.crt" ]
#xpack.management.elasticsearch.ssl.truststore.path: /path/to/file
#xpack.management.elasticsearch.ssl.truststore.password: password
#xpack.management.elasticsearch.ssl.keystore.path: /path/to/file
#xpack.management.elasticsearch.ssl.keystore.password: password
#xpack.management.elasticsearch.ssl.verification_mode: certificate
#xpack.management.elasticsearch.sniffing: false
#xpack.management.logstash.poll_interval: 5s

(Harsh Bajaj) #7

Please ignore this one and check my latest reply.


(Gobinda Nandi) #8

u mean

hosts => ["http://127.0.0.1:9200"]


(Harsh Bajaj) #9

Yes, please try it and confirm.


(Gobinda Nandi) #10

thanks for replying

it is working with localhost and IP both, but i think i have issue with my json data.

if i update my test1.txt file with this data

{"timestamp":"1524635328682","bidRequest":{"id":"550D288074CBCB09A01C59D3473DDB6D","imp":[{"id":"1","banner":{"w":320,"h":50,"id":"1","pos":"UNKNOWN","btype":["IFRAME"],"battr":["POP","ANNOYING"],"topframe":false,"wmax":320,"hmax":50},"displaymanager":"third_party_sdk","displaymanagerver":"3.0","instl":false,"bidfloor":0.637338,"bidfloorcur":"USD","secure":false}],"app":{"id":"246228_4-4793","name":"Women Saree Photo Making","domain":"silvermob.com","cat":["IAB1"],"bundle":"com.formationapps.womensaree","publisher":{"id":"48757"},"storeurl":"https://play.google.com/store/apps/details?id=com.formationapps.womensaree"},"device":{"dnt":false,"ua":"Mozilla/5.0 (Linux; Android 6.0.1; SM-T580 Build/MMB29K; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/66.0.3359.106 Safari/537.36","ip":"88.179.216.6","geo":{"lat":13.88273873222566,"lon":76.59120195170897,"country":"UK","type":"GPS_LOCATION"},"dpidsha1":"DA38637008C08832341EC8DED77CA2AAE9715276","dpidmd5":"577DBDDED8C396BE3E0E130C477621C6","carrier":"WIFI","make":"Samsung","model":"SM-T580","os":"Android","osv":"6.0","js":true,"connectiontype":"WIFI","devicetype":"TABLET","ifa":"CA42A60D-558A-4443-8B88-57D1C3C3BD0D"},"user":{"id":"CA42A60D-558A-4443-8B88-57D1C3C3BD0D"},"at":"SECOND_PRICE","tmax":300,"cur":["USD"],"bcat":["IAB24","IAB25","IAB26"]},"http":{"requestUrl":"http://mobfox-display.rtb.adx1.com/display","headers":{"host":"mobfox-display.rtb.adx1.com","xRealIp":"54.209.102.201","contentType":"application/json","xOpenrtbVersion":"2.3","accept":"/","acceptEncoding":"gzip"}},"organizationId":3,"user":{"platormId":"device.ifa:ca42a60d-558a-4443-8b88-57d1c3c3bd0d","externalId":"ca42a60d-558a-4443-8b88-57d1c3c3bd0d"},"sspId":320,"enrichment":{"geo":{"country":"UK","region":"78","city":"Poigny-la-Foret","cityID":0,"timeZoneOffset":-6.72549154E8,"locationCombinations":["UK","UK|78"]},"os":"Android 6","browser":"Chrome 66","carrier":"","device":"Tablet Samsung SM-T580","cellular":false},"domain":"com.formationapps.womensaree"}

so it is updating previus data which was

{"reviewId":"test","displayName":"Afrid Khan","profilePhotoUrl":"dfdf","comment":"Good working environment","createTime":"2018-11-21T11:38:54.366207Z","updateTime":"2018-11-21T11:39:11.527644Z","name" : "dnfd"}

i dont know whats happening . is there any particular format for json data to update to ES?


(Harsh Bajaj) #11

Hi @inandi,

could you please explain a bit more what you are trying to do.

Regards,
Harsh Bajaj


(Gobinda Nandi) #12

Hi,

I am trying to save data from a txt file (test1.txt) to ES using Logstash. this is the sample data

{"reviewId":"gobinda-ji--fhbrgfhbrjhfbrhjfbjrb-jiw","displayName":"Afrid Khan","profilePhotoUrl":"https://lh3.googleusercontent.com/-10fmENVabqs/AAAAAAAAAAI/AAAAAAAAAAA/9KS-xHs0S8s/photo.jpg","comment":"Good working environment","createTime":"2018-11-21T11:38:54.366207Z","updateTime":"2018-11-21T11:39:11.527644Z","name":"accounts/116351880697164468264/locations/9166534068115432035/reviews/AIe9_BEV1VP1woYvuPRixSBHlWg4qMiG68WXeAKdH2aw2RQXAHj7rvrnSHc5BlTIv4Ubkmang6deLon2HuhM4KPfSPz7WsjH-lXIyBHLAeFAUfWYOwx37Zs"}

but i am now able to update data to to ES. using a supernatural trick. I just an add "enter" after the data and it is working . remove the new line, it stops working.

{"reviewId":"gobinda-ji--fhbrgfhbrjhfbrhjfbjrb-jiw","displayName":"Afrid Khan","profilePhotoUrl":"https://lh3.googleusercontent.com/-10fmENVabqs/AAAAAAAAAAI/AAAAAAAAAAA/9KS-xHs0S8s/photo.jpg","comment":"Good working environment","createTime":"2018-11-21T11:38:54.366207Z","updateTime":"2018-11-21T11:39:11.527644Z","name":"accounts/116351880697164468264/locations/9166534068115432035/reviews/AIe9_BEV1VP1woYvuPRixSBHlWg4qMiG68WXeAKdH2aw2RQXAHj7rvrnSHc5BlTIv4Ubkmang6deLon2HuhM4KPfSPz7WsjH-lXIyBHLAeFAUfWYOwx37Zs"}
entered new line here

spooky day !!! :sleepy:


(Gobinda Nandi) #13

last thing how to resolve these warnings?

Thank you very much for listening to me and replying. God bless @harshbajaj16