Create index with the same name as request path value, using ElasticSearch output

This is my logstash.conf:

input {
	http {
		host => "127.0.0.1"
		port => 31311 
	}
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]    	
  }
  stdout {
    codec => "rubydebug"
  }
}

As a test, I ran this command in PowerShell:

C:\Users\Me\Downloads\curl-7.64.1-win64-mingw\bin> .\curl.exe -XPUT 'http://127.0.0.1:31311/twitter'

The following output was displayed inside my Logstash terminal:

{
    "@timestamp" => 2019-04-09T08:32:09.250Z,
       "message" => "",
      "@version" => "1",
       "headers" => {
           "request_path" => "/twitter",
           "http_version" => "HTTP/1.1",
        "http_user_agent" => "curl/7.64.1",
         "request_method" => "PUT",
            "http_accept" => "*/*",
         "content_length" => "0",
              "http_host" => "127.0.0.1:31311"
    },
          "host" => "127.0.0.1"
}

When I then ran

C:\Users\Me\Downloads\curl-7.64.1-win64-mingw\bin> .\curl.exe -XGET "http://127.0.0.1:9200/_cat/indices"

inside PowerShell, I saw

yellow open logstash-2019.04.09 1THStdPfQySWl1WPNeiwPQ 5 1 0 0 401b 401b

An index named logstash-2019.04.09 has been created in response to my PUT request, following the ElasticSearch convention.

My question is: If I want the index to have the same value as the {index_name} parameter I pass inside the the command .\curl.exe -XPUT 'http://127.0.0.1:31311/{index_name}', how should I configure the ElasticSearch output inside my logstash.conf file?

If you take a look at the output, you can see that the [headers][request_path] field contains the path.

You can use those values to specify the logstash index name, see https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html

Note: I also moved this into the logstash category.

Thank you for your reply. Here is my updated logstash.conf file:

input {
	http {
		host => "127.0.0.1"
		port => 31311 
	}
}

filter {
	mutate {
		split => ["%{headers.request_path}", "/"]
		add_field => { "index_id" => "%{headers.request_path[0]}" }
		add_field => { "document_id" => "%{headers.request_path[1]}" }
	}
}

output {
  elasticsearch {
    hosts => "http://localhost:9200"
	index => "%{index_id}"
	document_id => "%{document_id}"
  }
  stdout {
    codec => "rubydebug"
  }
}

However, when I submit a PUTrequest, Logstash crashes with this error message:

Invalid FieldReference: headers.request_path[0]

How can I fix this?

The same error occurs when I change the filter segment to the following:

filter {
	mutate {
		split => ["%{[headers][request_path]}", "/"]
		add_field => { "index_id" => "%{[headers][request_path][0]}" }
		add_field => { "document_id" => "%{[headers][request_path][1]}" }
	}
}

That should be

split => { "[headers][request_path]" => "/"}

index_id will get set to "", since nothing precedes the first / in /twitter.

Thank you for your reply. This is my updated logstash.conffile:

input {
	http {
		host => "127.0.0.1"
		port => 31311 
	}
}

filter {
	mutate {
		split => { "[headers][request_path]" => "/"}
		add_field => { "index_id" => "%{headers.request_path[1]}" }
	}
}
	
output {
  elasticsearch {
    hosts => "http://localhost:9200"
	index => "%{index_id}"
  }
  stdout {
    codec => "rubydebug"
  }
}

However, I still see an Invalid FieldReference error message:

Invalid FieldReference: headers.request_path[1]

Hello Badger, never mind, I noticed that I made an error. After changing add_field => { "index_id" => "%{headers.request_path[1]}" } to add_field => { "index_id" => "%{[headers][request_path][1]}" }, things are working now. Thank you!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.