Hi,
I am using ELK stack for Log Centralisation. I have written an ansible role (logstash-forwarder) which helps my application to send log to ELK using certificate. My ELK is currently supporting lumberjack. One of my application is hosted to Iron.io from where I want to send log to ELK. I have updated both elastic search and lostash version to 2.0. So Now I want to configure my input file to take http and tcp input. My objective is my iron.io app will send log to my ELK using http/tcp/udp and logstash will convert the data and log will be shown in Kibana.
My input file:
input {
lumberjack {
port => 5000
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
http {
port => 8080
user => myuser
password => "******"
}
}
Do I need to bring any change in my output file?
My Output file:
input { stdin { } }
output {
elasticsearch { hosts => localhost }
stdout { codec => rubydebug }
}
Please also let me know how will I conduct the test (sending log from local)
I'm not sure what you're trying to accomplish. There are multiple problems here:
Your curl command looks like an attempt to post directly to Elasticsearch, but you're actually posting to Logstash. What's the purpose of your http input?
The payload you post isn't valid JSON (because of the =>).
You're specifying the username and password as part of the payload, not it needs to be part of the HTTP header.
I have an Ubuntu server where I am running ELK stack (I have installed Elastics Search, Logstash & kibana in a single server). I am using logstash-forwarder in my some applications. They are properly sending log to logstash and I can see the log in Kibana.
My problem is, One of my application is run in Iron.io infrastructure. They can send log using http. So I enabled http plugin in my ELK server. I want to show the log in Kibana which is sent from Iron.Io application using http.
This is my input config file:
input {
lumberjack {
port => 5000
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
http {
port => 5555
user => myuser
password => "******"
}
}
Please provide me a valid request format so that I can send log to my ELK server using curl. 5555 port is open for all in my elk server. So please at least let me know how can I send a curl request from my local to send log that will be shown in kibana. If I can send from local then I will be able to configure in iron.io application.
Objective:
I will send a log to elk using http from local or other machine -> Logstash will process the http request -> Will show it in Kibana
I want to keep the lumberjack plugin in my input file as it is being used by those servers who are sending log using logstash-forwarder. beside that I want add http functionality
Skip the authentication requirements (i.e. drop the user and password options for the http input) until you've verified that things work. Then authenticate properly (pass -u myuser:***** to curl).
I suggest you leave Elasticsearch out of the equation for now and use a stdout { codec => rubydebug } output to see what the events look like. It's quite possible that you want to process them in some manner before passing them off to ES).
I followed the process that you told me to do. I have generated certificate in my ELK server that I am using to verify my client who is sending log using logstash-forwarder. But for the http plugin, In the input section, I have not provided any authentication.
My current Input file:
input {
lumberjack {
port => 5000
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
http {
port = 5555}
}
and I am sending request using below command:
curl -H "content-type: application/json" -XPUT 'http://elk.startjobs.net:5555' -d '{
"post_date" : "2015-10-29T15:20:34",
"message" : "Nowshad Amin" }' -v
I tried it from ELk server using command:
curl -H "content-type: application/json" -XPUT 'http://elk.startjobs.net:5555' -d '{
"post_date" : "2015-10-29T15:20:34","message" : "Nowshad_amin" }' -v
Response:
I tried it from ELk server using command:
curl -H "content-type: application/json" -XPUT 'http://elk.example.net:5555' -d '{
"post_date" : "2015-10-29T15:20:34","message" : "Nowshad_amin" }' -v
Response:
Connected to elk.startjobs.net (172.32.0.157) port 5555 (#0)
PUT / HTTP/1.1
User-Agent: curl/7.35.0
Host: elk.example.net:5555
Accept: /
content-type: application/json
Content-Length: 66
upload completely sent off: 66 out of 66 bytes
< HTTP/1.1 401 Unauthorized
< Content-Type: text/plain
< WWW-Authenticate: Basic realm=""
< Content-Length: 0
<
Connection #0 to host elk.example.net left intact
Please take this into your consideration (For lumberjack I am using keys, but for http I am not using any keys and no authentication in my input file). I can send log using lumberjack but getting 401 response while sending log using http
input {
lumberjack {
port => 5000
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
http {
port => 5555
}
}
Can I use multiple Input plugin in a single input file. Since I have enabled certificate for my lumberjack input so I am assuming it is creating problem for tcp though I have not enabled any authentication for http and tcp. Everytime I am trying to curl to elk from elk server/external server usinh http and tcp I am getting 401 Unauthorized. Please help me with your expert opinion
Can I use multiple Input plugin in a single input file.
Logstash handles multiple inputs just fine. You can have more than one input per configuration file and you can have multiple configuration files. The input can be of the same type or of different types. The one things you should be careful about is having two identically configured inputs, i.e. two inputs listening on the same port or reading from the same file.
Since I have enabled certificate for my lumberjack input so I am assuming it is creating problem for tcp though I have not enabled any authentication for http and tcp.
As mentioned before, make sure you're actually connecting to Logstash in your curl commands. For example, if you shut down Logstash, are subsequent connections refused? Could there be some kind of proxy that intercepts the connections?
It's obviously very easy to check the hypothesis of the lumberjack input disturbing the http input; just start a Logstash instance with only an http input (and a stdout { codec => rubydebug } output) and see how that behaves. You can also try different port numbers to see if they behave differently.
I have tried different port for different input and got error everytime. My ELK server is elk.startjobs.net which is running fine and collecting logs from Servers that are using logstash-forwarder.
I have setup Elasticsearch Logstash and Kibana in a same Ubuntu Server.
For Lumberjack we provide ssl_certificate and ssl_key, Can we anyhow provide these certificate and keys in http?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.