Creating Logstash Configuration

Hi ,

I am new here and wanted to know how to separate the field and and save it to Elastic Search.

below is the one example where you can see that I have various values which has been separated with ",".

can any one help me to understand how I can take specific values from this one line. like I want to take "AD_FOUND" word and "Price Comparison" word out of below line.

"2015-05-04 11:36:35.530,10603148080_150420070534_ADS_000000093,10603148080_150420070534_ADS_000000093,13727C2C35_101109132033_GET_000000010,GET_AND_COMMIT_AD,LAYER_8,TEXT,669f33b0dd53272139ed79ee9339a79a1c2c7892,669f33b0dd53272139ed79ee9339a79a1c2c7892,http://www.amazon.com,@KEY_WORDS@,AD_FOUND,,,adSelector,1;81,,1;21,,,1,2,TEXT,"aud;81,_sg;21,_ser;_pb;_ap;",,,,,,,,,,,,2,PC_Firefox,PC,10.60.3.14:8080,,,TB-Area,Price Comparison,Price Comparison,16,4,4,669f33b0dd53272139ed79ee9339a79a1c2c7892,,,,"

The csv filter will be helpful here.

filter {
  csv {
    columns => ["timestamp", "fieldname1", "fieldname2", ...]
  }
}

Note that all fields will be extracted as strings. For numerical data you'll want to convert it before passing it to Elasticsearch. Use the mutate filter for that.

filter {
  mutate {
    convert => ["name_of_integer_field", "integer"]
  }
}

HI Magnus Bäck,

Thanks , thanks a lots, let me try this.

Thanks and Regards
Subrata

Hi Magnus,

I have created one logstash conf file where I am loading CSV file and it's working properly, the moment when i have changed the condition like type=="xyz", in input and filter block I am getting issues.
below is the conf snippet.

input
{
file
{
type => "SDR"
path => "/tmp/AdSelectorSDR.csv"
}

file
{
type => "UA"
path => "/tmp/UA.txt"
}

}

filter {

if [type] == "SDR"
{
csv
{
columns => ["Date","Context ID","Opportunity ID","Injector Context Id","Action","Channel","Allowed Ad Types","Sender","Receiver","URL","Keywords","Success","Message","Number of Ads","Ad Server","Audiences","Services","Site Groups","Preferences","Gender","Age In Years","Ad ID 1","Ad type 1","Opportunity data 1","Ad ID 2","Ad type 2","Opportunity data 2","Ad ID 3","Ad type 3","Opportunity data 3","STL ID","STL origin","Is Subscribed","Position Name","Application Name","CampaignID","Device","DeviceGroup","Requestor ID","Service Names","Site Group Names","Area","Campaign Name","Creative Name","Overall Duration","Fetch Adit Sub Dur","Persist Adit Sub Dur","MSISDN","Ad IT CouponID","CouponSentDate","ExternalCouponId","CouponSentStatus"]

                                      remove_field => ["Context ID","Opportunity ID","Injector Context Id","Action","Sender","Receiver","Keywords","Number of Ads","Ad Server","Audiences","Services","Site Groups","Preferences","Gender","Age In Years","Ad ID 1","Ad type 1","Opportunity data 1","Ad ID 2","Ad type 2","Opportunity data 2","Ad ID 3","Ad type 3","Opportunity data 3","STL ID","STL origin","Is Subscribed","Position Name","Application Name","Requestor ID","Service Names","Site Group Names","Area","Overall Duration","Fetch Adit Sub Dur","Persist Adit Sub Dur","Ad IT CouponID","CouponSentDate","ExternalCouponId","CouponSentStatus"]
   }

}
if [type] == "UA"
{

   grok { match => { "message" => "%{COMBINEDAPACHELOG}" } }
   date { match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ] }
   geoip { source => "clientip" }
   useragent {
      source => "agent"
      target => "useragent"
             }




}

}

below is the Exception..

===============================
May 07, 2015 2:38:35 AM org.apache.http.impl.execchain.RetryExec execute
INFO: I/O exception (org.apache.http.NoHttpResponseException) caught when processing request to {}->http://awse-2100401872.us-west-2.elb.amazonaws.com:9200: The target server failed to respond
May 07, 2015 2:38:35 AM org.apache.http.impl.execchain.RetryExec execute
INFO: Retrying request to {}->http://awse-2100401872.us-west-2.elb.amazonaws.com:9200
May 07, 2015 7:49:52 AM org.apache.http.impl.execchain.RetryExec execute
INFO: I/O exception (org.apache.http.NoHttpResponseException) caught when processing request to {}->http://awse-2100401872.us-west-2.elb.amazonaws.com:9200: The target server failed to respond
May 07, 2015 7:49:52 AM org.apache.http.impl.execchain.RetryExec execute
INFO: Retrying request to {}->http://awse-2100401872.us-west-2.elb.amazonaws.com:9200
{:timestamp=>"2015-05-08T06:36:53.731000-0400", :message=>"A plugin had an unrecoverable error. Will restart this plugin.\n Plugin: <LogStash::Inputs::Stdin >\n Error: Unknown error - Bad file descriptor", :level=>:error}

can you please suggest me

May 07, 2015 2:38:35 AM org.apache.http.impl.execchain.RetryExec execute
INFO: I/O exception (org.apache.http.NoHttpResponseException) caught when processing request to {}->http://awse-2100401872.us-west-2.elb.amazonaws.com:9200: The target server failed to respond

Which logfile is this from? It doesn't look like a Logstash log. Either way I doubt this has anything to do with the Logstash filter configuration change. Isolate things and don't bother with Elasticsearch until your filters work. Use a stdout output for now.

Thanks Magnus, I did one mistake , it's working fine thanks thanks a lots :smile:

I have one file where we used to write one line like "Loading Headers Done. Request info: [MSISDN: null] [User-Agent: Mozilla/5.0 (iPad; CPU OS 8_3 like Mac OS X) AppleWebKit/600.1.4 (KHTML like Gecko) Version/8.0 Mobile/12F69 Safari/600.1.4] [Source-IP: 10.186.97.207] [Domain: http://mesgmy.ebay.it/ws/eBayISAPI.dll?ViewMyMessages&&ssPageName=STRK:MEMM:LNLK&FolderId=0&CurrentPage=MyeBayMyMessages&_trksid=p3984.m2295.l3928]"

if we want to separate this values like ip, UA and sites etc.. then what exactly I need to write in filter,

I have written below code for this, can you please suggest me where I did wrong.

grok { match => { "message" => "%{COMBINEDAPACHELOG}" } }
date { match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ] }
geoip { source => "clientip" }
useragent {
source => "agent"
target => "useragent"
}

Hi Magnus,

if possible kindly suggest me for my above query, I know it's annoying for you.

please help me.

Thanks and Regards
Subrata

You're trying to use COMBINEDAPACHELOG but your log has a completely different format that I don't recognize. Something like this (untested) should at least not be too far off the mark:

^Loading Headers Done\. Request info: \[MSISDN: null\] \[User-Agent: (?<agent>[^\]]+)\] \[Source-IP: %{IP:sourceip}\] \[Domain: %{GREEDYDATA:url}\]$

Hi Magnus,

Thanks for your response, but I didn't get success to parse this value.

Below is the snippet from Kibana:

t_source {"message":"","@version":"1","@timestamp":"2015-05-29T11:08:28.600Z","type":"UA","host":"0.0.0.0","path":"/tmp/UA.txt","tags":["_grokparsefailure"]}

here it's tell grokparsefailuure.

below is the configuration snippet which has been suggested by you.

grok
{
match => { "message" => "^Loading Headers Done. Request info: [MSISDN: null] [User-Agent: (?[^]]+)] [Source-IP: %{IP:sourceip}] [Domain: %{GREEDYDATA:url}]$"
}

Hi Magnus,

I think I am able to resolve the issue, I will let you know by monday, thanks, thanks a lots for all your advice.

Regards
Subrata

below is the configuration snippet which has been suggested by you.

grok
{
match => { "message" => "^Loading Headers Done. Request info: [MSISDN: null] [User-Agent: (?[^]]+)] [Source-IP: %{IP:sourceip}] [Domain: %{GREEDYDATA:url}]$"
}

This isn't quite what I suggested. You have omitted the backslashes that preceded the square brackets. [ and ] really need to be \[ and \].