Indexing problem when using logstash

I am using the foollowing config file
filter{
grok{
match=>[
"message",

"(?:?|&)C=%{DATA:kw}&%{DATA}\sT\s%{DATA:town}\sS\s%{WORD:state}\s%{DATA}%{IP:ip}"
]
}
grok{
match=>[
"message",
"(?:?|&)SRC=%{DATA:src}(?:&|$)"
]
}
}
output {
elasticsearch {
host => localhost
}
stdout { codec => rubydebug }
}
And I thought "kw", "town", "state", etc. will be fields in elastic search.
But trying

http://localhost:9200/_search?q="town:* AND state:*"
I am getting

{"took":5,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":0,"max_score":null,"hits":[]}}

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/b99b5f5a-9063-4970-8da2-106efc5de196%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.