Logstash Grok pattern

Grok pattern isn't working .
before I used grok pattern and it creates a fields which was I added into pattern.
but now it's not creating a fields
Help me out of this

Here is my grok pattern :

 grok {
         match=> ["message","%{DATESTAMP:timestamp} \[%{WORD:processId}\] %{LOGLEVEL:level} %{USERNAME:logger} %{USER:user} %{IPV4:clientIp} %{URI:requestUrl} 
        %{USER:method} %{GREEDYDATA:message}"] 
  }

here is my log sample :

2020-10-08 10:57:14.8837 [12964] ERROR DeveloperExceptionPageMiddleware bhavin 192.168.43.244 http://bhavin/favicon.ico GET An unhandled exception has occurred while executing the request.System.InvalidOperationException: No authenticationScheme was specified, and there was no DefaultChallengeScheme found. The default schemes can be set using either AddAuthentication(string defaultScheme) or AddAuthentication(Action<AuthenticationOptions> configureOptions).

Hi,

I just tried your Grok pattern in the Kibana Grok debugger and it works:

{
  "method": "GET",
  "level": "ERROR",
  "logger": "DeveloperExceptionPageMiddleware",
  "message": "An unhandled exception has occurred while executing the request.System.InvalidOperationException: No authenticationScheme was specified, and there was no DefaultChallengeScheme found. The default schemes can be set using either AddAuthentication(string defaultScheme) or AddAuthentication(Action<AuthenticationOptions> configureOptions).",
  "processId": "12964",
  "requestUrl": "http://bhavin/favicon.ico",
  "clientIp": "192.168.43.244",
  "user": "bhavin",
  "timestamp": "20-10-08 10:57:14.8837"
}

Maybe the grok pattern is not the problem? You said that the fields are not created - so the messages are stored in ElasticSearch?
Can you post the complete LogStash pipeline?

Best regards
Wolfram

Hi here is mypipeline.conf :

input {
 beats {
    type=>"mytest"
    port => 5044
	 
  }
} 
filter{
	if [fields][log_type] == "gbase"
	{
	   if [level] in [ "Error", "Fatal" ] 
	    {
			grok { match=> ["message","%{DATESTAMP:timestamp} \[%{WORD:processId}\] %{LOGLEVEL:level} %{USERNAME:logger} %{USER:user} %{IPV4:clientIp} %{URI:requestUrl} %{USER:method} %{GREEDYDATA:message}"] 
			}
	    }
		else
		{
			grok { match=> ["message","%{DATESTAMP:timestamp}  \[%{WORD:processId}\] %{LOGLEVEL:level} %{USERNAME:logger} %{USER:user} %{IPV4:clientIp} %{GREEDYDATA:message}" ] 
			 }
		}
	
	}
	if [fields][log_type] == "finance" 
	{
	   if [level] in [ "Error", "Fatal" ] 
	    {
			grok { match=> ["message","%{DATESTAMP:timestamp} \[%{WORD:processId}\] %{LOGLEVEL:level} %{USERNAME:logger} %{USER:user} %{IPV4:clientIp} %{URI:requestUrl} %{USER:method} %{GREEDYDATA:message}"]}
	    }
		else
		{
			grok { match=> ["message","%{DATESTAMP:timestamp}  \[%{WORD:processId}\] %{LOGLEVEL:level} %{USERNAME:logger} %{USER:user} %{IPV4:clientIp} %{GREEDYDATA:message}" ]}	}
   date {
		match => [ "timestamp", "yyyy-MM-dd HH:mm:ss.SSS" ]
		target=> "@timestamp"
	}
}
output {
 
	if [fields][log_type] == "finance"
	{
		elasticsearch 
		{
			hosts => ["http://localhost:9200"] 
			index => "finance-%{+YYYY.MM.dd}"
			user => "something"
			password => "something"		
		}
	} 
	if [fields][log_type] == "gbase"
	{
		elasticsearch 
		{
			hosts => ["http://localhost:9200"] 
			index => "gbase-%{+YYYY.MM.dd}"
			user => "something"
			password => "something"	
		}
	}
  stdout { codec => rubydebug }
 }

Are you sure that the conditions are correct? Where do the fields level and fields.log_type come from?

yeah sure ,after all that created index and took grok pattern by level