MSSQL datetime column parsing good format mapping and good date filter?

I am importing a MS SQL table with a datetime column.

It makes stored values of this type:
2008-02-18 00:00:00.000
2009-04-28 00:00:00.000
2000-05-03 00:00:00.000
...

by default jdbc input plugin does not detect this column type as a date but as text ,

so I am trying to use date filter to convert values in date.

I made this mapping in ElasticSearch:

    PUT /testgandalf
    {
         "mappings": {
          "indexname": {
            "properties": {
             "date_visa_demande": {
                "type": "date",
                "format": "yyyy-MM-dd HH:mm:ss.SSS"
              }
            }
          }
         }
    }

And I writed this filter in my conf file:

filter
{
	 date {
		 match => [ "date_visa_demande", "yyyy-MM-dd HH:mm:ss.SSS" ]
	}
}

When I launch my import I have error messages of that types:
..."error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date_visa_demande]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2008-04-08 00:00:00 +0200\" is malformed at \" +0200\""}}}}}...
...
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date_visa_demande]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2009-01-02 00:00:00 +0100\" is malformed at \" +0100\""}}}}}
...

Do you know how i must set format mapping and correct date filter?

  • You don't need any special mapping. Elasticsearch will detect the timestamp produced by the date filter as a timestamp.
  • If you want the date filter to store the parsed timestamp in the date_demande field you need to say so by adding target => "date_demande".
  • What's the difference between the date_demande and date_visa_demande fields? Should you keep the latter field? Or convert it too?

First thank you for your answere .
sorry for the column name it was a mistake, i fixed it.

So What should i write in filter to target a mssql datetime column called "date_visa_demande"?
I am a beginner and i don't see really how it works

I'm not sure what you're asking. Please post your current configuration and what you're getting in the Logstash logs.

ok I am starting from start:

I Have a miscrosoft SQL table with a column of type datetime

By default when I run a logstash import , logstash detect microsoft sql columns as text type.

This is my logstash file :

input {
    jdbc {
		jdbc_connection_string => "jdbc:sqlserver://***.***.***.***;databaseName=myDatabaseName;"
          jdbc_user => "user"
		  jdbc_password => "pass"
        jdbc_driver_library => "C:\Program Files (x86)\sqljdbc_6.0\enu\sqljdbc42.jar"
        jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
        statement => "SELECT 
		DATE_VISA_DEMANDE
		FROM FILM WHERE DATE_VISA_DEMANDE IS NOT NULL"
    }
}

output {
    elasticsearch { 
       hosts => "localhost:9200"
	   index => "test_view"
       document_type => "film_infos"
	
    }
}

logstash output in console:

[[main]<jdbc] INFO logstash.inputs.jdbc - (0.109000s) SELECT
DATE_VISA_DEMANDE
FROM FILM WHERE DATE_VISA_DEMANDE IS NOT NULL
16:48:04.817 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>["http://localhost:9200"]}}
16:48:04.818 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Running health check to see if an Elasticsearch connection is working {:url=>#<URI::HTTP:0x5bab2d11 URL:http://localhost:9200>, :healthcheck_path=>"/"}
16:48:06.145 [[main]-pipeline-manager] WARN logstash.outputs.elasticsearch - Restored connection to ES instance {:url=>#<URI::HTTP:0x5bab2d11 URL:http://localhost:9200>}
16:48:06.146 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Using mapping template from {:path=>nil}
16:48:06.867 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001,
"settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false},
"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}},
{"string_fields"=>{"match"=>"
", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}],
"properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true,
"properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
16:48:06.980 [[main]-pipeline-manager] INFO logstash.outputs.elasticsearch - New Elasticsearch output
{:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["localhost:9200"]}
16:48:06.986 [[main]-pipeline-manager] INFO logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>1000}
16:48:06.988 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
16:48:07.378 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
16:48:10.005 [LogStash::Runner] WARN logstash.agent - stopping pipeline {:id=>"main"}

this is my result mapping:

   "test_view": {
      "mappings": {
         "film_infos": {
            "properties": {
               "@timestamp": {
                  "type": "date"
               },
               "@version": {
                  "type": "text",
                  "fields": {
                     "keyword": {
                        "type": "keyword",
                        "ignore_above": 256
                     }
                  }
               },
               "date_visa_demande": {
                  "type": "text",
                  "fields": {
                     "keyword": {
                        "type": "keyword",
                        "ignore_above": 256
                     }
                  }
               }
            }
         }
      }
   }

Do you know what to do for having a date type in elasticsearch please?

Use a date filter to parse the date_visa_demande field. If you don't want the resulting timestamp stored in the @timestamp field, use the date filter's target option.

Use a stdout { codec => rubydebug } filter to verify that this is working for you. Then you can start worrying about the ES mappings. ES should map the field in question as a timestamp, but keep in mind that existing mappings are never changed. You probably have to reindex.

Sorry but i am not sure to understand how o use filter

As you can see above in my initial question I wrote:

filter
{
	 date {
		 match => [ "date_visa_demande", "yyyy-MM-dd HH:mm:ss.SSS" ]
	}
}

And it failed,
what should I write for make it working?

The error you got earlier is from ES, not the date filter. Try that configuration and use a stdout output to inspect the results.

I wrote:

filter
{
	 date {
		target => "date_visa_demande"
	}
}

output {
     stdout { codec => rubydebug }
}

And I have this error output:

20:10:27.975 [LogStash::Runner] ERROR logstash.agent - fetched an invalid config {:config=>"input {\n jdbc {\n\t\tjdbc_connection_string => "jdbc:sqlserver://...;databaseName=TEST;"\n jdbc_user => "user"\n\t\t jdbc_password => "mypass"\n jdbc_driver_library => "C:\Program Files (x86)\sqljdbc_6.0\enu\sqljdbc42.jar"\n jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"\n statement => " SELECT \n\t\t date_visa_demande \n\t\tFROM FILM WHERE date_visa_demande IS NOT NULL "\n }\n}\n\nfilter\n{\n\t date {\n\t\ttarget => "date_visa_demande"\n\t}\n}\n\noutput {\n stdout { codec => rubydebug }\n}\n", :reason=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register"}

I am not sure to understand what happens

Why did you leave out the match option in your date filter?

sorry

so i wrote:

filter
{
	 date {
		target => "date_visa_demande"
		match => [ "date_visa_demande", "yyyy-MM-dd HH:mm:ss.SSS" ]
	}
}

And it output now:

17:34:03.012 [Api Webserver] INFO  logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
{
    "date_visa_demande" => #<Java::JavaSql::Timestamp:0x60512ac0>,
           "@timestamp" => 2017-01-18T16:34:03.112Z,
             "@version" => "1",
                 "tags" => [
        [0] "_dateparsefailure"
    ]
}
{
    "date_visa_demande" => #<Java::JavaSql::Timestamp:0x86f0865>,
           "@timestamp" => 2017-01-18T16:34:03.112Z,
             "@version" => "1",
                 "tags" => [
        [0] "_dateparsefailure"
    ]
}
{
    "date_visa_demande" => #<Java::JavaSql::Timestamp:0x2981e344>,
           "@timestamp" => 2017-01-18T16:34:03.112Z,
             "@version" => "1",
                 "tags" => [
        [0] "_dateparsefailure"
    ]
}

...
It seems logstash fail to parse date_visa_demande column

Please edit your post and make sure the log snippet is formatted as preformatted text. Use the toolbar button. Why? Because I'm pretty sure Logstash didn't say

"date_visa_demande" => #,

but rather

"date_visa_demande" => #<something something>,

and I want to see what's inside the angle brackets.

Always post configuration and logs as preformatted text.

ok done

Okay, so the field contains a java.sql.Timestamp object. Then you can use a ruby filter to convert it, e.g. to an epoch value that the date filter understand. This probably works:

ruby {
  code => "event.set('[@metadata][epoch]', event.get('date_visa_demande').getTime)"
}
date {
  match => ["[@metadata][epoch]", "UNIX_MS"]
  target => "date_visa_demande"
}

So, we convert the field into milliseconds since the epoch and store that value in the [@metadata][epoch] field, then parse that field with the date filter and store back into date_visa_demande.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.