How tp push timestamp field (exp:01AUG2014:19:03:00) and location field : (exp :21700000000000) to KIBANA 4 and make ES map it as so?

I want to push data I have in my hadoop cluster to ES and then visualize
the hole thing in kibana.

this is what I’ve done :

CREATE TABLE xx(traffic_type_id INT, caller INT, time STRING,
tranche_horaire INT, called INT, call_duration INT, code_type_trafic
STRING, code_destination_trafic STRING, location_number STRING, id_offre
INT, id_service INT)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ‘\t';

LOAD DATA INPATH ‘/user/hive/outt.csv’ OVERWRITE INTO TABLE xxx;


CREATE EXTERNAL TABLE esxx (caller INT, time STRING, tranche INT,
called_number INT, duration INT, code_type STRING, code_destination STRING,
location STRING, offre INT, service INT)
STORED BY ‘org.elasticsearch.hadoop.hive.EsStorageHandler’
TBLPROPERTIES(‘es.resource’ = ‘xx/xx’,
‘es.nodes’=’192.168.238.130:9200′,
‘es.mapping.names’ = ‘time:@timestamp’);

INSERT OVERWRITE TABLE esxx SELECT s.caller, s.time, s.tranche_horaire,
s.called, s.call_duration, s.code_type_trafic, s.code_destination_trafic,
s.location_number, s.id_offre, s.id_service FROM xx s;


CREATE EXTERNAL TABLE xx (
caller INT,
time TIMESTAMP,
tranche INT,
called_number INT,
duration INT,
code_type STRING,
code_destination STRING,
location STRING,
offre INT,
service INT)
STORED BY ‘org.elasticsearch.hadoop.hive.EsStorageHandler’
TBLPROPERTIES(‘es.resource’ = ‘xx/xx/’,
‘es.nodes’=’192.168.238.130:9200′,
‘es.mapping.names’ = ‘time:@timestamp’);

But Kibana doesn’t seem to recognize my timestamp “time”, ES keeps on
mapping it as a string (the time field in my csv file is as so :
exp:01AUG2014:19:03:00 !

And the location field is like this : exp : 21700000000000 .

What should I change to let ES do the appropriate mapping and thus
recognize my timestamp and geo_location?

Should I try to change my field in the CSV file or another solution is
available?

Best regards,
Omar,

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/9d1749b5-8b50-4155-8d91-8b13a8bc2fbe%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

I have managed to change my time field to a recognizable format using
talend

On Sunday, March 22, 2015 at 6:15:06 PM UTC+1, BEN SALEM Omar wrote:

I want to push data I have in my hadoop cluster to ES and then visualize
the hole thing in kibana.

this is what I’ve done :

CREATE TABLE xx(traffic_type_id INT, caller INT, time STRING,
tranche_horaire INT, called INT, call_duration INT, code_type_trafic
STRING, code_destination_trafic STRING, location_number STRING, id_offre
INT, id_service INT)
ROW FORMAT DELIMITED FIELDS TERMINATED BY ‘\t';

LOAD DATA INPATH ‘/user/hive/outt.csv’ OVERWRITE INTO TABLE xxx;


CREATE EXTERNAL TABLE esxx (caller INT, time STRING, tranche INT,
called_number INT, duration INT, code_type STRING, code_destination STRING,
location STRING, offre INT, service INT)
STORED BY ‘org.elasticsearch.hadoop.hive.EsStorageHandler’
TBLPROPERTIES(‘es.resource’ = ‘xx/xx’,
‘es.nodes’=’192.168.238.130:9200′,
‘es.mapping.names’ = ‘time:@timestamp’);

INSERT OVERWRITE TABLE esxx SELECT s.caller, s.time, s.tranche_horaire,
s.called, s.call_duration, s.code_type_trafic, s.code_destination_trafic,
s.location_number, s.id_offre, s.id_service FROM xx s;


CREATE EXTERNAL TABLE xx (
caller INT,
time TIMESTAMP,
tranche INT,
called_number INT,
duration INT,
code_type STRING,
code_destination STRING,
location STRING,
offre INT,
service INT)
STORED BY ‘org.elasticsearch.hadoop.hive.EsStorageHandler’
TBLPROPERTIES(‘es.resource’ = ‘xx/xx/’,
‘es.nodes’=’192.168.238.130:9200′,
‘es.mapping.names’ = ‘time:@timestamp’);

But Kibana doesn’t seem to recognize my timestamp “time”, ES keeps on
mapping it as a string (the time field in my csv file is as so :
exp:01AUG2014:19:03:00 !

And the location field is like this : exp : 21700000000000 .

What should I change to let ES do the appropriate mapping and thus
recognize my timestamp and geo_location?

Should I try to change my field in the CSV file or another solution is
available?

Best regards,
Omar,

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/343f7bff-f10a-469b-9fee-429df43035be%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.