I have a set of documents created by filebeat -> logstash pushed to Elasticsearch and they look like this...
{
"_index": "sub_myapp_prod-filebeat-7.17.7-2023.04",
"_type": "_doc",
"_id": "IPLahocBBfkGcvN800_A",
"_version": 1,
"_seq_no": 2057636,
"_primary_term": 1,
"found": true,
"_source": {
"logger": "myapp-sub",
"host": {
"name": "fdc3fd32fc5f"
},
"mymessage": "[application] GMBHelper::fetchReviews response has no totalReviewCount. ...",
"stream": "stderr",
"message": "NOTICE: PHP message: [2023-04-15 22:38:18] myapp-sub.ERROR: [application] GMBHelper::fetchReviews response has no totalReviewCount. ...",
"@timestamp": "2023-04-15T21:38:18.907Z",
"agent": {
"id": "f84f19b3-6802-44bd-a096-0fd5d67ae3de",
"hostname": "fdc3fd32fc5f",
"name": "fdc3fd32fc5f",
"type": "filebeat",
"ephemeral_id": "f4081a93-5739-4bda-9270-87e93e7c22b5",
"version": "7.17.7"
},
"timestamp": "2023-04-15 22:38:18",
"ecs": {
"version": "1.12.0"
},
"tags": [
"beats_input_codec_plain_applied"
],
"type": "monolog",
"input": {
"type": "container"
},
"@version": "1",
"log": {
"offset": 33227987478,
"file": {
"path": "/var/lib/docker/containers/0bd7aaa3ae0a585bbce5307579272c3654e50b590b3cac74eed902f4c80a4055/0bd7aaa3ae0a585bbce5307579272c3654e50b590b3cac74eed902f4c80a4055-json.log"
}
},
"docker": {
"container": {
"labels": {
"io_rancher_environment_uuid": "1bad0e96-1ad3-4e6b-9522-3971998e163f",
"co_elastic_logs/enabled": "true",
"io_rancher_container_ip": "10.42.47.100/16",
"io_rancher_stack_service_name": "submyappprod/php",
"io_rancher_container_name": "submyappprod-php-1",
"io_rancher_container_mac_address": "02:7e:2e:57:19:30",
"io_rancher_service_hash": "c01a241978e86b870e84e99a2d301e88aa1bdf6d",
"io_rancher_cni_wait": "true",
"io_rancher_cni_network": "ipsec",
"io_rancher_project_name": "submyappprod",
"io_rancher_project_service_name": "submyappprod/php",
"io_rancher_container_uuid": "0ea6a19a-c899-4d4d-82e2-404674310d89",
"io_rancher_stack_name": "submyappprod",
"io_rancher_service_deployment_unit": "66aa975f-9827-4076-a943-eaec10ec5e39",
"io_rancher_service_launch_config": "io.rancher.service.primary.launch.config"
}
}
},
"level": "ERROR",
"container": {
"id": "0bd7aaa3ae0a585bbce5307579272c3654e50b590b3cac74eed902f4c80a4055",
"name": "r-submyappprod-php-1-0ea6a19a",
"image": {
"name": "registry.host.com/user/sub-myapp/php:BLABLA"
}
}
}
}
The important fields are in the _source section. Now if i try to send an update I try a few different things ... first i remove all the _fields . I try both keeping the structure in source and for instance just change the _source -> message content and I get as a result
$ curl -X POST https://search-BLBLLALLA.us-east-1.es.amazonaws.com/sub_myapp_prod-filebeat-7.17.7-2023.04/_doc/IPLahocBBfkGcvN800_A/_update -H "Content-Type: application/json" -d "$JSON_CNT"
{"error":{"root_cause":[{"type":"parsing_exception","reason":"Unknown key for a VALUE_STRING in [logger].","line":3,"col":15}],"type":"x_content_parse_exception","reason":"[3:15] [UpdateRequest] failed to parse field [_source]","caused_by":{"type":"parsing_exception","reason":"Unknown key for a VALUE_STRING in [logger].","line":3,"col":15}},"status":400}
I also try specifying just the logger field ie i send only this in the body
{
"logger": "anotherValueForLogger"
}
However when i run the curl request i get ....
curl -X POST https://search-BLBLLBLB.us-east-1.es.amazonaws.com/sub_myapp_prod-filebeat-7.17.7-2023.04/_doc/IPLahocBBfkGcvN800_A/_update -H "Content-Type: application/json" -d "$JSON_CNT"
{"error":{"root_cause":[{"type":"x_content_parse_exception","reason":"[2:3] [UpdateRequest] unknown field [logger]"}],"type":"x_content_parse_exception","reason":"[2:3] [UpdateRequest] unknown field [logger]"},"status":400}
What am I missing. How can I update this when in Kibana it is all shown good and well? ie i get all the fields displayed in kibana as outside the _source when viewing in table format. I am lost as I dont get why it is not liking the way the update is done.