Issues during index where field is null after upgrade to 7.x

In previous versions of ES I have been able to send data in from SQL Server with fields that have null values. I know I needed to make some changes to the mapping to deal with the move from 5.x-6.x-7.x and I never had an issue sending in the data with fields with a null value till 7.x..
The error I am getting is

Sep 05, 2019 08:09:32 AM: Sep 05, 2019 08:09:32 AM> Batch #1 Error, throwing exception...
Sep 05, 2019 08:09:32 AM: Failed to create entry in eventlog: Log entry string is too long. A     string written to the event log cannot exceed 32766 characters.>   ERROR[sp_vDash_NewClientsPatients]: System.Exception: Invalid NEST response built from a successful (200) low level call on POST: /_bulk
# Invalid Bulk items:
operation[0]: index returned 400 _index: logstash-hvms-ncp-kema-2019 _type: _doc _id: 709de0c7-df76-436a-86a2-33849020f443 _version: 0 error: Type: mapper_parsing_exception Reason: "failed to parse field [patientprofession] of type [text] in document with id '709de0c7-df76-436a-86a2-33849020f443'. Preview of field's value: '{}'" CausedBy: "Type: illegal_state_exception Reason: "Can't get text on a START_OBJECT at 1:243""
operation[1]: index returned 400 _index: logstash-hvms-ncp-kema-2019 _type: _doc _id: 378e55bd-6861-4e63-9e37-36903ab000ca _version: 0 error: Type: mapper_parsing_exception Reason: "failed to parse field [patientprofession] of type [text] in document with id '378e55bd-6861-4e63-9e37-36903ab000ca'. Preview of field's value: '{}'" CausedBy: "Type: illegal_state_exception Reason: "Can't get text on a START_OBJECT at 1:227""

I know because my current index's that were created in 6.x have null for fields
"status": "Active",
"patientprofession": null,
"email": null,

I know there is a null_value option to replace a null with a string and I have tried it but it did't do any replacing of those null's in the source and still failed to index with the same error. But I still would really like to get my data since the update to 7.x to continue to accept the null.

My mapping template
PUT _template/hvmsncp
{
"index_patterns" : "logstash-hvms-ncp-",
"version" : 60001,
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings": {
"dynamic_templates" : [
{"message_field" :
{"path_match" : "message","match_mapping_type" : "string",
"mapping" : {"norms" : false,"type" : "text"}
}
},
{
"string_fields" : {"match" : "
","match_mapping_type" : "string",
"mapping" : {
"fields" : {"keyword" : {"type" : "keyword"}},"norms" : false,"type" : "text"}
}
}
],
"properties": {
"@timestamp": {"type": "date"},
....
"patientprofession": {"type": "text","fields": {"keyword": { "type": "keyword"}}},
....
}
}
}

Thank you for your help.

Anyone have a mapping solution to dealing with null values? I

Using in your properties:

"patientprofession": {
   "type": "text",
   "null_value": "NULL"
}

Isn't working? https://www.elastic.co/guide/en/elasticsearch/reference/current/null-value.html

Are you using Elasticseach .NET to send your data into ES? I have tried the "null_value": "NULL" in my mappings with no success I still get the errors from Elasticsearch .NET trying to send in the data.

Also that only works for strings and I believe it adds the string "NULL" instead of null which I need to work for numeric and date values as well.

No, I'm not using .NET version. The "null_value" has to be of the same type as the original field and yes it just replaces 'null' with a string or for numbers you could use -1 or 0 or whatever, but it has to be the same type as the original field.

The errors do seem wrong, maybe there's an issue with it for that version.