I was trying to create a small working sample for you. In the process I am a bit more lost than before. Essentially what is happening is that if I create a fresh index then actually things work. But any change later on and I start getting strings. I am at loss of words.
A small csv file called tester.csv as source of data:
12225,2015-10-15T11:07:39.776Z
33342,2016-12-11T11:01:22.454Z
A simple pipeline called demo_pipeline to break the csv:
PUT _ingest/pipeline/demo_pipeline
{
  "description": "demo pipeline",
  "processors": [
    {
      "split": {
        "field": "message",
        "separator": ",",
        "target_field": "splitdata"
      }
    },
    {
      "script": {
        "lang": "painless",
        "source": """
                  ctx.ID = ctx.splitdata[0];
                  ctx.PC_Local_Time_2 = ctx.splitdata[1]
                  """
      }
    },
    {
      "date": {
        "field": "PC_Local_Time_2",
        "target_field": "PC_Local_Time_2", 
        "formats": ["yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"]
      }
    },
    {
      "remove": {
        "ignore_missing": true, 
        "field": [
          "splitdata"
          ]
      }
    }
  ]
}
And the filebeat_demo.yml:
setup.template.append_fields:
- name: PC_Local_Time_2
  type: date
filebeat.inputs:
- paths:
  -  C:\Data\tester.csv
  input_type: log
output.elasticsearch:
 hosts: ["http://localhost:9200"]
 pipeline: demo_pipeline
  
setup.ilm.enabled: auto
setup.ilm.rollover_alias: "try"
setup.ilm.pattern: "{now/d}-000001"
  
logging.level: info
logging.to_files: true
logging.files:
 path: C:\filebeatStuff\logs
 name: filebeat
 keepfiles: 7
 permissions: 0644
Just to test out the pipeline in the console:
GET _ingest/pipeline/demo_pipeline/_simulate
{
  "docs": [
    {
      "_source": {
        "message": "12,2019-10-17T11:07:39.776Z"
      }
    }
  ]
}
Result:
 {
  "docs" : [
    {
      "doc" : {
        "_index" : "_index",
        "_type" : "_doc",
        "_id" : "_id",
        "_source" : {
          "PC_Local_Time_2" : "2019-10-17T11:07:39.776Z",
          "ID" : "12",
          "message" : "12,2019-10-17T11:07:39.776Z"
        },
        "_ingest" : {
          "timestamp" : "2019-11-05T11:26:22.910Z"
        }
      }
    }
  ]
}
Then the actual run. And it works.
Then I make a small change in pipeline. I put in a different field as target. And this is without deleting the filebeat template.
{
      "date": {
        "field": "PC_Local_Time_2",
        "target_field": "PC_Local_Time_conv",
        "formats": ["yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"]
      }
 },
In the yml, I change the second line:
setup.template.append_fields:
- name: PC_Local_Time_conv
  type: date
And now on running the same thing:
I delete the index and the template this time before trying again.
This succeeds
DELETE try-2019.11.05-000001
This fails as I had already deleted this during experimentation before.
DELETE /_template/filebeat-7.2.0
And the result is same.
Not sure how helpful the details have been.