\u0000\u0000\u0000\u0017�\bS1T00S1T0_Snowflake_Ingestion\\S1T0_Snowflake_Ingestion_job_TACC_TYPE_BALANCEbS1T0_Snowflake_Ingestion_65a7c7b767404898408eb87b\u00004IDP successfully ran a job22024-01-17T12:27:35+00:00\u0002�\u0002/dev/02570/app/DQO0/data/jobprofile/tsz/S1T0_Snowflake_Ingestion/64ed110644ae0370f6a3778b/S1T0_Snowflake_Ingestion_job_TACC_TYPE_BALANCE/Final\u0000\u0000\u0002\u0001\u0000�\u0001https://api.idp-dev.devfg.rbc.com/jobs/history/by-id?id=65a7c7b767404898408eb87b
Here is my grok after the above mutated fields, I am getting a grok parse failure on logstash 7.17.10
I Finally found the reason for the special characters in the message, is because from the producer end, we have "avro" serializer, and this put some special characters in the message, as we are not using kafka deserializer in the logstash config.
In the organisation, the Kafka data is being sent in a Avro Schema and I am trying to load the data into Logstash using Kafka Input however without Avro Codec, I am getting the data in the below charset:
I am assuming, once I add the Avro Codec, it should solve the problem. However, in the organisation network i am unable to install avro codec because we cannot connect to internet. Is there a way to package the avro-codec plugin such that I can send via FTP to the network machine?