Parse out #012 from data

I have some data I'm sending via syslog to LS. It is json when it is collected:

{
    "updated_date": null,
    "status": null,
    "name": null,
    "dnssec": null,
    "city": null,
    "expiration_date": null,
    "address": null,
    "zipcode": null,
    "creation_date": null,
    "domain_name": null,
    "whois_server": null,
    "state": null,
    "registrar": null,
    "referral_url": null,
    "cached_on": "2019-01-24 08:08:52.560499",
    "name_servers": null,
    "org": null,
    "country": null,
    "emails": null,
    "cache_question": "ex.com"

and when it arrives at LS it has #012 where the lines ended:

{#012    "updated_date": null, #012    "status": null, #012    "name": null, #012    "dnssec": null, #012    "city": null, #012    "expiration_date": null, #012    "address": null, #012    "zipcode": null, #012    "creation_date": null, #012    "domain_name": null, #012    "whois_server": null, #012    "state": null, #012    "registrar": null, #012    "referral_url": null, #012    "cached_on": "2019-01-24 08:08:52.560499", #012    "name_servers": null, #012    "org": null, #012    "country": null, #012    "emails": null, #012    "cache_question": "ex.com"#012}}

When I try to parse this with the json filter, I get _jsonparsefailure. I tried mutate - gsub to remove it, but does not seem to work

mutate {
   gsub => ["message","#012     ",""]
}

I found some other discussions that talked about removing ff or lf, but they had \n and \r in their text. Searches so far have not found anyone having this problem.

Has anyone seen this or seen other forum discussions that solved this?

Thanks

Aaaannnddddd..... five seconds later I figured it out.

Sorry to the wasted bits.

mutate {
   gsub => ["message","#012",""]
}

was the right syntax.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.