Unable to drop lines using drop filter of logstash

Sample Log file:

Started by user e[8mha:////4JmSuRY+CFCKIu6RwHXiSXZDY4Huygv7JVrvzkfRI8zdAAAAmR+LCAAAAAAAAP9b85aBtbiIQTGjNKU4P08vOT+vOD8nVc83PyU1x6OyILUoJzMv2y+/JJUBAhiZGBgqihhk0NSjKDWzXb3RdlLBUSYGJk8GtpzUvPSSDB8G5tKinBIGIZ+sxLJE/ZzEvHT94JKizLx0a6BxUmjGOUNodHsLgAzWEgZe/dLi1CL9rNS87My8YgABJh63wgAAAA==e[0mSuchi
Running in Durability level: MAX_SURVIVABILITY
e[8mha:////4BtSc1ywGlugseGokkLauhiOfjz5VzoAnCDAh7oY3f0qAAAAoh+LCAAAAAAAAP9tjTEOwjAQBM8BClpKHuFItIiK1krDC0x8GCfWnbEdkooX8TX+gCESFVvtrLSa5wtWKcKBo5UdUu8otU4GP9jS5Mixv3geZcdn2TIl9igbHBs2eJyx4YwwR1SwULBGaj0nRzbDRnX6rmuvydanHMu2V1A5c4MHCFXMWcf8hSnC9jqYxPTz/BXAFEIGsfuclm8zQVqFvQAAAA==e[0m[Pipeline] Start of Pipeline
e[8mha:////4EVEmQlRJqNbotxxxv72biHyUzGmchavU3VpVGARx2w/AAAApR+LCAAAAAAAAP9tjTEOwjAUQ3+KOrAycohUghExsUZZOEFIQkgb/d8mKe3EibgadyBQiQlLlmxL1nu+oE4RjhQdby12HpP2vA+jK4lPFLtroIm3dOGaMFGwXNpJkrGnpUrKFhaxClYC1hZ1oOTRZdiIVt1VExS65pxj2Q4CKm8GeAAThZxVzN8yR9jeRpMIf5y/AJj7DGxXvP/86jduZBmjwAAAAA==e[0m[Pipeline] node
Running on e[8mha:////4PphPkWMGe0awTradjzsNI3THZhwqsf2lED4f9BhZa4DAAAAnh+LCAAAAAAAAP9b85aBtbiIQTGjNKU4P08vOT+vOD8nVc83PyU1x6OyILUoJzMv2y+/JJUBAhiZGBgqihhk0NSjKDWzXb3RdlLBUSYGJk8GtpzUvPSSDB8G5tKinBIGIZ+sxLJE/ZzEvHT94JKizLx0a6BxUmjGOUNodHsLgAz2EgZh/eT83ILSktQifY3cxGIgrakPAHib2iPIAAAAe[0mJenkins in /var/jenkins_home/workspace/LogTestWithFileBeatAndKafka
e[8mha:////4NjG4QEBFZlqQL+OL0I3Wsf98Q0zVAOeQqy2bhO691fQAAAApR+LCAAAAAAAAP9tjTEOwjAUQ3+KOrAycoh0gA0xsUZZOEFIQkgb/d8mKe3EibgadyBQiQlLlmxL1nu+oE4RjhQdby12HpP2vA+jK4lPFLtroIm3dOGaMFGwXNpJkrGnpUrKFhaxClYC1hZ1oOTRZdiIVt1VExS65pxj2Q4CKm8GeAAThZxVzN8yR9jeRpMIf5y/AJj7DGxXvP/86jfoP95RwAAAAA==e[0m[Pipeline] {
e[8mha:////4DIjrki99nl3MxdRCD5wb+MVl5WhzRshvb/bH/G1C7t/AAAAoh+LCAAAAAAAAP9tjTEOAiEURD9rLGwtPQTbaGWsbAmNJ0AWEZb8zwLrbuWJvJp3kLiJlZNMMm+a93rDOic4UbLcG+wdZu14DKOti0+U+lugiXu6ck2YKRguzSSpM+cFJRUDS1gDKwEbgzpQdmgLbIVXD9UGhba9lFS/o4DGdQM8gYlqLiqVL8wJdvexy4Q/z18BzLEA29ce4gdpL1fxvAAAAA==e[0m[Pipeline] echo
Hello World
e[8mha:////4OHZRjXLolcyJYEcJUEdi9YMWaiWqqj6+zfgxwcaqoLaAAAAoh+LCAAAAAAAAP9tjTEOAiEURD9rLGwtPQTbGBtjZUtoPAGyiLDkfxZYdytP5NW8g8RNrJxkknnTvNcb1jnBiZLl3mDvMGvHYxhtXXyi1N8CTdzTlWvCTMFwaSZJnTkvKKkYWMIaWAnYGNSBskNbYCu8eqg2KLTtpaT6HQU0rhvgCUxUc1GpfGFOsLuPXSb8ef4KYI4F2L72ED+qfHravAAAAA==e[0m[Pipeline] git
using credential github-credential
 > git rev-parse --is-inside-work-tree # timeout=10
Fetching changes from the remote Git repository
 > git config remote.origin.url https://github.com/Suchismita1tutun/calander-control.git # timeout=10
Fetching upstream changes from https://github.com/username/repo-name.git
 > git --version # timeout=10
using GIT_ASKPASS to set credentials github credential
 > git fetch --tags --progress https://github.com/username/repo-name.git.git +refs/heads/*:refs/remotes/origin/*
 > git rev-parse refs/remotes/origin/master^{commit} # timeout=10
 > git rev-parse refs/remotes/origin/origin/master^{commit} # timeout=10
Checking out Revision 90fc07c147f226cca23f5b3fc3279233709a20e3 (refs/remotes/origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 90fc07c147f226cca23f5b3fc3279233709a20e3
 > git branch -a -v --no-abbrev # timeout=10
 > git branch -D master # timeout=10
 > git checkout -b master 90fc07c147f226cca23f5b3fc3279233709a20e3
Commit message: "calander control added"
 > git rev-list --no-walk 90fc07c147f226cca23f5b3fc3279233709a20e3 # timeout=10
e[8mha:////4AKBxv2h+cZaWjt5g5KcXPzQIG/629SUoOGy4Qc1NpHvAAAAox+LCAAAAAAAAP9tjTEOAiEURD9rLGwtPQTbmFgYK1tC4wmQRYQl/7PAult5Iq/mHSRuYuUkk8yb5r3esM4JTpQs9wZ7h1k7HsNo6+ITpf4WaOKerlwTZgqGSzNJ6sx5QUnFwBLWwErAxqAOlB3aAlvh1UO1QaFtLyXV7yigcd0AT2CimotK5Qtzgt197DLhz/NXAHOMBdihdv8BvnU2RrwAAAA=e[0m[Pipeline] }
e[8mha:////4BB9EOaTxXgEjDkBMyeLKMQIZ1ZyVrcViJM/K97NaUI9AAAAox+LCAAAAAAAAP9tjTEOAiEURD9rLGwtPQRb2JgYK1tC4wmQRYQl/7PAult5Iq/mHSRuYuUkk8yb5r3esM4JTpQs9wZ7h1k7HsNo6+ITpf4WaOKerlwTZgqGSzNJ6sx5QUnFwBLWwErAxqAOlB3aAlvh1UO1QaFtLyXV7yigcd0AT2CimotK5Qtzgt197DLhz/NXAHOMBdihdv8BzHcCWrwAAAA=e[0m[Pipeline] // node
e[8mha:////4HfYwm5UTGfspQHu62u6ou8JBs9vMESu8TI0oCv/vlo/AAAAoh+LCAAAAAAAAP9tjTEOAiEURD9rLGwtPQSb2BljZUtoPAGyiLDkfxZYdytP5NW8g8RNrJxkknnTvNcb1jnBiZLl3mDvMGvHYxhtXXyi1N8CTdzTlWvCTMFwaSZJnTkvKKkYWMIaWAnYGNSBskNbYCu8eqg2KLTtpaT6HQU0rhvgCUxUc1GpfGFOsLuPXSb8ef4KYI6xADvU7j/qbmUQvAAAAA==e[0m[Pipeline] End of Pipeline
Finished: SUCCESS

Logstash Conf:

filter {
if [message] =~ /^\e[8mha:/ {
drop { }
}
}

I wanted to drop all the lines starts with \e[8mha: But this did not worked out. Can anyone help me to resolve this?

Hi @Suchismita_Goswami,

how is that message sent to Logstash? And what does it look like after Logstash (in Elasticsearch, or wherever you send it)?

I would expect that as that is a multi-line message, it is already encoded to be just one line before it gets to Logstash and that the whole log message is in the message field.

What i would try without knowing anything more would be to surpress those lines before they get to Logstash, either in whatever creates the logs or in what sends the message to Logstash. If you use Filebeat, you might be able to use exclude_lines to achieve that.

I think you want

if [message] =~ /^e\[8mha:/ { drop { } }

@A_B Thanks a lot for your quick reply. Since, its a multiline event, the entire message is sent to logstash in the message field. I am using filebeat to send the logs to logstash. I wanted to suppress these lines at logstash end because there are many different sources from which filebeat will send data to logstash. So , based on the source_type I need to skip lines from logs .Waiting for your reply!!!

Hi, @Badger I tried the same before. But this didn't work in my case.

Could you post a sample message after it has gone through Logstash please.

I'm sure it would be technically possible to do but I can't see an easy way to drop certain lines within a multi line field in Logstash. I think you would have to somehow completely deconstruct the field, remove the lines and then reconstruct the field again.

You have to configure Filebeat anyway for the multi line logs, why not add the exclude_lines at the same time? I see your logic to remove the lines centrally in Logstash but unfortunately I do not have a solution for that.

Sample message after passing through logstash
{
"_index": "jenkins-log",
"_type": "doc",
"_id": "VbrejGoBACho-A-fWXCK",
"_version": 1,
"_score": 0,
"_source": {
"@version": "1",
"message": "Started by user \u001b[8mha:////4JmSuRY+CFCKIu6RwHXiSXZDY4Huygv7JVrvzkfRI8zdAAAAmR+LCAAAAAAAAP9b85aBtbiIQTGjNKU4P08vOT+vOD8nVc83PyU1x6OyILUoJzMv2y+/JJUBAhiZGBgqihhk0NSjKDWzXb3RdlLBUSYGJk8GtpzUvPSSDB8G5tKinBIGIZ+sxLJE/ZzEvHT94JKizLx0a6BxUmjGOUNodHsLgAzWEgZe/dLi1CL9rNS87My8YgABJh63wgAAAA==\u001b[0mSuchiRunning in Durability level: MAX_SURVIVABILITY\u001b[8mha:////4BtSc1ywGlugseGokkLauhiOfjz5VzoAnCDAh7oY3f0qAAAAoh+LCAAAAAAAAP9tjTEOwjAQBM8BClpKHuFItIiK1krDC0x8GCfWnbEdkooX8TX+gCESFVvtrLSa5wtWKcKBo5UdUu8otU4GP9jS5Mixv3geZcdn2TIl9igbHBs2eJyx4YwwR1SwULBGaj0nRzbDRnX6rmuvydanHMu2V1A5c4MHCFXMWcf8hSnC9jqYxPTz/BXAFEIGsfuclm8zQVqFvQAAAA==\u001b[0m[Pipeline] Start of Pipeline\u001b[8mha:////4EVEmQlRJqNbotxxxv72biHyUzGmchavU3VpVGARx2w/AAAApR+LCAAAAAAAAP9tjTEOwjAUQ3+KOrAycohUghExsUZZOEFIQkgb/d8mKe3EibgadyBQiQlLlmxL1nu+oE4RjhQdby12HpP2vA+jK4lPFLtroIm3dOGaMFGwXNpJkrGnpUrKFhaxClYC1hZ1oOTRZdiIVt1VExS65pxj2Q4CKm8GeAAThZxVzN8yR9jeRpMIf5y/AJj7DGxXvP/86jduZBmjwAAAAA==\u001b[0m[Pipeline] nodeRunning on \u001b[8mha:////4PphPkWMGe0awTradjzsNI3THZhwqsf2lED4f9BhZa4DAAAAnh+LCAAAAAAAAP9b85aBtbiIQTGjNKU4P08vOT+vOD8nVc83PyU1x6OyILUoJzMv2y+/JJUBAhiZGBgqihhk0NSjKDWzXb3RdlLBUSYGJk8GtpzUvPSSDB8G5tKinBIGIZ+sxLJE/ZzEvHT94JKizLx0a6BxUmjGOUNodHsLgAz2EgZh/eT83ILSktQifY3cxGIgrakPAHib2iPIAAAA\u001b[0mJenkins in /var/jenkins_home/workspace/LogTestWithFileBeatAndKafka\u001b[8mha:////4NjG4QEBFZlqQL+OL0I3Wsf98Q0zVAOeQqy2bhO691fQAAAApR+LCAAAAAAAAP9tjTEOwjAUQ3+KOrAycoh0gA0xsUZZOEFIQkgb/d8mKe3EibgadyBQiQlLlmxL1nu+oE4RjhQdby12HpP2vA+jK4lPFLtroIm3dOGaMFGwXNpJkrGnpUrKFhaxClYC1hZ1oOTRZdiIVt1VExS65pxj2Q4CKm8GeAAThZxVzN8yR9jeRpMIf5y/AJj7DGxXvP/86jfoP95RwAAAAA==\u001b[0m[Pipeline] {\u001b[8mha:////4DIjrki99nl3MxdRCD5wb+MVl5WhzRshvb/bH/G1C7t/AAAAoh+LCAAAAAAAAP9tjTEOAiEURD9rLGwtPQTbaGWsbAmNJ0AWEZb8zwLrbuWJvJp3kLiJlZNMMm+a93rDOic4UbLcG+wdZu14DKOti0+U+lugiXu6ck2YKRguzSSpM+cFJRUDS1gDKwEbgzpQdmgLbIVXD9UGhba9lFS/o4DGdQM8gYlqLiqVL8wJdvexy4Q/z18BzLEA29ce4gdpL1fxvAAAAA==\u001b[0m[Pipeline] echoHello World\u001b[8mha:////4OHZRjXLolcyJYEcJUEdi9YMWaiWqqj6+zfgxwcaqoLaAAAAoh+LCAAAAAAAAP9tjTEOAiEURD9rLGwtPQTbGBtjZUtoPAGyiLDkfxZYdytP5NW8g8RNrJxkknnTvNcb1jnBiZLl3mDvMGvHYxhtXXyi1N8CTdzTlWvCTMFwaSZJnTkvKKkYWMIaWAnYGNSBskNbYCu8eqg2KLTtpaT6HQU0rhvgCUxUc1GpfGFOsLuPXSb8ef4KYI4F2L72ED+qfHravAAAAA==\u001b[0m[Pipeline] gitusing credential github-credential > git rev-parse --is-inside-work-tree # timeout=10Fetching changes from the remote Git repository > git config remote.origin.url https://github.com/Suchismita1tutun/calander-control.git # timeout=10Fetching upstream changes from https://github.com/username/repo.git > git --version # timeout=10using GIT_ASKPASS to set credentials github credential > git fetch --tags --progress https://github.com/username/repo.git +refs/heads/:refs/remotes/origin/ > git rev-parse refs/remotes/origin/master^{commit} # timeout=10 > git rev-parse refs/remotes/origin/origin/master^{commit} # timeout=10Checking out Revision 90fc07c147f226cca23f5b3fc3279233709a20e3 (refs/remotes/origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f 90fc07c147f226cca23f5b3fc3279233709a20e3 > git branch -a -v --no-abbrev # timeout=10 > git branch -D master # timeout=10 > git checkout -b master 90fc07c147f226cca23f5b3fc3279233709a20e3Commit message: "calander control added" > git rev-list --no-walk 90fc07c147f226cca23f5b3fc3279233709a20e3 # timeout=10\u001b[8mha:////4AKBxv2h+cZaWjt5g5KcXPzQIG/629SUoOGy4Qc1NpHvAAAAox+LCAAAAAAAAP9tjTEOAiEURD9rLGwtPQTbmFgYK1tC4wmQRYQl/7PAult5Iq/mHSRuYuUkk8yb5r3esM4JTpQs9wZ7h1k7HsNo6+ITpf4WaOKerlwTZgqGSzNJ6sx5QUnFwBLWwErAxqAOlB3aAlvh1UO1QaFtLyXV7yigcd0AT2CimotK5Qtzgt197DLhz/NXAHOMBdihdv8BvnU2RrwAAAA=\u001b[0m[Pipeline] }\u001b[8mha:////4BB9EOaTxXgEjDkBMyeLKMQIZ1ZyVrcViJM/K97NaUI9AAAAox+LCAAAAAAAAP9tjTEOAiEURD9rLGwtPQRb2JgYK1tC4wmQRYQl/7PAult5Iq/mHSRuYuUkk8yb5r3esM4JTpQs9wZ7h1k7HsNo6+ITpf4WaOKerlwTZgqGSzNJ6sx5QUnFwBLWwErAxqAOlB3aAlvh1UO1QaFtLyXV7yigcd0AT2CimotK5Qtzgt197DLhz/NXAHOMBdihdv8BzHcCWrwAAAA=\u001b[0m[Pipeline] // node\u001b[8mha:////4HfYwm5UTGfspQHu62u6ou8JBs9vMESu8TI0oCv/vlo/AAAAoh+LCAAAAAAAAP9tjTEOAiEURD9rLGwtPQSb2BljZUtoPAGyiLDkfxZYdytP5NW8g8RNrJxkknnTvNcb1jnBiZLl3mDvMGvHYxhtXXyi1N8CTdzTlWvCTMFwaSZJnTkvKKkYWMIaWAnYGNSBskNbYCu8eqg2KLTtpaT6HQU0rhvgCUxUc1GpfGFOsLuPXSb8ef4KYI6xADvU7j/qbmUQvAAAAA==\u001b[0m[Pipeline] End of PipelineFinished: SUCCESS",
"@timestamp": "2019-05-06T11:19:02.776Z"
},
"fields": {
"@timestamp": [
"2019-05-06T11:19:02.776Z"
]
}
}

Okay I can give it a try by adding exclude_lines at filebeat config. But it would be better if we could do it at logstash end. Otherwise, I need to configure different filebeat prospectors for different source.
Waiting for your reply!!!

Is the drop filter not working because of multi line logs?

First, I don't think the drop filter works the way you imagine it to work. From the documentation

Drops everything that gets to this filter.

So drop will drop the entire "event" or "document" (what ever the correct term is. I don't think you can use it to "drop" specific parts of a document or field.

Second, your are trying to match the start of a line. The message field only has one line start. But even if you would get a match, the whole document would be dropped anyway.

This is as far as I know. I'm definitely not an expert on the drop filter...

That's not correct. A mulitline message has multiple line starts. \A matches beginning of string, and there is only one beginning of string in a string.

If the event were generated by a multiline codec you could remove lines using a gsub filter. Use an anchor then a character group that excludes newline followed by a newline. Something like

mutate { gsub => [ "message", "\[8mha:[^
]
", "" ] }

Not sure if that applies to events generated by filebeat.

There you go :slight_smile: You learn something new every day :smiley:

Thanks for the info @Badger

I found this from the documentation Filebeat Prospectors | Filebeat Reference [5.5] | Elastic

If multiline is also specified, each multiline message is combined into a single line before the lines are filtered by exclude_lines .

This is an open issue .Please take a look. Filebeat exclude_lines prior to multiline · Issue #1940 · elastic/beats · GitHub

I did not know about that... And that is an old issue. I wonder if it is on the roadmap at all...

Did you try Badgers suggestion that would let you remove the lines in Logstash using the gsub filter?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.