Here is my scenario -
Logstash running on an hosted internal windows server
Filebeat running on a linux (Ubuntu) - internally hosted (both can communicate with other internally ok)
docker containers running on Linux box (docker compose)
Issue = Logs as creating 2 events (which is ok - but its not outputting next line data.
Example of output from below data:
Event 1 message =
***** Error: Castle.MicroKernel.ComponentNotFoundException: Requested component named 'string' was not found in the container. Did you forget to register it?
Event 2 message =
some message initializing ..
So its ignoring the multi-line data:
There are 2 other components supporting requested service 'Example.IFreightRateRepo'. Were you looking for any of them?
and the rest in the logs below...
Example of logs im trying to consumer with Filebeats
{"log":"2022-01-20 15:00:27,187 [30] ERROR example TEST Example.Api 513547be9c9f HostMachine1 - ***** Error: Castle.MicroKernel.ComponentNotFoundException: Requested component named 'string' was not found in the container. Did you forget to register it?\n","stream":"stdout","time":"2022-01-20T15:00:27.187743102Z"}
{"log":"There are 2 other components supporting requested service 'Example.IFreightRateRepo'. Were you looking for any of them?\n","stream":"stdout","time":"2022-01-20T15:00:27.187756342Z"}
{"log":" at Castle.MicroKernel.DefaultKernel.Castle.MicroKernel.IKernelInternal.Resolve(String key, Type service, Arguments arguments, IReleasePolicy policy)\n","stream":"stdout","time":"2022-01-20T15:00:27.187760792Z"}
{"log":" at Castle.Facilities.TypedFactory.TypedFactoryComponentResolver.Resolve(IKernelInternal kernel, IReleasePolicy scope)\n","stream":"stdout","time":"2022-01-20T15:00:27.187764841Z"}
{"log":" at Castle.Facilities.TypedFactory.Internal.TypedFactoryInterceptor.Resolve(IInvocation invocation)\n","stream":"stdout","time":"2022-01-20T15:00:27.187768609Z"}
{"log":" at Castle.Facilities.TypedFactory.Internal.TypedFactoryInterceptor.Intercept(IInvocation invocation)\n","stream":"stdout","time":"2022-01-20T15:00:27.187772283Z"}
{"log":" at Castle.DynamicProxy.AbstractInvocation.Proceed()\n","stream":"stdout","time":"2022-01-20T15:00:27.187791185Z"}
{"log":" at Castle.Proxies.ICarrierFactoryProxy.GetByCarrier(String carrierValue)\n","stream":"stdout","time":"2022-01-20T15:00:27.187795032Z"}
{"log":" at Example.Repo.CommonFreightRepo.getRatesAsync(RateRequest request) in /src/ExampleApp/Common/Repo/CommonFreightRepo.cs:line 43\n","stream":"stdout","time":"2022-01-20T15:00:27.18779829Z"}
{"log":" at Pivot.FreightRepos.Common.Repo.CommonFreightRepo.GetRatesUsingAsync(RateRequest request) in /src/ExampleApp/Common/Repo/CommonFreightRepo.cs:line 34\n","stream":"stdout","time":"2022-01-20T15:00:27.187801923Z"}
{"log":"2022-01-19 16:32:45,335 [1] INFO example TEST Example.Api2 c0acc6320d25 HostMachine1 - some message initializing ..\n","stream":"stdout","time":"2022-01-19T16:32:45.362836985Z"}
Logstash config example:
# contents of logstash\bin\logstash.config
input {
beats {
port => 5044
}
}
filter {
if (([app] in "OnPremMS")) {
mutate {
strip => message
}
if (!("_grokparsefailure" in [tags]) and (([app] in "OnPremMS")) and !(([app] in "MicroIntegrations"))){
grok {
match => { message => "(?m)%{TIMESTAMP_ISO8601:sourceTimestamp} \[%{DATA:thread}\] %{LOGLEVEL:loglevel}\s{0,1} %{DATA:logger} %{DATA:env} %{DATA:applicationName} %{DATA:containerName} %{DATA:dockerHostName} - %{GREEDYDATA:tempMessage}" }
}
mutate {
replace => [ "message" , "%{tempMessage}" ]
remove_field => [ "tempMessage", "sourceTimestamp" ]
}
}
date {
match => [ "sourceTimestamp", "YYYY-MM-dd HH:mm:ssZZ", "YYYY-MM-dd HH:mm:ssZ", "YYYY-MM-dd HH:mm:ss", "YYYY-MM-dd HH:mm:ss,SSSZZ", "YYYY-MM-dd HH:mm:ss,SSSZ", "YYYY-MM-dd HH:mm:ss,SSS", "YYYY-MM-dd HH:mm:ss:SSSZZ", "YYYY-MM-dd HH:mm:ss:SSSZ", "YYYY-MM-dd HH:mm:ss:SSS", "YYYY-MM-dd'T'HH:mm:ss,SSS", "ISO8601" ]
}
}
}
output {
if (!("_grokparsefailure" in [tags]) and ([app] in "OnPremMS")){
elasticsearch {
hosts => "https://cloudInstanceId.us-west-1.aws.found.io:9243"
user => "exampleUser"
password => "examplePw"
index => "indexName-%{+YYYY.MM}"
}
}
}
Filebeat within Linux example:
filebeat.inputs:
- type: filestream
enabled: true
paths:
- /var/lib/docker/containers/**/*.log
parsers:
- container:
stream: stdout
fields:
app: OnPremMS
entity: MicroservicesLogsFieldValue
service: AllMicroservices
fields_under_root: true
multiline.pattern: '^{"log":"[0-9]{4}-[0-9]{2}-[0-9]{2}'
multiline.negate: true
multiline.match: after
# Removes the state for file which cannot be found on disk anymore immediately
clean_removed: true
# ============================== Filebeat modules ==============================
filebeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d/*.yml
# Set to true to enable config reloading
reload.enabled: false
# ======================= Elasticsearch template setting =======================
setup.template.settings:
index.number_of_shards: 1
# ================================== General ===================================
# The tags of the shipper are included in their own field with each
# transaction published.
tags: ["applicationlogs"]
# Optional fields that you can specify to add additional information to the
# output.
fields:
environment: TEST
fields_under_root: true
# ------------------------------ Logstash Output -------------------------------
output.logstash:
# The Logstash hosts
hosts: ["WindowsServerName:5044"]
# ================================= Processors =================================
processors:
- add_host_metadata:
when.not.contains.tags: forwarded
- add_cloud_metadata: ~
- add_docker_metadata: ~
- add_kubernetes_metadata: ~
Thanks