.Net Core Agent not showing transactions

Kibana version:
6..8.0

Hi,

I am trying to instrument a .net core web application using elastic and am running into some issues.
I am able to see metric data ( cpu/ memory ) but am unable to see any transactions. I'm running a .NET Core 2.1 application on Ubuntu 18.04.

These are my settings

"ElasticAPM":{
"ServerUrls" : "https://apm-stage.****.com:8200",
"ServiceName": "dd_dev",
"MetricsInterval": "1s",
"TransactionSampleRate": 1.0,
"Environment": "staging"
}

This is what I see when logs are turned on

{ "time": "2019-09-11 14:28:33.1157", "level": "DEBUG", "nested": { "message": "init main" } }
{ "time": "2019-09-11 14:28:34.7749", "level": "WARN", "nested": { "message": "{MicrosoftExtensionsConfig} Service name provided in configuration is dd_dev" }, "{Scope}": "MicrosoftExtensionsConfig", "ServiceName": "dd_dev" }
{ "time": "2019-09-11 14:28:34.7851", "level": "INFO", "nested": { "message": "{MicrosoftExtensionsConfig} The agent was started without a service version. The service version will be automatically discovered." }, "{Scope}": "MicrosoftExtensionsConfig" }
{ "time": "2019-09-11 14:28:34.7851", "level": "INFO", "nested": { "message": "{MicrosoftExtensionsConfig} The agent was started without a service version. The automatically discovered service version is 1.0.0" }, "{Scope}": "MicrosoftExtensionsConfig", "ServiceVersion": "1.0.0" }
{ "time": "2019-09-11 14:28:34.7910", "level": "DEBUG", "nested": { "message": "{Service} Environment.Version (4.0.30319.42000') and version (4.6.27317.03') from RuntimeInformation.FrameworkDescription (.NET Core 4.6.27317.03') don't refer to the same version" }, "{Scope}": "Service", "DotNetFrameworkRuntimeVersion": "4.0.30319.42000", "DotNetFrameworkRuntimeDescription": ".NET Core 4.6.27317.03" } { "time": "2019-09-11 14:28:35.1437", "level": "DEBUG", "nested": { "message": "{Service} Falling back on using System assembly file location (/home/deepak/dotnet/shared/Microsoft.NETCore.App/2.1.8/System.Private.CoreLib.dll') - returning (2.1.8')" }, "{Scope}": "Service", "SystemAssemblyFileLocation": "\/home\/deepak\/dotnet\/shared\/Microsoft.NETCore.App\/2.1.8\/System.Private.CoreLib.dll", "DotNetFrameworkRuntimeVersion": "2.1.8" } { "time": "2019-09-11 14:28:35.1517", "level": "DEBUG", "nested": { "message": "{SystemInfoHelper} Could not parse container ID from '\/proc\/self\/cgroup' line: 10:memory:\/user.slice" }, "{Scope}": "SystemInfoHelper", "line": "10:memory:\/user.slice" } { "time": "2019-09-11 14:28:35.1517", "level": "DEBUG", "nested": { "message": "{SystemInfoHelper} Could not parse container ID from '\/proc\/self\/cgroup' line: 9:devices:\/user.slice" }, "{Scope}": "SystemInfoHelper", "line": "9:devices:\/user.slice" } { "time": "2019-09-11 14:28:35.1538", "level": "DEBUG", "nested": { "message": "{SystemInfoHelper} Could not parse container ID from '\/proc\/self\/cgroup' line: 7:pids:\/user.slice\/user-1000.slice\/session-2.scope" }, "{Scope}": "SystemInfoHelper", "line": "7:pids:\/user.slice\/user-1000.slice\/session-2.scope" } { "time": "2019-09-11 14:28:35.1538", "level": "DEBUG", "nested": { "message": "{SystemInfoHelper} Could not parse container ID from '\/proc\/self\/cgroup' line: 4:cpu,cpuacct:\/user.slice" }, "{Scope}": "SystemInfoHelper", "line": "4:cpu,cpuacct:\/user.slice" } { "time": "2019-09-11 14:28:35.1538", "level": "DEBUG", "nested": { "message": "{SystemInfoHelper} Could not parse container ID from '\/proc\/self\/cgroup' line: 2:blkio:\/user.slice" }, "{Scope}": "SystemInfoHelper", "line": "2:blkio:\/user.slice" } { "time": "2019-09-11 14:28:35.1538", "level": "DEBUG", "nested": { "message": "{SystemInfoHelper} Could not parse container ID from '\/proc\/self\/cgroup' line: 1:name=systemd:\/user.slice\/user-1000.slice\/session-2.scope" }, "{Scope}": "SystemInfoHelper", "line": "1:name=systemd:\/user.slice\/user-1000.slice\/session-2.scope" } { "time": "2019-09-11 14:28:35.1538", "level": "DEBUG", "nested": { "message": "{SystemInfoHelper} Could not parse container ID from '\/proc\/self\/cgroup' line: 0::\/user.slice\/user-1000.slice\/session-2.scope" }, "{Scope}": "SystemInfoHelper", "line": "0::\/user.slice\/user-1000.slice\/session-2.scope" } { "time": "2019-09-11 14:28:35.1538", "level": "ERROR", "nested": { "message": "{SystemInfoHelper} Failed parsing container id" }, "{Scope}": "SystemInfoHelper" } { "time": "2019-09-11 14:28:35.2948", "level": "INFO", "nested": { "message": "{MetricsCollector} Collecting metrics in 1000 milliseconds interval" }, "{Scope}": "MetricsCollector", "interval": 1000 } { "time": "2019-09-11 14:28:35.2986", "level": "DEBUG", "nested": { "message": "{MicrosoftExtensionsConfig} Using provided transaction sample rate1' parsed as 1" }, "{Scope}": "MicrosoftExtensionsConfig", "ProvidedTransactionSampleRate": "1" }
{ "time": "2019-09-11 14:28:35.4525", "level": "DEBUG", "nested": { "message": "{DiagnosticInitializer} Subscribed Elastic.Apm.AspNetCore.DiagnosticListener.AspNetCoreDiagnosticListener to `Microsoft.AspNetCore' events source" }, "{Scope}": "DiagnosticInitializer", "DiagnosticListenerType": "Elastic.Apm.AspNetCore.DiagnosticListener.AspNetCoreDiagnosticListener", "DiagnosticListenerName": "Microsoft.AspNetCore" }
{ "time": "2019-09-11 14:28:36.3283", "level": "DEBUG", "nested": { "message": "{PayloadSenderV2} MetricSet added to the queue, Elastic.Apm.Metrics.MetricSet" }, "{Scope}": "PayloadSenderV2", "MetricSet": "Elastic.Apm.Metrics.MetricSet" }
{ "time": "2019-09-11 14:28:36.3311", "level": "DEBUG", "nested": { "message": "{MetricsCollector} Metrics collected: MetricSample{system.memory.total: 33573797888}, MetricSample{system.memory.actual.free: 19184427008}, MetricSample{system.process.memory.size: 23259795456}, MetricSample{system.process.memory.rss.bytes: 90841088}, MetricSample{system.cpu.total.norm.pct: 0.149880095923261}, MetricSample{system.process.cpu.total.norm.pct: 0.0290689674819026}" }, "{Scope}": "MetricsCollector", "data": "MetricSample{system.memory.total: 33573797888}, MetricSample{system.memory.actual.free: 19184427008}, MetricSample{system.process.memory.size: 23259795456}, MetricSample{system.process.memory.rss.bytes: 90841088}, MetricSample{system.cpu.total.norm.pct: 0.149880095923261}, MetricSample{system.process.cpu.total.norm.pct: 0.0290689674819026}" }
{ "time": "2019-09-11 14:28:37.3013", "level": "DEBUG", "nested": { "message": "{PayloadSenderV2} MetricSet added to the queue, Elastic.Apm.Metrics.MetricSet" }, "{Scope}": "PayloadSenderV2", "MetricSet": "Elastic.Apm.Metrics.MetricSet" }

Can you tell me why I'm unable to see transactions ?

Thanks.

Hi @apm_user, welcome to discuss!

Based on the logs you pasted here I don't see any transaction captured by the agent - so it seems to either go wrong very early, or capturing transactions is not turned on.

How did you activate the agent? Do you use the UseAllElasticApm() method as described here?

I am using Elastic.Apm.AspNetCore and this is the code in Startup()

app.UseElasticApm(config);

I am using Elastic.Apm.AspNetCore and this is the code in Startup()
app.UseElasticApm(config);

That seems ok, assuming it's in the 1. line of the Configure method.

One suggestion: Use the UseAllElasticApm from the Elastic.Apm.NetCoreAll package, that turns on every agent component, although UseElasticApm should also capture incoming HTTP request as transaction.

Just to make sure we are on the same page: with that the agent will capture the incoming HTTP request to your ASP.NET Core app as transactions, for manually creating transactions you need to use the public agent api.

Can you maybe also paste a log here where you turn both the ASP.NET Core log level and the elastic apm log level to Trace and make sure that there are some HTTP requests in the timeframe of the logs?

@GregKalapos ,

I am able to see transactions,after moving up UseAllElasticAPM to line 1. Is there a reason it needs to be set before everything else ?

I am able to see transactions,after moving up UseAllElasticAPM to line 1. Is there a reason it needs to be set before everything else ?

Glad to hear that! We recommend UseAllElasticApm being the 1. line because we want to make sure it's the 1 middleware that gets registered, otherwise the duration of the transaction can be inaccurate. On the other hand if it's not in the 1. line the agent still should record transactions, so that's strange. I opened a GitHub issue to investigate this further. If you have a small reproducer that would help a lot.

But overall I guess the problem is solved, right?

@GregKalapos ,

Yep, The problem is solved. Thanks for your help! Unfortunately, I don't have a small reproducer. I can post part of code of that helps.

Great!

Yes, if you could post part of the code that'd help. Either directly on the GitHub issue or here.