Agents are running properly but no data in Data Stream

Using Elastic Cloud I'm not seeing any data in the data stream section even though a test "Rails Prod" server is showing "Online".

It looks like you created your own policy.... did you include / assign any integrations in the policy?

Back in Fleet, under Choose an agent policy , notice that the default policy is selected. The default policy includes a system integration for collecting logs and metrics from the host system.

Hi Stephen,

Thank you for taking the time to respond.

Yes, the policy has one integration. The default one:

which should include general metrics about the Linux system.

Thank you!

Hmm, Interesting, could you try the default policy and see what happens?

There is no data in Data Streams but there is an error now:

Hmm kinda seems like more than one agent perhaps... did you also try to install the original beat like filebeat or metricbeat, they do not co-exist as far as I know.

Yes, it was originally installed but it is now uninstalled.

sudo apt remove filebeat
manually deleted the directory
the service is now gone
which filebeat returns nothing

Not sure what the issue is...

Are beats process still running by chance? that error seems to indicate more than one agent / beat us running.

I would make sure both metricbeat and filebeat fully removed and make sure they are no zombie procceses

Then I would uninstall the agent , reinstall and try the default policy.

I made sure filebeat was gone, remove the elastic agent, reinstalled the elastic agent, rebooted, and still no data, no errors either:

It would be better if you could paste in that service status instead of a screen shot... I can not see the whole message.

(BTW in general screen shots of text / logs / status are discouraged as they can not be searched etc by others looking for the same errors / questions and therefore are less useful never mind that we can not see the entire text)

I am going to install and run on an Ubuntu server and see what I get. ... seems odd that there are 2 in filebeats and metricbeat but I can not see the rest of the status messages.

Is the Elasticsearch Cluster is it self managed or a hosted cluster?

Can you curl the elasticsearch host from that ubuntu host?

Have you looked at the actual agent logs?


Thanks. I should have pasted the code you're right.

Unfortunately I deleted the hosted ElasticSearch deployment and then made a new one so I can't grab any new logs.

Good news, I connected the test Rails Prod server to the new deployment and it's working. Something was up with the other deployment it seems.

I'm using hosting by the way.

Thank you.

1 Like

Awesome thanks for letting me know... and thanks for using the hosted solution. I am sorry you had challenges setting it up.

I am going to test anyway... see if I run into anything.

Hi all,

I'm also working with testing out the fleet component (on a self-hosted 7.10.1 stack) , and have three agents reporting back as being online and zero data streams. I used the setup method present in kibana to download the tar, then run with the generated command to include the enrollment token.

I'm not seeing anything that looks to stand out, but on the elasticsearch cluster nodes I do see a message like this:

[o.e.h.AbstractHttpServerTransport] [es2] caught exception while handling client http traffic, closing connection Netty4HttpChannel{localAddress=/, remoteAddress=/}
Caused by: Received fatal alert: bad_certificate

The remote address is one of my test clients with the fleet agent. I also have some metricbeat data coming in from other hosts where I did need to specify ssl.certificate_authorities, ssl.certificate and. ssl.key

Is the same also needed in the fleet config to trust and use the self-created CA in some way?


I think you are going to need to use the --insecure setting on the command line see here

Thanks for the suggestion. I looked over my past installs and they all had the --insecure flag configured when using the elastic-agent install option.

I'm also experiencing this issue.

I created a bunch of policies that are sending data to a namespace (Guessing I create the namespace when defining it on policy creation?) I've assigned policies to users and added custom integrations like Elastic Security and Windows but when I search for data under Data Streams, it's empty, and I can't select any namespaces under the filter dropdown.

1 Like

Hi All,

I have the same problem here.
I installed the agent with the following command:

./elastic-agent-7.10.1-linux-x86_64/elastic-agent install -f --kibana-url=http://dw-vmkibana-00:5601 --enrollment-token=WFJzS0IzY0JzTDc1NndqNk4xaDM6eFpkVno0
OHlTb0N0T1JBd2Mxd0VuZw== --insecure

I've installed the agent on four different systems. The behavior is always the same.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.