Newbie here: automated daily reports from searches as PDF needed

alerting

#1

Hello there!

As you can read in the topic title, I'm a newbie to the whole Elasticsearch stuff, including Kibana and X-Pack. I've been working with an ELK stack for a couple of days now and I'm evaluating all the neat things one can do with it.

We want to use Elasticsearch/Kibana/X-Pack (5.0.x on a Windows platform with 2 servers and 12 clients, no Internet connection) to replace an old COTS tool that's currently used to log Windows events. I'm quite satisfied with the results I get using Elasticsearch, Kibana and Winlogbeat. The logging works and we even have a better performance now as well.

After having spent some time now to convert the filters we use in our old tool into Kibana searches, I'm having a problem getting one last feature to work. We need to create scheduled PDF reports for our system. The creation of said reports works using the reporting plugin from the X-Pack. However, I am having trouble to set up the Watcher plugin the way we need it.

What I have is the generation URL from the reports we want to have automatically created. Then I have knowledge of the syntax I have to use for Watcher after reading the "Getting Started with Watcher" pages. But I seem to be stuck. The example given on the "Automating Report Generation" pages did not really help me. I was not able to rework it in a way that only reports are generated without further actions. I would very much appreciate some help from the experienced users of this community.

After the problem of the automated reports is solved, there would be another issue. Currently, the old tool puts all generated reports in an existing folder structure, which is backed-up regularly. Is it possible to have generated PDFs put in a specific folder via Watcher? The defined actions one can use with Watcher do not seem to directly support this.

Thanks to all of you in advance!

Best regards!


(Mark Walkom) #2

Can you share what you have so far?


#3

Certainly! Here's what I have: Elasticsearch is installed on Windows 10 together with Kibana, the Beats, Logstash and X-Pack (v 5.0.x). The Installation itself is pretty standard (no changes for the ports, just the names have been altered, all installation paths are whitout whitespaces to not run into any trouble there). The Java Version is 1.8.0_101 (32 bit). To rule any firewall issues out, it was deactivated on all clients (which are not connected to the Internet).

For testing the automated reporting, I used the development console of Kibana and entered the following command:

[CODE]
PUT _xpack/watcher/watch/report
{
"trigger": {
"schedule": {
"interval": "1m"
}
},

"actions": {
"webhook_test": {
"webhook": {
"method": "POST",
"url": "http://192.168.178.1:5601/api/reporting/generate/search/Security-events-on-all-clients?_g=(time:(from:now-24h,mode:quick,to:now))&sync"
}
}
}
}
[/CODE]

Then, by searching for the report with:

GET .watches/_search

I get the following response:

{ "took": 52, "timed_out": false, "_shards": { "total": 1, "successful": 1, "failed": 0 }, "hits": { "total": 1, "max_score": 1, "hits": [ { "_index": ".watches", "_type": "watch", "_id": "report", "_score": 1, "_source": { "trigger": { "schedule": { "interval": "1m" } }, "input": { "none": {} }, "condition": { "always": {} }, "actions": { "webhook_test": { "webhook": { "scheme": "http", "host": "192.168.178.1", "port": 5601, "method": "post", "path": "/api/reporting/generate/search/Security-events-on-all-clients", "params": { "_g": "(time:(from:now-24h,mode:quick,to:now))", "sync": "" }, "headers": {} } } }, "_status": { "state": { "active": true, "timestamp": "2016-12-08T07:25:33.918Z" }, "actions": { "webhook_test": { "ack": { "timestamp": "2016-12-08T07:25:33.918Z", "state": "awaits_successful_execution" }, "last_execution": { "reason": "ConnectException[Connection refused: connect]", "timestamp": "2016-12-08T07:39:34.509Z", "successful": false } } }, "last_checked": "2016-12-08T07:39:34.509Z", "last_met_condition": "2016-12-08T07:39:34.509Z" } } } ] } }

As I said, I'm quite new to the whole Elasticsearch stuff. And I'm certainly no Java expert. That's why I'm a little puzzled at the moment... :flushed:

And like I said in the first post, I have not found a way to put the PDFs automatically in a folder. The above example is just for the scheduled generation of the reports.

Thanks again for your help!

Best regards!


#4

As it didn't fit in the above post, here's the Elasticsearch console output:

[2016-12-08T08:59:35,387][ERROR][o.e.x.w.a.w.ExecutableWebhookAction] [PFMW01] failed to execute action [report/webhook_test] java.net.ConnectException: Connection refused: connect at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method) ~[?:1.8.0_101] at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:85) ~[?:1.8.0_101] at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) ~[?:1.8.0_101] at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) ~[?:1.8.0_101] at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) ~[?:1.8.0_101] at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172) ~[?:1.8.0_101] at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) ~[?:1.8.0_101] at java.net.Socket.connect(Socket.java:589) ~[?:1.8.0_101] at sun.net.NetworkClient.doConnect(NetworkClient.java:175) ~[?:?] at sun.net.www.http.HttpClient.openServer(HttpClient.java:432) ~[?:?] at sun.net.www.http.HttpClient.openServer(HttpClient.java:527) ~[?:?] at sun.net.www.http.HttpClient.<init>(HttpClient.java:211) ~[?:?] at sun.net.www.http.HttpClient.New(HttpClient.java:308) ~[?:?] at sun.net.www.http.HttpClient.New(HttpClient.java:326) ~[?:?] at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1169) ~[?:?] at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1148) ~[?:?] at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:999) ~[?:?] at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:933) ~[?:?] at org.elasticsearch.xpack.common.http.HttpClient.doExecute(HttpClient.java:170) ~[x-pack-5.0.0.jar:5.0.0] at org.elasticsearch.xpack.common.http.HttpClient.execute(HttpClient.java:90) ~[x-pack-5.0.0.jar:5.0.0] at org.elasticsearch.xpack.watcher.actions.webhook.ExecutableWebhookAction.execute(ExecutableWebhookAction.java:57) ~[x-pack-5.0.0.jar:5.0.0] at org.elasticsearch.xpack.watcher.actions.ActionWrapper.execute(ActionWrapper.java:161) [x-pack-5.0.0.jar:5.0.0] at org.elasticsearch.xpack.watcher.execution.ExecutionService.executeInner(ExecutionService.java:415) [x-pack-5.0.0.jar:5.0.0] at org.elasticsearch.xpack.watcher.execution.ExecutionService.execute(ExecutionService.java:275) [x-pack-5.0.0.jar:5.0.0] at org.elasticsearch.xpack.watcher.execution.ExecutionService$WatchExecutionTask.run(ExecutionService.java:496) [x-pack-5.0.0.jar:5.0.0] at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:444) [elasticsearch-5.0.0.jar:5.0.0] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_101] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_101] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_101]


#5

Hello again,

I changed the server to "localhost" now and added "headers": { "kbn-xsrf": "reporting" }:

[CODE]
PUT _xpack/watcher/watch/report
{
"trigger": {
"schedule": {
"interval": "1m"
}
},

"actions": {
"webhook_test": {
"webhook": {
"method": "POST",
"headers": {
"kbn-xsrf": "reporting"
},
"url": "http://localhost:5601/api/reporting/generate/search/Security-events-on-all-clients?_g=(time:(from:now-24h,mode:quick,to:now))&sync"
}
}
}
}
[/CODE]

Now the response is:

... "last_execution": { "reason": "ElasticsearchTimeoutException[failed to execute http request. timeout expired]; nested: SocketTimeoutException[Read timed out]; ", "timestamp": "2016-12-08T11:58:11.512Z", "successful": false } ...

Still not getting results... :frowning:


#6

OK,

I got myself the solution for the first problem! Yay!
I had to add

"read_timeout": "300s",
Now I get my reports as PDFs!

Anyway, that's only the first step. Now I need to get them in a specific folder on the harddrive!
Any suggestions?

Best regards!


(Alexander Reelsen) #7

Hey,

so what you are doing with this watch is basically triggering the execution of a PDF report generation, but not doing anything with. You should take a look the email action and its different attachment types. The http one can be used to download an attachment from kibana and sent it via email. This is the way to go in your case.

See https://www.elastic.co/guide/en/x-pack/5.0/actions-email.html#configuring-email-attachments

--Alex


#8

Hey Alex,

Thanks for your reply. I already read through the e-mail action. Unfortunately, there is no e-mail service running on our isolated system. It's not planned to install one on it either, as our customer requirements do not allow this. It would be convenient to have one running of course, so I would be able to store the reports in some mailbox (which is easily accessible and could be part of a backup as well).

Nevertheless, I need to find a way to get the reports stored automatically in some dedicated folder on the hard drive. I guess there is no way to somehow pull these PDF reports manually out of the indices folders (via a command or PowerShell script)?

Best regards!


(Alexander Reelsen) #9

Hey,

no, this is not possible. The main reason for this is, that you would never know in which node your download was stored, as watcher runs on the current master node and thus you would need to check any master eligibe node, where such a PDF has been stored.

Also note, that the PDF is never stored somewhere on disk, if you just download it via the HTTP action, which means there is also no way to extract.

--Alex


#10

Hello again,

That's unfortunate. So the bottom line is, that there's no way (doesn't have to be an easy one) to get the automatically generated PDFs as a file onto a harddrive? Maybe there's some workaround via the URL that I use for the action in the Watcher plugin...

Best regards!

Edit: I just found "skedler", which brings the functionality I would need: "save generated reports in custom server folders", so it is possible. :slight_smile:


(system) #11

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.