Elastic-agent "Process another repeated request" in loop indefinitely

Hello,

I've got an elastic-agent with no agent monitoring settings (meaning no Agent Logs/Mertrics collection) and only one integration policy.

So my elastic-agent.yml is pretty small:

id: 949c1a80-c8a9-11ed-8539-532f4758083a
revision: 2
outputs:
  default:
    type: elasticsearch
    hosts:
      - 'http://localhost:9200'
    username: 'elastic'
    password: 'changeme'
output_permissions:
  default:
    _elastic_agent_monitoring:
      indices: []
    _elastic_agent_checks:
      cluster:
        - monitor
    c4e5502e-d2ac-428d-bd13-85afdcc371e3:
      indices:
        - names:
            - logs-ti_misp.threat-default
          privileges:
            - auto_configure
            - create_doc
agent:
  download:
    sourceURI: 'https://artifacts.elastic.co/downloads/'
  monitoring:
    enabled: false
    logs: false
    metrics: false
inputs:
  - id: httpjson-ti_misp-c4e5502e-d2ac-428d-bd13-85afdcc371e3
    name: ti_misp-1
    revision: 1
    type: httpjson
    use_output: default
    meta:
      package:
        name: ti_misp
        version: 1.10.1
    data_stream:
      namespace: default
    package_policy_id: c4e5502e-d2ac-428d-bd13-85afdcc371e3
    streams:
      - id: httpjson-ti_misp.threat-c4e5502e-d2ac-428d-bd13-85afdcc371e3
        data_stream:
          dataset: ti_misp.threat
          type: logs
        config_version: '2'
        interval: 3m
        request.method: POST
        request.url: 'https://localhost/events/restSearch'
        request.ssl:
          verification_mode: none
        request.timeout: 30s
        request.body: null
        request.transforms:
          - set:
              target: header.Authorization
              value: BEpdSXuPb2lRyhVjNy9nHiA7EApYdD9ajMRafBZQ
          - set:
              target: body.page
              value: 1
          - set:
              target: body.limit
              value: 5
          - set:
              target: body.returnFormat
              value: json
          - set:
              target: body.timestamp
              value: '[[.cursor.timestamp]]'
              default: '[[ formatDate (now (parseDuration "-6000")) "UnixDate" ]]'
        response.split:
          target: body.response
          split:
            target: body.Event.Attribute
            ignore_empty_value: true
            keep_parent: true
            split:
              target: body.Event.Object
              keep_parent: true
              split:
                target: body.Event.Object.Attribute
                keep_parent: true
        response.request_body_on_pagination: true
        response.pagination:
          - set:
              target: body.page
              value: >-
                [[if (ne (len .last_response.body.response) 0)]][[add
                .last_response.page 1]][[end]]
              fail_on_template_error: true
        cursor:
          timestamp:
            value: '[[.last_event.Event.timestamp]]'
        tags:
          - preserve_original_event
          - forwarded
          - misp-threat
        publisher_pipeline.disable_host: true

I have the following info logs from elastic_agent.filebeat dataset every minutes indefinitely:

{"log.level":"info","@timestamp":"2023-03-22T15:01:57.095+0100","message":"Process another repeated request.","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"httpjson-default","type":"httpjson"},"log":{"source":"httpjson-default"},"log.origin":{"file.line":132,"file.name":"httpjson/input.go"},"log.logger":"input.httpjson-cursor","service.name":"filebeat","id":"httpjson-ti_misp.threat-c4e5502e-d2ac-428d-bd13-85afdcc371e3","input_source":"https://localhost/events/restSearch","input_url":"https://localhost/events/restSearch","ecs.version":"1.6.0","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2023-03-22T15:01:57.178+0100","message":"request finished: 1 events published","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"httpjson-default","type":"httpjson"},"log":{"source":"httpjson-default"},"log.origin":{"file.line":445,"file.name":"httpjson/request.go"},"service.name":"filebeat","input_source":"https://localhost/events/restSearch","log.logger":"input.httpjson-cursor","id":"httpjson-ti_misp.threat-c4e5502e-d2ac-428d-bd13-85afdcc371e3","input_url":"https://localhost/events/restSearch","ecs.version":"1.6.0","ecs.version":"1.6.0"}

It seem's that only one event were send.
But still no index in elasticsearch:

I've change the interval configuration from ten minutes to 1 minute, and put the timestamp as it collect all events that has a timestamp greater or equal (now - 6000h).

I don't know what the "Process another repeated request" means

The log entry you are seeing is just informational, it means that the integration communicated with MISP, and that it found at least 1 event to send to ES, though if there is no response, it might also count as 1.

If these are logged every configured interval, then from an integration perspective it is working just fine, and either the ingestion of data is working just fine, or we are not getting any data from MISP itself.

Your API request returns 0 information at all, even without any integrations there should be plenty of them, so the way you are checking if the datastream exist or not is most likely not the best way.

Can you check in Kibana? Or maybe In the Stack Management UI? Or even just in the Discover page?

Hello Marius,

Thank you for these informations and your response.

I'm not sure you expected a response in this post. Tell if not

You right i didn't get any information from MISP due to this mistake in config file

- set:
              target: body.timestamp
              value: '[[.cursor.timestamp]]'
              default: '[[ formatDate (now (parseDuration "-6000")) "UnixDate" ]]'

When i change 6000 to 6000h, i received some data.

Then another errors comes up which was : "Document contains at least one immense term in field=\\\"event.original\\\"(i resolve it by deleting keep_original event setting in config file)

Despite the error i received some data: a new index pop in with Misp information stored in it as document stored (Request: /.ds-logs-ti_misp.threat-default-2023.03.23-000001/_search:

But still nothing in discover Logs view of Kibana (for logs and metrics)

Then i delete elastic agent and restart it with the config that remove the original event. Then i directly have a Bulk index error (before the connection to elasticsearch was made) ("reason\":\"[parent] Data too large, data for [<http_request>] would be [2081656578/1.9gb], which is larger than the limit of [2040109465/1.8gb], real usage: [1945182800/1.8gb], new bytes reserved:):

{"log.level":"error","@timestamp":"2023-03-23T09:33:53.652+0100","message":"failed to perform any bulk index operations: 429 Too Many Requests: {\"error\":{\"root_cause\":[{\"type\":\"circuit_breaking_exception\",\"reason\":\"[parent] Data too large, data for [<http_request>] would be [2081656578/1.9gb], which is larger than the limit of [2040109465/1.8gb], real usage: [1945182800/1.8gb], new bytes reserved: [136473778/130.1mb], usages [model_inference=0/0b, eql_sequence=0/0b, fielddata=0/0b, request=0/0b, inflight_requests=136473778/130.1mb]\",\"bytes_wanted\":2081656578,\"bytes_limit\":2040109465,\"durability\":\"TRANSIENT\"}],\"type\":\"circuit_breaking_exception\",\"reason\":\"[parent] Data too large, data for [<http_request>] would be [2081656578/1.9gb], which is larger than the limit of [2040109465/1.8gb], real usage: [1945182800/1.8gb], new bytes reserved: [136473778/130.1mb], usages [model_inference=0/0b, eql_sequence=0/0b, fielddata=0/0b, request=0/0b, inflight_requests=136473778/130.1mb]\",\"bytes_wanted\":2081656578,\"bytes_limit\":2040109465,\"durability\":\"TRANSIENT\"},\"status\":429}","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"httpjson-default","type":"httpjson"},"log":{"source":"httpjson-default"},"log.logger":"elasticsearch","log.origin":{"file.line":241,"file.name":"elasticsearch/client.go"},"service.name":"filebeat","ecs.version":"1.6.0","ecs.version":"1.6.0"}

Should i increase the size ? Or decrease the limit of event fetch ? Or decrease the interval i fetch the event ?

---- Post Update ---- (5 minutes after the above)

I didn't receive this error anymore it resolve itself apparently.
Instead i've got the Process another repeated request. info message.
I've also the information "message":"request finished: 15331 events published"

{"log.level":"info","@timestamp":"2023-03-23T10:32:51.105+0100","message":"Process another repeated request.","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"httpjson-default","type":"httpjson"},"log":{"source":"httpjson-default"},"service.name":"filebeat","id":"httpjson-ti_misp.threat-c4e5502e-d2ac-428d-bd13-85afdcc371e3","input_source":"https://localhost/events/restSearch","input_url":"https://localhost/events/restSearch","ecs.version":"1.6.0","log.origin":{"file.line":132,"file.name":"httpjson/input.go"},"log.logger":"input.httpjson-cursor","ecs.version":"1.6.0"}
{"log.level":"info","@timestamp":"2023-03-23T10:32:59.237+0100","message":"request finished: 15331 events published","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"httpjson-default","type":"httpjson"},"log":{"source":"httpjson-default"},"id":"httpjson-ti_misp.threat-c4e5502e-d2ac-428d-bd13-85afdcc371e3","input_url":"https://localhost/events/restSearch","ecs.version":"1.6.0","log.logger":"input.httpjson-cursor","log.origin":{"file.line":445,"file.name":"httpjson/request.go"},"service.name":"filebeat","input_source":"https://localhost/events/restSearch","ecs.version":"1.6.0"}

But still nothing in Kibana Analytics Discover view (for both metrics and logs):

Well in data stream i see something:

I think you should start by setting a much lower value instead of 6000h, a bit unsure why this was manually changed by you as it is configurable?

You might need to re-add the integration or remove the state file for it to pick up your change from 6000h.

We are in the working of adding another API to the misp integration that is a bit less heavy than this.

tor. 23. mar. 2023, 09:26 skrev Nicolas via Discuss the Elastic Stack <notifications@elastic.discoursemail.com>:

Yes this was manually changed by me, but the configuration apply change only on the elastic-agent.yml file. We select value from Kibana Fleet setting in Integration policy and then it generate a yml file. Instead of copy and paste the whole file i just change the righte value.

Usually for changing configuration I just uninstall the agent doing: /opt/Elastic/Agent/elastic-agent uninstall and change the config in elastic-agent.yml in ~/elastic-agent-8.6.2-linux-x86_64/ repository and then reinstall it with sudo ~/elastic-agent-8.6.2-linux-x86_64/elastic-agent install. So a new one comes in /opt/Elastic/Agent/ repo.

Is it all right to take my change into account ?

But do you know why i still don't have any information on the Kibana dashboard dedicated to MISP and nothin in logs Analytics ?

I received some logs inside Kibana Logs Analytics:

But there is something strange about the data collected:
I've created 11 events "Dummy event" for test purpose. These events has the particularity to have no attribute attach to them. That's something that we don't see in real and no other events in MISP have no attribute attache to them.

And all of the logs receive in kibana anlytics are concerning those events: Dummy Event see below:

.

Another things strange is that i've create only 11 events but i receive plenty of them in Kibana.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.