Logstash-data between start and end transaction

I need to index the data between start and end transaction. I tried lots of combination.
but no one worked.

In logs, I want Logstash to parse the text between start and end transaction. is it possible?

2018-07-25 13:17:31,262 DEBUG [DSLog] (default task-38-T230584) RefDataCollectionImpl:getCollection() type=config-var

2018-07-25 13:17:31,262 DEBUG [DSLog] (default task-38-T230584) DSFactoryImpl.getConfigVarValue::configName->ENABLE_IDMS_SPECIAL_LOGGING, val=false

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) ----------- TRANSACTION START [23234]-----------

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) =====start com.sap.ist.ds2.pub.person.PersonRequest======

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) person.PersonRequest.PersonFetchPrefs=

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) person.PersonRequest.PersonSearchCriteria==

2018-07-25 13:17:31,519 DEBUG [DSLog] (default task-38-T230584) DsBaseServiceWP::forwardToServiceBean() end

2018-07-25 13:17:31,519 INFO [DSLog] (default task-38-T230584) ----------- TRANSACTION END [23234]-----------

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) ----------- TRANSACTION START [230584]-----------

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) =====start com.sap.ist.ds2.pub.person.PersonRequest======

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) person.PersonRequest.PersonFetchPrefs=

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) person.PersonRequest.PersonSearchCriteria==

2018-07-25 13:17:31,519 DEBUG [DSLog] (default task-38-T230584) DsBaseServiceWP::forwardToServiceBean() end

2018-07-25 13:17:31,519 INFO [DSLog] (default task-38-T230584) ----------- TRANSACTION END [230584]-----------

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) ----------- TRANSACTION START [230584]-----------

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) =====start com.sap.ist.ds2.pub.person.PersonRequest======

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) person.PersonRequest.PersonFetchPrefs=

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) person.PersonRequest.PersonSearchCriteria==
2018-07-25 13:17:31,519 DEBUG [DSLog] (default task-38-T230584) DsBaseServiceWP::forwardToServiceBean() end
2018-07-25 13:17:31,519 INFO [DSLog] (default task-38-T230584) ----------- TRANSACTION END [230584]-----------

Providing the lines for different transactions are never interleaved you can do it using an aggregate filter and a class variable. Note that you must set --pipeline.workers 1 to use an aggregate. Your use case matches example 1 in the aggregate documentation, except you are missing the taskid on some lines. So use a class variable to add it.

    if [message] =~ /TRANSACTION START/ {
        grok { match => { "message" => "TRANSACTION START \[%{NUMBER:taskid}\]" } }
        ruby { code => '@@taskid = event.get("taskid") ' }
    } else {
        ruby { code => 'event.set("taskid",  @@taskid)' }
    }

Then do the aggregate using task_id => "%{taskid}"

Thanks for the answer.
I tried it using as shown in Example1
I am not getting the data between the transaction instead each line is output as single document.
Here is my config file

input
{
file
{
path => "/Users/nitesh.agrawal/Desktop/logstash/Aggregation/elklog.log"
start_position => "beginning"
   sincedb_path => "/dev/null"
codec => multiline {
     #pattern => "<test-suite>|</<test-suite>"
     pattern => "[\^\-]+"
     negate => true
     what => "previous"
     auto_flush_interval => 1
   }
}
}


filter {
  grok {
    match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} %{WORD:Info}\s+(\[%{WORD:loglevel}\]\s+)?%{GREEDYDATA:logger}" ]
  }

  if [logger] == "(default task-38-T230584) ----------- TRANSACTION START [230584]-----------" {
    aggregate {
      task_id => "%{loglevel}"
      code => "map['sql_duration'] = 0"
      map_action => "create"
    }
  }

  if [logger] == "(default task-38-T230584)" {
    aggregate {
      task_id => "%{loglevel}"
      code => "map['sql_duration'] += event.get('duration')"
      map_action => "update"
    }
  }

  if [logger] == "(default task-38-T230584) ----------- TRANSACTION END [230584]-----------" {
    aggregate {
      task_id => "%{loglevel}"
      code => "event.set('sql_duration', map['sql_duration'])"
      map_action => "update"
      end_of_task => true
      timeout => 120
    }
  }
}

Let's start over. I was assuming that you wanted in some way to associate these lines, which are all related to transaction 23234.

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) ----------- TRANSACTION START [23234]-----------
2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) =====start com.sap.ist.ds2.pub.person.PersonRequest======
2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) person.PersonRequest.PersonFetchPrefs=
2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) person.PersonRequest.PersonSearchCriteria==
2018-07-25 13:17:31,519 DEBUG [DSLog] (default task-38-T230584) DsBaseServiceWP::forwardToServiceBean() end
2018-07-25 13:17:31,519 INFO [DSLog] (default task-38-T230584) ----------- TRANSACTION END [23234]-----------

and similarly to associate these lines, which are all related to transaction 230584.

2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) ----------- TRANSACTION START [230584]-----------
2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) =====start com.sap.ist.ds2.pub.person.PersonRequest======
2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) person.PersonRequest.PersonFetchPrefs=
2018-07-25 13:17:31,262 INFO [DSLog] (default task-38-T230584) person.PersonRequest.PersonSearchCriteria==
2018-07-25 13:17:31,519 DEBUG [DSLog] (default task-38-T230584) DsBaseServiceWP::forwardToServiceBean() end
2018-07-25 13:17:31,519 INFO [DSLog] (default task-38-T230584) ----------- TRANSACTION END [230584]-----------

Is that correct? If so, do you just want a single event for each transaction? If so, what do you want that event to contain?

There are multiple transaction with Start and STOP tag and their associated details between the tags in one log file. I want to have each transaction as a single event with corresponding log lines.

Hi,

Any help please!
I need to create ES document(through Logstash) Which will have log lines between
a Transaction start and Transaction stop. I need to copy those lines and create a record. There will be multiple such transactions need to create a record for each transactions.
Logstash config that I coded is resulting lines clubbed between all the transactions whereas I want it to segregate based on each transaction.
Any help would be appreciated.

If all you want to do is aggregate the log entries then this would do it

    if [message] =~ /TRANSACTION START/ {
        grok { match => { "message" => "TRANSACTION START \[%{NUMBER:taskid}\]" } }
        ruby { code => '@@taskid = event.get("taskid") ' }
    } else {
        ruby { code => 'event.set("taskid",  @@taskid)' }
    }

    if [message] =~ /TRANSACTION START/ {
        aggregate {
            task_id => "%{taskid}"
            code => "map['logEntries'] = []; map['logEntries'] << event.get('message')"
            map_action => "create"
        }
        drop {}
    } else if [message] =~ /TRANSACTION END/ {
        aggregate {
            task_id => "%{taskid}"
            code => "map['logEntries'] << event.get('message'); event.set('logEntries', map['logEntries'])"
            map_action => "update"
            end_of_task => true
            timeout => 120
        }
        mutate { remove_field => [ "message" ] }
    } else {
        aggregate {
            task_id => "%{taskid}"
            code => "map['logEntries'] << event.get('message')"
            map_action => "update"
        }
        drop {}
    }

You must set --pipeline.workers 1 and must not set --experimental-java-execution, since that breaks this.

Hi Badger,
The solution you provided, worked for me. Thank you.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.