Ingesting RDPCoreTS WineventLogs into Elasticsearch

Hi everyone.

First off, I would just like to thank you for reading this and giving me a hand.

We have a small Elastic stack which has been built off of the guides of CyberWardog (Roberto Rodriguez). Below is our current Winlogbeat configuration setup on our endpoints:
winlogbeat.event_logs:
- name: Application
ignore_older: 72h
- name: Security
- name: System
- name: Microsoft-Windows-Sysmon/Operational
- name: Microsoft-Windows-RemoteDesktopServices-RdpCoreTS/Operational
- name: Microsoft-Windows-RemoteApp and Desktop Connections/Admin
- name: Microsoft-Windows-RemoteDesktopServices-RdpCoreTS/Admin
- name: Microsoft-Windows-TerminalServices-ClientUSBDevices/Admin
- name: Microsoft-Windows-TerminalServices-LocalSessionManager/Admin
- name: Microsoft-Windows-TerminalServices-PnPDevices/Admin
- name: Microsoft-Windows-TerminalServices-Printers/Admin
- name: Microsoft-Windows-TerminalServices-RemoteConnectionManager/Admin
- name: Microsoft-Windows-TerminalServices-ServerUSBDevices/Admin
- name: Microsoft-Windows-TerminalServices-SessionBroker-Client/Admin
- name: Microsoft-Windows-VerifyHardwareSecurity/Admin '

#================================ General =====================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
name: winlogbeat

# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
#fields:
#  env: staging

#================================ Outputs =====================================

# Configure what outputs to use when sending the data collected by the beat.
# Multiple outputs may be used.

#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
  # Array of hosts to connect to.
  #hosts: ["localhost:9200"]

  # Optional protocol and basic auth credentials.
  #protocol: "https"
  #username: "elastic"
  #password: "changeme"

#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["REDACTED:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

#================================ Logging =====================================

# Sets log level. The default log level is info.
# Available log levels are: critical, error, warning, info, debug
#logging.level: debug

# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
#logging.selectors: ["*"]

What we are trying to do is properly ingest terminal/RDP services logs. We can see the logs in Logstash, however they are not getting to elasticsearch. Even stranger, some logs are getting through, such as event ID 102, but event ID 140, which is what we are really after, is not showing up anywhere. A quick parse of the Logstash error logs reveals the following error: "[ERROR][logstash.codecs.json ]JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unrecognized token 'A': was expecting ('true', 'false' or 'null')
at [Source: (String)"A connection from the client computer with an IP address of REDACTED failed because the user name or password is not correct."; line: 1, column: 2]>, :data=>"A connection from the client computer with an IP address of REDACTED failed because the user name or password is not correct."}"

I am not too experienced with ELK stack configuration beyond basic workings so I am very confused as to why this wouldn't be working. Our beats input and output configurations can be found below if that will help at all:

Beats Input:
# HELK Beats input conf file
# HELK build version: 0.9 (Alpha)
# Author: Roberto Rodriguez (@Cyb3rWard0g)
# License: BSD 3-Clause

input {
  beats {
    port => 5044
    add_field => { "[@metadata][source]" => "beats"}
    codec => "json"
    ssl => false
  }

Beats Output:
# HELK Beats output conf file
# HELK build version: 0.9 (BETA)
# Author: Roberto Rodriguez (@Cyb3rWard0g)
# License: BSD 3-Clause

output {
  if [@metadata][source] == "beats"{
    elasticsearch {
      hosts => ["REDACTED:9200"]
      index => "logs-endpoint-beats-%{+YYYY.MM}"
    }
  }
}

Any idea how I can get this data into elasticsearch so we can begin querying with Kibana? Thank you!

Edit: Apologies for formatting issues, I'm working on them!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.