I am facing the encoding issue,when i run the logstash ,the fields are displaying in unreadable format
Logstash.conf
input {
file {
path =>"C:/db2mon_v105/v105/db2mon_reports.csv"
start_position => "beginning"
sincedb_path => "null"
}
}
filter {
csv {
separator => ","
columns =>["TIME_STAMP","TS_DELTA","MEMBER","ACT_PER_S","CMT_PER_S ","RB_PER_S","DDLCK_PER_S","SEL_P_S","UID_P_S","ROWS_INS_P_S","ROWS_UPD_P_S","ROWS_RET_P_S","ROWS_MOD_P_S","PKG_CACHE_INS_P_S","P_RD_PER_S"]
}
}
output {
elasticsearch {
hosts => "http://172.31.55.33:9200"
user => "yyyyyy"
password => "xxxxxxx"
index => "icd"
}
stdout {
}
}
Please do find the filebeats.yml file
###################### Filebeat Configuration Example #logstash
#=========================== Filebeat inputs =============================
filebeat.inputs:
filebeat.prospectors:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: log
# Change to true to enable this input configuration.
enabled: true
encoding: utf-16
# Paths that should be crawled and fetched. Glob based paths.
#- /var/log/*.log
#- c:\programdata\elasticsearch\logs\*
#- C:\Program Files\IBM\SQLLIB\TPAEAutomation\DB2_10\DB2_install.log
#- C:\db2mon_v105\v105\db2mon_report.txt
- C:\db2mon_v105\v105\db2mon_reports.csv
#============================= Filebeat modules ===============================
filebeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d/*.yml
# Set to true to enable config reloading
reload.enabled: false
# Period on which files under path should be checked for changes
#reload.period: 10s
#==================== Elasticsearch template setting ==========================
setup.template.settings:
index.number_of_shards: 5
#index.codec: best_compression
#_source.enabled: false
#================================ General =====================================
# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:
# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]
# Optional fields that you can specify to add additional information to the
# output.
#fields:
# env: staging
#============================== Dashboards =====================================
# These settings control loading the sample dashboards to the Kibana index. Loading
# the dashboards is disabled by default and can be enabled either by setting the
# options here, or by using the `-setup` CLI flag or the `setup` command.
#setup.dashboards.enabled: false
# The URL from where to download the dashboards archive. By default this URL
# has a value which is computed based on the Beat name and version. For released
# versions, this URL points to the dashboard archive on the artifacts.elastic.co
# website.
#setup.dashboards.url:
#============================== Kibana =====================================
# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:
#Kibana Host
# Scheme and port can be left out and will be set to the default (http and 5601)
# In case you specify and additional path, the scheme is required: http://localhost:5601/path
# IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
host: "http://172.31.55.33:5601"
# Kibana Space ID
# ID of the Kibana Space into which the dashboards should be loaded. By default,
# the Default Space will be used.
#space.id:
#============================= Elastic Cloud ==================================
# These settings simplify using filebeat with the Elastic Cloud (https://cloud.elastic.co/).
# The cloud.id setting overwrites the `output.elasticsearch.hosts` and
# `setup.kibana.host` options.
# You can find the `cloud.id` in the Elastic Cloud web UI.
#cloud.id:
# The cloud.auth setting overwrites the `output.elasticsearch.username` and
# `output.elasticsearch.password` settings. The format is `<user>:<pass>`.
#cloud.auth:
#================================ Outputs =====================================
# Configure what output to use when sending the data collected by the beat.
#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["172.31.55.33:9200"]
# Enabled ilm (beta) to use index lifecycle management instead daily indices.
#ilm.enabled: false
# Optional protocol and basic auth credentials.
#protocol: "https"
username: "vvvvv"
password: "pppppp"
#----------------------------- Logstash output --------------------------------
#output.logstash:
#The Logstash hosts
#hosts: ["172.31.55.33:5044"]
What i see when i run the logstash :
Kindly let me now what needs to be done to rectify this issue.
Thanks in advance.