I have a multi-line java exception log that I'm correctly sending via rsyslog from a remote server to my ELK stack, like the one below (sorry for the verbosity). It looks, to me, like a valid json file.
{ "type":"pe-log", "host":"ip-10-53-234-234", "timestamp":"2017-02-24T20:06:08.036699+00:00", "@version":"1", "customer":"customer", "role":"app2", "sourcefile":"/tmp/error.log", "message":"2017-02-08 21:59:51,727 ERROR :localhost-startStop-1 [jdbc.sqlonly] 1. PreparedStatement.executeBatch() batching 1 statements:\\n1: insert into CR_CLUSTER_REGISTRY (Cluster_Name, Url, Update_Dttm, Node_Id) values ('customer', 'rmi:\/\/ip-10-53-254.254.eu-west-1.compute.internal:1199\/2', '02\/08\/2017 21:59:51.639', '2')\\n\\njava.sql.BatchUpdateException: [Teradata JDBC Driver] [TeraJDBC 15.00.00.35] [Error 1338] [SQLState HY000] A failure occurred while executing a PreparedStatement batch request. Details of the failure can be found in the exception chain that is accessible with getNextException.\\n at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeBatchUpdateException(ErrorFactory.java:148)\\n at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeBatchUpdateException(ErrorFactory.java:137)\\n at com.teradata.jdbc.jdbc_4.TDPreparedStatement.executeBatchDMLArray(TDPreparedStatement.java:272)\\n at com.teradata.jdbc.jdbc_4.TDPreparedStatement.executeBatch(TDPreparedStatement.java:2584)\\n at com.teradata.tal.qes.StatementProxy.executeBatch(StatementProxy.java:186)\\n at net.sf.log4jdbc.StatementSpy.executeBatch(StatementSpy.java:539)\\n at org.hibernate.jdbc.BatchingBatcher.doExecuteBatch(BatchingBatcher.java:70)\\n at org.hibernate.jdbc.AbstractBatcher.executeBatch(AbstractBatcher.java:268)\\n at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:266)\\n at org.hibernate.engine.ActionQueue.executeActions(ActionQueue.java:167)\\n at org.hibernate.event.def.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:321)\\n at org.hibernate.event.def.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:50)\\n at org.hibernate.impl.SessionImpl.flush(SessionImpl.java:1028)\\n at com.teradata.tal.common.persistence.dao.SessionWrapper.flush(SessionWrapper.java:920)\\n at com.teradata.trm.common.persistence.dao.DaoImpl.save(DaoImpl.java:263)\\n at com.teradata.trm.common.service.AbstractService.save(AbstractService.java:509)\\n at com.teradata.trm.common.cluster.Cluster.init(Cluster.java:413)\\n at com.teradata.trm.common.cluster.NodeConfiguration.initialize(NodeConfiguration.java:182)\\n at com.teradata.trm.common.context.Initializer.onApplicationEvent(Initializer.java:73)\\n at com.teradata.trm.common.context.Initializer.onApplicationEvent(Initializer.java:30)\\n at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:97)\\n at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:324)\\n at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:929)\\n at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:467)\\n at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:385)\\n at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:284)\\n at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:111)\\n at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4973)\\n at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5467)\\n at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)\\n at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901)\\n at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877)\\n at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:632)\\n at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1247)\\n at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1898)\\n at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)\\n at java.util.concurrent.FutureTask.run(FutureTask.java:262)\\n at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)\\n at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)\\n at java.lang.Thread.run(Thread.java:745)\\nCaused by: java.sql.SQLException: [Teradata Database] [TeraJDBC 15.00.00.35] [Error -2801] [SQLState 23000] Duplicate unique prime key error in CIM_META.CR_CLUSTER_REGISTRY.\\n at com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDatabaseSQLException(ErrorFactory.java:301)\\n at com.teradata.jdbc.jdbc_4.statemachine.ReceiveInitSubState.action(ReceiveInitSubState.java:114)\\n at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.subStateMachine(StatementReceiveState.java:311)\\n at com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.action(StatementReceiveState.java:200)\\n at com.teradata.jdbc.jdbc_4.statemachine.StatementController.runBody(StatementController.java:137)\\n at com.teradata.jdbc.jdbc_4.statemachine.PreparedBatchStatementController.run(PreparedBatchStatementController.java:58)\\n at com.teradata.jdbc.jdbc_4.TDStatement.executeStatement(TDStatement.java:387)\\n at com.teradata.jdbc.jdbc_4.TDPreparedStatement.executeBatchDMLArray(TDPreparedStatement.java:252)\\n ... 37 more\\n"}
I have the following Logstash conf file
input {
udp {
port => 55xx
codec => json
}
}
Logstash gives a json parse error when it receives this log. Any clues? How can I debug what Logstash is complaining about? I see nothing in logstash.log nor logstash.err files.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.