Failed to execute action:message=>"Unable to configure plugins

Hello everyone1 I am new to logstash and I have been trying for so long to import dataset through logstash. I keep on getting error. This is my recent attempt. Without super user, it is not giving me permission even after using chmod command. Can anyone please help me? I am determined to get this done I need some guidance on this please.


// root@ubuntu:/home/esupxi# /usr/share/logstash/bin/logstash -f /etc/logstash/logstash-tm.conf
Using bundled JDK: /usr/share/logstash/jdk
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2022-08-15 06:11:29.133 [main] runner - NOTICE: Running Logstash as superuser is not recommended and won't be allowed in the future. Set 'allow_superuser' to 'false' to avoid startup errors in future releases.
[INFO ] 2022-08-15 06:11:29.228 [main] runner - Starting Logstash {"logstash.version"=>"8.3.3", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.15+10 on 11.0.15+10 +indy +jit [linux-x86_64]"}
[INFO ] 2022-08-15 06:11:29.236 [main] runner - JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[WARN ] 2022-08-15 06:11:30.877 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2022-08-15 06:11:38.238 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[INFO ] 2022-08-15 06:11:43.171 [Converge PipelineAction::Create<main>] Reflections - Reflections took 601 ms to scan 1 urls, producing 124 keys and 408 values 
[ERROR] 2022-08-15 06:11:45.849 [Converge PipelineAction::Create<main>] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::JavaLang::IllegalStateException", :message=>"Unable to configure plugins: (ArgumentError) URI is not valid - host is not specified", :backtrace=>["org.logstash.config.ir.CompiledPipeline.<init>(CompiledPipeline.java:120)", "org.logstash.execution.JavaBasePipelineExt.initialize(JavaBasePipelineExt.java:85)", "org.logstash.execution.JavaBasePipelineExt$INVOKER$i$1$0$initialize.call(JavaBasePipelineExt$INVOKER$i$1$0$initialize.gen)", "org.jruby.internal.runtime.methods.JavaMethod$JavaMethodN.call(JavaMethod.java:837)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuper(IRRuntimeHelpers.java:1169)", "org.jruby.ir.runtime.IRRuntimeHelpers.instanceSuperSplatArgs(IRRuntimeHelpers.java:1156)", "org.jruby.ir.targets.InstanceSuperInvokeSite.invoke(InstanceSuperInvokeSite.java:39)", "usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.RUBY$method$initialize$0(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:48)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)", "org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:333)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:87)", "org.jruby.RubyClass.newInstance(RubyClass.java:939)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:50)", "usr.share.logstash.logstash_minus_core.lib.logstash.pipeline_action.create.RUBY$method$execute$0$__VARARGS__(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:49)", "org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:80)", "org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:70)", "org.jruby.ir.targets.InvokeSite.invoke(InvokeSite.java:207)", "usr.share.logstash.logstash_minus_core.lib.logstash.agent.RUBY$block$converge_state$2(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:381)", "org.jruby.runtime.CompiledIRBlockBody.callDirect(CompiledIRBlockBody.java:138)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:52)", "org.jruby.runtime.Block.call(Block.java:139)", "org.jruby.RubyProc.call(RubyProc.java:318)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:105)", "java.base/java.lang.Thread.run(Thread.java:829)"]}
[INFO ] 2022-08-15 06:11:46.650 [LogStash::Runner] runner - Logstash shut down.

Also, this is the content in my config file.

input{
    file{
        path => "/home/esupxi/Desktop/datasets/train_mosaic.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}
filter{
    csv{
        separator => ","
        columns => [
            "Destination_Port", " Flow_Duration"," Total_Fwd_Packets"," Total_Backward_Packets"," Total_Length_of_Fwd_Packets"," Total_Length_of_Bwd_Packets"," Fwd_Packet_Length_Max","
Fwd_Packet_Length_Min"," Fwd_Packet_Length_Mean"," Fwd_Packet_Length_Std"," Bwd_Packet_Length_Max"," Bwd_Packet_Length_Min"," Bwd_Packet_Length_Mean"," 
Bwd_Packet_Length_Std"," Flow_Bytes_Sec"," Flow_Packets_Sec"," Flow_IAT_Mean"," Flow_IAT_Std"," Flow_IAT_Max"," Flow_IAT_Min"," Fwd_IAT_Total"," Fwd_IAT_Mean"," Fwd_IAT_Std","
Fwd_IAT_Max"," Fwd_IAT_Min"," Bwd_IAT_Total"," Bwd_IAT_Mean"," Bwd_IAT_Std"," Bwd_IAT_Max"," Bwd_IAT_Min"," Fwd_PSH_Flags"," Bwd_PSH_Flags"," Fwd_URG_Flags"," Bwd_URG_Flags","
Fwd_Header_Length"," Bwd_Header_Length"," Fwd_Packets_Sec"," Bwd_Packets_Sec","	Min_Packet_Length"," Max_Packet_Length"," Packet_Length_Mean"," Packet_Length_Std"," 
Packet_Length_Variance"," FIN_Flag_Count"," SYN_Flag_Count"," RST_Flag_Count"," PSH_Flag_Count"," ACK_Flag_Count"," URG_Flag_Count"," CWE_Flag_Count"," ECE_Flag_Count","	
Down_Up_Ratio"," Average_Packet_Size"," Avg_Fwd_Segment_Size"," Avg_Bwd_Segment_Size"," Fwd_Avg_Bytes_Bulk"," Fwd_Avg_Packets_Bulk"," Fwd_Avg_Bulk_Rate"," 
Bwd_Avg_Bytes_Bulk"," Bwd_Avg_Packets_Bulk"," Bwd_Avg_Bulk_Rate"," Subflow_Fwd_Packets"," Subflow_Fwd_Bytes"," Subflow_Bwd_Packets"," Subflow_Bwd_Bytes"," 
Init_Win_bytes_forward"," Init_Win_bytes_backward"," act_data_pkt_fwd"," min_seg_size_forward"," Active_Mean"," Active_Std"," Active_Max"," Active_Min"," Idle_Mean"," 
Idle_Std"," Idle_Max"," Idle_Min"," Label"
        ]
    }
}
output{
    elasticsearch{
        hosts => "http://localhost:127.0.0.1"
        index => "tm_ds"
    }
stdout {}
}

This is wrong, it is not a valid URI, which is what your error message is saying:

:message=>"Unable to configure plugins: (ArgumentError) URI is not valid - host is not specified"

The hosts option should be in the format http://host:port, try to use http://localhost:9200

hello, I restarted elasticsearch so apparently its not giving me error, also I have written " http://localhost:9200" in host option. But now it has been stuck here in this following piece of statement. Does it take long to load?

[INFO ] 2022-08-15 20:53:03.794 [Agent thread] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

config file

input{
    file{
        path => "/home/esupxi/Desktop/datasets/train_mosaic.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}
filter{
    csv{
        separator => ","
        columns => [
            "Destination_Port", " Flow_Duration"," Total_Fwd_Packets"," Total_Backward_Packets"," Total_Length_of_Fwd_Packets"," Total_Length_of_Bwd_Packets"," Fwd_Packet_Length_Max","
Fwd_Packet_Length_Min"," Fwd_Packet_Length_Mean"," Fwd_Packet_Length_Std"," Bwd_Packet_Length_Max"," Bwd_Packet_Length_Min"," Bwd_Packet_Length_Mean"," 
Bwd_Packet_Length_Std"," Flow_Bytes_Sec"," Flow_Packets_Sec"," Flow_IAT_Mean"," Flow_IAT_Std"," Flow_IAT_Max"," Flow_IAT_Min"," Fwd_IAT_Total"," Fwd_IAT_Mean"," Fwd_IAT_Std","
Fwd_IAT_Max"," Fwd_IAT_Min"," Bwd_IAT_Total"," Bwd_IAT_Mean"," Bwd_IAT_Std"," Bwd_IAT_Max"," Bwd_IAT_Min"," Fwd_PSH_Flags"," Bwd_PSH_Flags"," Fwd_URG_Flags"," Bwd_URG_Flags","
Fwd_Header_Length"," Bwd_Header_Length"," Fwd_Packets_Sec"," Bwd_Packets_Sec","	Min_Packet_Length"," Max_Packet_Length"," Packet_Length_Mean"," Packet_Length_Std"," 
Packet_Length_Variance"," FIN_Flag_Count"," SYN_Flag_Count"," RST_Flag_Count"," PSH_Flag_Count"," ACK_Flag_Count"," URG_Flag_Count"," CWE_Flag_Count"," ECE_Flag_Count","	
Down_Up_Ratio"," Average_Packet_Size"," Avg_Fwd_Segment_Size"," Avg_Bwd_Segment_Size"," Fwd_Avg_Bytes_Bulk"," Fwd_Avg_Packets_Bulk"," Fwd_Avg_Bulk_Rate"," 
Bwd_Avg_Bytes_Bulk"," Bwd_Avg_Packets_Bulk"," Bwd_Avg_Bulk_Rate"," Subflow_Fwd_Packets"," Subflow_Fwd_Bytes"," Subflow_Bwd_Packets"," Subflow_Bwd_Bytes"," 
Init_Win_bytes_forward"," Init_Win_bytes_backward"," act_data_pkt_fwd"," min_seg_size_forward"," Active_Mean"," Active_Std"," Active_Max"," Active_Min"," Idle_Mean"," 
Idle_Std"," Idle_Max"," Idle_Min"," Label"
        ]
    }
}
output{
    elasticsearch{
        hosts => "http://localhost:9200"
        index => "tm_ds"
    }
stdout {}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.