Logstash not listening on port

I noticed that nothing is listening on port 5044 on my ELK server. Logstash is running. Is there some configuration I missed somewhere to have it running/listening on that port? Here are the last few lines from my log file:

[2020-11-13T14:36:17,578][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.10.0", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.8+10 on 11.0.8+10 +indy +jit [linux-x86_64]"}
[2020-11-13T14:36:19,947][INFO ][logstash.config.source.local.configpathloader] No config files found in path {:path=>"/etc/logstash/conf.d/*.conf"}
[2020-11-13T14:36:19,996][ERROR][logstash.config.sourceloader] No configuration found in the configured sources.
[2020-11-13T14:36:20,316][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

Right now it's looking for configuration files in /etc/logstash/conf.d/*.conf and it looks like you don't have any in that folder. So create a file called name.conf in /etc/logstash/conf.d/ with the below suggestions. Then restart logstash to run.

You need to create a pipeline configuration using this structure.

Most likely you will want a UDP or TCP input if you are listening to a port.

No filter unless you want to manipulate the data.

Then an elasticsearch output.

Thank you for your response and help.
Here is what I am seeing from the client-side for Logstash:

Failed to publish events caused by: read tcp 192.168.0.100:46456->192.168.0.103:5044: i/o timeout
isher]        pipeline/retry.go:223          done
stash]        logstash/async.go:280        Failed to publish events caused by: write tcp 192.168.0.100:46456->192.168.0.103:5044: use of closed network connection

On the server, 5044 is now listening:

tcp6       0      0 :::5044                 :::*                    LISTEN      7744/java           

In the logstash folder on the server, I created log4j2.xml:

<Configuration>
  <Appenders>
     <Socket name="Socket" host="192.168.0.103" port="5044">
       <JsonLayout compact="true" eventEol="true" />
    </Socket>
  </Appenders>
  <Loggers>
    <Root level="info">
      <AppenderRef ref="Socket"/>
    </Root>
  </Loggers>
</Configuration>

In my name.conf file:

input {
    tcp {
    port => 5044
    codec => json
    }
}

filter {
  date {
    match => [ "timeMillis", "UNIX_MS" ]
  }
}

output {
  elasticsearch {
    index => "%{[@metadata][beat]}"
  }
}

Also, is TCP or UDP recommended?

TCP is recommended over UDP due to TCP less likely than TCP to lose messages.

Ok, and what about the error message that I am getting? How would I resolve that?

Sorry didn't see the other question. Can you change your output to the below and let me know what it says?

output {
  stdout { }
}

I am not sure why, but I am having a hard time stopping/starting/restarting the logstash service. Here is the error that I am seeing:

2020-11-17T15:40:19,620][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error. {:url=>"http://127.0.0.1:9200/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :error=>"Elasticsearch Unreachable: [http://127.0.0.1:9200/][Manticore::SocketException] Connection refused (Connection refused)"}

I see elasticsearch is listening on 127.0.0.1 instead of the 192 IP. But when I view the elasticsearch conf file, I see the network.host is set to this 192 address. Is there another place this needs to be defined, that it's being pointed to?

What do you have in your logstash output elasticsearch for hosts? The default is 127 so if you don't set it to the right IP then it won't try to go there.

Here is what it looks like now:

output {
  elasticsearch {
    index => "%{[@metadata][beat]}"
    hosts => "192.168.0.103"
  }
}

Here is what I am seeing now though, on logstash:

 [2020-11-18T13:08:12,824][WARN ][logstash.codecs.jsonlines][main][26da92079e525d4bfdac5a892ff28079c6695bd768a516e8a992f0d588033c05] Received an event that has a different character encoding than you configured. {:text=>"\\u000E\\x97P]...
 [2020-11-18T13:08:12,826][WARN ][logstash.codecs.jsonlines][main][26da92079e525d4bfdac5a892ff28079c6695bd768a516e8a992f0d588033c05] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unrecognized token 'z': was expecting ('true', 'false' or 'null')
 at [Source: (String)"z -9\x92\u0001~\u0000/\f\x960l...

EDIT:
I added

codec => plain {
      charset => "ISO-8859-1"
    }

But am getting similar error messages:

JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected character...
Received an event that has a different character encoding than you configured. {:text=>"\\xB6\\xA6}e#\\x

bump

Since this is a new issue might want to create a new thread about message parsing.