Help with logstash and XML

I've been searching the internet for the last 6 hours trying to find a guide, tutorial, hint or any kind of help to upload the content of some .xml files into ES using logstash.

For the life of me I haven't been able to do so or find anything to shed some light on the matter, so I come here... asking for anything, a link to some forgotten "how to for dummies guide", some help... provided that my code is at all "right"... anything really.

I have these XML files called log.xml, it looks something like this:

It is saved inisde: c:\Users\my.user\Desktop\TestLogsFolder\

I run logstash using the command (in cmd)

logstash -f pathtoconfigfile/configfile.conf

The configuration file (configfile.conf) has the following text inside it:

input {

file  {
  type => "file"
	path => ["c:\Users\george.hosu\Desktop\TestLogsFolder\log.xml"]
	start_position => beginning
}

}

filter {

 xml {
      source => "message"
      target => "message_parsed"
      add_tag => ["xml_parsed"]
      remove_namespaces => true
      xpath => [
        "/testsuites/testsuite/testcase[1]", "testcase1names"
        ]
 }

}

output {

stdout { codec => "rubydebug"}

elasticsearch {

hosts => ["http://localhost:9200"]

manage_template => false

index => "QAlogs-dc-%{+YYYY.MM.dd.hh.mm}"

#protocol => "http"

document_type => "QAlogs"

#document_id => "%{[id]}"

}
}

I do this while ES is running, logstash give me the message "main pipline started" and then... nothing, the ES database doesn't recieve any new data/indexes... etc.

I've added data to said databse with stdin{} and it worked just fine.

If anyone could help me figure how the hell should I go about this I would be beyond greatfull.

It's likely to be a sincedb thing, find the existing file and then remove it.

It's better to either use stdin, or set sincedb to a null file, if you are testing.

Ok, sorry for the late reply, I was away this whole weekend
I tried deleting all the .sincedb.... files I had (I tried reading multiple filex)

Right now I have an xml filter and an output that works perfectly as long as I introduce XMLs from the concolse (as in copy paste them in and press eneter) using the stdin{} output.

I've tried various ways to "fix" the file input and I'm not sure what to do

Basically what I have right now (I've tried to keep it as bare bones as possible) is:

input {

file {
   path => ["c:\Users\my.user\Desktop\qalog\log2.xml"]
   start_position => beginning
}

}

I've tried reading .txt files instead... nothing (curiously enough it worked when I tried the example from the website with an appache server log)

.... yet with the above config a .sincedb... etc file is generated but NEVER updated, it stays at 0 0 0 0.

I've tried using " " around beginning, I've tried changing it to "end", I've tried adding info to the file while the pipeline is running and nothing, my console just stays at:

Settings: Default pipeline workers: 4
Pipeline main started

and chills there for an idefinite amount of time

I'm quite sure I have commited a very obvious newby mistakes but for the life of me I can't figure out what it is

Are the input files older than 24 hours? See the file input's ignore_older option.

I've tried updating the files with more info, creating new files and setting ignore_older => 0 but It didn't seem to affect anything