There's nothing in stock Logstash to merge multiple input files in the way you describe. What might work is reading them independently and updating the ES index twice. You'd have to pick a well-defined document id (perhaps the Session_ID field?). The first time a given session id is seen it'll create the document and the next time it'll update it with the additional fields. However, I don't think Logstash's elasticsearch output does partial document updates so you'd have to use an elasticsearch filter to fetch the missing fields. Yuck. I think this is something you'll want to do outside of Logstash.
I'm checking elasticsearch support this kind of scripting or not?
And I responded to your question. If some part of the answer is unclear, please ask a specific question about that. Don't post the same question all over again.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.