How to pull data data from 2 kafka topics using logstash and index the data in two separate index in elasticsearch

Put both topics in the topics setting.

input {
  kafka {
    bootstrap_servers => "servera:9092"
    topics => ["first_topic", "second_topic"]
  }
}

Internally, this happens.

event.set("[@metadata][kafka][topic]", record.topic)

Meaning you can then use conditional blocks to add a "index" field.

filter {
  if [@metadata][kafka][topic] == "first_topic" {
    clone {
     # does not clone if `clones` setting is not specified 
      add_field => { "[es_index]" => "logstash-nco"}
    }
  } else if [@metadata][kafka][topic] == "second_topic" {
    clone {
      add_field => { "[es_index]" => "logstash-tiv"}
    }
  }
}

output {
  elasticsearch {
    hosts => ["elasticA:9200"]
    index => "%{es_index}"
  }
}

However, if your topics are "nco" and "tiv" then its much simpler, no conditionals.

input {
  kafka {
    bootstrap_servers => "servera:9092"
    topics => ["nco", "tiv"]
    add_field => { "[es_index]" => "logstash-%{[@metadata][kafka][topic]}"}
  }
}
output {
  elasticsearch {
    hosts => ["elasticA:9200"]
    index => "%{es_index}"
  }
}