_cat/pengding_tasks

GET _cat/pending_tasks

754684456 42.9s URGENT delete-index [.node_monitor_jiesi-7-2016.07.05]                                       
754684458 14.4s URGENT create-index [writer_mq_test], cause [api]                                            
754681510  1.2m NORMAL indices_store ([[pop-afs-comp-v1][4]] active fully on other nodes)                    
754681514  1.2m NORMAL indices_store ([[data_worldwide_weekly3][9]] active fully on other nodes)             
754681513  1.2m NORMAL indices_store ([[data_worldwide_weekly3][17]] active fully on other nodes)            
754681511  1.2m NORMAL indices_store ([[pop-afs-comp-v1][7]] active fully on other nodes)                    
754681519  1.2m NORMAL indices_store ([[data_worldwide_weekly3][15]] active fully on other nodes)            
754681523  1.2m NORMAL indices_store ([[pop-afs-comp-v1][2]] active fully on other nodes)                    
754681515  1.2m NORMAL indices_store ([[pop-afs-comp-v1][0]] active fully on other nodes)                    
754681517  1.2m NORMAL indices_store ([[data_worldwide_weekly3][2]] active fully on other nodes)             
754681516  1.2m NORMAL indices_store ([[ege][0]] active fully on other nodes)                                
754681512  1.2m NORMAL indices_store ([[pop-afs-comp-v1][1]] active fully on other nodes)                    
754681522  1.2m NORMAL indices_store ([[data_worldwide_weekly3][14]] active fully on other nodes)            
754681521  1.2m NORMAL indices_store ([[data_worldwide_weekly3][16]] active fully on other nodes)            
754681625  1.2m NORMAL indices_store ([[procurement_v2][0]] active fully on other nodes)                     
754681527  1.2m NORMAL indices_store ([[data_worldwide_weekly3][7]] active fully on other nodes)             
754681524  1.2m NORMAL indices_store ([[data_worldwide_weekly3][1]] active fully on other nodes)             
754681558  1.2m NORMAL indices_store ([[data_worldwide_weekly3][0]] active fully on other nodes)             
...

there many indices_store,
How to solve, response is slow now

want to delete a index ,sometime response

{
  "error": {
    "root_cause": [
      {
        "type": "process_cluster_event_timeout_exception",
        "reason": "failed to process cluster event (delete-index [.node_monitor_jiesi-2-2016.06.14]) within 30s"
      }
    ],
    "type": "process_cluster_event_timeout_exception",
    "reason": "failed to process cluster event (delete-index [.node_monitor_jiesi-2-2016.06.14]) within 30s"
  },
  "status": 503
}

How to what?

How to solve, response is slow now