How to check the result of tokenizer after insert one document?

According to my requirement, I custom the analyzer.I know how to check the result using this analyzer, it look like this:
curl -XGET "192.168.0.1:9200/test/_analyze?pretty=1&analyzer=pattern_analyzer" -d '2016/08/05 09:57:23 login.go:442: Info: needAAS = true'

And below is my setting:
curl -XPUT "192.168.0.1:9200/wangxiang_test_5?pretty=1" -d '
{
"settings": {
"analysis": {
"analyzer": {
"pattern_analyzer": {
"type": "custom",
"tokenizer": "pattern_tokenizer"
}
},
"tokenizer": {
"pattern_tokenizer": {
"type": "pattern",
"pattern": "(\d+:\d+:\d+)|(\d+/\d+/\d+)|\w+",
"group": "0"
}
}
},
"mappings": {
"logs": {
"properties": {
"message": {
"type": "string",
"analyzer": "pattern_analyzer"
}
}
}
}
}
}'

And below is how I insert data:

curl -XPOST "192.168.0.1:9200/test/logs" -d '{"message":"2016/08/05 09:57:23 login.go:442: Info: needAAS = true"}'

but it seems that my custom analyzer dosen't work when I insert a document to this index. I want to check the token result when insert one document to this index. Is there any API can do this?Or is there any wrong with my setting?

Hi,
You could try and have a look at the term vector for that particular test document using the _termvectors API.

You set your custom analyzer to wangxiang_test_5 index. however you post the data to test index.
You should set the analyzer settings to test index or post the data to wangxiang_test_5.

I am sorry, I forget to modify this. In fact , I use the same index, but I paste wrong.

Could you show your mappings and settings in you test index?
I copied and pasted your setting in Sense, then I found 2 errors in your request.
One is the level of mappings in your request.
Another one is the pattern setting.

The request I posted to Elasticsearch 2.3.4 is here.

PUT /wangxiang_test_5?pretty=1
{
  "settings": {
    "analysis": {
      "analyzer": {
        "pattern_analyzer": {
          "type": "custom",
          "tokenizer": "pattern_tokenizer"
        }
      },
      "tokenizer": {
        "pattern_tokenizer": {
          "type": "pattern",
          "pattern": "(\\d+:\\d+:\\d+)|(\\d+\\/\\d+\\/\\d+)|\\w+",
          "group": "0"
        }
      }
    }
  },
  "mappings": {
    "logs": {
      "properties": {
        "message": {
          "type": "string",
          "analyzer": "pattern_analyzer"
        }
      }
    }
  }
}

GET /wangxiang_test_5/_settings

GET /wangxiang_test_5/_analyze?pretty=1&analyzer=pattern_analyzer
{
  "analyzer" : "pattern_analyzer",
  "text" : "2016/08/05 09:57:23 login.go:442: Info: needAAS = true"
}

And the response is :

{
  "tokens": [
    {
      "token": "2016/08/05",
      "start_offset": 0,
      "end_offset": 10,
      "type": "word",
      "position": 0
    },
    {
      "token": "09:57:23",
      "start_offset": 11,
      "end_offset": 19,
      "type": "word",
      "position": 1
    },
    {
      "token": "login",
      "start_offset": 20,
      "end_offset": 25,
      "type": "word",
      "position": 2
    },
    {
      "token": "go",
      "start_offset": 26,
      "end_offset": 28,
      "type": "word",
      "position": 3
    },
    {
      "token": "442",
      "start_offset": 29,
      "end_offset": 32,
      "type": "word",
      "position": 4
    },
    {
      "token": "Info",
      "start_offset": 34,
      "end_offset": 38,
      "type": "word",
      "position": 5
    },
    {
      "token": "needAAS",
      "start_offset": 40,
      "end_offset": 47,
      "type": "word",
      "position": 6
    },
    {
      "token": "true",
      "start_offset": 50,
      "end_offset": 54,
      "type": "word",
      "position": 7
    }
  ]
}

yeah, you find my mistake. First, I got the wrong level og mappings, and the second is the pattern, it should be double "" ,but I use only one.

Thanks for your reply.