Client error response [status code] 400

Hi,
I have mysql data field 'islem'. It has some values. It's default value is 'Y'. For example;
islem=Y
islem=YD
islem=YDI
islem=YDDD
islem=YI
islem=YDS
islem=YS
islem=YDDDD.. etc

I did nested mapping this field on sense editor. Because this field has to be array. After I indexed with PHP codes. But It has error:

Fatal error:  Uncaught exception 
'Guzzle\Http\Exception\ClientErrorResponseException' with message 
'Client error response
[status code] 400
[reason phrase] Bad Request
[url] http://localhost:9200/ihaleler/ihale/1030235' in 
/home/admin/web/server.com/public_html/vendor/guzzle/http/Guzzle/Http/Exception/BadResponseException.php:43
Stack
 trace:
#0 
/home/admin/web/server.com/public_html/vendor/guzzle/http/Guzzle/Http/Message/Request.php(145):
 
Guzzle\Http\Exception\BadResponseException::factory(Object(Guzzle\Http\Message\EntityEnclosingRequest),
 Object(Guzzle\Http\Message\Response))
#1 [internal function]: 
Guzzle\Http\Message\Request::onRequestError(Object(Guzzle\Common\Event),
 'request.error', 
Object(Symfony\Component\EventDispatcher\EventDispatcher))
#2 
/home/admin/web/server.com/public_html/vendor/symfony/event-dispatcher/Symfony/Component/EventDispatcher/EventDispatcher.php(164):
 call_user_func(Array, Object(Guzzle\Common\Event), 'request.error', 
Object(Symfony\Component\EventDispatcher\EventDispatcher))
#3 /home/admin/web/ in /home/admin/web/server.com/public_html/vendor/elasticsearch/elasticsearch/src/Elasticsearch/Connections/GuzzleConnection.php on line 266

Please help me.
Thank you.

Have you looked in the ES log for additional clues about what was wrong with the request?

my log file content :

   [2016-03-02 12:27:59,827][WARN ][bootstrap                ] unable to install syscall filter: seccomp unavailable: CONFIG_SECCOMP not compiled into kernel, CONFIG_SECCOMP and CONFIG_SECCOMP_FILTER are needed
    [2016-03-02 12:28:00,237][INFO ][node                     ] [Thena] version[2.1.1], pid[15994], build[40e2c53/2015-12-15T13:05:55Z]
    [2016-03-02 12:28:00,237][INFO ][node                     ] [Thena] initializing ...
    [2016-03-02 12:28:00,683][INFO ][plugins                  ] [Thena] loaded [license, marvel-agent], sites []
    [2016-03-02 12:28:00,705][INFO ][env                      ] [Thena] using [1] data paths, mounts [[/ (/dev/mapper/vg_server-lv_root)]], net usable_space [8.8gb], net total_space [17.1gb], spins? [possibly], types [ext4]
    [2016-03-02 12:28:04,721][INFO ][node                     ] [Thena] initialized
    [2016-03-02 12:28:04,721][INFO ][node                     ] [Thena] starting ...
    [2016-03-02 12:28:04,851][INFO ][transport                ] [Thena] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}, {[::2]:9300}, {[::1]:9300}
    [2016-03-02 12:28:04,863][INFO ][discovery                ] [Thena] elasticsearch/tU9myl0CS22cfwqS50cZTQ
    [2016-03-02 12:28:07,976][INFO ][cluster.service          ] [Thena] new_master {Thena}{tU9myl0CS22cfwqS50cZTQ}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
    [2016-03-02 12:28:08,023][INFO ][http                     ] [Thena] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}, {[::2]:9200}, {[::1]:9200}
    [2016-03-02 12:28:08,023][INFO ][node                     ] [Thena] started
    [2016-03-02 12:28:08,805][INFO ][license.plugin.core      ] [Thena] license [f6ce067d-afaf-49b1-ae49-33947f348a2d] - expired

log file continues :

[2016-03-02 12:28:08,813][ERROR][license.plugin.core      ] [Thena] 
        #
        # LICENSE EXPIRED ON [Sunday, February 14, 2016]. IF YOU HAVE A NEW LICENSE, PLEASE
        # UPDATE IT. OTHERWISE, PLEASE REACH OUT TO YOUR SUPPORT CONTACT.
        # 
        # COMMERCIAL PLUGINS OPERATING WITH REDUCED FUNCTIONALITY
        # - marvel
        #  - The agent will stop collecting cluster and indices metrics
        [2016-03-02 12:28:08,931][INFO ][gateway                  ] [Thena] recovered [7] indices into cluster_state
        [2016-03-02 12:30:01,487][INFO ][cluster.metadata         ] [Thena] [ihaleler] deleting index
        [2016-03-02 12:30:04,243][INFO ][cluster.metadata         ] [Thena] [ihaleler] creating index, cause [api], templates [], shards [5]/[1], mappings []
        [2016-03-02 12:30:09,088][INFO ][cluster.metadata         ] [Thena] [ihaleler] create_mapping [ihale]

log file continue:

 [2016-03-02 12:30:12,988][DEBUG][action.index             ] [Thena] [ihaleler][2], node[tU9myl0CS22cfwqS50cZTQ], [P], v[2], s[STARTED], a[id=Z2v3myV3RJyusjjYHei1Kg]: Failed to execute [index {[ihaleler][ihale][1030235], source[{"id":1030235,"usul":"A","tip":["H"],"ihaleno":"2015\/166464","durum":true,"islem":"Y","adi":"\u00d6ZEL G\u00dcVENL\u0130K H\u0130ZMET\u0130 ALINACAKTIR","ihale_il":50,"isin_il":50,"il":[50],"cetvel_durumu":"","kurum_enust_id":13,"kurum_ust_id":165,"kurum_id":8361,"ihale_tarihi":"2015-12-22 16:00:00","sektor":["75"],"maliyet":"X","ekap_ihale_id":"0c355bad26e17642500e4c5ff5bab48bc40eb80021dd941dd94c68eac4f30985","ekap_idare_id":"d29521afe06ec1e24efb581bf890a85370ec2e094026d2d4648a843cc205d623","ekleme":"2015-11-27 10:15:42","guncelleme":"0000-00-00 00:00:00"}]}]
                MapperParsingException[object mapping for [islem] tried to parse field [islem] as object, but found a concrete value]
                    at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:218)
                    at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:311)
                    at org.elasticsearch.index.mapper.DocumentParser.parseValue(DocumentParser.java:441)
                    at

log file continue:

org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:267)
                    at org.elasticsearch.index.mapper.DocumentParser.innerParseDocument(DocumentParser.java:127)
                    at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:79)
                    at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:304)
                    at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:551)
                    at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:542)
                    at org.elasticsearch.action.support.replication.TransportReplicationAction.prepareIndexOperationOnPrimary(TransportReplicationAction.java:1049)
                    at org.elasticsearch.action.support.replication.TransportReplicationAction.executeIndexRequestOnPrimary(TransportReplicationAction.java:1060)
                    at org.elasticsearch.action.index.TransportIndexAction.shardOperationOnPrimary(TransportIndexAction.java:170)
                    at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryPhase.performOnPrimary(TransportReplicationAction.java:579)
                    at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryPhase$1.doRun(TransportReplicationAction.java:452)
                    at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37)
                    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
                    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
                    at java.lang.Thread.run(Thread.java:745)
                [2016-03-02 12:30:12,991][INFO ][rest.suppressed          ] /ihaleler/ihale/1030235 Params: {index=ihaleler, id=1030235, type=ihale}
                MapperParsingException[object mapping for [islem] tried to parse field [islem] as object, but found a concrete value]
                    at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:218)
                    at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:311)
                    at org.elasticsearch.index.mapper.DocumentParser.parseValue(DocumentParser.java:441)
                    at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:267)
                    at org.elasticsearch.index.mapper.DocumentParser.innerParseDocument(DocumentParser.java:127)
                    at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:79)
                    at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:304)
                    at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:551)
                    at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:542)
                    at org.elasticsearch.action.support.replication.TransportReplicationAction.prepareIndexOperationOnPrimary(TransportReplicationAction.java:1049)
                    at org.elasticsearch.action.support.replication.TransportReplicationAction.executeIndexRequestOnPrimary(TransportReplicationAction.java:1060)
                    at org.elasticsearch.action.index.TransportIndexAction.shardOperationOnPrimary(TransportIndexAction.java:170)
                    at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryPhase.performOnPrimary(TransportReplicationAction.java:579)
                    at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryPhase$1.doRun(TransportReplicationAction.java:452)
                    at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37)
                    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
                    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
                    at java.lang.Thread.run(Thread.java:745)

I think the error is related to the mapping based on the mapping posted at 'Client error response [status code] 400

The field islem is mapped as an object in the index but the document you're trying to index has that field as a string or a number (probably). Mappings can't be changed without reindexing, but if you're just trying things out maybe you can just delete the current index and start over.

DELETE _all
PUT ihaleler

PUT ihaleler/ihale/_mapping
{
  "properties": {
    "islem": {
      "type": "nested"
    }  } }

result:

{
  "acknowledged": true
}

after:

http://10.0.2.15:8080/indexleme.php?p=index

result:

Fatal error: Uncaught exception 'Guzzle\Http\Exception\ClientErrorResponseException' with message 'Client error response [status code] 400 [reason phrase] Bad Request [url] http://localhost:9200/ihaleler/ihale/1030235' in /home/admin/web/server.com/public_html/vendor/guzzle/http/Guzzle/Http/Exception/BadResponseException.php:43 Stack trace: #0 /home/admin/web/server.com/public_html/vendor/guzzle/http/Guzzle/Http/Message/Request.php(145): Guzzle\Http\Exception\BadResponseException::factory(Object(Guzzle\Http\Message\EntityEnclosingRequest), Object(Guzzle\Http\Message\Response)) #1 [internal function]: Guzzle\Http\Message\Request::onRequestError(Object(Guzzle\Common\Event), 'request.error', Object(Symfony\Component\EventDispatcher\EventDispatcher)) #2 /home/admin/web/server.com/public_html/vendor/symfony/event-dispatcher/Symfony/Component/EventDispatcher/EventDispatcher.php(164): call_user_func(Array, Object(Guzzle\Common\Event), 'request.error', Object(Symfony\Component\EventDispatcher\EventDispatcher)) #3 /home/admin/web/ in /home/admin/web/server.com/public_html/vendor/elasticsearch/elasticsearch/src/Elasticsearch/Connections/GuzzleConnection.php on line 266

My index.php codes:

if($p=='index'){

    $db->orderBy('id','desc');
    $ihaleler=$db->get('tbl_ihale');

    foreach($ihaleler as $ihale){
        $id=$ihale['id'];
        if($id>0){
            $params=array();

            $islem=$ihale['islem'];
            $islem=trim($islem, ',');
            $islem_array=explode(',', $islem);
            $islem_array=array_unique($islem_array);

            $dizi=array();
            $dizi['islem']=$ihale['islem'];
            $params['body']=$dizi;
            $params['index']=$index_name;
            $params['type']=$type_name;
            $params['id']=$id;

            $ret=$client->index($params);
        }}
    exit('indexed'); }    ?>

It looks like $params['body'] is an array. I don't think that's supported; the top-level entity in the indexed JSON document must be an object. Even if it is it looks like the islem field is a string, but according to the mapping you posted it's supposed to be an object.

(This PHP code is unnecessarily hard to read. You should have a space on each side of the assignment operator (=) and you should align the closing braces better.)

1 Like

Ok. I did.

    if($p=='index'){
        $db->orderBy('id','desc');
        $ihaleler=$db->get('tbl_ihale');

        foreach($ihaleler as $ihale){
            $id=$ihale['id'];
            if($id>0){
                $params=array();
               
                $islem      =$ihale['islem'];
                $islem      =trim($islem, ',');
                $islem_array=explode(',', $islem);
                $islem_array=array_unique($islem_array);

                $dizi   =array();
                $object = new stdClass();
                $object = (object) $dizi;

                $dizi['islem']=$ihale['islem'];

                $params['body'] =$object;
                $params['index']=$index_name;
                $params['type'] =$type_name;
                $params['id']   =$id;
                $ret            =$client->index($params);
            }}
        exit('indexed'); }    ?>

result:

it's over It work true. Data indexed

Datas was indexed but filter.php wasn't work.

After : I did query

http://10.0.2.15:8080/filter.php?islem=YDDD

result:

Fatal error:  Uncaught exception 
'Guzzle\Http\Exception\ClientErrorResponseException' with message 
'Client error response
[status code] 400
[reason phrase] Bad Request
[url] http://localhost:9200/ihaleler/ihale/_search?from=0&size=1000'
 in 
/home/admin/web/server.com/public_html/vendor/guzzle/http/Guzzle/Http/Exception/BadResponseException.php:43
Stack
 trace:
#0 
/home/admin/web/server.com/public_html/vendor/guzzle/http/Guzzle/Http/Message/Request.php(145):
 
Guzzle\Http\Exception\BadResponseException::factory(Object(Guzzle\Http\Message\EntityEnclosingRequest),
 Object(Guzzle\Http\Message\Response))
#1 [internal function]: 
Guzzle\Http\Message\Request::onRequestError(Object(Guzzle\Common\Event),
 'request.error', 
Object(Symfony\Component\EventDispatcher\EventDispatcher))
#2 
/home/admin/web/server.com/public_html/vendor/symfony/event-dispatcher/Symfony/Component/EventDispatcher/EventDispatcher.php(164):
 call_user_func(Array, Object(Guzzle\Common\Event), 'request.error', 
Object(Symfony\Component\EventDispatcher\EventDispatcher))
#3 in /home/admin/web/server.com/public_html/vendor/elasticsearch/elasticsearch/src/Elasticsearch/Connections/GuzzleConnection.php on line 266

my filter.php codes :

    <?
    header('Content-Type: text/html; caherset=utf-8');
    require('conn.php');
    require 'vendor/autoload.php';

    $client=new Elasticsearch\Client();
    $index_name='ihaleler';
    $type_name ='ihale';

    $searchParams['index']=$index_name;
    $searchParams['type'] =$type_name;

    $searchParams['from']=0;
    $searchParams['size']=1000;

    unset($filter_islem);
        if(isset($_GET['islem'])){
        $islemler                             =$_GET['islem'];
        $islemler                             =trim($islemler, ',');
        $islemler_array                       =explode(',', $islemler);
        $islemler_array                       =array_unique($islemler_array);
        $filter_islem['match_phrase']['islem']=$_GET['islem'];
    }
    if(is_array($filter_islem)){
        $searchParams['body']['query']['bool']['should']=$filter_islem;
    } 
    $searchParams['body']['sort']['id']['order'] = 'asc';

    pa($searchParams);

    $queryResponse = $client->search($searchParams);

    $kayitlar   = $queryResponse['hits'];
    echo $total = $kayitlar['total'];
    $kayitlar   = $kayitlar['hits'];

    foreach($kayitlar as $kayit){
    $_source = $kayit['_source'];
    echo $_source['islem'];
    }
    exit('the end');
    ?>

I won't debug your PHP code for you, but if you show us more details from the ES log (or if you can dig up more information from the response that PHP gets) I might be able to help.

Nothing of this is related to your failing request. Perhaps you can get PHP to say more about the failing request? As a last resort, sniff the traffic to see exactly what request is placed and what ES has to say about it.

I am sorry. I've just sent log files are missing. Can you look again?

 [2016-03-03 14:42:11,007][WARN ][bootstrap                ] unable to install syscall filter: seccomp unavailable: CONFIG_SECCOMP not compiled into kernel, CONFIG_SECCOMP and CONFIG_SECCOMP_FILTER are needed
    [2016-03-03 14:42:14,677][INFO ][node                     ] [Sangre] version[2.1.1], pid[22364], build[40e2c53/2015-12-15T13:05:55Z]
    [2016-03-03 14:42:14,678][INFO ][node                     ] [Sangre] initializing ...
    [2016-03-03 14:42:15,747][INFO ][plugins                  ] [Sangre] loaded [license, marvel-agent], sites []
    [2016-03-03 14:42:15,843][INFO ][env                      ] [Sangre] using [1] data paths, mounts [[/ (/dev/mapper/vg_server-lv_root)]], net usable_space [9.1gb], net total_space [17.1gb], spins? [possibly], types [ext4]
    [2016-03-03 14:42:21,683][INFO ][node                     ] [Sangre] initialized
    [2016-03-03 14:42:21,683][INFO ][node                     ] [Sangre] starting ...
    [2016-03-03 14:42:21,932][INFO ][transport                ] [Sangre] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}, {[::2]:9300}, {[::1]:9300}
    [2016-03-03 14:42:21,971][INFO ][discovery                ] [Sangre] elasticsearch/RrHlBU_pQvaecziVLgRrDw
    [2016-03-03 14:42:25,063][INFO ][cluster.service          ] [Sangre] new_master {Sangre}{RrHlBU_pQvaecziVLgRrDw}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-join(elected_as_master, [0] joins received)
    [2016-03-03 14:42:25,077][INFO ][http                     ] [Sangre] publish_address {127.0.0.1:9200}, bound_addresses {127.0.0.1:9200}, {[::2]:9200}, {[::1]:9200}
    [2016-03-03 14:42:25,077][INFO ][node                     ] [Sangre] started
    [2016-03-03 14:42:26,050][INFO ][license.plugin.core      ] [Sangre] license [f6ce067d-afaf-49b1-ae49-33947f348a2d] - expired
    [2016-03-03 14:42:26,059][ERROR][license.plugin.core      ] [Sangre] 
    #
    # LICENSE EXPIRED ON [Sunday, February 14, 2016]. IF YOU HAVE A NEW LICENSE, PLEASE
    # UPDATE IT. OTHERWISE, PLEASE REACH OUT TO YOUR SUPPORT CONTACT.
    # 
    # COMMERCIAL PLUGINS OPERATING WITH REDUCED FUNCTIONALITY
    # - marvel
    #  - The agent will stop collecting cluster and indices metrics
    [2016-03-03 14:42:26,139][INFO ][gateway                  ] [Sangre] recovered [3] indices into cluster_state
    [2016-03-03 14:52:26,061][ERROR][license.plugin.core      ] [Sangre] 
    #
    # LICENSE EXPIRED ON [Sunday, February 14, 2016]. IF YOU HAVE A NEW LICENSE, PLEASE
    # UPDATE IT. OTHERWISE, PLEASE REACH OUT TO YOUR SUPPORT CONTACT.
    # 
    # COMMERCIAL PLUGINS OPERATING WITH REDUCED FUNCTIONALITY
    # - marvel
    #  - The agent will stop collecting cluster and indices metrics
    [2016-03-03 15:02:26,066][ERROR][license.plugin.core      ] [Sangre] 
    #
    # LICENSE EXPIRED ON [Sunday, February 14, 2016]. IF YOU HAVE A NEW LICENSE, PLEASE
    # UPDATE IT. OTHERWISE, PLEASE REACH OUT TO YOUR SUPPORT CONTACT.
    # 
    # COMMERCIAL PLUGINS OPERATING WITH REDUCED FUNCTIONALITY
    # - marvel
    #  - The agent will stop collecting cluster and indices metrics

There's still nothing of interest in that log.