Elasticsearch 中搜索不到Logstash同步过来的数据,这是为什么呢?

运行环境:windows server2008 R2
logstash:v6.2.1
elasticsearch:v6.2.1
我想通过logstash+elasticsearch做一个搜索服务,数据通过logstash从Mysql数据库同步到elasticsearch,我的logstash.conf文件内容如下:

input {
  jdbc { 
    jdbc_connection_string => "jdbc:mysql://192.168.1.184:3306/niucai1"
    jdbc_user => "root"
    jdbc_password => "huqian123"
    # The path to our downloaded jdbc driver
    jdbc_driver_library => "E:\els\mysql-connector-java-5.1.45\mysql-connector-java-5.1.45-bin.jar"
    jdbc_driver_class => "com.mysql.jdbc.Driver"
    # our query
    schedule => "* * * * *"
    statement => "select a.ProductId,a.ProductNo,a.ProductName,b.Width,b.Length,b.Thickness,d.ListItemName EnvClassName,b.PackageCount,b.Density,c.BoardTypeName from product a left outer join productboard b on a.ProductId = b.ProductId left outer join boardtype c on b.BoardTypeId = c.BoardTypeId left outer join listitem d on b.EnvClassId = d.ListItemId and d.ListId = 8 where a.IsView = 1 and a.ProductTypeId = 1 and a.ProductId > :sql_last_value order by a.ProductId"
    use_column_value => true
    tracking_column => ProductId
  }
}
output {
  stdout { codec => json_lines }
    elasticsearch {
    hosts => "192.168.1.184:9200"
    index => "niucaiproduct"
    document_type => "product"
    document_id => "%{productid}"
    manage_template => false
  }
}

其中给定了索引文档的ID标识,以及增量同步的设置
然后我通过NEXT API添加了一个映射模板,模板中用到了elasticsearch-analysis-ik该分词插件,通过API调用该分词插件能够正常分词,调用代码如下:

var settings = new ConnectionSettings(new Uri("http://192.168.1.184:9200"))
                .DefaultMappingFor<Product>(t => t.IndexName("niucaiproduct"));
var client = new ElasticClient(settings);
var analyzeResult = client.Analyze(a => a.Analyzer("ik_max_word").Text("临沂宗利12mm阻燃密度板C级"));

模板的映射代码为:

[ElasticsearchType(Name = "product", IdProperty = "productid")]
        public class Product
        {
            [Number(NumberType.Long)]
            public long ProductId { get; set; }
            [Text]
            public string ProductNo { get; set; }
            [Text(Analyzer = "ik_max_word", Index = true, SearchAnalyzer = "ik_max_word")]
            public string ProductName { get; set; }
            [Number(NumberType.Float)]
            public float Width { get; set; }
            [Number(NumberType.Float)]
            public float Length { get; set; }
            [Number(NumberType.Float)]
            public float Thickness { get; set; }
            [Text]
            public string EnvClassName { get; set; }
            [Number(NumberType.Integer)]
            public int PackageCount { get; set; }
            [Number(NumberType.Float)]
            public float Density { get; set; }
            [Text]
            public string BoardTypeName { get; set; }
        }

添加索引模板代码为:

var indexTemplate = client.PutIndexTemplate("product", t => t
                .IndexPatterns("niucaiproduct")
                .Mappings(m => m.Map<Product>(tmd => tmd
                    .Dynamic(false)
                    .AutoMap())));

然后启动logstash,elasticsearch中的输出日志显示已创建了索引,并且应用了我创建的模板
%E5%BE%AE%E4%BF%A1%E6%88%AA%E5%9B%BE_20180314114852
数据也能够正常同步到elasticsearch,通过以下代码验证能够获取到数据:

var testSearch = client.Get<Product>(1482);

现在的问题就是,我通过搜索的方式无法搜索到数据,代码如下:

var searchResponse = client.Search<Product>(s => s
                .From(0)
                .Size(9999)
                .Query(q => q
                        .Match(m => m
                        .Field(f => f.ProductName)
                        .Query("临沂"))
                )
            );

然后我通过API的方式,手工添加 了一条数据:

var indexDoc = client.IndexDocument(new Product
            {
                ProductId = 1,
                ProductName = "宁国绿源22mm家具密度板罗定绿源2.7mmT薄板密度板",
                ProductNo = "58115082200011201",
                BoardTypeName = "密度板",
                EnvClassName = "定制"
            });

再通过同样的方式搜索数据,只搜索到了我手工添加的那条数据,通过logstash同步过来的数据就是搜索不到,只能通过ID方式获取,这是为什么呢,谢谢

看样子是 Logstash 没有正常发送索引,你可以直接访问 es 的 api 来确认一下么?
然后 Logstash 的配置文件,在 output 里面加上

 stdout {}

看看有消息输出没有?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.