自定义分词器
自定义分析器地址:https://www.elastic.co/guide/cn/elasticsearch/guide/cn/custom-analyzers.html
PUT /my_index
{
“settings”: {
“analysis”: {
“char_filter”: { … custom character filters … },//字符过滤器
“tokenizer”: { … custom tokenizers … },//分词器
“filter”: { … custom token filters … }, //词单元过滤器
“analyzer”: { … custom analyzers … }
}
}
}
============================实例===========================
PUT /my_index
{
“settings”: {
“analysis”: {
“char_filter”: {
“&_to_and”: {
“type”: “mapping”,
“mappings”: [ “&=> and “]
}},
“filter”: {
“my_stopwords”: {
“type”: “stop”,
“stopwords”: [ “the”, “a” ]
}},
“analyzer”: {
“my_analyzer”: {
“type”: “custom”,
“char_filter”: [ “html_strip”, “&_to_and” ],
“tokenizer”: “standard”,
“filter”: [ “lowercase”, “my_stopwords” ]
}}
}}}
============================实例===========================
比如自定义好的analyzer名字是my_analyzer,在此索引下的某个新增字段应用此分析器
PUT /my_index/_mapping
{
“properties”:{
“username”:{
“type”:”text”,
“analyzer” : “my_analyzer”
},
“password” : {
“type” : “text”
}
}
}
=================插入数据====================
PUT /my_index/_doc/1
{
“username”:”The quick & brown fox “,
“password”:”The quick & brown fox “
}
====username采用自定义分析器my_analyzer,password采用默认的standard分析器==
===验证
GET /index_v1/_analyze
{
“field”:”username”,
“text”:”The quick & brown fox”
}
GET /index_v1/_analyze
{
“field”:”password”,
“text”:”The quick & brown fox”
}
还没有评论,来说两句吧...