如何使用定义的分析器和分片在 Elasticsearch 中创建索引?

How to create an index in Elasticsearch with an analyzer and shard defined?

我正在尝试创建一个索引,其中包含文本和关键字的映射以及定义的分析器,这里是我到目前为止所尝试的内容:


{
    "settings" : {
        "number_of_shards" : 2,
        "number_of_replicas" : 1
    },

    "analysis": {
      "normalizer": {
        "my_normalizer": {
          "type": "custom",
          "char_filter": [],
          "filter": ["lowercase", "asciifolding"]
        }
      }
    }
  ,
    "mappings": {
    "properties": {
  "question": {
    "type":"text",
    "fields": {
      "keyword": {
        "type": "keyword"
      },
     "normalize": {
      "type": "keyword",
      "normalizer": "my_normalizer"
    }

}
}
}
}
}

我已经试过了,但出现错误:

"error": {
    "root_cause": [
        {
            "type": "parse_exception",
            "reason": "unknown key [analysis] for create index"
        }
    ],
    "type": "parse_exception",
    "reason": "unknown key [analysis] for create index"
},
"status": 400

}

问题是我需要添加此映射的字段。 我正在 AWS ES 服务中尝试这个。

良好的开端,您快完成了!

analysis 部分需要位于顶级 settings 部分内,如下所示:

{
  "settings": {
    "index": {
      "number_of_shards": 2,
      "number_of_replicas": 1
    },
    "analysis": {
      "normalizer": {
        "my_normalizer": {
          "type": "custom",
          "char_filter": [],
          "filter": [
            "lowercase",
            "asciifolding"
          ]
        }
      }
    }
  },
  "mappings": {
    "properties": {
      "question": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword"
          },
          "normalize": {
            "type": "keyword",
            "normalizer": "my_normalizer"
          }
        }
      },
      "answer": {
        "type": "text",
        "fields": {
          "keyword": {
            "type": "keyword"
          },
          "normalize": {
            "type": "keyword",
            "normalizer": "my_normalizer"
          }
        }
      }
    }
  }
}