我们安装好elasticsearch后默认安装的插件是standard,对中文的支持不是太好
它分将中文拆分为一个个的汉字
我们需要安装ik插件,这样可以科学的分词
首先进入elasticsearch容器,然后安装插件
```
[root@iZbp1a1vzoew7k9hkqdryrZ ~]# docker exec -it elasticsearch /bin/bash
[root@0299080834c2 elasticsearch]# pwd
/usr/share/elasticsearch
[root@0299080834c2 elasticsearch]# elasticsearch-plugin install https://github.com/medcl/elasticsearch-analysis-ik/releases/download/v7.2.0/elasticsearch-analysis-ik-7.2.0.zip
```
也可以离线安装
```
将压缩包移动到容器中
docker cp /tmp/elasticsearch-analysis-ik-7.2.0.zip elasticsearch:/usr/share/elasticsearch/plugins
进入容器
docker exec -it elasticsearch /bin/bash
创建目录
mkdir /usr/share/elasticsearch/plugins/ik
将文件压缩包移动到ik中
mv /usr/share/elasticsearch/plugins/elasticsearch-analysis-ik-7.2.0.zip /usr/share/elasticsearch/plugins/ik
进入目录
cd /usr/share/elasticsearch/plugins/ik
解压
unzip elasticsearch-analysis-ik-6.5.4.zip
删除压缩包
rm -rf elasticsearch-analysis-ik-6.5.4.zip
```
最后退出容器并重启容器即可
```
exit
docker restart elasticsearch
```
最后测试分词是否生效
![](https://static.yuanchengzhushou.cn/image/8236.jpg)
还可以将ik_smart改为ik_max_word
![](https://static.yuanchengzhushou.cn/image/ik_max_word.jpg)