ELK/EFK中ES使用IK分词器的方式步骤
1. 安装ES同时安装IK
提前下载好IK分词器(版本和ES保持 一致):
https://github.com/medcl/elasticsearch-analysis-ik/releases/download/v6.8.0/elasticsearch-analysis-ik-6.8.0.zip
下载后,在plugins下面创建文件夹IK,把ZIP解压到IK里面
使用DOCKER启动ES:
es:
container_name: es
image: docker.elastic.co/elasticsearch/elasticsearch:6.8.0
privileged: true
ports:
- "9200:9200"
volumes:
- ./efk/es/data:/usr/share/elasticsearch/data
- ./efk/es/plugins:/usr/share/elasticsearch/plugins
environment:
- node.name=es
- http.host=0.0.0.0
- transport.host=127.0.0.1
- "ES_JAVA_OPTS=-Xms64m -Xmx256m -Xmn128m"
- bootstrap.memory_lock=true
- discovery.type=single-node
- xpack.security.enabled=true
- xpack.security.http.ssl.enabled=false
- xpack.security.transport.ssl.enabled=false
- ELASTIC_PASSWORD="xxxx"
user: "1000:1000"
2. 启动ES后,设置账号密码:
进入DOCKER: docker exec -it es bash 设置密码: ./bin/elasticsearch-setup-passwords interactive 修改密码 XXXX
3. 启动KIBANA
kibana:
image: docker.elastic.co/kibana/kibana:6.8.0
container_name: kibana
ports:
- "5601:5601"
depends_on:
- es
volumes:
- ./efk/kibana/kibana.yml:/usr/share/kibana/config/kibana.yml
environment:
SERVER_NAME: kibana
SERVER_HOST: '0.0.0.0'
I18N_LOCALE: zh-CN
#ELASTICSEARCH_URL: "http://es:9200"
ELASTICSEARCH_HOSTS: 'http://es:9200'
ELASTICSEARCH_USERNAME: 'elastic'
ELASTICSEARCH_PASSWORD: '"XXXXXX"'
4. 启动KIBANA后登陆localhost:5601, 然后进入开发工具,创建索引模板:
其中:filebeat-6.8.0为模板名,创建索引模式需要用到。
PUT /_template/filebeat-6.8.0
{
"index_patterns": ["filebeat-6.8.0-*"],
"settings": {
"index": {
"number_of_shards": 1,
"number_of_replicas": 1,
"analysis": {
"analyzer": {
"default": {
"type": "ik_max_word"
}
}
}
}
}
}
5. 安装filebeat/logbeat
filebeat:
image: docker.elastic.co/beats/filebeat:6.8.0
container_name: filebeat
restart: always
privileged: true
user: root
environment:
- setup.kibana.host=kibana:5601
- output.elasticsearch.hosts=["es:9200"]
volumes:
- /var/lib/docker/containers:/var/lib/docker/containers:ro
- ./efk/filebeat/filebeat.yml:/usr/share/filebeat/filebeat.yml
- /var/run/docker.sock:/var/run/docker.sock:ro
links: ['es']
depends_on: ['es']
deploy:
resources:
limits:
memory: 1000m
6. 进入kibana中,创建索引模式:
注意,定制索引模式ID必须为上面在开发工具中(第四步)创建 的名字

创建成功后,就可以在discovery中使用IK分词进行搜索了,支持中文,支持小数点分词
============ 欢迎各位老板打赏~ ===========
与本文相关的文章
- · Filebeat + ZincSearch 轻量级日志
- · kibana6.8.0禁用不用的模块
- · kibana查询统计
- · 解决es报错:blocked by: [FORBIDDEN/12/index read-only / allow delete (api) ]
- · filebeat按docker容器名创建不同的索引
- · es+filebeat+elastalert2实现异常邮件提醒
- · linux快速搭建轻量级efk日志系统
- · docker insepct logtail
- · 解决confluence文件预览问题,中文乱码问题
- · 原创!无插件hack方式实现conflulence open connect sso登陆
- · 利用k8s ingress访问非POD服务
- · 解决vault sidecar认证失败的问题
