直接到 elasticsearch data文件夹里删掉就行了,但怎么也得做的有点技术含量不是?
上网站看了看文档,其实也挺简单一条命令就行了
# curl -XDELETE '*' 清理掉了所有 3月份的索引文件,我发现curl 删除比rm删除要快出很多
#curl -XDELETE '9200/*2016.*'
#
#curl -XDELETE '9200/*2017.01.*'
#
#curl -XDELETE '9200/*2017.02.*'
#
#curl -XDELETE '9200/*2017.03.*'
#
#curl -XDELETE '9200/ctivityescort-2016.*'
#
#curl -XDELETE '9200/ctivityescort-2017.02.*'
#
#curl -XDELETE '9200/ctivityescort-2017.03.*'
#
#curl -XDELETE '9200/ctivityserver-2017.03.*'
delete_es_by_day.sh
#!/bin/sh
# example: sh delete_es_by_day.sh logstash-kettle-log logsdate 30
index_name=$1
daycolumn=$2
savedays=$3
format_day=$4
if [ ! -n "$savedays" ]; then
echo "the args is not right,please input again...."
exit 1
fi
if [ ! -n "$format_day" ]; then
format_day='%Y%m%d'
fi
sevendayago=`date -d "-${savedays} day " +${format_day}`
curl -XDELETE "10.130.3.102:9200/${index_name}/_query?pretty" -d "
{
"query": {
"filtered": {
"filter": {
"bool": {
"must": {
"range": {
"${daycolumn}": {
"from": null,
"to": ${sevendayago},
"include_lower": true,
"include_upper": true
}
}
}
}
}
}
}
}"
echo "ok"
注解:脚本传入参数说明:1.索引名;2.日期字段名;3.保留最近几天数据,单位天;4.日期格式,可不输(默认形式20160101)
#!/bin/bash
#hexm@2016.10.18#只保留一周es日志
logName=(
51-nginxaccesslog
51-nginxerrorlog
51-phperrorlog
)
#es配置文件
config=/usr/local/app/elasticsearch-2.3.4/config/elasticsearch.yml
#日期time=`date -d "7 day ago" +-%Y.%m.%d`
ip=`grep "network.host" ${config} | awk '{print$2}'`
port=`grep "http.port" ${config} | awk '{print$2}'`
#es监听的ip和端口
ipPort=${ip}:${port}
#循环删除for ((i=0;i<${#logName[*]};i++))
do
name=${logName[$i]}${time}
curl -XDELETE "{ipPort}/${name}" done
#!/usr/bin/python
# -*- coding:utf-8 -*-
#hexm@2016.10.18
#只保留一周es日志
#
import commands
from datetime import datetime, timedelta
config = "/usr/local/app/elasticsearch-2.3.4/config/elasticsearch.yml"
logName = ('51-nginxaccesslog', '51-nginxerrorlog', '51-phperrorlog')
ip = commands.getoutput(""" grep "network.host" %s | awk '{print$2}' """ % config)
port = commands.getoutput(""" grep "http.port" %s | awk '{print$2}' """ % config)
tm = datetime.now() + timedelta(days=-7)
tm = tm.strftime("%Y.%m.%d")
for name in logName:
url = "http://" + str(ip) + ":" + str(port) + "/" + name + "-" + tm
print url
#!/bin/bash
# author: Wang XiaoQiang
# crontab -e
# 0 0 * * * /root/script/del_esindex.sh
# auto delete 7 day ago elasticsearch index
dtime=`date -d "7 day ago" +%Y-%m-%d`
dtime_stamp=`date -d "$dtime" +%s`
indexs=`curl -s '9200/_cat/indices' | awk '$3~/^logstash/{print $3}'`
for line in $indexs;do
index=$line
itime=`echo $line | awk -F - '{print $3}' | tr '.' '-'`
itime_stamp=`date -d "$itime" +%s`
if [ $itime_stamp -lt $dtime_stamp ];then
curl -X DELETE "9200/$index" > /dev/null 2>&1
fi
done
elasticsearch 删除索引
curl -XDELETE '*'
阅读(1187) | 评论(0) | 转发(0) |