Chinaunix首页 | 论坛 | 博客
  • 博客访问: 540489
  • 博文数量: 76
  • 博客积分: 2990
  • 博客等级: 少校
  • 技术积分: 827
  • 用 户 组: 普通用户
  • 注册时间: 2008-09-26 10:53
文章分类

全部博文(76)

文章存档

2011年(47)

2010年(13)

2009年(5)

2008年(11)

分类: Mysql/postgreSQL

2011-08-13 09:56:52

最近慢查询较多,想法子自动做数据统计: 

生成的简报的内容如下 :
---------------Daily Slow Query Report Create at 110812 18:16:09-----------------

FROM 8:25:17    To 8:25:57
COUNT   SQL_TEXT
----------------
    356 select * from (select a.colum1 ,c.column2 ,c.column3,c.column5
     45 select a.*,c.column3, c.column4 from table_namea a l
     36 select count(*) from table_namea a left join table_namec

FROM 8:25:57    To 8:26:00
COUNT   SQL_TEXT
----------------
     18 select * from (select a.colum1 ,c.column2 ,c.column3,c.column5
      5 select a.*,c.column3, c.column4 from table_namea a l

FROM 8:26:00    To 8:26:07
COUNT   SQL_TEXT
----------------
     68 select * from (select a.colum1 ,c.column2 ,c.column3,c.column5
     15 select a.*,c.column3, c.column4 from table_namea a l
     10 select count(*) from table_namea a left join table_namec

------------------------------------------------------------------------------------------- 
实现脚本     
#!/bin/bash
date_mark=`date +%y%m%d`
date_now=`date "+%y%m%d %H:%M:%S"`
name=`hostname |cut -d'.' -f1`
dirpath="/var/mysqldatadir/"
file_name="$name-slow.log"

#分割时间段-汇总1分钟之内的
sed -n "/Time: $date_mark/,$ p" $dirpath/$file_name > /tmp/$file_name-$date_mark
grep "\# Time: $date_mark" /tmp/$file_name-$date_mark |cut  -c1-21 |awk '{print $4}' |uniq > /tmp/$file_name-$date_mark-split.tmp1
echo "" > /tmp/$file_name-$date_mark-split
for tmp in $(cat /tmp/$file_name-$date_mark-split.tmp1)
do
 grep '\# Time: ' /tmp/$file_name-$date_mark|grep "$tmp" |awk '{print $4}'|xargs |awk '{print $1"\n"$NF}' >> /tmp/$file_name-$date_mark-split
done
sed -i -e  '/^$/d' -e '$d' /tmp/$file_name-$date_mark-split
echo -ne "\n---------------Daily Slow Query Report Create at $date_now-----------------\n"

steps_cnt=`wc -l /tmp/$file_name-$date_mark-split|awk '{print $1}'`

#分时间段汇总
LIMIT="$steps_cnt"
for ((cnt=1; cnt <= LIMIT ; cnt++))
do
start_time=`sed -n "$cnt p" /tmp/$file_name-$date_mark-split`
indirect_cnt=`expr $cnt + 1`
end_time=`sed -n "$indirect_cnt p" /tmp/$file_name-$date_mark-split`
last_time=`grep "\# Time: $date_mark" /tmp/$file_name-$date_mark|tail -1`
if [ "$indirect_cnt" -gt "$steps_cnt" ]
then
  strings=`echo "FROM $start_time \tTo $last_time"|sed -e "s%# Time: %%g"`
  echo -ne "\n$strings\n"
  sed -n "/$start_time/,$ p"  /tmp/$file_name-$date_mark|grep -v "^#" |sort -nr |cut -c1-60 |uniq -c|sort -nr|awk 'BEGIN {print "COUNT   SQL_TEXT\n----------------"}{print $0}'
  rm -f /tmp/$file_name-$date_mark-split 
/tmp/$file_name-$date_mark-split.tmp1

  sleep 5
  e_address="monitor@mydomain.com"
  mail -s "$name slow query report at $date_now" $e_address < /tmp/email_text
  exit 0
fi
  strings=`echo "FROM $start_time \tTo $end_time"|sed -e "s%# Time: %%g"`
  echo -ne "\n$strings\n"
  sed -n "/$start_time/,/$end_time/ p"  /tmp/$file_name-$date_mark|grep -v "^#" |sort -nr |cut -c1-60 |uniq -c |sort -nr|awk 'BEGIN {print "COUNT   SQL_TEXT\n----------------"}{print $0}'
done

每日调用脚本
report_daily_slowlog.sh > /tmp/email_text
阅读(1159) | 评论(0) | 转发(0) |
给主人留下些什么吧!~~