客户端logstash把日志提交到服务端的redis里面,服务端logstash在从服务端的redis提取日志到ela中
1:客户端配置如下
input {
file {
path => "/data/nginx/nginx/logs/nginx_logstash.log"
type => "nginx_access"
start_position => "beginning"
codec => "json"
}
file {
path => "/data/mysql_db/db_inst1/slow.log"
type => "slow-mysql" #指定日志类型,以便在一个配置文件中收集多个日志,用来区别输出
start_position => "beginning"
codec => multiline {
pattern => "^# User@Host"
negate => true
what => "previous"
}
}
}
output {
if [type] == "nginx_access" {
redis {
host => "192.168.0.147"
key => "nginx_access"
data_type => "list"
port => "6379"
# type => "redis_nginx-input"
}
}
if [type] == "slow-mysql" {
redis {
host => "192.168.0.147"
key => "slow-mysql"
data_type => "list"
port => "6379"
#type => "redis_slow-input"
}
}
}
服务端配置如下:
input {
redis {
host => "192.168.0.147"
port => "6379"
key => "nginx_access"
type => "nginx_access"
data_type => "list"
}
redis {
type => "slow-mysql"
host => "192.168.0.147"
data_type => "list"
key => "slow-mysql"
port => "6379"
}
}
output {
if [type] == "nginx_access" {
elasticsearch {
host => ["192.168.0.147:9200","192.168.0.148:9200"]
protocol => "http"
index => "nginx-access-%{+YYYY.MM}"
}
}
if [type] == "slow-mysql" {
elasticsearch {
host => ["192.168.0.147:9200","192.168.0.148:9200"]
protocol => "http"
index => "slow-mysql-%{+YYYY.MM}"
}
}
}
在启动相应的服务:
客户端:/opt/logstash/bin/logstash -f ./nginx-access_mysql-slow.conf
服务端:/opt/logstash/bin/logstash -f ./logstash_redis-ela.conf
在登陆192.168.0.147:/_plugin/head 查看数据
发现数据已经到elasticsearch里面
在登录到kibana中设置索引到各个日志文件
1:
、
2:
3:查看日志生成
阅读(1167) | 评论(0) | 转发(0) |