Chinaunix首页 | 论坛 | 博客
  • 博客访问: 1061835
  • 博文数量: 239
  • 博客积分: 10
  • 博客等级: 民兵
  • 技术积分: 3618
  • 用 户 组: 普通用户
  • 注册时间: 2012-11-12 13:17
文章分类

全部博文(239)

文章存档

2021年(1)

2016年(1)

2015年(30)

2014年(91)

2013年(116)

分类: HADOOP

2015-11-03 10:36:51


编写logstash到kafka的接口,如下图所示。
[root@rac01 ~]# cat kafka.conf 
input {
    stdin {
        add_field => {"key" => "value"}
        codec => "plain"
        tags => ["add"]
        type => "std"
    }
}
output {
        stdout { codec=> rubydebug }
kafka {
        broker_list => "192.168.56.104:9092"
        topic_id => "test"
        compression_codec => "snappy"
}
}

启动logstash,输入
[root@rac01 ~]# /usr/local/logstash-1.5.0/bin/logstash agent -f kafka.conf 
Logstash startup completed
this is kafka test
{
       "message" => "this is kafka test",
      "@version" => "1",
    "@timestamp" => "2015-11-03T02:35:18.006Z",
          "type" => "std",
          "tags" => [
        [0] "add"
    ],
           "key" => "value",
          "host" => "rac01"
}

查看kafka接受端:
[root@rac01 bin]# ./kafka-console-consumer.sh --zookeeper localhost:2181 --topic test --from-beginning
{"message":"this is kafka test","@version":"1","@timestamp":"2015-11-03T02:35:18.006Z","type":"std","tags":["add"],"key":"value","host":"rac01"}

当然logstash也可以从文件中读取数据,其conf文件为
[root@rac01 ~]# cat logstash-kafka.conf 
input {
file {
type =>"syslog"
      path => ["/home/a.log"]
      start_position => "beginning"
      }
}
output {
        stdout { codec=> rubydebug }
kafka {
        broker_list => "192.168.56.104:9092"
topic_id => "test"
        compression_codec => "snappy"
}
}
阅读(2816) | 评论(0) | 转发(0) |
给主人留下些什么吧!~~