收集文件夹的内容到HDFS(spooldir-memory-hdfs)

àì夳堔傛蜴生んèń 2022-11-26 08:47 231阅读 0赞
  1. # Name the components on this agent
  2. a1.sources = r1
  3. a1.sinks = k1
  4. a1.channels = c1
  5. # Describe/configure the source
  6. a1.sources.r1.type = spooldir
  7. a1.sources.r1.spoolDir = /bigdata/testdata/spool_dir
  8. # Describe the sink
  9. a1.sinks.k1.type = hdfs
  10. a1.sinks.k1.hdfs.path=hdfs://localhost:8020/flume/spooling/
  11. a1.sinks.k1.hdfs.fileType=CompressedStream
  12. a1.sinks.k1.hdfs.codeC=org.apache.hadoop.io.compress.GzipCodec
  13. a1.sinks.k1.hdfs.filePrefix=events-
  14. a1.sinks.k1.hdfs.rollSize=0
  15. a1.sinks.k1.hdfs.rollCount =1000000
  16. a1.sinks.k1.hdfs.rollInterval=30
  17. # Use a channel which buffers events in memory
  18. a1.channels.c1.type = memory
  19. a1.channels.c1.capacity = 1000
  20. a1.channels.c1.transactionCapacity = 100
  21. # Bind the source and sink to the channel
  22. a1.sources.r1.channels = c1
  23. a1.sinks.k1.channel = c1

发表评论

表情:
评论列表 (有 0 条评论,231人围观)

还没有评论,来说两句吧...

相关阅读