omhdfs: Hadoop Filesystem Output Module¶
Module Name: | omhdfs |
Author: | Rainer Gerhards <rgerhards@adiscon.com> |
Purpose¶
This module supports writing message into files on Hadoop’s HDFS file system.
Configuration Parameters¶
Note
Parameter names are case-insensitive.
obsolete legacy
Directives¶
obsolete legacy directive |
type | default | mandatory |
---|---|---|---|
$OMHDFSFileName |
word | none | no |
The name of the file to which the output data shall be written.
obsolete legacy directive |
type | default | mandatory |
---|---|---|---|
$OMHDFSHost |
word | default | no |
Name or IP address of the HDFS host to connect to.
obsolete legacy directive |
type | default | mandatory |
---|---|---|---|
$OMHDFSPort |
integer | 0 | no |
Port on which to connect to the HDFS host.
obsolete legacy directive |
type | default | mandatory |
---|---|---|---|
$OMHDFSDefaultTemplate |
word | RSYSLOG_FileFormat | no |
Default template to be used when none is specified. This saves the work of specifying the same template ever and ever again. Of course, the default template can be overwritten via the usual method.
Caveats/Known Bugs¶
Building omhdfs is a challenge because we could not yet find out how to integrate Java properly into the autotools build process. The issue is that HDFS is written in Java and libhdfs uses JNI to talk to it. That requires that various system-specific environment options and paths be set correctly. At this point, we leave this to the user. If someone knows how to do it better, please drop us a line!
- In order to build, you need to set these environment variables BEFORE
running ./configure:
- JAVA_INCLUDES - must have all include pathes that are needed to build JNI C programms, including the -I options necessary for gcc. An example is # export JAVA_INCLUDES=”-I/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/include -I/usr/lib/jvm/java-1.6.0-openjdk-1.6.0.0.x86_64/include/linux”
- JAVA_LIBS - must have all library pathes that are needed to build JNI C programms, including the -l/-L options necessary for gcc. An example is # export export JAVA_LIBS=”-L/usr/java/jdk1.6.0_21/jre/lib/amd64 -L/usr/java/jdk1.6.0_21/jre/lib/amd64/server -ljava -ljvm -lverify”
- As of HDFS architecture, you must make sure that all relevant environment variables (the usual Java stuff and HADOOP’s home directory) are properly set.
- As it looks, libhdfs makes Java throw exceptions to stdout. There is no known work-around for this (and it usually should not case any troubles.
Examples¶
Example 1¶
$ModLoad omhdfs
$OMHDFSFileName /var/log/logfile \*.\* :omhdfs:
See also
Help with configuring/using Rsyslog
:
- Mailing list - best route for general questions
- GitHub: rsyslog source project - detailed questions, reporting issues
that are believed to be bugs with
Rsyslog
- Stack Exchange (View, Ask) - experimental support from rsyslog community
See also
Contributing to Rsyslog
:
- Source project: rsyslog project README.
- Documentation: rsyslog-doc project README