![]() Info: Including Hive libraries found via (/home/hadoop/apache-hive-2.3.7-bin) for Hive access Info: Including HBASE libraries found via (/home/hadoop/hbase-2.2.5/bin/hbase) for HBASE access I get the following: Info: Including Hadoop libraries found via (/home/hadoop/hadoop-2.10.1/bin/hadoop) for HDFS access When I try executing it using the command: flume-ng agent -n TwitterAgent -c conf -f /home/hadoop/Flume/conf/nf I hope I am clear with the question.Please help me to sort it out.I am trying to stream and retrieve Twitter data using Flume but unable to do so because of some sort of error. Java.io.FileNotFoundException: /var/log/flume-ng/flume.log (Permission denied)Īt java.io.FileOutputStream.openAppend(Native Method)Īt java.io.FileOutputStream.(FileOutputStream.java:192)Īt java.io.FileOutputStream.(FileOutputStream.java:116)Īt .setFile(FileAppender.java:294)Īt .setFile(RollingFileAppender.java:207)Īt .activateOptions(FileAppender.java:165)Īt .PropertySetter.activate(PropertySetter.java:307)Īt .tProperties(PropertySetter.java:172)Īt .tProperties(PropertySetter.java:104)Īt 圜onfigurator.parseAppender(Propert圜onfigurator.java:842)Īt 圜onfigurator.parseCategory(Propert圜onfigurator.java:768)Īt 圜nfigureRootCategory(Propert圜onfigurator.java:648)Īt 圜onfigurator.doConfigure(Propert圜onfigurator.java:514)Īt 圜onfigurator.doConfigure(Propert圜onfigurator.java:580)Īt .lectAndConfigure(OptionConverter.java:526)Īt .(LogManager.java:127)Īt 4jLoggerFactory.getLogger(Log4jLoggerFactory.java:73)Īt (LoggerFactory.java:242)Īt (LoggerFactory.java:254)Īt .Application.(Application.java:58) Where Documents/flume is a folder in /home/cloudera/ this folder contains the following fileĪnd I'm getting the following error: log4j:ERROR setFile(null,true) call failed. ![]() ![]() Sudo -u hdfs hadoop fs -chown flume:flume flumeLogTestĪ = tail -f Documents/flume/vmware-0.logĪ.path = hdfs://localhost.localdomain:8020/user/hdfs/flumeLogTestĪ.fileType = DataStreamĪnd I started the flume-agent by the following command : /usr/bin/flume-ng agent -conf Documents/flume -conf-file Documents/flume/nf -name agent I used the following command to create flumeLogTest folder in hdfs : sudo -u hdfs hadoop fs -mkdir flumeLogTest I am trying to read a log file from /home/cloudera/Documents/flume/ and write it to hdfs using apache flume.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |