I am running my hadoop job and it is failing on class not found. 4 java files in total.
logProcessor.java logMapper.java logReducer.java logParser.java
Everything is in a com folder on unix and I have "package com;" in the first line in all classes
that means if you do below command head -5 *java You will see package com; in all 4 files.
logProcessor is the Driver class. All files are in "com" folder on unix.
ls -ltr com/ logProcessor.java logMapper.java logReducer.java logParser.java I compiled the java program and made a jar out of it. hadoop jar /var/lib/hadoop-hdfs/xxxx/jarFiles/LogParser.jar com.LogProcessor /user/hdfs/flume/2015-03-30/03 /user/xxxx/output_xxx
It gives me below error:
Exception in thread "main" java.lang.ClassNotFoundException: com.RFCLogProcessor at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(Class.java:270) at org.apache.hadoop.util.RunJar.main(RunJar.java:201)Answer1:
First, you need to write your jar path in classpath. write directory path of your jar file into .bashrc file.
Then you can using below command:
hadoop jar directorypath/yourjarname.jar packagename.mainclassname outputpathAnswer2:
Everything that I was doing was right. Except that the classes which got created didnt have execute permission on them
So I did
chmod -R 777 com/
and jar'd it again and ran it with same above command and it executed.