You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
my problem as follows:
$ bin/hadoop jar /home/hadoop/hadoop/lib/hadoop-lzo-0.4.15.jar com.hadoop.compression.lzo.LzoIndexer README.txt.lzo
12/07/24 17:16:52 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
12/07/24 17:16:52 INFO lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 6bb1b7f]
12/07/24 17:16:53 INFO lzo.LzoIndexer: [INDEX] LZO Indexing file README.txt.lzo, size 0.00 GB...
Exception in thread "main" java.lang.IllegalArgumentException: Compression codec
com.hadoop.compression.lzo.LzoCodec not found.
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:96)
at org.apache.hadoop.io.compress.CompressionCodecFactory.(CompressionCodecFactory.java:134)
at com.hadoop.compression.lzo.LzoIndex.createIndex(LzoIndex.java:209)
at com.hadoop.compression.lzo.LzoIndexer.indexSingleFile(LzoIndexer.java:117)
at com.hadoop.compression.lzo.LzoIndexer.indexInternal(LzoIndexer.java:98)
at com.hadoop.compression.lzo.LzoIndexer.index(LzoIndexer.java:52)
at com.hadoop.compression.lzo.LzoIndexer.main(LzoIndexer.java:137)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.ClassNotFoundException:
com.hadoop.compression.lzo.LzoCodec
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:819)
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89)
... 11 more
hadoop version :hadoop-0.20.203.0;
java version:jdk-6u31-linux-i586.bin
and i have installed lzop-1.03,lzo-2.06,
kevinweil-hadoop-lzo-6bb1b7f.zip have builded successfully,i have copied hadoop-lzo-0.4.15.jar to /home/hadoop/hadoop/lib/lzo(my hadoop home is /home/hadoop/hadoop),and copied lib/native/Linux-i386-32 to /home/hadoop/hadoop/lib/native/Linux-i386-32(my system is 32 bit).and my /etc/profile is:
export JAVA_HOME=/usr/lib/jvm/java-1.6.0_31-sun
export CHUKWA_HOME=/home/hadoop/chukwa-0.4.0
export PATH=$PATH:$JAVA_HOME/bin
export HADOOP_HOME=/home/hadoop/hadoop
export HADOOP_CONF_DIR=/home/hadoop/hadoop/conf
export ANT_HOME=/home/hadoop/apache-ant-1.8.4
export PATH=$PATH:#HADOOP_HOME/bin:#ANT_HOME/bin
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$CHUKWA_HOME/lib/:$HADOOP_HOME/lib/hadoop-lzo-0.4.15.jar
my hadoop-env.sh is
Set Hadoop-specific environment variables here.
The only required environment variable is JAVA_HOME. All others are
optional. When running a distributed configuration it is best to
set JAVA_HOME in this file, so that it is correctly defined on
my problem as follows:
$ bin/hadoop jar /home/hadoop/hadoop/lib/hadoop-lzo-0.4.15.jar com.hadoop.compression.lzo.LzoIndexer README.txt.lzo
12/07/24 17:16:52 INFO lzo.GPLNativeCodeLoader: Loaded native gpl library
12/07/24 17:16:52 INFO lzo.LzoCodec: Successfully loaded & initialized native-lzo library [hadoop-lzo rev 6bb1b7f]
12/07/24 17:16:53 INFO lzo.LzoIndexer: [INDEX] LZO Indexing file README.txt.lzo, size 0.00 GB...
Exception in thread "main" java.lang.IllegalArgumentException: Compression codec
com.hadoop.compression.lzo.LzoCodec not found.
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:96)
at org.apache.hadoop.io.compress.CompressionCodecFactory.(CompressionCodecFactory.java:134)
at com.hadoop.compression.lzo.LzoIndex.createIndex(LzoIndex.java:209)
at com.hadoop.compression.lzo.LzoIndexer.indexSingleFile(LzoIndexer.java:117)
at com.hadoop.compression.lzo.LzoIndexer.indexInternal(LzoIndexer.java:98)
at com.hadoop.compression.lzo.LzoIndexer.index(LzoIndexer.java:52)
at com.hadoop.compression.lzo.LzoIndexer.main(LzoIndexer.java:137)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.ClassNotFoundException:
com.hadoop.compression.lzo.LzoCodec
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:819)
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:89)
... 11 more
hadoop version :hadoop-0.20.203.0;
java version:jdk-6u31-linux-i586.bin
and i have installed lzop-1.03,lzo-2.06,
kevinweil-hadoop-lzo-6bb1b7f.zip have builded successfully,i have copied hadoop-lzo-0.4.15.jar to /home/hadoop/hadoop/lib/lzo(my hadoop home is /home/hadoop/hadoop),and copied lib/native/Linux-i386-32 to /home/hadoop/hadoop/lib/native/Linux-i386-32(my system is 32 bit).and my /etc/profile is:
export JAVA_HOME=/usr/lib/jvm/java-1.6.0_31-sun
export CHUKWA_HOME=/home/hadoop/chukwa-0.4.0
export PATH=$PATH:$JAVA_HOME/bin
export HADOOP_HOME=/home/hadoop/hadoop
export HADOOP_CONF_DIR=/home/hadoop/hadoop/conf
export ANT_HOME=/home/hadoop/apache-ant-1.8.4
export PATH=$PATH:#HADOOP_HOME/bin:#ANT_HOME/bin
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$HADOOP_HOME/hadoop-core-0.20.203.0.jar:$CHUKWA_HOME/lib/:$HADOOP_HOME/lib/hadoop-lzo-0.4.15.jar
my hadoop-env.sh is
Set Hadoop-specific environment variables here.
The only required environment variable is JAVA_HOME. All others are
optional. When running a distributed configuration it is best to
set JAVA_HOME in this file, so that it is correctly defined on
remote nodes.
The java implementation to use. Required.
export JAVA_HOME=/usr/lib/jvm/java-1.6.0_31-sun
export HADOOP_OPTS=-Djava.net.preferIPv4Stack=true
Extra Java CLASSPATH elements. Optional.
export HADOOP_HOME=/home/hadoop/hadoop
export HADOOP_CLASSPATH=/home/hadoop/hadoop/lib/hadoop-lzo-0.4.15.jar
export LD_LIBRARY_PATH=/home/hadoop/hadoop/lib/native
export LIBRARY_PATH=/home/hadoop/hadoop/lib/native/Linux-i386-32/lib:/home/hadoop/hadoop/lib/native:/home/hadoop/hadoop/lib
export JAVA_LIBRARY_PATH=/home/hadoop/hadoop/lib/native:/home/hadoop/hadoop/lib/native/Linux-i386-32/lib
export HADOOP_LIBRARY_PATH=/home/hadoop/hadoop/lib/native/Linux-i386-32/lib:/home/hadoop/hadoop/lib/native:/home/hadoop/hadoop/lib
The maximum amount of heap to use, in MB. Default is 1000.
export HADOOP_HEAPSIZE=2000
Extra Java runtime options. Empty by default.
export HADOOP_OPTS=-server
Command specific options appended to HADOOP_OPTS when specified
export HADOOP_NAMENODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_NAMENODE_OPTS"
export HADOOP_SECONDARYNAMENODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_SECONDARYNAMENODE_OPTS"
export HADOOP_DATANODE_OPTS="-Dcom.sun.management.jmxremote $HADOOP_DATANODE_OPTS"
export HADOOP_BALANCER_OPTS="-Dcom.sun.management.jmxremote $HADOOP_BALANCER_OPTS"
export HADOOP_JOBTRACKER_OPTS="-Dcom.sun.management.jmxremote $HADOOP_JOBTRACKER_OPTS"
export HADOOP_TASKTRACKER_OPTS=
The following applies to multiple commands (fs, dfs, fsck, distcp etc)
export HADOOP_CLIENT_OPTS
Extra ssh options. Empty by default.
export HADOOP_SSH_OPTS="-o ConnectTimeout=1 -o SendEnv=HADOOP_CONF_DIR"
Where log files are stored. $HADOOP_HOME/logs by default.
export HADOOP_LOG_DIR=${HADOOP_HOME}/logs
File naming remote slave hosts. $HADOOP_HOME/conf/slaves by default.
export HADOOP_SLAVES=${HADOOP_HOME}/conf/slaves
host:path where hadoop code should be rsync'd from. Unset by default.
export HADOOP_MASTER=master:/home/$USER/src/hadoop
Seconds to sleep between slave commands. Unset by default. This
can be useful in large clusters, where, e.g., slave rsyncs can
otherwise arrive faster than the master can service them.
export HADOOP_SLAVE_SLEEP=0.1
The directory where pid files are stored. /tmp by default.
export HADOOP_PID_DIR=/var/hadoop/pids
A string representing this instance of hadoop. $USER by default.
export HADOOP_IDENT_STRING=$USER
The scheduling priority for daemon processes. See 'man nice'.
export HADOOP_NICENESS=10
no matter I try it on the cluster or on a single node ,the problem is always exits,i don't know why ,please help me!!1
The text was updated successfully, but these errors were encountered: