Skip to end of metadata
Go to start of metadata

When you execute a job and it is unable to read or write data using the HDFS, check the following on the machine where the Data Services Job Server configured for Hadoop is installed:

Ensure that the 64-bit libhdfs.so library is included on the LD_LIBRARY_PATH environment variable. If not, add the directory containing this library to the environment variable and re-start the Data Services Job Server.

If you are using a Cloudera 3 or Intel Hadoop distribution installed on the machine, ensure your $LINK_DIR/hadoop/bin/hadoop_env.sh script contains the following bold elements:

  • LD_LIBRARY_PATH=/usr/lib64:$HADOOP_HOME/c++/Linux-amd64-64/lib:$LINK_DIR/ext/jre/lib/amd64/server:$LD_LIBRARY_PATH
  • classes=`ls $HADOOP_HOME/lib/guava*.jar $HADOOP_HOME/lib/commons*.jar $HADOOP_HOME/hadoop-core-*.jar`

If you are using a Cloudera 4 distribution installed on the machine, ensure your $LINK_DIR/hadoop/bin/hadoop_env.sh script contains the following bold elements:

  • LD_LIBRARY_PATH=/usr/lib64:$HADOOP_HOME/c++/Linux-amd64-64/lib:$LINK_DIR/ext/jre/lib/amd64/server:$LD_LIBRARY_PATH
  • classes=`ls $HADOOP_HOME/client-0.20/*.jar`

If you are using a Hortonworks 1.2 distribution installed on the machine, ensure your $LINK_DIR/hadoop/bin/hadoop_env.sh script contains the following bold elements:

  • classes=`ls $HADOOP_HOME/lib/commons*.jar $HADOOP_HOME/hadoop-core-.jar $HADOOP_HOME/lib/log4j-*.jar`

If not, add them to the script, execute the command source ./hadoop_env.sh -e and re-start the Data Services Job Server.

  • No labels