cdh4 spark配置LZO
使用的hadoop是cdh4.2.1版本 spark1.1,需要配置hadoop-native和lzo native 在spark-env.sh加入 HADOOP_CONF_DIR=/etc/hadoop/conf?SPARK_SUBMIT_CLASSPATH=$SPARK_SUBMIT_CLASSPATH:/etc/hive/conf:/opt/cloudera/parcels/HADOOP_LZO/lib/hadoop/lib/hadoop
使用的hadoop是cdh4.2.1版本 spark1.1,需要配置hadoop-native和lzo native
在spark-env.sh加入
HADOOP_CONF_DIR=/etc/hadoop/conf ? SPARK_SUBMIT_CLASSPATH=$SPARK_SUBMIT_CLASSPATH:/etc/hive/conf:/opt/cloudera/parcels/HADOOP_LZO/lib/hadoop/lib/hadoop-lzo.jar #SPARK_SUBMIT_LIBRARY_PATH=$SPARK_SUBMIT_LIBRARY_PATH:/opt/cloudera/parcels/CDH/lib/hadoop/lib/native:/opt/cloudera/parcels/HADOOP_LZO/lib/hadoop/lib/native LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/cloudera/parcels/CDH/lib/hadoop/lib/native:/opt/cloudera/parcels/HADOOP_LZO/lib/hadoop/lib/native
我加了SPARK_SUBMIT_LIBRARY_PATH发现是加入了java.library.path,但是丢了默认的配置。改为加入LD_LIBRARY_PATH里
spark-defaults.conf的配置
spark.ui.port 8810 spark.executor.extraLibraryPath /opt/cloudera/parcels/CDH/lib/hadoop/lib/native:/opt/cloudera/parcels/HADOOP_LZO/lib/hadoop/lib/native
可能还需要加上 spark.executor.extraClassPath,但我当前还是单机测试,没配也能跑成功。。
启动spark-sql试试,打开监控页面看看 http://xxxx:8810/environment/ 是否配置正确。
参考:
http://hsiamin.com/posts/2014/05/03/enable-lzo-compression-on-hadoop-pig-and-spark/ (按这里的配置不成功,可能环境变量名称改了。)
http://lotso.blog.51cto.com/3681673/1441737
https://spark.apache.org/docs/latest/configuration.html
原文地址:cdh4 spark配置LZO, 感谢原作者分享。