这篇文章给大家介绍spark中怎么配置启用LZO压缩,内容非常详细,感兴趣的小伙伴们可以参考借鉴,希望对大家能有所帮助。
创新互联成立于2013年,我们提供高端重庆网站建设公司、网站制作、网站设计、网站定制、成都全网营销推广、微信小程序、微信公众号开发、成都网站推广服务,提供专业营销思路、内容策划、视觉设计、程序开发来完成项目落地,为社区文化墙企业提供源源不断的流量和订单咨询。
Spark中配置启用LZO压缩,步骤如下:
一、spark-env.sh配置
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/app/hadoop-2.6.0-cdh6.7.0/lib/native
export SPARK_LIBRARY_PATH=$SPARK_LIBRARY_PATH:/app/hadoop-2.6.0-cdh6.7.0/lib/native
export SPARK_CLASSPATH=$SPARK_CLASSPATH:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/yarn/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/yarn/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/hdfs/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/hdfs/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/mapreduce/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/mapreduce/lib/*:/app/hadoop-2.6.0-cdh6.7.0/share/hadoop/tools/lib/*:/app/spark-2.2.0-bin-2.6.0-cdh6.7.0/jars/*
2、无法找到LzopCodec类
2.1、错误提示:
Caused by: java.lang.IllegalArgumentException: Compression codec com.hadoop.compression.lzo.LzopCodec not found.
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:135)
at org.apache.hadoop.io.compress.CompressionCodecFactory.
at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45)
Caused by: java.lang.ClassNotFoundException: Class com.hadoop.compression.lzo.LzopCodec not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1980)
at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:128)
2.2、解决办法:在spark的conf中配置spark-defaults.conf,增加以下内容:
spark.driver.extraClassPath /app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/hadoop-lzo-0.4.19.jar
spark.executor.extraClassPath /app/hadoop-2.6.0-cdh6.7.0/share/hadoop/common/hadoop-lzo-0.4.19.jar
关于spark中怎么配置启用LZO压缩就分享到这里了,希望以上内容可以对大家有一定的帮助,可以学到更多知识。如果觉得文章不错,可以把它分享出去让更多的人看到。