Hive error : Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/Iterable

Hive error : Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/Iterable

我在 hive 使用 spark 时尝试在 hive 中查询 table 时遇到错误。例如,当我这样做时:

select count(*) from ma_table;

我明白了:

Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/Iterable
    at org.apache.hadoop.hive.ql.parse.spark.GenSparkProcContext.<init>(GenSparkProcContext.java:163)
    at org.apache.hadoop.hive.ql.parse.spark.SparkCompiler.generateTaskTree(SparkCompiler.java:195)
    at org.apache.hadoop.hive.ql.parse.TaskCompiler.compile(TaskCompiler.java:267)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10947)
    at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:246)
    at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:250)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:477)
    at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1242)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1384)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1171)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1161)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:232)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:183)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:399)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:776)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:714)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.lang.ClassNotFoundException: scala.collection.Iterable
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 23 more

经过一番研究,我尝试将其添加到我的 bashrc(并获取它;)):

for f in ${HIVE_LIB}/*.jar; do
    CLASSPATH=${CLASSPATH}:$f;
done


 for f in ${SPARK_HOME}/jars/*.jar; do
     CLASSPATH=${CLASSPATH}:$f;
 done


for f in ${SCALA_HOME}/lib/*.jar; do
     CLASSPATH=${CLASSPATH}:$f;
done

我检查过,CLASSPATH 中有所有这些 jar,但仍然出现错误。 我正在使用 Hive 2.1、Hadoop 2.8 和 Spark 2.1。 有任何想法吗?非常感谢!

回过头来,我发现scala jars文件不在Hive的lib中。 如果您遇到类似的情况,请查看: https://www.linkedin.com/pulse/hive-spark-configuration-common-issues-mohamed-k