Flink 1.12.3 upgrade triggers `NoSuchMethodError: 'scala.collection.mutable.ArrayOps scala.Predef$.refArrayOps`

Flink 1.12.3 upgrade triggers `NoSuchMethodError: 'scala.collection.mutable.ArrayOps scala.Predef$.refArrayOps`

当我将我的 Flink Java 应用程序从 1.12.2 升级到 1.12.3 时,我收到一个新的运行时错误。我可以将我的 Flink 应用程序剥离到这两个衬里:

public class TableEnvOnly {
    public static void main(String[] args) throws Exception {
        final StreamExecutionEnvironment streamEnv = StreamExecutionEnvironment.getExecutionEnvironment();
        StreamTableEnvironment tableEnv = StreamTableEnvironment.create(streamEnv);
    }
}

这适用于 Flink 1.12.2 版并且不会触发任何错误。当我将 Maven Flink 依赖项升级到 1.12.3 时,同一个简单的应用程序抛出错误:

Exception in thread "main" java.lang.NoSuchMethodError: 'scala.collection.mutable.ArrayOps scala.Predef$.refArrayOps(java.lang.Object[])'
        at org.apache.flink.table.planner.delegation.PlannerBase.<init>(PlannerBase.scala:118)
        at org.apache.flink.table.planner.delegation.StreamPlanner.<init>(StreamPlanner.scala:47)
        at org.apache.flink.table.planner.delegation.BlinkPlannerFactory.create(BlinkPlannerFactory.java:48)
        at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.create(StreamTableEnvironmentImpl.java:143)
        at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:113)
        at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:85)
        at simple.TableEnvOnly.main(TableEnvOnly.java:12)

仅供参考,我没有直接使用 Scala。我的 Gradle 依赖项是:

    implementation("org.apache.flink:flink-table-planner-blink_2.12:1.12.3")
    implementation("org.apache.flink:flink-clients_2.12:1.12.3")
    implementation("org.apache.flink:flink-connector-kafka_2.12:1.12.3")
    implementation("org.apache.flink:flink-connector-jdbc_2.12:1.12.3")

TL;DR: 升级到 Flink 1.12.4 后,问题神奇地消失了。

详情

从 Flink 1.12.2 升级到 Flink 1.12.3 后,以下代码停止编译:

import scala.collection.JavaConverters._
val input = new DataStream[String](env.fromCollection(Seq("a", "b", "c").asJava))
val res = input.map(_.toUpperCase)

Scala编译器报错:

could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[String]

scala-compiler 和 scala-library 的版本是 2.12.7 - 与 Flink 使用的完全一样。

为了克服编译问题,我们提供了一个TypeInformation的隐式实例:

implicit val typeInfo = TypeInformation.of(classOf[String])

然后,代码编译。尽管如此,我们仍面临上述运行时故障:

  java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
  at org.apache.flink.api.scala.ClosureCleaner$.getSerializedLambda(ClosureCleaner.scala:184)
  at org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$clean(ClosureCleaner.scala:257)
  at org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:168)
  at org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:859)
  at org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:1189)
  at org.apache.flink.streaming.api.scala.DataStream.map(DataStream.scala:623)

如前所述,升级到 Flink 1.12.4 有帮助 - 编译和运行时故障都消失了。

我的猜测是某些 Flink 1.12.3 jar 被意外编译为错误的 Scala 版本。后续版本 1.12.4 已使用正确的 Scala 版本编译。