spark 执行器中的自定义 log4j appender
Custom log4j appender in spark executor
我正在尝试在 spark 执行器中使用自定义 log4j appender,以便将所有日志转发到 Apache Kafka。
问题是,log4j 在 fatjar 的带有 appender 的类加载器注册之前初始化,所以我最终得到以下结果:
log4j:ERROR Could not instantiate class [kafka.producer.KafkaLog4jAppender].
java.lang.ClassNotFoundException: kafka.producer.KafkaLog4jAppender
at java.net.URLClassLoader.run(URLClassLoader.java:372)
at java.net.URLClassLoader.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:260)
at org.apache.log4j.helpers.Loader.loadClass(Loader.java:198)
at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:327)
at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:785)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.apache.spark.Logging$class.initializeLogging(Logging.scala:122)
at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:107)
at org.apache.spark.Logging$class.log(Logging.scala:51)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.log(CoarseGrainedExecutorBackend.scala:126)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:137)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:235)
at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
log4j:ERROR Could not instantiate appender named "KAFKA".
2015-09-29 13:10:43 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO akka.event.slf4j.Slf4jLogger: Slf4jLogger started
2015-09-29 13:10:43 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO Remoting: Starting remoting
2015-09-29 13:10:43 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO Remoting: Remoting started; listening on addresses :[akka.tcp://driverPropsFetcher@gin3.dev:36918]
2015-09-29 13:10:43 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO Remoting: Remoting now listens on addresses: [akka.tcp://driverPropsFetcher@gin3.dev:36918]
2015-09-29 13:10:44 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
2015-09-29 13:10:44 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
2015-09-29 13:10:44 [sparkExecutor-akka.actor.default-dispatcher-3] INFO akka.event.slf4j.Slf4jLogger: Slf4jLogger started
2015-09-29 13:10:44 [sparkExecutor-akka.actor.default-dispatcher-2] INFO Remoting: Starting remoting
2015-09-29 13:10:44 [sparkExecutor-akka.actor.default-dispatcher-2] INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutor@gin3.dev:40067]
2015-09-29 13:10:44 [sparkExecutor-akka.actor.default-dispatcher-2] INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkExecutor@gin3.dev:40067]
2015-09-29 13:10:44 [driverPropsFetcher-akka.actor.default-dispatcher-5] INFO Remoting: Remoting shut down
....
有什么简单的方法可以解决这个问题吗?我们目前使用的是 Spark 1.3.x.
谢谢
大卫
kafka.producer.KafkaLog4jAppender
在kafka的hadoop-producer中。
所以你可以添加这个依赖来修复它。
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>hadoop-producer</artifactId>
<version>0.8.0</version>
</dependency>
最终提交了带有日志记录 deps 的额外 jar 并在用户类路径之前加载它。
LOG_JAR="${THISDIR}/../lib/logging.jar"
spark-submit ...... \
--files "${LOG4J_CONF},${LOG_JAR}" \
--conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=`basename ${LOG4J_CONF}`" \
--conf "spark.driver.extraClassPath=`basename ${LOG_JAR}`" \
--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=`basename ${LOG4J_CONF}`" \
--conf "spark.executor.extraClassPath=`basename ${LOG_JAR}`" \
...
面临同样的问题,我将 post 对我有用的东西,结果 KafkaLog4jAppender
class 包名称在 kafka 0.9 中发生了变化,这就是我所做的,在 pom
中添加了以下依赖项
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-log4j-appender</artifactId>
<version>0.9.0.0</version>
</dependency>
并将我的 log4j.properties 从
更改为
log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
到
log4j.appender.KAFKA=org.apache.kafka.log4jappender.KafkaLog4jAppender
我正在尝试在 spark 执行器中使用自定义 log4j appender,以便将所有日志转发到 Apache Kafka。
问题是,log4j 在 fatjar 的带有 appender 的类加载器注册之前初始化,所以我最终得到以下结果:
log4j:ERROR Could not instantiate class [kafka.producer.KafkaLog4jAppender].
java.lang.ClassNotFoundException: kafka.producer.KafkaLog4jAppender
at java.net.URLClassLoader.run(URLClassLoader.java:372)
at java.net.URLClassLoader.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:260)
at org.apache.log4j.helpers.Loader.loadClass(Loader.java:198)
at org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:327)
at org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:785)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
at org.apache.spark.Logging$class.initializeLogging(Logging.scala:122)
at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:107)
at org.apache.spark.Logging$class.log(Logging.scala:51)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.log(CoarseGrainedExecutorBackend.scala:126)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:137)
at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:235)
at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
log4j:ERROR Could not instantiate appender named "KAFKA".
2015-09-29 13:10:43 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO akka.event.slf4j.Slf4jLogger: Slf4jLogger started
2015-09-29 13:10:43 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO Remoting: Starting remoting
2015-09-29 13:10:43 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO Remoting: Remoting started; listening on addresses :[akka.tcp://driverPropsFetcher@gin3.dev:36918]
2015-09-29 13:10:43 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO Remoting: Remoting now listens on addresses: [akka.tcp://driverPropsFetcher@gin3.dev:36918]
2015-09-29 13:10:44 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
2015-09-29 13:10:44 [driverPropsFetcher-akka.actor.default-dispatcher-4] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.
2015-09-29 13:10:44 [sparkExecutor-akka.actor.default-dispatcher-3] INFO akka.event.slf4j.Slf4jLogger: Slf4jLogger started
2015-09-29 13:10:44 [sparkExecutor-akka.actor.default-dispatcher-2] INFO Remoting: Starting remoting
2015-09-29 13:10:44 [sparkExecutor-akka.actor.default-dispatcher-2] INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkExecutor@gin3.dev:40067]
2015-09-29 13:10:44 [sparkExecutor-akka.actor.default-dispatcher-2] INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkExecutor@gin3.dev:40067]
2015-09-29 13:10:44 [driverPropsFetcher-akka.actor.default-dispatcher-5] INFO Remoting: Remoting shut down
....
有什么简单的方法可以解决这个问题吗?我们目前使用的是 Spark 1.3.x.
谢谢
大卫
kafka.producer.KafkaLog4jAppender
在kafka的hadoop-producer中。
所以你可以添加这个依赖来修复它。
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>hadoop-producer</artifactId>
<version>0.8.0</version>
</dependency>
最终提交了带有日志记录 deps 的额外 jar 并在用户类路径之前加载它。
LOG_JAR="${THISDIR}/../lib/logging.jar"
spark-submit ...... \
--files "${LOG4J_CONF},${LOG_JAR}" \
--conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=`basename ${LOG4J_CONF}`" \
--conf "spark.driver.extraClassPath=`basename ${LOG_JAR}`" \
--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=`basename ${LOG4J_CONF}`" \
--conf "spark.executor.extraClassPath=`basename ${LOG_JAR}`" \
...
面临同样的问题,我将 post 对我有用的东西,结果 KafkaLog4jAppender
class 包名称在 kafka 0.9 中发生了变化,这就是我所做的,在 pom
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-log4j-appender</artifactId>
<version>0.9.0.0</version>
</dependency>
并将我的 log4j.properties 从
更改为log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
到
log4j.appender.KAFKA=org.apache.kafka.log4jappender.KafkaLog4jAppender