无法在 spark 的命令行上覆盖 Typesafe 配置
cant override Typesafe configuration on commmandline in spark
我在默认加载的 src/main/resources
文件夹中有一个类型安全配置 application.conf
。
可以通过指定覆盖单个值:
--conf spark.driver.extraJavaOptions=-DsomeValue="foo"
但是,指定一个全新的,即覆盖 application.conf
文件,如:
spark-submit \
--class my.Class \
--master "local[2]" \
--files foo.conf \
--conf spark.driver.extraClassPath="-Dconfig.file=file:foo.conf" \
--conf spark.driver.extraJavaOptions=-Dvalue="abcd" \
job.jar
将无法加载 foo.conf
。相反,将加载资源文件夹中的原始文件。
尝试以下技巧: 也无济于事。
编辑
Overriding multiple config values in Typesafe config when using an uberjar to deploy 似乎是普通(无火花)程序的答案。
问题仍然是如何激发它。
同时通过:
--conf spark.driver.extraClassPath="-Dconfig.resource=file:foo.conf"
--conf spark.driver.extraClassPath="-Dconfig.resource=foo.conf"
无法从命令行加载我的配置。
不过,根据文档:
https://github.com/lightbend/config For applications using
application.{conf,json,properties}, system properties can be used to
force a different config source (e.g. from command line
-Dconfig.file=path/to/config-file):
- config.resource specifies a resource name - not a basename, i.e. application.conf not application
- config.file specifies a filesystem path, again it should include the extension, not be a basename
- config.url specifies a URL
These system properties specify a replacement for
application.{conf,json,properties}, not an addition. They only affect
apps using the default ConfigFactory.load() configuration. In the
replacement config file, you can use include "application" to include
the original default config file; after the include statement you
could go on to override certain settings.
这些参数应该可以。
spark-submit \
--class my.Class \
--master "local[2]" \
--files foo.conf \
--conf spark.driver.extraJavaOptions="-Dvalue='abcd' -Dconfig.file=foo.conf" \
target/scala-2.11/jar-0.1-SNAPSHOT.jar
从 spark.driver.extraClassPath
更改为 spark.driver.extraJavaOptions
就可以了
我在默认加载的 src/main/resources
文件夹中有一个类型安全配置 application.conf
。
可以通过指定覆盖单个值:
--conf spark.driver.extraJavaOptions=-DsomeValue="foo"
但是,指定一个全新的,即覆盖 application.conf
文件,如:
spark-submit \
--class my.Class \
--master "local[2]" \
--files foo.conf \
--conf spark.driver.extraClassPath="-Dconfig.file=file:foo.conf" \
--conf spark.driver.extraJavaOptions=-Dvalue="abcd" \
job.jar
将无法加载 foo.conf
。相反,将加载资源文件夹中的原始文件。
尝试以下技巧:
编辑
Overriding multiple config values in Typesafe config when using an uberjar to deploy 似乎是普通(无火花)程序的答案。 问题仍然是如何激发它。
同时通过:
--conf spark.driver.extraClassPath="-Dconfig.resource=file:foo.conf"
--conf spark.driver.extraClassPath="-Dconfig.resource=foo.conf"
无法从命令行加载我的配置。
不过,根据文档:
https://github.com/lightbend/config For applications using application.{conf,json,properties}, system properties can be used to force a different config source (e.g. from command line -Dconfig.file=path/to/config-file):
- config.resource specifies a resource name - not a basename, i.e. application.conf not application
- config.file specifies a filesystem path, again it should include the extension, not be a basename
- config.url specifies a URL
These system properties specify a replacement for application.{conf,json,properties}, not an addition. They only affect apps using the default ConfigFactory.load() configuration. In the replacement config file, you can use include "application" to include the original default config file; after the include statement you could go on to override certain settings.
这些参数应该可以。
spark-submit \
--class my.Class \
--master "local[2]" \
--files foo.conf \
--conf spark.driver.extraJavaOptions="-Dvalue='abcd' -Dconfig.file=foo.conf" \
target/scala-2.11/jar-0.1-SNAPSHOT.jar
从 spark.driver.extraClassPath
更改为 spark.driver.extraJavaOptions
就可以了