将文件名参数传递到 Dataproc Hadoop 作业中

Passing a filename argument into Dataproc Hadoop job

我有一个 Hadoop 作业,我正在尝试 运行 在 GCP Dataproc 集群上。该作业采用一个参数,一个文件名,用于配置作业的某些方面。我似乎找不到使用 gcloud CLI 成功执行此操作的方法(尽管我可以成功 运行 Airflow 中的作业)。我试过使用本地文件、Google 存储中的文件和集群本身上的文件。

这是我在文件中读取的工作代码:

File inFile = null;
Properties appProps = new Properties();

inFile = new File(args[0]);

try (FileInputStream inFileStream = new FileInputStream(inFile)) {
    appProps.load(inFileStream);
} catch (IOException e) {
    log.error("Unable to read file");
}

这是我对 运行 命令的尝试(其中 props_test 是文件):

# local file
$ gcloud dataproc jobs submit hadoop --project my-project --region global --cluster my-cluster --jar gs://path/to/hadoop_job.jar -- props_test
Job [00653af3afaf40ea8227d5a56e3de458] submitted.
Waiting for job output...
20/01/17 19:29:55 INFO job.JobDriver: props_test
20/01/17 19:29:55 ERROR job.JobDriver: Unable to read file
Exception in thread "main" java.lang.NullPointerException
    at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:187)
    at com.google.common.base.Splitter.split(Splitter.java:371)
    at job.JobDriver.parseHadoopConfig(JobDriver.java:52)
    at job.JobDriver.run(JobDriver.java:101)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    at job.JobDriver.main(JobDriver.java:45)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:244)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:158)
    at com.google.cloud.hadoop.services.agent.job.shim.HadoopRunJarShim.main(HadoopRunJarShim.java:12)
ERROR: (gcloud.dataproc.jobs.submit.hadoop) Job [00653af3afaf40ea8227d5a56e3de458] failed with error:
Job failed with message [Exception in thread "main" java.lang.NullPointerException]. Additional details can be found at '. . .'.

# Google Storage file
$ gcloud dataproc jobs submit hadoop --project my-project --region global --cluster my-cluster --jar gs://path/to/hadoop_job.jar -- gs://path/to/props_test
Job [8ec4a775a0fb4e31b77967d99280fd6c] submitted.
Waiting for job output...
20/01/17 19:30:15 INFO job.JobDriver: gs://path/to/props_test
20/01/17 19:30:15 ERROR job.JobDriver: Unable to read file
Exception in thread "main" java.lang.NullPointerException
    at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:187)
    at com.google.common.base.Splitter.split(Splitter.java:371)
    at job.JobDriver.parseHadoopConfig(JobDriver.java:52)
    at job.JobDriver.run(JobDriver.java:101)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    at job.JobDriver.main(JobDriver.java:45)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:244)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:158)
    at com.google.cloud.hadoop.services.agent.job.shim.HadoopRunJarShim.main(HadoopRunJarShim.java:12)
ERROR: (gcloud.dataproc.jobs.submit.hadoop) Job [8ec4a775a0fb4e31b77967d99280fd6c] failed with error:
Job failed with message [Exception in thread "main" java.lang.NullPointerException]. Additional details can be found at '. . .'.

# File on Dataproc cluster
$ gcloud dataproc jobs submit hadoop --project my-project --region global --cluster my-cluster --jar gs://path/to/hadoop_job.jar -- file:///path/to/props_test
Job [fda035c5159642c294a47385d5ebb85f] submitted.
Waiting for job output...
20/01/17 19:33:23 INFO job.JobDriver: props_test
20/01/17 19:33:23 ERROR job.JobDriver: Unable to read file
Exception in thread "main" java.lang.NullPointerException
    at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:187)
    at com.google.common.base.Splitter.split(Splitter.java:371)
    at job.JobDriver.parseHadoopConfig(JobDriver.java:52)
    at job.JobDriver.run(JobDriver.java:101)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
    at job.JobDriver.main(JobDriver.java:45)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:244)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:158)
    at com.google.cloud.hadoop.services.agent.job.shim.HadoopRunJarShim.main(HadoopRunJarShim.java:12)
ERROR: (gcloud.dataproc.jobs.submit.hadoop) Job [fda035c5159642c294a47385d5ebb85f] failed with error:
Job failed with message [Exception in thread "main" java.lang.NullPointerException]. Additional details can be found at '. . .'.

请先尝试在 GCS 中暂存文件,然后通过 --files arg 通过完全限定名称传递它,然后仅作为 args 中的文件名:

gcloud dataproc jobs submit hadoop ... \
    --files gs://path/to/file.ini \
    -- arg1 file.ini arg3