Google 数据流:PCollection<String> 到 PCollection<TableRow> 用于 BigQuery 插入
Google Dataflow: PCollection<String> to PCollection<TableRow> for BigQuery insertion
我是 Google 云平台的新手,我第一次尝试 Google Dataflow 用于我的研究生项目。我想做的是编写一个自动加载作业,从我的云存储上的某个存储桶加载文件,并将其中的数据插入 BigQuery table。
我将数据作为 PCollection<String>
类型获取,但为了在 BigQuery 中插入,我显然需要将其转换为 PCollection<TableRow>
类型。到目前为止,我还没有找到可靠的答案。
这是我的代码:
public static void main(String[] args) {
//Defining the schema of the BigQuery table
List<TableFieldSchema> fields = new ArrayList<>();
fields.add(new TableFieldSchema().setName("Datetime").setType("TIMESTAMP"));
fields.add(new TableFieldSchema().setName("Consumption").setType("FLOAT"));
fields.add(new TableFieldSchema().setName("MeterID").setType("STRING"));
TableSchema schema = new TableSchema().setFields(fields);
//Creating the pipeline
PipelineOptions options = PipelineOptionsFactory.fromArgs(args).withValidation().create();
Pipeline p = Pipeline.create(options);
//Getting the data from cloud storage
PCollection<String> lines = p.apply(TextIO.Read.named("ReadCSVFromCloudStorage").from("gs://mybucket/myfolder/certainCSVfile.csv"));
//Probably need to do some transform here ...
//Inserting data into BigQuery
lines.apply(BigQueryIO.Write
.named("WriteToBigQuery")
.to("projectID:datasetID:tableID")
.withSchema(schema)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED));
}
我可能只是忘记了一些基本的东西,所以我希望你们能帮我解决这个问题...
BigQueryIO.Write
在 PCollection<TableRow>
上运行,如 Writing to BigQuery. You'll need to apply a transform to convert PCollection<TableRow>
into PCollection<String>
. For an example, take a look at StringToRowConverter:
中所述
static class StringToRowConverter extends DoFn<String, TableRow> {
/**
* In this example, put the whole string into single BigQuery field.
*/
@Override
public void processElement(ProcessContext c) {
c.output(new TableRow().set("string_field", c.element()));
}
...
}
我是 Google 云平台的新手,我第一次尝试 Google Dataflow 用于我的研究生项目。我想做的是编写一个自动加载作业,从我的云存储上的某个存储桶加载文件,并将其中的数据插入 BigQuery table。
我将数据作为 PCollection<String>
类型获取,但为了在 BigQuery 中插入,我显然需要将其转换为 PCollection<TableRow>
类型。到目前为止,我还没有找到可靠的答案。
这是我的代码:
public static void main(String[] args) {
//Defining the schema of the BigQuery table
List<TableFieldSchema> fields = new ArrayList<>();
fields.add(new TableFieldSchema().setName("Datetime").setType("TIMESTAMP"));
fields.add(new TableFieldSchema().setName("Consumption").setType("FLOAT"));
fields.add(new TableFieldSchema().setName("MeterID").setType("STRING"));
TableSchema schema = new TableSchema().setFields(fields);
//Creating the pipeline
PipelineOptions options = PipelineOptionsFactory.fromArgs(args).withValidation().create();
Pipeline p = Pipeline.create(options);
//Getting the data from cloud storage
PCollection<String> lines = p.apply(TextIO.Read.named("ReadCSVFromCloudStorage").from("gs://mybucket/myfolder/certainCSVfile.csv"));
//Probably need to do some transform here ...
//Inserting data into BigQuery
lines.apply(BigQueryIO.Write
.named("WriteToBigQuery")
.to("projectID:datasetID:tableID")
.withSchema(schema)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED));
}
我可能只是忘记了一些基本的东西,所以我希望你们能帮我解决这个问题...
BigQueryIO.Write
在 PCollection<TableRow>
上运行,如 Writing to BigQuery. You'll need to apply a transform to convert PCollection<TableRow>
into PCollection<String>
. For an example, take a look at StringToRowConverter:
static class StringToRowConverter extends DoFn<String, TableRow> {
/**
* In this example, put the whole string into single BigQuery field.
*/
@Override
public void processElement(ProcessContext c) {
c.output(new TableRow().set("string_field", c.element()));
}
...
}