如何在不清除 bigquery 表的情况下更新 App Engine 中的 google-cloud-dataflow 运行

How To update google-cloud-dataflow running in app engine without clearing bigquery tables

我在 App-engine 上有一个 google-cloud-dataflow 进程 运行。 它监听通过 pubsub 发送的消息并流式传输到 big-query.

我更新了我的代码,我正在尝试重新运行该应用程序。

但是我收到这个错误:

Exception in thread "main" java.lang.IllegalArgumentException: BigQuery table is not empty

有没有办法在不删除table的情况下更新数据流? 由于我的代码可能经常更改,并且我不想删除 table.

中的数据

这是我的代码:

public class MyPipline {
    private static final Logger LOG = LoggerFactory.getLogger(BotPipline.class);
    private static String name;

    public static void main(String[] args) {

        List<TableFieldSchema> fields = new ArrayList<>();
        fields.add(new TableFieldSchema().setName("a").setType("string"));
        fields.add(new TableFieldSchema().setName("b").setType("string"));
        fields.add(new TableFieldSchema().setName("c").setType("string"));
        TableSchema tableSchema = new TableSchema().setFields(fields);

        DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class);
        options.setRunner(BlockingDataflowPipelineRunner.class);
        options.setProject("my-data-analysis");
        options.setStagingLocation("gs://my-bucket/dataflow-jars");
        options.setStreaming(true);

        Pipeline pipeline = Pipeline.create(options);

        PCollection<String> input = pipeline
                .apply(PubsubIO.Read.subscription(
                        "projects/my-data-analysis/subscriptions/myDataflowSub"));

        input.apply(ParDo.of(new DoFn<String, Void>() {

            @Override
            public void processElement(DoFn<String, Void>.ProcessContext c) throws Exception {
                LOG.info("json" + c.element());
            }

        }));
        String fileName = UUID.randomUUID().toString().replaceAll("-", "");


        input.apply(ParDo.of(new DoFn<String, String>() {
            @Override
            public void processElement(DoFn<String, String>.ProcessContext c) throws Exception {
                JSONObject firstJSONObject = new JSONObject(c.element());
                firstJSONObject.put("a", firstJSONObject.get("a").toString()+ "1000");
                c.output(firstJSONObject.toString());

            }

        }).named("update json")).apply(ParDo.of(new DoFn<String, TableRow>() {

            @Override
            public void processElement(DoFn<String, TableRow>.ProcessContext c) throws Exception {
                JSONObject json = new JSONObject(c.element());
                TableRow row = new TableRow().set("a", json.get("a")).set("b", json.get("b")).set("c", json.get("c"));
                c.output(row);
            }

        }).named("convert json to table row"))
                .apply(BigQueryIO.Write.to("my-data-analysis:mydataset.mytable").withSchema(tableSchema)
        );

        pipeline.run();
    }
}

您需要在 BigQueryIO.Write 上指定 withWriteDisposition - 请参阅文档 of the method and of its argument。根据您的要求,您需要 WRITE_TRUNCATEWRITE_APPEND.