从 python 上传到 Bigquery

Upload to Bigquery from python

我有一个 Python 脚本,它从 firebase 下载数据,对其进行处理,然后将其转储到 JSON 文件中。我可以通过命令行将它上传到 BigQuery,但现在我想将一些代码放入 Python 脚本中,以将其全部完成。

这是我目前的代码。

import json
from firebase import firebase

firebase = firebase.FirebaseApplication('<redacted>')
result = firebase.get('/connection_info', None)
id_keys = map(str, result.keys())

#with open('result.json', 'r') as w:
 # connection = json.load(w)

with open("w.json", "w") as outfile:
  for id in id_keys:
    json.dump(result[id], outfile, indent=None)
    outfile.write("\n")

要使用 google-cloud-bigquery Python 库加载 JSON 文件,请使用 Client.load_table_from_file() 方法。

from google.cloud import bigquery

bigquery_client = bigquery.Client()
table_id = 'myproject.mydataset.mytable'

# This example uses JSON, but you can use other formats.
# See https://cloud.google.com/bigquery/loading-data
job_config = bigquery.LoadJobConfig(
    source_format='NEWLINE_DELIMITED_JSON'
)

with open(source_file_name, 'rb') as source_file:
    job = bigquery_client.load_table_from_file(
        source_file, table_id, job_config=job_config
    )

job.result()  # Waits for the job to complete.

来自代码示例:https://github.com/googleapis/python-bigquery/blob/9d43d2073dc88140ae69e6778551d140430e410d/samples/load_table_file.py#L19-L41

编辑:上传到 table 的方式自 Python 库的 0.28.0 版本以来发生了变化。以下是在 0.27 及更早版本中执行此操作的方法。

要使用 google-cloud-bigquery Python 库加载 JSON 文件,请使用 Table.upload_from_file() 方法。

bigquery_client = bigquery.Client()
dataset = bigquery_client.dataset('mydataset')
table = dataset.table('mytable')

# Reload the table to get the schema.
table.reload()

with open(source_file_name, 'rb') as source_file:
    # This example uses JSON, but you can use other formats.
    # See https://cloud.google.com/bigquery/loading-data
    job = table.upload_from_file(
        source_file, source_format='NEWLINE_DELIMITED_JSON')

来自代码示例:https://github.com/GoogleCloudPlatform/python-docs-samples/blob/4de1ac3971d3a94060a1af7f478330b9c40cfb09/bigquery/cloud-client/load_data_from_file.py#L34-L50

2019 年 11 月更新

找到更新的 documentation,用于使用 Python.

将 JSON 上传到 Google BigQuery

这是我的工作解决方案:

from google.cloud import bigquery

from google.oauth2 import service_account
from dotenv import load_dotenv
load_dotenv()

client = bigquery.Client()
filename = '/path/to/file/in/nd-format.json'
dataset_id = 'DatasetName'
table_id = 'TableName'

dataset_ref = client.dataset(dataset_id)
table_ref = dataset_ref.table(table_id)
job_config = bigquery.LoadJobConfig()
job_config.source_format = bigquery.SourceFormat.NEWLINE_DELIMITED_JSON
job_config.autodetect = True

with open(filename, "rb") as source_file:
    job = client.load_table_from_file(
        source_file,
        table_ref,
        location="europe-west1",  # Must match the destination dataset location.
        job_config=job_config,
    )  # API request

job.result()  # Waits for table load to complete.

print("Loaded {} rows into {}:{}.".format(job.output_rows, dataset_id, table_id))