Cloud SQL Cloud Storage 存储桶的导入权限问题

Cloud SQL import permissions issues for Cloud Storage bucket

我正在写一个 Cloud Function 来:

注: 我希望此代码每晚自己 运行 将生产数据库复制到暂存环境,因此我打算使用 Cloud Scheduler 触发它。
如果您有 better/easier 解决方案在 GCP 中解决这个问题,我会洗耳恭听:)

这是我的代码(实际功能在文件底部clone_db):

from os import getenv
from datetime import datetime
from time import sleep

from googleapiclient import discovery
from googleapiclient.errors import HttpError
from oauth2client.client import GoogleCredentials
from google.cloud import storage

GS_BUCKET = getenv("GS_BUCKET")
GS_FOLDER = "sql-exports"
GS_EXPORT_PATH = f"gs://{GS_BUCKET}/{GS_FOLDER}"


def __sql_file_name(db: str, timestamp: datetime):
    return f"{db}-{timestamp.strftime('%Y-%m-%d')}.sql.gz"


def __sql_file_uri(db: str, timestamp: datetime):
    return f"{GS_EXPORT_PATH}/{__sql_file_name(db, timestamp)}"


def __export_source_db(service, project: str, timestamp: datetime, instance: str, db: str):
    context = {
        "exportContext": {
            "kind": "sql#exportContext",
            "fileType": "SQL",
            "uri": __sql_file_uri(db, timestamp),
            "databases": [db],
        }
    }

    return service.instances().export(project=project, instance=instance, body=context).execute()


def __import_target_db(service, project: str, timestamp: datetime, instance: str, db: str):
    context = {
        "importContext": {
            "kind": "sql#importContext",
            "fileType": "SQL",
            "uri": __sql_file_uri(db, timestamp),
            "database": db,
        }
    }

    return service.instances().import_(project=project, instance=instance, body=context).execute()


def __drop_db(service, project: str, instance: str, db: str):
    try:
        return service.databases().delete(project=project, instance=instance, database=db).execute()
    except HttpError as e:
        if e.resp.status == 404:
            return {"status": "DONE"}
        else:
            raise e


def __create_db(service, project: str, instance: str, db: str):
    database = {
        "name": db,
        "project": project,
        "instance": instance,
    }

    return service.databases().insert(project=project, instance=instance, body=database).execute()


def __update_export_permissions(file_name: str):
    client = storage.Client()
    file = client.get_bucket(GS_BUCKET).get_blob(f"{GS_FOLDER}/{file_name}")
    file.acl.user(getenv("TARGET_DB_SERVICE_ACCOUNT")).grant_read()
    file.acl.save()


def __delete_sql_file(file_name: str):
    client = storage.Client()
    bucket = client.get_bucket(GS_BUCKET)
    bucket.delete_blob(f"{GS_FOLDER}/{file_name}")


def __wait_for(operation_type, operation, service, project):
    if operation["status"] in ("PENDING", "RUNNING", "UNKNOWN"):
        print(f"{operation_type} operation in {operation['status']} status. Waiting for completion...")

        while operation['status'] != "DONE":
            sleep(1)
            operation = service.operations().get(project=project, operation=operation['name']).execute()

    print(f"{operation_type} operation completed!")


def clone_db(_):
    credentials = GoogleCredentials.get_application_default()
    service = discovery.build('sqladmin', 'v1beta4', credentials=credentials)

    # Project ID of the project that contains the instance to be exported.
    project = getenv('PROJECT_ID')

    # Cloud SQL instance ID. This does not include the project ID.
    source = {
        "instance": getenv("SOURCE_INSTANCE_ID"),
        "db": getenv("SOURCE_DB_NAME")
    }

    timestamp = datetime.utcnow()

    print(f"Exporting database {source['instance']}:{source['db']} to Cloud Storage...")
    operation = __export_source_db(service, project, timestamp, **source)

    __wait_for("Export", operation, service, project)

    print("Updating exported file permissions...")
    __update_export_permissions(__sql_file_name(source["db"], timestamp))
    print("Done.")

    target = {
        "instance": getenv("TARGET_INSTANCE_ID"),
        "db": getenv("TARGET_DB_NAME")
    }

    print(f"Dropping target database {target['instance']}:{target['db']}")
    operation = __drop_db(service, project, **target)
    __wait_for("Drop", operation, service, project)

    print(f"Creating database {target['instance']}:{target['db']}...")
    operation = __create_db(service, project, **target)
    __wait_for("Creation", operation, service, project)

    print(f"Importing data into {target['instance']}:{target['db']}...")
    operation = __import_target_db(service, project, timestamp, **target)
    __wait_for("Import", operation, service, project)

    print("Deleting exported SQL file")
    __delete_sql_file(__sql_file_name(source["db"], timestamp))
    print("Done.")

一切正常,直到我尝试将导出的数据导入到我的目标实例中。

当它调用 import_ 时,该函数失败并出现以下错误:

Error: function crashed. Details:
<HttpError 403 when requesting https://www.googleapis.com/sql/v1beta4/projects/<project_id>/instances/<instance_id>/import?alt=json returned "The service account does not have the required permissions for the bucket.">

我已经在此处和网络上的许多其他问答中了解到此错误,但我不知道如何解决。
这是我所做的:

有趣的是,当我尝试从 GCP Cloud SQL 控制台在同一实例上手动导入同一文件时,一切正常。
完成后,我可以看到我导出的文件的权限已更新,将实例的服务帐户作为 Reader 包括在内,就像我最后在代码中尝试重现该行为时所做的那样。

那么我在这里错过了什么?
我应该为哪个服务帐户设置哪些权限才能正常工作?

CloudSQL 实例 运行 在不属于您的项目的 Google 服务帐户下。

您需要找到您实例的服务帐户 - Cloud SQL-> 集群名称 -> 服务账户

然后,您使用上述服务帐户并授予它 write/read 相关存储桶的权限

我遇到了同样的问题并尝试了很多不同的方法。即使在为项目、存储桶和 SQL 文件授予 DB-service-account 所有者权限后,它也不起作用,而 importing/exporting from/to 其他文件始终有效。

所以我最终重命名了我的导入文件,令人惊讶的是它起作用了(以前的文件名很长并且有下划线,就像你的例子一样)。但是我在文档中找不到任何有关此类命名限制的信息,此时我什至无法判断此问题是否与文件名或下划线的使用有关。但可能值得一试。

问题出在您的代码上,与 Cloud SQL 无关。

当调用 _import_target_db 函数时,您正在查找您的云存储存储桶中不存在的文件。

进入细节:

您将数据库导出到名为以下的存储桶:

gs://yourBucket/sql-exports/exportedDatabaseName-yyyy-mm-dd.sql.gz

但是,当您尝试导入它时,导入功能正在查找名为的文件:

gs://yourBucket/sql-exports/importDatabaseName-yyyy-mm-dd.sql.gz

您的存储桶中不存在此文件,出于安全原因,返回 403 Forbidden 错误。