在构建期间将文件从 GCS 复制到 Cloud 运行 docker 容器中
Copy files from GCS into a Cloud Run docker container during build
我正在尝试使用 gsutil
在构建步骤中将文件从 GCS 复制到 运行 容器中。
我试过的步骤:
RUN pip install gsutil
RUN gsutil -m cp -r gs://BUCKET_NAME $APP_HOME/artefacts
错误:
ServiceException: 401 Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.
CommandException: 1 file/object could not be transferred.
The command '/bin/sh -c gsutil -m cp -r gs://BUCKET_NAME $APP_HOME/artefacts' returned a non-zero code: 1
ERROR
ERROR: build step 0 "gcr.io/cloud-builders/docker" failed: step exited with non-zero status: 1
服务帐户(默认计算和 cloudbuild)确实可以访问 GCS,我也尝试过 gsutil config -a
和各种其他标志但没有成功!
我不确定我应该如何进行身份验证才能成功访问存储桶。
我看到你标记为 Cloud Build,
您可以使用这样的步骤:
steps:
- name: gcr.io/cloud-builders/gsutil
args: ['cp', 'gs://mybucket/results.zip', 'previous_results.zip']
# operations that use previous_results.zip and produce new_results.zip
- name: gcr.io/cloud-builders/gsutil
args: ['cp', 'new_results.zip', 'gs://mybucket/results.zip']
这是我的 github 动作作品
jobs:
build:
name: Build image
runs-on: ubuntu-latest
env:
BRANCH: ${GITHUB_REF##*/}
SERVICE_NAME: ${{ secrets.SERVICE_NAME }}
PROJECT_ID: ${{ secrets.PROJECT_ID }}
steps:
- name: Checkout
uses: actions/checkout@v2
# Setup gcloud CLI
- uses: google-github-actions/setup-gcloud@master
with:
service_account_key: ${{ secrets.SERVICE_ACCOUNT_KEY }}
project_id: ${{ secrets.PROJECT_ID }}
export_default_credentials: true
# Download the file locally
- name: Get_file
run: |-
gsutil cp gs://BUCKET_NAME/path/to/file .
# Build docker image
- name: Image_build
run: |-
docker build -t gcr.io/$PROJECT_ID/$SERVICE_NAME .
# Configure docker to use the gcloud command-line tool as a credential helper
- run: |
gcloud auth configure-docker -q
# Push image to Google Container Registry
- name: Image_push
run: |-
docker push gcr.io/$PROJECT_ID/$SERVICE_NAME
您必须设置 3 个秘密:
- SERVICE_ACCOUNT_KEY: 这是您的服务帐户密钥文件
- SERVICE_NAME: 你的容器名称
- PROJECT_ID: 部署镜像的项目
因为您在本地下载文件,所以该文件在本地存在于 Docker 构建中。然后,只需将它复制到 docker 文件中,然后用它做你想做的事。
更新
如果你想在docker中做到这一点,你可以这样实现
Docker文件
FROM google/cloud-sdk:alpine as gcloud
WORKDIR /app
ARG KEY_FILE_CONTENT
RUN echo $KEY_FILE_CONTENT | gcloud auth activate-service-account --key-file=- \
&& gsutil cp gs://BUCKET_NAME/path/to/file .
....
FROM <FINAL LAYER>
COPY --from=gcloud /app/<myFile> .
....
Docker 构建命令
docker build --build-arg KEY_FILE_CONTENT="YOUR_KEY_FILE_CONTENT" \
-t gcr.io/$PROJECT_ID/$SERVICE_NAME .
YOUR_KEY_FILE_CONTENT 取决于您的环境。这里有一些注入它的解决方案:
- 在 Github 操作:
${{ secrets.SERVICE_ACCOUNT_KEY }}
- 在您的本地环境中:
$(cat my_key.json)
我正在尝试使用 gsutil
在构建步骤中将文件从 GCS 复制到 运行 容器中。
我试过的步骤:
RUN pip install gsutil
RUN gsutil -m cp -r gs://BUCKET_NAME $APP_HOME/artefacts
错误:
ServiceException: 401 Anonymous caller does not have storage.objects.get access to the Google Cloud Storage object.
CommandException: 1 file/object could not be transferred.
The command '/bin/sh -c gsutil -m cp -r gs://BUCKET_NAME $APP_HOME/artefacts' returned a non-zero code: 1
ERROR
ERROR: build step 0 "gcr.io/cloud-builders/docker" failed: step exited with non-zero status: 1
服务帐户(默认计算和 cloudbuild)确实可以访问 GCS,我也尝试过 gsutil config -a
和各种其他标志但没有成功!
我不确定我应该如何进行身份验证才能成功访问存储桶。
我看到你标记为 Cloud Build,
您可以使用这样的步骤:
steps:
- name: gcr.io/cloud-builders/gsutil
args: ['cp', 'gs://mybucket/results.zip', 'previous_results.zip']
# operations that use previous_results.zip and produce new_results.zip
- name: gcr.io/cloud-builders/gsutil
args: ['cp', 'new_results.zip', 'gs://mybucket/results.zip']
这是我的 github 动作作品
jobs:
build:
name: Build image
runs-on: ubuntu-latest
env:
BRANCH: ${GITHUB_REF##*/}
SERVICE_NAME: ${{ secrets.SERVICE_NAME }}
PROJECT_ID: ${{ secrets.PROJECT_ID }}
steps:
- name: Checkout
uses: actions/checkout@v2
# Setup gcloud CLI
- uses: google-github-actions/setup-gcloud@master
with:
service_account_key: ${{ secrets.SERVICE_ACCOUNT_KEY }}
project_id: ${{ secrets.PROJECT_ID }}
export_default_credentials: true
# Download the file locally
- name: Get_file
run: |-
gsutil cp gs://BUCKET_NAME/path/to/file .
# Build docker image
- name: Image_build
run: |-
docker build -t gcr.io/$PROJECT_ID/$SERVICE_NAME .
# Configure docker to use the gcloud command-line tool as a credential helper
- run: |
gcloud auth configure-docker -q
# Push image to Google Container Registry
- name: Image_push
run: |-
docker push gcr.io/$PROJECT_ID/$SERVICE_NAME
您必须设置 3 个秘密:
- SERVICE_ACCOUNT_KEY: 这是您的服务帐户密钥文件
- SERVICE_NAME: 你的容器名称
- PROJECT_ID: 部署镜像的项目
因为您在本地下载文件,所以该文件在本地存在于 Docker 构建中。然后,只需将它复制到 docker 文件中,然后用它做你想做的事。
更新
如果你想在docker中做到这一点,你可以这样实现
Docker文件
FROM google/cloud-sdk:alpine as gcloud
WORKDIR /app
ARG KEY_FILE_CONTENT
RUN echo $KEY_FILE_CONTENT | gcloud auth activate-service-account --key-file=- \
&& gsutil cp gs://BUCKET_NAME/path/to/file .
....
FROM <FINAL LAYER>
COPY --from=gcloud /app/<myFile> .
....
Docker 构建命令
docker build --build-arg KEY_FILE_CONTENT="YOUR_KEY_FILE_CONTENT" \
-t gcr.io/$PROJECT_ID/$SERVICE_NAME .
YOUR_KEY_FILE_CONTENT 取决于您的环境。这里有一些注入它的解决方案:
- 在 Github 操作:
${{ secrets.SERVICE_ACCOUNT_KEY }}
- 在您的本地环境中:
$(cat my_key.json)