是否可以在组织级别使用 python 为 GCP 中的约束创建基于日志的警报?
Is it possible to create log based alert for Constraints in GCP using python at organization level?
是否可以在组织级别为 GCP 约束创建基于日志的 (stackdriver) 警报???在项目级别创建警报是可能的,但想知道我们是否可以在组织级别创建警报,如果可以,该怎么做..
目前,log-based 警报只能在项目级别创建。
我认为这可能是您的案例的解决方法:
1.创建一个 aggregated sink 来路由您的 Google 云组织的审核日志条目。
To create a sink, use the logging sinks create
command.
Supply the sink name, sink destination, filter, and the ID of the
folder or organization from which you're routing logs:
gcloud logging sinks create SINK_NAME
SINK_DESTINATION --include-children \
--folder=FOLDER_ID filter
For example, if you're creating an aggregated sink at the folder level
and whose destination is a BigQuery dataset, your command might look
like the following:
gcloud logging sinks create SINK_NAME \
bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID --include-children \
--folder=FOLDER_ID --log-filter="logName:activity"
Notes:
To create a sink at the organization level, replace
--folder=FOLDER_ID
with --organization=ORGANIZATION_ID
.
- For the sink to include all resources within the organization, the --include-children flag must be set, even when the --organization flag is passed to create. When set to false (the default), a sink will
only route logs from the host resource.
- Retrieve the service account name used to create the sink from the command output.
- Give that service account permission to write to your sink destination.
- If you don't have permission to make that change to the sink destination, then send the service account name to someone who can
make that change for you.
For more information about granting service accounts permissions for
resources, see the set destination permissions section.
2。您可以设置 sink to Pub/Sub 的目的地,以便在接收器路由审核日志时触发电子邮件警报。
Logs routed to Pub/Sub are generally available within seconds, with
99% of logs available in less than 60 seconds.
To view your routed logs as they are streamed through Pub/Sub, do the
following:
- Go to the Pub/Sub page in the Cloud Console.
- Find or create a subscription to the topic used in the log sink, and pull a log entry from it. You might have to wait for a new log
entry to be published.
是否可以在组织级别为 GCP 约束创建基于日志的 (stackdriver) 警报???在项目级别创建警报是可能的,但想知道我们是否可以在组织级别创建警报,如果可以,该怎么做..
目前,log-based 警报只能在项目级别创建。
我认为这可能是您的案例的解决方法:
1.创建一个 aggregated sink 来路由您的 Google 云组织的审核日志条目。
To create a sink, use the
logging sinks create
command.Supply the sink name, sink destination, filter, and the ID of the folder or organization from which you're routing logs:
gcloud logging sinks create SINK_NAME SINK_DESTINATION --include-children \ --folder=FOLDER_ID filter
For example, if you're creating an aggregated sink at the folder level and whose destination is a BigQuery dataset, your command might look like the following:
gcloud logging sinks create SINK_NAME \ bigquery.googleapis.com/projects/PROJECT_ID/datasets/DATASET_ID --include-children \ --folder=FOLDER_ID --log-filter="logName:activity"
Notes:
To create a sink at the organization level, replace
--folder=FOLDER_ID
with--organization=ORGANIZATION_ID
.
- For the sink to include all resources within the organization, the --include-children flag must be set, even when the --organization flag is passed to create. When set to false (the default), a sink will only route logs from the host resource.
- Retrieve the service account name used to create the sink from the command output.
- Give that service account permission to write to your sink destination.
- If you don't have permission to make that change to the sink destination, then send the service account name to someone who can
make that change for you.For more information about granting service accounts permissions for resources, see the set destination permissions section.
2。您可以设置 sink to Pub/Sub 的目的地,以便在接收器路由审核日志时触发电子邮件警报。
Logs routed to Pub/Sub are generally available within seconds, with 99% of logs available in less than 60 seconds.
To view your routed logs as they are streamed through Pub/Sub, do the following:
- Go to the Pub/Sub page in the Cloud Console.
- Find or create a subscription to the topic used in the log sink, and pull a log entry from it. You might have to wait for a new log entry to be published.