Terraform 胶水作业无法正确创建

Terraform glue job doesn't create properly

我正在使用 Terraform,但我没有获得正确的参数来创建我的粘合作业。 因为我不是 terraform pro(我开始),所以我想知道它是如何工作的。

https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/glue_job#glue_version

我没有使用这些参数对我的粘合作业资源进行良好更新:

resource "aws_glue_job" "job_name" {
  name     = "job_name"
  description  = "job-desc"
  role_arn = "${aws_iam_role.service-name.arn}"
  max_capacity  = 2
  max_retries  = 1
  timeout      = 60

  command {
    script_location = "s3://my_bucket"
    python_version  = "3"
  }

  default_arguments = {    
    "--job-language"    = "python"
    "--ENV"             = "env"
    "--spark-event-logs-path" = "s3://my_bucket"
    "--job-bookmark-option" = "job-bookmark-enable"
    "--glue_version" = "2.0"
    "--worker_type" = "G.1X"
    "--enable-spark-ui" = "true"
  }

  execution_property {
    max_concurrent_runs = 1
  }
}

我不知道这些参数放在哪里以及如何放。你能帮帮我吗?

    "--glue_version" = "2.0"
    "--worker_type" = "G.1X"

此致。

glue_versionworker_type 参数与 default_arguments 块处于同一级别,而不是在它内部。

将它们移出后,您的 resource 块可能如下所示:

resource "aws_glue_job" "job_name" {
  name         = "job_name"
  description  = "job-desc"
  role_arn     = "${aws_iam_role.service-name.arn}"
  max_capacity = 2
  max_retries  = 1
  timeout      = 60
  glue_version = "2.0"
  worker_type  = "G.1X"

  command {
    script_location = "s3://my_bucket"
    python_version  = "3"
  }

  default_arguments = {    
    "--job-language"          = "python"
    "--ENV"                   = "env"
    "--spark-event-logs-path" = "s3://my_bucket"
    "--job-bookmark-option"   = "job-bookmark-enable"
    "--enable-spark-ui"       = "true"
  }

  execution_property {
    max_concurrent_runs = 1
  }
}

编辑

您正在使用的版本 2.30.0 不支持 aws_glue_job 资源的这些参数。

直到 AWS Provider 2.34.0 版本才添加 glue_version 参数。

worker_type 参数直到 2.39.0 版本才添加。

您需要更新提供程序以支持这些参数。