如何触发 Terraform 上传新的 lambda 代码
How to trigger terraform to upload new lambda code
我按如下方式使用 Terraform 部署了 lambda,但有以下问题:
1) 我希望 null_resource.lambda 总是被调用,或者当 stop_ec2.py 被更改时被调用,这样 stop_ec2_upload.zip 就不会过时。 trigger{}里应该写什么?
2) aws_lambda_function.stop_ec2如何在stop_ec2_upload.zip发生变化时将新的stop_ec2_upload.zip更新到云端?
现在我必须销毁 aws_lambda_function.stop_ec2 然后再创建它。有什么我可以在代码中写的,这样当我 运行 terraform apply 时,1) 和 2) 会自动发生吗?
resource "null_resource" "lambda" {
triggers {
#what should I write here?
}
provisioner "local-exec" {
command = "mkdir -p lambda_func && cd lambda_py && zip
../lambda_func/stop_ec2_upload.zip stop_ec2.py && cd .."
}
}
resource "aws_lambda_function" "stop_ec2" {
depends_on = ["null_resource.lambda"]
function_name = "stopEC2"
handler = "stop_ec2.handler"
runtime = "python3.6"
filename = "lambda_func/stop_ec2_upload.zip"
source_code_hash =
"${base64sha256(file("lambda_func/stop_ec2_upload.zip"))}"
role = "..."
}
这些可能有帮助:
triggers {
main = "${base64sha256(file("source/main.py"))}"
requirements = "${base64sha256(file("source/requirements.txt"))}"
}
triggers = {
source_file = "${sha1Folder("${path.module}/source")}"
}
我看了 Chandan 提供的 link 并想通了。
这是我的代码,它运行完美。
其实有了"archive_file"和source_code_hash,我就不需要触发器了。每当我创建一个新文件 stop_ec2.py 或修改它时。当我 运行 terraform 时,文件将被重新压缩并上传到云端。
data "archive_file" "stop_ec2" {
type = "zip"
source_file = "src_dir/stop_ec2.py"
output_path = "dest_dir/stop_ec2_upload.zip"
}
resource "aws_lambda_function" "stop_ec2" {
function_name = "stopEC2"
handler = "stop_ec2.handler"
runtime = "python3.6"
filename = "dest_dir/stop_ec2_upload.zip"
source_code_hash =
"${data.archive_file.stop_ec2.output_base64sha256}"
role = "..."
}
我按如下方式使用 Terraform 部署了 lambda,但有以下问题:
1) 我希望 null_resource.lambda 总是被调用,或者当 stop_ec2.py 被更改时被调用,这样 stop_ec2_upload.zip 就不会过时。 trigger{}里应该写什么?
2) aws_lambda_function.stop_ec2如何在stop_ec2_upload.zip发生变化时将新的stop_ec2_upload.zip更新到云端?
现在我必须销毁 aws_lambda_function.stop_ec2 然后再创建它。有什么我可以在代码中写的,这样当我 运行 terraform apply 时,1) 和 2) 会自动发生吗?
resource "null_resource" "lambda" {
triggers {
#what should I write here?
}
provisioner "local-exec" {
command = "mkdir -p lambda_func && cd lambda_py && zip
../lambda_func/stop_ec2_upload.zip stop_ec2.py && cd .."
}
}
resource "aws_lambda_function" "stop_ec2" {
depends_on = ["null_resource.lambda"]
function_name = "stopEC2"
handler = "stop_ec2.handler"
runtime = "python3.6"
filename = "lambda_func/stop_ec2_upload.zip"
source_code_hash =
"${base64sha256(file("lambda_func/stop_ec2_upload.zip"))}"
role = "..."
}
这些可能有帮助:
triggers {
main = "${base64sha256(file("source/main.py"))}"
requirements = "${base64sha256(file("source/requirements.txt"))}"
}
triggers = {
source_file = "${sha1Folder("${path.module}/source")}"
}
我看了 Chandan 提供的 link 并想通了。 这是我的代码,它运行完美。
其实有了"archive_file"和source_code_hash,我就不需要触发器了。每当我创建一个新文件 stop_ec2.py 或修改它时。当我 运行 terraform 时,文件将被重新压缩并上传到云端。
data "archive_file" "stop_ec2" {
type = "zip"
source_file = "src_dir/stop_ec2.py"
output_path = "dest_dir/stop_ec2_upload.zip"
}
resource "aws_lambda_function" "stop_ec2" {
function_name = "stopEC2"
handler = "stop_ec2.handler"
runtime = "python3.6"
filename = "dest_dir/stop_ec2_upload.zip"
source_code_hash =
"${data.archive_file.stop_ec2.output_base64sha256}"
role = "..."
}