This is how I am defining the archive_file data source in terraform :-
locals {
tmp_azure_function_zip = "../code.zip"
azure_function_local_basedir = "../my_code/"
}
data "archive_file" "functionzip" {
type = "zip"
output_path = local.tmp_azure_function_zip
excludes = ["__pycache__"]
source {
content = file("${local.azure_function_local_basedir}/info/__init__.py")
filename = "info/__init__.py"
}
....
# same logic to source other files
....
}
My python code for azure python function resides in my_code directory & I am reading the file content using file function.
Later, I use the output zip file to deploy azure function using zip_deploy_file setting.
zip_deploy_file = data.archive_file.functionzip.output_path
Now, the problem is, first time creation is ok but when I am "updating" the code of __init__.py
file, Terraform does not pick the change because technically, nothing changed in terraform config.
In case of aws_lambda_function, there is an option to use source_code_hash property; however, in case of azurerm_linux_function_app, there is no such option.
How can I trigger Terraform to read the source files again when they are updated? Is there are way I can use hashing functions like filesha1 etc.?
Regards, Yuvraj
I was expecting the file function which I used inside source block to pick the changes in source file; but, looks like it does not work like that.
You can use a null_resource with a trigger to achieve this.
Replace_triggered_by only works for resources, luckily null_resource has a trigger argument that accepts strings.