Terraform: source block of archive_file data source does not recognize changes in source file contents

214 Views Asked by At

This is how I am defining the archive_file data source in terraform :-

    locals {
    tmp_azure_function_zip       = "../code.zip"
  azure_function_local_basedir = "../my_code/"
}

 data "archive_file" "functionzip" {
  type        = "zip"
  output_path = local.tmp_azure_function_zip
  excludes    = ["__pycache__"]

  source {
    content  = file("${local.azure_function_local_basedir}/info/__init__.py")
    filename = "info/__init__.py"
  }
....
# same logic to source other files
....
}

My python code for azure python function resides in my_code directory & I am reading the file content using file function.

Later, I use the output zip file to deploy azure function using zip_deploy_file setting.

 zip_deploy_file      = data.archive_file.functionzip.output_path

Now, the problem is, first time creation is ok but when I am "updating" the code of __init__.py file, Terraform does not pick the change because technically, nothing changed in terraform config.

In case of aws_lambda_function, there is an option to use source_code_hash property; however, in case of azurerm_linux_function_app, there is no such option.

How can I trigger Terraform to read the source files again when they are updated? Is there are way I can use hashing functions like filesha1 etc.?

Regards, Yuvraj

I was expecting the file function which I used inside source block to pick the changes in source file; but, looks like it does not work like that.

2

There are 2 best solutions below

0
On

You can use a null_resource with a trigger to achieve this.

# Replaced when file is modified
resource "null_resource" "trigger" {
    triggers = {
      "file" =  filesha256("some_file")
    }
}
# Replaced when null_resource.trigger is modified
resource "null_resource" "mock" {
    lifecycle {
      replace_triggered_by = [ null_resource.name ]
    }
}

Replace_triggered_by only works for resources, luckily null_resource has a trigger argument that accepts strings.

0
On

Thanks for the input Harambo!

I fixed it a bit differently though.

I followed the suggestion of using hash mentioned in this page Can Terraform watch a directory for changes?

Instead of adding a trigger, I am just appending the calculated source hash in the output zip file name. The config has become far more complex, but, it works as expected :)

data "archive_file" "functionzip" {
  for_each    = var.az_func
  type        = "zip"
  output_path = "${each.value.api_name}-${substr(sha1(join("", [for f in fileset(each.value.api_name, "**") : filesha1("${each.value.api_name}/${f}")])), 0, 10)}.zip"
  excludes    = ["__pycache__"]

Now, when I generate the plan after changing my source content, the hash updates & Terraform picks it as a change. This thread can be closed now.

 ~ zip_deploy_file = "api-test1-bfee5c673.zip" -> "api-test1-eb9f524223.zip"