Cloud Function deployment does not respect Symbolic Linking

157 Views Asked by At

As the design specification makes clear, the source directories of Google Cloud Functions are expected to include main.py, requirements.txt, and __init__.py. Additional local dependencies (i.e., code) may be specified so long as their imports are within the source directory, as described here. This precludes the importing of sibling or parent directories.

In this directory setup, main.py can import internal.code_internal and, if base/ has been added to the PythonPath, can import base.code_sibling. The limitations (by design) of Cloud Functions does not allow this latter import, as only the contents of functions/f/ will be deployed to its servers. My question refers to workarounds and uses of symbolic links.

base/
   __init__.py
   code_sibling.py
functions/
   f/
      __init
      main.py
      requirements.txt
      internal/
         __init__.py
         code_internal.py

In conventional Python, a symbolic link can be added to functions/f/ which points to base/, and which then makes the contents of base/ able to be imported as though they are directly in the functions/f/ directory: in other words, as though the file functions/f/base/code_sibling.py exists. However, this improvement does not change the deployment behavior of Cloud Functions: the symbolic link is (seemingly ignored by gcloud functions deploy. Instead, I am finding myself directly copying the base/ directory into functions/f/, then deploying the Cloud Function, then deleting the copied files of functions/f/base/.

Has anyone been able to support symbolic link, or are there other workarounds that better address the situation? Thank you.

Cross-posted to functions-framework-python with this ticket.

1

There are 1 best solutions below

1
On

Cloud Functions uses a different file system than the local file system. They don't support symbolic links or certain file operations because of their read-only file system. You can get around this by copying the files you need directly into the Cloud Function source directory. This is one option you have.

Alternatively, you can use the gcloud functions deploy command with the --runtime\_file flag. This flag lets you specify a file containing configuration details. In this file, you can set the PythonPath to include the directory you want to import from.

For example:

gcloud functions deploy my-function --runtime python37 --runtime_file runtime.yaml

In the runtime.yaml file, you would have:

pythonPath:
  - functions/f/base

This tells Cloud Functions to search for imports in the functions/f/base directory. Which solution to choose depends on how many files you need to import and how often they change.

For more reference:

An object in Google Cloud Storage which acts as a "redirect" or "symlink"

gcloud functions runtime list