How to use PyPI package in a Docker image in GitLab CI pipeline?

80 Views Asked by At

Currently only one (but in the future more) package is being build and pushed to my GitLab repo's PyPI registry with the following two jobs:

build-base-package:
  stage: build
  image: python:slim
  artifacts:
    untracked: true
    paths:
    - base/build
  script:
    - cd base
    - apt-get update && apt-get -y upgrade && pip install --upgrade pip && pip install -r requirements.txt
    - pip install wheel build
    - python -m build --wheel
    - ls -R
  tags:
    - myrunner
  interruptible: true 


deploy-base-package-to-registry:
  stage: build
  image: python:slim
  tags:
    - myrunner
  dependencies:
    - build-base-package
  script:
    - apt-get update && apt-get -y upgrade && pip install --upgrade pip
    - pip install twine
    - TWINE_PASSWORD=${CI_JOB_TOKEN} TWINE_USERNAME=gitlab-ci-token python -m twine upload --verbose --skip-existing --repository-url ${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/packages/pypi base/dist/*

In the same repository I have a Dockerfile and some requirements.txt that I would like to create my image from. The image has to include base-package (the CI jobs above). I do not know however how to even add it to my requirements.txt so that it works.

The main (only?) issue I have so far is the requirement to use an access token (PAT or CI_JOB_TOKEN) to access the PyPI registry. I don't see how this can work with an image that depends on such a registry for packages.


UPDATE: In case someone mentions it, I would prefer not to use the product from the build job (here a wheel file), which I can do as long as I am in the same pipeline as far as I know. The knowledge is important since it may be applicable to other use cases such as inter-repo dependencies, where a pipeline from one project uses the package built by a pipeline in another one.

0

There are 0 best solutions below