Python imports and celery worker

1.2k Views Asked by At

I have the following dir structure:

.
└── package
    ├── foo
    │   ├── __init__.py
    │   └── foo.py
    ├── run.py
    └── tasks.py

foo/foo.py

class Foo:
    @staticmethod
    def double(x):
        return 2 * x

run.py

from package.foo.foo import Foo

def do_bar(x):
    return x + 10

if __name__ == '__main__':
    foo = Foo()
    r1 = foo.double(10)
    r2 = do_bar(5)
    print(r1 + r2)

tasks.py

from celery import Celery

from package.foo.foo import Foo
from package.run import do_bar

celapp = Celery('foo', broker='amqp://guest@localhost//')


@celapp.task
def run():
    foo = Foo()
    foo.double(10) + do_bar(5)

When I run celery with $ celery -A tasks worker it gives me:

Error: 
Unable to load celery application.
The module package was not found.

If I only do in tasks.py

from .foo.foo import Foo
from .run import do_bar

I get ImportError: attempted relative import with no known parent package

I think my imports are wrong but I can't get my head around how Python deals with imports.

1

There are 1 best solutions below

0
On

As @Mr_and_Mrs_D said in their comment:

Try running $ celery -A package.tasks worker from the parent directory of package - python scripts are not meant to be run from inside the package

Worked wonders for me!