I have a project directory that looks like this
/work
/venv
/project1
app.py
/package1
/package2
etc
where the script app.py runs the project I'm working on. When I run
python -m site from the /project1 directory I get, as expected
sys.path = [
'/work/project1',
plus the relevant /lib/pythonX.Y paths. However, when running app.py (from within project1) this import
from package1 import ...
fails with a ModuleNotFoundError, whereas from .package1 import ... works fine since it tells python to search directory that app.py is in. So, I added this to the beginning of app.py
import sys
print(sys.path)
and the result is
sys.path = [
'/work',
Instead of /work/project1 the directory that is being searched when importing to app.py is /work. What is the cause of this discrepancy between python -m site and print(sys.path) and what is the best way to go about making sure that /work/project1 is always part of sys.path?
I would like to avoid using something like site.addsitedir() and perhaps use a .pth file instead. From what I've read though a .pth file belongs in /lib/pythonX.Y/sitepackages, but I also need this solution to be system independent so that a collaborator who clones /project1 from github wont have to add their own .pth file.
I don't know if it applies to you, but I've helped a lot of others by pointing out what our shop does. We don't worry about the path upon execution at all. All we rely upon is that from the initial script location, we know where all the dependencies are. Then we add directories to sys.path by computing their positions relative the directory containing the initial script, which is always available.
We put this logic in a file named
init.pyin the same dir as the main script. Here's what one looks like:and then in the main script do
import init. We have directories with lots of scripts in them, and they all just doimport initto have their sys.path fixed up to find all of our utility modules and such.