I am looking for a solution to the following problem. I have a package that includes some build tools (for use with SCons, though that is just incidental to the question). I want the package to be able to use these tools itself straight from source before it is installed, so I do the following in the main build file (SConstruct file):
sys.path.insert(0, "path_to_my_package")
import my_package
Then great, my tools can be used in the build/install steps of my_package itself (putting it at the front of sys.path to preferentially import it over any other version of my_package
that might happen to be installed in the environment)
The problem comes in that one of the tools does a "local install" of the package, basically running conda develop my_package
. Unfortunately, if some version of my_package
is already installed in the environment, it will be imported preferentially to the local copy "installed" via conda develop
. So I want to detect whether my_package
is installed and tell the user to uninstall it before doing the "local install".
However, when talking about "local install"ing my_package
itself, I run into the problem that in the build scripts I added the package to the path and imported it, so all the normal ways for detecting if a package is installed will think that the package IS installed, because it kind of is within the current Python session. E.g. if I run something like
importlib.util.find_spec("my_package")
then it will find the copy I am using directly from source, which is not what I care about. I want to check if there is any other version of that package already installed in my environment.
I could do something like this:
sys.path = sys.path[1:] # remove the local source from the path
importlib.reload(my_package) # make Python look again for the module
and check for import errors, which does work, but I am a little worried about side effects it will have on the version of my_package
I already loaded the first time. I'd rather do something like with find_spec
where I can look for packages without actually loading them or messing with the existing sys.modules
. But I don't know any way to get find_spec
to properly search the path for my_package
again. It just always returns the package that is already imported, regardless of what I do to sys.path
. I guess it looks in some cache or at sys.modules or whatever first.
So, is the some way to search the system for modules totally from scratch, ignoring everything that is currently in the Python session?
I guess I can do something kind of drastic like spawn a fresh Python session as a subprocess via a system call and do the find_spec
check there. I suppose there is nothing terribly wrong with that, it just feels like there should be a better way to do this...
My other crazy idea was to copy my_package
to a temporary directory under a different package name, say my_package_2
and import it from there. It's a simple package so this should be fine. Then when I later search for "properly installed" versions of my_package
I can do a "normal" search without worrying about collisions. The subprocess thing might be simpler though...
So actually maybe my idea to use symlinks to import the local version under a different name isn't so crazy after all:
Then I can do whatever I want inside
my_package.tools
, including searching for themy_package
module, and it will work in a straightforward way without interference due tomy_package_local
.