OK, I have an old Debian VM. Package managers are useless. No, I'm not going to update the OS.
I have the bzip2 lib and development headers installed correctly on my system (those actually came from a package).
I start with absolutely NO Python on the system. I removed everything manually. I downloaded the Python 2.7.5 source, and configured with ./configure --prefix=/usr
. It configures fine. I run make
, and it compiles fine. I try ./python -c "import bz2; print bz2.__doc__"
, it works, and says:
The python bz2 module provides a comprehensive interface for the bz2 compression library. It implements a complete file interface, one shot (de)compression functions, and types for sequential (de)compression.
I then run make test
and the whole test suite progresses fine, and notably the "test_bz2" test passes.
I then run make install
, which installs my new Python binary into /usr/bin/ like I wanted.
I try /usr/bin/python -c "import bz2; print bz2.__doc__"
, and it fails with:
Traceback (most recent call last): File "", line 1, in ImportError: No module named bz2
I've tried a bunch of different things, including building Python as --enable-shared
and not, no luck. I've tried at least 10 times (each time totally cleaning out everything, running make distclean
, etc.). No luck.
I tried: PYTHONPATH="/usr/lib/python2.7"; export PYTHONPATH
. Still no luck.
HOWEVER, if I delete the symlink that make install
creates for /usr/bin/python, and instead do: ln -s /path/to/my/python/compile/python python
, NOW it magically works.
So, what the heck? Why is this Python binary I'm getting created only able to find stuff when the binary exists in the compile directory, and not when it's put into normal production install location? What am I missing?
I am root during the entire process, from configure
to make
to make install
to trying to test the Python import call.
I have started from scratch again (this time compiling with --enable-shared
btw), and verified that not only in the compile directory is there build/lib.linux-x86_64-2.7/bz2.so
, but once I run make install
, that file is put into /usr/lib/python2.7/lib-dynload/bz2.so
.
I've tried to do some reading on lib-dynload, but haven't been able to determine if there's something else a Python program (like default configuration for the CLI or whatever) would need to be able to tell it to pull module imports from lib-dynload, or if there's some other place or option to tell the make install
where it should be putting it instead of dynload.
Still I have no explanation why the /path/to/compilation/python
binary can find and load bz2.so
fine, but the /usr/bin/python
binary can't find (or load) /usr/lib/python2.7/lib-dynload/bz2.so
.
I thought maybe it was something to do with the fact that the installation doesn't create like a /usr/lib/python
symlink to point at /usr/lib/python2.7
directory. But I created the symlink and still no go.
I am still lost here.
It would appear that a sort of non-answer answer was arrived at accidentally via a long string of Twitter conversation(s).
I've filed another Stack Overflow question here to ask WHY what we found was the solution to this problem: https://stackoverflow.com/questions/17662091/python-installation-prefix-not-being-persisted-in-config
For posterity sake, right now the solution is that I have to set the PYTHONHOME
environment variable to /usr
, and everything starts working. The puzzling part is that the documentation says PYTHONHOME should default to {prefix}, which I was clearly setting as default during configure to /usr
. So why should I have to manually set it?
Running python-config --prefix
reveals that the {prefix} default is in fact /usr/bin
, NOT /usr
like I specified, which leads to me needing to override the default back to the default, bizarrely.
Collected from the Internet
Please contact [email protected] to delete if infringement.
Comments