CentOS 5 uses python 2.4, and replacing it is not really on option since yum and other core packages depend on it. My solution is to compile 2.6 and use /opt or /usr/local for the prefix. I also create a with the new python executable, so when I’m in the environment 2.6 becomes the default python. It also isolates all my python libraries for a given project. Most of this article is actually distribution agnostic, and the yum build requirement install will likely work on other versions of CentOS and other RedHat derivatives like Fedora.
Install Build Requirements
Run the following command as root (or with sudo) to install gcc and the development libraries used by python:
yum install gcc gdbm-devel readline-devel ncurses-devel zlib-devel \ bzip2-devel sqlite-devel db4-devel openssl-devel tk-devel bluez-libs-devel
Compile Python
and unpack the source tarball for your version of choice. For example:
VERSION=2.6.1 mkdir ~/src cd ~/src wget tar xjf Python-$VERSION.tar.bz2 rm Python-$VERSION.tar.bz2 cd Python-$VERSION ./configure --enable-ipv6 --prefix=/opt make sudo make install
Run ./configure –help for more configure options.
If you plan to install multiple versions of python to the same prefix, use “sudo make altinstall” instead of “make install” for the versions you want installed as python2.6 etc, and “make install” for the default version you want installed as simply python. See the README for more details.
After running the make command you will see a list of modules that were not built. If you installed all of the devel libraries listed above, the only missing modules should be bsddb185 and sunaudiodev. You probably don’t need these – bsddb185 is the old version of the berkely db module, and sunaduiodev is for solaris. On x86_64 db and imageop may also be in the list, but looking at the setup.py, it looks like this is normal.
Install setuptools
– get the egg if it’s available for your version of python, otherwise get the tarball. In either case prepend PREFIX/bin to your path, where PREFIX is what you passed to configure when building python (/opt in my example). Then either run the egg as a shell script or unpack the tarball and run “python setup.py install”. Here’s an example for setuptools 0.6c9 and python 2.6:
wget sudo PATH=/opt/bin:$PATH sh setuptools-0.6c9-py2.6.egg rm setuptools-0.6c9-py2.6.egg
Install virtualenv and virtualenvwrapper
sudo /opt/bin/easy_install virtualenv sudo /opt/bin/easy_install virtualenvwrapper
Install More Packages (Optional)
If you plan on using other modules in all your projects, and don’t need different versions, you can install them once in the main site-packages dir for your new python install. Just make sure to use the full path to easy_install (or prepend /opt/bin to the PATH when running eggs or python setup.py), otherwise they will be installed for the system python (2.4 in CentOS 5).
Setup virtualenvwrapper
UPDATED 4-9: Thanks Brian for the feedback. Hopefully this is more clear.
Create a directory to store your virtual environments:
mkdir ~/.virtualenvs
Add the following lines to your bashrc (~/.bashrc):
export PATH=/opt/bin:$PATH export WORKON_HOME=$HOME/.virtualenvs source /opt/bin/virtualenvwrapper_bashrc
If you used a prefix other than /opt (e.g. /usr/local) when installing python, then change the PATH line and the source line accordingly. This also assumes you are using bash. I’m not sure how to set up virtualenvwrapper for other shells.
The PATH line will actually make python 2.6 your default interpreter for login sessions. If this is undesirable you may be able to copy virtualenv to some other location already in your path, and it will should still use the appropriate version of python. Be careful if you have virtualenv installed under your main system python – virtual environments created with other version will be tied to that version of python.
New login sessions will automatically source ~/.bashrc and have support for virtualenvwrapper – to update your currently running shell, source it explicitly:
source ~/.bashrc
IMPORTANT: you should use “#!/usr/bin/env python” for your scripts instead of “#!/path/to/python” to make them more portable and work in different virtualenv.
More documentation: , Doug Hellmann’s original ,
Create a virtualenv (using virtualenvwrapper)
Choose a name for your new environment. I will use NewEnv as a placeholder. You can create new environments with mkvirtualenv:
mkvirtualenv NewEnv
This command will also automatically activate the new environment. To activate NewEnv in another shell, use the “workon” command:
workon NewEnv
Running workon without any arguments will give you a list of all the virtualenv’s you’ve created. Here are some common packages you may want to easy_install into your new virtualenv. Note that you don’t need to use sudo to install stuff into your virtualenv:
workon NewEnv easy_install MySQL-python easy_install twisted easy_install egenix-mx-base
Install PostgreSQL and psycopg2
Centos 5 has postgres 8.1. To install a 8.3, setup the , remove any old versions (but not -libs), and yum install postgresql-devel. Note that you may need to remove nfs-utils to avoid a libevent conflict (or maybe find an updated nfs-utils if you really need it).
sudo yum erase nfs-utils sudo yum erase postgresql postgresql-devel sudo rpm -ivh sudo yum update sudo yum install postgresql-devel workon NewEnv easy_install psycopg2
Following the instructions I was able to get virtual wrapper up and running. I had a few rough spots setting up my bashrc. So here is my specific information:
# User specific aliases and functions
export PATH=/opt/bin:$PATH
export WORKON_HOME=$HOME/.virtualenvs
source /opt/bin/virtualenvwrapper_bashrc
Looking at the last line where the virtualenvwrapper_bashrc is sourced, an error is going to occur each time this is run. The guide nor does the above instructions mention creating the .virtualenvs directory. So in your home directory:
mdkir .virtualenvs
Once the workon is run and then the mkvirtualenv temp completes, the next bashrc start will have the temp for the virtual environment.
Otherwise great instructions and very helpful for creating a controlled python environment. This is a good way of keeping a team working together by using the same code levels.