You can think of zc.buildout or the virtualenv-based script as a form ofīundling, which bootstraps from another already-installed Python, but So I think we agree here: > depending on something stable (python stdlib + a few well known > things) system-wide is OK, for anything else, not sharing is easier > and more robust in the current state of things, at least when one > needs to stay cross platform. > Well, I may not have been clear: I meant that in my experience, > deploying something with several dependencies was easier with bundling > than with a mechanism ala setuptools with *system-wide* installation > of multiple versions of the same library. > It is wildly successful, even on platforms such as Windows, when you > abandon the notion that separate applications should be sharing the > libaries they need. But for > deployment on end-user machines, the whole thing is a failure IMO. What kind of nightmare would it > be if programs developed in C would required a C library which is 6 > months old ? That's exactly what multiple-versions installations > inflict on us. > If it is too much a problem because the application depends on > billions of libraries which are 6 months old, the problem is to allow > such a dependency in the first place. > If the problem is to get a recent enough version of the library, then > the library would better be installed "locally", for the application. The other pacakges installed on the machine. Pre-installs your own packages into such an environment, isolated from You can even ship a virtualenv-derived script which Use of 'virtualenv' as a "supported" way to install it might reduce your If yourĪpplication ships as Python package distributions, then documenting the Which I install the libraries for a given application. When not doing Plone / Zope-specific work (where zc.buildout is a deįacto standard), I use 'virtualenv' to create isolated environments into > virtualenv, pip, yolk, those are useful tools for development/testing, > but I don't see how they can help me to make the installation of a > numpy environment easier on many different kind of platforms. Other bits are typically in their own subdirectories, often under 'eggs' subdirectory', which is *not* on the PYTHONPATH, nor is it a By convention, released package distributions are installed into the Python pacakges (and versions) they require on the PYTHONPATH. Scripts in the 'bin' directory are configured to have the specific Install to install a C library such as libxml2). It uses setuptools to install Python packageĭistributions, but also can use other means (e.g, configure-make-make I think so: it is largely a way to get repeatable / scripted deployment It seems very > specific to web development - I may completely miss the point ? > Everytime I tried to understand what buildout was about, I was not > even sure it could help for my own problems at all. In each case (since Plone 3.2), the installer isīased on (and includes) zc.buildout, and documents how to add newīits to the installed Plone by modifying the buildout.cfg file. Plone is downloaded and installed on many-many systems, across all the I > don't know anything about plone, but I can imagine the deployment > issues are quite different from the projects I am involved in (numpy > and co). > Is that a working solution when you want to enable easy installation > on a large number of "customers" ? In those discussions, I often see > different solutions depending on the kind of projects people do. Instead, each environment uses a restricted subset > of packages known to work together. ![]() We don't > install the packages used by different applications into shared > directories at all. On Wed, at 2:20 AM, Tres Seaver wrote: > Many of us using setuptools extensively tend to adopt an "isolated > environment" strategy (e.g., pip, virtualenv, zc.buildout).
0 Comments
Leave a Reply. |