MESSAGE
DATE | 2001-10-26 |
FROM | David Sugar
|
SUBJECT | Re: [hangout] Statically linked Redhat
|
Yes, it checks and records dependencies when it is BUILT. This is good if you deploy 50 servers that are identical, and just want to create a package for those servers that are then correctly installed. For corporate uses such rollouts are common and I think it is a good solution.
The problem, of course, comes from packages built by vendors; they are specific and register the dependencies that vendor has in their libraries. This makes it harder to take a binary package from say a SUSE and then install it on a RedHat, or even one that has not been "updated" but yet the dependency has an update.
Of course, having source packages means one could change the system and rebuild an installation binary that has the right compile time configure flags and changed dependencies, and so I think RPM isfairly reasonable system in those circumstances, if not as convenient or powerful as apt-get in this regard. What I find unreasonable is when software exists in binary form alone. Then the notion, as you note, starts to break down.
On Fri, 26 Oct 2001, Ruben Safir wrote:
> > If I'm understanding this correctly, the rpm, when it is BUILT, is looking > for > dependencies and creating the information in the package UPON BUILDING. > > > That is the problem. The user gets an RPM package. Are you saying the the > RPM searches the ACTUAL system for the shared libraries and dependencies > when it is being installed? > > I don't think so. I think it's checking that RPM database....and even > there, if things > are in the wrong place, or a place it doesn't anticipate, your screwed. > > Ruben > > <> dynamic link library references. This is done by a special script in > /usr/lib/RPM, and is one of the reasons that the final step in RPM package > generation takes so long. >> >
____________________________ New Yorker Linux Users Scene Fair Use - because it's either fair use or useless....
|
|