MESSAGE
DATE | 2017-02-17 |
FROM | Rick Moen
|
SUBJECT | Re: [Hangout-NYLXS] aptget: | | it was not wasted !
|
Quoting Mancini, Sabin (DFS) (Sabin.Mancini-at-dfs.ny.gov):
> RM: Even though I don't anticipate encountering this issue, I read
> it carefully and tried to absorb and comprehend it. I will also put
> it in my Linux reference binder. So it was not wasted !
Having a reference binder's a good idea. I may start suggesting to
newcomers serious about learning Linux that they start keeping a Mancini
file (like the Foley Files described by Heinlein in _Double Star_,
except about technology rather than contacts).
Third-party .deb or .rpm repos (and Ubuntu's cute little walled garden
of PPAs at ppa.launchpad.net) are oft-mentioned candidate solutions to a
common and more _general_ problem: 'My Linux distribution doesn't seem
to have a package for $FOO.'
Sometimes, $FOO _is_ distro-packaged, but the user wants (or _thinks_ he/she
wants) a much more recent version that is being dangled to the public by
the upstream developer. Other times, $FOO just isn't available packaged
for that distro, as with Firefox being unpackaged for Trisquel.
Newcomers grappling with this problem are prone to trying very strange
things while floundering around looking for a solution. Often, they end
up breaking their Linux installations in various gruesome ways.
And they (often) do that _because_ they don't understand the role of
package maintainers and of distro software policies.
http://www.advogato.org/article/169.html And that is why I see far too
many people, including Linux veterans who ought to know better, saying
'Oh, I'll just compile an upstream source tarball in /usr/local .'
It even came up repeatedly in otherwise good articles for _Linux
Gazette_ when I was one of the magazine's editors. Quoting one of my
editorial footnotes:
http://linuxmafia.com/~rick/weatherwax.html#1
[1] Rick Moen comments: While it's useful and worthwhile to know about
a program's "upstream" development site, where (among other things) the
author's latest source code can be downloaded, there are a few
disadvantages that should be noted (and some alternative locations that
should be usually be preferred, instead, if such are findable):
1. Absent extraordinary measures on your part, your Linux distribution's
package-tracking system won't know about the program's presence on your
system. Therefore, it won't know to avoid installing conflicting
programs, removing libraries it depends on, etc.
2. You won't get any tweaks and enhancements that may be normal (or
necessary!) for applications on your Linux distribution — unless you
yourself implement them. You won't get security patches, either, except
those written by the upstream author.
3. Along those same lines, the desirable version to compile and run may
well not be the author's latest release: Sometimes, authors are trying
out new concepts, and improvements & old bugs fixed are outweighed by
misfeatures & new bugs introduced.
4. As a person downloading the upstream author's source code directly,
you have to personally assume the burden of verifying that the tarball
really is the author's work, and not that of (e.g.) a network intruder
who cracked the download ftp site substituted a trojaned version.
Although this concern applies mostly to software designed to run with
elevated privilege, it's not a strictly academic risk: Linux-relevant
codebases that have been (briefly) trojaned in this fashion, in recent
years, on the upstream author's download sites, include Wietse Venema's
TCP Wrappers (tcpd/libwrap), the util-linux package, sendmail, OpenSSH,
and the Linux kernel (CVS gateway's archive, only). Unless you are
prepared to meaningfully verify the author's cryptographic signature —
if any — on that tarball, you risk sabotaging your system's security.
(None of those upstream trojanings escaped into Linux distributions
because of distribution packager vigilance. Make sure you can be as good
a gatekeeper, or rely on those who already do the job well.)
All of the above are problems normally addressed (and the burden of
solving them, shouldered) by Linux distributions' package maintainers,
so that you won't have to. It's to your advantage to take advantage of
that effort, if feasible. The memory of when a thousand Linux sysadmins,
circa 1993, would need to do all of that work 999-times redundantly, is
still fresh to us old-timers: We call those the Bad Old Days, given that
today one expert package maintainer can instead do that task for a
thousand sysadmins. And yes, sometimes there's nothing like such a
package available, and you have no reasonable alternative but to grab
upstream source tarballs — but the disadvantages justify some pains to
search for suitable packages, instead.
Depending on your distribution, you may find that there are update
packages available directly from the distribution's package updating
utilities, or from ancillary, semi-official package archives (e.g., the
Fedora Extras and "dag" repositories for Fedora/RH and similar
distributions), or, failing that, third-party packages maintained by
reputable outside parties, e.g., some of the Debian-and-compatible
repositories registered at the apt-get.org and backports.org sites.
Although those are certainly not unfailingly better than tarballs, I
would say they're generally so.
The smaller, less popular, and less dependency-ridden a package is, the
more you might be tempted to use an upstream source tarball. For
example, I use locally compiled versions of the Leafnode pre-2.0 betas
to run my server's local NNTP newsgroups, because release-version
packages simply lack that functionality altogether. On the other hand,
that package's one dependency, the Perl PCRE library, I satisfy from my
distribution's official packages, for all the reasons stated above.
Among the new Linux user's first and most-important jobs is to learn
the chosen Linux distribution's design, toolset, and maintenance regimen.
Is it a release-oriented distribution, or a rolling distribution? How
are official packages added and removed? In so doing, how is security
maintained and software integrity vetted? What if any are the major
outside repositories of distro-unofficial software for this
distribution (e.g., for A/V, crypto, etc.), how are they best used, what
are the cautions and gotchas, and how is code-signing handled? How do
you make sure that outside packaged software is compatible with the
development branch or release of the Linux distribution you're running,
so that you don't break things?
Where are the online communities of clueful users from whom you can
learn the distro's methods and ways of coping / philosophical approach?
Web search done well and _skeptically_ is good; Web-search done naively
leads instantly to confirmation bias. See Danah Boyd's perceptive essay
on the main learning malady of our times:
http://www.zephoria.org/thoughts/archives/2017/01/09/did-media-literacy-backfire.html
As I detailed in my book It’s Complicated: The Social Lives of
Networked Teens, too many students I met were being told that
Wikipedia was untrustworthy and were, instead, being encouraged
to do research. As a result, the message that many had taken
home was to turn to Google and use whatever came up first. They
heard that Google was trustworthy and Wikipedia was not.
Understanding what sources to trust is a basic tenet of media
literacy education. [...]
Our society is plagued with this problem, a generation of people having
been taught to 'research' every knowledge issue (in every area of life),
without being _taught how_, with the result of them blithely disregarding
expert resources, because they've found some blog or Web-forum post and
credulously believed it. I find Ubuntu kiddies all the time who've gone
through gyrations to install proprietary hardware drivers rather than
using better open source drivers, resulting in their systems becoming
brittle and difficult to maintain over the long term. Why? Because
they Google-searched on their hardware items + 'Ubuntu', found some
really bad advice on ubuntuforums.org, and followed it.
So, it's really important for new Linux users to, first, understand how
their distributions' package-maintenance systems work, including how to
understand what's available for official sources. They should make sure
they know what the _focus_ of their distro is, e.g., not install
Debian-stable and then be all surprised about it lacking bleeding edge
software versions. They need to know what a package repo is,
more-or-less how it works, and what third-party ones exist and what
they're good for. Last, they should get around to understanding how to
build (or tweak and rebuild) a local package -- as an almost-always
superior alternative to just compiling upstream tarballs in /usr/local .[1]
It pains me when I see twenty-year Linux veterans resorting immediately
to the latter when they're running a Linux distribution with good
alternatives, because it suggests to me they've never really bothered to
learn the fundamentals and are still acting like it's 1993.
[1] I also sometimes hear people say 'I don't need to make local distro
packages because I manage everything using Chef (or Puppet).'
Configuration management is highly recommended, but IMO complements
rather than replaces software packaging.
--
Cheers, Homo in Domu Alba, qui est iratus et habet in
Rick Moen artificialibus capillum: Quod homo non sit
rick-at-linuxmafia.com honesta, et est perniciosa in rei publicae.
McQ! (4x80)
_______________________________________________
hangout mailing list
hangout-at-nylxs.com
http://www.nylxs.com/
|
|