[vox] Stuff you really need to run a GNU/Linux network
Michael Cheselka
cheselka at gmail.com
Tue Sep 14 17:08:43 PDT 2010
Hello,
When I last checked, Google was using Red Hat on servers and Goobuntu
on desktops.
They had at one time experimented with custom kernels and software
systems but found it so much trouble to be different with everyone
else that the efficiency wasn't worth it. They were having bugs only
they were having.
Goobuntu is Google's desktop. It's just a slightly modified Ubuntu
which hooks readily into Google's infrastructure but is not really
much different from Ubnuntu at all.
It cannot be emphasized enough that mirroring is for high availability
not for backups. If there is corruption, a backups saves you but a
mirror merely copies the corruption all around over the good copies.
Regards,
Michael Cheselka
650-488-4820
On Tue, Sep 14, 2010 at 13:59, Don Werve <don.werve at gmail.com> wrote:
> On Sep 14, 2010, at 1:27 PM, Michael Wenk wrote:
>
>> Because this is what DNS was designed for. I've yet to work in a place that was completely homogeneous, usually, you have at least a few non UNIX/Linux machines, and DNS is a standard. I ran DNS locally on my home network for this very reason. Just about anything that can network TCP/IP will talk it. And I'm not talking solely about PCs/workstations here. You have other non PC equipment in your network.
>
> I heartily concur. Windows boxes, networked printers, routers, and IP-addressable toilet seats will all speak DNS.
>
>> I disagree. I can with just a simple hostname command determine exactly what the node does. Its names like "fiddle" or "d0r3k9s2" that make no sense. And having to query LDAP is IMO annoying as hell.
>
> Agreed. Put CNAMEs or TXT records in for task-specific names (ldap.mycorp.com, mail.mycorp.com, etc.), but give the machine a name related to its primary task (app1, app2, app3, db1, db2, etc.)
>
>> I'm kinda confused, I thought a good portion of web servers out there run on LAMP.. If you want a specific example, I believe Google uses a linux variant as their main OS. I would hardly call them a single server hacker, tho their "manage by install" setup is one on steroids, or at least that's what I have been given to believe by reading.
>
> Very true, and this is increasingly common in smaller shops, such as any shop I run. Installation and configuration are completely automated, and the only reason to SSH into a machine is to troubleshoot problems, or to test new configs in the staging area before bringing them into Git.
>
>> That's great in some setups, until you lose the hot fail server. Sure you get redundancy, but there's nothing like the safety of a non volatile backup. Of course this is highly dependent on what data we're talking about. In many cases, rsync'ing to another server, or hell, just tar/encrypt/uuencoding and gmailing the data is also fine. But sometimes you want the safety of tape/optical/etc.
>
> Not only that, but you lose incremental changes. What if you need a file that was deleted a week ago, and has been similarly nuked from the hot spare?
>
>> My advice is there is not one backup strategy that works. Know the data you're backing up, how important it is, and tailor your backup strategy based on that. And then test. If you don't test, you don't know that it works.
>
> Let me know when you're looking for a sysadmin job, Michael. :) This is such an important attitude, and missing in a lot of candidates that I've interviewed over the years.
>
> A personal favorite thing is to, once backups are in place, nuke a production webserver from orbit -- just 'rm -fr /' the machine. Then I time how long it takes to bring the machine back online.
>
> If you've got a good backup and rollout strategy, this is easy.
>
> _______________________________________________
> vox mailing list
> vox at lists.lugod.org
> http://lists.lugod.org/mailman/listinfo/vox
>
More information about the vox
mailing list