[vox-tech] Utility to image a hard drive
Bill Broadley
bill at cse.ucdavis.edu
Wed Apr 8 23:35:40 PDT 2009
Thomas Johnston wrote:
> I installed Debian (lenny) on my laptop last week and it is my first
> exposure to Linux. I was wondering if anyone had a recommendation for
> a utility to "image" my laptop's hard drive to an external drive. I
> would like to perform some kind of complete system backup now that I
> have the absolute essentials working. There are lots of tweaks that I
> would like to try (nvidia graphics driver, sleep/resume functionality,
> etc, etc.), but I am certain that I will wind up royally screwing
> things up. If possible, I would like to avoid spending another week
> just getting back to where I am now.
Heh, this ends up being quite a large open ended question. First of all I'd
recommend taking notes so that that week (which is mostly learning) would
likely be an hour or two if rebuilt from notes. Somewhere else reliable I'd
keep track of what packages you installed, and maybe a copy of the config
files you edited. Maybe you or a friend has a subversion/cvs repository or
even just a remote disk that would survive a local failure. You could keep
those notes on your system then just rsync that directory remotely as needed.
In general I consider norton ghost and related image based software rather
primitive. Any mentions of unicasting or multicasting mostly marketing from
folks that haven't figured out that bit-torrent works.
BTW, I recommend installing everything from the CD/DVD on a single partition.
/boot isn't justified unless you have very old hardware. /home is nice to
have on a different partition so if you decide to reinstall, switch
distributions, or have some kind of problem you can preserve /home. Then
everything you install (without apt-get/aptitude/synaptic) put in /opt (like
matlab), again so that it survives a reinstall. Then future backups can
handle each partition as needed, /opt and / typically get changed relatively
rarely compared to /home. I don't usually backup things like matlab since by
the time the disk dies I usually want a newer version anyways. The license
file on the other hand is quite valuable and I try to keep at least 2 copies
around and at least one remote.
I find this quite useful because eventually you are likely to want to install
a second box, or maybe reinstall from scratch after a hardware failure, maybe
once the next version of debian comes out (not that upgrades don't work), or
maybe switch to ubuntu or fedora.
If you really want an image you can image a disk to another disk (as big or
larger) with:
dd if=/dev/sda of=/dev/sdb
That will copy partition tables, the contents, swap, everything. It's
unlikely that you want that since the drives are not likely to be the same
size, and if the 2nd is bigger you just end up wasting the space. So you
could do instead:
dd if=/dev/sda1 of=/mnt/my_external_disk/lenny_image_apr-08-2009.img
Assuming of course /dev/sda1 is your / and not your swap or boot partitions.
At that point with a recovery CD (many installed cd/dvds have a recovery
option). If not try knoppix it's relatively common. To reinstall the image
just mount the external drive and then:
dd if=/mnt/my_external_disk/lenny_image_apr-08-2009.img of=/dev/sda1
This assumes of course you have the same partition size, and fdisk -l might
not be a bad thing to add to your notes or subversion repository. You will
have to reinstall grub/bootloader if you only restore /dev/sda1 onto a new disk.
To be more space efficient you could compress the image:
dd if=/dev/sda1 | bzip2 > /mnt/my_external_disk/lenny_apr-08-2009.img.bz2
This will tend to be slower depending on how fast your CPU and external disk are.
But it's still not going to be very flexible. There are various backup
programs around, but many aren't worth it if you have a single machine. I'd
take a look at tar or rsync if you want to make a backup that is file based
instead of block. This allows you to restore to different size partitions as
well as do partial restores. This is particularly nice if you just want to
poke around /etc so you can enable similar functionality on a new machine.
Rsync is particularly nice in that if you change 1% of a filesystem that the
next rsync will only copy 1% of the disk.
One of the other big advantages of a file instead of block based backup system
is that you can then write deltas. So you do a level-0 of /dev/sda1 that
copies every file to the external drive. Then with tar, rsync, or dump you
can ask for every file that has changed. So while the first image might be
4GB, the delta after 2 weeks might just be 250MB.
Once you start having more important files on your server I'd strongly suggest
backing up your files offsite. After all theft, flood, electrical surges,
leaks, folks waiting for an xhost +, etc might blow away your server and the
external disk. If you don't have servers/desktops elsewhere (a friend or
family might be willing to accept a disk from you and host it on their
network) you could use amazon or other related services. Amazon charges $0.15
or so per GB per month. It takes a rather long time to type in a GB,
especially if it's compressed... various tools will let you backup only the
differences.
I'd poke around google with search terms like linux backup, rsync, rdiff,
rdiff-backup, duplicity, and see what looks like a good fit.
Good luck, you look like you are well along the right path already. Many
folks don't realize they need backups until it's too late.
More information about the vox-tech
mailing list