[vox-tech] burn directories to CDs

Alex Mandel tech_dev at wildintellect.com
Sat Nov 1 12:54:10 PDT 2008


Brian Lavender wrote:
> On Tue, Oct 28, 2008 at 07:22:49PM -0700, Alex Mandel wrote:
>> Jeff Newmiller wrote:
>>> harke wrote:
>>>> On Tuesday 28 October 2008 08:49, Tim Riley wrote:
>>>>> On Mon, 2008-10-27 at 18:56 -0700, Jeff Newmiller wrote:
>>>>>> harke wrote:
>>>>> <snip>
>>>>>
>>>>>>> You could use cpio with the pass-through option. his does
>>>>>>> not use or create an archive. You'll probably need some other options
>>>>>>> like make-directories
>>>>>> I am mystified why (or how) one would use cpio to copy files to a cdrom.
>>>>>> Can you elaborate?
>>>>> $ find . -print | cpio -p /dev/cdrom ? ;-)
>>>>>
>>>> You'll first need a file system on the cd
>>>> so you could do
>>>>      mkfs -t ext2 /dev/cdrom
>>>>
>>>> Notice that it is perfectly feasible to put an ext2 file system
>>>> on a cd Of course certain other operating systems will not be
>>>> able to read it.
>>>>
>>>> If you prefer to stick to an iso file system, just use the usual tools.
>>> I suppose if you want to be obscure, dumping data to /dev/cdrom is
>>> one way... I prefer making my backups as self-documenting and simple
>>> as possible.
>>>
>>> I also recognize that it is feasible to put alternate filesystems on a CDR,
>>> but the above mkfs command won't work, given the fact that any data written 
>>> to a CDR must be written in one pass with no modifications, and mkfs lays
>>> out data structures throughout the device file in random access fashion
>>> with the expectation that data and directory entries will be modified later.
>>>
>>> I think Brian's requirement to support multi-disk backups in standard
>>> directory layout is a tall order... though there might be a tool out there
>>> that supports this.  Seems like it would be hard to allocate disk usage
>>> among small and large files in arbitrary directories on multiple volumes.
>>> Read-only LVM? (very obscure... why bother with the directory structure?)
>>>
>> This reminded me, back in the day when you could span zip files at
>> 1.44MB in order to put it across multiple floppies.
>>
>> And hence a solution...
>> tar them with a max size option, the archive will be split at the given
>> size and start a new file.
>> http://www.base64.co.uk/splitting-large-files/
>>
>> Maybe not exactly browsable, but probably so from a graphical archive
>> tool except for the 1 split file between the spans.
> 
> No, no, no. This is a very simple problem. No splitting of files is
> allowed. These are all files that will fit within one CD.
> 
> https://answers.launchpad.net/ubuntu/+source/k3b/+question/11427
> 
> K3b doesn't seem to work, and the multicd seems to still use commands
> from 2001. It refers to the CD burner as a scsi device. But, I can
> probably hack multicd. I just want a CD I can put in the CD drive and
> browse the files so that they are in the same directory structure as
> original.
> 

I'm a little confused. I thought the goal was to burn more than what fit
on 1 cd and preserve directory structure. I see how the multicd concept
is a better solution but a spanned tar would also achieve similar
results, with the exception of an additional abstraction layer, the tar.
Using a graphical tool though exploring the tar would appear as a
filesystem(nautilus file-roller) without need for extraction.

Now if we're talking about 1 cd, I'm so lost because k3b or any data
disc burner seems to do what you want, although maintaining the full
directory path requires that you either build the folder structure or
add the whole thing and delete out things you don't want.
Maybe I'm missing some key goal you have in mind or we're visualizing
the data differently.

Oh well, all that matters is you found a solution,
Alex


More information about the vox-tech mailing list