Question: How can I tar up about 12GB of files on a linux box and then split the files into 700MB chucks?
Answer: You can use the following 2 methods on any GNU tooled box or using the Cygwin environment on Windows.
Create an uncompressed Tar Archive of the files you want
tar -cvf output.tar /path/to/file
Split it into Chunks
split -d -b 734003200 output.tar your_prefix
Where “-d”
means append a decimal suffix (00, 01, 02 etc)
“-b”
is the chunk size in Bytes (Type in xxxMB in Bytes into google for a
conversion)
650 megabytes = 681574400 Bytes
700 megabytes =
734003200 Bytes
“your_prefix” is whatever you want
the chunk filenames to begin with
Creates a series of files
your_prefix00, your_prefix01 etc
Gzip the Chunks (note for greater compression use bzip2
gzip your_prefix*
Copy the gzipped chunks to CD
Put all the chunks in one directory
Unzip all the gzipped chunks
gunzip your_prefix*
Join the chunks
cat your_prefix* >> mynewoutput.tar
Untar to a new location
Zip up the files into a zip
zip -r zipfile.zip /path/to/files
Use zipsplit to create chunk
zipsplit -n 12000000 zipfile.zip
Where -n is the number of bytes to make each chunk
Copy to CD
Copy the zip chunks to the same directory
Change directory to the root of the new location
Unzip all the chunk individually
for i in /path/to/zipfil[0-9][0-9].zip ; do unzip $i ; done