Koozali.org: home of the SME Server
Legacy Forums => Experienced User Forum => Topic started by: Roger Magoon on February 28, 2002, 05:15:24 PM
-
I have searched through the forms and not found any discussion...but if I have missed it please let me know.
I am running SMEServer-5.0_Update3-05 on an admittedly slow machine (133 Mhz Pentium) and I have a lot of data (about 7G). However, I am basically unable to backup to disk. I left the backup running (saving to a Windows PC on the network) for over 24 hours and it still did not finish. Is this normal? Any suggestions?
Thanks. Roger.
-
When the desktop backup runs, it is compressing the files in a gzipped tar archive. The equvilent would be trying to create a zip file on a Windows box with 7 gig of source files. On a P133, that would take forever if nothing else ran. Your server is still busy doing other things. Not that it helps much, but it sounds like its time for a hardware upgrade.
-
Roger
Be aware that you will probably run into the problem of Windows/DOS file size limit of 2GB. It just depends how much data you have on the server as to what the final backup file size will be, but I'm guessing that 7Gb od data will create a backup file larger than 2Gb, therefore your backup to desktop will be unsuccessful.
Time to instal a tape drive or use some other method.
Regards
Ray Mitchell
-
Ray Mitchell wrote:
> Be aware that you will probably run into the problem of
> Windows/DOS file size limit of 2GB. It just depends how much
> data you have on the server as to what the final backup file
> size will be, but I'm guessing that 7Gb od data will create a
> backup file larger than 2Gb, therefore your backup to desktop
> will be unsuccessful.
It's worse than that, Ray. The backup might succeed (depending on the desktop OS and file system), but the restore would then fail.
> Time to instal a tape drive or use some other method.
No arguments from me there!
Charlie
-
Dear All
Yes as Charlie says the backup appears to complete OK with the file size of approx 2Gb, but when the option to verify the backup file is selected, the file cannot be read. Internet Explorer gives an error message.
Lesson learned, ouch !!
Ray Mitchell
-
Thanks for all the replies folks...I guess it is tape time.
Roger.
-
Just been following this thread.
I have a 1G chip machine with an internal IDE tape. Any idea on typical data rates whilst the machine is backing up ?
I am using the flexbackup which seems to work fine, but it does seem to take quite a long time even for a small amount of data.
Any comments appreciated.
B. Rgds
John
-
I have a similar machine (900Mhz Athlon, 256Mb RAM). I see this:
DUMP: Date of this level 0 dump: Thu Feb 28 21:03:34 2002
DUMP: Date of last level 0 dump: the epoch
DUMP: Dumping /dev/hda6 (/) to standard output
DUMP: Label: none
DUMP: mapping (Pass I) [regular files]
DUMP: mapping (Pass II) [directories]
DUMP: estimated 375349 tape blocks.
DUMP: Volume 1 started at: Thu Feb 28 21:05:46 2002
DUMP: dumping (Pass III) [directories]
DUMP: dumping (Pass IV) [regular files]
DUMP: 87.58% done, finished in 0:00
DUMP: Volume 1 completed at: Thu Feb 28 21:11:38 2002
DUMP: Volume 1 took 0:05:52
DUMP: Volume 1 transfer rate: 1135 KB/s
DUMP: 399844 tape blocks (390.47MB)
DUMP: finished in 352 seconds, throughput 1135 KBytes/sec
DUMP: Date of this level 0 dump: Thu Feb 28 21:03:34 2002
DUMP: Date this dump completed: Thu Feb 28 21:11:38 2002
DUMP: Average transfer rate: 1135 KB/s
DUMP: DUMP IS DONE
Kilobytes Out 111140
-
Roger:
I don't know about your needs or your budget, but tapes are becoming a *BAD* idea in many instances, especially if you have a lot of data to back up.
If you only have a few GB of data, tape is OK, but be sure to buy a good one (defined as >$200-$300). If your data will grow larger over time, your tape will become obsolete (won't hold a backup on one tape) and then you'll be replacing it, or avoiding backups.
As prices come down, I'm becoming a fan of using spare hard drives for data backups. You can buy 5 of them for about the price of a DAT tape drive, and they are faster on backup and much faster on restore -- especially selective restore.
There are some technical issues, but there's lots of good advice available on this phorum to help.
Regards,
Tom
Roger Magoon wrote:
>
> Thanks for all the replies folks...I guess it is tape time.
> Roger.
-
Thanks Tom,
I have more than enough disk space to backup what I want and i agree with your comment about tape. But, I have two issues/problems:
1) I am learning on the fly about Linux/e-smith and the comments in this thread led me to believe the 2G limit was a hard problem (because of Windows). Is there a way around that using disks?
2) I use the e-smith server for a number of applications one of which is general file server storage for my network. Since a lot of that is just downloaded applications from the web I thought I would transfer it to a local disk, get under the 2G limit and then do the backup. But at the moment I cannot find where space is being taken up under Linux. I have copied the download files but still have about 6G of space being used somewhere. In actual fact I really am only concerned backing up the system configuration & files and webpages. If I could selectively backup those up I would be happy.
Any suggestions would be appreciated.
Roger.
-
Tom Keiser wrote:
> I don't know about your needs or your budget, but tapes
> are becoming a *BAD* idea in many instances, especially if
> you have a lot of data to back up.
Snip
> As prices come down, I'm becoming a fan of using spare hard
> drives for data backups. You can buy 5 of them for about the
> price of a DAT tape drive, and they are faster on backup and
> much faster on restore -- especially selective restore.
You have to be very careful using ordinary disks for backup. In my experience they get broken when they are carried off site and back again.
Ed Form
-
Dear Roger & others
This HOWTO on using flexbackup to backup to disk may be of interest
http://www.e-smith.org/docs/howto/contrib/flexbackup-to-disk-howto.htm
Haven't tried it myself but it looks interesting.
Regards
Ray Mitchell
-
try the "du" command disk usage
if you run it from / and try maybe du -m (or was it du -M)
du = disk usage ; will tell you how much a folder contains in data.