My upgrade to Karmic went well, I even got a message from the utility Palimpsest saying that one of my drives in my RAID1 had many bad sectors. I purchased a same sized drive from Newegg and replaced the one that was failing. I used Palimpsest to add the new drive to the RAID1 and it took quite awhile and it then said everything was fine.
sudo mdadm --misc -D /dev/md0 also said that both drives in the array were "active sync" so I felt pretty confidant that I had successfully rebuilt the RAID. When I looked at the drives with Gparted the first drive looked normal but the new supposedly successfully added to the array drive said its status was not mounted. So what more do I need to do to make return this RAID to normal operation, or is it there now?
Any help greatly appreciated.
EDIT: Tried to reboot with the new drive only and crashed big time, so it isn't working, now just not sure how to fix it from here.
EDIT 2.0: Trying to rebuild with terminal commands after manually setting up drive with Gparted. Says it should take about and hour and a half. Fingers crossed.
Comments
I just want to remind you that a RAID 1 can not be your only backup. RAID 1 saves you in the case that one of your drives crashes, in that you can still get some work done while you wait for a new drive to show up. Odds of both going down together should be pretty low.
What it does not save you from is say, if you accidentally delete a file or get a virus or some such. That file will be instantly deleted on both drives. You still need to have some sort of archive backup like an external hard drive, jungledisk, carbonite, etc.
Here is a simple article on creating a RAID1 on FreeBSD. If you are using Linux your commands should be similar.