Jump to content

Assistance needed for HW Raid1


dude67
 Share

Recommended Posts

I don't have mandriva available to me currently, so could someone verify, if this Promise card is supported by kernel version 2.6.24.x

 

Promise FastTrack TX2300 (2-port SATA RAID adapter)

 

This seems like to card that would work, so I'm trying to find this from the local stores. But, if someone could check if this is already supported by the kernel, I'd appreciate it. :)

Link to comment
Share on other sites

This is from www.promise.com site:

Linux Kernel 2.6 Partial Source Code 359.02KB v2.6.0.0331 Windows

2007/3/27

Description

NOTE: This source code is for the FastTrak TX2300/TX2200 RAID controllers using Linux Kernel 2.6.

 

This Linux source code should only be used by experienced Linux users. This code is designed to be used with Linux kernel 2.6. This code is provided as-is and Promise Technology does not provide technical support for the code.

 

Please read the README file contained in the download package. If you have any Linux Source Code comments, please email: linuxcomments@promise.com and linux@promise.nl

What would that mean in terms of .... What do I do with that? :unsure:

Do I need to build something into the kernel or what...?

Link to comment
Share on other sites

I would say so yes. If you're wanting to install an OS using a card, then it's best to find one supported in kernel, as kernel compiling during install isn't an option really. Unfortunately, not so easy without knowing the chipsets that are supported - like the LSI Logic/Symbios Logic Fusion MPT that I mentioned before for example.

 

It may take a bit of looking around to find the one you need, and the chipset info, so that you can see if it's going to work by default.

Link to comment
Share on other sites

Just as an aside. I could use CentOS 5.0 with my Adaptec 1430 using the correct driver and Raid 10 setup by the card. Problem was, I couldn't upgrade any kernels to get Xen installed, or anything else to get the full potential of my machine. I had connected the four 250GB disks to the motherboard, removing the Adaptec card entirely, but found that I then had a mix with CentOS of two disks being hda/hdc and the other two being sda/sdb. Now I don't know why there was a mix for this. But a later kernel with Gentoo, all were sda/sdb/sdc/sdd. Anyway, they were slow too.

 

So I put the Adaptec back in, deleted the config so that there is no raid config, and then I just use it as a disk controller instead of a raid controller. Now, I see four disks instead of one big one, but it is also much faster. I now have sda configured for my swap, / and /home partitions, and then I'm using LVM across the sdb, sdc and sdd disks :)

 

Oh, and it's just compiling my install on Gentoo now. Well the last remaining parts, which is nice on this Quad Core with 8GB of ram :)

Link to comment
Share on other sites

You can only do dynamic partition with the data disk, therefore Windows in C: is not using software raid as you can see from the article. The rest looks exactly how you'd do software raid like I wrote in my article.

Link to comment
Share on other sites

OK, if I then understood correctly, I can pretty much do software RAID with my dual boot installation, all except the Win C:\ part? So, I will install the C:\ partition for Windows, but will keep it fairly small and make a D:\ partition where I should keep all the data I can. Well, Windows is mainly for gaming and some video editing, so I can easily save all the files into D:\ partition.

 

But if I set it up like this, then if the primary HDD is broken, I would need to do a reinstallation of the whole WinXP into the first partition. Am I correct? In that case, I would need to install all the apps again, even though the data (and perhaps the apps themselves) would be on the D:\ partition. I think the registry would be lost if it's stored in C:\ partition...

 

I don't know - this seems so much harder nut to crack than I originally thought... I'll give it a try anyway.

Edited by dude67
Link to comment
Share on other sites

Yes, you'd need to reinstall Windows again unfortunately. But everything on D: would be intact. This is why for Windows, hardware raid would be better.

 

Incidently, I just decided to add my two old hard disks that were using software raid in Linux, and I was able to make the raid config file, and then just simple assemble and scan the arrays to activate them and then copy the data from the old disks to the new disks. This would be so much harder if not impossible with a hardware raid controller. Just copying data now to new disks and using Mandriva 2008.1 :)

 

Yep, not vmware this time. Unfortunately, still struggling to find a distro that has a working Xen installation and only CentOS 5 seems to have it so far. Unfortunately, by onboard LAN doesn't work with it and I can't be bothered to get it working. Will figure out why Xen doesn't work in Mandriva later, and decide on Xen or VMware Server 2 RC1 or VirtualBox.

Link to comment
Share on other sites

It's not as simple as hw raid vs software raid, because many "hardware raid" cards are significantly dependent on software and are not true hardware raid.

 

I just posted about this in Ian's new computer thread. A lot of Linux kernel developers advocare full Linux software raid, unless you know that you've actually got a real hardware raid -- and they're expensive.

 

After all, what's the point in wasting money on a fakeraid card if it's just doing it all in proprietary software and using your CPU anyway. You might as well use regular Linux software raid that's open, well supported, and easier to deal with when something goes wrong.

 

https://mandrivausers.org/index.php?showtop...st&p=498284

 

Though... you're using windows too, and that complicates things a bit. Linux software raid probably won't work with Windows.

 

The linux ATA site has a great deal of information about what is and isn't supported:

http://linux-ata.org/driver-status.html

http://linuxmafia.com/faq/Hardware/sata.html

Link to comment
Share on other sites

Thanks for all of you. I didn't know (before this i.e.) that the PCI controllers usually come with SW RAID (at least in my price-range) instead of HW RAID.

 

I bought the card (Promise TX2300 RAID Controller) in vain, but let this be a lesson to all you other newbies... I've paid for my lesson - you don't have to!

 

So my advice is to try to get the software raid to work - it's fairly simple during Mandriva installation. I can write a short how-to here, but I first need to sort a few things out first. I disconnected the sda HDD and booted up the system, but was dropped to command line - with only "/" operating. If you have any idea as what to do or what went wrong, I'd appreciate it.

 

I have two 750GB HDDs and have windows installed first (some 20GB for C-drive and 80 GB for D-drive). Then I installed Mandy for the remaining 600 GB.

 

I configured the sda so that the first mandy part (to be "/") is 19GB, then there is swap (around 1GB), and two other partitions (100GB for "/home" and 480GB for "/storage"). All these (except for the swap) I formated as Linux RAID. I then added new RAIDs: first 19GB became md0, the 100GB became md1 and the last partition 480GB became md2.

 

In the RAID page I then formatted the md0 as "/", md1 as "/home" and md2 as "/storage".

 

The sdc (the second 750GB drive is sdc) is configured the same way, except the windows part is left empty and the rest are exactly as in sda - all formated as Linux RAID also). And finally I added these partitions to the same RAID partitions (/ = md0, /home = md1 and /storage = md2).

 

Then I could see that these were operating fine (at least reported to be fine):

<code>[dude67@localhost ~]$ cat /proc/mdstat
		 Personalities : [raid1] [raid6] [raid5] [raid4]
		 md2 : active raid1 sda9[0] sdc8[1]
			   506272256 blocks [2/2] [UU]

		 md1 : active raid1 sdc7[1] sda8[0]
			   102398208 blocks [2/2] [UU]

		 md0 : active raid1 sda6[0] sdc5[1]
			   20474688 blocks [2/2] [UU]

		 unused devices: <none>
		 [dude67@localhost ~]{:content:}lt;/code>

I did add the bootloader (lilo) when installing Mandriva to md0 (OK, I did add it only to sda first, but then in MCC added it to md0). But then I tested the redundancy by disconnecting the SATA cable from sda drive. I could only get working "/" partition - no other partition would be working. This is the same in command line:

<code>md2 : inactive sda9[0](S)
		   506272256 blocks
	 md1 : inactive sda8[0](S)
		   10239820/8 blocks
	 md0 : active raid1 sda6[0]
		   20474688 blocks [2/1][U_]

	 unused devices: <none></code>

 

So any suggestions as to what went wrong? Anything to do with the fact that the partition numbers differ from one HDD to the other like this:

md0 ... "/" ... sda6 ... sdc5

md1 ... "/home" ... sda8 ... sdc7

md2 ... "/storage" ... sda9 ... sdc8

 

This is due to the fact that in sda the windows occupies two partitions and in sdc there is only one empty partition.

Link to comment
Share on other sites

  • 2 weeks later...

According to here:

 

https://qa.mandriva.com/show_bug.cgi?id=43785

 

it will be fixed for the LiveCD shortly, but the normal install sets still work as they should do, and grub should be selectable. If it's not, a bug would need to be raised to address it. Maybe later, I will check and test this again when the full release of Mandriva 2009 is out just to make sure :)

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

×
×
  • Create New...