92TB FlexRAID Server
#31
(2018-10-06, 20:22)Harro Wrote: You are using 1 parity drive for the 74 TB? What speeds are you getting on a parity check and how long does it take?
 With how FlexRAID works, it only needs to generate new parity with new or altered files.  So every week the parity check can take an hour to a few hours, or 6-12 hours if I've made HUGE changes. (Usually a very large ingestion, like nearly a TB in a week, but that's rare).

However, to build all new parity data from SCRATCH, it will take multiple days to generate it.

Also, since this is a file based, snap shot, parity system.  Even if one storage drive and one parity drive blows up, all the other drives still just have NTFS file systems, all the files, plus some folders of special meta data.  Any surviving drive can just be pulled out and read normally in any system that can read NTFS.
Reply
#32
Thanks for the info. I have wanted to dabble with Flexraid but Unraid just seems to get better and better. Have you ever considered trying Unraid? I currently have been running Unraid for many years and I like it a lot. My sig has my build but since then I have updated the PSU and CPU, along with all hard drives. Currently at 104TB  of storage with 2 parity drives. My parity check monthly takes about 18 hours. Upgraded the cpu so I could run a Plex server and stream more clients on the server. Psu was upgraded because of the higher power for the 7200 rpm drives. I idle at about 100watts and have the server connected to a battery backup in case or power failure.

It is fun to have storage for whatever needs you want.
Reply
#33
(2018-10-07, 20:49)Harro Wrote: Thanks for the info. I have wanted to dabble with Flexraid but Unraid just seems to get better and better. Have you ever considered trying Unraid? I currently have been running Unraid for many years and I like it a lot. My sig has my build but since then I have updated the PSU and CPU, along with all hard drives. Currently at 104TB  of storage with 2 parity drives. My parity check monthly takes about 18 hours. Upgraded the cpu so I could run a Plex server and stream more clients on the server. Psu was upgraded because of the higher power for the 7200 rpm drives. I idle at about 100watts and have the server connected to a battery backup in case or power failure.

It is fun to have storage for whatever needs you want.
Unraid is pretty attractive and I didn't really know about it when I went with FlexRAID.  I was mostly looking for a replacement for DriveBender at the time.  I eventually will deploy a SECOND server to supplement this one, once it's hard drive capacity is maxed out, and I'll seriously look at UnRAID in that situation.
Reply
#34
A few more upgrades!  eBay had a sale on 8TBs so I added another, we're up to 78TB of storage now.  I added the last SATA card it'll need and then I installed a hotswap bay to allow me to install 4x3.5" drives in the 3x5.25" bays.  With this, the server will now hold 16x3.5" drives, plus 4x2.5" drives on the back side trays.  Once this is full, it'll be time to build another server. Smile  The next one will be a 4U rack unit and use LSI drives and otherwise have a LOT more forward thought.

Image

Image

Image
Reply
#35
Gosh I have some updates here I guess. Smile  First, it's up to 104TB now, also FlexRAID is dead and I need to migrate away from it while my activation still works.  I also traded out the Syba SATA controller mess for a single 16 port LSI SAS controller, I also swapped in 4x8GB of DIMMs instead of the SODIMMs in adapters.

...Oh and I upgraded from a P67 board with an i5 2300 to an X79 board and an Intel Xeon E5 2697v2. Smile

Image

Image

Image

Image
Reply
#36
A word of warning, do not use SATA cards especially not $35 ones

The reason being is that I done the same years ago now, I brought for example 8 3tb drives, I had four on my mobo sata ports, and then I had four on my sata card.

In the space of a year all of the four that were on the SATA card failed and the four that were on the mobo sata ports have lasted till this day.

They were all the same drives, a sata board especially a cheap one, is not safe.

You can either listen to my advice or end up crying because you are going to lose a lot of hard drives and also what is on those hard drives.

I now buy second hand mobo's with as many ports as I can find, and use them, it is better to have multiple PC's than putting all your eggs in one basket and seeing the lot go down.
Reply
#37
(2019-11-09, 12:55)Video Titles Wrote: A word of warning, do not use SATA cards especially not $35 ones

The reason being is that I done the same years ago now, I brought for example 8 3tb drives, I had four on my mobo sata ports, and then I had four on my sata card.

In the space of a year all of the four that were on the SATA card failed and the four that were on the mobo sata ports have lasted till this day.

They were all the same drives, a sata board especially a cheap one, is not safe.

You can either listen to my advice or end up crying because you are going to lose a lot of hard drives and also what is on those hard drives.

I now buy second hand mobo's with as many ports as I can find, and use them, it is better to have multiple PC's than putting all your eggs in one basket and seeing the lot go down.

Look at my last post in this thread where I make an update.  It's the only post in this thread for over a year.  It's the post that bumped this thread.
Quote:I also traded out the Syba SATA controller mess for a single 16 port LSI SAS controller 

Now look at the most recent case photo, the one where you can clearly see an LSI SAS 9201-16i in a single slot where 4x Syba controllers once sat?

Cool, thanks, good reading.

Also, no, a SATA controller can't KILL drives.
Reply
#38
(2019-11-09, 16:04)DJ_Izumi Wrote:
(2019-11-09, 12:55)Video Titles Wrote: A word of warning, do not use SATA cards especially not $35 ones

The reason being is that I done the same years ago now, I brought for example 8 3tb drives, I had four on my mobo sata ports, and then I had four on my sata card.

In the space of a year all of the four that were on the SATA card failed and the four that were on the mobo sata ports have lasted till this day.

They were all the same drives, a sata board especially a cheap one, is not safe.

You can either listen to my advice or end up crying because you are going to lose a lot of hard drives and also what is on those hard drives.

I now buy second hand mobo's with as many ports as I can find, and use them, it is better to have multiple PC's than putting all your eggs in one basket and seeing the lot go down.

Look at my last post in this thread where I make an update.  It's the only post in this thread for over a year.  It's the post that bumped this thread.
Quote:I also traded out the Syba SATA controller mess for a single 16 port LSI SAS controller 

Now look at the most recent case photo, the one where you can clearly see an LSI SAS 9201-16i in a single slot where 4x Syba controllers once sat?

Cool, thanks, good reading.

Also, no, a SATA controller can't KILL drives. 

lol so it was a fluke that all of the drives that I had on a sata card all died around a year, and yet the ones on the mobo are still working four or more years later? As I say, I have warned you, if you have upgraded since then to a better more expensive card which can be more stable, but I can only speak from experience, cheap sata cards can KILL drives.
Reply
#39
Hi DJ,

I have a quick question for you about Flexraid.

I have followed a fairly similar path to you with my Flexraid server starting with 3x 3tb drives to now having a mix of 9 drives that total 38TB with a 14tb parity drive and a system SSD. I have them all in an old tower case with an ancient q6600 CPU.

I want to upgrade the CPU, motherboard and memory but as you know Flexraid is not supposed to be transferable between systems. I was wondering how you managed the upgrade from 2500 to Xeon chip?

Thanks very much
Reply
#40
(2021-01-24, 19:40)kofsw4 Wrote: Hi DJ,

I have a quick question for you about Flexraid.

I have followed a fairly similar path to you with my Flexraid server starting with 3x 3tb drives to now having a mix of 9 drives that total 38TB with a 14tb parity drive and a system SSD. I have them all in an old tower case with an ancient q6600 CPU.

I want to upgrade the CPU, motherboard and memory but as you know Flexraid is not supposed to be transferable between systems. I was wondering how you managed the upgrade from 2500 to Xeon chip?

Thanks very much

If I recall, the license kept working but I'm not 100% sure.  It was some time ago.  More over, FlexRAID has been abandoned, it's developer killed the site and ghosted users.  I strongly suggest you abandon FlexRAID for something else, I'm all UnRAID now.
Reply
#41
Thanks for the reply. I know deep down that you're right and I should upgrade the software first before the hardware but Flexraid is just so, well, flexible for my purpose.

Maybe I'll prepare myself for a potential drivepool/snapraid migration in case Flexraid fails to relicense after the hardware change.
Reply
#42
I use Linux rather than Windows. In my case I use mergerfs which I think is along the same line as DrivePool. I've had great success combining that with Snapraid. Snapraid is easy to use, just did my first disk replacement the other day. Took awhile as my parity drives are USB so limited i/o. But it worked out just fine. I highly recommend snapraid any time. As I understand it it's being put into the next Debian release so it will be around a long time, and open source as well so you can depend on it being around without worrying about licensing.
Reply
#43
Nice setup Smile
But you really need to do some cable management Big Grin
Kodi 21.0α | Ubuntu 22.04.3 | Kernel 6.4.x | intel i5-12600K | Gigabyte Z690 Gaming X DDR4 | Corsair 2x8192MB (DDR4-3200) | HDPlex H5v2 | HDPlex 400W HiFi DC-ATX | Pioneer VSX-934 | LG 65B7D
Reply

Logout Mark Read Team Forum Stats Members Help
92TB FlexRAID Server0