Top critical review
1.0 out of 5 starsHorrible reliability
Reviewed in the United States on December 14, 2018
I built a RAID 5 for a home security system with 5 of these. One drive failed at 5 weeks. (Five is a magic number with this review. 5 drives in RAID 5 failing in 5 weeks.) Instead of repairing the RAID 5, I decided it simply wasn't worth repairing if the drive failed that quickly.
I also built a RAID 0 in a Linux home rig for High Performance Computing for assisting for my work towards a PhD in the filed of computational electromagnetics. One drive failed on merely the 4th week. Instead of rebuilding, again applying the reasoning, why rebuild the same configuration if its reliability was so low, I bought an LSI (now BroadCom) NVME Host Bus Adapter and one of the crazy fast and crazy reliable Samsung 970 Pro NonVolatile Memory (M.2 form-factor NVMe) SSD (with SAS to NVMe cable and M.2 housing for installing in a drive bay). The build with the LSI HBA and 970 Pro has been AMAZING -- extremely high performance and flawless reliability so far though granted I am only 4 months into this configuration. To be fair, I think a comment on reliability should be made after at least a year of service, preferably 4 or 5 years.
As a closing note, one may wonder if I had a good power supply, meaning anything from enough power capacity to clean enough power for the 5 Disk RAID. Well a month after removing the 5 disk RAID 5 and installing stand alone SATA III SSDs for each of my security cameras, I wanted better performance from this build. The system is maxed out at 32 GB DDR3. The cameras are a mix of older H.264 and newer H.265. I record all in 24-bit color depth, 2 MPi and 20 fps. In real-time I transcode the two older H.264 to H.265. I did have a P1000 and the CUDA libraries installed in this rig before commissioning the P1000 for my Linux HPC. Now I simply transcode the H.264 to H.265 in (as close to) real-time (as possible) with Intel QuickSync on an i5-4430. If both of the two older cameras are triggered at once, my 32 GB of memory can file rather quickly. I am running Windows 10. I also immediately mirror the files to a remote server for improved security. When the OS starts paging frequently, the video files were getting corrupted as this 4-core QSync rig simply couldn't keep up with real-time when multiplexing processor time for paging memory contents. I likely could have tweaked things to get the OS and any unneeded processes unloaded from memory. However, instead I took a huge chance. Even with my lousy two recent experiences with RAIDs failing, I took 5 of the Samsung 860 Pro SATA SSDs and built a RAID 0 of the PRIMARY (SYSTEM) drive -- a very scary thing and highly unadvisable in general. However, my experience (even before knowing or reading the specs) with Samsung's Pro series has been rock solid. I use a few 840 Pros to build a RAID 10 at the office several years ago, and I have had ZERO troubles from it. With the new 860 Pro boasting 1.5 million hours Mean Time Between Failures and 300 Terabytes written warranty and with a 512 MB cache, I was willing to take my chance. I also used a slightly larger than default block size in hopes of speeding paging operations. That was 3 months ago, and I have not had any corrupt video files from my security system since. It wasn't that the video files were being paged. It was that the processors had to take time out to orchestrate a great amount of paging, and the OS could only page so fast. This RAID consumes a peak power of 16.5 W and a standby power of 0.25 W. My power supply has had no difficulty meeting the power requirements. These, as one would certainly expect, support TRIM as well as AES-256 and IEEE1667 encryption. The drawback... the Samsung Pro Series SATA III SSD are considerably more expensive than many low-cost SOHO SATA III SSDs -- $88 vs $40. When building a 5 disk RAID that is a price difference of $240 OR 120% more costly, but my experience has been that low-cost SOHO SSDs simply cannot be used in RAID configurations.
When a user has plenty of PCIe lanes, I highly recommend NVMe. Reliability is fantastic and performance is considerably better than RAID. I benchmark and also experiment with real-world scenarios of NVMe vs SATA RAID vs SAS RAID. I am yet to have seen the performance of even a 6 disk SAS RAID come even close to performance of an NVMe in both personal benchmarks and personal real-world experience. For now, Intel doesn't offer enough lanes to satisfy my insatiable appetite for data. I have an i9 rig with 44 lanes for an extreme HPC server. I'd like to have a x16 display, my x16 Tesla, my x4 NVMe OS drive, my x4 10 Gbe NIC, an x4 NVMe data drive, an x4 mirror for the system drive and an x4 mirror for the data drive. That is 52 lanes. AMD offers processors with more than 52 lanes, but I already commited my money to the i9. Also, at the time AMD only supported DDR3. Now they support DDR4. AMD's slow adaptation of DDR4 really hurt them even though they have managed to stay ahead of Intel on PCIe lane count and on core counts and on clock rates and sometimes ahead on chace though in general, poorer cache and poor Platform Control Host... traditionally. AMD seems to be gaining momentum. However, mothertboard manufacturers seem slow to embrace the recent advances of AMD. I also am a big fan of QuickSync. I find it to be fairly comparable to NVidia's GPUs when scaled to 64-bit bus though the ever expanding bus is where NVidia is able to load and unload gobs of data between on-board memory and compute cores extremely quickly -- very nice for iterative solutions. NVidia cannot yet fit 128 GB like QSync. While on-board and off-board ("off-core") memory share is possible with OpenMP and even many message passing interfaces by treating the GPU has a node and the host CPU as the parent node, efficiency takes a huge hit, negating the value. Intel needs to get more than 44 PCIe lanes and NVidia needs to push beyond 48 GB DDR5 to 64 GB and even hopefully very soon to 128 GB -- granted the NVlink/NVbridge is available for many HPC cards, but I find only a limited few.
In summary, these cards are cheap, meaning affordable and very low quality. The net effect is close to placing the money directly in a paper shredder. Do not waste your precious time, money or data. Do not be swayed by Sirens sweetly singing a low price on these SSDs.