Review: Kingston Hyper-X 3K 240GB SSD
Kingston targets fans... and mostly doesn't disappoint
Real world use cases
I set the Hyper-X SDDs in various different real-world use cases; they have proven to be impressive devices for nearly all workloads. When compared to other SSDs (or spinning rust + controller card storage arrays) the Hyper-X are only middling at sequential access, but they truly shine at high-queue depth random access.
If I'm laying down large sequential files I would honestly much rather have Seagate 7200.14s; they are nearly as fast as the Hyper-X for this use case but you get 3TB instead of 240GB. This makes the Hyper-X of questionable value in a Video Gaming rig, for example. Most of my video games read texture files in a very sequential way during game initialisation and then run entirely out of RAM. Similarly, the Hyper-X is absolutely pointless for storing bulk media - there is no noticeable difference between 7200.14 and the Hyper X when playing video.
Outside of these sequential use cases, however, the Hyper-X is not only a step up from any of the spinning rust disks I have to play with, but wipes the floor with my Intel 510 SSDs and any of the OCZ stuff I still have lying around. Without question, Windows loads faster. Most applications see a significant improvement and finally I'm able to do video editing in real time. Web browser launch times are completely unaffected by use of the Hyper-X, but loading and navigating rich websites and HTML5 applications sees a marked improvement.
The most noticeable improvement was as a datastore for virtual machines. VMware Workstation 9 is something I use heavily. Running 8 VMs at a time could get pretty laggy on the Intel 510, but the Hyper-X soaked up the I/O like a champ.
The final destination for my Hyper-X SSDs is not a notebook or a desktop, but a server. These are consumer devices and not designed for enterprise use, but for a test lab environment they've been brilliant. For years Kingston has been a brand I have trusted without reservation for my server RAM. It was based on the strength of that reputation that I chose them for my test lab SSDs.
I managed to score eight of these drives on sale from Newegg for $159. They are regularly $199. A quick calculation shows that to read the theoretical limit of 10Gbit ethernet – 1280MB/sec – I would need at least 7 drives. Controller cards come in 8 ports, so 8 drives it is. I picked up a Supermicro AOC-SAS2LP-MV8. With the SSDs, the test lab's high-speed array comes to a little over $1,400.
That's a lot of cash out of pocket - it will take months to pay this array off. Having run the tests and given enough time for buyer's remorse to take hold, I still think I bought the right disks. $1,400 gives me an array that can reach over 1000MB/sec sustained throughput reading and writing simultaneously. Even using Windows RAID, I can hammer the array with random I/O from a 3-node VMware cluster and maintain those numbers. ®