Home Lab Rebuild 2018

About two years ago I decided that the pieced together homelab was not working any longer. I had VMs sitting on multiple pieces of storage. Some were RAID5 being presented back to the same hosts over again with freenas, openfiler or the HP VSA.

In order to provide a complete solution that was scaleable, resilient to failure, had 10Gb networking and comprised of enterprise hardware I chose the HPE C3000 Blade System. I picked up two 10Gb VirturalConnect Ethernet Modules, three blades with Westmere CPUs plus 96GB of RAM, storage blades, SAS HDDs and some SSDs. I made the decision to go with vSAN on vSphere 6.0 in order to provide for fault tolerant shared storage to all hosts.

Gone baby gone

The system worked well. Eventually I tossed the consumer SSDs for some enterprise SAS versions but as soon as vSphere 6 Update 2 released, I knew I was in trouble. It seemed that the Westmere L5630 CPUs were not long for this world and would be eliminated from support in the next few releases. Deduplication and compression were also introduced for All Flash configurations in that release. 6.5 introduced even more new features that weren’t going to be doable on my hardware. I was also looking to do some GPU pass-through work which wasn’t possible on the blades while maintaining the current configuration. I knew they would have to be replaced.

I began my search by outlining the must have features.

  1. Sandy Bridge or newer CPUs
  2. vSAN VCG supported hardware end to end
  3. DDR3 RAM so it would still be cheap
  4. Two PCIe slots at a minimum
  5. On-board 10Gb SFP+ to avoid wasting a PCIe slot
  6. Less than 200W power consumption without a GPU

It seemed like my choices were an HPE Dl360P Gen8 or Dell R620. After considering the cost, I picked the DL360p as it came out to be the cheapest.

My configuration was as follows,

  1. Dual E5-2670 8-Core CPUs
  2. 192GB of RAM after recovering some memory from my blades
  3. Dual Port SFP+ on board
  4. 1RU form factor
  5. Dual 450w platinum PSUs

Next in line is the actual storage. I reached out to someone I bought enterprise SSDs from in the past and purchased six 1.6TB Sandisk Optimus Ascend drives brand new off the shelf. These are great because they are enterprise grade and supported as All Flash capacity for vSAN 6.6! For the cache, I found a few eBay sellers with some Sandisk FusionIO ioDrive2 365GB PCIe cards with 99%+ life left!

A Quanta LB6M switch from UnixPLus provides the 10Gb networking for my new core. A bunch of SFP+ LC modules and some OM3 LC fibre cables and my parts list is done.

With all of that together, now I have a pretty sweet lab. I should be satisfied for at least 3 or 4 weeks.

Stay tuned for a follow up on the build and performance testing.