Home Lab Generation 8 Parts List (Part 2)

Posted on Updated on

Today I’m releasing the parts that I plan to use for my Generation 8 Home Lab! I’ll be using three Dell T7820 Workstations for my ESXi Hosts. In this video I go through the BOM (Bill of Materials) or the parts I selected to build out my systems.

NOTE: Though I’ve installed ESXi 8 on these hosts, this in no way is my complete solution. For now ESXi 8 seems to run without issue but I have not tested it with vSAN ESA yet. I’ll be releasing more videos around settings and networking shortly.

UPDATE: 03/11/23 All parts seem to be working well though I’ve had a string of bad QLogic QL45212HLCU-DE 2 Port 25Gbe cards from eBay. After receiving 2 bad cards, I’ve decided to buy a new card from Server Supply. I’ve installed this card and its working fine. I recommend buying this card new vs. used. https://www.serversupply.com/NETWORKING/NETWORK%20ADAPTER/25%20GIGABIT/DELL/0NJFX_310672.htm

More and updated information can be found here – https://vmexplorer.com/home-lab-bom/ #intelxeon #optane #vmware #vexpert #cloud

Home Lab Generation 7: Archive Page

Posted on Updated on

Today I’m retiring my Generation 7 Home lab as my Generation 8 lab will be posted in my Home Lab BOM section very soon. My Gen 7 Home Lab was a vSphere and vSAN 7 All Flash cluster and below was its final configuration. Some of the items below I’ll be reusing in Generation 8, and some of the items I will be selling off.


My Gen 7 Home Lab is based on vSphere 7 (VCSA, ESXi, and vSAN), NSX-T, and vRealize Suite. Additionally, I use it for vSphere Nesting and Testing. It contains 3 x ESXi Hosts, 1 x Windows 10 Workstation,  4 x Cisco Switches, 2 x MikroTik 10gbe Switch, 3 x APC UPS, 1 x Synology 1621+, and 1 x Asustor Lockerstor 10.  Read further below for an itemized list.

NOTE: My Home Lab Generation 7 design will not fully work with vSphere 8.  Though I could force the install of vSphere 8 I have decided to upgrade my home lab. Please see my Home Lab BOM for more inforamtion.


ESXi Hosts: (Total cost for one host ~$1325 | Unless noted, the items this area are per host)

NOTE: This Home Lab Generation 7 has worked well for me with vSphere 7 BUT most likely it will need to be updated for vSphere 8 due to pCPU discontinuation. Please see this KB for further information https://kb.vmware.com/s/article/82794

  • Case:
  • Motherboard:
  • CPU: (These CPUs work on vSphere 7, but chances are they will NOT work with vSphere 8)
    • 2 x Xeon E5-2640 v2 8 Cores / 16 HT (Ebay $30 each)
  • CPU Cooler or Heatsink: (Bracket required to fit narrow 2011 cooler mounts)
  • RAM:
    • 256GB DDR3 ECC RAM (Mostly from Ebay | each host has a different brand)
      • 16 x Host1: ELPIDA 16GB 2Rx4 PC3L-10600R-09-11-E2 Registered ECC Memory DDR3
      • 16 x Host 2: SAMSUNG M393B2K70CM0-YF8 16gb Pc3-8500r 1066mhz 1.35v Quad Rank X4 Ecc Registered Cl7 Ddr3 Sdram 240-pin Rdimm Memory Module For Server
      • 16 x Host 3: HYNIX HMT42GR7BMR4C-G7 16gb Pc3-8500r 1066mhz Cl7 Quad Rank X4 Registered Ecc Ddr3 Sdram 240-pin Rdimm Memory Module For Server
  • Disks:
    • 1 x Dual M.2 PCIe Adapter Card for NVMe/SATA SSD (Amazon $15)
      • 1 x 512GB NVMe SSD | vSAN Cache | Sabrent Rocket 512 (B&H $69)
      • 1 x 240GB M.2 SSD | Boot Disk | Kingston A400 (Amazon $35)
    • 2 x Supermicro AOC-SLG3-2M2 PCIe NVMe (Amazon $55)
      • 2 x 2TB NVMe SSD | vSAN Capacity | Sabrent Rocket 2TB (Amazon $199)
    • 1 x 2TB SATA | Extra Space | Hitachi (Ebay $50)
  • Network:
    • Motherboard Integrated i350 1gbe 4 Port
    • 1 x MellanoxConnectX3 Dual Port (HP INFINIBAND 4X DDR PCI-E HCA CARD 452372-001)
  • Power Supply:
    • Antec Earthwatts 500-700 Watts (Adapters needed to support case and motherboard connections)
      • Adapter: Dual 8(4+4) Pin Male for Motherboard Power Adapter Cable (Amazon $11)
      • Adapter: LP4 Molex Male to ATX 4 pin Male Auxiliary (Amazon $11)
      • Power Supply Extension Cable: StarTech.com 8in 24 Pin ATX 2.01 Power Extension Cable (Amazon $9)

Storage (NAS):


Battery Backup UPS:

Announcing my Generation 8 Super VMware Workstation!

Posted on Updated on

I’ve been using VMware Workstation for many years and its been a great tool to have in my home lab plan. Over the last couple of years, I had been using multiple computers to do various tasks (VMware Workstation, PC, PLEX, Video Editing, etc.) and they seemed to work okay. Most of these were older PCs I had around that I repurposed. However, a follower of my blog made a very generous donation of several key components. It was enough to really get me thinking about creating a SUPER Workstation box. I wanted to make an all in one workstation that could handle everything I currently supported plus more. I started to evaluate what I wanted to accomplish and created a simple goal list.


  • Support for 3 different Networks:
    • I wanted to have a 1Gbe and 10Gbe into my Home lab
    • The other connection would be used for basic networking
  • Run VMware Workstation:
    • Support several PC VMs
    • Support nested ESXi/vSAN 8 VMs, without having to force the install
    • Have enough resources to run everything from one box
  • Run PLEX Server
    • I wanted to make sure I had enough HDD space to run my PLEX media
  • Video Editing
    • Support x16 PCIe Video Card
    • From time to time I create videos and I wanted a system that could be performant around this.

From there I inventoried the parts I already had, chose new components, and created my Generation 8 VMware Workstation.


To better clarify some of my component choices I created this video and announced my DREAM #vmware #workstation

I’m sure it will be updated as time goes and my most recent Bill of Materials (BOM) can be found here. https://vmexplorer.com/home-lab-bom/

Working with Dell T7820 Disk trays

Posted on Updated on

As part of my Generation 8 ESXi/vSAN ESA Home lab build, in this video I go over how to work with the T7820 disk trays. They can be a bit tricky to work with, especially when adding or subtracting the 2.5″ carrier.

First Look GEN8 ESXi/vSAN ESA 8 Home Lab (Part 1)

Posted on Updated on

I’m kicking off my next generation home lab with this first look in to my choice for an ESXi/vSAN 8 host. There will be more videos to come as this series evolves!

Dell T7820: CPU Upgrade Issues *Solved*

Posted on Updated on

Quick Read: Did you buy a used barebones Dell T7820 or upgrade your CPU from Xeon Silver to Gold and now the T7820 won’t boot? Are you getting the error Code Memory/RAM failure: 2 amber blinks followed by a short pause, 4 white blinks, long pause, then repeats:?

Solution: Ensure your BIOS is at least 2.6.3 or later.

More Details:

I recently purchased 3 used Dell Precision Workstations model T7820 from eBay. They will be the replacements for my next generation Home Lab with vSphere 8. These used T7820 are barebone systems that didn’t come with a CPU or RAM. Other than a vNivida Video card they were pretty much empty. After inserting a Xeon 6252 and RAM modules, they all powered on but only one would post. The other two simply gave me a blink code (2 amber blinks followed by a short pause, 4 white blinks, long pause, then repeats). It was an odd error as their manuals stated it supported the Xeon Gold 6252 and 2933Mhz RAM.

Looking at the Dell 7820 manual I soon found Dell code description that matched the blink code: Memory/RAM failure. I swapped the known working RAM and CPU between the systems but still the errors persisted. So at this point, I knew my RAM and CPU were compatible with the T7820 but why would only 1 of 3 them work?

I now put a focus on the one system that worked. I first checked its BIOS level. It was running Dell BIOS 2.6.3 and funny enough the BIOS release notes specifically called out fixing a memory speed issue. It read: ‘Supports memory speed of 2933 MHz with two memory modules per channel’

It appears that when the T7820 posts, it queries the CPU for its max memory speed rating. In my case the Xeon 6252 has a max memory speed is 2933Mhz. So, even if I put in 2666Mhz RAM it still will not work, because its asking the CPU for is max speed rating. Dell BIOS 2.6.3 fixes this issue allowing the for faster memory speeds to be supported.

Knowing this, the fix should be simple. All I need to do is update my BIOS to 2.6.3 or later. However, the issue is I can’t update or even check the BIOS version on my 2 non-working systems. To do this I’m going to need a Xeon CPU whose max RAM frequency is slower than 2933Mhz, it’s something I don’t have. Enter my eBay Seller oztech llc. to the rescue!

I’ve heard of so many bad experiences when working with eBay sellers, but working with the eBay vendor oztech llc has been an absolute pleasure and I would highly recommend them. They are very responsive, helpful, and knowledgeable around the product they sell. They had not heard of this issue before but were willing to help and after a short call explaining my issue, they promptly shipped out a Xeon Silver 4114 CPU that supported a max RAM speed of 2666Mhz and matching RAM. This should allow me power on my T7820s and update to the latest Dell BIOS (2.29.0)

When the Xeon Silver CPU arrived I did the following:

  • Prepare the T7820
    • Removed the existing Xeon Gold CPU and 2933 Mhz RAM
    • Plugged in the power to the system and then unplugged (don’t skip this step)
    • Cleared the CMOS
    • Installed the Xeon Silver CPU and 2666Mhz RAM in Slot one
    • Plugged in the T7820
    • It power cycled about 4 times as it adjusted the system settings, and then it booted!
  • Confirm Current BIOS Level
    • During boot I pushed F12 and went into setup
    • Confirmed its BIOS level to be 1.7.1
  • Updating the BIOS to 2.29.0
    • The T7820 BIOS update is designed to work with Windows 10.
    • I booted to Windows 10, and ran the 2.29.0 BIOS update
    • The update confirmed it was currently at 1.7.1
    • I ran the BIOS update
    • Once it was completed I checked the T7820 BIOS but it came up with BIOS 2.6.3
    • I ran the BIOS update again, it confirmed 2.6.3 was current and it would update the T7820 to 2.29.0
    • After a reboot I confirmed the BIOS to be 2.29.0
  • Finally, installing the Xeon Gold CPU
    • Powered down the T7820, removed the Xeon Silver CPU and RAM
    • Plugged in the power then removed, and cleared the CMOS
    • Inserted the Xeon Gold CPU / 2933Mhz RAM, powered on, and allowed it to adjust for the new components (reboots multiple)
    • Went into setup and confirmed it saw the Xeon Gold CPU
  • I repeated this process on my other T7820, only its BIOS was 1.4.1.
  • I had no issues with either system after I completed this process.

Why this odd BIOS Update behavior? Though I could not find any information in the 2.29.0 release notes, I can only assume that it had a requirement for systems to be on 2.6.3 to be able to update. This would explain why it updated the system to 2.6.3 first then it updated to 2.29.0

It took a bit of trial and error to figure out this issue but with the great help of oztech and their all out willingness to ensure my satisfaction I now have 3 working T7820s. Next I’m off to install ESXi 8 and update my Home Lab but that my readers will be a different blog post.

Installing Intel U.2 Optane Disk using an adapter

Posted on Updated on

In this quick video I installed a Intel U.2 Optane disk into my Windows system using the StarTech U.2 to PCIe Adapter and then speed tested it with ATTO. I’ll be using these drives and adapters in my Next Generation 8 Super VMware Workstation that I am currently working on. Note the drive that I show being installed on to the StarTech card is different from the one I ran ATTO with.

** Please do keep in mind that the average PCIe slot has a 25 Watt Power Rating. Some systems have a higher wattage rating and some could be lower. The P5800X draws about 30 Watts and the P4800X about 23 Watts. Take this into account when using this adapter and plan accordingly. **

Products Show in this video:

Drive that was tested with ATTO: https://ark.intel.com/content/www/us/en/ark/products/201861/intel-optane-ssd-dc-p5800x-series-400gb-2-5in-pcie-x4-3d-xpoint.html

Drive that was shown being installed on the Adapter: https://ark.intel.com/content/www/us/en/ark/products/129968/intel-optane-ssd-dc-d4800x-series-750gb-2-5in-pcie-2×2-3d-xpoint.html

StarTech Adapter: https://www.startech.com/en-us/hdd/pex4sff8639

Super VMware Workstation:¬†Install tips for the Supermicro X11SPL-F

Posted on Updated on

In this quick video I go over a few of the installation tips when mounting this motherboard into a Phanteks Enthoo Pro case. I’m looking at this motherboard as a possible candidate for my next Super VMware Workstation / PLEX Server.

Motherboard: https://www.supermicro.com/en/products/motherboard/x11spl-f

Case: https://www.phanteks.com/Enthoo-Pro.html

Super VMware Workstation: Supermicro X11SPL-F First Look and Basic Overview

Posted on Updated on

In this video I give a first look at the motherboard I plan to possibly use for my new Super VMware Workstation. I got a lot of great plans for this server and it all starts here.

Supporting 18TB drives with Intel Virtual RAID on CPU (VROC)

Posted on Updated on

To compliment my vSphere Home Lab I use VMware Workstation all the time. It’s great for quickly spinning up VM’s and running nested ESXi. My current workstation (See specs here) was starting to show its age and wasn’t keeping up with my needs. To replace it, I recently bought a ASRock Rack EPC621D8A motherboard in hopes of paring it with a Intel Xeon 6252 CPU, 256GB RAM, GTX 1650 Super video card, Noctua NH-D9 DX-3647 cooler, and 5 x 18TB drives. My hopes were to use this mobo as the foundation for my new SuperWorkstation, Plex Server, and to process videos.

Looking for mobo that will fit a LGA 3647 CPU is pretty hard. The lowest end mobo will set you back at least $430 USD and your choices are very limited. I chose the ASRock Rack as it checked all the boxes – lowest cost, supported PCIe x16, quad Intel NICs, onboard sound, IPMI, Dual NVMe, and many other options, plus it supported the CPU+RAM I already had.

As a test, I did install ESXi 8 on it, and it worked well. However, when I stared my build with Windows 11/PLEX and I hooked up the 5 x 18TB HDD, only one disk would be seen in the VROC, but in Windows 11 all the disks were present. This meant I could not create a RAID 5 group with VROC.

I sent in my findings to ASRock support looking for help. Soon after the video stopped working, so I had to replace the mobo. However, the new ASRock replacement board had that same issue with the 18TB drives. At this point I reached out to ASRock support around this issue.

In the mean time, I decided to start working with the SuperMicro X11SPL-F. This mobo has a similar price point but is lacking in features when compared to the ASRock Rack. It only supports dual Intel NICs, though it looks like it has true x16 PCIe slots they are only a disappointing x8, no onboard audio, and single NVMe. The good news was, there was a current BIOS update that stated it supported my larger 18TB drives. I hooked up the SuperMicro mobo, updated its BIOS, and sure enough all the 18TB drives showed up. Next, I installed Windows 11 and with some quick Intel INF updates everything was accounted for and working.

Soon after I updated ASRock support letting them know my findings. Both the SuperMirco and the ASROCK RACK use the Intel C621 with VROC and with out the VROC update to the ASRock Mobo it just won’t support those larger drives. Their last ASRock mobo BIOS update was from 2019 and their VROC was clearly out of date. I’m hoping this information helps them to reconsider updating their BIOS which should allow for larger HDD support. I got a response back very quickly and they plan to look into this item. Additionally, throughout this issue the ASRock support person was very responsive, friendly, and communicated well through email. Kudos to their support team.

*Update Jan-01-2023 – I got word from ASRock support that they are looking into updating their BIOS and VROC. They asked me to test it out and I will be doing that shortly.*

*Update Jan-29-2023 – Over the past few weeks I’ve been testing the ASRock Beta BIOS and it seems to be working perfectly with my 18TB Drives. Last check I checked on their product page they have not released an update yet. *

Thanks for reading, and please do post up a comment!