Backgroound Image

vSphere Lab Build Out – The Domain Controller Deployment

When building out a lab the first thing I do is build out a Domain Controller and DNS server. I can then use AD for credential management, and the DNS functionality is helpful as well.  I also use that server to create an iSCSI target for my hosts.

1. Virtual Environment

The first step is to have your virtualization environment ready to go.  It’s easy enough to next-next-finish your way through the VMware Workstation install, so I won’t detail out those steps.

2. Download Windows ISOs

You can download the Server 2019 ISO here: https://www.microsoft.com/en-us/evalcenter/evaluate-windows-server-2019

Select ISO, fill out the info required, and then hit continue.  Select your language, and then start the download.

3. Create the Lab Domain Controller VM

  1. In VMware Workstation press CTRL+N to open the New Virtual Machine Wizard, and make sure Typical is selected, then click Next
  2. Select the option for Installer Disc Image File, and browse to the location you downloaded the Server 2019 ISO to then click Next
  3. Since this will be using the evaluation license leave the product key blank, enter a name and password, and then click Next.
  4. Accept the prompt about not having a product key
  5. Enter the name and location for the VM, and click Next again
  6. Use the default hard drive size of 60GB (another drive will be added later for the iSCSI target storage), and click Next
  7. Click Customize Hardware…
  8. Set the VM hardware
    1. Set the CPU and RAM to what you’d like.  I used 2 vCPUs and 8GB RAM on my VM.
    2. Change the Network Adapter to Bridged
    3. Click Close
  9. Uncheck the box for Power on this virtual machine after creation and click finish.
  10. Now to add a the hard drive for the iSCSI target and remove the floppy drive.  In the library view right-click on the VM and click Settings
    1. Find the Floppy drive and click Remove (NOTE: If you don’t remove the floppy drive the OS install will encounter an error and fail), then click Add
      1. Select Hard Drive and click Next
      2. Leave the default drive (mine happens to be NVMe) and click Next
      3. Leave the default option to create a new drive and click Next
      4. Enter the size for the drive (I used 750GB) and click Next
      5. Leave the default file name and click Finish
    2. Click OK to finish the hardware changes
  11. Power on the VM

4. Install the OS to the Lab DC

NOTE: While in the VM you will need to press Ctrl+Alt to release the cursor to get to your desktop
  1. While the VM is booting you might see a prompt to press a key to boot from CD.  If that happens click into the window and press a key.
  2. Select the language, and keyboard settings
  3. Click Install Now
  4. When prompted to select the OS choose Windows Server 2019 Datacenter Evaluation (Desktop Experience) because we like graphical interfaces, and click Next
  5. Read through all of the licenses terms, and if you accept the terms check the box to accept them and click Next
  6. Select the Custom install option
  7. Select Drive 0, this should be the 60GB drive, and click Next
  8. Wait for the install to complete.  This might take some time.
  9. When the install is complete it will prompt for a password.  Set that and click Finish.
  10. The last thing to do for the VM deployment is to install VMware Tools.
    1. Log into the VM using the password set previously
    2. Right click on the VM in the Library an select Install VMware Tools
    3. Navigate to the D: drive and double click it.  That should kick off the Autorun for the installer.
    4. Follow the defaults for the install.  Next > Next > Install > Finish and then click Yes when prompted for a reboot.
The DC configuration will be detailed out in another posting in this series.

VMware lab design

 I am going to be building out a lab to test out some automation tools in VMware, so I decided I’d write up a few posts detailing the process.  I’m calling this part Phase 1, where the goal will be to get two ESX hosts, vCenter, and vRealize up and working.  I after that, I need to decide if I’ll go SRM, NSX, Horizon, or start playing the the Tanzu stuff.  For now though, vSphere and vRealize.

I put together a high-level design of what I will be building out:

(Shameless plug for Draw.io.  It’s an awesome tool for creating diagrams!)

I am running VMware workstation on my desktop, so I’ll be running the entire lab within Workstation.  I’ll also point out that there are free eval copies of everything except vRealize Automation.  You can also register for the VMware User Group’s VMUG Advatage program and get access to 365-day trial licenses for everything except vRealize Automation.  More info on the VMUG Advantage program can be found here: https://www.vmug.com/membership/vmug-advantage-membership

My Home Lab

I recently decided to build up a PC for my home lab environment.  I know a lot of people find old rack mount servers that they use as a lab, but I didn’t want to deal with the space, power, or noise of a bunch of old servers.  Instead, I decided to build a desktop PC that could run everything I wanted.  

Here’s a list of my build, and why I selected the parts that I did.  I will point out that pricing and part availability has changed, so your mileage may vary.

1. The CPU 

AMD Ryzen Threadripper 3960X 24-Core ($1,349.99)

I chose this CPU for a few reasons.  First, the new Threadrippers can use up to 256GB RAM, so there’s plenty of room there.  Second, 24 cores.  This thing is fast!!  And third, when comparing against the other Threadripper CPUs this one was the cheapest.  I debated going with the 3970X, but I couldn’t justify the extra cost for it.

2. Motherboard

MSI TRX40 PRO WiFi Motherboard ($389.99)

When I started this build it was near the beginning of the COVID-19 pandemic, so some parts were in short supply.  One of the primary advantages of this board was that it was in stock.  Also, it has 2x PCIe 4.0 M.2 slots, it supports up to 256GB RAM, and it has PCIe 4.0 slots.

In hindsight, I wish I’d spent the extra $50 and went with the MSI TRX40 PRO 10G Motherboard

That’s effectively the same board, but it drops the built-in Wifi in place of a 10GbE NIC.   Since a Wifi6 adapter (if needed) can be picked up for under $50, and a 10GbE NIC is nearly $100 it’s cheaper to just go with the 10G board.  Granted, doing that with both Wifi6 and 10GbE would consume an extra PCI slot.

3. CPU Cooler

Corsair H115i RGB Platinum AIO Liquid CPU Cooler ($169.99)

The Threadripper requires a water cooling solution, and since I didn’t want to mess with building a water cooling rig I went with an All-in-one (AIO) cooler.  First off, it’s imporant to be aware that there’s an H115 Pro and an H115 Platinum.  For the sTRX4 socket you need the Platinum version.  The copper base on the Pro series is too small for the sTRX4 CPUs.  

This cooler has two 140mm fans and a 280mm radiator, which is what fit best with the case I selected.  One important thing to be aware of with this cooler with the MSI motherboard is the USB power connection covers one of the RAM slots when it’s installed.  However, there’s an easy fix for this. 

I got a Cerrxian 9Inch Micro USB Cable which has a low profile 90-degree micro USB connector, and now the cooler is connected and not blocking the RAM slot.  Additionally, I used a CY 50cm 10Pin Motherboard Female Header to Dual USB 2.0 Adapter Cable to connect to the motherboard header.

I can say that this cooler is amazing!  I can run Folding@home and get the CPU up over 90°C and when I stop folding the temp is down to 50°C in seconds.

4. RAM

OLOy DDR4 RAM 128GB (4x32GB) 3000 MHz ($529.99)

The most important thing for me when looking at RAM was getting 32GB DIMMs.  That way I’d be able to get the full 256GB the CPU and motherboard would support.  I ended up with this OLOy RAM because it was cost effective.  There’s options for higher clock speeds, but I’m more concerned with memory capacity than speed.

5. Storage

Seagate Firecuda 520 2TB Performance Internal Solid State Drive SSD PCIe Gen4 X4 NVMe ($397.99 for 2TB, and $252.63 for 1TB)

I ended up going with two of these.  One 2TB drive, and a 1TB drive.  I have my OS and applications on the 1TB drive, and my VMs on the 2TB drive.  These drives are PCIe Gen4 drives, so they are stupidly fast.

6. GPU

XFX Rx 5700 XT Raw II ($379.99)

The GPU market is rapidly changing, but at the time of this build this card was one of the few PCIe 4.0 cards available.  I’m not a big 3D gamer, so I didn’t need the greatest GPU on the market.  This card seemed to be a good balance between cost and performance.

7. Case

CORSAIR CARBIDE SPEC-05 Mid-Tower ($66.23)

I didn’t want to spend a huge amount on one of the fancy RGB cases.  This one has enough room for the water cooler radiator, and room for three 120mm exhaust fans (two top, and one rear).  Coming from a full ATX case I like the smaller size, but I found it a tight fit between the exhaust fans and some of the motherboard connections.

8. Power Supply

EVGA 850 GQ, 80+ GOLD 850W ($169.99)

It’s an 850 Watt modular power supply.  It has two 8-pin CPU connectors.  All in all,  it fits what I needed.

9. Exhaust Fans

When I built this I ended up using a 3-pack of Thermaltake Pure Plus 12 RGB TT Premium Edition 120mm fans.  They work well enough, but since they use the Thermaltake RGB software, and the water cooler uses the Corsair software I wish I would have gone with the Corsair ML120 PRO 120mm fans.  Then they’d all be controlled by the same software.

Accessories:

There were three additional things that I added when I completed this build.  The first was a UPS.  I went with this: APC Sine Wave UPS Battery Backup & Surge Protector (BR1500MS) This unit can support the 850w PSU (and a few other devices) and it has a USB port to trigger a shut down in the event of a power loss.  I’ve had issues in the past with brownouts and in some cases I’ve had components damaged due to power fluctuations, so I’m happy with this.

The second item was an external storage array.  I used to run internal RAID sets, but it was always a pain when a drive failed to find which specific drive had failed, remove it, and RMA it.  So to solve that problem I added a 4-bay NAS, and loaded it with some old drives I had from my old PC.  I selected a 

QNAP TS-453Be-2G-US 4-Bay Professional NAS because it had front-accessible hot-swappable drives, it was expandable, and QNAP has a number of apps that can run natively on the appliance.

One of the apps that I can run on the QNAP is Plex.  Since Windows 10 removed Media Center I needed to find a new way to get my over-the-air TV recordings (Skol Vikings!)  I decided to go with a SiliconDust HDHomeRun HDHR5-2US Connect Duo Dual Tuner, and tie that in with Plex on the QNAP.  

In some upcoming posts I’ll detail out what I’m running in the lab, and how I deployed the different environments.