jibbajabba wrote: » I always had fully blown server which were doing my head-in. You could usually hear them buzzing in my garage from down the street. I looked for a solution with multiple server and yet I don't pay £$£$ for electricity. First of all, I know nested setups work, but I don't like em Over the last year I "collected" a few Microserver. I love these little things so I used them as sole hardware for my lab. Here the lab from "left to right" (currently vSphere until I got my DCA / IAAS, then change to Citrix)1. N40L - Running vSphere 5.1 (Standalone) Boot from USB 16GB of Ram (KVR1333D3E9SK2/16G) Adaptec 3805 4x 2TB SATA 3.5" (3x2TB Raid 5 + Hotswap) In 3.5" Optical Bay using 4x2.5" enclosure 3x 300GB 10k SAS 2.5" (2x300GB Raid 1 + Hotswap) 1x 750GB SATA / SSD Hybrid 2.5" (Standalone for Host Cache, isos, patches) Hosting VMs required to run the whole lot 1. DC (2008R2 core) 2. SQL (2008R2 + SQL 2008R2 Standard) 3. FreeNAS (Presenting local SATA storage as NFS / iSCSI) 4. WEB (CentOS 6 - Dev Web) 5. Veeam (2008R2 + Veeam B&R Free)2. N40L - Running Server 2008R2 - Running vCenter Server 5.1 suite + PowerCLI etc. Booting from SSD (128GB) 8GB of RAM3. N40L - Running vSphere 5.1 HA / DRS Cluster (shared Storage coming from FreeNAS) Boot from USB 16GB of Ram (KVR1333D3E9SK2/16G)4. N40L - Running vSphere 5.1 HA / DRS Cluster (shared Storage coming from FreeNAS) Boot from USB 16GB of Ram (KVR1333D3E9SK2/16G) Got a N36L as well - but that is (probably) dead but if fixed then it becomes a dedicated storage server. As for performance : Good enough. Memory is always the bottleneck - CPU never really ... The standalone box sits usually at 50% when running updates of some sort, but apart from that it is a lot lower - maximum I have seen is 90% when I am "doing stuff". As for memory though - those 4 essential VMs total up to 14GB but the Veeam and Web server are on only when needed and I am sure I could even lower the SQL server's ram (currently 6GB).
Essendon wrote: » I have a HP DL 380 G5 workhorse with 32GB RAM, dual quad-core Xeon's and about 1.5TB in SAS storage. I have been running a nested ESXi lab for about a year and a half now. I only use 1 power supply. Works like a treat, no performance problems ever. Best thing about this setup? It cost me just $150.
JBrown wrote: » Essendon Just to be clear, is that what you used for your VCAP5-DCA /DCD Labs ? I think kriscamaro68's setup is quite nice. I am actually considering the following for the VCAP-DCA/ Vmware View exams: 3 x Dell PowerEdge C1100 1U 2X Xeon QC L5520 2 26GHz No HDD 72GB DDR3 Tested | eBay with vSphere 5.1 USB flash boot 1x Synology 413 with 4x Samsung 840 250GB I hope it can handle the VCAP and View stuff.
MrAgent wrote: » My VMWare lab maching is actually much different than what I posted. I have an intel i7 970 32gb RAM 4x 1tb storage1 Fusion-io ioDrive2 Duo 1.2TB I dont remember the motherboard, but it does PCI pass through, which is what I really needed when doing my labs.
JBrown wrote: » ioDrive2 is like $15K,
JBrown wrote: » How did you manage to get your hands on a beauty like that one
santaowns wrote: » HP dl380 g4
MrAgent wrote: » For all those that are building labs, and wonder what would be a good lab, I found this the other day and thought I would share it.Another VMware home lab | Neil Koch
datgirl wrote: » I like his mention of nested virtualization, and the link to Building the Ultimate vSphere Lab – Part 1: The Story.
QHalo wrote: » That motherboard has a NIC that's not on the HCL and requires a custom driver injected into the ESXi installation.[H]ard|Forum - View Single Post - Intel 82579LM and 82579VIntel 82579LM and 82579V - Page 10 - [H]ard|ForumEnabling Intel NIC (82579LM) on Intel S1200BT under ESXi 5.1 | Virtual Drive I'm sure you're aware, but if not, now you are.