Finally finished my low-power(ed) vSphere Lab
jibbajabba
Member Posts: 4,317 ■■■■■■■■□□
I always had fully blown server which were doing my head-in. You could usually hear them buzzing in my garage from down the street. I looked for a solution with multiple server and yet I don't pay £$£$ for electricity.
First of all, I know nested setups work, but I don't like em
Over the last year I "collected" a few Microserver. I love these little things so I used them as sole hardware for my lab.
Here the lab from "left to right" (currently vSphere until I got my DCA / IAAS, then change to Citrix)
1. N40L - Running vSphere 5.1 (Standalone)
Boot from USB
16GB of Ram (KVR1333D3E9SK2/16G)
Adaptec 3805
4x 2TB SATA 3.5" (3x2TB Raid 5 + Hotswap)
In 3.5" Optical Bay using 4x2.5" enclosure
3x 300GB 10k SAS 2.5" (2x300GB Raid 1 + Hotswap)
1x 750GB SATA / SSD Hybrid 2.5" (Standalone for Host Cache, isos, patches)
Hosting VMs required to run the whole lot
1. DC (2008R2 core)
2. SQL (2008R2 + SQL 2008R2 Standard)
3. FreeNAS (Presenting local SATA storage as NFS / iSCSI)
4. WEB (CentOS 6 - Dev Web)
5. Veeam (2008R2 + Veeam B&R Free)
2. N40L - Running Server 2008R2 - Running vCenter Server 5.1 suite + PowerCLI etc.
Booting from SSD (128GB)
8GB of RAM
3. N40L - Running vSphere 5.1
HA / DRS Cluster (shared Storage coming from FreeNAS)
Boot from USB
16GB of Ram (KVR1333D3E9SK2/16G)
4. N40L - Running vSphere 5.1
HA / DRS Cluster (shared Storage coming from FreeNAS)
Boot from USB
16GB of Ram (KVR1333D3E9SK2/16G)
Got a N36L as well - but that is (probably) dead but if fixed then it becomes a dedicated storage server.
As for performance : Good enough. Memory is always the bottleneck - CPU never really ... The standalone box sits usually at 50% when running updates of some sort, but apart from that it is a lot lower - maximum I have seen is 90% when I am "doing stuff".
As for memory though - those 4 essential VMs total up to 14GB but the Veeam and Web server are on only when needed and I am sure I could even lower the SQL server's ram (currently 6GB).
First of all, I know nested setups work, but I don't like em
Over the last year I "collected" a few Microserver. I love these little things so I used them as sole hardware for my lab.
Here the lab from "left to right" (currently vSphere until I got my DCA / IAAS, then change to Citrix)
1. N40L - Running vSphere 5.1 (Standalone)
Boot from USB
16GB of Ram (KVR1333D3E9SK2/16G)
Adaptec 3805
4x 2TB SATA 3.5" (3x2TB Raid 5 + Hotswap)
In 3.5" Optical Bay using 4x2.5" enclosure
3x 300GB 10k SAS 2.5" (2x300GB Raid 1 + Hotswap)
1x 750GB SATA / SSD Hybrid 2.5" (Standalone for Host Cache, isos, patches)
Hosting VMs required to run the whole lot
1. DC (2008R2 core)
2. SQL (2008R2 + SQL 2008R2 Standard)
3. FreeNAS (Presenting local SATA storage as NFS / iSCSI)
4. WEB (CentOS 6 - Dev Web)
5. Veeam (2008R2 + Veeam B&R Free)
2. N40L - Running Server 2008R2 - Running vCenter Server 5.1 suite + PowerCLI etc.
Booting from SSD (128GB)
8GB of RAM
3. N40L - Running vSphere 5.1
HA / DRS Cluster (shared Storage coming from FreeNAS)
Boot from USB
16GB of Ram (KVR1333D3E9SK2/16G)
4. N40L - Running vSphere 5.1
HA / DRS Cluster (shared Storage coming from FreeNAS)
Boot from USB
16GB of Ram (KVR1333D3E9SK2/16G)
Got a N36L as well - but that is (probably) dead but if fixed then it becomes a dedicated storage server.
As for performance : Good enough. Memory is always the bottleneck - CPU never really ... The standalone box sits usually at 50% when running updates of some sort, but apart from that it is a lot lower - maximum I have seen is 90% when I am "doing stuff".
As for memory though - those 4 essential VMs total up to 14GB but the Veeam and Web server are on only when needed and I am sure I could even lower the SQL server's ram (currently 6GB).
My own knowledge base made public: http://open902.com