How to reorganize my VMware lab?
Asif Dasl
Member Posts: 2,116 ■■■■■■■■□□
Hello all!
First off sorry for the wall of text! but I want to reorganize my VMware lab... I think I've got a good lab at the moment but the problem with it is I have parts which have been left over from previous labs when it was more difficult to build a whitebox VMware server, this is my current setup:
Asus Z87-PRO, Intel i7-4770T, 32 GB DDR3-1600, 500GB SSD, 9 NICs = Left-vSphere
Asus Z87-PRO, Intel i7-4770T, 32 GB DDR3-1600, 500GB SSD, 9 NICs = Right-vSphere
Asus M4A785-M, AMD Phenom II X6 1075T, 8GB DDR2-1066, 1TB HDD, 500GB SSD, 5 NICs = vCenter Server/iSCSI target
Intel Atom 300, Low Powered, 2GB DDR2-800, 500GB HDD = Domain Controller
Netgear ReadyNAS PRO 4 - iSCSI/NFS/CIFS/FTP, Dual NICs, 4 x 4TB
HP v1910-48G (JE009A), layer 3 routing capable, 48 port GigE switch
Belkin Omniview SOHO 4-Port dual monitor KVM (F1DH104LEA)
But... I think it could be better organised and use less space & electricity - I'm thinking of selling different parts off. I'm thinking of moving from a physical lab to a nested lab utilising my current Corsair 600T case and water cooling setup. This is what I'm thinking of moving to:
1 x SuperMicro MBD-X10SRL-F single processor LGA-2011-3 motherboard
1 x Intel Xeon E5-2630 v3 CPU - LGA-2011-3 8-core hyperthreaded
6 or 8 x SK-Hynix 16GB HMA42GR7MFR4N-TF = 96GB or 128GB of DDR4 RAM
3 x Existing 500GB SSDs as datastores
Not sure how many NICs to give it??
1 x QNAP TS-670 + 4 x 4TB + SSD caching
1 x Used Cisco 3750-G 24 port switch
I reckon if I sell all of my current hardware I can build a single great nested VMware server. I was going to "downgrade" the switch to Cisco - less ports but better functionality of Cisco? and upgrade the NAS to a QNAP for VAAI support and all of the other features that NAS has got too...
Money-wise I think it'll work out reasonably neutral, getting rid of 4 machines for 1 great machine. I'll use my laptop to connect to the lab remotely. I'm just wondering if anyone can see any flaws in my plans?
How many NICs would you give to the nested machine? Is it a bad idea to change the HP switch for a used Cisco switch? Would the Cisco 3750-G be better suited for learning NSX? Should I wait until vSphere 6 is released for compatibility with the new motherboard?
Any thoughts? I am in 2 minds whether to change the current setup or not, but if I do it, it has to be now before Christmas...
First off sorry for the wall of text! but I want to reorganize my VMware lab... I think I've got a good lab at the moment but the problem with it is I have parts which have been left over from previous labs when it was more difficult to build a whitebox VMware server, this is my current setup:
Asus Z87-PRO, Intel i7-4770T, 32 GB DDR3-1600, 500GB SSD, 9 NICs = Left-vSphere
Asus Z87-PRO, Intel i7-4770T, 32 GB DDR3-1600, 500GB SSD, 9 NICs = Right-vSphere
Asus M4A785-M, AMD Phenom II X6 1075T, 8GB DDR2-1066, 1TB HDD, 500GB SSD, 5 NICs = vCenter Server/iSCSI target
Intel Atom 300, Low Powered, 2GB DDR2-800, 500GB HDD = Domain Controller
Netgear ReadyNAS PRO 4 - iSCSI/NFS/CIFS/FTP, Dual NICs, 4 x 4TB
HP v1910-48G (JE009A), layer 3 routing capable, 48 port GigE switch
Belkin Omniview SOHO 4-Port dual monitor KVM (F1DH104LEA)
But... I think it could be better organised and use less space & electricity - I'm thinking of selling different parts off. I'm thinking of moving from a physical lab to a nested lab utilising my current Corsair 600T case and water cooling setup. This is what I'm thinking of moving to:
1 x SuperMicro MBD-X10SRL-F single processor LGA-2011-3 motherboard
1 x Intel Xeon E5-2630 v3 CPU - LGA-2011-3 8-core hyperthreaded
6 or 8 x SK-Hynix 16GB HMA42GR7MFR4N-TF = 96GB or 128GB of DDR4 RAM
3 x Existing 500GB SSDs as datastores
Not sure how many NICs to give it??
1 x QNAP TS-670 + 4 x 4TB + SSD caching
1 x Used Cisco 3750-G 24 port switch
I reckon if I sell all of my current hardware I can build a single great nested VMware server. I was going to "downgrade" the switch to Cisco - less ports but better functionality of Cisco? and upgrade the NAS to a QNAP for VAAI support and all of the other features that NAS has got too...
Money-wise I think it'll work out reasonably neutral, getting rid of 4 machines for 1 great machine. I'll use my laptop to connect to the lab remotely. I'm just wondering if anyone can see any flaws in my plans?
How many NICs would you give to the nested machine? Is it a bad idea to change the HP switch for a used Cisco switch? Would the Cisco 3750-G be better suited for learning NSX? Should I wait until vSphere 6 is released for compatibility with the new motherboard?
Any thoughts? I am in 2 minds whether to change the current setup or not, but if I do it, it has to be now before Christmas...
Comments
-
Architect192 Member Posts: 157 ■■■□□□□□□□I just did a similar cleanup. Sold my 2 hosts (desktops with 16GB RAM) and replaced with a Supermicro board with 64gb ram for nesting my lab (working on VCAP-DCA). Value wise, HP switches are great (just bought a 1910-24G ) but short of a 3750G, there's not much I would change it for. I would LOVE to have a Cisco at home for actual "production" usage (I had a huge cisco lab of older gear but not interested into running at 100mbps)
Look into Synology also for your storage. Moved from QNAP to Synology and very happy with the device. QNAP was great, but the software is nicer on Syno IMHO.
Wait for vSphere 6? That's an option... but then you'll be waiting for something else so... Plan ahead, do your homework, and buy for your current needs.Current: VCAP-DCA/DCD, VCP-DCV2/3/4/5, VCP-NV 6 - CCNP, CCNA Security - MCSE: Server Infrastructure 2012 - ITIL v3 - A+ - Security+
Working on: CCNA Datacenter (2nd exam), Renewing VMware certs... -
Asif Dasl Member Posts: 2,116 ■■■■■■■■□□Thanks for the reply, interesting that you did the same thing for VCAP-DCA.
I had a look at the Synology live demo and the QNAP live demo and I agree the Synology has nicer software, but I think I'm going to stick with the QNAP as it has double the write speeds over a DS1513+ - that and it has 1 more drive bay, though much more expensive.
I was thinking of going Cisco as I have thought about doing up to the CCNP SWITCH material for a long time (maybe learn the material but not take the exam - it seems pointless to take 1 CCNP exam), I also have 2 x 3550's so if I got the 3750-G and maybe picked up a 24 port 3750-G at some point in the future I would have a great CCNP SWITCH lab also.
I think I'll do this in stages, upgrade the hosts, then the NAS, then the switch - less likely to mess something up and if I scale back then it's done in order of priority. -
jibbajabba Member Posts: 4,317 ■■■■■■■■□□3750G is a nice switch, just pricey .. I am not sure what the requirements are from NSX perspective, but since you supposed to do all with NSX and just need hardware as upstream, why not go Cisco, but affordable and fan-less ? SG300My own knowledge base made public: http://open902.com
-
Asif Dasl Member Posts: 2,116 ■■■■■■■■□□Well this is the thing, I think I could get at least €350 for the v1910-48G, a Cisco SG300-28 costs about €415 and a Cisco WS-C3750G-24TS-S costs about €450 used on eBay. I know NSX uses VXLANs which the 3750-G does support but the SG300 doesn't nor does the v1910 support it.
I have no idea if I will ever "play" with this but for the money as long as the used switch doesn't fail on me then maybe it's the better option. Cisco rates the MTBF on the 3750-G as 180,000 hours which is 7,500 days or 20 years - nice over engineering there!! It could be loud is the only problem I forsee, well maybe that I don't know what I'm doing either!! haha