Options

VMware Study Lab

SilverGeniusSilverGenius Member Posts: 56 ■■□□□□□□□□
Ok, so those of you that use VMware as a lab setup I have some questions. I will mostly be focusing on a windows domain and the different technologies involved so I need to setup a few server and workstations etc.

Do you have a desktop or laptop and use Workstation and/or Virtual Box or do you have a dedicated ESXi box? Why have you gone with one setup over the other?

My work currently provides me with a laptop that I have Workstation installed on. It's a 2.0 Core2Duo with 3GB or ram. I installed Server2008 in a VM the other day and with just Outlook open and browsing the web, doing basic work my laptop was so slow. I do have a newer laptop coming soon but I think I would have to do some out-of-pocket upgrades to prevent bottlenecks.

1. Do I upgrade the new laptop memory to 8GB and get a SSD and hope that it runs Workstation better.

Or

2. Do I find a server or build one and put ESXi on it? I can then use my laptop and connect to the different machines ,theoretically from anywhere if I have a VPN setup correct? If I go this route then I can probably put a media server on it and also use it to backup some of my home stuff.

What do you guys think?
Failed to load the poll.

Comments

  • Options
    AkaricloudAkaricloud Member Posts: 938
    I think this mostly depends on what you're wanting to use it for. If you're just looking to run one VM or maybe two at a time then upgrading your laptop might make most sense.

    On the other hand, if you're looking to run a lot of VMs at once then build something dedicated. Working remotely with this might not be the best experience depending on your internet connection at both ends but internally it should be great.

    Personally I'm going to be building a dedicated box for just labbing and playing around with. Something that I'm not going to quickly outgrow and will actually be able to use for my household server needs.
  • Options
    gollgoll Member Posts: 9 ■□□□□□□□□□
    1. Do I upgrade the new laptop memory to 8GB and get a SSD and hope that it runs Workstation better.

    Just out of curiosity, what model is this laptop ?
  • Options
    estamandestamand Member Posts: 13 ■□□□□□□□□□
    I have access to laptops so I have a xenserver setup on a i7 laptop with 8GB or ram. Works for my needs.

    I will be setting up ESXi 5 on a 2nd laptop to get familiar with different environments
  • Options
    SilverGeniusSilverGenius Member Posts: 56 ■■□□□□□□□□
    goll wrote: »
    Just out of curiosity, what model is this laptop ?
    I think they are like HP EliteBook 8440's or something like that.
  • Options
    SilverGeniusSilverGenius Member Posts: 56 ■■□□□□□□□□
    Akaricloud wrote: »
    I think this mostly depends on what you're wanting to use it for. If you're just looking to run one VM or maybe two at a time then upgrading your laptop might make most sense.

    On the other hand, if you're looking to run a lot of VMs at once then build something dedicated. Working remotely with this might not be the best experience depending on your internet connection at both ends but internally it should be great.

    Personally I'm going to be building a dedicated box for just labbing and playing around with. Something that I'm not going to quickly outgrow and will actually be able to use for my household server needs.

    The more I think about it I would like to build a dedicated box, I think I would have to much stuff for the laptop to handle.
  • Options
    AkaricloudAkaricloud Member Posts: 938
    The more I think about it I would like to build a dedicated box, I think I would have to much stuff for the laptop to handle.

    Yeah, In my opinion it's better to start fresh and leave room for expansion rather than spend money trying to max out what you already have. In the future if you want to run another couple VMs you'll be happy that you chose that route.
  • Options
    jmritenourjmritenour Member Posts: 565
    For what it's worth, I found that VirtualBox did multiple VMs at once way better than VMware workstation. I found the bottleneck for Vmware was the disk i/o, so I'd think a SSD drive would probably make the most impact, but that's just my personal opinion based on one particular piece of hardware I was using (XPS m1530, 4GB of RAM, Core 2 Duo, and running around 5-10 VMs). YMMV.

    That said, at this point I'm putting together a dedicated hypervisor at home for all my labbing needs. I work with ESXi & vCenter day and and day out, so I'm currently using Hyper-V - undediced if I'm going to stick with that or try Xen. It's a two fold thing - a hypervisor will definitely hand multiple VMs better, I can connect to from any system in my house, and most importantly, I get more hands on time with another virtualization platform. My eventual goal is to learn Xen & Hyper-V as well as I know VMware. If I can learn them while I'm labbing other stuff, then I'm getting the most out of my time.
    "Start by doing what is necessary, then do what is possible; suddenly, you are doing the impossible." - St. Francis of Assisi
  • Options
    EveryoneEveryone Member Posts: 1,661
    I think you know my setup SilverGenius. ;) RAM is really the key, the more the better. When creating VMs, create them with the bare minimum requirements. Your disks really only become a noticable bottleneck when you don't have enough RAM left for everything. Your VMs will start using swapping on the HD when they can't use RAM, and your host OS will start using its swap space too, which makes things crawl.

    The same thing happens when you try to give a single VM more RAM than your host physically has.

    CPU comes after. I prefer Intel CPUs w/ VT-x on a motherboard with a chipset that supports VT-d. I like hyperthreading too. In my experience, gives the best performance for VMs.

    I really like quad core CPUs for doing this too. I had a dual core AMD system at my old job with 8 GB of RAM, running Linux as the host, and I couldn't get more than 2 VMs before the CPU started to choke. 3 was pushing it, and 4 would kill it. Plenty of RAM left, but CPU usage was pegged at 100%.

    My Core i7 860 and 8 GB of RAM runs 8 VMs, 3 of them are nested, without breaking a sweat. I still have enough resources left over to play games.

    I am starting to get to the limit of my RAM though. I could upgrade to 16 GB, and probably run all the VMs I could ever want off this 1 computer, but I now have 2 VMs that I need to run 24/7, and have a couple more that I'd like to finish building that will need to run 24/7 as well, so it is time to move to a dedicated ESXi box for me.
  • Options
    jibbajabbajibbajabba Member Posts: 4,317 ■■■■■■■■□□
    I have a physical 2U system in the datacenter (2xE5520 / 96GB RAM) which I am using for nested vSphere cluster ... If I need physical hosts to play with then I throw some spare chassis into the rack ...
    My own knowledge base made public: http://open902.com :p
  • Options
    MentholMooseMentholMoose Member Posts: 1,525 ■■■■■■■■□□
    Everyone wrote: »
    I really like quad core CPUs for doing this too. I had a dual core AMD system at my old job with 8 GB of RAM, running Linux as the host, and I couldn't get more than 2 VMs before the CPU started to choke. 3 was pushing it, and 4 would kill it. Plenty of RAM left, but CPU usage was pegged at 100%.
    Did you check why the CPUs were pegged? It's possible the cause was actually the disk. In top this will show as 100% CPU, but there will be a high percentage for IO wait (the "wa" column). I'm not saying the dual-core CPU definitely wasn't a limitation, but in machines with standard hard drives I rarely see a CPU bottleneck. It's usually the hard drive slowing things down. The lab machine I used for a few MCITPs had a Core2 Quad Q9600 with 8GB RAM and even with a decent hard drive array (4x SATA drives in RAID 10 on an LSI controller), some tasks would really kill the disks and grind the system to a halt.

    I've been putting SSDs in all my machines lately so nowadays I'm more likely to see actual CPU bottlenecks. The other day I booted 10 sysprep'd Windows 7 images on my SSD-equipped laptop in VirtualBox, and with all of them going through mini-setup it was killing the CPU (Core i5... only dual core though it does have hyperthreading) while the SSD was unaffected. I could boot 5 with no problem. It's really a balancing act so I wouldn't say there is one component that matters the most, but in my experience an SSD will go a long way.

    I've done some more extreme testing to see how far an SSD would go. I put a complete vSphere environment (three VMs: a DC, one for vCenter, and one for SQL Server) on a desktop with ESXi 4.1 and a single-core AMD64 (circa 2006), 2GB RAM, and an old/cheap (Kingston-branded Intel G1) 40GB SSD. Even with less than 1.5GB for VMs, resulting in 100% memory over-subscription (each VM had been assigned 1GB RAM), it worked fine, even with the extreme disk swapping due to heavily over-provisioned RAM. Unfortunately with only 40GB I ran out of space for additional VMs and couldn't see how far I could take it. I have a 160GB SSD sitting around waiting to be installed somewhere so maybe I'll dust off that old desktop again to see what it is capable of. :)
    MentholMoose
    MCSA 2003, LFCS, LFCE (expired), VCP6-DCV
  • Options
    SilverGeniusSilverGenius Member Posts: 56 ■■□□□□□□□□
    Thanks for the feedback guys!
    I think I am going to get a SSD & ram for the laptop and see what I can run. I will remove the optical drive and put the regular hdd in its place and use it for storage space. When I get to the point of needing a ESXi box I can build one. The SSD and extra ram will not be wasted because it will make my everyday work experience better :)
Sign In or Register to comment.