Does this seem reasonable for a home lab for esxi?

gbdavidxgbdavidx Member Posts: 840
I want to setup my own home lab and eventually use it for backups with raid, i probably wont need as much storage space at first and will upgrade as i go but does this look like a reasonable setup I should shoot for?

ESXi 5 Home Lab Specs | Home Server Blog

Comments

  • DeathmageDeathmage Banned Posts: 2,496
    gbdavidx wrote: »
    I want to setup my own home lab and eventually use it for backups with raid, i probably wont need as much storage space at first and will upgrade as i go but does this look like a reasonable setup I should shoot for?

    ESXi 5 Home Lab Specs | Home Server Blog

    this is a perfect build:

    2 x Dell R610 1U Server Dual Quad Core E5520 2 26GHz 16GB RAM PERC 6 I RAID No HDD 884116025870 | eBay (these sucker can take 192 GB's of RAM)

    2 x 16 GB flash drives for internal storage on the R610's, I just install ESXI to the flash drive and boot the VM's from local storage on the NAS. You can plug in the flash drive into a slot right behind the cpu when you open the case up, very handy!

    1 x Amazon.com: QNAP TS-420 4-bay Personal Cloud NAS, DLNA, Mobile App, iSCSI Supported: Computers & Accessories

    3 x http://www.amazon.com/WD-Red-NAS-Hard-Drive/dp/B008JJLZ7G/ref=sr_1_2?ie=UTF8&qid=1418308749&sr=8-2&keywords=2+TB+drives (RAID 5)

    1 x HP ProCurve 2848 J4904A 48 Port Switch Used 0890552632848 | eBay (make vlans to seperate your iSCSI vStorage vMotion 'jumbo-frames'/vMotion/Production traffic) (this switch is about 7 years old, but it has 10/100/1000 and there cheap now and support jumbo-frames and gigabit)

    1 x Dantrak Net *- New and Used Networking Equipment and Peripherals. Skeletek Racks and Accessories. (pick your 4P rack)

    You will need about 10 or so 10ft cat5e's of course and if you want more than 4 nic's the expansion cards for the R610's. there about $50 each. However for a lab you can use vlans on nic's but I personally have 8 nic's per R610, two for everything.

    this would be one of many proper solutions.
  • gbdavidxgbdavidx Member Posts: 840
    And do those consume a lot of power? the ones I were looking at were Opteron's which are usually lower end cpus that don't take a lot of wattage to utilize
  • gbdavidxgbdavidx Member Posts: 840
    those are only enough for the host, I need at least one SAN server for my storage
  • DeathmageDeathmage Banned Posts: 2,496
    gbdavidx wrote: »
    those are only enough for the host, I need at least one SAN server for my storage

    Why would you need a SAN if you have a 6 TB NAS?

    as for power consumption, there not the best but my lab personally is as close to the real deal as possible. Using a NAS with iSCSI is really not different than using a SAN with iSCSI; the principle of the storage fabric is pretty much the same even for FCoE, iSCSI, and fiber.
  • gbdavidxgbdavidx Member Posts: 840
    Well it doesn't matter which protocol I use I just need a third server for the storage utilizing some sort of raid
  • DeathmageDeathmage Banned Posts: 2,496
    gbdavidx wrote: »
    Well it doesn't matter which protocol I use I just need a third server for the storage utilizing some sort of raid

    You should have you local storage in an minimum a mirrored array, however now this gets into DCD talk but you can use a local RAID for swap.

    I mean unless your going for the VSAN approach I can't envision using a server as a storage platform as a good move, I mean I hope its just going to be a pure storage platform with redundant power supplies.

    lastly, what would the difference I guess I'm trying to see of having a glorified server running storage compared to a NAS running storage? ... what kind of 'storage' are you using it for?
  • SimonD.SimonD. Member Posts: 111
    Whilst I love rack based servers I tend to find they don't have much in the way of WAF (Wife Acceptance Factor) score, they are large, noisy and lets be fair great for the enterprise, not so much for the home office.

    I went down the route of this - Home Lab Environment « Everything Virtual

    My shuttles cost me about £550 each (I have 3) all in, that includes the 4 port Intel NIC as well.

    There are better shuttles out now but obviously they will be more expensive.

    I currently run vSphere 5.5 on them without too much issue (I am pretty sure I documented the build out as there was an issue with the move to 5.5 and not having the right drivers installed out of the box).

    As far as storage is concerned I would have a look at the Synology range of NAS devices, certainly the DS1513+ has VAAI built in and from a performance standpoint they are better than any of the homebuilt devices I have used in the past (OpenFiler, FreeNAS, Open-E DSS etc).

    For on the go labs I stick with a heavy gaming spec'd laptop (MSI GT70) with 32GB of ram fitted as well as SSD's for storage, it's pretty good but obviously quite heavy (good if I am driving somewhere, not so much if I am flying).

    I would also suggest looking at the recently updated AutoLab2 deployment tool.
    My Blog - http://www.everything-virtual.com
    vExpert 2012\2013\2014\2015
  • darkerosxxdarkerosxx Banned Posts: 1,343
    Servers work great as storage devices, given the right hardware. I mean, that's all SAN/NAS are...
Sign In or Register to comment.