Options

Terminal sever testbed hardware suggestions for $5000

3Piece3Piece Member Posts: 4 ■□□□□□□□□□
Hi all,

I'm looking to build a terminal server testbed for under $5000 (Hardware budget). I'll have another 10K in the next year to add to it and more the following years, so it needs to be scalable. We want to run 4-8 test clients over RDP. 3 of the thin clients must be included in the price (Not sure what is required for a thin client over RDP).

If we are using Xen server, it seems we must use a SAN disk and not NAS, is this correct?

We are currently looking at using either Xen or Sun's virtualbox RDP system -VRDP

The system will be used by 3 heavy users:

1 3D artist using Maya and 3D studio with mental ray and running heavy renders, therefore we wanted the server configured to use all excess processing power to act as a render farm, I know this will have a large impact on the hardware we should use.

1 video editor who shoots in HD and edits film using Adobe after effects, I know this can be memory intensive. They have also requested if they can use the Apple video editing suite.

The last of our heavy users is an audio editor who uses various programs. Ableton live?

We will have other users that won't be doing much, but will be watching movies from the drive, listening to music and surfing the web.

It's a fairly lively test environment that we think is as close to the system that we will need to implement next year.

Currently to start the system off, we have found a 4TB iSCSI rackmount for $1,500, this would drop our available budget for the server down to $3,500.

We know the system needs to be 64-bit, we would like to put the system on a blade if budget allows. When starting to go through processors AMD-v vs Intel vt-x, i7 or xeon, we are starting to get a little lost.

Also, is all graphic processing done on the servers processor? what if one of the clients was to play the latest game, would they hog the system if they cranked the graphics?

Appreciate as much help as we can get on this one.

Comments

  • Options
    tierstentiersten Member Posts: 4,505
    Uh. Apart from the users that are just browsing the web, the tasks you've listed aren't suited for RDP.

    The 3D artist will be hogging the CPUs all the time with rendering and should get their own render farm.

    The video editor will also want loads of CPU, RAM and IO.

    The delay in RDP audio will probably cause the audio editor to go crazy.

    Those 3 alone will want dedicated workstations.

    Don't even try playing the latest/greatest 3D game over RDP :P
  • Options
    3Piece3Piece Member Posts: 4 ■□□□□□□□□□
    Thanks Tiersten,

    I was curious about latency issues, I have seen that here is now hardware designed to get around this issue, Teradici is the main company who have developed a chip for PCoIP (PC over IP). They claim that you can perform 3D work on a thin client with the host server on the other side of the world.

    So latency is a real issue without special hardware, what other hardware solutions are there?

    Is it not possible to prioritize use of the server, such as basic apps get first priority, video editor gets second, and any renders that are qued get last priority, so during times when not so much is happening say overnight, the renders use the processing power available?
  • Options
    dynamikdynamik Banned Posts: 12,312 ■■■■■■■■■□
    It sounds like each one of those guys should get a $3000-5000 workstation. As Tiersten mentioned, you're going to be spreading your resources insanely thin, and a lot of that work is going to suck to do remotely.

    Also, what sort of iSCSI unit did you get for $1500? I'd be concerned about the performance on something that cheap. Did you just get one of those Sans Digital units or something similar? I doubt that would be significantly better than just running Open Filer on a spare PC. Why do you need to use iSCSI if you only have the single server? Why don't you just use internal storage?
  • Options
    kalebkspkalebksp Member Posts: 1,033 ■■■■■□□□□□
    It sounds like your doing desktop virtualization rather than terminal services. If that is the case you should look at Microsoft's licensing for virtualized desktops. The short of it is that if you're using thin clients it's $110/device/per year (if you run it off a regular desktop covered under SA then it's $23/computer/year). It doesn't matter if you're using Sun, Citrix, VMware, or Microsoft, that's how much it costs just to license Windows. I dropped my desktop virtualization research after I saw that, just not cost effective for my environment.

    From your description I don't think virtualization or terminal services would be a good fit for your application.
  • Options
    tierstentiersten Member Posts: 4,505
    3Piece wrote: »
    I was curious about latency issues, I have seen that here is now hardware designed to get around this issue, Teradici is the main company who have developed a chip for PCoIP (PC over IP). They claim that you can perform 3D work on a thin client with the host server on the other side of the world.
    Their product is to help you run RDP style sessions over a WAN link. Your proposed system I assume would all be running off the local LAN where available bandwidth will be high.

    The PCoIP chip is used to allow remote access to an existing PCs graphics card output. The actual protocol it uses can be implemented in software. It isn't doing anything particularly special in terms of remote desktop protocols. They all work by compressing the current display, transmitting it over the network and then decompressing it at the other end. Add audio and also a channel for user input going back.

    Even with their PCoIP chip, you're going to incur some latency due to the need for compression, transmission over the network and then decompression. It won't be as fast as if you were sitting physically in front of the PC. Whilst you may be able to spin a 3D model around fairly well as shown in their video demo, you'll notice a difference between remote and local.

    Audio even uncompressed will have the network delay added to it. Audio editors strive to have low latency audio inputs and outputs with precise timing. RDP would drive them crazy.

    All the video editors I've met are fanatical about their equipment. They all worked in darkened rooms with very carefully colour calibrated monitors. They'd be horrified if you moved their workstation to the server room and forced them to use remote desktop with compression and network latency.

    I'm uncertain about the level of work that your clients would be doing. They may not mind these limitations. The editors I've talked to were from a visual effects company which has worked on a large number of mainstream films. I used to work in a company next door to them and it was always interesting to see what they were working on. A few times they've done motion capture or video work in the floor lobby. A few of the large crowd scenes in films don't actually have that many people in them. Its just a bunch of people from the company dressed up and pasted a few thousand times in the background ;)
  • Options
    3Piece3Piece Member Posts: 4 ■□□□□□□□□□
    Ok, I was only looking at drives, I haven't actually bought one yet, but I saw this one Amazon.com: Terastation Iscsi 4TB 2URM 4X1000GB Sata Raid 5 3YR Warr: Electronics

    I'm very interested in RDP and I would like to pursue this especially the software route, the 3 users will have their own workstations to utilise during testing the server, they are also all willing to try it.

    If all 3 power users were physically located next to the server and connected directly to it, then would there still be any latency issues using RDP (I know they're right there), and if not, then how about running a short distance over a gigabyte fibre optic network?

    From what I noticed with Teradici's solution, all the systems at the data centre were individual systems mounted in the racks. I was hoping to have all the users sharing the processors and ram, so when the demand on the server is not so high the other users have more power at their fingertips. Is this possible to achieve?


    I really appreciate your ideas so far.
  • Options
    dynamikdynamik Banned Posts: 12,312 ■■■■■■■■■□
    Meh, it's a Buffalo; same type of deal. You never answered my question though, why are you using iSCSI over internal storage?

    Don't you have any machines you can test this out with prior to making the purchase? What are they using now? Have them RDP to those machines and see how they like working like that.
  • Options
    astorrsastorrs Member Posts: 3,139 ■■■■■■□□□□
    As others have mentioned (or hinted at) what you are trying to accomplish for $5k is unrealistic. RDP is a bad protocol for those 3 users - and while Teradici makes a great product and it would be my choice in a LAN environment like this - to architect a solution that would give acceptable performance to those users, along with a reasonable amount of standard office users, you're going to want to multiply that budget by at least 4x-5x.

    Server based computing and VDI can be a great enabler when it's used to solve a problem appropriately; while you've told us some of the specific users/challenges/etc, if you could clarify the "why" (as in: what business/technical problems is the client wanting to solve?) we'll be in a better position to help you develop a solution.
  • Options
    3Piece3Piece Member Posts: 4 ■□□□□□□□□□
    Ok brilliant,

    First to answer dynamik's question, the reason I was looking at using SAN disks is that from what I read, I thought Xen would only work with SAN, if it works with other types of disks then great!

    I have been poised with the task of setting up a solid network for a company next year. The network and users using the network will be isolated for months on end with no IT savvy person amongst them. The only link they will have to the outside world is a satellite internet connection, if seeing the state of the connection this year, it's slow at best and down half the time. They will be recording 8 video feeds simultaneously with audio. Some of the video will be reviewed and some edited. One thing in our favour is that it doesn't matter if the audio is out of sync up to 1 second as it's only commentary with no head shots of people. There will be around 10 -20 clients and there will also be 12 monitors that need to see video on demand. There is one user who uses after effects, 3d studio and Maya but they currently work on an outdated laptop so anything is going to please them. I'm to come up with a quotation for the system at some stage in the next couple of months.

    My plan was to have the video feeds record straight into the server, have all the processing power at the server and only run network cables to thin clients.

    This is only our first project incorporating virtualization but if it's successful we'll have many more on a larger scale with more demanding applications.

    To learn about RDP we have several months of free time to play with a server setup and some cash spare, so I wanted to make a test server where I live, I share the building with a professional film maker who
    shoots in HD and currently edits on an Apple workstation. (He has done doco's for Nat Geo). We also have a musician who is well into his techno music creation and we also live with a web/media creator who works in 3d studio and adobe after effects.

    All PC's are production pc's being utilised, so none spare to play with RDP. That's why I wanted to start with a small server and scale up. Next year I'll have more cash to add to the server, however at the moment I'm being conservative :)
  • Options
    tierstentiersten Member Posts: 4,505
    Just by the fact that you want to make it a blade means that you've blown your budget already. The blade enclosures aren't particularly cheap and you'll be paying a premium for each blade that you put in.

    The Buffalo you've linked to is more aimed at the SOHO or very small office market. The performance will be inadequate for the uses you've listed.

    There would still be latency even if you're on a thin client right next to the server. It isn't the length of cabling thats causing the latency but the actual protocol itself.
  • Options
    dynamikdynamik Banned Posts: 12,312 ■■■■■■■■■□
    3Piece wrote: »
    All PC's are production pc's being utilised, so none spare to play with RDP.

    You can RDP to any machine you have, which is why it's an easy way to test how things are going to function.
Sign In or Register to comment.