Options

Question for the web guys

brad-brad- Member Posts: 1,218
I dont think we have many web developers out there, but I'll ask anyway.

We're about to deploy a website/web application that probably needs to handle about 20k concurrent users. My question is, where is the bottleneck in terms of how many users can connect? IIS? SQL? Server itself?

We have IIS7 in W2k8. Database is on SQL2k8 standard on W2k8. I did some preliminary research on it, and was reading that the concurrent users were really limited to fetching the webpage, and then the connection is closed...so a limit of 32k connections would really hold a bit more users than that. Any thoughts?

Comments

  • Options
    PashPash Member Posts: 1,600 ■■■■■□□□□□
    This isn't really an Web Dev question though. Usually for these high number of concurrent connections you would have some infrastructure guy with solid sql knowledge. I think RK would be able to answer something like this.

    Personally ive never worked on a web server with these types of requirements.
    DevOps Engineer and Security Champion. https://blog.pash.by - I am trying to find my writing style, so please bear with me.
  • Options
    RobertKaucherRobertKaucher Member Posts: 4,299 ■■■■■■■■■■
    brad- wrote: »
    I dont think we have many web developers out there, but I'll ask anyway.

    We're about to deploy a website/web application that probably needs to handle about 20k concurrent users. My question is, where is the bottleneck in terms of how many users can connect? IIS? SQL? Server itself?

    We have IIS7 in W2k8. Database is on SQL2k8 standard on W2k8. I did some preliminary research on it, and was reading that the concurrent users were really limited to fetching the webpage, and then the connection is closed...so a limit of 32k connections would really hold a bit more users than that. Any thoughts?
    The answer is "it depends." There are no theoretical limits to the number of connections SQL Server and IIS7 can support. Until you reach the limit allowed by TCP. However you are going to hit hardware limitations long before that. I am not very involved in scaling when it comes to IIS, so I cannot give you much direction. But you need to look into things like load balancing and failover clustering. Also, keep in mind that HTTP is not a concurrent session. Once the data is transferred the connection should be closed. And if your base is 20K users, there is no way your server will get hit by 20K GETs at the same time. What is going to be key here is hardware capacity planning/server load balancing.
  • Options
    it_consultantit_consultant Member Posts: 1,903
    If you are supporting 20K users you should have some sort of high availability in place for 100% uptime. Do you have a HA solution? Many times HA solutions can (and should IHMO) include some sort of load/link balancing.
  • Options
    brad-brad- Member Posts: 1,218
    If you are supporting 20K users you should have some sort of high availability in place for 100% uptime. Do you have a HA solution? Many times HA solutions can (and should IHMO) include some sort of load/link balancing.
    Its actually many more, but we're planning on that many simultaneous. Its a govt site that takes payment for sales tax and some other things. We have a contractor developing it, but we're just trying to think it all out ahead of time so we order the right stuff, if we need to order at all.

    The two servers that IIS and SQL sit on now are beefy. SQL is a prolaint DL370 with 24GB RAM and 4 quad cores on W2k8/SQL2k8. The web server is W2k8 with 2 quad cores and 8GB.
  • Options
    gorebrushgorebrush Member Posts: 2,743 ■■■■■■■□□□
    How fast is the internet feed to your planned architecture?
  • Options
    it_consultantit_consultant Member Posts: 1,903
    brad- wrote: »
    Its actually many more, but we're planning on that many simultaneous. Its a govt site that takes payment for sales tax and some other things. We have a contractor developing it, but we're just trying to think it all out ahead of time so we order the right stuff, if we need to order at all.

    The two servers that IIS and SQL sit on now are beefy. SQL is a prolaint DL370 with 24GB RAM and 4 quad cores on W2k8/SQL2k8. The web server is W2k8 with 2 quad cores and 8GB.

    There is always that argument between big iron and clustered little servers. In my experience it is important to have high availability for these things if only because you can do maintenance without incurring downtime.

    BTW, thats a great piece of hardware. I have several DL370G6 servers under my care. I will never go back to a server that does not have dual port hard drive cabling.
  • Options
    eMeSeMeS Member Posts: 1,875 ■■■■■■■■■□
    brad- wrote: »
    We're about to deploy a website/web application that probably needs to handle about 20k concurrent users. My question is, where is the bottleneck in terms of how many users can connect? IIS? SQL? Server itself?

    I'd agree with RK that the answer to this question is "it depends". Much of the "it depends" relates to how it was built and configured in your environment.

    For something this significant, one thing that you might do is invest in testing and validation software that can simulate loads on your system. The two big names in that game are HP(Mercury) and IBM(Rational).

    You won't truly know how things will perform or where the bottlenecks are until you flip the switch on, but, you can get a better idea of these things before go live by doing effective load testing. You can also do this at interim stages as the application is being built and when significant changes are made.


    The point is, it's not a really good idea to build such a significant application and guess at what the bottlenecks might be. You need empirical evidence that shows where the likely bottlenecks are, and that's what effective load and simulation testing will give you.

    MS
  • Options
    Forsaken_GAForsaken_GA Member Posts: 4,024
    I've had to deal with server side of large concurrent users before, and it's not that big of a deal if you prepare for it right. For example, awhile back, Hustler did some Sara Palin parodies. And they were mentioned on Leno. We'd gotten the heads up about 6 hours before the show, so we had a little bit of time to prepare, but not too terribly much.

    We'd designed their setup to be able to take hits like this with a little tuning here and there. We went with the many clustered servers solution, several front end web servers with multiple database servers on the backend. The only real hitch we saw didn't have anything to do with concurrent connections, they were spread about on enough machines that it didn't really matter. The major bottleneck was the I/O on the database servers, since they were getting hammered repeatedly. after about 15 minutes of hacking and doing some voodoo with memcache, that was no longer a problem either.
  • Options
    brad-brad- Member Posts: 1,218
    eMeS wrote: »
    For something this significant, one thing that you might do is invest in testing and validation software that can simulate loads on your system. The two big names in that game are HP(Mercury) and IBM(Rational).
    I'll take a look at them and see if its in our budget. Thanks very much, good post.
  • Options
    RobertKaucherRobertKaucher Member Posts: 4,299 ■■■■■■■■■■
    eMeS wrote: »
    I'd agree with RK that the answer to this question is "it depends". Much of the "it depends" relates to how it was built and configured in your environment.

    For something this significant, one thing that you might do is invest in testing and validation software that can simulate loads on your system. The two big names in that game are HP(Mercury) and IBM(Rational).

    You won't truly know how things will perform or where the bottlenecks are until you flip the switch on, but, you can get a better idea of these things before go live by doing effective load testing. You can also do this at interim stages as the application is being built and when significant changes are made.


    The point is, it's not a really good idea to build such a significant application and guess at what the bottlenecks might be. You need empirical evidence that shows where the likely bottlenecks are, and that's what effective load and simulation testing will give you.

    MS

    Yes, that sort of testing is really the foundation of proper capacity planning. If you are lucky, brad, you'll get the budget. *sarcasm* I was fortunate enough to get to write our in house load testing software /*sarcasm* and with it and a combination of of SQLIO/SQLIOSim and PerfMon with a dash of Excel I came to the conclusion that yes, I thought our hardware might, maybe be able to handle the expected load.
  • Options
    PashPash Member Posts: 1,600 ■■■■■□□□□□
    Well maybe it might be worth mentioning if you havent used it already. Apachebench is a great tool for testing your web server element. Works with IIS too.
    DevOps Engineer and Security Champion. https://blog.pash.by - I am trying to find my writing style, so please bear with me.
Sign In or Register to comment.