Welcome Guest, Not a member yet? Register   Sign In
10-50k users online
#11

[eluser]Gewa[/eluser]
thanks a lot
#12

[eluser]1stInter[/eluser]
Quote:It is gonna be a social network like classmates.com we await huge amount users online. I home that for the first time we will be able handle 500-1000 users online untill we will make a new engine with load balancing…

Hi,

Although I'm new to CodeIgnite, I work a fair bit with optimising servers for high capacity serving.

If you're looking to have 50,000 users online you need a strong architecture behind it, for instance:

* Multiple web servers (Apache / LightHTTP)
* Multiple SQL server back ends (MySQL / Postgres / Oracle)
* Load Balancing (LVS with keepalived / heartbeat / mon / Pound)
- Web servers AND SQL servers
* Caching Reverse Proxy (Squid)
* Finely tuned servers (files open, connections etc)

Of course, there are many different solutions I haven't mentioned at all.

As a popular web host you also have to put in excellent DoS detection (Denial of Service) and facilities to block IPs in case of attacks. You have to fine tune your kernel to ensure that hundreds of simultaneous connections don't bring your server to a crawling halt.

You'll need to look into whether caching technologies such as memcache is right for your solution (to cache MySQL queries - forget storing results in file caches with these kind of user figures, it will bring your servers to a standstill).

Just figuring out what technologies you should use, configuring and testing them with simulated loads will take weeks, if you do it right. Especially if you haven't used any of these before.

I have one comment though: 50,000 users online is a MASSIVE number of users; it means you'll have a registered userbase at least 10 times as large, so 500,000 registered users. And unless you're planning an application like Facebook you're probably looking at 1,000,000 users or more before you have 50,000 online at any one time.

Even Second Life don't normally have 50,000 users online at the same time, as far as I know. See HERE.

However, if you're talking about having 50,000 registered users then you're more likely to see up to 5,000 users online. And it's more likely to be 5,000 logged in during a 24 hour period rather than simultaneously, based on my experience as well as research. Still very significant but much less architecture required.

If you are looking for 50,000 users online I would suggest putting together a team of around 3-5 people at minimum just to do the initial planning, and ensure it includes people with the right kind of experience. You'll need people with real data centre experience, and developers who have developed massively scaleable solutions in the past.

I don't want to expose who my current client is, but they have less than 50,000 simultaneous users and we manage: 5 web servers for two different websites, 4 load balancers, 2 squid reverse proxy servers, 4 client facing application servers, another 7 back-end servers, they also have an identical staging environment for new releases and an integration environment where the various developers ensure the systems work together (their 'playground').

We also use VMWare to host a number of servers for actual development and more.

Virtually all of these servers are Dell 2950 with Quad or 8 cores; one server they have is 24 core CPUs with 16GB RAM.

And they still have MASSIVE performance problems. Why? Because developers have frequently made the WRONG decisions about how to implement a feature or technology.

Simple example: the 24 core server uses software that can only utilise 4 cores. When more cores are used the result are "indeterminate" according to the third party software developers. This means random errors are introduced and there is no way of detecting them automatically. Another example: this application is incredibly disk intensive, often reading 10GB files and producing a 50MB as a result. The developers believed that they could run 24 processes, one on each core and they could process 24 files simultaneously and it would take the same time as a single file. WRONG! No way can this server can cope with more than 2 or 3 simultaneous processes before it slows down so much it takes 10 times longer than doing a single process - the hard disk system just cannot read the data quickly enough, and the simultaneous processes makes the hard disk system do endless seeks in between reading the data, so 95% of the time is spent with the hard disk seeking rather than reading data. Easy mistake to make if you don't consider EVERY aspect of your solution.

What's my point here?

Well, I cannot emphasise enough the importance of the WHOLE solution being planned for heavy loads from the very start, and that every single decision you make must be thought about in the context of: "How will this work under heavy load?"

You must consider every detail: where do you write your SQL data? Where do you read your SQL data? How do you partition the data? How do you replicate the data? How do you deal with JOIN queries? How do you cope with servers being unavailable? If you have a single SQL write master, having to reboot this server can be catastrophic for your application and the service you provide. And if your service is offline for a few minutes, imagine people retrying to reload their page again and again; the normal traffic pattern of 50,000 can make it suddenly seem like 75,000 users for a couple of minutes.

So, sorry for the lengthy post, but having done work in this field for a while from an IT Operational point of view, I know that you can only scale to 50,000 simultaneously users if you really plan this very, very carefully.

On the positive side, if the business model is right, 50,000 simultaneous users should be worth a lot of money so your client should be prepared to spend a lot of money too. And you should be able to charge a LOT of money to make it work for them.

Best of luck!

Steve
#13

[eluser]Teks[/eluser]
@1stinter: thank you for the very interesting, and most informative post.
#14

[eluser]Gewa[/eluser]
Steve you are super THANK THANK and 100000000 thanks to you!
#15

[eluser]kevinprince[/eluser]
Dont forget your front end optimization!

Reduce HTTP Requests to a minimum by compacting CSS and JS files, make sure images are compressed correctly.

http://developer.yahoo.com/performance is a great reference for front end performance.
#16

[eluser]Gewa[/eluser]
anyway I didn't get answer to a question what is the maximum limit on good big server for CI... 800users online simultanously will be ok?
#17

[eluser]1stInter[/eluser]
Hi Gewa,

Nobody can give you the answer - it depends 100% on the application you write.

GI will add some overhead to your code, but the answer is dependent on many factors:

- How many 'assets' are downloaded (images, javascript files etc)
- How many SQL queries you do in the script
- How you cache your information
- How much logic and processing for your code
- What OpCode cache you are using
- The usage profile (i.e. how much time the user spend on each page - this is 100% application specific)
- etc etc etc

The ONLY way you'll find out is to write your code (or write up a module which is "representative" of the finished product) and do load testing.

There is NO other way - you have to test!

For instance, on one server (which isn't particularly powerful) I can serve around 45 basic pages (Hello World application) per second, on a different server (a duo core, 3GB memory, 2.2GHz processor) I get around 80 pages per second. This test, by the way, is using Smarty templates to serve the actual page. Also, this server has a complicated setup and isn't tuned for performance, it's my internal development server running Ubuntu.

But that's the actual logic code; static assets (images, javascript etc) can be served at 600 pages per second on the same server.

I haven't tested GI on a Dell 2950 yet, but instead of 600 pages per second for static content a Dell 2950 can serve over 8,500 requests per second, over 10 times faster (better tuned server, running Red Hat RHEL 5). So GI itself should be pretty fast too.

So if you have virtually no logic at all a single server can cope; if you application is very heavy then you'll need more servers.

Best of luck!

Steve
#18

[eluser]onblur[/eluser]
@1stInter, thanks for the very informative posts re: performance - I'm sure many, like me, find this very useful.
#19

[eluser]RS71[/eluser]
Hello all,

I have a question. I don't know much about running sites on multiple servers (like mysql clusters & separate db and web servers, multiple db and web servers whatnot) but would you need any special coding for that to work? If I code an application that works on a simple shared server, can I take it and use it on a multiple db an web server configuration?

Also, is there a general maximum amount of queries per page I stay under?

Thanks in advance.




Theme © iAndrew 2016 - Forum software by © MyBB