Welcome Guest, Not a member yet? Register   Sign In
Working with multiple databases - "reader" database
#1

Hello guys, 

i am starting a new project that is expected to be under very heavy loads on the database side. 

Our infrastructure will be using a DB cluster of one main instance that will be the "writer" - it will write all data and also replicate it to other nodes on the cluster. The other nodes are however "readonly" (only the writer instance will be able to change them) but not any other user.  

The idea is to route all SELECT queries to the reader nodes (they all have the same endpoint, IE i can only configure it as a single "reader" database in CI), where all INSERTs, and UPDATEs will be send to the writer node. 

What would be the best practice to do this? 

If i have say a News model - should i instantiate two models in my controller (one for reading and one for writing)? 

Its the first time i am dealing with this kind of setup and have no idea how to structure this in CI environment

All help is much appreciated Smile
Reply
#2

Can you define "very heavy loads", how many hits per second are we talking about?

Do you require people to login? Or can guest view your pages?

You should use Redis/Memcache (logged in) to cache objects and use Varnish/Nginx for complete pages (not logged in).
That will give a ton more performance than using your database every time.

You don't need multiple models, you need to however load different configurations depending on action.
Reply
#3

(This post was last modified: 04-22-2020, 02:54 PM by korgoth.)

Actual numbers are not important at the moment - "high" depends on many factors and could be a result of a traffic spike. 

Caches will be put in place - of course, but since the infrastructure will be designed with many nodes and the "reader" nodes will be within an autoscaling configuration i would like to take advantage of that - there are certain things that i wont be able to cache so calls to the database will be necessary. 

Here is a very basic example on the situation with an say e-commerce site:
- in the backend i open an order to view/edit - i should also get "connected/similar" orders - which in some cases be several hundred. Also get some external details for that orders, some warehouse information about stocks and packages for each one. 
- now imagine there are 150 people doing the same thing on a database with several million records in the order tables alone while the frontend traffic continues - several thousand people all doing all sort of crap - making new orders, updating old ones, tracking packages and stuff.
- i would like to be able to direct all those read-only queries to the "reader" nodes as they need to be accurate. 

Assuming the question is not on cache, query optimisations and weather or not i would have user logins, whats the best way to achieve this?  

Can i switch model database on-the-fly by only changing $DBgroup? 

Is if in my controllers constructor i instantiate a model like that: 

PHP Code:
<?php namespace App\Controllers;

use 
App\Models\OrdersModel;

class 
Test extends BaseController
{
    function 
__construct()
    {
        
$this->model = new OrdersModel();
    }


This will create an instance using the default database (which will be the writer one);

Now if i would like to have a function that would like to call many "SELECT" queries on that model, and would like to direct them to reader nodes, how should i do that? 

If i write a function on my model that switches $DBGroup on the model, will that work to change the database on-the-fly? 
If its possible at all, how do i return to my "default" (writer) db after that? Do another switch?
Reply




Theme © iAndrew 2016 - Forum software by © MyBB