Welcome Guest, Not a member yet? Register   Sign In
Having lots if insert into database for thread analytics
#1

(This post was last modified: 10-09-2016, 01:01 AM by wolfgang1983.)

I have a thread analytics table where each time a new user visits a page it insert the ip address and also a view of "1" and the thread id

If there are over a 1000 rows for each thread does it slow down the page?  Is it OK to create a new row for each new visit. I am just worried about how many rows I can have in my table.


PHP Code:
<?php

class Thread_model extends CI_Model {

    public function thread_analytics($thread_id) {
        $this->db->where('thread_id'$thread_id);
        $this->db->where('ip_address'$this->input->ip_address());
        $query $this->db->get($this->db->dbprefix 'thread_analytics');

        if ($query->num_rows() == 1) {

            return false;
        
        
} else {

            $data = array(
                'thread_id' => $thread_id,
                'ip_address' => $this->input->ip_address(),
                'views' => '1',
                'date_created' => date('Y-m-d H:i:s')
            );

            $this->db->insert($this->db->dbprefix 'thread_analytics'$data);
        }
    }

    public function total_thread_views($thread_id) {
        $this->db->where('thread_id'$thread_id);
        $this->db->where('views''1');
        return $this->db->count_all_results('thread_analytics');
    }

There's only one rule - please don't tell anyone to go and read the manual.  Sometimes the manual just SUCKS!
Reply
#2

Hi,

I think you are worrying unnecessarily. An insert is not slowed down by the size of the table. Counting results is pretty quick too. DB's were built to do these tasks efficiently and quickly, in fact it never ceases to amaze me how they perform so well.

You should be wary of table size when you start reaching the limits of your DB and environment limits, usually, or so I have read but never experienced, when you have start reaching millions of records. But this is usually due to the indexes no longer fitting into your RAM, and although I have no experience of doing this, very big data sets can be managed in a variety of ways that is much more about how you are hosting and configuring the database rather than any actual DB limitations.

However, you probably also should be doing some clean up on those tables. For instance, summarizing historical data into a summary table and cleaning out your thread table. I suppose it depends on your usage of the data.

My biggest tables were done for a site where I was checking which items a user had viewed. It never seemed right logging potentially 'every resource' times 'every user' just for a viewed change of style. But I never found a cleverer way to do it. The other one was a history table of user actions, that became enormous, easily racking up 200 rows per user per visit. That one was more easily solved with with a clean up and archive routine though.

There are some really interesting reads about all this although for me I quickly get lost when they start talking about stacks and server environments.
Reply
#3

I agree with PaulD...that being said don't forget if you want to see the impact of your database calls on a specific page you can always turn on the Profiler http://www.codeigniter.com/user_guide/ge...iling.html for that particular controller/page
Reply




Theme © iAndrew 2016 - Forum software by © MyBB