[eluser]little brittle[/eluser]
I have my CI site using DB sessions with 'sess_use_database'=true, and set cookie expiration for 2 weeks.
The problem is, I currently have 60,000 rows in my CI_Sessions table, and I'm not getting a ton of traffic. Based on the useragent, many of those sessions are for Googlebot or other bots, sometimes with the same IP address. None of these bots require a session to retrieve the data they need. My concern is that when my site is getting a thousand times more traffic, I have millions of unnecessary database entries, making it more expensive to find a retrieve valid session data.
Is there a way to prevent bots from creating sessions? Is it a good idea? Has anyone encountered something like this?