[eluser]Jondolar[/eluser]
[quote author="hikari" date="1258731462"]Is the cache rewriting a new page on every visit or what ?? 7000 files should slow down the system when ci wants to access a cached file.[/quote]
Having the OS find your page in a directory of 7000 files is much more efficient than having php re-render the page. Do you know how many pages your site will have? Windows will bog down after about 20,000 to 30,000 files in a directory unless you build your disk with special parameters related to the master file table (MFT). Under unix, you probably won't see issues until you hit around 200,000 from what I've read.
If your site has been indexed by google several times, then you've probably hit your limit of files. You can look at your logs to see when google has visited and then look at the date of your cache files to determine if the files are being recreated every time.