[eluser]BrianDHall[/eluser]
In a word - yes, it would slow things down to do it that way. I believe the only efficient way would be something like an htaccess solution, but questionable.
First, this is only good advice for really intense traffic sites - like Google. Similar to Yahoo's YSlow utility, it gives sometimes bad advice for regular websites, specifically regarding caching.
Yahoo, as a for instance, sets most of its pages and content to be cached forever. If they want to change or update something, they change the url. Facebook handles caching of included CSS and JS in their applications the same way - when you app submits a CSS or JS file they rip it out, do strange magical things to it, and cache it with the instruction that a browser should NEVER check for a new version. If you want a new version served you have to change the file of the file you submit to facebook.
These are all great ideas for massive sites, because lets say you have 20 files (a few JS and CSS pages, various little images, etc). Browsers occassionally check for a new version by pinging your server to see if they have a newer version of a given files.
You can use LiveHTTP Headers in firefox to see this traffic, and when it is generated. If you set cache directives some browsers will follow them to the letter, so they won't even ask your server if a newer version is available - it will assume it isn't until the cache expires.
This saves your server a few requests it has to deal with, a few hits to the file system. Most applications will have unnecessary bugs and added maintenance difficulty for no real benefit, but if you need to squeeze every last possible request out of your server because it is squealing for dear life - then caching is one thing to seriously look at.