Free to rheme |
There are a lot of discussions, examples, plug-ins, libraries, and suchlike on the topic of User Authentication... the problem that I have is that, for many of them, it's very much along the lines of: [0]:prohibit users that are not signed in, [1]:give an authenticated user a blank cheque.
Yes, they do limit/control users on a reasonably granular level: is the user an admin, or just a random user from the wild who's quickly created an account? However, once a registered/authenticated user is signed in, they essentially have unfettered access to ALL the material on the site accessible to that user-level. So, to provide a simplistic example to demonstrate my problem/labour the point (and images are not specifically the problem- it's easy enough to have them served from a controller that generates a unique hash to make them all have hard-to-guess URIs- the problem applies to other resources/pages that are harder to apply unique URI's to each time they are accessed/displayed within a page): <img src="/Image/serve/image00001.jpg" /> Here a user could simply foreach() though every image on the site... or do it manually if the resources were rewarding enough... think Zooey Deschanel ⋂ pornRus.com... Another example is changing the URL in my browser to https://forum.codeigniter.com/thread-78050.html will serve the particular page on this forum... and simply typing in a different number at the end... say 78032.html will then provide access to another (unrelated/unlinked) page. Of course, this could also be done by the user editing links in their browser (if other material needs posting with the new link to make it work). Yes, I understand that ALL of this material IS accessible already... but suppose one doesn't want it as easy for a user to flip though/leech one's whole website using something as simple as a foreach()?... in other words, if we specifically want to limit a user to accessing only legitimately presented, searched for, or connected links, and the subsequently presented links? Effectively, the functionality that I'm looking for would be like having a constantly updated CI Filter that only permits "expected" links to be processed (digesting something like $request->getServer(['SERVER_PROTOCOL', 'REQUEST_URI']) and comparing it to links on the page it was sent from)... so that only links presented on a page will be considered for processing/be replied to. Needless to say this will likely get pretty messy very quickly if there are parameters posted, and/or when JavaScript is thrown into the mix (where it generates some of the posted material). Does anyone have a clever way to do this, or some insightful advice that would point me in a sensible direction? Thanks.
Did you check out this library?
PHP URL Hash Value HMAC: Generate hashes for URLs to prevent tampering What did you Try? What did you Get? What did you Expect?
Joined CodeIgniter Community 2009. ( Skype: insitfx )
I have had a look at what you suggested, thanks InsiteFX... it is an implementation of a similar concept to what I've used for the images on my site.
My headache is that my site is largely JavaScript driven, so although having links hash-protected is not a bad thing, the issue I'm facing is figuring out a way to get the more critical $_POSTed content obfuscated in some sensible (and implementable) way... Hashes are only useful for confirming what is already known... which means I'd have to keep a 'copy' of what I was expecting to have as possible replies (which, for the purpose at hand, I could essentially do without hashing the data)... perhaps most the sensible would be to do some simple/fast encryption scheme (with a bit of random mixed in) that decouples the page presentation from the POSTed replies (?). |
Welcome Guest, Not a member yet? Register Sign In |