Welcome Guest, Not a member yet? Register   Sign In
Stopping people from brute forcing URL / UUIDs
#1

[eluser]RS71[/eluser]
Hey

I have segments on my url that correspond to IDs and I'm trying to decide the best solution for slowing down and if possible stopping people from looping through URLs and extracting the page contents. (wew, long sentence)

I don't necessarily want to limit them to one certain page but just make it hard for them to wander off into other ones by guessing the url.

The only solution I see as of right now would be to switch my IDs to UUIDs to try to slow down automated extraction.

Could anybody give me some advice?

Also, I've come across the uniqid() function in PHP but is it a reliable function or should I use any of the many other UUID functions out there? (Why would there be many out there if there is a php one?) I have performance in mind as well, would UUIDs just slow me down greatly?

Thanks in advance, I appreciate it.
#2

[eluser]TheFuzzy0ne[/eluser]
Could you please explain what this page is for?
#3

[eluser]RS71[/eluser]
Its for a listings page/section and theres another similar page used for displaying user's information, address, etc.
#4

[eluser]TheFuzzy0ne[/eluser]
I think as a quick fix, I would encrypt the id with your site key. That would surely put anyone off the idea of brute force. You could them implement a system that would ban someone by IP for a limited time if they try using a URL that contains an invalid ID say more than 10 times?

Should you're users addresses be available so readily? Perhaps you could implement a system that only allows a user to see there own address (assuming that's the goal).

EDIT, if the site ID is only a few characters in length, you can pad it out a bit, by doing something like this:

Code:
$id = $this->encrypt->encode($id . ':' . md5(time()));

That will make the IDs uglier, but virtually impossible to break. You'd just need to decode the string, and split it to obtain the ID. If you want to pass base64 URLs via the URI, you might want to check out my [url="http://ellislab.com/forums/viewthread/109429/"]URI-Safe Encrypt library extension[/url].
#5

[eluser]RS71[/eluser]
I hadn't thought of the IP ban on invalid IDs, very nice. I'd need non-sequential IDs for that to work nicely. Wouldn't encrypting the IDs need lots of processing power since for every item for every user viewing the site, I'd need to encrypt and decrypt IDs? I'm not too familiar with using base64 urls, what would be the benefit?

What do you think of UUIDs?
#6

[eluser]TheFuzzy0ne[/eluser]
Sure. That would work well too. In fact, it would require less overhead than my suggestion, so it would probably be much better.

One final suggestion - I don't know how often a user will/should need to access these pages, but to stop brute force attacks, you could add a CAPTCHA. Obviously this will piss your users off if they have to enter 100 CAPTCHAs in an hour, but only you will know if that's applicable.
#7

[eluser]jedd[/eluser]
[quote author="RS71" date="1237787495"]I hadn't thought of the IP ban on invalid IDs, very nice. I'd need non-sequential IDs for that to work nicely.[/quote]

You could have honeypot ID's. Though this would be ugly - it's getting very complicated for something that should really be handled a proper ACL/permissions system.

Quote:Wouldn't encrypting the IDs need lots of processing power since for every item for every user viewing the site

I'd do it at ID creation - probably using md5 to create the second column (right next to ID) and using that new md5 field for all public references (ie. URL components). You'd want to salt it, if you did it that way, of course, and not just base it on md5('1') (etc).

The base64 stuff was, I think, to ensure no nasties in the URL. MD5 offers the same feature, of course.


But remember, at the end of the day, we are not enamoured with security by obscurity.
#8

[eluser]RS71[/eluser]
[quote author="jedd" date="1237789708"][quote author="RS71" date="1237787495"]I hadn't thought of the IP ban on invalid IDs, very nice. I'd need non-sequential IDs for that to work nicely.[/quote]

You could have honeypot ID's. Though this would be ugly - it's getting very complicated for something that should really be handled a proper ACL/permissions system.

Quote:Wouldn't encrypting the IDs need lots of processing power since for every item for every user viewing the site

I'd do it at ID creation - probably using md5 to create the second column (right next to ID) and using that new md5 field for all public references (ie. URL components). You'd want to salt it, if you did it that way, of course, and not just base it on md5('1') (etc).

The base64 stuff was, I think, to ensure no nasties in the URL. MD5 offers the same feature, of course.


But remember, at the end of the day, we are not enamoured with security by obscurity.[/quote]

Thank you for your reply jedd

Could you please elaborate on the honeypot IDs? I currently do have a login/permissions system set up but I don't necessarily want to prevent users from viewing those IDs, just trying to make it harder for them to automatically extract information (running a script to run through all the IDs and extract contents). In the end, these IDs would all be available as search results so I'm still vulnerable but I guess it'd just make it slightly more complicated for them.

Do you have any tips you could give me?

I like the md5 idea, I might implement either that or UUIDs.
#9

[eluser]bretticus[/eluser]
Well, forgive me if I do not understand your problem in it's entirety. However, if we are thinking about the same problem, I have become accustomed to having a sequential auto-incremented id for my database records (primary key) and another random key (UUID) in addition. When I pass a URL out for public consumption, I simply use the UUID to look up the record. In the managerial side of the website, I'll use the sequential number (or id.) In CI, a UUID of sorts is easy to generate with the string helper. I use the random_string() function. Works rather nicely.
#10

[eluser]jedd[/eluser]
[quote author="RS71" date="1237797879"]
Could you please elaborate on the honeypot IDs?[/quote]

Note - this becomes theoretically moot if you have sufficiently long and random strings to publicly identify your records.

With smaller, incremental ID numbers, though, you would periodically reserve one of these - say every 27th normal addition you'd create an additional dummy record, which would have some unique identifying data. Your code would never refer to this site - that is, there would never be a URL generated by your site, that pointed to one of these records. If your controller ever received a request for same, therefore, it would know that the kind of search you are concerned about was occurring. At which point you .. do whatever.

Drop a route, ban an IP address, ban that user, etc.

None of these are very effective, of course, since the first doesn't stop the user from going to a different machine, the second can annoy anyone else on the same ISP, the third assumes an ACL system (or a business policy) that is evidently not present, that can manage these rights.

Quote:Do you have any tips you could give me?

My big tip is, if you really do have an ACL system in place, you should extend it such that it can handle the permission model that you sound like you actually want to use. Or let it go.

I guess I just can't imagine a situation where I would want 'slightly more complicated' as my security mechanism, though perhaps I'm just used to black and white problems.




Theme © iAndrew 2016 - Forum software by © MyBB