Post Reply 
BugMeNot
Jun. 13, 2004, 04:55 PM
Post: #1
 
Hi, i'm cross-posting this here and at Y!G prox-list:

There is a new site on the block, which those who like news sites or messing with cookies may find interesting.
Bypass Compulsory Web Registration: http://bugmenot.com/

So i looked a lot at newspages lately and here is a list of > 100 sites that can be accessed (or made less nasty) with a few methods - neither of which are new. It's written as an update to my config, but i added some example filters that show how to do the same thing in every config.

I mostly used the methods in this priority: fake referrer, block cookie, block scripts, fake user-agent, fake cookie.

There are very few sites on the lists that would normally require you to pay, but since access to them is possible by blocking cookies or scripts and since it's my business what i allow on my machine and what not, i feel it's ethically correct.
However, i'm not sure. So if the general opinion goes into the other direction, i'll remove them again.

NewsSites.zip (10kb)

Have fun,
sidki
Add Thank You Quote this message in a reply
Jun. 14, 2004, 07:50 AM
Post: #2
 
That bugmenot link takes one to a blank-looking page. It has code on it, but nothing visible. What is it supposed to do?
Add Thank You Quote this message in a reply
Jun. 14, 2004, 09:41 AM
Post: #3
 
If it's blank then you have issues with your browser or Proxomitron filters.
It gives you login accounts for sites that require you to register.

Try:
http://bugmenot.com/view.php?url=www.nytimes.com
http://bugmenot.com/faq.php
Add Thank You Quote this message in a reply
Jun. 14, 2004, 09:26 PM
Post: #4
 
i have modified the Sydney Morning Herald entry like so
Code:
[^.]+.smh.com.au/(^rugbyheaven)   $SET(keyword=.gbot.)
www.rugbyheaven.smh.com.au/       $SET(keyword=.gbot.)
i assume this is correct?
it seems to work.

registration at this site is free so i think your method is ethical.
thanks for the new file.

edit:
i think the one line

[^.]+.smh.com.au/(^rugbyheaven) $SET(keyword=.gbot.)

will suffice since only one keyword is being used.
Add Thank You Quote this message in a reply
Jun. 15, 2004, 08:38 AM
Post: #5
 
If it should match also on 2-level subdomains for that site then you could use "([^/]++.|)smh.com.au/".
However, as far as i can see those URLs are just redirectors
( http://www.rugbyheaven.smh.com.au/ -> http://rugbyheaven.smh.com.au/ ),
so i would use "([^.]+.|)" which is faster.

I listed SMH twice in IncludeExclude, d'oh -- one more time under "fake referrer" which doesn't work any more.
So remove the old lines and add:
Code:
([^.]+.|)smh.com.au/        $SET(keyword=.gbot.)

Thanks,
sidki
Add Thank You Quote this message in a reply
Jul. 16, 2004, 01:36 PM
Post: #6
 
Cool!!! I must have scanned this thread too quickly upon my initial visitation...
Add Thank You Quote this message in a reply
Jul. 16, 2004, 06:23 PM
Post: #7
 
Nice one,Sidki!

I'm heartilly sick of supplying endless registration details,and remembering passwords.I only supply bogus details and throwaway spam addresses,so what's the point? Why do these sites bother with such inconvenient measures?

They'll try and dump a tracking cookie on you anyway,which to my mind makes it no more than a concerted effort to mine data,with or without your consent.

????,??,????`????,? _J_G_ ????,??,????`????,?
Add Thank You Quote this message in a reply
Jul. 20, 2004, 07:27 PM
Post: #8
 
here is a bugmenot plugin for Mozilla/Firefox

http://bugmenot.mozdev.org/
Add Thank You Quote this message in a reply
Post Reply 


Forum Jump: