Jump to content

Squid Access List


r_balest
 Share

Recommended Posts

hmm.. okay i'm not sure what are my mistakes

so here it goes:

 

1. I want my clients can't access some websites using "Sex" or "porn" words...

2. I want my clients can't access the EXACT site www.xxx.com

 

how do i apply that on squid?

 

I used url_regex.. but I can still access what I denied. Strange

 

here's my configuration:

1. acl denied_phrase url_regex "/etc/squid/denied_phrase"

I created a file in /etc/squid called denied_phrase

vi /etc/squid/denied_phrase

I typed: sex

porn

bla bla bla

service squid restart...

when I tried to access some sites that contain those words, i can still access them... why is that?

2. I don't know what to use... help...

 

 

Thanks a lot

Link to comment
Share on other sites

I did this before using the blacklists from squidguard's website. Check it out, it also gives some ideas on how to implement it too.

 

You can even install squidguard from Mandriva's repositories if you decide to use it. I did it without squidguard, but used the blacklists.

Link to comment
Share on other sites

That's all I know of, I'll check it later this week hopefully as I'm out of the office the next couple of days. If not, will build a squid on Monday with the relevant config, and post it here.

Link to comment
Share on other sites

Yep, site is down, but you can get the black lists here:

 

http://squidguard.shalla.de/

 

alternatively, squidguards website is archived here:

 

http://web.archive.org/web/20060824110625/...squidguard.org/

 

if you prefer getting from original source. I will try building the proxy either today or tomorrow if I get chance, but unlikely until next week.

Link to comment
Share on other sites

ian, do you have a YM or something like that?

I wanna ask more bout my linux ;))

 

Mostly use icq, but yeah have got yahoo and msn as well, but depends if you're lucky to catch me online or not :P Send me a pm if you like to ask some more about your system.

Link to comment
Share on other sites

Sorry for delay, not had time until now. OK, here's how I did it:

 

First, download this file:

 

http://ftp.teledanmark.no/pub/www/proxy/sq...acklists.tar.gz

 

this gives you the blacklist stuff. I dunno how up-to-date it is, but is working on my system quite well. Now, you need to create some acls in the squid.conf file, that look like this:

 

acl ads url_regex "/etc/squid/blacklists/ads/domains" "/etc/squid/blacklists/ads/urls"
acl aggressive url_regex "/etc/squid/blacklists/aggressive/domains" "/etc/squid/blacklists/aggressive/urls"
acl drugs url_regex "/etc/squid/blacklists/drugs/domains" "/etc/squid/blacklists/drugs/urls"
acl gambling url_regex "/etc/squid/blacklists/gambling/domains" "/etc/squid/blacklists/gambling/urls"
acl hacking url_regex "/etc/squid/blacklists/hacking/domains" "/etc/squid/blacklists/hacking/urls"
acl porn url_regex "/etc/squid/blacklists/porn/domains" "/etc/squid/blacklists/porn/urls"
acl proxy url_regex "/etc/squid/blacklists/proxy/domains" "/etc/squid/blacklists/proxy/urls"
acl violence url_regex "/etc/squid/blacklists/violence/domains" "/etc/squid/blacklists/violence/urls"
acl warez url_regex "/etc/squid/blacklists/warez/domains" "/etc/squid/blacklists/warez/urls"

 

as you can see, these are the access control lists for blocking based on the relevant subject. The directory path is where I extracted the files, so you may need to change this based on where your squid is, or where you extracted them.

 

Next up, you need to deny access, so you'll need some of these:

 

http_access deny ads
http_access deny aggressive
http_access deny drugs
http_access deny gambling
http_access deny hacking
http_access deny porn
http_access deny proxy
http_access deny violence
http_access deny warez

 

the order is important, rules are read from top to bottom, so to block them they should be near the top of the http_access rules, otherwise, when a rule condition is met, like:

 

http_access allow our_networks

 

for example, it'll not block because you match the ip addresses allowed access for this acl.

 

There's also this link, it might work better for you: http://squidguard.shalla.de/index.html

 

this gives some config for actually using squidguard. I'm not using squidguard, just squid on it's own. The listings might take a while to process, and therefore cause delays visiting websites. Using squidguard might actually be faster.

Link to comment
Share on other sites

wow ian... thanks for the explanation.

I will try them about Tuesday... I will post here my result Ok?

Thanks a lot

 

Meanwhile, I have troubles accessing my website using Squid proxy. It suddenly becomes slower and slower, but if I direct my clients to Wingate, it runs flawlessly...

I actually haven't done anything to Squid, how come it becomes slower?

Any idea, ian?

Link to comment
Share on other sites

Squid needs a fast machine, in particular one that can handle a lot of I/O requests to the hard disk, as well as a nice network card that is good too. This is the most important aspect of it.

 

Also, make sure it can resolve hostnames easily and is able to get DNS responses quickly. Otherwise, check the default levels of disk space allocated, in case the cache is too small.

Link to comment
Share on other sites

I have read that Dansguardian is faster at filtering than squidguard & squid -- it seems to be better at managing large lists. It can also do content filtering (block pages based on content, rathern than only on address or name). I run DG on an Athlon XP 2800+ for single use (desktop use), with lots of lists. No delay ever. Largest list has over half a million entries... Note that DG only filters; it still relies on a proxy for some other tasks (which I do not quite understand). I run my setup with squid.

 

The default blacklists coming with squidguard are old. You can find more update lists (using same taxonomy) at the website of the Univeristy of Toulouse and a US website (education board or something?) called MESD (also has some good background on squidguard). MESD also allows you to update your blacklist by rsync which typically reduces download size with 85% (I have a script to do this weekly). Note that not every list-type that is Toulouse if also in MESD; I update lists for ads and phishing from Toulouse. Anotehr huge repository for blacklists is ISAK which is based on Mozilla's Open Directory -- I do recall that their lists need some editing before being workable for DG, though (but this may not be the case for squidGuard).

Edited by pindakoe
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

×
×
  • Create New...