I have restricted access to specific sites before, but am at a loss why it is failing to work now.
I updated the squid templates as follows:
acl BadWords url_regex rotten
http_access deny BadWords
expanded the template and have the following /etc/squid/squid.conf
acl all src 0.0.0.0/0.0.0.0
acl BadWords url_regex rotten
acl manager proto cache_object
acl localsrc src 127.0.0.1 192.168.1.0/255.255.255.0
acl localdst dst 127.0.0.1 192.168.1.0/255.255.255.0
acl SSL_ports port 443 563
acl Safe_ports port 80 21 443 563 70 210 1025-65535 980
acl CONNECT method CONNECT
acl webdav method PROPFIND TRACE PURGE PROPPATCH MKCOL COPY MOVE LOCK UNLOCK
cache_mgr admin@davesshop.net
ftp_user nobody@davesshop.net
http_access allow manager localsrc
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localsrc
http_access deny BadWords
http_access deny all
httpd_accel_host virtual
httpd_accel_with_proxy on
httpd_accel_uses_host_header on
icp_access allow all
miss_access allow all
store_avg_object_size 3 KB
always_direct allow webdav
always_direct allow all
and still I can access
www.rotten.comPlease help me.