[TAG] Network Traffic Review/Filtering
Kapil Hari Paranjape
kapil at imsc.res.in
Fri Apr 7 05:33:22 MSD 2006
On Thu, 06 Apr 2006, sloopy wrote:
> I run a 8-10 node network at home through a router (a VIA C3 mobo with
> fedora core) and would like to have a way of setting up a web page on it
> that would list URL's being retrieved from the inet, and a nice side option
> of being able to block certain content for some nodes on the network. would
> i need to run a proxy (i.e. squid or similar) for this? or would this be
> over the capabilities of the router machine?
As Suramya pointed out anecdotally, in any (re-)configuration of
routers/firewalls make sure you understand and can handle the
As Francis Daly said you have three solutions. I'll add a
glimpse to the politics associated with each.
a. Force all nodes to use a web proxy by blocking other nodes
from accessing the web directly (using firewall rules).
Any web proxy combined with a log analyzer (analog?) can do what
Provide a ".pac" file (for automatic proxy configuration) for user
This way everyone using the nodes knows what you are doing
b. Automatically redirect web connections from the nodes to
the web proxy by firewall rules. You need a web proxy (like squid)
that can handle "transparent proxying".
The users need not be told anything but they'll probably find out!
"Transparent" proxying is generally not quite transparent and in my
experience does break a few (very few) sites. Note that web proxies
*are* acounted for by the RFC for HTTP but transparent proxies are
c. Use firewall rules to send a copy of all web traffic through a sniffer
which can extract the URL's. You can insert firewall rules to
block/allow specific IP addresses.
Again the users need not be told anything.
You will not be breaking any network protocols by doing this.
Hope this helps,
More information about the TAG