
At community centre I have a debian server that routes / filters all network traffic. Is there any easy way to produce graphs of daily internet use by machine, by site visited, etc? I've tried to set up a few different packages but I can't figure any of them out, what's the easiest option? Also at home I have one computer that uses between 2GB and 3GB every day. Is there any easy way to limit this to 200M a day and then just cut off the internet for the next 24 hours?

On Fri, 20 Jul 2012 09:30:07 +1200, Bruce Kingsbury wrote:
At community centre I have a debian server that routes / filters all network traffic. Is there any easy way to produce graphs of daily internet use by machine, by site visited, etc? I've tried to set up a few different packages but I can't figure any of them out, what's the easiest option?
Where I work, I've set up a proxy using squid and sarg to monitor internet useage by machine. http://www.squid-cache.org/ - Proxy http://sarg.sourceforge.net/ - Generates the useage reports http://squidguardmgr.darold.net/ - Configures access and allow sites to be blocked if considered neccessary. http://dansguardian.org/ IIRC I had to compile squidguardmgr for the block lists but I understand Dansguardian might be another option. By the time I'd found that one, I was happily running squidguard. If you set up the proxy and block access to the internet for services not going through the proxy, you should start to get a good idea of the traffic.
Also at home I have one computer that uses between 2GB and 3GB every day. Is there any easy way to limit this to 200M a day and then just cut off the internet for the next 24 hours?
Kids? 3GB to 200M will be a big drop - good luck. Dansguardian again?
_______________________________________________ wlug mailing list | wlug(a)list.waikato.ac.nz Unsubscribe: http://list.waikato.ac.nz/mailman/listinfo/wlug

already have squid and dans guardian in transparent mode for filtering pr0n etc. you think sarg is easiest for logging? can't recall what I already looked at but I think I tried sarg once. might have to take a deeper look at it. -- Sent from my Ideos X5 On 20/07/2012 9:45 AM, "mailinglist" <mailinglist(a)blahdeblah.co.nz> wrote:
On Fri, 20 Jul 2012 09:30:07 +1200, Bruce Kingsbury wrote:
At community centre I have a debian server that routes / filters all network traffic. Is there any easy way to produce graphs of daily internet use by machine, by site visited, etc? I've tried to set up a few different packages but I can't figure any of them out, what's the easiest option?
Where I work, I've set up a proxy using squid and sarg to monitor internet useage by machine.
http://www.squid-cache.org/ - Proxy http://sarg.sourceforge.net/ - Generates the useage reports http://squidguardmgr.darold.**net/ <http://squidguardmgr.darold.net/> - Configures access and allow sites to be blocked if considered neccessary. http://dansguardian.org/
IIRC I had to compile squidguardmgr for the block lists but I understand Dansguardian might be another option. By the time I'd found that one, I was happily running squidguard.
If you set up the proxy and block access to the internet for services not going through the proxy, you should start to get a good idea of the traffic.
Also at home I have one computer that uses between 2GB and 3GB every day. Is there any easy way to limit this to 200M a day and then just cut off the internet for the next 24 hours?
Kids?
3GB to 200M will be a big drop - good luck.
Dansguardian again?
______________________________**_________________
wlug mailing list | wlug(a)list.waikato.ac.nz Unsubscribe: http://list.waikato.ac.nz/**mailman/listinfo/wlug<http://list.waikato.ac.nz/mailman/listinfo/wlug>
______________________________**_________________ wlug mailing list | wlug(a)list.waikato.ac.nz Unsubscribe: http://list.waikato.ac.nz/**mailman/listinfo/wlug<http://list.waikato.ac.nz/mailman/listinfo/wlug>

On Fri, 20 Jul 2012 10:43:48 +1200, Bruce Kingsbury wrote:
already have squid and dans guardian in transparent mode for filtering pr0n etc. you think sarg is easiest for logging? can't recall what I already looked at but I think I tried sarg once. might have to take a deeper look at it. Just set up a webserver and post the sarg reports.
I set this all on a turnkey virtual machine. The webmin interface makes this all pretty easy to setup. Another thread here might be of interest. http://www.turnkeylinux.org/forum/general/20100920/tklpatch-web-filter-proxy

It's a bit of work but this http://www.pmacct.net/ worked for me in the past and u can accout for all traffic and the level of detail is upto you. Greg On Jul 20, 2012 11:08 AM, "mailinglist" <mailinglist(a)blahdeblah.co.nz> wrote:
On Fri, 20 Jul 2012 10:43:48 +1200, Bruce Kingsbury wrote:
already have squid and dans guardian in transparent mode for filtering pr0n etc. you think sarg is easiest for logging? can't recall what I already looked at but I think I tried sarg once. might have to take a deeper look at it.
Just set up a webserver and post the sarg reports.
I set this all on a turnkey virtual machine. The webmin interface makes this all pretty easy to setup.
Another thread here might be of interest.
http://www.turnkeylinux.org/**forum/general/20100920/** tklpatch-web-filter-proxy<http://www.turnkeylinux.org/forum/general/20100920/tklpatch-web-filter-proxy>
______________________________**_________________ wlug mailing list | wlug(a)list.waikato.ac.nz Unsubscribe: http://list.waikato.ac.nz/**mailman/listinfo/wlug<http://list.waikato.ac.nz/mailman/listinfo/wlug>
participants (3)
-
Bruce Kingsbury
-
Gregory Machin
-
mailinglist