
Folks, Does anybody have any recommendations for free Wiki space? I am thinking of maybe wikispaces.org as this looks OK and content is under Creative Commons. I am looking at this for todo for DCCP kernel development and it is a bit detailed, dynamic and hard core for the WLUG website in my opinion... Ian

Ian McDonald wrote:
Folks,
Does anybody have any recommendations for free Wiki space? I am thinking of maybe wikispaces.org as this looks OK and content is under Creative Commons.
You are a paid up WLUG member. You could simply install a wiki in your shell account on Hoiho?
I am looking at this for todo for DCCP kernel development and it is a bit detailed, dynamic and hard core for the WLUG website in my opinion...
In Perry's opinion, it will no doubt be perfect for the WLUG wiki! ;) Greig.

One question then on this if I use WLUG wiki - how often is backup made and is it shipped offsite - would hate to see work lost.... On 12/08/05, Greig McGill <greig(a)hamiltron.net> wrote:
Ian McDonald wrote:
Folks,
Does anybody have any recommendations for free Wiki space? I am thinking of maybe wikispaces.org as this looks OK and content is under Creative Commons.
You are a paid up WLUG member. You could simply install a wiki in your shell account on Hoiho?
I am looking at this for todo for DCCP kernel development and it is a bit detailed, dynamic and hard core for the WLUG website in my opinion...
In Perry's opinion, it will no doubt be perfect for the WLUG wiki! ;)
Greig.
_______________________________________________ wlug mailing list | wlug(a)list.waikato.ac.nz Unsubscribe: http://list.waikato.ac.nz/mailman/listinfo/wlug

Ian McDonald wrote:
One question then on this if I use WLUG wiki - how often is backup made and is it shipped offsite - would hate to see work lost....
If you do use the WLUG wiki, we do make backups, though I'm not sure of the frequency. We are not responsible for any losses etc though. If you use your own wiki on hoiho, you will be responsible for your own backups. Ditto the disclaimer! ;) G.

On Fri, Aug 12, 2005 at 11:08:57AM +1200, Ian McDonald wrote:
One question then on this if I use WLUG wiki - how often is backup made and is it shipped offsite - would hate to see work lost....
The db is backed up daily, and offsite backups are much more irregular - about once every 3 or 4 weeks (done manually), although this is obviously something we need to look at at some stage. (The db is readable by anyone at the moment, so if you had concerns just after doing a large amount of updates you could make your own copy somewhere as well). John

On 12/08/05, John R. McPherson <jrm21(a)cs.waikato.ac.nz> wrote:
On Fri, Aug 12, 2005 at 11:08:57AM +1200, Ian McDonald wrote:
One question then on this if I use WLUG wiki - how often is backup made and is it shipped offsite - would hate to see work lost....
The db is backed up daily, and offsite backups are much more irregular - about once every 3 or 4 weeks (done manually), although this is obviously something we need to look at at some stage.
(The db is readable by anyone at the moment, so if you had concerns just after doing a large amount of updates you could make your own copy somewhere as well).
John
Cool. This is good enough for me and I can always take it off as you say. Well my stuff is about to start shifting now... Ian

Ian McDonald wrote:
One question then on this if I use WLUG wiki - how often is backup made and is it shipped offsite - would hate to see work lost....
how often are you cronning wget http://www.wlug.org.nz/archive/wiki/ -O - | sed -e 's/.*<a href="\(.*\)".*/\1/' | grep sql.gz | tail -n 1 | xargs -ixxx echo wget http://www.wlug.org.nz/archive/wiki/xxx ?

* Perry Lorier <perry(a)coders.net> [2005-08-12 02:00]:
wget http://www.wlug.org.nz/archive/wiki/ -O - | sed -e 's/.*<a href="\(.*\)".*/\1/' | grep sql.gz | tail -n 1 | xargs -ixxx echo wget http://www.wlug.org.nz/archive/wiki/xxx
wget http://www.wlug.org.nz/archive/wiki/ -O - | sed -e 's/.*<a href="\(.*\)".*/http://www.wlug.org.nz/archive/wiki/\1/' | grep sql.gz | tail -n 1 | wget -i - Or install the HTML::Parser module (`sudo cpan HTML::Parser`), then download <http://plasmasturm.org/code/linkextor/linkextor>, and then it becomes simply linkextor -f 'a::sql\.gz$' http://www.wlug.org.nz/archive/wiki/ | tail -1 | wget -i If I may say so myself, being the author of the beast, linkextor is terrifically useful when you need to scrape pages to run automated downloads. Regards, -- Aristotle Pagaltzis // <http://plasmasturm.org/>

Does anybody have any recommendations for free Wiki space? I am thinking of maybe wikispaces.org as this looks OK and content is under Creative Commons.
You are a paid up WLUG member. You could simply install a wiki in your shell account on Hoiho?
DCCP development is something that would be neat for the lug to support in some way :)
I am looking at this for todo for DCCP kernel development and it is a bit detailed, dynamic and hard core for the WLUG website in my opinion...
In Perry's opinion, it will no doubt be perfect for the WLUG wiki! ;)
Of course! We need more networking/programming content in the wiki!
participants (5)
-
A. Pagaltzis
-
Greig McGill
-
Ian McDonald
-
John R. McPherson
-
Perry Lorier