From 2f3b2d3684068987691242da4dbda2f09828a56c Mon Sep 17 00:00:00 2001 From: Silvio Rhatto Date: Sun, 25 Aug 2013 21:52:43 -0300 Subject: Usage and cleaner httrack options --- README.mdwn | 8 ++++++++ 1 file changed, 8 insertions(+) (limited to 'README.mdwn') diff --git a/README.mdwn b/README.mdwn index e9c9d72..3309f60 100644 --- a/README.mdwn +++ b/README.mdwn @@ -4,8 +4,16 @@ Feed Crawler Download all links from a feed using httrack. This is the engine behind the "Cache" feature used by https://links.sarava.org Semantic Scuttle instance. +Usage +----- + +Place this script somewhere and setup a cronjob like this: + +`*/5 * * * * /var/sites/arquivo/httracker/httracker &> /dev/null` + TODO ---- - Include all sites already donwloaded by scuttler. - Support for other fetchers like youtube-dl. +- Lockfile support. -- cgit v1.2.3