Greg Jones wrote:
> Hi Don,
>
> I used sitesucker on Mac to do it, but the magic is that you need to grab all
of the subdomains to get everything. When you browse your local copies, you'll
need to know which subdomain you want to look at unless you spend some time
fixing the local html.
>
> wkfinetools.com
> contrib1.wkfinetools.com
> contrib2.wkfinetools.com
> huk1.wkfinetools.com
> hus-boringt.wkfinetools.com
> hus-saws1.wkfinetools.com
> library.wkfinetools.com
> otools1.wkfinetools.com
> tmaking.wkfinetools.com
> trestore.wkfinetools.com
> wworking.wkfinetools.com
On Linux, wget can retrieve from multiple domains AND
updated the links for you:
https://www.gnu.org/software/wget/manual/wget.html
https://www.gnu.org/software/wget/manual/wget.html#Spanning-Hosts
https://www.gnu.org/software/wget/manual/wget.html#Recursive-Retrieval-Options
BugBear
|