Ключевые слова: Download, Save, Website, Suck, Local Copy
- Crawl a website, save all files locally. - Very simple interface - simply enter your page url and press start. When the crawl finishes, you'll see a save dialog. - Powerful crawl settings, allowing for rate limiting, black / white listing, setting of user-agent string (spoofing) and more. - Runs locally, not a cloud service. Own your own data. - Options to preserve all files exactly as they were fetched under their original filenames - Or to process them to allow browsing of the local copy
Some improvements which help to avoid missing resources Now built to run natively on Intel and Apple Silicon Macs Inherits rece improvements to the Integrity crawling engine
22.12.2020
0.4.0
Major Update
Improvements to engine meaning that certain sites will display properly locally after being saved with the 'process' option Updates the selectable user-agent strings and adds more Changes default setting for treating http:// links on the same domain (when starting with an https:// url).
23.08.2020
0.3.0
Major Update
adds 'single page' option Adds option to archive all files from a website Improvements to crawling engine; now finds and processes image urls within inline styles