I’m trying to archive all the images from a website, this one specifically: https://stevegallacci.com/archive/edf
However, when I use a tool like DownThemAll, it just pulls the thumbnails that link to the full image and not the image itself. Dunno if that’s because I’m using the software wrong or that’s just a limitation of DownThemAll. Is there any way to bulk download the full images without having to do so manually?
It looks like DownThemAll doesn’t do recursive search (it’s not looking at any deeper levels of the site, so if you run it on the page you linked, it’ll only grab the thumbnails, because it’s not navigating to the subpage where each full image is stored).
A quick search turned up this tool, which might do what you’re looking for? HTTrack
You could try wget’s recursive mode:
Downloadthemall let’s you set some filters, namely file size. If you set it high enough, at least thumbnails and other stuff will not be downloaded.