About 10 years ago, I downloaded entire sites for offline use with the HTTrack Website Copier. Perhaps you can upload your own website, which will give you a good hierarchy of your static web pages. If you think that all your web pages are accessible through links to the main page, links in the menu, etc., you can download most of your site. You can mainly use Google for web crawlers / standalone browsers / website downloaders, etc. And run them to do their job.
Alternatively, if you know the URL pattern, you can send it to the download manager to download it. Not sure if it works with your site, but sometimes I do it.
NTN
source share