Backups

Mediawiki Scraper
You can generate a database dump and file dump of any public wiki using the Mediawiki Client Tools' Mediawiki Scraper Python 3 dumpgenerator script, (full instructions are at that link).

Example usage
The result will include an XML dump with full page history, a dump of all images and files along with associated descriptions and a siteinfo.json file containing information about features, such as the installed extensions and skins.

Private wikis
To dump a private the wiki will have to be temporarily switched from private to public.