: Clone an abandoned MediaWiki site Is there any way to clone a MediaWiki site that's been abandoned by the owner and all admins? None of the admins have been seen in 6 months and all attempts
Is there any way to clone a MediaWiki site that's been abandoned by the owner and all admins? None of the admins have been seen in 6 months and all attempts to contact any of them over the past 3-4 months have failed and the community is worried for the future of the Wiki. We have all put countless man-hours into the Wiki and to lose it now would be beyond devastating.
What would be the simplest way to go about this?
Thanks.
More posts by @Gretchen104
3 Comments
Sorted by latest first Latest Oldest Best
Backing up a wiki without server shell access, requires Python v2 (v3 didn't yet work last time I did this).
From the command-line run the the WikiTeam Python script dumpgenerator.py to get an XML dump, including edit histories with all images and their descriptions.
python dumpgenerator.py --api=http://www.abandoned.wiki/w/api.php --xml --images
Note this XML dump doesn't create a complete backup of the wiki database. It doesn't contain user accounts, etc. Also the extensions and their configuration aren't backed up, file types other than images don't get saved. But it does save enough to recreate the wiki on another server.
Full instructions are at the WikiTeam tutorial.
For restoring the wiki from the XML dump see MediaWiki Manual:Importing XML dumps, etc.
You can use the API to export all the text content, with something like action=query&generator=allpages&export. Files you'll have to scrape via some script, such as pywikibot. You can see what extensions are installed via Special:Version if you want to set up an identical wiki; some of the configuration settings are available via the siteinfo API, most you'll have to guess. There is no way to bulk clone user accounts, but you can use the MediaWikiAuth extension to transfer them when they log in.
Media-Wiki pages can be exported in a special XML format to upload import into another MediaWiki installation
You can use 'Special:Export' usually in most standard Mediawiki.
At lease you can get all pages of each namespace.
imho this depends on the size. This works good for small mediawikis, I never tried to get a xml dump of huge Wikis (like wikipedia;)
But its worth trying.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.