: How to get a list of all page names (with namespaces) of a MediaWiki installation? I want to export all pages (in all namespaces, and with all revisions!) of a public MediaWiki installation.
I want to export all pages (in all namespaces, and with all revisions!) of a public MediaWiki installation. I’m just a visitor, so I have no admin access.
The export is possible with the Special:Export page. It takes a list of page names and exports all given pages in a single XML file. Great!
However, how could I get a list of all pages?
There is Special:AllPages. Problems:
The results are paginated.
It doesn’t include the namespace in the page name (anchor text).
Is there a better way?
(I know that there may be various special extensions that allow visitors to export pages or download a backup, however, assume that the MediaWiki installation in question doesn’t use any extensions, it’s just a default installation.)
More posts by @Shanna517
1 Comments
Sorted by latest first Latest Oldest Best
When you have no shell access use the WikiTeam DumpGenerator.py script. This can do a XML dump of the current pages or with all their history and also does a dump of all available images along with their descriptions.
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.