: MediaWiki sites have all of their content in a relational database (RDBMS). The code for generating a site map basically just does SQL SELECT query to pull up the necessary information for
MediaWiki sites have all of their content in a relational database (RDBMS). The code for generating a site map basically just does SQL SELECT query to pull up the necessary information for every page. Probably doable in a single SQL query (that returns one row per page). The code for that is fairly simple, really.
Any large site that uses a content management system (CMS) will have an equally easy time generating a sitemap, even if there's a million pages. Query the database, format the results into the appropriate sitemap format. Pretty much the same kind of code as a search, but with one less WHERE clause (to return everything) and no pagination needed. The database type and schema can affect how easy this is, but in general a CMS will have the page name, URL (well, fields necessary to generate a URL), modification date and stuff like that as fields in the database.
This question and your other two make it seem like you don't really understand that MediaWiki sites uses a relational database, not a bunch of directories full of files.
Do you have a large site you're trying to generate sitemaps for? How is the data stored? Plain old-fashioned files on a filesystem?
More posts by @Yeniel560
Terms of Use Create Support ticket Your support tickets Stock Market News! © vmapp.org2024 All Rights reserved.