View on
MetaCPAN is shutting down
For details read Perl NOC. After June 25th this page will redirect to
Tyler Riddle > MediaWiki-DumpFile-0.1.8 > MediaWiki::DumpFile



Annotate this POD


New  6
Open  1
View/Report Bugs
Module Version: 0.1.8   Source   Latest Release: MediaWiki-DumpFile-0.2.2


MediaWiki::DumpFile - Process various dump files from a MediaWiki instance


  use MediaWiki::DumpFile;

  $mw = MediaWiki::DumpFile->new;
  $sql = $mw->sql($filename);
  $sql = $mw->sql(\*FH);
  $pages = $mw->pages($filename);
  $pages = $mw->pages(\*FH);
  $fastpages = $mw->fastpages($filename);
  $fastpages = $mw->fastpages(\*FH);
  use MediaWiki::DumpFile::Compat;
  $pmwd = Parse::MediaWikiDump->new;


This module is used to parse various dump files from a MediaWiki instance. The most likely case is that you will want to be parsing content at provided by WikiMedia which includes the English and all other language Wikipedias.

This module is the successor to Parse::MediaWikiDump acting as a full replacement in feature set and providing a backwards compatible API that is faster than Parse::MediaWikiDump is (see MediaWiki::DumpFile::Compat).


This software is maturing into a stable and tested state with known users; the API is stable and will not be changed.



Return an instance of MediaWiki::DumpFile::SQL. This object can be used to parse any arbitrary SQL dump file used to recreate a single table in the MediaWiki instance.


Return an instance of MediaWiki::DumpFile::Pages. This object parses the contents of the page dump file and supports both single and multiple revisions per article as well as associated metadata.


Return an instance of MediaWiki::DumpFile::FastPages. This object parses the contents of the page dump file but only supports fetching the article titles and text and will only return the text for the first revision of the article if the page dump includes multiple revisions. The trade off for the lack of features is drastically increased processing speed.


These benchmarks will give you a rough idea of how fast you can expect the XML dump files to be processed. The benchmark is to print all of the article titles and text to STDOUT and was executed on a 2.66 GHz Intel Core Duo Macintosh running Snow Leopard. The test data is a dump file of the Simple English Wikipedia from October 21, 2009.

MediaWiki-DumpFile-FastPages: 26.16 MiB/sec
MediaWiki-DumpFile-Pages: 8.32 MiB/sec
MediaWiki-DumpFile-Compat: 7.26 MiB/sec
Parse-MediaWikiDump: 3.2 MiB/sec


Tyler Riddle, <triddle at>


Please report any bugs or feature requests to bug-mediawiki-dumpfile at, or through the web interface at I will be notified, and then you'll automatically be notified of progress on your bug as I make changes.


You can find documentation for this module with the perldoc command.

    perldoc MediaWiki::DumpFile

You can also look for information at:


All of the people who reported bugs or feature requests for Parse::MediaWikiDump.


Copyright 2009 "Tyler Riddle".

This program is free software; you can redistribute it and/or modify it under the terms of either: the GNU General Public License as published by the Free Software Foundation; or the Artistic License.

See for more information.

syntax highlighting: