Manual:dumpBackup.php
MediaWiki file: dumpBackup.php | |
---|---|
Location: | maintenance/ |
Source code: | master • 1.41.1 • 1.40.3 • 1.39.7 |
Classes: | DumpBackup |
Details
dumpBackup.php file creates an XML dump for export or backup. XML dumps contain the content of the wiki (wiki pages with all their revisions), without the site-related data. DumpBackup.php does not create a full backup of the wiki database, the dump does not contain user accounts, images, edit logs, deleted revisions, etc.[1] Once the dump is completed, you can import the XML dump.
Examples
General examples
You must choose a name for the data dump.
For example the user here is saving all of the revision history ( --full) into a file named dump.xml
:
php dumpBackup.php --full --quiet > dump.xml
- For more details on this dump.xml example, see Detailed example below.
You can restrict the data dump to one namespace. In this example there are only templates with their current revision:
php dumpBackup.php --current --quiet --filter=namespace:10 > templates.xml
or templates with all of their revisions:
php dumpBackup.php --full --quiet --filter=namespace:10 > templates.xml
To include multiple namespaces:
php dumpBackup.php --current --quiet --filter=namespace:10,11 > templates_plus_template_talk.xml
Example usage of a plugin:
php dumpBackup.php \ --plugin=AbstractFilter:extensions/ActiveAbstract/AbstractFilter.php \ --current \ --output=gzip:/dumps/abstract.xml.gz \ --filter=namespace:NS_MAIN \ --filter=noredirect \ --filter=abstract \ --quiet
or
php dumpBackup.php \ --plugin=MathMLFilter:../extensions/MathSearch/maintenance/MathMLFilter.php \ --current \ --filter=namespace:NS_MAIN \ --filter=mathml \ --quiet
The --stub
option can be used with dumpTextPass.php .
To --include-files
:
php dumpBackup.php \ --full \ --include-files \ --uploads \ --output=gzip:/dumps/abstract.xml.gz \ --quiet
The --uploads
option needs to be used also. Otherwise no files will be included.
Detailed example
In this example Green text is text you type in.
- Redirect to the maintenance folder using the cd command. The location of your maintenance folder will vary from this example.
- Type in php dumpBackup.php --full > dump.xml and press enter. A long list of code is created, similar to the example below.
root@356:/# cd /home/trav/public_html/finddcjobs.com/public/w/maintenance root@356:/home/trav/public_html/finddcjobs.com/public/w/maintenance# php dumpBackup.php --full > dump.xml 2014-08-15 09:54:08: my_wiki-finddcjobs (ID 23578) 0 pages (0.0|0.0/sec all|curr), 100 revs (404.7|404.7/sec all|curr), ETA 2014-08-15 09:54:11 [max 1143] 2014-08-15 09:54:08: my_wiki-finddcjobs (ID 23578) 0 pages (0.0|0.0/sec all|curr), 200 revs (499.7|652.8/sec all|curr), ETA 2014-08-15 09:54:10 [max 1143] 2014-08-15 09:54:08: my_wiki-finddcjobs (ID 23578) 10 pages (19.2|83.8/sec all|curr), 300 revs (577.4|838.3/sec all|curr), ETA 2014-08-15 09:54:10 [max 1143] 2014-08-15 09:54:08: my_wiki-finddcjobs (ID 23578) 17 pages (24.1|91.4/sec all|curr), 400 revs (567.0|537.9/sec all|curr), ETA 2014-08-15 09:54:10 [max 1143] 2014-08-15 09:54:09: my_wiki-finddcjobs (ID 23578) 18 pages (15.6|40.2/sec all|curr), 500 revs (433.4|223.1/sec all|curr), ETA 2014-08-15 09:54:10 [max 1143] 2014-08-15 09:54:09: my_wiki-finddcjobs (ID 23578) 23 pages (15.4|66.8/sec all|curr), 600 revs (400.6|290.5/sec all|curr), ETA 2014-08-15 09:54:11 [max 1143] 2014-08-15 09:54:09: my_wiki-finddcjobs (ID 23578) 59 pages (36.0|412.4/sec all|curr), 700 revs (426.6|699.0/sec all|curr), ETA 2014-08-15 09:54:10 [max 1143] 2014-08-15 09:54:09: my_wiki-finddcjobs (ID 23578) 62 pages (36.2|856.3/sec all|curr), 800 revs (466.9|1381.2/sec all|curr), ETA 2014-08-15 09:54:10 [max 1143] 2014-08-15 09:54:10: my_wiki-finddcjobs (ID 23578) 89 pages (48.8|798.3/sec all|curr), 900 revs (493.2|896.9/sec all|curr), ETA 2014-08-15 09:54:10 [max 1143] 2014-08-15 09:54:10: my_wiki-finddcjobs (ID 23578) 120 pages (62.4|1224.2/sec all|curr), 1000 revs (520.1|1020.2/sec all|curr), ETA 2014-08-15 09:54:10 [max 1143] 2014-08-15 09:54:10: my_wiki-finddcjobs (ID 23578) 124 pages (59.0|697.5/sec all|curr), 1100 revs (523.7|562.5/sec all|curr), ETA 2014-08-15 09:54:10 [max 1143] root@356:/home/trav/public_html/finddcjobs.com/public/w/maintenance# |
The new xml file will be created in the maintenance folder (you may need to refresh your SCP to see the xml file).
Options
From MediaWiki r105912:
This script dumps the wiki page or logging database into an XML interchange wrapper format for export or backup. XML output is sent to stdout; progress reports are sent to stderr. Usage: php dumpBackup.php <action> [<options>] Actions: --full Dump all revisions of every page. --current Dump only the latest revision of every page. --logs Dump all log events. --stable Stable versions of pages? --pagelist=<file> Where <file> is a list of page titles to be dumped --revrange Dump specified range of revisions, requires revstart and revend options. Options: --quiet Don't dump status reports to stderr. --report=n Report position and speed after every n pages processed. (Default: 100) --server=h Force reading from MySQL server h --start=n Start from page_id or log_id n --end=n Stop before page_id or log_id n (exclusive) --revstart=n Start from rev_id n --revend=n Stop before rev_id n (exclusive) --skip-header Don't output the <mediawiki> header --skip-footer Don't output the </mediawiki> footer --stub Don't perform old_text lookups; for 2-pass dump --uploads Include upload records without files --include-files Include files within the XML stream --conf=<file> Use the specified configuration file (LocalSettings.php) --wiki=<wiki> Only back up the specified <wiki> Fancy stuff: (Works? Add examples please.) --plugin=<class>[:<file>] Load a dump plugin class --output=<type>:<file> Begin a filtered output stream; <type>s: file, gzip, bzip2, 7zip --filter=<type>[:<options>] Add a filter on an output branch --7ziplevel=<0-10> Level of 7zip compression (0 - no compression is default)
This script connects to the database using the username and password defined by $wgDBadminuser and $wgDBadminpassword , which are normally set in LocalSettings.php .
Usually $wgDBadminuser is a user with more privileges than the one in $wgDBuser , but for running dumpBackup.php
no extra privileges are required, so the usernames and passwords may be the same.
If the variables are not set, dumpBackup.php
will fail when trying to connect to the database:
$ php dumpBackup.php --full DB connection error: Unknown error
Notes
The XML file is sent to standard output (stdout); progress reports are sent to standard error (stderr). These are the default output and the output device for errors. When you call the script from the shell, then both outputs should by default be sent to the shell, meaning: You should see "informational" output and errors directly on screen.
Error messages
If you are not in the correct folder, you will receive this message:
No input file specified.
The dumpBackup script says "Warning: Division by zero in [DIRECTORY]/maintenance/backup.inc
" when the parameter after "--report" evaluates as 0 or not-a-number.
The fix is to run dumpBackup without the "--report" option; then dumpBackup will print a status line every 100 pages processed.
Recommended configuration settings
$wgRevisionCacheExpiry
should be set to 0, to avoid inserting all revisions into the object cache. Most of them won't benefit from being stored in the cache because your readers normally won't view random old revisions of pages on your wiki.
See also
- Manual:Backing up a wiki
- Manual:Importing XML dumps
- Manual:Parameters to Special:Export
- wikitech:Dumps/Software dependencies
- Examples and wrapper scripts
- xmldumps-backup contains scripts used by the Wikimedia Foundation, python scripts have some documentation; see also wikitech:Category:Dumps (some examples on wikitech:Dumps/Rerunning_a_job#Other_stuff for now).
- Wikia's WikiFactory/Dumps
References
- ↑ DumpBackup.php creates an XML interchange wrapper.