Latest revision |
Your text |
Line 27: |
Line 27: |
| ** [[Memcached/MediaWiki|Memcached]] - An alternative to the default APCu PHP caching system, and is designed to significantly lighten the load of queries on the database. Also, the [[mediawikiwiki:Extension:OAuth|OAuth]] extension requires memcached. | | ** [[Memcached/MediaWiki|Memcached]] - An alternative to the default APCu PHP caching system, and is designed to significantly lighten the load of queries on the database. Also, the [[mediawikiwiki:Extension:OAuth|OAuth]] extension requires memcached. |
| ** [[Redis/MediaWiki|Redis]] - Redis is now used by the Wikimedia Foundation instead of Memcached, since it can also handle the job queue. | | ** [[Redis/MediaWiki|Redis]] - Redis is now used by the Wikimedia Foundation instead of Memcached, since it can also handle the job queue. |
| * [[MediaWiki/Moderation|Moderation]] - How to moderate on MediaWiki, as it can get covered in spam.
| |
|
| |
|
| == Extensions == | | == Extensions == |
Line 96: |
Line 95: |
|
| |
|
| These XML dumps can then be imported through [[mediawikiwiki:Manual:Importing_XML_dumps|these procedures.]] | | These XML dumps can then be imported through [[mediawikiwiki:Manual:Importing_XML_dumps|these procedures.]] |
|
| |
| ==== Automysqlbackup ====
| |
|
| |
| This script can make setting up cron for backing up all mysql databases much easier. You'll still have to upload the backups with another cron script though.
| |
|
| |
| Notice that you should exclude the {{ic|performance_schemas}} table from backup.
| |
|
| |
| https://www.linux.com/learn/how-do-painless-mysql-server-backups-automysqlbackup
| |
|
| |
|
| === Image Backup === | | === Image Backup === |
Line 110: |
Line 101: |
|
| |
|
| Use [[mediawikiwiki:Manual:ImportImages.php|ImportImages.php]] to dump them to a folder. Then 7zip them up into the Wikiteam format along with the XML. | | Use [[mediawikiwiki:Manual:ImportImages.php|ImportImages.php]] to dump them to a folder. Then 7zip them up into the Wikiteam format along with the XML. |
|
| |
| === Automated Site Backup ===
| |
|
| |
| Since we have a unique configuration, it can be difficult to reconstruct if it is lost. This script backs up the site config and all images.
| |
|
| |
| Monthly full backup:
| |
|
| |
| {{bc|<nowiki>
| |
| 55 11 1 * * /usr/local/bin/fullmwbackup.sh
| |
| </nowiki>}}
| |
|
| |
| {{bc|<nowiki>
| |
| #!/bin/bash
| |
| #
| |
| # fullsitebackup.sh V1.2
| |
| #
| |
| # Full backup of website files and database content.
| |
| #
| |
| # A number of variables defining file location and database connection
| |
| # information must be set before this script will run.
| |
| # Files are tar'ed from the root directory of the website. All files are
| |
| # saved. The MySQL database tables are dumped without a database name and
| |
| # and with the option to drop and recreate the tables.
| |
| #
| |
| # ----------------------
| |
| # 05-Jul-2007 - Quick adaptation for MediaWiki (currently testing)
| |
| # ----------------------
| |
| # March 2007 Updates - Version for Drupal
| |
| # - Updated script to resolve minor path bug
| |
| # - Added mysql password variable (caution - this script file is now a security risk - protect it)
| |
| # - Generates temp log file
| |
| # - Updated backup and restore scripts have been tested on Ubunutu Edgy server w/Drupal 5.1
| |
| #
| |
| # - Enjoy! BristolGuy
| |
| #-----------------------
| |
| #
| |
| ## Parameters:
| |
| # tar_file_name (optional)
| |
| #
| |
| #
| |
| # Configuration
| |
| #
| |
|
| |
| # Database connection information
| |
| #dbname="wikidb" # (e.g.: dbname=wikidb)
| |
| #dbhost="localhost"
| |
| #dbuser="" # (e.g.: dbuser=wikiuser)
| |
| #dbpw="" # (e.g.: dbuser password)
| |
|
| |
| # Website Files
| |
| webrootdir="/var/www/mediawiki" # (e.g.: webrootdir=/home/user/public_html)
| |
|
| |
| #
| |
| # Variables
| |
| #
| |
|
| |
| # Default TAR Output File Base Name
| |
| tarnamebase=sitebackup-
| |
| datestamp=`date +'%m-%d-%Y'`
| |
|
| |
| # Execution directory (script start point)
| |
| #startdir=`pwd`
| |
| startdir=/tmp
| |
| logfile=$startdir"/fullsite.log" # file path and name of log file to use
| |
|
| |
| # Where backups should be placed
| |
| enddir=/var/backup/mediawiki
| |
|
| |
| # Temporary Directory
| |
| tempdir=$datestamp
| |
|
| |
| #
| |
| # Input Parameter Check
| |
| #
| |
|
| |
| if test "$1" = ""
| |
| then
| |
| tarname=$tarnamebase$datestamp.tgz
| |
| else
| |
| tarname=$1
| |
| fi
| |
|
| |
| #
| |
| # Begin logging
| |
| #
| |
| echo "Beginning mediawiki site backup using fullsitebackup.sh ..." > $logfile
| |
| #
| |
| # Create temporary working directory
| |
| #
| |
| echo " Creating temp working dir ..." >> $logfile
| |
| cd $startdir
| |
| mkdir $tempdir
| |
|
| |
| #
| |
| # TAR website files and /etc/mediawiki/LocalSettings.php
| |
| #
| |
| echo " TARing website files into $webrootdir ..." >> $logfile
| |
| cd $webrootdir
| |
| tar czf $enddir/$tarname.tar.gz /etc/mediawiki/LocalSettings.php .
| |
| #tar cf $startdir/$tempdir/filecontent.tar .
| |
|
| |
| #
| |
| # sqldump database information
| |
| #
| |
| #echo " Dumping mediawiki database, using ..." >> $logfile
| |
| #echo " user:$dbuser; database:$dbname host:$dbhost " >> $logfile
| |
| #cd $startdir/$tempdir
| |
| #mysqldump --user=$dbuser --password=$dbpw --add-drop-table $dbname > dbcontent.sql
| |
|
| |
| #
| |
| # Create final backup file
| |
| #
| |
| #echo " Creating final compressed (tgz) TAR file: $tarname ..." >> $logfile
| |
| #tar czf $enddir/$tarname filecontent.tar
| |
| #tar czf $enddir/$tarname filecontent.tar dbcontent.sql
| |
|
| |
| #
| |
| # Cleanup
| |
| #
| |
| echo " Removing temp dir $tempdir ..." >> $logfile
| |
| cd $startdir
| |
| rm -r $tempdir
| |
|
| |
| #
| |
| # Exit banner
| |
| #
| |
| endtime=`date`
| |
| echo "Backup completed $endtime, TAR file at $tarname. " >> $logfile
| |
|
| |
| </nowiki>}}
| |