The main purpose of the home setup is for testing our mediawiki setup. Our old sites are running old versions of Mediawiki on php5.6. Our custom extensions need to be updated to work with the latest MW and php7 in order to satisfy IT security, and on general principles.
Documentation on MediaWiki.org has some tips for running on Debian or Ubuntu. This is mostly not based on using the apt package manager, although some prereqs will be done with apt. In particular, it looks like I need:
- elasticsearch (the ubuntu default package is 1.73, will need to do this differently later)
I downloaded the latest release (1.30.0) to my home directory
extracted this with tar -xzf and then used cp -R to make separate wikis inside the /var/www/html directory.
- wiki – for general testing
Set this up as blank wikis first, so that I will get a clean copy of LocalSettings. The installer complained about not finding APCu, XCache, or WinCache. APCu appears to be the preferred choice. Added to the dependency list above. To get the installer to see this, apache had to be restarted. This is different than on using MacPorts on a Mac, of course.
sudo service apache2 restart
I now just get this warning, which I will ignore for the moment.
Warning: The intl PECL extension is not available to handle Unicode normalization, falling back to slow pure-PHP implementation.
Did the generic test wiki first.
- Kept the default db name my_wiki for this one.
- Storage engine InnoDB (default)
- Database character set UTF-8 (default is binary, but this makes it hard to see things in phpmyadmin)
- Pre-installed extensions
- Enable image uploads
- PHP object caching. Our current wikis use memcached, which is a different option. This was set up a long time ago, and caching definitely affects performance, but APCu was not compared at the time.
Using the web based installer, I had to download LocalSettings.php to my laptop and the scp it over to the server. This isn’t a big deal, but I suspect there’s a way to do the whole thing from the terminal.
This seems to work. The next step, however is to figure out how to do it all again using Docker so I can have multiple containers running different wikis.