MediaWiki Documentation Day 2017

It’s MediaWiki Documentation Day 2017!

So I’ve been documenting a couple of things, and I’ve added a bit to the Xtools manual.

The latter is actually really useful, not so much from the end-user’s point of view because I dare say they’ll never read it, but I always like writing documentation before coding. It makes the goal so much more clear in my mind, and then the coding is much easier. With agreed-upon documentation, writing tests is easier; with tests written, writing the code is easier.

Time for a beer — and I’ll drink to DFD (document first development)! Oh, and semantic linebreaks are great.

Editing MediaWiki pages in an external editor

I’ve been working on a MediaWiki gadget lately, for editing Wikisource authors’ metadata without leaving the author page. It’s fun working with and learning more about OOjs-UI, but it’s also a pain because gadget code is kept in Javascript pages in the MediaWiki namespace, and so every single time you want to change something it’s a matter of saving the whole page, then clicking ‘edit’ again, and scrolling back down to find the spot you were at. The other end of things—the re-loading of whatever test page is running the gadget—is annoying and slow enough, without having to do much the same thing at the source end too.

So I’ve added a feature to the ExternalArticles extension that allows a whole directory full of text files to be imported at once (namespaces are handled as subdirectories). More importantly, it also ‘watches’ the directories and every time a file is updated (i.e. with Ctrl-S in a text editor or IDE) it is re-imported. So this means I can have MediaWiki:Gadget-Author.js and MediaWiki:Gadget-Author.css open in PhpStorm, and just edit from there. I even have these files open inside a MediaWiki project and so autocompletion and documentation look-up works as usual for all the library code. It’s even quite a speedy set-up, luckily: I haven’t yet noticed having to wait at any time between saving some code, alt-tabbing to the browser, and hitting F5.

I dare say my bodged-together script has many flaws, but it’s working for me for now!

What goes Where on the Web

Every now and then I recap on where and what I store online. Today I do so again, while I’m rather feeling that there should be discrete and specific tools for each of the things.

Firstly there are the self-hosted items:

  1. WordPress for blogging (where photo and file attachments should be customized to the exact use in question, not linked from external sites). Is also my OpenID provider.
  2. Piwigo as the primary location for all photographs.
  3. MoonMoon for feed reading (and, hopefully one day, archiving).
  4. MediaWiki for family history sites that are closed-access.
  5. My personal DokuWiki for things that need to be collaboratively edited.

Then the third-party hosts:

  1. OpenStreetMap for map data (GPX traces) and blogging about map-making.
  2. Wikimedia Commons for media of general interest.
  3. The NLA’s Trove for correcting newspaper texts.
  4. Wikisource as a library.
  5. Twitter (although I’m not really sure why I list this here at all).

Finally, I’m still trying to figure out the best system for:

  1. Public family history research. There’s some discussion about this on Meta.

What web-based software to use

My current favourites, based on the idea that software should be constructed with a specific use in mind (obviously, wikis are an odd exception to this):

Glossaries in Semantic MediaWiki

A simple glossary system for Semantic MediaWiki that lets you define key terms for use in technical documentation etc.

A term can be referenced from anywhere in the wiki with {{defined term inline|term}}. This results in the term being displayed in a distinct style (green for instance) and linked to the term’s wikipage. When a user hovers over the link, a tooltip is displayed that contains the term’s definition.

Software required: MediaWiki, ParserFunctions, SemanticMediaWiki, SemanticForms.

Pages required:

  1. Defined terms
  2. Template:Defined term
  3. Template:Defined term inline
  4. Form:Defined term
  5. Property:Definition
  6. MediaWiki:Common.css (to change the style of the inline terms)
  7. MediaWiki:Common.js (to fix the tooltip display)
  8. Category:Defined terms (no content actually required, but probably should at least be categorized)
  9. Data pages

Continue reading Glossaries in Semantic MediaWiki

ReqWiki semantic software requirements management

ReqWiki is a Semantic MediaWiki-based system for software requirements management. It looks very interesting.

We know that requirements engineering has a large impact on the success of a project, yet sophisticated tool support, especially for small to mid-size enterprises, is still lacking. We present ReqWiki, a novel open source web-based approach based on a semantic wiki that includes natural language processing (NLP) assistants, which work collaboratively with humans on the requirements specification documents.

ReqWiki features three top-level wiki pages that are considered the entry points to the SRS documentation platform. They have been designed to guide stakeholders through the RE process (Fig. 3).

The top-level pages are as follows:

  • A Vision page to define the product position, stakeholders, assumptions, dependencies, needs, and features;
  • a Use Case page to define actors, goals, and use cases;
  • a Supplementary Specification page to define functional and non-functional requirements, standards, legal notes, test cases and traceability links.

The wiki pages are available on Github.

(Here’s a local copy of the PDF, in case the link above goes away one day.)

Printable WeRelate

Last year I wrote a little script for producing GraphViz graphs, and LaTeX books, from werelate.org family history data. I’ve been tweaking it a bit now and then, and using it for my mum’s genealogical research. It works, but the more I want to do with it the more I think it needs a good ground-up refactoring. So, I’ve set to work turning it into a MediaWiki extension, so I can use an installation of MediaWiki as the cache (instead of text files), and update this installation in a separate operation to the tree-generation stuff. (I found that I was playing around with regenerating things more often that I wanted to be waiting for downloading modified data, and it was set to check for modifications if it’d been longer than a day since the last check…) The other big advantage of sync’ing into a local MW is that I’ll have a complete, working, backup of all our data.

The basic idea is that the ancestor and descentant lists, which define the starting points for the tree traversal, will be defined in normal wiki pages, and both the syncronisation and the tree-generation processes will read these and do what they need to do.

I’m setting up a test wiki at http://test.archives.org.au, if anyone’s interested.

MediaWiki_TwentyTen now has nicer ToCs

I’ve finally gotten around to updating my MediaWiki port of the TwentyTen WordPress theme. There’s more to be done, but this morning I moved the page Tables of Contents to the sidebar (issue #2). It’s nice to be forced into learning more about the internals (especially caching) of MediaWiki!

WordPress TwentyTen theme ported to MediaWiki

Last week I needed a simple, reader-focused skin for a MediaWiki install, and I figured WordPress’ TwentyTen theme would be suitable. So I ported it to MediaWiki.

The skin can be downloaded from Github, and I’ve also added it to the MediaWiki gallery of user styles.