Exactitude in wiki data modelling

There’s a new page on MediaWiki.org called Managing data in MediaWiki. It compares the Wikibase, Semantic MediaWiki, and Cargo extensions. All three of these extensions are about managing the metadata of wiki pages or of concepts represented by those pages.

The thing I love about MediaWiki is that one can give in to the impulse to progressively, gradually, move towards being exact in the representation of data. Each of the above extensions help massively with this, but it’s still pretty easy to get confused about how to do it.

3 legs

One of the four legs of the metalworking bench was recently sacrificed to make a post for the new letterbox, and so the bench is rather on hold for now. Which is a total relief! Because those bits of jarrah were really a bit rough anyway for the bench I’ve got in mind, with too many holes and rotten bits. So I can now move down the list and get working on the jarrah work stand (or one of… I’ve only got enough timber for one, which will be supremely useless without its mate, but I’m sure I can find more wood for the 2nd stand).


I’ve had to give up on my camera because it’s battery has died. So I’m stuck with the phone. Although, there’s something good too about being able to describe photos as soon as I take them, rather than waiting and maybe forgetting why i took them.

Like this strange rusty bouy thing on the beach, but probably this is a bad photo and shouldn’t be used.

Rusy bouy on the beach near the Army Jetty
Rusy bouy on the beach near the Army Jetty (2019-09-24)

Army Jetty changes

Heading back to the Settlement now, and I’ve stopped at the army jetty to have a look at it now that the unsafe concrete decking has been demolished. It’s a bit boring now, just a groyne, but good that it’s nit going to squash anyone I guess!

English: Looking towards the shore on the Army jetty, with new limestone visible in the foreground where the concrete decking was removed in 2019.

Army Jetty with new part.jpg (Taken on 24 September 2019) by

Sam Wilson

(taken with LG G6), CC-BY-SA-3.0.

Backing up (my) Commons files

I’m experimenting with an idea of treating the Commons copies of my photos as the ‘master’ copy, and not keeping them online anywhere else (e.g. Flickr). This involves uploading to Commons and then keeping a local copy in sync — because I don’t want to lose any photos if they get deleted from Commons.

I’m using Digikam locally, and have two collections configured: one scratch-pad one, for sorting out photos that are just off the camera; and one backup one, which lets me browse photos I’ve got on Commons.

I download from Commons with the following backup.sh script, which goes through all of my contributions and exports XML for every page I’ve worked on, and every file for which I’m the first author (i.e. I uploaded it).

BACKUP_DIR=$(cd "$(dirname $0)"; pwd -P)
mwcli export:contribs \
	--config="$BACKUP_DIR/config.yml" \
	--wiki=commons \
	--user=samwilson \
	--dest="$BACKUP_DIR" \

The mwcli script is at github.com/samwilson/mwcli

The reason I want the Commons copy to be canonical is that it makes for centralised metadata, a single place to edit and add links to related material. It’s annoying to have to keep metadata in sync between Commons, Flickr, and possibly a local copy of things too.