South Beach
I'm trying to make UnlinkedWikibase a bit more performant for pages that include data of tens or more of entities. The basic idea is that the parser function, which currently causes a remote request to me made, will still do that but it'll specify the cache TTL of that request to be 'indefinitely'. That way, subsequent parser function calls will result in the cached data being used—forever.
That of course is silly, so the next part of it is a maintenance script that will loop through all entities in use, and for each of them it will refresh the cache.
Hopefully this means that the editing user experiences the slowness of the remote fetching, but no one else needs to. The trade off being that the data might be a bit out of date, but the only time I've found that to be a problem is during development of modules, or when demonstrating to someone the way in which edits made on Wikidata propagate to the local wiki. Neither are insurmountable, and if they're too annoying then perhaps an on-demand cache refreshing could be added somewhere.
Anyway, the current issue seems to be my misunderstanding of how to refresh a cached item that has an infinite TTL. And the main issue there is that I'm now sat under a tree in the shade on a hot Sunday afternoon, and am giving up on thinking for the day.