It’s been a slower couple of weeks. M is back into her studies, so my own days are following a bit more of a pattern. I still regular events keeping me up until the wee hours with people back in Canada on Tuesday and Thursday nights, but I’m trying to get in and out of bed at reasonable times otherwise. I went through a few days where I was pretty into Slime Rancher, and now Newto is pulling me back into TIS-100 and Infinifactory. The end of the summer is visible. Days are slowly getting shorter; it’s dark before 22:00.
Cottage cleaning continued
With the cottage empty for a few days and a bit of better weather, I undertook to power wash the paving tiles around the cottage.
Satisfying results, for sure. Need to do around the house next.
Bubble waffles
A café in Aberlour has started serving bubble waffles, so of course we had to take a walk down to try them. Hard to measure up to Moo Shu, but it was pretty good. In my excitement, I neglected to take any photos; but I did get a couple on the walk back up to the house.
Employment and writing
It’s time to start thinking about the fall. M is planning on moving to Strathkinness for the winter, and I need to start looking for something to do – something which will look good on a rental application. “Occasional open source contributor” probably isn’t enough. And I’m struggling to push ahead on that work, anyway, without anyone else interested in me doing it. Guess it’s time to start looking for real work. I started poking around for contract opportunities on Upwork. Lots of people looking for lots of things I can’t do. Imposter syndrome hits hard and fast. It’s the sort of thing to make one dream of switching careers. Too bad my obvious backup, live sound, is still in such a overwhelming state of COVID-induced decline.
I wrote an exploratory article on the performance of a particular design decision I made in passing on a project years ago. I don’t really remember what I was doing which brought it to mind, but I’m glad it was did. It was fun to write up, especially because I hadn’t put much thought into the performance of the choice at the time: it was originally stylistically-motivated. While writing the code samples for the post, I found that the amount of material I wanted to provide was unpleasantly long to read casually. Rather than provide a condensed version of the code in the post and link to a longer form, I annotated the unimportant sections and wrote a plugin to control visibility. I’m still struggling with the language to use to describe this feature: is it expansion? Compression? “Squeezing” is the term I used in the plugin name, but that was mostly for the pun value. And what are the bits that get hidden? Layers? Blocks? Expansions? Would like to get back to it and give it a proper readme, a configuration interface, and release it to the WordPress Plugin Directory.
Cullen and the Knock
We spent most of yesterday afternoon in Cullen, sitting on a bench overlooking the harbour. We had lunch, and M read while I watched the goings-on and enjoyed the day. People watching isn’t usually my thing, but people don’t usually move about that much, either.
In the evening, we climbed the Knock and had a picnic supper. That’s Ben Rinnes in the background.
It’s been a beautiful couple of days here. Clear, bright and sunny, just enough of a breeze.
Next week
I’d like to land at least one contract on Upwork, even if just something small.
I’d like to beat (or at least match) Newto’s cycle count on Interrupt Handler.
I’d like to write two more sections for the library logging analysis project/post I started on months ago.
I’d like to at least look into building a universal macOS binary for NRJS to support both x86-64 and ARM64.
I’d like to go at least one whole day without having an existential crisis.
Rebooting Seefilms by way of DCGSFT, and on the use of Syncplay
This is the eleventh summer some friends and I have run the DCGSFT drama group. Due to COVID-19 inhibiting its usual format, the group isn’t doing a performance this year; instead, we’re taking the opportunity (and advantage of everyone’s newly-developed familiarity with video conference calls) to do group readings of plays. It’s not very formal; whoever shows up on the call gets a part, and parts are assigned randomly (and can rotate from scene to scene). So far, we’ve read through Dear Brutus by J. M. Barrie, and Saint Joan by George Bernard Shaw.
This week, we took a break from reading plays, and watched a film adaptation of one instead: the 2002 adaptation of The Importance of Being Earnest by Oscar Wilde. We performed this play last summer, so it was pretty familiar to most; beyond that, I’d already re-read the script and watched this cinematic adaptation a month or two ago with KJJ’s Windstone book club. It was neat to rewatch it again with a different crew, and see different things pulled out of it.
Outside of the actual watching, there was a little technical effort involved on my part to get things set up for synchronous watching. I wanted to do something a little more structured than someone in the call doing a countdown and having everyone try to hit “play” at the same time. And because some people watching (self included, but also our fearless director) are doing so from fairly crappy Internet connections, I wanted a system where some people could stream the movie directly, while other people could download a copy ahead of time. The obvious solution is Syncplay: it’s free (in both of the usual senses), it’s cross-platform, and it supports a number of media players.
Syncplay, however, is only one piece of the puzzle (the other being a media player). Setup requires a little more attention than I (pessimistically, and perhaps uncharitably) expected all participants to be able to muddle through by themselves. I spent several hours trying to package up a turnkey distribution of Syncplay, mpv, and a configuration file for Syncplay which would automatically log into the right server and room with a preconfigured username. The intent was to throw together a web interface which would ask for a username, then patch the configuration file, and deliver a customized archive to the user. Unfortunately, the Syncplay configuration dialog doesn’t appear to be skippable. When loading the configuration file, Syncplay substitutes default values for unspecified configuration options, so I could generate a file specifying only the things I care about (username, server, room, etc.) – but the configuration dialog would always pop up anyway when launching Syncplay. Rightly or wrongly, I felt that if I couldn’t totally achieve my objective of a one-click launch then it wasn’t worth trying to build my own package at all. In the end, I wrote some instructions for installing the individual bits and pieces, and it worked out okay. We had a few issues with the stream pausing and spuriously rewinding/skipping, and I’m not sure it was really worth the effort to get everyone to use it; but it did function in the end.
Cottage cleanup
Most of the big effort of the week went into cleaning up around the cottage in anticipation of the arrival of the first Airbnb guests of the season. This involved the usual mundane deep-clean things we hadn’t done earlier in the year, like windows and the fridge, along with extra, pandemic-related disinfecting.
Earlier in the year, M and I cut down a whole bunch of Scotch broom near the cottage:
Before.
After.
We had piled it all the in cottage driveway, because we thought that would be the easiest place from which to load it all into the trailer and take it to the dump. Unfortunately, the dump has stopped taking trailer-loads for now, so there the pile sat. We ended up hand-carrying it down the hill and re-piling it in the main parking lot next to the drive shed.
The trouble with cleaning things is that you keep finding more things to clean. I took a brush to the garden bench thinking I could quickly knock some of the moss off it; a couple hours (and one serious application of wire brush and elbow grease) later, the bench is much cleaner, but really ought to be properly sanded and stained. To be done some time when there’s no one in the cottage to miss the bench – maybe in the fall.
M spent a lot of her time trying to beat the overgrown cottage garden into submission. In particular, one large rosebush had pulled itself off the wall and needed reanchoring. And a barrel planter needed replanting; and the interior of the cottage is much improved for the addition of several little plants. It’s not a great time to be cutting things back, horticulturally, but you’ve gotta do what you’ve gotta do. My thumbs are pretty brown, so I keep getting pretty surprised by what a few green things can accomplish.
It does feel really good to get this stuff off of the list. Some of it has been on there since we got here.
Next week
I’d still like to do the things which I said last week that I wanted to do this week. This week has been pretty physically demanding, so it was hard to muster the interest in expending much mental effort even when I did get the time. Hopefully, with the cottage pressure off, I’ll feel a bit more like tackling some deeper technical challenges in the coming week.
I made steaks (and frites). First time trying a reverse sear. The ribeyes we got were a bit thinner than necessary to take full advantage of the technique, but it was a good first stab at it. Even if they did come out closer to medium-well, they tasted phenomenal, and stayed wonderfully juicy. I’ll definitely be doing that again. I was a bit too ginger with the temperature for the fries, and while they were well-cooked and seasoned, they were a bit on the limp side. Would’ve been a good first pass for double-fried fries, but I was trying to do it in one. Live and learn.
Berries
M found what she believed to be an expansive patch of wild blueberries in the woods, and, after confirming that they were indeed wild blueberries, we collected probably half a cup of them. A bit bitter by themselves, but fantastic with (somewhat more than half a cup of) ice cream.
We repeated the experiment with raspberries from the garden, and appreciated the results similarly (although not quite as much – the blueberries really were great).
KCR
I’ve been mulling over whether and how best to get involved with the local community radio station. I mocked up a new website layout, but it’s not done yet and I ran into a few snags, so I put it back on the shelf. I did build prototype apps for both Android and iOS:
They might not look like much, but they don’t do much, either.
I tried to build something functionally equivalent to this many years ago for Axe Radio, but never got it finished. This seemed like a good opportunity to prove to myself that I could do it, given the time.
I ran into frustrations building both. The Android player was built over parts of three days. The first day was a false start based on the template Android Studio gives you if you start a new app and accept all of its default proposals. I ended up with Kotlin, which sounds fine, and the AppCompat/AndroidX support libraries. That sounded all well and good (I was looking forward to playing with Kotlin), but combined, those options carry over 2MB overhead with them in your published app package. To me, for an app which should be able to get by on media playback functionality built into Android for years now, that’s unacceptable. I spent the rest of the day in a new, Java-based project trying to pare back the Gradle build definition to a point where I could understand everything it was doing.
On day two, I got down to business wiring things up. On Android, you need to manage your own background task for long-living audio playback. It’s not hard, but it is a little tedious to get set up properly. Beyond that, the built-in programmatic media player, the aptly-named MediaPlayer, doesn’t give very good error messages. I stopped in frustration when I got to the point where older versions of Android were working fine, but Android 9+ failed with an obscure error code (1, meaning MEDIA_ERROR_UNKNOWN, with extra information -2147483648, meaning MEDIA_ERROR_SYSTEM – helpful and specific, I’m sure you’ll agree).
On day three, I proved that my implementation worked on some level when I succeeded in playing back media embedded in the app. After much fumbling and floundering, I tried side-stepping MediaPlayer’s built-in HTTP retrieval mechanism: I made my own HTTP request for a fixed-length chunk of the audio stream, and fed the bytes into MediaPlayer. And… it didn’t even make it to MediaPlayer, because I got an error making the HTTP request. As of Android 9, all HTTP traffic must be secure (HTTPS), or you need to configure your app to allow plaintext traffic. And yet, I had already configured that back on day 2. Turns out that redeploying the app through the Android Studio debugger may not cause the device to pick up on changes to the app manifest. After I uninstalled the app from the emulated device and reinstalled it, it worked perfectly. Aaugh.
As for iOS, I ran into slightly similar backwards-compatibility issues trying to start the project. The default Xcode template for a new Swift app relies on SwiftUI, which is only available as of iOS 13. I wanted to target something at least a version or maybe two older than that. Even after starting afresh with a Storyboard-based app, I still had to rip out a lot of references to scenes before it would compile. After that, it was mostly straightforward to get something working. iOS has similar restrictions on HTTP content in recent versions, but it gives helpful error messages about it. It’s also is much more restrictive about what can run in the background than is Android, but the trade-off is that the process for doing that background work much simpler: tick off the “Audio, AirPlay, and Picture in Picture” background mode permission, and your AVPlayer will do the right thing. No coordinating state between your app’s UI and some background task/service, because it’s managed for you. Much less flexible, but much easier to do – as long as what you want to do is on the path well trod. Then again, not in all cases: error handling for AVPlayer is a mess, and the iOS version of the app mostly doesn’t do it right now. HTTP errors from the source (e.g., 404 if the stream is offline) are handled separately from playback/stream errors (e.g., bad decoding), and are accessed through clumsy, Objective-C-style key-value observers. Even knowing when the player has started actually playing back media versus just starting to buffer it is tedious.
On one hand, it’s nice to have knocked out proofs-of-concept for both of these. On the other hand, neither experience was welcoming. Both Android Studio and Xcode were massive downloads on my wee, limited Internet connection here. Android Studio started off better, with a “mere” 850 MB installer, but immediately after install kicked off downloading a few more hundred megs of updates and plugins. Then, to emulate a device, you need to download an image for each Android version you want to run, which are 700 MB to 1.1 GB. Total weight, by the time all was said and done, was probably in the neighbourhood of 5.5 GB. Xcode, on the other hand, is a single massive 7.8 GB package, but it does have all of the bits in the box, so to speak.
I don’t like using either IDE much. IntelliJ (even for more traditional Java projects) always feels like death by a thousand cuts, with every little thing being just a little different than I’m used to. And without getting into its performance issues, Xcode continues to make worse and worse use of screen real estate. I feel like I could see more code on the screen at once in the QBASIC editor on a CGA monitor than in a contemporary Xcode session. This whole experience has been a good reminder of why I never finished doing this for Axe Radio: too much tooth-pulling to go through voluntarily, unless you’re feeling particularly stubborn.
I also sent an e-mail to KCR asking if and how I could get involved. Haven’t heard back yet, but maybe something beyond my own edification will come of this, eventually.
Webcam
I added a “The webcam is located at…” note to the webcam page. I realized that I’ve been sending this link to friends who don’t really have a precise idea of where I am, so this answers the question nicely.
I’m hoping to get a timelapse interface built soon. I have weeks of images now at five-minute intervals, and it would be fun to be able to browse through them. In addition to the obvious daily timelapse (animation of all of the images from the last 24 hours in sequence), I think it would be neat to do a time-by-day timelapse, showing images from (around) the same time across multiple days. It wouldn’t look as fluid (the clouds would jump around a lot), but it would be an easy way to compare day-by-day weather.
I’m pretty frustrated with the reflections on the inside of the window. I’d really like to put the camera outside the house. The webcam I’m using (Logitech C920) is not weather-protected in the slightest, but maybe I can build a box to put it in. Or maybe I can replace the plastic housing with aluminum. Then I could put a nicer lens on it, too. But that’s a fairly expensive option, as cool as the result would be. I’ve also thought about replacing the camera entirely with a security camera. Some of those are pretty cheap. But they can also pretty sketchy, with unknown data leakage, and sometimes the image can only be accessed through some cloud service, not directly. A CUBE would be perfect, only that it isn’t out yet. I could even resort to a Raspberry Pi High Quality Camera + a lens + a PoE HAT + some waterproof case, and have a self-contained unit. But that’s probably the most expensive solution of the lot.
Ben Rinnes
M and I climed up Ben Rinnes.
Ben Rinnes, seen from the webcam at around the time we were at its peak.
A difference of 525 m in altitude from base to peak. Took minutes shy of three hours to make the round trip, with about an hour and three-quarters of that being the trek up, and maybe ten or fifteen minutes spent at the top.
We were pretty spent afterwards, but after staring out the window at it for a few months, it felt pretty good to climb it and look down from the opposite perspective.
Next week
I’m hoping to get back to some research/analysis I was doing for NRJavaSerial. It would also be nice to finish up error handling for the iOS version of the KCR app, and maybe start on design work for both the Android and iOS versions. Displaying some information on the currently-playing show and music would be nice, too.
1 and a bit cups of farfalle or maybe rotini? Idk I’m not Italian
Heineken
Get home a bit late from work. Decide you want pasta. Realize that if you wanted pasta, you should’ve probably gotten something out of the freezer this morning or you’re going to have to suck it up and go to the grocery store. Do neither and submerge the freezer-bag-ensconced ground beef in tepid water with the hope that it’ll defrost quickly.
Begin veg prep. Do something somewhat like mincing to the mushrooms; that’s cool, right? Crush the garlic with the flat of the knife because it’s satisfying then question how much that really affects anything in the overall swing of things while continuing to mangle it finer with the actual, sharp, you’re-supposed-to-use-this-part of the blade.
Get bored waiting for the beef to thaw. You’re going to cook it pretty much well-done anyway, right, so why not just let it continue to thaw in the pan? Toss it in at medium-high heat.
Ah, that’s why not. Go throw in some olive oil much too late in the process (“It’s medium so it’s got lots of fat in it, right?”) and get the batteries out of the fire alarms (“Oh, that’s where that one is!”). Do your best to tear apart the burnt chunks while continuing to break up the larger, still-mostly-frozen pieces.
Add the garlic in here somewhere.
Once the beef is thoroughly cooked and not a moment before (because that would be the right time to do it), add the can of soup. Spread it around for a few seconds while trying to decide what other liquid would make this less of a balled-up disaster then slosh in a cup of milk from the measuring-ish implement you can find. Panic in the face of your decisions: it will truly be a miracle if this results in something that is in any way edible.
Make a huge mess getting all the various bits in the pan melded together. Realize you should’ve used a larger pan. Reduce heat to medium. Reduce the sauce, too. Taste the sauce. Bad call on the cream of celery there, mate. Remember you forgot the mushrooms. Add the mushrooms. Add black pepper.
Around the time the mixture has been reduced to a more normal sauce-like consistency and you’ve been reduced nearly to a nervous breakdown, remember the pasta. Get the water on to boil. Decide the reduction has gone on for long enough and bring the heat down as low as it goes.
Do some dishes to kill time. Nervously eye the quantity of pasta you’ve deemed appropriate. Will it be enough? Will it be far, far, far, far too much? Literally no one is capable of knowing and anyone who says otherwise is a damned liar.
When the water boils, add the pasta. Shortly thereafter, remember the red pepper. Add the red pepper to the sauce. Good thing you don’t like it too cooked. Add some more black pepper while you’re at it too, ‘cause why not? Remember you forgot to set a timer when you added the pasta. Make a note of whatever the clock on the stove reads and decide against setting a timer because it’s too late for that now.
Chop some parsley that you’ve had in the crisper drawer for six weeks which is miraculously still green because that’s a thing people put on top of pasta sauce, right?
Do more dishes.
Oh look, the pasta’s done. Sweet. Drain the pasta.
Start writing blog post.
Remember the pasta. Plate pasta, add sauce, top with parsley. Take photo for Instagram. Realize photo looks like dogfood. Decide to use photo for blog post instead.