in a hole in the ground

Update, weeks of 2020-07-06 and 2020-07-13

I am a great eater of beef

I made steaks (and frites). First time trying a reverse sear. The ribeyes we got were a bit thinner than necessary to take full advantage of the technique, but it was a good first stab at it. Even if they did come out closer to medium-well, they tasted phenomenal, and stayed wonderfully juicy. I’ll definitely be doing that again. I was a bit too ginger with the temperature for the fries, and while they were well-cooked and seasoned, they were a bit on the limp side. Would’ve been a good first pass for double-fried fries, but I was trying to do it in one. Live and learn.

Berries

M found what she believed to be an expansive patch of wild blueberries in the woods, and, after confirming that they were indeed wild blueberries, we collected probably half a cup of them. A bit bitter by themselves, but fantastic with (somewhat more than half a cup of) ice cream.

We repeated the experiment with raspberries from the garden, and appreciated the results similarly (although not quite as much – the blueberries really were great).

KCR

I’ve been mulling over whether and how best to get involved with the local community radio station. I mocked up a new website layout, but it’s not done yet and I ran into a few snags, so I put it back on the shelf. I did build prototype apps for both Android and iOS:

I tried to build something functionally equivalent to this many years ago for Axe Radio, but never got it finished. This seemed like a good opportunity to prove to myself that I could do it, given the time.

I ran into frustrations building both. The Android player was built over parts of three days. The first day was a false start based on the template Android Studio gives you if you start a new app and accept all of its default proposals. I ended up with Kotlin, which sounds fine, and the AppCompat/AndroidX support libraries. That sounded all well and good (I was looking forward to playing with Kotlin), but combined, those options carry over 2MB overhead with them in your published app package. To me, for an app which should be able to get by on media playback functionality built into Android for years now, that’s unacceptable. I spent the rest of the day in a new, Java-based project trying to pare back the Gradle build definition to a point where I could understand everything it was doing.

On day two, I got down to business wiring things up. On Android, you need to manage your own background task for long-living audio playback. It’s not hard, but it is a little tedious to get set up properly. Beyond that, the built-in programmatic media player, the aptly-named MediaPlayer, doesn’t give very good error messages. I stopped in frustration when I got to the point where older versions of Android were working fine, but Android 9+ failed with an obscure error code (1, meaning MEDIA_ERROR_UNKNOWN, with extra information -2147483648, meaning MEDIA_ERROR_SYSTEM – helpful and specific, I’m sure you’ll agree).

On day three, I proved that my implementation worked on some level when I succeeded in playing back media embedded in the app. After much fumbling and floundering, I tried side-stepping MediaPlayer’s built-in HTTP retrieval mechanism: I made my own HTTP request for a fixed-length chunk of the audio stream, and fed the bytes into MediaPlayer. And… it didn’t even make it to MediaPlayer, because I got an error making the HTTP request. As of Android 9, all HTTP traffic must be secure (HTTPS), or you need to configure your app to allow plaintext traffic. And yet, I had already configured that back on day 2. Turns out that redeploying the app through the Android Studio debugger may not cause the device to pick up on changes to the app manifest. After I uninstalled the app from the emulated device and reinstalled it, it worked perfectly. Aaugh.

As for iOS, I ran into slightly similar backwards-compatibility issues trying to start the project. The default Xcode template for a new Swift app relies on SwiftUI, which is only available as of iOS 13. I wanted to target something at least a version or maybe two older than that. Even after starting afresh with a Storyboard-based app, I still had to rip out a lot of references to scenes before it would compile. After that, it was mostly straightforward to get something working. iOS has similar restrictions on HTTP content in recent versions, but it gives helpful error messages about it. It’s also is much more restrictive about what can run in the background than is Android, but the trade-off is that the process for doing that background work much simpler: tick off the “Audio, AirPlay, and Picture in Picture” background mode permission, and your AVPlayer will do the right thing. No coordinating state between your app’s UI and some background task/service, because it’s managed for you. Much less flexible, but much easier to do – as long as what you want to do is on the path well trod. Then again, not in all cases: error handling for AVPlayer is a mess, and the iOS version of the app mostly doesn’t do it right now. HTTP errors from the source (e.g., 404 if the stream is offline) are handled separately from playback/stream errors (e.g., bad decoding), and are accessed through clumsy, Objective-C-style key-value observers. Even knowing when the player has started actually playing back media versus just starting to buffer it is tedious.

On one hand, it’s nice to have knocked out proofs-of-concept for both of these. On the other hand, neither experience was welcoming. Both Android Studio and Xcode were massive downloads on my wee, limited Internet connection here. Android Studio started off better, with a “mere” 850 MB installer, but immediately after install kicked off downloading a few more hundred megs of updates and plugins. Then, to emulate a device, you need to download an image for each Android version you want to run, which are 700 MB to 1.1 GB. Total weight, by the time all was said and done, was probably in the neighbourhood of 5.5 GB. Xcode, on the other hand, is a single massive 7.8 GB package, but it does have all of the bits in the box, so to speak.

I don’t like using either IDE much. IntelliJ (even for more traditional Java projects) always feels like death by a thousand cuts, with every little thing being just a little different than I’m used to. And without getting into its performance issues, Xcode continues to make worse and worse use of screen real estate. I feel like I could see more code on the screen at once in the QBASIC editor on a CGA monitor than in a contemporary Xcode session. This whole experience has been a good reminder of why I never finished doing this for Axe Radio: too much tooth-pulling to go through voluntarily, unless you’re feeling particularly stubborn.

I also sent an e-mail to KCR asking if and how I could get involved. Haven’t heard back yet, but maybe something beyond my own edification will come of this, eventually.

Webcam

I added a “The webcam is located at…” note to the webcam page. I realized that I’ve been sending this link to friends who don’t really have a precise idea of where I am, so this answers the question nicely.

I’m hoping to get a timelapse interface built soon. I have weeks of images now at five-minute intervals, and it would be fun to be able to browse through them. In addition to the obvious daily timelapse (animation of all of the images from the last 24 hours in sequence), I think it would be neat to do a time-by-day timelapse, showing images from (around) the same time across multiple days. It wouldn’t look as fluid (the clouds would jump around a lot), but it would be an easy way to compare day-by-day weather.

I’m pretty frustrated with the reflections on the inside of the window. I’d really like to put the camera outside the house. The webcam I’m using (Logitech C920) is not weather-protected in the slightest, but maybe I can build a box to put it in. Or maybe I can replace the plastic housing with aluminum. Then I could put a nicer lens on it, too. But that’s a fairly expensive option, as cool as the result would be. I’ve also thought about replacing the camera entirely with a security camera. Some of those are pretty cheap. But they can also pretty sketchy, with unknown data leakage, and sometimes the image can only be accessed through some cloud service, not directly. A CUBE would be perfect, only that it isn’t out yet. I could even resort to a Raspberry Pi High Quality Camera + a lens + a PoE HAT + some waterproof case, and have a self-contained unit. But that’s probably the most expensive solution of the lot.

Ben Rinnes

M and I climed up Ben Rinnes.

Ben Rinnes, captured by the webcam
Ben Rinnes, seen from the webcam at around the time we were at its peak.

A difference of 525 m in altitude from base to peak. Took minutes shy of three hours to make the round trip, with about an hour and three-quarters of that being the trek up, and maybe ten or fifteen minutes spent at the top.

We were pretty spent afterwards, but after staring out the window at it for a few months, it felt pretty good to climb it and look down from the opposite perspective.

Next week

I’m hoping to get back to some research/analysis I was doing for NRJavaSerial. It would also be nice to finish up error handling for the iOS version of the KCR app, and maybe start on design work for both the Android and iOS versions. Displaying some information on the currently-playing show and music would be nice, too.

Write a Comment

Comment