Return to the Stage

Sometimes life has a habit of getting in the way of our plans—but still we press on. That's been the theme of 2021 for many people, and I'm no exception.

Shortly after I posted the last update here, I started on peritoneal dialysis, and it's astonishing how much time and energy it takes up. I had hoped that the time I spend tethered to a machine in my bedroom would be an opportunity to focus more on projects like Lapis, but in practice it's just rarely productive time. And adjusting to the chronic fatigue that goes with it has been a major project in and of itself. Plus there's been no shortage of additional ways life has gotten in the way this year, including the sudden and unexpected death of my brother in September.

It's just been … a year.

At the same time, progress is being made on the pandemic front (or at least it was, until omicron arrived). I'm vaccinated and boosted, and I've had several opportunities to be back on the scoring stage over the course of 2021. It didn't take as long as I'd feared to get back, but it was long enough that I'd forgotten both how much work session prep is, and how immensely rewarding the payoff is.

Working with this year's cohort of (incredibly skillful) USC students spurred me to get back to work on the next to-do item in Lapis, which I'd been approaching sporadically over the past year-and-a-bit. It's been a major project, and a difficult one to keep all the complexity of in my head while I worked on it, but I'm pleased to say that I reached the milestone of being able to check off the next box on the Road Map!

✔︎ Frame rendering

This innocuous-looking line item is, in fact, one of the biggest steps toward a working prototype of Lapis. What it means is that video frames—which had previously been getting decoded, but just sitting around in memory—are now making their way onto the screen. There was a bit of complexity that it turned out I hadn't quite handled around buffering frames properly on the playback side, and a lot of moving parts related to converting frames to Metal textures and getting them to render. Then there's everything around managing a video window for playback, and a lot of attention to structuring things correctly under the hood, rather than just slapping together something inflexible that I'd end up having to rework later on.

It's also a major step because, for the first time, I can see things happening. I can start generating MTC in QLab, and watch Lapis follow along, playing video in perfect sync. It's really satisfying.

And, it's fast! My main point of reference is Streamers, which takes a noticeable amount of time to catch up when MTC starts. That's in part because Streamers takes a QuickTime movie player object and attempts to carefully control its timing, which incurs a lot of overhead and is rather like steering the Titanic. Lapis, on the other hand, decodes the file a frame at a time in a much more concerted way, which both saves that overhead and improves the timing accuracy. On an M1 MacBook Air, Lapis can skip into the middle of a 720p H.264 video and only miss about 5 frames before it starts playing back in sync.

I've also made progress on rendering visual overlays: streamers, punches, and the bar/beat counter are all up and running! This constitutes a huge amount of the work toward being able to use Lapis for live playback on the stage.

What's next

The next steps on the Road Map are:

Designing and building the UI is the most daunting item left on the list by far. I have thoughts about what I'd like to do—which ideas to carry over from Streamers, and what new ideas can make it faster and more user-friendly—but it's likely going to take a lot of building and tearing down and iterating to get something that feels right.

Still, even without a real UI in place, to be able to say that Lapis is functioning as a timecode-driven video player (albeit not a very user-friendly one yet) is quite an exciting way to round out 2021.