SwiftUI and Environment

Sometime, somewhere, I stumbled across a hint about SwiftUI that I had been stumped about. Consider this example:

This is great, and very easy to figure out. But how the heck does this work:

My naive approach involved creating a modifier, which applied a method to each view type I wanted to modify. That seemed like the right thing to do. But then it caused problems when I tried to apply it to *all* view types. It is possible to extend the View protocol to give a default implementation of a custom modifier method, but in the end, this made all my views run that default implementation instead of any custom implementation that class may adopt. And further, how does each view do the kind of introspection necessary for the Group to apply the Font modifier to the objects inside the Group?

It turns out that SwiftUI uses an EnvironmentValue in order to store the default font for any view in a given hierarchy.

https://developer.apple.com/documentation/swiftui/environmentvalues/font

Adopting this pattern made it easier to send information upward throughout the hierarchy of my framework MusicStaffView. For example:

Elements in the ZStack (including the StaffShapeView, which draws the musical staff lines) depend heavily on the vertical space between the staff lines. This is how I developed a better method of resolution independence when the framework moved from rendering images to drawing paths, and it remains a core feature of MusicStaffView today. But it requires that the width of the spaces is computed before the drawing takes place. In UIKit and Cocoa, this was easy, because the rendering happened in the draw() method for a custom subclass of UIView and NSView.

In reality, SwiftUI is calling similar methods to draw the shapes of the notes and other elements, but now it relies on the environment to describe how big to make each of the elements, as well as what their offsets are (i.e. a given note is offset from the center of the staff view by a certain distance depending on which note is being represented). In this way, it can be described by the GeometryProxy in the MusicStaffView itself and propagated to the rest of the views in the staff view, allowing them to draw in the correct size.

M2

I bought a new computer. Well, an old computer. A newer computer from the refurbished store. I appreciate that there are discounts on Apple products, because they can tend to be fairly expensive, and their add-ons are overpriced because they have a captive audience due to the soldered-on nature of the components.

Using the iOS devices without being able to upgrade them hasn’t really been an issue for me. It has trained me to appreciate the contained, singular nature of the devices, and I find that I don’t really mind that I can no longer upgrade the RAM or put in a new HDD to prolong the life of the machine. At some point it just becomes necessary to move to a new model.

I didn’t expect to do this so soon, but I had a couple of friends who bought new M2 macs, and I found myself with the itch to do something about my own setup. So far, it seems to be worth it. The performance gains that are happening with each M-series chip are particularly pronounced, and upgrading from the normal M1 (which felt magical) to the M2 Pro seems to be another level altogether. I didn’t think it would be like this, but that’s because the Intel that we have just came out of was so incremental.

These computers have gotten a bit more expensive. Gone are the days of a $1200, entry-level MacBook Pro. But the cost is becoming more worth it. I bought refurbished so that I could spend on extra RAM (or, unified memory, as we’re calling it now) and a bigger SSD, because I found myself always at the limit of 512GB. 1TB feels like a luxury, but I’m sure it will be a constraint in a couple of years.

Still, I find it hard to spend money on luxuries. But this seems to be worth it so far, and I think we are in a place where we won’t feel like the money went missing.