[WIP] SwiftUI and Metal experiment
This is a prototype that renders data with SwiftUI and Metal.
Having used WebGL, WebGL2 and some WebGPU before, I was interested in Swift, SwiftUI and Metal. Luckily,
WebGPU, and some WebGL2 advances such as uniform buffer objects (UBO) are patterned after Apple's Metal
graphics API. To propagate value among UI elements such as sliders, Combine is used. It follows the reactive
model, inspired by Functional Reactive Programming, a proven substrate for data visualization (including
sonification) and interactive data analysis. As MTKView
is a subclass of UIVuew
on
mobile, the resulting UIKit component is wrapped for use by SwiftUI. Non-Metal components are kept in SwiftUI.
The initial experiments use public raster data for the annual average of California's precipitation between 1981 and 2010.
These first results use SwiftUI, Combine, MetalKit, shader functions, Core Graphics, 120FPS animation, and naturally, XCode and the Simulator.
The cross-section terminal circles are draggable via DragGesture
. The user can drag both circles simultaneously due to multitouch:
Operations include loading a data file, discretizing precipitation levels, generating contours, mapping it to palettes such as those provided at ColorBrewer2.
Having used TypeScript for years, Swift and its UI libraries prove to be a productive environment, letting me transfer enough of the earlier skills to feel like one can hit the ground running with Swift.
Caveat: it's not a good example for information design, UI design or data visualization practices. It's only for catching up on Swift, SwiftUI and Metal.