While the Apple Pencil is designed as a sketching tool for creative professionals, MacRumors reader Simon Gladman has created three Swift demo apps that show the accessory being used for three unconventional purposes — as a weight scale, controlled synthesizer and 3D controller for image editing.
PencilScale, based on Goodman’s Plum-O-Meter, is an experimental app that uses a homemade harness to turn the Apple Pencil into an electronic scale that is highly sensitive, but not incredibly accurate.
The experiment works by subtracting the touch’s force from a base weight, which is “set as the current touch force when the ‘zero’ button is pressed,” and multiplying it by 140 for a very rough weight in grams.
- Apple Pencil’s horizontal position on the screen controls frequency
- Apple Pencil’s vertical position on the screen controls the modulating multiplier
- Apple Pencil’s altitude angle controls the carrier multiplier
- Apple Pencil’s azimuth angle controls the modulation index
PencilController is an experimental image processing app that uses the Apple Pencil as a controller for the fine setting of parameters on Core Image filters.
The demo has three image filtering modes:
- Hue/Saturation – Apple Pencil’s azimuth angle controls hue and its altitude angle controls the saturation
- Brightness/Contrast – Apple Pencil’s altitude angle along North/South controls contrast and the angle along West/East controls brightness
- Gamma/Exposure – Apple Pencil’s altitude angle along North/South controls exposure and the angle along West/East controls gamma
Gladman explains that “the app uses a spring loaded pattern, so the user needs to hold down one of the mode keys in the bottom left of the screen to stay in the filtering mode.”
The source code for all three projects is available on GitHub.