PencilController: использование Apple Pencil в качестве 3D-контроллера для редактирования изображений

Несмотря на то, что Джони Айв описал Карандаш как предназначенный для маркировки, а не как замену стилуса в Wallpaper * , я решил изучить несколько нетрадиционных способов использования моего. Вчера я увидел слегка потрепанный электронный масштаб, основанный на карандашах,  а сегодня я использую его в качестве своего рода джойстика для управления параметрами фильтров изображений.

My PencilController project is a Swift app for iPad Pro that applies two Core Image filters to an image: a hue adjustment and a colour controls which I use to control the saturation.

The Pencil’s orientation in space is described by the Horizontal Coordinate System with azimuth and altitude angles.

The hue filter’s value is controlled by the azimuth angle and the saturation is controlled by the altitude angle: when the pencil is vertical, the saturation is zero and when it’s horizontal the saturation is eight (although when the pencil is totally horizontal, its tip isn’t actually touching the screen, so the highest saturation the app can set is about six and three quarters).

To jazz up the user interface, I’ve also added a rounded cylinder using SceneKit which mirrors the Pencil’s position and orientation.

Controlling Core Image Filter Parameters with Pencil

Setting the values for the two Core Image filters is pretty simple stuff.  Both filters are declared as constants at the top of my view controller along with a Core Image context (without colour management for performance) and a Core Image image:

    let hueAdjust = CIFilter(name: "CIHueAdjust")!
    let colorControls = CIFilter(name: "CIColorControls")!

    let ciContext = CIContext(EAGLContext: EAGLContext(API: EAGLRenderingAPI.OpenGLES2),
        options: [kCIContextWorkingColorSpace: NSNull()])

    let coreImage = CIImage(image: UIImage(named: "DSCF0786.jpg")!)!

When the touch either starts or changes, I want to ensure it originates from a Pencil by checking its type and then invoke applyFilter() via pencilTouchHandler() method:

    override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?)
        guard let touch = touches.first where
            touch.type == UITouchType.Stylus else


pencilTouchHandler() extracts the azimuth and altitude angles from the UITouch, does some simple arithmetic and passes those values to applyFilter():

    applyFilter(hueAngle: pi + touch.azimuthAngleInView(view),
        saturation: 8 * ((halfPi - touch.altitudeAngle) / halfPi))

It’s applyFilter() that uses those two values to set the parameters on the filters and display the output in a UIImageView:

    func applyFilter(hueAngle hueAngle: CGFloat, saturation: CGFloat)
            forKey: kCIInputImageKey)
            forKey: kCIInputAngleKey)

        colorControls.setValue(hueAdjust.valueForKey(kCIOutputImageKey) as! CIImage,
            forKey: kCIInputImageKey)
            forKey: kCIInputSaturationKey)

        let cgImage = ciContext.createCGImage(colorControls.valueForKey(kCIOutputImageKey) as! CIImage,
            fromRect: coreImage.extent)

        imageView.image =  UIImage(CGImage: cgImage)

        label.text = String(format: "Hue: %.2f°", hueAngle * 180 / pi) + "      " +  String(format: "Saturation: %.2f", saturation)

On my iPad Pro this filtering is fast enough on a near full screen image that I don’t have to worry about doing this work in a background thread.

Controlling SceneKit Geometry with Pencil

The next piece of work is to orient and position the «virtual pencil» so it mirrors the real one. I’ve overlaid a SCNView above the UIImageView and added a capsule geometry (which is a cylinder with rounded ends, not unlike a Pencil). Importantly, I’ve also added a flat plane which is used to capture the Pencil’s location in the SceneKit 3D space:

    let sceneKitView = SCNView()
    let scene = SCNScene()
    let cylinderNode = SCNNode(geometry: SCNCapsule(capRadius: 0.05, height: 1))
    let plane = SCNNode(geometry: SCNPlane(width: 20, height: 20))

    // in init()
    sceneKitView.scene = scene

Inside the pencilTouchHandler(), I use the SceneKit view’s hitTest() method to find the x and y positions of the Pencil on the screen in SceneKit’s 3D space on the plane:

   func pencilTouchHandler(touch: UITouch)
        guard let hitTestResult:SCNHitTestResult = sceneKitView.hitTest(touch.locationInView(view), options: nil)
            .filter( { $0.node == plane })
            .first else

…and with the results of that hit test, I can position the cylinder underneath the Pencil’s touch location:

    cylinderNode.position = SCNVector3(hitTestResult.localCoordinates.x,

Finally, with the altitude and azimuth angles of the touch, I can set the Euler angles of the cylinder to match the Pencil:

    cylinderNode.eulerAngles = SCNVector3(touch.altitudeAngle, 
        0 - touch.azimuthAngleInView(view) - halfPi)

I’ve made the SceneKit camera orthographic, a perspective camera adds unwanted rotation to the «virtual pencil» as it moves across the screen. 


Despite what Jony Ive may say, the Pencil offers some user interaction patterns impossible with a simple touch screen and I hope other developers start exploring new ideas. In addition to the two angles, the Pencil also has andy coordinates and its force, so that’s five different values that could potentially be used for controlling anything, from image filters to an audio synthesiser!

As always, the source code for this project is available at my GitHub repository here. Enjoy!

I’m also working on a Pencil controlled synthesiser app using the AudioKit libraries, here’s a preview of the prototype in action:

Addendum: I’ve updated the app to allow the user to select between three different image editing modes! See the video above.