Abriß I

Das Häuschen ist schon ein bisschen speziell. Es gibt nicht nur ein außen liegendes, aber unbenutztes, Klo, sondern auch ein innen liegendes, welches über keinen Wasserzugang verfügt. Ein Geheimraum ohne Zugang, welcher einstmals einen Öltank beherbergte, fügt sich zwischen Bad und Außenwand ein.

Jedenfalls fängt Z. schon einmal an, den Innenklobunker zu entfernen.

An one finger rotation gesture recognizer

Last year we developed the Raumfeld iPhone App. The goal was to build something that could replace our native controller completely. The main design feature of the native controller is the massive volume knob. Of course we wanted to have something similar in our App, so the designer created the volume screen with a big knob, as shown in the right picture.

To create a volume control, we needed a way to detect if the user performs a rotation gesture on the knob image. There is no default UIGestureRecognizer that detects such a gesture, so we had to implement a custom gesture recognizer. It was surprisingly easy – to be honest, the hardest part was to do the math right.

To track the finger movement, we need to check for every touch event, whether it is within the defined area. The gesture is rotating, so there is a center point m, the radius a which defines the minimum distance from m and the radius b which defines the maximum distance from m. Finally we need to calculate the angle ? between the startpoint and the current finger position (relative to m).

For the first check, we calculate the distance d from the center and make sure, that a < d < b is true.

The rotation angle is the angle between the two lines a and b. The arc tangent function atan() is the key:

Ok, that was the hard part 🙂 To implement a custom gesture recognizer, let’s have a look at the UIGestureRecognizer API docs, especially the subclassing notes. We just need to overwrite five methods

The key is touchMoved:withEvent: This method checks the distance of the touch event from the center point. If the touch is within the valid area, the angle between the start point and the current touch position is calculated. The result is sent to the delegate object of our class.

target needs to implement some kind of protocol to allow the gesture recognizer notification of movements:

And that’s the whole magic. To use the gesture recognizer, create an image of a rotating control and add a gesture recognizer to your view:

As soon as a gesture is detected, the delegate method is called and you can rotate the image according to the angle:

I’ve created a sample project on github, feel free to play with. Also make sure to read Ole Begemanns article about the UX details on gesture recognition.