An one finger rotation gesture recognizer

Last year we developed the Raumfeld iPhone App. The goal was to build something that could replace our native controller completely. The main design feature of the native controller is the massive volume knob. Of course we wanted to have something similar in our App, so the designer created the volume screen with a big knob, as shown in the right picture.

To create a volume control, we needed a way to detect if the user performs a rotation gesture on the knob image. There is no default UIGestureRecognizer that detects such a gesture, so we had to implement a custom gesture recognizer. It was surprisingly easy – to be honest, the hardest part was to do the math right.

To track the finger movement, we need to check for every touch event, whether it is within the defined area. The gesture is rotating, so there is a center point m, the radius a which defines the minimum distance from m and the radius b which defines the maximum distance from m. Finally we need to calculate the angle ? between the startpoint and the current finger position (relative to m).

For the first check, we calculate the distance d from the center and make sure, that a < d < b is true.

/** Calculates the distance between point1 and point 2. */
CGFloat distanceBetweenPoints(CGPoint point1, CGPoint point2)
    CGFloat dx = point1.x - point2.x;
    CGFloat dy = point1.y - point2.y;
    return sqrt(dx*dx + dy*dy);

The rotation angle is the angle between the two lines a and b. The arc tangent function atan() is the key:

/** The method is a bit too generic - in our case both lines share the same start point. */
CGFloat angleBetweenLinesInDegrees(CGPoint beginLineA,
                                   CGPoint endLineA,
                                   CGPoint beginLineB,
                                   CGPoint endLineB)
    CGFloat a = endLineA.x - beginLineA.x;
    CGFloat b = endLineA.y - beginLineA.y;
    CGFloat c = endLineB.x - beginLineB.x;
    CGFloat d = endLineB.y - beginLineB.y;
    CGFloat atanA = atan2(a, b);
    CGFloat atanB = atan2(c, d);
    // convert radiants to degrees
    return (atanA - atanB) * 180 / M_PI;

Ok, that was the hard part 🙂 To implement a custom gesture recognizer, let’s have a look at the UIGestureRecognizer API docs, especially the subclassing notes. We just need to overwrite five methods

- (void)reset;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event;
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event;

The key is touchMoved:withEvent: This method checks the distance of the touch event from the center point. If the touch is within the valid area, the angle between the start point and the current touch position is calculated. The result is sent to the delegate object of our class.

- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
    [super touchesMoved:touches withEvent:event];
    if (self.state == UIGestureRecognizerStateFailed) return;
    CGPoint nowPoint  = [[touches anyObject] locationInView: self.view];
    CGPoint prevPoint = [[touches anyObject] previousLocationInView: self.view];
    // make sure the new point is within the area
    CGFloat distance = distanceBetweenPoints(midPoint, nowPoint);
    if (   innerRadius < = distance
        && distance    <= outerRadius)
        // calculate rotation angle between two points
        CGFloat angle = angleBetweenLinesInDegrees(midPoint, prevPoint, midPoint, nowPoint);
        // fix value, if the 12 o'clock position is between prevPoint and nowPoint
        if (angle > 180)
            angle -= 360;
        else if (angle < -180)
            angle += 360;
        // sum up single steps
        cumulatedAngle += angle;
        // call delegate
        if ([target respondsToSelector: @selector(rotation:)])
            [target rotation:angle];
        // finger moved outside the area
        self.state = UIGestureRecognizerStateFailed;

target needs to implement some kind of protocol to allow the gesture recognizer notification of movements:

@protocol OneFingerRotationGestureRecognizerDelegate <nsobject>
/** A rotation gesture is in progress, the frist argument is the rotation-angle in degrees. */
- (void) rotation: (CGFloat) angle;
/** The gesture is finished, the first argument is the total rotation-angle. */
- (void) finalAngle: (CGFloat) angle;

And that’s the whole magic. To use the gesture recognizer, create an image of a rotating control and add a gesture recognizer to your view:

// calculate center and radius of the control
CGPoint midPoint = CGPointMake(image.frame.origin.x + image.frame.size.width / 2,
                               image.frame.origin.y + image.frame.size.height / 2);
CGFloat outRadius = image.frame.size.width / 2;
// outRadius / 3 is arbitrary, just choose something >> 0 to avoid strange 
// effects when touching the control near of it's center
gestureRecognizer = [[OneFingerRotationGestureRecognizer alloc] initWithMidPoint: midPoint
                                                            innerRadius: outRadius / 3 
                                                            outerRadius: outRadius
                                                                 target: self];
[someView addGestureRecognizer: gestureRecognizer];

As soon as a gesture is detected, the delegate method is called and you can rotate the image according to the angle:

- (void) rotation: (CGFloat) angle
    // calculate rotation angle
    imageAngle += angle;
    if (imageAngle > 360)
        imageAngle -= 360;
    else if (imageAngle < -360)
        imageAngle += 360;
    // rotate image and update text field
    image.transform = CGAffineTransformMakeRotation(imageAngle *  M_PI / 180);
    textDisplay.text = [NSString stringWithFormat: @"\u03b1 = %.2f", imageAngle];

I’ve created a sample project on github, feel free to play with. Also make sure to read Ole Begemanns article about the UX details on gesture recognition.

29 Antworten auf „An one finger rotation gesture recognizer“

  1. Hey, wenn ich gewusst hätte, dass du bei Raumfeld arbeitest, hätte ich mich natürlich dafür entschieden. So ist es leider nur ein sonos system geworden.
    Bin aber sehr zufrieden 😉

  2. Danke das ist ein sehr Hilfreicher Blog eintrag.
    Somit weiß ich nun was man sich Holen muss.
    Schaut auch sehr gut aus.

  3. Hi, could you explain how I could edit your code so when a finger rotation stops, the knob takes the original start position, I’m not sure how I would do that so it moves to that location at an even speed?

  4. Great solution for one finger rotation.
    Thank you very much for sharing.
    I’m a newbie and I need to access the angleInRadians value, calculate a new value from this, and use that new value to move a UIImageView across the screen.
    The problem is that this is not in the ViewController and I don’t know how to get this value back to the ViewController.
    If you could point me in the right direction, that would be great.

    Thank you in advance.

    Best Regards,

  5. @Mike: Either skip the conversion in angleBetweenLinesInDegrees, but make sure that you adopt touchesMoved:withEvent: to calculate with radiant values. Or you convert the value back to radiants in the rotation: method.

    Your ViewController should implement the protocol to receive the rotation: calls. See the example source code at github, the view controller there is implementing the protocol and updates the textfield.

  6. Hi, I want to change angle from 360′ to 270′ and stop it when it reaches 270 or 0 on a scroll. How can I do this?

    Best Regards,

  7. Danke !
    That’s exactly what I was looking for.
    As my „wheel“ view was nested in another view, I passed the parent view in the init and replaced all references to „self.view“ in touchesMoved with this. Works like a charm.

  8. @Arthur: I guess you should change „touchesMoved:“ and replace:

    // fix value, if the 12 o’clock position is between prevPoint and nowPoint
    if (angle > 180)
    angle -= 360;
    else if (angle 270)
    angle = 270;
    else if (angle < 0)
    angle = 0;

  9. Hi. What if I have image in a frame and therefore image location is relative to the frame, not to the whole window

  10. Hi, I have three uibutton on this view. They only work fine when I swipe on them. How can I use with this gesture recognizer buttons? Where should I make changes?

    Thanks for answer,

  11. Hi one again. In my previous comment I want to ask how did you do this „close-x“ button to work as a normal button. My buttons on this controller work fine when I swipe on them.

  12. OK, while I’m relatively new to Objective-C I am versed enough in programming to say that that’s an absolutely brilliant piece of code. If you could point me in the right direction however, how can I use this class across a collection of knobs? Say for example that I’ve got 3 UIView knobs and 3 text fields in a single viewcontroller – how can I get the class to respond to each of the knobs independently? I thought perhaps using a switch statement in the

    if ([target respondsToSelector: @selector(rotation:)])
    [target rotation:angle];

    section of your implementation file, but I can’t seem to figure out how to return which knob is being dragged. Any help would be greatly appreciated.

  13. OK, I figured out how to use the delegate across a multitude of knobs, now can anyone give me a clue how to change the values so that 0 isn’t at 12 o’clock? I’d like to get 0 at 7 o’clock and the high end at like 4 o’clock – similar to a volume knob on a radio.

  14. Hi
    its great tutorial about rotation object with single finger i was in trying to find such that tutorial thanks

  15. I have used this control for 8 knobs in a row. There I found an error.

    I solved this by replacing:

    – (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event

    CGPoint nowPoint = [[touches anyObject] locationInView: self.view];
    CGPoint prevPoint = [[touches anyObject] previousLocationInView: self.view];


    CGPoint nowPoint = [[touches anyObject] locationInView: self.view.superview];
    CGPoint prevPoint = [[touches anyObject] previousLocationInView: self.view.superview];

    Otherwise, the touches locations where wrong.

  16. It does not work on iOS7. I tried to implement your code from the scratch, but the image always fluctuate while rotating.

  17. Great article ! I made a VERY simple little app drive a „dial“, but purely based on swipe gesture speeds (split the wheel into for gesture zones and combined x and y speeds). The issue was to bind that to the actual „dot“ on the dial. This solution actually looks at the touch position so way way better and still very short. Thanks for sharing !! 😀

  18. In Xcode 6, where new template has LaunchScreen.xib, this doesn’t work. Problem is in file OneFingerRotationGestureRecognizer.m, at line – CGPoint prevPoint = [[touches anyObject] previousLocationInView: self.view];
    There are 2 errors showing. I can’t avoid them.
    Interesting, that everything still works in old projects, which were made in previous versions of xcode, even if I take old project and include OneFingerRotationGestureRecognizer in it.
    So error messages are:
    1) No known instance method for selector ‚previousLocationInView:‘
    2) Initializing ‚CGPoint‘ (aka ’struct CGPoint‘) with an expression of incompatible type ‚id‘
    Can somebody tell me how to avoid it?

Kommentare sind geschlossen.