UIGestureRecognizer wrapper

Authored by rcameron on Aug 11 2016, 2:27 PM.



first take at basic gesture recognizer wrapper

Diff Detail

rMDMDIRECTMANIPULATIONSWIFT material-motion/direct-manipulation-swift
Automatic diff as part of commit; lint not applicable.
Automatic diff as part of commit; unit tests not applicable.
rcameron retitled this revision from to UIGestureRecognizer wrapper.Aug 11 2016, 2:27 PM
rcameron updated this object.
rcameron edited the test plan for this revision. (Show Details)
Restricted Application added a project: Material Motion. · View Herald Transcript
featherless requested changes to this revision.Aug 11 2016, 5:57 PM
featherless added a reviewer: featherless.
featherless added a subscriber: featherless.
featherless added inline comments.

Replace with https://github.com/material-motion/... so that travis will be able to check out the dependency.


I see the train of thought that lead to this particular solution. I imagine that you also felt that this particular approach feels like a bunch of heavy lifting for not much gain.

One of the central goals of Material Motion is to allow application developers to think in more declarative terms. For example, one might simply describe a view as being "draggable"; the specific logic implementing this behavior would not have to be written by the application developer.

As the code is presently written, the application developer is both describing and implementing the plan of "draggable", which is somewhat of an anti-goal for material motion.

In essence, the "plan" in this implementation is actually "add a gesture recognizer to a target", not "draggable" as it's been described here.

So the central question to answer is what would a performer that fulfills a "draggable" contract look like?

Thinking declaratively

If we revisit what the contract of "draggable" is, we might expect the performer to do the following for us:

  1. Receive gesture events.
  2. Commit position delta changes to the target.

#1 is solvable by adding our performer as a target for the gesture recognizer. This requires that we provide a gesture recognizer to the plan, which will then be made available to the performer:

let draggable = Draggable(withGestureRecognizer: pan)
transaction.add(plan: draggable, to: draggableView)

To solve #2, the Draggable plan might be performed by a DraggablePerformer. This performer's addPlan method might look like so:

func addPlan(plan) {
  plan.gesture.addTarget(self, selector: #selector(gestureDidChange:))

The performer would then be responsible for implementing the gestureDidChange: method:

func gestureDidChange(gesture) {
  // Commit the delta to the target

Generalizing the pattern

In the above example we explored a "Draggable" plan that was fulfilled by a "DraggablePerformer". What if we want to support "Pinchable" and "Rotatable" as well?

One approach would be to build a new performer type for each of the plans.

Once we do the above we might find a common pattern between all three of the performers. At this point we might choose to build one performer, a GesturePerformer, that is able to handle many gesture plans simultaneously for a given target.

In the latter approach, our GesturePerformer would be able to do certain smart things like adjust the anchor point of a view when the first gesture recognizer initiates. This would allow you to implement "direct manipulation" interactions on an arbitrary view:

transaction.add(plan: Draggable(pan), to: draggableView)
transaction.add(plan: Pinchable(pinch), to: draggableView)
transaction.add(plan: Rotate(rotation), to: draggableView)

If you're using our new composable plan feature, you might capture all of these sub-plans into a single plan called "DirectlyManipulable" whose performer registers each of the above plans in turn:

transaction.add(plan: DirectlyManipulable(pan, pinch, rotate), to: draggableView)

Let me know if any of the above isn't clear or if you have any questions!

This revision now requires changes to proceed.Aug 11 2016, 5:57 PM
rcameron added inline comments.Aug 11 2016, 6:53 PM

This is amazingly helpful. Thanks!

I initially started down a similar path, but changed direction after looking at the CoreAnimation project again.

rcameron updated this revision to Diff 6225.Aug 11 2016, 7:47 PM

Moved Plan conformance to Gesturable, away from UIGestureRecognizer

rcameron updated this revision to Diff 6226.Aug 11 2016, 8:00 PM

update MaterialMotionRuntime pod path

rcameron marked an inline comment as done.Aug 11 2016, 8:00 PM

BTW: I'm working on refactoring my work from this week into a more composible format just like this. Worked with featherless.

featherless requested changes to this revision.Aug 12 2016, 2:13 PM
featherless added inline comments.

Please add the material motion header.

This revision now requires changes to proceed.Aug 12 2016, 2:13 PM
rcameron updated this revision to Diff 6237.Aug 12 2016, 2:47 PM
  • add material motion header
rcameron marked an inline comment as done.Aug 12 2016, 2:53 PM
featherless accepted this revision.Aug 12 2016, 3:52 PM
This revision is now accepted and ready to land.Aug 12 2016, 3:52 PM
rcameron updated this revision to Diff 6248.Aug 12 2016, 7:12 PM
  • Merged project file
This revision was automatically updated to reflect the committed changes.