Microinteractions: Prototyping gestural interactions
How can gestural Microinteractions enrich our digital experiences? In a week-long exercise, we created a batch of Framer.js sandbox prototypes to explore the tiniest details of the video preview functionality.
A picture is worth a thousand words. A video is worth a thousand images. But sometimes, we have no time to replay and watch the entire recording: when we are seeking for a certain shot (e.g. video libraries), or when we just want to get the gist of the content (e.g. Facebook ad videos). On current platforms, there are some great examples of video preview functionalities, some even make use of the touchscreen's gestural interface to adjust the video playback rate.
Team members: Aylin Alpay
Tutors: Jorge Furuya & Paulo Coelho (Google Android)
When designing micro-level details of an interaction, the underlying code reveals different parameters that define a gestural input. A finger swipe can be defined by its velocity, distance, direction, starting point and endpoint. The output parameters are the video playback rate and duration.
Our question was: which gestural input parameter is best to represent the video playback rate? In existing applications, such as on Android on Samsung mobile (GIF above), swipe position adjusts playback rate. This is great for scrubbing and seeking, but not so much for previewing videos quickly.
Therefore we implemented swipe velocity (purple prototype), and swipe distance (teal prototype) in the following mockups to test which one would be easier and more comfortable to use for quick previews on an overview page.
By implementing and testing both options, we learned that on an overview page with many videos (Vimeo or Facebook feed), it's best to use the velocity of a swipe for mapping the playback rate. Using velocity ensures that the user's intent to view the video slowly or quickly can be easily understood, without the need for absolute accuracy.
Then, once on the detail page with one video only, vertical swipe can be used for precise scrubbing, where the playback rate is mapped to distance (brown prototype). A notable failed attempt was to implement the same vertical interaction on the overview page (blue prototype): it's an overly complicated interaction for such a cluttered space.
All sandboxes were created in Framer Studio, written in CoffeeScript. For simplicity, we used moving balls to represent a video being played. The code behind these two prototypes only differed in one key variable: whether the ball's motion (i.e. playback rate) was correlated to the swipe input's velocity (draggable.velocity) or distance swiped (draggable.constraintsOffset.x).
For best feel of the difference between swipe velocity and swipe distance, please open the prototypes on a smartphone (iOS or Android).