Systems and methods for displaying representative images

Information

  • Patent Grant
  • 10810781
  • Patent Number
    10,810,781
  • Date Filed
    Friday, September 28, 2018
    6 years ago
  • Date Issued
    Tuesday, October 20, 2020
    4 years ago
Abstract
A system, method, and computer program product for displaying representative images within a collection viewer is disclosed. The method comprises receiving an indication of a new orientation for the collection viewer, displaying a sequence of animation frames that depict an in-place rotation animation for the representative images, generating a rotation angle in a sequence of rotation angles, and displaying a rendered representative image for each of the two or more representative images by rendering the two or more representative images, wherein each rendered representative image is rotated according to the rotation angle.
Description
FIELD OF THE INVENTION

Embodiments of the present invention relate generally to user interface design, and more specifically to systems and methods for displaying representative images.


BACKGROUND

A typical mobile computing device, or simply “mobile device,” such as a smartphone or tablet computer, includes a computation subsystem, and a display screen configured to display a user interface (UI) comprising elements such as control widgets and representative images of files accessible through the UI. A representative image may comprise a thumbnail image associated with an application, script, or data file residing within a file system or file database. One example of a representative image is an image thumbnail. Another example of a representative image is a file icon. Representative images are typically presented to a user within a collection viewer that is configured to allow the user to browse, select, view, execute, and otherwise interact with corresponding objects. One example of a collection viewer is a file browser, which may be configured to show users a list of files within a file system depicted as icons. Another example of a collection viewer is an image browser, configured to show users a list of images within a file system or image database depicted as thumbnails.


A UI for a mobile device typically includes a collection viewer for files and may include a collection viewer for images. A collection viewer for digital images may similarly present thumbnails associated with digital images residing within an image database or within a file system folder of digital images. The collection viewer enables the user to browse thumbnails, and to open a digital image by performing a touch gesture on a corresponding thumbnail.


Mobile computing devices, or simply “mobile devices” may include an arbitrarily large number of files and corresponding icons that need to be presented within a collection viewer for files. Similarly, mobile devices may include an arbitrarily large number of digital images and corresponding thumbnails that need to be presented within a collection viewer for digital images. Typical collection viewers enable users to view a collection of representative images as a two-dimensional grid of representative images. The representative images are conventionally positioned within the grid according to a specific sequence, such as a file sequence number, a sort sequence number, or an image sequence number. The grid is populated with representative images frequently forming a tall, narrow form factor regardless of device orientation. Width of the grid is associated with horizontal screen width, which may be different in landscape versus portrait orientations. As a consequence, the physical location of a specific representative image may change when device orientation changes because the collection viewer typically needs to alter the grid layout and specific sequence of the representative images.


In conventional operation of a collection viewer, the user may locate a desired representative image by scrolling the grid vertically into an appropriate screen position. However, if the user then rotates their mobile device, the screen position of the desired representative image typically changes in response to the change in grid width, forcing the user to once again locate their desired representative image within the grid. Forcing the user to locate the desired representative image a second time after rotation introduces inefficiency and confusion in the user experience.


As the foregoing illustrates, there is a need for addressing this and/or other related issues associated with the prior art.


SUMMARY

A system, method, and computer program product for displaying representative images within a collection viewer is disclosed. The method comprises receiving an indication of a new orientation for the collection viewer, displaying a sequence of animation frames that depict an in-place rotation animation for the representative images, generating a rotation angle in a sequence of rotation angles, and displaying a rendered representative image for each of the two or more displayed representative images, wherein each rendered representative image is rotated according to the rotation angle





BRIEF DESCRIPTION OF THE DRAWINGS

So that the manner in which the above recited features of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, for the invention may admit to other equally effective embodiments.



FIG. 1 illustrates a flow chart of a method for displaying a collection of representative images, according to one embodiment of the present invention;



FIG. 2A illustrates a front view of a mobile device comprising a display unit, according to one embodiment of the present invention;



FIG. 2B illustrates a block diagram of a mobile device comprising a display unit, according to one embodiment of the present invention;



FIG. 3A illustrates a collection viewer configured in a portrait orientation, according to one embodiment of the present invention;



FIG. 3B illustrates a collection viewer configured in a landscape orientation, according to one embodiment of the present invention;



FIG. 3C illustrates one representative image in two different orientations as viewed by a user with respect to a physical up direction, according to one embodiment of the present invention;



FIG. 3D illustrates one representative image in two different orientations with respect to a physical display origin, according to one embodiment of the present invention; and



FIG. 3E depicts an animation sequence of frames for one representative image transitioning between two different orientations, according to one embodiment of the present invention.





DETAILED DESCRIPTION

Embodiments of the present invention enable a mobile device to present a consistent location of representative images regardless of device orientation within a user interface (UI). A collection viewer refers to a collection of software modules that generate a depiction of data objects within a UI. Displayed UI elements generated by the software modules may also be referred to generally as a collection viewer. A collection viewer is configured to present representative images of the data objects, such as file icons and image thumbnails to a user. A collection viewer may comprise a file browser, an image browser, or any other type of data object browser configured to depict data objects as representative images. When the user rotates the mobile device, the collection viewer generates an in-place rotation animation for each representative image. Keeping each representative image in substantially the same physical screen location regardless of device orientation allows the user to visually track a given representative image through device rotation, thereby providing a more efficient and intuitive user experience. Animating rotation of the representative images to maintain proper viewing orientation provides an intuitive visual cue, further improving the user experience.



FIG. 1A illustrates a flow chart of a method 100 for displaying a collection of representative images, according to one embodiment of the present invention. Although method 100 is described in conjunction with the systems of FIGS. 2A-2B, persons of ordinary skill in the art will understand that any system that performs method 100 is within the scope and spirit of embodiments of the present invention. In one embodiment, a mobile device, such as mobile device 270 of FIGS. 2A-2B, is configured to perform method 100. by executing a collection viewer, implemented as a software module within mobile device 270.


Method 100 begins in step 110, where the collection viewer receives a notification indicating that the mobile device has been repositioned into a new orientation. In one embodiment, the notification indicates one of four orthogonal orientations, where a first orientation is associated with a generally upright portrait orientation, a second orientation is associated a clockwise rotation from the first orientation to a landscape orientation, a third orientation is associated with an upside down portrait orientation, and a fourth orientation is associated a counterclockwise rotation from the first orientation to a landscape orientation. Each of the four orientations represents an approximation of a physical orientation of the mobile device to the nearest ninety degrees. In other embodiments, the approximation of a physical orientation may comprise angular increments of less than ninety degrees to provide a finer approximation granularity. In certain embodiments, angular increments are substantially uniform, while in other embodiments, angular increments are non-uniform.


In one embodiment, physical orientation is generated from measurements performed by a sensor device, such as one of sensor devices 242. For example, an accelerometer, comprising sensor devices 242, may provide a physical measurement of a force vector corresponding to physical forces on the mobile device. When mobile device is held generally still, such as when a user is holding the mobile device, this measured force vector is generally aligned with a gravity force vector. When the measured force vector is aligned vertically and pointing from the top of the mobile device to the bottom of the mobile device, the mobile device is likely being held in an upright portrait orientation. When the measured force vector is rotated by approximately ninety degrees about a normal vector to display unit 212, the device is likely being held in a landscape orientation, and so forth. While approximating orientation is described herein based on a measured force vector, other techniques of approximating orientation may be performed without departing the scope and spirit of embodiments of the present invention.


Any technically feasible technique may be implemented for sending a notification to the collection viewer. For example, the notification may comprise a message in an object message passing system. In this example, an instance of the collection viewer is configured to receive a new orientation message when the mobile device changes to a new orientation. A software module, such as a system service module, is configured to approximate device orientation, for example, by monitoring an accelerometer within sensor devices 242. The new orientation message may specify a new orientation, or the new orientation message may specify that the device is in a new orientation and trigger the collection viewer to determine the new orientation. The new orientation message may indicate that the device has changed orientation beyond a specified angular threshold, enabling the collection viewer to determine an orientation for display, such as in embodiments that implement finer rotational granularity than ninety degree granularity. The system service module may include a kernel process configured to monitor hardware circuits comprising the sensor devices 242, an application programming interface (API) configured to respond to the kernel process, a process executing in application space that is configured to monitor sensor devices 242 and generate messages based on specified criteria, or any other technically feasible mechanism for providing orientation notifications to the collection viewer.


In one embodiment, hysteresis is applied to an orientation approximation, so that a change in physical orientation needs to surpass a certain rotational threshold to trigger generation of the notification of a new orientation. In such an embodiment, the system services module applies hysteresis to physical orientation measurements so that a notification is generated only after a specified orientation threshold is exceeded. In other embodiments, the collection viewer is configured to apply hysteresis to notifications, such as notifications for changes of less than ninety degrees or less than an orientation change threshold.


In step 120, the collection viewer, or a helper function to the collection viewer, computes a current animation state for a current animation sequence. A current animation sequence may include a scroll animation, a rotation animation, or a combination thereof triggered by a physical change in device orientation. A given current animation sequence is initiated in response to receiving the notification. In one embodiment, the current animation sequence defines a sequence of frames, as discussed in greater detail below in FIG. 3E. In one embodiment, a given animation sequence completed before a subsequent animation sequence is initiated. The current animation state may define a current scroll position and a current rotation angle for a collection of representative images being animated in the current animation sequence. The current animation state may determine which representative images are visible within view panel 311.


In step 130, the collection viewer, or a helper function to the collection viewer, identifies representative images to render in a current animation frame. Any technically feasible technique may be used to identify representative images to render. In one embodiment, only visible representative images are identified to render in the current frame. In one embodiment, a visible representative image includes geometry that intersects at least one pixel within view panel 311. In another embodiment, visible representative images and at least one non-visible representative image are identified to render in a current animation frame. In one embodiment, when a representative image is newly visible in a current animation frame, that representative image is instantiated for display and the instantiated representative image instance is sent a message to render an associated representative image according to the current animation state. Other representative image instances may be sent a substantially similar message to render an associated representative image according to the current animation state. In certain embodiments, a representative image instance that is no longer visible is de-allocated at the completion of the animation sequence.


In step 140, the collection viewer, or a helper function to the collection viewer, generates an animation frame comprising rendered representative images. In one embodiment, the animation frame includes one rendered representative image. In another embodiment, the animation frame includes two or more rendered representative images. Each rendered representative image includes a representative image that has been translated, rotated, or both translated and rotated in accordance with the current animation state. At least a portion of each rendered representative image may be presented within view panel 311. In one embodiment, the generated animation frame is displayed on display unit 212 of FIG. 2A. In one embodiment, each animation frame is rendered by a graphics processing unit (GPU) within processor complex 210 of FIG. 2B.


If, in step 150, the current animation frame is the last animation frame, then the method terminates. Otherwise, the method proceeds back to step 120.


System Overview


FIG. 2A illustrates a front view of a mobile device 270 comprising a display unit 212, according to one embodiment of the present invention. Display unit 212 is configured to display user interface (UI) elements associated with software applications configured to execute on mobile device 270. The UI elements may include representative images, such as file icons and image thumbnails.



FIG. 2B illustrates a block diagram of mobile device 270, according to one embodiment of the present invention. Mobile device 270 includes a processor complex 210 coupled to display unit 212. Mobile device 270 may also include, without limitation, a digital camera 230, a strobe unit 236, a set of input/output devices 214, non-volatile memory 216, volatile memory 218, a wireless unit 240, and sensor devices 242, each coupled to processor complex 210. In one embodiment, a power management subsystem 220 is configured to generate appropriate power supply voltages for each electrical load element within mobile device 270, and a battery 222 is configured to supply electrical energy to power management subsystem 220. Battery 222 may implement any technically feasible battery, including primary or rechargeable battery technologies. Alternatively, battery 222 may be implemented as a fuel cell, or high capacity electrical capacitor.


In one usage scenario, strobe illumination 237 comprises at least a portion of overall illumination in a scene being photographed by digital camera 230. Optical scene information 239, which may include strobe illumination 237 reflected from objects in the scene, is focused onto an image sensor 232 as an optical image. Image sensor 232, within digital camera 230, generates an electronic representation of the optical image. The electronic representation comprises spatial color intensity information, which may include different color intensity samples for red, green, and blue light.


Display unit 212 is configured to display a two-dimensional array of pixels to form a digital image for display. Display unit 212 may comprise a liquid-crystal display, an organic LED display, or any other technically feasible type of display. Input/output devices 214 may include, without limitation, a capacitive touch input surface, a resistive tablet input surface, buttons, knobs, or any other technically feasible device for receiving user input and converting the input to electrical signals. In one embodiment, display unit 212 and a capacitive touch input surface comprise a touch entry display system, and input/output devices 214 comprise a speaker and microphone.


Non-volatile (NV) memory 216 is configured to store data when power is interrupted. The NV memory 216 therefore implements a non-transitory computer-readable medium. In one embodiment, NV memory 216 comprises one or more flash memory devices. NV memory 216 may be configured to include programming instructions for execution by one or more processing units within processor complex 210. The programming instructions may include, without limitation, an operating system (OS), user interface (UI) modules, imaging processing and storage modules, and modules implementing one or more embodiments of techniques taught herein. In particular, the NV memory 216 may be configured to store instructions that implement method 100 of FIG. 1. The instructions, when executed by processing units within processor complex 210, cause the processing units to perform method 100. One or more memory devices comprising NV memory 216 may be packaged as a module that can be installed or removed by a user. In one embodiment, volatile memory 218 comprises dynamic random access memory (DRAM) configured to temporarily store programming instructions, image data, and the like needed during the course of normal operation of mobile device 270. Sensor devices 242 include sensors configured to detect at least device orientation of the mobile device 270. For example sensor devices 242 may include an accelerometer to detect motion and orientation, an electronic gyroscope to detect motion and orientation, or a combination thereof. Sensor devices 242 may also include, without limitation, a magnetic flux detector to detect orientation, a global positioning system (GPS) module to detect geographic position, or any combination thereof.


Wireless unit 240 may include one or more digital radios configured to send and receive digital data. In particular, wireless unit 240 may implement wireless standards known in the art as “WiFi” based on institute for electrical and electronics engineers (IEEE) standard 802.11, and may implement digital cellular telephony standards for data communication such as the well-known “3G” and long term evolution (“LTE”), or “4G” suites of standards. In one embodiment, mobile device 270 is configured to transmit one or more digital photographs residing within either NV memory 216 or volatile memory 218 to an online photographic media service via wireless unit 240. In such an embodiment, a user may possess credentials to access the online photographic media service and to transmit the one or more digital photographs for storage and presentation by the online photographic media service. The credentials may be stored or generated within mobile device 270 prior to transmission of the digital photographs. The online photographic media service may comprise a social networking service, photograph sharing service, or any other web-based service that provides storage and download of digital photographs. In certain embodiments, mobile device 270 is configured to receive one or more incoming digital photographs via wireless unit 240, and store the incoming digital photographs in the NV memory 216, or the volatile memory 218, or a combination thereof.


Collection Viewer


FIG. 3A illustrates a collection viewer 310 configured in a portrait orientation, according to one embodiment of the present invention. As shown, a physical display origin 312 is disposed in an upper left corner, and a scroll axis 314 is aligned vertically. That is, the scroll axis 314 is generally aligned with respect to a physical “up” direction 316.


In one embodiment, collection viewer 310 allows a user to scroll a collection of representative images 320, along scroll axis 314 in response to an input scroll command. The collection of representative images 320 may be organized in a grid, with a portion of the representative images 320 visible within a view panel 311. A swipe gesture performed on a capacitive input device within display unit 212 may serve as the input scroll command. In one embodiment, view panel 311 is configured to have a rectangular form, including a larger dimension and a smaller dimension. In such an embodiment, the term “portrait orientation” refers to an orientation for view panel 311 with the larger dimension generally oriented along the up direction 316. The term “landscape orientation” refers to an orientation for view panel 311 with the smaller dimension generally oriented along the up direction 316. In other embodiments, view panel 311 may be square. In such embodiments, “portrait orientation” and “landscape orientation” comprise arbitrary but orthogonal orientations of view panel 311.


While collection viewer 310 is illustrated here as a UI element having a view panel 311, the term “collection viewer” is defined broadly herein to include a software module configured to generate the UI element and display representative images 320 within view panel 311.


When a user rotates mobile device 270 into a new position, the collection viewer may reconfigure presentation of representative images 320 by causing the representative images 320 to rotate to an angle consistent with the new position. For example, the user may initially hold mobile device 270 in a portrait orientation. The user may then rotate the device orientation into a landscape orientation. In this example, mobile device 270 may detect a sufficient change in orientation and cause collection viewer 310 to transition from a portrait orientation to a landscape orientation, illustrated below in FIG. 3B. In other embodiments, different orientations may be implemented according to arbitrary angles having finer granularity than orthogonal angles. Detecting a sufficient change may include a hysteresis function in device orientation.



FIG. 3B illustrates collection viewer 310 configured in a landscape orientation, according to one embodiment of the present invention. As shown, the physical display origin 312 is disposed in a lower left corner, and a scroll axis 314 is aligned horizontally.


In a typical usage scenario, a user holds their head in an upright position, and therefore prefers to view representative images 320 rendered according to the physical up direction 316. As shown, representative images 320 are rotated to be viewed in an orientation consistent with the up direction 316. In one embodiment, representative images 320 are rotated in place. In one embodiment, rotation in place comprises performing an animation, such as a rotation animation, fade animation, or other transition animation, for each representative image 320. In one embodiment, animation for all representative images 320 is performed substantially synchronously, so that all displayed representative image 320 appear to move together. By rotating representative images 320 in place, collection viewer 310 is able to present a physical metaphor of the representative images 320 that is consistent with a physical device rotation. By contrast, prior art systems typically rearrange thumbnails, leading to user confusion and breaking any perceived consistency with physical device rotation.



FIG. 3C illustrates representative image 320(0,0) of FIG. 3A in two different orientations as viewed by a user, according to one embodiment of the present invention. A portrait to landscape transform 330 is implemented to animate a clockwise rotation of representative image 320(0,0) from a portrait orientation to a landscape orientation. Additional transforms may be similarly implemented to animate transitions between each different discrete rotation position. Here, representative image 320(0,0) is rotated ninety degrees in a clockwise direction to compensate for a ninety degree counter-clockwise rotation of physical display origin 312. As shown, representative image 320(0,0) is rotated to be viewable in a generally upright orientation regardless of orientation of physical display origin 312. In other embodiments, finger-grain discrete rotation positions may be similarly implemented.



FIG. 3D illustrates representative image 320(0,0) in two different orientations with respect to the physical display origin 312 of FIG. 3A, according to one embodiment of the present invention. As described above, portrait to landscape transform 330 implements a rotation of representative image 320(0,0). As shown, representative image 320(0,0) is rotated relative to physical display origin 312.



FIG. 3E depicts an animation sequence 340 of frames 360 for one representative image transitioning between two different orientations, according to one embodiment of the present invention. As shown, a representative image 342 is rotated in sequential frames 360(N) to 360(N+4) to generate a rotation animation of representative image 342. The rotation animation depicts a rotational movement of the representative image 342 from an initial position at time T0, to a new position at time T4. In this example, a user rotates mobile device 270 counter-clockwise from a portrait orientation to a landscape orientation between time T0 and time T1, thereby triggering a clockwise animation of representative image 342. In the process, a new physical up direction is established. A new up direction 352 consequently replaces an old up direction 350. The animation sequence depicts rotational movement of representative image 342 to generally negate the physical rotation of mobile device 270. In one embodiment the animation sequence is timed independently of physical rotation once a rotation event is detected. In other embodiments, the animation sequence is timed to substantially track physical rotation once a rotation event is detected.


Representative image 342 may be rendered in each rotational position associated with each incremental frame 360. Although three intermediate frames 360(N+1), 360(N+2), and 360(N+3) are shown, animation sequence 340 may implement an arbitrary number of intermediate frames. In one embodiment, animation sequence 340 is initiated and completed during a time span of less than one second, but more than ten milliseconds. In certain implementations, duration of animation sequence 340 may be measured as an integral multiple of a number of frame times needed to display intermediate frames as refresh frames on display device 212. In one embodiment, each representative image being displayed within view panel 311 is animated substantially synchronously, so that each animation step for each representative image is completed together. For example, animation frame 360(N+1) is rendered and displayed at or before time T2 for each representative image 320 of FIG. 3A. In other embodiments, looser synchronization may be implemented, so that each representative image 220 completes a respective animation sequence within a specified maximum number frame times, such as less than five frame times, or less than sixty frame times. In certain embodiments, an animation sequence models certain physical behaviors or properties, such as momentum, oscillation, friction, and the like. For example, an animation sequence may depict the representative images overshooting their rotation and springing back into proper position. An arbitrary rotation function may be applied with respect to time to provide such effects.


In alternative embodiments, transition effects other than a rotation animation may be implemented. For example, one alternative transition effect to a rotation animation is an alpha fade animation between representative image 342 depicted in frame 360(0) and representative image 342 depicted in frame 360(N+4). Another alternative transition effect animates representative image 342 depicted in frame 360(0) collapsing to a dot and re-emerging as representative image 342 depicted in frame 360(N+4). These and other in-place transition effects may be implemented without departing the scope and spirit of the present invention.


In one embodiment, frames 360 are rendered by a graphics processing unit (GPU) within processor complex 210 of FIG. 2B.


While the techniques disclosed herein are described in conjunction with a mobile device, persons skilled in the art will recognize that any compute platform may be configured to perform these techniques.


While the foregoing is directed to embodiments of the present invention, other and further embodiments of the invention may be devised without departing from the basic scope thereof, and the scope thereof is determined by the claims that follow.

Claims
  • 1. A method, comprising: displaying, on a display unit, two or more representative images within a collection viewer, wherein each one of the two or more representative images is displayed at a respective location relative to a physical origin for the display unit;determining a physical rotation of the display unit;when the physical rotation surpasses a predetermined angular threshold, receiving an indication of a new orientation for the collection viewer that corresponds to a finer rotation granularity than orthogonal angles of the respective location relative to the physical origin, and when the physical rotation does not surpass the predetermined angular threshold, applying a hysteresis to the indication of the new orientation; and in response to the indication of the new orientation, displaying a sequence of animation frames that depict an in-place rotation animation for each of the two or more representative images, wherein each in-place rotation animation is displayed at the respective location relative to the physical origin for the display unit and each in-place rotation animation is timed independently of the physical rotation of the display unit, and wherein each animation frame in the sequence of animation frames depicts a different rotation angle in a sequence of rotation angles for the two or more representative images by: generating the different rotation angle for each animation frame in the sequence of animation frames; anddisplaying a rendered representative image for each of the two or more representative images at the respective location relative to the physical origin for the display unit, wherein each rendered representative image is rotated according to the different rotation angle, wherein a first timing is associated with the in-place rotation animation and a second timing is associated with a physical rotation timing, and the first timing commences after the second timing commences.
  • 2. The method of claim 1, wherein the indication comprises a message in an object-based message passing system.
  • 3. The method of claim 1, wherein the predetermined angular threshold is based on a physical force associated with a mobile computing device.
  • 4. The method of claim 1, wherein displaying the two or more representative images comprises identifying representative images that are visible within the collection viewer.
  • 5. The method of claim 1, wherein the respective location for each of the two or more representative images is determined in one or more animation frames of the sequence of animation frames according to a current scroll position.
  • 6. The method of claim 1, wherein displaying the sequence of rotation angles is performed by a graphics processing unit within a mobile computing device.
  • 7. The method of claim 1, wherein the first timing associated with each in-place rotation animation is based, at least in part, on tracking a physical rotation associated with the new orientation for the collection viewer.
  • 8. The method of claim 1, wherein each in-place rotation animation starts with a first frame of the sequence of animation frames and ends with a last frame of the sequence of animation frames, wherein the last frame corresponds to the new orientation for the collection viewer.
  • 9. The method of claim 1, wherein the in-place rotation animation for each of the two or more representative images completes the respective rotation angle in the sequence of rotation angles within a specified maximum number of frame times.
  • 10. The method of claim 1, wherein the in-place rotation animation for each of the two or more representative images occurs synchronously with respect to the in-place rotation animations of other representative images of the two or more representative images.
  • 11. The method of claim 1, wherein a placement of the rendered representative image for each of the two or more representative images is consistent with a placement of the two or more representative images.
  • 12. The method of claim 1, wherein the new orientation is based on rotating the display unit from at least one of a landscape orientation to a portrait orientation, the portrait orientation to the landscape orientation, or a combination of the landscape orientation to the portrait orientation and the portrait orientation to the landscape orientation.
  • 13. The method of claim 1, wherein a third timing of the sequence of animation frames is based on a timing of a physical rotation.
  • 14. A computer program product embodied in a non-transitory computer-readable medium that, when executed by a processor, causes the processor to perform a method comprising: displaying, on a display unit, two or more representative images within a collection viewer, wherein each one of the two or more representative images is displayed at a respective location relative to a physical origin for the display unit;determining a physical rotation of the display unit;when the physical rotation surpasses a predetermined angular threshold, receiving an indication of a new orientation for the collection viewer that corresponds to a finer rotation granularity than orthogonal angles of the respective location relative to the physical origin, and when the physical rotation does not surpass the predetermined angular threshold, applying a hysteresis to the indication of the new orientation; and in response to the indication of the new orientation, displaying a sequence of animation frames that depict an in-place rotation animation for each of the two or more representative images, wherein each in-place rotation animation is displayed at the respective location relative to the physical origin for the display unit and each in-place rotation animation is timed independently of the physical rotation of the display unit, and wherein each animation frame in the sequence of animation frames depicts a different rotation angle in a sequence of rotation angles for the two or more representative images by: generating the different rotation angle for each animation frame in the sequence of animation frames; anddisplaying a rendered representative image for each of the two or more representative images at the respective location relative to the physical origin for the display unit, wherein each rendered representative image is rotated according to the different rotation angle;wherein a first timing is associated with the in-place rotation animation and a second timing is associated with a physical rotation timing, and the first timing commences after the second timing commences.
  • 15. The computer program product of claim 14, wherein the indication comprises a message in an object-based message passing system.
  • 16. The computer program product of claim 14, wherein the indication is generated in response to a measurement of a physical force associated with a mobile computing device.
  • 17. The computer program product of claim 14, wherein displaying the two or more representative images comprises identifying representative images that are visible within the collection viewer.
  • 18. The computer program product of claim 14, wherein the respective location for each of the two or more representative images is determined in one or more animation frames of the sequence of animation frames according to a current scroll position.
  • 19. The computer program product of claim 14, wherein displaying the sequence of rotation angles is performed by a graphics processing unit within a mobile computing device.
  • 20. A mobile computing device comprising: a display unit configured to display a collection viewer; anda processing unit in communication with the display unit and configured to: display two or more representative images within the collection viewer, wherein each one of the two or more representative images is displayed at a respective location relative to a physical origin for the display unit;determine a physical rotation of the display unit;when the physical rotation surpasses a predetermined angular threshold, receive an indication of a new orientation for the collection viewer that corresponds to a liner rotation granularity than orthogonal angles of the respective location relative to the physical origin, and when the physical rotation does not surpass the predetermined angular threshold, apply a hysteresis to the indication of the new orientation; andin response to the indication of the new orientation, display a sequence of animation frames that depict an in-place rotation animation for each of the two or more representative images, wherein each in-place rotation animation is displayed at the respective location relative to the physical origin for the display unit and each in-place rotation animation is timed independently of the physical rotation of the display unit, and wherein each animation frame in the sequence of animation frames depicts a different rotation angle in a sequence of rotation angles for the two or more representative images by: generating the different rotation angle for each animation frame in the sequence of animation frames; anddisplaying a rendered representative image for each of the two or more representative images at the respective location relative to the physical origin for the display unit, wherein each rendered representative image is rotated according to the different rotation angle;wherein a first timing is associated with the in-place rotation animation and a second timing is associated with a physical rotation timing, and the first timing commences after the second timing commences.
CROSS-REFERENCES TO RELATED APPLICATIONS

The present application is a continuation of, and claims priority to, U.S. patent application Ser. No. 15/622,520, filed Jun. 14, 2017, entitled “SYSTEMS AND METHODS FOR DISPLAYING REPRESENTATIVE IMAGES,” which is a continuation of, and claims priority to, U.S. patent application Ser. No. 14/340,557, filed Jul. 24, 2014, entitled “SYSTEMS AND METHODS FOR DISPLAYING REPRESENTATIVE IMAGES,” now U.S. Pat. No. 9,741,150, which in turn claims priority to U.S. Provisional Application No. 61/958,324, entitled “Systems and methods for digital photography,” filed Jul. 25, 2013. The foregoing applications and/or patents are herein incorporated by reference in their entirety for all purposes.

US Referenced Citations (278)
Number Name Date Kind
5734760 Yoshida Mar 1998 A
5835639 Honsinger et al. Nov 1998 A
5900909 Parulski et al. May 1999 A
5986668 Szeliski et al. Nov 1999 A
5987164 Szeliski et al. Nov 1999 A
6055326 Chang Apr 2000 A
6061696 Lee et al. May 2000 A
6115025 Buxton et al. Sep 2000 A
6137468 Martinez Oct 2000 A
6326978 Robbins Dec 2001 B1
6704007 Clapper Mar 2004 B1
6842265 Votipka et al. Jan 2005 B1
7027054 Cheiky et al. Apr 2006 B1
7030868 Clapper Apr 2006 B2
7030912 Honma Apr 2006 B1
7085590 Kennedy et al. Aug 2006 B2
7352361 Yi Apr 2008 B2
7626598 Manchester Dec 2009 B2
7646417 Goto et al. Jan 2010 B2
7730422 Russo Jun 2010 B2
7903115 Platzer et al. Mar 2011 B2
7978182 Ording Jul 2011 B2
8068121 Williamson et al. Nov 2011 B2
8120625 Hinckley Feb 2012 B2
8125499 Yamada Feb 2012 B2
8217964 Laine Jul 2012 B2
8233003 Obinata Jul 2012 B2
8314817 Williamson et al. Nov 2012 B2
8363145 Iwamoto Jan 2013 B2
8412277 Fujiwara Apr 2013 B2
8451296 Ono May 2013 B2
8531465 Platzer et al. Sep 2013 B2
8543946 Kethireddy Sep 2013 B2
8581935 Handa Nov 2013 B2
8610724 Garg Dec 2013 B2
8692851 Ording et al. Apr 2014 B2
8717293 Wong et al. May 2014 B2
8817048 Kerr et al. Aug 2014 B2
8830177 Woo Sep 2014 B2
8830261 Asai Sep 2014 B2
8854325 Byrd Oct 2014 B2
8872855 Doll Oct 2014 B2
8890897 Homma Nov 2014 B2
8896632 MacDougall et al. Nov 2014 B2
8915437 Hoshino et al. Dec 2014 B2
8933960 Lindahl et al. Jan 2015 B2
8937735 Mori Jan 2015 B2
8947382 Winkler et al. Feb 2015 B2
8988349 Alberth et al. Mar 2015 B2
9015640 de Leon Apr 2015 B2
9070229 Williamson et al. Jun 2015 B2
9098069 Dickinson et al. Aug 2015 B2
9129550 Doll Sep 2015 B2
9144714 Hollinger Sep 2015 B2
9158492 Miyata Oct 2015 B2
9165533 Paulson Oct 2015 B2
9177362 Restrepo Nov 2015 B2
9189069 Hinckley Nov 2015 B2
9196076 MacLeod Nov 2015 B1
9215405 Atkinson Dec 2015 B2
9232124 Song Jan 2016 B2
9256974 Hines Feb 2016 B1
9261909 Lam Feb 2016 B2
9298745 Lee et al. Mar 2016 B2
9342138 Ding May 2016 B2
9383202 Zhou et al. Jul 2016 B2
9417836 Postal Aug 2016 B2
9424798 Park Aug 2016 B2
9459781 Wilson et al. Oct 2016 B2
9478012 Uratani Oct 2016 B2
9489927 Aizawa Nov 2016 B2
9495025 Ishikawa Nov 2016 B2
9507379 Kamei Nov 2016 B2
9507445 Sutton et al. Nov 2016 B2
9552076 Homma Jan 2017 B2
9560269 Baldwin Jan 2017 B2
9591225 Jung et al. Mar 2017 B2
9628647 Tomono et al. Apr 2017 B2
9646576 Masuko May 2017 B2
9684434 Lewin et al. Jun 2017 B2
9721375 Rivard Aug 2017 B1
9741150 Feder Aug 2017 B2
9761033 Flider Sep 2017 B2
9779481 Yuasa Oct 2017 B2
9798395 Ye Oct 2017 B2
9858648 Li Jan 2018 B2
9886192 Masuko Feb 2018 B2
9942464 Voss Apr 2018 B2
9953454 Rivard Apr 2018 B1
10088866 Braun Oct 2018 B2
10102829 Paulson Oct 2018 B2
10109098 Feder Oct 2018 B2
10186019 Homma Jan 2019 B2
10365820 Lee Jul 2019 B2
10366526 Rivard Jul 2019 B2
10410605 Gardenfors Sep 2019 B2
10552016 Cherna Feb 2020 B2
10552946 Furukawa Feb 2020 B2
10627854 Gurr Apr 2020 B2
20020063714 Haas et al. May 2002 A1
20040150622 Bohn Aug 2004 A1
20040184115 Suzuki Sep 2004 A1
20050022131 Saint-Hilaire Jan 2005 A1
20050104848 Yamaguchi May 2005 A1
20050143124 Kennedy Jun 2005 A1
20050154798 Nurmi Jul 2005 A1
20050237587 Nakamura Oct 2005 A1
20050237588 Gohara Oct 2005 A1
20060026535 Hotelling et al. Feb 2006 A1
20060029292 Hagiwara Feb 2006 A1
20060033760 Koh Feb 2006 A1
20060039630 Kotani Feb 2006 A1
20060133695 Obinata Jun 2006 A1
20070004451 Anderson Jan 2007 A1
20070045433 Chapman et al. Mar 2007 A1
20070136208 Hamashima et al. Jun 2007 A1
20070236515 Montague Oct 2007 A1
20070236709 Mitani Oct 2007 A1
20070296820 Lonn Dec 2007 A1
20080001945 Kashito et al. Jan 2008 A1
20080001950 Lin Jan 2008 A1
20080043032 Mamona et al. Feb 2008 A1
20080076481 Iwasaki et al. Mar 2008 A1
20080122796 Jobs et al. May 2008 A1
20080165144 Forstall et al. Jul 2008 A1
20080174570 Jobs et al. Jul 2008 A1
20080266326 Porwal Oct 2008 A1
20080307363 Jalon et al. Dec 2008 A1
20090002335 Chaudhri Jan 2009 A1
20090002391 Williamson Jan 2009 A1
20090002395 Yamada Jan 2009 A1
20090058882 Adachi et al. Mar 2009 A1
20090237420 Lawrenz Sep 2009 A1
20090262074 Nasiri et al. Oct 2009 A1
20100007603 Kirkup Jan 2010 A1
20100066763 MacDougall Mar 2010 A1
20100079494 Sung Apr 2010 A1
20100118115 Takahashi May 2010 A1
20100123737 Williamson May 2010 A1
20100123929 Yoshimoto May 2010 A1
20100146446 Ahn Jun 2010 A1
20100149377 Shintani et al. Jun 2010 A1
20100218113 White Aug 2010 A1
20100302278 Shaffer et al. Dec 2010 A1
20100302408 Ito Dec 2010 A1
20100315656 Agata Dec 2010 A1
20100317332 Bathiche et al. Dec 2010 A1
20100333044 Kethireddy Dec 2010 A1
20110012914 Nakamura Jan 2011 A1
20110037712 Kim Feb 2011 A1
20110037777 Lindahl et al. Feb 2011 A1
20110057880 Kasahara Mar 2011 A1
20110074973 Hayashi Mar 2011 A1
20110090256 Manchester Apr 2011 A1
20110158473 Sun et al. Jun 2011 A1
20110167382 van Os Jul 2011 A1
20110187749 Dehmann Aug 2011 A1
20110193982 Kook et al. Aug 2011 A1
20110261075 Tanaka Oct 2011 A1
20110298982 Kobayashi Dec 2011 A1
20110310094 Park Dec 2011 A1
20120001943 Ishidera Jan 2012 A1
20120033262 Sakurai Feb 2012 A1
20120044266 Mori Feb 2012 A1
20120056889 Carter et al. Mar 2012 A1
20120057064 Gardiner Mar 2012 A1
20120081382 Lindahl et al. Apr 2012 A1
20120139904 Lee et al. Jun 2012 A1
20120154276 Shin et al. Jun 2012 A1
20120162251 Minamino Jun 2012 A1
20120162263 Griffin Jun 2012 A1
20120176413 Kulik Jul 2012 A1
20120206488 Wong Aug 2012 A1
20120229370 Stroffolino et al. Sep 2012 A1
20120242683 Asai Sep 2012 A1
20120250082 Mori Oct 2012 A1
20120256959 Ye Oct 2012 A1
20120294533 Ikenoue Nov 2012 A1
20120299964 Homma Nov 2012 A1
20120324400 Caliendo, Jr. Dec 2012 A1
20130016122 Bhatt et al. Jan 2013 A1
20130038634 Yamada Feb 2013 A1
20130069988 Kamei Mar 2013 A1
20130069989 Nagata et al. Mar 2013 A1
20130120256 Ishidera May 2013 A1
20130141464 Hunt et al. Jun 2013 A1
20130162542 Badali Jun 2013 A1
20130176222 Tanaka Jul 2013 A1
20130205244 Decker et al. Aug 2013 A1
20130222231 Gardenfors et al. Aug 2013 A1
20130222516 Do et al. Aug 2013 A1
20130222646 Tsubota et al. Aug 2013 A1
20130235071 Ubillos et al. Sep 2013 A1
20130262486 O'Dell et al. Oct 2013 A1
20130293502 Kitatani Nov 2013 A1
20130328935 Tu Dec 2013 A1
20130335317 Liu et al. Dec 2013 A1
20140055494 Mikawa Feb 2014 A1
20140063611 Raymond et al. Mar 2014 A1
20140075286 Harada Mar 2014 A1
20140075372 Wu et al. Mar 2014 A1
20140078171 Miyatake et al. Mar 2014 A1
20140085339 Brady Mar 2014 A1
20140085430 Komori Mar 2014 A1
20140096064 Suzuki Apr 2014 A1
20140111548 Shin Apr 2014 A1
20140118256 Sonoda et al. May 2014 A1
20140168271 Yu Jun 2014 A1
20140177008 Raymond et al. Jun 2014 A1
20140210754 Ryu Jul 2014 A1
20140215365 Hiraga Jul 2014 A1
20140240453 Kim et al. Aug 2014 A1
20140240543 Kim Aug 2014 A1
20140258674 Kim Sep 2014 A1
20140307001 Aizawa Oct 2014 A1
20140340428 Shibayama Nov 2014 A1
20140359517 Elings et al. Dec 2014 A1
20140362117 Paulson Dec 2014 A1
20140365977 Elyada Dec 2014 A1
20140372914 Byrd Dec 2014 A1
20150029226 Feder Jan 2015 A1
20150035991 Sachs Feb 2015 A1
20150042669 Van Nostrand Feb 2015 A1
20150049119 Homma Feb 2015 A1
20150070458 Kim Mar 2015 A1
20150091945 Uratani Apr 2015 A1
20150095775 Lewis Apr 2015 A1
20150113368 Flider Apr 2015 A1
20150113370 Flider Apr 2015 A1
20150113371 Flider Apr 2015 A1
20150169166 Kim Jun 2015 A1
20150193912 Yuasa Jul 2015 A1
20150213784 Jafarzadeh Jul 2015 A1
20150215526 Jafarzadeh et al. Jul 2015 A1
20150215532 Jafarzadeh et al. Jul 2015 A1
20150278853 McLaughlin et al. Oct 2015 A1
20150278999 Summers Oct 2015 A1
20150287189 Hirai Oct 2015 A1
20150302587 Hirano et al. Oct 2015 A1
20150339002 Arnold et al. Nov 2015 A1
20150339006 Chaland et al. Nov 2015 A1
20150341536 Huang et al. Nov 2015 A1
20160026658 Krishnaraj et al. Jan 2016 A1
20160027150 Lee Jan 2016 A1
20160034166 Wilson et al. Feb 2016 A1
20160034167 Wilson et al. Feb 2016 A1
20160062645 Masuko Mar 2016 A1
20160148551 Jian May 2016 A1
20160148648 Dimson et al. May 2016 A1
20160163289 Masuko Jun 2016 A1
20160170608 Zambetti et al. Jun 2016 A1
20160173782 Dimson et al. Jun 2016 A1
20160202866 Zambetti et al. Jul 2016 A1
20160202872 Jang et al. Jul 2016 A1
20160240168 Keal Aug 2016 A1
20160248968 Baldwin Aug 2016 A1
20160275650 Case Sep 2016 A1
20160313781 Jeon Oct 2016 A1
20160330383 Oyama Nov 2016 A1
20160344927 Brasket et al. Nov 2016 A1
20160357420 Wilson et al. Dec 2016 A1
20170048442 Cote et al. Feb 2017 A1
20170061669 Hirano Mar 2017 A1
20170236253 Restrepo et al. Aug 2017 A1
20170278292 Feder Sep 2017 A1
20170285743 Yu et al. Oct 2017 A1
20170323149 Harary et al. Nov 2017 A1
20180061126 Huang et al. Mar 2018 A1
20180067633 Wilson et al. Mar 2018 A1
20180088775 Ye et al. Mar 2018 A1
20180114351 Rivard Apr 2018 A1
20180114352 Rivard Apr 2018 A1
20180115702 Brauer Apr 2018 A1
20180130182 Bhatt et al. May 2018 A1
20180341383 Sully Nov 2018 A1
20190035135 Feder Jan 2019 A1
20190347843 Rivard Nov 2019 A1
20200106956 Kimball Apr 2020 A1
Non-Patent Literature Citations (17)
Entry
Rivard et al., U.S. Appl. No. 16/518,786, filed Jul. 22, 2019.
Non-Final Office Action from U.S. Appl. No. 14/340,557, dated Jan. 21, 2016.
Final Office Action from U.S. Appl. No. 14/340,557, dated Sep. 16, 2016.
Easy Flex, Two Examples of Layout Animations, Apr. 11, 2010, pp. 1-11, http://evtimmy.com/2010/04/two-examples-of-layout-animations/.
Tidwell, J., “Animated Transition,” from Designing Interfaces, O'Reilly Media, Inc., Dec. 2010, pp. 127-129.
Notice of Allowance from U.S. Appl. No. 15/331,733, dated Dec. 7, 2016.
Notice of Allowance from U.S. Appl. No. 14/340,557, dated Mar. 3, 2017.
Notice of Allowance from U.S. Appl. No. 15/331,733, dated Apr. 17, 2017.
Non-Final Office Action from U.S. Appl. No. 15/452,639, dated May 11, 2017.
International Search Report and Written Opinion from PCT Application No. PCT/US17/57704, dated Nov. 16, 2017.
Notice of Allowance from U.S. Appl. No. 15/452,639, dated Nov. 30, 2017.
Corrected Notice of Allowance from U.S. Appl. No. 14/340,557, dated Jul. 27, 2017.
Non-Final Office Action from U.S. Appl. No. 15/622,520, dated Jan. 10, 2018.
Notice of Allowance from U.S. Appl. No. 15/622,520, dated Jul. 18, 2018.
Non-Final Office Action from U.S. Appl. No. 15/642,074, dated Oct. 19, 2018.
Notice of Allowance from U.S. Appl. No. 15/642,074, dated Apr. 10, 2019.
Non-Final Office Action from U.S. Appl. No. 16/518,786, dated Jun. 29, 2020.
Related Publications (1)
Number Date Country
20190035135 A1 Jan 2019 US
Provisional Applications (1)
Number Date Country
61958324 Jul 2013 US
Continuations (2)
Number Date Country
Parent 15622520 Jun 2017 US
Child 16147206 US
Parent 14340557 Jul 2014 US
Child 15622520 US