Advances in technology have added an ever-increasing array of features and capabilities to telecommunication devices and other portable computing devices. For example, telecommunication devices may include features such as touch screens, video and still cameras, web browsing capabilities, telephony capabilities, email sending and receiving capabilities, music storing and playback capabilities, calendar and contact managing capabilities, GPS (global positioning system) location and navigation capabilities, game playing capabilities, and television capabilities, to name a few. Many of these features and capabilities are provided through specialized applications resident on the telecommunication devices. For example, many telecommunication devices allow the user to further customize the device through custom configuration options or by adding third-party software. Thus, a variety of applications, such as dedicated computer programs or software, applets, or the like, can be loaded on a telecommunication device by the consumer, the network service provider, or by the telecommunication device manufacturer. Consequently, a typical telecommunication device can maintain a large variety of applications, content items, and the like.
Further, user-friendly graphic user interfaces (GUIs) that are available on many telecommunication devices enable users to perform a wide variety of tasks, such as initiating or receiving phone calls, writing emails or text messages, browsing the Internet, managing device settings and contact lists, viewing media content, and using the large assortment of applications mentioned above. GUIs may also be specific to particular applications, such as applications developed by third party developers. However, because the number of applications and other items present on a telecommunication device may be quite large, only a portion of the applications and other items available can typically be displayed on the GUI at any one time. For example, the GUI of a typical telecommunication device often requires horizontal or vertical scrolling through a number of pages or views to locate a desired application.
The detailed description is set forth with reference to the accompanying drawing figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items or features.
The technologies described herein are generally directed towards user interfaces for telecommunication devices, touch screen devices, tablet computing devices, and other portable computing devices. Some implementations provide a user interface having an interactive z-axis component. For example, some implementations provide a stack of items that are scrollable in a direction of a z-axis either toward or away from a plane of a display screen of the device. Further, implementations include a method of detecting interaction with a three dimensional user interface having an interactive z-axis dimension based on a user's finger position relative to the device. In some implementations, layers of applications or other items are presented and are scrollable in the z-axis direction. For example, a user may avoid having to move the user interface desktop left/right/up/down to locate an application, and is instead able to scroll through multiple applications or other items in the z-axis direction. The movement through the scrollable items in the z-axis direction may be activated by various controls or inputs, such as by a physical or virtual slider, a touch-free finger position sensing component, and so forth.
According to some implementations, a user interface architecture includes a set of columns or stacks of items displayed and browsable forward or backward along the z-axis direction. Each stack may have a representation on the x-axis or y-axis, such as a name or data type of the stack. For example, the name in the x-axis could be “photos” and the items contained in the stack associated with that name could be representations of albums of photos, individual photos, and so forth. The user interface architecture may also be hierarchical. For example, an item in one stack can represent a folder that includes a number of subfolders. Selection of the item can result in the display of a new set of stacks of items in the user interface in which each of the subfolders are represented along the x-axis as a stack and the items in the subfolders are represented along the z-axis as the items in the stacks.
The multiple stacks may be arranged as an upper level navigation interface in which each stack has a different centricity for enabling navigation among applications, media content, and other items and features on the device. For example, the upper level navigation interface may include an applications stack, a calendar stack, a people stack, a device management stack and a media stack. Each upper level stack may have a different centricity from the other upper level stacks. Each upper level stack may be navigated along the z-axis direction to view items contained therein, and the upper level navigation interface may be navigated along the x-axis direction to view and access other stacks of the multiple stacks in the upper level navigation flow. Further, each stack in the upper level navigation flow may be expanded to provide one or more additional multiple stack interfaces corresponding to the centricity of the particular upper level stack that was expanded. Navigation properties between adjacent stacks in the lower level flows may vary depending on the centricity of the particular lower level flow. For example, in some implementations, navigation from a current stack to an adjacent stack may result in presentation of an item in the adjacent stack at an analogous level of depth in the stack, while in other implementations, navigation to an adjacent stack results in presentation of a first or front item in the adjacent stack.
In some implementations, z-axis browsing is responsive to a detected position of a user's finger in relation to the device rendering the user interface. For example, the device may include one or more sensors for detecting a position of a user's fingertip at a spatially separated distance from the display screen of the device. Movement of the user's fingertip toward or away from the display screen of the device is detected and is interpreted into movement of the user interface along the z-axis direction. Furthermore, lateral translation of the user's finger in the left or right direction relative to the display screen can be interpreted as a panning movement of the user interface in the x-axis direction, while translation of the user's finger in the up or down direction relative to the display screen can be interpreted to pan the user interface in the y-axis direction. Accordingly, implementations herein provide for interaction with a user interface having three dimensions of movement based on a finger-position of the user.
Furthermore, in some implementations, a slider may be provided for the user to scroll in the z-axis direction. For example, in the case of a device having a touch screen display, the slider may be a virtual slider located in a portion of the touchscreen. Alternatively, a mechanical slider or similar mechanism may be provided as part of the device. Employing the slider, the user is able to flip forward and backward through layers of applications or other items displayed in the z-axis direction. In other implementations, tilting of the device is used to control interaction in the z-axis direction. For example, tilt-detection can be activated when the device is in a first position, and the tilting of the device toward or away from the user causes movement of the interface along the z-axis direction.
According to some implementations, multiple columns or stacks of multiple elements are arranged in a grid in which each column or stack represents multiple elements of a similar type. Moving the stacks horizontally or vertically, such as by using swipes, dragging or panning the stacks, moves a focus of the user interface from one element type to another, while navigation in the z-axis direction allows the user to move between individual elements of a particular type. Further, through the use of perspective when displaying the stacks of items in the user interface, a user is able to visually determine the amount of content in a stack by the size of the stack. Thus, the items represented in the user interface can be quickly browsed across several different types of data. Some implementations herein may be employed for rapidly scanning through large groups of brief content, such as contacts, social status updates, really simple syndication (RSS) blurbs, or the like. Further, implementations enable a large number of applications or items to be viewed on a single desktop without necessitating panning or scrolling in the x or y direction through multiple page views. Accordingly, the implementations of the user interface herein provide a scrollable representation of applications or items along a direction of a z-axis to compactly represent, on a single user interface view, a plurality of applications or items associated with multiple desktop user interface views.
A user is able to interact with the items 106 to cause the items to move forward or backward along the z-axis, as indicated by arrow 118, so that each of the items 106 may be viewed by the user. For example, the entire stack 108 can be made to appear to move forward and outward of the display 104, so that as each item 106 reaches a certain point it will fade or disappear. The item immediately behind then becomes visible for viewing. Consequently, a user can scroll through and view a large number of items 106 in a relatively short time.
The stack 108 may be arranged with a perspective viewpoint so that as items 106 are placed toward the rear, each item 106 appears smaller and closer together with the next item than with the item in front of it until a horizon 110 is reached where the items appear to blur together. Alternatively, in other implementations, the items 106 may continue to be shown increasingly smaller to a perspective vanishing point. Thus, the stack 108 can provide a user with an indication of a number of items in the stack 108. For example, if only five items 106 are in the stack 108, then all five items can be visible. If a very large number of items are in the stack 108, then the stack may appear to extend far into the screen.
Device 100 may include various controls for controlling the user interface 102. In the illustrated example, device 100 includes a one or more finger position sensors 120 and one or more squeeze or grip sensors 122, the use of which will be described additionally below. Alternatively or in addition, a slider (not shown in
At block 402, multiple items are presented in a stack that is scrollable in the z-axis direction. For example, applications, content items, or the like may be presented in a stack to a user in the user interface 102, and the user is able to scroll forwards or backwards through the stack on the z-axis to locate and select a desired item.
At block 404, the user interface receives a selection of one of the items in the stack. For example, when a user reaches a desired item, the user may stop scrolling and select the item, such as by using a designated control or, in the case of a touchscreen, tapping on the item itself, or the like.
At block 406, the user interface presents a new set of items corresponding to the selected item in a new stack. The user is able to scroll through the new stack to locate a new item to be selected. Consequently blocks 404 and 406 may be repeated a number of times depending on the depth of the hierarchy.
Additionally, device 100 may include one or more squeeze or grip sensors 122 as a user-activatable input mechanism located on the sides of the device 100 or in another suitable location. For instance, grip sensors 122 may be pressure sensitive sensors or switches that are activated when a sufficient predetermined pressure is applied. Grip sensors 122 are able to be grasped by a user of the device 100 and squeezed for executing certain functions in the user interface 102. For example, one use of grip sensors 122 may be to select an item currently viewed in the user interface 102, although numerous other functions may also be implemented. Further, in some implementations, grip sensors 122 may also be touch sensitive, having a touch-sensitive surface 506 that can detect, for example, the sliding of a user's finger along the surface. Consequently, in some implementations, grip sensors 122 can be used as scroller or slider for controlling interaction with the user interface in the z-axis direction. Alternatively, in other implementations, grip sensors 122 may be employed as a user-activated input mechanism used in conjunction with other inputs, such as finger position for controlling interaction in the z-axis direction. Additionally, while grip sensors are shown on the sides of device 100 in some implementations herein, in other implementations, such as in the case in which device 100 is larger than a palm-sized unit, as in the case of a tablet device, on or more grip sensors may be located elsewhere on the device, such as near one or more corners of the device (e.g., the corner of a touch-sensitive screen) on the back of the device, or other convenient location for gripping the device.
As an example, an initial fingertip position of the finger may be established near the device 100 by squeezing and holding the grip sensors 122 while positioning the fingertip 502 within proximity to the finger position sensor 120. When the initial fingertip position has been established, all movements may be track relative to that point by the finger position sensor 120. For example, movement of the finger tip 502 laterally in a plane parallel to the screen 104 of the device may be interpreted as a real-time panning motion on the user interface in the direction of finger movement.
Further, as illustrated in
At block 612, an initial position of the fingertip of a user is detected by the device 100. For example, an initial fingertip position may be established near the device 100 by squeezing and holding the grip sensors 122 while positioning the fingertip within proximity to the finger position sensor 120.
At block 614, movement of the finger is detected in the direction of the z-axis relative to the initial position. For example, the finger position sensor 120 may detect that the finger has moved toward or away from the initial position.
At block 616, in response to the detected movement of the finger, the user interface 102 scrolls in the direction of the z-axis by moving one or more items in the stack of items presented in the user interface as described above with respect to
While the finger positioning system 500 has been described in use with the user interfaces described herein, the finger positioning system 500 can also be used with other types of user interfaces for carrying out panning and zooming operations. For example, when viewing a map, the fingertip positioning system 500 in conjunction with the grip sensors 122 can be used to pan and zoom over portions of the map, and can even carry out panning and zooming in a single motion. Further, in other implementations, the finger positioning system 500 may be used for manipulating 3-D objects, 3-D spatial navigation, game control, or the like. Other uses and functions will also be apparent to those of skill in the art in light of the disclosure herein.
Additionally, or alternatively, as illustrated in
Other variations may also be used. For example, a first squeeze of the grip sensors 122 may turn on the tilt-responsive interaction with the z-axis, while a second squeeze of grip sensors 122 turns off the tilt-responsive interaction. Further, rather than using grip sensors 122, other activation mechanisms may be used, such as touching one of control buttons 124. Additionally, tilting the device to the left or right, rather than forward or backward, can be used for scrolling in the x-axis direction. As another example, touching a location on screen 104 when screen 104 is touch-sensitive may also serve as an activation mechanism for using tilting of the device for interaction with the interface in the z-axis direction.
At block 612, an initial position or attitude of the device is detected. For example, an initial position of the device may be established when a user squeezes and holds the grip sensors 122. Other activation mechanisms may also be used to implement the tilting control, as discussed above.
At block 614, tilting of the device is detected relative to the initial position. For example, one or more accelerometers or other motion sensors may be used to detect tilting of the device from the initial position, such as tilting the device forward or backward around the x-axis direction, e.g., rotating part of the device toward or away from the user.
At block 616, in response to the detected tilting of the device, the user interface 102 scrolls in the direction of the z-axis by moving one or more items in the stack 108 of items 106 presented in the user interface, as described above with respect to
Further, as mentioned above with reference to
In the illustrated example, the user interface 800 includes multiple stacks 802, 804, 806, in which each stack is made up of multiple items. For example, stack 802 is made up of items 808-1, 808-2, . . . , 808-n; stack 804 is made up of items 810-1, 810-2, . . . , 810-n; and stack 806 is made up of items 812-1, 812-2, . . . , 812-n. The view or focus of the user interface 800 is sized so that a single stack 802 is viewable and large enough to present meaningful information, while a portion of the adjacent stacks 804, 806 are shown to the right and left, respectively, to enable intuitive navigation to the adjacent stacks. Similar to the implementations described above, the items in each stack are displayed and browsable forward or backward along the z-axis direction, as indicated by arrow 814.
Each stack 802-806 may have a representation on the x-axis, such as a name or data type 816 of items in the stack, and may also include an indication 818 of the number of items in the stack. In some implementations, the different stacks may represent different data types or information. For example, one stack may be for contacts, one stack for e-mail, one stack for a calendar, etc. Furthermore, the focus of the user interface may be switched from one stack to an adjacent stack by scrolling or dragging of the stacks in the direction of the x-axis, as indicated by arrow 820. For example, stack 802 may be moved to the left into the position currently occupied by stack 806, which would put stack 804 in the focus of the user interface. This left/right panning or scrolling may be conducted at any location in the stack, thereby switching between data types at the current level of depth, as will be described additionally below.
Consequently, a user is able to successively view each item 808 contained in the stack 802. Furthermore, in some implementations, when the end of the stack 802 is reached, the stack 802 may loop back so that the first item 808-1 is presented again to the viewer thereby restarting the stack 802. Additionally, the user is able to reverse the direction of scrolling at any point in time so that the stack 802 and the stacks 804 and 806 appear to move inward along the z-axis, away from the user, rather than outward from the user interface. Further, rather than employing the fade effect described above, each item 808 may simply disappear or appear at a predetermined point, such as when the item 808 reaches a size too large to fit within the view of the user interface 800. Other variations will also be apparent to those of skill in the art in light of the disclosure here in.
Additionally, as depicted in
As illustrated in
Additionally, in some implementations, rather than adding or removing entire data types to the flow 1000, a user may add one or more items of a particular data type. For example, if the user has received updates from a social networking site, the user can add one or more updates of interest to the flow 1000 for subsequent review, while leaving other updates out of the flow 1000. For example, if the user has a stack for the social network, the selected one or more items are added to the stack, or the selected items may merely be added to the flow 1000 separate from any other stack.
In another variation, rather than having the user add stacks to the flow 1000, one or more stacks may be automatically added, such as when one or more relevant updates are received for a particular data type. For example, suppose that the user receives a new text message. Upon receipt of the text message, the SMS/MMS stack 1012 may be automatically added to the flow 1000. After the user has viewed the text message, the SMS/MMS stack 1012 may then automatically be removed from the flow 1000. When another new text message is received, the SMS/MMS stack 1012 is again added back to the flow 1000. This automatic addition and removal of stacks can be extended to include updates to any of the different data types. Further, rather than adding an entire stack that includes both new updates and items already viewed, the items or stacks added to the flow 1000 may be just the newly received or updated items. As another example, one of the stacks in the flow 1000 may be designated as containing newly-received updates of various different data types. Thus, the user can then just scroll through this one stack to view updates to various difference data types, e.g., new text messages, new emails, new social networking updates, or the like. These updates can also be added to their corresponding data type stack as well, to provide the user with the option to view updates according to data type.
In some implementations, the flow 1000 may be configured to automatically scroll across the view in the x-axis direction and momentarily pause on each stack before moving to a subsequent adjacent stack. The flow 1000 may loop to create a continuous experience. The flow direction and speed may be adjusted by the user, and when the user wishes to view a particular stack's content, the user can stop the flow such as with a finger tap and scroll along the z-axis to view the content of the particular stack. Furthermore, in addition to including the name of the data type 816 described above, the stacks may be visually distinct from each other in other ways, such as being color-coded, having distinct icon shapes, or the like. Additionally, the number of items in each stack may be visually indicated by depth of the stack, as discussed above, and/or the numerical indicator 818 may indicate the number of items in each stack. Furthermore, while the flow 1000 has been described in some implementations as displaying recent updates and information, in other implementations, the flow 1000 may be populated with stacks of other types. For example, the user may populate the flow 1000 with applications that the user frequently uses, or the like. Further, while some implementations provide constant movement of the flow 1000, in other implementations the movement is only initiated by the user, such as by swiping on a touch screen, or by activation of a control, slider, or the like.
Further, as mentioned above, the user may move to an adjacent stack at any point during navigation of the z-axis. For example, suppose that the today stack 1110 contains items representing one hour time periods for creating appointments, and the user has navigated along the z-axis of the today stack 1110 to determine whether an appointment is already scheduled for 3:00 pm. If so, the user my swipe or otherwise activate a control to move the plurality of stacks 1108 to the left so that the 3:00 pm time slot of the tomorrow stack 1112 is immediately presented in the focus or viewable area 1102, rather than the first item in the tomorrow stack 1112. The user can then determine whether the 3:00 pm time period is available tomorrow. If not, the user may move on to the next adjacent stack 1114 (i.e., the day after tomorrow) and be immediately presented with the 3:00 pm time period for that day, and so forth. Other navigation variations are also possible, as described additionally below.
As discussed above, as indicated by arrow 1216, the user may move a desired stack 1202-1210 into the focus 1214 by swiping or dragging in the case of a touch screen, by using mechanical controls, or other suitable control mechanism. Further, in some implementations, any of the multiple stack interfaces herein, including the upper level interface 1200, may be configured as a flow to automatically alternate between sequential presentation of each of the stacks 1202-1210, such as by continually scrolling each of the stacks 1202-1210 through the focus 1214, and optionally stopping for a brief period of time before presenting the next stack in the sequence.
Each stack 1202-1210 may be expanded into a plurality of additional stacks of a lower hierarchical level and having configurations based on the centricity of the corresponding upper level stack 1202-1210. Further, each set of lower level stacks may have different navigation properties based on the centricity of the particular upper level stack 1202-1210 from which they originate. For example, in some implementations, navigation from a first stack to an adjacent stack may result in direct display of an item that is analogous to an item that was displayed in the first stack. In some implementations, an analogous item might simply be an item at the same level of depth in the stacks along the z-axis direction, while in other implementations an analogous item might be an item directly related to a current item, e.g., related to the same person, same subject, same time period, or the like. Further, in other implementations, navigation to an adjacent stack results in display of a beginning of the adjacent stack.
In some implementations, an expansion control 1218, such as a virtual control, may be displayed in the focus 1214 in conjunction with a stack 1202-1210. For example, the expansion control 1218 may be touched or tapped on by the user to expand the selected upper level stack 1202-1210 into a plurality of corresponding lower level stacks based on the centricity of the selected upper level stack. Additionally, a collapse control 1220 may also be provided to enable the user to move back up a hierarchy of stacks from a lower level to a higher level. For example, pressing the collapse control 1220 once may result in display of a next higher level hierarchy, while pressing and holding the collapse control 1220 for a predetermined period of time or pressing multiple times may result in navigation to a highest level of the hierarchy. Further, while the examples herein discuss a virtual expansion control 1218 and collapse control 1220 displayed in conjunction with a touch screen, other types of controls may also be used for expansion and collapsing, such as double tapping on a selected stack, certain gestures or actions on a touch screen, mechanical controls provided on the device, or the like.
The galleries stack 1302 may contain galleries of photographs or videos arranged according to people shown in the images, such as may be identified through tagging of the images, or the like. A user may navigate through the galleries in the z-axis direction to locate a gallery of a particular person. The social network stack 1304 may contain social network pages of social network friends arranged alphabetically, by frequency of contact, or the like. A user may scroll through the social network stack 1304 in the z-axis direction to locate a social network page of a particular person. Similarly, the user mage navigate through the contacts stack 1306 to locate a contact page for a particular person. The microblog stack may include a plurality of microblog pages that the user follows, and the user may navigate along the z-axis to locate a particular page for a particular person. Further, the relationship manager stack 1310 may correspond to a relationship management application that enables users to maintain connectivity with selected individuals. For example, the relationship manager may determine a length of time since a last communication with the selected individual and provide reminders to the user to maintain contact when the length of time exceeds a predetermined threshold.
In some implementations, as the user navigates along the z-axis in any one of the stacks 1302-1310, the other stacks 1302-1310 also scroll to the same depth level, and the user is able to peripherally witness this scrolling of adjacent stacks by movement of the items of the adjacent stacks partially visible within the focus 1214. However in other implementations, the scrolling effect of the adjacent stacks is not necessarily provided. In any event, when the user has navigated along the z-axis to an item relating to Jon, subsequent lateral navigation the x-axis direction to any the stacks may result in direct presentation of a corresponding item relating to Jon from that stack. In the illustrated example, the user navigates along the z-axis direction to item 1316 containing Jon's contact information in the contacts stack 1306. The user then can navigate in the x-axis direction to the social network stack 1304 and be presented with item 1318 representing Jon's social network page. The user may continue navigation in the x-axis direction to the galleries stack 1302 and be presented with an item 1320 representing Jon's gallery (e.g. a gallery of images containing or related to Jon). Similarly, navigation in the opposite direction along the x-axis (or continued navigation in the same direction along the x-axis) brings the microblog stack 1308 into the focus 1214, and immediately presents an item 1322 displaying Jon's microblog page, while navigation of the relationship manager stack 1310 into the focus 1214 presents an item 1324 displaying Jon's relationship manager information.
As a further example, suppose that a second person, for example Josh, immediately follows alphabetically behind Jon among the people that the user interacts with in at least one of the stacks 1302-1310. When the user navigates along the z-axis direction from, for example, item 1318 displaying Jon's social network page to the next item 1326 displaying Josh's social network page, Josh's social network page is presented in the focus 1214. Subsequent navigation in the x-axis direction will present an item 1328 displaying Josh's gallery, an item 1330 displaying Josh's contact information, an item 1332 displaying Josh's micro-blog page, and an item 1334 displaying Josh's relationship manager information. Consequently, in these implementations, navigation in the x-axis direction results in presentation of items in adjacent stacks that are analogous or at a same level of depth as the current stack, i.e., items corresponding to the same person.
Furthermore, suppose that Jon does not have, for example, a social network page. In this case, the user may be presented with an item that indicates that Jon is not currently associated with a social network page and that provides the user with an invitation to locate or provide information to link Jon to a social network page. This concept can be extended to the other stacks 1302-1310 in the people-centric interface 1300, such that whenever a page or information is missing for a particular person in one or more of the stacks 1302-1310, the user may be presented with an opportunity to add information for the particular person to that stack, rather than being presented with a blank item or the like. For example, suppose that the user has just added a new friend on the social network, and the user navigates in the direction of the z-axis to the new friend's page in the social network stack 1304. If the user then navigates laterally to the contacts stack 1306, the interface may automatically create a contact item, add the new friend's name to the contact item, and present the contact item along with an invitation for the user to fill in other contact information for the new friend. If the user then navigates to the microblog stack 1308, the user may be presented with an item inviting the user to add the new friend's microblog information, and so forth. Additionally, while lateral navigation has been described as occurring at the same level of depth throughout the people-centric stacks 1302-1310, in other implementations, the user may be provided with the opportunity to change the default navigation so as to automatically relocate the focus to the beginning item of an adjacent stack, or other such variations. Further, should the user desire to navigate back to the upper level interface 1200, the user may simply press the collapse button 1220 to close the people-centric interface 1300 and be presented with the upper level interface 1200.
The application store stack 1402 may include items that represent one or more application stores that a user may access to obtain additional applications. Communication applications stack 1404 may include a plurality of items representing communication applications, such as arranged in alphabetical order or order of most frequent use. Similarly, the games stack 1406 may include a plurality of items representing different games that the user can access, the media applications stack 1408 may include a plurality of items representing media applications that the user can access, and the productivity applications stack 1400 may include a plurality of items representing productivity applications that the user can access. Further, when the user reaches the end of any of the application stacks 1404-1410, the user may be presented with an item that invites the user to connect directly to the application store to add more applications, or the like.
Navigation within the application-centric interface 1400 can be configured to take place differently than that described above for the people-centric interface 1300. For instance, there is typically little correspondence or relationship between the applications in one stack 1404-1410 and applications in an adjacent stack 1404-1410. Therefore, according to some implementations, navigation to an adjacent stack along the x-axis, as indicated by arrow 1412, can result in the user being presented with the first or beginning item in the adjacent stack regardless of the level of depth to which the user has navigated in the previous stack. For example, suppose that the user navigates along the z-axis in the games stack 1406, as indicated by arrow 1416, to a particular game near the middle of the games stack 1406. Should the user then navigate laterally to the left to an adjacent stack, such as to the communication applications stack 1404, the user may be presented with a first item at the beginning of the communications applications stack 1404, rather than an item at the same level of depth. Other navigation variations will also be apparent to those of skill in the art in light of the disclosure herein.
Navigation among the stacks 1502-1510 in the device-management-centric interface 1500 may be similar to that described above with respect to the application-centric interface 1400. For example, as there is typically little correspondence or relationship between items in one stack 1502-1510 and items in another stack 1502-1510, navigation along the x-axis direction from a current stack to an adjacent stack may typically result in navigation to the first or beginning item in the adjacent stack, regardless of the depth level of navigation in the current stack.
Additionally, the media item stacks 1602-1610 may be further expanded by selection of expansion control 1218, such as to create a photo-centric interface 1612 or a music centric interface 1614. For example, the photo centric interface 1612 may include a plurality of stacks related to different photograph storage categories based on how the photographs are stored or classified, such as a date stack 1616, a location stack 1618, a name stack 1620, an event stack 1622, and a tagged stack 1624. The date stack 1616 may include a plurality of items representing photographs arranged according to the date on which the photographs were taken. The location stack 1618 may contain a plurality of items representing photographs arranged according to the location at which the photographs were taken. For example, the location may be automatically recorded by a camera using a GPS, or the like, when the photo is taken. Alternatively, the user may tag the photos or otherwise identify the location of photos. The name stack 1620 may include a plurality of items representing photographs arranged according to the names of the people in the photographs. The event stack 1622 may contain photographs arranged according to particular events, such as holidays, birthdays, etc. The tagged stack 1624 may include a plurality of items representing photographs that have been tagged by the user or by others, and arranged according to the tags. Because there is typically little correspondence between adjacent items in the stacks 1616-1624, navigation on the x-axis direction from a current stack to an adjacent stack of the photo-centric interface 1612 may be configured to present the first or beginning item in the adjacent stack, rather than an item at an analogous level of depth.
The music-centric interface 1614 may have a plurality of stacks based on different music storage categories, such as an artists stack 1626, an albums stack 1628, a song titles stack 1630, a playlists stack 1632, and a genre stack 1634. The artists stack 1626 may contain a plurality of items representing songs listed according to artist, such as in alphabetical order or other suitable order. The albums stack 1628 may include a plurality of items representing albums, such as in alphabetical order or other suitable order. The song titles stack 1630 may include a plurality of items representing songs according to title, such as in alphabetical order or other suitable order. The playlists stack may include a plurality of items representing playlists, with each playlist containing a number of songs. The playlists may be created by the user or created automatically by an application on the device 100. The genre stack 1634 may include a plurality of items representing songs categorized according to various genres such as hip-hop, rock, classical, blues, country, etc.
Navigation laterally among the multiple stacks in the music centric interface 1614 may be a combination of navigation through an analogous level of depth and navigation to the front of a stack. Thus, the user interface may determine an appropriate navigation property based on the type of the adjacent stack being navigated to. For example, suppose that the user navigates along the z-axis direction in the song titles stack 1630, and arrives at a song entitled “Poker Face” by an artist named “Lady Gaga.” If the user then navigates along the x-axis direction to the albums stack 1628, the user may then be immediately presented with an analogous item representing an album entitled “The Frame” having the song “Poker Face” as one of the tracks. If the user continues to navigate to the next adjacent stack, the artists stack 1626, the user may be immediately presented with an item representing a list of songs by Lady Gaga, including “Poker Face.” If the user navigates to the genre stack 1634, the user may be immediately presented with an item representing pop genre that includes the song “Poker Face.” Further, if the user navigates to the playlist stack 1632, the user may be presented with an item representing a playlist that includes the song “Poker Face.” However, if there is no playlist that includes the song “Poker Face,” the user may instead be presented with the first item in the playlist stack 1632. The user may then scroll through the playlists along the z-axis direction to locate a playlist to which to add “Poker Face,” etc. Consequently, depending on the point at which navigation in the x-axis direction begins, navigation may either move to an analogous depth level in an adjacent stack, or may move back to the beginning of a stack. For example, suppose that the user is navigating along the z-axis direction through the playlist stack 1632, and arrives at a particular playlist. Navigation to an adjacent stack such as the song titles stack may result in the user being presented with the first or beginning item in the song titles stack 1630, as there typically would not be a single analogous item that is analogous to a particular playlist. On the other hand, if the user navigates to a particular playlist and selects a particular song in the particular playlist, and then navigates in the x-axis direction to an adjacent stack, such as the song titles stack 1630, the navigation may result in the immediate presentation of the particular song according to title. Other variations will also be apparent in view of the disclosure herein.
Additionally some of the stacks in the photo centric interface 1612 and the music centric interface 1614 may be further expanded to create additional multiple stack interfaces of even lower hierarchies. For example, in the photo-centric interface 1612, the event stack 1622 may be expanded to generate an interface of multiple stacks representing particular events such as holidays, birthdays, etc. Similarly, the genre stack 1634 in the music centric interface 1614 may be expanded to create an interface of a plurality of stacks, with each stack representing a different genre. Furthermore, the movies stack 1602, videos stack 1604, and television programs stack 1610 of the media-centric interface 1600 may each be similarly expanded to create additional multiple stack interfaces of lower level hierarchies similar to the photo-centric interface 1612 and the music centric interface 1614. Additional variations will also be apparent to those of skill in the art in light of the disclosure herein, with the foregoing being mere non-limiting examples presented for discussion purposes.
In the example of
Further, the user may navigate through the days of the week by navigating along the z-axis direction. For example, suppose that the current day is Wednesday. The user activates the week expansion control 1704, and is presented with the stack for the current week 1708, with a first item 1716 representing Wednesday being displayed at the front of the current week stack 1708, such as for displaying any appointments scheduled for that day. The other days of the current week are available for navigation behind the first item 1716, namely a second item 1718 representing Thursday, a third item 1720 representing Friday, a fourth item 1722 representing Saturday, a fifth item 1724 representing Sunday, a sixth item 1726 representing Monday, and a seventh item 1728, representing Tuesday. Thus, the user may navigate forward or backward in the z-axis direction, as indicated by the arrow 1730 to view appointments scheduled for any day of the week. Further, should the user navigate to the left or right in the x-axis direction, the user may be presented with an item at the analogous level of depth. For example, suppose the user would like to schedule an appointment on a Thursday afternoon, and has navigated in the z-axis direction to second item 1718 representing Thursday. If there are no appointments available for this Thursday, the user may swipe the current week stack 1708 sideways to navigate in the x-axis direction and be immediately presented with item 1732 representing Thursday of next week in the next week stack 1710. Thus, in some implementations, navigation from one stack 1708-1712 to another stack 1708-1712 takes place at the same level of depth of navigation in the x-axis direction, i.e. to the same day of the week. Alternatively, in other implementations, the default navigation may be configured to start at the beginning of the adjacent week stack, such as by displaying Monday as the first item in an adjacent stack. Further, in some implementations, rather than displaying a seven-day week, the interface 1700 may be configured to display only a five-day week, such as Monday-Friday.
Further, navigation in the x-axis direction to an adjacent stack, as indicated by arrow 1750, locates a next month stack 1752 or a last month stack 1754 within the focus 1214, depending on the direction of navigation. When navigating from a currently presented item in a current stack to an adjacent stack, in some implementations, the user is presented with the first day in the month represented by the adjacent stack, such as day one 1756 of the next month stack, or day one 1758 of the last month stack 1754. Alternatively, the interface may be configured to immediately present the same day of the adjacent month as the day of the current month that the viewer was viewing. For example, the user may be provided with options for setting the default navigation scheme. Further, while examples of a calendar-centric interface have been provided herein, other variations will be apparent to those of skill in the art in light of the disclosure herein.
Interface 1800 may also include a stack 1810 of items adjacent to the list 1802 of categories. For example, stack 1810 may include related items related to the categories in the list 1802. The related items may be displayed concurrently with the selection of a category, or with the passage of a corresponding category through the focus 1808 during scrolling of the list 1802. In some implementations, when a particular category is selected or located in the focus 1808, a related item 1812 is displayed at the front of the stack 1810. In the illustrated example, “music” is the currently selected category, and related item 1812 may be related to music. For example, related item 1812 may be a representation of a particular song or album, may be a graphic representing music in general, may be a music-related advertisement, or the like. Additionally, as a user scrolls other categories in list 1802 through the focus 1808 and/or selects other categories in list 1802, the stack 1810 can automatically scroll in the z-axis direction, as indicated by arrow 1814, in a contemporaneous manner. For example, a related item 1816 located immediately behind related item 1812 may be related to applications, i.e., the next category in list 1802, while a related item 1818 located behind related item 1816 may be related to games, and so forth. Additionally, as a next category in list 1802 enters the focus 1808 during scrolling of the list 1802, in some implementations, the currently-displayed related item may appear to fly out toward the user so that the next related item in the stack 1810 is displayed as the top or front item in stack 1810. Similarly, when the list 1802 is scrolled in the opposite direction, relate items of stack 1810 may appear to fly inward in the z-axis direction, onto the front of stack 1810.
Further, a plurality related representations 1820 may be displayed in another area of the interface 1800, such as below stack 1810 and list 1802. For example, related representations 1820 may be movable or scrollable in the x-axis direction, as indicated by arrow 1822. In some implementations, representations 1820 may be individual items, while in other implementations, representations 1820 may be stacks of items. For example, when music 1808 is selected, in some implementations, representations 1820 may be individual songs or albums, while in other implementations, representations 1820 may be a flow or group of stacks, such as stacks 1626-1634 in the music centric interface 1614 described above with reference to
In various implementations, memory 1902 generally includes both volatile memory and non-volatile memory (e.g., RAM, ROM, Flash Memory, miniature hard drive, memory card, or the like). Additionally, in some implementations, memory 1902 includes a SIM (subscriber identity module) card, which is a removable memory card used to identify a user of the device 100 to a telecommunication service provider.
In some implementations, the user interface component 1904 implements the user interfaces described above, including the user interface 102 and the user interface 800. The user interface component 1904, including the z-axis component 1906, the flow component 1908, the finger position component 1910 and the slider control component 1912 may comprise a plurality of executable instructions which may comprise a single module of instructions or which may be divided into any number of modules of instructions.
In various implementations, the APIs 1914 provides a set of interfaces allowing application providers to create user interfaces that provide for the z-axis scrolling and x-axis translation of sets of z-axis-scrollable items, as described herein. The interfaces of the APIs 1914 may in turn correspond to a set of functions, such as a function for generating a user interface or a function for enabling control of a user interface with a finger position control system or a slider. Such functions may take as parameters a set of parameters and user interface element pairs, as well as an identifier of the application, OS, platform, or device to which the user interface elements belong.
In various implementations, the applications 1916 and the OS and other modules 1918 comprise any executing instructions on the device 100. Such instructions include, for example, an OS of the device 100, drivers for hardware components of the device 100, applications providing interfaces to settings or personalization of the device 100, applications made specifically for the device 100, and third party applications of application providers. Collectively these applications/processes are hereinafter referred to as applications 1916 and OS and other modules 1918, which may be entirely or partially implemented on the device 100. In some implementations, the applications 1916 and OS and other modules 1918 are implemented partially on another device or server.
In some implementations, the processor 1920 is a central processing unit (CPU), a graphics processing unit (GPU), or both CPU and GPU, or other processing unit or component known in the art. Among other capabilities, the processor 1920 can be configured to fetch and execute computer-readable instructions or processor-accessible instructions stored in the memory 1902, machine readable medium 1930, or other computer-readable storage media.
In various implementations, the display 1922 is a liquid crystal display or any other type of display commonly used in devices, such as telecommunication devices. For example, display 1922 may be a touch-sensitive touch screen, and can then also act as an input device or keypad, such as for providing a soft-key keyboard, navigation buttons, or the like.
In some implementations, the transceiver(s) 1924 includes any sort of transceivers known in the art. For example, transceiver(s) 1924 may include a radio transceiver and interface that performs the function of transmitting and receiving radio frequency communications via an antenna. The transceiver(s) 1924 may facilitate wireless connectivity between the device 100 and various cell towers, base stations and/or access points.
Transceiver(s) 1924 may also include a near field interface that performs a function of transmitting and receiving near field radio communications via a near field antenna. For example, the near field interface may be used for functions, as is known in the art, such as communicating directly with nearby devices that are also, for instance, Bluetooth® or RFID enabled. A reader/interrogator may also be incorporated into device 100.
Additionally, transceiver(s) 1924 may include a wireless LAN interface that performs the function of transmitting and receiving wireless communications using, for example, the IEEE 802.11, 802.16 and/or 802.20 standards. For example, the device 100 can use a Wi-Fi interface to communicate directly with a nearby wireless access point such as for accessing the Internet directly without having to perform the access through a telecommunication service provider's network.
In some implementations, the output device(s) 1926 include any sort of output devices known in the art, such as a display (already described as display 1922), speakers, a vibrating mechanism, tactile feedback mechanisms, and the like. Output device(s) 1926 may also include ports for one or more peripheral devices, such as headphones, peripheral speakers, or a peripheral display.
The machine readable storage medium 1930 stores one or more sets of instructions (e.g., software) embodying any one or more of the methodologies or functions described herein. The instructions may also reside, completely or at least partially, within the memory 1902 and within the processor 1920 during execution thereof by the device 100. The memory 1902 and the processor 1920 also may constitute machine readable medium 1930. The term “module,” “mechanism” or “component” as used herein generally represents software, hardware, or a combination of software and hardware that can be configured to implement prescribed functions. For instance, in the case of a software implementation, the term “module,” “mechanism” or “component” can represent program code (and/or declarative-type instructions) that performs specified tasks or operations when executed on a processing device or devices (e.g., CPUs or processors). The program code can be stored in one or more computer-readable memory devices or other computer-readable storage devices, such as memory 1902. Thus, the processes, components and modules described herein may be implemented by a computer program product.
In some implementations, fingertip sensor 1936 includes an imaging device or other component to recognize and track a position of a finger. Further, other input devices 1938 include any sort of input devices known in the art. For example, input device(s) 1938 may include a microphone, a keyboard/keypad, or a touch-sensitive display (such as the touch-sensitive touch screen described above). A keyboard/keypad may be a push button numeric dialing pad (such as on a typical telecommunication device), a multi-key keyboard (such as a conventional QWERTY keyboard), or one or more other types of keys or buttons, and may also include a joystick-like controller and/or designated navigation buttons, or the like.
Additionally, while an example device configuration and architecture has been described, other implementations are not limited to the particular configuration and architecture described herein. Thus, this disclosure can extend to other implementations, as would be known or as would become known to those skilled in the art. Reference in the specification to “one implementation,” “this implementation,” “these implementations” or “some implementations” means that a particular feature, structure, or characteristic described is included in at least one implementation, and the appearances of these phrases in various places in the specification are not necessarily all referring to the same implementation.
At block 2002, multiple stacks of multiple items scrollable in the z-axis direction are presented in a user interface 800. For example, each of the stacks is of a different data type, different application, or the like. The items in each stack may be presented and viewed by scrolling along the z-axis.
At block 2004, input is received to scroll in the direction of the z-axis. For example, input may be received from a finger position control system, from a slider, or from another input mechanism.
At block 2006, the user interface scrolls through one or more of the items in the stack that is currently in the focus of the user interface.
At block 2008, input is received to move the focus of the user interface laterally. For example, a user may swipe the representation of the currently presented item to the left or right to move in the direction of the x-axis. Other controls may also be used.
At block 2010, the user interface moves the focus to an item in the adjacent stack. For example, in some implementations, the focus may move to an analogous item or an item at the same depth as the item in the previous stack. In other implementations, the user interface may move the focus to the first or beginning item in the adjacent stack. For example, when the user interface receives an input to move an adjacent stack into the viewable area of the display, the user interface may determine a type or centricity of the adjacent stack for determining whether to present an analogous item of the adjacent stack or the beginning item of the adjacent stack in the user interface.
Although the subject matter has been described in language specific to structural features and/or methodological acts, the subject matter defined in the appended claims is not limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claims. This disclosure is intended to cover any and all adaptations or variations of the disclosed implementations, and the following claims should not be construed to be limited to the specific implementations disclosed in the specification. Instead, the scope of this document is to be determined entirely by the following claims, along with the full range of equivalents to which such claims are entitled.
This application is a continuation-in-part of, and claims priority to, U.S. patent application Ser. No. 12/788,145, filed May 26, 2010, the entire disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 12788145 | May 2010 | US |
Child | 12852086 | US |