This can relate to systems and methods for improving the scrolling of user interfaces of electronic devices.
There is a need for improving the scrolling of user interfaces of various electronic devices. Specifically, there is a need for improving the ease and speed with which users may scroll through information using user interfaces of various electronic devices.
Some known electronic devices (e.g., personal computers and portable telephones) include a user interface that manipulates data transmitted via an output component based on instructions received from a user input component. Some known input components are conventional keyboards, mice, and the like that allow a user to move a selector and/or information displayed on a visual output component, such as a video monitor, for scrolling through a set of data.
However, the amount of data to be scrolled through is typically extensive as compared to the amount of data able to be easily displayed on the output component at any given time. Accordingly, what is needed are systems and methods for improving the ease and speed with which users may scroll through data using user interfaces of various electronic devices.
Systems and methods for improving the scrolling of user interfaces of electronic devices are provided.
In some embodiments, a system for controlling the scrolling of information includes a first output component configured to display at least a portion of a list of listings, wherein the list of listings includes a plurality of sublists of listings. The system also includes an input component configured to receive a user movement, and an interface coupled to the input component and the first output component, wherein the interface is configured to monitor a first attribute of the user movement, and wherein the interface is configured to scroll through the list of listings on the first output component from an originating listing to a first scroll listing when the first attribute is below a first threshold. The originating listing is contained within an originating sublist of the plurality of sublists, wherein the first scroll listing is consecutive with the originating listing in the list, wherein the interface is configured to scroll through the list of listings on the first output component from the originating listing to a second scroll listing when the first attribute is above the first threshold, wherein the second scroll listing is the initial listing in a second scroll sublist of the plurality of sublists, and wherein the second scroll sublist is one of the sublists from the following group of sublists: (1) the originating sublist and (2) a sublist consecutive with the originating sublist in the list of listings.
In some embodiments, a method for controlling the scrolling of a list of listings on a first output component with an input component, wherein the list of listings includes a plurality of sublists of listings, includes monitoring a first attribute of a user movement of the input component. The method also includes scrolling on the first output component from an originating listing to a first scroll listing when the monitored first attribute is below a first threshold, wherein the originating listing is contained within an originating sublist of the plurality of sublists, and wherein the first scroll listing is consecutive with the originating listing in the list. The method also includes scrolling on the first output component from the originating listing to a second scroll listing when the monitored first attribute is above the first threshold, wherein the second scroll listing is the initial listing in a second scroll sublist of the plurality of sublists, and wherein the second scroll-sublist is one of the sublists from the following group of sublists: (1) the originating sublist and (2) a sublist consecutive with the originating sublist in the list.
In some embodiments, a method includes detecting an accelerated navigation through a listing of assets and, while the accelerated navigation is detected, providing an asset list identifier along with the listing of assets to indicate where within the listing of assets a user is currently navigating, wherein the providing an asset list identifier comprises generating a first audible signal associated with the accelerated navigation.
In some embodiments, a method includes detecting an accelerated navigation through a listing of assets and, while the accelerated navigation is detected, providing an asset list identifier along with the listing of assets to indicate where within the listing of assets a user is currently navigating, wherein each asset in the listing of assets is related to an image file.
In some embodiments, a method includes detecting an accelerated navigation through a listing of assets and, while the accelerated navigation is detected, providing an asset list identifier along with the listing of assets to indicate where within the listing of assets a user is currently navigating, wherein each asset in the listing of assets is related to a geographic location file.
In some embodiments, a system for controlling the navigation of assets includes a first output component configured to provide a list of assets, a second output component, and an input component coupled to the first and second output components, wherein the input component is configured to detect an accelerated navigation through the list of assets, and wherein the second output component is configured to provide an audible asset list identifier to indicate where within the list of assets provided by the first output component a user is currently navigating when the accelerated navigation is detected.
In some embodiments, a system for controlling the navigation of assets includes an output component configured to provide a list of assets and an input component coupled to the output component, wherein the input component is configured to detect an accelerated navigation through the list of assets, wherein the output component is configured to provide an asset list identifier to indicate where within the list of assets a user is currently navigating when the accelerated navigation is detected, and wherein each asset in the list of assets is related to an image file.
In some embodiments, a system for controlling the navigation of assets includes an output component configured to provide a list of assets and an input component coupled to the output component, wherein the input component is configured to detect an accelerated navigation through the list of assets, wherein the output component is configured to provide an asset list identifier to indicate where within the list of assets a user is currently navigating when the accelerated navigation is detected, and wherein each asset in the list of assets is related to a geographic location file.
The above and other features of the present invention, its nature and various advantages will become more apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings, in which like reference characters refer to like parts throughout, and in which:
Systems and methods for improving the scrolling of user interfaces of electronic devices are provided and described with reference to
Moreover, in some cases, these electronic devices may be any portable, mobile, hand-held, or miniature electronic device having a user interface constructed according to an embodiment of the invention that allows a user to use the device wherever the user travels. Miniature electronic devices may have a form factor that is smaller than that of hand-held electronic devices, such as an iPod™ available by Apple Inc. of Cupertino, Calif. Illustrative miniature electronic devices can be integrated into various objects that include, but are not limited to, watches, rings, necklaces, belts, accessories for belts, headsets, accessories for shoes, virtual reality devices, other wearable electronics, accessories for sporting equipment, accessories for fitness equipment, key chains, and combinations thereof. Alternatively, electronic devices that incorporate a user interface of the invention may not be portable at all, but may instead be generally stationary, such as a desktop computer or television.
As shown in
One or more input components 110 may be provided to permit a user to interact or interface with device 100. For example, input component 110 can take a variety of forms, including, but not limited to, an electronic device pad, dial, click wheel, scroll wheel, touch screen, one or more buttons (e.g., a keyboard), mouse, joy stick, track ball, and combinations thereof. Input component 110 may include a multi-touch screen, such as that described in U.S. Pat. No. 6,323,846, which is incorporated by reference herein in its entirety. Input component 110 may emulate a rotary phone or a multi-button electronic device pad, which may be implemented on a touch screen or the combination of a click wheel or other user input device and a screen. A more detailed discussion of such a rotary phone interface may be found, for example, in U.S. patent application publication No. 2007/0152983, filed Nov. 1, 2006, entitled “Touch Pad With Symbols Based On Mode,” which is incorporated by reference herein in its entirety. Each input component 110 can be configured to provide one or more dedicated control functions for making selections or issuing commands associated with operating device 100.
One or more output components 120 can be provided to present information (e.g., textual, graphical, audible, and/or tactile information) to a user of device 100. Output component 120 may take various forms, including, but not limited, to audio speakers, headphones, audio line-outs, visual displays, antennas, infrared ports, rumblers, vibrators, or combinations thereof.
It should be noted that one or more input components and one or more output components may sometimes be referred to collectively herein as an I/O interface (e.g., input component 110 and output component 120 as I/O interface 180). It should also be noted that input component 110 and output component 120 may sometimes be a single I/O component, such as a touch screen that may receive input information through a user's touch of a display screen and that may also provide visual information to a user via that same display screen.
Communications circuitry 108 may be provided to allow device 100 to communicate with one or more other electronic devices using any suitable communications protocol. For example, communications circuitry 108 may support Wi-Fi (e.g., an 802.11 protocol), Ethernet, Bluetooth™, high frequency systems (e.g., 900 MHz, 2.4 GHz, and 5.6 GHz communication systems), infrared, TCP/IP (e.g., any of the protocols used in each of the TCP/IP layers), HTTP, BitTorrent, FTP, RTP, RTSP, SSH, any other communications protocol, or any combination thereof. Communications circuitry 108 can also include circuitry that enables device 100 to be electrically coupled to another device (e.g., a computer or an accessory device) and communicate with that other device.
Memory 104 may include one or more storage mediums, including for example, a hard-drive, flash memory, permanent memory such as ROM, semi-permanent memory such as RAM, any other suitable type of storage component, or any combination thereof. Memory 104 may include cache memory, which may be one or more different types of memory used for temporarily storing data for electronic device applications. Memory 104 may store media data (e.g., music, image, and video files), software (e.g., for implementing functions on device 100), firmware, preference information (e.g., media playback preferences), lifestyle information (e.g., food preferences), exercise information (e.g., information obtained by exercise monitoring equipment), transaction information (e.g., information such as credit card information), wireless connection information (e.g., information that may enable device 100 to establish a wireless connection), subscription information (e.g., information that keeps track of podcasts or television shows or other media a user subscribes to), contact information (e.g., telephone numbers and email addresses), calendar information, any other suitable data, or any combination thereof.
Power supply 106 may provide power to the components of device 100. In some embodiments, power supply 106 can be coupled to a power grid (e.g., when device 100 is not a portable device, such as a desktop computer). In some embodiments, power supply 106 can include one or more batteries for providing power (e.g., when device 100 is a portable device, such as a cellular telephone). As another example, power supply 106 can be configured to generate power from a natural source (e.g., solar power using solar cells).
Housing 101 may at least partially enclose one or more of the components of device 100 for protecting them from debris and other degrading forces external to the device. In some embodiments, one or more of the components may be provided within its own housing (e.g., input component 110 may be an independent keyboard or mouse within its own housing that may wirelessly or through a wire communicate with processor 102, which may be provided within its own housing).
Processor 102 of device 100 may control the operation of many functions and other circuitry included in the device. For example, processor 102 can receive input signals from input component 110 and/or drive output signals through output component 120. Processor 102 may load a user interface program (e.g., a program stored in memory 104 or another device or server) to determine how instructions received via an input component 110 may manipulate the way in which information (e.g., information stored in memory 104 or another device or server) is provided to the user via an output component 120.
As described above, a disadvantage of conventional electronic device user interfaces is that the amount of data that may be manipulated by a user via an input component is typically quite large as compared to the amount of data that may be easily provided to the user via an output component at any one point in time. Therefore, according to embodiments of the invention, systems and methods are provided for improving the ease and speed with which users may scroll through a large amount of data using user interfaces of various electronic devices.
In accordance with one embodiment of the invention, device 200 can permit a user to load and browse through one or more large libraries of media or data. Each library may be stored in a memory component of the device (e.g., memory 104 of
For example, a particular piece of metadata 350 that may be associated with an audio recording file 340 of a particular song entry 326 in library 300 is textual information metadata. Such textual information may be a string of one or more alphanumeric characters representative or descriptive of the title of the song (e.g., song title metadata 351), the length of the song (e.g., song length metadata 352), the name of the song's artist (e.g., song artist metadata 353), the name of the album on which the song originally appears (e.g., song album metadata 354), or any other facet of the song, such as the lyrics of the song, for example. As shown, song title metadata 351 for each entry 326 may be a string of one or more alphanumeric characters representative or descriptive of the title of the song (e.g., as shown in
Similarly, song length metadata 352 for each entry 326 may be a string of one or more alphanumeric characters representative or descriptive of the length of the song (e.g., as shown in
Another particular piece of metadata 350 that may be associated with an audio recording file 340 of a particular song entry 326 in library 300 is graphical information. Such graphical information may be an image or video file depicting or descriptive of the cover art of the album on which the song originally appears (e.g., cover art metadata 355) or any other facet of the song, such as a picture of the song's artist, for example. As shown, cover art metadata 355 for each entry 326 may be an image file representing or descriptive of the cover art of the song's album (e.g., as shown in
Yet another particular piece of metadata 350 that may be associated with an audio recording file 340 of a particular song entry 326 in library 300 is additional audio information. Such additional audio information may be an additional audio file representative of at least a portion of the associated payload audio recording file 340. For example, the additional audio information may be a condensed or smaller or shorter version of the payload audio recording file, such as a thirty-second clip of a much longer payload audio recording file, or a short recording enunciating the name of the song's title or first alphanumeric character of the song's album (e.g., audio clip metadata 356). As shown, audio clip metadata 356 for each entry 326 may be an audio file representative of a short portion of the associated payload audio recording file 340 (e.g., as shown in
There are many other various types of metadata 350 that can be associated with a particular payload audio file 340 of a particular song entry 326 in library 300. For example, such a particular piece of metadata may include preference information (e.g., media playback preferences), exercise information (e.g., information obtained by exercise monitoring equipment), transaction information (e.g., information such as credit card information), subscription information (e.g., information that keeps track of podcasts or television shows or other media a user subscribes to), contact information (e.g., telephone numbers and email addresses), calendar information, or any other suitable type of information that a user or other entity may wish to associate with a particular payload audio file of a particular song entry in a library of songs (e.g., miscellaneous metadata 357). As shown, miscellaneous metadata 357 for each entry 326 may be any type of file or alphanumeric string representative of any facet of the associated payload audio recording file 340 (e.g., as shown in
As mentioned, each library (e.g., library 300) or any particular portions of a library (e.g., cover art metadata 355) may be stored in any memory component of device 200 (e.g., memory 104 of
According to an embodiment of the invention, device 200 may include a user interface that allows a user to quickly and easily alternate between two or more modes of scrolling through a list of library entries. For example, the user interface of device 200 may quickly and easily switch between first and second scrolling modes in response to a particular type of user command generated by input component 210. This can improve the speed and ease with which a user may search for a particular entry within an extensive library of entries.
For example, in the embodiment where a user is accessing library 300 of song entries 326 (see, e.g.,
User interface 222 may also include a highlighter or selector indicator 228 that can differentiate one or more specific descriptive entry listings 226 from the other listings 226 displayed on output component 220 at any given time (e.g., listing 226c in user interface 222A of
User interface 222 may also include a status portion 224 that can describe the status of device 200. For example, as show in
Rotational input component 210 may include a selector 212 surrounded by a curved track 214, as shown in
For example, a user may gesture or impart a movement in the direction of arrow R along track 214 in such a way that user interface 222 scrolls downwardly through one additional listing 226. For example, user interface 222 may monitor an attribute of the user movement and update user interface 222A of
As well as handling various gesture types (e.g., user movement in the direction of arrows L and R), input component 210 may generate different instructions to the user interface of device 200 based on various attributes of a particular gesture type. The user interface may monitor at least a first attribute of a particular type of user movement received by the input component and thereby vary the way in which listings are scrolled. For example, user interface 222 may monitor an attribute of a user's movement on input component 210, such as the speed, length, or frequency of a user's movement in the direction of arrow R along track 214, and may vary the way in which listings 226 are scrolled with respect to indicator 228 based on that monitored attribute. In one embodiment, the listings 226 may be scrolled upwardly or downwardly one listing at a time when a monitored attribute of a user movement is below a certain threshold (e.g., when the speed of the movement is below a certain threshold velocity) and may be scrolled differently than one listing at a time when the monitored attribute of the user movement is above a certain threshold (e.g., when the speed of the movement is above a certain threshold velocity).
As described, a user may gesture in the direction of arrow R along track 214 in such a way that user interface 222 is updated with a single new listing 226 displayed at the bottom of the list (e.g., as shown in the update of user interface 222 between interface 222A of
There are various ways in which a user interface may scroll through a list of listings other than one listing at a time. For example, rather than simply scrolling from an original listing to a new listing that is consecutive with the original listing in the list, the list may be broken down into several sublists and a user interface may scroll from an original listing in a first sublist of the list to a new listing that is either the initial listing in the first sublist or the initial listing in a sublist that is consecutive with the first sublist in the list.
In one embodiment, as shown in
Each listing 226 in the list of listings on user interface 222 may be included in one of the plurality of sublists of listings 226 based on a first characteristic of this first piece of metadata. For example, each listing 226 in the list of listings on user interface 222 may be included in one of a plurality of sublists of listings 226 based on a first characteristic of the song title metadata 351 associated with that listing. Song title metadata 351 may be a string of one or more alphanumeric characters (e.g., “A BAD ONE” or “ACCENT” or “BALLOON” or “CLAM BAKE”). Therefore, each listing 226 in the list of listings on user interface 222 may be included in one of a plurality of sublists of listings 226 based on a first characteristic of the alphanumeric string, such as the first alphanumeric character of the string (e.g., “A” for “A BAD ONE”, or “A” for “ACCENT”, or “B” for “BALLOON”, or “C” for “CLAM BAKE”). As shown in
The listings 226 in each one of the plurality of sublists may be ordered within a particular sublist based on a second characteristic of the first piece of metadata. For example, each one of listings 226a-g in the first sublist on user interface 222 may be ordered within that sublist based on a second characteristic of the song title metadata 351. Song title metadata 351 may be a string of one or more alphanumeric characters. Therefore, each one of listings 226a-g in the first sublist on user interface 222 may be ordered within that sublist based on a second characteristic of the alphanumeric string, such as the alphanumerical order of the string. For example, each one of listings 226a-g in the first sublist on user interface 222 may therefore be ordered within that sublist as shown in
Finally, the plurality of sublists of listings 226 may be ordered within the list of listings 226 provided by user interface 222 based on the first characteristic of the first piece of metadata. For example, the first sublist containing listings 226a-g and the second sublist containing listings 226h-j may be ordered within the list of listings 226 provided by user interface 222 based on the first characteristic of the first piece of metadata (e.g., based on the alphanumerical order of the first alphanumeric character of the song title metadata 351). For example, the first sublist containing listings 226a-g and the second sublist containing listings 226h-j may be ordered within the list of listings 226 provided by user interface 222 as shown in
A list of listings that is broken down into several sublists of listings, such as listings 226 of
Alternatively, user interface 222 may scroll from an original listing in a first sublist of the list to a new listing that is either (1) the initial listing in a second sublist that is consecutive with the first sublist in the list or (2) the initial listing in the first sublist (i.e., “quickly-scroll”). For example, user interface 222 may scroll downwardly from an original listing 226d in a first sublist containing listings 226a-g, as shown in user interface 222B of
Somewhat likewise, user interface 222 may scroll upwardly from an original listing 226l in a first sublist containing listings 226k and 226l, as shown in user interface 222E of
These thresholds of various attributes of various user movements that may be monitored by user interface 222 to determine whether to “elementally-scroll” or “quickly-scroll” through the listings 226 provided on output component 220 may be determined by a user of device 200 or the manufacturer of device 200. For example, a user may select a threshold based on how many entries are in the library through which he or she wishes to scroll. Alternatively, a user may select a threshold based on his or her dexterity using the input component. These thresholds may be stored locally on the device (e.g., memory 104 of
Therefore, according to an embodiment of the invention, user interface 222 of device 200 may quickly and easily switch between a first “elemental-scrolling” mode and a second “quick-scrolling” mode for updating the displayed portion of a list of descriptive entry listings 226 on output component 220 in response to a monitored attribute of a particular type of user movement of input component 210. This can improve the speed and ease with which a user may search for a particular entry within an extensive library of entries. The user interface of the invention may provide more than two modes of scrolling by monitoring an attribute of a user movement with respect to more than one threshold or by monitoring more than one attribute of a user movement. Moreover, a quick-scrolling mode of the user interface may scroll through a list of listings in various other ways, such as immediately to the end of the list, or immediately to the middle of the list, for example.
In addition to changing the way in which descriptive entry listings 226 are scrolled on output component 220 in response to a particular type of user movement of input component 210, user interface 222 may also change the type of information transmitted to the user in response to a particular type of user movement of input component 210. For example, when user interface 222 quick-scrolls downwardly from an original listing 226d in a first sublist to a new listing 226h that is the initial listing in a downwardly consecutive second sublist, user interface 222 may also enhance itself by providing a visual enhancer 230 along with the updated set of listings 226 (e.g., as shown in the update from user interface 222B of
Visual enhancer 230 may be any additional information, such as an icon or image or string of one or more alphanumerical characters, that is descriptive of or related to at least one characteristic of the new listing or the second sublist (i.e., the sublist that contains the new listing). For example, as shown in
In one embodiment, user interface 222 may continuously show visual enhancer 230 as long as the user interface continues to quick-scroll through the listings. For example, if user interface 222 continues to quick-scroll downwardly from listing 226h in a first sublist to a new listing 226k that is the initial listing in a downwardly consecutive second sublist, as shown in the update from user interface 222C of
When user interface 222 terminates quick-scrolling and begins elemental-scrolling, for example, visual enhancer 230 may also be terminated. For example, if user interface 222 stops quick-scrolling but continues to update the listings 226 displayed on output component 220 by elementally-scrolling downwardly from listing 226k to downwardly consecutive listing 226l, as shown in the update from user interface 222D of
It is to be noted, however, that in accordance with an embodiment of the invention, user interface 222 may provide visual enhancer 230 along with an updated set of listings 226 even when it is not quick-scrolling. For example, user interface 222 may once again provide visual enhancer 230 if the interface elementally-scrolls upwardly through the listings from listing 226l to listing 226k, as shown in the update from user interface 222E of
The situations in which user interface 222 may provide a visual enhancer, such as visual enhancer 230 of
As an alternative or in addition to visually enhancing an updated set of listings 226 with a visual enhancer 230, user interface 222 may enhance itself aurally. As shown in
For example, when user interface 222 elementally-scrolls downwardly from an original listing 226c to downwardly consecutive listing 226d in the list of listings, user interface 222 may enhance itself aurally by transmitting a first sound 241 via output component 240 while also updating the set of listings 226 on output component 220 (e.g., as shown in the update from user interface 222A of
First sound 241 and second sound 242 may each be a single tone or a much more complex sound, such as a song. In one embodiment, first sound 241 may be a single short “clicking” sound indicative of the short scrolling between consecutive listings 226c and 226d (e.g., as shown in the update from user interface 222A of
For example, when user interface 222 continues to quick-scroll downwardly from an original listing 226h to a new listing 226k that is the initial listing in a downwardly consecutive sublist, user interface 222 may enhance itself aurally by once again transmitting second sound 242 via output component 240 while also updating the set of listings 226 on output component 220 (e.g., as shown in the update from user interface 222C of
However, there are various other ways in which user interface 222 can transmit different sounds via output component 240 for increasing the ease and speed with which a user may scroll through a list of listings 226. For example, in another embodiment, the sound transmitted by user interface 222 via output component 240 may be specifically associated with the listing being highlighted by indicator 228. For example, when user interface 222 scrolls to a new listing 226m (e.g., by elementally scrolling downwardly from an original listing 226l), the user interface may enhance itself aurally by transmitting via output 240 a third sound 243 that is in some way related to new listing 226m (e.g., as shown in the update from user interface 222E of
The situations in which user interface 222 may provide an aural enhancement via output component 240, may be determined by a user of device 200 or the manufacturer of device 200. For example, a user may wish to be provided with aural enhancement only when he or she is quick-scrolling. Alternatively, a user may wish to be provided with aural enhancement whenever he or she scrolls to a listing that is an initial listing in a sublist of the list of listings. These preferences may be fully customizable and may be stored locally on the device (e.g., memory 104 of
As an alternative or in addition to visually enhancing an updated set of listings 226 with a visual enhancer 230 and/or aurally enhancing an updated set of listings 226 with sounds via an audio output component 240, user interface 222 may enhance itself haptically or tactilely. As shown in
For example, when user interface 222 elementally-scrolls downwardly from an original listing 226c to downwardly consecutive listing 226d, user interface 222 may enhance itself haptically by transmitting a first haptic signal 251 via output component 250 while also updating the set of listings 226 on output component 220 (e.g., as shown in the update from user interface 222A of
First haptic signal 251 and second haptic signal 252 may each be a single force or a much more complex motion, such as a steady beat. In one embodiment, first haptic signal 251 may provide a single short vibrating sensation to the user that is indicative of the short scrolling between consecutive listings 226c and 226d, while second haptic signal 252 may provide a longer and more powerful vibrating sensation to the user that is indicative of the quick-scrolling between listings 226d and 226h of different sublists. The same first haptic signal 251 may be transmitted by user interface 222 every time it elementally-scrolls between two listings and the same second haptic signal 252 may be transmitted by user interface 222 every time it quickly-scrolls between two listings. This may help a user to more quickly and more easily realize how he or she is scrolling through the listings.
For example, when user interface 222 continues to quick-scroll downwardly from an original listing 226h to a new listing 226k that is the initial listing in a downwardly consecutive sublist, user interface 222 may enhance itself haptically by once again transmitting second haptic signal 252 via output component 250 while also updating the set of listings 226 on output component 220 (e.g., as shown in the update from user interface 222C of
However, there are various other ways in which user interface 222 can transmit different haptic signals via output component 250 for increasing the ease and speed with which a user may scroll through a list of listings 226. For example, in another embodiment, the haptic signal transmitted by user interface 222 via output component 250 may be specifically associated with the listing being highlighted by indicator 228. For example, when user interface 222 scrolls to a new listing 226m (e.g., by elementally scrolling downwardly from an original listing 226l), the user interface may enhance itself haptically by transmitting via output 250 a third haptic signal 253 that is in some way related to new listing 226m (e.g., as shown in the update from user interface 222E of
The situations in which user interface 222 may provide haptic or tactile enhancement via output component 250, may be determined by a user of device 200 or the manufacturer of device 200. For example, a user may wish to be provided with haptic enhancement only when he or she is quick-scrolling. Alternatively, a user may wish to be provided with haptic enhancement whenever he or she scrolls to a listing that is an initial listing in a sublist of the list of listings. These preferences may be fully customizable and may be stored locally on the device (e.g., memory 104 of
There are various other ways in which descriptive entry listings may be provided on a user interface to allow a user to scroll through library 300 of song entries 326 than as shown in
As shown in
Moreover, user interface 422 of device 400 may also include a highlighter or selector indicator 428 that can differentiate one or more specific descriptive entry listings 426 from the other listings 426 displayed on output component 420 at any given time (e.g., listing 426c in user interface 422A of
Device 400 may include an input component 410 with which a user may interact to send various types of input signals to the user interface, similarly to input component 210 of
As shown in
Visual enhancement portion 429′ may include one or more visual enhancers 4301. Each of the one or more visual enhancers 430′ may be any additional information, such as an icon or image or string of one or more alphanumerical characters, that is descriptive of or related to at least one characteristic of a listing 426′ or its sublist. For example, as shown in
Middle visual enhancer 430_2′ may be any additional information that is descriptive of or related to at least one characteristic of the highlighted listing or the sublist containing that highlighted listing (e.g., listing 426h′ of
In one embodiment, top visual enhancer 430_1′ may be any additional information that is descriptive of or related to at least one characteristic of the sublist upwardly consecutive from the sublist containing the highlighted listing. For example, as shown in
Device 400′ may include an input component 410′ with which a user may interact to send various types of input signals to the user interface, similarly to input component 210 of
In one embodiment, if the user desires to select the library entry associated with any of the descriptive entry listings 426′ of user interface 422′ displayed on output component 420′, he or she may simply tap that portion of the interface 422′. However, if the user desires to select a descriptive entry listing 426′ other than the ones currently displayed on output component 420′, he or she may impart either an upward flicking motion on the display in the direction of arrow FU (e.g., for scrolling from the listings of user interface 422C′ of
As shown in
Moreover, user interface 422′ of device 400′″ may not include a highlighter or selector indicator, such as indicator 228 of
Device 400″ may include an input component 410″ with which a user may interact in order to send various types of input signals to the user interface, similarly to input component 210 of
In accordance with one embodiment of the invention, device 500 can permit a user to load and browse through one or more large libraries of media or data. Each library may be stored in a memory component of the device (e.g., memory 104 of
For example, a particular piece of metadata 650 that may be associated with a picture file 640 of a particular picture entry 626 in library 600 is textual information metadata. Such textual information may be a string of one or more alphanumeric characters representative or descriptive of the picture (e.g., picture description metadata 651), the date and time at which the picture was captured (e.g., timestamp metadata 652), the name of the photo album to which the picture belongs (e.g., photo album metadata 654), or any other facet of the picture, such as a journal entry describing any events surrounding the picture, for example. As shown, picture description metadata 651 for each entry 626 may be a string of one or more alphanumeric characters representative or descriptive of the picture (e.g., as shown in
Similarly, timestamp metadata 652 for each entry 626 may be a string of one or more alphanumeric characters representative or descriptive of the date and time at which the picture was captured (e.g., as shown in
Another particular piece of metadata 650 that may be associated with an image file 640 of a particular picture entry 626 in library 600 is additional graphical information. Such graphical information may be a thumbnail (i.e., compressed) version of the image file (e.g., thumbnail metadata 655) or may be related to any other facet of the picture, such as a picture of the photographer, for example. As shown, thumbnail metadata 655 for each entry 626 may be a thumbnail of the picture (e.g., as shown in
Yet another particular piece of metadata 650 that may be associated with an image file 640 of a particular picture entry 626 in library 600 is audio information. Such audio information may be an audio file related to the associated payload image file 640, such as a recorded account of the events surrounding the picture (e.g., audio clip metadata 656). As shown, audio clip metadata 656 for each entry 626 may be an audio file related to the associated payload picture file 640 (e.g., as shown in
As described above with respect to song library 300 of
As mentioned, each library (e.g., library 600) or any particular portions of a library (e.g., thumbnail metadata 655) may be stored in any memory component of device 500 (e.g., memory 104 of
According to an embodiment of the invention, device 500 may include a user interface that allows a user to quickly and easily alternate between two or more modes of scrolling through a list of library entries. For example, like electronic device 200 of
As shown in
User interface 522 may also include a highlighter or selector indicator 528 that can differentiate one or more specific descriptive entry listings 526 from the other listings 526 displayed on output component 520 at any given time (e.g., listing 526a in user interface 522A of
User interface 522 may also include a status portion 524 that can describe the status of device 500. For example, as show in
As described above with respect to rotational input component 210 of
For example, a user may gesture or impart a movement in the direction of arrow R along track 514 in such a way that user interface 522 scrolls forward through one additional listing 526 in the grid of listings 526. For example, user interface 522 may monitor an attribute of the user movement and update user interface 522A of
As well as handling various gesture types (e.g., user movement in the direction of arrows L and R), input component 510 may generate different instructions to the user interface of device 500 based on various attributes of a particular gesture type, similarly to device 200. The user interface may monitor at least a first attribute of a particular type of user movement received by the input component and thereby vary the way in which listings are scrolled. For example, user interface 522 may monitor an attribute of a user's movement on input component 510, such as the speed, length, or frequency of a user's movement in the direction of arrow R along track 514, and may vary the way in which listings 526 are scrolled with respect to indicator 528 based on that monitored attribute. In one embodiment, the listings 526 may be scrolled forwards or backwards one listing at a time (e.g., “elementally”) when a monitored attribute of a user movement is below a certain threshold (e.g., the speed of the movement is below a certain velocity) and may be scrolled differently than one listing at a time when the monitored attribute of the user movement is above a certain threshold.
As described, a user may gesture in the direction of arrow R along track 514 in such a way that user interface 522 is updated with indicator 528 highlighting the forwardly consecutive listing 526 (e.g., as shown in the update of user interface 522 between interface 522A of
There are various ways in which a user interface may scroll through a list of listings other than one listing at a time. For example, rather than simply scrolling from an original listing to a new listing that is consecutive with the original listing in the list, the list may be broken down into several sublists and a user interface may scroll from an original listing in a first sublist of the list to a new listing that is either the initial listing in the first sublist or the initial listing in a sublist that is consecutive with the first sublist in the list.
In one embodiment, as shown in
Each listing 526 in the list of listings on user interface 522 may be included in one of the plurality of sublists of listings 526 based on a first characteristic of this first piece of metadata. For example, each listing 526 in the list of listings on user interface 522 may be included in one of a plurality of sublists of listings 526 based on a first characteristic of the photo album title metadata 654 associated with that listing. Photo album title metadata 654 may be a string of one or more alphanumeric characters (e.g., “ALBUM_1” or “ALBUM_2” or “ALBUM_3” or “ALBUM_4”). Therefore, each listing 526 in the list of listings on user interface 522 may be included in one of a plurality of sublists of listings 526 based on a first characteristic of the alphanumeric string, such as the entire string itself. As may be seen in
The listings 526 in each one of the plurality of sublists may be ordered within that sublist based on a first characteristic of a second piece of metadata. For example, each one of listings 526a-526g in the first sublist on user interface 522 may be ordered within that sublist based on a first characteristic of timestamp metadata 652. Timestamp metadata 652 may be a string of one or more alphanumeric characters. Therefore, each one of listings 526a-526g in the first sublist on user interface 522 may be ordered within that sublist based on a first characteristic of the alphanumeric string, such as the alphanumerical order of the string. For example, each one of listings 526a-526g in the first sublist on user interface 522 may therefore be ordered within that sublist as shown in
Finally, the plurality of sublists of listings 526 may be ordered within the list of listings 526 provided by user interface 522 based on the first characteristic of the first piece of metadata. For example, the first sublist containing listings 526a-526g and the second sublist containing listings 526h-526j may be ordered within the list of listings 526 provided by user interface 522 based on the first characteristic of the first piece of metadata (e.g., based on the alphanumerical order of the entire alphanumeric string of the photo album title metadata 654). For example, the first sublist containing listings 526a-526g and the second sublist containing listings 526h-526j may be ordered within the list of listings 526 provided by user interface 522 as shown in
A list of listings, such as listings 526 of
Alternatively, user interface 522 may scroll from an original listing in a first sublist of the list to a new listing that is either (1) the initial listing in a second sublist that is consecutive with the first sublist in the list or (2) the initial listing in the first sublist (i.e., “quickly-scroll”). For example, user interface 522 may scroll forwardly from an original listing 526b in a first sublist containing listings 526a-526g, as shown in user interface 522B of
Somewhat likewise, user interface 522 may scroll backwardly from an original listing 526l in a first sublist containing listings 526k and 526l, as shown in user interface 522E of
These thresholds of various attributes of various user movements that may be monitored by user interface 522 to determine whether to “elementally-scroll” or “quickly-scroll” through the listings 526 provided on output component 520 may be determined by a user of device 500 or the manufacturer of device 500. For example, a user may select a threshold based on how many entries are in the library through which he or she wishes to scroll. Alternatively, a user may select a threshold based on his or her dexterity using the input component. These thresholds may be stored locally on the device (e.g., memory 104 of
Therefore, according to an embodiment of the invention, user interface 522 of device 500 may quickly and easily switch between a first “elemental-scrolling” mode and a second “quick-scrolling” mode for updating the displayed portion of a list of descriptive entry listings 526 on output component 520 in response to a particular type of user movement of input component 510. This can improve the speed and ease with which a user may search for a particular entry within an extensive library of entries.
In addition to changing the way in which descriptive entry listings 526 are scrolled on output component 520 in response to a particular type of user movement of input component 510, user interface 522 may also change the type of information transmitted to the user in response to a particular type of user movement of input component 510, similarly to user interface 222 of
Visual enhancer 530 may be any additional information, such as an icon or image or string of one or more alphanumerical characters, that is descriptive of or related to at least one characteristic of the new listing or the second sublist (i.e., the sublist that contains the new listing). For example, as shown in
In one embodiment, user interface 522 may continuously show visual enhancer 530 as long as the user interface continues to quick-scroll through the listings. For example, if user interface 522 continues to quick-scroll forwardly from listing 526h in a first sublist to a new listing 526k that is the initial listing in a forwardly consecutive second sublist, as shown in the update from user interface 522C of
When user interface 522 terminates quick-scrolling and begins elemental-scrolling, for example, visual enhancer 530 may also be terminated. For example, if user interface 522 stops quick-scrolling but continues to update the listings 526 displayed on output component 520 by elementally-scrolling forwardly from listing 526k to downwardly consecutive listing 526l, as shown in the update from user interface 522D of
It is to be noted, however, that in accordance with an embodiment of the invention, user interface 522 may provide visual enhancer 530 along with an updated set of listings 526 even when it is not quick-scrolling. For example, user interface 522 may once again provide visual enhancer 530 if the interface elementally-scrolls forwardly through the listings from listing 526l to listing 526k, as shown in the update from user interface 522E of
The situations in which user interface may provide a visual enhancer, such as visual enhancer 530 of
As an alternative or in addition to visually enhancing an updated set of listings 526 with a visual enhancer 530, user interface may enhance itself aurally. As shown in
For example, when user interface 522 elementally-scrolls forwardly from an original listing 526a to forwardly consecutive listing 526b, user interface may enhance itself aurally by transmitting a first sound 541 via output component 540 while also updating the set of listings 526 on output component 520 (e.g., as shown in the update from user interface 522A of
First sound 541 and second sound 542 may each be a single tone or a much more complex sound, such as a song. In one embodiment, first sound 541 may be a single short clicking sound indicative of the short scrolling between consecutive listings 526a and 526b, while second sound 542 may be a longer clunking sound indicative of the quick-scrolling between listings 526b and 526h of different sublists. The same first sound 541 may be transmitted by user interface 522 every time it elementally-scrolls between two listings and the same second sound 542 may be transmitted by user interface 522 every time it quickly-scrolls between two listings. This may help a user to more quickly and more easily realize how he or she is scrolling through the listings.
For example, when user interface 522 continues to quick-scroll downwardly from an original listing 526h to a new listing 526k that is the initial listing in a downwardly consecutive sublist, user interface 522 may enhance itself aurally by once again transmitting second sound 542 via output component 540 while also updating the set of listings 526 on output component 520 (e.g., as shown in the update from user interface 522C of
However, there are various other ways in which user interface 522 can transmit different sounds via output component 540 for increasing the ease and speed with which a user may scroll through a list of listings 526. For example, in another embodiment, the sound transmitted by user interface 522 via output component 540 may be specifically associated with the listing being highlighted by indicator 528. For example, when user interface 522 scrolls to a new listing 526m (e.g., by elementally scrolling forwardly from an original listing 526l), the user interface may enhance itself aurally by transmitting via output 540 a third sound 543 that is in some way related to new listing 526m (e.g., as shown in the update from user interface 522E of
The situations in which user interface 522 may provide an aural enhancement via output component 540, may be determined by a user of device 500 or the manufacturer of device 500. For example, a user may wish to be provided with aural enhancement only when he or she is quick-scrolling. Alternatively, a user may wish to be provided with aural enhancement whenever he or she scrolls to a listing that is an initial listing in a sublist of the list of listings. These preferences may be fully customizable and may be stored locally on the device (e.g., memory 104 of
As an alternative or in addition to visually enhancing an updated set of listings 526 with a visual enhancer 530 and/or aurally enhancing an updated set of listings 526 with sounds via an audio output component 540, user interface 522 may enhance itself haptically or tactilely. As shown in
For example, when user interface 522 elementally-scrolls downwardly from an original listing 526a to forwardly consecutive listing 526b, user interface 522 may enhance itself haptically by transmitting a first haptic signal 551 via output component 550 while also updating the set of listings 526 on output component 520 (e.g., as shown in the update from user interface 522A of
First haptic signal 551 and second haptic signal 552 may each be a single force or a much more complex motion, such as a steady beat. In one embodiment, first haptic signal 551 may provide a single short vibrating sensation to the user that is indicative of the short scrolling between consecutive listings 526a and 526b, while second haptic signal 552 may provide a longer and more powerful vibrating sensation to the user that is indicative of the quick-scrolling between listings 526b and 526h of different sublists. The same first haptic signal 551 may be transmitted by user interface 522 every time it elementally-scrolls between two listings and the same second haptic signal 552 may be transmitted by user interface 522 every time it quickly-scrolls between two listings. This may help a user to more quickly and more easily realize how he or she is scrolling through the listings.
For example, when user interface 522 continues to quick-scroll downwardly from an original listing 526h to a new listing 526k that is the initial listing in a forwardly consecutive sublist, user interface 522 may enhance itself haptically by once again transmitting second haptic signal 552 via output component 550 while also updating the set of listings 526 on output component 520 (e.g., as shown in the update from user interface 522C of
However, there are various other ways in which user interface 522 can transmit different haptic signals via output component 550 for increasing the ease and speed with which a user may scroll through a list of listings 526. For example, in another embodiment, the haptic signal transmitted by user interface 522 via output component 550 may be specifically associated with the listing being highlighted by indicator 528. For example, when user interface 522 scrolls to a new listing 526m (e.g., by elementally scrolling forwardly from an original listing 526l), the user interface may enhance itself haptically by transmitting via output 550 a third haptic signal 553 that is in some way related to new listing 526m (e.g., as shown in the update from user interface 522E of
The situations in which user interface 522 may provide haptic or tactile enhancement via output component 550, may be determined by a user of device 500 or the manufacturer of device 500. For example, a user may wish to be provided with haptic enhancement only when he or she is quick-scrolling. Alternatively, a user may wish to be provided with haptic enhancement whenever he or she scrolls to a listing that is an initial listing in a sublist of the list of listings. These preferences may be fully customizable and may be stored locally on the device (e.g., memory 104 of
In accordance with one embodiment of the invention, device 700 can permit a user to load and browse through one or more large libraries of media or data. Each library may be stored in a memory component of the device (e.g., memory 104 of
For example, a particular piece of metadata 850 that may be associated with a geographic file 840 of a particular geographic entry 826 in library 800 is textual information metadata. Such textual information may be a string of one or more alphanumeric characters representative or descriptive of the geographic location (e.g., geographic description metadata 851), the latitudinal and longitudinal coordinates of the geographic location (e.g., coordinate metadata 852), the local standard time of the geographic location (e.g., time zone metadata 854), or any other facet of the location, such as the name of the dominant language spoken in that location, for example. As shown, geographic description metadata 851 for each entry 826 may be a string of one or more alphanumeric characters representative or descriptive of the geographic location (e.g., as shown in
Similarly, coordinate metadata 852 for each entry 826 may be a string of one or more alphanumeric characters representative or descriptive of the latitudinal and longitudinal coordinates of the geographic location (e.g., as shown in
Another particular piece of metadata 850 that may be associated with a geographic location file 840 of a particular geographic entry 826 in library 800 is graphical information. Such graphical information may be a small image file (e.g., thumbnail metadata 855) related to any facet of the location, such as a picture of that location, for example. As shown, thumbnail metadata 855 for each entry 826 may be a thumbnail image file (e.g., as shown in
Yet another particular piece of metadata 850 that may be associated with a geographic location file 840 of a particular geographic entry 826 in library 800 is audio information. Such audio information may be an audio file related to the associated payload geographic location file 840, such as a recorded description of the location or a pronunciation of the location in its native tongue (e.g., audio clip metadata 856). As shown, audio clip metadata 856 for each entry 826 may be an audio file related to the associated payload geographic location file 840 (e.g., as shown in
As described above with respect to picture library 500 of
As mentioned, each library (e.g., library 800) or any particular portions of a library (e.g., metadata 855) may be stored in any memory component of device 700 (e.g., memory 104 of
According to an embodiment of the invention, device 700 may include a user interface that allows a user to quickly and easily alternate between two or more modes of scrolling through a list of library entries. For example, like electronic device 500 of
Particularly, in the embodiment of
User interface 722 may also include a highlighter or selector indicator 728 that can differentiate one or more specific descriptive entry listings 726 from the other listings 726 displayed on output component 720 at any given time (e.g., listing 726a in user interface 722A of
User interface 722 may also include a status portion 724 that can describe the status of device 700. For example, as show in
As described above with respect to rotational input component 510 of
For example, a user may gesture or impart a movement in the direction of arrow R along track 714 in such a way that user interface 722 scrolls downward through one additional listing 726 of the location-based map of listings 726. For example, user interface 722 may monitor an attribute of the user movement and update user interface 722A of
As well as handling various gesture types (e.g., user movement in the direction of arrows L and R), input component 710 may generate different instructions to the user interface of device 700 based on various attributes of a particular gesture type, similarly to device 500. The user interface may monitor at least a first attribute of a particular type of user movement received by the input component and thereby vary the way in which listings are scrolled. For example, user interface 722 may monitor an attribute of a user's movement on input component 710, such as the speed, length, or frequency of a user's movement in the direction of arrow R along track 714, and may vary the way in which listings 726 are scrolled with respect to indicator 728 based on that monitored attribute. In one embodiment, listings 726 may be scrolled downwards or upwards one listing at a time (e.g., “elementally”) when a monitored attribute of a user movement is below a certain threshold (e.g., the speed of the movement is below a certain velocity) and may be scrolled differently than one listing at a time when the monitored attribute of the user movement is above a certain threshold.
As described, a user may gesture in the direction of arrow R along track 714 in such a way that user interface 722 is updated with indicator 728 highlighting the downwardly consecutive listing 726 (e.g., as shown in the update of user interface 722 between interface 722A of
There are various ways in which a user interface may scroll through a list of listings other than one listing at a time. For example, rather than simply scrolling from an original listing to a new listing that is consecutive with the original listing in the list, the list may be broken down into several sublists and a user interface may scroll from an original listing in a first sublist of the list to a new listing that is either the initial listing in the first sublist or the initial listing in a sublist that is consecutive with the first sublist in the list.
In one embodiment, as shown in
Each listing 726 in the list of listings on user interface 722 may be included in one of the plurality of sublists of listings 726 based on a first characteristic of this first piece of metadata. For example, each listing 726 in the list of listings on user interface 722 may be included in one of a plurality of sublists of listings 726 based on a first characteristic of the time zone metadata 854 associated with that listing. Time zone metadata 854 may be a string of one or more alphanumeric characters (e.g., “−08:00 GMT (PST)” or “−07:00 GMT (MST)” or “−06:00 GMT (CST)” or “−05:00 GMT (EST)”). Therefore, each listing 726 in the list of listings on user interface 722 may be included in one of a plurality of sublists of listings 726 based on a first characteristic of the alphanumeric string, such as the entire string itself. As may be seen in
Listings 726 in each one of the plurality of sublists may be ordered within that sublist based on a first characteristic of a second piece of metadata. For example, each one of listings 726a-726g in the first sublist on user interface 722 may be ordered within that sublist based on a first characteristic of coordinates metadata 852. Coordinates metadata 852 may be a string of one or more alphanumeric characters. Therefore, each one of listings 726a-726g in the first sublist on user interface 722A may be ordered within that sublist based on a first characteristic of the alphanumeric string, such as the alphanumerical order of the string. For example, each one of listings 726a-726g in the first sublist on user interface 722A may therefore be ordered within that sublist as shown in
Finally, the plurality of sublists of listings 726 may be ordered within the list of listings 726 provided by user interface 722 based on the first characteristic of the first piece of metadata. For example, the first sublist containing listings 726a-726g and the second sublist containing listings 726h-726j may be ordered within the location-based map of listings 726 provided by user interface 722 based on the first characteristic of the first piece of metadata (e.g., based on the alphanumerical order of the entire alphanumeric string of the time zone metadata 854). For example, the first sublist containing listings 726a-726g and the second sublist containing listings 726h-726j may be ordered within the location-based map of listings 726 provided by user interface 722 as shown in
As shown, this location-based map of listings 726 provided by user interface 722 may display each sublist within a separate substantially vertical column on map 727 (e.g., one of columns 727_1, 727_2, 727_3, and 727_4 of
A list of listings that is broken down into several sublists of listings, such as listings 726 of
Alternatively, user interface 722 may scroll from an original listing in a first sublist of the list to a new listing that is either (1) the initial listing in a second sublist that is consecutive with the first sublist in the list or (2) the initial listing in the first sublist (i.e., “quickly-scroll”). For example, user interface 722 may scroll forwardly from an original listing 726b in a first sublist containing listings 726a-726g, as shown in user interface 722B of
Somewhat likewise, user interface 722 may scroll backwardly from an original listing 726l in a first sublist containing listings 726k and 726l, as shown in user interface 722E of
These thresholds of various attributes of various user movements that may be monitored by user interface 722 to determine whether to “elementally-scroll” or “quickly-scroll” through the listings 726 provided on output component 720 may be determined by a user of device 700 or the manufacturer of device 700. For example, a user may select a threshold based on how many entries are in the library through which he or she wishes to scroll. Alternatively, a user may select a threshold based on his or her dexterity using the input component. These thresholds may be stored locally on the device (e.g., memory 104 of
Therefore, according to an embodiment of the invention, user interface 722 of device 700 may quickly and easily switch between a first “elemental-scrolling” mode and a second “quick-scrolling” mode for updating the displayed portion of a list of descriptive entry listings 726 on output component 720 in response to a particular type of user movement of input component 710. This can improve the speed and ease with which a user may search for a particular entry within an extensive library of entries.
In addition to changing the way in which descriptive entry listings 726 are scrolled on output component 720 in response to a particular type of user movement of input component 710, user interface 722 may also change the type of information transmitted to the user in response to a particular type of user movement of input component 710, similarly to user interface 222 of
Visual enhancer 730 may be any additional or updated information, such as an icon or image or string of one or more alphanumerical characters, that is descriptive of or related to at least one characteristic of the new listing or the second sublist (i.e., the sublist that contains the new listing). For example, as shown in
In one embodiment, user interface 722 may continuously show visual enhancer 730 as long as the user interface continues to quick-scroll through the listings. For example, if user interface 722 continues to quick-scroll forwardly from listing 726h in a first sublist to a new listing 726k that is the initial listing in a forwardly consecutive second sublist, as shown in the update from user interface 722C of
When user interface 722 terminates quick-scrolling and begins elemental-scrolling, for example, visual enhancer 730 may also be terminated. For example, if user interface 722 stops quick-scrolling but continues to update the listings 726 displayed on output component 720 by elementally-scrolling forwardly from listing 726k to downwardly consecutive listing 726l, as shown in the update from user interface 722D of
It is to be noted, however, that in accordance with an embodiment of the invention, user interface 722 may provide visual enhancer 730 along with an updated set of listings 726 even when it is not quick-scrolling. For example, user interface 722 may once again provide visual enhancer 730 if the interface elementally-scrolls backwardly through the listings from listing 726l to listing 726k, as shown in the update from user interface 722E of
The situations in which user interface 722 may provide a visual enhancer, such as visual enhancer 730 of
As an alternative or in addition to visually enhancing an updated set of listings 726 with a visual enhancer 730, user interface 722 may enhance itself aurally. As shown in
For example, when user interface 722 elementally-scrolls forwardly from an original listing 726a to forwardly consecutive listing 726b, user interface 722 may enhance itself aurally by transmitting a first sound 741 via output component 740 while also updating the set of listings 726 on output component 720 (e.g., as shown in the update from user interface 722A of
First sound 741 and second sound 742 may each be a single tone or a much more complex sound, such as a song. In one embodiment, first sound 741 may be a single short clicking sound indicative of the short scrolling between consecutive listings 726a and 726b, while second sound 742 may be a longer clunking sound indicative of the quick-scrolling between listings 726b and 726h of different sublists. The same first sound 741 may be transmitted by user interface 722 every time it elementally-scrolls between two listings and the same second sound 742 may be transmitted by user interface 722 every time it quickly-scrolls between two listings. This may help a user to more quickly and more easily realize how he or she is scrolling through the listings.
For example, when user interface 722 continues to quick-scroll downwardly from an original listing 726h to a new listing 726k that is the initial listing in a downwardly consecutive sublist, user interface 722 may enhance itself aurally by once again transmitting second sound 742 via output component 740 while also updating the set of listings 726 on output component 720 (e.g., as shown in the update from user interface 722C of
However, there are various other ways in which user interface 722 can transmit different sounds via output component 740 for increasing the ease and speed with which a user may scroll through a list of listings 726. For example, in another embodiment, the sound transmitted by user interface 722 via output component 740 may be specifically associated with the listing being highlighted by indicator 728. For example, when user interface 722 scrolls to a new listing 726m (e.g., by elementally scrolling forwardly from an original listing 726l), the user interface may enhance itself aurally by transmitting via output 740 a third sound 743 that is in some way related to new listing 726m (e.g., as shown in the update from user interface 722E of
The situations in which user interface 722 may provide an aural enhancement via output component 740, may be determined by a user of device 700 or the manufacturer of device 700. For example, a user may wish to be provided with aural enhancement only when he or she is quick-scrolling. Alternatively, a user may wish to be provided with aural enhancement whenever he or she scrolls to a listing that is an initial listing in a sublist of the list of listings. These preferences may be fully customizable and may be stored locally on the device (e.g., memory 104 of
As an alternative or in addition to visually enhancing an updated set of listings 726 with a visual enhancer 730 and/or aurally enhancing an updated set of listings 726 with sounds via an audio output component 740, user interface 722 may enhance itself haptically or tactilely. As shown in
For example, when user interface 722 elementally-scrolls downwardly from an original listing 726a to forwardly consecutive listing 726b, user interface 722 may enhance itself haptically by transmitting a first haptic signal 751 via output component 750 while also updating the set of listings 726 on output component 720 (e.g., as shown in the update from user interface 722A of
First haptic signal 751 and second haptic signal 752 may each be a single force or a much more complex motion, such as a steady beat. In one embodiment, first haptic signal 751 may provide a single short vibrating sensation to the user that is indicative of the short scrolling between consecutive listings 726a and 726b, while second haptic signal 752 may provide a longer and more powerful vibrating sensation to the user that is indicative of the quick-scrolling between listings 726b and 726h of different sublists. The same first haptic signal 751 may be transmitted by user interface 722 every time it elementally-scrolls between two listings and the same second haptic signal 752 may be transmitted by user interface 722 every time it quickly-scrolls between two listings. This may help a user to more quickly and more easily realize how he or she is scrolling through the listings.
For example, when user interface 722 continues to quick-scroll downwardly from an original listing 726h to a new listing 726k that is the initial listing in a forwardly consecutive sublist, user interface 722 may enhance itself haptically by once again transmitting second haptic signal 752 via output component 750 while also updating the set of listings 726 on output component 720 (e.g., as shown in the update from user interface 722C of
However, there are various other ways in which user interface 722 can transmit different haptic signals via output component 750 for increasing the ease and speed with which a user may scroll through a list of listings 726. For example, in another embodiment, the haptic signal transmitted by user interface 722 via output component 750 may be specifically associated with the listing being highlighted by indicator 728. For example, when user interface 722 scrolls to a new listing 726m (e.g., by elementally scrolling forwardly from an original listing 726l), the user interface may enhance itself haptically by transmitting via output 750 a third haptic signal 753 that is in some way related to new listing 726m (e.g., as shown in the update from user interface 722E of
The situations in which user interface 722 may provide haptic or tactile enhancement via output component 750, may be determined by a user of device 700 or the manufacturer of device 700. For example, a user may wish to be provided with haptic enhancement only when he or she is quick-scrolling. Alternatively, a user may wish to be provided with haptic enhancement whenever he or she scrolls to a listing that is an initial listing in a sublist of the list of listings. These preferences may be fully customizable and may be stored locally on the device (e.g., memory 104 of
While there have been described systems and methods for improving the scrolling of user interfaces of electronic devices, it is to be understood that many changes may be made therein without departing from the spirit and scope of the present invention. For example, many other types of payload data may be scrolled according to the invention, such as video files, contact information, word processing documents, and the like. It will also be understood that various directional and orientational terms such as “up” and “down,” “left” and “right,” “top” and “bottom,” “side” and “edge” and “corner,” “height” and “width” and “depth,” “horizontal” and “vertical,” and the like are used herein only for convenience, and that no fixed or absolute directional or orientational limitations are intended by the use of these words. For example, the devices of this invention can have any desired orientation. If reoriented, different directional or orientational terms may need to be used in their description, but that will not alter their fundamental nature as within the scope and spirit of this invention. Those skilled in the art will appreciate that the invention can be practiced by other than the described embodiments, which are presented for purposes of illustration rather than of limitation, and the invention is limited only by the claims which follow.
This application claims the benefit of U.S. Provisional Patent Application No. 60/967,457, filed Sep. 4, 2007, which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
1061578 | Wischhusen et al. | May 1913 | A |
2063276 | Thomas | Dec 1936 | A |
2798907 | Schneider | Jul 1957 | A |
2903229 | Landge | Sep 1959 | A |
2945111 | McCormick | Jul 1960 | A |
3005055 | Mattke | Oct 1961 | A |
3965399 | Walker et al. | Jun 1976 | A |
3996441 | Ohashi | Dec 1976 | A |
4029915 | Ojima | Jun 1977 | A |
4103252 | Bobick | Jul 1978 | A |
4110749 | Janko et al. | Aug 1978 | A |
4115670 | Chandler | Sep 1978 | A |
4121204 | Welch et al. | Oct 1978 | A |
4129747 | Pepper, Jr. | Dec 1978 | A |
4158216 | Bigelow | Jun 1979 | A |
4242676 | Piguet et al. | Dec 1980 | A |
4246452 | Chandler | Jan 1981 | A |
4264903 | Bigelow | Apr 1981 | A |
4266144 | Bristol | May 1981 | A |
4293734 | Pepper, Jr. | Oct 1981 | A |
D264969 | McGourty | Jun 1982 | S |
4338502 | Hashimoto et al. | Jul 1982 | A |
4380007 | Steinegger | Apr 1983 | A |
4380040 | Posset | Apr 1983 | A |
4394649 | Suchoff et al. | Jul 1983 | A |
4475008 | Doi et al. | Oct 1984 | A |
4570149 | Thornburg et al. | Feb 1986 | A |
4583161 | Gunderson et al. | Apr 1986 | A |
4587378 | Moore | May 1986 | A |
4604786 | Howie, Jr. | Aug 1986 | A |
4613736 | Shichijo et al. | Sep 1986 | A |
4644100 | Brenner et al. | Feb 1987 | A |
4719524 | Morishima et al. | Jan 1988 | A |
4734034 | Maness et al. | Mar 1988 | A |
4736191 | Matzke et al. | Apr 1988 | A |
4739191 | Puar | Apr 1988 | A |
4739299 | Eventoff et al. | Apr 1988 | A |
4752655 | Tajiri et al. | Jun 1988 | A |
4755765 | Ferland | Jul 1988 | A |
4764717 | Tucker et al. | Aug 1988 | A |
4771139 | DeSmet | Sep 1988 | A |
4798919 | Miessler et al. | Jan 1989 | A |
4810992 | Eventoff | Mar 1989 | A |
4822957 | Talmage, Jr. et al. | Apr 1989 | A |
4831359 | Newell | May 1989 | A |
4849852 | Mullins | Jul 1989 | A |
4856993 | Maness et al. | Aug 1989 | A |
4860768 | Hon et al. | Aug 1989 | A |
4866602 | Hall | Sep 1989 | A |
4876524 | Jenkins | Oct 1989 | A |
4897511 | Itaya et al. | Jan 1990 | A |
4914624 | Dunthorn | Apr 1990 | A |
4917516 | Retter | Apr 1990 | A |
4943889 | Ohmatoi | Jul 1990 | A |
4951036 | Grueter et al. | Aug 1990 | A |
4954823 | Binstead | Sep 1990 | A |
4976435 | Shatford et al. | Dec 1990 | A |
4990900 | Kikuchi | Feb 1991 | A |
5008497 | Asher | Apr 1991 | A |
5036321 | Leach et al. | Jul 1991 | A |
5053757 | Meadows | Oct 1991 | A |
5086870 | Bolduc | Feb 1992 | A |
5125077 | Hall | Jun 1992 | A |
5159159 | Asher | Oct 1992 | A |
5179648 | Hauck | Jan 1993 | A |
5186646 | Pederson | Feb 1993 | A |
5192082 | Inoue et al. | Mar 1993 | A |
5193669 | Demeo et al. | Mar 1993 | A |
5231326 | Echols | Jul 1993 | A |
5237311 | Mailey et al. | Aug 1993 | A |
5278362 | Ohashi | Jan 1994 | A |
5305017 | Gerpheide | Apr 1994 | A |
5313027 | Inoue et al. | May 1994 | A |
D349280 | Kaneko | Aug 1994 | S |
5339213 | Ocallaghan | Aug 1994 | A |
5367199 | Lefkowitz et al. | Nov 1994 | A |
5374787 | Miller et al. | Dec 1994 | A |
5379057 | Clough et al. | Jan 1995 | A |
5404152 | Nagai | Apr 1995 | A |
5408621 | Ben-Arie | Apr 1995 | A |
5414445 | Kaneko et al. | May 1995 | A |
5416498 | Grant | May 1995 | A |
5424756 | Ho et al. | Jun 1995 | A |
5432531 | Calder et al. | Jul 1995 | A |
5438331 | Gilligan et al. | Aug 1995 | A |
D362431 | Kaneko et al. | Sep 1995 | S |
5450075 | Waddington | Sep 1995 | A |
5453761 | Tanaka | Sep 1995 | A |
5473343 | Kimmich et al. | Dec 1995 | A |
5473344 | Bacon et al. | Dec 1995 | A |
5479192 | Carroll, Jr. et al. | Dec 1995 | A |
5494157 | Golenz et al. | Feb 1996 | A |
5495566 | Kwatinetz | Feb 1996 | A |
5508703 | Okamura et al. | Apr 1996 | A |
5508717 | Miller | Apr 1996 | A |
5543588 | Bisset et al. | Aug 1996 | A |
5543591 | Gillespie et al. | Aug 1996 | A |
5555004 | Ono et al. | Sep 1996 | A |
5559301 | Bryan, Jr. et al. | Sep 1996 | A |
5559943 | Cyr et al. | Sep 1996 | A |
5561445 | Miwa et al. | Oct 1996 | A |
5564112 | Hayes et al. | Oct 1996 | A |
5565887 | McCambridge et al. | Oct 1996 | A |
5578817 | Bidiville et al. | Nov 1996 | A |
5581670 | Bier et al. | Dec 1996 | A |
5585823 | Duchon et al. | Dec 1996 | A |
5589856 | Stein et al. | Dec 1996 | A |
5589893 | Gaughan et al. | Dec 1996 | A |
5596347 | Robertson et al. | Jan 1997 | A |
5596697 | Foster et al. | Jan 1997 | A |
5598183 | Robertson et al. | Jan 1997 | A |
5611040 | Brewer et al. | Mar 1997 | A |
5611060 | Belfiore et al. | Mar 1997 | A |
5613137 | Bertram et al. | Mar 1997 | A |
5617114 | Bier et al. | Apr 1997 | A |
5627531 | Posso et al. | May 1997 | A |
5632679 | Tremmel | May 1997 | A |
5640258 | Kurashima et al. | Jun 1997 | A |
5648642 | Miller et al. | Jul 1997 | A |
D382550 | Kaneko et al. | Aug 1997 | S |
5657012 | Tait | Aug 1997 | A |
5661632 | Register | Aug 1997 | A |
D385542 | Kaneko et al. | Oct 1997 | S |
5675362 | Clough et al. | Oct 1997 | A |
5689285 | Asher | Nov 1997 | A |
5721849 | Amro | Feb 1998 | A |
5726687 | Belfiore et al. | Mar 1998 | A |
5729219 | Armstrong et al. | Mar 1998 | A |
5730165 | Philipp | Mar 1998 | A |
5748185 | Stephan et al. | May 1998 | A |
5751274 | Davis | May 1998 | A |
5754890 | Holmdahl et al. | May 1998 | A |
5764066 | Novak et al. | Jun 1998 | A |
5777605 | Yoshinobu et al. | Jul 1998 | A |
5786818 | Brewer et al. | Jul 1998 | A |
5790769 | Buxton et al. | Aug 1998 | A |
5798752 | Buxton et al. | Aug 1998 | A |
5805144 | Scholder et al. | Sep 1998 | A |
5808602 | Sellers | Sep 1998 | A |
5812239 | Eger | Sep 1998 | A |
5812498 | Teres | Sep 1998 | A |
5815141 | Phares | Sep 1998 | A |
5825351 | Tam | Oct 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5825353 | Will | Oct 1998 | A |
5828364 | Siddiqui | Oct 1998 | A |
5838304 | Hall | Nov 1998 | A |
5841078 | Miller et al. | Nov 1998 | A |
5841423 | Carroll, Jr. et al. | Nov 1998 | A |
D402281 | Ledbetter et al. | Dec 1998 | S |
5850213 | Imai et al. | Dec 1998 | A |
5856645 | Norton | Jan 1999 | A |
5856822 | Du et al. | Jan 1999 | A |
5859629 | Tognazzini | Jan 1999 | A |
5861875 | Gerpheide | Jan 1999 | A |
5869791 | Young | Feb 1999 | A |
5875311 | Bertram et al. | Feb 1999 | A |
5883619 | Ho et al. | Mar 1999 | A |
5889236 | Gillespie et al. | Mar 1999 | A |
5889511 | Ong et al. | Mar 1999 | A |
5894117 | Kamishima | Apr 1999 | A |
5903229 | Kishi | May 1999 | A |
5907152 | Dandliker et al. | May 1999 | A |
5907318 | Medina | May 1999 | A |
5909211 | Combs et al. | Jun 1999 | A |
5910802 | Shields et al. | Jun 1999 | A |
5914706 | Kono | Jun 1999 | A |
5923388 | Kurashima et al. | Jul 1999 | A |
D412940 | Kato et al. | Aug 1999 | S |
5933102 | Miller et al. | Aug 1999 | A |
5933141 | Smith | Aug 1999 | A |
5936619 | Nagasaki et al. | Aug 1999 | A |
5943044 | Martinelli et al. | Aug 1999 | A |
5953000 | Weirich | Sep 1999 | A |
5956019 | Bang et al. | Sep 1999 | A |
5959610 | Silfvast | Sep 1999 | A |
5959611 | Smailagic et al. | Sep 1999 | A |
5964661 | Dodge | Oct 1999 | A |
5973668 | Watanabe | Oct 1999 | A |
6000000 | Hawkins et al. | Dec 1999 | A |
6002093 | Hrehor et al. | Dec 1999 | A |
6002389 | Kasser | Dec 1999 | A |
6005299 | Hengst | Dec 1999 | A |
6025832 | Sudo et al. | Feb 2000 | A |
6031518 | Adams et al. | Feb 2000 | A |
6034672 | Gaultier et al. | Mar 2000 | A |
6057829 | Silfvast | May 2000 | A |
6075533 | Chang | Jun 2000 | A |
6084574 | Bidiville | Jul 2000 | A |
D430169 | Scibora | Aug 2000 | S |
6097372 | Suzuki | Aug 2000 | A |
6104790 | Narayanaswami | Aug 2000 | A |
6122526 | Parulski et al. | Sep 2000 | A |
6124587 | Bidiville | Sep 2000 | A |
6128006 | Rosenberg et al. | Oct 2000 | A |
6131048 | Sudo et al. | Oct 2000 | A |
6141068 | Iijima | Oct 2000 | A |
6147856 | Karidis | Nov 2000 | A |
6163312 | Furuya | Dec 2000 | A |
6166721 | Kuroiwa et al. | Dec 2000 | A |
6179496 | Chou | Jan 2001 | B1 |
6181322 | Nanavati | Jan 2001 | B1 |
D437860 | Suzuki et al. | Feb 2001 | S |
6188391 | Seely et al. | Feb 2001 | B1 |
6188393 | Shu | Feb 2001 | B1 |
6191774 | Schena et al. | Feb 2001 | B1 |
6198054 | Janniere | Mar 2001 | B1 |
6198473 | Armstrong | Mar 2001 | B1 |
6211861 | Rosenberg et al. | Apr 2001 | B1 |
6219038 | Cho | Apr 2001 | B1 |
6222528 | Gerpheide et al. | Apr 2001 | B1 |
D442592 | Ledbetter et al. | May 2001 | S |
6225976 | Yates et al. | May 2001 | B1 |
6225980 | Weiss et al. | May 2001 | B1 |
6226534 | Aizawa | May 2001 | B1 |
6227966 | Yokoi | May 2001 | B1 |
6229456 | Engholm et al. | May 2001 | B1 |
D443616 | Fisher et al. | Jun 2001 | S |
6243078 | Rosenberg | Jun 2001 | B1 |
6243080 | Molne | Jun 2001 | B1 |
6243646 | Ozaki et al. | Jun 2001 | B1 |
6248017 | Roach | Jun 2001 | B1 |
6254477 | Sasaki et al. | Jul 2001 | B1 |
6256011 | Culver | Jul 2001 | B1 |
6259491 | Ekedahl et al. | Jul 2001 | B1 |
6262717 | Donohue et al. | Jul 2001 | B1 |
6262785 | Kim | Jul 2001 | B1 |
6266050 | Oh et al. | Jul 2001 | B1 |
6285211 | Sample et al. | Sep 2001 | B1 |
D448810 | Goto | Oct 2001 | S |
6297795 | Kato et al. | Oct 2001 | B1 |
6297811 | Kent et al. | Oct 2001 | B1 |
6300946 | Lincke et al. | Oct 2001 | B1 |
6307539 | Suzuki | Oct 2001 | B2 |
D450713 | Masamitsu et al. | Nov 2001 | S |
6314483 | Goto et al. | Nov 2001 | B1 |
6321441 | Davidson et al. | Nov 2001 | B1 |
6323845 | Robbins | Nov 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
D452250 | Chan | Dec 2001 | S |
6340800 | Zhai et al. | Jan 2002 | B1 |
6347290 | Bartlet | Feb 2002 | B1 |
D454568 | Andre et al. | Mar 2002 | S |
6357887 | Novak | Mar 2002 | B1 |
D455793 | Lin | Apr 2002 | S |
6373265 | Morimoto et al. | Apr 2002 | B1 |
6373470 | Andre et al. | Apr 2002 | B1 |
6377530 | Burrows | Apr 2002 | B1 |
6396523 | Segal et al. | May 2002 | B1 |
6424338 | Anderson | Jul 2002 | B1 |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6429852 | Adams et al. | Aug 2002 | B1 |
6452514 | Philipp | Sep 2002 | B1 |
6465271 | Ko et al. | Oct 2002 | B1 |
6473069 | Gerpheide | Oct 2002 | B1 |
6492602 | Asai et al. | Dec 2002 | B2 |
6492979 | Kent et al. | Dec 2002 | B1 |
6496181 | Bomer et al. | Dec 2002 | B1 |
6497412 | Bramm | Dec 2002 | B1 |
D468365 | Bransky et al. | Jan 2003 | S |
D469109 | Andre et al. | Jan 2003 | S |
D472245 | Andre et al. | Mar 2003 | S |
6546231 | Someya et al. | Apr 2003 | B1 |
6563487 | Martin et al. | May 2003 | B2 |
6587091 | Serpa | Jul 2003 | B2 |
6606244 | Liu et al. | Aug 2003 | B1 |
6618909 | Yang | Sep 2003 | B1 |
6636197 | Goldenberg et al. | Oct 2003 | B1 |
6639584 | Li | Oct 2003 | B1 |
6640250 | Chang et al. | Oct 2003 | B1 |
6650975 | Ruffner | Nov 2003 | B2 |
D483809 | Lim | Dec 2003 | S |
6658773 | Rohne et al. | Dec 2003 | B2 |
6664951 | Fujii et al. | Dec 2003 | B1 |
6677927 | Bruck et al. | Jan 2004 | B1 |
6678891 | Wilcox et al. | Jan 2004 | B1 |
6686904 | Sherman et al. | Feb 2004 | B1 |
6686906 | Salminen et al. | Feb 2004 | B2 |
6703550 | Chu | Mar 2004 | B2 |
6724817 | Simpson et al. | Apr 2004 | B1 |
6727889 | Shaw | Apr 2004 | B2 |
D489731 | Huang | May 2004 | S |
6734883 | Wynn et al. | May 2004 | B1 |
6738045 | Hinckley et al. | May 2004 | B2 |
6750803 | Yates et al. | Jun 2004 | B2 |
6781576 | Tamura | Aug 2004 | B2 |
6784384 | Park et al. | Aug 2004 | B2 |
6788288 | Ano | Sep 2004 | B2 |
6791533 | Su | Sep 2004 | B2 |
6795057 | Gordon | Sep 2004 | B2 |
D497618 | Andre et al. | Oct 2004 | S |
6810271 | Wood et al. | Oct 2004 | B1 |
6822640 | Derocher | Nov 2004 | B2 |
6834975 | Chu-Chia et al. | Dec 2004 | B2 |
6844872 | Farag et al. | Jan 2005 | B1 |
6847351 | Noguera | Jan 2005 | B2 |
6855899 | Sotome | Feb 2005 | B2 |
6865718 | Levi Montalcini | Mar 2005 | B2 |
6886842 | Vey et al. | May 2005 | B2 |
6894916 | Reohr et al. | May 2005 | B2 |
D506476 | Andre et al. | Jun 2005 | S |
6922189 | Fujiyoshi | Jul 2005 | B2 |
6930494 | Tesdahl et al. | Aug 2005 | B2 |
6958614 | Morimoto et al. | Oct 2005 | B2 |
6977808 | Lam et al. | Dec 2005 | B2 |
6978127 | Bulthuis et al. | Dec 2005 | B1 |
6985137 | Kaikuranta | Jan 2006 | B2 |
7006077 | Uusimaki | Feb 2006 | B1 |
7019225 | Matsumoto et al. | Mar 2006 | B2 |
7046230 | Zadesky et al. | May 2006 | B2 |
7050292 | Shimura et al. | May 2006 | B2 |
7069044 | Okada et al. | Jun 2006 | B2 |
7078633 | Ihalainen | Jul 2006 | B2 |
7084856 | Huppi | Aug 2006 | B2 |
7113196 | Kerr | Sep 2006 | B2 |
7117136 | Rosedale | Oct 2006 | B1 |
7119792 | Andre et al. | Oct 2006 | B1 |
7215319 | Kamijo et al. | May 2007 | B2 |
7233318 | Farag et al. | Jun 2007 | B1 |
7236154 | Kerr et al. | Jun 2007 | B1 |
7236159 | Siversson | Jun 2007 | B1 |
7253643 | Seguine | Aug 2007 | B1 |
7279647 | Philipp | Oct 2007 | B2 |
7288732 | Hashida | Oct 2007 | B2 |
7297883 | Rochon et al. | Nov 2007 | B2 |
7310089 | Baker et al. | Dec 2007 | B2 |
7312785 | Tsuk et al. | Dec 2007 | B2 |
7321103 | Nakanishi et al. | Jan 2008 | B2 |
7325195 | Arant | Jan 2008 | B1 |
7333092 | Zadesky et al. | Feb 2008 | B2 |
7348898 | Ono | Mar 2008 | B2 |
7365737 | Marvit et al. | Apr 2008 | B2 |
7382139 | Mackey | Jun 2008 | B2 |
7394038 | Chang | Jul 2008 | B2 |
7395081 | Bonnelykke Kristensen et al. | Jul 2008 | B2 |
7397467 | Park et al. | Jul 2008 | B2 |
7439963 | Geaghan et al. | Oct 2008 | B2 |
7466307 | Trent et al. | Dec 2008 | B2 |
7479949 | Jobs et al. | Jan 2009 | B2 |
7486323 | Lee et al. | Feb 2009 | B2 |
7502016 | Trent, Jr. et al. | Mar 2009 | B2 |
7503193 | Schoene et al. | Mar 2009 | B2 |
7593782 | Jobs et al. | Sep 2009 | B2 |
7645955 | Huang et al. | Jan 2010 | B2 |
7671837 | Forsblad et al. | Mar 2010 | B2 |
7689466 | Benbrahim et al. | Mar 2010 | B1 |
7708051 | Katsumi et al. | May 2010 | B2 |
7710393 | Tsuk et al. | May 2010 | B2 |
7716582 | Mueller | May 2010 | B2 |
7769794 | Moore et al. | Aug 2010 | B2 |
7772507 | Orr et al. | Aug 2010 | B2 |
20010000537 | Inala et al. | Apr 2001 | A1 |
20010011991 | Wang et al. | Aug 2001 | A1 |
20010011993 | Saarinen | Aug 2001 | A1 |
20010033270 | Osawa et al. | Oct 2001 | A1 |
20010043545 | Aratani | Nov 2001 | A1 |
20010050673 | Davenport | Dec 2001 | A1 |
20010051046 | Watanabe et al. | Dec 2001 | A1 |
20020000978 | Gerpheide | Jan 2002 | A1 |
20020011993 | Lui et al. | Jan 2002 | A1 |
20020027547 | Kamijo | Mar 2002 | A1 |
20020030665 | Ano | Mar 2002 | A1 |
20020033848 | Sciammarella et al. | Mar 2002 | A1 |
20020039493 | Tanaka | Apr 2002 | A1 |
20020045960 | Phillips et al. | Apr 2002 | A1 |
20020059584 | Ferman et al. | May 2002 | A1 |
20020071550 | Pletikosa | Jun 2002 | A1 |
20020089545 | Levi Montalcini | Jul 2002 | A1 |
20020103796 | Hartley | Aug 2002 | A1 |
20020118131 | Yates et al. | Aug 2002 | A1 |
20020118169 | Hinckley et al. | Aug 2002 | A1 |
20020145594 | Derocher | Oct 2002 | A1 |
20020154090 | Lin | Oct 2002 | A1 |
20020158844 | McLoone et al. | Oct 2002 | A1 |
20020164156 | Bilbrey | Nov 2002 | A1 |
20020168947 | Lemley | Nov 2002 | A1 |
20020180701 | Hayama et al. | Dec 2002 | A1 |
20020196239 | Lee | Dec 2002 | A1 |
20030002246 | Kerr | Jan 2003 | A1 |
20030025679 | Taylor et al. | Feb 2003 | A1 |
20030028346 | Sinclair et al. | Feb 2003 | A1 |
20030043121 | Chen | Mar 2003 | A1 |
20030043174 | Hinckley et al. | Mar 2003 | A1 |
20030050092 | Yun | Mar 2003 | A1 |
20030076301 | Tsuk et al. | Apr 2003 | A1 |
20030076303 | Huppi | Apr 2003 | A1 |
20030076306 | Zadesky et al. | Apr 2003 | A1 |
20030091377 | Hsu et al. | May 2003 | A1 |
20030095095 | Pihlaja | May 2003 | A1 |
20030095096 | Robbin et al. | May 2003 | A1 |
20030098851 | Brink | May 2003 | A1 |
20030103043 | Mulligan et al. | Jun 2003 | A1 |
20030122792 | Yamamoto et al. | Jul 2003 | A1 |
20030135292 | Husgafvel et al. | Jul 2003 | A1 |
20030142081 | Iizuka et al. | Jul 2003 | A1 |
20030184517 | Senzui et al. | Oct 2003 | A1 |
20030197740 | Reponen | Oct 2003 | A1 |
20030206202 | Moriya | Nov 2003 | A1 |
20030210537 | Engelmann | Nov 2003 | A1 |
20030224831 | Engstrom et al. | Dec 2003 | A1 |
20040027341 | Derocher | Feb 2004 | A1 |
20040074756 | Kawakami et al. | Apr 2004 | A1 |
20040080682 | Dalton | Apr 2004 | A1 |
20040109028 | Stern et al. | Jun 2004 | A1 |
20040109357 | Cernea et al. | Jun 2004 | A1 |
20040145613 | Stavely et al. | Jul 2004 | A1 |
20040150619 | Baudisch et al. | Aug 2004 | A1 |
20040156192 | Kerr et al. | Aug 2004 | A1 |
20040178997 | Gillespie et al. | Sep 2004 | A1 |
20040200699 | Matsumoto et al. | Oct 2004 | A1 |
20040215986 | Shakkarwar | Oct 2004 | A1 |
20040224638 | Fadell et al. | Nov 2004 | A1 |
20040239622 | Proctor et al. | Dec 2004 | A1 |
20040252109 | Trent, Jr. et al. | Dec 2004 | A1 |
20040252867 | Lan et al. | Dec 2004 | A1 |
20040253989 | Tupler et al. | Dec 2004 | A1 |
20040263388 | Krumm et al. | Dec 2004 | A1 |
20040267874 | Westberg et al. | Dec 2004 | A1 |
20050012644 | Hurst et al. | Jan 2005 | A1 |
20050017957 | Yi | Jan 2005 | A1 |
20050024341 | Gillespie et al. | Feb 2005 | A1 |
20050030048 | Bolender et al. | Feb 2005 | A1 |
20050052425 | Zadesky et al. | Mar 2005 | A1 |
20050052426 | Hagermoser et al. | Mar 2005 | A1 |
20050052429 | Philipp | Mar 2005 | A1 |
20050068304 | Lewis et al. | Mar 2005 | A1 |
20050083299 | Nagasaka | Apr 2005 | A1 |
20050083307 | Aufderheide | Apr 2005 | A1 |
20050090288 | Stohr et al. | Apr 2005 | A1 |
20050104867 | Westerman et al. | May 2005 | A1 |
20050110768 | Marriott et al. | May 2005 | A1 |
20050125147 | Mueller | Jun 2005 | A1 |
20050129199 | Abe | Jun 2005 | A1 |
20050139460 | Hosaka | Jun 2005 | A1 |
20050140657 | Park et al. | Jun 2005 | A1 |
20050143124 | Kennedy et al. | Jun 2005 | A1 |
20050156881 | Trent et al. | Jul 2005 | A1 |
20050162402 | Watanachote | Jul 2005 | A1 |
20050204309 | Szeto | Sep 2005 | A1 |
20050212760 | Marvit et al. | Sep 2005 | A1 |
20050237308 | Autio et al. | Oct 2005 | A1 |
20060017692 | Wehrenberg et al. | Jan 2006 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060032680 | Elias et al. | Feb 2006 | A1 |
20060038791 | Mackey | Feb 2006 | A1 |
20060095848 | Naik | May 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060131156 | Voelckers | Jun 2006 | A1 |
20060143574 | Ito et al. | Jun 2006 | A1 |
20060174568 | Kinoshita et al. | Aug 2006 | A1 |
20060181517 | Zadesky et al. | Aug 2006 | A1 |
20060197750 | Kerr et al. | Sep 2006 | A1 |
20060232557 | Fallot-Burghardt | Oct 2006 | A1 |
20060236262 | Bathiche et al. | Oct 2006 | A1 |
20060250377 | Zadesky et al. | Nov 2006 | A1 |
20060274042 | Krah et al. | Dec 2006 | A1 |
20060274905 | Lindahl et al. | Dec 2006 | A1 |
20060279896 | Bruwer | Dec 2006 | A1 |
20060284836 | Philipp | Dec 2006 | A1 |
20070013671 | Zadesky et al. | Jan 2007 | A1 |
20070018970 | Tabasso et al. | Jan 2007 | A1 |
20070044036 | Ishimura et al. | Feb 2007 | A1 |
20070052044 | Forsblad et al. | Mar 2007 | A1 |
20070052691 | Zadesky et al. | Mar 2007 | A1 |
20070080936 | Tsuk et al. | Apr 2007 | A1 |
20070080938 | Robbin et al. | Apr 2007 | A1 |
20070080952 | Lynch et al. | Apr 2007 | A1 |
20070083822 | Robbin et al. | Apr 2007 | A1 |
20070085841 | Tsuk et al. | Apr 2007 | A1 |
20070097086 | Battles et al. | May 2007 | A1 |
20070120834 | Boillot | May 2007 | A1 |
20070125852 | Rosenberg | Jun 2007 | A1 |
20070126696 | Boillot | Jun 2007 | A1 |
20070152975 | Ogihara | Jul 2007 | A1 |
20070152977 | Ng et al. | Jul 2007 | A1 |
20070152983 | McKillop et al. | Jul 2007 | A1 |
20070155434 | Jobs et al. | Jul 2007 | A1 |
20070157089 | Van Os et al. | Jul 2007 | A1 |
20070180409 | Sohn et al. | Aug 2007 | A1 |
20070242057 | Zadesky et al. | Oct 2007 | A1 |
20070247421 | Orsley et al. | Oct 2007 | A1 |
20070247443 | Philipp | Oct 2007 | A1 |
20070271516 | Carmichael | Nov 2007 | A1 |
20070273671 | Zadesky et al. | Nov 2007 | A1 |
20070276525 | Zadesky et al. | Nov 2007 | A1 |
20070279394 | Lampell | Dec 2007 | A1 |
20070285404 | Rimon et al. | Dec 2007 | A1 |
20070290990 | Robbin et al. | Dec 2007 | A1 |
20070291016 | Philipp | Dec 2007 | A1 |
20070296709 | GuangHai | Dec 2007 | A1 |
20080001770 | Ito et al. | Jan 2008 | A1 |
20080006453 | Hotelling et al. | Jan 2008 | A1 |
20080006454 | Hotelling | Jan 2008 | A1 |
20080007533 | Hotelling et al. | Jan 2008 | A1 |
20080007539 | Hotelling et al. | Jan 2008 | A1 |
20080012837 | Marriott et al. | Jan 2008 | A1 |
20080018615 | Zadesky et al. | Jan 2008 | A1 |
20080018616 | Lampell et al. | Jan 2008 | A1 |
20080018617 | Ng et al. | Jan 2008 | A1 |
20080036473 | Jansson | Feb 2008 | A1 |
20080036734 | Forsblad et al. | Feb 2008 | A1 |
20080060925 | Weber et al. | Mar 2008 | A1 |
20080062141 | Chandhri | Mar 2008 | A1 |
20080066016 | Dowdy et al. | Mar 2008 | A1 |
20080069412 | Champagne et al. | Mar 2008 | A1 |
20080071810 | Casto et al. | Mar 2008 | A1 |
20080079699 | Mackey | Apr 2008 | A1 |
20080087476 | Prest | Apr 2008 | A1 |
20080088582 | Prest | Apr 2008 | A1 |
20080088596 | Prest | Apr 2008 | A1 |
20080088597 | Prest | Apr 2008 | A1 |
20080088600 | Prest | Apr 2008 | A1 |
20080094352 | Tsuk et al. | Apr 2008 | A1 |
20080098330 | Tsuk et al. | Apr 2008 | A1 |
20080110739 | Peng et al. | May 2008 | A1 |
20080111795 | Bollinger | May 2008 | A1 |
20080143681 | XiaoPing | Jun 2008 | A1 |
20080165144 | Forstall et al. | Jul 2008 | A1 |
20080165158 | Hotelling et al. | Jul 2008 | A1 |
20080166968 | Tang et al. | Jul 2008 | A1 |
20080196945 | Konstas | Aug 2008 | A1 |
20080201751 | Ahmed et al. | Aug 2008 | A1 |
20080202824 | Philipp et al. | Aug 2008 | A1 |
20080209442 | Setlur et al. | Aug 2008 | A1 |
20080264767 | Chen et al. | Oct 2008 | A1 |
20080280651 | Duarte | Nov 2008 | A1 |
20080284742 | Prest | Nov 2008 | A1 |
20080293274 | Milan | Nov 2008 | A1 |
20080300055 | Lutnick et al. | Dec 2008 | A1 |
20090021267 | Golovchenko et al. | Jan 2009 | A1 |
20090026558 | Bauer et al. | Jan 2009 | A1 |
20090033635 | Wai | Feb 2009 | A1 |
20090036176 | Ure | Feb 2009 | A1 |
20090058687 | Rothkopf et al. | Mar 2009 | A1 |
20090058801 | Bull | Mar 2009 | A1 |
20090058802 | Orsley et al. | Mar 2009 | A1 |
20090073130 | Weber et al. | Mar 2009 | A1 |
20090078551 | Kang | Mar 2009 | A1 |
20090109181 | Hui et al. | Apr 2009 | A1 |
20090141046 | Rathnam et al. | Jun 2009 | A1 |
20090160771 | Hinckley et al. | Jun 2009 | A1 |
20090166098 | Sunder | Jul 2009 | A1 |
20090167508 | Fadell et al. | Jul 2009 | A1 |
20090167542 | Culbert et al. | Jul 2009 | A1 |
20090167704 | Terlizzi et al. | Jul 2009 | A1 |
20090170532 | Lee et al. | Jul 2009 | A1 |
20090179854 | Weber et al. | Jul 2009 | A1 |
20090197059 | Weber et al. | Aug 2009 | A1 |
20090229892 | Fisher et al. | Sep 2009 | A1 |
20090273573 | Hotelling | Nov 2009 | A1 |
20090303204 | Nasiri et al. | Dec 2009 | A1 |
20090307633 | Haughay et al. | Dec 2009 | A1 |
20100045705 | Vertegaal et al. | Feb 2010 | A1 |
20100058251 | Rottler et al. | Mar 2010 | A1 |
20100060568 | Fisher et al. | Mar 2010 | A1 |
20100073319 | Lyon et al. | Mar 2010 | A1 |
20100149127 | Fisher et al. | Jun 2010 | A1 |
20100214216 | Nasiri et al. | Aug 2010 | A1 |
20100289759 | Fisher et al. | Nov 2010 | A1 |
20100313409 | Weber et al. | Dec 2010 | A1 |
20110005845 | Hotelling et al. | Jan 2011 | A1 |
Number | Date | Country |
---|---|---|
1139235 | Jan 1997 | CN |
1455615 | Nov 2003 | CN |
1499356 | May 2004 | CN |
1659506 | Aug 2005 | CN |
3615742 | Nov 1987 | DE |
19722636 | Dec 1998 | DE |
10022537 | Nov 2000 | DE |
20019074 | Feb 2001 | DE |
102004043663 | Apr 2006 | DE |
0178157 | Apr 1986 | EP |
0419145 | Mar 1991 | EP |
0498540 | Aug 1992 | EP |
0521683 | Jan 1993 | EP |
0674288 | Sep 1995 | EP |
0731407 | Sep 1996 | EP |
0551778 | Jan 1997 | EP |
0880091 | Nov 1998 | EP |
1026713 | Aug 2000 | EP |
1081922 | Mar 2001 | EP |
1098241 | May 2001 | EP |
1133057 | Sep 2001 | EP |
1162826 | Dec 2001 | EP |
1168396 | Jan 2002 | EP |
1205836 | May 2002 | EP |
1244053 | Sep 2002 | EP |
1251455 | Oct 2002 | EP |
1263193 | Dec 2002 | EP |
1347481 | Sep 2003 | EP |
1376326 | Jan 2004 | EP |
1467392 | Oct 2004 | EP |
1482401 | Dec 2004 | EP |
1496467 | Jan 2005 | EP |
1517228 | Mar 2005 | EP |
1542437 | Jun 2005 | EP |
1589407 | Oct 2005 | EP |
1784058 | May 2007 | EP |
1841188 | Oct 2007 | EP |
1850218 | Oct 2007 | EP |
1876711 | Jan 2008 | EP |
2686440 | Jul 1993 | FR |
2015167 | Sep 1979 | GB |
2072389 | Sep 1981 | GB |
2315186 | Jan 1998 | GB |
2333215 | Jul 1999 | GB |
2391060 | Jan 2004 | GB |
2402105 | Dec 2004 | GB |
5795722 | Jun 1982 | JP |
57097626 | Jun 1982 | JP |
05233141 | Sep 1983 | JP |
61117619 | Jun 1986 | JP |
61124009 | Jun 1986 | JP |
63020411 | Jan 1988 | JP |
10063467 | Mar 1988 | JP |
63106826 | May 1988 | JP |
63181022 | Jul 1988 | JP |
63298518 | Dec 1988 | JP |
0357617 | Jun 1991 | JP |
03192418 | Aug 1991 | JP |
0432920 | Feb 1992 | JP |
4205408 | Jul 1992 | JP |
05041135 | Feb 1993 | JP |
05080938 | Apr 1993 | JP |
05101741 | Apr 1993 | JP |
0536623 | May 1993 | JP |
05189110 | Jul 1993 | JP |
05205565 | Aug 1993 | JP |
05211021 | Aug 1993 | JP |
05217464 | Aug 1993 | JP |
05262276 | Oct 1993 | JP |
05265656 | Oct 1993 | JP |
05274956 | Oct 1993 | JP |
05289811 | Nov 1993 | JP |
05298955 | Nov 1993 | JP |
05325723 | Dec 1993 | JP |
0620570 | Jan 1994 | JP |
06208433 | Feb 1994 | JP |
06084428 | Mar 1994 | JP |
06089636 | Mar 1994 | JP |
06096639 | Apr 1994 | JP |
06111685 | Apr 1994 | JP |
06111695 | Apr 1994 | JP |
06139879 | May 1994 | JP |
06187078 | Jul 1994 | JP |
06267382 | Sep 1994 | JP |
06283993 | Oct 1994 | JP |
06333459 | Dec 1994 | JP |
07107574 | Apr 1995 | JP |
0741882 | Jul 1995 | JP |
07201249 | Aug 1995 | JP |
07201256 | Aug 1995 | JP |
07253838 | Oct 1995 | JP |
07261899 | Oct 1995 | JP |
07261922 | Oct 1995 | JP |
07296670 | Nov 1995 | JP |
07319001 | Dec 1995 | JP |
08016292 | Jan 1996 | JP |
08115158 | May 1996 | JP |
08203387 | Aug 1996 | JP |
08293226 | Nov 1996 | JP |
08298045 | Nov 1996 | JP |
08299541 | Nov 1996 | JP |
08316664 | Nov 1996 | JP |
09044289 | Feb 1997 | JP |
09069023 | Mar 1997 | JP |
09128148 | May 1997 | JP |
09134248 | May 1997 | JP |
09218747 | Aug 1997 | JP |
09230993 | Sep 1997 | JP |
09231858 | Sep 1997 | JP |
09233161 | Sep 1997 | JP |
09251347 | Sep 1997 | JP |
09258895 | Oct 1997 | JP |
09288926 | Nov 1997 | JP |
09512979 | Dec 1997 | JP |
10074127 | Mar 1998 | JP |
10074429 | Mar 1998 | JP |
10198507 | Jul 1998 | JP |
10227878 | Aug 1998 | JP |
10240693 | Sep 1998 | JP |
10320322 | Dec 1998 | JP |
10326149 | Dec 1998 | JP |
1124834 | Jan 1999 | JP |
1124835 | Jan 1999 | JP |
1168685 | Mar 1999 | JP |
11184607 | Jul 1999 | JP |
11194863 | Jul 1999 | JP |
11194872 | Jul 1999 | JP |
11194882 | Jul 1999 | JP |
11194883 | Jul 1999 | JP |
11194891 | Jul 1999 | JP |
11195353 | Jul 1999 | JP |
11203045 | Jul 1999 | JP |
11212725 | Aug 1999 | JP |
11272378 | Oct 1999 | JP |
11338628 | Dec 1999 | JP |
2000200147 | Jul 2000 | JP |
2000215549 | Aug 2000 | JP |
2000267777 | Sep 2000 | JP |
2000267786 | Sep 2000 | JP |
2000267797 | Sep 2000 | JP |
2000353045 | Dec 2000 | JP |
2001011769 | Jan 2001 | JP |
2001022508 | Jan 2001 | JP |
2001160850 | Jun 2001 | JP |
2001184158 | Jul 2001 | JP |
3085481 | Feb 2002 | JP |
2002215311 | Aug 2002 | JP |
2003015796 | Jan 2003 | JP |
2003060754 | Feb 2003 | JP |
2003099198 | Apr 2003 | JP |
2003150303 | May 2003 | JP |
2003517674 | May 2003 | JP |
2003280799 | Oct 2003 | JP |
2003280807 | Oct 2003 | JP |
2004362097 | Dec 2004 | JP |
2005251218 | Sep 2005 | JP |
2005293606 | Oct 2005 | JP |
2006004453 | Jan 2006 | JP |
2006178962 | Jul 2006 | JP |
3852854 | Dec 2006 | JP |
2007123473 | May 2007 | JP |
19980071394 | Oct 1998 | KR |
19990050198 | Jul 1999 | KR |
20000008579 | Feb 2000 | KR |
20010052016 | Jun 2001 | KR |
20010108361 | Dec 2001 | KR |
20020065059 | Aug 2002 | KR |
20060021678 | Mar 2006 | KR |
431607 | Apr 2001 | TW |
00470193 | Dec 2001 | TW |
547716 | Aug 2003 | TW |
I220491 | Aug 2004 | TW |
9814863 | Apr 1988 | WO |
9417494 | Aug 1994 | WO |
9500897 | Jan 1995 | WO |
9627968 | Sep 1996 | WO |
9949443 | Sep 1999 | WO |
0079772 | Dec 2000 | WO |
0102949 | Jan 2001 | WO |
0144912 | Jun 2001 | WO |
0208881 | Jan 2002 | WO |
03036457 | May 2003 | WO |
03044645 | May 2003 | WO |
03044956 | May 2003 | WO |
03025960 | Sep 2003 | WO |
03088176 | Oct 2003 | WO |
03090008 | Oct 2003 | WO |
2004001573 | Dec 2003 | WO |
2004040606 | May 2004 | WO |
2004091956 | Oct 2004 | WO |
2005055620 | Jun 2005 | WO |
2005076117 | Aug 2005 | WO |
2005114369 | Dec 2005 | WO |
2005124526 | Dec 2005 | WO |
2006020305 | Feb 2006 | WO |
2006021211 | Mar 2006 | WO |
2006037545 | Apr 2006 | WO |
2006104745 | Oct 2006 | WO |
2006135127 | Dec 2006 | WO |
2007025858 | Mar 2007 | WO |
2007078477 | Jul 2007 | WO |
2007084467 | Jul 2007 | WO |
2007089766 | Aug 2007 | WO |
2008007372 | Jan 2008 | WO |
2008045414 | Apr 2008 | WO |
2008045833 | Apr 2008 | WO |
Entry |
---|
“About Quicktip.RTM.” www.logicad3d.com/docs/qt.html, downloaded Apr. 8, 2002. |
“Alps Electric introduces the GlidePoint Wave Keyboard; combines a gentily curved design with Alps' advanced GlidePoint Technology”, Business Wire, Oct. 21, 1996. |
“Alps Electric Ships GlidePoint Keyboard for the Macintosh; Includes a GlidePoint Touchpad, Erase-Eaze Backspace Key and Countoured Wrist Rest”, Business Wire, Jul. 1, 1996. |
“APS Show Guide to Exhibitors”, Physics Today, vol. 49, No. 3, Mar. 1996. |
“Atari VCS/2600 Peripherals”, www.classicgaming.com/gamingmuseum/2600p.html, downloaded Feb. 28, 2007, pp. 1-15. |
“Der Klangmeister,” Connect Magazine, Aug. 1998. |
“Design News: Literature Plus”, Design News, vol. 51, No. 24, Dec. 18, 1995. |
“Diamond Multimedia Announces Rio PMP300 Portable MP3 Music Player,” located at http://news.harmony-central.com/Newp/1998/Rio-PMP300.html visited on May 5, 2008, 4 pages. |
“Manufactures”, Laser Focus World, Buyers Guide '96, vol. 31, No. 12, Dec. 1995. |
“National Design Engineering Show”, Design News, vol. 52, No. 5, Mar. 4, 1996. |
“Neuros MP3 Digital Audio Computer”, www.neurosaudio.com., downloaded Apr. 9, 2003. |
“OEM Touchpad Modules” website www.glidepoint.com/sales/modules.index.shtml, downloaded Feb. 13, 2002. |
“Preview of exhibitor booths at the Philadelphia show”, The News Air Conditioning Heating & Refridgeration, vol. 200, No. 2, Jan. 13, 1997. |
“Product news”, Design News, vol. 53, No. 11, Jun. 9, 1997. |
“Product news”, Design News, vol. 53, No. 9, May 5, 1997. |
“Product Overview—ErgoCommander.RTM.”, www.logicad3d.com/products/ErgoCommander.htm, downloaded Apr. 8, 2002. |
“Product Overview—SpaceMouse.RTM. Classic”, www.logicad3d.com/products/Classic.htm, downloaded Apr. 8, 2002. |
“System Service and Troubleshooting Manual,” www.dsplib.com/intv/Master, downloaded Dec. 11, 2002. |
“Triax Custom Controllers due; Video Game Controllers”, HFD—The Weekly Home Furnishing Newspaper, vol. 67, No. 1, Jan. 4, 1993. |
Acer Information Co. Ltd., “Touchpad,” Notebook PC Manual, Feb. 16, 2005, 3 pages. |
Ahl, David H., “Controller Update”, Creative Computing, vol. 9, No. 12, Dec. 1983. |
Ahmad, Subutai, “A Usable Real-Time 3D Hand Tracker,” Proceedings of the 28th Asilomar Conference on Signals, Systems and Computers—Part 2 (of 2) vol. 2, Oct. 1994. |
Apple Inc., “Apple Presents iPod: Ultra-Portable MP3 Music Player Puts 1,000 Songs in Your Pocket”, Press Release, Oct. 23, 2001, 3 pages. |
Apple Inc., “Apple Unveils Optical Mouse and New Pro Keyboard,” Press Release, Jul. 19, 2000. |
Baig, E.C., “Your PC Just Might Need a Mouse”, U.S. News & World Report, vol. 108, No. 22, Jun. 4, 1990. |
Bang & Olufsen Telecom, “BeoCom 6000 User Guide 2000”. |
Bang & Olufsen Telecom, “BeoCom 6000, Sales Training Brochure”, date unknown. |
Bartimo, Jim, “The Portables: Traveling Quickly”, Computerworld, Nov. 14, 1983. |
Boling, Douglas, “Programming Microsoft Windows CE.NET,” Microsoft, Third Edition, 1993, p. 109. |
Bray, Kevin, “Phosphors help switch on xenon,” Physics in Action, pp. 1-3, Apr. 1999. |
Brink et al., “Pumped-up portables”, U.S. News & World Report, vol. 116, No. 21, May 30, 1994. |
Brown, Ed et al., “Windows on Tablets as a Means of Achieving Virtual Input Devices”, Human-Computer Interaction—Interact '90, 1990. |
Buxton, William et al., “Issues and Techniques in Touch-Sensitive Tablet Input”, Computer Graphics, 19(3), Proceedings of SIGGRAPH '85, 1985. |
Chapweske, Adam, “PS/2 Mouse/Keyboard Protocol”, 1999, http://panda.cs.ndsu.nodak.edu/.about.achapwes/PICmicro/PS2/ps2.htm. |
Chen, Michael et al., “A Study in Interactive 3-D Rotation Using 2-D Control Devices”, Computer Graphics, vol. 22, No. 4, Aug. 1988. |
De Meyer, Kevin “Crystal Optical Mouse,” Feb. 14, 2002, Heatseekerz, Web-Article 19. |
EBV Elektronik, “TSOP6238 IR Receiver Modules for Infrared Remote Control Systems”, Jan. 2004, 1 page. |
Evans, Kenneth et al., “Tablet-based Valuators that Provide One, Two, or Three Degrees of Freedom”, Computer Graphics, vol. 15, No. 3, Aug. 1981. |
Fiore, Andrew, “Zen Touchpad”, Cornell University, May 2000. |
Gadgetboy, “Point and click with the latest mice”, CNETAsia Product Review, www.asia.cnet.com/reviews...are/gadgetboy/0,39001770,38023590,00.- htm, downloaded Dec. 5, 2001. |
Gfroerer, Thomas H., “Photoluminescence in Analysis of Surfaces and Interfaces,” Encyclopedia of Analytical Chemistry, 2000, pp. 1-23. |
Intelink Electronics, VersaPad: Integration Guide, 1998, pp. 1-35. |
Jesitus John , “Broken promises?”, Industry Week/IW, vol. 246, No. 20, Nov. 3, 1997. |
Kobayashi, Minoru et al., “Dynamic Soundscape: Mapping Time to Space for Audio Browsing,” Computer Human Interaction, 1997, pp. 194-201. |
Kobayashi, Minoru, “Design of Dynamic Soundscape: Mapping Time to Space for Audio Browsing with Simultaneous Listening”, thesis submitted to Program in Media Arts and Sciences at the Massachusetts Institute of Technology, 1996, 58 pages. |
Kobayashi, Shinji et al. “Development of the Touch Switches with the Click Response,” Koukuu Denshi Gihou, Japan Aviation Electronics Industry, Ltd., No. 17, 1994, pp. 44-48 (Translation of Summary). |
Letter re: Bang & Olufsen A/S, by David Safran, Nixon Peabody, LLP, May 21, 2004. |
Luna Technologies International, Inc., LUNA Photoluminescent Safety Products, “Photoluminescence—What is Photoluminescence?” Dec. 27, 2005, available at http://www.lunaplast.com/photoluminescence.com. |
Mims, Forrest M., III, “A Few Quick Pointers; Mouses, Touch Screens, Touch Pads, Light Pads, and the Like Can Make Your System Easier to Use”, Computers & Electronics, vol. 22, May 1984. |
Nass, Richard, “Touchpad input device goes digital to give portable systems a desktop “mouse-like” feel”, Electronic Design, vol. 44, No. 18, Sep. 3, 1996. |
Perenson, Melissa, “New & Improved: Touchpad Redux”, PC Magazine, Sepember 10, 1996. |
Petersen, Marty, “Koala Pad Touch Tablet & Micro Illustrator Software”, InfoWorld, Oct. 10, 1983. |
Petruzzellis, T.L., “Force-Sensing Resistors”, Electronics Now, vol. 64, No. 3, Mar. 1993. |
Photographs of Innovations 2000 Best of Show award presented at the 2000 International CES Innovations 2000 Design & Engineering Showcase, 1 pg. |
Robbin, U.S. Appl. No. 60/346,237 entitled, “Method and System for List Scrolling,” filed Oct. 22, 2001; 12 pages. |
SanDisk Sansa Connect User Guide, 2007, 29 pages. |
Schramm, Mike, “Playing with the iPhone's accelerometer”, The Unofficial Apple Weblog, Aug. 29, 2007, 5 pages. Available at http://www.tuaw.com/2007/08/29/playing-with-the-iphones-accelerometer/. |
Soderholm, Lars G., “Sensing Systems for ‘Touch and Feel’”, Design News, May 8, 1989, pp. 72-76. |
Sony, “Sony presents ‘Choice Without Compromise’”, IBC '97 M2 Presswire, Jul. 24, 1997. |
Spiwak, Marc, “A Great New Wireless Keyboard”, Popular Electronics, vol. 14, No. 12, Dec. 1997. |
Spiwak, Marc, “A Pair of Unusual Controllers”, Popular Electronics, vol. 14, No. 4, Apr. 1997. |
Suzuki, K., “Full Loading of Usable Online Software! Strongest Palm Series Packs 1000”, Ascii Co., Ltd., pp. 126-129. |
Sylvania, “Intellivision™ Intelligent Television Master Component Service Manual”, pp. 1, 2 and 8, 1979. |
Synaptics, Inc., “Synaptics TouchPad Interfacing Guide” Second Edition, Mar. 25, 1998, San Jose, CA, pp. 1 to 90. |
Tessler, Franklin et al. “Touchpads: Three new input devices”, website www.macworld.com/1996/02/review/1806.html, downloaded Feb. 13, 2002. |
Tessler, Franklin, “Point Pad”, Macworld, vol. 12, No. 10, Oct. 1995. |
Tessler, Franklin, “Smart Input: How to Chose from the New Generation of Innovative Input Devices”, Macworld, vol. 13, No. 5, May 1996. |
Tessler, Franklin, “Touchpads”, Macworld, vol. 13, No. 2, Feb. 1996. |
Translation of Trekstor's Defense Statement to the District Court Mannheim of May 23, 2008; 37 pages. |
Number | Date | Country | |
---|---|---|---|
20090064031 A1 | Mar 2009 | US |
Number | Date | Country | |
---|---|---|---|
60967457 | Sep 2007 | US |