The present invention can relate to variable input/output interfaces for electronic devices.
Currently available portable media devices may provide limited ways by which users can interact with the devices. For example, currently available portable media devices may show images on its display in only one orientation with respect to the housing of the media device. When a user places the portable media device in a non-standard orientation, the user may have to angle his head in order to properly view the displayed images. Furthermore, when the media device is playing a media file, the media device may not permit the user to adjust parameters associated with the media file. In order to adjust any of the parameters of the media file, the media device may require the user first to stop playback of the media file.
The present invention can include electronic devices that have variable input/output (I/O) interfaces. The variable I/O interfaces can allow a user to interact with the devices in a more ergonomic manner and with greater efficiency.
An electronic device of the present invention can display one or more software icons associated with user-programmable parameters of a media file during playback of the media file. The electronic device can permit a user to select a user-programmable parameter of the media file by selecting a corresponding icon. While the selected icon is visually distinguished, the electronic device can permit the user to adjust the selected user-programmable parameter. The electronic device can change the software icons it displays to reflect different user-programmable parameters associated with different modules of the media file or with different media files.
An electronic device of the present invention also can automatically re-orient an image shown on its display and/or re-configure user input components based on the orientation of the electronic device. The user input components can include hardware or software input components.
The above and other advantages of the present invention will be apparent upon consideration of the following detailed description, taken in conjunction with accompanying drawings, in which like reference characters refer to like parts throughout, and in which:
User input component 16 can include a clickwheel similar to that incorporated in the iPod™. The clickwheel can include one or more buttons and a touchpad. The touchpad may permit a user to scroll by running the user's finger around the track of the clickwheel. In alternative embodiments, user input component 16 can include, for example, one or more buttons, a touchpad, a touchscreen display, electronics for accepting voice commands, antennas for accepting signals from other electronic devices, infrared ports for accepting signals from other electronic devices, or any combination thereof. Controller 18 can include one or more processors, ASICs, circuits, or any combination thereof. Memory 20 can include read only memory, random access memory, solid-state memory, buffer memory, hard drive memory, any other memory known in the art or otherwise, or any combination thereof.
In accordance with one aspect of the present invention, a user can use portable media device 10, local server 24 (e.g., a user's personal computer), or central server 26 (e.g., a server that hosts a website) to design or compile a program of activities for the user to perform or for another user to perform. The program of activities can include instructions and other audio and/or visual information. The instructions and other audio/visual information can be stored in a media file, which the user can play back on, e.g., electronic device 10. For example, the incorporated patent documents describe in greater detail programs of activities (e.g., compilations of fitness activities, nutritional activities, and/or medical activities) that a user can compile, e.g., on a local or central server and playback on electronic device 10.
If the user compiles the program of activities on local server 24 or central server 26, the local or central server can transmit the media file to portable media device 10 using a cable or a wireless communication protocol known in the art or otherwise. The portable media device can store the media file in memory 20. When the user wishes to perform activities in accordance with the compiled program, controller 18 can play back the media file. Electronic device 10 can output audio data stored in the media file using speakers (not shown) disposed within portable media device 10 or an accessory coupled thereto. Portable media device 10 can output visual data stored in the media file using display 14. Visual data can include text images, still images, and/or video images.
In one aspect of the present invention, portable electronic device 10 can accept data from sensors S that can capture information about the user's activities. Sensors S can include sensors used in, for example, pedometers, accelerometers, heart rate monitors, oximeters, GPS tracking devices, devices that measure temperature, devices that measure heat flux, electrocardiogram devices, devices having activity tracking sensors similar to those described in the incorporated INTEGRATED SENSORS document, devices having activity tracking sensors similar to those described in the other incorporated patent documents, or any combination thereof.
A user can wear sensors S in the user's clothing or accessories. Alternatively, sensors S can be disposed within electronic device 10 or another electronic device utilized by the user. While the user is performing the activities programmed into the media file, sensors S can track the user's performance and electronic device 10 can log data from sensors S in memory 20. Once the user has completed the scheduled activities, the user can upload the data that electronic device 10 has collected about the user's performance into local server 24 and/or central server 26. Local server 24 and/or central server 26 then can analyze that data to adapt the user's goals or otherwise track a user's progress, for example, in ways similar to those described in the incorporated patent documents.
When the user is ready to perform the activities associated with media file 30, the user can initiate playback of media file 30 on electronic device 10. Electronic device 10 then can begin playing module 1 by outputting the audio and/or visual information associated therewith. Upon the completion of module 1, the electronic device 10 can automatically begin playing module 2 by outputting the audio and/or visual information associated therewith. This process can continue until electronic device 10 has played all of the modules.
In accordance with one aspect of the present invention, controller 18 can be configured to permit a user to adjust global and/or module-specific user-programmable parameters of media file 30 during playback of the media file. Controller 18 can display one or more icons on display 14 that correspond to the user-programmable parameters of media file 30. Using user input component 16, the user may select the icon corresponding to the user-programmable parameter he wishes to modify. Once controller 18 has visually distinguished the selected icon, the user can modify the user-programmable parameter by manipulating user input component 16.
In the illustrative embodiment shown in
In step 46, controller 18 can begin playing the first module of the media file. This can include outputting audio and/or visual information associated with that module, displaying icon(s) 38 on display 14 corresponding to media file-independent parameters, and logging information collected by sensors S. Visual information associated with a module can include icons 34-36 that identify user-programmable parameters that a user can adjust during playback of that module, the amount of time elapsed, visual cues for the proper way to perform the activity associated with the current module playing, visual cues of the activity associated with the next module to be played from the media file, etc.
In step 48, controller 18 can check whether the end of the module has been reached. If not, in step 50, controller 18 can wait to accept signals from user input component 16 that indicate the user wishes to adjust either a user-programmable parameter associated with the media file or a media file-independent parameter. Once the controller receives one or more signals from user input component 16, controller 18 can permit the user to adjust a user-programmable parameter or one of the media file-independent parameters in step 52. Thereafter, controller 18 can reiterate steps 46-52 to continue playing the first module of the loaded media file in accordance with the adjustment made to a user-programmable parameter in step 52.
Once controller 18 detects that it has reached the end of the first media module in step 48, controller 18 can save any adjustments made to user-programmable parameters associated with the first media module in step 54. Alternatively, the controller can save adjustments to the user-programmable parameters immediately after the adjustments are made.
Thereafter, controller 18 can automatically load and begin playback of a second media module of the loaded media file. Because the second module can request that the user perform an activity different than that of the first module, the controller may output different audio and/or visual information than that output for the first module and/or log different information collected by sensors S. Controller 18 can continuously reiterate steps 42-54 until the controller detects in step 42 that it has just played the last module of the loaded media file. In step 56, controller 18 can conclude playback of the media file loaded in step 40.
Once the user has completed the scheduled activities, the user can upload the adjustments made to the user-programmable parameters of the media file into local server 24 and/or central server 26 along with other data that electronic device 10 collected about the user's performance. Local server 24 and/or central server 26 then can analyze the uploaded data to adapt the user's goals or otherwise track a user's progress.
Adjustments in global or module-specific user-programmable parameters during step 64 may require that controller 18 adjust the playback of the remainder of the media file. In step 66, controller 18 can make adjustments in other user-programmable and non-user-programmable parameters of the loaded media file in response to adjustments made in step 64. For example, if the user adjusts the number of repetitions from 15 to 25 as shown in
While the media file described above can store both audio and visual data, media files of the present invention also may be configured to store only audio data or only visual data. Furthermore, an electronic device of the present invention can permit a user to adjust user-programmable parameters associated with any type of file, not just a media file.
Image I can include text images, still images, and/or video images. For example, image I can include icons 34-38 as discussed above with respect to
While
Although
Orientation transducers incorporated in the electronic devices of the present invention can include a single multi-dimensional motion sensor or an assembly of sensors that can detect motion of the electronic device, including position, orientation, and/or movement. For example, an orientation transducer can include one or more multi-dimensional accelerometers, GPS systems, gyroscopes, magnetic sensors, mercury switches, or any combination thereof. The orientation transducers also can include a receiver that can triangulate the position, orientation, and/or movement of the electronic device based on signals received from multiple transmitters disposed near the receiver.
Advantageously, an electronic device that can dynamically adjust the orientation of image I in the manner described may be useful for a user who may position the electronic device in a non-standard orientation. For example, an athlete may have an iPod™ strapped to his forearm. When the athlete has his arm extended in front of him (e.g., when he is stretching before a jog), he may wish to view image I on a display of the iPod™ in the orientation shown in
When electronic device 70 is oriented in a reference orientation (e.g., the positive Y direction shown in
Controller 78 can re-configure the buttons of user input 76 when electronic device 70 is re-oriented by, e.g., using one or more hardware switches or multiplexers. Alternatively, controller 78 can incorporate software that distributes signals received from the buttons of user input 76 to the appropriate function based on signals from orientation transducer 82.
Electronic device 70 also can provide tactile feedback to the user during manual re-configuration. For example, there can be mechanical stops that prevent the user from rotating user input 77 past certain angles. Also, user input component 77 can incorporate mechanical protuberances that are configured to engage depressions (or vice versa). When the user manually re-configures user input component 77, electronic device 70 can provide tactile feedback to the user when the protuberances engage the depressions.
As illustrated in
As shown in
For example, as illustrated in
Electronic device 90 also can include hardware user input component 96, memory 100, and orientation transducer 102. Memory 100 can store, for example, media files with which image I is associated, data from which controller 98 can generate icons 104, and program code. The program code can include, for example, instructions for displaying, re-orienting, and re-configuring image I, icons 104, software input components, and hardware input components 105. Hardware user input component 96 can be, for example, a single or multi-functional button. Multi-functions buttons can signal controller 98 to perform one function when a user depresses the button for a predetermined amount of time and signal controller 98 to perform a second function when a user depresses the button for a shorter amount of time.
The controller may permit a manufacturer or a user to set the reference orientation. For example, a user may set the reference orientation of the electronic component in the positive X direction, rather than, e.g., the positive Y direction shown in
In step 118, the controller can generate signals to re-orient an image in the reference orientation. Depending on the determination made in step 116, this can include rotating the image into the reference orientation and performing any other operations necessary to properly display the image in the new orientation, e.g., scaling.
To rotate the image, the controller can use transformation matrices or select from versions of the image stored in memory that already are disposed in predetermined orientations. For example, each time the controller determines that the reference orientation is different than the real-time orientation of the electronic device, the controller can determine a transformation matrix based on the reference orientation and the real-time orientation of the electronic device. The transformation matrix also can incorporate factors needed to properly display the image in the new orientation (e.g., scaling factors). The controller then can use the transformation matrix to re-orient the image. Advantageously, this technique may be useful for re-orienting images that are constantly changing, e.g., video images.
The electronic device also can store multiple versions of each image disposed in multiple orientations. The controller then can select and display the version of the image having the appropriate orientation based on the determination made in step 116. This may be useful for re-orienting still images that are repeatedly displayed in certain predetermined orientations.
Alternatively, rather than storing multiple versions of each image, the electronic device can store predetermined transformation matrices that can re-orient any image to be displayed by the electronic device. Again, the controller can incorporate, e.g., scaling factors into the stored transformation matrices. The controller then can select the appropriate transformation matrix based on the determination made in step 116 and apply the transformation matrix to re-orient the image. Again, this technique may be useful for re-orienting images that are constantly changing.
The controller also can use other techniques known in the art or otherwise to re-orient an image shown on a display of the electronic device.
In step 120, the controller can generate signals to re-configure one or more user input components of the electronic device. This can include re-assigning functions to one or more user input components, re-positioning one or more user input components, and/or re-positioning one or more icons that designate user input component(s).
In accordance with another aspect of the present invention, a controller can permit the user to reversibly deactivate certain orientations in which the controller is capable of displaying an image. For example, the controller can permit a user to limit an image to be displayed only in the following two orientations: the orientation of
Similarly, the controller can permit the user to reversibly deactivate certain configurations in which the controller is capable of re-configuring user input components. The controller also can permit the user to reversibly deactivate the automatic re-configuration feature of the electronic device. Again, the controller can instead require that the user toggle between configurations of the user input component by manually signaling the controller.
Although particular embodiments of the present invention have been described above in detail, it will be understood that this description is merely for purposes of illustration. Alternative embodiments of those described hereinabove also are within the scope of the present invention.
Electronic devices of the present invention can combine features described above with respect to
Electronic devices of the present invention also can include other components that are not illustrated in
Electronic devices of the present invention can be any electronic device that executes files having user-programmable parameters and/or any electronic device that a user may orient in a non-standard orientation. For example, the electronic device can be any portable, mobile, hand-held, or miniature consumer electronic device. Illustrative electronic devices can include, but are not limited to, music players, video players, still image players, game players, other media players, music recorders, video recorders, cameras, other media recorders, radios, medical equipment, calculators, cellular phones, other wireless communication devices, personal digital assistances, programmable remote controls, pagers, laptop computers, printers, or any combination thereof. Miniature electronic devices may have a form factor that is smaller than that of hand-held devices. Illustrative miniature electronic devices can include, but are not limited to, watches, rings, necklaces, belts, accessories for belts, headsets, accessories for shoes, virtual reality devices, other wearable electronics, accessories for sporting equipment, accessories for fitness equipment, key chains, or any combination thereof.
The above described embodiments of the present invention are presented for purposes of illustration and not of limitation, and the present invention is limited only by the claims which follow.
This application is a continuation of U.S. patent application Ser. No. 11/729,291, filed on Mar. 27, 2007, which claims priority to U.S. Provisional Patent Application No. 60/846,414, filed on Sep. 21, 2006 (referred to below as “the incorporated provisional patent application”). These earlier application are incorporated herein by reference. This also is related to: U.S. Publication No. 2008/0086318, entitled “LIFESTYLE COMPANION SYSTEM,” (referred to herein as “the incorporated LIFESTYLE COMPANION document”), the entirety of which is incorporated herein by reference; U.S. Pat. No. 8,001,472, entitled “SYSTEMS AND METHODS FOR PROVIDING AUDIO AND VISUAL CUES VIA A PORTABLE ELECTRONIC DEVICE,” (referred to herein as “the incorporated AUDIO AND VISUAL CUES document”), the entirety of which is incorporated herein by reference; U.S. Pat. No. 8,235,724, entitled “DYNAMICALLY ADAPTIVE SCHEDULING SYSTEM,” (referred to herein as “the incorporated ADAPTIVE SCHEDULING SYSTEM document”), the entirety of which is incorporated herein by reference; U.S. Pat. No. 8,429,223, entitled “SYSTEMS AND METHODS FOR FACILITATING GROUP ACTIVITIES,” (referred to herein as “the incorporated GROUP ACTIVITIES document”), the entirety of which is incorporated herein by reference; U.S. Publication No. 2008/0077489, entitled “REWARDS SYSTEMS,” (referred to herein as “the incorporated REWARDS SYSTEMS document”), the entirety of which is incorporated herein by reference; U.S. Publication No. 2008/0076972, entitled “INTEGRATED SENSORS FOR TRACKING PERFORMANCE METRICS,” (referred to herein as “the incorporated INTEGRATED SENSORS document”), the entirety of which is incorporated herein by reference; and U.S. patent application Ser. No. 11/426,078, to King et al., filed on Jun. 23, 2006 (Publication No. 2006/0238517, published on Oct. 26, 2006), entitled “Electronic Device Having Display and Surrounding Touch Sensitive Bezel for User Interface and Control” (referred to herein as “the incorporated King document”), the entirety of which is incorporated herein by reference. The incorporated provisional patent application, LIFESTYLE COMPANION document, AUDIO AND VISUAL CUES document, ADAPTIVE SCHEDULING SYSTEM document, GROUP ACTIVITIES document, REWARDS SYSTEMS document, INTEGRATED SENSORS document, and King document collectively are referred to herein as “the incorporated patent documents.”
Number | Name | Date | Kind |
---|---|---|---|
3675640 | Gatts | Jul 1972 | A |
4649552 | Yukawa | Mar 1987 | A |
4907795 | Shaw et al. | Mar 1990 | A |
5379057 | Clough et al. | Jan 1995 | A |
5412564 | Ecer | May 1995 | A |
5434913 | Tung et al. | Jul 1995 | A |
5452435 | Malouf et al. | Sep 1995 | A |
5471405 | Marsh | Nov 1995 | A |
5489249 | Brewer et al. | Feb 1996 | A |
5490247 | Tung et al. | Feb 1996 | A |
5645509 | Brewer et al. | Jul 1997 | A |
5673691 | Abrams et al. | Oct 1997 | A |
5675362 | Clough et al. | Oct 1997 | A |
5794018 | Vrvilo et al. | Aug 1998 | A |
5819735 | Mansfield | Oct 1998 | A |
5857939 | Kaufman | Jan 1999 | A |
5859979 | Tung et al. | Jan 1999 | A |
5890995 | Bobick et al. | Apr 1999 | A |
5890997 | Roth | Apr 1999 | A |
5913062 | Vrvilo et al. | Jun 1999 | A |
5954640 | Szabo | Sep 1999 | A |
5976083 | Richardson | Nov 1999 | A |
6013007 | Root et al. | Jan 2000 | A |
6032108 | Seiple et al. | Feb 2000 | A |
6039688 | Douglas et al. | Mar 2000 | A |
6077193 | Buhler et al. | Jun 2000 | A |
6135951 | Richardson et al. | Oct 2000 | A |
6159131 | Pfeffer | Dec 2000 | A |
6357147 | Darley et al. | Mar 2002 | B1 |
6447424 | Ashby et al. | Sep 2002 | B1 |
6463385 | Fry | Oct 2002 | B1 |
6527674 | Clem | Mar 2003 | B1 |
6539336 | Vock et al. | Mar 2003 | B1 |
6553037 | Pivowar et al. | Apr 2003 | B1 |
6560903 | Darley | May 2003 | B1 |
6582342 | Kaufman | Jun 2003 | B2 |
6585622 | Shum et al. | Jul 2003 | B1 |
6587127 | Leeke et al. | Jul 2003 | B1 |
6619835 | Kita | Sep 2003 | B2 |
6623427 | Mandigo | Sep 2003 | B2 |
6702719 | Brown et al. | Mar 2004 | B1 |
6716139 | Hosseinzadeh-Dolkhani et al. | Apr 2004 | B1 |
6725281 | Zintel et al. | Apr 2004 | B1 |
6735568 | Buckwalter et al. | May 2004 | B1 |
6749537 | Hickman | Jun 2004 | B1 |
6790178 | Mault et al. | Sep 2004 | B1 |
6793607 | Neil | Sep 2004 | B2 |
6808473 | Hisano et al. | Oct 2004 | B2 |
6898550 | Blackadar et al. | May 2005 | B1 |
6910068 | Zintel et al. | Jun 2005 | B2 |
6921351 | Hickman et al. | Jul 2005 | B1 |
6945911 | Jackowski | Sep 2005 | B2 |
7030735 | Chen | Apr 2006 | B2 |
7062225 | White | Jun 2006 | B2 |
7069308 | Abrams | Jun 2006 | B2 |
7070539 | Brown et al. | Jul 2006 | B2 |
7085590 | Kennedy et al. | Aug 2006 | B2 |
7171331 | Vock et al. | Jan 2007 | B2 |
7174227 | Kobayashi et al. | Feb 2007 | B2 |
7192387 | Mendel | Mar 2007 | B2 |
7200517 | Darley et al. | Apr 2007 | B2 |
7228168 | Dardik et al. | Jun 2007 | B2 |
7251454 | White | Jul 2007 | B2 |
7254516 | Case, Jr. et al. | Aug 2007 | B2 |
7261690 | Teller et al. | Aug 2007 | B2 |
7277726 | Ahya et al. | Oct 2007 | B2 |
7278966 | Hjelt et al. | Oct 2007 | B2 |
7292867 | Werner et al. | Nov 2007 | B2 |
7328239 | Berberian et al. | Feb 2008 | B1 |
7353139 | Burrell et al. | Apr 2008 | B1 |
7424718 | Dutton | Sep 2008 | B2 |
7454002 | Gardner et al. | Nov 2008 | B1 |
7496277 | Ackley et al. | Feb 2009 | B2 |
7519327 | White | Apr 2009 | B2 |
7523040 | Kirchhoff et al. | Apr 2009 | B2 |
7526524 | White | Apr 2009 | B2 |
7591760 | Gordon et al. | Sep 2009 | B2 |
7603255 | Case, Jr. et al. | Oct 2009 | B2 |
7618345 | Corbalis et al. | Nov 2009 | B2 |
7636754 | Zhu et al. | Dec 2009 | B2 |
7656824 | Wang et al. | Feb 2010 | B2 |
7670263 | Ellis et al. | Mar 2010 | B2 |
7683252 | Oliver et al. | Mar 2010 | B2 |
7753825 | Jaquish et al. | Jul 2010 | B2 |
7765245 | Nichols | Jul 2010 | B2 |
7827039 | Butcher et al. | Nov 2010 | B2 |
7841967 | Kahn et al. | Nov 2010 | B1 |
7946959 | Shum et al. | May 2011 | B2 |
7973231 | Bowen | Jul 2011 | B2 |
8001472 | Gilley et al. | Aug 2011 | B2 |
8066514 | Clarke | Nov 2011 | B2 |
8095120 | Blair et al. | Jan 2012 | B1 |
8529409 | Lesea-Ames | Sep 2013 | B1 |
20010054180 | Atkinson | Dec 2001 | A1 |
20020007313 | Mai et al. | Jan 2002 | A1 |
20020022551 | Watterson et al. | Feb 2002 | A1 |
20020022774 | Karnieli | Feb 2002 | A1 |
20020027164 | Mault et al. | Mar 2002 | A1 |
20020033753 | Imbo | Mar 2002 | A1 |
20020072932 | Swamy | Jun 2002 | A1 |
20020077784 | Vock et al. | Jun 2002 | A1 |
20020095460 | Benson | Jul 2002 | A1 |
20020107824 | Ahmed | Aug 2002 | A1 |
20030017914 | Jackowski | Jan 2003 | A1 |
20030028116 | Bimbaum | Feb 2003 | A1 |
20030059747 | Yoshida et al. | Mar 2003 | A1 |
20030097878 | Farringdon et al. | May 2003 | A1 |
20030175666 | Tanabe et al. | Sep 2003 | A1 |
20030204412 | Brier | Oct 2003 | A1 |
20030208113 | Mault et al. | Nov 2003 | A1 |
20030220971 | Kressin | Nov 2003 | A1 |
20030224337 | Shum et al. | Dec 2003 | A1 |
20030229900 | Reisman | Dec 2003 | A1 |
20040002041 | Peplinski et al. | Jan 2004 | A1 |
20040029684 | Zarif | Feb 2004 | A1 |
20040091843 | Albro et al. | May 2004 | A1 |
20040102931 | Ellis et al. | May 2004 | A1 |
20040106449 | Walker et al. | Jun 2004 | A1 |
20040107116 | Brown | Jun 2004 | A1 |
20040143673 | Kristjansson | Jul 2004 | A1 |
20040198555 | Anderson et al. | Oct 2004 | A1 |
20040201595 | Manchester | Oct 2004 | A1 |
20040220017 | Gordon et al. | Nov 2004 | A1 |
20040229729 | Albert et al. | Nov 2004 | A1 |
20050008993 | Bergh et al. | Jan 2005 | A1 |
20050008994 | Bisogno | Jan 2005 | A1 |
20050010638 | Richardson et al. | Jan 2005 | A1 |
20050014113 | Fleck et al. | Jan 2005 | A1 |
20050042582 | Graves | Feb 2005 | A1 |
20050044503 | Richardson et al. | Feb 2005 | A1 |
20050058970 | Perlman et al. | Mar 2005 | A1 |
20050060368 | Wang et al. | Mar 2005 | A1 |
20050070809 | Acres | Mar 2005 | A1 |
20050101314 | Levi | May 2005 | A1 |
20050107116 | Yamaguchi | May 2005 | A1 |
20050107216 | Lee et al. | May 2005 | A1 |
20050113649 | Bergantino | May 2005 | A1 |
20050125221 | Brown et al. | Jun 2005 | A1 |
20050125222 | Brown et al. | Jun 2005 | A1 |
20050125302 | Brown et al. | Jun 2005 | A1 |
20050164833 | Florio | Jul 2005 | A1 |
20050172311 | Hjelt et al. | Aug 2005 | A1 |
20050176461 | Bozzone et al. | Aug 2005 | A1 |
20050180341 | Nelson et al. | Aug 2005 | A1 |
20050202934 | Olrik et al. | Sep 2005 | A1 |
20050209050 | Bartels | Sep 2005 | A1 |
20050226172 | Richardson et al. | Oct 2005 | A1 |
20050227811 | Shum et al. | Oct 2005 | A1 |
20050240705 | Novotney et al. | Oct 2005 | A1 |
20050266385 | Bisogno | Dec 2005 | A1 |
20050287499 | Yeager | Dec 2005 | A1 |
20060004862 | Fisher et al. | Jan 2006 | A1 |
20060025282 | Redmann | Feb 2006 | A1 |
20060026052 | Klett et al. | Feb 2006 | A1 |
20060035200 | Pittman | Feb 2006 | A1 |
20060040244 | Kain | Feb 2006 | A1 |
20060047208 | Yoon | Mar 2006 | A1 |
20060063980 | Hwang et al. | Mar 2006 | A1 |
20060085272 | Case et al. | Apr 2006 | A1 |
20060107822 | Bowen | May 2006 | A1 |
20060173972 | Jung et al. | Aug 2006 | A1 |
20060197670 | Breibart | Sep 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060199155 | Mosher | Sep 2006 | A1 |
20060205564 | Peterson | Sep 2006 | A1 |
20060238517 | King et al. | Oct 2006 | A1 |
20060250524 | Roche | Nov 2006 | A1 |
20060253874 | Stark et al. | Nov 2006 | A1 |
20060256130 | Gonzalez | Nov 2006 | A1 |
20060263750 | Gordon | Nov 2006 | A1 |
20070026999 | Merolle et al. | Feb 2007 | A1 |
20070032345 | Padmanabhan et al. | Feb 2007 | A1 |
20070033068 | Rao et al. | Feb 2007 | A1 |
20070033069 | Rao et al. | Feb 2007 | A1 |
20070059672 | Shaw | Mar 2007 | A1 |
20070074619 | Vergo | Apr 2007 | A1 |
20070087686 | Holm et al. | Apr 2007 | A1 |
20070100595 | Earles et al. | May 2007 | A1 |
20070110074 | Bradley et al. | May 2007 | A1 |
20070113726 | Oliver et al. | May 2007 | A1 |
20070130476 | Mohanty | Jun 2007 | A1 |
20070135264 | Rosenberg | Jun 2007 | A1 |
20070136093 | Rankin et al. | Jun 2007 | A1 |
20070141540 | Borg | Jun 2007 | A1 |
20070166683 | Chang et al. | Jul 2007 | A1 |
20070192106 | Zilca | Aug 2007 | A1 |
20070219059 | Schwartz et al. | Sep 2007 | A1 |
20070265138 | Ashby | Nov 2007 | A1 |
20070287596 | Case et al. | Dec 2007 | A1 |
20070287597 | Cameron | Dec 2007 | A1 |
20080033827 | Kuang et al. | Feb 2008 | A1 |
20080096726 | Riley et al. | Apr 2008 | A1 |
20080155470 | Khedouri et al. | Jun 2008 | A1 |
20080177860 | Khedouri et al. | Jul 2008 | A1 |
20080195594 | Gerjets et al. | Aug 2008 | A1 |
20080195997 | Herberger | Aug 2008 | A1 |
20080215968 | Bekerman | Sep 2008 | A1 |
20080242521 | Einav | Oct 2008 | A1 |
20090047645 | Dibenedetto et al. | Feb 2009 | A1 |
20090087819 | Adachi et al. | Apr 2009 | A1 |
20090169171 | Massey et al. | Jul 2009 | A1 |
20090312105 | Koplar | Dec 2009 | A1 |
20090327894 | Rakib et al. | Dec 2009 | A1 |
20100081116 | Barasch et al. | Apr 2010 | A1 |
20120185905 | Kelley | Jul 2012 | A1 |
20120198317 | Eppolito et al. | Aug 2012 | A1 |
20120230510 | Dinescu et al. | Sep 2012 | A1 |
20130024880 | Moloney-Egnatios et al. | Jan 2013 | A1 |
20140122601 | Poston et al. | May 2014 | A1 |
20140285312 | Laaksonen et al. | Sep 2014 | A1 |
20140328571 | Roberts, Jr. et al. | Nov 2014 | A1 |
20150086174 | Abecassis et al. | Mar 2015 | A1 |
20150341410 | Schrempp et al. | Nov 2015 | A1 |
Number | Date | Country |
---|---|---|
1462979 | Sep 2004 | EP |
1512370 | Mar 2005 | EP |
1585014 | Oct 2005 | EP |
2253706 | Sep 1992 | GB |
2284060 | May 1995 | GB |
2409040 | Jun 2005 | GB |
2007-013228 | Jan 2007 | JP |
1999-0073234 | Oct 1999 | KR |
9714357 | Apr 1997 | WO |
0052604 | Sep 2000 | WO |
0116855 | Mar 2001 | WO |
0165460 | Sep 2001 | WO |
0215986 | Feb 2002 | WO |
02062425 | Aug 2002 | WO |
0293272 | Nov 2002 | WO |
2005032363 | Apr 2005 | WO |
2005036918 | Apr 2005 | WO |
2005082472 | Sep 2005 | WO |
2005087323 | Sep 2005 | WO |
2005093633 | Oct 2005 | WO |
2006042415 | Apr 2006 | WO |
2006079942 | Aug 2006 | WO |
2007099206 | Sep 2007 | WO |
Entry |
---|
Menta, “1200 Song MP3 Portable is a Milestone Player.” http://www.mp3newswire.net/stories/personaljuke.html (retrieved Jul. 17, 2010). |
Oliver et al. “Enhancing Exercise Performance through Real-time Physiological Monitoring and Music: A User Study.” Pervasive Health Conference and Workshops, pp. 1-10 (2007). |
Pike, Weight Watchers On-the-Go, Apr. 12, 2005, PC Magazine, vol. 24, Iss.6; p. 149. |
Creative NOMAD® Digital Audio Player User Guide, Jun. 1999. |
Creative NOMAD® Digital Getting Started Guide, Jan. 2000. |
Ericsson Inc. “Cellular Phone With Integrated MP3 Player.” Research Disclosure Journal No. 41815, Research Disclosure Database No. 418015 (Feb. 1999). |
Microsoft Zune Impressions- Part 1, DigitalArts Online Magazine, Dec. 4, 2006: <http://www.digitalartsonline.co.uklblogs/index.cfm?entryid=184&blogid=2>. |
Podfitness Delivers on myMedia Promise, Utah Tech Jobs.com, Nov. 13, 2006: <http://utahtechjobs.com/index.php/2006/11/13/podfitness-delivers-on-mymedia-promise/>. |
Reinventing the Scroll Wheel, CNET news.com, Aug. 22, 2006: <http://www.news.com/2300-1041 3-6107951-1.html?tag=ne.gall.pg; http://www.news.com/2300-1041 3-6107951-2.html?tag=ne.gall.pg; http://www.news.com/2300-1041 3-6107951-3.html?tag=ne.gall.pg; http://www.news.com/2300-1041 3-6107951-4.html?tag=ne.gall.pg. |
Rio 500 Getting Started Guide, 1999. |
Rio PMP300 User's Guide (1998). |
Sensei.com, Feb. 10, 2008: <http://www.sensei.com/senseipublic/Inner.aspx>. |
“Notice from the European Patent Office dated Oct. 1, 2007 concerning business methods.” Official Journal of the European Patent Office, vol. 30, No. 7, Nov. 1, 2007. |
“Statement in Accordance With the Notice From the European Patent Office dated Oct. 1, 2007 Concerning Business Methods.” Official Journal of the European Patent Office, Nov. 1, 2007. |
Number | Date | Country | |
---|---|---|---|
20140250380 A1 | Sep 2014 | US |
Number | Date | Country | |
---|---|---|---|
60846414 | Sep 2006 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11729291 | Mar 2007 | US |
Child | 14274940 | US |