Variable I/O interface for portable media device

Abstract
The present invention can include electronic devices having variable input/output interfaces that can allow a user to interact with the devices with greater efficiency and in a more ergonomic manner. An electronic device of the present invention can display icons associated with user-programmable parameters of a media file. By interacting with the icons, a user can change the user-programmable parameters during playback of the media file. Changes to the user-programmable parameters can affect playback of the remainder of the media file. An electronic device of the present invention also can automatically re-orient images shown on a display and re-configure user input components based on the orientation of the electronic device.
Description
FIELD OF THE INVENTION

The present invention can relate to variable input/output interfaces for electronic devices.


BACKGROUND OF THE INVENTION

Currently available portable media devices may provide limited ways by which users can interact with the devices. For example, currently available portable media devices may show images on its display in only one orientation with respect to the housing of the media device. When a user places the portable media device in a non-standard orientation, the user may have to angle his head in order to properly view the displayed images. Furthermore, when the media device is playing a media file, the media device may not permit the user to adjust parameters associated with the media file. In order to adjust any of the parameters of the media file, the media device may require the user first to stop playback of the media file.


SUMMARY OF THE INVENTION

The present invention can include electronic devices that have variable input/output (I/O) interfaces. The variable I/O interfaces can allow a user to interact with the devices in a more ergonomic manner and with greater efficiency.


An electronic device of the present invention can display one or more software icons associated with user-programmable parameters of a media file during playback of the media file. The electronic device can permit a user to select a user-programmable parameter of the media file by selecting a corresponding icon. While the selected icon is visually distinguished, the electronic device can permit the user to adjust the selected user-programmable parameter. The electronic device can change the software icons it displays to reflect different user-programmable parameters associated with different modules of the media file or with different media files.


An electronic device of the present invention also can automatically re-orient an image shown on its display and/or re-configure user input components based on the orientation of the electronic device. The user input components can include hardware or software input components.





BRIEF DESCRIPTION OF THE DRAWINGS

The above and other advantages of the present invention will be apparent upon consideration of the following detailed description, taken in conjunction with accompanying drawings, in which like reference characters refer to like parts throughout, and in which:



FIG. 1 illustrates one embodiment of an electronic device of the present invention in communication with a network of servers;



FIGS. 2A-2B illustrate a media file in accordance with one embodiment of the present invention;



FIGS. 3A-3B illustrate an electronic device that permits a user to adjust user-programmable parameters of a media file during playback of the media file in accordance with one embodiment of the present invention;



FIGS. 4A-4B illustrate a process for use by the electronic device of FIGS. 3A-3B in accordance with one embodiment of the present invention;



FIGS. 5A-5I illustrate electronic devices that can automatically re-orient an image shown on its display based on the orientation of the device in accordance with one embodiment of the present invention;



FIGS. 6A-6B illustrate an electronic device that can automatically re-orient an image shown on its display and re-configure user input components based on the orientation of the device in accordance with one embodiment of the present invention;



FIGS. 6C-6D illustrate an electronic device that can automatically re-orient an image shown on its display based on the orientation of the device and permit a user to manually re-configure a user input component in accordance with one embodiment of the present invention;



FIGS. 7A-7E illustrate electronic devices that can automatically re-orient an image shown on its display and re-configure user input components based on the orientation of the device in accordance with one embodiment of the present invention; and



FIG. 8 illustrates a process for use by an electronic device of one embodiment of the present invention to re-orient an image shown on a display and re-configure a user input component based on the orientation of the device.





DETAILED DESCRIPTION OF THE INVENTION


FIG. 1 illustrates one embodiment of an electronic device of the present invention in communication with a network of servers. Electronic device 10 can include housing 12, display 14, user input component 16, controller 18, memory 20, and orientation transducer 22. In one embodiment of the present invention, electronic device 10 can be a portable media device similar to that sold under the trademark iPod™ by Apple Computer, Inc. of Cupertino, Calif.


User input component 16 can include a clickwheel similar to that incorporated in the iPod™. The clickwheel can include one or more buttons and a touchpad. The touchpad may permit a user to scroll by running the user's finger around the track of the clickwheel. In alternative embodiments, user input component 16 can include, for example, one or more buttons, a touchpad, a touchscreen display, electronics for accepting voice commands, antennas for accepting signals from other electronic devices, infrared ports for accepting signals from other electronic devices, or any combination thereof. Controller 18 can include one or more processors, ASICs, circuits, or any combination thereof. Memory 20 can include read only memory, random access memory, solid-state memory, buffer memory, hard drive memory, any other memory known in the art or otherwise, or any combination thereof.


In accordance with one aspect of the present invention, a user can use portable media device 10, local server 24 (e.g., a user's personal computer), or central server 26 (e.g., a server that hosts a website) to design or compile a program of activities for the user to perform or for another user to perform. The program of activities can include instructions and other audio and/or visual information. The instructions and other audio/visual information can be stored in a media file, which the user can play back on, e.g., electronic device 10. For example, the incorporated patent documents describe in greater detail programs of activities (e.g., compilations of fitness activities, nutritional activities, and/or medical activities) that a user can compile, e.g., on a local or central server and playback on electronic device 10.


If the user compiles the program of activities on local server 24 or central server 26, the local or central server can transmit the media file to portable media device 10 using a cable or a wireless communication protocol known in the art or otherwise. The portable media device can store the media file in memory 20. When the user wishes to perform activities in accordance with the compiled program, controller 18 can play back the media file. Electronic device 10 can output audio data stored in the media file using speakers (not shown) disposed within portable media device 10 or an accessory coupled thereto. Portable media device 10 can output visual data stored in the media file using display 14. Visual data can include text images, still images, and/or video images.


In one aspect of the present invention, portable electronic device 10 can accept data from sensors S that can capture information about the user's activities. Sensors S can include sensors used in, for example, pedometers, accelerometers, heart rate monitors, oximeters, GPS tracking devices, devices that measure temperature, devices that measure heat flux, electrocardiogram devices, devices having activity tracking sensors similar to those described in the incorporated INTEGRATED SENSORS document, devices having activity tracking sensors similar to those described in the other incorporated patent documents, or any combination thereof.


A user can wear sensors S in the user's clothing or accessories. Alternatively, sensors S can be disposed within electronic device 10 or another electronic device utilized by the user. While the user is performing the activities programmed into the media file, sensors S can track the user's performance and electronic device 10 can log data from sensors S in memory 20. Once the user has completed the scheduled activities, the user can upload the data that electronic device 10 has collected about the user's performance into local server 24 and/or central server 26. Local server 24 and/or central server 26 then can analyze that data to adapt the user's goals or otherwise track a user's progress, for example, in ways similar to those described in the incorporated patent documents.



FIGS. 2A-2B illustrate a media file in accordance with one embodiment of the present invention. Media file 30 can be a compilation of one or more activity modules. Each module can store audio and/or visual information for one type of activity. When the user is compiling the program of activities stored in media file 30, the user may choose the modules the user wants to include in media file 30 and set global user-programmable parameters associated with media file 30. For example, a global user-programmable parameter can include the total amount of time a user has allocated to perform all the activities associated with media file 30. The user also may set module-specific user-programmable parameters associated with each module the user includes in the media file. For example, module 1 can be a fitness module instructing the user to lift weights using barbells. User-programmable parameters associated with module 1 can include the song a user wants to hear when exercising to this module, the number of pounds the user wants to lift per barbell, and the number of repetitions a user wants to perform.


When the user is ready to perform the activities associated with media file 30, the user can initiate playback of media file 30 on electronic device 10. Electronic device 10 then can begin playing module 1 by outputting the audio and/or visual information associated therewith. Upon the completion of module 1, the electronic device 10 can automatically begin playing module 2 by outputting the audio and/or visual information associated therewith. This process can continue until electronic device 10 has played all of the modules.


In accordance with one aspect of the present invention, controller 18 can be configured to permit a user to adjust global and/or module-specific user-programmable parameters of media file 30 during playback of the media file. Controller 18 can display one or more icons on display 14 that correspond to the user-programmable parameters of media file 30. Using user input component 16, the user may select the icon corresponding to the user-programmable parameter he wishes to modify. Once controller 18 has visually distinguished the selected icon, the user can modify the user-programmable parameter by manipulating user input component 16.



FIGS. 3A-3B illustrate an electronic device that permits a user to adjust user-programmable parameters of a media file during playback of the media file in accordance with one embodiment of the present invention. Controller 18 of electronic device 10 can display multiple icons 34-38 on display 14 of electronic device 10. Icons 34-36 can correspond to global and/or module-specific user-programmable parameters of media file 30. Icon 38 can correspond to media file-independent parameters of the electronic device associated with playback of the media file (e.g., volume and/or an equalizer). Controller 18 can permit a user to tab through icons 34-38 using user input component 16 until a desired icon is visually distinguished from the other icons (e.g., highlighted or bolded). Once the desired icon is selected, the controller may permit the user to modify parameters associated with the selected icon by manipulating user input 16.


In the illustrative embodiment shown in FIGS. 3A-3B, electronic device 10 may be playing a media file that walks a user through a fitness routine similar to that discussed in the incorporated patent documents. The user may have programmed the fitness routine with a fitness module that instructs the user to lift weights using barbells. As shown in FIG. 3A, the user may have initially programmed the module to instruct the user to perform 15 repetitions of bicep curls using 10 pound barbells. However, during the fitness routine, the user may decide he wants to increase the number of repetitions to 25. Electronic device 10 can permit a user to indicate this adjustment by toggling through icons 34-38 by manipulating user input 16 (e.g., the buttons of a clickwheel) until the user reaches desired icon 36. Icon 36 can correspond to the user-programmable parameter associated with the number of repetitions saved in the media file. As shown in FIG. 3B, once desired icon 36 is visually distinguished from the other icons (e.g., highlighted), electronic device 10 can permit the user to adjust the number of repetitions (e.g., from 15 to 25) by manipulating user input 16 (e.g., by scrolling through values using a clickwheel similar to that incorporated in the iPod™). Electronic device 10 then can display adjustments to the selected user-programmable parameter in the corresponding icon, as shown in FIG. 3B.



FIGS. 4A-4B illustrate a process for use by electronic device 10 of FIGS. 3A-3B in accordance with one embodiment of the present invention. In step 40, controller 18 of electronic device 10 can accept signals from user input component 16 to load a desired media file for playback. Once the media file is loaded, controller 18 can check to see whether the controller has played the last module of the media file in step 42. Since the controller has just loaded the user-selected media file, controller 18 can then progress onto step 44, in which it can load the first module in the media file.


In step 46, controller 18 can begin playing the first module of the media file. This can include outputting audio and/or visual information associated with that module, displaying icon(s) 38 on display 14 corresponding to media file-independent parameters, and logging information collected by sensors S. Visual information associated with a module can include icons 34-36 that identify user-programmable parameters that a user can adjust during playback of that module, the amount of time elapsed, visual cues for the proper way to perform the activity associated with the current module playing, visual cues of the activity associated with the next module to be played from the media file, etc.


In step 48, controller 18 can check whether the end of the module has been reached. If not, in step 50, controller 18 can wait to accept signals from user input component 16 that indicate the user wishes to adjust either a user-programmable parameter associated with the media file or a media file-independent parameter. Once the controller receives one or more signals from user input component 16, controller 18 can permit the user to adjust a user-programmable parameter or one of the media file-independent parameters in step 52. Thereafter, controller 18 can reiterate steps 46-52 to continue playing the first module of the loaded media file in accordance with the adjustment made to a user-programmable parameter in step 52.


Once controller 18 detects that it has reached the end of the first media module in step 48, controller 18 can save any adjustments made to user-programmable parameters associated with the first media module in step 54. Alternatively, the controller can save adjustments to the user-programmable parameters immediately after the adjustments are made.


Thereafter, controller 18 can automatically load and begin playback of a second media module of the loaded media file. Because the second module can request that the user perform an activity different than that of the first module, the controller may output different audio and/or visual information than that output for the first module and/or log different information collected by sensors S. Controller 18 can continuously reiterate steps 42-54 until the controller detects in step 42 that it has just played the last module of the loaded media file. In step 56, controller 18 can conclude playback of the media file loaded in step 40.


Once the user has completed the scheduled activities, the user can upload the adjustments made to the user-programmable parameters of the media file into local server 24 and/or central server 26 along with other data that electronic device 10 collected about the user's performance. Local server 24 and/or central server 26 then can analyze the uploaded data to adapt the user's goals or otherwise track a user's progress.



FIG. 4B illustrates a process for use by electronic device 10 in step 52 of FIG. 4A in accordance with one embodiment of the present invention. In response to receipt of signals from user input component 16 in step 50, controller 18 can send signals to display 14 to visually distinguish a selected icon 34-38 in step 60. In step 62, controller 18 can accept signals from user input component 16 that indicates the user is requesting to adjust the user-programmable parameter or media file-independent parameter corresponding to the selected icon. In step 64, controller 18 can adjust the selected user-programmable or media file-independent parameter.


Adjustments in global or module-specific user-programmable parameters during step 64 may require that controller 18 adjust the playback of the remainder of the media file. In step 66, controller 18 can make adjustments in other user-programmable and non-user-programmable parameters of the loaded media file in response to adjustments made in step 64. For example, if the user adjusts the number of repetitions from 15 to 25 as shown in FIGS. 3A-3B, controller 18 may need to allocate a greater amount of time for the user to perform and for the controller to play back the entire media file. Thus, controller 18 may need to make a corresponding adjustment to a global user-programmable parameter of the loaded media file, e.g., the total amount of time a user has allocated to perform all the activities of media file 30. Alternatively, rather than changing a global parameter of the media file, controller 18 can instead adjust a user-programmable parameter of a media module that is scheduled to be played after the current module. For example, the controller can instead shorten the time allocated for a user to perform another fitness module, e.g., a jogging module.


While the media file described above can store both audio and visual data, media files of the present invention also may be configured to store only audio data or only visual data. Furthermore, an electronic device of the present invention can permit a user to adjust user-programmable parameters associated with any type of file, not just a media file.



FIGS. 5A-5D illustrate an electronic device of the present invention that can automatically re-orient image I shown on its display based on the orientation of the device. For example, assume that electronic device 10 and image I are disposed in a reference orientation in FIG. 5A. When electronic device 10 is rotated from its reference orientation, controller 18 can detect the rotation based on signals from orientation transducer 22. Responsive thereto, controller 18 can rotate image I on display 14 so that it remains in the reference orientation (e.g., oriented upright in the positive Y direction). For example, when electronic device 10 is rotated by 90 degrees so that it is positioned parallel to the X axis (the orientation shown in FIG. 5B or 5C), controller 18 can maintain image I on display 14 in the reference orientation. Similarly, when electronic device 10 is rotated by an additional 90 degrees as shown in FIG. 5D, controller 18 also can maintain image I on display 14 in the reference orientation.


Image I can include text images, still images, and/or video images. For example, image I can include icons 34-38 as discussed above with respect to FIGS. 1-4.


While FIGS. 5A-5D illustrate image I oriented in either a portrait or landscape orientation based on the orientation of electronic device 10, controller 18 also can be configured to orient image I in intermediate orientations.



FIGS. 5E-5I illustrate an electronic device of the present invention that can re-orient image I in one of a plurality of orientations ranging from the portrait orientation of FIG. 5E to the landscape orientation of FIG. 5I, inclusive. Thus, as shown in FIGS. 5F-5H, the orientation of image I can be rotated by less than 90 degrees with respect to either the orientation of FIG. 5E or 5I. In one embodiment of the present invention, each of the orientations shown in FIGS. 5E-5I can be a discrete orientation in which image I can be disposed. Alternatively, the orientations shown in FIGS. 5E-5I can represent a non-discrete set of orientations in which image I can be disposed. In the latter case, electronic device 10 can be configured to re-orient image I so that image I continuously tracks the orientation of electronic device 10. Thus, each successive re-orientation of image I can merge smoothly with the next so that changes in the orientation of image I are not disjointed.


Although FIGS. 5A-I illustrate different orientations of electronic device 10 in two dimensions, electronic device 10 also can be oriented in three-dimensions. Accordingly, controller 18 can be configured to re-orient image I based on the orientation of electronic device 10 in three-dimensions. For example, controller 18 can re-orient image I based on the orientation of device 10 in the X-, Y-, and Z-axes. When electronic device 10 is tilted out of the X-Y plane, controller 18 can detect the rotation based on signals from orientation transducer 22. Responsive thereto, controller 18 can re-orient image I so that it remains in a manufacturer or user-defined reference orientation.


Orientation transducers incorporated in the electronic devices of the present invention can include a single multi-dimensional motion sensor or an assembly of sensors that can detect motion of the electronic device, including position, orientation, and/or movement. For example, an orientation transducer can include one or more multi-dimensional accelerometers, GPS systems, gyroscopes, magnetic sensors, mercury switches, or any combination thereof. The orientation transducers also can include a receiver that can triangulate the position, orientation, and/or movement of the electronic device based on signals received from multiple transmitters disposed near the receiver.


Advantageously, an electronic device that can dynamically adjust the orientation of image I in the manner described may be useful for a user who may position the electronic device in a non-standard orientation. For example, an athlete may have an iPod™ strapped to his forearm. When the athlete has his arm extended in front of him (e.g., when he is stretching before a jog), he may wish to view image I on a display of the iPod™ in the orientation shown in FIG. 5A. However, when the athlete's forearm is positioned close to his chest (for example, when the athlete wants to check the elapsed time or the song that is currently playing on his iPod™), it may be more convenient for the athlete to view image I in the orientation of FIG. 5B or 5D. Electronic device 10 may re-orient image I on its display as the athlete moves his forearm closer to his chest.



FIGS. 6A-6B illustrate one embodiment of an electronic device that can automatically re-configure a hardware user input component based on the orientation of the device. Electronic device 70 can include housing 72, display 74, hardware user input component 76, controller 78, memory 80, and orientation transducer 82. Hardware user input component 76 can include, e.g., multiple buttons associated with functions A-E.


When electronic device 70 is oriented in a reference orientation (e.g., the positive Y direction shown in FIG. 6A), buttons associated with functions A-E also can be configured in a reference orientation (e.g., with the button associated with function A disposed closest to display 74). When electronic device 70 rotates from its reference orientation (e.g., into the orientation shown in FIG. 6B), controller 78 can detect the rotation based on signals from orientation transducer 82. Responsive thereto, controller 78 can re-configure user input component 76 so that the user does not perceive a change in the configuration of the user input component despite re-orientation of electronic device 70. For example, controller 78 can re-configure user input component 76 by re-assigning functions A-D so that function A always is assigned to the button disposed in the positive Y direction relative to the button assigned with function E, function B always is assigned to the button disposed in the negative X direction relative to the button assigned with function E, function C always is assigned to the button disposed in the negative Y direction relative to the button assigned with function E, and function D always is assigned to the button disposed in the positive X direction relative to the button assigned with function E.


Controller 78 can re-configure the buttons of user input 76 when electronic device 70 is re-oriented by, e.g., using one or more hardware switches or multiplexers. Alternatively, controller 78 can incorporate software that distributes signals received from the buttons of user input 76 to the appropriate function based on signals from orientation transducer 82.



FIGS. 6C-6D illustrate electronic device 70 with an alternative user input component that can be re-configured with re-orientation of the device. User input component 77 can be similar to the clickwheel incorporated in some models of iPods™. User input component 77 can incorporate touchpad TP and a plurality of buttons assigned with functions A-E. In one embodiment of the present invention, user input component 77 can be manually re-configured by a user when the user changes the orientation of device 70. For example, user input component 77 can be configured to be manually rotated clockwise or counterclockwise. Thus, when a user re-orients electronic device 70 from the orientation of FIG. 6C to that of FIG. 6D, the user also can rotate user input component 77 by 90 degrees to dispose the user input component in a more ergonomic configuration.


Electronic device 70 also can provide tactile feedback to the user during manual re-configuration. For example, there can be mechanical stops that prevent the user from rotating user input 77 past certain angles. Also, user input component 77 can incorporate mechanical protuberances that are configured to engage depressions (or vice versa). When the user manually re-configures user input component 77, electronic device 70 can provide tactile feedback to the user when the protuberances engage the depressions.



FIGS. 7A-7E illustrate an electronic device that can automatically re-configure user input components based on the orientation of the device. Electronic device 90 can include housing 92, display 94, and controller 98. Controller 98 can overlay icons 104 onto image I shown on display 94 or incorporate icons 104 into image I. Icons 104 can be disposed on display 94 in locations on or proximate to software or hardware user input components. The icons can have shapes, text or images that indicate the functions of the input components designated by the icons. For example, if display 94 is a touchscreen display, icons 104 can indicate the locations of software input components, e.g., software buttons assigned with functions A-F (see FIG. 7A-7C). A user then may actuate any of the software buttons by contacting or otherwise interacting with display 94 at the locations at which icons 104 are disposed. In addition to buttons, software user input components also can include linear or circular scrolls, sliders, dials, any other user interfaces that can be emulated on display 94, or any combination thereof. Touchscreen displays can include any touch or proximity sensitive display known in the art or otherwise that can simultaneously (1) display image I and icons 104, and (2) detect when a user wants to actuate a software user input component.


As illustrated in FIGS. 7D-7E, icons 104 also can be disposed proximate to hardware input components 105. Hardware input components 105 can include hardware buttons assigned with functions G-J, linear or circular scroll, slider, dial, one of the user input components described in the incorporated King document, any other type of hardware that can be used for user input, or any combination thereof.


As shown in FIGS. 7A and 7D, controller 98 can display image I and icons 104 on display 94 in a reference orientation. When electronic device 90 changes orientation, e.g., rotates counter-clockwise by 90 degrees as shown in FIGS. 7B, 7C, and 7E, controller 98 can detect the rotation based on signals from orientation transducer 102. Responsive thereto, controller 98 can re-orient image I and re-configure the user input components designated by icons 104.


For example, as illustrated in FIG. 7B, controller 98 can re-position the software input components and icons 104 so that the composite image formed by image I and icons 104 stays the same with the re-configuration. Alternatively, as illustrated in FIG. 7C, controller 98 can re-position the software input components and icons 104 so that the composite image formed by image I and icons 104 change as a result of the re-configuration. In the embodiment of FIG. 7E, controller 98 can re-assign functions to one or more of components 105 and re-position icons 104. Again, functions G-J can be re-assigned so that the composite image formed by image I and icons 104 change as a result of the re-configuration. An electronic device of the present invention may change the composite image formed by image I and icons 104, for example, to better distribute icons 104 within a display (as shown in FIG. 7C) and/or when all active user input components are consolidated along a single side of the display that is more accessible to a user (as shown in FIG. 7E).


Electronic device 90 also can include hardware user input component 96, memory 100, and orientation transducer 102. Memory 100 can store, for example, media files with which image I is associated, data from which controller 98 can generate icons 104, and program code. The program code can include, for example, instructions for displaying, re-orienting, and re-configuring image I, icons 104, software input components, and hardware input components 105. Hardware user input component 96 can be, for example, a single or multi-functional button. Multi-functions buttons can signal controller 98 to perform one function when a user depresses the button for a predetermined amount of time and signal controller 98 to perform a second function when a user depresses the button for a shorter amount of time.



FIG. 8 illustrates one embodiment of a process for use by an electronic device of the present invention to re-orient an image on a display and re-configure one or more user input components of the device based on the orientation of the electronic device. In step 112, a controller can accept signals from an orientation transducer disposed within the electronic device. In step 114, the controller can analyze the signals from the orientation transducer to determine the orientation of the electronic device. In step 116, the controller can determine the difference between the orientation of the electronic device and a reference orientation.


The controller may permit a manufacturer or a user to set the reference orientation. For example, a user may set the reference orientation of the electronic component in the positive X direction, rather than, e.g., the positive Y direction shown in FIG. 5A. Thereafter, the controller can maintain images displayed by the device oriented in the positive X direction regardless of the orientation in which the electronic device is disposed.


In step 118, the controller can generate signals to re-orient an image in the reference orientation. Depending on the determination made in step 116, this can include rotating the image into the reference orientation and performing any other operations necessary to properly display the image in the new orientation, e.g., scaling.


To rotate the image, the controller can use transformation matrices or select from versions of the image stored in memory that already are disposed in predetermined orientations. For example, each time the controller determines that the reference orientation is different than the real-time orientation of the electronic device, the controller can determine a transformation matrix based on the reference orientation and the real-time orientation of the electronic device. The transformation matrix also can incorporate factors needed to properly display the image in the new orientation (e.g., scaling factors). The controller then can use the transformation matrix to re-orient the image. Advantageously, this technique may be useful for re-orienting images that are constantly changing, e.g., video images.


The electronic device also can store multiple versions of each image disposed in multiple orientations. The controller then can select and display the version of the image having the appropriate orientation based on the determination made in step 116. This may be useful for re-orienting still images that are repeatedly displayed in certain predetermined orientations.


Alternatively, rather than storing multiple versions of each image, the electronic device can store predetermined transformation matrices that can re-orient any image to be displayed by the electronic device. Again, the controller can incorporate, e.g., scaling factors into the stored transformation matrices. The controller then can select the appropriate transformation matrix based on the determination made in step 116 and apply the transformation matrix to re-orient the image. Again, this technique may be useful for re-orienting images that are constantly changing.


The controller also can use other techniques known in the art or otherwise to re-orient an image shown on a display of the electronic device.


In step 120, the controller can generate signals to re-configure one or more user input components of the electronic device. This can include re-assigning functions to one or more user input components, re-positioning one or more user input components, and/or re-positioning one or more icons that designate user input component(s).


In accordance with another aspect of the present invention, a controller can permit the user to reversibly deactivate certain orientations in which the controller is capable of displaying an image. For example, the controller can permit a user to limit an image to be displayed only in the following two orientations: the orientation of FIG. 5A or the orientation of FIG. 5B. The controller also may permit the user to reversibly deactivate the automatic re-orientation feature of the electronic device. Instead, the controller can require that the user toggle images on the display among different orientations by manually signaling the controller using a user input component.


Similarly, the controller can permit the user to reversibly deactivate certain configurations in which the controller is capable of re-configuring user input components. The controller also can permit the user to reversibly deactivate the automatic re-configuration feature of the electronic device. Again, the controller can instead require that the user toggle between configurations of the user input component by manually signaling the controller.


Although particular embodiments of the present invention have been described above in detail, it will be understood that this description is merely for purposes of illustration. Alternative embodiments of those described hereinabove also are within the scope of the present invention.


Electronic devices of the present invention can combine features described above with respect to FIGS. 1-8. For example, an electronic device of the present invention can permit a user to change user-programmable parameters of a media file, automatically re-orient images based on the orientation of the electronic device, and automatically re-configure user input components based on the orientation of the electronic device.


Electronic devices of the present invention also can include other components that are not illustrated in FIGS. 1-8 for the sake of simplicity, e.g., a receiver to receive data from sensors S, additional input components, and/or additional output components.


Electronic devices of the present invention can be any electronic device that executes files having user-programmable parameters and/or any electronic device that a user may orient in a non-standard orientation. For example, the electronic device can be any portable, mobile, hand-held, or miniature consumer electronic device. Illustrative electronic devices can include, but are not limited to, music players, video players, still image players, game players, other media players, music recorders, video recorders, cameras, other media recorders, radios, medical equipment, calculators, cellular phones, other wireless communication devices, personal digital assistances, programmable remote controls, pagers, laptop computers, printers, or any combination thereof. Miniature electronic devices may have a form factor that is smaller than that of hand-held devices. Illustrative miniature electronic devices can include, but are not limited to, watches, rings, necklaces, belts, accessories for belts, headsets, accessories for shoes, virtual reality devices, other wearable electronics, accessories for sporting equipment, accessories for fitness equipment, key chains, or any combination thereof.


The above described embodiments of the present invention are presented for purposes of illustration and not of limitation, and the present invention is limited only by the claims which follow.

Claims
  • 1. A system comprising: a user input component;a memory component;a sensor; anda processor that: plays back a media file from the memory component for providing media information to a user about a plurality of activities of the media file;receives sensor data from the sensor indicative of the user's performance of at least one activity of the plurality of activities during the playback of the media file;receives a user input command from the user input component during the playback of the media file; andbased on the received user input command, adjusts a user-programmable parameter of the media file during the playback of the media file, wherein: the parameter adjustment adjusts the media file;the media file adjustment adjusts the playback of the remainder of the media file after the parameter adjustment;the playback adjustment affects the media information provided to the user during the playback of the remainder of the media file after the parameter adjustment;the processor plays back the media file by: playing back a first media module of the media file that provides first media information about a first activity of the plurality of activities; andafter playing back the first media module, playing back a second media module of the media file that provides second media information about a second activity of the plurality of activities;the processor receives the user input command during the playback of the first media module; andthe parameter adjustment affects the playback of the second media module.
  • 2. The system of claim 1, wherein the user input component, memory component, and processor are provided by a portable electronic device.
  • 3. The system of claim 1, wherein the parameter adjustment is stored in the memory component.
  • 4. The system of claim 1, wherein the parameter comprises a global parameter associated with the total time allocated to perform every activity of the plurality of activities of the media file.
  • 5. The system of claim 1, wherein: the parameter comprises a first parameter that is associated with the first media module.
  • 6. A system f claim comprising: a user input component;a memory component;a sensor; anda processor that: plays back a media file from the memory component for providing media information to a user about a plurality of activities of the media file;receives sensor data from the sensor indicative of the user's performance of at least one activity of the plurality of activities during the playback of the media file;receives a user input command from the user input component during the playback of the media file; andbased on the received user input command, adjusts a user-programmable parameter of the media file during the playback of the media file, wherein: the parameter adjustment adjusts the media file;the media file adjustment affects the media information provided to the user during the playback of the remainder of the media file after the media file adjustment;the processor plays back the media file by: playing back a first media module of the media file that provides first media information about a first activity of the plurality of activities; andafter playing back the first media module, playing back a second media module of the media file that provides second media information about a second activity of the plurality of activities;the processor receives the user input command during the playback of the first media module;the parameter comprises a first parameter that is associated with the first module; andbased on the adjustment to the first parameter, the processor adjusts a second parameter of the media file during playback of the media file.
  • 7. The system of claim 6, wherein the second parameter comprises a global parameter of the media file.
  • 8. The system of claim 6, wherein the second parameter comprises a module-specific parameter.
  • 9. The system of claim 8, wherein the second parameter is associated with the first module.
  • 10. The system of claim 8, wherein the second parameter is associated with the second module.
  • 11. A method comprising: playing back a media file for providing media information to a user about a plurality of activities of the media file;receiving sensor data indicative of the user's performance of at least one activity of the plurality of activities during the playing back of the media file;receiving a user input command during the playing back of the media file; andbased on the received user input command, adjusting a user-programmable parameter of the media file during the playing back of the media file, wherein: the adjusting the parameter comprises adjusting the media file;the adjusting the media file affects the media information provided to the user during the playing back of the remainder of the media file after the adjusting the media file;the playing back comprises: playing back a first media module of the media file that provides first media information about a first activity of the plurality of activities; andafter the playing back of the first media module, playing back a second media module of the media file that provides second media information about a second activity of the plurality of activities;the receiving the user input command occurs during the playing back of the first media module; andthe adjusting the media file affects the playing back of the second media module.
  • 12. The method of claim 11, wherein the playing back, the receiving the sensor data, the receiving the user input command, and the adjusting the parameter are accomplished by a portable electronic device.
  • 13. The method of claim 11, further comprising storing the parameter adjustment in a memory component.
  • 14. The method of claim 11, wherein the parameter comprises a global parameter associated with the total time allocated to perform every activity of the plurality of activities of the media file.
  • 15. The method of claim 11, wherein: the parameter comprises a first parameter that is associated with the first media module.
  • 16. The method of claim 11, wherein: the parameter comprises a first parameter that is associated with the first module; andbased on the adjusting the first parameter, the method further comprises adjusting a second parameter of the media file during the playing back of the media file.
  • 17. The method of claim 16, wherein the second parameter comprises a global parameter of the media file.
  • 18. The method of claim 16, wherein the second parameter comprises a module-specific parameter.
  • 19. The method of claim 18, wherein the second parameter is associated with the first module.
  • 20. The method of claim 18, wherein the second parameter is associated with the second module.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 11/729,291, filed on Mar. 27, 2007, which claims priority to U.S. Provisional Patent Application No. 60/846,414, filed on Sep. 21, 2006 (referred to below as “the incorporated provisional patent application”). These earlier application are incorporated herein by reference. This also is related to: U.S. Publication No. 2008/0086318, entitled “LIFESTYLE COMPANION SYSTEM,” (referred to herein as “the incorporated LIFESTYLE COMPANION document”), the entirety of which is incorporated herein by reference; U.S. Pat. No. 8,001,472, entitled “SYSTEMS AND METHODS FOR PROVIDING AUDIO AND VISUAL CUES VIA A PORTABLE ELECTRONIC DEVICE,” (referred to herein as “the incorporated AUDIO AND VISUAL CUES document”), the entirety of which is incorporated herein by reference; U.S. Pat. No. 8,235,724, entitled “DYNAMICALLY ADAPTIVE SCHEDULING SYSTEM,” (referred to herein as “the incorporated ADAPTIVE SCHEDULING SYSTEM document”), the entirety of which is incorporated herein by reference; U.S. Pat. No. 8,429,223, entitled “SYSTEMS AND METHODS FOR FACILITATING GROUP ACTIVITIES,” (referred to herein as “the incorporated GROUP ACTIVITIES document”), the entirety of which is incorporated herein by reference; U.S. Publication No. 2008/0077489, entitled “REWARDS SYSTEMS,” (referred to herein as “the incorporated REWARDS SYSTEMS document”), the entirety of which is incorporated herein by reference; U.S. Publication No. 2008/0076972, entitled “INTEGRATED SENSORS FOR TRACKING PERFORMANCE METRICS,” (referred to herein as “the incorporated INTEGRATED SENSORS document”), the entirety of which is incorporated herein by reference; and U.S. patent application Ser. No. 11/426,078, to King et al., filed on Jun. 23, 2006 (Publication No. 2006/0238517, published on Oct. 26, 2006), entitled “Electronic Device Having Display and Surrounding Touch Sensitive Bezel for User Interface and Control” (referred to herein as “the incorporated King document”), the entirety of which is incorporated herein by reference. The incorporated provisional patent application, LIFESTYLE COMPANION document, AUDIO AND VISUAL CUES document, ADAPTIVE SCHEDULING SYSTEM document, GROUP ACTIVITIES document, REWARDS SYSTEMS document, INTEGRATED SENSORS document, and King document collectively are referred to herein as “the incorporated patent documents.”

US Referenced Citations (213)
Number Name Date Kind
3675640 Gatts Jul 1972 A
4649552 Yukawa Mar 1987 A
4907795 Shaw et al. Mar 1990 A
5379057 Clough et al. Jan 1995 A
5412564 Ecer May 1995 A
5434913 Tung et al. Jul 1995 A
5452435 Malouf et al. Sep 1995 A
5471405 Marsh Nov 1995 A
5489249 Brewer et al. Feb 1996 A
5490247 Tung et al. Feb 1996 A
5645509 Brewer et al. Jul 1997 A
5673691 Abrams et al. Oct 1997 A
5675362 Clough et al. Oct 1997 A
5794018 Vrvilo et al. Aug 1998 A
5819735 Mansfield Oct 1998 A
5857939 Kaufman Jan 1999 A
5859979 Tung et al. Jan 1999 A
5890995 Bobick et al. Apr 1999 A
5890997 Roth Apr 1999 A
5913062 Vrvilo et al. Jun 1999 A
5954640 Szabo Sep 1999 A
5976083 Richardson Nov 1999 A
6013007 Root et al. Jan 2000 A
6032108 Seiple et al. Feb 2000 A
6039688 Douglas et al. Mar 2000 A
6077193 Buhler et al. Jun 2000 A
6135951 Richardson et al. Oct 2000 A
6159131 Pfeffer Dec 2000 A
6357147 Darley et al. Mar 2002 B1
6447424 Ashby et al. Sep 2002 B1
6463385 Fry Oct 2002 B1
6527674 Clem Mar 2003 B1
6539336 Vock et al. Mar 2003 B1
6553037 Pivowar et al. Apr 2003 B1
6560903 Darley May 2003 B1
6582342 Kaufman Jun 2003 B2
6585622 Shum et al. Jul 2003 B1
6587127 Leeke et al. Jul 2003 B1
6619835 Kita Sep 2003 B2
6623427 Mandigo Sep 2003 B2
6702719 Brown et al. Mar 2004 B1
6716139 Hosseinzadeh-Dolkhani et al. Apr 2004 B1
6725281 Zintel et al. Apr 2004 B1
6735568 Buckwalter et al. May 2004 B1
6749537 Hickman Jun 2004 B1
6790178 Mault et al. Sep 2004 B1
6793607 Neil Sep 2004 B2
6808473 Hisano et al. Oct 2004 B2
6898550 Blackadar et al. May 2005 B1
6910068 Zintel et al. Jun 2005 B2
6921351 Hickman et al. Jul 2005 B1
6945911 Jackowski Sep 2005 B2
7030735 Chen Apr 2006 B2
7062225 White Jun 2006 B2
7069308 Abrams Jun 2006 B2
7070539 Brown et al. Jul 2006 B2
7085590 Kennedy et al. Aug 2006 B2
7171331 Vock et al. Jan 2007 B2
7174227 Kobayashi et al. Feb 2007 B2
7192387 Mendel Mar 2007 B2
7200517 Darley et al. Apr 2007 B2
7228168 Dardik et al. Jun 2007 B2
7251454 White Jul 2007 B2
7254516 Case, Jr. et al. Aug 2007 B2
7261690 Teller et al. Aug 2007 B2
7277726 Ahya et al. Oct 2007 B2
7278966 Hjelt et al. Oct 2007 B2
7292867 Werner et al. Nov 2007 B2
7328239 Berberian et al. Feb 2008 B1
7353139 Burrell et al. Apr 2008 B1
7424718 Dutton Sep 2008 B2
7454002 Gardner et al. Nov 2008 B1
7496277 Ackley et al. Feb 2009 B2
7519327 White Apr 2009 B2
7523040 Kirchhoff et al. Apr 2009 B2
7526524 White Apr 2009 B2
7591760 Gordon et al. Sep 2009 B2
7603255 Case, Jr. et al. Oct 2009 B2
7618345 Corbalis et al. Nov 2009 B2
7636754 Zhu et al. Dec 2009 B2
7656824 Wang et al. Feb 2010 B2
7670263 Ellis et al. Mar 2010 B2
7683252 Oliver et al. Mar 2010 B2
7753825 Jaquish et al. Jul 2010 B2
7765245 Nichols Jul 2010 B2
7827039 Butcher et al. Nov 2010 B2
7841967 Kahn et al. Nov 2010 B1
7946959 Shum et al. May 2011 B2
7973231 Bowen Jul 2011 B2
8001472 Gilley et al. Aug 2011 B2
8066514 Clarke Nov 2011 B2
8095120 Blair et al. Jan 2012 B1
8529409 Lesea-Ames Sep 2013 B1
20010054180 Atkinson Dec 2001 A1
20020007313 Mai et al. Jan 2002 A1
20020022551 Watterson et al. Feb 2002 A1
20020022774 Karnieli Feb 2002 A1
20020027164 Mault et al. Mar 2002 A1
20020033753 Imbo Mar 2002 A1
20020072932 Swamy Jun 2002 A1
20020077784 Vock et al. Jun 2002 A1
20020095460 Benson Jul 2002 A1
20020107824 Ahmed Aug 2002 A1
20030017914 Jackowski Jan 2003 A1
20030028116 Bimbaum Feb 2003 A1
20030059747 Yoshida et al. Mar 2003 A1
20030097878 Farringdon et al. May 2003 A1
20030175666 Tanabe et al. Sep 2003 A1
20030204412 Brier Oct 2003 A1
20030208113 Mault et al. Nov 2003 A1
20030220971 Kressin Nov 2003 A1
20030224337 Shum et al. Dec 2003 A1
20030229900 Reisman Dec 2003 A1
20040002041 Peplinski et al. Jan 2004 A1
20040029684 Zarif Feb 2004 A1
20040091843 Albro et al. May 2004 A1
20040102931 Ellis et al. May 2004 A1
20040106449 Walker et al. Jun 2004 A1
20040107116 Brown Jun 2004 A1
20040143673 Kristjansson Jul 2004 A1
20040198555 Anderson et al. Oct 2004 A1
20040201595 Manchester Oct 2004 A1
20040220017 Gordon et al. Nov 2004 A1
20040229729 Albert et al. Nov 2004 A1
20050008993 Bergh et al. Jan 2005 A1
20050008994 Bisogno Jan 2005 A1
20050010638 Richardson et al. Jan 2005 A1
20050014113 Fleck et al. Jan 2005 A1
20050042582 Graves Feb 2005 A1
20050044503 Richardson et al. Feb 2005 A1
20050058970 Perlman et al. Mar 2005 A1
20050060368 Wang et al. Mar 2005 A1
20050070809 Acres Mar 2005 A1
20050101314 Levi May 2005 A1
20050107116 Yamaguchi May 2005 A1
20050107216 Lee et al. May 2005 A1
20050113649 Bergantino May 2005 A1
20050125221 Brown et al. Jun 2005 A1
20050125222 Brown et al. Jun 2005 A1
20050125302 Brown et al. Jun 2005 A1
20050164833 Florio Jul 2005 A1
20050172311 Hjelt et al. Aug 2005 A1
20050176461 Bozzone et al. Aug 2005 A1
20050180341 Nelson et al. Aug 2005 A1
20050202934 Olrik et al. Sep 2005 A1
20050209050 Bartels Sep 2005 A1
20050226172 Richardson et al. Oct 2005 A1
20050227811 Shum et al. Oct 2005 A1
20050240705 Novotney et al. Oct 2005 A1
20050266385 Bisogno Dec 2005 A1
20050287499 Yeager Dec 2005 A1
20060004862 Fisher et al. Jan 2006 A1
20060025282 Redmann Feb 2006 A1
20060026052 Klett et al. Feb 2006 A1
20060035200 Pittman Feb 2006 A1
20060040244 Kain Feb 2006 A1
20060047208 Yoon Mar 2006 A1
20060063980 Hwang et al. Mar 2006 A1
20060085272 Case et al. Apr 2006 A1
20060107822 Bowen May 2006 A1
20060173972 Jung et al. Aug 2006 A1
20060197670 Breibart Sep 2006 A1
20060197753 Hotelling Sep 2006 A1
20060199155 Mosher Sep 2006 A1
20060205564 Peterson Sep 2006 A1
20060238517 King et al. Oct 2006 A1
20060250524 Roche Nov 2006 A1
20060253874 Stark et al. Nov 2006 A1
20060256130 Gonzalez Nov 2006 A1
20060263750 Gordon Nov 2006 A1
20070026999 Merolle et al. Feb 2007 A1
20070032345 Padmanabhan et al. Feb 2007 A1
20070033068 Rao et al. Feb 2007 A1
20070033069 Rao et al. Feb 2007 A1
20070059672 Shaw Mar 2007 A1
20070074619 Vergo Apr 2007 A1
20070087686 Holm et al. Apr 2007 A1
20070100595 Earles et al. May 2007 A1
20070110074 Bradley et al. May 2007 A1
20070113726 Oliver et al. May 2007 A1
20070130476 Mohanty Jun 2007 A1
20070135264 Rosenberg Jun 2007 A1
20070136093 Rankin et al. Jun 2007 A1
20070141540 Borg Jun 2007 A1
20070166683 Chang et al. Jul 2007 A1
20070192106 Zilca Aug 2007 A1
20070219059 Schwartz et al. Sep 2007 A1
20070265138 Ashby Nov 2007 A1
20070287596 Case et al. Dec 2007 A1
20070287597 Cameron Dec 2007 A1
20080033827 Kuang et al. Feb 2008 A1
20080096726 Riley et al. Apr 2008 A1
20080155470 Khedouri et al. Jun 2008 A1
20080177860 Khedouri et al. Jul 2008 A1
20080195594 Gerjets et al. Aug 2008 A1
20080195997 Herberger Aug 2008 A1
20080215968 Bekerman Sep 2008 A1
20080242521 Einav Oct 2008 A1
20090047645 Dibenedetto et al. Feb 2009 A1
20090087819 Adachi et al. Apr 2009 A1
20090169171 Massey et al. Jul 2009 A1
20090312105 Koplar Dec 2009 A1
20090327894 Rakib et al. Dec 2009 A1
20100081116 Barasch et al. Apr 2010 A1
20120185905 Kelley Jul 2012 A1
20120198317 Eppolito et al. Aug 2012 A1
20120230510 Dinescu et al. Sep 2012 A1
20130024880 Moloney-Egnatios et al. Jan 2013 A1
20140122601 Poston et al. May 2014 A1
20140285312 Laaksonen et al. Sep 2014 A1
20140328571 Roberts, Jr. et al. Nov 2014 A1
20150086174 Abecassis et al. Mar 2015 A1
20150341410 Schrempp et al. Nov 2015 A1
Foreign Referenced Citations (23)
Number Date Country
1462979 Sep 2004 EP
1512370 Mar 2005 EP
1585014 Oct 2005 EP
2253706 Sep 1992 GB
2284060 May 1995 GB
2409040 Jun 2005 GB
2007-013228 Jan 2007 JP
1999-0073234 Oct 1999 KR
9714357 Apr 1997 WO
0052604 Sep 2000 WO
0116855 Mar 2001 WO
0165460 Sep 2001 WO
0215986 Feb 2002 WO
02062425 Aug 2002 WO
0293272 Nov 2002 WO
2005032363 Apr 2005 WO
2005036918 Apr 2005 WO
2005082472 Sep 2005 WO
2005087323 Sep 2005 WO
2005093633 Oct 2005 WO
2006042415 Apr 2006 WO
2006079942 Aug 2006 WO
2007099206 Sep 2007 WO
Non-Patent Literature Citations (14)
Entry
Menta, “1200 Song MP3 Portable is a Milestone Player.” http://www.mp3newswire.net/stories/personaljuke.html (retrieved Jul. 17, 2010).
Oliver et al. “Enhancing Exercise Performance through Real-time Physiological Monitoring and Music: A User Study.” Pervasive Health Conference and Workshops, pp. 1-10 (2007).
Pike, Weight Watchers On-the-Go, Apr. 12, 2005, PC Magazine, vol. 24, Iss.6; p. 149.
Creative NOMAD® Digital Audio Player User Guide, Jun. 1999.
Creative NOMAD® Digital Getting Started Guide, Jan. 2000.
Ericsson Inc. “Cellular Phone With Integrated MP3 Player.” Research Disclosure Journal No. 41815, Research Disclosure Database No. 418015 (Feb. 1999).
Microsoft Zune Impressions- Part 1, DigitalArts Online Magazine, Dec. 4, 2006: <http://www.digitalartsonline.co.uklblogs/index.cfm?entryid=184&blogid=2>.
Podfitness Delivers on myMedia Promise, Utah Tech Jobs.com, Nov. 13, 2006: <http://utahtechjobs.com/index.php/2006/11/13/podfitness-delivers-on-mymedia-promise/>.
Reinventing the Scroll Wheel, CNET news.com, Aug. 22, 2006: <http://www.news.com/2300-1041 3-6107951-1.html?tag=ne.gall.pg; http://www.news.com/2300-1041 3-6107951-2.html?tag=ne.gall.pg; http://www.news.com/2300-1041 3-6107951-3.html?tag=ne.gall.pg; http://www.news.com/2300-1041 3-6107951-4.html?tag=ne.gall.pg.
Rio 500 Getting Started Guide, 1999.
Rio PMP300 User's Guide (1998).
Sensei.com, Feb. 10, 2008: <http://www.sensei.com/senseipublic/Inner.aspx>.
“Notice from the European Patent Office dated Oct. 1, 2007 concerning business methods.” Official Journal of the European Patent Office, vol. 30, No. 7, Nov. 1, 2007.
“Statement in Accordance With the Notice From the European Patent Office dated Oct. 1, 2007 Concerning Business Methods.” Official Journal of the European Patent Office, Nov. 1, 2007.
Related Publications (1)
Number Date Country
20140250380 A1 Sep 2014 US
Provisional Applications (1)
Number Date Country
60846414 Sep 2006 US
Continuations (1)
Number Date Country
Parent 11729291 Mar 2007 US
Child 14274940 US