This relates to portable electronic devices and more particularly to systems and methods for saving and restoring scenes in a multimedia system. As used herein, a scene includes the states of one or more components in the multimedia system. States of components that can be captured in a scene can include, for example, the states of lights (e.g., whether they are turned on or off and their brightness levels), the states of speakers (e.g., volume level and balance), media that is playing on a media playback application, metadata associated with the media that is playing, or any other suitable states.
Remote controls, such as dedicated remote controls for controlling devices in a multimedia system, are known. Devices in the multimedia system can include consumer electronic devices, such as televisions, set-top boxes, stereo receivers, and computers, and appliances, such as lights, window shades, and fireplaces. In addition, universal remote controls have been designed to interact with a wide variety of devices, so that a single universal remote control can be used to control more than one device. For example, a universal remote control can control both a television and set-top box. Most of these remote controls communicate with the devices using infrared (IR) signals.
However, in order to configure a conventional remote control to work with multiple devices, a user must first specify the devices that are in the user's multimedia system. For example, the user may specify a TV, a stereo receiver, and a DVD player. Furthermore, although conventional remote controls can save and restore different pre-defined scenes (e.g., “watch TV,” “watch DVD,” and “play game”), a user has to spend time manually defining the scene (e.g., by specifying which device inputs to use, etc.).
Furthermore, in most conventional remote controls, a user must use a computer to define a scene. In some cases, however, a user may want to be able to define a scene without being restricted to using a computer. For example, the user may be watching a horror movie and may adjust the lighting and sound of the scene accordingly. The user may want to save the states of the devices so that the scene can be restored when, for example, the user watches another horror movie a few days later. The user may want to do this save instantaneously without having to use a computer.
In addition, these conventional remote controls may wait for a user to activate a certain scene instead of recommending scenes based on past activities of the user or external parameters (e.g., genre of media that is playing, time of the day, and special occasions). For example, a conventional remote control may not be able to detect that a user is viewing a horror movie, and, as a result, suggest a saved scene that has been configured for horror movies.
In view of the foregoing, it would be desirable to provide systems and methods that allow a user to save and restore scenes in a multimedia system with minimal configuration. It would also be desirable to provide suggestions of suitable recommended scenes based on states of devices in the scenes and usage patterns.
In accordance with the disclosure, a portable electronic device and method of operation are provided in which the device can save and restore scenes in a multimedia system with minimal configuration. The portable electronic device can additionally provide suggestions of suitable recommended scenes to the user.
In some embodiments, the portable electronic device can discover one or more components in a multimedia system that are ready to be controlled. For example, servers on each component can broadcast one or more identifiers to the portable electronic device. In response to receiving the identifiers, the portable electronic device can establish a communications link with the components.
In some embodiments, after establishing communications with the components, the portable electronic device can receive states from the components. States can include software and/or hardware settings of the components, such as, for example, the current volume setting, the current playback speed of a media item, or the current media that is playing. In response to receiving the states, the portable electronic device can provide controls to a user. In response to the user controlling the state of a component using the controls, the portable electronic device can transmit the remote control commands to the component.
In some embodiments, the portable electronic device (e.g., a controller device) can manage a plurality of saved scenes, where the plurality of saved scenes include metadata that identifies respective usage patterns of the plurality of saved scenes. Additionally, the controller device can identify a current state and/or identify one or more recommended scenes from the plurality of saved scenes based at least in part on comparing the current state with the respective usage patterns. In some cases, the controller device can present the one or more recommended scenes in a user interface of the controller device. The controller device can also receive a selection to activate a selected scene of the one or more recommended scenes. In some examples, the controller device can also activate the selected scene in accordance with a determination that media content is playing in a current scene. Further, in some cases, the controller device can transmit instructions for switching from playing the media content of the current scene to playing other media content of the selected scene.
In some embodiments, the current state comprises a current time of day, and the respective usage patterns identify respective times of day during which the plurality of saved scenes where activated. Additionally, the current time of day comprises a window of time during the day. The controller device can also determine that the additional media content of the selected scene is being played in the selected scene when the selection to activate the selected scene is received. In some instances, activating the selected scene comprises not playing the other media content of the selected scene. The controller device may also present, in the user interface of the controller device, an option for switching from playing the media content of the current scene to playing other media content of the selected scene. In some cases, the instructions for switching from playing the media content of the current scene to playing other media content of the selected scene are transmitted based at least in part on receiving, in the user interface of the controller device, a selection for switching.
Therefore, in accordance with the present disclosure, there are provided systems, methods, and computer-readable medium for allowing a user to save and restore scenes in a multimedia system. These systems, methods, and computer-readable medium can include a portable electronic device (e.g., a controller device) that can automatically capture states of components in the multimedia system.
There is also provided a controller device for providing a user with recommended scenes based on states of components and usage patterns.
The above and other features of the present disclosure, its nature and various advantages will be more apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawing in which:
Display component 104 is illustrated in
User input component 106 is illustrated in
In some embodiments, portable electronic device 300 can be a portable computing device with remote control capabilities. For example, device 300 can be a media player with wireless communications capabilities (e.g., MP3 player), such as an iPod available from Apple Inc., of Cupertino, Calif., a cellular telephone, a personal e-mail or messaging device (e.g., a Blackberry® or a Sidekick®), a game player, a portable communication device, pocket-sized personal computers such as an iPAQ Pocket PC available by Hewlett Packard Inc., of Palo Alto, Calif., personal digital assistants (PDAs), a laptop computer, a remote ordering interface, dedicated remote control devices, global positioning system (GPS) devices, or any other suitable personal device capable of communicating wirelessly (with or without the aid of a wireless enabling accessory system). In some embodiments, portable electronic device 300 can be a portable device that combines multi-media processing and telephone functionality in single integrated unit, such as an iPhone available from Apple Inc. A more detailed discussion of a portable electronic device with remote control capabilities may be found, for example, in Cannistraro et al., U.S. patent application Ser. No. 11/955,385, filed Dec. 12, 2007, entitled “Handheld Electronic Devices with Bimodal Remote Control Functionality,” which is incorporated by reference in its entirety.
Portable electronic device 300 can be battery-operated and highly portable so as to allow a user to listen to music, play games or videos, record video or take pictures, place and take telephone calls, communicate with other people or devices, control other devices, and any combination thereof. In addition, portable electronic device 300 can be sized such that it fits relatively easily into a pocket or hand of the user. By being handheld, portable electronic device 300 is relatively small and easily handled and utilized by its user and thus may be taken practically anywhere the user travels.
Portable electronic devices can include processor 302, memory 304, user interface 306, display 308, input/output circuitry 310, communications circuitry 312, and bus 314. In some embodiments, portable electronic device 300 can include more than one of each component or circuitry, but for the sake of illustration, only one of each is shown in
Processor 302 can include, for example, circuitry that can be configured to perform any function. Processor 302 can be used to run operating system applications, media playback applications, media editing applications, any other applications, or any combination thereof. Processor 302 can drive display 308 and can receive user inputs from user interface 306.
Memory 304 can include one or more different types of memory or storage mediums which can be used for performing device functions. For example, memory 304 can include cache, Flash, one or more different types of memory used for temporarily storing data, a hard-drive, tape drive, optical drive, permanent memory such as ROM (read-only memory), semi-permanent memory such as RAM (random access memory), any other suitable type of memory component, or any combination thereof. Memory 304 can be specifically dedicated to storing firmware. For example, memory 304 can be provided for storing firmware for device applications (e.g., operating system, user interface functions, and processor functions). Memory 304 can be used to store any suitable user-specific or global information that can be used by portable electronic device 300. Memory 304 can store, for example, media data (e.g., music and video files), application data (e.g., for implementing functions on device 300), preference information data (e.g., media playback preferences), lifestyle information data (e.g., food preferences), exercise information data (e.g., information obtained by exercise monitoring equipment), transaction information data (e.g., information such as credit card information), wireless connection information data (e.g., information that may enable device 300 to establish a wireless connection), subscription information data (e.g., information that keeps track of podcasts or television shows or other media a user subscribes to), contact information data (e.g., telephone numbers and e-mail addresses), calendar information data, any other suitable data, or any combination thereof. Memory 304 can store programs or applications that can be run on processor 302, can maintain files formatted to be read and edited by one or more of the applications, and can store any additional files that may aid the operation of one or more applications (e.g., files with metadata).
User interface 306 can allow a user to interact with portable electronic device 300. For example, the device for user interface 306 may take a variety of forms, such as at least one a button, keyboard/keypad, dial, a click wheel/scrollwheel, a touch screen or any combination thereof. User interface 306 can also include an audio input device (e.g., a microphone) or a video input device (e.g., a camera or a web cam) for recording. A more detailed discussion of user interaction with a portable electronic device may be found, for example, in Cannistraro et al., U.S. patent application Ser. No. 11/955,382, filed Dec. 12, 2007, entitled “Handheld Electronic Devices with Remote Control Functionality and Gesture Recognition,” which is incorporated by reference in its entirety.
Display 308 can accept and/or generate signals for presenting media information (textual and/or graphic) on a display screen, such as those discussed above. For example, display 308 may include a coder/decoder (CODEC) to convert digital media data into analog signals. Display 308 also can include display driver circuitry and/or circuitry for driving display drivers. The display signals can be generated by processor 302 or display 308. The display signals can provide media information related to media data received from communications circuitry 312, any other component of portable electronic device 300, or any combination thereof. In some embodiments, display 308, like any other component discussed herein, may be integrated with and/or externally coupled to portable electronic device 300.
Input/output circuitry 310 can convert (and encode/decode, if necessary) data, analog signals and other signals (e.g., physical contact inputs, physical movements, analog audio signals, etc.) into digital data, and vice-versa. The digital data can be provided to and received from processor 302, memory 304, or any other component of portable electronic device 300. Although input/output circuitry 310 is illustrated in
Communications circuitry 312 can permit portable electronic device 300 to communicate with one or more servers, components, or other devices using any suitable communications protocol. For example, communications circuitry 312 can support Wi-Fi (e.g., a 802.11 protocol), Ethernet, protocols for other short-range wireless communications links such as Bluetooth™ (which is a trademark owned by Bluetooth Sig, Inc.), high frequency systems (e.g., 900 MHz, 2.4 GHz, and 5.6 GHz communications systems), TCP/IP (e.g., any of the protocols used in each of the TCP/IP layers), HTTP, BitTorrent, FTP, RTP, RTSP, SSH, cellular telephone communications protocols, infrared, any other communications protocol, or any combination thereof. In some embodiments, communications circuitry 312 may include a wired or wireless network interface card (“NIC”) configured to connect to the Internet or any other public or private network. Communications circuitry 312 can be used to initiate and conduct communications with other devices or media players or systems within a communications network. Communications networks will be discussed in more detail in connection with
Bus 314 can provide a data transfer path for transferring data to, from, or between processor 302, memory 304, user interface 306, display 308, input/output circuitry 310, and communications circuitry 312. Persons skilled in the art will appreciate that components do not have to be directly connected to each other via bus 314.
In some embodiments, device 400 can host media and computing applications. For example, device 400 can host e-mail applications, media applications (e.g., applications that supply video, music, slideshows, and photos), media recording software, web browsers, media sharing applications, programming guide applications, software update applications, game networks, or any other suitable type of applications.
Device 400 can include processor 402, memory 404, communications circuitry 406, and bus 408. In some embodiments, device 400 can include more than one of each component or circuitry, but for the sake of illustration, only one of each is shown in
In some embodiments, device 400 can include processor 402, which can include circuitry used to control the operation of device 400. For example, processor 402 can be implemented using microprocessors, microcontrollers, digital signal processors, application specific integrated circuits, and any other suitable integrated circuits.
Memory 404 can include one or more different types of storage. For example, memory 404 can include hard disk drive storage, nonvolatile memory (e.g., flash memory or any other electrically-programmable ROM), volatile memory (e.g., battery-based static or dynamic RAM), or any other suitable type of memory. In some embodiments, processor 402 and memory 404 can be used to run software on device 400. For example, suitable software can include remote control applications, media playback applications, operating system functions, and presentation programs.
Communications circuitry 406 can allow device 400 to communicate with one or more external devices (e.g., portable electronic device 300 of
Bus 408 can provide a data transfer path for transferring data to, from, or between processor 402, memory 404, and communications circuitry 406. Persons skilled in the art will appreciate that components do not have to be directly connected to each other via bus 408.
In some embodiments, users may interact with a variety of different media types using the components in the multimedia system. For example, multimedia system 500 can be used by a user to view media. Multimedia system 500 can be used to play compact disks, video disks, tapes, and hard-drive-based or flash-disk-based media files. The songs, videos, and other content maybe presented to the user using speakers and display screens. As another example, visual content such as a television program that is received from a cable provider can be displayed on a television. As yet another example, audio content such as a song can be streamed from an on-line source or can be played back from a local hard-drive.
Multimedia system 500 can include one or more components such as component 502, component 504, and component 506. Persons skilled in the art will appreciate that the components provided are merely illustrative and that any suitable components capable of communicating with the portable electronic device, which are not shown in
Component 502 can include any suitable type of media display device, such as, for example, a cathode-ray tube (CRT) television, a high-definition television, plasma screen, liquid crystal display (LCD), organic light emitting diode (OLED) display, or any other suitable display device. In some embodiments, component 502 can include a television tuner. In some embodiments, a user can control the state of component 502 using a remote control. For example, the user may use a remote control to change the current television channel on the television tuner or adjust the volume produced by speakers in component 502. In some embodiments, component 502 can send audio and video signals to component 506 while simultaneously sending audio to component 504 for playback over speakers 508.
Component 504 can include any suitable type of audio-video receiving device. In some embodiments, component 504 can be an audio-video receiver that can switch among various video and audio inputs. Component 504 can be used to amplify audio signals for playback over speakers 508. In some embodiments, audio (e.g., digital or analog audio) from component 502 and component 506 can be transmitted to component 504 for amplification.
Component 506 can include a set-top box. For example, component 506 can be a cable receiver, computer-based set-top box, network-connected media playback equipment, personal video recorder, digital video recorder, digital video disc (DVD) players, or any other suitable set-top box. In some embodiments, component 506 can receive television signals from a television provider on a television signal input line. In some embodiments, a tuner in component 506 can be used to tune to a desired television channel. A video and audio signal corresponding to this channel can be provided to components 502 and 504. In some embodiments, component 506 can provide recorded content (e.g., content that has been recorded on a hard-drive) and downloaded content (e.g., video and audio files that have been downloaded from the Internet) to components 502 and 504.
In some embodiments, components 502, 504, and 506 can be connected via links 510. Links 510 can provide communication for multimedia system 500 using any suitable type of wired or wireless communication technology. For example, component 504 can receive audio signals from component 502 and component 506 via links 510. In some embodiments, links 510 can allow components in multimedia system 500 to connect to a portable electronic device (e.g., portable electronic device 300 of
Components 502, 504, and 506 can include communications circuitry 512. Communications circuitry 512 can be used to implement one or more communication protocols and can be the same as or similar to communications circuitry 406 of
Any suitable circuitry, device, system or combination of these (e.g., a wireless communications infrastructure including communications towers and telecommunications servers) operative to create a communications network can be used to create communications network 610. Communications network 610 can be capable of providing wireless communications using any suitable short-range or long-range communications protocol. Communications network 610 can support, for example, satellite technology, television broadcast technology, wireless universal serial bus technology, Internet, Ethernet, Wi-Fi (e.g., an 802.11 protocol), Bluetooth (registered trademark), high frequency systems (e.g., 900 MHz, 2.4 GHz, and 5.6 GHz communication systems), light-based wireless systems, sound-based wireless systems, other relatively localized wireless communication protocol, or any combination thereof.
In some embodiments, communications network 610 can support protocols used by wireless and cellular phones and personal email devices (e.g., a Blackberry®). Such protocols can include, for example, G8M, G8M plus EDGE, CDMA, quadband, and other cellular protocols. As another example, long range communications protocols can be used, such as Wi-Fi and protocols for placing or receiving calls using VOIP or LAN. Portable electronic device 602 and component 604, when located within communications network 610, can wirelessly communicate over a local wireless communication link such as link 640.
Portable electronic device 602 can be coupled with component 606 over link 642. Link 642 can be a wired link that is coupled to both portable electronic device 602 and component 606.
The communications sent and received between portable electronic device 602 and components 604 and 606 can include any suitable type of one-way or two-way communications. In some embodiments, to control the operation of components 604 and 606, portable electronic device 602 can transmit remote control signals to components 604 and 606. In addition, components 604 and 606 can broadcast data signals that can correspond to the states of the components to portable electronic device 602. Suitable states can include software and/or hardware settings that can be controlled by portable electronic device 602, such as, for example, the current volume setting and/or the current playback speed of a media item.
In some embodiments, states of components can include the locations of the components. For example, lights and TV located in the living room can broadcast to portable electronic device 602 that they are located in the living room. In some embodiments, the components can determine their locations based on GPS signals. In response to receiving the location of each component, portable electronic device 602 can build an internal map of the components and store the internal map in memory (e.g., memory 304 of
In some embodiments, one or more servers on components 604 and 606 can broadcast the components' identifiers and states to portable electronic device 602. For example, components 604 and 606 can use servers 612 and 614 to broadcast media content to portable electronic device 602 (e.g., as downloaded files or streaming media). Servers 612 and 614 can use any suitable wired or wireless communications link, such as links 640 and 642, to communicate with portable electronic device 602.
The states of components 604 and 606 can be broadcast at specified times. For example, the states of components 604 and 606 can be broadcast whenever the states change. As another example, the states of the components can be broadcast at preset time intervals.
In some embodiments, portable electronic device 602 can include remote client 616, which can provide a communications interface between portable electronic device 602 and components in the multimedia system (e.g., components 604 and 606). For example, remote client 616 can transmit remote control commands to the components. As another example, remote client 616 can allow servers 612 and 614 to communicate with a remote control application on portable electronic device 602. The remote control application can be implemented using software stored in memory (e.g., memory 304 of
As discussed previously, some components in the multimedia system (e.g., devices) can host one or more applications. For example, component 604 can include application 618 and plug-in 620.
In some embodiments, plug-in 620 can provide application 618 with remote control functionality. For example, plug-in 620 can extract the state of application 618 and can provide the state to server 612. The state of application 618 can include both passive elements and active elements. Passive elements can include, for example, images (e.g., album covers), videos, title names, artist names, album names, or any other suitable passive elements. Active elements can include functions of an application that can be remotely controlled. For example, active elements can include a volume setting, a highlight region in a list of media items (e.g., a list of media items that a media player application can access), playback controls (e.g., play, pause, rewind, fast-forward), contrast settings, equalizer settings, and any other suitable active elements.
In some embodiments, plug-in 620 can receive a remote control command from server 612 and can perform the desired action for application 618. For example, when a remote control command from portable electronic device 602 indicates that the volume of a media playback operation in application 618 should be raised, plug-in 620 can adjust the volume accordingly. As another example, when a remote control command indicates that a user has selected a media item for playback, plug-in 620 can direct application 618 to initiate media playback of the media item.
In some embodiments, if a device hosts one or more applications that are operative to be controlled, one or more servers (e.g., server 612) on the device can broadcast the availability of the applications to portable electronic device 602. For example, server 612 can broadcast that component 604 hosts application 618, which has plug-in 620 providing remote control functionality. In some embodiments, remote client 616 can respond to the broadcast information. For example, remote client 616 can request to activate the remote control functionality. As a result, server 612 can receive remote control commands from remote client 616 and can forward the remote control commands to plug-in 620. In addition, the state of application 618 can be broadcast to remote client 616 at specified times (e.g., when the state changes or at preset time intervals).
The portable electronic device can launch a remote control application in one or more ways. For example, the portable electronic device can display an option for the remote control application (in fact, highlight region 706 in
In some embodiments, servers on each device (e.g., servers 612 and 614 of
Once the portable electronic device discovers the devices, the portable electronic device can receive the states of the software and hardware running on the devices. As a result, the states are “device pushed” to the portable electronic device. Indicators 804 can specify the activity level of a device. For example, a media device can be considered active if it is currently performing a media playback operation. As another example, an appliance (e.g., fireplace or lamp) can be considered active if it is turned on. A user can select a listing 802 by placing highlight region 806 over the listing. In some embodiments, in response to receiving a user selection of a listing, the portable electronic device can display a graphic image based, at least in part, on the controls of a device.
As still yet another example, display screen 900 can include selectable arrows 912, which can allow the user to navigate menu items on the device. Display screen 900 can also include options 914, which can provide the user with access to more detailed controls for the device (e.g., a media guide, menu, and recorded programs).
Referring back to
In some embodiments, display screen 1000 can include indicators 1006, which can specify the activity level of an application. For example, the “MUSIC” application can be considered active if it is playing a song. As another example, the “SOFTWARE FOR LIGHTS” application can be considered active if it is currently controlling the lights in a room.
Referring back to
In some embodiments, in response to the user selecting to save the state, the portable electronic device can automatically save the states of any suitable component in the scene. For example, the portable electronic device can save the states of all components in the scene. As another example, the portable electronic device can save the states of components in a particular location.
The portable electronic device can determine the locations of components in one or more ways. For example, the portable electronic device can determine the locations based on the states broadcast from the components. As another example, the portable electronic device can determine the locations based on the internal map of the components that is stored in memory (e.g., memory 304 of
In some embodiments, in response to the user selecting “SAVE” option 810 or pressing a button, the portable electronic device can provide the user with an option to save the states of a subset of the components (e.g., devices and applications) in the multimedia system.
The devices in listings 1102 can be organized and displayed based on one or more criteria obtained from any suitable source. For example, the criteria can be obtained from information broadcast by servers on the devices (e.g., while discovering the devices or from the states of the devices). As another example, the criteria can be obtained from information that has been specified by the user. The user may, for example, specify that a particular device is located in the living room.
Persons skilled in the art will appreciate that any suitable criteria can be used to organize the devices. For example, as shown in listings 1102, the devices are organized based on the category of each device. Suitable categories can include televisions, sound devices, computers, electronics accessories, gaming systems, appliances, miscellaneous devices, or any other suitable categories. As another example, the devices can be organized based on the devices' names (e.g., the devices can be displayed alphabetically). As yet another example, the devices can be organized based on their locations.
In some embodiments, the user may select to view devices organized based a different criteria. For example, in response to the user selecting “VIEW” option 1106, the portable electronic device can display the devices based on their locations.
As discussed previously, locations can be obtained from any suitable source. For example, locations can be broadcast from the components (e.g. locations can be included in the states of the components). As another example, the user may specify the location of a component. As yet another example, the portable electronic device can determine the locations based the strength of the signals broadcast from the components.
In some embodiments, in response to the user selecting “VIEW” option 1204, the portable electronic device can return to displaying the devices based on their categories (e.g.,
Referring back to
In some embodiments, in response to the user selecting “DELETE” option 1110 for a device (e.g., “TV1”), the portable electronic device can remove the device from the scene and display the devices that are still included in the scene.
In some embodiments, display screen 1300 can include “ADD DEVICE” option 1304, which can provide the user with components that have not been included in the scene but can be controlled by the portable electronic device. For example, the components may have previously been deleted from the scene, or the components may have recently been added to the multimedia system. In response to the user selecting “ADD DEVICE” option 1304, the portable electronic device can display those components that have not been included in the scene.
Referring back to
In some embodiments, display screen 1500 can include “ALL DEVICES” option 1510, which can allow the user to view all the devices in the multimedia system that can be controlled. In some embodiments, in response to the user selecting “ALL DEVICES” option 1510, display screen 800 (
In some embodiments, the user may save updates to the scene by selecting “SAVE” option 1512. In response to the user selecting “SAVE” option 1512, the portable electronic device can save any changes to the states of the components since the scene was last saved. For example, the user may have changed the volume setting on the speakers or the application that is running on the computer.
In some embodiments, the user may change the components that are included in the scene by selecting “EDIT” option 1514. In response to the user selecting “EDIT” option 1514, the portable electronic device can provide a display screen similar to display screen 1900 (
In some embodiments, the portable electronic device can allow the user to set up a scene in which the states of components change with time. For example, for a “DINNER” scene, the user may specify that the music gets softer as it gets later in the evening. As another example, for a “BABYSITTING” scene, the user may specify that the volume on the television decreases as it gets later in the evening. At a given time in the night, the television can turn off.
In some embodiments, the user may set up the scene so that the scene is automatically restored at a certain time of the day. A scene can be linked to an alarm on the portable electronic device or on any suitable component in the multimedia system. For example, a “WAKE-UP” scene can be set up to turn on the lights, the media player, and the coffee pot at 8 am.
XML file 1600 can include scene tag 1602 and corresponding close scene tag 1604, which can define the beginning and end of a scene. Label tag 1606 can define a label for the scene. For example, label tag 1606 can be used by the portable electronic device to display onscreen text that identifies a scene as “ROMANCE MOVIE NIGHT.”
Component tags 1608 and 1610 can be used to define the beginning and end of the saved state of component 1612 (“COMPONENT1”). Identifier tags 1614 and 1616 and ID 1618 can be used by the portable electronic device to associate other information (e.g., information stored in memory 304 of
Screen elements tag 1620 and corresponding close screen elements tag 1622 can define the beginning and end of a screen elements section for component 1612. A screen element can be passive (e.g., an element that can provide information about the current state of the component) or active (e.g., an element that can display information and/or can be controlled based on user input). Label tag 1624 can define a label for the component. For example, the portable electronic device can use label tag 1624 to display on-screen text that identifies a component as “TVI.”
Slider tags 1626 and 1628 can define the beginning and end of slider element 1630. Slider element 1630 can be a passive or active screen element and can represent a volume slider such as, for example, volume slider 906 of
Min tag 1634 can define the lowest value for slider element 1630. Similarly, max tag 1636 can define the highest value for slider element 1630. Current value tag 1638 can define the current value of slider element 1630. In addition, slider element 1630 can be displayed in various ways depending on the display capabilities of the portable electronic device.
Component tags 1640 and 1642 can be used to define the beginning and end of the saved state of component 1644 (“COMPONENT2”). Component 1644 can be another component (e.g., A/V receiver) that is included in the scene.
In some embodiments, the portable electronic device can keep track of usage patterns of the saved scenes (e.g., the times and frequencies that the saved scenes are accessed by the user). This information can be stored in memory (e.g., memory 304 of
Display screen 1700 can include “EDIT” option 1704, which can allow the user to edit listings 1702. In some embodiments, in response to the user selecting “EDIT” option 1704, the portable electronic device can allow the user to delete one or more saved scenes.
In some embodiments, the portable electronic device can provide an option to the user to restore a saved scene. For example, the user may select to restore a saved scene by placing highlight region 1706 over a listing in listings 1702. In response to the user selecting to restore a scene, the portable electronic device can inspect the preference file of the scene, and can find the components that are listed in the preference file. The portable electronic device can then attempt to contact these components and verify whether the components are currently available. For example, the portable electronic device can verify if the servers on the components are broadcasting identifiers.
If the portable electronic device discovers that a component is available, the portable electronic device can obtain the saved state of the component from the preference file, and transmit the saved state to the component (e.g., via remote client 616 of
If the portable electronic device determines that a component's current state is the same as the saved state, the portable electronic device may not transmit the saved state to the component. As a result, the restoring process can be faster and more efficient. After restoring the scene, the portable electronic device can provide a summary of the restored scene.
If a particular component of the restored scene is not available, the portable electronic device can display message alert 1806 that indicates that the component can not be contacted. For example, if the portable electronic device is unable to establish communications with the fireplace, a “cannot be contacted” message can be displayed. The user may close message alert 1806 by selecting “OK” option 1808.
In some embodiments, the portable electronic device can recommend alternative components to substitute for the component that is no longer available. For example, the portable electronic device can recommend another state for an existing component in the scene. As another example, the portable electronic device can recommend a new component that is not included in the scene. In addition, the portable electronic device can recommend a state for the new component that closely matches the component that is no longer available.
In some embodiments, display screen 1800 can include “EDIT” option 1810, which can allow the user to edit the restored scene. For example, in response to the user selecting “EDIT” option 1810, the portable electronic device can allow the user to add other available components to the scene, delete components from the scene, or any combination thereof.
In some embodiments, the user may add other available devices to listings 1902 by selecting “ADD DEVICE” option 1908. In response to the user selecting “ADD DEVICE” option 1908, the portable electronic device can search for other devices that can be controlled but are not included in listings 1902. The portable electronic device can present a display screen similar to display screen 1400 of
In some embodiments, the portable electronic device can provide an option for the user to create a new scene from the restored scene. For example, the user may assign a new label to the restored scene by entering a new label in text box 1912. In response to the user selecting to create a new scene (e.g., by selecting “CREATE NEW SCENE” option 1914), the portable electronic device can create a copy of the restored scene, and associate the copied scene with the new label.
Referring back to
In some embodiments, the portable electronic device can search for metadata associated with the media that is playing on a component in the current scene. For example, the user may currently be watching a movie on “TV1.” The portable electronic device can determine the genre of the movie (e.g., horror movie) and can search for one or more saved scenes in which “TV1” is playing another movie from the same genre. As another example, the portable electronic device can determine the artist of a song that is playing on a “MUSIC” application, and can search for one or more saved scenes in which the “MUSIC” application is playing another song by the same artist. As yet another example, the portable electronic device can determine that the user is watching an episode from a television show on ITV2” and can search for a saved scene in which “TV2” is playing another episode from the same television show.
In some embodiments, the portable electronic device can determine that the user is watching an episode from a television show on “TV1” and can search for a saved scene in which “TV2” is playing an episode from the same television show. In response to finding a saved scene in which “TV2” is playing the same television show, the portable electronic device can identify the locations of “TV1” and “TV2” (e.g., by using an internal map). For example, the portable electronic device may find that “TV1” is located in the bedroom and “TV2” is located in the living room. When the user selects to restore the scene, the portable electronic device can attempt to apply the states of devices in the scene to similar devices around “TV1”. For example, the television volume and lights in the bedroom can be adjusted to a similar state to the television volume and lights in the living room.
In some embodiments, the user may select to restore a recommended scene in listings 2002 by placing highlight region 2006 over a listing. In response to the user selecting to restore a recommended scene, the portable electronic device can restore the states of components from the recommended scene.
In some embodiments, the portable electronic device can provide one or more options that allow the user to select to continue playing the current media or switch to playing media from the recommended scene. For example, the portable electronic device can display “YES” option 2010 and “NO” option 2012. In response to the user selecting “YES” option 2010, the portable electronic device can begin to play media from the recommended scene. In response to the user selecting “NO” option 2012, the portable electronic device can continue to play the current media so that the user's viewing experience is not disrupted.
Since the user may not want to be bothered while watching media, the portable electronic device can display “YES” option 2010 and “NO” option 2012 at any suitable time. For example, if the user is watching a television program, the portable electronic device can wait for a commercial break before displaying the options (e.g., the commercial break can be identified using media cues). As another example, if the user is listening to music, the portable electronic device can wait until there is a break in the music (e.g., the time between songs) before displaying the options. As yet another example, the portable electronic device can immediately display the options after the user selects to restore a recommended scene. After displaying the options, the portable electronic device can wait for a user selection. The user may not enter a selection if he prefers to continue watching the current media. Therefore, if no user selection is received after a period of time, the portable electronic device can remove the options from display screen 2000 and continue to play the current media. As still yet another example, the user may set up the portable electronic device so that a recommended scene is automatically restored when the user selects the recommended scene. During setup, the user may also specify whether to switch the current media to the media in the recommended scene.
As discussed previously, the portable electronic device can keep track of usage patterns of the saved scenes. In some embodiments, the portable electronic device can recommend scenes based on the usage patterns. For example, the portable electronic device can determine that the current date is February 14th. Based on this information, the portable electronic device can search for a saved scene that was saved and/or restored on the same day for the past several years. As another example, the portable electronic device can determine that the current day of the week is Saturday, and can search for a saved scene that is accessed the most on Saturdays. As yet another example, the portable electronic device can use day parting to recommend a saved scene. Any suitable way of diving the day can be used such as, for example, dividing the day into 8 am-10 am, 10 am-4 pm, 5 pm-6 pm, and 6 pm-10 pm time slots. As a result, if the portable electronic device determines that the current time is 9 am, the portable electronic device can search for a saved scene that is accessed the most between 8 am and 10 am. Persons skilled in the art will appreciate that any suitable criteria and/or combinations of criteria can be used to determine recommended scenes.
In some embodiments, the user may access a listing of favorite scenes.
Display screen 2100 can include listings 2102 of saved scenes that have been designated as favorite scenes. In some embodiments, “FAVORITES” option 2104 can be shaded to indicate that listings 2102 include favorite scenes. In some embodiments, the user may designate one or more saved scenes as favorite scenes. The user may, for example, select “ADD” option 2106 to add scenes to listings 2102. In response to receiving a selection of “ADD” option 2106, the portable electronic device can display saved scenes that are not currently designated as favorite scenes. The user may select one or more of these saved scenes to add to listings 2102. In some embodiments, the user may select “EDIT” option 2108 to delete one or more scenes from listings 2102. In some embodiments, the portable electronic device can automatically add the most frequently restored scenes to listings 2102.
In some embodiments, the user may select a scene in listings 2102 by placing highlight region 2110 over a listing. In response to the user selecting a scene, the portable electronic device can restore the scene.
At step 2206, the portable electronic device can receive user instructions. At step 2208, the portable electronic device can determine whether an instruction to access a component in the multimedia system has been received. For example, the portable electronic device can determine whether the user has selected a component in listings 802 (
At step 2212, the portable electronic device can provide the selected component. For example, the portable electronic device can provide display screen 900 (
At step 2210, the portable electronic device can determine whether an instruction to save the scene has been received. For example, the portable electronic device can determine whether the user has selected “SAVE” option 810 (
If, at step 2210, the portable electronic device instead determines that an instruction to save the scene has been received, process 2200 moves to step 2216. At step 2216, the portable electronic device can save the scene. For example, the portable electronic device can store the states of the components included in the scene in memory (e.g., memory 304 of
At step 2214, the portable electronic device can determine whether an instruction to restore a scene has been received. For example, the portable electronic device can determine whether the user has selected “VIEW SAVED” option 812 (
If, at step 2214, the portable electronic device instead determines that an instruction to restore a scene has been received, process 2200 moves to step 2220. At step 2220, the portable electronic device can restore the scene. After restoring the scene, process 2200 moves to step 2218.
At step 2218, the portable electronic device can determine whether an instruction to exit has been received. For example, the user may select to exit process 2200 by selecting button 704 (
Process 2300 starts at step 2302. At step 2304, the portable electronic device can receive remote control commands for a component. For example, remote control commands can be associated control options 902 or options 914 of
At step 2306, the portable electronic device can transmit the remote control commands to the component. For example, a remote client (e.g., remote client 616 of
At step 2308, the portable electronic device can receive new state from the component. For example, a server (e.g., server 612 or server 614) on a component can broadcast the new state to the portable electronic device. After receiving the new state, process 2300 moves to step 2310.
At step 2310, the portable electronic device can verify if the new state has been correctly adjusted based on the remote control commands. If, at step 2310, the portable electronic device determines that the new state has not been correctly adjusted, process 2300 moves back to step 2306. At step 2306, the portable electronic device can re-transmit the remote control commands to the component.
If, at step 2310, the portable electronic device instead determines that the new state has been correctly adjusted, process 2300 ends at step 2312. Persons skilled in the art will appreciate that process 2300 can be repeated for all components in a scene.
Process 2400 starts at step 2402. At step 2404, the portable electronic device can receive user instructions. At step 2406, the portable electronic device can determine whether an instruction to view components in an alternative organization has been received. For example, portable electronic device can provide display screen 1100 (
At step 2410, the portable electronic device can display the components in an alternative organization. For example, the portable electronic device can provide display screen 1200 (
At step 2408, the portable electronic device can determine whether an instruction to delete one or more components from the scene has been received. For example, the portable electronic device can determine whether the user has selected to delete a component (e.g., the user has selected option 1108 and “DELETE” option 1110 of
At step 2414, the portable electronic device can delete the one or more components selected by the user. For example, in response to receiving an instruction to delete “TV1”, the portable electronic device can delete “TV1” from listings 1102 (
At step 2412, the portable electronic device can determine whether an instruction to add one or more components to the scene has been received. For example, the portable electronic device can determine whether the user has selected to add a component (e.g., the user has selected “ADD DEVICE” option 1304 of
At step 2418, the portable electronic device can add the components that were selected by the user. For example, in response to the user selecting “ADD DEVICE” option 1304, the portable electronic device can provide display screen 1400 (
At step 2416, the portable electronic device can determine whether an instruction to save a scene has been received. For example, the portable electronic device can determine whether the user has selected to save a scene by selecting a “SAVE” option (e.g., “SAVE” options 1112, 1206, and 1308 in
If, at step 2416, the portable electronic device determines that an instruction to save a scene has been received, process 2400 moves to step 2420. For example, in response to the user selecting a “SAVE” option, the portable electronic device can save the scene in a preference file that can be stored in memory (e.g., memory 304 of
If, at step 2416, the portable electronic device instead determines that an instruction to save the scene has not been received, process 2400 ends at step 2422. For example, the user may have selected a “BACK” option (e.g., “BACK” options 1116, 1210, and 1310 in
Process 2500 starts at step 2502. At step 2504, the portable electronic device can receive user instructions. At step 2506, the portable electronic device can determine whether an instruction to display all saved scenes has been received. For example, the portable electronic device can determine whether the user has selected an “ALL” option (e.g., “ALL” options 1710, 2008, and 2112 in
At step 2510, the portable electronic device can display all saved scenes to the user. For example, the portable electronic device can provide the user with display screen 1700 (
At step 2508, the portable electronic device can determine whether an instruction to display recommended scenes has been received. For example, the portable electronic device can determine whether the user has selected a “RECOMMENDED” option (e.g., “RECOMMENDED” options 1712, 2004, and 2114 in
At step 2514, the portable electronic device can display recommended scenes to the user. In some embodiments, the portable electronic device can determine recommended scenes based on one or more parameters (e.g., states of components and/or usage patterns). In some embodiments, the portable electronic device can provide the user with display screen 2000 (
At step 2512, the portable electronic device can determine whether an instruction to display favorite scenes has been received. For example, the portable electronic device can determine whether the user has selected a “FAVORITES” option (e.g., “FAVORITES” options 1714, 2014, and 2104 in
At step 2518, the portable electronic device can display favorite scenes to the user. In some embodiments, the user may designate one or more saved scenes as favorite scenes. In some embodiments, the portable electronic device can provide the user with display screen 2100 (
At step 2516, the portable electronic device can determine whether an instruction to edit saved scenes has been received. For example, the portable electronic device can determine whether the user has selected “EDIT” option 1704. If, at step 2516, the portable electronic device determines that an instruction to edit saved scenes has not been received, process 2500 moves to step 2520. If, at step 2516, the portable electronic device instead determines that an instruction to edit saved scenes has been received, process 2500 moves to step 2522.
At step 2522, the portable electronic device can allow the user to edit the saved scenes. For example, the portable electronic device can allow the user to delete one or more saved scenes. Process 2500 then moves to step 2520.
At step 2520, the portable electronic device can determine whether an instruction to restore a scene has been received. For example, the portable electronic device can determine whether the user has selected a scene by placing highlight region 1706 over a listing in listings 1702 (
If, at step 2520, the portable electronic device instead determines that an instruction to restore a scene has been received, process 2500 moves to step 2526.
At step 2526, the portable electronic device can locate components in the scene. For example, the portable electronic device can inspect the preference file of the scene, and find the components that are listed in the preference file. After locating the components, process 2500 moves to step 2528.
At step 2528, the portable electronic device can attempt to contact each of the components in the scene. For example, the portable electronic device can verify if the servers on the components are broadcasting identifiers. After attempting to contact each of the components, process 2500 moves to step 2530.
At step 2530, the portable electronic device can transmit the states to the components. For example, if a component is available, the portable electronic device can obtain the saved state of the component from the preference file, and transmit the saved state to the component (e.g., via remote client 616 of
At step 2532, the portable electronic device can provide a summary of the scene that has been restored. For example, the portable electronic device can provide the user with display screen 1800 (
Process 2600 starts at step 2602. At step 2604, the portable electronic device can identify recommended scenes. In some embodiments, the portable electronic device can search for recommended scenes based on states of components in the current scene. For example, the portable electronic device can determine metadata (e.g., genre) associated with the media that is currently playing. The portable electronic device can then search for components in the saved scenes that are also playing media similar to the media that is currently playing (e.g., same genre or episodes of the same television show). In some embodiments, the portable electronic device can search for recommended scenes based on usage patterns. For example, the portable electronic device can determine the current date and time (e.g., day parting). Based on this information, the portable electronic device can search for saved scenes that were accessed the most for that particular date or time (or period of time). After the portable electronic devices has finished searching for recommended scenes, process 2600 moves to step 2606.
At step 2606, the portable electronic device can provide the recommended scenes to the user. For example, the portable electronic device can provide the user with display screen 2000 (
At step 2608, the portable electronic device can determine whether an instruction to restore a recommended scene has been received. For example, the user may select a recommended scene by placing highlight region 2006 over a listing in listings 2002. If, at step 2608, the portable electronic device determines that an instruction to restore a recommended scene has not been received, process 2600 ends at step 2610.
If, at step 2608, the portable electronic device instead determines that an instruction to restore a recommended scene has been received, process 2600 moves to step 2612.
At step 2612, the portable electronic device can determine if media is playing in the current scene and in the recommended scene. If, at step 2612, the portable electronic device determines that media is not playing in both scenes, process 2600 moves to step 2614.
At step 2614, the portable electronic device can restore the recommended scene. After restoring the recommended scene, process 2600 ends at step 2610.
If, at step 2612, the portable electronic device instead determines that media is playing in both scenes, process 2600 moves to step 2616. At step 2616, the portable electronic device can restore most of the components' states in the recommended scene except for the media.
At step 2618, the portable electronic device can provide options for switching the current media to media in the recommended scene. For example, the portable electronic device can display “YES” option 2010 and “NO” option 2012 (
At step 2620, the portable electronic device can determine whether an instruction to switch the media has been received. For example, the user may select “YES” option 2010 to switch to the media in the recommended scene, and “NO” option 2012 to continue playing the current media. If, at step 2620, the portable electronic device determines that an instruction to switch the media has not been received, process 2600 ends at step 2610.
If, at step 2620, the portable electronic device instead determines that an instruction to switch the media has been received, process 2600 moves to step 2622. At step 2622, the portable electronic device can switch to the media in the recommended scene. After switching the media, process 2600 ends at step 2610.
Thus it is seen that systems and methods are provided for saving and restoring scenes in a multimedia system with minimal configuration. It is also seen that systems and methods are provided for providing suggestions of suitable recommended scenes based on states of components and usage patterns. Persons skilled in the art will appreciate that the disclosure can be practiced by other than the described embodiments, which are presented for purposes of illustration and not of limitation, and the present disclosure is limited only by the claims which follow.
As described above, one aspect of the present technology is the gathering and use of data available from various sources to improve the ability of network-connected devices to communicate with one another and/or with users of the devices. The present disclosure contemplates that in some instances, this gathered data may include personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, twitter ID's, home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information data, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to communicate relevant information between devices for security (e.g., for authentication and/or authorization purposes). Accordingly, use of such personal information data enables users to appropriately register and/or control of the devices. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure. For instance, health and fitness data may be used to provide insights into a user's general wellness, or may be used as positive feedback to individuals using technology to pursue wellness goals.
The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the US, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA); whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an app that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing such personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data. For example, content can be selected and delivered to users by inferring preferences based on non-personal information data or a bare minimum amount of personal information, such as the content being requested by the device associated with a user, other non-personal information available to device connection services, or publicly available information.
This application is a continuation of U.S. patent application Ser. No. 17/349,849, entitled “SYSTEMS AND METHODS FOR SAVING AND RESTORING SCENES IN A MULTIMEDIA SYSTEM,” filed Jun. 16, 2021, set to issue Mar. 15, 2022 as U.S. Pat. No. 11,277,654, which is a continuation of U.S. patent application Ser. No. 16/890,967, entitled “SYSTEMS AND METHODS FOR SAVING AND RESTORING SCENES IN A MULTIMEDIA SYSTEM,” filed Jun. 2, 2020, now U.S. Pat. No. 11,044,511, issued Jun. 22, 2021, which is a continuation of U.S. patent application Ser. No. 16/028,315, entitled “SYSTEMS AND METHODS FOR SAVING AND RESTORING SCENES IN A MULTIMEDIA SYSTEM,” filed Jul. 5, 2018, now U.S. Pat. No. 10,681,298 issued Jun. 9, 2020, which is a continuation of U.S. patent application Ser. No. 15/705,065, entitled “SYSTEMS AND METHODS FOR SAVING AND RESTORING SCENES IN A MULTIMEDIA SYSTEM,” filed Sep. 14, 2017, now U.S. Pat. No. 10,021,337 issued Jul. 10, 2018, which is a continuation of U.S. patent application Ser. No. 14/970,292, entitled “SYSTEMS AND METHODS FOR SAVING AND RESTORING SCENES IN A MULTIMEDIA SYSTEM,” filed Dec. 15, 2015, now U.S. Pat. No. 9,794,505, issued Oct. 17, 2017, which is a continuation of U.S. patent application Ser. No. 13/954,787, entitled “SYSTEMS AND METHODS FOR SAVING AND RESTORING SCENES IN A MULTIMEDIA SYSTEM,” filed Jul. 30, 2013, now U.S. Pat. No. 9,288,422, issued Mar. 15, 2016, which is a continuation of U.S. patent application Ser. No. 12/202,796, entitled “SYSTEMS AND METHODS FOR SAVING AND RESTORING SCENES IN A MULTIMEDIA SYSTEM,” filed Sep. 2, 2008, now U.S. Pat. No. 8,519,820, issued Aug. 27, 2013. The disclosures of these applications are incorporated by reference herein in their entirety. This disclosure is also related to U.S. application Ser. No. 15/275,342, filed Sep. 24, 2016, published as US2018/0091381A1 on Mar. 29, 2018, the disclosure of which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5909183 | Borgstahl et al. | Jun 1999 | A |
5924486 | Ehlers et al. | Jul 1999 | A |
6323846 | Westerman et al. | Nov 2001 | B1 |
6604023 | Brown et al. | Aug 2003 | B1 |
7136709 | Arling et al. | Nov 2006 | B2 |
7187279 | Chung | Mar 2007 | B2 |
8281342 | Hiraoka | Oct 2012 | B2 |
8519820 | Cannistraro | Aug 2013 | B2 |
9172937 | Roberts et al. | Oct 2015 | B2 |
9288422 | Cannistraro | Mar 2016 | B2 |
9794505 | Cannistraro | Oct 2017 | B2 |
10021337 | Cannistraro | Jul 2018 | B2 |
20020116471 | Shteyn | Aug 2002 | A1 |
20030009567 | Farouk | Jan 2003 | A1 |
20030023435 | Josephson | Jan 2003 | A1 |
20030074088 | Gonzales et al. | Apr 2003 | A1 |
20040215694 | Podolsky | Oct 2004 | A1 |
20040217859 | Pucci | Nov 2004 | A1 |
20050035846 | Zigmond et al. | Feb 2005 | A1 |
20050097618 | Arling et al. | May 2005 | A1 |
20050231134 | Sid | Oct 2005 | A1 |
20050254505 | Chang et al. | Nov 2005 | A1 |
20060004680 | Robarts et al. | Jan 2006 | A1 |
20060045280 | Boss et al. | Mar 2006 | A1 |
20060221260 | Fujine et al. | Oct 2006 | A1 |
20060288300 | Chambers et al. | Dec 2006 | A1 |
20070047481 | Fair et al. | Mar 2007 | A1 |
20070152983 | McKillop et al. | Jul 2007 | A1 |
20070237492 | Roberts et al. | Oct 2007 | A1 |
20080129491 | Ruperto | Jun 2008 | A1 |
20080141293 | Blanchard | Jun 2008 | A1 |
20080183651 | Hunter | Jul 2008 | A1 |
20080235031 | Yamamoto | Sep 2008 | A1 |
20090023395 | Chang et al. | Jan 2009 | A1 |
20090055742 | Nordhagen | Feb 2009 | A1 |
20090070363 | Bull et al. | Mar 2009 | A1 |
20090102617 | Thommes | Apr 2009 | A1 |
20090152286 | Wilson | Jun 2009 | A1 |
20090153288 | Hope et al. | Jun 2009 | A1 |
20090153289 | Hope et al. | Jun 2009 | A1 |
20090156251 | Cannistraro et al. | Jun 2009 | A1 |
20090202250 | Dizechi et al. | Aug 2009 | A1 |
20090237487 | Santini | Sep 2009 | A1 |
20090284476 | Bull et al. | Nov 2009 | A1 |
20100052843 | Cannistraro | Mar 2010 | A1 |
20100058240 | Bull et al. | Mar 2010 | A1 |
20120124474 | Suh et al. | May 2012 | A1 |
20140009676 | Cannistraro | Jan 2014 | A1 |
20140213205 | Kennedy | Jul 2014 | A1 |
20140351374 | Canoy | Nov 2014 | A1 |
20150222517 | McLaughlin et al. | Aug 2015 | A1 |
20150350031 | Burks et al. | Dec 2015 | A1 |
20150351145 | Burks et al. | Dec 2015 | A1 |
20160055422 | Li | Feb 2016 | A1 |
20160091871 | Marti et al. | Mar 2016 | A1 |
20160173806 | Cannistraro | Jun 2016 | A1 |
20170140285 | Dotan-Cohen et al. | May 2017 | A1 |
20180070044 | Cannistraro | Mar 2018 | A1 |
20180091381 | McLaughlin et al. | Mar 2018 | A1 |
20190037162 | Cannistraro | Jan 2019 | A1 |
20210385519 | Cannistraro | Dec 2021 | A1 |
Entry |
---|
Cannistraro et al., “Remote Control Protocol for Media Systems Controlled by Portable Devices”, U.S. Appl. No. 11/955,383, filed Dec. 12, 2007. |
Hope et al., “Handheld Electronic Devices With Bimodal Remote Control Functionality”, U.S. Appl. No. 11/955,385, filed Dec. 12, 2007. |
Hope et al., “Handheld Electronic Devices With Remote Control Functionality and Gesture Recognition”, U.S. Appl. No. 11/955,382, filed Dec. 12, 2007. |
Wilson, ““Drainable Cap for Invertible Containers””, U.S. Appl. No. 11/955,392, filed Dec. 13, 2007. |
Coombs, James H.; Renear, Allen H.; DeRose, Steven J. (Nov. 1987). “MMarkup Systems and the Future of Scholarly Text Processing”. Communications of the ACM (ACM) 30 (11 ): 933-947. http://xml.coverpages.org/coombs.html. Accessed Jul. 18, 2011. Entire document pertinent. |
“Markup” Free Online Dictionary of Computing, http://foldoc.org/markup. Mar. 30, 1995, 1 page. |
Non-Final OA dated May 15, 2015 in U.S. Appl. No. 13/954,787. 37 pages. |
Notice of Allowance dated Nov. 4, 2015 in U.S. Appl. No. 13/954,787. 8 pages. |
U.S. Appl. No. 12/202,796, Final Office Action dated Feb. 17, 2012, 13 pages. |
U.S. Appl. No. 12/202,796, Non-Final Office Action dated Jul. 22, 2011, 16 pages. |
U.S. Appl. No. 12/202,796, Non-Final Office Action dated Dec. 12, 2012, 8 pages. |
U.S. Appl. No. 12/202,796, Notice of Allowance dated Apr. 26, 2013, 8 pages. |
Notice of Allowance dated Jun. 16, 2017 in U.S. Appl. No. 14/970,292. 9 pages. |
Cannistraro et al., U.S. Appl. No. 11/955,383, filed Dec. 12, 2007. |
Hope et al., U.S. Appl. No. 11/955,385, filed Dec. 12, 2007. |
Kelce S. Wilson, U.S. Appl. No. 11/955,392, filed Dec. 13, 2007. |
Notice of Allowance dated Mar. 14, 2018 in U.S. Appl. No. 15/705,065. 11 pages. |
Number | Date | Country | |
---|---|---|---|
20220377401 A1 | Nov 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17349849 | Jun 2021 | US |
Child | 17652678 | US | |
Parent | 16890967 | Jun 2020 | US |
Child | 17349849 | US | |
Parent | 16028315 | Jul 2018 | US |
Child | 16890967 | US | |
Parent | 15705065 | Sep 2017 | US |
Child | 16028315 | US | |
Parent | 14970292 | Dec 2015 | US |
Child | 15705065 | US | |
Parent | 13954787 | Jul 2013 | US |
Child | 14970292 | US | |
Parent | 12202796 | Sep 2008 | US |
Child | 13954787 | US |