Embodiments of the present invention relate generally to mobile terminal interoperability with a remote environment or remote client, and, more particularly, relate to a method and apparatus for associating content components with different hardware interfaces to facilitate exchange between a mobile terminal and the remote environment.
Mobile computing devices continue to evolve such that the computing devices are capable of supporting new and powerful applications. Examples include location and mapping technologies (e.g., via Global Positioning System (GPS)), media player technologies (e.g., audio and video), web browsing technologies, gaming technologies, and the like. Mobile computing devices or mobile terminals, such as mobile phones, smart phones, personal digital assistants are evolving into personal media and entertainment centers in the sense that the devices are able to store and present a considerable amount of multimedia content. Moreover, many mobile computing devices support rich interactive games including those with three dimensional graphics.
However, due to the inherently small screen sizes and form factors of mobile computing devices, the user experience can be compromised when using rich applications. As such, solutions have been developed for interfacing a mobile computing device with a remote environment, such as a vehicle head unit, a meeting room, a home living room, etc., that, for example, includes a larger display or a more convenient user interface. As a consequence, the features and capabilities of the mobile component device may be projected into the remove environment and appear as inherent capabilities of the remote environment. The interfacing between the mobile computing device and the remote environment may occur upon entry of the mobile computing device into the remove environment However, connecting a mobile computing device to a remote environment often introduces latency to the user experience or otherwise diminishes the quality of service (QoS). These issues may be exacerbated in instances in which a number of different types of content are provided by the mobile computing device to the remote environment since the different types of content may have different network resource requirements, e.g., bandwidth, latency, QoS, etc.
Example methods, apparatus and computer program products are therefore described that facilitate mobile device interoperability with a remote environment. In this regard, the method, apparatus and computer program product of example embodiments facilitate the provision of different types of content to the remote environment in a manner that satisfies their different network resource requirements. Thus, the method, apparatus and computer program product of example embodiments permit the user experience to be replicated in the remote environment by accommodating the different network resource requirements of the different types of content.
In one embodiment, a method is provided that determines, for each of a plurality of content components of a user interface, a respective hardware interface via which to transmit the content component. The method may also generate meta-information associated with at least one of the content components to facilitate recomposition of the content component following transmission. Further, the method may cause the plurality of content components and the meta-information to be transmitted via the respective hardware interfaces. In this regard, at least two of the content components may be transmitted via different hardware interfaces.
The method of one embodiment may also include splitting a unified user interface into a plurality of content components based upon content type, prior to determining the respective hardware interfaces via which to transmit the content components. For example, the splitting of the unified user interface may include the separation of a control stream from a user interface stream. If desired, the user interface stream may, in turn, be further split into its constituent content components, e.g., RGB, Video, OpenGL commands, etc. The method of one embodiment may embed the meta-information in a common stream with the respective content component. In one embodiment, the determination of the respective hardware interfaces may include determining the respective hardware interfaces based upon one or more network resource requirements of the respective content components. Additionally, the determination of the respective hardware interfaces may include the determination of the respective hardware interfaces based upon the quality of service of the respective hardware interfaces.
In another embodiment, an apparatus is provided that includes at least one processor and at least one memory including computer program code. The at least one memory and the computer program code are configured to, with the processor, cause the apparatus to determine, for each of a plurality of content components of a user interface, a respective hardware interface via which to transmit the content component. The at least one memory and the computer program code may also be configured to, with the processor, cause the apparatus to generate meta-information associated with at least one of the content components to facilitate recomposition of the content components following transmission. Further, the at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to cause the plurality of content components and the meta-information to be transmitted via the respective hardware interfaces. In this regard, at least two of the content components may be caused to be transmitted via different hardware interfaces.
The memory and the computer program code may be further configured to, with the processor, cause the apparatus to split a unified user interface into a plurality of content components based upon content type, prior to determining the respective hardware interfaces via which to transmit the content components. For example, the unified user interface may be split by separating a control stream from a user interface stream. The memory and the computer program code may be further configured to, with the processor, cause the apparatus to embed the meta-information in a common stream with the respective content component. The memory and the computer program code may be further configured to, with the processor, cause the apparatus to determine the respective hardware interfaces by determining the respective hardware interfaces based upon one or more network resource requirements of the respective content components. Additionally, the determination of the respective hardware interfaces may be based upon the quality of service of the respective hardware interfaces.
A computer program product is provided in accordance with one embodiment that includes at least one computer-readable storage medium having computer-executable program code portions stored therein. The computer-executable program code portions may include program code instructions for determining, for each of a plurality of content components of a user interface, a respective hardware interface via which to transmit the content component. The computer-executable program code portions may also include program code instructions for generating meta-information associated with at least one of the content components to facilitate recomposition of the content components following transmission and program code instructions for causing the plurality of content components and the meta-information to be transmitted via the respective hardware interfaces. For example, at least two of the content components may be caused to be transmitted via different hardware interfaces.
The computer-executable program code portions may also include program code instructions for splitting the unified user interface into a plurality of content components based upon content type prior to determining the respective hardware interfaces via which to transmit the content components. For example, the unified user interface may be split by separating a control stream from a user interface stream. The computer-executable program code portions may include program code instructions for embedding the meta-information in a common stream with the respective content component. The program code instructions for determining the respective hardware interfaces may include program code instructions for determining the respective hardware interfaces based upon one or more network resource requirements of the respective content components. Additionally, the respective hardware interfaces may be determined based upon the quality of service of the respective hardware interfaces.
An apparatus is also provided in accordance with one embodiment that includes means for determining, for each of a plurality of content components of a user interface, a respective hardware interface via which to transmit the content component. The apparatus of this embodiment may also include means for generating meta-information associated with at least one of the content components to facilitate recomposition the content components following transmission and means for causing the plurality of content components and the meta-information to be transmitted via the respective hardware interfaces. For example, at least two of the content components may be transmitted via different hardware interfaces.
The apparatus may also include means for splitting a unified user interface into a plurality of content components based upon content type prior to determining the respective hardware interfaces via which to transmit the content components. In this regard, the means for splitting the unified user interface may include means for separating a control stream from a user interface stream. The apparatus of one embodiment may also include means for embedding the meta-information in a common stream with the respective content component. The means for determining the respective hardware interfaces may include means for determining the respective hardware interfaces based upon one or more network resource requirements of the respective content components. For example, the means for determining the respective hardware interfaces may be based upon the quality of service of the respective hardware interfaces.
In another embodiment, a method is provided that receives a plurality of streams of content components and meta-information via different respective hardware interfaces, recomposes the content components in accordance with the meta-information to form a unified user interface and causes a display to be presented in accordance with the unified user interface. For example, receiving the plurality of streams may include receiving a control stream and a user interface stream via different hardware interfaces. In addition, the method of one embodiment may include causing feedback to be provided regarding the quality of service of the respective hardware interfaces.
An apparatus is provided in accordance with another embodiment that includes at least one processor and at least one memory including computer program code. In this embodiment, the at least one memory and the computer program code are configured to, with the processor, cause the apparatus to at least receive the plurality of streams of content components and meta-information via different respective hardware interfaces, recompose the content components in accordance with the meta-information to form a unified user interface and cause a display to be presented in accordance with the unified user interface. In regards to receiving the plurality of streams, the memory and the computer program code may be further configured to, with the processor, cause the apparatus to receive a control stream and a user interface stream via different hardware interfaces. The memory and the computer program code of one embodiment may be further configured to, with the processor, cause the apparatus to cause feedback to be provided regarding the quality of service of the respective hardware interfaces.
A computer program product is provided in accordance with another embodiment that includes at least one computer-readable storage medium having a computer-executable program code portions stored therein. The computer-executable program code portions include program code instructions for receiving a plurality of streams of content components and meta-information via different respective hardware interfaces, recomposing the content components in accordance with the meta-information to form a unified user interface and causing a display to be presented in accordance with the unified user interface. The program code instructions for receiving the plurality of streams may include program code instructions for receiving a control stream and a user interface stream via different hardware interfaces. the computer-executable program code portions may also include program code instructions for causing feedback to be provided regarding the quality of service of the respective hardware interfaces.
An apparatus may be provided in accordance with one embodiment that includes means for receiving a plurality of streams of content components and meta-information via different respective hardware interfaces, means for recomposing the content components in accordance with the meta-information to form a unified user interface and means for causing a display to be presented in accordance with the unified user interface. The means for receiving a plurality of streams may include means for receiving a control stream and a user interface stream via different hardware interfaces. The apparatus of one embodiment may also include means for causing feedback to be provided regarding the quality of service of the respective hardware interfaces.
By separately determining the hardware interface via which to transmit each content component, the content components may be associated with different hardware interfaces based upon, for example, the network resource requirements of the respective content components. By matching or otherwise correlating the network resource requirements of the respective content components with the performance offered by the respective hardware interfaces, the content components may be transmitted and then recomposed in an efficient and effective way such that the user experience in the remote environment may be improved.
Having thus described embodiments of the present invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Some embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, various embodiments of the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. As used herein, the terms “data,” “content,” “information” and similar terms may be used interchangeably to refer to data capable of being transmitted, received and/or stored in accordance with embodiments of the present invention. Thus, use of any such terms should not be taken to limit the spirit and scope of embodiments of the present invention.
Additionally, as used herein, the term ‘circuitry’ refers to (a) hardware-only circuit implementations (e.g., implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present. This definition of ‘circuitry’ applies to all uses of this term herein, including in any claims. As a further example, as used herein, the term ‘circuitry’ also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware. As another example, the term ‘circuitry’ as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
As defined herein a “computer-readable storage medium,” which refers to a non-transitory, physical storage medium (e.g., volatile or non-volatile memory device), can be differentiated from a “computer-readable transmission medium,” which refers to an electromagnetic signal.
Some embodiments of the present invention may provide a mechanism by which improvements may be experienced in relation to the interoperability of mobile terminals with remote environments. In this regard, for example, a mobile terminal may be placed in communication with a remote device or environment and the mobile terminal and the remote environment may exchange information that directs the remote environment to display the same or at least a portion of the same user interface as that provided or generated by the mobile terminal. In this regard,
The remote environment 10 may include any type of computing device configured to display an image. According to some example embodiments, the remote environment may include user interface components and functionality, such as a screen on which to display the image. In this regard, keypad 16 may be an optional user input device, although other types of user input devices may be employed. For example, the screen may be a touch screen display that is configured to receive input from a user via touch events with the display. Further, the remote environment may include gaming controllers, speakers, a microphone, and the like. According to some example embodiments, the remote environment may be a system of devices that define an intelligent space. The system of devices may be configured to cooperate to perform various functionalities. For example, a remote environment implemented in a meeting room, a home living room, etc. may include a large screen monitor, a wired telephone device, a computer, and the like. The remote environment may also include a communications interface for communicating with the mobile terminal 12 via the communications link 14. By way of another example, the remote environment may include an in-car navigation system, a vehicle entertainment system, a vehicle head unit or any of a number of other remote environment with which the mobile terminal may communicate.
The communications link 14 may be any type communications link capable of supporting communications between the remote environment 10 and the mobile terminal 12. According to some example embodiments, the communications link is a wireless link, such as a wireless local area network (WLAN) link, a Bluetooth link, a WiFi link, an infrared link or the like. While the communications link is depicted as a wireless link, it is contemplated that the communications link may be a wired link, such as a Universal Serial Bus (USB) link or a High-Definition Multimedia Interface (HDMI) link. As described below, the communications link generally includes a plurality of different types of links. Consequently, the mobile terminal and the remote environment may each include a plurality of hardware interfaces, one of which is associated with and adapted for each of the communications links.
The mobile terminal 12 may be any type of mobile computing and communications device. According to various example embodiments, the mobile terminal is any type of user equipment, such as, for example, a personal digital assistant (PDA), wireless telephone, mobile computing device, camera, video recorder, audio/video player, positioning device (e.g., a GPS device), game device, television device, radio device, or various other like devices or combinations thereof. The mobile terminal may be configured to communicate with the remote environment 10 via the communications link 14. The mobile terminal may also be configured to execute and implement applications via a processor and memory included within the mobile terminal, as described below.
The interaction between the mobile terminal 12 and the remote environment 10 provides an example of mobile device interoperability, which may also be referred to as a smart space, remote environment, and remote client. In this regard, features and capabilities of the mobile terminal may be projected onto an external environment (e.g., the remote environment), and the external environment may appear as if the features and capabilities are inherent to external environment such that the dependency on the mobile terminal is not apparent to a user. According to various example embodiments, the mobile terminal may seamlessly become a part of the remote environment, whenever the person carrying the mobile device physically enters into the intelligent space (e.g., living room, meeting room, vehicle, or the like). Projecting the mobile terminal's features and capabilities may involve exporting the User Interface (UI) images of the mobile terminal, as well as command and control signals, to the external environment, whereby the user may comfortably interact with the external environment in lieu of the mobile terminal.
According to some example embodiments, the mobile terminal 12 may be configured to, via the communications link 14, direct the remote environment 10 to project a user interface image originating with the mobile terminal and receive user input provided via the remote environment. The image presented by the remote environment may be the same image or a portion of the same image that is being presented on a display of the mobile terminal, or an image that would have been presented had the display of the mobile terminal been activated. In some example embodiments, the image projected by the remote environment may be a modified image, relative to the image that would have been provided on the display of the mobile terminal. For example, consider an example scenario where the remote environment is installed in a vehicle as a vehicle head unit. The driver of the vehicle may wish to use the remote environment as an interface to the mobile terminal due, for example, to the convenient location of the remote environment within the vehicle and the size of the display screen provided by the remote environment. The mobile terminal may be configured to link with the remote environment, and direct the remote environment to present user interface images.
In an example embodiment, the remote environment 10 and the mobile terminal 12 may provide for virtual network computing (VNC) operation. As such, for example, the mobile terminal may serve as a VNC server configured to provide content originally executed or accessed by the mobile terminal to the remote environment acting as a VNC client (or vice versa). A VNC protocol such as RFB (remote frame buffer) or another protocol for enabling remote access to a graphical user interface may be utilized to provide communication between the mobile terminal and the remote environment.
Referring now to
The processor 70 may be embodied in a number of different ways. For example, the processor may be embodied as one or more of various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, processing circuitry, or the like. In an exemplary embodiment, the processor may be configured to execute instructions stored in the memory device 76 or otherwise accessible to the processor. Alternatively or additionally, the processor may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to embodiments of the present invention while configured accordingly. Thus, for example, when the processor is embodied as an ASIC, FPGA or the like, the processor may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor is embodied as an executor of software instructions, the instructions may specifically configure the processor to perform the algorithms and/or operations described herein when the instructions are executed. However, in some cases, the processor may be a processor of a specific device (e.g., the mobile terminal 12 or the remote environment 10) adapted for employing embodiments of the present invention by further configuration of the processor by instructions for performing the algorithms and/or operations described herein. By executing the instructions or programming provided thereto or associated with the configuration of the processor, the processor may cause corresponding functionality to be performed. The processor may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor.
Meanwhile, the communication interface 74 may be any means such as a device or circuitry embodied in either hardware, or a combination of hardware and software that is configured to receive and/or transmit data from/to a network and/or any other device or module in communication with the apparatus. In this regard, the communication interface may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network. In some environments, the communication interface may alternatively or also support wired communication. As such, for example, the communication interface may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms. As described below, the communication interface may include a plurality of hardware interfaces for facilitating communication via different respective communication links 14. For example, the communication interface may include a WLAN interface, a Bluetooth interface, an infrared interface, a USB interface, an HDMI interface, etc.
The user interface 72 may be in communication with the processor 70 to receive an indication of a user input at the user interface and/or to provide an audible, visual, mechanical or other output to the user. As such, the user interface may include, for example, a keyboard, a mouse, a joystick, a display, a touch screen, soft keys, a microphone, a speaker, or other input/output mechanisms. In this regard, for example, the processor may comprise user interface circuitry configured to control at least some functions of one or more elements of the user interface, such as, for example, a speaker, ringer, microphone, display, and/or the like. The processor and/or user interface circuitry comprising the processor may be configured to control one or more functions of one or more elements of the user interface through computer program instructions (e.g., software and/or firmware) stored on a memory accessible to the processor (e.g., memory device 76, and/or the like).
In an example embodiment depicted in
In the embodiment of
The content that is to be provided from the mobile terminal 12 to the remote environment 10 may include a plurality of different components with each content component being of a different type. For example, the content may represent a unified user interface that includes an audio component, a video component, control signals and the like. Depending upon the type of content, each content component may have different network resource requirements that are necessary or desired to support the efficient transmission of the content component from the mobile terminal to the remote environment. While various network resource requirements may be defined, bandwidth, quality of service, latency and the like are examples of network resource requirements that may differ depending upon the type of content. Moreover, the mobile terminal and the remote environment may each include a variety of different hardware interfaces that support different types of communication links between the mobile terminal and the remote environment. As described above, for example, the mobile terminal and the remote environment may each include a WLAN interface, a Bluetooth interface, a WiFi interface, a USB interface, an HDMI interface and the like. Each hardware interface may also be configured to provide different levels of service or to otherwise provide access to different network resources, such as by providing different bandwidth, quality of service, latency, etc.
In order to facilitate the transfer of content, such as a unified user interface, between the mobile terminal 12 and the remote environment 10, the content, such as the entire user interface, e.g., video, audio, etc., may be split into different content components based upon the type of the content and the content components may then be assigned to or associated with different ones of the hardware interfaces based upon the network resource requirements of the content components and the network resources that may be provided by the different hardware interfaces. For example, the content components may be matched with respective hardware interfaces that satisfy the network resource requirements of the content components such that the content components may thereafter be transferred from the mobile terminal to the remote environment via the respective hardware interfaces in such a manner that the overall transfer is done in an efficient manner. By way of example, the entire unified user interface may be split into a user interface stream (which may, in turn, be further split into its constituent content components, e.g., video, audio, OpenGL commands, etc.) and a control stream with the user interface stream being transferred via the HDMI interface or an AV out interface, while the control stream is transferred via a Bluetooth stream or a WLAN stream. As a further example, the entire unified user interface may be split into a RGB user interface (UI) component, a video component and an audio component. Once encoded, these components may be transferred to the remote environment via different hardware interfaces, such as a USB interface for the RGB UI component, an HDMI interface for the video component and a Bluetooth interface for the audio component, as shown, for example, in
The remote environment 10 may, in turn, provide content to the mobile terminal 12. For example, the remote environment may receive user input, such as via a user input device, and may provide the user input, such as via a control stream to the mobile terminal. As before, the remote environment may determine the appropriate hardware interface via which to transfer the control stream based upon the network resource requirements of the control stream and the network resources provided by the respective hardware interfaces. In the embodiment of
By way of further example, the operations associated with the interoperability of a mobile terminal 12 and a remote environment 10 in one embodiment are described in further detail below in conjunction with
The apparatus 50 of the server device may also include means, such as the processor 70, for determining, for each of the plurality of the content components, a respective hardware interface via which to transmit the content component. See block 102 of
In at least some embodiments, the server device, such as the mobile terminal 12, may also take into account feedback from the client device, such as the remote environment 10, such as in conjunction with the determination of the respective hardware interfaces via which to transmit the content components. As such, the apparatus 50 of the server device may also include means, such as the processor 70, for determining the respective hardware interfaces via which the content components are to be transmitted based upon the feedback, such as the quality of service of the respective hardware interfaces. Thus, the assignment process by which content components are assigned to respective hardware interfaces may evolve in accordance with the behavior of the network.
In addition to determining the hardware interface via which to transmit each content component, the apparatus 50 of the server device may include means, such as the processor 70, for generating meta-information associated with at least one of the content components. See block 104 of
The apparatus 50 of the server device may then also include means, such as the processor 70, the communication interface 74 or the like, for causing the plurality of content components and the meta-information to be transmitted via respective hardware interfaces. See block 108 of
Following transmission of the content components, the client device, such as the remote environment 10, may include an apparatus 50 having means, such as the processor 70, the communication interface 74 or the like, for receiving the plurality of streams of content components and meta-information via the different respective hardware interfaces. See block 110 of
The apparatus 50 of the client device may also include means, such as the processor 70, for implementing content composition by recomposing the content components to the original content, such as a unified user interface. See block 112 of
In one embodiment, the apparatus 50 of the client device may also include means, such as the processor 70, the communication interface 74 or the like, causing feedback, such as feedback regarding the quality of service of the respective hardware interfaces, to be provided to the server device. In this regard, the processor may determine a measure of the quality of service associated with the transfer of each content component via the respective hardware interface and may then provide such information to the server device for use, for example, in conjunction with the subsequent assignment of content components to the different hardware interfaces. The client device may also provide other signals, in addition to or instead of the feedback, to the server device. As shown in
While primarily described above in conjunction with an embodiment in which the mobile terminal 10 functions as the server device and the remote environment 10 serves as the client device, the roles may be reversed in other embodiments with the remote environment functioning as the server device and the mobile terminal serving as the client device. In still further embodiments, other types of devices or terminals may serve as the server device and/or the client device.
As described above,
Accordingly, blocks of the flowchart support combinations of means for performing the specified functions, combinations of operations for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that one or more blocks of the flowchart, and combinations of blocks in the flowcharts, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
In an example embodiment, an apparatus for performing the methods of
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
This application claims priority to U.S. Application No. 61/329681 filed Apr. 30, 2010, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61329861 | Apr 2010 | US |