Example embodiments of the present invention relate generally to displays and user interfaces of mobile devices and, in particular, to controlling the level of information detail displayed on the display of a device when used in a multi-device environment.
The modern communications era has brought about a tremendous expansion of wireline and wireless networks. Computer networks, television networks, and telephone networks are experiencing an unprecedented technological expansion, fueled by consumer demand. Wireless and mobile networking technologies have addressed consumer demands while providing more flexibility and immediacy of information transfer.
Mobile devices, such as cellular telephones, have become smaller and lighter while also becoming more capable of performing tasks that far exceed a traditional voice call. Mobile devices are increasingly becoming small, portable computing devices that are capable of running a variety of applications and providing a user with a display on which they may watch video, view web pages, play interactive games, or read text. Devices are often small enough to fit into a pocket to achieve desired portability of these devices; however, as the capabilities of the devices increases, the displays of such devices are used to display large amounts of information and view objects which have traditionally been displayed on larger, less portable displays. It may be desirable to provide a method of enhancing the displayed information of a single device in a multi-device environment in response to a user input.
In general, exemplary embodiments of the present invention provide an improved method of enhancing a user interface with a mobile device by joining the displays of multiple devices together to function together with one another and controlling information detail in a multi-device environment. In particular, the method of example embodiments provides for directing a presentation of a first image by a processor on a display of a device configured to operate in a multi-device environment, detecting a motion of the device, directing a change of an image presented on the display of the device from the first image to a second image in response to detecting the motion of the device. Where the first image presented on the device is related to images presented on other devices in the multi-device environment. The second image may be a scaled version of the first image and the method may further include scaling the second image based on at least one property of the motion. Each device in the multi-device environment may be directed to present a portion of a complete image, and the first image may be a portion of the complete image. The method may further entail directing at least one other device in the multi-device environment to change an image presented on the display of the at least one other device in response to the detected motion of the device. The motion of the device may include moving the device from a first location and the method may further include again directing presentation of the first image on the device in response to detection that the device being returned to the first location. The second image may be an expanded view of the first image including information not present in the first image.
According to another embodiment of the present invention, an apparatus is provided. The apparatus may include at least one processor and at least one memory including computer program code, the at least one processor and the at least one memory including computer program code. The at least one memory and the computer program code may be configured to, with the at least one processor, cause the apparatus to at least direct presentation of a first image on a display of a device configured to operate in a multi-device environment, detect a motion of the device, and direct a change of an image presented on the display of the device from the first image to a second image in response to detecting the motion of the device. The first image presented on the device may be related to images presented on other devices in the multi-device environment. The second image may be a scaled version of the first image and the computer program code may be further configured to cause the apparatus to scale the second image based on at least one property of the motion. The memory and the computer program code may be configured to, with the at least one processor, cause the apparatus to present a portion of a complete image, and the first image is a portion of the complete image. The at least one memory and the computer program code may be configured to, with the at least one processor, cause the apparatus to direct at least one other device in the multi-device environment to change an image presented on the display of the at least one other device in response the detected motion of the device. The motion of the device may include moving the device from a first location and the apparatus may be further caused to again direct presentation of the first image on the device in response to detection that the device has returned to the first location. The second image may be an expanded view of the first image presenting information not present in the first image.
A further embodiment of the invention may include a computer program product including at least one computer-readable storage medium having computer-executable program code instructions stored therein, the computer-executable program code instructions may include program code instructions for directing the presentation of a first image on a display of a device configured to operate in a multi-device environment, program code instructions for detecting a motion of the device, and program code instructions for directing a change of an image presented on the display of the device from the first image to a second image in response to detecting the motion of the device. The first image presented on the device may be related to images presented on other devices in the multi-device environment. The second image may be a scaled version of the first image and the computer program product may further include program code instructions for scaling the second image based on at least one property of the motion. The computer program product may further include program code instructions to cause each device in the multi-device environment to present a portion of a complete image, and the first image may be a portion of a complete image. The computer program product may further include program code instructions for causing at least one other device in the multi-device environment to change an image presented on the display of said at least one other device in response to the detected motion of the device. The motion of the device may include moving the device from a first location and the computer program product may further include program code instructions for again directing presentation of the first image on the device in response to the device being returned to the first location.
Another example embodiment of the present invention may provide a means for directing presentation of a first image on a display of a device configured to operate in a multi-device environment, means for detecting a motion of the device, and means for directing a change of the image presented on the display of the device from the first image to a second image in response to detecting the motion of the device. The first image presented on the device may be related to images presented on other devices in the multi-device environment. The second image may be a scaled version of the first image and the apparatus may include means for scaling the second image based on at least one property of the motion. The apparatus may further include means for presenting a portion of a complete image, where the first image is a portion of the complete image. The apparatus may include means for directing at least one other device in the multi-device environment to change an image presented on the display of the at least one other device in response the detected motion of the device. The motion of the device may include moving the device from a first location and the apparatus may include means again directing presentation of the first image on the device in response to detection that the device has returned to the first location. The second image may be an expanded view of the first image presenting information not present in the first image.
In general, further example embodiments of the present invention may provide a simple and intuitive method for combining the displays of multiple devices in a multi-device environment and for indicating the spatial arrangement of the devices relative to one another. The method may include detecting a touch, receiving an indication of a touch on another device in a multi-device environment, obtaining an order of devices in the multi-device environment, and providing for operation according to the order of devices. The method may further include obtaining a location relative to another device in the multi-device environment in response to receiving an indication of a touch on said device. The method may also include providing for display of a portion of an image based upon the location relative to another device. Receiving an indication of a touch on another device in a multi-device environment may include receiving a request to join said device in the multi-device environment.
According to another embodiment of the present invention, an apparatus is provided. The apparatus may include at least one processor and at least one memory including computer program code, the at least one processor and the at least one memory including computer program code. The at least one memory and the computer program code may be configured to, with the at least one processor, cause the apparatus to at least detect a touch, receive an indication of a touch on another device in a multi-device environment, obtain an order of devices in the multi-device environment, and provide for operation according to the order of devices. The apparatus may further be caused to obtain a location relative to another device in the multi-device environment in response to receiving an indication of a touch on said device and provide for display of a portion of an image based upon the location relative to another device. Receiving an indication of a touch on another device in the multi-device environment may include receiving a request to join the device in the multi-device environment.
A further embodiment of the invention may include a computer program product including at least one computer-readable storage medium having computer-executable program code instructions stored therein, the computer-executable program code instructions may include program code instructions for detecting a touch, program code instructions for receiving an indication of a touch on another device in a multi-device environment, program code instructions for obtaining an order of devices in the multi-device environment, and program code instructions for providing for operation according to the order of devices. The computer program product may further include program code instructions for obtaining a location relative to another device in the multi-device environment in response to receiving an indication of a touch on the device and program code instructions for providing for display of a portion of an image based upon the location relative to another device. The program code instructions for receiving an indication of a touch on another device in a multi-device environment may include program code instructions for receiving a request to join the device in the multi-device environment.
Another example embodiment of the present invention may provide an apparatus including means for detecting a touch, means for receiving an indication of a touch on another device in a multi-device environment, means for obtaining an order of devices in the multi-device environment, and means for providing for operation according to the order of devices. The apparatus may further include means for obtaining a location relative to another device in the multi-device environment in response to receiving an indication of a touch on the device and means for providing for display of a portion of an image based upon the location relative to another device. Receiving an indication of a touch on another device in a multi-device environment may include receiving a request to join the device in the multi-device environment.
Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Some example embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. Indeed, various embodiments of the invention may be embodied in many different forms and should not be construed as limited to the example embodiments set forth herein; rather, these example embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. As used herein, the terms “data,” “content,” “information” and similar terms may be used interchangeably to refer to data capable of being transmitted, received and/or stored in accordance with embodiments of the present invention.
Additionally, as used herein, the term ‘circuitry’ refers to (a) hardware-only circuit implementations (e.g., implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present. This definition of ‘circuitry’ applies to all uses of this term herein, including in any claims. As a further example, as used herein, the term ‘circuitry’ also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware. As another example, the term ‘circuitry’ as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
A session may be supported by a network 30 as shown in
One or more communication terminals such as the mobile terminal 10 and the second mobile terminal 20 may be in communication with each other via the network 30 and each may include an antenna or antennas for transmitting signals to and for receiving signals from a base site, which could be, for example a base station that is part of one or more cellular or mobile networks or an access point that may be coupled to a data network, such as a local area network (LAN), a metropolitan area network (MAN), and/or a wide area network (WAN), such as the Internet. In turn, other devices (e.g., personal computers, server computers or the like) may be coupled to the mobile terminal 10 and the second mobile terminal 20 via the network 30. By directly or indirectly connecting the mobile terminal 10 and the second mobile terminal 20 and other devices to the network 30, the mobile terminal 10 and the second mobile terminal 20 may be enabled to communicate with the other devices or each other, for example, according to numerous communication protocols including Hypertext Transfer Protocol (HTTP) and/or the like, to thereby carry out various communication or other functions of the mobile terminal 10 and the second mobile terminal 20, respectively.
In example embodiments, either of the mobile terminals may be mobile or fixed communication devices. Thus, for example, the mobile terminal 10 and the second mobile terminal 20 could be, or be substituted by, any of personal computers (PCs), personal digital assistants (PDAs), wireless telephones, desktop computer, laptop computer, mobile computers, cameras, video recorders, audio/video players, positioning devices, game devices, television devices, radio devices, or various other devices or combinations thereof.
Although the mobile terminal 10 may be configured in various manners, one example of a mobile terminal that could benefit from embodiments of the invention is depicted in the block diagram of
The mobile terminal (e.g., mobile terminal 10) may, in some embodiments, be a computing device configured to employ an example embodiment of the present invention. However, in some embodiments, the mobile terminal may be embodied as a chip or chip set. In other words, the mobile terminal may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The mobile terminal may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single “system on a chip.” As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
The mobile terminal 10 illustrated in
It is understood that the apparatus, such as the processor 40, may include circuitry implementing, among others, audio and logic functions of the mobile terminal 10. The processor may be embodied in a number of different ways. For example, the processor may be embodied as various processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing circuitry including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like), a hardware accelerator, and/or the like.
In an example embodiment, the processor 40 may be configured to execute instructions stored in the memory device 60 or otherwise accessible to the processor 40. Alternatively or additionally, the processor 40 may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor 40 may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to an embodiment of the present invention while configured accordingly. Thus, for example, when the processor 40 is embodied as an ASIC, FPGA or the like, the processor 40 may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor 40 is embodied as an executor of software instructions, the instructions may specifically configure the processor 40 to perform the algorithms and/or operations described herein when the instructions are executed. However, in some cases, the processor 40 may be a processor of a specific device (e.g., a mobile terminal or network device) adapted for employing an embodiment of the present invention by further configuration of the processor 40 by instructions for performing the algorithms and/or operations described herein. The processor 40 may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor 40.
The mobile terminal 10 may also comprise a user interface including an output device such as an earphone or speaker 44, a ringer 42, a microphone 46, a display 48, and a user input interface, which may be coupled to the processor 40. The user input interface, which allows the mobile terminal to receive data, may include any of a number of devices allowing the mobile terminal to receive data, such as a keypad 50, a touch sensitive display (not shown) or other input device. In embodiments including the keypad, the keypad may include numeric (0-9) and related keys (#, *), and other hard and soft keys used for operating the mobile terminal 10. Alternatively, the keypad may include a conventional QWERTY keypad arrangement. The keypad may also include various soft keys with associated functions. In addition, or alternatively, the mobile terminal may include an interface device such as a joystick or other user input interface. The mobile terminal may further include a battery 54, such as a vibrating battery pack, for powering various circuits that are used to operate the mobile terminal, as well as optionally providing mechanical vibration as a detectable output. The mobile terminal 10 may also include a sensor 49, such as an accelerometer, motion sensor/detector, temperature sensor, or other environmental sensor to provide input to the processor indicative of a condition or stimulus of the mobile terminal 10.
The mobile terminal 10 may further include a user identity module (UIM) 58, which may generically be referred to as a smart card. The UIM may be a memory device having a processor built in. The UIM may include, for example, a subscriber identity module (SIM), a universal integrated circuit card (UICC), a universal subscriber identity module (USIM), a removable user identity module (R-UIM), or any other smart card. The UIM may store information elements related to a mobile subscriber. In addition to the UIM, the mobile terminal may be equipped with memory. For example, the mobile terminal may include volatile memory 60, such as volatile Random Access Memory (RAM) including a cache area for the temporary storage of data. The mobile terminal may also include other non-volatile memory 62, which may be embedded and/or may be removable. The non-volatile memory may additionally or alternatively comprise an electrically erasable programmable read only memory (EEPROM), flash memory or the like. The memories may store any of a number of pieces of information, and data, used by the mobile terminal to implement the functions of the mobile terminal. For example, the memories may include an identifier, such as an international mobile equipment identification (IMEI) code, capable of uniquely identifying the mobile terminal. Furthermore, the memories may store instructions for determining cell id information. Specifically, the memories may store an application program for execution by the processor 40, which determines an identity of the current cell, i.e., cell id identity or cell id information, with which the mobile terminal is in communication.
In general, example embodiments of the present invention provide a method for controlling information detail depicted on the display of a device, such as a mobile terminal 10. In particular, embodiments may control information detail depicted on the display of a mobile terminal relative to at least one other mobile terminal when the mobile terminal is operating in a multi-device environment. For example a first mobile terminal may be operating in a near-field network with at least one other mobile terminal, through a protocol such as Bluetooth™, and the mobile terminals may be operating in a symbiotic manner in which the displays of the mobile terminals are joined together to create a larger display capable of presenting a greater amount of detail of an image, document, or other object presented across the displays of the mobile terminals. While the term “image” is used herein to describe what is presented on the display of a mobile terminal, it is to be understood that the term image is not limited to media files or images in the conventional sense, but rather the presentation of any object of data, media, or otherwise which may be presented on the display of a mobile terminal.
An example application for which embodiments of the present invention may be implemented includes a virtual mind map as presented on a first mobile terminal placed, for example, on a table top surface. A second mobile terminal may be placed adjacent to the first mobile terminal and a join-event may occur to join the two devices in a multi-device environment. The join event may include a touch gesture between the two mobile terminals or a menu-driven pairing operation operable on either or both mobile terminals. Mobile terminals that have previously been joined in a multi-device environment may require only to be placed directly adjacent one another to initiate the join event. The user(s) may indicate through a gesture or a menu prompt by either terminal that a join event is to occur or to simply confirm the join event. Once joined, the two mobile terminals may function cooperatively (or independently) in dependence of the application executed on one or both of the mobile terminals. For example, in the case of a virtual mind map, the second terminal may present a portion of the virtual mind map that was previously off-screen of the first mobile terminal as the second mobile terminal may function to expand the display area of the first mobile terminal.
A multi-device near-field network may provide a multi-device environment in which multiple mobile terminals may be used cooperatively to enhance a user experience. Mobile terminals may be “joined” to the network through a number of possible manners, such as through motion gestures of adjacent mobile terminals, or through a manual connection procedure in which a user synchronizes or pairs a mobile terminal with another mobile terminal. The motion gesture for joining the devices may consist of a sequence of primitive discrete gestures like taps on each device, or it may be a continuous gesture (e.g. of circular shape) that spans across the displays of the devices. In one embodiment of the present invention, the order of the devices in the group may be defined through the order each device is joined to the group through the motion gesture. For example, the device that is tapped first or is the starting point for a continuous joining gesture, becomes the first or “dominant” device in the group. In yet another embodiment, the devices are able to track each others' relative position (e.g. the devices form a circle), the joining gesture may be started by a user (e.g. by tapping on three adjacent devices in clockwise direction) and the rest of the devices and their order in the group may be determined automatically (e.g. adding each adjacent device to the group following the clockwise order). Once joined, two or more mobile terminals may cooperatively perform actions or execute programs to enhance a user experience. The methods of cooperation may differ depending upon the application or functions being performed by the mobile terminals. The applications may utilize the order of the devices in the group to determine which information to present or to relay between the users. Such applications that consider the order of the devices in a group include various games, educational applications, expert review systems like medical applications, enterprise applications like auditing, and so forth.
One example of cooperation may include a media viewing application in which the displays of at least two mobile terminals are virtually joined to create a larger display as illustrated in
While
Example embodiments of the present invention are described herein with reference to a mobile terminal comprising a touch-sensitive display (e.g., a touchscreen); however, embodiments of the present invention may be configured to be operable on various types of mobile terminals with single or multi-touch displays, displays with separate touch-pad user-interfaces, or other display types.
Embodiments of the present invention may comprise at least two fundamental operations. A first operation includes a mobile terminal being joined with at least one other mobile terminal to form a multi-device environment. The multi-device environment may be supported, for example, by a near-field communications protocol such as Bluetooth™. Once joined, the mobile terminals of the multi-device environment may be configured to control the level of information detail depicted on each of the mobile terminals in the multi-device environment. The second operation includes enabling functionality of at least one of the mobile terminals to control the information detail of at least one of the mobile terminals in the multi-device environment. A first mobile terminal of the mobile terminals of the multi-device environment may control the information detail level for the first mobile terminal and the first mobile terminal may also control the information detail level of each of the remaining mobile terminals in the multi-device environment.
An example embodiment of the present invention is illustrated in
Another example embodiment of the present invention is illustrated in
Example embodiments of the present invention may include a dominant mobile terminal which controls the images presented on each of the mobile terminals in the multi-device environment. The dominant mobile terminal may be determined at the time the multi-device environment is created. For example, when the multi-device environment is created through contact of the mobile terminals or through the pairing of mobile terminals, the first mobile terminal to initiate a join event with another mobile terminal may be considered the “dominant” mobile terminal and may then be the mobile terminal used to control the information detail depicted on the displays of each of the other mobile terminals. Alternatively, the dominant mobile terminal may be whichever mobile terminal in a multi-device environment experiences a stimulus that causes a change in the images presented on the displays of the other mobile terminals, such as any mobile terminal which is moved from its location within the multi-device environment. In an example embodiment where more than one mobile terminal is moved relative to the other mobile terminals in a multi-device environment, the first mobile terminal moved may remain the dominant mobile terminal or, optionally, the most recently moved mobile terminal may become the dominant mobile terminal. Each of these methods for determining the dominant mobile terminal in a multi-device environment may be user configurable by the mobile terminals in such a multi-device environment or the mobile terminals within a multi-device environment may be governed by a set of rules generated for a multi-device environment based upon the application used in the multi-device environment. For example, an image display application, when used in a multi-device environment, may include few, simple rules for determining the dominant mobile terminal, while a multi-device environment operating a spreadsheet program may have more complex rules requiring a single dominant mobile terminal to properly perform the spreadsheet application in the multi-device environment.
The joining of mobile terminals in a multi-device environment can be accomplished in a number of possible ways. Example embodiments of joining devices may include where mobile terminals are physically “bumped” together, where the “bump” is detected by, for example, microphones or accelerometers. Other methods for joining mobile terminals may include a pinch gesture across the displays of multiple mobile terminals. Further example embodiments may detect mobile terminals to be joined by RFID readers and tags, or infrared transmitters and receivers attached to the edges of a mobile terminal, for example. Optionally, more generic position tracking technologies may be used such as, for example ultrasound or radio technologies.
Determining the spatial arrangement of multiple mobile terminals in a multi-device environment may be accomplished via interpretation of a gesture or a touch of the display of a mobile terminal. For example, a continuous circle gesture performed across the displays of multiple mobile terminals may indicate the physical arrangement of the mobile terminals relative to one another and may further indicate the “dominant” mobile terminal based upon the starting location of the gesture. The motion of the gesture may connect the displays of the mobile terminal in the multi-device environment, set the physical arrangement of the mobile terminals relative to one another, and set the order of the mobile terminals in applications requiring turn-based access to content items (e.g., providing a hierarchy).
While the above example embodiments have been described with respect to a multi-device environment, further example embodiments of the present invention may be used with a single mobile terminal. For example, a mobile terminal may be on a surface or held by a user presenting an image on the display of the mobile terminal. In response to the mobile terminal being moved, for example, in an upward direction, the image presented on the mobile terminal may become zoomed-in. Further, the panning operation described above with respect to
Accordingly, blocks of the flowchart support combinations of means for performing the specified functions, combinations of operations for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that one or more blocks of the flowchart, and combinations of blocks in the flowcharts, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
An example embodiment of a method of the present invention in which a device may control information detail in a multi-device environment is depicted in the flowchart of
Another example embodiment of a method of the present invention in which a simple and intuitive method for combining the displays of multiple mobile terminals in a multi-device environment, and for indicating the spatial arrangement of the mobile terminals relative to one another, is depicted in the flowchart of
In an example embodiment, an apparatus for performing the methods of
As described above and as will be appreciated by one skilled in the art, embodiments of the present invention may be configured as a system, method or electronic device. Accordingly, embodiments of the present invention may be comprised of various means including entirely of hardware or any combination of software and hardware. Furthermore, embodiments of the present invention may take the form of a computer program product on a computer-readable storage medium having computer-readable program instructions (e.g., computer software) embodied in the storage medium. Any suitable computer-readable storage medium may be utilized including hard disks, CD-ROMs, optical storage devices, or magnetic storage devices.
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.