This application was originally filed as PCT Application No. PCT/US2012/071628 filed Dec. 26, 2012, claims priority to U.S. application Ser. No. 13/445,467, filed Apr. 12, 2012, which also claims priority to International Patent Application No. PCT/EP2011/074182, filed Dec. 28, 2011.
The present disclosure relates to the field of user interfaces, associated methods, computer programs and apparatus. Certain disclosed aspects/embodiments relate to portable electronic devices, for example so-called hand-portable electronic devices which may be hand-held in use (although they may be placed in a cradle in use) and which may include cameras. Such hand-portable electronic devices include so-called Personal Digital Assistants (PDAs), cameras, mobile phones, smartphones and tablet PCs.
Photography has long been popular, but has become particularly so now that digital cameras permit images to be captured and viewed without the time and cost needed to provide and process celluloid film.
Digital cameras can be manufactured cheaply and made small in size, and they are now included in many diverse types of electronic equipment, for example in mobile telephones, PDAs, personal computers, television sets (e.g. so-called ‘smart TVs’) and children's toys. However, not all electronic devices contain a camera, and even when a camera is present the specifications and therefore the images that can be captured vary hugely. For example, a high-end Digital Single Lens Reflex (DSLR) camera is typically very large, very heavy, and very expensive, whilst the small camera typically embedded into a mobile telephone is typically very small, very light and comparatively cheap to produce. However, the physical and functional limitations of the phone camera commonly result in it producing images that are highly inferior in quality to similar scenes captured by the DSLR camera.
In a first example there is provided an apparatus comprising: at least one processor; and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to perform at least the following: receive a user input; and based on the user input, provide an open instance of a first camera control application on a first apparatus as a corresponding open instance of an at least related camera control application on a second apparatus.
In a second example there is provided a method comprising: receiving a user input; and based on the user input, providing an open instance of a first camera control application on a first apparatus as a corresponding open instance of an at least related camera control application on a second apparatus.
In a third example there is provided a computer program configured to provide computer program code for at least the following: receiving a user input; and based on the user input, providing an open instance of a first camera control application on a first apparatus as a corresponding open instance of an at least related camera control application on a second apparatus.
In a fourth example there is provided an apparatus, the apparatus comprising: means for receiving a user input; and means for providing, based on the user input, an open instance of a first camera control application on a first apparatus as a corresponding open instance of an at least related camera control application on a second apparatus.
The above summary is intended to be merely exemplary and non-limiting.
A description is now given, by way of example only, with reference to the accompanying drawings, in which:
Electronic apparatuses/devices are often configured to run one or more applications. The same application may be run on different apparatus, possibly at the same time. For example, an office may contain several desktop computers which may all run the same word processing application at the same time. The application may be stored on a remote server and accessed by the different apparatus/devices, or the application may be locally stored for use on each individual desktop computer.
In another case, there may be equivalent applications which have substantially the same functionality, and which may also run on different apparatus/devices. Such equivalent applications may, for example, be the word processing applications Microsoft Word, WordPerfect, and Open Office Writer. These three applications can be considered to be equivalent in the sense that they provide a user with substantially the same functionality, but they are in fact different word processing applications.
Further, other applications may share some common functionality while each also having functionality not in common with another application. For example, a spreadsheet application and a data plotting application may be considered to share common functionality. These two applications share some common functionality (for example, being able to organise data in a tabular/cellular format) but each may have their own functionality which is not shared. For example, the spreadsheet application may allow more advanced text-based reports to be generated and the data plotting application may not allow this functionality, but the data plotting application may allow a user to plot data in a more advanced way, for example using mathematical functions and statistical analysis, but the spreadsheet application may not allow this functionality.
One example of an application that may be run on a on an apparatus/device is a camera application that is used to control a physical camera. Such control may include causing the camera to capture an image, changing the camera's settings (for example its focus, aperture, ISO (i.e. film speed), etc. Where the camera is movable (for example where it is mounted on servos) then the control may include moving the camera. The camera control application may also, or alternatively, cause information to be presented to the user concerning the camera's current state: for example a viewfinder view, or one or more current settings of the camera.
In the case where the application is a camera control application, there may be circumstances in which it is desirable that it is run on a particular apparatus. For example, it may be beneficial to run the application on an apparatus that comprises the camera itself. Not only does this simplify the relationship between the application and the camera in the user's mind (he is controlling the camera locally), but since the image captured by the camera will normally be dependent upon its position (and therefore the position, in this example, of the apparatus comprising the camera), the user can easily access a user interface presented by the camera control application whilst he is holding the camera in order to appropriately position it.
Conversely, it may sometimes be beneficial that the camera control application runs on a device other than the camera.
An example of a use case in which it may be desirable to run the camera control application on a device other than the camera being controlled is when the user is for some reason unable to immediately access the camera. For example, the user may be composing a photograph in which he is the subject (or one of the subjects)—in this event it would be beneficial for the user to control the camera whilst he is in the correct position for the photograph to be composed (e.g. when he is some distance in front of the camera and probably unable to reach the camera or to easily see its user interface). Alternatively, the user may not be a subject of the composition, but for other reasons may not be able to reach his camera. Examples of this latter scenario would be where the camera has been out of reach of the user in order to capture a better view of a scene (for example when it is mounted on a camera rig and raised too high for the user to reach), or where the camera has been introduced into an environment that is unsafe or undesirable for the user to enter (e.g. underwater, in proximity to a dangerous or easily scared animal, or close to a poisonous or otherwise dangerous object).
Another example use case in which it may be desirable to run the camera control application on a device other than the camera is when the user may have ready access to the camera but prefer not to directly interact with it. For example, the user may wish to avoid disturbing the camera by touching it to instruct it to capture an image (e.g. by depressing a shutter button) because the vibration that he introduces to the camera when touching it may adversely affect the quality of the captured image.
As another example, the user may simply prefer the user interface offered by a device other than the camera. For example, the user may prefer to view the camera's viewfinder view on a device that has a larger or otherwise preferable display than that of the camera, or he may prefer to interact with the camera using a device that has controls that are more accurate or more ergonomic than those provided by the camera, or that are simply not provided by the camera. As an example of the latter case, where the camera is configured to be movable the user may wish to pan the camera back and forth smoothly—a device providing an analogue joystick or other control well suited for this purpose may enable the user to achieve a smoother pan than the controls present on the camera itself (digital push buttons, for example).
It may also provide a benefit to a user of multiple devices if there was a relatively easy and intuitive way to transfer an open instance of an application (which can perform at least one task) such as a camera application from one device/application to another device/application. For instance, closing the first application on the first device, and then manually opening a second, equivalent, application on a second device, may be considered to be cumbersome for a user. If the user were able to, for example, perform a gesture on the touch sensitive screen of one device to transfer the open instance of the application to the second device and then recommence working on the same task/application in the second application/device, then the user may experience a smoother and simpler way of transferring tasks performed using applications between devices.
The input I allows for receipt of signalling to the apparatus 100 from further components. The output O allows for onward provision of signalling from the apparatus 100 to further components. In this example embodiment the input I and output O may be part of a connection bus that allows for connection of the apparatus 100 to further components. The processor 110 may be a general purpose processor dedicated to executing/processing information received via the input I in accordance with instructions stored in the form of computer program code on the memory 120. The output signalling generated by such operations from the processor 110 is provided onwards to further components via the output O.
The memory 120 (not necessarily a single memory unit) is a computer readable medium (such as solid state memory, a hard drive, ROM, RAM, Flash or other memory) that stores computer program code. This computer program code stores instructions that are executable by the processor 110, when the program code is run on the processor 110. The internal connections between the memory 120 and the processor 110 can be understood to provide active coupling between the processor 110 and the memory 120 to allow the processor 110 to access the computer program code stored on the memory 120.
In this example embodiment the input I, output O, processor 110 and memory 120 are electrically connected internally to allow for communication between the respective components I, O, 110, 120, which in this example are located proximate to one another as an ASIC. In this way the components I, O, 110, 120 may be integrated in a single chip/circuit for installation in an electronic device. In other example embodiments, one or more or all of the components may be located separately (for example, throughout a portable electronic device such as devices 200, 300, or through a “cloud”, and/or may provide/support other functionality.
One or more examples of the apparatus 100 can be used as a component for another apparatus as in
The example apparatus/device 200 comprises a display 240 such as, a Liquid Crystal Display (LCD), e-Ink, or touch-screen user interface. The device 200 is configured such that it may receive, include, and/or otherwise access data. For example, device 200 comprises a communications unit 250 (such as a receiver, transmitter, and/or transceiver), in communication with an antenna 260 for connection to a wireless network and/or a port (not shown). Device 200 comprises a memory 220 for storing data, which may be received via antenna 260 or user interface 230. The processor 210 may receive data from the user interface 230, from the memory 220, or from the communication unit 250. Data may be output to a user of device 200 via the display device 240, and/or any other output devices provided with apparatus. The processor 210 may also store the data for later user in the memory 220. The device contains components connected via communications bus 280.
The communications unit 250 can be, for example, a receiver, transmitter, and/or transceiver, that is in communication with an antenna 260 for connecting to a wireless network and/or a port (not shown) for accepting a physical connection to a network, such that data may be received via one or more types of network. The communications (or data) bus 280 may provide active coupling between the processor 210 and the memory (or storage medium) 220 to allow the processor 210 to access the computer program code stored on the memory 220.
The memory 220 comprises computer program code in the same way as the memory 120 of apparatus 100, but may also comprise other data. The processor 210 may receive data from the user interface 230, from the memory 220, or from the communication unit 250. Regardless of the origin of the data, these data may be outputted to a user of device 200 via the display device 240, and/or any other output devices provided with apparatus. The processor 210 may also store the data for later user in the memory 220.
Device/apparatus 300 shown in
The apparatus 100 in
The storage medium 390 is configured to store computer code configured to perform, control or enable the operation of the apparatus 100. The storage medium 390 may be configured to store settings for the other device components. The processor 385 may access the storage medium 390 to retrieve the component settings in order to manage the operation of the other device components. The storage medium 390 may be a temporary storage medium such as a volatile random access memory. The storage medium 390 may also be a permanent storage medium such as a hard disk drive, a flash memory, or a non-volatile random access memory. The storage medium 390 could be composed of different combinations of the same or different memory types.
Each of the apparatuses 400, 410, 420 is capable of running at least one application that controls the camera 420. These applications may be identical copies of the same application, or they may be different applications that have at least some camera control functionality in common between them. Each of the apparatuses 400, 410, 420 may be running an operating system that is the same or different to the operating systems of the other apparatuses, and an equivalent application could be run on each device in either case.
Camera control applications may be considered “at least related” to each other on the basis that they are either equivalent (i.e. they provide substantially similar functionality), or identical. The two applications may be identical in the sense that they are different instances of the same application, or may be identical in the sense that they are the same instance of an application, with that single instance being transferable between apparatuses.
Consider a further example. In this example apparatuses 400 and 410 can each access and run a different camera control application. The camera control application runnable by apparatus 400 includes functions for displaying a viewfinder view of the camera, causing the camera 420 to capture an image, and switching the camera flash on and off. The camera control application runnable by apparatus 410 includes equivalent functions for displaying a viewfinder view and causing the camera 420 to capture an image, but it does not include a function for switching the flash on and off. Apparatus 400 is in this case capable of running an application with a common level of functionality to that provided by the application runnable on the apparatus 410. The common level of functionality in this example is that a viewfinder view may be presented and the camera may be caused to capture an image. The common functionality may also be considered to be any menu options 408, 468 with the same common functionality. For example, the application 462 may also allow for the functionality to change camera settings. The user interface may be different for the two applications without affecting the fact that they provide a common level of underlying camera control functionality and therefore provide substantially similar functionality making them equivalent. It will be appreciated that there are many other ways in which the appearance may be different between two open instances of two applications, such as having different colour schemes, different fonts, different display sizes and form factors, different style buttons/icons/menu systems, and many other possible factors.
The apparatuses 400 and 410 in
Where a remote server is present, related camera control applications may be provided for use on various different apparatuses by communication with the remote server. That is, a camera control application may be available from the server, and different versions of this application may be available, each version being more suited for use on a particular apparatus. For example, a version with a relatively large functionality may be suitable for use on a laptop or desktop computer, a version with reduced functionality may be suitable for use on a mobile phone or smartphone, and a version with particular functionality suitable for use on a digital camera may also be available.
As another example, the same camera control application may be provided for use on various different apparatuses by communication with the remote server. Thus, applications are not necessarily located locally just on the apparatuses that will run them, but may need to communicate with the remote server in order to run the application. In certain embodiments, the application can be considered to be distributed between the apparatus and the server, with parts of the application on the respective device/server needing to be run together to perform the tasks provided by the application.
The server may provide, for example, the same, or related applications, for use by allowing for download of the application from the server, so that after the download, the apparatus can run the stored application as a stand-alone device, without further communication with the server. The apparatus may also be able to communicate further with the server, for example to download and store an updated version of the application. Provided for use may also be taken to mean that the application is stored and running on the remote server, and that the apparatuses provided with that application for use are using that application which is running on the server. In this case, a user of the device may not be able to use the application if communication with the server is interrupted or terminated. In other examples, there may be more than one remote server, with which one or more devices is/are in communication. To say that an application is “running” on an apparatus may therefore mean that it is running on a remote server but accessible to the apparatus.
In this example, all three apparatuses 600, 610, 620 are in communication with one another (for example they are all attached to a common network). Not all three apparatuses need be present or even exist, for example only apparatuses 600 and 620 might be present in the absence of apparatus 610. Similarly, other apparatus (not illustrated) may be present and also in communication with the apparatuses shown.
In
There may be transient data associated with the camera control application running on the camera 620. For example, the current zoom level, the current flash settings, the current position/status of navigation through a menu system provided by the camera control application. Were the camera control application on the camera 620 to be closed (e.g. by turning the camera off, or by otherwise exiting the application) then such data would not be preserved until the application was next loaded and can therefore be considered to be transient content. Transient content may be thought of as data which is not normally saved upon an application being closed, that is, transient content would not normally persist between instances of an application being opened. In contrast, the data that would persist (e.g. images that have been captured and stored to the camera's memory) can be considered to be data content as (opposed to transient content).
In the example shown in
The interface displayed by the camera's 620 camera control application may also show other features such as previously captured images. These images form data content. Data content may be considered persistent, in that it will always be available to a user upon the user opening the camera control application. The camera control application itself may also display elements such as menus and buttons/icons and such elements would not be considered to be data content. Each time the camera control application is opened, the images, menus, buttons and text will be presented and/or available to the user regardless of any transient content such as the current view through the viewfinder or non-persistent settings. Also, different camera control applications may have different menus and/or buttons/icons and may still allow the user to view the same data content, e.g. the same previously captured images.
Other examples of data content include text (such as text saved in a word processing document, e-mail, database, text-based message or contacts list); numbers (such as entries made in a spreadsheet, database or data package, or contacts list); fields (such as entries made in a database, text entered into a field in a website); cells (such as spreadsheet entries); image content (such a photographs, maps, drawings, presentation slide content, and images in games); audio content (such as music, songs, voice recordings and game soundtracks); video content (such as video cuts in games, and movies); and webpage content (text, images and hyperlinks). It will be appreciated that this list of examples is not exhaustive. It will also be appreciated that, upon opening a previously closed application, such data content is usually presented and/or available to the user.
In the example shown in
At this point the tablet 600 and mobile phone 610 do not necessarily show any open applications on their respective displays 605, 615. It may be that there is camera control application currently running on these devices. Alternatively, there may be a camera control application running, but it is ‘minimised’ or hidden so that although it is running, it is not currently displayed. Each of the displays 605, 615 may be deactivated, or may be displaying content such as for example a screensaver or homepage. However, each of apparatuses 600 and 610 have access to and can run camera control applications that are at least related to the camera control application of apparatuses 620, in that they may be the same application (for example, all three apparatuses 600, 610, 620 may have the same camera control application) or they may have camera control applications with a common level of functionality (for example, the camera control application of apparatus 600 may be one particular camera control application, whereas device 610 may have a different camera control application and apparatus 620 may have a plurality of other camera control applications. These applications all have a common level of functionality.)
In this example, the camera control applications are provided for use on apparatuses 600, 610, 620 by accessing respective memories located with each apparatus. For example, a first camera control application is located in a memory (not shown) of the tablet 600, a second camera control application is located in a memory (not shown) of the phone 610, and a third camera control application is located in a memory (not shown) of the camera 620.
In
Based on this user input, as shown in
In this example shown in
It can also be seen that the difference in form factor is accounted for in this example. The form factor of the displays of the camera 620 and tablet 600 are different, and so the form factor of the displayed content is altered accordingly so that the open instance of the camera control application displayed on the camera 620 displays the same viewfinder view as the open instance of the camera control application displayed on the tablet 600. Note also that in this example the tablet lacks a physical shutter key like that 626 present on the camera 620. The UI of the camera control application on the tablet 600 therefore includes a virtual shutter button 606 that was not present in the UI of the camera control application on the camera 620.
It will be appreciated that in other examples, the form factor of the displayed open application may not be preserved upon displaying another open instance of a user application, and the form factor may be chosen to best suit the data being displayed. For example, video shown in widescreen display on a landscape-oriented monitor of a laptop may maintain the form factor of the widescreen if subsequently opened on a mobile telephone with a portrait-oriented screen; the other regions of the mobile phone screen not showing the video may be black or contain other options or text. Rotating the mobile phone so that the display is landscape oriented may cause the movie to be rotated correspondingly and displayed in landscape on the landscape oriented display of the mobile phone.
The two apparatuses 600 and 620 not only have different form factors, but also have different display sizes. The open instance of the second application on the second apparatus is resized in comparison with the open instance of the first application on the first apparatus, to accommodate the larger screen of the second apparatus with respect to the first apparatus. The two apparatuses 600, 630, may also have different display resolutions and/or different display colour schemes (for example, one apparatus may have a black and white screen, and another apparatus may have a colour screen).
The apparent transition of the camera control application being progressively removed from display from the camera 620 and progressively displayed on the tablet 620 may be dependent on a speed-dependent characteristic of the user input, it may be envisaged that if the user makes a faster slide gesture, that the apparent transition would be relatively faster. Similarly, if the user were to make a relatively slow slide gesture, then the apparent transition may be relatively slower. Also, for example, the user may be able to begin to make a slide gesture, then (say, if they changed their mind about wanting to make the transition) they could reverse their input and move the open instance of the application back on to the original apparatus.
The provision of open instances of user applications is not limited to being between two apparatuses only.
In the example embodiment shown in
For example, it may be envisaged that while the user can make inputs and manipulate the open camera control application on the phone 610, he or she may also be able to make inputs and manipulate the open application on the camera 620. Just because the open instance has been provided on a further apparatus does not necessarily exclude the ability of the open application to still be running on the initial apparatus. Any inputs made on the new apparatus may be received by the open instance of the camera control application on the old; however, on-going synchronisation of the transient content of the applications on the apparatuses may mean that the effect of such inputs at the old apparatus is propagated into the application running on the new apparatus. In such cases one of the apparatuses, or a remote server, can act as a controller to control the synchronisation between the multiple apparatuses, or more than one of the apparatuses can act in cooperation to control the synchronisation.
In the above example, the gestures can be considered to be push gestures, in that the corresponding open instance is seen to be pushed onto a new apparatus from the apparatus which receives the gesture input. Of course, in other examples, the (gesture) input could be a pull gesture input from a new apparatus to cause the corresponding open instance to be provided to the new apparatus from an old apparatus on which it is currently running.
It may be the case that two apparatuses are configured to form a (e.g. private) sharing network to allow for the provision of corresponding open instances of an at least related application as described herein.
As previously described in relation to
The tablet 700 has access to and can run a camera control application that is at least related to that running on the camera 720, in that they may be the same application (for example, both apparatuses 700, 720 may have access to the same camera control application) or they may have camera control applications with a common level of functionality (as described previously). It may be imagined that an apparatus such as tablet 700 may have more than one suitable camera control application available for use. All these applications will have a common level of functionality in that they can be used for controlling camera 720. In this example, the camera control applications are provided for use on apparatuses 700, 720 by accessing the respective memories located in each apparatus. It may be in other examples that one or more apparatus accesses the required software from a remote server or cloud as has previously been described.
In
Based on this user input, as shown in
In this example shown in
In the example illustrated in
In the examples presented so far, the instance of the camera control application has been transferred from a camera to a non-camera apparatus. However, this is purely by way of example. The control application may be instead (or subsequently) transferred between apparatuses that are both cameras, or of which neither are cameras.
The effects provided due to the user slide inputs 856, 814 made in
In other examples, the user may be required to press and hold for a predetermined period of time before sliding to indicate to the apparatus what the intended effect of the input is (to provide an open instance of an application on a second apparatus). Other possible inputs include, for example, a tap-and-slide, press and slide, a flick, press and flick, a multiple tap and slide or a multi finger slide. Another example may be that an apparatus 800 comprising an accelerometer and/or gyroscope may allow the user to make such an input by tilting the apparatus from left to right (or in another direction). This tilting action may be performed after a prior input priming the apparatus that the user wishes to open a corresponding application on another apparatus. Other inputs are possible as will be appreciated.
It will be appreciated that some form of (e.g. private) sharing network needs to be available between the apparatuses. This can be done prior to receipt of the user input to provide the corresponding open instance, or can be initiated upon the determination that such an input has been received, e.g. after detecting the particular user input, the apparatus which has received the user input may look for apparatuses with which it can form a network to allow for the provision of the corresponding open instance. Such networks may be preconfigured, or the apparatus may ask the user to identify/confirm the apparatus which will provide the corresponding open instance of the at least related application.
It may be that the source (first) apparatus and/or the recipient (second) apparatus may be identified due to the push/pull gesture input made by a user. Other examples of ways of identifying the apparatuses to be involved in providing open instances of applications due to user inputs include that the identity of each apparatus has been previously defined. This may be by a user changing a setting in a menu, or by the manufacturers pre-defining the identity. Another example is that nearby apparatuses may be able to determine their relative distances from each other, and the apparatus determined to be the closest to the apparatus being interacted with by a user in the one which is involved in providing a (corresponding) open instance of an application. A further example is that nearby apparatuses may be able to determine their relative locations from each other, and the direction of the user input made may indicate the other apparatus to be used, for example, indicated by the direction in which the user input is made (e.g. a swipe made in the direction of the recipient apparatus). A further example is in the case of a pull (slide) user input, all the apparatuses available except for the source apparatus (on which the input is being made) may be in a standby state waiting for a user input to instruct them to open a corresponding instance of an at least related application (by an open application being pulled towards that apparatus).
In the examples where an open instance of a first application is provided as a corresponding open instance of an at least related application by the first application being progressively removed while the at least related application is progressively displayed, the progressions (which give the impression of a transition from one to the other apparatus) of the two applications may be matched. By being matched, it is meant that if, for example, one quarter of the first application is removed by a user input, then correspondingly one quarter of the at least related application is displayed on the second apparatus; similarly as one third of the first application is removed, then one third of the at least related application is displayed on the second apparatus. The speed at which the apparent transition takes place also required that the speed of progressive removal of the open application on the first apparatus is matched by the speed of progressive display of the at least related application on the second apparatus. This progressive apparent transition can take into account any differences in form factor or size of the different displays, as previously described.
The apparatus shown in the above embodiments may be a portable electronic apparatus, a laptop computer, a mobile phone, a Smartphone, a tablet computer, a personal digital assistant, a digital camera, a watch, a non-portable electronic apparatus, a desktop computer, a monitor, a server, or a module/circuitry for one or more of the same.
The portable electronic apparatus according to one or more disclosed aspects/embodiments may provide one or more audio/text/video communication functions (e.g. tele-communication, video-communication, and/or text transmission (Short Message Service (SMS)/Multimedia Message Service (MMS)/emailing) functions), interactive/non-interactive viewing functions (e.g. web-browsing, navigation, TV/program viewing functions), music recording/playing functions (e.g. MP3 or other format and/or (FM/AM) radio broadcast recording/playing), downloading/sending of data functions, image capture function (e.g. using a (e.g. in-built) digital camera), and gaming functions.
Any mentioned device/apparatus/server and/or other features of particular mentioned apparatus/apparatus/server may be provided by apparatus arranged such that they become configured to carry out the desired operations only when enabled, e.g. switched on, or the like. In such cases, they may not necessarily have the appropriate software loaded into the active memory in the non-enabled (e.g. switched off state) and only load the appropriate software in the enabled (e.g. on state). The apparatus may comprise hardware circuitry and/or firmware. The apparatus may comprise software loaded onto memory. Such software/computer programs may be recorded on the same memory/processor/functional units and/or on one or more memories/processors/functional units.
In some embodiments, a particular mentioned device/apparatus/server may be pre-programmed with the appropriate software to carry out desired operations, and wherein the appropriate software can be enabled for use by a user downloading a “key”, for example, to unlock/enable the software and its associated functionality. Advantages associated with such embodiments can include a reduced requirement to download data when further functionality is required for an apparatus, and this can be useful in examples where a apparatus is perceived to have sufficient capacity to store such pre-programmed software for functionality that may not be enabled by a user.
Any mentioned apparatus/circuitry/elements/processor may have other functions in addition to the mentioned functions, and that these functions may be performed by the same apparatus/circuitry/elements/processor. One or more disclosed aspects may encompass the electronic distribution of associated computer programs and computer programs (which may be source/transport encoded) recorded on an appropriate carrier (e.g. memory, signal).
Any “computer” described herein can comprise a collection of one or more individual processors/processing elements that may or may not be located on the same circuit board, or the same region/position of a circuit board or even the same apparatus. In some embodiments one or more of any mentioned processors may be distributed over a plurality of apparatuses. The same or different processor/processing elements may perform one or more functions described herein.
With reference to any discussion of any mentioned computer and/or processor and memory (e.g. including ROM, CD-ROM etc), these may comprise a computer processor, Application Specific Integrated Circuit (ASIC), field-programmable gate array (FPGA), and/or other hardware components that have been programmed in such a way to carry out the inventive function.
The applicant hereby discloses in isolation each individual feature described herein and any combination of two or more such features, to the extent that such features or combinations are capable of being carried out based on the present specification as a whole, in the light of the common general knowledge of a person skilled in the art, irrespective of whether such features or combinations of features solve any problems disclosed herein, and without limitation to the scope of the claims. The applicant indicates that the disclosed aspects/embodiments may consist of any such individual feature or combination of features. In view of the foregoing description it will be evident to a person skilled in the art that various modifications may be made within the scope of the disclosure.
While there have been shown and described and pointed out fundamental novel features as applied to preferred embodiments thereof, it will be understood that various omissions and substitutions and changes in the form and details of the apparatuses and methods described may be made by those skilled in the art without departing from the spirit of the disclosure. For example, it is expressly intended that all combinations of those elements and/or method steps which perform substantially the same function in substantially the same way to achieve the same results are within the scope of the disclosure. Moreover, it should be recognized that structures and/or elements and/or method steps shown and/or described in connection with any disclosed form or embodiments may be incorporated in any other disclosed or described or suggested form or embodiment as a general matter of design choice. Furthermore, in the claims means-plus-function clauses are intended to cover the structures described herein as performing the recited function and not only structural equivalents, but also equivalent structures.
Number | Date | Country | Kind |
---|---|---|---|
PCT/EP2011/074182 | Dec 2011 | WO | international |
1204857.5 | Mar 2012 | GB | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2012/071628 | 12/26/2012 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2013/101813 | 7/4/2013 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5016166 | Van Loo et al. | May 1991 | A |
5127098 | Rosenthal et al. | Jun 1992 | A |
6154843 | Hart, Jr. et al. | Nov 2000 | A |
6311209 | Olson et al. | Oct 2001 | B1 |
6463299 | Macor | Oct 2002 | B1 |
6545669 | Kinawi et al. | Apr 2003 | B1 |
6906646 | Mills et al. | Jun 2005 | B1 |
6947975 | Wong et al. | Sep 2005 | B2 |
7363342 | Wang et al. | Apr 2008 | B1 |
7371177 | Ellis et al. | May 2008 | B2 |
7475397 | Garthwaite et al. | Jan 2009 | B1 |
7479949 | Jobs et al. | Jan 2009 | B2 |
7529824 | Joseph | May 2009 | B2 |
7532196 | Hinckley | May 2009 | B2 |
7533189 | Mahajan et al. | May 2009 | B2 |
7536034 | Rhoads et al. | May 2009 | B2 |
7620906 | Igeta | Nov 2009 | B2 |
7623892 | Hawkins | Nov 2009 | B2 |
7636794 | Ramos et al. | Dec 2009 | B2 |
7703073 | Illowsky et al. | Apr 2010 | B2 |
7716286 | Heins et al. | May 2010 | B2 |
7761885 | Labrou et al. | Jul 2010 | B2 |
7817991 | Hinckley et al. | Oct 2010 | B2 |
7860019 | Zhang et al. | Dec 2010 | B2 |
7890517 | Angelo et al. | Feb 2011 | B2 |
7920169 | Jung et al. | Apr 2011 | B2 |
7958453 | Taing | Jun 2011 | B1 |
7991896 | Shen et al. | Aug 2011 | B2 |
8019867 | Allen et al. | Sep 2011 | B1 |
8078646 | Das et al. | Dec 2011 | B2 |
8214747 | Yankovich et al. | Jul 2012 | B1 |
8224894 | Parks et al. | Jul 2012 | B1 |
8255360 | Wallace et al. | Aug 2012 | B1 |
8332606 | Boldyrev et al. | Dec 2012 | B2 |
8588824 | Rowe | Nov 2013 | B2 |
8656080 | Shin | Feb 2014 | B2 |
8751444 | Wallace et al. | Jun 2014 | B2 |
8812987 | Joynes et al. | Aug 2014 | B2 |
8874585 | Oliver et al. | Oct 2014 | B2 |
20020116455 | Mitchell et al. | Aug 2002 | A1 |
20030126136 | Omoigui | Jul 2003 | A1 |
20040006551 | Sahinoja et al. | Jan 2004 | A1 |
20040019676 | Iwatsuki et al. | Jan 2004 | A1 |
20040024819 | Sasaki et al. | Feb 2004 | A1 |
20040054690 | Hillerbrand et al. | Mar 2004 | A1 |
20040117409 | Scahill et al. | Jun 2004 | A1 |
20040150664 | Baudisch | Aug 2004 | A1 |
20040150724 | Nozaki | Aug 2004 | A1 |
20040203891 | Cole et al. | Oct 2004 | A1 |
20040230636 | Masuoka et al. | Nov 2004 | A1 |
20050028221 | Liu et al. | Feb 2005 | A1 |
20050030255 | Chiu et al. | Feb 2005 | A1 |
20050055330 | Britton et al. | Mar 2005 | A1 |
20050060179 | Tinberg et al. | Mar 2005 | A1 |
20050086309 | Galli et al. | Apr 2005 | A1 |
20050120096 | Rekimoto et al. | Jun 2005 | A1 |
20050165795 | Myka et al. | Jul 2005 | A1 |
20050168399 | Palmquist | Aug 2005 | A1 |
20050198578 | Agrawala et al. | Sep 2005 | A1 |
20050204141 | Sayers et al. | Sep 2005 | A1 |
20050204296 | Rossler et al. | Sep 2005 | A1 |
20050219211 | Kotzin et al. | Oct 2005 | A1 |
20050219223 | Kotzin et al. | Oct 2005 | A1 |
20050254514 | Lynn | Nov 2005 | A1 |
20050289134 | Noguchi | Dec 2005 | A1 |
20050289558 | Illowsky et al. | Dec 2005 | A1 |
20060004834 | Pyhalammi et al. | Jan 2006 | A1 |
20060028458 | King et al. | Feb 2006 | A1 |
20060041893 | Castro et al. | Feb 2006 | A1 |
20060101054 | Dempski et al. | May 2006 | A1 |
20060123011 | Stillion et al. | Jun 2006 | A1 |
20060123039 | Scheuerle, Jr. et al. | Jun 2006 | A1 |
20060236302 | Bateman et al. | Oct 2006 | A1 |
20060236307 | Debruin et al. | Oct 2006 | A1 |
20060242278 | Hawkins | Oct 2006 | A1 |
20060284758 | Stilwell et al. | Dec 2006 | A1 |
20060284785 | Bitterlich | Dec 2006 | A1 |
20060284795 | Akiyama et al. | Dec 2006 | A1 |
20070054627 | Wormald | Mar 2007 | A1 |
20070118394 | Cahoon | May 2007 | A1 |
20070124503 | Ramos | May 2007 | A1 |
20070127426 | Watters et al. | Jun 2007 | A1 |
20070146347 | Rosenberg | Jun 2007 | A1 |
20070242061 | Rhoten et al. | Oct 2007 | A1 |
20070250645 | Meadows et al. | Oct 2007 | A1 |
20080081558 | Dunko et al. | Apr 2008 | A1 |
20080104032 | Sarkar | May 2008 | A1 |
20080120343 | Altrichter et al. | May 2008 | A1 |
20080154907 | Prasad et al. | Jun 2008 | A1 |
20080160974 | Vartiainen et al. | Jul 2008 | A1 |
20080207128 | Mikko | Aug 2008 | A1 |
20080215989 | Kravtsova et al. | Sep 2008 | A1 |
20080288913 | Creighton et al. | Nov 2008 | A1 |
20080294644 | Liu et al. | Nov 2008 | A1 |
20080297608 | Border | Dec 2008 | A1 |
20090006946 | Hanson et al. | Jan 2009 | A1 |
20090031258 | Arrasvuori et al. | Jan 2009 | A1 |
20090058830 | Herz et al. | Mar 2009 | A1 |
20090119572 | Koivunen | May 2009 | A1 |
20090143056 | Tang et al. | Jun 2009 | A1 |
20090172101 | Arthursson | Jul 2009 | A1 |
20090172565 | Jackson et al. | Jul 2009 | A1 |
20090172671 | Bobak et al. | Jul 2009 | A1 |
20090177777 | Behrendt et al. | Jul 2009 | A1 |
20090204966 | Johnson et al. | Aug 2009 | A1 |
20090213032 | Newport et al. | Aug 2009 | A1 |
20090216714 | Gonzalez et al. | Aug 2009 | A1 |
20090265686 | Lucas et al. | Oct 2009 | A1 |
20090300493 | Hamilton, II et al. | Dec 2009 | A1 |
20090303676 | Behar et al. | Dec 2009 | A1 |
20090313645 | Sathish et al. | Dec 2009 | A1 |
20090327302 | Richardson et al. | Dec 2009 | A1 |
20100054242 | Oliver et al. | Mar 2010 | A1 |
20100057815 | Spivack et al. | Mar 2010 | A1 |
20100066643 | King et al. | Mar 2010 | A1 |
20100070740 | Allen et al. | Mar 2010 | A1 |
20100071038 | Flynn et al. | Mar 2010 | A1 |
20100082136 | Rosenblatt et al. | Apr 2010 | A1 |
20100093399 | Kim et al. | Apr 2010 | A1 |
20100138756 | Saund et al. | Jun 2010 | A1 |
20100156812 | Stallings et al. | Jun 2010 | A1 |
20100167646 | Alameh et al. | Jul 2010 | A1 |
20100185956 | Anantharaman et al. | Jul 2010 | A1 |
20100198778 | Venugopal et al. | Aug 2010 | A1 |
20100231735 | Burian et al. | Sep 2010 | A1 |
20100241634 | Madhok | Sep 2010 | A1 |
20100257251 | Mooring et al. | Oct 2010 | A1 |
20100259486 | Anson et al. | Oct 2010 | A1 |
20100262925 | Liu et al. | Oct 2010 | A1 |
20100274804 | Muskal et al. | Oct 2010 | A1 |
20100274858 | Lindberg et al. | Oct 2010 | A1 |
20100281363 | Inaba | Nov 2010 | A1 |
20100281395 | Apted | Nov 2010 | A1 |
20100287513 | Singh et al. | Nov 2010 | A1 |
20100293106 | Rhoads et al. | Nov 2010 | A1 |
20100299436 | Khalid et al. | Nov 2010 | A1 |
20110018982 | Shibamiya et al. | Jan 2011 | A1 |
20110019001 | Rhoads et al. | Jan 2011 | A1 |
20110058052 | Bolton et al. | Mar 2011 | A1 |
20110065384 | Cader et al. | Mar 2011 | A1 |
20110066871 | Farmer et al. | Mar 2011 | A1 |
20110081923 | Forutanpour et al. | Apr 2011 | A1 |
20110083111 | Forutanpour et al. | Apr 2011 | A1 |
20110083130 | Boldyrev et al. | Apr 2011 | A1 |
20110088002 | Freer | Apr 2011 | A1 |
20110098056 | Rhoads et al. | Apr 2011 | A1 |
20110107227 | Rempell et al. | May 2011 | A1 |
20110113138 | Tyrkko et al. | May 2011 | A1 |
20110117898 | Pereira et al. | May 2011 | A1 |
20110126141 | King et al. | May 2011 | A1 |
20110131299 | Sardary | Jun 2011 | A1 |
20110138381 | Hauser | Jun 2011 | A1 |
20110158138 | Vivek | Jun 2011 | A1 |
20110161076 | Davis et al. | Jun 2011 | A1 |
20110165841 | Baek | Jul 2011 | A1 |
20110197153 | King et al. | Aug 2011 | A1 |
20110239114 | Falkenburg et al. | Sep 2011 | A1 |
20110244919 | Aller et al. | Oct 2011 | A1 |
20110258430 | Luukkala et al. | Oct 2011 | A1 |
20110276911 | Choi | Nov 2011 | A1 |
20110289147 | Styles et al. | Nov 2011 | A1 |
20110289157 | Pirnazar | Nov 2011 | A1 |
20110289520 | Grigoriev et al. | Nov 2011 | A1 |
20110307841 | Boldyrev et al. | Dec 2011 | A1 |
20110307857 | Lucas et al. | Dec 2011 | A1 |
20120084356 | Ferdi | Apr 2012 | A1 |
20120096076 | Chan | Apr 2012 | A1 |
20120139951 | Hwang et al. | Jun 2012 | A1 |
20120290657 | Parks et al. | Nov 2012 | A1 |
20120290730 | Desai et al. | Nov 2012 | A1 |
20120317508 | Schone et al. | Dec 2012 | A1 |
20130007499 | Moy | Jan 2013 | A1 |
20130013560 | Goldberg et al. | Jan 2013 | A1 |
20130046935 | Ramanathan | Feb 2013 | A1 |
20130047008 | Shin | Feb 2013 | A1 |
20130097234 | Beinvel et al. | Apr 2013 | A1 |
20130262706 | Stahlberg et al. | Oct 2013 | A1 |
20130275994 | Uola et al. | Oct 2013 | A1 |
Number | Date | Country |
---|---|---|
1625229 | Jun 2005 | CN |
1658149 | Aug 2005 | CN |
1682546 | Oct 2005 | CN |
101095110 | Dec 2007 | CN |
101421686 | Apr 2009 | CN |
101534411 | Sep 2009 | CN |
101662667 | Mar 2010 | CN |
101764933 | Jun 2010 | CN |
102150126 | Aug 2011 | CN |
102150126 | Aug 2011 | CN |
102202355 | Sep 2011 | CN |
102238281 | Nov 2011 | CN |
1215575 | Jun 2002 | EP |
1760584 | Mar 2007 | EP |
2106186 | Sep 2009 | EP |
2161960 | Mar 2010 | EP |
2261793 | Dec 2010 | EP |
2385462 | Nov 2011 | EP |
2487871 | Aug 2012 | EP |
2358778 | Aug 2001 | GB |
2384064 | Jul 2003 | GB |
2468893 | Sep 2010 | GB |
2498229 | Jul 2013 | GB |
H11-95931 | Apr 1999 | JP |
2006-339835 | Dec 2006 | JP |
2009-130876 | Jun 2009 | JP |
2010-262330 | Nov 2010 | JP |
2011-065518 | Mar 2011 | JP |
2011-0056314 | May 2011 | KR |
2011 0056314 | May 2011 | KR |
2011 0080348 | Jul 2011 | KR |
2011-0080348 | Jul 2011 | KR |
2011-0123099 | Nov 2011 | KR |
2005051020 | Jun 2005 | WO |
2005109829 | Nov 2005 | WO |
2009152316 | Dec 2009 | WO |
2010028405 | Mar 2010 | WO |
2010063873 | Jun 2010 | WO |
2011127201 | Oct 2011 | WO |
2013097878 | Jul 2013 | WO |
WO 2013097878 | Jul 2013 | WO |
Entry |
---|
International Search Report and Written Opinion received for corresponding Patent Cooperation Treaty Application No. PCT/EP2011/074182, dated Apr. 4, 2012, 10 pages. |
International Search Report and Written Opinion received for corresponding Patent Cooperation Treaty Application No. PCT/FI2013/050198, dated Jun. 6, 2013, 20 pages. |
International Search Report and Written Opinion received for corresponding Patent Cooperation Treaty Application No. PCT/EP2011/074178, dated May 22, 2012, 9 pages. |
International Search Report and Written Opinion received for corresponding Patent Cooperation Treaty Application No. PCT/EP2011/074193, dated Sep. 28, 2012, 13 pages. |
Search Report received for corresponding United Kingdom Patent Application No. 1204842.7, dated May 28, 2012, 4 pages. |
Search Report received for corresponding United Kingdom Patent Application No. 1204849.2, dated May 28, 2012, 4 pages. |
Search Report received for corresponding United Kingdom Patent Application No. 1204857.5, dated May 28, 2012, 4 pages. |
Bagrodia et al., “iMash: Interactive Mobile Application Session Handoff”, Proceedings of the First International Conference on Mobile Systems, Applications and Services, May 5-8, 2003, pp. 259-272. |
Chong et al., “GesturePIN: Using Discrete Gestures for Associating Mobile Devices”, Proceedings of the 12th international conference on Human computer interaction with mobile devices and services, Sep. 7-10, 2010, pp. 261-264. |
De Carvalho et al., “Synchronizing Web Browsing Data with Browserver”, IEEE Symposium on Computers and Communications (ISCC), Jun. 22-25, 2010, 6 pages. |
Menges et al., “Inverting X: An Architecture for a Shared Distributed Window System”, Third Workshop on Enabling Technologies: Infrastructure for Collaborative Enterprises, Apr. 17-19, 1994, 11 Pages. |
Peng et al., “Point&Connect: Intention-based Device Pairing for Mobile Phone users”, Proceedings of the 7th international conference on Mobile systems, applications, and services, Jun. 22-25, 2009, pp. 137-149. |
Shim et al., “Providing Flexible Services for Managing Shared State in Collaborative Systems”, Proceedings of the fifth conference on European Conference on Computer-Supported Cooperative Work, 1997, pp. 237-252. |
“Understanding Home Sharing”, Apple, Retrieved on Aug. 8, 2014, Webpage available at : http://support.apple.com/kb/HT3819. |
“ITunes: How to Share Music and Video”, Apple, Retrieved on Aug. 8, 2014, Webpage available at : http://support.apple.com/kb/HT2688. |
“Touch Gesture Reference Guide”, Retrieved on Aug. 8, 2014, Webpage available at : http://static.lukew.com/TouchGestureGuide. |
Non-Final Office action received for corresponding U.S. Appl. No. 13/445,467, dated Oct. 17, 2013, 23 pages. |
Final Office action received for corresponding U.S. Appl. No. 13/445,467, dated Apr. 17, 2014, 22 pages. |
Schuckmann et al., “Modeling Collaboration Using Shared Objects”, Proceedings of the international ACM SIGGROUP conference on Supporting group work, 1999, pp. 189-198. |
Non-Final Office action received for corresponding U.S. Appl. No. 13/814,681, dated Nov. 5, 2014, 13 Pages. |
International Search Report and Written Opinion received for corresponding Patent Cooperation Treaty Application No. PCT/US2012/071628, dated Feb. 22, 2013, 8 pages. |
Ding et al., “Tracking Rdf Graph Provenance Using Rdf Molecules”, Proceedings of the Fourth International Semantic Web Conference, 2005, 2 pages. |
Milojicic et al., “Process Migration”, Tech report from HP labs, HPL-1999-21, Dec. 5, 1998, 49 pages. |
Rabin, “Efficient Dispersal of Information for Security, Load Balancing, and Fault Tolerance”, Journal of the ACM, vol. 36, No. 2, Apr. 1989, pp. 335-348. |
“Singularity”, Microsoft Research, Retrieved on Jul. 17, 2015, Webpage available at : http://research.microsoft.com/en-us/projects/singularity/. |
“K42”, IBM Research, Retrieved on Jul. 17, 2015, Webpage available at : http://domino.research.ibm.com/comm/research_projects.nsf/pages/k42.index.html. |
“Accelerating Innovation in the Desktop”, Intel, Retrieved on Jul. 17, 2015, Webpage available at : http://download.intel.com/pressroom/kits/events/computex2009/Crooke_Computex_presentation.pdf. |
“ARM11 MPCore™ Processor Technical Reference Manual Revision: r2p0”, ARM DDI 0360F, Oct. 15, 2008, 728 Pages. |
“ARM11MPCore Processor”, ARM, Retrieved on Jul. 17, 2015, Webpage available at : http://www.arm.com/products/processors/classic/arm11/arm11-mpcore.php. |
“Category Theory Lecture Notes”, Laboratory for Foundations of Computer Science, Sep. 1996-Dec. 2001, 61 Pages. |
Jaeger et al., “Flexible Control of Downloaded Executable Content”, ACM Transactions on Information and System Security, vol. 2, No. 2, May 1999, pp. 177-228. |
“Hyper-Threading Technology”, Intel, Retrieved on Jul. 17, 2015, Webpage available at : http://www.intel.com/technology/platform-technology/hyper-threading/index.htm. |
“OpenCL—The open standard for parallel programming of heterogeneous systems”, Khronos Group, Retrieved on Jul. 28, 2015, Webpage available at : https://www.khronos.org/opencl/. |
“The OpenMP® API specification for parallel programming”, OpenMP, Retrieved on Jul. 17, 2015, Webpage available at : http://openmp.org/wp/openmp-specifications/. |
Silberschatz et al.,“Interprocess Communication and Remote Procedure Calls”, Operating system concepts, 6th edition, 2002, pp. 107-128. |
Oliver et al., “Personal Semantic Web Through a Space Based Computing Environment”, In Proceedings of Middleware for the Semantic Web, Seconds IEEE Interntional Conference on Semantic Computing, Aug. 4-7, 2008, 14 pages. |
Patterson et al., “Recovery Oriented Computing (ROC): Motivation, Definition, Techniques, and Case Studies”, Technical Report, UCB/CSD-02-1175, Mar. 15, 2002, pp. 1-16. |
Hunt et al., “Singularity: Rethinking the Software Stack”, ACM SIGOPS Operating Systems Review, vol. 41, No. 2, Apr. 2007, pp. 37-49. |
“Recovery-Oriented Computing (ROC) Project”, Berkeley, Retrieved on Jul. 17, 2015, Webpage available at : http://roc.cs.berkeley.edu/. |
Pham et al., “The Design and Implementation of a First-Generation Cell Processor”, IEEE International Solid-State Circuits Conference, Feb. 8, 2005, 3 Pages. |
“L4 Developer's Bibliography”, The L4 μ-Kernel Family, Retrieved on Jul. 17, 2015, Webpage available at : http://os.inf.tu-dresden.de/L4/bib.html. |
Bozman et al., “Optimizing Hardware for X86 Server Virtualization”, IDC Analyze the Future, Aug. 2009, pp. 1-18. |
“Xen Architecture Overview”, Slideshare, Retrieved on Jul. 28, 2015, Webpage available at : http://www.slideshare.net/nico200567/xen-architecture-q1-2008. |
Hwang et al., “Xen On Arm: System Virtualization Using Xen Hypervisor for ARM-based Secure Mobile Phones”, 5th IEEE Consumer Communications and Networking Conference, Jan. 10-12, 2008, pp. 257-261. |
“XenServer”, Citrix, Retrieved on Jul. 17, 2015, Webpage available at : http://www.citrix.com/English/ps2/products/product.asp?contentID=683148. |
International Search Report and Written Opinion received for corresponding Patent Cooperation Treaty Application No. PCT/FI2011/050493, dated Sep. 28, 2011, 14 pages. |
Melchior et al., “A Toolkit for Peer-To-Peer Distributed User Interfaces: Concepts, Implementation, and Applications”, Proceedings of the 1st ACM SIGCHI symposium on Engineering interactive computing systems, Jul. 15-17, 2009, pp. 69-78. |
Luyten et al., “Distributed User Interface Elements to Support Smart Interaction Spaces”, 7th IEEE International Symposium on Multimedia, Dec. 12-14, 2005, 8 pages. |
Bandelloni et al., “Flexible Interface Migration”, Proceedings of the 9th international conference on Intelligent user interfaces, Jan. 13-16, 2004, pp. 148-155. |
Demeure et al., “The 4C Reference Model for Distributed User Interfaces”, Fourth International Conference on Autonomic and Autonomous Systems, Mar. 16-21, 2008, pp. 61-69. |
Non-Final Office action received for corresponding U.S. Appl. No. 12/813,248, dated Feb. 7, 2012, 21 Pages. |
Office action received for corresponding Chinese Patent Application No. 201180028581.4, dated Oct. 28, 2014, 6 pages of office action and no page of office action translation available. |
Extended European Search Report received for corresponding European Patent Application No. 11791986.0, dated Nov. 25, 2014, 6 pages. |
Office action received for corresponding Chinese Patent Application No. 201180028581.4, dated Apr. 9, 2015, 3 pages of office action and no page of office action translation available. |
Extended European Search Report received for corresponding European Patent Application No. 12862981.3, dated May 6, 2015, 6 pages. |
Notice of Allowance received for corresponding U.S. Appl. No. 13/445,467, dated Nov. 18, 2014, 8 pages. |
Final Office action received for corresponding U.S. Appl. No. 13/814,681, dated Jun. 3, 2015, 22 pages. |
Office action received for corresponding Japanese Patent Application No. 2014-549371, dated Jul. 2, 2015, 5 pages of office action and 6 pages of office action translation available. |
Office action received for corresponding Japanese Patent Application No. 2014-549372, dated Jul. 17, 2015, 4 pages of office action and 4 pages of office action translation available. |
Office action received for corresponding Korean Patent Application No. 2014-7021097, dated Jul. 20, 2015, 7 pages of office action and no pages of office action translation available. |
Office action received for corresponding Chinese Patent Application No. 201180028581.4, dated Aug. 26, 2015, 3 pages of office action and no pages of office action translation available. |
Non-Final Office action received for corresponding U.S. Appl. No. 13/814,681, dated Sep. 24, 2015, 12 pages. |
Office action received for corresponding Russian Patent Application No. 2014128244, dated Oct. 6, 2015, 5 pages of office action and 3 pages of office action translation available. |
Non-Final Office action received for corresponding U.S. Appl. No. 13/814,639, dated Oct. 7, 2015, 28 pages. |
Office action received for corresponding Korean Patent Application No. 2014-7021088, dated Oct. 20, 2015, 5 pages of office action and no pages of office action translation available. |
Extended European Search Report for European Application No. 12862981.3 dated May 6, 2015. |
Extended European Search Report for European Application No. 11791986.0 dated Nov. 25, 2014. |
Bandelloni, R. et al., Flexible Interface Migration, Proceedings of the Ninth International Conference on Intelligent User Interfaces (Jan. 2004) 9 pages. |
Demeure, A. et al., The 4C Reference Model for Distributed User Interfaces, Proceedings of the Fourth International Conference on Autonomic and Autonomous Systems (Mar. 2008) 61-69. |
Luyten, K. et al., Distributed User Interface Elements to Support Smart Interaction Spaces, Proceedings of the Seventh IEEE International Symposium on Multimedia (Dec. 2005) 8 pages. |
Melchior, J. et al., A Toolkit for Peer-to-Peer Distributed User Interfaces: Concepts, Implementation, and Applications, Proceedings of the First ACM Sigchi Symposium on Engineering Interactive Computing Systems (Jul. 2009), 69-78. |
Office Action for U.S. Appl. No. 13/814,639 dated Oct. 10, 2015. |
Office Action for U.S. Appl. No. 13/814,681 dated Sep. 24, 2015. |
Notice of Allowance for U.S. Appl. No. 13/445,467 dated Nov. 18, 2014. |
Notice of Allowance for U.S. Appl. No. 13/814,681 dated Jun. 15, 2016. |
Office Action for Japanese Application No. 2014-549372 dated Jan. 25, 2016. |
Office Action for Korean Application No. 2014-7021097 dated Jan. 26, 2016. |
Office Action for U.S. Appl. No. 13/814,639 dated Apr. 29, 2016. |
Office Action for corresponding Chinese Application No. 201180076460.7 dated Mar. 31, 2016. |
Office Action for corresponding Korean Application No. 10-2014-7021097 dated May 30, 2016. |
Notice of Dismissal of Amendment for corresponding Korean Application No. 10-2014-7021097 dated May 30, 2016. |
Ramos, G. et al.; “Synchronous Gestures in Multi-Display Environments”; In the Jour. of Human-Computer Interaction; vol. 24, Issue 1-2; pp. 117-169; Apr. 21, 2009. |
Office Action for U.S. Appl. No. 13/814,639 dated Oct. 12, 2016. |
Office Action for Chinese Patent Application No. 201180076462.6 dated Aug. 16, 2016. |
Office Action for European Patent Application No. 11807948.2 dated Aug. 16, 2016. |
Office Action from corresponding Chinese Patent Application No. 201180076460.7 dated Jan. 22, 2017. |
Office Action and Search Report from corresponding Chinese Patent Application No. 2012800707774 dated Jan. 22, 2017 with English translation, 12 pages. |
Office Action from U.S. Appl. No. 13/814,639 dated Feb. 24, 2017, 58 pages. |
Office Action for Russian Patent Application No. 2014128242 dated Nov. 18, 2015. |
Office Action for European Application No. 12 862 981.3 dated Nov. 4, 2016. |
Lucero, A. et al., Collaborative Use of Mobile Phones for Brainstorming, MobileHCI'10 (Sep. 7-10, 2011), 337-340. |
Lyons, K. et al., Multi-Display Composition: Supporting Display Sharing for Collocated Mobile Devices, Interact 2009, Part I, LNCS 4726 (2009) 758-771. |
Office Action from corresponding Chinese Patent Application No. 201180076462.6 dated Jun. 2, 2017, 15 pages. |
Office Action from U.S. Appl. No. 13/814,639 dated Jul. 13, 2017, 48 pages. |
Office Action and Search Report from corresponding Chinese Patent Application No. 2011800764607 dated Jan. 10, 2018, with English summary, 14 pages. |
Notice of Allowance for U.S. Appl. No. 13/814,639 dated Jan. 26, 2018, 28 pages. |
Office Action and Search Report from corresponding Chinese Patent Application No. 2012800707774, dated Dec. 13, 2017, with English translation, 14 pages. |
Office Action and Search Report from corresponding Chinese Patent Application No. 2011800764607 dated Jul. 3, 2018, with English summary, 6 pages. |
Office Action and Search Report from corresponding Chinese Patent Application No. 2012800707774 dated Sep. 25, 2018, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20140375834 A1 | Dec 2014 | US |