Method, an apparatus and a computer program for controlling an output from a display of an apparatus

Information

  • Patent Grant
  • 9983729
  • Patent Number
    9,983,729
  • Date Filed
    Monday, March 27, 2017
    8 years ago
  • Date Issued
    Tuesday, May 29, 2018
    6 years ago
Abstract
A method including: displaying information corresponding to a first output state, temporarily displaying information corresponding to a second output state while a user actuation is occurring; and displaying information corresponding to the first output state when the user actuation is no longer occurring.
Description
TECHNOLOGICAL FIELD

Embodiments of the present invention relate to a method, an apparatus and a computer program. In particular, they relate to method, an apparatus and a computer program for controlling an output from a display of an apparatus.


BACKGROUND

Electronic apparatus now often have displays. However, it is not always possible to display in such a display all the information that a user may wish to view. In such circumstances, it may be necessary to define different output states that have different corresponding information and to provide the user with a way of navigating from one output state to another.


For example, in Microsoft Windows, running applications have an icon in the Windows taskbar. Selecting the icon for an application makes that application the current active application. The output state changes and a screen for the selected application is displayed in front of the screens for the other applications.


BRIEF SUMMARY

According to various, but not necessarily all, embodiments of the invention there is provided a method comprising: displaying information corresponding to a first output state, temporarily displaying information corresponding to a second output state while a user actuation is occurring; and displaying information corresponding to the first output state when the user actuation is no longer occurring.


According to various, but not necessarily all, embodiments of the invention there is provided an apparatus comprising: a sensor configured to respond to a user actuation by generating a sensor signal; at least one processor; and at least one memory comprising computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to perform: controlling a display to display information corresponding to a first output state, when detecting the sensor signal from the sensor responsive to a user actuation, temporarily controlling the display to display information corresponding to a second output state, and when no longer detecting the sensor signal from the sensor, automatically controlling the display to display again information corresponding to the first output state.


According to various, but not necessarily all, embodiments of the invention there is provided an apparatus comprising: sensor means for responding to a user actuation by generating a sensor signal; means for controlling a display to display information corresponding to a first output state, means for temporarily controlling the display to display information corresponding to a second output state while the sensor signal is being generated; and means for controlling the display to display information corresponding to the first output state when the sensor signal is no longer being generated.


According to various, but not necessarily all, embodiments of the invention there is provided a computer program which when loaded into a processor enables the processor to: enable displaying information corresponding to a first output state, enable temporarily displaying information corresponding to a second output state while a user actuation is occurring; and enabling displaying information corresponding to the first output state when the user actuation is no longer occurring.





BRIEF DESCRIPTION

For a better understanding of various examples of embodiments of the present invention reference will now be made by way of example only to the accompanying drawings in which:



FIG. 1 illustrates a method;



FIG. 2 schematically illustrates a suitable method for controlling the output of an apparatus;



FIGS. 3A to 3C, 4A to 4C, 5A to 5C and 6A to 6D schematically illustrate different embodiments of an apparatus in which a user actuation causes a temporary physical deformation of the apparatus and the apparatus temporarily displays information corresponding to a second output state until the user deformation of the apparatus is interrupted;



FIGS. 7A and 7B schematically illustrate an embodiment of the apparatus in which the display output of the apparatus is controlled by bending the apparatus from a non-deformed configuration (FIG. 7A) to a deformed configuration (FIG. 7B);



FIGS. 8A and 8B, FIGS. 9A and 9B, FIGS. 10A, 10B and 10C, FIGS. 11A and 11B schematically illustrate different examples of suitable pairings of first output states and second output states;



FIG. 12 schematically illustrates an example of the apparatus 30; and



FIG. 13 schematically illustrates a record carrier for a computer program.





DETAILED DESCRIPTION


FIG. 1 illustrates a method 10 comprising: at block 12 displaying information corresponding to a first output state, at block 14 temporarily displaying information corresponding to a second output state while a user actuation is occurring; and at block 16 displaying information corresponding to the first output state when the user actuation is no longer occurring.


The method 10 may be performed at an apparatus 30.



FIGS. 3A to 3C, 4A to 4C, 5A to 5C and 6A to 6D schematically illustrate different embodiments of an apparatus 30 in which a user actuation causes a temporary physical deformation of the apparatus 30. While the physical deformation is occurring, the apparatus 30 temporarily displays information corresponding to a second output state until the user deformation of the apparatus is interrupted and then the apparatus 30 again displays information corresponding to the first output state.


In FIG. 3A, the apparatus 30 is in a non-deformed configuration. The apparatus 30 is displaying information corresponding to a first output state in a display 45 positioned in a front face 31 of the apparatus 30.


In FIG. 3B, the apparatus 30 is temporarily physically deformed by a user action. In this embodiment, a body 36 of the apparatus is squeezed. A front face 31 of the apparatus 30 is deformed inwardly and also a back face of the apparatus 30 is deformed inwardly. A sensor 34 is configured to detect the deformation of the apparatus 30. While the apparatus 30 is deformed, the sensor 34 generates a sensor signal and the apparatus 30 temporarily displays information corresponding to a second output state in the display 45.


In FIG. 3C, the apparatus 30 is no longer deformed by the user and the apparatus has returned to the non-deformed configuration. The sensor 34 is no longer generating the sensor signal. The apparatus 30 is again displaying information corresponding to the first output state in the display 45.


In FIG. 4A, the apparatus 30 is in a non-deformed configuration. The apparatus 30 is displaying information corresponding to a first output state in a display 45 positioned in a front face 31 of the apparatus 30.


In FIG. 4B, the apparatus 30 is temporarily physically deformed by a user action. In this embodiment, the front face 31 of the apparatus 30 is inwardly deformed or bent. A sensor 34 is configured to detect the deformation of the front face 31 of the apparatus 30. While the apparatus 30 is deformed, the sensor 34 generates a sensor signal and the apparatus 30 temporarily displays information corresponding to a second output state in the display 45.


In FIG. 4C, the apparatus 30 is no longer deformed by the user and the apparatus 30 has returned to the non-deformed configuration. The sensor 34 is no longer generating the sensor signal. The apparatus 30 is again displaying information corresponding to the first output state in the display 45.


In FIG. 5A, the apparatus 30 is in a non-deformed configuration. The apparatus 30 is displaying information corresponding to a first output state in a display 45 positioned in a front face 31 of the apparatus 30.


In FIG. 5B, the apparatus 30 is temporarily physically deformed by a user action. In this embodiment, a body 36 of the apparatus is bent. A front face 31 of the apparatus 30 is deformed to form a convex surface and also a back face 33 of the apparatus 30 is deformed to form a concave face. The deformation of the front face 31 extends it whereas the deformation of the back face 33 compresses it.


A sensor 34 is configured to detect the deformation of the apparatus 30. The sensor may, for example, be a compression sensor positioned in association with the back face 33. While the apparatus 30 is deformed, the sensor 34 generates a sensor signal and the apparatus 30 temporarily displays information corresponding to a second output state in the display 45.


In FIG. 5C, the apparatus 30 is no longer deformed by the user and the apparatus has returned to the non-deformed configuration. The sensor 34 is no longer generating the sensor signal. The apparatus 30 is again displaying information corresponding to the first output state in the display 45.


In FIG. 6A, the apparatus 30 is in a non-deformed configuration. The apparatus 30 is displaying information corresponding to a first output state in a display 45 positioned in a front face 31 of the apparatus 30.


In FIG. 6B, the apparatus 30 is temporarily physically deformed by a user action to first extent. In this embodiment, a body 36 of the apparatus is bent. A front face 31 of the apparatus 30 is deformed to form a convex surface and also a back face 33 of the apparatus 30 is deformed to form a concave face. The deformation of the front face 31 extends it whereas the deformation of the back face 33 compresses it.


A sensor 34 is configured to detect the deformation of the apparatus 30. The sensor may, for example, be a compression sensor positioned in association with the back face 33. While the apparatus 30 is deformed beyond the first extent, the sensor 34 generates a first sensor signal and the apparatus 30 temporarily displays information corresponding to a second output state in the display 45.


If the user now released the deformation of the apparatus 30 so that it returned to the non-deformed configuration, then the apparatus 30 would again display information corresponding to the first output state in the display 45.


However, in FIG. 6C, instead the apparatus 30 is temporarily physically deformed further in the same sense by the user action to a second extent. In this embodiment, a body 36 of the apparatus is bent further. A front face 31 of the apparatus 30 is deformed to form a more convex surface and also a back face 33 of the apparatus 30 is deformed to form a more concave face. The deformation of the front face 31 further extends it whereas the deformation of the back face 33 further compresses it.


When the apparatus 30 is deformed beyond the first extent to the second extent exceeding a deflection threshold, the sensor 34 generates a second sensor signal. The apparatus 30 now displays information corresponding to the second output state in the display 45 even if the user releases the deformation of the apparatus 30.


In FIG. 6D, the apparatus 30 is no longer deformed by the user and the apparatus has returned to the non-deformed configuration. The sensor 34 is no longer generating the sensor signal. The apparatus 30 is now displaying information corresponding to the second output state in the display 45 rather than the first output state.


Consequently, by slightly bending the apparatus 30 the user is able to reversibly view the information corresponding to the second output state when the first output state is the current active state. Releasing the bend returns the display to displaying information corresponding to the first output state. However, further bending the apparatus 30 switches the current active state from the first output state to the first output state.


It should be appreciated that the embodiments illustrated in FIGS. 3A to 3C and 4A to 4C may also enable switching of a current output state from the first output state to the second output state by further deforming the apparatus 30 beyond the initial deformation required to display temporarily the information corresponding to the second output state.



FIG. 2 schematically illustrates one example of a suitable method 10 for controlling the output of the apparatus 30.


At block 20, the first output state is set as a current active output state and a second output state is set as a non-current output state.


Next at block 21, information corresponding to the current output state is displayed in display 45.


Next at block 22, it is detected when a user actuation exceeds a first threshold. For example, it may be detected when a deformation of the apparatus 30 exceeds a first deformation threshold by determining when a sensor signal exceeds a first deformation signal threshold.


When it is detected that a user actuation exceeds a first threshold, the method moves to block 23.


Next at block 23, information corresponding to the non-current output state is displayed in the display 45.


Next at block 24, it is detected when a user actuation falls beneath the first threshold because the user has released the actuation. For example, it may be detected when a deformation of the apparatus 30 is less than the first deformation threshold by determining when a sensor signal is less than the first deformation signal threshold.


If it is detected that a user actuation has fallen beneath the first threshold then the method returns to block 21. Otherwise the method proceeds top block 25.


Next at block 25, it is detected when a user actuation exceeds a second threshold. For example, it may be detected when a deformation of the apparatus 30 exceeds a second deformation threshold by determining when a sensor signal exceeds a second deformation signal threshold.


If it is detected that a user actuation has exceeded the second threshold then the method proceeds to block 26. Otherwise the method returns to block 23.


Next at block 26, the second output state is set as a current active output state and the first output state is set as a non-current output state. The method then returns to block 21.


At block 21, information corresponding to the current output state (second output state) is displayed in display 45.


Next at block 22, it is detected when a user actuation exceeds a threshold that may be the same or different to the first threshold. When it is detected that a user actuation exceeds the threshold, the method moves to block 23.


Next at block 23, information corresponding to the non-current output state is displayed in the display 45. The non-current output state may be the first output state or a different output state.


Next at block 24, it is detected when a user actuation falls beneath the threshold because the user has released the actuation. If it is detected that a user actuation has fallen beneath the threshold then the method returns to block 21. Otherwise the method proceeds top block 25.


Next at block 25, it is detected when a user actuation exceeds a further greater threshold. If it is detected that a user actuation has exceeded the further threshold then the method proceeds to block 26. Otherwise the method returns to block 23.


Next at block 26, a non-current output state and the current output state are swapped. The current output state becomes a non-current output state and a different non-current output state becomes the current output state. For example, the first output state may be set as the current active output state and the second output state may be set as a non-current output state. The method then returns to block 21.


In this example it is therefore possible to temporarily toggle between displaying information corresponding to the first and second output states by, for example, performing a first deformation of the apparatus 30 and toggle back by releasing the first deformation. It is therefore possible to permanently toggle the current output state between the first and second output states by, for example, performing a second deformation of the apparatus 30 (releasing this second deformation does not cause a toggle) and to toggle back by performing a third deformation of the apparatus 30 to a greater extent or in a different way (releasing this third deformation does not cause a toggle).


The second deformation may, for example, be similar to the first deformation but to a greater extent. The third deformation may, for example, be similar to but separate to the second deformation or it may be in an opposite sense to the second deformation.


The user can provide input commands to an application corresponding to the current output state but cannot provide input commands to the application(s) corresponding to the non-current output state(s).



FIGS. 7A and 7B schematically illustrate an embodiment of the apparatus 30 in which the display output of the apparatus 30 is controlled by bending the apparatus 30, for example, as previously described with reference to FIGS. 5A to 5C and 6A to 6D.


In FIG. 7A, the apparatus 30 is in a non-deformed configuration.


In FIG. 7B, the apparatus 30 is in a deformed configuration.


In FIGS. 7A and 7B, the apparatus 30 comprises an internal supporting structure 40. The support 40 operates as a rigid skeleton.


A first part 44A of the supporting structure 40 is a rigid limb of the skeleton. It extends, in the non deformed configuration (FIG. 7A) substantially parallel to the front face 31.


A second part 44B of the supporting structure 40 is a rigid limb of the skeleton. It extends, in the non deformed configuration (FIG. 7A) substantially parallel to the front face 31.


A hinge 42 forms a joint of the skeleton positioned between the first part 44A and the second part 44B. The hinge 42 has an axis that extends substantially parallel to the front face 31. The hinge 42 enables the first part 44A and the second part 44B to rotate about the axis when the apparatus 30 is bent (FIG. 7B).


The first part 44A provides a rigid support for first functional circuitry 48A and the second part 44B provides a rigid support for second functional circuitry 48B. The first functional circuitry 48A and the second functional circuitry 48B are electrically interconnected via an interconnecting flex 46. The combination of the first functional circuitry 48A and the second functional circuitry 48B provide the components that, in combination, enable the apparatus 30 to operate. They may, for example, include a controller. Implementation of the controller can be in hardware alone (a circuit, a processor etc), have certain aspects in software including firmware alone or can be a combination of hardware and software (including firmware). The controller may be implemented using instructions that enable hardware functionality, for example, by using executable computer program instructions in a general-purpose or special-purpose processor that may be stored on a computer readable storage medium (disk, memory etc) to be executed by such a processor.


The apparatus 30 comprises a housing 43 that has plastic sidewalls, a thin plastic window 41 at the front face 31 of the apparatus overlying the display 45 and soft plastic material 47 at the back face 33 of the apparatus 30.


The deformation sensor 34 is integrated into the back face 33. In this example, it is positioned underneath the hinge 42.


A temporary physical deformation by the user bends the supporting structure 40 at the hinge 42. This is detected by the deformation sensor 44 which is temporarily deformed.



FIGS. 8A and 8B schematically illustrate one example of a suitable first output state (FIG. 8A) and second output state (FIG. 8B).


In FIG. 8A, the first output state is an inactive state (e.g. idle state or sleep state) and the information displayed is sleep state information. In this example, the display 45 is switched off and no information is displayed.


In FIG. 8B, the second output state is a clock state and the information corresponding to the second output state that is displayed is a time. Thus the time can be displayed without exiting the low energy sleep state.



FIGS. 9A and 9B schematically illustrate one example of a suitable first output state (FIG. 9A) and second output state (FIG. 9B).


In FIG. 9A, there are a large number of running applications including a web browser, a messaging application, a calendar application, a missed calls application and a Facebook (Trade Mark) application. The current active application is the web browser application. The first output state is the output screen from the web-browser.


In FIG. 9B, the second output state displays information relating to the other applications (the messaging application, the calendar application, the missed calls application and the Facebook (Trade Mark) application).



FIGS. 10A and 10B schematically illustrate one example of a suitable first output state (FIG. 10A) and second output state (FIG. 10B).


In FIG. 10A, there is a web browser running and a messaging application running. The current active application is the web browser application. The first output state is the output screen from the web-browser.


In FIG. 10B, after an initial deformation of the apparatus 30, the second output state displays information relating to the messaging application. The whole output screen of the messaging application is visible. The web browser application is still the current active application.


In FIG. 10C, after a further deformation of the apparatus 30, the messaging application becomes the current active application. The whole output screen of the messaging application is now displayed as a default.



FIGS. 11A and 11B schematically illustrate one example of a suitable first output state (FIG. 11A) and second output state (FIG. 11B).


In FIG. 11A, there is a web browser running and a messaging application running. The current active application is the web browser application. The first output state is the output screen from the web-browser.


In FIG. 10B, after an initial deformation of the apparatus 30, the second output state displays some information relating to the messaging application. Part of the output screen of the messaging application is visible but at least a part of the web browser screen is also visible.



FIG. 12 schematically illustrates an example of the apparatus 30. The apparatus may, for example, be a hand-portable electronic apparatus that is sized to fit into an inside breast pocket of a jacket and/or to be held in the palm of a human hand. The hand-portable electronic device may be operable as a mobile cellular telephone, a personal media player (music, video, and/or books), a personal digital assistant and/or a personal computer.


The apparatus 30 comprises a sensor 34 configured to respond to a user actuation by generating a sensor signal 35. The sensor 34 may be a deformation sensor configured to detect deformation of the apparatus 30 and configured to generate a sensor signal 35 in response to physical deformation of the apparatus 30.


The sensor 34 may, for example, be positioned at a surface of the apparatus and may be configured to generate a sensor signal 35 in response to physical deformation of the surface of the apparatus 30.


The sensor 34 may, for example, be positioned at an interior of the apparatus 30 and may be configured to generate a sensor signal 35 in response to physical deformation of a supporting structure of the apparatus 30.


The apparatus 30 also comprises a display 45, at least one processor 50; and at least one memory 52 including computer program code 54.


The at least one memory 52 and the computer program code 54 are configured to, with the at least one processor 50, cause the apparatus 30 at least to perform:


controlling a display 45 to display information corresponding to a first output state,


when detecting the sensor signal 35 from the sensor 34 responsive to a user actuation, temporarily controlling the display 45 to display information corresponding to a second output state, and


when no longer detecting the sensor signal 35 from the sensor 34, automatically controlling the display 45 to display again information corresponding to the first output state.


The processor 50 is configured to read from and write to the memory 52. The processor 50 may also comprise an output interface via which data and/or commands are output by the processor and an input interface via which data and/or commands are input to the processor 50.


The memory 52 stores a computer program 54 comprising computer program instructions that control the operation of the apparatus 30 when loaded into the processor 50. The computer program instructions 54 provide the logic and routines that enables the apparatus to perform the methods illustrated in the Figs. The processor 50 by reading the memory 52 is able to load and execute the computer program 54.


The computer program may arrive at the apparatus 30 via any suitable delivery mechanism 56 (FIG. 13). The delivery mechanism 56 may be, for example, a computer-readable storage medium, a computer program product, a memory device, a record medium such as a CD-ROM or DVD, an article of manufacture that tangibly embodies the computer program 54. The delivery mechanism may be a signal configured to reliably transfer the computer program 54.


The apparatus 30 may propagate or transmit the computer program 54 as a computer data signal.


Although the memory 54 is illustrated as a single component it may be implemented as one or more separate components some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/dynamic/cached storage.


References to ‘computer-readable storage medium’, ‘computer program product’, ‘tangibly embodied computer program’ etc. or a ‘controller’, ‘computer’, ‘processor’ etc. should be understood to encompass not only computers having different architectures such as single/multi-processor architectures and sequential (Von Neumann)/parallel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other processing circuitry. References to computer program, instructions, code etc. should be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed-function device, gate array or programmable logic device etc.


As used in this application, the term ‘circuitry’ refers to all of the following:


(a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry) and


(b) to combinations of circuits and software (and/or firmware), such as (as applicable): (i) to a combination of processor(s) or (ii) to portions of processor(s)/software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions) and


(c) to circuits, such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present.


This definition of ‘circuitry’ applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term “circuitry” would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware. The term “circuitry” would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in server, a cellular network device, or other network device.”


The blocks illustrated in the FIGS. 1 and 2 may represent operations in a method and/or sections of code in the computer program 54. The illustration of a particular order to the blocks does not necessarily imply that there is a required or preferred order for the blocks and the order and arrangement of the block may be varied. Furthermore, it may be possible for some steps to be omitted.


Although embodiments of the present invention have been described in the preceding paragraphs with reference to various examples, it should be appreciated that modifications to the examples given can be made without departing from the scope of the invention as claimed.


Features described in the preceding description may be used in combinations other than the combinations explicitly described.


Although functions have been described with reference to certain features, those functions may be performable by other features whether described or not.


Although features have been described with reference to certain embodiments, those features may also be present in other embodiments whether described or not.


Whilst endeavoring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance it should be understood that the Applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon.

Claims
  • 1. A method comprising: displaying first information corresponding to a first output state;while a user actuation is occurring, temporarily displaying information corresponding to a second output state; andwhen the user actuation is released and is no longer occurring, returning to displaying the first information corresponding to the first output state, wherein the user actuation comprises temporarily inwardly deforming or bending only a front face of an apparatus, and not deforming or bending a back face of the apparatus.
  • 2. A method as claimed in claim 1, comprising: temporarily displaying information corresponding to a second output state while a user actuation is occurring until the user actuation is interrupted and then displaying the first information corresponding to the first output state.
  • 3. A method as claimed in claim 1, wherein the user actuation comprises a temporary physical deformation of an apparatus.
  • 4. A method as claimed in claim 1, wherein the user actuation comprises a temporary physical deformation of a surface of an apparatus.
  • 5. A method as claimed in claim 1, wherein the user actuation comprises a temporary physical deformation of an internal supporting structure of an apparatus.
  • 6. A method as claimed in claim 1, wherein the user actuation comprises a temporary bending of a body of an apparatus.
  • 7. A method as claimed in claim 6, wherein the apparatus has a front face comprising a display for display information and a back face, wherein bending the body of the apparatus compresses one of the faces and extends the other of the faces.
  • 8. A method as claimed in claim 7, wherein the sensor is a compression sensor positioned in association with the back face.
  • 9. A method as claimed in claim 1, comprising: detecting when the occurring user actuation exceeds a threshold;displaying information corresponding to the second output state; and displaying information corresponding to the second output state when the user actuation is no longer occurring.
  • 10. A method as claimed in claim 9, wherein the user actuation exceeds a threshold when a temporary physical deformation of an actuator exceeds a deformation threshold.
  • 11. A method as claimed in claim 9, wherein the user actuation exceeds a threshold when a temporary bending of an apparatus exceeds a deflection threshold.
  • 12. A method as claimed in claim 1, wherein a pairing of the first output state and the second output state is selected from the group comprising: the first information corresponding to the first output state is inactive state information and information corresponding to the second output state comprises additional or different information;the first information corresponding to the first output state is a content of a page of a document and information corresponding to the second output state is a content of later pages of the document;the first information corresponding to the first output state is a screen for a currently active one of multiple running applications each having a respective screen and information corresponding to the second state is at least part of a screen for one of the multiple running applications that is not active;the first information corresponding to the first output state is a screen for a currently active one of multiple running applications each having a respective screen andinformation corresponding to the second state is a whole of a screen for one of the multiple running applications that is not active.
  • 13. An apparatus comprising: a sensor configured to respond to a user actuation by generating a sensor signal;at least one processor; andat least one memory comprising computer program code,the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to perform:controlling a display to display first information corresponding to a first output state;while detecting the sensor signal from the sensor responsive to a user actuation, temporarily controlling the display to display information corresponding to a second output state, andwhen no longer detecting the sensor signal from the sensor because the user actuation is released and is no longer occurring, automatically controlling the display to return to displaying the first information corresponding to the first output state, wherein the user actuation comprises temporarily inwardly deforming or bending only a front face of the apparatus, and not deforming or bending a back face of the apparatus.
  • 14. An apparatus as claimed in claim 13, wherein the sensor is configured to generate a sensor signal in response to physical deformation of the sensor.
  • 15. An apparatus as claimed in claim 13, wherein the sensor is positioned at a surface of the apparatus and is configured to generate a sensor signal in response to physical deformation of the surface of the apparatus.
  • 16. An apparatus as claimed in claim 13, wherein the sensor is positioned at an interior of the apparatus and is configured to generate a sensor signal in response to physical deformation of a supporting structure of the apparatus.
  • 17. An apparatus as claimed in claim 13, wherein the sensor is configured to generate a sensor signal in response to bending of the apparatus.
  • 18. An apparatus as claimed in claim 17, wherein the apparatus has a front face comprising a display for displaying information and a back face, wherein bending the apparatus compresses one of the faces and extends the other of the faces.
  • 19. An apparatus as claimed in claim 18, wherein the apparatus has a front face comprising a display for displaying information and a back face, wherein bending the apparatus compresses the back face and extends the other of the faces and wherein the sensor is a compression sensor positioned in association with the back face.
  • 20. An apparatus as claimed in claim 13, wherein the sensor configured to respond to a developing user actuation by generating a developing sensor signal; detecting when the sensor signal exceeds a threshold;displaying information corresponding to the second output state; and displaying information corresponding to the second output state when the user actuation is no longer occurring.
  • 21. An apparatus as claimed in claim 20, wherein the sensor is configured to generate a sensor signal in excess of the threshold in response to physical deformation of the apparatus beyond a deformation threshold.
  • 22. An apparatus as claimed in claim 20, wherein the sensor is configured to generate a sensor signal in excess of the threshold in response to bending of the apparatus beyond a deflection threshold.
  • 23. An apparatus comprising: sensor means for responding to a user actuation by generating a sensor signal;means for controlling a display to display first information corresponding to a first output state;means for temporarily controlling the display to display information corresponding to a second output state while the sensor signal is being generated; andmeans for controlling the display to display the first information corresponding to the first output state when the sensor signal is no longer being generated because the user actuation is released and is no longer occurring, wherein the user actuation comprises temporarily inwardly deforming or bending only a front face of the apparatus, and not deforming or bending a back face of the apparatus.
  • 24. An apparatus comprising means for performing the method of claim 1.
  • 25. A computer program stored on a non-transitory computer readable medium which when loaded into a processor enables the processor to: enable displaying first information corresponding to a first output state;while a user actuation is occurring, enable temporarily displaying information corresponding to a second output state; andwhen the user actuation is released and is no longer occurring, enabling returning to displaying the first information corresponding to the first output state, wherein the user actuation comprises temporarily inwardly deforming or bending only a front face of the apparatus, and not deforming or bending a back face of the apparatus.
CROSS REFERENCE TO RELATED APPLICATION

This is a continuation application of copending application Ser. No. 13/697,579 filed Jan. 28, 2013, which is a national stage application of International Application No. PCT/IB2010/052273 filed May 21, 2010 which are hereby incorporated by reference in their entireties.

US Referenced Citations (128)
Number Name Date Kind
1410366 Buchman Mar 1922 A
1619502 Fox Mar 1927 A
2311470 Ritter Feb 1943 A
3148724 Chieger Sep 1964 A
3297077 Garbus Jan 1967 A
3324930 Colombo Jun 1967 A
3363383 La Barge Jan 1968 A
3570579 Matsushima Mar 1971 A
3880500 Kojabashian Apr 1975 A
4344475 Frey Aug 1982 A
4438605 DeLucia Mar 1984 A
4483020 Dunn Nov 1984 A
4716698 Wilson Jan 1988 A
4762020 Schmidberger Aug 1988 A
4785565 Kuffner Nov 1988 A
5007108 Laberge et al. Apr 1991 A
5133108 Esnault Jul 1992 A
5148850 Urbanick Sep 1992 A
5176463 Kraus Jan 1993 A
5214623 Seager May 1993 A
5488982 Rejc Feb 1996 A
5588167 Pahno et al. Dec 1996 A
5613541 Bradbury Mar 1997 A
5706026 Kent et al. Jan 1998 A
5771489 Snedeker Jun 1998 A
5795430 Beeteson et al. Aug 1998 A
5923318 Zhai et al. Jul 1999 A
6016176 Kim et al. Jan 2000 A
6160540 Fishkin et al. Dec 2000 A
6378172 Schrage Apr 2002 B1
6441809 Kent et al. Aug 2002 B2
6556189 Takahata et al. Apr 2003 B1
6557177 Hochmuth May 2003 B2
7075527 Takagi et al. Jul 2006 B2
7443380 Nozawa Oct 2008 B2
7446757 Mochizuki Nov 2008 B2
7456823 Poupyrev et al. Nov 2008 B2
8194399 Ashcraft et al. Jun 2012 B2
8380327 Park Feb 2013 B2
8581859 Okumura et al. Nov 2013 B2
8619021 Hayton Dec 2013 B2
8780540 Whitt et al. Jul 2014 B2
8780541 Whitt et al. Jul 2014 B2
8804324 Bohn Aug 2014 B2
8929085 Franklin et al. Jan 2015 B2
8999474 Casteras Apr 2015 B2
20010033275 Kent et al. Oct 2001 A1
20020033798 Nakamura et al. Mar 2002 A1
20020167495 Quinn et al. Nov 2002 A1
20030043087 Kim Mar 2003 A1
20030060269 Paulsen et al. Mar 2003 A1
20030144034 Hack et al. Jul 2003 A1
20030147205 Murphy et al. Aug 2003 A1
20030210801 Naksen et al. Nov 2003 A1
20030214485 Roberts Nov 2003 A1
20030227441 Hioki et al. Dec 2003 A1
20040008191 Poupyrev et al. Jan 2004 A1
20040017355 Shim Jan 2004 A1
20040035994 Cho et al. Feb 2004 A1
20040046739 Gettemy Mar 2004 A1
20040212588 Moriyama Oct 2004 A1
20040239631 Gresham Dec 2004 A1
20050051693 Chu Mar 2005 A1
20050057527 Takenaka et al. Mar 2005 A1
20050140646 Nozawa Jun 2005 A1
20050162389 Obermeyer et al. Jul 2005 A1
20050237308 Autio et al. Oct 2005 A1
20060007151 Ram Jan 2006 A1
20060077672 Schaak Apr 2006 A1
20060199999 Ikeda et al. Sep 2006 A1
20060238494 Narayanaswami et al. Oct 2006 A1
20070040810 Dowe et al. Feb 2007 A1
20070097014 Solomon May 2007 A1
20070154254 Bevirt Jul 2007 A1
20070205997 Lieshout et al. Sep 2007 A1
20070242033 Cradick et al. Oct 2007 A1
20070247422 Vertegaal et al. Oct 2007 A1
20080018631 Hioki et al. Jan 2008 A1
20080042940 Hasegawa Feb 2008 A1
20080251662 Desorbo et al. Oct 2008 A1
20090058828 Jiang et al. Mar 2009 A1
20090085866 Sugahara Apr 2009 A1
20090088204 Culbert et al. Apr 2009 A1
20090115734 Frederiksson et al. May 2009 A1
20090184921 Scott et al. Jul 2009 A1
20090219247 Watanabe et al. Sep 2009 A1
20090237374 Li Sep 2009 A1
20090237872 Bemelmans et al. Sep 2009 A1
20090244013 Eldershaw Oct 2009 A1
20090326833 Ryhanen et al. Dec 2009 A1
20100011291 Nurmi Jan 2010 A1
20100013939 Ohno et al. Jan 2010 A1
20100056223 Choi et al. Mar 2010 A1
20100060548 Choi et al. Mar 2010 A1
20100108828 Yu et al. May 2010 A1
20100120470 Kim et al. May 2010 A1
20100134428 Oh Jun 2010 A1
20100141605 Kang Jun 2010 A1
20100164888 Okumura Jul 2010 A1
20100228295 Whitefield Sep 2010 A1
20100238612 Hsiao et al. Sep 2010 A1
20100263245 Bowser Oct 2010 A1
20110007000 Lim Jan 2011 A1
20110057873 Geissler et al. Mar 2011 A1
20110062703 Lopez et al. Mar 2011 A1
20110080155 Aldridge Apr 2011 A1
20110095999 Hayton Apr 2011 A1
20110141053 Bulea et al. Jun 2011 A1
20110141069 Hirakata et al. Jun 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20110181494 Wong et al. Jul 2011 A1
20110193771 Chronqvist Aug 2011 A1
20110227822 Shai Sep 2011 A1
20110241822 Opran et al. Oct 2011 A1
20110298786 Cho et al. Dec 2011 A1
20120044620 Song Feb 2012 A1
20120110784 Hsu May 2012 A1
20120162876 Kim Jun 2012 A1
20120206375 Fyke et al. Aug 2012 A1
20130083496 Franklin Apr 2013 A1
20130120912 Ladouceur May 2013 A1
20130178344 Walsh et al. Jul 2013 A1
20130187864 Paasovaara et al. Jul 2013 A1
20130194207 Andrew et al. Aug 2013 A1
20130197819 Vanska et al. Aug 2013 A1
20130286553 Vanska et al. Oct 2013 A1
20130333592 Cavallaro Dec 2013 A1
20140003006 Ahn Jan 2014 A1
Foreign Referenced Citations (35)
Number Date Country
1598870 Mar 2005 CN
1617614 May 2005 CN
101430601 May 2009 CN
201758267 Mar 2011 CN
1 657 965 May 2006 EP
1770965 Apr 2007 EP
1 829 023 Sep 2007 EP
1830336 Sep 2007 EP
1 970 886 Sep 2008 EP
2166443 Mar 2010 EP
2202624 Jun 2010 EP
2315186 Apr 2011 EP
2508960 Oct 2012 EP
2456512 Jul 2009 GB
2002278515 Sep 2002 JP
2003015795 Jan 2003 JP
2004046792 Feb 2004 JP
2004192241 Jul 2004 JP
2008152426 Jul 2008 JP
20060134130 Dec 2006 KR
20090006718 Jan 2009 KR
20090006807 Jan 2009 KR
2009001161 Feb 2009 KR
200404248 Mar 2004 TW
WO-0060438 Oct 2000 WO
WO-2005093548 Oct 2005 WO
WO-2005093548 Oct 2005 WO
WO-2006014230 Feb 2006 WO
WO-2008150600 Dec 2008 WO
WO-2009050107 Apr 2009 WO
WO-2010004080 Jan 2010 WO
WO-2010041227 Apr 2010 WO
WO-2011117681 Sep 2011 WO
WO-2011144972 Nov 2011 WO
WO-2013160737 Oct 2013 WO
Non-Patent Literature Citations (10)
Entry
“Nokia patent application points to flexible phone displays”; Donald Melanson, Publication date: Jan. 19, 2010.
Intuitive Page-Turning Interface of E-Books on Flexible E-Paper Based on User Studies; Taichi Tajika, Tomoko Yonezawa, Noriaki Mitsunaga; on pp. 793-796; Publication date: 2008.
Lahey, Byron et al.; “PaperPhone: Understanding the Use of Bend Gestures in Mobile Devices with Flexible Electronic Paper Displays”; CHI 2011-Session: Flexible Grips & Gestures; May 7-12, 2011; pp. 1303-1312.
Lee, Sang-Su et al; “How Users Manipulate Deformable Displays as Input Devices”; Apr. 10-15, 2010; pp. 1647-1656.
Poupyrev, Ivan; “Gummi: A bendable computer”; http://ivanpoupyrev.com/projects/gummi.php; 1994-2012; whole document (7 pages).
Honig, Zach; “Mrata Tactile conroller TV remote hands-on (video)”; http://www.engadget.com/2011/10/05/murata-tactile-controller-tv-remote-hands-on-video; 2012; whole document (8 pages).
“Press release: revolutionary new paper computer shows flexible future for smartphones and tablets”; http://www.hml/queensu.ca/paperphone; 2012; whole document (2 pages).
Mina; “Samsung Unveils Flexible Android Smartphone”; http://www.androidauthority.com/samsung-unveils-flexible-android-smartphone-24933/; Sep. 21, 2011; whole document (8 pages).
Smith, Matt; “Nokia's kinetic future: flexible screens and a twisted interface”; http://www.engadget.com/2011/10/26/nokias-kinetic-future-flexible-screens-and-a-twisted-interface/; Oct. 26, 2012; whole document (4 pages).
Watanabe, Jun-ichiro, et al., “Booksheet: Bendable Device for Browsing Content Using the Metaphor of Leafing Through the Pages”, Sep. 21-24, 2008, pp. 360-369.
Related Publications (1)
Number Date Country
20170199623 A1 Jul 2017 US
Continuations (1)
Number Date Country
Parent 13697579 US
Child 15470059 US