Embodiments of the present invention relate to a method, an apparatus and a computer program. In particular, they relate to method, an apparatus and a computer program for controlling an output from a display of an apparatus.
Electronic apparatus now often have displays. However, it is not always possible to display in such a display all the information that a user may wish to view. In such circumstances, it may be necessary to define different output states that have different corresponding information and to provide the user with a way of navigating from one output state to another.
For example, in Microsoft Windows, running applications have an icon in the Windows taskbar. Selecting the icon for an application makes that application the current active application. The output state changes and a screen for the selected application is displayed in front of the screens for the other applications.
According to various, but not necessarily all, embodiments of the invention there is provided a method comprising: displaying information corresponding to a first output state, temporarily displaying information corresponding to a second output state while a user actuation is occurring; and displaying information corresponding to the first output state when the user actuation is no longer occurring.
According to various, but not necessarily all, embodiments of the invention there is provided an apparatus comprising: a sensor configured to respond to a user actuation by generating a sensor signal; at least one processor; and at least one memory comprising computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus at least to perform: controlling a display to display information corresponding to a first output state, when detecting the sensor signal from the sensor responsive to a user actuation, temporarily controlling the display to display information corresponding to a second output state, and when no longer detecting the sensor signal from the sensor, automatically controlling the display to display again information corresponding to the first output state.
According to various, but not necessarily all, embodiments of the invention there is provided an apparatus comprising: sensor means for responding to a user actuation by generating a sensor signal; means for controlling a display to display information corresponding to a first output state, means for temporarily controlling the display to display information corresponding to a second output state while the sensor signal is being generated; and means for controlling the display to display information corresponding to the first output state when the sensor signal is no longer being generated.
According to various, but not necessarily all, embodiments of the invention there is provided a computer program which when loaded into a processor enables the processor to: enable displaying information corresponding to a first output state, enable temporarily displaying information corresponding to a second output state while a user actuation is occurring; and enabling displaying information corresponding to the first output state when the user actuation is no longer occurring.
For a better understanding of various examples of embodiments of the present invention reference will now be made by way of example only to the accompanying drawings in which:
The method 10 may be performed at an apparatus 30.
In
In
In
In
In
In
In
In
A sensor 34 is configured to detect the deformation of the apparatus 30. The sensor may, for example, be a compression sensor positioned in association with the back face 33. While the apparatus 30 is deformed, the sensor 34 generates a sensor signal and the apparatus 30 temporarily displays information corresponding to a second output state in the display 45.
In
In
In
A sensor 34 is configured to detect the deformation of the apparatus 30. The sensor may, for example, be a compression sensor positioned in association with the back face 33. While the apparatus 30 is deformed beyond the first extent, the sensor 34 generates a first sensor signal and the apparatus 30 temporarily displays information corresponding to a second output state in the display 45.
If the user now released the deformation of the apparatus 30 so that it returned to the non-deformed configuration, then the apparatus 30 would again display information corresponding to the first output state in the display 45.
However, in
When the apparatus 30 is deformed beyond the first extent to the second extent exceeding a deflection threshold, the sensor 34 generates a second sensor signal. The apparatus 30 now displays information corresponding to the second output state in the display 45 even if the user releases the deformation of the apparatus 30.
In
Consequently, by slightly bending the apparatus 30 the user is able to reversibly view the information corresponding to the second output state when the first output state is the current active state. Releasing the bend returns the display to displaying information corresponding to the first output state. However, further bending the apparatus 30 switches the current active state from the first output state to the first output state.
It should be appreciated that the embodiments illustrated in
At block 20, the first output state is set as a current active output state and a second output state is set as a non-current output state.
Next at block 21, information corresponding to the current output state is displayed in display 45.
Next at block 22, it is detected when a user actuation exceeds a first threshold. For example, it may be detected when a deformation of the apparatus 30 exceeds a first deformation threshold by determining when a sensor signal exceeds a first deformation signal threshold.
When it is detected that a user actuation exceeds a first threshold, the method moves to block 23.
Next at block 23, information corresponding to the non-current output state is displayed in the display 45.
Next at block 24, it is detected when a user actuation falls beneath the first threshold because the user has released the actuation. For example, it may be detected when a deformation of the apparatus 30 is less than the first deformation threshold by determining when a sensor signal is less than the first deformation signal threshold.
If it is detected that a user actuation has fallen beneath the first threshold then the method returns to block 21. Otherwise the method proceeds top block 25.
Next at block 25, it is detected when a user actuation exceeds a second threshold. For example, it may be detected when a deformation of the apparatus 30 exceeds a second deformation threshold by determining when a sensor signal exceeds a second deformation signal threshold.
If it is detected that a user actuation has exceeded the second threshold then the method proceeds to block 26. Otherwise the method returns to block 23.
Next at block 26, the second output state is set as a current active output state and the first output state is set as a non-current output state. The method then returns to block 21.
At block 21, information corresponding to the current output state (second output state) is displayed in display 45.
Next at block 22, it is detected when a user actuation exceeds a threshold that may be the same or different to the first threshold. When it is detected that a user actuation exceeds the threshold, the method moves to block 23.
Next at block 23, information corresponding to the non-current output state is displayed in the display 45. The non-current output state may be the first output state or a different output state.
Next at block 24, it is detected when a user actuation falls beneath the threshold because the user has released the actuation. If it is detected that a user actuation has fallen beneath the threshold then the method returns to block 21. Otherwise the method proceeds top block 25.
Next at block 25, it is detected when a user actuation exceeds a further greater threshold. If it is detected that a user actuation has exceeded the further threshold then the method proceeds to block 26. Otherwise the method returns to block 23.
Next at block 26, a non-current output state and the current output state are swapped. The current output state becomes a non-current output state and a different non-current output state becomes the current output state. For example, the first output state may be set as the current active output state and the second output state may be set as a non-current output state. The method then returns to block 21.
In this example it is therefore possible to temporarily toggle between displaying information corresponding to the first and second output states by, for example, performing a first deformation of the apparatus 30 and toggle back by releasing the first deformation. It is therefore possible to permanently toggle the current output state between the first and second output states by, for example, performing a second deformation of the apparatus 30 (releasing this second deformation does not cause a toggle) and to toggle back by performing a third deformation of the apparatus 30 to a greater extent or in a different way (releasing this third deformation does not cause a toggle).
The second deformation may, for example, be similar to the first deformation but to a greater extent. The third deformation may, for example, be similar to but separate to the second deformation or it may be in an opposite sense to the second deformation.
The user can provide input commands to an application corresponding to the current output state but cannot provide input commands to the application(s) corresponding to the non-current output state(s).
In
In
In
A first part 44A of the supporting structure 40 is a rigid limb of the skeleton. It extends, in the non deformed configuration (
A second part 44B of the supporting structure 40 is a rigid limb of the skeleton. It extends, in the non deformed configuration (
A hinge 42 forms a joint of the skeleton positioned between the first part 44A and the second part 44B. The hinge 42 has an axis that extends substantially parallel to the front face 31. The hinge 42 enables the first part 44A and the second part 44B to rotate about the axis when the apparatus 30 is bent (
The first part 44A provides a rigid support for first functional circuitry 48A and the second part 44B provides a rigid support for second functional circuitry 48B. The first functional circuitry 48A and the second functional circuitry 48B are electrically interconnected via an interconnecting flex 46. The combination of the first functional circuitry 48A and the second functional circuitry 48B provide the components that, in combination, enable the apparatus 30 to operate. They may, for example, include a controller. Implementation of the controller can be in hardware alone (a circuit, a processor etc), have certain aspects in software including firmware alone or can be a combination of hardware and software (including firmware). The controller may be implemented using instructions that enable hardware functionality, for example, by using executable computer program instructions in a general-purpose or special-purpose processor that may be stored on a computer readable storage medium (disk, memory etc) to be executed by such a processor.
The apparatus 30 comprises a housing 43 that has plastic sidewalls, a thin plastic window 41 at the front face 31 of the apparatus overlying the display 45 and soft plastic material 47 at the back face 33 of the apparatus 30.
The deformation sensor 34 is integrated into the back face 33. In this example, it is positioned underneath the hinge 42.
A temporary physical deformation by the user bends the supporting structure 40 at the hinge 42. This is detected by the deformation sensor 44 which is temporarily deformed.
In
In
In
In
In
In
In
In
In
The apparatus 30 comprises a sensor 34 configured to respond to a user actuation by generating a sensor signal 35. The sensor 34 may be a deformation sensor configured to detect deformation of the apparatus 30 and configured to generate a sensor signal 35 in response to physical deformation of the apparatus 30.
The sensor 34 may, for example, be positioned at a surface of the apparatus and may be configured to generate a sensor signal 35 in response to physical deformation of the surface of the apparatus 30.
The sensor 34 may, for example, be positioned at an interior of the apparatus 30 and may be configured to generate a sensor signal 35 in response to physical deformation of a supporting structure of the apparatus 30.
The apparatus 30 also comprises a display 45, at least one processor 50; and at least one memory 52 including computer program code 54.
The at least one memory 52 and the computer program code 54 are configured to, with the at least one processor 50, cause the apparatus 30 at least to perform:
controlling a display 45 to display information corresponding to a first output state,
when detecting the sensor signal 35 from the sensor 34 responsive to a user actuation, temporarily controlling the display 45 to display information corresponding to a second output state, and
when no longer detecting the sensor signal 35 from the sensor 34, automatically controlling the display 45 to display again information corresponding to the first output state.
The processor 50 is configured to read from and write to the memory 52. The processor 50 may also comprise an output interface via which data and/or commands are output by the processor and an input interface via which data and/or commands are input to the processor 50.
The memory 52 stores a computer program 54 comprising computer program instructions that control the operation of the apparatus 30 when loaded into the processor 50. The computer program instructions 54 provide the logic and routines that enables the apparatus to perform the methods illustrated in the Figs. The processor 50 by reading the memory 52 is able to load and execute the computer program 54.
The computer program may arrive at the apparatus 30 via any suitable delivery mechanism 56 (
The apparatus 30 may propagate or transmit the computer program 54 as a computer data signal.
Although the memory 54 is illustrated as a single component it may be implemented as one or more separate components some or all of which may be integrated/removable and/or may provide permanent/semi-permanent/dynamic/cached storage.
References to ‘computer-readable storage medium’, ‘computer program product’, ‘tangibly embodied computer program’ etc. or a ‘controller’, ‘computer’, ‘processor’ etc. should be understood to encompass not only computers having different architectures such as single/multi-processor architectures and sequential (Von Neumann)/parallel architectures but also specialized circuits such as field-programmable gate arrays (FPGA), application specific circuits (ASIC), signal processing devices and other processing circuitry. References to computer program, instructions, code etc. should be understood to encompass software for a programmable processor or firmware such as, for example, the programmable content of a hardware device whether instructions for a processor, or configuration settings for a fixed-function device, gate array or programmable logic device etc.
As used in this application, the term ‘circuitry’ refers to all of the following:
(a) hardware-only circuit implementations (such as implementations in only analog and/or digital circuitry) and
(b) to combinations of circuits and software (and/or firmware), such as (as applicable): (i) to a combination of processor(s) or (ii) to portions of processor(s)/software (including digital signal processor(s)), software, and memory(ies) that work together to cause an apparatus, such as a mobile phone or server, to perform various functions) and
(c) to circuits, such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present.
This definition of ‘circuitry’ applies to all uses of this term in this application, including in any claims. As a further example, as used in this application, the term “circuitry” would also cover an implementation of merely a processor (or multiple processors) or portion of a processor and its (or their) accompanying software and/or firmware. The term “circuitry” would also cover, for example and if applicable to the particular claim element, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in server, a cellular network device, or other network device.”
The blocks illustrated in the
Although embodiments of the present invention have been described in the preceding paragraphs with reference to various examples, it should be appreciated that modifications to the examples given can be made without departing from the scope of the invention as claimed.
Features described in the preceding description may be used in combinations other than the combinations explicitly described.
Although functions have been described with reference to certain features, those functions may be performable by other features whether described or not.
Although features have been described with reference to certain embodiments, those features may also be present in other embodiments whether described or not.
Whilst endeavoring in the foregoing specification to draw attention to those features of the invention believed to be of particular importance it should be understood that the Applicant claims protection in respect of any patentable feature or combination of features hereinbefore referred to and/or shown in the drawings whether or not particular emphasis has been placed thereon.
This is a continuation application of copending application Ser. No. 13/697,579 filed Jan. 28, 2013, which is a national stage application of International Application No. PCT/IB2010/052273 filed May 21, 2010 which are hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
1410366 | Buchman | Mar 1922 | A |
1619502 | Fox | Mar 1927 | A |
2311470 | Ritter | Feb 1943 | A |
3148724 | Chieger | Sep 1964 | A |
3297077 | Garbus | Jan 1967 | A |
3324930 | Colombo | Jun 1967 | A |
3363383 | La Barge | Jan 1968 | A |
3570579 | Matsushima | Mar 1971 | A |
3880500 | Kojabashian | Apr 1975 | A |
4344475 | Frey | Aug 1982 | A |
4438605 | DeLucia | Mar 1984 | A |
4483020 | Dunn | Nov 1984 | A |
4716698 | Wilson | Jan 1988 | A |
4762020 | Schmidberger | Aug 1988 | A |
4785565 | Kuffner | Nov 1988 | A |
5007108 | Laberge et al. | Apr 1991 | A |
5133108 | Esnault | Jul 1992 | A |
5148850 | Urbanick | Sep 1992 | A |
5176463 | Kraus | Jan 1993 | A |
5214623 | Seager | May 1993 | A |
5488982 | Rejc | Feb 1996 | A |
5588167 | Pahno et al. | Dec 1996 | A |
5613541 | Bradbury | Mar 1997 | A |
5706026 | Kent et al. | Jan 1998 | A |
5771489 | Snedeker | Jun 1998 | A |
5795430 | Beeteson et al. | Aug 1998 | A |
5923318 | Zhai et al. | Jul 1999 | A |
6016176 | Kim et al. | Jan 2000 | A |
6160540 | Fishkin et al. | Dec 2000 | A |
6378172 | Schrage | Apr 2002 | B1 |
6441809 | Kent et al. | Aug 2002 | B2 |
6556189 | Takahata et al. | Apr 2003 | B1 |
6557177 | Hochmuth | May 2003 | B2 |
7075527 | Takagi et al. | Jul 2006 | B2 |
7443380 | Nozawa | Oct 2008 | B2 |
7446757 | Mochizuki | Nov 2008 | B2 |
7456823 | Poupyrev et al. | Nov 2008 | B2 |
8194399 | Ashcraft et al. | Jun 2012 | B2 |
8380327 | Park | Feb 2013 | B2 |
8581859 | Okumura et al. | Nov 2013 | B2 |
8619021 | Hayton | Dec 2013 | B2 |
8780540 | Whitt et al. | Jul 2014 | B2 |
8780541 | Whitt et al. | Jul 2014 | B2 |
8804324 | Bohn | Aug 2014 | B2 |
8929085 | Franklin et al. | Jan 2015 | B2 |
8999474 | Casteras | Apr 2015 | B2 |
20010033275 | Kent et al. | Oct 2001 | A1 |
20020033798 | Nakamura et al. | Mar 2002 | A1 |
20020167495 | Quinn et al. | Nov 2002 | A1 |
20030043087 | Kim | Mar 2003 | A1 |
20030060269 | Paulsen et al. | Mar 2003 | A1 |
20030144034 | Hack et al. | Jul 2003 | A1 |
20030147205 | Murphy et al. | Aug 2003 | A1 |
20030210801 | Naksen et al. | Nov 2003 | A1 |
20030214485 | Roberts | Nov 2003 | A1 |
20030227441 | Hioki et al. | Dec 2003 | A1 |
20040008191 | Poupyrev et al. | Jan 2004 | A1 |
20040017355 | Shim | Jan 2004 | A1 |
20040035994 | Cho et al. | Feb 2004 | A1 |
20040046739 | Gettemy | Mar 2004 | A1 |
20040212588 | Moriyama | Oct 2004 | A1 |
20040239631 | Gresham | Dec 2004 | A1 |
20050051693 | Chu | Mar 2005 | A1 |
20050057527 | Takenaka et al. | Mar 2005 | A1 |
20050140646 | Nozawa | Jun 2005 | A1 |
20050162389 | Obermeyer et al. | Jul 2005 | A1 |
20050237308 | Autio et al. | Oct 2005 | A1 |
20060007151 | Ram | Jan 2006 | A1 |
20060077672 | Schaak | Apr 2006 | A1 |
20060199999 | Ikeda et al. | Sep 2006 | A1 |
20060238494 | Narayanaswami et al. | Oct 2006 | A1 |
20070040810 | Dowe et al. | Feb 2007 | A1 |
20070097014 | Solomon | May 2007 | A1 |
20070154254 | Bevirt | Jul 2007 | A1 |
20070205997 | Lieshout et al. | Sep 2007 | A1 |
20070242033 | Cradick et al. | Oct 2007 | A1 |
20070247422 | Vertegaal et al. | Oct 2007 | A1 |
20080018631 | Hioki et al. | Jan 2008 | A1 |
20080042940 | Hasegawa | Feb 2008 | A1 |
20080251662 | Desorbo et al. | Oct 2008 | A1 |
20090058828 | Jiang et al. | Mar 2009 | A1 |
20090085866 | Sugahara | Apr 2009 | A1 |
20090088204 | Culbert et al. | Apr 2009 | A1 |
20090115734 | Frederiksson et al. | May 2009 | A1 |
20090184921 | Scott et al. | Jul 2009 | A1 |
20090219247 | Watanabe et al. | Sep 2009 | A1 |
20090237374 | Li | Sep 2009 | A1 |
20090237872 | Bemelmans et al. | Sep 2009 | A1 |
20090244013 | Eldershaw | Oct 2009 | A1 |
20090326833 | Ryhanen et al. | Dec 2009 | A1 |
20100011291 | Nurmi | Jan 2010 | A1 |
20100013939 | Ohno et al. | Jan 2010 | A1 |
20100056223 | Choi et al. | Mar 2010 | A1 |
20100060548 | Choi et al. | Mar 2010 | A1 |
20100108828 | Yu et al. | May 2010 | A1 |
20100120470 | Kim et al. | May 2010 | A1 |
20100134428 | Oh | Jun 2010 | A1 |
20100141605 | Kang | Jun 2010 | A1 |
20100164888 | Okumura | Jul 2010 | A1 |
20100228295 | Whitefield | Sep 2010 | A1 |
20100238612 | Hsiao et al. | Sep 2010 | A1 |
20100263245 | Bowser | Oct 2010 | A1 |
20110007000 | Lim | Jan 2011 | A1 |
20110057873 | Geissler et al. | Mar 2011 | A1 |
20110062703 | Lopez et al. | Mar 2011 | A1 |
20110080155 | Aldridge | Apr 2011 | A1 |
20110095999 | Hayton | Apr 2011 | A1 |
20110141053 | Bulea et al. | Jun 2011 | A1 |
20110141069 | Hirakata et al. | Jun 2011 | A1 |
20110167391 | Momeyer et al. | Jul 2011 | A1 |
20110181494 | Wong et al. | Jul 2011 | A1 |
20110193771 | Chronqvist | Aug 2011 | A1 |
20110227822 | Shai | Sep 2011 | A1 |
20110241822 | Opran et al. | Oct 2011 | A1 |
20110298786 | Cho et al. | Dec 2011 | A1 |
20120044620 | Song | Feb 2012 | A1 |
20120110784 | Hsu | May 2012 | A1 |
20120162876 | Kim | Jun 2012 | A1 |
20120206375 | Fyke et al. | Aug 2012 | A1 |
20130083496 | Franklin | Apr 2013 | A1 |
20130120912 | Ladouceur | May 2013 | A1 |
20130178344 | Walsh et al. | Jul 2013 | A1 |
20130187864 | Paasovaara et al. | Jul 2013 | A1 |
20130194207 | Andrew et al. | Aug 2013 | A1 |
20130197819 | Vanska et al. | Aug 2013 | A1 |
20130286553 | Vanska et al. | Oct 2013 | A1 |
20130333592 | Cavallaro | Dec 2013 | A1 |
20140003006 | Ahn | Jan 2014 | A1 |
Number | Date | Country |
---|---|---|
1598870 | Mar 2005 | CN |
1617614 | May 2005 | CN |
101430601 | May 2009 | CN |
201758267 | Mar 2011 | CN |
1 657 965 | May 2006 | EP |
1770965 | Apr 2007 | EP |
1 829 023 | Sep 2007 | EP |
1830336 | Sep 2007 | EP |
1 970 886 | Sep 2008 | EP |
2166443 | Mar 2010 | EP |
2202624 | Jun 2010 | EP |
2315186 | Apr 2011 | EP |
2508960 | Oct 2012 | EP |
2456512 | Jul 2009 | GB |
2002278515 | Sep 2002 | JP |
2003015795 | Jan 2003 | JP |
2004046792 | Feb 2004 | JP |
2004192241 | Jul 2004 | JP |
2008152426 | Jul 2008 | JP |
20060134130 | Dec 2006 | KR |
20090006718 | Jan 2009 | KR |
20090006807 | Jan 2009 | KR |
2009001161 | Feb 2009 | KR |
200404248 | Mar 2004 | TW |
WO-0060438 | Oct 2000 | WO |
WO-2005093548 | Oct 2005 | WO |
WO-2005093548 | Oct 2005 | WO |
WO-2006014230 | Feb 2006 | WO |
WO-2008150600 | Dec 2008 | WO |
WO-2009050107 | Apr 2009 | WO |
WO-2010004080 | Jan 2010 | WO |
WO-2010041227 | Apr 2010 | WO |
WO-2011117681 | Sep 2011 | WO |
WO-2011144972 | Nov 2011 | WO |
WO-2013160737 | Oct 2013 | WO |
Entry |
---|
“Nokia patent application points to flexible phone displays”; Donald Melanson, Publication date: Jan. 19, 2010. |
Intuitive Page-Turning Interface of E-Books on Flexible E-Paper Based on User Studies; Taichi Tajika, Tomoko Yonezawa, Noriaki Mitsunaga; on pp. 793-796; Publication date: 2008. |
Lahey, Byron et al.; “PaperPhone: Understanding the Use of Bend Gestures in Mobile Devices with Flexible Electronic Paper Displays”; CHI 2011-Session: Flexible Grips & Gestures; May 7-12, 2011; pp. 1303-1312. |
Lee, Sang-Su et al; “How Users Manipulate Deformable Displays as Input Devices”; Apr. 10-15, 2010; pp. 1647-1656. |
Poupyrev, Ivan; “Gummi: A bendable computer”; http://ivanpoupyrev.com/projects/gummi.php; 1994-2012; whole document (7 pages). |
Honig, Zach; “Mrata Tactile conroller TV remote hands-on (video)”; http://www.engadget.com/2011/10/05/murata-tactile-controller-tv-remote-hands-on-video; 2012; whole document (8 pages). |
“Press release: revolutionary new paper computer shows flexible future for smartphones and tablets”; http://www.hml/queensu.ca/paperphone; 2012; whole document (2 pages). |
Mina; “Samsung Unveils Flexible Android Smartphone”; http://www.androidauthority.com/samsung-unveils-flexible-android-smartphone-24933/; Sep. 21, 2011; whole document (8 pages). |
Smith, Matt; “Nokia's kinetic future: flexible screens and a twisted interface”; http://www.engadget.com/2011/10/26/nokias-kinetic-future-flexible-screens-and-a-twisted-interface/; Oct. 26, 2012; whole document (4 pages). |
Watanabe, Jun-ichiro, et al., “Booksheet: Bendable Device for Browsing Content Using the Metaphor of Leafing Through the Pages”, Sep. 21-24, 2008, pp. 360-369. |
Number | Date | Country | |
---|---|---|---|
20170199623 A1 | Jul 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13697579 | US | |
Child | 15470059 | US |