The invention relates to electronic devices and, in particular, to a controlling a user interface in a portable system.
Wrist devices such as electronic training computers or, in general, electronic wrist computers comprise a user interface. The user interface is typically limited by the small size of a display and input devices. Therefore, it is beneficial to consider novel technologies to improve the user experience associated with usage of wrist devices.
According to an aspect, there is provided a portable system comprising a physical activity monitoring device comprising: a wireless proximity detection module configured to detect a proximity of an input control entity with respect to the physical activity monitoring device and output a control signal as a response to the detection, wherein the proximity is a non-zero distance between the input control entity and the training computer; and a user interface controller configured to generate, as a response to the control signal from the wireless proximity detection module, at least one of an audio control function and a display control function.
In the following the invention will be described in greater detail by means of preferred embodiments with reference to the accompanying drawings, in which
The following embodiments are exemplary. Although the specification may refer to “an”, “one”, or “some” embodiment(s) in several locations, this does not necessarily mean that each such reference is to the same embodiment(s), or that the feature only applies to a single embodiment. Single features of different embodiments may also be combined to provide other embodiments. Furthermore, words “comprising” and “including” should be understood as not limiting the described embodiments to consist of only those features that have been mentioned and such embodiments may contain also features/structures that have not been specifically mentioned.
The user interface may comprise an audio interface comprising a loudspeaker and/or an earpiece speaker. The user interface may comprise a visual interface comprising a display screen 14. The physical activity monitoring device 12, 16 may comprise an audiovisual interface comprising both the audio interface and the visual interface. The physical activity monitoring device 12, 16 may comprise an audio playback module configured to play audio tracks stored in a memory of the physical activity monitoring device 12, 16.
The system may further comprise at least one sensor device 10 configured to measure training measurement data. The sensor device 10 may comprise at least one of the following: heart activity sensor, a motion sensor, a force sensor, a cadence sensor, and a location tracking sensor.
The physical activity monitoring device may further comprise at least one processor 60 and at least one memory 70 storing a computer program code 76. The computer program code may comprise program instructions configuring the at least one processor 60 to carry out at least one computer process. In an embodiment, the computer process comprises controlling audio output or visual output according to a control signal received from the proximity detection module 52. In an embodiment, the at least one processor 60 comprises a user interface controller 62 configured to generate, as a response to the control signal from the proximity detection module 52, at least one of an audio control function and a display control function.
The audio control function and/or the display control function may control output of a user interface 20 of the physical activity monitoring device or a user interface with which the physical activity monitoring device communicates over a wireless or a wired connection. In an embodiment where the physical activity monitoring device is the wrist device 12 and the user 11 additionally uses a user interface apparatus 16 such as a portable media player apparatus, the wrist device 12 may control audio and/or display output of the user interface apparatus 16 as a response to the control signal from the proximity detection module 52. The user interface controller 62 may comprise a display controller 64 configured to execute the display control function and/or an audio controller 66 configured to execute the audio control function.
The memory 70 may further store a detection database 72 storing mapping information linking the detected proximity to the audio control function and/or display control function. In an embodiment where the proximity detection module 52 utilizes the detection database 72, the detection database 72 may comprise mapping information mapping the detected proximities to control signals, and the proximity detection module may measure the proximity of the input control entity 24 and map the measured proximity to a corresponding control signal according to the mapping information retrieved from the database 72. In another embodiment where the user interface controller utilizes the detection database 72, the detection database 72 may comprise mapping information mapping the control signals and the control functions of the user interface controller 62. The user interface controller may then receive the control signal from the proximity detection module 52, link the received control signal to an audio control function and/or the display control function according to the mapping information, and instruct the display controller 64 and/or the audio controller 66 to perform a function associated with the received control signal.
It should be appreciated that the shorter distance may be associated to a lower audio volume and the longer distance to a higher audio volume in the embodiment of
In some embodiments, a communication interval between the sensor device 50 and the physical activity monitoring device is long, e.g. one message in one second intervals. Accordingly, the distance may be measured with substantially low periodicity. In an embodiment of
In other embodiments where the communication interval is shorter, e.g. less than 0.5 seconds, less than 0.3 seconds, or less than 0.1 seconds, the notifications at the measurement timings is necessarily not needed. For example, the embodiment of
In an embodiment of
In an embodiment, the audio control function comprises at least one of the following: adjusting an audio volume; changing an audio track, starting an audio track, stopping an audio track, pausing an audio track, recording an audio track, outputting an exercise guidance audio signal; selecting a sound profile, and selecting a playback device.
In an embodiment, the display control function comprises at least one of the following: switching from an audio player display mode to an exercise display mode, changing brightness of a display light, zooming a display view in/out, changing one exercise display mode to another exercise display mode, accepting an incoming call, and dismissing the incoming call. With respect to said accepting the incoming call, the display control function may comprise switching a display mode from notification of pending incoming call to a display mode indicating that the call is connected and voice connection is on. With respect to said accepting the incoming call, the display control function may comprise switching a display mode from notification of pending incoming call to a display mode indicating that the incoming call has been dismissed.
The zooming and the change of the audio volume and the brightness are substantially linear so they may be adjusted by using the embodiment of
In an embodiment, the input control entity is a human hand, a part of a human hand such as a finger, or another pointer causing changes in an electric field around the proximity detection module when the input control entity is moved with respect to the proximity detection module.
In the embodiment of
In an embodiment, the antennas are micro-strip antennas integrated into a circuit board.
In an embodiment, the proximity detection module 52 further comprises an impedance conversion circuitry configured to convert a detected change in the sensed electromagnetic field into said control signal output to the user interface controller 62. In the embodiment of
With the embodiment of
In an embodiment, the proximity detection module 52 may activate the impedance conversion circuitry upon receiving an activation signal through a user interface of the training computer, e.g. user operation of a physical button or selection of a determined operating mode. As response to the activation, the impedance conversion circuitry may start the sensing of the gestures from the impedance of the antennas 700,702. The impedance conversion circuitry may be deactivated upon receiving a deactivation signal through the user interface and/or upon detecting from the antenna impedance(s) that the hand is no longer within the proximity of the antennas.
In another embodiment, the impedance conversion circuitry may operate autonomously and start the sensing of the gestures upon detecting the proximity of the hand with respect to the antennas.
In an embodiment, the input control entity comprises an interaction device configured to communicate wirelessly with the training computer.
The proximity detection module 52 may comprise an energizing circuitry 92 configured to wirelessly energize the interaction device, read data from the interaction device as a result of the energization, and output the control signal as a response to the read data. The data read from the interaction device may comprise a device address of the interaction device, and the user interface controller 62 may be configured to modify the at least one of the audio control function and the display control function according to the device address. Accordingly, the detection database 72 may provide mapping between different device addresses and corresponding control functions.
Referring to
In an embodiment, the interaction device comprises an apparel comprising the at least one tag 1, 2, 3, 4, 5, 6. The tag may be sewn into the apparel and a marking indicating a function associated with each tag may be disposed in connection with the tag in the apparel.
With respect to the embodiment of
In the embodiment where the apparel is the glove, the user 11 may trigger the selected audio or display control function by bringing the finger or any other portion of the glove where the tag is disposed within the proximity of the training computer. Accordingly, the desired functions may be executed without taking the gloves off of the hand. In an embodiment where the apparel is a coat, a jacket, or trousers, the user 11 may trigger the selected audio or display control function by bringing the physical activity monitoring device close to the portion of the apparel where the tag 90 is disposed.
In an embodiment, the physical activity monitoring device is configured to establish a wireless connection with an external media player, and wherein the user interface controller is configured to send a signal of the audio control function and/or the display control function to the external audio player through the wireless connection. For example, the wrist device 12 may control the audio or display output of the portable media player 16.
In another embodiment, the physical activity monitoring device further comprises an integrated audio device and/or display, and wherein the user interface controller is configured to control the integrated audio device with the audio control function and the display with the display control function.
Referring to
In block 1004, the processor of the server computer causes transmission of the mapping information created in block 1002 to the training computer. The mapping information is transferred from the server computer to the physical activity monitoring device in step 1006 over one or more wired or wireless connections.
In block 1008, the physical activity monitoring device creates a playlist comprising a plurality of audio tracks. In block 1010, the physical activity monitoring device retrieves the mapping information from a memory, e.g. as a response to playback of the playlist. The physical activity monitoring device may retrieve a portion of the mapping information, e.g. the identifiers of the training events the physical activity monitoring device is configured to detect. In block 102, the physical activity monitoring device analyses training measurement data received from at least one sensor device and scans for the training events. Upon detecting one or more of the training events, the physical activity monitoring device is configured to cause playback of the portion mapped to the detected one or more training events.
The physical activity monitoring device may cause immediate playback of the portion regardless of whether or not an audio track is currently played. In another embodiment, the physical activity monitoring device is configured to cause the playback of the portion when the currently played track ends. Accordingly, the playback of the portions may be scheduled to the next idle time interval between two consecutive audio tracks.
In an embodiment, a further categorization of the portions may be created in block 1002 by mapping the portion to one or more sports types. Accordingly, the portion may be used only when the physical exercise belongs to the sports type mapped to the portion.
In an embodiment, the physical activity monitoring device is configured to select a portion that is already included in the playlist or a portion that is comprised in an audio track of an artist comprised in the playlist.
As used in this application, the term ‘circuitry’ refers to all of the following: (a) hardware-only circuit implementations such as implementations in only analog and/or digital circuitry; (b) combinations of circuits and software and/or firmware, such as (as applicable): (i) a combination of processor(s) or processor cores; or (ii) portions of processor(s)/software including digital signal processor(s), software, and at least one memory that work together to cause an apparatus to perform specific functions; and (c) circuits, such as a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation, even if the software or firmware is not physically present.
This definition of ‘circuitry’ applies to all uses of this term in this application. As a further example, as used in this application, the term “circuitry” would also cover an implementation of merely a processor (or multiple processors) or portion of a processor, e.g. one core of a multi-core processor, and its (or their) accompanying software and/or firmware. The term “circuitry” would also cover, for example and if applicable to the particular element, a baseband integrated circuit, an application-specific integrated circuit (ASIC), and/or a field-programmable grid array (FPGA) circuit for the apparatus according to an embodiment of the invention.
The processes or methods described above in connection with
The present invention is applicable to portable systems defined above but also to other suitable systems. The development of the systems may require extra changes to the described embodiments. Therefore, all words and expressions should be interpreted broadly and they are intended to illustrate, not to restrict, the embodiment. It will be obvious to a person skilled in the art that, as technology advances, the inventive concept can be implemented in various ways. The invention and its embodiments are not limited to the examples described above but may vary within the scope of the claims.
This application is a Divisional of U.S. application Ser. No. 14/075,736, filed on Nov. 8, 2013, which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3758117 | Harrison | Sep 1973 | A |
5099227 | Geiszler | Mar 1992 | A |
5174577 | Warde | Dec 1992 | A |
5694939 | Cowings | Dec 1997 | A |
6115636 | Ryan | Sep 2000 | A |
6261102 | Dugan | Jul 2001 | B1 |
6430997 | French | Aug 2002 | B1 |
6554706 | Kim | Apr 2003 | B2 |
7113087 | Casebolt | Sep 2006 | B1 |
7789800 | Watterson et al. | Sep 2010 | B1 |
7841967 | Kahn | Nov 2010 | B1 |
8088042 | Limma | Jan 2012 | B2 |
8140339 | Hernandez-Rebollar | Mar 2012 | B2 |
8152695 | Riley et al. | Apr 2012 | B2 |
8237041 | McCauley | Aug 2012 | B1 |
8241184 | DiBenedetto | Aug 2012 | B2 |
8287434 | Zavadsky | Oct 2012 | B2 |
8337335 | Dugan | Dec 2012 | B2 |
8344998 | Fitzgerald | Jan 2013 | B2 |
8348672 | Saunders | Jan 2013 | B2 |
8396452 | Matsuoka | Mar 2013 | B1 |
8418085 | Snook | Apr 2013 | B2 |
8471868 | Wilson | Jun 2013 | B1 |
8506457 | Baudhuin | Aug 2013 | B2 |
8562487 | Berggren et al. | Oct 2013 | B2 |
8610582 | Jeon et al. | Dec 2013 | B2 |
8622795 | Edis | Jan 2014 | B2 |
8747282 | Lannon | Jun 2014 | B2 |
8764651 | Tran | Jul 2014 | B2 |
8810249 | Cehelnik | Aug 2014 | B2 |
8929809 | Dobyns | Jan 2015 | B2 |
9092123 | Kahn | Jul 2015 | B1 |
9147343 | Johnson et al. | Sep 2015 | B2 |
9173086 | Yoon | Oct 2015 | B2 |
9191829 | Maguire | Nov 2015 | B2 |
9201548 | Leek | Dec 2015 | B2 |
9236860 | Unterreitmayer | Jan 2016 | B2 |
9400985 | Dobyns | Jul 2016 | B2 |
9420841 | Anderson | Aug 2016 | B2 |
9459697 | Bedikian | Oct 2016 | B2 |
9460700 | Smith | Oct 2016 | B2 |
9477313 | Mistry | Oct 2016 | B2 |
20060007124 | Dehlin | Jan 2006 | A1 |
20060044112 | Bridgelall | Mar 2006 | A1 |
20060224048 | Devaul et al. | Oct 2006 | A1 |
20060282873 | Zalewski | Dec 2006 | A1 |
20070049836 | Chen | Mar 2007 | A1 |
20070075965 | Huppi et al. | Apr 2007 | A1 |
20070100666 | Stivoric et al. | May 2007 | A1 |
20070194878 | Touge | Aug 2007 | A1 |
20070219059 | Schwartz et al. | Sep 2007 | A1 |
20070275826 | Niemimaki | Nov 2007 | A1 |
20080139975 | Einav et al. | Jun 2008 | A1 |
20080268931 | Alderucci | Oct 2008 | A1 |
20080300055 | Lutnick | Dec 2008 | A1 |
20090153369 | Baier et al. | Jun 2009 | A1 |
20090262074 | Nasiri | Oct 2009 | A1 |
20090303204 | Nasiri | Dec 2009 | A1 |
20100160115 | Morris et al. | Jun 2010 | A1 |
20100216600 | Noffsinger | Aug 2010 | A1 |
20100265189 | Rofougaran | Oct 2010 | A1 |
20100292050 | DiBenedetto | Nov 2010 | A1 |
20100306712 | Snook | Dec 2010 | A1 |
20110009713 | Feinberg | Jan 2011 | A1 |
20110025345 | Unterreitmayer | Feb 2011 | A1 |
20110105854 | Kiani et al. | May 2011 | A1 |
20110112771 | French | May 2011 | A1 |
20110154258 | Hope et al. | Jun 2011 | A1 |
20110251021 | Zavadsky | Oct 2011 | A1 |
20110306297 | Chang | Dec 2011 | A1 |
20120050181 | King | Mar 2012 | A1 |
20120319846 | Rogers | Dec 2012 | A1 |
20130024018 | Chang et al. | Jan 2013 | A1 |
20130040271 | Rytky | Feb 2013 | A1 |
20130089845 | Hutchison | Apr 2013 | A1 |
20130169420 | Blount, Jr. | Jul 2013 | A1 |
20130207889 | Chang | Aug 2013 | A1 |
20130271342 | Shen | Oct 2013 | A1 |
20130324036 | Hillan et al. | Dec 2013 | A1 |
20140070957 | Longinotti-Buitoni | Mar 2014 | A1 |
20140107493 | Yuen | Apr 2014 | A1 |
20140147820 | Snow | May 2014 | A1 |
20140201666 | Bedikian | Jul 2014 | A1 |
20140240103 | Lake | Aug 2014 | A1 |
20140240214 | Liu et al. | Aug 2014 | A1 |
20140267024 | Keller | Sep 2014 | A1 |
20140267148 | Luna et al. | Sep 2014 | A1 |
20140280156 | Maser et al. | Sep 2014 | A1 |
20150017965 | Lim | Jan 2015 | A1 |
20150082408 | Yeh et al. | Mar 2015 | A1 |
20150117161 | Nichol | Apr 2015 | A1 |
20150133206 | Sarrafzadeh | May 2015 | A1 |
20150341074 | Saukko | Nov 2015 | A1 |
20160240100 | Rauhala | Aug 2016 | A1 |
20160243403 | Oleson | Aug 2016 | A1 |
20170080320 | Smith | Mar 2017 | A1 |
Number | Date | Country |
---|---|---|
103713746 | Apr 2014 | CN |
2239023 | Oct 2010 | EP |
2012044334 | Apr 2012 | WO |
Entry |
---|
Sidhant Gupta, Ke-Yu Chen, Matthew S. Reynolds, Shwetak N. Patel, LightWave: Using Compact Fluorescent Lights as Sensors, Sep. 17-21, 2011, 10 pages. |
Rachel M. Bainbridge, HCI Gesture Tracking Using Wearable Passive Tags, 2009, 80 pages. |
Sami Myllymaki, Capacitive Antenna Sensor for User Proximity Recognition, 2012, 60 pages. |
Thomas G. Zimmerman, Joshua R. Smith, Joseph A. Paradiso, David Allport, Neil Gershenfeld, Applying Electric Field Sensing to Human-Computer Interfaces, 1995, 8 pages. |
Anonymous, “Wired Glove”, Wikipedia, 3 pages, Sep. 27, 2013. |
European Search Report, EP 14191396, 3 pages, dated Jun. 3, 2015. |
Pu, et al., Whole-Home Gesture Recogition Using Wireless Signals (Demo), Aug. 12-16, 2013, 2 pages. |
Ma, Michelle, Wi-Fi signals enable gesture recognition throughout entire home, Jun. 4, 2013, 11 pages. |
Pu, et al., Whole-Home Gesture Recognition Using Wireless Signals, Sep. 30 to Oct. 4, 2013, 12 pages. |
Number | Date | Country | |
---|---|---|---|
20170068327 A1 | Mar 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14075736 | Nov 2013 | US |
Child | 15357425 | US |