The present disclosure relates generally to a touchscreen device, and in particular, a touchscreen device with dynamic button functions based on tilt angle.
Handheld devices may include a number of physical buttons which may be located on both left and right-hand sides of the devices. When a user launches an application, such as by touching an icon, the function associated with a physical button may change based on the launched application. However, within a given application or “app”, and when the user has not touched an icon to load a new app, functions assigned to these physical buttons are typically static and do not change based on different contexts for which the devices may be used or even different orientations of the device.
Some embodiments include a system, method, computer program product, and/or combination(s) or sub-combination(s) thereof, for a handheld device with programmable physical buttons in addition to providing a touchscreen display. Some embodiments include a handheld device with physical buttons outside of the touchscreen and may be located on left and right sides of the wireless handheld device. Some embodiments include a controller coupled to the handheld device, configured to dynamically configure functions performed by the physical buttons based on different or changing contexts of the handheld device. Non-limiting examples of contexts include orientation states and application states which may represent a current state of the handheld device or a transition from one state to different state.
Further embodiments, features, and advantages of the present disclosure, as well as the structure and operation of the various embodiments of the present disclosure, are described in detail below with reference to the accompanying drawings.
The accompanying drawings, which are incorporated herein and form part of the specification, illustrate the present disclosure and, together with the description, further serve to explain the principles of the disclosure and to enable a person skilled in the relevant art(s) to make and use the disclosure.
The present disclosure will now be described with reference to the accompanying drawings. In the drawings, generally, like reference numbers indicate identical or functionally similar elements. Additionally, generally, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears.
The following Detailed Description of the present disclosure refers to the accompanying drawings that illustrate exemplary embodiments consistent with this disclosure. The exemplary embodiments will fully reveal the general nature of the disclosure that others can, by applying knowledge of those skilled in relevant art(s), readily modify and/or adapt for various applications such exemplary embodiments, without undue experimentation, without departing from the spirit and scope of the disclosure. Therefore, such adaptations and modifications are intended to be within the meaning and plurality of equivalents of the exemplary embodiments based upon the teaching and guidance presented herein. It is to be understood that the phraseology or terminology herein is for the purpose of description and not of limitation, such that the terminology or phraseology of the present specification is to be interpreted by those skilled in relevant art(s) in light of the teachings herein. Therefore, the detailed description is not meant to limit the present disclosure.
The embodiment(s) described, and references in the specification to “one embodiment”, “an embodiment”, “an example embodiment”, etc., indicate that the embodiment(s) described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is understood that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
A handheld device may include one or more physical buttons. The handheld device may be implemented as a wireless handheld device that is capable of wireless communications. Typically, one button may be configured as a power button for turning the wireless handheld device on or off. Other examples of physical buttons may include a button or buttons for controlling the volume of the device. These functions are typically static in nature regardless of how the device is currently oriented, or the device orientation. For example, a left physical button (i.e., a physical button located on a left side of the device) may be hardcoded to increase the volume of the device while a right physical button (i.e., a physical button located on an opposing right side of the device) may be hardcoded to decrease the volume of the device. In a conventional implementation of a wireless handheld device, these volume functions are static and do not change even if the device orientation (e.g., tilt angle) changes. In a conventional implementation, alternate functions of physical buttons of the wireless handheld device may be activated based on whichever application is currently loaded and being executed (such as by the user loading the application with the touch of an icon). But such activations are typically static (i.e., the function of the button remains the same for the application) and also require an explicit user interaction with software on the wireless handheld device. In contrast, in some embodiments, changes to physical button function may be triggered by a change in hardware, such as a device orientation. In some embodiments, the trigger may be both a change in a combination of hardware and software.
In some embodiments, device orientation may be used for determining an orientation state of the wireless handheld device. Device orientation may refer to quantitative measurements that reflect the positioning of wireless handheld device in a three-dimensional space. Device orientation may refer to a continuous range of values (e.g., angles) representing positioning of the wireless handheld device. In some embodiments, this positioning may be based on a relationship to a reference vector, such as gravity. Wireless handheld device may include internal components for determining the quantitative measurements which may be implemented as, for example, angles calculated between the direction of gravity and an X, Y, and Z axis of the wireless handheld device.
Some embodiments include mapping the quantitative measurements, or device orientation, to orientation states. Orientation states may be a discrete set of predefined values that are associated with a certain range or set of values represented by the device orientation. Non-limiting examples of orientation states include vertical orientation, horizontal orientation, vertical orientation with a left tilt, and vertical orientation with a right tilt. Orientation states may further be mapped to different functions to be performed by programmable physical buttons of the wireless handheld device. A controller of the wireless handheld device may dynamically update the functions of the physical buttons responsive to determining the current orientation state of the wireless handheld device.
Some embodiments include using the quantitative measurements to determine transitions between different orientation states, such as from a horizontal position to a vertical position. Detecting the extent of transitions, such as changes in angles along the X, Y, or Z axes, allows a controller of the wireless handheld device to prevent functions of the programmable physical buttons from being changed due to small shifts in orientation. Detecting transitions angles allows the controller to provide a buffer or range when to stay within an orientation state or when to transition to a different orientation state.
Embodiments described herein may allow for updating functions of programmable physical buttons. A benefit of this approach is to allow for fewer buttons to be implemented on the wireless handheld device since functions of a button may be updated dynamically as needed based on the orientation of the device. Fewer buttons implemented on the sides of the device reduces cost of the device and simplifies antenna design because there is less mechanical interference for an antenna signal.
Each user may want to adjust the device speaker volume per their personal preferences. It may be desirable for the adjustment mechanism is intuitive whether the user listens with their right ear or left ear, e.g. by mapping left and right programmable buttons to volume up and volume down as discussed in connection with
As shown in
An example of configuring functions of physical buttons is shown in example implementation 100. In one orientation state with a first tilt, a button located on a left-hand side of the wireless handheld device may be configured to perform action A when the button is pressed while a button located on an opposing right-hand side may be configured to perform action B when the button is pressed. Conversely, when in a second orientation state, the left-hand side button may be reconfigured to perform action B when pressed and the right-hand side button may be reconfigured to perform action A when pressed. Actions A and B may control software functions associated with applications installed on the wireless handheld device. The reconfiguration is triggered by a change of hardware, such as a device orientation, and requires no explicit interaction with software such as by touching an icon.
In one embodiment, the software function may be a volume control and actions may be to increase or decrease the volume. Actions may be mapped to buttons based on the application. For example, during a phone call, it may be intuited that the wireless handheld device is pressed against a left ear when the device is in a vertical orientation but tilted to the left. Actions of the left and right buttons may be assigned intuitively to match the user's expectations such as assigning volume down action to the left button (because it is facing downward) and the volume up action to the right button (because it is facing upward). Similarly, it may be intuited that the wireless handheld deivce is pressed against a right ear when the device is in a vertical orientation but tilted to the right. Actions performed by the buttons may be reconfigured accordingly. With a tilt to the right, the volume up action may now be assigned to the left button (because it is facing upward) and the volumen down action to the right side button (because it is facing downward). In this manner, depending on the tilt, actions performed by the side button may vary so that button actions remain intuitive regardless of whether the user holds the handheld device to the left ear or the right ear.
Wireless handheld device 200 may be implemented as any device that uses physical buttons to control actions of software installed on or hardware of wireless handheld device 200. Non-limiting examples of installed software may include a camera application, a media application (for playing music, video, or other media), and a phone application. Non-limiting examples of actions of software may include any functionality associated with applications installed on wireless handheld device 200 such as volume control for a media application or a phone application, a media control for a media application, and a shutter control for a camera application. For example, wireless handheld device 200 may be implemented as a mobile phone (e.g., a smart phone), a computing device (e.g., a tablet), or a similar device.
In some embodiments, camera 202 may be implemented as a rear-facing camera and a forward-facing camera. In some embodiments, camera 202 refers to either a rear-facing camera or a forward-facing camera. Camera 202 may be associated with a software installed on wireless handheld device 200 that is controlled via at least one of a right programmable button 204 or left programmable button 206. For example, right programmable button 204 may be configured to operate a shutter of camera 202 and left programmable button 206 may be configured to operate other options (e.g., flash setting, camera modes) of camera 202.
While only two buttons are illustrated in
In some embodiments, wireless handheld device 200 may include touch/display area 208 for displaying applications and receiving input from a user for launching and changing applications. Wireless handheld device 200 may include orientation sensor 210. Orientation sensor 210 may measure device orientation of the wireless handheld device with respect to a reference vector and coordinate axes, X, Y, and Z in relation to the device.
Wireless handheld device 200 may further include scanner 212. In some embodiments, scanner 212 may be a barcode scanner that emits a beam such as a laser for reading barcodes. Scanner 212 may be associated with a scanner application that is controlled via at least one of right programmable button 204 and left programmable button 206. For example, right programmable button 204 may be configured to activate the beam of scanner 212 and left programmable button 206 may be configured to operate other options of camera 202. In another embodiment, both right and left programmable buttons 204 and 206 may be configured to activate the beam of scanner 212.
Device orientation of wireless handheld device 300 may be defined with respect to reference vector 314, and may be quantitatively specified based on angles α 316a, β 318a, and γ 320a between reference vector 314 and the X-axis 310, Y-axis 308, and Z-axis 312, respectively. The device orientation of wireless handheld device 300 may also be specified with the cosines of these angles, or cos(α), cos(β), and cos(γ) which may be interpreted as components of a vector of unit length. As a non-limiting example, if the variable “G” represents the unit vector in the direction of reference vector 314, then, with respect to the device's coordinate system, its X, Y and Z components may be given by the following expression:
G=(Gx, GY, Gz)=(cos(α), cos(β), cos(γ))
Angles α 316a, β 318a, and γ 320a may represent a device orientation and may be used to define device orientation states. As non-limiting examples, when γ 320a=180° and G=(0, 0, −1), the device orientation state may be described as being in a face-up (user facing) horizontal orientation and when β 318a=180° and G=(0, −1, 0), the device orientation state may be described as being in a vertical position. An additional non-limiting example may include when γ 320a=90° and α 316a is greater than 90°, G=(GX<0, GY, 0), the device orientation state may be described as being vertical with a left tilt which may correspond to wireless handheld device 300 being held by a user with the left hand next to the left ear such as during a phone conversation. An additional non-limiting example may include when γ 320a=90° and α 316a<90°, G=(GX>0, GY, 0), the device orientation state may be described as being vertical with a right tilt which may correspond to wireless handheld device 300 being held by a user with the right hand next to the right ear during a phone conversation. These descriptions of orientation states are merely exemplary and are not intended to limit the scope of how orientation states may be associated with the measured device orientations. In addition to the orientation of wireless handheld device 300, the orientation state may be impacted by other factors such as the purpose of the software currently running, the history of past orientations, and orientation states.
With reference to
A second application mode may refer to another capability of the software application such as a camera capability (camera mode), email capability (email mode), inventory management capability (inventory management mode), music capability (music mode). In this embodiment, the software application is described as having two different modes. But in other embodiments, the software application may have any number of modes including one (i.e., only a single mode such as a phone mode) or three or more as just described. Each of these modes may be mapped to different sets of actions. For example, in a music mode, actions may control functions for changing songs, in a camera mode, actions may control taking pictures. These may be called “function maps” and are discussed further with respect to
With reference to
Device orientation states may further be associated with the different application modes. For example, a horizontal orientation state my trigger a different application mode such as the chat mode, inventory management mode, or email mode. When the controller of wireless handheld device 300 detects a transition from the horizontal orientation state to a vertical orientation state, the controller may cause the application software to return to the first application mode (e.g., phone mode) in which first and second set of actions that may be mapped to right button 304 and left button 306 in a way depending on device orientation. For example, in the embodiment shown in
As a non-limiting example, when wireless handheld device 300 is in a vertical position with left tilt orientation state (such as shown in
As another non-limiting example, when wireless handheld device 300 is in a vertical position with right tilt orientation state (such as shown in
This discussion of the types of actions, software functions, application modes, and orientation states is merely exemplary. Other types and combinations of functions, application mode, and orientation states are considered to be within the scope of this disclosure.
In an embodiment of
G
BEFORE
·G
AFTER=(GXBEFORE*GXAFTER+GYBEFORE*GYAFTER+GZBEFORE*GZAFTER)≈0
In this equation, GBEFORE may represent the gravity unit vector during the horizontal device orientation state before wireless handheld device 400 is rotated. GAFTER may represent the gravity unit vector into the vertical device orientation state after wireless handheld device 400 is rotated. GX may represent the gravity unit vector along the x-axis, Gy may represent the gravity unit vector along the y-axis, and GZ may represent the gravity unit vector along the z-axis. That is, GX, Gy, and GZ may be the components of the gravity unit vector in the frame of reference of the wireless handheld device.
As a non-limiting example, the orientation state transition criteria might be represented as follows:
−½<GBEFORE·GAFTER<+½
In some embodiments, it may be advantageous to condition transitions to one or more axes. For example, a rotation (e.g., 90°) may trigger a transition only if the rotation is about a specified axis such as the X axis. Determining that a rotation of wireless handheld device 400 is about its X axis, rather than some other axis, may be mathematically expressed using the vector cross product:
G
BEFORE
×G
AFTER=(GYBEFORE*GZAFTER−GZBEFORE*GYAFTER, GZBEFORE*GXAFTERGXBEFORE*GZAFTER, GXBEFORE*GYAFTER−GYBEFORE*GXAFTER)≈(±1 , 0 , 0)
As a non-limiting example, in addition to the orientation state transition criteria (−½<GBEFORE·GAFTER<½) discussed above, additional criteria corresponding to a requirement that the rotation is approximately about the X axis may further be specified to calculate the transition of wireless handheld device 400:
|GYBEFORE*GZAFTER−GZBEFORE*GYAFTER|>½
|GZBEFORE*GXAFTER−GXBEFORE*GZAFTER|>½
|GXBEFORE*GYAFTER−GYBEFORE*GXAFTER|<½
Similar principles may apply in embodiments in which the desired criteria includes rotation about the Y or Z axes. For example, the above equations may be adapted to condition transitions about the Y axes by replacing the subscript X with Y, Y with Z, and Z with X. If A is a unit vector about any axis of interest a 90° rotation about that axis may be recognized by the criteria |GBEFORE×GAFTER·A|≈1.
In some embodiments, a return from camera functionality to barcode scanning functionality within the inventory management mode may be triggered rotating wireless handheld device 400 from a vertical at approximately 90 degree angle device orientation back to an approximately horizontal device orientation. The foregoing equations discuss determining transitions to different assignments of button functions based not only on current device orientation state of wireless handheld device 400, but also on previous device orientation states. In these embodiments, device orientation states may be mapped to button actions to each physical button and to different application states. In other words, device orientation state is the only orientation condition for determining application mode and button action mapping.
In some embodiments, software actions are mapped to different device orientation states and the current application state of handheld device. In other words, both device orientation states and current application mode represent the orientation condition for determining the button action mapping.
In an embodiment, tables may be associated with a single application mode or multiple application modes. For example, table 500a, as shown, includes two application modes, namely a phone mode 516a and an inventory management mode 516b. In other embodiments, the software application may supports only one application mode. For example, a software application may have only a phone mode; for example, the software application may be a phone application for wireless handheld device 200. Different sets of actions may be mapped to the different orientation states for the single phone mode which would result in modification of table 500a (or a different table altogether). For example, there may be no actions mapped to for left button action 504 and right button action 506 when the device orientation state is horizontal face up 510c. As another example, a different set of actions such as selecting contacts or initiating phone calls may be mapped to left button action 504 and right button action 506. As another example, table 500a may be modified so that it is associated with an application having only a single application mode (inventory management mode) in which the programmable buttons either trigger a scan or do nothing depending on whether or not the device is approximately horizontal. While table 500a reflects a table for an application having a phone mode and an inventory management mode, one of ordinary skill in the art would understand that wireless handheld device 200 may include one or more tables associated with different applications installed on wireless handheld device 200. For example, there may be one or more tables associated with a phone software application, one or more tables associated with a media software application, one or more tables associated with a camera application, etc. Each of these tables may have different function maps that map device orientation states to different actions and software functions. For example, a table may be associated with a phone software application (e.g., the phone “app” installed with wireless handheld device 200, a phone application that has chat capability such as WhatsApp or Skype) that maps actions to one or more application modes of the phone software application. A phone application that has chat capability may have a first set of actions mapped to the phone mode of the phone software application and another set of actions mapped to the chat mode of the phone software application.
In an embodiment where two or more modes are supported within one software application, transitions between modes, and transitions of button function assignments within each mode, may be determined by device orientation and do not require the user to otherwise interact with software on wireless handheld device 200 such as by touching an icon on touch/display 208 to launch a new software application.
Returning to
Referring to both table 500a, mapped actions for some embodiments are now discussed. Function map 518a may map actions for a software function to device orientation state 502 and current application state 508. In other words, device orientation state 502 and current application state 508 may represent orientation conditions for triggering the configuration of right button 204 and left button 206 to right button action 506 and left button action 504, respectively. In this embodiment of function map 518a, the software function is volume control, device orientation state 502 is vertical with left tilt 510a, and current application mode 508 is in phone application state 516a. Accordingly, in function map 518a, volume down 512a may be mapped to left button action 504 and volume up 514a may be mapped to right button action 506.
Referring again to table 500a, function map 518b may map actions for a software function to device orientation state 502 and current application mode 508. In this embodiment, the software function is volume control, device orientation state 502 is vertical with right tilt 510b, and current application mode 508 is in phone application mode 516a. Accordingly, in function map 518b, volume up 512b may be mapped to left button action 504 and volume down 514b may be mapped to right button action 506.
Function map 518c may map actions for a software function to device orientation state 502 and current application mode 508. In this embodiment, the software function is a barcode scan control, device orientation state 502 is horizontal face up 510c, and current application mode 508 is inventory management state 516b. Accordingly, in function map 518c, both left button action 504 and right button action 506 may be mapped to scan barcode 512c.
Function map 518d may map actions for a software function to device orientation state 502 and current application mode 508. In this embodiment, the software function is a camera shutter control, device orientation state 502 is vertical at approximately 90° 510d, and current application mode 508 is inventory management state 516b. Accordingly, in function map 518d, both left button action 504 and right button action 506 may be mapped to camera shutter control (take picture) 512d.
Again referring to both table 500a, function map 518e may map actions for a software function to device orientation state 502 and current application state 508. In this embodiment, there is no software function when device orientation state 502 is horizontal face down 510d and no active application mode for current application mode 508. In other words, when wireless handheld device 200 is face down, such as on a table, there are no actions mapped to left button action 504 and right button action 506. This may be advantageous as a user picking up a handheld device that is face down on a table generally does not yet seek to activate any button, and yet is at risk of unintentionally pressing a button while picking up the device.
An exemplary use case is discussed. First, the store employee may use wireless handheld device 200 as a phone to communicate with fellow employees. In an embodiment, this may be done with the aid of a software application that includes a phone mode. The employee may launch the software application, such as by touching the appropriate icon, or the desired software application may initiated as a default during device power up. Current application mode 508 is in phone mode 516a in order for wireless handheld device 200 to operate as phone. The software application may always be in phone mode (e.g. if the application has no other modes such as inventory management mode), or may enter phone mode for certain orientation states. For example, table 500a may map the vertical with left tilt 510a orientation state to the phone mode so that the phone mode is launched when wireless handheld device 200 is placed in that device orientation state. Accordingly, when held against the store employee's ear, the handheld device is in the appropriate device orientation state and functions as a phone via the phone mode. When the device is in the appropriate device orientation state—vertical with left tilt 510a or vertical with right tilt 510b—actions of left button 206 and right button 204 are mapped to control volume which would allow the store employee may adjust the speaker volume during the phone conversation.
Continuing the example above, the store employee may finish the phone conversation, and then use wireless handheld device 200 to scan barcodes of shelved items. To maximize the efficient use and user friendliness of the active software application, it is advatageous for it to change application modes of the software application without requiring the employee to find and select and icon presented on the touch/display area 208 in order to activate an inventory management mode that includes a software function for barcode scanning. In an embodiment, mode transitions may be triggered by a change in orientation state. For example, right button 204 and/or left button 206 may be reprogrammed so that barcode scans can be easily activated by a thumb on one of those side buttons after having entered a mode supporting a bar code scanning function to make the experience more ergonomic for the store employee.
While scanning barcodes, the store employee may hold wireless handheld device 200 at approximately horizontal device orientation so that scanner 212 is aimed at barcodes to be scanned and with touch/display area 208 facing up so that the display can be viewed by the store employee. Depending on whether the device is being held with a right hand or a left hand, use of either right button 204 or left button 206 to control the barcode software function will be more ergonomic. A barcode scan action may be assigned to both left and right side buttons when device orientation state 502 is horizontal face up 510c. Conversely, if device orientation state 502 is horizontal face down 510d, it may be assumed that wireless handheld device is not being used as either a barcode scanner or a phone; it may be appropriate to disable the side buttons, assigning them no function to prevent accidental button touches.
The number of items may be relevant to the inventory management application while scanning bar codes of a number of idential items on a shelf. There may a need for the store employee to take a picture of the inventory such as when there are multiple identical items on a shelf so that the store employee does not need to individually scan or count all identical items. The inventory management application may further be configured to perform optical character recognition on the picture to automate counting of the items on the shelf. For this purpose, as is illustrated in
Mapped actions for this embodiment are now discussed. Function map 518e may map actions for a software function to device orientation state 502. In other words, device orientation state 502 may represent an orientation condition for triggering the configuration of right button 204 and left button 206 to right button action 506 and left button action 504, respectively. Device orientation state 502 may also trigger configuration of mode 520, which in this embodiment, relates to a phone mode of wireless handheld device 200 that not only influences the functions of programmable hardware buttons, but also affects speaker volume settings. In this embodiment of function map 518e, the software function includes volume control, and device orientation state 502 is vertical with left tilt 510a. Accordingly, in function map 518a, volume down 512a may be mapped to left button action 504, volume up 514a may be mapped to right button action 506, and mode 520 is configured to phone mode 522a.
Function map 518f may map actions for a software function to device orientation state 502. In this embodiment of function map 518f, the software function includes volume control, and device orientation state 502 is vertical with right tilt 510b. Accordingly, in function map 518f, volume down 514b may be mapped to right button action 504, volume up 512b may be mapped to right button action 506, and mode 520 is configured to phone mode 522a.
Function map 518g may map actions for a software function to device orientation state 502. In this embodiment of function map 518g, the software function is a barcode scan control and device orientation state 502 is horizontal face up 510c. Accordingly, in function map 518g, both left button action 504 and right button action 506 may be mapped to scan barcode 512c and mode 520 is configured to combined phone and inventory management mode 524a which may also include volume control functions.
Function map 518h may map actions for a software function to device orientation state 502. In this embodiment, there is no software function when device orientation state 502 is horizontal face down 510d. In other words, when wireless handheld device 200 is face down, such as on a table, there are no actions mapped to left button action 504 and right button action 506 and no mode is configured for mode 520.
Mapped actions for this embodiment are now discussed. The values of device orientation (angle) 526 and device orientation (unit vector) 528 are merely exemplary. Table 500c may be configured to map any device orientation values to device orientation state 502 as needed based on the particular software application, particular type of wireless handheld device 200, or any other factors.
Orientation map 530a may map device orientation values to device orientation state 502. When controller of wireless handheld device 200 detects a device orientation (angle) 526 of α>90° & 45°<γ<135° or the mathematically equivalent device orientation (unit vector) 528 of Gx<0 & −0.707<Gz<+0.707, or any other equivalent mathematical relationships, the controller may configure device orientation state 502 to vertical with left tilt 510a.
Orientation map 530b may map device orientation values to device orientation state 502. When controller of wireless handheld device 200 detects a device orientation (angle) 526 of α>90° & 45°<γ<135° or equivalently a device orientation (unit vector) 528 of Gx>0 & −0.707<Gz<+0.707, the controller may configure device orientation state 502 to vertical with right tilt 510b.
Orientation map 530c may map device orientation values to device orientation state 502. When controller of wireless handheld device 200 detects a device orientation (angle) 526 of γ<135° and/or a device orientation (unit vector) 528 of Gz <−0.707, the controller may configure device orientation state 502 to horizontal face up 510c.
Orientation map 530d may map device orientation values to device orientation state 502. When controller of wireless handheld device 200 detects a device orientation (angle) 526 of γ<45° and/or a device orientation (unit vector) 528 of GZ>+0.707, the controller may configure device orientation state 502 to horizontal face down 510d.
Table 500d illustrates an example of device-orientation to orientation-state mapping with buffering. Device orientation transition angles 532 provides criteria to transition from one device orientation state to another device orientation state. Device orientation hold angles 534 provides criteria for remaining in a device orientation state. In other words, for a given orientation state, the transition angle thresholds for transitioning into the orientation state may differ from hold angle thresholds for transitioning out of the orientation state. Table 500d may be regarded as a buffered version of table 500c. Table 500d, which maps orientations to orientation states, may work in conjunction with tables 500a or table 500b, which maps orientation states to button actions and software application modes. The values of device orientation transition angles 523 and device orientation hold angles 534 are merely exemplary. Table 500d may be configured to map any device orientation values to device orientation state 502 as needed based on the particular software application, particular type of wireless handheld device 200, or any other factors.
Orientation map 530e may specify values for device orientation transition angles 532 for wireless handheld device 200 to transition into the orientation state vertical with left tilt 510a and also values for device orientation hold angles 534 for wireless handheld device 200 to begin transitioning out of vertical with left tilt 510a. As a non-limiting example, if device orientation state 502 is initially vertical with right tilt 510b, and the angle α increases to satisfy α>100°, then wireless handheld device 200 may transition into orientation state vertical with left tilt 510a. Similarly, if device orientation state 502 is initially horizontally face up 510c or horizontally face down 510d, and the angle γ increases to satisfy 55°<γ<125°, then wireless handheld device 200 may transition into the orientation state vertical with left tilt 510a or the orientation state vertical with right tilt 510b depending on whether the angle α is larger or less than 90°. In table 500d, these transition angle criteria are represented in abbreviated form by “α>100° OR 55°<γ<125°”. Once device orientation state 502 is vertical with left tilt 510a, device orientation state 502 of wireless handheld device 200 remains held in vertical with left tilt 510a as long as device orientation angles α and γ continue to satisfy both the condition α>80° and the condition 35°<γ<145°. When these conditions are no longer satisfied, then wireless handheld device 200 may initiate a transition out of vertical with left tilt 510a. In particular, when α>80° is no longer true, wireless handheld device may have a right tilt, device orientation state 502 may transition to vertical with right tilt 510b, and when the condition 35°<γ<145° is no longer true (device orientation has become more horizontal than vertical), device orientation state 502 may transition to horizontal face up 510a or horizontal face down 510d. In table 500d, these hold angle criteria are represented in abbreviated form by “α>80° AND 35°<γ<145°”. In some embodiments, there may be multiple criteria based on the current device orientation. For example, the criteria for transitioning into the vertical with left tilt 510a is α>100° if the transition is from the vertical with right tilt 510b; if the transition is from a horizontal orientation state (such as horizontal face up 510c) to a vertical state (such as vertical with left tilt 510a), then the transition criteria is 55°<γ<125°.
Orientation map 530f may specify values for device orientation transition angles 532 for wireless handheld device 200 to transition into vertical with right tilt 510b and values for device orientation hold angles 534 for wireless handheld device 200 to begin transitioning out of vertical with right tilt 510b. As a non-limiting example, if device orientation is detected to fall within α<80° OR 55°<γ<125°, then device orientation state 502 of wireless handheld device 200 may transition into vertical with right tilt 510b and may remain in this orientation state as long as the device orientation is within α<100° AND 35°<γ<145°. When these conditions are no longer true, wireless handheld device 200 may initiate a transition out of vertical with right tilt 510b. In some embodiments, there may be multiple criteria based on the current device orientation. For example, the criteria for transitioning into the vertical with right tilt 510a is a <80° if the transition is from the vertical with left tilt 510a; if the transition is from a horizontal orientation state (such as horizontal face up 510c) to a vertical state (such as vertical with right tilt 510b), then the transition criteria is 55°<γ<125°.
Orientation map 530g may specify values for device orientation transition angles 532 for when device orientation state 502 of wireless handheld device 200 is to transition into horizontal face up 510c and values for device orientation hold angles 534 for wireless handheld device 200 to begin transitioning out of horizontal face up 510c. As a non-limiting example, if device orientation is detected to fall within γ>145°, then wireless handheld device 200 transitions into horizontal face up 510c and remains held in that orientation state as long as the device orientation continues to meet the criteria γ>125°. When the condition γ>125° is no longer satisfied, then wireless handheld device 200 may transition out of the orientation state horizontal face up 510c. In other words, wireless handheld device 200 will not transition to the horizontal face-up orientation state until the angle y exceeds 145°. But once in the horizontal face-up orientation state, there will be no return to a vertical orientation state until the angle y drops below 125°.
Orientation map 530h may specify values for device orientation transition angles 532 for wireless handheld device 200 to transition into horizontal face down 510d and values for device orientation hold angles 534 for wireless handheld device 200 to begin transitioning out of horizontal face down 510d. As a non-limiting example, if device orientation is detected to fall within γ<35°, then wireless handheld device 200 transitions into horizontal face down 510d and remains in that orientation state as long as the device orientation is within γ<55°. When the condition γ<55° is no longer true, the wireless handheld device 200 may initiate a transition out of horizontal face down 510d.
As a non-limiting example, an embodiment may include a handheld device in the shape of a cube with a physical programmable button on each of its six sides. The handheld device may further include an orientation sensor. The six sides of the handheld device may be numbered 1 through 6. The device may have six orientation states corresponding to which of the six sides is facing most upward relative to the orientation of the other sides. For example, orientation state 1 may correspond to side 1 being uppermost and orientation state 2 corresponds to side 2 being uppermost. The device may be programmed so that in orientation state N (where N is 1, 2, 3, 4, 5 or 6), only the physical button on side N is active while all other buttons are inactive. The device may implement a counter in which a count is incremented each time the active physical button (the one on top) is depressed. Such a counting device could be used by a tour guide counting the number of tourists returning to a tour bus by depressing the top button each time a tourist enters a bus. In this implementation, it would not matter which side of the device is facing upward because each of the physical buttons may be implemented to take the same action (taking a count) and any physical button on the uppermost side will be active.
Accordingly, the handheld device of
Moreover, although
Referring to
At step 604, the current device orientation may be mapped to a current orientation state, which may be one of a discrete set of possible orientation states such as those described above with respect to
At step 606, actions may be mapped to the physical buttons of wireless handheld device based on the current orientation state. Examples of these mappings are described above with respect to
It is to be understood that method 600a may include additional optional steps not explicitly shown in
Referring to
At step 614, the controller may recall a prior orientation state. Prior orientation states may be stored in a memory of wireless handheld device 200. A prior orientation state may refer to any prior orientation states of wireless handheld device 200 including the orientation state immediately preceding the current orientation state. For example, if a user shifts wireless handheld device 200 from a horizontal face down orientation state to a vertical with a right tilt orientation state, then the prior orientation state is the horizontal face down orientation state and the current orientation state is the vertical with a right tilt orientation state.
In an embodiment, the device orientation state is not stored (or determined) unless wireless handheld device 200 remains in that orientation for a threshold amount of time. This condition may be useful to prevent storing every orientation state as wireless handheld device 200 transitions from one orientation to another (i.e., there may be any number of intervening orientations or orientation states). In the example discussed above, as wireless handheld device 200 transitions from horizontal face down orientation state to vertical with a right tilt orientation state, it may first transition to a horizontal face up orientation state (i.e., user turns wireless handheld device 200 over), and then to vertical at approximately 90° (i.e., user begins to bring the phone up), and then finally to the vertical with a right tilt orientation state. In an embodiment, only the beginning and end orientation states are stored; this may be determined by a threshold period of time that the device remains in the orientation state. In another embodiment, all orientation states are stored along with the period of time that wireless handheld device remained in that orientation state.
At step 616, the controller may determine the current orientation state based on both the current device orientation and the previously determined orientation state. Once the current orientation state has been determined, at step 618, the controller may provide the mapping of button actions to the physical buttons (e.g., right button 204, left button 206) based on the current orientation state as discussed above with respect to step 606.
Referring to
At 622, the controller of wireless handheld device may determine the current device orientation. At step 624, the controller of wireless handheld device 200 may recall one or more previously determined device orientations. Prior device orientation states may be stored in a memory of wireless handheld device 200. A prior device orientation may refer to any prior device orientation state of wireless handheld device 200 including the device orientation immediately preceding the current orientation state. Device orientations may be stored in memory in a similar manner discussed with respect to orientation states above.
At step 626, controller may determine a current orientation state based on both prior and current device orientations. Once the current orientation state has been determined, at step 628, the controller may provide a mapping of button actions based on the current orientation state.
Referring to
Referring to
At step 642, the controller of wireless handheld device 200 may determine a current device orientation (similar to step 631 of
Various embodiments can be implemented, for example, using one or more well-known computer systems, such as computer system 700 shown in
Computer system 700 includes one or more processors (also called central processing units, or CPUs), such as a processor 704. Processor 1604 is connected to communication infrastructure 706 (e.g., a bus). One or more processors 704 may each be a graphics processing unit (GPU). In an embodiment, a GPU is a processor that is a specialized electronic circuit designed to process mathematically intensive applications. The GPU may have a parallel structure that is efficient for parallel processing of large blocks of data, such as mathematically intensive data common to computer graphics applications, images, videos, etc. Computer system 700 also includes user input/output device(s) such as monitors, keyboards, pointing devices, etc., that communicate with communication infrastructure 806 through user input/output interface(s) 702.
Computer system 700 also includes a main or primary memory 708, such as random access memory (RAM). Main memory 708 may include one or more levels of cache. Main memory 708 has stored therein control logic (i.e., computer software) and/or data. Computer system 700 may also include one or more secondary storage devices or memory 710. Secondary memory 710 may include, for example, a hard disk drive 712 and/or a removable storage device or drive 714. Removable storage drive 714 may be a floppy disk drive, a magnetic tape drive, a compact disk drive, an optical storage device, tape backup device, and/or any other storage device/drive.
Removable storage drive 714 may interact with a removable storage unit 718. Removable storage unit 718 includes a computer usable or readable storage device having stored thereon computer software (control logic) and/or data. Removable storage unit 718 may be a floppy disk, magnetic tape, compact disk, DVD, optical storage disk, and/ any other computer data storage device. Removable storage drive 714 reads from and/or writes to removable storage unit 718 in a well-known manner.
According to an exemplary embodiment, secondary memory 710 may include other means, instrumentalities or other approaches for allowing computer programs and/or other instructions and/or data to be accessed by computer system 700. Such means, instrumentalities or other approaches may include, for example, a removable storage unit 722 and an interface 720. Examples of the removable storage unit 722 and the interface 720 may include a program cartridge and cartridge interface (such as that found in video game devices), a removable memory chip (such as an EPROM or PROM) and associated socket, a memory stick and USB port, a memory card and associated memory card slot, and/or any other removable storage unit and associated interface.
Computer system 700 may further include a communication or network interface 724. Communication interface 724 enables computer system 700 to communicate and interact with any combination of remote devices, remote networks, remote entities, etc. (individually and collectively referenced by reference number 728). For example, communication interface 724 may allow computer system 700 to communicate with remote devices 728 over communications path 726, which may be wired, and/or wireless, and which may include any combination of LANs, WANs, the Internet, etc. Control logic and/or data may be transmitted to and from computer system 700 via communication path 726.
In an embodiment, a tangible, non-transitory apparatus or article of manufacture comprising a tangible, non-transitory computer useable or readable medium having control logic (software) stored thereon is also referred to herein as a computer program product or program storage device. This includes, but is not limited to, computer system 700, main memory 708, secondary memory 710, and removable storage units 718 and 722, as well as tangible articles of manufacture embodying any combination of the foregoing. Such control logic, when executed by one or more data processing devices (such as computer system 700), causes such data processing devices to operate as described herein.
The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the disclosure. However, it will be apparent to one skilled in the art that specific details are not required in order to practice the disclosure. Thus, the foregoing descriptions of specific embodiments of the disclosure are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the disclosure to the precise forms disclosed; obviously, many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the disclosure and its practical applications, they thereby enable others skilled in the art to best utilize the disclosure and various embodiments with various modifications as are suited to the particular use contemplated. It is intended that the following claims and their equivalents define the scope of the disclosure.
Based on the teachings contained in this disclosure, it will be apparent to persons skilled in the relevant art(s) how to make and use embodiments of the disclosure using data processing devices, computer systems and/or computer architectures other than that shown in
It is to be appreciated that the Detailed Description section, and not the Abstract section, is intended to be used to interpret the claims. The Abstract section may set forth one or more, but not all exemplary embodiments, of the disclosure, and thus, are not intended to limit the disclosure and the appended claims in any way.
The disclosure has been described above with the aid of functional building blocks illustrating the implementation of specified functions and relationships thereof. The boundaries of these functional building blocks have been arbitrarily defined herein for the convenience of the description. Alternate boundaries may be defined so long as the specified functions and relationships thereof are appropriately performed.
It will be apparent to those skilled in the relevant art(s) that various changes in form and detail can be made therein without departing from the spirit and scope of the disclosure. Thus the disclosure should not be limited by any of the above-described exemplary embodiments. Further, the claims should be defined only in accordance with their recitations and their equivalents.