The present disclosure relates generally to a multi-functioning pointing device. More specifically, the present disclosure describes a circular, hand-held stress mouse that may be used as a computing input device or as a stress ball.
A computing mouse (“mouse”) is an input device used to control a pointer in order to manipulate visible data on a display screen. The pointer is a small symbol that acts as an interface between the mouse and the display screen and is displayed on the display screen to simulate the movements and actions of the mouse. A user typically operates a mouse by rolling it along a hard, flat surface. Once the mouse detects such movements, it sends a signal to the computing device to display the movement of the pointer on the display screen. However, repeated use of the mouse may be related to computing-related health issues associated with the body including the hands, wrists, arms, and neck, among others.
Certain exemplary examples are described in the following detailed description and in reference to the drawings, in which:
The same numbers are used throughout the disclosure and the figures to reference like components and features. Numbers in the 100 series refer to features originally found in
One of the main goals of a computing mouse is to translate the motion of a human user's hand into command signals. Specifically, the mouse transmits the command signals to a computing device to control the movement of a pointer on a display screen. The mouse may manipulate data, including pointing, selecting, searching, dragging, and highlighting, among other functions, on the display screen. For example, the mouse may enable a user to switch between computing applications, select options and buttons displayed on the display screen, move between and select links on a website, and many other tasks that may be difficult to carry out using a keyboard or other types of pointing devices.
However, use of the mouse may induce ergonomic related injuries due to excessive and repetitive hand movements. As an example, a Repetitive Stress Injury (“RSI”) is a common type of ergonomic related injury associated with use of a conventional computing mouse. RSI often occurs due to the repeated movement, lack of movement, or both, of the muscles and joints in the hand, wrist, arm, neck, back and shoulder. Embodiments described herein enable a circular, hand-held stress mouse that can be used in numerous locations, positions, and subjected to frequent changes in hand movements. Additionally, the mouse may be disabled and used as a stress ball to eliminate or reduce body stress and tension.
The system 100 may include various sensors to detect the hand movements. In particular, the system 100 may include pressure sensors 102, motion sensors 104, temperature sensors 106, accelerometers 108, and compass sensors 110, among other electrical sensors. Each sensor may aid in detecting hand movements that act upon the mouse 101. The hand movements that may act upon the mouse 101 may include tapping, rolling, bouncing, swiping, elevating, and squeezing, among others. In some examples, the hand movements may be combined with algorithms, such as behavioral algorithms, to increase detection and conversion accuracy of the movements. Moreover, the algorithms may embody the rules of logic for controlling actions of the mouse 101. In this manner, the mouse may be programmed to meet the needs of the user and across different form factors.
The pressure sensors 102 and the motion sensors 104 may monitor the mouse 101 to detect the hand movements, including finger movements, and hand gestures. The acceleration sensors, such as an accelerometer 108, may measure and identify the orientation of the mouse 101, such as an upward orientation or a downward orientation, and the acceleration of the hand movements. In other configurations, the mouse 101 may include a compass sensor 110 to increase the accuracy of the accelerometer 108 by detecting additional directional movements.
In order to distinguish between a touch by a human and an inanimate object, the mouse 101 may use a temperature sensor 106 to measure the temperature of the object that touches the mouse 101. The temperature sensor 106 may be programmed to detect a human touch based on a specified temperature range, for example, a range of normal body temperatures of a human. Additionally, when the user is subjected to cold temperatures, the temperature sensor 106 may be calibrated to account for variations associated with normal body temperatures and for ambient temperature conditions. The various sensors may be configured in sequence to detect a human touch. For example, the sequence of the sensors may include detection by a motion sensor, detection by a temperature sensor, and detection by a pressure sensor, among other combinations. The sequence of sensors may further distinguish between a human touch and an inanimate object.
Each sensor may transmit data related to the hand movements to a device controller unit 112 electrically connected to each sensor. The device controller unit 112 may translate the hand movement into a mouse action. The mouse action may include a point action, a move action, a select action, an access-menu action, or a zoom action, among others. Each type of action may correspond with a particular hand movement. Further, each action may describe the visual changes associated with the pointer, as displayed on the display screen. In examples, the manufacturer or end-user may implement additional hand movements and mouse actions, at their discretion.
The device controller unit 112 may transmit the mouse actions to a central processing unit (CPU) 114 that may be adapted to execute the mouse actions. The CPU 114 can be a single core processor, a multi-core processor, a computing cluster, or any number of other configurations. The device controller unit 112 and the CPU 114 may be electrically coupled via a system bus 116 to control the transmission of the mouse actions from the device controller unit 112 to the CPU 114. The CPU 114 may execute the mouse actions to determine the type of mouse action. For instance, the user may initiate a hand movement, such as tapping on the mouse, which is sensed by the sensors. The device controller unit 112 may receive the data from the sensors and may translate it to a particular type of mouse action, for example a point mouse action. In this way, the point mouse action may correspond with the hand movement of tapping on the mouse 101. Accordingly, the hand movement of tapping on the mouse 101 with a finger may be used to carry out the mouse action of pointing to an item on a display screen.
The system bus 116 may couple the CPU 114 to a memory device 118, which may also store mouse actions that are executable by the CPU 114. The memory device 118 can include random access memory (RAM), read only memory (ROM), flash memory, or any other suitable memory systems. Further, the mouse 101 may include a power source 120 to power the mouse 101. For example, the power source 120 may include a rechargeable source, such as a rechargeable battery located within the mouse 101.
The mouse may communicate with a computing device 122 with a display screen 124 using various technologies. For instance, the system 100 may include Bluetooth technology where the system 100 may include a Bluetooth antenna 126 to pair the mouse 101 with the computing device 122. The system 100 may also include WiFi capabilities and thus, may include a wireless antenna 128 built into the mouse 101. The Bluetooth antenna 126, the wireless antenna 128, or any other type of wireless transmission technology may transmit the mouse actions to the computing device 122. The computing device 122 may include a laptop computer, a desktop computer, and a tablet computer, among others. The display screen 124 may be integrated into the computing device or may be an external display, projector, television, and the like.
The computing device 122 with the display screen 124 may include a graphics processing unit (GPU) 130. The GPU 130 may be configured to perform any number of graphics operations related to the mouse 101 by analyzing the mouse actions. Following the previous example, the CPU 114 may transmit the point mouse action to the computing device 122, in particular, to the GPU 130. The GPU 130 may use the data related to the point mouse action to render a graphic image based on that data.
A transmitter 132, located in the computing device 122, may encode and transmit the graphic image rendered by the GPU 130 to a display interface 134. The display interface 134 may enable signals related to the graphic image to be rendered to the display screen 122. In particular, the display interface 134 may use the graphic image rendered by the GPU 130 to move a pointer on the display screen 124. The movement of the pointer on the display screen 124 may be based on the coordinates of a specific type of mouse action. Using the previous example, if the graphic image is based on a point mouse action, the pointer may be displayed on the display screen 124 as carrying out actions, such as pointing to and selecting an item on the screen 124.
The mouse 200 may contain numerous electronic components, such as the sensors 102, 104, 106, 108, 110 and the power source 120, among others. The various sensors may be placed within the pliable material 203 to adequately detect human user contact, such as hand movements and finger movements. In some embodiments, the various sensors may be flexible, thin, and customizable so as to substantially embedded within the pliable material 203.
The CPU 114 and the memory 118, among other electronic components of the mouse 200, may be mounted on a circuit board 204. In some embodiments, the circuit board 204 may be flexible, thin, and customizable so as to substantially embedded within the pliable material 203.
The electronic components may be located directly beneath an external surface 206 of the mouse 200 so as not to be detectable either visibly or physically. For example, the electronic components, as shown in
The pliable material 203, or any other type of soft, pliable material, may be used to cover the external surface 206 of the mouse 200 to provide an external covering. In other implementations, the electronic components may be mounted within an internal frame structure that may be embedded within the pliable material 203, as will be further described.
The mouse 200 may be disabled and thus, rendered inoperable as a computing mouse. In the present examples, a pressure detected upon the mouse 200 beyond a predetermined pressure may signal that the user desires to disable the computing mouse functions. In other examples, inactivity of the mouse 200 for a predetermined time may power down and disable the mouse 200 from further use as a computing mouse.
Once the mouse 200 is disabled, it may be used as a stress ball. The stress ball may be squeezed by a hand or manipulated by fingers to either relieve stress and muscle tension or to exercise the muscles of the hand, among other benefits. In some embodiments, the use of the mouse 200 can be restricted within a predetermined distance away from the computing device 122. An alert may trigger when the mouse 200 moves beyond the predetermined distance.
The internal frame structure 302 may be made of a plastic material or any other type of material that may be flexible and bendable, yet, substantially durable in nature to withstand repeated stresses enacted by a user. The electronic components, including the sensors 102, 104, 106, 108, 110, the power source 120, and the circuit board 204, may be housed within the inner surface 304 of the internal frame structure 302. As previously discussed, the sensors and the circuit board 204 may be flexible.
The internal frame structure 302, along with the components, may be embedded within the pliable material 203 so as to substantially cover the structure 302 and to surround the components. The pliable material 203 may cover the outer surface 306 of the structure 302 to provide an external covering. The pliable material 203 may be substantially strong and durable to withstand repeated stresses during normal usage.
The mouse of the present disclosure may be formed into any shape that can roll on a surface. The surface may include either a hard surface or a soft surface, such as a desk or a human hand of a user. The mouse of the present disclosure, regardless of the shape, may include the internal frame structure 302, based on design specifications.
As shown, Table I provides a list of the various mouse actions, a corresponding hand movement for each action, and a description of the functions that each mouse action may provide. Each mouse action is further described in detail.
Point and Select Action
By carrying out a point and select hand movement upon the mouse, a user may point to an item displayed on a display screen by moving a pointer to a certain location on the screen and then selecting the item. The mouse action of pointing and selecting may include the user tapping on the stress mouse with a finger or bouncing the mouse on any surface to select an item or choose a command on the display screen. In this way, the point and selection action may alert the computing device that the user is making a selection of an item on the display screen. The point and selection action may highlight an item in a window, activate buttons in a dialog box, or produce a menu on the display screen, among others actions.
Move Action
A user may move an item displayed on a display screen with the mouse by dragging the item across the screen to a desired location. The user of the mouse may use several hand movements to carry out the move action. For example, the user may tap on the mouse with a finger to select, then roll the mouse on any surface. Likewise, the user could bounce the mouse on any surface, then roll the mouse. The surface can include a hard surface, such as a top of a desk or an arm of a chair, or a soft surface, such as the user's hand. By using performing the move action on the mouse, the user can, for example, move files and folders to different windows and move icons around on the screen.
Select Action
Using a hand movement corresponding with the selection action, the user may access an item on the display screen. Moreover, the same hand movement may open or close a window on the display without accessing a menu. The mouse action of selecting may include the user tapping twice in rapid succession on the mouse with a finger. Additionally, the user may tap the mouse twice, on any surface, in rapid succession. In some examples, the user may use the select action to open a new window in a word processing application without initially accessing and opening a menu and thereafter, selecting the ‘open new window” option.
Access Menu Action
A user may access a drop down menu to choose a command using a hand movement corresponding with the access menu action. The mouse action of selecting may include the user tapping twice with two fingers on the mouse while simultaneously resting the mouse on a surface. Additionally, the user may tap twice with two fingers on the mouse while simultaneously holding the mouse in a hand. A user may use the access action to activate a menu displayed on the screen and selecting a command from the menu. For example, if a user desires to change the font size of text, the user can carry out the access action to access and open the font menu and thereafter, select the desired font size.
Zoom Action
By carrying out a hand movement upon the stress mouse, a user may zoom in and out of a display screen. Specifically, the user may provide a single-squeeze, press the stress mouse on any surface, or swipe the fingers together over the surface of the mouse. Such hand movements may zoom-out of the display screen to reduce the viewing size of the display. The mouse action of zoom-in may include a double-squeeze or swipe of the surface mouse with the fingers apart. For example, the user may perform the zoom-in hand movement to see a more detailed, enlarged view of the content on the display screen.
At block 804, at least one sensor may be positioned within the mouse housing to detect the hand movements that may act upon the mouse. For example, a number of pressure sensors may be located in various areas of the mouse to sense pressure from a human hand or fingers. In particular, the pressure sensors may be embedded in the pliable material. At block 806, at least one controller may be positioned within the mouse housing to interpret and translate the hand movement into a corresponding mouse action. The at least one controller may be embedded in the pliable material. As an example of a hand movement, a user may tap on the mouse followed by rolling the mouse. The controller can translate the hand movement to determine the mouse action that corresponds with the movement. Following the previous example, the hand movement may include the move mouse action. In some examples, additional hand movements made be added or the existing hand movements reconfigured to provide other types of mouse actions.
A processor may execute the mouse action and transmit information about the mouse action to a transmitter. At block 808, a transmitter may be positioned within the mouse housing to transmit the mouse actions to a computing device with a display screen. Based on the information received by the computing device, the display screen may display a pointer that imitates the mouse actions.
Various software components may be stored on one or more computer-readable media 902, as shown in
Examples may include subject matter such as systems to use a computing mouse in numerous locations and with various hand movements that provide an interactive approach when operating the mouse. The hand movements may change frequently with minimal to no limitations on the user. The computing mouse may be disabled and further used as a stress ball to reduce or alleviate tension.
Example 1 is a computing mouse. The computing mouse includes one or more sensors configured to detect at least one hand movement acting upon the mouse. The computing mouse includes one or more controllers configured to translate the at least one hand movement into a mouse action. The computing mouse may include a transmitter to transmit the mouse action to a computing device. The computing mouse may further include a mouse housing comprising a soft, pliable material, where the sensors, the controllers, and the transmitter are embedded within the soft, pliable material. The computing mouse may include an external covering to cover the mouse housing, where the external covering comprises a soft, pliable material.
The at least one hand movement may include tapping, rolling, bouncing, swiping, or squeezing, in any combination, thereof. The at least one hand movement may include manipulation of the mouse with a human hand, human fingers, or both, in any combination, thereof.
The mouse action may include a point action, a move action, a select action, an access-menu action, or a zoom action. The sensors include pressure sensors and motion sensors configured to detect the at least one hand movement. The sensors include temperature sensors configured to distinguish between a human touch and an inanimate object. The sensors may include acceleration sensors and compass sensors configured to detect a position of the mouse.
The computing mouse may include a shape of the mouse that is configured to be spherical, oval, elliptical, or cylinder, and where the mouse is configured to roll on a surface. The computing mouse may be configured to roll on a hard surface, a soft surface, or both.
The computing mouse may be configured to be used as a stress-ball when detection of the at least one hand movement is disabled. The computing mouse may be configured to operate as a wireless computing mouse.
The computing mouse may include a mouse housing that includes an internal frame structure made of prism segments.
Example 2 is a system including a pointing device and a computing device. The pointing device may include one or more of sensors to detect at least one hand movement acting upon the pointing device. The pointing device may include one or more controllers configured to translate the at least one hand movement into a pointing device action. The pointing device may include a transmitter to transmit the pointing device action. The pointing device may include a housing comprising a soft, pliable material, where the sensors, the controllers, and the transmitter are embedded within the housing.
The computing device may be configured to receive the pointing device action via a wireless technology from the transmitter.
The system may include the wireless technology that may include Bluetooth technology or computer networking (Wi-Fi) technology.
The system may include the pointing device action displayed on a display screen that may correspond to the at least one hand movement acting upon the pointing device.
The system may include the pointing device that may be a computing mouse. The system may include the pointing device that may be a stress ball when the pointing device is electrically disabled.
Example 3 is a method for manufacturing a hand-held stress mouse. The method may include configuring a pliable material into a circular shape to form a housing. The method may include positioning at least one sensor within the housing to detect at least one hand movement. The method may include positioning at least one controller within the housing to translate the at least one hand movement into a mouse action. The method may further include positioning at least one transmitter within the housing to transmit the mouse action, via a wireless technology, to a computing device, where the computing device displays the mouse action on a display screen.
The method may include a tap on the mouse housing with a finger or a bounce of the mouse housing on a surface that may be configured to simulate a point-and-select mouse action.
The method may include a tap on the mouse housing with a finger followed by a roll of the mouse housing on a surface that may be configured to simulate a move mouse action.
The method may include a bounce of the mouse housing on a surface followed by a roll of the mouse housing on a surface that may be configured to simulate a move mouse action.
The method may include a double-tap on the mouse housing or a double-tap of the mouse housing on a surface that may be configured to simulate a select mouse action.
The method may include a tap on the mouse housing with multiple fingers that may be configured to simulate an access-menu mouse action, where the mouse housing simultaneously rests on a surface.
The method may include a tap on the mouse housing with multiple fingers that may be configured to simulate an access-menu mouse action, where the mouse housing is simultaneously hand-held.
The method may include a single-squeeze of the housing mouse, a swipe of a surface of the mouse housing with fingers together, or a press of the mouse housing that may be configured to simulate a zoom-out mouse action.
The method may include a double-squeeze of the mouse housing or a swipe of a surface of the mouse housing with fingers apart that may be configured to simulate a zoom-in mouse action.
The method may include a mouse housing that may be configured to operate in a plurality of locations and positions.
The method may include restricting the use of the mouse housing within a pre-determined distance away from the computing device.
Example 4 is a tangible, machine-readable medium that may include code that, when executed causes a processor to detect a hand movement on a pointing device, translate the hand movement into a corresponding mouse action, execute the mouse action, transmit the mouse action to a computing device, and display the mouse action on a display screen of the computing device.
The tangible, machine-readable medium that may include disabling detection of the hand movement, where the disabling may configure the pointing device to be used as a stress ball.
The tangible, machine-readable medium where the pointing device may be configured to be a circular, hand-held computing mouse.
In the foregoing description and claims, the terms “coupled” and “connected,” along with their derivatives, can be used. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular examples, “connected” can be used to indicate that two or more elements are in direct physical or electrical contact with each other. “Coupled” can mean that two or more elements are in direct physical or electrical contact. However, “coupled” can also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
Some examples can be implemented in one or a combination of hardware, firmware, and software. Some examples can also be implemented as instructions stored on a machine-readable medium, which can be read and executed by a computing platform to perform the operations described herein. A machine-readable medium can include any mechanism for storing or transmitting information in a form readable by a machine, e.g., a computer. For example, a machine-readable medium can include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; or electrical, optical, acoustical or other form of propagated signals, e.g., carrier waves, infrared signals, digital signals, or the interfaces that transmit and/or receive signals, among others.
Not all components, features, structures, characteristics, etc. described and illustrated herein need be included in a particular example or examples. If the specification states a component, feature, structure, or characteristic “can”, “might”, “can” or “could” be included, for example, that particular component, feature, structure, or characteristic is not required to be included. If the specification or claim refers to “a” or “an” element, that does not mean there is only one of the element. If the specification or claims refer to “an additional” element, that does not preclude there being more than one of the additional element.
It is to be noted that, although some examples have been described in reference to particular implementations, other implementations are possible according to some examples. Additionally, the arrangement and/or order of circuit elements or other features illustrated in the drawings and/or described herein need not be arranged in the particular way illustrated and described. Many other arrangements are possible according to some examples.
In each system shown in a figure, the elements in some cases can each have a same reference number or a different reference number to suggest that the elements represented could be different and/or similar. However, an element can be flexible enough to have different implementations and work with some or all of the systems shown or described herein. The various elements shown in the figures can be the same or different. Which one is referred to as a first element and which is called a second element is arbitrary.
In the preceding description, various aspects of the disclosed subject matter have been described. For purposes of explanation, specific numbers, systems and configurations were set forth in order to provide a thorough understanding of the subject matter. However, it is apparent to one skilled in the art having the benefit of this disclosure that the subject matter can be practiced without the specific details. In other instances, well-known features, components, or modules were omitted, simplified, combined, or split in order not to obscure the disclosed subject matter.
While the disclosed subject matter has been described with reference to illustrative examples, this description is not intended to be construed in a limiting sense. Various modifications of the illustrative examples, as well as other examples of the subject matter, which are apparent to persons skilled in the art to which the disclosed subject matter pertains are deemed to lie within the scope of the disclosed subject matter.
While the present techniques can be susceptible to various modifications and alternative forms, the exemplary examples discussed above have been shown only by way of example. It is to be understood that the technique is not intended to be limited to the particular examples disclosed herein. Indeed, the present techniques include all alternatives, modifications, and equivalents falling within the true spirit and scope of the appended claims.