This disclosure generally relates to computing devices. More particularly, the disclosure relates to a configuration for providing a user experience based on communication with a footwear apparatus.
Recent developments in technology have led to various activity tracking devices (e.g., smartwatches) that may be worn by a user during a physical activity, such as running. The activity tracking device is typically worn on the wrist, and may have one or more sensors that attempt to determine activity based on periodic bursts of motion. For example, a conventional activity tracking device may have an accelerometer integrated therein. To track steps taken by a user, a conventional activity tracking device will typically count the number of times a user moves his or her wrist—presuming that each motion of the wrist corresponds to a step taken in the natural walking/running stride of a user.
Yet, such presumptions may often lead to inaccurate activity tracking measurements. For example, a user's hands may be preoccupied during a physical activity (e.g., pushing a cart, holding a smartphone, etc.). In other words, the feet of the user may be moving while the hands of the user are relatively stationary, thereby leading to uncounted steps by the activity tracking device. Alternatively, the user's hands may be moving while the user is relatively stationary (e.g., sitting while taking a break from the physical activity), which could result in steps being added even though no steps were actually taken.
As a result, conventional activity tracking devices placed on the wrist of a user do not accurately measure physical activities of a user.
In one embodiment, a computer program product comprises a non-transitory computer useable storage device that has a computer readable program. When executed on a computer, the computer readable program causes the computer to receive, from a footwear apparatus, motion data corresponding to a foot movement of a user wearing footwear operably attached to the footwear apparatus. The motion data is measured by one or more sensors operably attached to the footwear. Further, the computer is caused to provide, with a processor positioned within a mobile computing device, a user experience based on the motion data. Additionally, the computer is caused to display, via a display device in operable communication with the mobile computing device, the user experience.
In another embodiment, a different computer is caused to sense, with one or more sensors operably attached to a footwear apparatus, a foot movement of a user wearing footwear operably attached to the footwear apparatus. Further, the computer is caused to send, from the footwear apparatus to a mobile computing device, motion data corresponding to the foot movement of the user wearing the footwear such that the mobile computing device provides a user experience based on the motion data.
In yet another embodiment, a footwear apparatus has footwear in which a foot of a user is positioned. Further, the footwear apparatus has a sensor that senses a foot movement of a user wearing the footwear. The sensor is operably attached to the footwear. Moreover, the footwear apparatus has a transmitter that sends, from the footwear apparatus to a mobile computing device, motion data corresponding to the foot movement of the user wearing the footwear such that the mobile computing device provides a user experience based on the motion data. The transmitter is operably attached to the footwear.
The above-mentioned features of the present disclosure will become more apparent with reference to the following description taken in conjunction with the accompanying drawings wherein like reference numerals denote like elements and in which:
A user experience configuration is provided herein to render a user experience for a user of a mobile computing device (e.g., smartphone, smart glasses, etc.) based on communication with a footwear apparatus. In contrast with previous configurations, the user experience configuration receives data from one or more sensors positioned within various forms of footwear (e.g., shoe, sneaker, boot, slipper, sandal, etc.). By placing such sensors within footwear, rather than on the wrist of a user, the user experience configuration is able to more accurately determine the position of the foot of a user, thereby more accurately tracking physical activity corresponding to the user. For example, a user may hold a smartphone in a steady position (i.e., to talk, send text messages, listen to music, etc.) while walking, and have a number of steps accurately counted. Further, the user may sit and freely move his or her hands, without having an impact on the number of steps accurately counted. Such placement of the sensors also allows for generating a user experience based on physical interaction of one or more feet with one or more virtual objects in the user experience.
Various types of user experiences may be generated as a result of communication with the footwear apparatus. Firstly, an activity tracking software application may be used to track fitness activity of a user based on data received from the footwear apparatus. Secondly, a game software application may be used to provide a gaming experience to the user based on the data received from the footwear apparatus. Thirdly, an augmented reality (“AR”) or virtual reality (“VR”) software application may be used to provide an AR/VR experience to the user based on the data received from the footwear apparatus. The foregoing user experiences are provided only as examples, and are provided herein only for illustrative purposes. (Other possible user experiences may be generated as a result of communication with the footwear apparatus.)
For instance, the footwear apparatus 101 may have various componentry (processors, processing boards, circuitry, sensors, etc.) that are integrated within, or operably attached to, footwear (e.g., shoe, sneaker, boot, slipper, sandal, etc.). For example, the footwear apparatus 101 may have a processor 114 that coordinates various operations (e.g., capturing sensed data, performing calculations on the sensed data, performing communication operations between internal and/or external devices with respect to the footwear, etc.). The processor 114 may also have various sensors, such as a motion sensor 103 (e.g., accelerometer, gyroscope, magnetometer, etc.) that detects motion of the footwear. In other words, the processor 114 directly detects motion of a foot of the user, rather than indirectly via a hand of the user, thereby more accurately determining the foot motion of a user. Further, the footwear apparatus 101 may have a transmitter 104 that is used to transmit the sensed motion data from the footwear apparatus 101. For example, the transmitter 104 may transmit the sensed motion data, via a wireless network 105, to the mobile computing device 102. (Alternatively, the transmitter 104 may transmit the sensed motion data via a wired connection, such as a USB cable, to the mobile computing device 102.)
Upon receiving the sensed motion data, the mobile computing device 102 (e.g., smartphone, tablet device, smart glasses, etc.) uses various componentry to generate a user experience. For example, the mobile computing device 102 may have a processor 106 that coordinates the operations of various componentry within the mobile computing device 102. The processor 106 may perform operations by executing code stored in a memory device 109. As an example, the mobile computing device 102 may have a storage device 110 that stores user experience code 111, which may be used to provide a user experience based on the sensed motion data. The mobile computing device 102 may also have a transceiver 107, or a stand-alone receiver, which receives the sensed motion data via the wireless network 105 from the transmitter 104 of the footwear apparatus 101. In one embodiment, the user experience is generated via a cloud configuration. For example, the transceiver 107 may send the sensed motion data via a network 113 (computerized, telecommunications, etc.) to a remotely located server 112, which may generate a user experience from the sensed motion data. The server 112 may then send the user experience via the network 113 to the mobile computing device 102 to render the user experience at the mobile computing device 102. In other words, the server 112, which may have more computational capacity than the mobile computing device 102, may perform computationally intensive calculations (e.g., for an AR experience) so that the mobile computing device 102 does not have to perform such calculations, thereby improving computational efficiency. In an alternative embodiment, the processor 106 in the mobile computing device 102 may directly perform the calculations to generate the user experience.
Upon the user experience being generated, either directly or indirectly, the processor 106 renders the user experience on a display unit 108 (e.g., display screen of a smartphone, glass portion of smart glasses, etc.). Optionally, the mobile computing device 102 may also have various audio devices (e.g., audio speakers) that may be used to enhance the visual aspects of the user experience.
As an alternative to the mobile computing device 102, a computing device such as a desktop computer or a kiosk may be used herein.
Finally, in one embodiment, the footwear apparatus 101 may optionally have a location sensor 115 (e.g., GPS device) that determines the real-world coordinates corresponding to the physical location of the footwear apparatus 101; such location data may be used by the processor 114 of the footwear apparatus 101 and/or the processor 106 of the mobile computing device 102 to generate the user experience in conjunction with the sensed motion data. For example, the user experience may be an AR experience that combines the sensed motion data with graphical imagery generated based upon a particular location corresponding to GPS coordinates of the footwear apparatus 101. In another embodiment, the location sensor 115 may be positioned within the mobile computing device 102.
In one embodiment, the footwear apparatus 101 may be partially integrated within a sole portion 203 of footwear 202 and partially integrated within a top portion 204 of the footwear 202. For example, the processing componentry (e.g., processor 114) may be positioned within the sole portion 203, whereas the sensing componentry (e.g., motion sensor 103) may be positioned at the top portion 204. The various componentry of the footwear apparatus 101 may communicate via various types of connectivity (e.g., wireless or wired). By having the motion sensor 103 positioned at the top portion of 204 of the footwear 202, the processor 114 is able to generate virtual objects at positions that coincide with real-world positioning of the foot of the user 201. Although the motion sensor 103 is illustrated as being positioned at the top portion 204, the motion sensor 103 may, alternatively, be positioned at other portions (e.g., front, side, rear, bottom, or a combination thereof) of the footwear 202.
In another embodiment, the footwear apparatus 101 may be fully integrated within portions of the footwear 202 other than the sole portion 203. For example, the motion sensor 103 may be positioned at the top portion 204, the processor 114 may be positioned at a rear portion of the footwear 202, and the transmitter 104 may be positioned on a side portion of the footwear 202. (The foregoing example is provided solely for illustrative purposes; the various componentry of the footwear apparatus 101 may be positioned individually, or in combination, at, or within, various portions of the footwear 202.)
In yet another embodiment, the footwear apparatus 101 may be fully integrated within the sole portion 203 of the footwear 202.
Although the footwear apparatus 101 is illustrated in
Alternatively, the avatar 311 may be displayed without any avatar manipulation during a physical activity of the user 201. For example, the avatar 311 may be rendered to display what benchmarks 312 have been achieved by the user 201. The benchmark indicia 312 (e.g., virtual medals, virtual clothing, virtual shoes, virtual hats, virtual headphones, etc.) may be displayed in the profile screen 310 based upon the completion of various tasks. For example, the user 201 may win a gold medal for walking ten thousand steps or a silver medal for walking five thousand steps, as determined by the processor 114 (
In one embodiment, the benchmark indicia 312 are rendered in a portion of the profile screen 310 in an area other than that which displays the avatar 311. Accordingly, the user 201 may swipe/scroll through the various earned benchmark indicia 312. In another embodiment, the benchmark indicia 312 may be positioned directly on the avatar 311. For example, a gold medal may be positioned on the shirt/jacket of the avatar 311. As another example, the user 201 may select which benchmark indicia he or she wants to be worn by the avatar 311 at a given moment (e.g., switch between different hats that have been won as a result of reaching various activity-based and/or event-based goals).
Earning a benchmark indicium 312 may be associated with a particular reward. For example, a gold medal may result in a free meal at a particular restaurant, free concert tickets, etc. Alternatively, the benchmark indicium 312 may not be associated with a reward other than achieving a particular goal of the user 102. For instance, a benchmark indicium 312 may be customized by the user 102 (e.g., reaching ten thousand steps in one week), rather than being pre-generated.
Further,
Moreover,
Finally,
In the alternative, or in addition, to the activity tracking functionality described with respect
As an example,
Accordingly, the user 201 may use the footwear apparatus 101 to track the motion of one or both feet of the user 201, via the motion sensor 103. The mobile computing device 102 may then receive the sensed motion data, and generate an AR soccer experience/game for the user 201. In other words, the glass portion of the AR glasses displays various virtual objects (e.g., virtual soccer ball 402) within the context of the real-world environment 401 based on the sensed motion data determined by the motion sensor 103. As a result, the AR glasses 102 are able to render a real-time, or substantially real-time, depiction of the virtual soccer ball 402 with respect to the real-world placement of one or both feet of the user 201.
Further,
In one embodiment, the processor 104 of the footwear apparatus 101 may use the location sensor 115, illustrated in
Although the processing for rendering, and/or calculating coordinates, for the virtual objects may be performed entirely by the processor 106 of the AR glasses 102, such processing may be performed partially by the processor 106 and partially by another processor of an additional computing device (e.g., processor 114 of the footwear apparatus 101, server 107, smartphone, smartwatch, etc.) that may be in operable communication with the AR glasses 102. Alternatively, such other processor may perform such processing in its entirety without being performed in conjunction with the processor 106.
Accordingly, the footwear apparatus 101 may be used to customize virtual imagery of an AR experience based on location data corresponding to a real-world location of the user 201, and sensed motion data corresponding to motion of the foot of the user 201.
Moreover,
Further,
Finally,
As another example,
As another example,
Accordingly, the user 201 may use the footwear apparatus 101 to track the motion of one or both feet of the user 201, via the motion sensor 103. The AR glasses 102 may then receive the sensed motion data, and generate an AR basketball experience/game for the user 201. In other words, the glass portion of the AR glasses 102 displays various virtual objects (e.g., virtual basketball 602, virtual basketball net 603, etc.) within the context of the real-world environment 601 based on the sensed motion data determined by the motion sensor 103. In particular, the AR glasses 102 determine whether or not the user 201 is performing a jumping motion, and how that jumping motion is with respect to the virtual basketball net 603. The AR glasses 102 may then calculate and render a trajectory of the virtual basketball 602 with respect to the virtual basketball net 603. Further,
The foregoing user experiences (e.g., AR soccer, AR track and field, and AR basketball) are just examples of the possible AR applications of the footwear apparatus 101 in conjunction with the mobile computing device 102, illustrated in
Additionally, the user experience may optionally be a game that has one or more rewards corresponding to benchmarks particular to that game. For example, the user 201 may win a particular medal for a certain number of soccer goals made, hurdles jumped, or basketball shots successfully made; such medal may correspond to a particular reward. For example, the reward may be specific to the particular AR experience and/or physical location of the user (e.g., a discount on basketball sneakers at a local sneaker store for certain number of basketball shots made while playing the AR basketball game).
Further,
With the positioning of the motion sensor 103 as provided for herein, the footwear apparatus 101 more accurately measures activity of the foot positioning of the user 201 than conventional configurations, thereby allowing for viable virtual-based user experiences that rely, at least in part, on foot positioning of the user 201. For example, an inaccurate determination of the foot positioning of the user 201, as could easily occur with a wrist-based fitness tracking device, could lead to the virtual soccer ball 402 (
Although a motion sensor 103 is provided for herein, other types of sensors may be used to provide other types of data that may be used for the AR experience. For example, a sensor may be positioned within the footwear apparatus 101 to measure foot weight, foot pressure, pressure points, etc.
The processes described herein may be implemented in a specialized, multi-purpose or single purpose processor. Such a processor will execute instructions, either at the assembly, compiled or machine-level, to perform the processes. A computer readable medium may be any medium capable of carrying those instructions and include a CD-ROM, DVD, magnetic or other optical disc, tape, silicon memory (e.g., removable, non-removable, volatile or non-volatile, packetized or non-packetized data through wireline or wireless transmissions locally or remotely through a network).
It is understood that the processes, systems, apparatuses, and compute program products described herein may also be applied in other types of processes, systems, apparatuses, and computer program products. Those skilled in the art will appreciate that the various adaptations and modifications of the embodiments of the processes, systems, apparatuses, and compute program products described herein may be configured without departing from the scope and spirit of the present processes and systems. Therefore, it is to be understood that, within the scope of the appended claims, the present processes, systems, apparatuses, and compute program products may be practiced other than as specifically described herein.