This relates to an exercise apparatus and, more specifically, to an exercise machine with an integrated holographic display.
Current stationary exercise equipment like treadmills, ellipticals, rowing machines, stationary bikes and weight machines can be monotonous to use. While some come with integrated displays for entertainment while exercising, these displays fail to provide rich sensory immersion and the sense of motion that make outdoor exercise like running, rowing or cycling engaging to athletes.
Some existing stationary exercise equipment with 2-dimensional (2D) displays attempt to mitigate the monotony of stationary exercise with on-screen media, but they do not provide the stereoscopic visuals and head-motion parallax that are required to simulate depth perception and convincing motion through 3-dimensional (3D) space, making them less immersive and less engaging than their non-stationary counterparts for outdoor course or indoor track exercise. Stationary exercise is advantageous for several reasons including that it can be done from the comfort of one's home or gym, regardless of weather conditions outside. However, current stationary exercise equipment does not offer the immersive visuals and perception of motion that can increase enjoyment and engagement and enhance therapeutic mental health benefits during exercise. While other systems use virtual reality and augmented reality headsets or eyewear connected to exercise equipment to simulate motion, they are also cumbersome to wear and are prone to cause nausea or motion-sickness.
There is no existing exercise equipment that can provide an immersive experience without requiring the user to wear a headset.
Embodiments of the disclosure use an integrated and responsive 3D or holographic display attached to and/or integrated with stationary exercise equipment to create a more immersive, engaging and enjoyable stationary exercise experience. The 3D or holographic display provides a more stimulating sensory experience and can better simulate the perception of motion through a 3D virtual environment.
The foregoing and other features of the present disclosure will become more fully apparent from the following description and appended claims, taken in conjunction with the accompanying drawings. These drawings depict only several embodiments in accordance with the disclosure and are, therefore, not to be considered limiting of its scope. The disclosure will be described with additional specificity and detail through use of the accompanying drawings.
In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be utilized, and other changes may be made, without departing from the spirit or scope of the subject matter presented here. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the Figures, can be arranged, substituted, combined, and designed in a wide variety of different configurations, all of which are explicitly contemplated and make part of this disclosure.
The terms “exercise apparatus,” “exercise equipment,” and “exercise machine” are used interchangeably in this document.
The terms “holographic”, “3D”, “spatial”, “volumetric” are used interchangeably in reference to display technology. They describe various systems that enable on-screen visuals to appear different to each of the user's eyes (i.e. stereoscopic visuals) and based on the position of the user's head (i.e. head-motion parallax), which serve to accurately simulate depth and motion through 3D space. These systems can include lenticular displays, retinal projectors, and/or light-field displays, as well as face-tracking cameras to optimize imagery based on the position of the user's eyes and head. These methods of display do not require headwear or eyewear, as in other virtual or augmented reality systems.
As stated above, current stationary exercise equipment like treadmills, ellipticals and stationary bikes can be monotonous to use. While some come with integrated displays for entertainment while exercising, these displays fail to provide rich sensory immersion and the sense of motion that make outdoor exercise like running or cycling engaging to athletes. Embodiments of the disclosure solve this problem.
To make stationary exercise machines more immersive and engaging, as well as to enhance their physical and mental health benefits, embodiments of the disclosure integrate an holographic display that responds when, for example, force is exerted on the stationary exercise machine (e.g. when the user pedals a stationary bicycle), changing the imagery on screen to simulate motion through a virtual environment.
In some embodiments, the 3D display can also be utilized more generally to make on-screen content more engaging and immersive (e.g. displaying 3D movies as the user exercises). To further increase the immersiveness of the exercise experience, the spatial display on the exercise equipment can integrate with spatial speakers to simulate sound in virtual space.
Embodiments of the disclosure differ from what currently exists. The embodiments use a holographic display attached to and/or integrated with stationary exercise equipment to provide stereoscopic visuals and head-motion parallax that enhance the exercise experience. Additionally, a holographic display that is integrated with the exercise equipment is less nausea-inducing and less cumbersome than a virtual or augmented reality head-mounted display.
Stationary exercise equipment, even those with displays, provides a less immersive and engaging experience than outdoor or track exercise. As a result of their 2D displays, they can only convey depth and motion in a limited capacity through monoscopic perspective (e.g. using monoscopic video that displays distant virtual objects smaller than nearby virtual objects). These systems are unable to provide stereoscopic visuals or head-motion parallax required for rich sensory stimulation and convincing perception of motion.
Embodiments of the disclosure use an integrated and responsive 3D or holographic display attached to and/or integrated with stationary exercise equipment to create a more immersive, engaging and healthy stationary exercise experience. The holographic display provides a more stimulating sensory experience and can better simulate the perception of motion through a 3D virtual environment.
Also, embodiments of the disclosure functions as a software content delivery system. The software content (e.g. software applications) delivered through exercise equipment can utilize the sensors, displays, cameras and exercise equipment in different ways to create different exercise experiences. For example, one software application might virtually simulate cycling the Tour de France with other live participants connected via the internet in an immersive 3D scene generated by photogrammetry depth capture. Another application might simulate running through a virtual forest realistically rendered from a game engine. Yet another might simulate a fitness class with a personal trainer appearing as an animated avatar in a virtual gym. Other content might include holographic music visualization or computational art. Any application can leverage the array of display(s), speaker(s), sensor(s), and exercise equipment in different ways to create unique 3D and holographic experiences.
The exercise equipment according to the disclosed embodiments can be stationary exercise equipment (e.g. stationary bike, treadmill, elliptical, rowing machine, etc.) with electronic sensors to detect force exerted by user and/or capture other information relating to the user's movement when using the exercise equipment.
The exercise equipment disclosed herein can include 3D or holographic display (e.g., those that can utilize lenticular display technology, light-field display technology or projector-based display technology), positional-tracking camera system (for face-tracking) to record position of the user's eyes to adapt imagery from 3D or holographic display and better simulate depth perception through a 3D virtual environment.
The exercise equipment disclosed herein can additionally include computer hardware/software such as one or more processors to interpret sensor and camera data and output graphical content through the display.
Embodiments of the exercise equipment can display virtual content through the 3D or holographic display.
In another aspect of the disclosure, a method of connecting to the internet to stream updates or access content to an exercise apparatus is disclosed. The connection can be mediated through a connection to a mobile phone or can connect directly to WiFi or cellular networks.
The stationary exercise bike 100 can incorporate one or more sensors 101 to detect force exerted by the user and capture other data on how the user is using the stationary exercise bike 100. The sensors for detecting force and other data vary depending on the type of exercise equipment and may include potentiometers, gyroscopes and accelerometers as well as optical sensors like cameras. In some embodiments, the one or more sensors 101 can also include pressure sensors, rotation sensors, position sensors, cadence sensors, vibration sensors, etc.
In the embodiment illustrated in
For example, force sensors 101 in the pedals can detect the amount of force exerted by the user when pedaling. Alternatively or additionally, a cadence sensor 107 can be attached to the stationary bike's crank arm to measure the real time cadence when the stationary bike 100 is in use. Alternatively or additionally, one or more vibration sensors and/or accelerometers 109 can be attached or embedded in the frames of the stationary bike 100 to detect the vibration and/or tilting of the stationary bike 100 when in use. It should be understood that the exemplary sensors 101, 107, 109 shown in
Additionally, one or more sensors 110 may be integrated with the display 102 to detect the position and orientation of the display 102. These may be used to adapt on-screen content, for example, so that the horizon line displayed on-screen matches the real-world horizon line regardless of the angle and orientation of the display. Sensor 110 can be an accelerometer or a position sensor.
The processor 104 can receive signal(s) from the one or more sensors 101 and determine based on the signal(s), for example, the real-time speed at which the user is pedaling the bike and adjust the 3D or holographic environment being displayed on the display 102 accordingly to have the proper simulation of the user biking through the simulated environment.
The stationary bike of
The camera 103 can send in real time a signal embedding the information it captured to the processor 104. The processor can then process the signal from the camera to determine imagery output through the 3D or holographic display 102 that can adapt and accurately simulate depth-perception and motion through virtual space. As a result, while the user exerts force on the stationary exercise bike, the imagery displayed through the 3-dimensional or holographic display can respond to simulate motion through virtual space.
The 3D or holographic display 102 can be any existing display capable of providing content in a way that provides the user an immersive experience while using the stationary bike 100 without requiring the user to wear any virtual reality (VR) headset. For example, the 3-dimensional or holographic display 102 may utilize a lenticular display paired with the face-tracking camera 103 to detect the user's eye position. The lenticular display can rapidly output image frames directed alternately at the user's right and left eye to simulate stereoscopic depth perception. The imagery adapts to the position of the user's eyes to simulate visual parallax. Alternatively, the 3-dimensional or holographic display 102 can be a light-field, which directs photons along the proper vector to simulate their trajectory from a virtual scene and create the perception of depth. Lastly, holographic projectors provide another alternate method of simulating depth perception by tracking the user's eye position with the camera 103 and then projecting different images into each of the user's eyes to create a stereoscopic view of a virtual scene. The embodiments of the disclosure can use any of these methods or a combination of them.
The virtual content output through the 3-dimensional or holographic display 102 can respond to the sensor data input from the one or more sensors 101 of the stationary exercise bike 100. Conversely, the stationary exercise bike can adapt to virtual content. For example, to simulate a virtual hill, the resistance of the pedals of a stationary bicycle might increase to give the user a sense of pedaling up an actual hill. Additionally, the virtual content can adapt and respond to the positional-tracking cameras 103 to better simulate depth-perception. As well, the face-tracking cameras 103 might be used to better understand how the user is perceptually, emotionally, physiologically and psychologically experiencing their exercise. The virtual content can adapt to this information to optimize and customize the exercise experience for each individual user. For example, the camera(s) 103 might detect infrared light output from the user's body to infer heart-rate and blood-flow and adjust the exercise intensity to maintain a constant, optimum heart-rate. Additionally or alternatively, the exercise equipment might use a combination of cameras 103 with one or more electrodermal sensors 112 (positioned in the handlebars, for example) to detect perspiration and infer hydration levels to then prompt the user to drink liquid when needed. An array of sensors including cameras (e.g., camera 103), cadence and resistance sensors and electrodermal sensors (e.g., sensor 112) might track and interpret subtle variations in perspiration, heart-rate, exertion, eye-movement, facial expression, exercise technique, etc. to infer when the user is experiencing a peak rush of euphoria (known as “runner's high”) while exercising and synchronize visual and audio content to enhance euphoria, for example, by displaying more exciting imagery and louder music that match the rhythm of the user's heart-rate or exercise cadence. The audio content can be stored locally in a storage of the stationary exercise bike 101 or streamed from a remote source (e.g., a cloud server). The audio content can be synchronized with the visual content by the processor 104 in response to data captured by the one or more sensors 101, 107, 109, 110, 112, and camera(s) 103. One or more speakers 120 located at different locations on the stationary exercise bike 101 can output the audio content with the intended effects and/or volume.
Although the 3-dimensional or holographic display 102 is shown to have a flat display surface, it should be understood that the display 102 can have a surface of any type and any curvature. It should also be understood that the 3-dimensional or holographic display 102 can include multiple screens that combine to create the immersive visual experience to the user. It should also be understood that the display 102 can be of any size and shape.
The computer processor 104 can be any computer processor capable of processing signals from the camera 103 and the one or more sensors 101, 107, 109 of the stationary bike 100. Although the processor 104 is illustrated to be located in a housing behind the 3-dimensional or holographic display 102 in
Content to be shown on the 3D or holographic display 102 and software or firmware updates can be streamed or downloaded over the internet. For example, the user can select a stage of the Tour de France to be rendered by the 3D or holographic display 102 while using the bike 100 to simulate competing in the race. Specifically, the 3D or holographic display 102 can display pre-existing (e.g., downloaded) continuous footage of a Tour de France race captured using an omnidirectional, stereoscopic camera on a vehicle (e.g., a bicycle or a car). The footage can be shown at a pace that corresponds to the user's pace on the stationary bike 100 and from different angles that change in response to the user's eye movements captured by the face tracking camera 103.
Sensor 202 can be a sensor placed under the belt of the treadmill to detect the force, timing, and/or location of the contact made by the user's feet. Sensor 202 can be multiple sensors placed at different locations under the belt. Additionally or alternatively, sensors 204 can be placed on areas on the top handlebars of the treadmill to detect any force from the user gripping the handle bars. Data detected by the sensors 202, 204 can be transmitted to the computer processor 210 of the treadmill 200.
The 3-dimensional or holographic display 206 and the face-tracking camera 208 of the treadmill 200 can be similar to the 3-dimensional or holographic display 102 and face-tracking camera 103 of the stationary bike 100 of
The processor 210 can process data received from the sensors 202, 204 and the camera 208 and the settings (e.g., degree of incline, speed setting) of the treadmill to determine the user's pace, lateral movement, head/eye movement, etc. when the user is using the treadmill 200. The processor 210 can then display on the display 206 a 3-dimensional or holographic imageries that simulate an immersive visual experience (e.g., running through a forest or on a race track against other runners) for the user. The imageries can be a video being streamed in real time or content pre-downloaded from a remote server such as a cloud server.
The treadmill 200 can optionally include additional sensors not shown in
It should be understood that both the stationary exercise bike 100 of
Processor 604 may include any appropriate type of general-purpose or special-purpose microprocessor, digital signal processor, or microcontroller. Processor 604 may be configured to receive data and/or signals from sensors, camera, other types of user interface such as a keypad or a touch screen, and/or other devices on the network and process the user input and received data and/or signals to determine the settings of the exercise apparatus including what content to be provided via the 3-dimensional or holographic display and how the content is provided.
Processor 604 may execute computer instructions (program codes) stored in memory 602 and/or storage 606, and may perform functions in accordance with exemplary techniques described in this disclosure. Memory 602 and storage 606 may include any appropriate type of mass storage provided to store any type of information that processor 604 may need to operate. Memory 602 and storage 606 may be a volatile or non-volatile, magnetic, semiconductor, tape, optical, removable, non-removable, or other type of storage device or tangible (i.e., non-transitory) computer-readable medium including, but not limited to, a ROM, a flash memory, a dynamic RAM, and a static RAM. Memory 602 and/or storage 606 may be configured to store one or more computer programs that may be executed by processor 604 to perform exemplary functions disclosed in this disclosure including generating content for displaying on the 3-dimensional or holographic display of the exercise apparatus. For example, memory 602 and/or storage 606 may be configured to store program(s) that may be executed by processor 604 to determine the speed at which the content is being played on the display based on the pace of the user running on the treadmill. The program(s) may also be executed by processor 604 to provide an interface for interacting with a user.
Memory 602 and/or storage 606 may be further configured to store information and data used by processor 604. Memory 602 and/or storage 606 may be configured to store real-time streaming or pre-downloaded video content and/or software updates to the exercise machine.
Referring again to
I/O interface 608 can allow the exercise apparatus 600 to interact with a user. For example, the I/O interface 608 can be a touch screen that displays an interactive screen for the programs (or apps) running on the exercise apparatus 600. The touch screen can also receive touch or gesture input from a user. Any other conventional I/O interface can also be incorporated into the apparatus.
In another aspect of the disclosure, a method of providing an immersive experience to a user of an exercise machine is provided. The method can be performed by software, firmware, hardware, and/or a combination thereof. The software and firmware can be stored in a local storage or hosted on a remote server connected to the exercise equipment.
Next, the processor initiates the exercise machine based on the user inputs (step 402). As the user starts exercising on the exercise machine, the processor can receive real-time data from the one or more sensors capturing the movement, force, and other data associated with the user's action (step 403). Similarly, the processor can also receive information from the face-tracking camera that tracks, for example, the user's eye movement (step 404). The processor can analyze the sensor data in combination with the camera data to display or simulate a virtual environment via the 3D or holographic display of the exercise machine for the user (step 405). The processor can optionally continue to make real-time adjustment to the virtual environment based on the data received from the sensors and the camera (step 406). Additionally or alternatively, the processor can generate feedback (e.g., changing the incline on the treadmill) by adjusting the settings of the exercise equipment based on the virtual environment being shown to the user (step 407). When the user finishes his/her exercise, the processor can turn off the display (step 408).
In yet another aspect of the disclosure, embodiments of exercise equipment can also function as a software content delivery system. The software content (i.e. software applications) delivered through the exercise equipment can utilize the sensors, displays, cameras and exercise equipment in different ways to create different exercise experiences. For example, one software application might virtually simulate cycling the Tour de France with other live participants connected via the internet. Another application might simulate running through the woods. Any application can leverage the array of display(s), sensors, and exercise equipment in different ways to create unique experiences for the user.
In another aspect of the disclosure, an exercise apparatus with an integrated performance verification system is disclosed.
Current athletic esports platforms such as virtual cycling and rowing races are vulnerable to cheating, especially when participants connect remotely, and it is not possible to verify their physical performance through direct human observation. Dishonest participants can cheat using methods such as interfering with exertion sensors in exercise equipment, attaching external motors that turn gears of exercise equipment to simulate athletic exertion, and hacking various layers of the software systems that esports platforms rely on. High numbers of remote participants in massively multiplayer environments can further exacerbate the challenge of verifying individual performance. As well, some exercise equipment categories like treadmills do not have integrated mechanisms that respond to physical exertion, such as cadence and wattage sensors common in cycling and rowing, instead estimating athletic output from the speed of the motorized tread, presenting additional challenges with performance verification. As a result, much performance data goes unverified or not thoroughly verified, particularly in casual multiplayer settings, allowing dishonest participants to outpace other players, score points, earn rewards and even qualify for competitive events.
Embodiments of the disclosure utilize computer vision systems integrated with exercise equipment alongside common exercise exertion sensors like cadence and wattage gauges to verify that the recorded physical exertion output aligns with biometric signals simultaneously registered by the computer vision system. The camera sensors and computer vision systems used to measure biometric signals for the purposes of performance verification and cheating prevention can be the same as or used in concert with those used in a stereoscopic display attached to the exercise equipment such as the stationary bike 100 of
For example, one approach to verify whether a person is actually exerting the effort that the ergometer is reading would be to analyze camera footage of a person while they use exercise equipment like a rowing machine and determining whether their body motion during each rowing stroke aligns with the expected body motion to exert the force reported by the ergometer during the rowing stroke. The expected body motion in this case would be predetermined by analyzing a large sampling of camera footage of people performing similar exercise. Another approach to measure other biometric signals such as heart-rate, circulation and blood-oxygen saturation involves optical sensors (i.e. cameras) and illuminators using photoplethysmography. These optical sensors can be placed on the exercise apparatus so as to allow for close skin contact such as with handlebars. Alternatively, the camera sensor used can be the same as the one used to detect body motion and would be placed further from the user for non-contact photoplethysmography.
The exercise apparatus 500 can optionally include additional sensors such as exercise exertion sensors like cadence sensor 506 and wattage gauge 508. Other types of sensors such as pressure sensors, rotation sensors, position sensors, vibration sensors (not shown in
The exercise apparatus can also include a processor 510 in communication with the computer vision system 504 and the optional additional sensors 506, 508. The processor 510 can receive biometric signals and other data from the computer vision system 504. The processor 510 can also receive signals from the additional sensors 506, 508. The processor 510 can verify whether the physical performance level indicated by one or more of the biometric signals aligns with the physical performance level simultaneously recorded by the other sensors (e.g., exertion sensors 506, 508) of the exercise apparatus 500.
For example, the processor can determine whether the body motion patterns and the amount of perspiration detected are consistent with the same performance level (e.g., pedaling/running speed). If both the body motion patterns captured by the computer vision system 504 and the signals received from the exercise exertion sensors 506, 508 indicate that the user is pedaling or running at about the same speed, the user's performance is verified. In contrast, if the camera data (e.g., body motion patterns) shows that the user is running at a slower pace than the data from the exercise exertion sensors 506, 508 indicates, the processor 510 can determine that the data from at least some of the exercise exertions sensors 506, 508 may not be accurate in representing the user's actual performance on the exercise apparatus. This can provide a mechanism to verify user performance in an esport competition taking place remotely on participants' own exercise apparatus.
In some embodiments, the processor 510 may not be on the exercise apparatus 500 but instead be on a remote computer (e.g., a cloud processor) that is connected to the exercise apparatus via a network such as the Internet. Sensor data from the computer vision system 504 and the other sensors 506, 508 can be transmitted via the network to the remote computer for processing.
In some embodiments, when measured together by the computer vision system 504, the processor can cross-check multiple biometric signals with each other as well as the output of the exertion sensor(s), creating a stronger verification system that is more difficult to dupe. For example, in the case of an activity like rowing where stroke form contributes to overall output, analyzing body motion can help verify that the user's stroke rate and technique match the exertion readout from the ergometer. However, body motion analysis alone may not be sufficient to prevent cheating. Analyzing heart-rate alongside body motion can further verify that the user's cadence output aligns with the ergometer readout, helping to ensure they are not artificially amplifying their stroke force. These separate methods together can magnify the cumulative effectiveness more than the sum of their parts as it becomes much more difficult to dupe multiple systems simultaneously.
In some embodiments, it is also possible for the processor 510 to bypass the process of inferring biometric measures like heart-rate altogether and instead only analyze variations in the raw biometric signals emitted by the user as detected by the camera sensors. For example, using this method to correlate exertion with skin light absorption is more direct than correlating exertion with the inferred heart-rate, which is an extrapolation and abstraction of the observed biometric signal of skin light absorption. Such systems can be facilitated by neural network models trained on data sets collected from numerous users in a variety of settings and conditions.
In some contexts, such as casual non-competitive exercise, simpler computer vision systems that track fewer biometric signals might be sufficient to verify performance whereas other more stringent use-cases such as esports competitions might require more robust verification that relies on a variety of biometric signals and computer vision systems. In some embodiments, the exercise apparatus can have different performance verification settings that can be set either by the user or by a remote computer based on the context of an exercise session. For example, for a formal esports competition, each of the participating exercise apparatus can be connected to a central computer over a network. The central computer can set the performance verification setting on each exercise apparatus to the highest level, enabling all verification mechanisms available on the apparatus. The central computer can also lock the setting so the user cannot override it when in competition. In contrast, for a casual training session, the performance verification setting can be turned off.
In some embodiments, analyzing data of simultaneous output from the exertion sensor and optical sensors (e.g., cameras) from numerous different users and their exercise sessions can establish a baseline correlation that can then be used to verify an individual user's physical performance. The aggregate session data can also be compared to details about a particular user such as height, weight, and age as well as data collected from previous exercise sessions data to establish an individual baseline specific to that user.
Referring again to
Data from one or more other sensors (e.g., exercise exertion sensors) on the exercise apparatus can be processed by a processor to determine the veracity of a second set of user performance data. (Step 705) The first and second sets of performance data are synchronized and compared to determine if there are significant inconsistencies between the data set. (Step 706) Any significant inconsistencies can trigger an alert about that user's performance (Step 707) that can be handled by the system differently depending on the context and programming. For example, in the context of a competitive race, the alert can be redirected to a supervising authority such as a human referee that can review and determine if there has been a rule violation based on the evidence provided by the system. It should be understood that some of the steps illustrated in
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed systems and methods. Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the systems and methods. It is intended that the specification and examples be considered as exemplary only, with a true scope being indicated by the following claims and their equivalents.
Number | Date | Country | |
---|---|---|---|
63118149 | Nov 2020 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17334441 | May 2021 | US |
Child | 18587333 | US |