The present disclosure relates to the field of digitalization of training for sports involving a ball. More specifically, the present disclosure relates to a method and smart ball for generating an audio feedback for a user interacting with the smart ball.
With the advances in miniaturization and processing capabilities of electronic components, it is now possible to design a smart ball comprising embedded electronic components. In particular, the smart ball comprises sensors adapted for measuring parameters representative of interactions of a user with the smart ball. For example, a soccer ball is equipped with sensors adapted for measuring a speed of the ball, an acceleration of the ball, a spin of the ball, etc. The data collected by the sensors of the smart soccer ball are used for monitoring and analyzing practice or training exercises performed by the user of the smart soccer ball. The smart ball is digitally integrated with other devices to implement a smart ball training system, aiming at improving various aspects of the soccer ball practice (e.g. accuracy, speed, impact, etc.). The smart ball training system generally includes at least one device providing a training user interface for interacting with the user of the system.
Up to now, the usage of smart balls and smart ball training systems is reserved to a professional usage. For example, expensive analysis and coaching tools based on smart balls have been developed for improving the training and practice of professional soccer players. By contrast, a very large number of amateur players do not have access to affordable digitally based training solutions. Furthermore, the needs and expectations of amateur players are different from those of professional players. For instance, the digital training user interface needs to be adapted to amateur players. In particular, for amateur players, designing a training user interface providing functionalities usually used in the gaming industry or the social media industry would be particularly adapted for attracting the amateur players. For example, the training user interface would be designed to build positive training habits, to develop and maintain engagement of the user in the training activities, etc. The training user interface should also allow interactions with the user during a training exercise (for example to provide feedbacks on the execution of the training exercise), without disrupting the physical and mental engagement of the user in the training exercise.
Therefore, there is a need for a new method and smart ball for generating an audio feedback for a user interacting with the smart ball.
According to a first aspect, the present disclosure relates to a method for generating an audio feedback for a user interacting with a smart ball. The method comprises receiving ball data from the smart ball by a processing unit of a computing device via a wireless communication interface of the computing device. The ball data are generated by an embedded monitoring platform comprised in the smart ball, the ball data being representative of an interaction session of the user with the smart ball. The method comprises processing the ball data by the processing unit of the computing device to generate indicators. The method comprises determining an audio feedback for the user based on at least one of the indicators, the audio feedback providing a feedback to the user on the execution of the interaction session by the user. The method comprises transmitting the audio feedback to an audio device by the processing unit of the computing device.
According to a second aspect, the present disclosure relates to a non-transitory computer-readable medium comprising instructions executable by a processing unit of a computing device, the execution of the instructions by the processing unit of the computing device providing for implementing the aforementioned method.
According to a particular aspect, a plurality of consecutive audio feedbacks are determined during the execution of the interaction session by the user.
According to another particular aspect, the audio feedback is determined at the end of the interaction session.
According to still another particular aspect, the interaction session is a training exercise performed by the user with the smart ball.
According to yet another particular aspect, the ball data comprise at least one of the following: timestamps, spin values, acceleration values, orientation values, speed values, pressure values, temperature values and force values.
According to another particular aspect, the ball data comprise at least some of three gyrometer values in three coordinate axes, three accelerometer values in three coordinate axes and three magnetometer values in three coordinate axes.
According to still another particular aspect, the indicators comprise ball indicators generated by the processing of the ball data
According to yet another particular aspect, the ball data are processed in near real-time to generate the ball indicators during the execution of the interaction session by the user.
According to another particular aspect, wherein the ball indicators comprise at least one of the following: a number of touch-events, a timing for each touch-event, a speed of the ball, a spin of the ball, a height of the ball, and a trajectory of the ball.
According to still another particular aspect, a given type of touch-event is identified through the processing of the ball data; and for each occurrence of the given type of touch-event, the number of touch-events is updated, the timing of occurrence of the touch-event is determined, and at least one of the following is determined: the speed of the ball, the spin of the ball, the height of the ball and the trajectory of the ball.
According to yet another particular aspect, the indicators comprise at least one performance indicator, each performance indicator being generated by the processing of at least one ball indicator.
According to another particular aspect, the performance indicator consists of a score or a performance statistic.
According to still another particular aspect, the generation of the performance indicator further takes into consideration a goal defined for the at least one ball indicator.
According to yet another particular aspect, an audio sequence is played during the interaction session, and the performance indicator is representative of a synchronization of the execution of the interaction session by the user with the audio sequence.
According to another particular aspect, touch-events consisting of a foot-contact are detected based on the ball data, a timing of each touch-event consisting of a foot-contact is determined, an average contact frequency with the ball is determined based on the determined timings, and the performance indicator is based on the average contact frequency with the ball.
According to still another particular aspect, the audio device is integrated to the computing device or the audio device is a standalone audio device, the audio device playing the audio feedback.
According to yet another particular aspect, the smart ball is a soccer ball, a basketball ball, a volleyball ball, an American football ball, a tennis ball, a table tennis ball or a golf ball.
According to another particular aspect, the method further comprises generating sensor data by at least one sensor of the embedded monitoring platform, the sensor data being representative of the interaction session of the user with the smart ball; generating the ball data by a processing unit of the embedded monitoring platform based on the sensor data generated by the at least one sensor; and transmitting the ball data to the computing device by the processing unit of the embedded monitoring platform via a wireless communication interface of the embedded monitoring platform sensor.
According to a third aspect, the present disclosure relates to a smart ball comprising an embedded monitoring platform. The embedded monitoring platform comprises a wireless communication interface. The embedded monitoring platform comprises at least one sensor adapted to generate sensor data, the sensor data being representative of an interaction session of a user with the smart ball. The embedded monitoring platform comprises a processing unit configured to: generate ball data based on the sensor data generated by the at least one sensor, and transmit the ball data to a computing device via the wireless communication interface.
According to a particular aspect, the ball data comprise at least one of the following: timestamps, spin values, acceleration values, orientation values, speed values, pressure values, temperature values and force values.
According to another particular aspect, the at least one sensor comprises at least one of the following: an inertial sensor, a gyroscope, an accelerometer, a pressure sensor, a temperature sensor, and a force sensor.
According to still another particular aspect, the at least one sensor comprises an inertial measurement unit (IMU) sensor adapted for generating three gyrometer values in three coordinate axes, three accelerometer values in three coordinate axes and three magnetometer values in three coordinate axes, the ball data comprising at least some of the three gyrometer values, the three accelerometer values and the three magnetometer values.
According to yet another particular aspect, the at least one sensor comprises a high range accelerometer adapted for generating three high range accelerometer values in three coordinate axes, the ball data comprising the three high range accelerometer values.
According to another particular aspect, the smart ball is a soccer ball, a basketball ball, a volleyball ball, an American football ball, a tennis ball, a table tennis ball or a golf ball.
According to still another particular aspect, the embedded monitoring platform further comprises a battery and a receiver coil adapted for wireless electrical charging of the battery.
Embodiments of the disclosure will be described by way of example only with reference to the accompanying drawings, in which:
The foregoing and other features will become more apparent upon reading of the following non-restrictive description of illustrative embodiments thereof, given by way of example only with reference to the accompanying drawings.
Various aspects of the present disclosure generally address one or more of the problems related to the digitalization of the user experience of a user interacting with a ball (e.g. practicing a training exercise with the ball). A smart ball is equipped with electronic components, including sensors adapted to collect sensor data related to the ball. Ball data are generated based on the sensor data collected by the sensors. The ball data are wirelessly transmitted from the smart ball to a computing device, where the ball data are further processed to generate indicators. The indicators are used to determine audio feedback(s) for the user (e.g. guiding/encouraging the user in the practice of a training exercise with the smart ball).
The following terminology is used throughout the present disclosure:
Interaction session: the notion of interaction session shall be interpreted broadly as a period of time during which a user has one or more interactions with a ball. For example, the interaction session is a training session during which various types of interactions with the ball may be performed by the user, to perform a particular type of training exercise (e.g. juggling with one foot, juggling with both feet, juggling with the head, kicking the ball, etc.). The training exercise may include a single interaction with the ball, a sequence of different interactions with the ball, a repetition of the same interaction with the ball, etc.
Sensor data: data generated by a sensor and transmitted to another component for further processing. The sensor comprises a sensing unit generating raw sensor data (e.g. measurements). The sensor data transmitted by the sensor to the other component (e.g. a processing unit) are either directly the raw sensor data or data generated by an internal processing by the sensor of the raw sensor data.
Ball data: data generated by a processing unit of a smart ball when performing a processing of sensor data generated by embedded sensor(s) of the ball. More details about the ball data will be provided in the description.
Indicators: data generated through processing of the ball data by a computing device (e.g. a smartphone), upon reception of the ball data transmitted by the smart ball. More details about the indicators will be provided later in the description.
Reference is now made concurrently to
The terminology smart ball is used for the ball 200, to indicate that the embedded monitoring platform 300 is integrated to the ball 200. The embedded monitoring platform 300 provides the capability to monitor (using one or more sensors) an interaction session of a user with the ball 200; and to further transmit ball data, representative of the interaction session of the user with the smart ball, to the computing device 100.
In the rest of the description, the functionalities of the smart ball system will be detailed with respect to a smart ball 200 consisting of a soccer ball. However, a person skilled in the art would readily adapt the functionalities of the smart ball system to other types of balls 200 (e.g. a basketball ball, a volleyball ball, an American football ball, a tennis ball, a table tennis ball, a golf ball, etc.).
The computing device 100 receives the ball data from the embedded monitoring platform 300. The ball data are processed by the computing device 100 to generate indicators. In the rest of the description, we will refer to an application 112 implementing the processing of the ball data to generate the indicators.
A person skilled in the art will readily understand that the present disclosure is not limited to the application 112 being a training application; but can be generalized to any application in charge of managing interaction sessions of the user with the smart ball 200. More specifically, the interaction session is not limited to a training session. For example, the interaction session is a real-time (or non real-time) competitive session between a plurality of users of smart balls 200.
In the rest of the description, the functionalities of the smart ball system will be detailed with respect to a computing device 100 consisting of a smartphone. However, a person skilled in the art would readily adapt at least some of the functionalities of the smart ball system to other types of computing devices 100, including any type of mobile computing device (e.g. a tablet, a computer, a smart watch, another type of wearable computing device, etc.).
The application 112 provides the additional functionality of determining an audio feedback for the user based on at least one of the indicators. The audio feedback provides a feedback to the user on the execution of the interaction session by the user. The audio feedback is transmitted to an audio device 400 and played by the audio device 400. Examples of audio devices include wearable audio devices, such as a headset, a pair of in-ear headphones, a pair of earbuds, etc. Other examples of audio devices include non-wearable audio devices, such as a wireless speaker, etc. Alternatively, the audio device 400 is integrated to the computing device 100 (this configuration is not represented in the Figures for simplification purposes).
Following is an exemplary use case for the smart ball system. A user performs a training exercise with a soccer smart ball 200. During the training exercise, ball data are generated by the embedded monitoring platform 300 of the soccer smart ball 200 and wirelessly transmitted to a smartphone 100 of the user. The ball data are processed in (near) real-time by the application 112 executed by the smartphone 100 (to generate the indicators). During the training exercise, the smartphone 100 may or may not be carried by the user. It may be more practical for practicing the training exercise to position the smartphone 100 in a location close enough from the smart soccer ball 200 to receive the ball data, but far enough to avoid potential damages to the smartphone 100 by the smart soccer ball 200.
The indicators generated by the application 112 comprise ball indicators. The ball indicators are calculated during the execution of the training exercise, based on a (near) real-time processing of the ball data received from the smart ball 200. The ball indicators focus on the ball and are generally not directly related to a particular training exercise. Examples of ball indicators include a speed of the ball, a height of the ball, a spin of the ball, etc. More details related to the ball indicators will be provided later in the description.
The indicators generated by the application 112 also comprise performance indicators. The performance indicators are generally directly related to a particular training exercise and are indicative of how well the user performed with respect to the objectives of the training exercise. Examples of performance indicators include scores, performance statistics, etc. A performance indicator is generated based on one or more ball indicators. Additional information are generally also used for the generation of the performance indicators, such as for example expected values (also referred to as goals in the rest of the description) of the ball indicators. The expected values of the ball indicators depend on the training exercise, on the user, etc. A performance indicator is calculated once at the end of the training exercise. Alternatively, a performance indicator is updated during the execution of the training exercise. More details related to the performance indicators will be provided later in the description.
The application 112 further determines the audio feedback(s) to be used. The determination of each audio feedback is based on ball indicator(s) solely, performance indicator(s) solely, or a combination of ball indicator(s) and performance indicator(s). Furthermore, the determination of the audio feedback(s) is performed during the execution of the exercise, at the completion of the training exercise, or a combination thereof. Each audio feedback is transmitted by the smartphone 100 to a pair of earbuds carried by the user, the audio feedback being played to the user by the pair of earbuds. This determination of audio feedback(s) will be detailed later in the description.
In an exemplary implementation, the application 112 is further adapted to use the indicators (ball indicators and/or performance indicators) in the context of a gamified training experience. This functionality will be detailed later in the description.
Referring more specifically to
The computing device 100 comprises a processing unit 110, memory 120, at least one communication interface 130 of the wireless type, a user interface 140, and a display 150. The computing device 100 may comprise additional components not represented in
The processing unit 110 comprises one or more processors (not represented in
The memory 120 stores instructions of computer program(s) executed by the processing unit 110, data generated by the execution of the computer program(s), data received via the wireless communication interface 130 (e.g. the ball data received from the embedded monitoring platform 300), etc. Only a single memory 120 is represented in
The wireless communication interface 130 allows the computing device 100 to exchange data with other devices (e.g. the embedded monitoring platform 300, the audio device 400, a remote server not represented in
In the case of a computing device 100 consisting of a smartphone, the following exemplary configuration may be implemented. A first wireless communication interface 130 is used for exchanging data with the embedded monitoring platform 300 of the smart ball 200 via a Wi-Fi network, a second wireless communication interface 130 is used for exchanging data with the audio device 400 via a Bluetooth network. A third communication interface provides communication capabilities over a cellular network. The application 112 exchanges data with other devices (e.g. a remote server or a remote computing device not represented in
The user interface 140 comprises at least one of the following: a touch screen user interface integrated to the display 150, a keyboard, a mouse, at least one button, etc.
The display 150 is either a regular display or a touchscreen display. The display 150 has a size adapted to the form factor of the computing device 100.
The various types of audio devices 400 which can be used to implement the smart ball system will not be described in detail, since they are well known in the art.
Referring more specifically to
The embedded monitoring platform 300 comprises a processing unit 310, at least one sensor 320, a wireless communication interface 330, a battery 340, and a receiver coil 350. As mentioned previously, the embedded monitoring platform 300 may comprise additional components not represented in
Several types of sensors 320 may be used depending on the type of sensor data that need to be collected by the embedded monitoring platform 300. Examples of sensors 320 include (without limitations) an inertial sensor, a gyroscope, an accelerometer, a pressure sensor, a temperature sensor, a force sensor, etc. Several sensors of the same type may be used, for example several inertial sensors, several pressure sensors, several force sensors, etc. Any combination of sensors 320 capable of generating the type of sensor data that need to be collected is relevant to the present disclosure.
In an exemplary implementation, one of the sensor(s) 320 comprises an inertial measurement unit (IMU). The IMU sensor 320 generates 3 gyrometer values, 3 accelerometer values and 3 magnetometer values.
The 3 gyrometer values represent the current rotation or spin of the ball 200 in 3 coordinate axes. Thus, the 3 gyrometer values correspond to the amount of rotation, as well as the direction of rotation, measured by the IMU sensor 320.
The 3 accelerometer values represents the current acceleration of the ball 200 in 3 coordinate axes. Whenever the ball 200 is being kicked, these values reach large numbers. When the ball is stationary on the ground, these values reflect the gravity, i.e. an acceleration of about 1 g=9.81 m/s{circumflex over ( )}2.
The 3 magnetometer values represent the relative orientation of the IMU sensor 320 compared to the earth magnetic field in 3 coordinate axes. For one fixed location outside, it is possible to deduce relative changes of the orientation of the ball 200 with these numbers.
Optionally, a high range accelerometer is used in combination with the IMU sensor (or a standard accelerometer). The high range accelerometer generates high range accelerometer values (e.g. three high range accelerometer values in three coordinate axes), which are noisier but provide a broader range of values (for example, to measure the acceleration for very strong kicks in the ball 200). The IMU sensor (or standard accelerometer) generates acceleration values that are capped to a maximum value; and is therefore not adapted for measuring the acceleration for very strong kicks in the ball 200.
The processing unit 310 has a form factor adapted to be embedded in the ball 200. Exemplary implementations of the processing unit 310 include a microcontroller (comprising one or more processing cores capable of executing instructions of a computer program), one or more FPGAs, one or more ASICs, etc.
The memory 311 stores instructions of computer program(s) executed by the processing unit 310, data generated by the execution of the computer program(s), sensor data generated and transmitted by the sensors 320, data received via the wireless communication interface 330, etc. Only a single memory 320 is represented in
The wireless communication interface 330 allows the embedded monitoring platform 300 to exchange data with other devices (e.g. transmission of the ball data to the computing device 100, reception of configuration data (e.g. to configure the sensors 320) from the computing device 100 or another device not represented in the Figures, etc.). Examples of implementations of the wireless communication interface 330 include a Wi-Fi communication interface, a Bluetooth communication interface, a BLE communication interface, another type of short range wireless communication technology, etc. The wireless communication interface 330 usually comprises a combination of hardware and software executed by the hardware, for implementing the communication functionalities of the wireless communication interface 330.
The battery 340 stores energy for electrically powering the electronic components of the embedded monitoring platform 300. For example, as illustrated in
The receiver coil 350 is adapted for wireless electrical charging of the battery 340. The implementation of the receiver coil 350 is well known in the art. Alternatively, another type of component adapted for performing the wireless electrical charging of the battery 340 is used in place of (or complementarily to) the receiver coil 350. The battery 340 may also be adapted to receive electrical charging cables through an external skin of the ball 200, although this implementation is more intrusive and may modify the dynamic properties of the ball 200.
The processing unit 310 receives the sensor data generated and transmitted by the sensor(s) 320. The processing unit 310 generates the ball data based on the received sensor data. The ball data are transmitted to the computing device 100 via the wireless communication interface 330.
For a given type of sensor data, the corresponding ball data consist directly of the sensor data (the generation of the ball data is limited to copying the sensor data). For another type of sensor data, the generation of the corresponding ball data includes an actual processing of the sensor data. Examples of processing may include sampling, detection of incoherent values, averaging over a period of time, selection of a maximum or minimum value over a period of time, etc. Furthermore, a given type of ball data may be generated through the processing of several types of sensor data (originating from the same or different sensors 320).
The ball data generally also comprise timestamps. For example, the timestamp precisely identifies the time at which a set of ball data is transmitted to the computing device 100 (the precision is for example in the range of microseconds). The timestamp is a precise approximation of the time at which the sensors 320 generated the sensor data used for generating the corresponding set of ball data.
Examples of ball data transmitted to the computing device 100 include (without limitations) at least some of the following: timestamps, spin values, acceleration values, orientation values, speed values, pressure values, temperature values, force values, etc.
In the case where the sensor(s) 320 comprise the aforementioned IMU sensor, the ball data comprise at least some of the 3 gyrometer values, 3 accelerometer values and 3 magnetometer values generated by the IMU sensor.
The embedded monitoring platform 300 also receives (via the wireless communication interface 330) commands from the computing device 100. The commands are processed by the processing unit 310, to enforce corresponding actions on components of the embedded monitoring platform 300. Examples of commands include a command (with calibration data) for calibrating/recalibrating the sensors 320, a command (with configuration data) for configuring the sensors 320, a command for activating (e.g. when an interaction session is started) or de-activating (e.g. when an interaction session is finished) the sensors 320, a command for determining the types of ball data that need to be transmitted to the computing device 100 (based for example on a given training exercise currently selected via the training application 112), etc.
Referring to
Referring back concurrently to
As mentioned previously, the application 112 generates the ball indicators. Examples of ball indicators include a number of touch-events, and for each touch-event: the timing of the touch-event, the speed of the ball, the spin of the ball and the height of the ball when the touch-event occured.
A touch-event is defined as whenever the ball experiences a strong impact or bounce. An algorithm is used for detecting a touch-event based on the ball data. For example, the algorithm calculates the norm of the acceleration, and determines that a touch-event has occurred when the norm of the acceleration surpasses a threshold of 9.0*g for a short amount of time, where g is the gravitation 9.81 m/s{circumflex over ( )}2. The threshold of 9.0*g is for illustration purposes and could be modified (or adapted to the user, e.g. child versus adult). The norm (norm) of the acceleration (acc) is defined as: norm(acc)=square root of (acc_x{circumflex over ( )}2+acc_y{circumflex over ( )}2+acc_z{circumflex over ( )}2), where acc_x, acc_y and acc_z are components of the acceleration in 3 axes. For example, acc_x, acc_y and acc_z are the 3 previously described accelerometers values included in the ball data.
Optionally, two or more categories of touches are defined, depending on the strength of the touch. The previous algorithm is adapted to detect the different categories of touches. For example, a soft-touch and a standard-touch are defined. The algorithm is adapted as follows: any touch-event with an acceleration norm of more than 5.0*g but less than 9.0*g is a soft-touch, while any touch-event with an acceleration norm of more than 9.0*g is a standard-touch. This provides flexibility for the definitiOn of training exercises. For a first type of exercise, the ball indicators comprise the number of soft-touches; while for another type of exercise, the ball indicators comprise the number of standard-touches.
Furthermore, different types of touch-events may occur, such as a contact of the ball with a foot, a contact of the ball with the ground, a contact of the ball with a wall, etc. The ball indicators may be updated only for one of these types of touch-events. For example, the number of touches, and all the additional ball indicators associated to each touch-event, are only updated/calculated for a touch-event consisting of a contact of the ball with a foot. The other types of touch-events (e.g. a contact of the ball with the ground) are ignored with respect to the update/calculation of the ball indicators.
A machine learning algorithm (e.g. a neural network) can be used to classify a touch-event, based on at least some of the ball data. For example, the inputs of the machine learning algorithm include a series of consecutive values of some of the ball data. For instance, the series of consecutive values includes a series of N (e.g. N=100) values for the previously described 3 gyrometer values and 3 accelerometer values.
The machine learning algorithm is trained to classify the following types of touch-events (the classification is the output of the machine learning algorithm): foot-contact, ground-contact, etc. The model can be refined to classify additional touch-events like thigh-contact, head-contact, wall-contact, etc. Furthermore, the output of the machine learning algorithm may include a probability of accuracy of the result of the classification.
Following is an exemplary implementation of the generation of the ball indicators by the application 112. A flow of ball data is received in (near) real-time from the ball 200 and processed in (near) real-time by the application 112. A decision of which ball indicators need to be calculated is made based on the ball data received over an interval of time (e.g. the last 100 or 200 milliseconds). For example, as mentioned previously, if a touch-event consisting of a foot-contact is detected based on the ball data received within the interval of time, the corresponding ball indicators are updated/determined/calculated for this touch-event: timing of the touch-event, number of touch-events, speed of the ball, spin of the ball, height of the ball, etc. Since some of the audio feedbacks may be based on the ball indicators, the ball indicators are only updated at a frequency adapted to be intelligible to the user (e.g. every few seconds or a few times per second).
Additional ball indicators may be defined, which provide additional physical insights about what happens to the ball 200. For example, the trajectory of the ball 200 (e.g. a flight curve of the ball 200) is calculated based on the received ball data. For instance, the trajectory is calculated each time a touch-event is detected (e.g. only for touch-events consisting of foot-contacts). The following parameters of the trajectory are calculated: relative position of the ball 200 and speed of the ball 200. The calculation of the trajectory generally uses the previously described 3 accelerometer values and 3 gyrometer values (more generally, based on three-dimensional acceleration and spin measurements included in the ball data).
As mentioned previously, the application 112 also generates the performance indicators. Examples of performance indicators include scores and performance statistics, but are not limited to these two types of performance indicators. For a given type of interaction session, one or more corresponding performance indicators are defined. For example, in the case of training exercises, for each type of training exercise, one or more performance indicators adapted to the specificities of the training exercise are defined.
Each performance indicator is generated by taking into consideration the value of one or more corresponding ball indicators. A given performance indicator is updated during an interaction session, for example every time one of the corresponding ball indicator(s) is calculated, determined or updated. Alternatively, a given performance indicator is updated at the end of the interaction session, taking into consideration the final value of the corresponding ball indicator(s).
In addition to the corresponding ball indicator(s), the determination of a given performance indicator takes into consideration at least some of the following additional parameters: goal(s) defined for at least some of the corresponding ball indicator(s), characteristics of the user (e.g. age, weight, height, gender, level of the user in the context of a training exercise, etc.), additional contextual information of the interaction session (different from the aforementioned goals). Additional parameters may include user history data, average performance indicators across a group of several users (the application 112 synchronizes such data via a backend when connected to the Internet).
In the case of a training exercise, the performance indicators determine how good the performance of a user was when performing the training exercise. Optionally, the performance is divided into sub-categories (e.g. which parts of the performance were good, which one were not, ideally with insights about what the user needs to do to improve).
The usage of the performance indicators provides the capability to quantify the performance of the user. For example, if the goal is to reach a speed of 30.0 km/h when kicking the ball during an exercise, a user who kicks the ball at 31.3 km/h does a good job, but the notion of doing a good job is relative. The quantification allows to track progress and compare the user with other users.
A goal may be defined with a single value or a range of values. Examples of goals include the ball indicator being equal to a pre-defined value, above the pre-defined values, below the predefined value, within a range of pre-defined values, outside the range of pre-defined values, etc.
Following is an example of calculation of a performance indicator (e.g. a score) in the context of a juggling exercise (the user is supposed to keep the ball bouncing in the air with his feet). After each touch-event identified (classified) as a foot-contact, the ball indicators “height of the ball” and “spin of the ball” are calculated. We assume that the current score is 78.3 points. We also assume that the currently identified foot-contact is associated with the values height=0.87 m and spin=121 rpm. A scoring algorithm determines that this foot-contact scores 5.6 points and the current score is updated to 78.3+5.6=83.9 points. At the end of the juggling exercise, a final score is allocated to the user performing the juggling exercise.
Following is another example of calculation of a performance indicator (e.g. a score) in the context of a free kick exercise (the user is supposed to kick the ball hard enough for the ball to reach a pre-defined very high speed, for example 100 km/h). After each touch-event identified (classified) as a foot-contact, the ball indicator “speed of the ball” is calculated. We assume that the current score is 51.5 points. We also assume that the currently identified foot-contact is associated with the value speed=83 km/h. A scoring algorithm determines that this foot-contact scores 8.3 points (based on a linear calculation of the points allocating 10 points for a speed of 100 km/h or more) and the current score is updated to 51.5+8.3=59.8 points. At the end of the free kick exercise, a final score is allocated to the user performing the free kick exercise.
The calculation of a performance indicator may also be based on an average value of a ball indicator or a maximum value (alternatively minimal value) of a ball indicator. This can be applied to a calculation of the performance indicator during the interaction session or to a calculation of the performance indicator at the end of the interaction session.
Following are additional examples of training exercises with a soccer ball consisting of juggling with the soccer ball: juggling with one foot, juggling with one thigh, and juggling with the head. Exemplary goals for these training exercises include performing a given number of juggles within a given period of time, performing as many juggles as possible within a given period of time, performing a given number of juggles as fast as possible, etc. The calculation of a corresponding performance indicator (e.g., a score) takes into consideration at least some of the following ball indicators: number of touch-events identified (classified) as a foot-contact, number of touch-events identified (classified) as a thigh-contact, number of touch-events identified (classified) as a head contact, timing of each touch-event. Following are examples of feedbacks determined during the juggling exercise, based on the evolution of the performance indicator and/or at least some of the ball indicators during the execution of the juggling exercise: an audio feedback indicative that the user should accelerate the tempo of the juggles, an audio feedback indicative that the user should decelerate the tempo of the juggles, an audio feedback indicative that the user should maintain the tempo of the juggles, etc.
The performance indicators are used to provide feedback to the users. For a given interaction session (e.g. a given training exercise), the one or more performance indicators generated in the context of the given interaction session are used to generate acoustic and/or visual feedback. A first example is a visual feedback, which includes displaying one or more diagrams on the display 150, based on the value of the performance indicator(s).
As mentioned previously, consecutive audio feedbacks can be provided during the interaction session and/or one at least one audio feedback at the end of the interaction session. When the user is not performing well during the interaction session, the audio feedback indicates that the user is not performing well; and may also provide a recommendation of how to improve the performance. When the user is performing well during the interaction session, the audio feedback indicates that the user is performing well; and may also provide encouragements to the user. Alternatively, if the user is performing well, no audio feedback is provided (only negative audio feedback is provided when needed—it is assumed that no audio feedback means that everything is going well in terms of performances). For an audio feedback provided at the end of the interaction session, if the user has not performed well, the audio feedback may also provide a recommendation of how to improve the performance at the next interaction session.
Following are examples of recommendations which can be provided in during or at the end of the interaction session: “Try to add a small amount of backspin to keep more control during the juggling”, “Try to hit the ball lower”. Following is an example of a recommendation which can be provided at the end of the interaction session: “Follow the linked tutorial to learn how to kick with more power”.
Other types of audio feedback may also be supported by the application 112, which are not directly related to the performance of the user. One example of such additional audio feedback consists of navigation audio feedback, which helps the user navigating through the application 112. For instance, reading out loud the name of the currently selected menu or view (e.g. “Dashboard”), indicating to the user what to select next in a current menu (e.g. “Choose your favorite exercise”), greeting the user with a welcoming message when opening the application (e.g. “Great to have you here again”), informing the user about certain events (e.g. “Your smart ball is connected now”), reminding everyone whose turn it is in a multiplayer exercise (e.g. “Next player”), etc. This type of audio feedback increases the convenience for the user, since the user does not need to look at the display 150 during the execution of an exercise (e.g. in multiplayer mode), but can instead have fun with the ball and only engage with the computing device 100 afterwards to see the results on the display 150.
Following is an exemplary use case (referred to as beat attack), where an audio sequence (e.g. a song) is played while the user is performing an interaction session (e.g. a training exercise with the smart ball referred to as a beat attack exercise).
Upon selection of the beat attack exercise by the user via the application 112, a corresponding audio sequence is determined. The audio sequence may include instrumentals only, lyrics only, or a combination of instrumentals and lyrics (e.g. a song). The audio sequence is selected by the user among a series of audio sequences associated with the beat attack exercise. Alternatively, a pre-defined audio sequence is associated to the beat attack exercise.
The audio sequence is played during the execution of the beat attack exercise, and one of the objective(s) of the exercise for the user is to synchronize actions performed during the exercise with the beat of the audio sequence. For example, foot touches with the ball need to be synchronized with the beat of the audio sequence.
As described previously, touch-events consisting of a foot-contact are detected based on the ball data, and the corresponding ball indicator is determined: timing of the touch-event. The consecutive timings of the touch-events are used to calculate a corresponding ball indicator consisting of an average contact frequency with the ball. The average contact frequency is compared to the beat of the audio sequence, to generate a corresponding performance indicator. The performance indicators is representative of the synchronization of the average contact frequency with the beat of the audio sequence. Following is an example of audio feedback determined based on the performance indicator: “on average, you hit the beat of the song only 30 ms too early—good rhythm, nice fluency in your skills”.
Reference is now made concurrently to
The first ball indicator is a spin value. The calculated value of the first ball indicator is 42 and the target value (goal) of the first ball indicator is 42. The second ball indicator is a number of touches. The calculated value of the second ball indicator is 126 and there is no target value for the second ball indicator. The third ball indicator is a combined height of the trajectories of the ball during the training exercise. The calculated value of the third ball indicator is 640 meters and the target value (goal) of the third ball indicator is 645 meters.
The performance indicator is a score which is calculated by taking into consideration the values of each of the three ball indicators with respect to the corresponding target values if any (the score may not take into consideration the second ball indicator, since there is no target value for the second ball indicator). The calculated value of the score is 76, and the target value of the score is 70, which is representative of an outperforming execution of the training exercise.
Following is a description of an exemplary implementation of the functionality of the training application 112 consisting in providing a gamified user experience.
The training exercises are presented as games, with challenges to be taken up, coaching presented in an entertaining form, etc. Furthermore, the training application 112 may be integrated with social networks, to share the progress of the user, receive feedbacks (e.g. encouragements, advice, comments, etc.) from/provide feedbacks to other users, participate in competitions and challenges with other users in relation to training exercises via the social networks, etc.
For example, the training application 112 uses the previously mentioned performance indicators in the context of a gamified training experience. The gamified training experience includes one or more global performance objectives defined for the training exercise. Each time the training exercise is performed by the user, the one or more performance indicators generated in relation to the currently executed instance of the training exercise are processed to determine their contribution to the one or more global performance objectives defined for the training exercise. The training application 112 is capable of managing several training exercises, each training exercise having its own global performance objective(s) and performance indicator(s).
Furthermore, to render the training experience more enjoyable, the progress of the user with respect to his global performance objectives can be shared with other users of the training application 112, via a social network. In this case, the training application 112 allows various users to compete with each other, to be the first to reach similar global performance objectives.
In the context of the gamified user experience, information related to a user (e.g. historic of the values of the performance indicators calculated for the completed training exercises, advancement of the global performance objectives, etc.) may be transmitted by the training application 112 to a remote server implementing a cloud storage of the information related to multiple users of the training application 112.
Referring now concurrently to
The processing unit 310 of the embedded monitoring platform 300 executes an application 312, which implements at least some of the functionalities of the application 112 executed on the computing device 100. For this purpose, the application 312 uses at least some of the previously described ball data (generated by the processing unit 310 based on the sensor data generated by the sensor(s) 320).
For example, the application 312 is adapted to generate at least some of the ball indicators previously described in relation to the application 112.
Optionally, the application 312 is also adapted to generate at least some of the performance indicators previously described in relation to the application 112.
Based on the capabilities of the application 312, the data wirelessly transmitted by the embedded monitoring platform 300 to the computing device 100 include at least some of the previously described ball data, and at least some of the previously described indicators (including one or more ball indicators and optionally one or more performance indicators).
Alternatively, the application 312 has the capability to generate all the previously described ball indicators and optionally one or more of the previously described performance indicators. In this case, the data wirelessly transmitted by the embedded monitoring platform 300 to the computing device 100 may not include the previously described ball data, but include the ball indicators and optionally the one or more performance indicators.
The application 312 may also have the capability to determine one or more of the previously described audio feedback, based on indicators generated locally by the application 312. audio feedback determined by the application 312 is directly transmitted to the audio device 400 via the wireless communication interface 330. In an alternative implementation not represented in
In order to implement the functionalities of the training application 312, configuration data need to be transmitted by the application 112 to the application 312. For example, the user of the computing device 100 selects a training exercise via the application 112, and configuration data related to the selected training exercise are transmitted by the application 112 to the application 312. The configuration data are used for generating the indicators supported by the application 312, and optionally for determining the audio feedbacks supported by the application 312.
Reference is now made concurrently to
Furthermore, a first dedicated computer program has instructions for implementing the steps of the method 500 performed by the processing unit 110 of the computing device 100. The instructions are comprised in a non-transitory computer-readable medium (e.g. in the memory 120) of the computing device 100. The instructions, when executed by the processing unit 110, implement the steps of the method 500 performed by the processing unit 110 of the computing device 100. The instructions are deliverable to the computing device 100 via an electronically-readable media such as a storage media (e.g. any internally or externally attached storage device connected via USB, Firewire, SATA, etc.), or via communication links (e.g. via a communication network through the wireless communication interface 130).
Furthermore, a second dedicated computer program has instructions for implementing the steps of the method 500 performed by the processing unit 310 of the embedded monitoring platform 300. The instructions are comprised in a non-transitory computer-readable medium (e.g. in the memory 311) of the embedded monitoring platform 300. The instructions, when executed by the processing unit 310, implement the steps of the method 500 performed by the processing unit 310 of the embedded monitoring platform 300. The instructions are deliverable to the embedded monitoring platform 300 via communication links (e.g. via a communication network through the wireless communication interface 330).
The method 500 comprises the step 505 of generating sensor data by at least one sensor 320 of the embedded monitoring platform 300. The sensor data are representative of the interaction session of a user with the smart ball 300. Step 505 is executed by the processing unit 310 of the embedded monitoring platform 300. The sensor data have been described previously.
The method 500 comprises the step 510 of generating ball data based on the sensor data generated at step 505. Step 510 is executed by the processing unit 310 of the embedded monitoring platform 300. The ball data have been described previously.
The method 500 comprises the step 515 of transmitting the ball data to the computing device 100 via the wireless communication interface 330 of the embedded monitoring platform 300. Step 515 is executed by the processing unit 310 of the embedded monitoring platform 300.
The method 500 comprises the step 520 of receiving the ball data via the wireless communication interface 130 of the computing device 100. Step 520 is executed by the processing unit 110 of the computing device 100.
The method 500 comprises the step 525 of processing the ball data to generate indicators. Step 525 is executed by the processing unit 110 of the computing device 100. The indicators have been described previously.
The method 500 comprises the step 530 of determining an audio feedback for the user based on at least one of the indicators. The audio feedback provides a feedback to the user on the execution of the interaction session by the user. Step 530 is executed by the processing unit 110 of the computing device 100. The audio feedback has been described previously.
The method 500 comprises the step 535 of transmitting the audio feedback to an audio device 400. Step 535 is executed by the processing unit 110 of the computing device 100. As mentioned previously, the audio feedback is then played by the audio device 400 (the audio device 400 being integrated to the computing device 100 or being a remote audio device (e.g. a pair of earbuds) as illustrated in the Figures).
The various steps of the method 500 have not been described in details, since such a detailed description has already been provided previously and applies to the steps of the method 500.
The method 500 can be generalized to a sport device comprising embedded sensor(s) 320 capable of generating sensor data representative of interactions of a user with the sport device during a practice of a corresponding sport. The sport device also comprises components (e.g. the processing unit 310 and the wireless communication interface 330) capable of generating and wirelessly transmitting device data based on the sensor data. Examples of sport devices different from the previously described smart ball 200 include a racket (e.g. for practicing tennis or table tennis), a golf club, a baseball club, etc.
Reference is now made concurrently to
The following exemplary design is adapted to a soccer ball comprising an internal bladder and an external skin, as is well known in the art. This design can be adapted to any kind of ball comprising a bladder and a skin (e.g. a basketball ball, a volleyball ball, an American football ball, etc.).
The flexible PCB 300, with all its components, is integrated between the bladder and the skin of the ball 200. All (electric/electronic) components and materials are integrated in such a way, that the ball 200 appears as a normal ball without digital elements from outside, to ensure that the user is not influenced in his interactions with the ball 200.
In a first embodiment, all the components of the flexible PCB are grouped and assembled to form a compact flexible PCB. In an exemplary implementation, the compact flexible PCB is localized near a valve of the ball 200.
In a second embodiment, the components of the flexible PCB are spread around the bladder and assembled to form an elongated flexible PCB. The shape of the PCB is adapted to accommodate the spreading of the components. In particular, in an exemplary implementation, the sensors 320 are spread at a distance from one another around the bladder.
Although the present disclosure has been described hereinabove by way of non-restrictive, illustrative embodiments thereof, these embodiments may be modified at will within the scope of the appended claims without departing from the spirit and nature of the present disclosure.
Number | Date | Country | |
---|---|---|---|
63424684 | Nov 2022 | US |