The disclosure relates to a tactile rendering system and a tactile rendering method.
With the rapid development of science and technology, various interactive technologies have been developed. For example, interactive display technology could be applied to various fields such as education, gaming, sports, and automobiles. As the current interactive display technology provides merely visual feedback, when users adopt interactive modalities such as gestures, stylus pens, and hand-held joysticks, there is no realistic tactile feedback, which greatly reduces the user experience.
The disclosure is directed to a tactile rendering system and a tactile rendering method, which apply a texture imaging procedure to the comparison of the generative texture image, apply a tactile rendering procedure to the generation of a tactile rendering signal, and control the tactile feedback module for realistic tactile sensation.
According to one embodiment, a tactile rendering system is provided. The tactile rendering system includes a texture image processing subsystem, a generative texture image comparison subsystem and a tactile rendering subsystem. The texture image processing subsystem includes an image capture module and a texture image processing module. The image capture module is used to obtain a real material surface image. The texture image processing module is used to obtain a real texture image according to the real material surface image. The generative texture image comparison subsystem includes a texture image feature factor capturing module, a tactile feature database and a tactile feature database factor search module. The texture image feature factor capturing module is used to analyze at least one real texture image feature factor according to the real texture image. The tactile feature database factor search module is used to search the tactile feature database to obtain at least one tactile data according to the at least one real texture image feature factor. The tactile rendering subsystem includes a tactile human rendering generation module. The tactile human rendering generation module is used to generate at least one tactile rendering signal according to the at least one tactile data.
According to another embodiment, a tactile rendering method is provided. The tactile rendering method includes obtaining a real material surface image; obtaining a real texture image according to the real material surface image; obtaining at least one real texture image feature factor according to the real texture image; searching a tactile feature database according to the at least one real texture image feature factor, to obtain at least one tactile data; and generating at least one tactile rendering signal according to the at least one tactile data.
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
The technical terms used in this specification refer to the idioms in this technical field. If there are explanations or definitions for some terms in this specification, the explanation or definition of this part of the terms shall prevail. Each embodiment of the present disclosure has one or more technical features. To the extent possible, a person with ordinary skill in the art may selectively implement some or all of the technical features in any embodiment, or selectively combine some or all of the technical features in these embodiments.
Please refer to
In an application scenario, the user could wear a virtual display 800 (such as, but not limited to, a head-mounted display or VR glasses), or a flat-panel display to view a virtual object. The virtual object is displayed in front of the user. The user could interact with the virtual object while wearing the tactile feedback module 900 (such as, but not limited to, a glove that actuates a vibration device and an air bag). Once the tactile feedback module 900 moves to the position of the virtual object, the corresponding tactile sensation would be generated according to the texture of the virtual object. In this way, users could obtain a realistic tactile experience when he or she is interacting with the virtual object.
Please refer to
In step S1, the image capture module 111 obtains a real material surface image IM1 and digitizes the image. The real material surface image IM1 is, for example, but not limited to, surface images of leather, rubber, cotton, metal, glass and other materials.
In step S2, the texture image processing module 112 obtains a real texture image IM2 according to the real material surface image IM1. The texture image processing module 112 obtains a texture information according to, for example, but not limited to, color changes or grayscale changes on the real material surface image IM1 and obtain the real texture image IM2 accordingly.
The generative texture image comparison subsystem 120 includes a texture image feature factor capturing module 121, a tactile feature database 122 and a tactile feature database factor search module 123. The texture image feature factor capturing module 121 and/or the tactile feature database factor search module 123 is, for example, but not limited to, a circuit, a circuit board, a storage device storing program code, or a chip. The tactile feature database 122 is, for example, but not limited to, any type of fixed or removable random access memory (RAM), read-only memory (ROM), flash memory, hard disk drive (HDD), solid state drive (SSD) or similar component or a combination of the above components.
In step S3, the texture image feature factor capturing module 121 obtains at least one real texture image feature factor FT1 according to the real texture image IM2. The real texture image feature factor FT1 is, for example, but not limited to, shape, area, gradient of shape, period of spatial variation of a rough surface, depth of spatial variation, etc. The texture image feature factor capturing module 121 could also determine whether the degree of changes meet the threshold, where these values and evaluated results could be used as the real texture image feature factor FT1.
In step S4, the tactile feature database factor search module 123 searches the tactile feature database 122 according to the real texture image feature factor FT1 to obtain at least one tactile data TH1. The tactile data TH1 is, for example, but is not limited to surface fluctuations, roughness changes and other information. The tactile data TH1 could be a quantification result of surface topography, such as local gradient, period of roughness variation, height of roughness change, etc.; or a quantification result of image pattern, such as gray-scale co-occurrence matrix energy value, entropy value, contrast value, contrast difference value, correlation value, etc.; or the curve of shape change after signal processing, such as principal components or intrinsic mode function, etc. The tactile feature database 122 stores multiple sets of tactile data TH1. According to the similarity between each set of the tactile data TH1 and the real texture image feature factor FT1, the tactile feature database factor search module 123 first select the one with the highest similarity.
The tactile rendering subsystem 130 includes a tactile human rendering generation module 131. The tactile human rendering generation module 131 is, for example, but not limited to, a circuit, a circuit board, a storage device storing program code, or a chip.
In step S5, the tactile human rendering generation module 131 generates at least one tactile rendering signal RD according to the tactile data TH1. The tactile rendering signal RD is, for example, but is not limited to, information such as surface fluctuation waveforms that change over time, change of roughness, or change of friction. With the tactile rendering signal RD, the tactile feedback module 900 could be controlled accordingly to produce a realistic tactile feeling.
Please refer to
In step S6, the new texture image generation module 124 generates a new texture image IM3 according to the real texture image feature factor FT1. The new texture image IM3 is, for example, different from the real texture image IM2. Alternatively, the new texture image IM3 may be the same as the real texture image IM2. The new texture image IM3 may be only generated according to the real texture image feature factor FT1, so it is not necessary to reflect unnecessary information.
In step S7, the tactile parameter input module 113 inputs at least one material surface property parameter PM1. The material surface property parameter PM1 is, for example, but not limited to, friction coefficient, hardness, temperature and other parameters.
In step S8, the tactile feature database factor search module 123 obtains at least one reference texture image feature factor FT2 according to the material surface property parameter PM1 and the real texture image feature factor FT1. The reference texture image feature factor FT2 is, for example, but not limited to, shape, area, gradient of shape, period of spatial variation of a rough surface, depth of spatial variation, etc. The tactile feature database 122 stores multiple sets of tactile data TH1 and the corresponding reference texture image feature factor FT2. According to the similarity between each set of tactile data TH1 and the real texture image feature factor FT1, the tactile feature database factor search module 123 first select the reference texture image feature factor FT2 corresponding to the highest similarity.
In step S9, the reference texture image generation module 125 generates a reference texture image IM4 according to the reference texture image feature factor FT2.
In step S10, the texture image similarity comparison module 126 compares the new texture image IM3 and the reference texture image IM4. If the similarity between the new texture image IM3 and the reference texture image IM4 is greater than a predetermined value (for example, but not limited to, 90%) or the time has passed a predetermined time (for example, but not limited to, 30 ms), then the tactile data TH1 corresponding the reference texture image feature factor FT2 is outputted. On the contrary, if the similarity between the new texture image IM3 and the reference texture image IM4 is not greater than the predetermined value and the time has not reached the predetermined time, then in step S10′, the tactile feature database factor search module 123 is asked to repeat the step S8 to search the next set of reference texture image feature factor FT2. If the search cannot be completed within the predetermined time (such as, but not limited to, 30 ms), the tactile data TH1 with the highest similarity in the previous time is inputted in the step S10.
As shown in
The quantity of the actuation signals AT could be plurality. The actuation signals AT are, for example, vibration control signals, inflation control signals, compression control signals, voltage control signals, temperature control signals, etc. In one embodiment, the actuation signals AT could control the tactile feedback modules 900 in a time-divided multiple access manner, in a frequency-divided multiple access manner or in a code-divided multiple access manner. That is to say, different actuation signals AT are activated in turn.
In another embodiment, the actuation signals AT could control at least one tactile feedback module 900 in a spatial separation manner. That is to say, the actuation signals AT could be activated at different locations at the same time.
According to the above embodiment in
Please refer to
In step S12, the user posture tracking module 133 obtains a user posture PT (such as, but not limited to, the direction, speed, acceleration and other data of the user's hand).
In step S13, the displacement information analysis module 134 obtains a displacement information MV according to the user posture PT.
In step S5, the tactile human rendering generation module 131 could generate the tactile rendering signal RD according to the displacement information MV.
According to the embodiment of
Please refer to
In step S14, the user human factor input module 135 inputs a user human factor HM. The user human factor HM is, for example, but not limited to, the user's just-noticeable difference (JND), age, gender, skin condition, etc.
In step S5, the tactile human rendering generation module 131 could optimize the tactile rendering signal RD according to the user human factor HM.
According to the embodiment in
Please refer to
In step S15, the virtual object selection module 114 provides at least one virtual object VO. The virtual object VO is, for example, but not limited to, the virtual display 800 (shown in
In step S2, the texture image processing module 112 could obtain the real texture image IM2 according to the real material surface image IM1 and the virtual object VO. The virtual object VO is further considered, so the accuracy of the real texture image IM2 could be improved, and the comparison of the new texture image IM3 and the reference texture image IM4 could be skipped. When the deviation of image feature is less than the threshold (for example, it is, but not limited to, 10%), the tactile data TH1 could be identified by the real texture image feature factor FT1.
According to the above embodiment in
Please refer to
In step S16, the previous image comparison module 115 compares the real material surface image IM1 with a previous real material surface image IM1′.
In step S17, if the difference between the real material surface image IM1 and the previous real material surface image IM1′ is lower than a predetermined value (for example, but not limited to, 20%), the previous tactile signal module 136 provides the previous tactile rendering signal RD′, such that the tactile human rendering generation module 131 adjusts the previous tactile rendering signal RD′ according to the difference between the real material surface image IM1 and the previous real material surface image IM1′ to obtain the current tactile rendering signal RD2.
The previous real material surface image IM1′ is further considered, so the subsequent tactile rendering signal RD could be obtained directly according to the adjustment of the previous tactile rendering signal RD′, such as adjusting the actuation control signal intensity.
According to the above embodiment in
Please refer to
In step S18, the first data transmission module 127 transmits the real texture image feature factor FT1 to a remote site, and the second data transmission module 128 receives the real texture image feature factor FT1 at the remote site. The tactile feature database 122 and the tactile feature database factor search module 123 are located at the remote site, so that the computing resources at the remote site could be used to perform fast searches.
According to the embodiment of
Please refer to
In step S20, the inertial data capturing module 116 captures an inertia data IT. The inertia data IT is, for example but not limited to, an acceleration or an azimuth angle.
In step S3, the texture image feature factor capturing module 121 obtains the real texture image feature factor FT1 according to the real texture image IM2 and the inertia data IT.
According to the embodiment in
Please refer to
In step S21, the adaptive noise suppression module 129 filters out the noise of the real texture image IM2. The adaptive noise suppression module 129 uses, for example, but not limited to, the minimum mean square error filtering or an autoregressive model to filter out the noise of the real texture image IM2.
According to the above embodiment in
Please refer to
In step S22, the previous image comparison module 115 compares the real material surface image IM1 with a previous real material surface image IM1′.
In step S17′, if the similarity between the real material surface image IM1 and the previous real material surface image IM1′ is higher than a predetermined value (such as, but not limited to, 90%), the previous tactile signal module 136 provides the previous tactile rendering signal RD′.
Then, in step S11′, the tactile signal conversion module 132 converts the tactile rendering signal RD or the previous tactile rendering signal RD′ into at least one actuation signal AT. That is to say, when the similarity between real material surface image IM1 and previous real material surface image IM1′ is higher than the predetermined value of 90%, extraction of the real texture image feature factor FT1, search for the reference texture image feature factor FT2, generation and comparison of the new texture image IM3 and the reference texture image IM4, search for the tactile data TH1, and generation of the tactile rendering signal RD are not required.
According to the above embodiment in
The above disclosure provides various features for implementing some implementations or examples of the present disclosure. Specific examples of components and configurations (such as numerical values or names mentioned) are described above to simplify/illustrate some implementations of the present disclosure. Additionally, some embodiments of the present disclosure may repeat reference symbols and/or letters in various instances. This repetition is for simplicity and clarity and does not inherently indicate a relationship between the various embodiments and/or configurations discussed.
It will be apparent to those skilled in the art that various modifications and variations could be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplars only, with a true scope of the disclosure being indicated by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
113128332 | Jul 2024 | TW | national |
This application claims the benefit of U.S. Provisional application Ser. No. 63/534,370, filed Aug. 24, 2023, and Taiwan application Serial No. 113128332, filed Jul. 30, 2024, the disclosure of which is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63534370 | Aug 2023 | US |