The invention relates to a force sensing scheme, and more particularly to a force sensor device and a method of the force sensor device.
Generally speaking, a conventional force sensor usually uses strain gauge/force-sensitive resistor or uses a Micro-Electro-Mechanical Systems (MEMS) sensor. However, such conventional force sensor has insufficiencies for industrial uses. For example, the conventional force sensor will be heavy weight or fragile, and it will need a complicated system design for implementations.
Therefore one of the objectives of the invention is to provide a force sensor device and a corresponding method, to solve the above-mentioned problems.
According to embodiments of the invention, a force sensor device is disclosed. The force sensor device comprises a first structure component, an optical sensor, and a flexible structure component. The optical sensor has pixel units, and it is disposed on the first structure component. The flexible structure component has a convex portion, and the flexible structure component is assembled with the first structure component to form a chamber in which the optical sensor is disposed. The optical sensor is arranged for sensing light ray transmitted from the flexible structure component to at least one pixel unit to generate at least one differential image and then detecting a user's control force applied for the flexible structure component according to the at least one differential image; and, the at least one differential image is at least one temporal differential image, generated from successive pixel values of a single pixel unit, or is at least one spatial differential image, generated based on temporal differential images of at least two neighboring pixel units.
According to the embodiments, a method of a force sensor device is disclosed. The method comprises: providing a first structure component; providing an optical sensor having pixel units and disposed on the first structure component; using a flexible structure component having a convex portion, the flexible structure component assembled with the first structure component to form a chamber in which the optical sensor is disposed; sensing light ray transmitted from the flexible structure component to at least one pixel unit to generate at least one differential image; and detecting a user's control force applied for the flexible structure component according to the at least one differential image; the at least one differential image is at least one temporal differential image, generated from successive pixel values of a single pixel unit, or is at least one spatial differential image, generated based on temporal differential images of at least two neighboring pixel units.
According to the embodiments, the circuit cost can be significantly reduced, to achieve a lost-cost force sensor system. The response time can become fast and consumes less computing resource.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
The invention aims at providing a technical solution of a force sensor device which is capable of using an optical sensing scheme to detect image patterns corresponding to the shape deformation caused by a user's different control behaviors such as different control forces, stresses, and/or directions, so as to accurately estimate and detect the user's different control behaviors. The circuit cost can be significantly reduced, to achieve a lost-cost force sensor system. The response time can become fast and consumes less computing resource.
The optical sensor 110 is disposed on the top surface of the bottom structure component 105 which for example is a flexible printed circuit (FPC) or a printed circuit board (PCB); this is not intended to be a limitation. The flexible top structure component 115 is assembled with the bottom structure component 105 to form a chamber space 120 (such as an empty space or a space filled with other transparent or translucent material(s) to increase a user's feeling of control) in which the optical sensor 110 is disposed, as shown in
The optical sensor 110 comprises a processor 1105 and a sensor array 1110 having multiple pixel units such as pixels or sub-pixels, and one or each pixel unit is used for sensing the light ray(s) transmitted from the flexible top structure component 115 so as to generate one or more pixel values.
In this embodiment, the force sensor device 100 may be disposed in a still system (but not limited) with no motions or almost no motions; that is, the force sensor device 100 is not moved or shifted. At least one portion of the flexible top structure component 115 may be implemented by using translucent or transparent material(s), and the light ray(s), transmitted from the flexible top structure component 115, may be associated with such translucent/transparent material(s). For example, the external ambient light ray(s) may penetrate through a translucent/transparent material, i.e. may be transmitted from an outer surface of the flexible top structure component 115 into an inner surface of the flexible top structure component 115 and then transmitted from such inner surface into the optical sensor 110. The optical sensor 110 can use the sensor array 1110 to sense the penetrated light ray(s) to generate one or more optical images.
In this embodiment, a pixel unit included in the sensor array 1110 for example is equivalent to an artificial intelligence (AI) pixel unit which can be arranged to sense the penetrated light ray to generate pixel values at different timings, i.e. successive pixel values, and to compare its generated successive pixel values to determine whether a pixel value changes so as to determine whether to generate and output a differential image, such as pixel-level difference information. The circuit cost can be significantly reduced, and this achieves a lost-cost force sensor system. Also, the AI pixel unit's response time is fast and only consumes less computing resource.
In the still system, for example, the condition of the external ambient light ray(s) changes unfrequently or almost rarely, and thus the condition of the penetrated light ray(s) also changes unfrequently or almost rarely. In this situation, when a difference between two successive pixel values (e.g. a current pixel value and a previous pixel value) generated by a pixel unit or a difference between the current pixel value and a reference pixel value of the pixel unit is smaller than a specific threshold, and the pixel unit in this condition is arranged to determine that the pixel value does not change and does not generate and output a differential image. A differential image can be a set of the difference(s) of pixel values. Alternatively, when the difference is larger than the specific threshold, the pixel unit in this condition is arranged to determine that the pixel value does change and the sensor array 1110 is arranged to generate and output a differential image to the processor 1105. Thus, the sensor array 1110 can output difference/differential image information in response to an event that the pixel unit detects the pixel value changes.
In practice, for example, a differential image, generated and outputted by the sensor array 1110, can be a temporal differential image which is a difference image obtained by difference pixel values of the same pixel captured in successive capturing times, and such temporal differential image with temporal intensity can be outputted from the sensor array 1110 to the processor 1105 of the optical sensor 110. Alternatively, in other embodiment, a differential image, generated and outputted by the sensor array 1110, can be a spatial differential image which may be a difference image obtained by difference pixel values captured between two neighboring or adjacent pixel units or may be a difference image obtained by difference pixel values of neighboring or adjacent pixel units' temporal differential pixel values. In the embodiments, a difference between the pixel unit's pixel value and a neighboring or adjacent pixel unit's pixel value can be used as a spatial differential image of the sensor array 1110. It should be noted that the sensor array 1110 in other embodiment may generate the temporal differential image and the spatial differential image respectively associated with the current optical image into the processor 1105. Thus, the optical sensor 110 can sense light ray(s) transmitted from the flexible top structure component 115 to the sensor array 1110 of the optical sensor 110 to generate at least one differential image such as temporal differential image(s) and/or spatial differential image(s). It should be noted that, the circuit design of the sensor array 1110 applied in the force sensor device 100 can make the force sensor device 100 have a fast response time since only partial pixel values are needed to be read out and transmitted to the processor 1105 and also save more power since the readout of an image with no pixel changes and the readout of a row with no pixel changes can be skipped.
In this embodiment, the flexible top structure component 115 has the translucent/transparent material portion. A user's different control behaviors/forces/directions, applied for the flexible top structure component 115, may block the external ambient light ray(s) with different angles/points/amounts and/or change the penetrated light ray(s) in different ways, and the optical sensor 110 can sense and detect the different changes of the penetrated light ray(s) to generate different image patterns corresponding to different differential image(s) so as to detect the user's different control behaviors/forces/directions. The processor 1105 of the optical sensor 110 can be arranged to detect a user's control force, which is applied for the flexible top structure component 115, based on the at least one differential image of the sensor array 1110.
In the first example of
In the second example of
In practice, the displacement, size, and/or the shape of the differential image generated by the changed pixel values, e.g. the image o2, can be detected and calculated by the processor 1105 to estimate the user's control behavior. For instance, the image o2 for example may be a shadow image or image pattern having smooth circle edge(s) (but not limited), the image pattern may have a circle shape or any oval shapes. However, this is not intended to be a limitation. The image o2 may have a triangle shape, a trapezoid shape, or other shapes. The processor 1105 can detect and calculate the displacement, size, and/or the shape of the image o2 in the whole monitoring image f2. For example, the processor 1105 can calculate the X-axis pixel distance X2, Y-axis pixel distance Y2, the pixel distance D1, and the pixel distance D2 of the image o2, to precisely generate, calculate, and estimate the coordination point (θ2, φ2) at the outer surface of the flexible top structure component 115 touched by the user, the magnitude F2 of the force applied by the user, and the force direction N2 applied by the user. Thus, the optical sensor 110 (or force sensor device 100) can precisely detect and estimate the user's control operation and then perform a corresponding response operation in response to the user's control operation.
The displacement, size, and/or the shape of the differential image generated by the changed pixel values, e.g. the image o2, will be changed in response to a different user control behavior of the user. For example, in the third example of
Similarly, the displacement, size, and/or the shape of the image o2′ can be detected and calculated by the processor 1105 to estimate the user's second control behavior. For instance, the processor 1105 can calculate the X-axis pixel distance X2′, Y-axis pixel distance Y2′, the pixel distance D1′, and the pixel distance D2′ of the image o2′, to precisely generate, calculate, and estimate the coordination point (θ2′, φ2′) at the outer surface of the flexible top structure component 115 touched by the user, the magnitude F2′ of the force applied by the user, and the force direction N2′ applied by the user. Thus, the optical sensor 110 (or force sensor device 100) can precisely detect and estimate the user's control operation and then perform a corresponding response operation in response to the user's control operation.
In addition, for example, if the two monitoring images f2 and f3 are successive monitoring images generated by the processor 1105, then the processor 1105 can further estimate successive user control behaviors based on the image change between the images o2 and o2′ so as to determine that the user may perform a particular control operation or a particular sequence of control operations.
Further, it should be noted that the position/displacement of the light source unit 305 in the chamber space 120 can be disposed at any other positions, and this is not intended to be a limitation. In other embodiment, the light source unit 305 may be comprised within the optical sensor 110110. In addition, the chamber space 120 in
In the first example of
In the second example of
Similarly, in practice, the processor 1105 can sense and calculate the displacement, size, and/or the shape of the image o3 to estimate the user's control behavior. For instance, the image o3 for example may be an image pattern having smooth circle edge(s) (but not limited), the image pattern may have a circle shape or any oval shapes. The image o3 may have a triangle shape, a trapezoid shape, or other shapes. The processor 1105 can detect and calculate the displacement, size, and/or the shape of the image o3 in the whole image f5 to calculate the X-axis pixel distance X3, Y-axis pixel distance Y3, the pixel distance D3, and the pixel distance D4 of the image o3, to precisely calculate and estimate the coordination point (θ3, φ3) at the outer surface of the flexible top structure component 115 touched by the user, the magnitude F3 of the force applied by the user, and the force direction N3 applied by the user.
The displacement, size, and/or the shape of the image o3 will be changed in response to a different user control behavior of the user. For example, in the third example of
For instance, the processor 1105 can calculate the X-axis pixel distance X3′, Y-axis pixel distance Y3′, the pixel distance D3′, and the pixel distance D4′ of the image o3′, to precisely generate, calculate, and estimate the coordination point (θ′, φ3′) at the outer surface of the flexible top structure component 115 touched by the user, the magnitude F3′ of the force applied by the user, and the force direction N3′ applied by the user. Thus, the optical sensor 110 (or force sensor device 300) can precisely detect and estimate the user's control operation and then perform a corresponding response operation in response to the user's control operation.
In addition, if the two monitoring images f5 and f6 are successive images generated by the processor 1105, then the processor 1105 can further estimate successive user control behaviors based on the image change between the image patterns o3 and o3′ so as to determine that the user may perform a particular control operation or a particular sequence of control operations.
Further, in other embodiments of
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention.
Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
This application is a continuation in part application of U.S. patent application Ser. No. 17/191,572, filed on 2021, Mar. 3, which is a continuation application of U.S. patent application Ser. No. 16/395,226, filed on 2019, Apr. 25, which is a continuation in part of U.S. application Ser. No. 15/681,415, filed on 2017, Aug. 20.
Number | Date | Country | |
---|---|---|---|
Parent | 16395226 | Apr 2019 | US |
Child | 17191572 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17191572 | Mar 2021 | US |
Child | 17577005 | US | |
Parent | 15681415 | Aug 2017 | US |
Child | 16395226 | US |