The disclosure relates to a method and an apparatus for proximity direction estimation for robot collision avoidance.
As robots work in dynamic environments, unexpected collisions with people, objects, and obstacles must be avoided. A robot colliding with the environment can damage itself or its surroundings, and can harm humans in the workspace. Collision avoidance systems enable the robot to detect approaching obstacles before collision, and take measures to avoid or mitigate impact. Such systems may be particularly necessary for robotic manipulators such as robot arms to safely operate in uncertain and dynamic environments. As such, there has been extensive research on collision avoidance systems for robotic manipulators. Avoiding collisions is also important for mobile robots. Examples include robot vacuum cleaners, robot floor cleaners, and outdoor self-navigating robots such as lawn mowers and trash collectors.
There are many scenarios in which collision avoidance depends on accurate short-range sensing. Many existing collision avoidance methods use cameras and computer vision-based object recognition or three-dimensional (3D) shape reconstruction to detect and react to obstacles. However, these approaches have several limitations. Their performance suffers when faced with visual occlusions, poor light conditions, and transparent or mirrored objects that are difficult to detect visually. Further, camera-based approaches are typically not accurate over very short ranges (less than 10 cm) depending on camera focal length, and any single camera has a limited field of view.
To address this need for short-range detection, proximity sensors such as ultrasonic proximity sensors, millimeter wave radar, infrared proximity sensors, and short-range light detecting and ranging (LiDAR) have been proposed for robot collision avoidance. These methods also have limitations. For example, LiDAR and millimeter wave radar are expensive, and also emanate from a point source and thus have blind spots. Effective coverage may require a large number of sensors distributed throughout the robot, and blind spots can be difficult to eliminate entirely. This complicates robotic system design and adds a significant amount of extra cost and sensor management overhead.
In accordance with an aspect of the disclosure, there is provided an apparatus for estimating a proximity direction of an obstacle, including an acoustic transmitter attached to a surface of the apparatus; a first acoustic receiver spaced apart from the surface; a second acoustic receiver be spaced apart from the surface, wherein a position of the second acoustic receiver is different from a position of the first acoustic receiver with respect to the apparatus; a memory configured to store instructions; and at least one processor configured to execute the instructions to: control the acoustic transmitter to generate an acoustic surface wave along the surface; obtain a first proximity direction signal based on the first acoustic wave signal; obtain a second proximity direction signal based on the second acoustic wave signal; and estimate a proximity direction of an obstacle with respect to the apparatus based on the first proximity direction signal and the second proximity direction signal.
In accordance with an aspect of the disclosure, there is provided a method for estimating a proximity direction of an obstacle, the method being executed by at least one processor and including controlling an acoustic transmitter attached to a surface of an electronic device to generate an acoustic wave along the surface; obtaining a first proximity direction signal based on a first acoustic wave signal received via a first acoustic receiver spaced apart from the surface of the electronic device, wherein the first acoustic wave signal corresponds to the generated acoustic wave; obtaining a second collision direction signal based on a second acoustic wave signal received via a second acoustic receiver spaced apart from the surface of the electronic device, wherein a position of the second acoustic receiver is different from a position of the first acoustic receiver with respect to the electronic device, and wherein the second acoustic wave signal corresponds to the generated acoustic wave; and estimating the proximity direction of the obstacle with respect to the electronic device based on the first proximity direction signal and the second proximity direction signal.
In accordance with an aspect of the disclosure, there is provided a non-transitory computer-readable storage medium storing instructions that, when executed by at least one processor of an electronic device for estimating a proximity direction of an obstacle, cause the at least one processor to: control an acoustic transmitter attached to a surface of the electronic device to generate an acoustic wave along the surface; obtain a first proximity direction signal based on a first acoustic wave signal received via a first acoustic receiver spaced apart from the surface of the electronic device, wherein the first acoustic wave signal corresponds to the generated acoustic wave; obtain a second collision direction signal based on a second acoustic wave signal received via a second acoustic receiver spaced apart from the surface of the electronic device, wherein a position of the second acoustic receiver is different from a position of the first acoustic receiver with respect to the electronic device, and wherein the second acoustic wave signal corresponds to the generated acoustic wave; and estimate the proximity direction of the obstacle with respect to the electronic device based on the first proximity direction signal and the second proximity direction signal.
The above and other aspects, features, and advantages of embodiments of the disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Example embodiments are described in greater detail below with reference to the accompanying drawings.
In the following description, like drawing reference numerals are used for like elements, even in different drawings. The matters defined in the description, such as detailed construction and elements, are provided to assist in a comprehensive understanding of the example embodiments. However, it is apparent that the example embodiments can be practiced without those specifically defined matters. Also, well-known functions or constructions are not described in detail since they would obscure the description with unnecessary detail.
Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list. For example, the expression, “at least one of a, b, and c,” should be understood as including only a, only b, only c, both a and b, both a and c, both b and c, all of a, b, and c, or any variations of the aforementioned examples.
While such terms as “first,” “second,” etc., may be used to describe various elements, such elements must not be limited to the above terms. The above terms may be used only to distinguish one element from another.
Embodiments described herein relate to a sensing modality which may enable short-range proximity detection for objects such as robot arms. A proximity detection and proximity direction estimation system using this principle may be lightweight and inexpensive, and may be attached to an off-the-shelf robotic manipulator with minimal modifications, and provide proximity detection of all objects with sufficient cross-sectional area across an entire surface of a robot. In embodiments, the system can perform full surface and omnidirectional proximity detection and proximity direction estimation using, for example, only a single acoustic transmitter and one or more acoustic receivers, for example a pair of acoustic receivers.
In embodiments, a proximity detection and proximity direction estimation system may use an acoustic transmitter and a pair of acoustic receivers attached to a robot arm. In embodiments, the acoustic transmitter may be, for example, a piezoelectric transmitter or transducer, and the acoustic receivers may be, for example, piezoelectric receivers or transducers. In embodiments, the acoustic transmitter may transmit excitation signals through the robot arm to the one or more acoustic receivers. This acoustic energy may transfer through a whole surface of the robot arm, which may in turn couple with surrounding air and emanate an acoustic signal. This emanated signal may decay in the air, forming an “aura” surrounding the robot surface.
An approaching obstacle that enters this aura will establish a standing wave pattern between the obstacle and the robot surface, changing an acoustic impedance of a system. In embodiments, the term “obstacle” may refer to an object which may present a potential collision which is to be avoided, however embodiments are not limited thereto. For example, in embodiments the term “obstacle” may refer to an object which is a target of investigation, or a target of a potential interaction, for example an object which is to moved, touched, pressed, grasped, etc. This change can be measured by the pair of acoustic receivers attached to the arm at a point far from the obstacle, allowing the system to perform proximity detection. In embodiments, the term “attached” may mean directly attached, however embodiments are not limited thereto. For example, in embodiments the term “attached” may mean indirectly attached, and the acoustic receivers may be for example attached to one or more intervening elements which may be directly attached to the arm. In embodiments, the term “attached” may also mean, for example, directly or indirectly disposed, fastened, affixed, coupled, connected, secured, linked, joined, etc. The proximity detection and proximity direction estimation system according to one or more embodiments may be implemented using other sound producers, such as speakers and microphones, without using piezoelectric elements.
A major component of a signal is received from a surface of a robot rather than an over-the-air signal. However, only the over-the-air signal may contain information useful for proximity detection. Further, a robot arm itself introduces both mechanical and electrical noise that can be received by an attached acoustic receiver.
Therefore, according to embodiments, the acoustic receivers may be mechanically decoupled from the surface, for example by suspending the acoustic receivers in the air just above the surface, in order to minimize the component of the signal received from the surface of the robot. In addition, according to embodiments, the pair of acoustic receivers may be located at two different locations, and differences between the signals received at the two different locations may be used to provide proximity direction estimation corresponding to the approaching obstacle based on the received signals. In embodiments, the proximity direction estimation may be performed by or include comparing the received signals to one or more threshold values, or for example one or more reference signals, in order to estimate a relative direction of the approaching obstacle with respect to the robot arm. However, embodiments are not limited thereto, and the proximity direction estimation may be performed by comparing the received signals with each other, for example by comparing at least one of the received signals with at least another of the received signal.
The apparatus 100 and any portion of the apparatus 100 may be included or implemented in a robot and/or an electronic device. Although apparatus 100 is illustrated as a robotic arm in
As shown in
The acoustic transmitter 110 and the acoustic receivers 120 are disposed adjacent to a surface 105 of an the apparatus 100, which may be for example a robot and/or an electronic device. For example, the acoustic transmitter 110 may be coupled to, disposed on, or embedded within the surface 105 of a robot arm, and the acoustic receivers 120 may be suspended above the surface 105 of the robot arm. In embodiments, the acoustic receivers 120 may be suspended by corresponding connection structures 122. For example, first acoustic receiver 120a may be coupled to first connection structure 122a, which may be coupled to, disposed on, or embedded within the surface 105, and which may suspend the first acoustic receiver 120a at a certain height about the surface 105. Similarly, second acoustic receiver 120b may be coupled to second connection structure 122b, which may be coupled to, disposed on, or embedded within the surface 105, and which may suspend the second acoustic receiver 120b at a certain height about the surface 105. In embodiments, the first acoustic receiver 120a and the second acoustic receiver 120b may be suspended at the same height above the surface 105, however embodiments are not limited thereto, and the first acoustic receiver 120a and the second acoustic receiver 120b may be suspended at different heights above the surface 105.
As shown in
If the apparatus 100 is made out of elastic materials, such as plastic or metal, the surface 105 of the apparatus 100will vibrate and couple with the air, and the entire surface 105 of the apparatus 100 functions as an acoustic transducer, however embodiments are not limited thereto, and in embodiments only a portion of the surface 105 of the apparatus 100 may vibrate. In embodiments, the acoustic transmitter 110 couples with the surface 105 instead of air, and could even be embedded within the apparatus 100. Then, the at least one processor receives, via the acoustic receivers 120, a plurality of acoustic wave signals corresponding to the generated acoustic wave 130. Based on an obstacle being nearby the apparatus 100, the generated acoustic wave 130 becomes a deformed acoustic wave 140 (as shown for example in
As further shown in
In embodiments, signal processing may be performed on the data collected in this signal window. For example, the at least one processor may filter the received acoustic wave signals or the received deformed acoustic wave signals, using a low-pass filter for reducing noise of the received acoustic wave signals or the received deformed acoustic wave signals. As another example, the at least one processor may apply a fast Fourier transform (FFT) to the received acoustic wave signals or the received deformed acoustic wave signals, in order to determine signal powers of the received acoustic wave signals or the received deformed acoustic wave signals.
As further shown in
As further shown in
Although
Based on the apparatus 100being the robot, and based on the obstacle being determined to be proximate to the surface 105 of the apparatus 100, the at least one processor may control the apparatus 100 to avoid collision with the obstacle. In embodiments, the at least one processor may use the estimated proximity direction in order to avoid the collision.
An obstacle 200 close to the surface of the object 300 will establish a standing wave pattern 335 or interference pattern between the obstacle 200 and the object surface, which perturbs the acoustic pressure field and results in an acoustic impedance change across the entire surface. These changes can be detected by a piezoelectric receiver 320, which may be located on the surface of the object 300. As the acoustic wave 305 propagates through the object 300, obstacles close to any point on the object surface will cause distortions that can be measured at other points on or within the object 300, allowing for a single transmitter/receiver pair of piezoelectric elements to detect the obstacles close to any part of the coupled object 300.
Therefore, embodiments provide an apparatus 100 in which the acoustic receivers 120 are suspended in the air just above the surface, in order to allow the acoustic receivers 120 to directly sense the component of the acoustic wave signal which corresponds to the emanated acoustic wave315, while sensing relatively less of the component of the acoustic wave signal that is transmitted mechanically through the surface.
As shown in
The first and second acoustic receivers 120a and 120b may receive acoustic wave signals corresponding to the acoustic wave 130. Because the first and second acoustic receivers 120a and 120b are suspended above the surface 105 by the first and second connection structures 122a and 122b, respectively, the received acoustic wave signals may be influenced relatively more by the second portion 130b of the acoustic wave 130, and may be influenced relatively less by the first portion 130a of the acoustic wave 130. Therefore, when an obstacle 200 approaches the apparatus 100and changes some or all of the second portion 130b of the acoustic wave 130 into a deformed acoustic wave 140, as shown in
An example of an acoustic wave signal which may be generated by one of the first and second acoustic receivers 120a and 120b is shown in
As shown in
In embodiments, if the connection structure 122 is too tall, then it may be more prone to swaying and hitting obstacles, which may be undesirable. Further, if the connection structure 122 is too rigid then it may conduct mechanical vibrations well, and this may also be undesirable. Accordingly, in embodiments, the connection structure 122 may be a relatively thin, light weight structure that is sufficiently close to the surface.
In embodiments, the connection structure 122 may be constructed such that vibrations of the surface 105 which travel up the connection structure 122 and are transmitted to the acoustic receiver 120, such as the first portion 130a of the acoustic wave 130, may be reduced. In embodiments, the connection structure 122 may include one or more of plastic, foam, wood, or any other material that may reduce vibrations. For example, in embodiments, the connection structure may include a cylinder which is coupled to the surface 105. In embodiments, the cylinder may be hollow in order to reduce a weight of the cylinder. In addition, connection structure 122 may include sound absorbing material such as soundproofing or sound absorbing foam, which may be used to separate the cylinder from the acoustic receiver 120, in order to absorb at least some of the vibrations.
If the acoustic receiver 120 is mounted directly on the surface 105, then a maximum amount of noisy vibrations from the surface 105 is received by the acoustic receiver 120, which is undesirable. If the acoustic receiver 120 is lifted, it can detect changes in the emanated acoustic wave and may be decoupled from the noisy surface vibrations. The distance H may be selected to be sufficiently close to the surface that the acoustic receiver 120 is able to detect the interference pattern setup by the apparatus 100as an obstacle approaches the surface 105. However, the distance H can be selected to be as far away from the surface 105 as desired, as long as the acoustic receiver 120 is still able to detect the emanated acoustic wave . A value for the distance H may be selected based on design and deployment requirements. In embodiments, the distance H may be for example in the range from, for example, 5 millimeters to several centimeters, however embodiments are not limited thereto.
Table 1 below shows signal-to-noise ratios (SNRs) provided by example connection structures 122. In particular, Table 1 shows SNRs corresponding to connection structures 122 constructed of wood and polylactic acid (PLA) which place the acoustic receivers 120 including a piezoelectric receiver at heights of 10 mm and 15 mm above the surface 500.
In embodiments, the aura provided by the emanated acoustic wave may extend only a short distance from the surface 105, for example, within 3-7 wavelengths depending on the amplitude of the input signal. In embodiments, if the acoustic wave signal transmitted by the acoustic transmitter 110 has a frequency of about 19 kHz, the aura provided by the emanated acoustic wave may extend in the range of about 5.5 cm to about 14 cm above the surface 105. In embodiments, the distance H may be selected to be within about 1 wavelength from the surface 105 in order to ensure that the emanated acoustic wave can be properly detected. In embodiments, a node or peak may be present about a half wavelength above the surface 105, so the distance H may be selected to be within about a half wavelength from the surface 105 in order to maximize the signal provided by the emanated acoustic wave. In addition, in embodiments the size of the acoustic receiver 120 may be selected to be about 1 wavelength in diameter. In embodiments, this may mean that the acoustic receiver may be about 20 mm in diameter, and the distance H may be about 10 mm-20 mm, however embodiments are not limited thereto.
According to embodiments, the acoustic receivers 120 may be lifted from the surface in a variety of different ways. For example, as shown in
As another example, as shown in
As another example, as shown in
As yet another example, as shown in
As shown in
In operation 1302, the process 1300 includes mixing the acquired signal with a copy of the transmitted signal and applying a low pass filter in order to obtain an envelope of the acquired signal. In embodiments, operation 1302 may be performed by the signal processing elements 1100 discussed above.
In operation 1303 the process 1300 includes performing a fast Fourier transform on the envelope in order to calculate signal power of the signal in a predetermined band. In embodiments, the calculated signal power may be a sum of signal powers corresponding to frequencies below a cutoff frequency.
In operation 1304 the process 1300 includes determining whether the calculated power is greater than a first threshold value thr1.
Based on determining that the calculated power is greater than the first threshold value thr1 (YES at operation 1304), the process 1300 may proceed to operation 1305, in which a proximity event is determined to occur. For example, based on the calculated threshold being greater than the first threshold value thr1, the at least one processor of the apparatus 100 may determine that an obstacle 200 is proximate to the apparatus 100.
Based on determining that the calculated power is not greater than the first threshold value thr1 (NO at operation 1304), the process 1300 may proceed to operation 1306, in which a proximity event is determined to not to occur. In embodiments, the process 1300 may proceed to operation 1301, and the process 1300 may be performed again based on a signal acquired in a next signal window.
In embodiments, the process 1300 and the proximity detection algorithm discussed above may provide robust results based on a simple threshold value, for example first threshold value thr1. In some embodiments, the first threshold value thr1 may be set without needing to perform training, for example without using a machine learning approach which requires specific training for different objects, obstacles, and robot motion paths in order to work robustly. In embodiments, the first threshold value thr1 may be calculated for a specific design, for example a specific design of a robot including the apparatus 100, and can then be used for all robots having the same design.
As another example, in order to obtain a first threshold value thr1 to be used in proximity detection, a power spectrum up to a particular cutoff frequency fmax (for example 100 Hz) may be first observed when the robot is stationary and moving without any object in proximity. Then, a threshold value calculation algorithm, for example the threshold value calculation algorithm of
In embodiments, in addition to providing proximity detection indicating whether an obstacle is proximate to the apparatus 100, the apparatus 100 may also be used to estimate a proximity direction of the obstacle with respect to the apparatus 100. For example, the first acoustic receiver 120a and the second acoustic receiver 120b may be deployed at different positions on the apparatus 100, for example on opposite sides of the apparatus 100. Then by comparing the signal strength of acoustic wave signals received by the first and second acoustic receivers 120a and 120b, the proximity direction can be estimated, for example by dividing the area surrounding the apparatus 100 into quadrants and indicating which quadrant the obstacle is present in. In embodiments, this estimated proximity direction can be used for collision avoidance, for example by providing a visual or audible signal, or by providing information about the estimated proximity direction to a controller such as a robot controller so that the robot controller can control the robot to avoid a collision with the obstacle.
As shown in
In operation 1802, the process 1800 includes acquiring a signal Rx2. In embodiments, the signal Rx2 may correspond to a second acoustic wave signal acquired using the second acoustic receiver 120b.
In operation 1803, the process 1800 includes mixing the signal Rx1 with a copy of the transmitted signal and applying a low pass filter in order to obtain an envelope of the signal Rx1. In embodiments, operation 1803 may be performed by the signal processing elements 1100 discussed above.
In operation 1804, the process 1800 includes mixing the signal Rx2 with a copy of the transmitted signal and applying a low pass filter in order to obtain an envelope of the signal Rx2. In embodiments, operation 1804 may be performed by the signal processing elements 1100 discussed above.
In operation 1805 the process 1800 includes performing a fast Fourier transform on the envelope corresponding to the signal Rx1 in order to calculate a sum S1 of the signal powers of the signal Rx1 in a predetermined band.
In operation 1806 the process 1800 includes performing a fast Fourier transform on the envelope corresponding to the signal Rx2 in order to calculate a sum S2 of the signal powers of the signal Rx2 in a predetermined band.
In operation 1807 the process 1800 includes determining a relative proximity direction of the obstacle based on the first threshold value thr1, the second threshold value thr2, and the third threshold value thr3. In embodiments, based on the sum S1 being greater than the first threshold value thr1 and the sum S2 being less than the first threshold value thr1, the obstacle can be determined to be in quadrant 1. Based on the sum S1 being less than the first threshold value thr1 and the sum S2 being greater than the first threshold value thr1, the obstacle can be determined to be in quadrant 2. Based on both of the sum S1 and the sum S2 being greater than the second threshold value thr2, the obstacle can be determined to be in quadrant 3. Based on both of the sum S1 and the sum S2 being greater than the third threshold value thr3, the obstacle can be determined to be in quadrant 4.
In operation 1808 the process 1800 includes outputting the determined quadrant as the estimated proximity direction.
Although embodiments described above relate to proximity direction estimation based on thresholds, embodiments are not limited thereto. For example, in embodiments the proximity direction estimation may be performed by comparing received signals with each other, for example by comparing the signal Rx1 with the signal Rx1, or by comparing the sum S1 with the sum S2. In embodiments, comparing the received signals with each other instead of a reference signal or a threshold may provide improvements in one or more of cost, complexity, and accuracy.
In addition, although embodiments described above relate to proximity direction estimation having four quadrants, embodiments are not limited thereto. In embodiments, additional acoustic receivers 120 may be used in addition to the first and second acoustic receivers 120a and 120b. For example, two more acoustic receivers 120 may be added to the apparatus 100 and offset at 45 degrees on axis. In embodiments, different frequencies may be transmitted by the acoustic transmitter 110 and received by the acoustic receivers 120, which may provide a different interference pattern or a different ratio between receivers. In embodiments, a chirp signal may be used, and a signal reflected from the surroundings of the apparatus 100 may be analyzed, for example with a trained classifier, in order to determine a distance and location of an obstacle with respect to the apparatus 100. In embodiments, the classifier may be a neural network trained based on a dataset corresponding to the apparatus 100and various obstacles.
For example, the graph of
However, these are only examples and embodiments are not limited to the ranges and proximity direction illustrated in
A localization unit 2306 may receive wheel odometry information from the wheel encoder 2303 and a heading angle from the IMU 2304, and may provide location information to a map building unit 2305.
The map building unit 2305 may receive images from the camera 2301, a point cloud from the LIDAR unit 2302, and the location information, and may provide map information to the localization unit 2306 and a global path planning unit 2307.
The global path planning unit 2307 may receive the map information and user commands from user interface 2308, and may provide way points to a local path planning unit 2309.
The local path planning unit 2309 may receive a bump signal from the bump sensor 2310, a floor distance from the cliff sensor 2311, and the way points, and may provide a desired moving direction and speed to a motor control unit 2312, which may control a motion of, for example the robotic vacuum cleaner.
Accordingly, the robot control system 2300a may control a robotic vacuum cleaner. However, because proximity detection may be primarily provided by the bump sensor 2310, and because the bump sensor may not provide detailed information about a detection of an obstacle within a certain proximity, or a proximity direction of a detected obstacle, the robotic vacuum cleaner may become easily stuck.
Because the robot control system 2300b includes the ambi-sense unit 2314, when a bump occurs between the robotic vacuum cleaner and an obstacle, the robot control system 2300b may control the robotic vacuum cleaner to turn to free space. In addition, in the case of moving obstacles, the robot control system 2300b may determine where the obstacle is, or is from, so the robot control system 2300b can either ignore if the obstacle is moving away, detour motion if the obstacle is in the way and there is an alternate path to the goal, slow down the speed of the robotic vacuum cleaner if the obstacle is too close to the desired path, and stop the robotic vacuum cleaner if the obstacle is on the desired path and there is no way to detour. In addition, the proximity direction information may be used to more accurately build a map.
Because the robot control system 2300c includes the ambi-sense unit 2314, in the case of an unexpected bump or dynamic scene, the robot control system 2300c may determine where the obstacle is, or is from, so the robot control system 2300b can either ignore if the obstacle is moving away, detour motion if the obstacle is in the way and there is an alternate path to the goal, slow down the speed of the robotic arm if the obstacle is too close to the desired path, and stop the robotic arm if the obstacle is on the desired path and there is no way to detour. In addition, the proximity direction information may be used to more accurately build a map.
As shown in
In operation 2412, the process 2400A includes receiving, via a first acoustic receiver, a first acoustic wave signal corresponding to the generated acoustic wave. In embodiments, the first acoustic receiver may be spaced apart from the surface. In embodiments, the first acoustic receiver may correspond to the first acoustic receiver 120a.
In operation 2413, the process 2400a includes receiving, via a second acoustic receiver, a second acoustic wave signal corresponding to the generated acoustic wave. In embodiments, the second acoustic receiver may be spaced apart from the surface, and a position of the second acoustic receiver may be different from a position of the first acoustic receiver with respect to the apparatus. In embodiments, the second acoustic receiver may correspond to the second acoustic receiver 120b.
In operation 2414, the process 2400a includes estimating a proximity direction of an obstacle with respect to apparatus based on the first proximity direction signal and the second proximity direction signal.
In embodiments, the first acoustic receiver and the second acoustic receiver may be attached to the surface at two opposing positions on the surface, and may be spaced apart from the surface by a same distance.
In embodiments, the acoustic transmitter may include a piezoelectric transmitter, the first acoustic receiver may include a first piezoelectric receiver, and the second acoustic receiver may include a second piezoelectric receiver.
In embodiments, the first proximity direction signal may be obtained by applying signal processing to the first acoustic wave signal, the second proximity direction signal may be obtained by applying signal processing to the second acoustic wave signal, and the applying of the signal processing may include applying at least one from among a low-pass filter (LPF) and a fast Fourier transform (FFT).
In embodiments, the apparatus may include a first connection structure and a second connection structure, a first end of the first connection structure may be connected to the first acoustic receiver, a second end of the first connection structure may be connected to the surface, such that the first acoustic receiver is spaced apart from the surface by a first distance, and a first end of the second connection structure may be connected to the second acoustic receiver, and a second end of the second connection structure is connected to the surface, such that the second acoustic receiver is spaced apart from the surface by the first distance.
In embodiments, the first connection structure may be configured to reduce an effect of vibrations transmitted mechanically through the first object on the first acoustic wave signal received by the first acoustic receiver, and the second connection structure may be configured to reduce an effect of the vibrations on the second acoustic wave signal received by the second acoustic receiver.
In embodiments, the acoustic wave generated by the acoustic transmitter may include a chirp signal, and estimating of the proximity direction may further include providing the first proximity direction signal and the second proximity direction signal to a neural network which is trained based on a dataset corresponding to the electronic device and a plurality of obstacles.
As shown in
In operation 2422, the process 2400B includes determining whether the first proximity detection signal is greater than the first threshold and the second proximity detection signal is less than the first threshold.
Based on the first proximity detection signal being greater than the first threshold and the second proximity detection signal being less than the first threshold (YES at operation 2422), the process 2400B may proceed to operation 2423, in which the estimated proximity direction is determined to be a first direction. Otherwise (NO at operation 2422), the process 2400B may proceed to operation 2424.
In operation 2424, the process 2400B includes determining whether the first proximity detection signal is less than the first threshold and the second proximity detection signal is greater than the first threshold.
Based on the first proximity detection signal being less than the first threshold and the second proximity detection signal being greater than the first threshold (YES at operation 2424), the process 2400B may proceed to operation 2425, in which the estimated proximity direction is determined to be a second direction. Otherwise (NO at operation 2422), the process 2400B may proceed to operation 2426.
In operation 2426, the process 2400B includes determining whether the first and second proximity detection signals are greater than the second threshold.
Based on the first and second proximity detection signals being greater than the second threshold (YES at operation 2426), the process 2400B may proceed to operation 2427, in which the estimated proximity direction is determined to be a third direction. Otherwise (NO at operation 2426), the process 2400B may proceed to operation 2428.
In operation 2428, the process 2400B includes determining whether the first and second proximity detection signals are greater than the third threshold.
Based on the first and second proximity detection signals being greater than the third threshold (YES at operation 2428), the process 2400B may proceed to operation 2429, in which the estimated proximity direction is determined to be a fourth direction. Otherwise (NO at operation 2422), the process 2400B may proceed to operation 2430, in which no estimated proximity is determined.
The electronic device 2500 includes a bus 2510, a processor 2520, a memory 2530, an interface 2540, and a display 2550.
The bus 2510 includes a circuit for connecting the components 2520 to 2550 with one another. The bus 2510 functions as a communication system for transferring data between the components 2520 to 2550 or between electronic devices.
The processor 2520 includes one or more of a central processing unit (CPU), a graphics processor unit (GPU), an accelerated processing unit (APU), a many integrated core (MIC), a field-programmable gate array (FPGA), or a digital signal processor (DSP). The processor 2520 is able to perform control of any one or any combination of the other components of the electronic device 2500, and/or perform an operation or data processing relating to communication. The processor 2520 executes one or more programs stored in the memory 2530.
The memory 2530 may include a volatile and/or non-volatile memory. The memory 2530 stores information, such as one or more of commands, data, programs (one or more instructions), applications 2534, etc., which are related to at least one other component of the electronic device 2500 and for driving and controlling the electronic device 2500. For example, commands and/or data may formulate an operating system (OS) 2532. Information stored in the memory 2530 may be executed by the processor 2520.
The applications 2534 include the above-discussed embodiments. These functions can be performed by a single application or by multiple applications that each carry out one or more of these functions.
The display 2550 includes, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, a quantum-dot light emitting diode (QLED) display, a microelectromechanical systems (MEMS) display, or an electronic paper display. The display 2550 can also be a depth-aware display, such as a multi-focal display. The display 2550 is able to present, for example, various contents, such as text, images, videos, icons, and symbols.
The interface 2540 includes input/output (I/O) interface 2542, communication interface 2544, and/or one or more sensors 2546. The I/O interface 2542 serves as an interface that can, for example, transfer commands and/or data between a user and/or other external devices and other component(s) of the electronic device 2500.
The sensor(s) 2546 can meter a physical quantity or detect an activation state of the electronic device 2500 and convert metered or detected information into an electrical signal. For example, the sensor(s) 2546 can include one or more cameras or other imaging sensors for capturing images of scenes. The sensor(s) 2546 can also include any one or any combination of a microphone, a keyboard, a mouse, one or more buttons for touch input, a gyroscope or gyro sensor, an air pressure sensor, a magnetic sensor or magnetometer, an acceleration sensor or accelerometer, a grip sensor, a proximity sensor, a color sensor (such as a red green blue (RGB) sensor), a bio-physical sensor, a temperature sensor, a humidity sensor, an illumination sensor, an ultraviolet (UV) sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECG) sensor, an infrared (IR) sensor, an ultrasound sensor, an iris sensor, and a fingerprint sensor. The sensor(s) 2546 can further include an inertial measurement unit. In addition, the sensor(s) 2546 can include a control circuit for controlling at least one of the sensors included herein. Any of these sensor(s) 2546 can be located within or coupled to the electronic device 2500. The sensors 2546 may be used to detect touch input, gesture input, and hovering input, using an electronic pen or a body portion of a user, etc.
The communication interface 2544, for example, is able to set up communication between the electronic device 2500 and an external electronic device. The communication interface 2544 can be a wired or wireless transceiver or any other component for transmitting and receiving signals.
The embodiments of the disclosure described above may be written as computer executable programs or instructions that may be stored in a medium.
The medium may continuously store the computer-executable programs or instructions, or temporarily store the computer-executable programs or instructions for execution or downloading. Also, the medium may be any one of various recording media or storage media in which a single piece or plurality of pieces of hardware are combined, and the medium is not limited to a medium directly connected to the electronic device 2200, but may be distributed on a network. Examples of the medium include magnetic media, such as a hard disk, a floppy disk, and a magnetic tape, optical recording media, such as CD-ROM and DVD, magneto-optical media such as a floptical disk, and ROM, RAM, and a flash memory, which are configured to store program instructions. Other examples of the medium include recording media and storage media managed by application stores distributing applications or by websites, servers, and the like supplying or distributing other various types of software.
The above described method may be provided in a form of downloadable software. A computer program product may include a product (for example, a downloadable application) in a form of a software program electronically distributed through a manufacturer or an electronic market. For electronic distribution, at least a part of the software program may be stored in a storage medium or may be temporarily generated. In this case, the storage medium may be a server or a storage medium of the server.
A model related to the CNN described above may be implemented via a software module. When the CNN model is implemented via a software module (for example, a program module including instructions), the CNN model may be stored in a computer-readable recording medium.
Also, the CNN model may be a part of the apparatus 100 described above by being integrated in a form of a hardware chip. For example, the CNN model may be manufactured in a form of a dedicated hardware chip for artificial intelligence, or may be manufactured as a part of an existing general-purpose processor (for example, a CPU or application processor) or a graphic-dedicated processor (for example a GPU).
Also, the CNN model may be provided in a form of downloadable software. A computer program product may include a product (for example, a downloadable application) in a form of a software program electronically distributed through a manufacturer or an electronic market. For electronic distribution, at least a part of the software program may be stored in a storage medium or may be temporarily generated. In this case, the storage medium may be a server of the manufacturer or electronic market, or a storage medium of a relay server.
Accordingly, embodiments may relate to a novel sensing architecture that may enable robots and other objects to sense proximity and proximity direction of surrounding obstacles. Embodiments may use low-cost piezoelectric transducers that produce and receive emanated surface waves as a sensory signal. Embodiments may provide a novel receiver structure that provides clean and sensitive signals, along with a new signal processing pipeline and several simple but elegant detection algorithms. As a result, embodiments may enable responsive human robot interaction, provide robots with collision detection and avoidance capabilities when obstacles are in proximity, make robots that move more aware of surroundings, enable path planning for robots in a dynamic environment, and allow robots and humans to share environments more freely.
While the embodiments of the disclosure have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims.
This application is based on and claims priority under 35 U.S.C. § 119 from U.S. Provisional Application No. 63/330,989 filed on Apr. 14, 2022, in the U.S. Patent & Trademark Office, the disclosure of which is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63330989 | Apr 2022 | US |