A portion of the disclosure of this patent document contains material that is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever.
The present application relates generally to robotics, and more specifically to systems and apparatuses for a protective module or a device for robotic sensors.
The foregoing needs are satisfied by the present disclosure, which provides for, inter alia, systems and apparatuses for a protective module or device for robotic sensors.
Exemplary embodiments described herein have innovative features, no single one of which is indispensable or solely responsible for their desirable attributes. Without limiting the scope of the claims, some of the advantageous features will now be summarized. One skilled in the art would appreciate that as used herein, the term robot may generally refer to an autonomous vehicle or object that travels a route, executes a task, or otherwise moves automatically upon executing or processing computer readable instructions.
According to at least one non-limiting exemplary embodiment, a module for a robot to prevent collisions with hazards is disclosed. The module comprises a top portion comprising a first toothed edge comprising one or more top teeth; a bottom portion comprising a second toothed edge comprising one or more bottom teeth; wherein, the top and bottom portions surround a sensor of the robot; and the top teeth and the bottom teeth define a gap comprising a size wide enough to permit transmission and receipt of signals from the sensor to one or more objects of interest while being no larger than the hazard.
According to at least one non-limiting exemplary embodiment, the sensor is a light detection and ranging sensor configured to measure along a plane, and the teeth, at least in part, occlude a portion of a receiver element of the light detection and ranging sensor.
According to at least one non-limiting exemplary embodiment, the one or more teeth form spacings; and the teeth of the top portion protrude into the spacings of the top portion; and the teeth of the bottom portion protrude into the spacings of the top portion.
According to at least one non-limiting exemplary embodiment, the teeth of the top portion and bottom portion are of a size and shape configured to not occlude more than a threshold value of either (i) a reflected signal to the sensor from the object of interest, or (ii) a subtended area of a detector of the light detection and ranging sensor from the perspective of the object of interest.
According to at least one non-limiting exemplary embodiment the threshold is no larger than 20%.
According to at least one non-limiting exemplary embodiment, the occlusion caused by the teeth is uniform across the field of view of the light detection and ranging sensor. The uniform nature may be characterized by a uniform solid angle of the detector occluded by the teeth when the detector is viewed from an orthographic perspective. The uniform nature may also be characterized by a uniform return signal strength as a function of angle across the field of view.
According to at least one non-limiting exemplary embodiment, the protective module further comprises one or more attachment mechanisms within the connection interface configured to allow the module to be coupled to a robot without manipulation of the sensor.
According to at least one non-limiting exemplary embodiment, the protective module further comprises a connection interface capable of mechanically coupling the module to the robot.
These and other objects, features, and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the disclosure. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
The disclosed aspects will hereinafter be described in conjunction with the appended drawings, provided to illustrate and not to limit the disclosed aspects, wherein like designations denote like elements.
All Figures disclosed herein are @ Copyright 2023 Brain Corporation. All rights reserved.
Currently, many robots utilize sensors, such as light detection and ranging (“LiDAR”) sensors, to sense various objects within their environment. Typically, these sensors operate on a planar measurement surface, such as spinning planar LiDAR sensors. In some instances, the objects the sensors are designed to detect may also pose a hazard to the sensors themselves. Namely, for LiDAR sensors, scratches in the lens may distort measurements and/or render the sensor inoperable. For example, a robot may utilize a planar LiDAR to sense objects at a small height (e.g., 5-20 inches) above a floor to detect, e.g., human legs, table legs, bottoms of a shelf/wall/tall object, shopping carts (e.g., via detecting a lower rack), and other small objects that the robot should avoid. On occasion, and primarily due to human actions, these small objects may collide with the sensor. For example, an autonomous robot may collide with a shopping cart pushed by a human. As another example, the robot may be controlled manually (e.g., in a manual mode), wherein the operator of the robot may collide with an object (e.g., a driver may use a large robot such as the one depicted in
Various aspects of the novel systems, apparatuses, and methods disclosed herein are described more fully hereinafter with reference to the accompanying drawings. This disclosure can, however, be embodied in many different forms and should not be construed as limited to any specific structure or function presented throughout this disclosure. Rather, these aspects are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. Based on the teachings herein, one skilled in the art would appreciate that the scope of the disclosure is intended to cover any aspect of the novel systems, apparatuses, and methods disclosed herein, whether implemented independently of, or combined with, any other aspect of the disclosure. For example, an apparatus may be implemented or a method may be practiced using any number of the aspects set forth herein. In addition, the scope of the disclosure is intended to cover such an apparatus or method that is practiced using other structure, functionality, or structure and functionality in addition to or other than the various aspects of the disclosure set forth herein. It should be understood that any aspect disclosed herein may be implemented by one or more elements of a claim.
Although particular aspects are described herein, many variations and permutations of these aspects fall within the scope of the disclosure. Although some benefits and advantages of the preferred aspects are mentioned, the scope of the disclosure is not intended to be limited to particular benefits, uses, and/or objectives. The detailed description and drawings are merely illustrative of the disclosure rather than limiting, the scope of the disclosure being defined by the appended claims and equivalents thereof.
The present disclosure provides for systems and apparatuses for a protective module for robotic sensors. As used herein, a robot may include mechanical and/or virtual entities configured to carry out a complex series of tasks or actions autonomously. In some exemplary embodiments, robots may be machines that are guided and/or instructed by computer programs and/or electronic circuitry. In some exemplary embodiments, robots may include electro-mechanical components that are configured for navigation, where the robot may move from one location to another. Such robots may include autonomous and/or semi-autonomous cars, floor cleaners, rovers, drones, planes, boats, carts, trams, wheelchairs, industrial equipment, stocking machines, mobile platforms, personal transportation devices (e.g., hover boards, SEGWAY® vehicles, etc.), trailer movers, vehicles, and the like. Robots may also include any autonomous and/or semi-autonomous machine for transporting items, people, animals, cargo, freight, objects, luggage, and/or anything desirable from one location to another.
As used herein, network interfaces may include any signal, data, or software interface with a component, network, or process including, without limitation, those of the FireWire (e.g., FW400, FW800, FWS800T, FWS1600, FWS3200, etc.), universal serial bus (“USB”) (e.g., USB 1.X. USB 2.0, USB 3.0, USB Type-C, etc.), Ethernet (e.g., 10/100, 10/100/1000 (Gigabit Ethernet), 10-Gig-E, etc.), multimedia over coax alliance technology (“MoCA”), Coaxsys (e.g., TVNET™), radio frequency tuner (e.g., in-band or OOB, cable modem, etc.), Wi-Fi (802.11), WiMAX (e.g., WiMAX (802.16)), PAN (e.g., PAN/802.15), cellular (e.g., 3G, 4G, or 5G including LTE/LTE-A/TD-LTE/TD-LTE, GSM, etc. variants thereof), IrDA families, etc. As used herein, Wi-Fi may include one or more of IEEE-Std. 802.11, variants of IEEE-Std. 802.11, standards related to IEEE-Std. 802.11 (e.g., 802.11 a/b/g/n/ac/ad/af/ah/ai/aj/aq/ax/ay), and/or other wireless standards.
As used herein, processor, microprocessor, and/or digital processor may include any type of digital processing device such as, without limitation, digital signal processors (“DSPs”), reduced instruction set computers (“RISC”), complex instruction set computers (“CISC”) processors, microprocessors, gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors, and application-specific integrated circuits (“ASICs”). Such digital processors may be contained on a single unitary integrated circuit die or distributed across multiple components.
As used herein, computer program and/or software may include any sequence or human or machine cognizable steps that perform a function. Such computer program and/or software may be rendered in any programming language or environment including, for example, C/C++, C#, Fortran, COBOL, MATLAB™, PASCAL, GO, RUST, SCALA, Python, assembly language, markup languages (e.g., HTML, SGML, XML, VoXML), and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (“CORBA”), JAVA™ (including J2ME, Java Beans, etc.), Binary Runtime Environment (e.g., “BREW”), and the like.
As used herein, connection, link, and/or wireless link may include a causal link between any two or more entities (whether physical or logical/virtual), which enables information exchange between the entities.
As used herein, computer and/or computing device may include, but are not limited to, personal computers (“PCs”) and minicomputers, whether desktop, laptop, or otherwise, mainframe computers, workstations, servers, personal digital assistants (“PDAs”), handheld computers, embedded computers, programmable logic devices, personal communicators, tablet computers, mobile devices, portable navigation aids, J2ME equipped devices, cellular telephones, smart phones, personal integrated communication or entertainment devices, and/or any other device capable of executing a set of instructions and processing an incoming data signal.
Detailed descriptions of the various embodiments of the system and methods of the disclosure are now provided. While many examples discussed herein may refer to specific exemplary embodiments, it will be appreciated that the described systems and methods contained herein are applicable to any kind of robot. Myriad other embodiments or uses for the technology described herein would be readily envisaged by those having ordinary skill in the art, given the contents of the present disclosure.
Advantageously, the systems and methods of this disclosure at least: (i) reduce the number of unexpected damages to robots caused by manual use (e.g., a robot experiencing damage in a manual mode may not be reported as inoperable until requested to operate autonomously); (ii) reduce operator time servicing robots by reducing the maintenance required to manage, repair or replace damaged sensors; and/or (iii) enable existing robots to quickly be fitted with protective coverings for their sensors that do not hinder performance. Other advantages are readily discernable by one having ordinary skill in the art given the contents of the present disclosure.
Controller 118 may control the various operations performed by robot 102. Controller 118 may include and/or comprise one or more processors or processing devices (e.g., microprocessors) and other peripherals. As previously mentioned and used herein, processor, processing device, microprocessor, and/or digital processing device may include any type of digital processing device such as, without limitation, digital signal processors (“DSPs”), reduced instruction set computers (“RISC”), complex instruction set computers (“CISC”), microprocessors, gate arrays (e.g., field programmable gate arrays (“FPGAs”)), programmable logic device (“PLDs”), reconfigurable computer fabrics (“RCFs”), array processors, secure microprocessors and application-specific integrated circuits (“ASICs”). Peripherals may include hardware accelerators configured to perform a specific function using hardware elements such as, without limitation, encryption/description hardware, algebraic processors (e.g., tensor processing units, quadratic problem solvers, multipliers, etc.), data compressors, encoders, arithmetic logic units (“ALU”), and the like. Such digital processors may be contained on a single unitary integrated circuit die, or distributed across multiple components.
Controller 118 may be operatively and/or communicatively coupled to memory 120. Memory 120 may include any type of integrated circuit or other storage device configured to store digital data including, without limitation, read-only memory (“ROM”), random access memory (“RAM”), non-volatile random access memory (“NVRAM”), programmable read-only memory (“PROM”), electrically erasable programmable read-only memory (“EEPROM”), dynamic random-access memory (“DRAM”). Mobile DRAM, synchronous DRAM (“SDRAM”), double data rate SDRAM (“DDR/2 SDRAM”), extended data output (“EDO”) RAM, fast page mode RAM (“FPM”), reduced latency DRAM (“RLDRAM”), static RAM (“SRAM”), flash memory (e.g., NAND/NOR), memristor memory, pseudostatic RAM (“PSRAM”), etc. Memory 120 may provide computer-readable instructions and data to controller 118. For example, memory 120 may be a non-transitory, computer-readable storage apparatus and/or medium having a plurality of instructions stored thereon, the instructions being executable by a processing apparatus (e.g., controller 118) to operate robot 102. In some cases, the computer-readable instructions may be configured to, when executed by the processing apparatus, cause the processing apparatus to perform the various methods, features, and/or functionality described in this disclosure. Accordingly, controller 118 may perform logical and/or arithmetic operations based on program instructions stored within memory 120. In some cases, the instructions and/or data of memory 120 may be stored in a combination of hardware, some located locally within robot 102, and some located remote from robot 102 (e.g., in a cloud, server, network, etc.).
It should be readily apparent to one of ordinary skill in the art that a processor may be internal to or on board robot 102 and/or may be external to robot 102 and be communicatively coupled to controller 118 of robot 102 utilizing communication units 116 wherein the external processor may receive data from robot 102, process the data, and transmit computer-readable instructions back to controller 118. In at least one non-limiting exemplary embodiment, the processor may be on a remote server (not shown).
In some exemplary embodiments, memory 120, shown in
Still referring to
Returning to
In exemplary embodiments, navigation units 106 may include systems and methods that may computationally construct and update a map of an environment, localize robot 102 (e.g., find its position) in a map, and navigate robot 102 to/from destinations. The mapping may be performed by imposing data obtained in part by sensor units 114 into a computer-readable map representative at least in part of the environment. In exemplary embodiments, a map of an environment may be uploaded to robot 102 through user interface units 112, uploaded wirelessly or through wired connection, or taught to robot 102 by a user.
In exemplary embodiments, navigation units 106 may include components and/or software configured to provide directional instructions for robot 102 to navigate. Navigation units 106 may process maps, routes, and localization information generated by mapping and localization units, data from sensor units 114, and/or other operative units 104.
Still referring to
Actuator unit 108 may also include any system used for actuating and, in some cases actuating task units to perform tasks. For example, actuator unit 108 may include driven magnet systems, motors/engines (e.g., electric motors, combustion engines, steam engines, and/or any type of motor/engine known in the art), solenoid/ratchet system, piezoelectric system (e.g., an inchworm motor), magnetostrictive elements, gesticulation, and/or any actuator known in the art.
According to exemplary embodiments, sensor units 114 may comprise systems and/or methods that may detect characteristics within and/or around robot 102. Sensor units 114 may comprise a plurality and/or a combination of sensors. Sensor units 114 may include sensors that are internal to robot 102 or external, and/or have components that are partially internal and/or partially external. In some cases, sensor units 114 may include one or more exteroceptive sensors, such as sonars, light detection and ranging (“LiDAR”) sensors, radars, lasers, cameras (including video cameras (e.g., red-blue-green (“RBG”) cameras, infrared cameras, three-dimensional (“3D”) cameras, thermal cameras, etc.), time of flight (“ToF”) cameras, structured light cameras, etc.), antennas, motion detectors, microphones, and/or any other sensor known in the art. According to some exemplary embodiments, sensor units 114 may collect raw measurements (e.g., currents, voltages, resistances, gate logic, etc.) and/or transformed measurements (e.g., distances, angles, detected points in obstacles, etc.). In some cases, measurements may be aggregated and/or summarized. Sensor units 114 may generate data based at least in part on distance or height measurements. Such data may be stored in data structures, such as matrices, arrays, queues, lists, arrays, stacks, bags, etc.
According to exemplary embodiments, sensor units 114 may include sensors that may measure internal characteristics of robot 102. For example, sensor units 114 may measure temperature, power levels, statuses, and/or any characteristic of robot 102. In some cases, sensor units 114 may be configured to determine the odometry of robot 102. For example, sensor units 114 may include proprioceptive sensors, which may comprise sensors such as accelerometers, inertial measurement units (“IMU”), odometers, gyroscopes, speedometers, cameras (e.g. using visual odometry), clock/timer, and the like. Odometry may facilitate autonomous navigation and/or autonomous actions of robot 102. This odometry may include robot 102's position (e.g., where position may include robot's location, displacement and/or orientation, and may sometimes be interchangeable with the term pose as used herein) relative to the initial location. Such data may be stored in data structures, such as matrices, arrays, queues, lists, stacks, bags, etc. According to exemplary embodiments, the data structure of the sensor data may be called an image.
According to exemplary embodiments, sensor units 114 may be in part external to the robot 102 and coupled to communications units 116. For example, a security camera within an environment of a robot 102 may provide a controller 118 of the robot 102 with a video feed via wired or wireless communication channel(s). In some instances, sensor units 114 may include sensors configured to detect a presence of an object at a location such as, for example without limitation, a pressure or motion sensor may be disposed at a shopping cart storage location of a grocery store, wherein the controller 118 of the robot 102 may utilize data from the pressure or motion sensor to determine if the robot 102 should retrieve more shopping carts for customers.
According to exemplary embodiments, user interface units 112 may be configured to enable a user to interact with robot 102 (for example, exchange information with the robot). For example, user interface units 112 may include touch panels, buttons, keypads/keyboards, ports (e.g., universal serial bus (“USB”), digital visual interface (“DVI”), Display Port, E-Sata, Firewire, PS/2, Serial, VGA, SCSI, audioport, high-definition multimedia interface (“HDMI”), personal computer memory card international association (“PCMCIA”) ports, memory card ports (e.g., secure digital (“SD”) and miniSD), and/or ports for computer-readable medium), mice, rollerballs, consoles, vibrators, audio transducers, and/or any interface for a user to input and/or receive data and/or commands, whether coupled wirelessly or through wires. Users may interact through voice commands or gestures. User interface units 218 may include a display, such as, without limitation, liquid crystal display (“LCDs”), light-emitting diode (“LED”) displays, LED LCD displays, in-plane-switching (“IPS”) displays, cathode ray tubes, plasma displays, high definition (“HD”) panels, 4K displays, retina displays, organic LED displays, touchscreens, surfaces, canvases, and/or any displays, televisions, monitors, panels, and/or devices known in the art for visual presentation. According to exemplary embodiments user interface units 112 may be positioned on the body of robot 102. According to exemplary embodiments, user interface units 112 may be positioned away from the body of robot 102 but may be communicatively coupled to robot 102 (e.g., via communication units including transmitters, receivers, and/or transceivers) directly or indirectly (e.g., through a network, server, and/or a cloud). According to exemplary embodiments, user interface units 112 may include one or more projections of images on a surface (e.g., the floor) proximally located to the robot, e.g., to provide information to the occupant or to people around the robot. The information could be the direction of future movement of the robot, such as an indication of moving forward, left, right, back, at an angle, and/or any other direction. In some cases, such information may utilize arrows, colors, symbols, etc.
According to exemplary embodiments, communications unit 116 may include one or more receivers, transmitters, and/or transceivers. Communications unit 116 may be configured to send/receive a transmission protocol, such as BLUETOOTH®, ZIGBEE®, Wi-Fi, induction wireless data transmission, radio frequencies, radio transmission, radio-frequency identification (“RFID”), near-field communication (“NFC”), infrared, network interfaces, cellular technologies such as 3G (3.5G, 3.75G. 3GPP/3GPP2/HSPA+), 4G (4GPP/4GPP2/LTE/LTE-TDD/LTE-FDD), 5G (5GPP/5GPP2), or 5G LTE (long-term evolution, and variants thereof including LTE-A, LTE-U, LTE-A Pro, etc.), high-speed downlink packet access (“HSDPA”), high-speed uplink packet access (“HSUPA”), time division multiple access (“TDMA”), code division multiple access (“CDMA”) (e.g., IS-95A, wideband code division multiple access (“WCDMA”), etc.), frequency hopping spread spectrum (“FHSS”), direct sequence spread spectrum (“DSSS”), global system for mobile communication (“GSM”), Personal Area Network (“PAN”) (e.g., PAN/802.15), worldwide interoperability for microwave access (“WiMAX”), 802.20, long term evolution (“LTE”) (e.g., LTE/LTE-A), time division LTE (“TD-LTE”), global system for mobile communication (“GSM”), narrowband/frequency-division multiple access (“FDMA”), orthogonal frequency-division multiplexing (“OFDM”), analog cellular, cellular digital packet data (“CDPD”), satellite systems, millimeter wave or microwave systems, acoustic, infrared (e.g., infrared data association (“IrDA”)), and/or any other form of wireless data transmission.
Communications unit 116 may also be configured to send/receive signals utilizing a transmission protocol over wired connections, such as any cable that has a signal line and ground. For example, such cables may include Ethernet cables, coaxial cables, Universal Serial Bus (“USB”), Fire Wire, and/or any connection known in the art. Such protocols may be used by communications unit 116 to communicate to external systems, such as computers, smart phones, tablets, data capture systems, mobile telecommunications networks, clouds, servers, or the like. Communications unit 116 may be configured to send and receive signals comprising numbers, letters, alphanumeric characters, and/or symbols. In some cases, signals may be encrypted, using algorithms such as 128-bit or 256-bit keys and/or other encryption algorithms complying with standards such as the Advanced Encryption Standard (“AES”), RSA, Data Encryption Standard (“DES”), Triple DES, and the like. Communications unit 116 may be configured to send and receive statuses, commands, and other data/information. For example, communications unit 116 may communicate with a user operator to allow the user to control robot 102. Communications unit 116 may communicate with a server/network (e.g., a network) in order to allow robot 102 to send data, statuses, commands, and other communications to the server. The server may also be communicatively coupled to computer(s) and/or device(s) that may be used to monitor and/or control robot 102 remotely. Communications unit 116 may also receive updates (e.g., firmware or data updates), data, statuses, commands, and other communications from a server for robot 102.
In exemplary embodiments, operating system 110 may be configured to manage memory 120, controller 118, power supply 122, modules in operative units 104, and/or any software, hardware, and/or features of robot 102. For example, and without limitation, operating system 110 may include device drivers to manage hardware recourses for robot 102.
In exemplary embodiments, power supply 122 may include one or more batteries, including, without limitation, lithium, lithium ion, nickel-cadmium, nickel-metal hydride, nickel-hydrogen, carbon-zinc, silver-oxide, zinc-carbon, zinc-air, mercury oxide, alkaline, or any other type of battery known in the art. Certain batteries may be rechargeable, such as wirelessly (e.g., by resonant circuit and/or a resonant tank circuit) and/or plugging into an external power source. Power supply 122 may also be any supplier of energy, including wall sockets and electronic devices that convert solar, wind, water, nuclear, hydrogen, gasoline, natural gas, fossil fuels, mechanical energy, steam, and/or any power source into electricity, including converting alternating current (AC) power to direct current (DC) power.
One or more of the units described with respect to
As used herein, a robot 102, a controller 118, or any other controller, processing device, or robot performing a task, operation or transformation illustrated in the figures below comprises a controller executing computer readable instructions stored on a non-transitory computer readable storage apparatus, such as memory 120, as would be appreciated by one skilled in the art.
Next referring to
One of ordinary skill in the art would appreciate that the architecture illustrated in
One of ordinary skill in the art would appreciate that a controller 118 of a robot 102 may include one or more processing devices 138 and may further include other peripheral devices used for processing information, such as ASICS, DPS, proportional-integral-derivative (“PID”) controllers, hardware accelerators (e.g., encryption/decryption hardware), and/or other peripherals (e.g., analog to digital converters) described above in
Individual beams 208 of photons may localize respective points 204 of the wall 206 in a point cloud, the point cloud comprising a plurality of points 204 localized in 2D or 3D space as illustrated in
According to at least one non-limiting exemplary embodiment, sensor 202 may be illustrative of a depth camera or other ToF sensor configurable to measure distance, wherein the sensor 202 being a planar LiDAR sensor is not intended to be limiting. Depth cameras may operate similar to planar LiDAR sensors (i.e., measure distance based on a ToF of beams 208); however, depth cameras may emit beams 208 using a single pulse or flash of electromagnetic energy, rather than sweeping a laser beam across a field of view. Depth cameras may additionally comprise a two-dimensional field of view rather than a one-dimensional, planar field of view.
According to at least one non-limiting exemplary embodiment, sensor 202 may be illustrative of a structured light LiDAR sensor configurable to sense distance and shape of an object by projecting a structured pattern onto the object and observing deformations of the pattern. For example, the size of the projected pattern may represent distance to the object and distortions in the pattern may provide information of the shape of the surface of the object. Structured light sensors may emit beams 208 along a plane as illustrated or in a predetermined pattern (e.g., a circle or series of separated parallel lines).
Absent noise, atmospheric diffusion, imperfections in mirror 214, and/or other distortions, the beam 208 will be emitted perfectly along the x-y plane as defined by coordinates 218. However, in practice, LiDAR sensors 202 are often subject to noise, imperfections, an atmosphere/transmission medium, and other perturbations that may cause the beam 208 to be emitted at an angle away from the x-y plane. The typical variance of the beam 208 is shown via boundary lines 216 (
Next,
One skilled in the art will appreciate that angle α is the result of noise and atmospheric diffusion, which may vary over time, whereas angle β is determined by the sampling area of the LiDAR sensor and can be presumed herein to be a fixed parameter.
First, in
Currently, the only protection to the LiDAR 302 includes a front cover 304 and a brush 306, wherein neither of these components are designed specifically for protection of the LiDAR 302 but may provide marginal protection against large objects entering between and colliding with the LiDAR 302 lens. In one exemplary scenario, the robot 102 may operate in a retail environment comprising a plurality of humans and shopping carts, wherein the lower racks of shopping carts are approximately of the same height of the LiDAR 302 and thin enough to pass between front cover 304 and brush 306, and therefore pose a great risk of collision with the sensor 302. In another exemplary scenario, protrusions of shelves, displays, or other (typically thin or highly reflective) objects may pose a risk of collision with the LiDAR sensor 302. Accordingly, a protective module 400 as disclosed herein will be coupled to this lower LiDAR sensor 302.
As shown, the LiDAR sensor 302 lens is now entirely exposed. One may appreciate that thin (e.g., about 3 inch) objects, such as protrusions from shelves, lower shopping cart racks, etc., may easily collide with the lens of the LiDAR 302, even if the lower brush 306 is coupled to the robot 102. A metallic support 314 is included in this embodiment of robot 102; however, this lower support 314 does not include any added protection to the LiDAR 302 lens as shown by the lack of occlusion of the lens from this front-facing perspective. Support 314 is utilized, in this embodiment, to perform small calibration adjustments to the sensor 302. While support 314 may protect the LiDAR 302 lens from thick objects, it does not protect it from thin ones.
The LiDAR 302 is coupled to the robot 102 via a frame 308 which couples to the robot 102 chassis via bolts 310. The frame 308 may further include threads or holes 312 which couple the lower part of the front panel 304 (not shown) to the robot 102 chassis. In some embodiments, these threads 312 may be utilized to couple the protective module disclosed herein to the robot 102. In other embodiments, the entire frame 308 may be decoupled from the robot 102 and replaced with a new frame which includes the protective module disclosed herein, which is depicted next in
In this exemplary embodiment, an existing robot 102 is being reconfigured with the protective module 400 disclosed herein. It is appreciated that removal of the prior frame 308 is not necessary to utilize the protective module 400, wherein the module 400 could be installed during production of the robot 102. Further, replacement of the prior frame 308 with the new frame 414 may not always be necessary, wherein the frame 414 substantially resembles the old frame 308, but replacing the entire frame 308 could be beneficial for quicker retrofitting of robots 102 operating in the field. In some instances, it may be necessary if one or more mechanical/electrical couplers (e.g., threads 312, cable interfaces, etc.) in the new frame 414 are not present in the old frame 308.
The teeth are positioned in front of the sensor 202 lens, thereby blocking some of the outgoing and incoming signals. The outgoing beams 208 includes a variance of angle α from an x-y plane due to atmospheric diffusion and noise, as described in
wherein l is the gap size, w is the distance between the emission/focal point of the beams to the tooth gap, and α/2 represents half the illustrated angle α. It can be appreciated, however, that this lower bound is substantially smaller than most hazards, e.g., shopping cart bars and is only concerned with emission of the signal from the sensor 202 without disruption/reflections off teeth 408. Further, this bound does not account for a return signal, which must be maintained at a threshold power and/or signal to noise ratio (“SNR”), which may require the gap to be larger than this lower bound, as discussed below. The value of l also contains a maximum bound wherein the value of l plus the size of the teeth 408 ht should be equal to or smaller than the size of the hazard of concern. Due to the alternative spacing, the maximum gap size which prevents the hazard from colliding with the sensor 202 is l+ht rather than l which aids in ensuring the returning signals are detected, as discussed next in
The module 400 further includes a connection interface or frame 414, which includes one or more structural components configured to couple the top and bottom portions 410, 412 to each other and to a robot 102. The frame 414 may include one or more holes 406 (shown in
According to at least one non-limiting exemplary embodiment, the holes 406 of the connection interface may not align with any pre-drilled holes within a robot 102 body/chassis. Accordingly, in some instances, it may be required by a skilled technician to drill new holes in the robot 102 chassis at locations where the holes 406 bored in the structural components 414 of the module 400 can be aligned in order to couple the module 400 to the robot 102. Alternatively, the entire frame 414 could replace a previous frame, e.g., 308 shown in
Receivers 504 are characterized by a surface area which is configured to collect a reflected signal. Collection of more reflected signal power (i.e., light) corresponds to a stronger signal with less noise, yielding more robust and reliable measurements. As mentioned previously, the gap l is configured such that no outgoing light is incident upon the teeth 408. However, in considering reflected light, the SNR must be above a threshold level to obtain a measurement. Further, in some instances, the power of the reflected signal must also be above a threshold value. Lastly, the power of the returning signal is dependent on the distance traveled by the signal, wherein distance to objects of interest must also be considered. For example, in some embodiments, a robot 102 may not need to know the locations of objects beyond 20 meters from itself. One skilled in the art would appreciate that the teeth 408 may occlude a portion of the detector 504 surface area, as seen by a target a distance from the sensor 202, thereby weakening the return signal.
Stated another way, the perspective view shown in
Light emitted from the sensor 202 may be highly directional (e.g., a spinning laser), but it is appreciated that the returning light is not directional and is a result of diffuse reflection off a surface. Accordingly, any given beam 208 emitted by the sensor 202 may be reflected back to the detector element 504 and be incident on the detector 504 at many locations, wherein the return signal power is integrated over the area of the detector 504.
The SNR required to obtain a measurement may be a fixed parameter intrinsic to the sensor 202 used. Often such parameter is stated by manufacturers of the sensors 202 and can be treated as a fixed, known value. To obtain sufficient SNR to operate the sensor 202, the parameters of the teeth 408 may be adjusted for a target of a certain distance. Ideally, the certain distance should correspond to the maximum distance of which the robot 102 must consider nearby objects to operate, but could be extended for safety concerns. The parameters of the teeth include the tooth width dt, the tooth gap dg, the tooth 408 height ht, and the dimensions h and l. The tooth width dt, tooth height ht, and tooth gap dg may all be adjusted without substantial constraint. For example, it may be determined that the teeth 408 must not occlude more than 10% of the detector element 504 area for a given target at a certain range, wherein there exist an infinite number of values of dt, ht, and dg which satisfy this constraint.
However, some additional constraints are considered. Namely, value l must be less than the size of the hazard the module 400 is intending to prevent from colliding with the lens 502, which in turn may constrain w or the distance between the teeth 408 and the focal point of the sensor 202 (shown in
It is appreciated that there is no one solution for any given set of sensor intrinsic parameters for the values of dt, dg, and ht which configure sufficient returning light to be received by the detector 504. Many solutions are contemplated without limitation. For example, the width dt of the teeth 408 may be increased while the spacing ds decrease without impacting the (i) safety of the lens 502 from collision with objects, and (ii) the occlusion of the detector element 504.
For the purpose of explanation, the size and shape of the teeth 408T, 408B are shown as uniform. However, one skilled in the art may appreciate that non-uniform teeth 408 may also be used on either the top, bottom or both portions of the module 400. Preferably, however, the teeth 408 should still be offset and extend into spacings formed by the teeth of the opposing side for the reasons discussed above. The size and shape of the teeth 408 shown herein and in other figures are configured such that the area of the detector 504 occluded is substantially uniform. The size and shape of the teeth 408 and their spacing dg shown herein and in other figures (e.g.,
One skilled in the art may appreciate that LiDAR lenses are rounded, wherein the view shown in
To illustrate the constraints of the various parameters discussed in
According to at least one non-limiting exemplary embodiment, the emissions from the LiDAR 202 may not follow the depicted model, wherein beams 208 deviate from the focal point or mirror 214. Rather, the beam 208 variance may be modeled as deviating from a focal point located on the outer lens of the sensor 202. Accordingly, in some instances, w may be measured from the outer lens of the sensor 202 rather than the center point or at the mirror 214.
Stated another way, α, the SNR required to receive a measurement, and the size of the hazards are fixed parameters. Parameters l, w, h, ht, ds, dt, dg and dt are all tunable parameters. The lower bound of l being based on beam divergence (α) and its distance w from the focal point following equation 1. The upper bound of l=h+ht would be dependent on the width of the hazards, wherein a larger h/l may cause such hazards to contact the sensor. Ideally, l is maximized to provide maximum SNR to the receiver 504, wherein the constraints on l may dictate the size, shape, and spacing of the teeth 408B, 408T. One skilled in the art may further appreciate that any of the listed tunable parameters may be constrained by other factors such as the body of the robot 102, attachment mechanisms therein, and other form specific factors. A process flow diagram 700 illustrating these design considerations is provided below in
Advantageously, use of a module 400 to protect a LiDAR 202 from hazards the LiDAR 202 is designed to detect enables protection of the lens 502 from contact with objects (e.g., resulting in scratches and cracks) without inhibiting the ability of the sensor 202 to measure the environment. The use of teeth 408 provide multiple readily tunable parameters (e.g., dt, dg, and ht) to enable one skilled in the art to design a protective module 400 to meet the intrinsic parameters of any sensor 202 without substantially occluding the detector element 504. Further, ensuring the teeth 408 do not extend into the angle α ensures no distance measurements from the sensor 202 to the teeth 408 are taken. Lastly, by alternating the teeth 408 in the top and bottom portions 410, 412 the gap h may be maximized, to be just smaller than a hazard of which the module 400 is designed to protect the sensor 202 from, yielding a strong and roughly uniform (i.e., non-directional) return signal than designs with aligned teeth 408.
Block 602 includes the operator powering off the robot 102.
Block 604 includes the operator detaching any protective coverings already in place. With reference to
One optional, recommended step is to cover the LiDAR sensor 202 with a temporary protective covering, such as a plastic film, to avoid scratches when performing the later steps of method 600. Often small scratches may impede the performance of the sensor 202 without being readily visible to the human eye.
Block 606 includes the operator placing a module 400 over the sensor 202. In some embodiments, the sensor 202 may remain coupled to the robot 102 (e.g., via mechanical attachment mechanisms and electronically via cables) as the module 400 is affixed to the robot 102 to ensure the sensor 202 position is not changed. The top and bottom portions 410, 412 of the module 400 form a U-shape, as can be seen in
In other embodiments, the sensor 202 may be coupled to portions of the robot 102 which need to be removed in block 602 in order to access the sensor 202. For instance, with reference to
Block 608 includes the operator securing a connection interface of the module 400 to the robot 102. The connection interface may include one or more mechanical attachment mechanisms, such as bolts 310 and/or 316 shown in
Block 610 includes the operator reattaching any protective coverings removed in block 604, which can be placed back onto the robot 102. For example, the front panel 304 and/or brush 306 may be reaffixed to the robot 102. It is appreciated, however, that the parts reaffixed or removed from the robot 102 may be specific to the particular make or model of robot, however, a protective module 400 may be used on any robot 102 regardless of its make or model, provided the connection interface is configured to the particular make or model. Accordingly, the teeth 408 of module 400 may be configured in a manner suitable for a particular sensor, whereas the connection interface may be configured to enable the protective module 400 to be used by any robot 102 comprising the particular sensor, thereby improving adaptability of the protective module 400 for use in a wide variety of robots 102 and sensors 202. For example, a module 400 may comprise portions 410 and 412 configured for a specific sensor, and a connection interface configured for a specific robot.
Block 702 includes the designer determining a maximum operable range for the LiDAR sensor. The maximum operable range may be different from the maximum range of the sensor, wherein the maximum operable range refers to the range at which the robot 102 must consider nearby objects when operating autonomously. For instance, a robot 102 may not need to concern itself with objects beyond 20 meters from itself in its normal operations in, e.g., path planning. Thus, despite the LiDAR sensor being potentially capable of measuring ranges beyond 20 meters, the module 400 is only required to facilitate measurements at 20 meters or less.
Block 704 includes the designer determining the beam variance (α). The beam variance is an intrinsic parameter of the sensor and may be provided by a manufacturer of the sensor. Alternatively it may be measured via a detector placed at a known distance and measuring the area of which the beams 208 are incident upon.
Block 706 includes the designer placing a target at the maximum operable range for the sensor within the FOV of the sensor. The target may comprise of any non-specular surface, preferably one of moderate reflectivity to simulate typical objects, which may comprise varying reflectivity. For instance, a broadband (in near infrared or infrared) reflective target would reflect additional light back to the sensor due to its high reflectivity, however in normal operation of the robot 102 such highly reflective objects may not always be present. If desired for safety concerns, the designer may utilize a poorly reflective surface, such as a black object, to simulate a worst case scenario.
Block 708 includes the designer determining a minimum gap l based on a distance w from the focal point of the sensor and the beam variance (α) in accordance with equation 1 above. The distance w may be additionally constrained by the size, shape and layout of the robot 102 chassis. For instance, with reference to
Block 710 includes the designer configuring two rows of misaligned teeth 408 on top 408T and bottom 408B of the sensor at the distance w from the focal point. The teeth comprise an area as viewed by the target (e.g., as shown in
Block 712 includes the designer determining if the sensor can detect the target object at the maximum operable range. The designer may, for instance, active the sensor and read distance measurements therefrom, wherein the distances should be approximately the maximum operable range. In some instances, the outgoing beam may reflect off the teeth 408 if l is not configured properly yielding very short distance measurements.
If the sensor is unable to measure a distance to the target, the designer may move to block 714. If the sensor can measure a distance to the target, the designer may move to block 716. It is appreciated that a LiDAR 202 sensor may, even without any occlusion, fail to return range measurements. Accordingly, the designer may determine that the sensor 202 is able to measure the target if a sufficient number or percentage (e.g., 90% or more) of emitted beams 208 return a distance measurement.
Block 714 includes decreasing the area of the teeth or increasing l, if l is not currently maximized to be equal to the size of the hazards. Either or both options effectively reduce the occlusion of the detector 504, thereby improving SNR of the returning signal. Improving the SNR of the return signal may effectively increase the operable range of the sensor with the module 400 coupled thereto. Upon decreasing l and/or increasing the areas occluded by the teeth 408B and/or 408T, the designer may subsequently return to block 714 to verify if the sensor is able to measure the target with the modifications applied. In some instances, the spacing ds between any two teeth 408 may be increased to improve the SNR.
Block 716 includes the designer determining if the hazard can pass through the teeth 408T and 408B. The hazards from which the module 400 is designed to protect may be case specific based on environments within which the robots 102 operate and where the sensor 202 is disposed on the robot. For instance, a robot 102 operating in a supermarket using a LiDAR 302 as shown in
If the hazard can pass through the teeth 408, the designer may move to block 718 to decrease l and/or increase the area occluded by the teeth 408. Following such change, the designer returns to block 712 to ensure the sensor is still able to measure the target. In some cases, the tooth spacing, w, could be reduced (and the number of teeth 408 increased) if the hazard is small enough to fit in between the spaces of the teeth 408.
If the hazard is not able to pass through the teeth, the designer moves to block 720 and the module 400 is configured correctly.
Advantageously, method 700 enables designers to configure a protective module 400 to protect LiDAR sensors on a wide variety of robots 102 and/or LiDARs. Further, the protective module 400 is designed with modularity as an essential aspect as often it may be necessary to add additional protection to existing robots 102. It is appreciated that robots 102 operate in a wide variety of environments, each containing different hazards. Often specific hazards are anticipated, like shopping carts in a supermarket, however on occasion some hazards are not, such as a specific shelf or object unique to the environment which protrudes at a height just above the LiDAR and is undetectable by the LiDAR. By designing a modular protective covering 400 with a plurality of adjustable parameters, designers are capable of protecting LiDAR sensors on a wide variety of robots 102 operating in a wide variety of environments, as well as respond to new hazards as they are identified.
According to at least one non-limiting exemplary embodiment, the teeth 408 of the module 400 could be utilized as a calibration reference point. With reference to
It will be recognized that while certain aspects of the disclosure are described in terms of a specific sequence of steps of a method, these descriptions are only illustrative of the broader methods of the disclosure, and may be modified as required by the particular application. Certain steps may be rendered unnecessary or optional under certain circumstances. Additionally, certain steps or functionality may be added to the disclosed embodiments, or the order of performance of two or more steps permuted. All such variations are considered to be encompassed within the disclosure disclosed and claimed herein.
While the above detailed description has shown, described, and pointed out novel features of the disclosure as applied to various exemplary embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the device or process illustrated may be made by those skilled in the art without departing from the disclosure. The foregoing description is of the best mode presently contemplated of carrying out the disclosure. This description is in no way meant to be limiting, but rather should be taken as illustrative of the general principles of the disclosure. The scope of the disclosure should be determined with reference to the claims.
While the disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive. The disclosure is not limited to the disclosed embodiments. Variations to the disclosed embodiments and/or implementations may be understood and effected by those skilled in the art in practicing the claimed disclosure from a study of the drawings, the disclosure and the appended claims.
It should be noted that the use of particular terminology when describing certain features or aspects of the disclosure should not be taken to imply that the terminology is being re-defined herein to be restricted to include any specific characteristics of the features or aspects of the disclosure with which that terminology is associated. Terms and phrases used in this application, and variations thereof, especially in the appended claims, unless otherwise expressly stated, should be construed as open-ended as opposed to limiting. As examples of the foregoing, the term “including” should be read to mean “including, without limitation,” “including but not limited to,” or the like; the term “comprising” as used herein is synonymous with “including,” “containing,” or “characterized by,” and is inclusive or open-ended and does not exclude additional, unrecited elements or method steps; the term “having” should be interpreted as “having at least;” the term “such as” should be interpreted as “such as, without limitation:” the term “includes” should be interpreted as “includes but is not limited to;” the term “example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof, and should be interpreted as “example, but without limitation;” adjectives such as “known,” “normal,” “standard,” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass known, normal, or standard technologies that may be available or known now or at any time in the future; and use of terms like “preferably,” “preferred,” “desired,” or “desirable,” and words of similar meaning should not be understood as implying that certain features are critical, essential, or even important to the structure or function of the present disclosure, but instead as merely intended to highlight alternative or additional features that may or may not be utilized in a particular embodiment. Likewise, a group of items linked with the conjunction “and” should not be read as requiring that each and every one of those items be present in the grouping, but rather should be read as “and/or” unless expressly stated otherwise. Similarly, a group of items linked with the conjunction “or” should not be read as requiring mutual exclusivity among that group, but rather should be read as “and/or” unless expressly stated otherwise. The terms “about” or “approximate” and the like are synonymous and are used to indicate that the value modified by the term has an understood range associated with it, where the range may be ±20%, ±15%, ±10%, ±5%, or ±1%. The term “substantially” is used to indicate that a result (e.g., measurement value) is close to a targeted value, where close may mean, for example, the result is within 80% of the value, within 90% of the value, within 95% of the value, or within 99% of the value. Also, as used herein “defined” or “determined” may include “predefined” or “predetermined” and/or otherwise determined values, conditions, thresholds, measurements, and the like.
This application is a continuation of International Patent Application No. PCT/US23/17043 filed on Mar. 31, 2023 and claims the benefit of U.S. Provisional Patent Application Ser. No. 63/326,202 filed on Mar. 31, 2022 under 35 U.S.C. § 119, the entire disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63326202 | Mar 2022 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US23/17043 | Mar 2023 | WO |
Child | 18893360 | US |