This non-provisional patent application claims priority under 35 U.S.C. § 119 from Chinese Patent Application No. 202010972378.3 filed on Sep. 16, 2020, the entire content of which is incorporated herein by reference.
The disclosure relates to the technical field of autonomous driving, particularly relates to an autonomous driving vehicle and a dynamic planning method of drivable area.
With the rapid development of autonomous driving vehicles, the autonomous driving technology is an inevitable trend of the vehicles. The autonomous driving technology adapts to a concept of friendliness and meets requirements of social development, such as high efficiency and low cost, and is more convenient for people's work and life. The autonomous driving technology includes four modules, of a position module, a perception module, a decision-making module, and a control module. The position module obtains the accurate a location of the vehicle in a specific map, the perception module dynamically collects and perceives information of the surrounding environment; the decision module processes the collected location and perception information and makes the drivable area, and the control module controls the vehicle to move horizontally or longitudinally according to the drivable area from the decision module.
Planning drivable area is a core technology in the field of autonomous technology. Planning drivable areas refers to planning a drivable area which does not collide with obstacles and meets kinematic constraints, environment constraints and time constraints of the vehicle when an initial state, a target state and an obstacle distribution in the environment of the vehicle are obtained. It is urgent to develop strategies to avoid obstacles, research suitable control methods, and plan different driving areas.
The disclosure provides a dynamic planning method of drivable area for an autonomous driving vehicle to solve above problems.
In a first aspect, a dynamic planning method of drivable area is provided. The dynamic planning method of drivable area comprises steps of: obtaining a location of the autonomous driving vehicle at a current time; perceiving environment data about environment around the autonomous driving vehicle; extracting lane information about lanes from the environment data, the lane information comprising locations of lane lines of the lanes; obtaining a first drivable area of the autonomous driving vehicle according to the location of the autonomous driving vehicle at the current time, a high-definition map, and the lane information, the first drivable area comprising lane areas locating between two edge lines of each lane, and a shoulder locating between each edge line of the lane and a curb respectively adjacent to each edge line of the lane; extracting static information about static objects from the environment data, the static information containing locations of the static objects and regions of the static objects; extracting dynamic information about dynamic objects from the environment data, and predicting trajectories of the dynamic objects according to the dynamic information; and planning a second drivable area according to the first drivable area, the static information, the trajectories of the dynamic objects, and the lane information.
In a second aspect, an autonomous driving vehicle is provided. The autonomous driving vehicle comprising: a memory configured to store program instructions; one or more processors configured to execute the program instructions to perform a dynamic planning method of drivable area, the dynamic planning method of drivable area for an autonomous driving vehicle comprising: obtaining a location of the autonomous driving vehicle at a current time; perceiving environment data about environment around the autonomous driving vehicle; extracting lane information about lanes from the environment data, the lane information comprising locations of lane lines of the lanes; obtaining a first drivable area of the autonomous driving vehicle according to the location of the autonomous driving vehicle at the current time, a high-definition map, and the lane information, the first drivable area comprising lane areas locating between two edge lines of each lane, and a shoulder locating between each edge line of the lane and a curb respectively adjacent to each edge line of the lane; extracting static information about static objects from the environment data, the static information containing locations of the static objects and regions of the static objects; extracting dynamic information about dynamic objects from the environment data, and predicting trajectories of the dynamic objects according to the dynamic information; planning a second drivable area according to the first drivable area, the static information, the trajectories of the dynamic objects, and the lane information.
In a third aspect, a medium is provided. The medium comprising a plurality of program instructions, the program instructions executed by one or more processors to perform a dynamic planning method of drivable area, the dynamic planning method of drivable area for an autonomous driving vehicle comprising: obtaining a location of the autonomous driving vehicle at a current time; perceiving environment data about environment around the autonomous driving vehicle; extracting lane information about lanes from the environment data, the lane information comprising locations of lane lines of the lanes; obtaining a first drivable area of the autonomous driving vehicle according to the location of the autonomous driving vehicle at the current time, a high-definition map, and the lane information, the first drivable area comprising lane areas locating between two edge lines of each lane, and a shoulder locating between each edge line of the lane and a curb respectively adjacent to each edge line of the lane; extracting static information about static objects from the environment data, the static information containing locations of the static objects and regions of the static objects; extracting dynamic information about dynamic objects from the environment data, and predicting trajectories of the dynamic objects according to the dynamic information; planning a second drivable area according to the first drivable area, the static information, the trajectories of the dynamic objects, and the lane information.
As described above, the dynamic planning method can plan the drivable area for the autonomous driving vehicle based on the environment around the autonomous driving vehicle at the current moment, analysis of the lane information, the static information, and the dynamic information in the surrounding environment, which may get a drivable area which is big enough for the autonomous driving vehicle drive continually when there are obstacles.
In order to illustrate the technical solution in the embodiments of the disclosure or the prior art more clearly, a brief description of drawings required in the embodiments or the prior art is given below. Obviously, the drawings described below are only some of the embodiments of the disclosure. For ordinary technicians in this field, other drawings can be obtained according to the structures shown in these drawings without any creative effort.
In order to make the purpose, technical solution and advantages of the disclosure more clearly, the disclosure is further described in detail in combination with the drawings and embodiments. It is understood that the specific embodiments described herein are used only to explain the disclosure and are not used to define it. On the basis of the embodiments in the disclosure, all other embodiments obtained by ordinary technicians in this field without any creative effort are covered by the protection of the disclosure.
The terms “first”, “second”, “third”, “fourth”, if any, in the specification, claims and drawings of this application are used to distinguish similar objects and need not be used to describe any particular order or sequence of priorities. It should be understood the data used here are interchangeable where appropriate, in other words, the embodiments described can be implemented in order other than what is illustrated or described here. In addition, the terms “include” and “have” and any variation of them, can encompass other things. For example, processes, methods, systems, products, or equipment that comprise a series of steps or units need not be limited to those clearly listed, but may include other steps or units that are not clearly listed or are inherent to these processes, methods, systems, products, or equipment.
It is to be noted that the references to “first”, “second”, etc. in the disclosure are for descriptive purpose only and neither be construed or implied the relative importance nor indicated as implying the number of technical features. Thus, feature defined as “first” or “second” can explicitly or implicitly include one or more such features. In addition, technical solutions between embodiments may be integrated, but only on the basis that they can be implemented by ordinary technicians in this field. When the combination of technical solutions is contradictory or impossible to be realized, such combination of technical solutions shall be deemed to be non-existent and not within the scope of protection required by the disclosure.
Referring to
In the step S102, the autonomous driving vehicle 30 obtains a location of the autonomous driving vehicle at a current time. In detail, the method obtains the current location of the autonomous driving vehicle 30 through the location module 31 set on the autonomous driving vehicle 30. Among them, location module 31 includes but is not limited to the global location system, the Beidou satellite navigation system, an inertial measurement unit, etc.
In the step S104, environment data about environment around the autonomous driving vehicle is perceived. In detail, in this embodiment, the step S104 is performed as: first, detecting the environment around the autonomous driving vehicle 30 through the sensing device 32 set in the autonomous driving vehicle 30 to obtain the sensing data; and then the sensing data is processed to generate the environment data according to a pre fusion prediction algorithm or a post fusion prediction algorithm. In this embodiment, the sensing device 32 is sensor device in an integrated flat shape, and the sensing device 32 is arranged in a middle of a top side of the autonomous driving vehicle 30. In some implement embodiments, the sensing device 32 may be but not limited to a convex sensor device or separated sensor devices. And the sensing device 32 may be installed in other position of the autonomous driving vehicle 30 rather than the middle of the top side of the autonomous driving vehicle 30. The sensing device 32 includes but is not limited to radars, lidars, thermal image sensors, image sensors, infrared instruments, ultrasonic sensors, and other sensors with sensing function. In this embodiment, the sensing device 32 obtains the sensing data around the autonomous driving vehicle 30 by various sensors. The sensing data includes but is not limited to radar detection data, lidar detection data, thermal imager detection data, image sensor detection data, infrared detector detection data, ultrasonic sensor detection data, etc. When the sensing data is processed by the pre fusion sensing algorithm, the sensing data detected by the various sensors will be synchronized, and the synchronized data is then perceived to generate the environmental data. When the sensing data is processed by the post fusion prediction algorithm, the sensing data detected by the various sensors will be perceived, to generate target data and the target data is then fused to generate the environmental data. In some implement embodiments, the sensing data can also be processed via a hybrid fusion sensing algorithm, or a combination of multiple fusion sensing algorithms. When the sensing data is processed by the hybrid fusion sensing algorithm, the sensing data processed by each of the pre fusion sensing algorithm and a part of the sensing data is processed by the post fusion prediction algorithm and the sensing data processed is then mixed to generate the environmental data. When the sensing data is processed by a combination of multiple fusion sensing algorithms, the sensing data is processed by via fusion sensing algorithms being performed in parallel to generate the environmental data, the fusion sensing algorithms include the post fusion prediction algorithm, the pre fusion sensing algorithm, the hybrid fusion sensing algorithm, or fusion sensing algorithms which are constructed by combined the post fusion prediction algorithm, the pre fusion sensing algorithm, the hybrid fusion sensing algorithm in a predetermined rule. How to generate the environmental data will be described combining the
In the step S106, lane information about lanes from the environment data is extracted. As shown in
In the step S108, a first drivable area of the autonomous driving vehicle is obtained according to the location of the autonomous driving vehicle at the current time, a high-definition map, and the lane information. As shown in
In the step S110, static information about static objects is extracted from the environment data. As shown in
In the step S112, dynamic information about dynamic objects is extracted from the environment data, and predicting trajectories of the dynamic objects according to the dynamic information. As shown in
In the step S114, a second drivable area is planned according to the first drivable area, the static information, the trajectories of the dynamic objects, and the lane information. As shown in
In this embodiment, the environment around the autonomous driving vehicle 30 at the current moment is sensed and the environment data is obtained, and the lane information. The lane information about lanes, the static information about static objects and the dynamic information about dynamic objects are extracted from environmental data synchronously or asynchronously. the first drivable area of the autonomous driving vehicle is obtained according to the current location of the autonomous driving vehicle, high-definition map and lane information. In addition to the drivable lane, the first drivable area also includes the shoulder, and the autonomous driving vehicle is capable of driving beyond the lane. The static information, the dynamic information, and the trajectory of the autonomous driving vehicle are then dynamically planned according to the first drivable area and the lane information.
Referring to
In this embodiment, the first drivable area also includes the bicycle lanes on both sides of the lane, the roadside parking area and so on, and the drivable area for the autonomous driving vehicle is further to enlarge, and the area which can be used to dynamically planned the motion trajectories becomes lager.
Referring to
In the step S1102, it is determined that whether the static objects are unrecognizable objects. In detail, the current surrounding environment of the autonomous driving vehicle 30 may include objects that cannot be recognized by the autonomous driving vehicle 30, so other methods need to be used for recognition.
In the step S1104, a grid map constructed based on the sensing data when the static objects are unrecognizable objects. In detail, the grid map is an occupancy grid map. The occupancy grid map is constructed based on the lidar detection data obtained by the lidars of the sensing device 32. In detail, the current surrounding environment is divided to form a grid map, and each grid of the grid map. has a state of a free state or an occupied state.
In the step S1106, the static regions occupied by the static objects are obtained based on the grid map. Furthermore, how to obtain the static regions is determined by the states of the grids of the grid map. When the state of the grid is occupied, an area according to the grid is occupied by the unrecognized objects. When the state of the grid is empty, an area according to the grid is not occupied by the unrecognized objects. For example, as shown in
Referring to
In the step S1101, it is determined that whether one or more of the static objects are dynamic objects in static state. In detail, one dynamic object in static state is an object which is in a static state at the current moment but turns to a dynamic state at the next moment. The dynamic objects in static state may be but not limited to still vehicles, still pedestrians, and still animals. For one example, one of the still pedestrians will extend his arms or legs at the next moment, or walk in a certain direction at the next moment. For other example, there will be other objects extending out of a still vehicle at the next moment. In detail, the sill vehicle may open the door at the next moment, or a wheelchair for the disabled may stretch out of a door of a bus stopping at a bus stop at the next moment, or goods may be moved out of a door of a container of a truck stopping at the roadside at the next moment. It is understood that, it is necessary to prevent the dynamic objects in static state from hinder the driving of the vehicle when the state of the dynamic object changes.
In the step S1103, external contour lines are expanded outward by a predetermined distance along external contour lines of the one or more static objects to form expansion areas, when the one or more static objects are the dynamic objects in static state. In this embodiment, the outer contour of the static object is extracted from the static information, and the outer contour is extended outward by a predetermined distance. In this embodiment, the predetermined distance is 1 meter. In some other implement embodiments, the predetermined distance can be other suitable lengths. As shown in
In the step S1105, the static regions occupied by the one or more static objects are obtained based on the expansion areas. In this embodiment, the static area occupied by bus E is the extending region M of bus E. In some other implement embodiments, the static area occupied by bus E can include the area occupied by bus E and the expansion area M near bus stop D.
In some implement embodiments, when a still pedestrian stands next to a still vehicle and extends a predetermined distance outward along the contour lines of the still pedestrian and the still vehicle respectively to form a pedestrian expansion area and a vehicle expansion area, the pedestrian expansion area may include the area between the pedestrian and the vehicle, and the vehicle expansion area may also include the area between the pedestrian and the vehicle.
In the above-mentioned embodiment, the predetermined distance is extended outward along the outer contour line of the dynamic objects in the static state to form an expansion area, which prevents the dynamic object in the static state at the current moment and becoming a dynamic state at the next moment from affecting the motion trajectory of the autonomous driving vehicle 30 being planned and make the driving of the autonomous driving vehicle safer.
When the static object is other recognizable object or immovable object, the static information about the static objects is extracted from the environment data, the static information includes but not limited to the static area occupied by the static object obtained directly from the environment data. As shown in
Referring to
In the step S1142, the static region and the dynamic region occupied by the trajectories of the dynamic objects are removed from the first drivable area to generates a third drivable area. In detail, the planning module 37 obtains the dynamic region P occupied by the trajectories of the dynamic objects according to the trajectories of the dynamic object, and removes the static region and the dynamic region P from the first drivable area Q1, that is, the static region and the dynamic region P are deleted from the first drivable area Q1 to form the third drivable area Q3.
In some implement embodiments, the static information may also include slit areas between the static objects and the static objects, slit areas between the static objects and the road teeth J and so on. The slit areas are not large enough for the autonomous driving vehicle 30. In detail, when a distance between two the static regions or a distance between the static objects and the road teeth J does not reach the preset distance which is not large enough for the autonomous driving vehicle 30, the area between an area between two the static regions, or an area between the static objects and the road teeth J is determined as the slit area. As shown in
In the above-mentioned embodiment, the slit areas between the static areas and between the static areas and the road teeth that can't be driven by the autonomous driving vehicle are deleted from the first drivable area Q1, so that the planning of the autonomous driving vehicle trajectory is more in line with the reality.
In the step S1144, the second drivable area is planned according to the third drivable area and lane information. In detail, in the current surrounding environment, according to the lane information, the lanes being suitable for the autonomous driving vehicle 30 to drive on are Lane K1 and lane K2. However, the temporary construction of protective wall B and the traffic cone C occupy a part of lane K2, the autonomous driving vehicle 30 needs to cross the part of the lane L and drives on a part of the lane K3 to go away from the current area. When the autonomous driving vehicle 30 occupies the lane K3, the vehicle F2 will not affect the autonomous driving vehicle 30 according to analysis of a dynamic region P of the vehicle F2 in lane K3. Therefore, the second drivable area Q2 includes the shoulder between the right edge lines of the lane L1 and the right curb J, a part of the Lane K1 unoccupied, and a part of the lane K2 unoccupied, and a part of the lane K3.
Referring to
In the step S116, the second drivable area is divided into a plurality of drivable routes. The plurality of drivable routes is arranging in order according to a preset rule. In detail, the planning module 37 analyzes the second drivable area Q2, and divides the second drivable area Q2 into plurality of drivable routes according to a size of the autonomous driving vehicle 30. The preset rule is to arrange plurality of drivable routes according to a driving distance of the drivable routes. In some other implement embodiments, the preset rule is to arrange plurality of drivable routes according to quantity of turns of drivable routes. As shown in
In the step S118, an optimal driving route is selected from the plurality drivable routes to drive on. For example, the execution module 38 arranged on the autonomous driving vehicle 30 selects the optimal driving route from the drivable routes H1 and H2. The distance of the drivable route H1 is shorter than that of the drivable route H2, the drivable route H1 is selected as the optimal drivable route. However, the drivable route H1 is far away from the traffic cone C and the temporary construction protective wall B, the overall driving speed can be fast and stable, while the drivable route H2 is near the traffic cone C and the temporary construction protective wall B, and the autonomous driving vehicle 30 needs to slow down when becoming closer to the traffic cone C and the temporary construction protective wall B, and can be accelerated when becoming farther from the traffic cone C and the temporary construction protective wall B. Therefore, the drivable route H1 will not be satisfied each of the drivable routes H1 and H2 have advantages and disadvantages. The autonomous driving vehicle 30 can choose a route being suitable for a user as the optimal driving route according to the user's habits or the user's kind.
As described above, it is capable of planning dynamic trajectories by perceiving the environment around the autonomous driving vehicle at the current moment, and analyzing the lane information, the static information, and the dynamic information in the surrounding environment. In this embodiment, the drivable area for the autonomous driving vehicle includes the shoulder, the retrograde lane, the bicycle lane, and the roadside parking area, so that the autonomous driving vehicle can change lanes smoothly. Furthermore, the drivable area not only supports the autonomous driving vehicles to stop at the roadside in emergency, but also supports the autonomous driving vehicles to occupy the retrograde lane to realize the intelligent planning of the movement trajectory, which breaks the restriction of the lane to the autonomous driving vehicles and expands the planning ability of the autonomous driving vehicles.
Referring to
The sensing device 12 is configured to sense the environmental data of the surrounding environment of the autonomous driving vehicle. In detail, the sensing device 12 detects the environment around the autonomous driving vehicle 100 to obtain the sensing data, and then processes the sensing data based on the pre fusion sensing algorithm or the post fusion sensing algorithm to obtain the environment data. The sensing device 12 can be an integrated flat sensor device, a convex sensor device, or a split sensor device. The sensing device 12 includes but is not limited to radars, lidars, thermal imagers, image sensors, infrared instruments, ultrasonic sensors and other sensors with sensing function. The sensing data around the autonomous driving vehicle 100 is obtained via various sensors. The sensing data including but not limited to radar detection data, lidar detection data, thermal imager detection data, image sensor detection data, infrared detector detection data, ultrasonic sensor detection data, and so on.
The first extraction module 13 is configured to extract lane information about lanes from environment data. The lane information includes the location of the lane line.
The acquisition module 14 is configured to acquire a first drivable area of the autonomous driving vehicle according to the location of the autonomous driving vehicle at the current time, high-definition map, and lane information. The first drivable area includes a lane area between two lane edge lines and a shoulder between each edge lines of the lane and adjacent curbs. In some implement embodiments, the first drivable area also includes a bicycle lane, a roadside parking area, and the like for the autonomous driving vehicle 100.
The second extraction module 15 is configured to extract static information about static objects from the environment data. The, static objects include but are not limited to static vehicles, traffic cones, construction road signs, temporary construction protective walls, and so on. The static information includes locations of static objects and static areas occupied by the static objects.
The third extraction module 16 is configured to extract the dynamic information about dynamic objects from the environment data, and predict a motion trajectory of the dynamic objects according to the dynamic information. The dynamic objects include but are not limited to vehicles in the lane, pedestrians walking on the sidewalk, pedestrians crossing the road, and so on. The dynamic information includes but is not limited to locations of dynamic objects, the movement directions of the dynamic objects, the speeds of the dynamic objects and so on.
The planning module 17 is configured to plan the second drivable area according to the first drivable area, static information, motion trajectory and lane information. In detail, the planning module 17 is configured to remove the static area occupied by the static object and the dynamic area occupied by the trajectories from the first drivable area, and to plan the second drivable area according to the lane information.
The planning module 17 is also configured to divide the second drivable area into plurality drivable routes and arranges the drivable routes in order according to preset rules. In detail, the planning module 17 analyzes the second drivable area and divides the second drivable area into the plurality drivable routes according to the size of the autonomous driving vehicle 100. For example, the preset rule is to arrange the plurality of drivable routes according to a driving distance of the drivable routes. For another example, the preset rule is to arrange the plurality of drivable routes according to quantity of turns of the drivable routes.
The dynamic planning system 10 further includes an execution module 18. The execution module 18 is configured to select an optimal drivable route from numbers of drivable routes to drive on.
Referring to
The memory 232 is configured to store program instructions.
The processor 231 is configured to execute the program instructions to enable the autonomous driving vehicle 30 to generate the dynamic planning method of drivable area as described above.
In some embodiments, the processor 231 can be a central processing unit, a controller, a microcontroller, a microprocessor or other data processing chip, which is configured to run the dynamic planning program instructions of the motion trajectory of the autonomous driving vehicle stored in the memory 232.
The memory 232 includes at least one type of readable storage medium including flash memory, hard disk, multimedia card, card type memory (E.G., SD or DX memory, etc.), magnetic memory, magnetic disk, optical disk, etc. The memory 232 may in some embodiments be an internal storage unit of a computer device, such as a hard disk of a computer device. In some other embodiments, the memory 232 may also be a storage device of an external computer device, such as a plug-in hard disk, an intelligent memory card, a secure digital card, a flash card, etc. provided on the computer device. Further, the memory 232 may include both an internal storage unit and an external storage device of the computer device. The memory 232 can not only be used to store the application software and various kinds of data installed on the computer equipment, such as the code of the dynamic planning method for realizing the motion trajectory of the autonomous driving vehicle, but also be used to temporarily store the data that has been output or will be output.
In the above embodiments, it may be achieved in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, it can be implemented in whole or in part as a computer program product.
The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executer on a computer, a process or function according to the embodiment of the disclosure is generated in whole or in part. The computer device may be a general-purpose computer, a dedicated computer, a computer network, or other programmable device. The computer instruction can be stored in a computer readable storage medium, or transmitted from one computer readable storage medium to another computer readable storage medium. For example, the computer instruction can be transmitted from a web site, computer, server, or data center to another web site, computer, server, or data center through the cable (such as a coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, radio, microwave, etc.). The computer readable storage medium can be any available medium that a computer can store or a data storage device such as a serve or data center that contains one or more available media integrated. The available media can be magnetic (e.g., floppy Disk, hard Disk, tape), optical (e.g., DVD), or semiconductor (e.g., Solid State Disk), etc.
The technicians in this field can clearly understand the specific working process of the system, device and unit described above, for convenience and simplicity of description, can refer to the corresponding process in the embodiment of the method described above, and will not be repeated here.
In the several embodiments provided in this disclosure, it should be understood the systems, devices and methods disclosed may be implemented in other ways. For example, the device embodiments described above is only a schematic. For example, the division of the units, just as a logical functional division, the actual implementation can have other divisions, such as multiple units or components can be combined with or can be integrated into another system, or some characteristics can be ignored, or does not perform. Another point, the coupling or direct coupling or communication connection shown or discussed may be through the indirect coupling or communication connection of some interface, device or unit, which may be electrical, mechanical or otherwise.
The unit described as a detached part may or may not be physically detached, the parts shown as unit may or may not be physically unit, that is, it may located in one place, or it may be distributed across multiple network units. Some of the units can be selected according to actual demand to achieve the purpose of this embodiment scheme.
In addition, the functional units in each embodiment of this disclosure may be integrated in a single processing unit, or may exist separately, or two or more units may be integrated in a single unit. The integrated units mentioned above can be realized in the form of hardware or software functional units.
The integrated units, if implemented as software functional units and sold or used as independent product, can be stored in a computer readable storage medium. Based on this understanding, the technical solution of this disclosure in nature or the part contribute to existing technology or all or part of it can be manifested in the form of software product. The computer software product stored on a storage medium, including several instructions to make a computer equipment (may be a personal computer, server, or network device, etc.) to perform all or part of steps of each example embodiments of this disclosure. The storage medium mentioned before includes U disk, floating hard disk, ROM (Read-Only Memory), RAM (Random Access Memory), floppy disk or optical disc and other medium that can store program codes.
It should be noted that the embodiments number of this disclosure above is for description only and do not represent the advantages or disadvantages of embodiments. And in this disclosure, the term “including”, “include” or any other variants is intended to cover a non-exclusive contain. So that the process, the devices, the items, or the methods includes a series of elements not only include those elements, but also include other elements not clearly listed, or also include the inherent elements of this process, devices, items, or methods. In the absence of further limitations, the elements limited by the sentence “including a . . . ” do not preclude the existence of other similar elements in the process, devices, items, or methods that include the elements.
The above are only the preferred embodiments of this disclosure and do not therefore limit the patent scope of this disclosure. And equivalent structure or equivalent process transformation made by the specification and the drawings of this disclosure, either directly or indirectly applied in other related technical fields, shall be similarly included in the patent protection scope of this disclosure.
Number | Date | Country | Kind |
---|---|---|---|
202010972378.3 | Sep 2020 | CN | national |