The present invention relates to a technique of determining a transmission order of pieces of information on objects present on a road.
Conventionally, a method has been known in which image information on a blind spot range which is a blind spot from a host-vehicle is received from another vehicle by using inter-vehicle communication (Patent Literature 1). The reception of the image information on the blind spot range from another vehicle can provide, to an occupant of the host-vehicle, information on the blind spot range which is invisible from the host-vehicle.
However, even if the pieces of information on the blind spot range are transmitted, if the pieces of information are transmitted at random, there was a risk that all necessary pieces of information may not be received by a receiver vehicle until a time at which the receiver vehicle desires to use the pieces of information.
The present invention is made in view of the above described problems, and an object of the present invention is to transmit pieces object information in an order of object information necessary for a vehicle.
An information processing device according to a first as of the present invention calculates a collision risk between a vehicle and each of a plurality of objects present in a traveling direction of the vehicle, and transmits, to the vehicle, pieces of information on the objects in a transmission order determined based on the collision risk.
According to the present invention, pieces of object information can be transmitted in an order of object information necessary for a vehicle.
An information processing device 10 according to a First embodiment will be described with reference to
The information processing device 10 receives, from a vehicle A, a current position of the vehicle A, and sensor data obtained by sensing space around the vehicle A, and receives, from a vehicle B, a current position of the vehicle B. The information processing device 10 detects objects that have risks of colliding with the vehicle B based on the sensor data, and transmits, to the vehicle B, pieces of information on the objects in a descending order of the degree of collision risk. Accordingly, the vehicle B can start generating a travel plan well in advance such that the vehicle travels by drawing a track of avoiding the objects on a road by using pieces of information on objects observable from another position such as a position of the vehicle A, in addition to information on objects detectable from the vehicle B. The information processing device 10 may receive the sensor data or the like not only from the vehicle A, but also from other vehicles or sensors installed around the road.
The vehicles A and B may be vehicles with or without an automatic driving function. The vehicles A and B may be vehicles capable of switching between automatic driving and manual driving.
The information processing device 10 shown in
The object detection unit 11 receives, from the vehicle A, position information on the vehicle A and the sensor data obtained by sensing space around the vehicle A. The object detection unit 11 outputs information on an object that is present in a traveling direction of the vehicle B based on the position information and the sensor data of the vehicle A, and the object information includes at least an object position and a detection time of the object and includes a speed, a state, a type and the like of the object. A coordinate system representing the object position is expressed by a travel distance from a reference point of a world geodetic system or a high-precision map by using the position information on the vehicle A received from the vehicle A. The state of the object is, for example, whether the object is stationary, whether the object is about to start, direction indicator information detected from a direction indicator, and the like. The type of the object is, for example, a kind of the object, such as whether the object is a vehicle, a pedestrian, a bicycle, an obstacle, or the like. By the object information including the type of the object, the vehicle B can take an appropriate action depending on the type of the object.
The collision risk calculation unit 12 receives, from the vehicle B, position information, a speed, and the like. The collision risk calculation unit 12 calculates a risk that each object collides with the vehicle B by using the position information and the speed of the vehicle B and the object information output by the object detection unit 11. The collision risk is a numerical value of the possibility that the vehicle B collides with each object. The collision risk calculation unit 12 calculates the collision risk based on, for example, the relationship between a lane on which the vehicle B travels, and a lane on which each object is present.
The object selection unit 13 selects pieces of object information to be transmitted to the vehicle B based on the collision risk calculated by the collision risk calculation unit 12, and determines the transmission order of the individual pieces of object information. The object selection unit 13 transmits the pieces of object information to the vehicle B in the determined transmission order. The transmission order is determined, for example, based on a margin time until the vehicle B collides with each object. The margin time is determined by dividing a relative distance by a relative speed.
The vehicle A includes a self-position measuring unit. 21 and a sensor 22.
The self-position measuring unit 21 measures and outputs the position information on the vehicle A. Specifically, the self-position measuring unit 21 receives Global Navigation. Satellite System. (GNSS) signals to measure a current time, and a self-position of the vehicle A. The self-position measuring unit 21 may measure the position information on the vehicle A based on other methods. The position information includes, for example, information on a position and an attitude of the vehicle A.
The sensor 22 senses objects present around the vehicle A. For example, a laser range finder can be used as the sensor 22. The laser range finder senses 360-degree space around the vehicle A within a viewable range of about 150 m, and outputs a sensing result as point cloud format data A visible light camera can be used as the sensor 22. The visible light camera photographs the space around the vehicle A, and outputs the photographed image data. The The light camera is installed so as to photograph, for example, each of space in a forward direction of the vehicle A, spaces on both side directions of the vehicle A, and space in a backward direction of the vehicle A. The sensor 22 transmits, to the information processing device 10, the point cloud format data and the image data as sensor data Other types of sensors may be used.
The vehicle B includes a self-position measuring unit 21 and an object information collecting unit 23.
The self-position measuring unit 21 measures and outputs position information on the vehicle B in the same manner as the self-position measuring unit 21 of the vehicle A.
The object information collecting unit 23 receives object information from the information processing device 10 to collect the information. The vehicle B can generate a traveling track plan of the vehicle B by using the object information collected by the object information collecting unit 23. The traveling track plan is, for example, a track of a vehicle so that the vehicle can take safety actions.
As same as the vehicle A, the vehicle B may include a sensor 22 to sense objects present around the vehicle B.
With reference to
In steps S11 and S12, the object detection unit 11 receives, from the vehicle A, the sensor data and the position information. Table 1 shows an example of data structures of the sensor data and the position information transmitted from the vehicle A to the information processing device 10.
The data structure of Table 1 is configured and transmitted as one data stream, for example. The data stream includes a header part and a content data part. The header part stores an identification code of a transmitter vehicle (the vehicle A) which transmits the data stream and a basic message of the transmitter vehicle. The basic message of the transmitter vehicle includes, for example, various pieces of information on the vehicle, a date and a time at which the data was created, a geographical location, a traveling direction, and a speed of the vehicle, and a past road travel route and a future travel plan route of the vehicle. Information to be transmitted as the basic message may be in accordance with SAE J2945/1 ESN, or the like.
The content data part stores one or more pieces of object information. The object information includes an identification code of an object, a basic message of the vehicle at the time of object detection, sensor information, and detailed information on the object. The basic message of the vehicle at the time of object detection includes, for example, a date and a time at which the object is detected, and a geographical location, a traveling direction, and a speed of the vehicle. The sensor information is information on a sensor which detects the object. Described as the sensor information are an identification code, a type, and a sensing cycle of the sensor, a frame number of an image in which the object is detected, the number of frames of images to be transmitted, a visual axis and a view angle of a camera, and the identification accuracy of the object.
The detailed information on the object includes a geographical location of the object, a date and a time at which the object is detected, a traveling direction and a speed of the object, a stationary duration of the object, a type of the object, a size of the object, detailed information on a road structure, still image data, video data, and point cloud format data the geographical location of the object is a position of the object specified by latitude and longitude, a position of the object specified by a predetermined parameter (node or link) of a road map, and a position relative to a sensor or the like which detects the object. The type of the object is pieces of information indicating, for example, a person, a vehicle (a standard-sized vehicle, a large-sized vehicle, a two-wheel vehicle, or the like), a bicycle, a road structure, a road obstacle, and the like. The detailed information on the road structure is pieces of information on a road such as pieces of information on a road width, a lane width, the number of lanes, and a road alignment, and regulation information, and regulation vehicle speed information. The still image data, the video data and the point cloud format data are pieces of sensing data that include detected objects.
In step S13, the object detection unit 11 detects objects present around the vehicle A based on the sensor data and the position information on the vehicle A, and then, outputs, to the collision risk calculation unit 12, pieces of information on the detected objects and the information on the vehicle A.
In step S14, the collision risk calculation unit 12 receives the current position information on the vehicle B and information on a planned position where the vehicle B will travel in the future. These pieces of information on the vehicle B can be obtained by, for example, the information processing device 10 receiving the same data as in Table 1 from the vehicle B to obtain, from the basic message of the vehicle, a geographical location, a traveling direction, a speed of the vehicle B at a predetermined time, a past road travel route, and a future travel plan route. The processes of receiving the signals in steps S11, S12, and S14 may be performed at any time in a random order.
In step S15, the collision risk calculation unit 12 calculates a risk that each object collides with the vehicle B based on the current position information on the vehicle B, the information on the planned position where the vehicle B will travel in the future, the information on the vehicle A, and the information on the object detected by the vehicle A.
In step S16, the object selection unit 13 transmits, to the vehicle B, the pieces of object information in the order from an object having a high collision risk. The vehicle B receives the pieces of object information, and starts performing processes by using the received pieces of object information after all necessary pieces of object information are received.
The calculation of the collision risk and the determination of the transmission order will be described with reference to
The situation shown in
The collision risk calculation unit 12 calculates the collision risk based on the relationship between the lane on which the vehicle B travels and the lanes on which the objects C to G are present. In the present embodiment, as shown in
The intersecting road is a road that intersects the road on which the vehicle B travels. The lane position uncertainty includes, for example, an object that is present outside the road, and an object of unclear position information.
The collision risk calculation unit 12 sets collision risks in the order of the same lane, the adjacent lane, the opposite lane, the intersecting road, and the lane position in which are listed from the bottom of
The object selection unit 13 transmits the pieces of object information in a descending order of the degree of collision risk, and, in the order of a margin time to the collision from the shortest. Time Headway (THW: headway) is used, for example, for the margin time to the collision on the same lane and the adjacent lane. If the vehicle B travels by following an object, THW may be included in information on the object. For the opposite lane, Time to Collision (TTC: collision time) is used. The object selection unit 13 determines the transmission order of the pieces of object information based on the margin time to the collision, and thus, the vehicle B can process the pieces of object information in the order of the received object information from the first information received to the last information received, when making a plan to take the safety action.
In the example shown in
As shown in
The object information in Table 2 is transmitted and configured as one data stream, for example. The data stream includes a header part and a content data part. The header part stores an identification code of an information processing device as a data creation subject, and an index of object information to be transmitted by the content data part. The index of the object information includes an identification code of a transmission destination vehicle (the vehicle B), information showing a geographical area where the object information to be transmitted is collected, a flag showing the transmission order of the pieces of object information, the total number of the pieces of object information included in the content data part, the total number of pieces of information on objects that have high collision risks, and an identification code of an object that has a high collision risk. The geographical area is information for specifying an area. The geographical area is described by a position or a range specified by latitude and longitude, a position or a range specified by a predetermined parameter (a node or a link) of a road map, a position or a range relative to a sensor or the like which detects an object, an area size, a link ID, a group of node IDs for each link, a node ID, node position information (GNSS coordinate), an adjacent area ID, a road ID and a lane ID on the area, a map ID and version information. The flag showing the transmission order is, for example, a flag showing that the transmission order is determined in accordance with the collision risk. Information on an object that has a high collision risk is, for example, object information in which TTC is smaller than a predetermined value. A plurality of identification codes of pieces of information on objects that have high collision risks may be described.
The content data part stores one or more pieces of object information in a descending order of the degree of collision risk with respect to the transmission destination vehicle (the vehicle B). The object information includes an identification code of an object, information on the transmission order of objects, information on the collision risk, information on a device which detects the object, and detailed information on an object.
The information on the transmission order of the objects in the object information of the content data part is, for example, numbers set in accordance with a descending order of the degree of collision risk. The information on the collision risk is information that includes, for example, a collision risk ranking, TTC, THW, and a lane type. The collision risk ranking is a numerical value obtained by ranging the objects detected by the vehicle A, in a descending order of the degree of collision risk with respect to the vehicle B, and assigning a smaller number, as the collision risk becomes high. The lane type is information for identifying a lane on which an object is present, and may be, for example, an identification code of a lane identified on a road map, or may store information indicating that the lane on which a vehicle travels is the same as the lane on which the vehicle B travels, or information indicating that the lane on which a vehicle travels is opposite to the lane on which the vehicle B travels. The information on the object detection is information on a vehicle or a device such as a roadside unit which detects the object. The information on the object detection includes an identification code of a device that detects the object, a basic message of the device, and sensor information. The basic message and the sensor information are similar to the basic message of the vehicle at the time of object detection and the sensor information shown in Table 1. The detailed information on the object is similar to the detailed information on the object shown in Table 1.
The vehicle B that receives the data stream related to the object information having the data structure shown in Table 2 comes to be possible to receive the pieces of object information in a descending order of the degree of collision risk, and accordingly, comes to be possible to process information on an object that has a higher collision risk earlier than when the pieces of object information are received irrespective of the order of the degree of collision risk.
The information processing device 10 may be mounted to the vehicle A as shown in
As described above, according to the present embodiment, the collision risk calculation unit 12 calculates the collision risk between the vehicle B and each of the objects C to G that are present in the traveling direction of the vehicle B based on the relationship between the lane on which the vehicle B travels and the lanes on which the objects C to G are present. The object selection unit 13 determines the transmission order of pieces of information on the Individual objects C to G based on the collision risk, and transmits the pieces of object information to the vehicle B based on the transmission order. This causes the pieces of object information to be transmitted in the order according to the collision risk, and thus, the vehicle B can make a plan to take safety actions well in advance in the order of the received object information.
The information processing device 10 according to a second embodiment will be described with reference to
The information processing device 10 shown in
The collision risk correction unit 14 corrects the collision risk depending on a condition of an object, that is, an environmental factor surrounding the object. When correcting the collision risk, the collision risk correction unit 14 may refer to the map 15, and correct the collision risk based on whether a median strip is present, a condition of a road such as a priority road, and traffic rules. Examples of conditions of correcting the collision risk are shown below. The collision risk is set to be high, if an object (a pedestrian) stopping at a place outside a road is about to start. The collision risk is set to be high, if an object (an oncoming vehicle) which is stopped to wait for a right turn is about to start. The collision risk is set to be high for an object (an intersecting vehicle) that is present on an intersecting road which has a priority over the road on which the vehicle B travels. The collision risk is set to be low, if the median strip is present between the lane on which the vehicle B travels and the travelling lane of the object (the oncoming vehicle).
As on the map 15, map information acquired via the network may be used, or if the information processing device 10 is mounted to the vehicle A, a map in the vehicle A may be used.
With reference to
The flowchart of
In step S151, the collision risk calculation unit 12 calculates the TTC based on the distance and a relative speed between the vehicle B and the object, and determines whether the TTC between the vehicle B and the object can be calculated. An object whose ITC is not able to be calculated is a stationary object. If the TTC is not able to be calculated, the collision risk correction unit 14 advances a process to step S154.
The calculation results of the TTC in the example of
If the TTC can be calculated, in step S152, time collision risk calculation unit 12 calculates the THW of an object followed by the vehicle B, and if the THW is not calculated, a process is advanced to step S155. The object that is not followed by the vehicle B is an oncoming vehicle that travels on an opposite lane, or an intersecting vehicle that travels on an intersecting road.
In the example of
After calculating the THW, in step S153, the collision risk calculation unit 12 sets the highest collision risk to an object having the shortest ITC and the shortest THW among the objects followed by the vehicle B.
In the example shown in
In steps S154 to S157, the collision risk correction unit 14 determines a collision environment risk depending on a condition of each object. The collision environment risk is information for correcting the collision risk depending on the condition of the object. In the present embodiment, any one of “high,” “normal,” and “no risk” is set for the collision environment risk.
In step S154, the collision risk correction unit 14 detects whether a starting action is made by a stationary object, and if the starting action is made by the object, the collision risk correction unit 14 determines that the collision environment risk of the object is high.
In the example shown in
In step S155, the collision risk correction unit 14 determines whether an object is an oncoming vehicle.
If the object is the oncoming vehicle, in step S156, the collision risk correction unit 14 determines whether the median strip is present between the lane on which the vehicle B travels and a lane on which the object travels. If the median strip is present, the object is determined to have no collision environment risk, and alternatively, if the median strip is absent, the object is determined to have a high collision environment risk.
In the example shown in
If the object is an intersecting vehicle, in step S157, the collision risk correction unit 14 determines whether a road on which the intersecting vehicle travels is a priority road over the road on which the vehicle B travels. If the road of the object is not a priority road, the object is determined to have a normal collision environment risk, and alternatively, if the road of the object is a priority road, the object is determined to have a high collision environment risk.
In the example shown in
After the collision environment risk is determined, in step S158, the collision risk calculation unit 12 sets the collision risk in the order of TTC and in the order of distance for the objects which are determined to have a high collision environment risk in the processes of steps S154 to S157.
In the example shown in
In step S159, the collision risk calculation unit 12 sets the collision risks for the remaining objects. For the objects having the normal collision environment risk, the collision risk calculation unit 12 sets the collision risk in the order based on the positional relationship between the lane on which the vehicle B travels and the lane on which the objects are present, as in the first embodiment.
In the example shown in
By performing the above described processes, the collision risk is set for each object. Thereafter, the object selection unit 13 transmits the pieces of object information to the vehicle B in the order from an object having a high collision risk.
As described above, according to the present embodiment, the collision risk correction unit 14 sets the collision environment risk depending on the conditions of the objects H to O, and corrects the collision risk according to the collision environment risk. As a result, for environmental factors in which the TTC and THW are not calculated, the transmission order of the pieces of information on the objects H to O is corrected based on, for example, displaying of a direction indicator of the oncoming vehicle N and whether the oncoming vehicle N makes a starting action, whether the crossing pedestrian H makes a starting action, and traffic rules such as priority roads, and accordingly, the vehicle B can quickly respond to situations depending on the conditions of the objects H to O.
With reference to
The information processing device 10 shown in
The information processing device 10 receives, from the vehicle B, a transmission request for requesting the transmission of the object information, and starts transmitting the object information to the vehicle B in response to the transmission request. The transmission request may include information on a distribution range in which the vehicle B desires that the object information is transmitted. In the first and second embodiments also, the transmission of the object information to the vehicle B may be started in response to the reception of the transmission request. An example of the data structure of the transmission request transmitted from the vehicle B is shown in Table 3 below.
The transmission request in Table 3 is, for example, configured and transmitted as one data stream. The data stream includes a header part and a content data part. The header part stores information on the vehicle that transmits the request, and request information. The information on the vehicle includes an identification code of the vehicle and a basic message of the vehicle. The basic message contains content which is similar to that of the basic message of Table 1.
The request information includes a flat indicating the request content, an identification code of the request, a type of the requested object, a time limit, a maximum data size, and a data type. The flag indicating the request content is a flag indicating that the transmission of the object information is requested. The type of the requested object is, for example, a vehicle, a pedestrian, a bicycle, or an obstacle, and is expressed by an identification code indicating the type. The time limit is a time limit for receiving the object information and is expressed by a date and a time. The maximum data size indicates magnitude of the receivable data size. The data type indicates a type of receivable data such as, for example, text data, still image data, or video data. The data type may include a file type such as MPEG or AVI.
The content data part stores one or more pieces of request area information. The request area information includes an identification code of a request area, and request area data. The request area data is information for specifying an area where the transmission of the object information is requested. The request area data is described by a position or a range specified by latitude and longitude, a position or a range specified by a predetermined parameter (a node or a link) of a road map, a position or a range relative to a sensor or the like which detects an object, an area size, a link ID, a group of node IDs for each link, a node ID, node position information (GNSS coordinate), an adjacent area ID, a road ID and a lane ID on the area, a map ID and version information.
The sensor recognition area calculation unit 16 receives, from the vehicle A, information on a sensing range of the sensor 22 of the vehicle A to specify a range of an object detected by the vehicle A, and transmits the detection range to the vehicle B. The information processing device 10 transmits, to the vehicle B, information on the object which is detected within the distribution range and the detection range.
The vehicle B has the sensor 22 as the vehicle A to sense the space around the vehicle B. The object information requesting unit 24 may transmit, to the information processing device 10, the transmission request in which a blind spot area that is not able to be sensed by the vehicle B by using the sensor 22 is set as the distribution range. The information processing device 10 transmits the object information to the vehicle B in response to the transmission request. The vehicle B integrates sensing results obtained by using the sensor 22 of the vehicle B, and the object information received from the information processing device 10 to perform processes such as planning to take a safety action.
With reference to
In step S20, the information processing device 10 receives, from the vehicle B, the transmission request including the distribution range.
The vehicle B may cause a travel route plan to be included in the transmission request. The travel route plan indicates a route along which the vehicle B will travel in the future, and means, for example, a route to a destination which is set in advance. The information processing device 10 may set the route along which the vehicle B is planned to travel as the distribution range based on the travel route plan, and may perform the processes at or after step S11 to transmit the object information. For example, if the vehicle B is planned to make a left turn at an intersection, the information processing device 10 sets an intersecting road that will appear in front of the vehicle B after the vehicle B makes a left turn at the intersection as the distribution range, and performs the processes at or after step S11.
In steps S11 and S12, the object detection unit 11 receives, from the vehicle A, the sensor data, the position information, and a range sensed by the sensor 22.
In step S13, the object detection unit 11 detects objects present around the vehicle A based on the sensor data and the position information on the vehicle A, and outputs pieces of information on the objects to the collision risk calculation unit 12.
In step S14, the collision risk calculation unit 12 receives the position information on the vehicle B. Processes of receiving signals in steps S11, S12, and S14 may be performed at any time in a random order.
In step S15, the collision risk calculation unit 12 calculates a risk that the vehicle B collides with each object based on the position information on the vehicle B and the object information. The information processing device 10 may perform the process of correcting the collision risk according to the second embodiment.
In step S21, the sensor recognition area calculation unit 16 calculates the detection range of the object based on the distribution range and a range sensed by the sensor 22 in the vehicle A. The details of processes performed by the sensor recognition area calculation unit 16 will be described later.
In step S16, the object selection unit 13 transmits the detection range calculated in step S21, and transmits the pieces of object information to the vehicle B in the order from the object having the highest collision risk.
With reference to
In step S211, the sensor recognition area calculation unit 16 calculates a recognition range of the vehicle A from a position and an attitude of the vehicle A, and the sensing range of the sensor 22 of the vehicle A.
In step S212, the sensor recognition area calculation unit 16 determines the detection range based on the recognition range, the distribution range desired by the vehicle B, and boundary lines of a road. Specifically, the sensor recognition area calculation unit 16 sets an area inside the boundary lines of the road that satisfies the recognition range and the distribution range as the detection range.
In step S213, the sensor recognition area calculation unit 16 excludes, from the detection range. 510, an area that may not be visible (sensed) by the vehicle A (hereinafter referred to as “shielded area”).
Information on an object that is present outside the detection range 520 is not transmitted. In an example of
In step S214, the sensor recognition area calculation unit 16 represents the detection range 520 based on a link represented by a connection between nodes of the road or the lane.
By performing the above described processes, the detection range 520 by the vehicle A is calculated.
With reference to
As shown in
As shown in
As shown in
As shown in
As described above, according to the present embodiment, the vehicle B transmits, to the information processing device 10, a transmission request including a distribution range in which an area which is not able to be sensed by the sensor 22 of the vehicle B is an area where the vehicle B requests the transmission of object information. Then, the information processing device 10 selects pieces of object information to be transmitted based on the distribution range and the recognition range of the sensor 22 of the vehicle A. This enables the vehicle B to receive pieces of information on objects that are present only in the area which is not able to be sensed by the sensor 22, and thus, the vehicle B can integrate results obtained by the sensor 22 of the vehicle B, and the received pieces of object information to perform processes such as planning to take a safety action promptly. The transmission of the object information at an appropriate timing becomes possible, because the information processing device 10 transmits the object information in response to the transmission request received from the vehicle B.
According to the present embodiment, the sensor recognition area calculation unit 16 specifies the detection range in which the objects are sensed, and transmits the detection range to the vehicle B. This enables the vehicle B to specify an area that can be covered by the object information obtained from the information processing device 10, among the area which is not able to be sensed by the sensor 22 of the vehicle B, and thus, an area which will continue to be a blind spot area can be easily specified. By the sensor recognition area calculation unit 16 expressing the detection range based on a link represented by a connection between nodes of the road or the lane, a communication volume at the time of transmitting the detection range can be reduced.
According to the present embodiment, the sensor recognition area calculation unit 16 excludes, from the detection range, the shielded area that is not able to be sensed by the sensor 22 of the vehicle A based on the information obtained from the map 15. This can suppress the transmission of unnecessary data and can reduce the communication volume.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2019/000700 | 7/12/2019 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2021/009531 | 1/21/2021 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
9767693 | Lee et al. | Sep 2017 | B2 |
10115312 | Lee et al. | Oct 2018 | B2 |
11024164 | Lin et al. | Jun 2021 | B2 |
20130018572 | Jang | Jan 2013 | A1 |
20140019005 | Lee et al. | Jan 2014 | A1 |
20160054441 | Kuo et al. | Feb 2016 | A1 |
20170004366 | Nakata | Jan 2017 | A1 |
20170352277 | Lee et al. | Dec 2017 | A1 |
20180151070 | Katou et al. | May 2018 | A1 |
20180151077 | Lee | May 2018 | A1 |
20180327029 | Oooka | Nov 2018 | A1 |
20190061750 | Tamura | Feb 2019 | A1 |
20200043339 | Kozaki | Feb 2020 | A1 |
20200086855 | Packer | Mar 2020 | A1 |
20200209871 | Xiong et al. | Jul 2020 | A1 |
20200219387 | Lin et al. | Jul 2020 | A1 |
20210264224 | Tamaoki | Aug 2021 | A1 |
Number | Date | Country |
---|---|---|
107564306 | Jan 2018 | CN |
107749193 | Mar 2018 | CN |
102011078615 | Jan 2013 | DE |
2004054369 | Feb 2004 | JP |
2004-077281 | Mar 2004 | JP |
2005-062912 | Mar 2005 | JP |
2008011343 | Jan 2008 | JP |
2008071062 | Jan 2008 | JP |
2008299676 | Dec 2008 | JP |
2009298193 | Dec 2009 | JP |
2010073026 | Apr 2010 | JP |
2012093883 | May 2012 | JP |
2017016318 | Jan 2017 | JP |
2017182563 | Oct 2017 | JP |
2017182570 | Oct 2017 | JP |
10-2014-0007709 | Jan 2014 | KR |
10-2018-0023328 | Mar 2018 | KR |
2018140191 | Aug 2018 | WO |
Number | Date | Country | |
---|---|---|---|
20220319327 A1 | Oct 2022 | US |