The present disclosure relates to a vehicle driving controller, a system including the same, and a method thereof, and more particularly, relates to technologies about a control authority transition demand according to a driving environment and/or a user state during autonomous driving.
The statements in this section merely provide background information related to the present disclosure and may not constitute prior art.
There is a driver assist system for assisting a driver who is driving to perform autonomous driving. For example, the driver assist system may include an adaptive cruise control (ACC), a lane departure warning system (LDWS), a lane keeping system (LKS), and the like.
Such a driver assist system enhances the convenience of driving by automatically controlling a portion of longitudinal control or lateral control of the driver. On the other hand, there are constraints in which such a driver assist system (an autonomous system) should prepare such that the driver intervenes. Thus, in a situation where the driver performs drowsy driving or is unable to drive his or her vehicle due to his or her health problems, a conventional driver assist system does not assist the driver.
When there is a situation to which it is difficult for the driver assist system to correspond, it is important to notify a user that there is the situation depending on a driving situation or a user state and safely hand over control authority to the user. However, in the related art, there are no technologies of warning the user and handing over control authority to the user depending on a driving situation and a user situation.
An aspect of the present disclosure provides a vehicle driving controller for differently applying a control authority transition demand method depending on a driving environment and/or a user state in an autonomous driving environment such that a user safely corresponds to a risk situation, a system including the same, and a method thereof.
The technical problems to be solved by the present disclosure are not limited to the aforementioned problems, and any other technical problems not mentioned herein will be clearly understood from the following description by those skilled in the art to which the present disclosure pertains.
According to an aspect of the present disclosure, a vehicle driving controller may include: a processor configured to differently apply a warning level depending on at least one or more of a driving situation and a user state during autonomous driving and provide a notification of a control authority transition demand to a user and a storage storing information associated with the driving situation and the user state and information associated with a warning level determined by the processor.
The driving situation may include whether there is collision risk within a predetermined time or whether there is out of an operational design domain (ODD) capable of performing the autonomous driving.
The processor may be configured to track a static object and a dynamic object, generate a static grid map and a dynamic object list, and extract map information.
The processor may be configured to generate a driving trajectory of a host vehicle and determine whether there is intersection between the driving trajectory of the host vehicle and the static grid map or whether there is intersection between the driving trajectory of the host vehicle and a trajectory of a dynamic object of the dynamic object list to determine there is collision.
The processor may be configured to determine a time when there is out of the ODD and a type where there is out of the ODD, based on the map information.
The type where there is out of the ODD may include, when a host vehicle is able to travel during a predetermined time after there is out of the ODD and, when the host vehicle is unable to travel after there is out of the ODD.
The processor may be configured to determine at least one or more of whether the user state is a “hands-off” state, whether the user state is an “eyes-off” state, and whether the user state is a “mind-off” state.
The processor may be configured to change and apply a notification reference time, which is a notification time, depending on the user state.
The processor may be configured to set the notification reference time to be shorter in an order of the “hands-off” state, the “eyes-off” state, and the “mind-off” state.
The processor may be configured to determine the warning level to be higher in an order of the “hands-off” state, the “eyes-off” state, and the “mind-off” state.
The processor may be configured to determine whether the user state is the “eyes-off” state or whether the user is the “mind-off” state when there is no driving risk in the driving situation to determine a warning level, provide the notification of the control authority transition demand depending on the determined warning level, and increase the warning level, when a predetermined time elapses.
The processor may be configured to differently apply the warning level depending on an expected time when collision risk will occur or an expected time when there will be out of an ODD.
The warning level may include level 1 for providing a visual warning, level 2 for providing a visual warning and an audible warning, level 3 for providing a visual warning and an audible warning, the visual warning being provided in a color different from a color of the visual warning in levels 1 and 2 and the audible warning being output with a sound higher than the sound upon the audible warning in level 2, and level 4 for providing a visual warning in a color different from the colors in levels 1 to 3, providing an audible warning with the highest sound magnitude, and performing an emergency call mode operation.
The processor may be configured to determine a minimum risk maneuver (MRM) depending on the warning level, when control authority is not handed over to the user after providing the notification of the control authority transition demand depending on the warning level and perform vehicle control depending on the MRM.
The MRM may include at least one of constant-speed driving control after decelerating at a constant speed, stop control, and shoulder stop control.
According to another aspect of the present disclosure, a vehicle system may include: a vehicle driving controller configured to differently apply a warning level depending on at least one or more of a driving situation and a user state during autonomous driving and provide a notification of a control authority transition demand to a user and a warning device configured to provide a warning according to the warning level.
The warning device may be configured to provide at least one or more of a visual warning, an audible warning, and an emergency call mode.
According to another aspect of the present disclosure, a vehicle driving control method may include: determining at least one or more of a driving situation and a user state during autonomous driving and differently applying a warning level depending on at least one or more of the driving situation and the user state and providing a notification of a control authority transition demand to a user.
The method may further include determining an MRM depending on the warning level, when control authority is not handed over to the user after providing the notification of the control authority transition demand depending on the warning level and performing vehicle control depending on the MRM.
The driving situation may include whether there is collision risk within a predetermined time or whether there is out of an ODD capable of performing the autonomous driving.
Further areas of applicability will become apparent form the description provided herein. It should be understood that the description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
In order that the disclosure may be well understood, there will now be described various forms thereof, given by way of example, reference being made to the accompanying drawings, in which:
The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way.
The following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses. It should be understood that throughout the drawings, corresponding reference numerals indicate like or corresponding parts and features.
Hereinafter, some forms of the present disclosure will be described in detail with reference to the accompanying drawings. In adding reference denotations to elements of each drawing, although the same elements are displayed on a different drawing, it should be noted that the same elements have the same denotations. In addition, in describing some forms of the present disclosure, if it is determined that a detailed description of related well-known configurations or functions blurs the gist of some forms of the present disclosure, it will be omitted.
In describing elements of some forms of the present disclosure, the terms 1st, 2nd, first, second, A, B, (a), (b), and the like may be used herein. These terms are only used to distinguish one element from another element, but do not limit the corresponding elements irrespective of the nature, turn, or order of the corresponding elements. Unless otherwise defined, all terms used herein, including technical or scientific terms, have the same meanings as those generally understood by those skilled in the art to which the present disclosure pertains. Such terms as those defined in a generally used dictionary are to be interpreted as having meanings equal to the contextual meanings in the relevant field of art, and are not to be interpreted as having ideal or excessively formal meanings unless clearly defined as having such in the present application.
Some forms of the present disclosure may be technology of differently applying a warning level depending on a driving environment and/or a user state during autonomous driving, providing a notification of a control authority transition demand to a user, and performing vehicle control depending on a minimum risk maneuver (MRM) when control authority is not handed over to the user and may be applied to autonomous driving technology.
Hereinafter, a description will be given in detail of some forms of the present disclosure with reference to
Referring to
The vehicle driving controller 100 may differently apply a warning level depending on at least one or more of a driving situation and/or a user state during autonomous driving and may provide a notification of a control authority transition demand to a user. Furthermore, when the user does not take over control authority after receiving the notification of the control authority transition demand, the vehicle driving controller 100 may determine a minimum risk maneuver (MRM) depending on a warning level and may perform autonomous driving control of a vehicle depending on the determined MRM.
The vehicle driving controller 100 may include a communication device 110, a storage 120, and a processor 130.
The communication device 110 may be a hardware device implemented with various electronic circuits to transmit and receive a signal over a wireless or wired connection. In some forms of the present disclosure, the communication device 110 may perform in-vehicle communication through controller area network (CAN) communication, local interconnect network (LIN) communication, or the like and may communicate with the sensor module 200, the GPS receiver 300, the map DB 400, the display device 500, the warning device 600, the actuator 700, and the like.
The storage 120 may store information associated with a sensing result of the sensor module 200 and information associated with a driving situation, a user state, a warning level, or the like obtained by the processor 130. The storage 120 may include at least one type of storage medium, such as a flash memory type memory, a hard disk type memory, a micro type memory, a card type memory (e.g., a secure digital (SD) card or an extreme digital (XD) card), a random access memory (RAM), a static RAM (SRAM), a read-only memory (ROM), a programmable ROM (PROM), an electrically erasable PROM (EEPROM), a magnetic RAM (MRAM), a magnetic disk, and an optical disk.
The processor 130 may be electrically connected with the communication device 110, the storage 120, or the like and may electrically control the respective components. The processor 130 may be an electrical circuit which executes instructions of software and may perform a variety of data processing and calculation described below.
The processor 130 may differently apply a warning level depending on at least one or more of a driving situation and a user state during autonomous driving to provide a notification of a control authority transition demand. When the user does not take over control authority after receiving the notification of the control authority transition demand, the processor 130 may determine an MRM according to the warning level and may perform autonomous driving control of the vehicle, thus avoiding a risk situation.
In this case, the driving situation may include whether there is collision risk within a predetermined time or whether there is out of an operational design domain (ODD) capable of performing autonomous driving. In other words, the driving situation may refer to situation information associated with whether there is a probability of collision within a predetermined time or whether the vehicle departs from the ODD.
The ODD may be a range set to perform a function of an autonomous system and may be set in consideration of conditions such as a geographic condition, a road condition, an environmental condition, and a driving condition. In other words, for safe driving, it is able to perform a system operation for autonomous driving on a highway, a limited-access road, or the like, but it is unable to perform a system operation for autonomous driving on a general road, a junction/interchange (JC/IC), a tollgate, or the like.
The processor 130 may track a static object and a dynamic object based on information received from the sensor module 200 and the GPS receiver 300 to generate a static grid map and a dynamic object list and may extract map information from the map DB 400. In this case, the dynamic object may be listed on the dynamic object list, and the dynamic object list may include information such as a type or trajectory of the dynamic object.
The processor 130 may generate a driving trajectory of a host device based on information received from the sensor module 200 and the GPS receiver 300 or a navigation device (not shown) and may determine whether the driving trajectory of the host device and the static grid map intersect each other or whether the driving trajectory of the host device and a trajectory of a dynamic object included in the dynamic object list intersect each other to determine whether there is collision. In other words, when there is a point (a point of intersection) where the driving trajectory of the host vehicle and the static grid map meet each other or where the driving trajectory of the host vehicle and the trajectory of the dynamic object meet each other, it is meant that collision occurs.
Furthermore, the processor 130 may determine a time when there is out of an ODD and a type where there is out of the ODD, based on map information. Information about the time when there is the ODD may be time information at a time when there is out of the ODD. Information about the type where there is out of the ODD may include type information when the host vehicle is able to travel during a predetermined time after there is out of the ODD and type information when the host vehicle is unable to travel after there is out of the ODD.
The processor 130 may determine at least one or more of whether the user state is a “hands-off” state, whether the user state is an “eyes-off” state, and whether the user state is a “mind-off” state. The determining of the user state may be determining whether the user is not focused on driving and is in a reckless state. In other words, the “hands-off” state may be a state where the user does not grip a steering wheel. The “eyes-off” state may be a reckless state, for example, a state where the user sees a movie without keeping his or her eyes on the road. The “mind-off” state may be a state where the user is unable to drive the vehicle, for example, a state where the user dozes or gets drunk.
When a state where a value measured by a steering wheel torque sensor is less than a predetermined value and where there is no change in steering angle stays over a predetermined time, the processor 130 may determine that the user does not grip the steering wheel to determine the user state as the “hands-off” state. In this case, the user should be in an “eyes-on” state and a “mind-on” state. The processor 130 may determine the user state as the “eyes-off” state when a time when the user does not keep his or her eyes on the road stays over a predetermined time, based on information about the line of sight of the user, information about a face direction of the user, or the like from image data. In this case, the user should be in the “mind-on” state. The processor 130 may extract information associated with blinking of the eyes of the user, a yawn of the user, or the like based on information about a user look from image data and may determine drowsy driving of the user. Alternatively, when the user is unable to drive the vehicle, the processor 130 may determine the user state as the “mind-off” state. The processor 130 may generate one of a “hands-off” flag, an “eyes-off” flag, or a “mind-off” flag.
When determining a warning level, the processor 130 may determine the warning level to be higher in an order of the “hands-off” state, the “eyes-off” state, and the “mind-off” state and may set a notification reference time (a notification period) to be shorter in an order of the “hands-off” state, the “eyes-off” state, and the “mind-off” state. For example, the processor 130 may determine the notification period to 100 meters when the user state is the “hands-off” state and may determine the notification period to 10 meters when the user state is the “mind-off” state.
In other words, the processor 130 may change and apply the notification reference time, which is a notification time, depending on a user state. The processor 130 may determine that risk increases in an order of the “hands-off” state, the “eyes-off” state, and the “mind-eye” state. In case of the “hands-off” state, the processor 130 may set the notification reference time to be longest. In case of the “mind-off” state, the processor 130 may set the notification reference time to be shortest. For example, when collision is expected at a 100-meter point, the processor 130 may provide a notification of a control authority transition demand at a 50-meter point before the vehicle arrives at a point to collision when the user is in the “hands-off” state or may provide the notification of the control authority transition demand from a 10-meter point before the vehicle arrives at the point to collision when the user is in the “mind-off” state, and may increase a warning level.
When there is no driving risk in a driving situation, the processor 130 may determine whether the user state is the “eyes-off” state or the “mind-off” state to determine a warning level and may provide a notification of a control authority transition demand depending on the determined warning level. When a predetermined time elapses, the processor 130 may increase the warning level.
Moreover, the processor 130 may differently apply a warning level depending on an expected time when collision risk will occur or an expected time when there will be out of an ODD. In other words, the processor 130 may provide a visual warning in warning level 1 and may provide a visual warning and an audible warning together in warning level 2. The processor 130 may provide a visual warning and an audible warning in warning level 3, the visual warning being provided in a color different from a color of the visual warning of warning levels 1 and 2 and the audible warning being output with a sound higher than a sound level upon the audible warning in warning level 2. In warning level 4, the processor 130 may provide a visual warning in a color different from the colors upon the visual warning in warning levels 1 to 3, may provide an audible warning with the highest sound level, and may perform an emergency call mode operation.
After providing a notification of a control authority transition demand depending on a warning level, when control authority is not handed over to the user, the processor 130 may determine an MRM depending on the warning level and may perform vehicle control depending on the MRM. The MRM may include at least one of constant-speed driving control after deceleration at a constant speed, stop control, and shoulder stop control. For example, the processor 130 may perform shoulder stop control in warning level 4 or may perform constant-speed driving control in warning level 1.
The sensor module 200 may include a plurality of sensors for sensing an object outside the vehicle and may obtain information associated with a location of the object, a speed of the object, a movement direction of the object, and/or a type (e.g., a vehicle, a pedestrian, a bicycle, a motorcycle, or the like) of the object. To this end, the sensor module 200 may include an ultrasonic sensor, a radar sensor, a light detection and ranging (LiDAR) sensor, a camera, a laser scanner and/or a corner radar, an acceleration sensor, a yaw rate sensor, a torque sensor and/or a wheel speed sensor, a steering angle sensor, a steering wheel torque sensor, or the like.
In some forms of the present disclosure, the sensor module 200 may sense a dynamic object and a static object around the vehicle by means of the ultrasonic sensor, the radar sensor, the LiDAR sensor, the camera, or the like and may provide the sensed information to the processor 130. Furthermore, the sensor module 200 may obtain information about a line of sight of the user, information about a face direction of the user, or information about a user look from image data of the camera and may obtain information about a steering angle from the steering angle sensor, thus providing the obtained information to the processor 130.
The GPS receiver 300 may receive a GPS signal transmitted from a GPS satellite and may provide the received GPS signal to the processor 130. The GPS signal may be used to ascertain a current location of the vehicle.
The map DB 400 may store map information for controlling autonomous driving of the vehicle and may provide the map information to the processor 130. In this case, the map information may include information associated with a type (e.g., a highway, a limited-access road, a general road, or the like) of a road on which the vehicle is currently traveling, whether there is a tollgate, or whether there is a JC/IC.
The display device 500 may be controlled by the vehicle driving controller 100 to display a visual warning for a notification of a control authority transition demand and differently display a color a visual warning according to a warning level. Furthermore, the display device 500 may display vehicle control information during autonomous driving as well as a control authority transition demand to provide the displayed information to the user.
The display device 500 may be implemented as a head-up display (HUD), a cluster, an audio video navigation (AVN), or the like. Furthermore, the display device 500 may receive a color input or the like directly from the user using a user setting menu (USM) of the cluster. Moreover, the display device 500 may include at least one of a liquid crystal display (LCD), a thin film transistor-LCD (TFT-LCD), a light emitting diode (LED) display, an organic LED (OLED) display, an active matrix OLED (AMOLED) display, a flexible display, a bended display, and a three-dimensional (3D) display. Some thereof may be implemented as transparent displays configured as a transparent type or a semi-transparent type to see the outside. Moreover, the display device 500 may be implemented as a touchscreen including a touch panel to be used as an input device other than an output device.
The warning device 600 may output a visual warning or a sound notification (a beep sound) for a notification of a control authority transition demand or may perform an emergency call mode or the like. The warning device 600 may change a color of a visual warning depending on a warning level determined by the processor 130 and may output the changed visual warning, or may change sound strength and may output the changed sound.
Furthermore, the warning device 600 may provide a warning such as a notification of a driving risk situation during autonomous driving. The warning device 600 may have a configuration for providing visual, audible, tactile warnings and may be implemented with a HUD, a cluster, an AVN, a pop-up speaker, or the like.
The actuator 700 may be configured to be controlled by the vehicle driving controller 100 and control a steering angle, acceleration, braking, engine driving, or the like of the vehicle and may include a steering wheel, an actuator interlocked with the steering wheel, and a controller for controlling the actuator, a controller for controlling a brake, a controller for a speed of the vehicle, or a controller, configured to control a transmission of the vehicle, for controlling a gear, a clutch, or the like.
As such, some forms of the present disclosure may determine a driving situation and/or a user state and may determine a warning level and a notification period, thus providing a notification of a control authority transition demand depending on the warning level and the notification period. When the user does not take over control authority, some forms of the present disclosure may continue increasing a warning level to provide a notification or may continue reducing a notification period to provide a notification. Moreover, when the user does not take over control authority, some forms of the present disclosure may determine an MRM according to a warning level and may perform autonomous driving control of the vehicle, thus minimizing a risk situation driving during autonomous driving of the vehicle to perform safe driving.
Hereinafter, a description will be given in detail of an example of determining a warning level upon driving risk with reference to
Notification start condition when there is no driving risk
1. When there is no driving risk, but when an “eyes-off” state continues during a predetermined time, a vehicle driving controller 100 of
2. When there is no driving risk, but when a “mind-off” state continues during a predetermined time, the vehicle driving controller 100 may enter warning level 3 and may increase a warning level over time. The vehicle driving controller 100 may enter warning level 4 and may provide a notification, but when control authority transition is not performed, the vehicle driving controller 100 may perform shoulder stop control.
Notification start condition when there is driving risk
1. State where collision risk is expected.
{circle around (1)} collision occurs after a first reference time tth1,collision
Hereinafter, a description will be given in detail of a vehicle driving control method in some forms of the present disclosure with reference to
Hereinafter, it is assumed that a vehicle driving controller 100 of
Referring to
In operation S120, the vehicle driving controller 100 may determine driving risk based on the recognized information. Based on the static grid map and the dynamic object list, the vehicle driving controller 100 may determine whether a driving trajectory of the host vehicle and a static grid map intersects each other and may determine whether the driving trajectory of the host vehicle and a trajectory of a dynamic object intersect each other to determine whether there is collision and a time to collision (TTC). Furthermore, the vehicle driving controller 100 may determine whether there is out of an ODD based on map information. In other words, the vehicle driving controller 100 may output information about a type where there is out of the ODD and information about a time when there is out of the ODD, based on map information.
In operation S130, the vehicle driving controller 100 may determine a user state based on information received from the sensor module 200. In this case, the user state may include at least one or more of a “hands-off” state, an “eyes-off” state, or a “mind-off” state.
In operation S140, the vehicle driving controller 100 may determine a warning level based on the driving risk and the user state and may output a warning (a control authority transition demand).
When the user takes over control authority in operation S150 after outputting the warning, in operation S160, the user may directly drive the host vehicle. When the user does not take over the control authority in operation S150, in operation S170, the vehicle driving controller 100 may determine an MRM based on the driving risk and the user state and may perform vehicle control according to the MRM.
Referring to
In this case, the vehicle driving controller 100 may represent and output a static object of a surrounding environment on a grid map and may predict a location of a dynamic object, a speed of the dynamic object, an acceleration of the dynamic object, a current driving lane, and a driving trajectory to generate and output a dynamic object list. Furthermore, the vehicle driving controller 100 may search for a section where the host vehicle is currently traveling, based on a current location and a map DB and may transmit geographic/road information about the section where the host vehicle is currently traveling. In this case, the geographic/road information may include information associated with a main line of a highway, a JC/IC, a tollgate, a limited-access road, a general road, or the like.
As shown reference numeral 401 of
Referring to
In operation S122, the vehicle driving controller 100 may generate a driving trajectory of a host vehicle based on a current driving lane. In operation S123, the vehicle driving controller 100 may determine whether there is a point of intersection between the driving trajectory of the host vehicle and a grid map. In operation S124, the vehicle driving controller 100 may determine whether there is a point of intersection between the driving trajectory of the host vehicle and a trajectory of a dynamic object. The vehicle driving controller 100 may determine whether there is a point 207 of intersection between a static object 205 shaped with points of a grid map 402 generated shown in
In operation S127, the vehicle driving controller 100 may calculate the remaining distance and time to an event from map information based on location information of the host vehicle. In operation S128, the vehicle driving controller 100 may determine an ODD. In this case, the ODD may refer to a domain capable of performing autonomous driving, and the event may refer to a point capable of performing autonomous driving. For example, when the host vehicle travels on a highway which belongs to a range of the ODD and then travels on a general road which is not within the range of the ODD, the vehicle driving controller 100 may regard a domain before the host vehicle enters the general road (event) as the ODD and may determine a time when the host vehicle enters the general road as a point where there is out of the ODD.
A type where there is out of the ODD may be classified as i) the host vehicle is able to travel after there is out of the ODD and ii) the host vehicle is unable to travel after there is out of the ODD. Furthermore, when the host vehicle is able to travel after there is out of the ODD may include when the host vehicle exits from a highway and then enters a general road or when the host vehicle is affected by weather such as rain or the like. When the host vehicle is unable to travel after there is out of the ODD may include when there is a line which disappears or when there is no line.
Referring to
In operation S131, the vehicle driving controller 100 may determine whether the user state is the “hands-off” state based on information about a steering wheel torque and information about a steering angle. For example, when a state where a value measured by a steering wheel torque sensor is less than a predetermined value and where there is no change in steering angle stays over a predetermined time, the vehicle driving controller 100 may determine that the user does not grip the steering wheel to generate a “hands-off” flag. In this case, the user should be in the “eyes-on” state and the “mind-on” state.
In operation S132, the vehicle driving controller 100 may determine whether the user state is the “eyes-off” state based on information about a line of sight of the user and information about a face direction of the user. For example, the vehicle driving controller 100 may track a line of sight of the user and a face direction of the user. When a time when the user does not keep his or her eyes on the road stays over a predetermined time, the vehicle driving controller 100 may generate an “eyes-off” flag. In this case, the user should be in the “mind-on” state.
In operation S133, the vehicle driving controller 100 may determine whether the user state is the “mind-off” state using information about a user look from image data of a camera. For example, the vehicle driving controller 100 may determine a state where the user is drowsy and a state where the user is unable to currently drive the vehicle through blinking of the eyes of the user, a yawn of the user, or the like to generate a “mind-off” flag.
Referring to
When a “hands-off” state occurs, the vehicle driving controller 100 may provide a notification of a control authority transition demand in warning level 1. In this case, the vehicle driving controller 100 may provide a visual warning.
When the “hands-off” state or an “eyes-off” state occurs, the vehicle driving controller 100 may provide a notification of a control authority transition demand in warning level 2. In this case, the vehicle driving controller 100 may provide a visual warning and a sound notification (a beep sound) in warning level 2.
When the “hands-off” state or the “eyes-off” state occurs, the vehicle driving controller 100 may provide a notification of a control authority transition demand in warning level 3. In this case, the vehicle driving controller 100 may provide a visual warning and a sound notification (a beep sound) in warning level 3, and may output a color different from a color of the visual warning in warning level 2 or may output a level of a notification sound to be higher.
When the hands-off state or the “eyes-off” state occurs, the vehicle driving controller 100 may provide a notification of a control authority transition demand in warning level 4. In this case, the vehicle driving controller 100 may provide a visual warning, a sound notification (a beep sound), and an emergency call mode in warning level 4. In this case, when providing the visual warning and the sound notification, the vehicle driving controller 100 may output a color of the visual warning different from the color of the visual warning in warning level 2 or 3 and may output sound strength stronger than the sound strength in warning level 3.
In
Referring to
The vehicle driving controller 100 may determine one of constant-speed driving control for performing constant-speed driving after decelerating at a constant speed, stop control for stopping on a corresponding lane, or shoulder stop control for changing a lane to a shoulder and stopping on the shoulder, using such an MRM.
The vehicle driving controller 100 may determine an MRM depending on driving risk and a user state. When the notification of the control authority transition demand is started, the vehicle driving controller 100 may gradually increase a warning level over time to provide the notification of the control authority transition demand and may provide the notification of the control authority transition demand in the highest level in the warning level. When the user does not take over control authority, the vehicle driving controller 100 may determine an MRM and may perform vehicle control.
Hereinafter, a description will be given in detail of a method for determining a warning level according to a driving environment and determining an MRM according to a warning level in some forms of the present disclosure.
Hereinafter, it is assumed that a vehicle driving controller 100 of
Referring to
When the user state is the “eyes-off” state in operation S202, in operation S203, the vehicle driving controller 100 may provide a notification of a control authority transition demand in warning level 2, may increase a warning level over time, and may perform stop control when a user does not take over control authority after entering warning level 4.
When the user state is the “mind-off” state in operation S202, in operation S204, the vehicle driving controller 100 may provide the notification of the control authority transition demand in warning level 3, may increase the warning level over time, and may perform shoulder stop control when the user does not take over control authority after entering warning level 4.
Meanwhile, when there is the driving risk in operation S201, in operation S205, the vehicle driving controller 100 may determine whether there is collision risk and whether there is out of an ODD.
When there is the collision risk in operation S205, in operation S206, the vehicle driving controller 100 may determine whether the collision risk occurs after a predetermined time.
When the collision risk occurs after the predetermined time, the vehicle driving controller 100 may determine a warning level depending on a user state.
For example, when the collision risk occurs after the predetermined time, in operation S207, the vehicle driving controller 100 may provide the notification of the control authority transition demand in warning level 2 when the user state is the “hands-off” state, may provide the notification of the control authority transition demand in warning level 2 when the user state is the “eyes-off” state, and may provide the notification of the control authority transition demand in warning level 3 when the user state is the “mind-off” state.
When the user does not take over the control authority, in operation S208, the vehicle driving controller 100 may perform stop control when providing the notification of the control authority transition demand in “eyes-off” warning level 2 and may perform shoulder stop control in “mind-off” warning level 3.
Meanwhile, when the collision is expected earlier (faster) than the predetermined time rather than after the predetermined time, in operation S209, the vehicle driving controller 100 may provide the notification of the control authority transition demand in warning level 3 when the user state is the “hands-off” state, may provide the notification of the control authority transition demand in warning level 4 when the user state is the “eyes-off” state, and provide the notification of the control authority transition demand in warning level 4 when the user state is the “mind-off” state.
After providing the notification of the control authority transition demand, when the user does take over the control authority, in operation S210, the vehicle driving controller 100 may perform stop control upon “hands-off” warning level 3, may perform stop control upon “eyes-off” warning level 4, and may perform shoulder stop control upon “mind-off” warning level 4.
When it is determined that there is out of the ODD in operation S205, in operation S211, the vehicle driving controller 100 may determine whether there is out of the ODD after a predetermined time.
When there is out of the ODD after the predetermined time, in operation S212, the vehicle driving controller 100 may provide the notification of the control authority transition demand in warning level 1 when the user state is the “hands-off” state, may provide the notification of the control authority transition demand in warning level 2 when the user state is the “eyes-off” state, and may provide the notification of the control authority transition demand in warning level 3 when the user state is the “mind-off” state.
After providing the notification of the control authority transition demand, when the user does not take over the control authority, in operation S213, the vehicle driving controller 100 may perform constant-speed driving control upon “hands-off” warning level 1, may perform stop control upon “eyes-off” warning level 2, and may perform shoulder stop control upon “mind-off” warning level 3.
Meanwhile, when it is determined that there is out of the ODD easier than the predetermined time in operation S211, in operation S214, the vehicle driving controller 100 may provide the notification of the control authority transition demand in warning level 2 when the user state is the “hands-off” state, may provide the notification of the control authority transition demand in warning level 3 when the user state is the “eyes-off” state, and may provide the notification of the control authority transition demand in warning level 4 when the user state is the “mind-off” state.
After providing the notification of the control authority transition demand, when the user does not take over the control authority, in operation S215, the vehicle driving controller 100 may perform constant-speed driving control upon “hands-off” warning level 2, may perform stop control upon “eyes-off” warning level 3, and may perform shoulder stop control upon “mind-off” warning level 4.
Referring to
The processor 1100 may be a central processing unit (CPU) or a semiconductor device for processing instructions stored in the memory 1300 and/or the storage 1600. Each of the memory 1300 and the storage 1600 may include various types of volatile or non-volatile storage media. For example, the memory 1300 may include a read only memory (ROM) and a random access memory (RAM).
Thus, the operations of the methods or algorithms described in connection with some forms of the present disclosure disclosed in the specification may be directly implemented with a hardware module, a software module, or combinations thereof, executed by the processor 1100. The software module may reside on a storage medium (e.g., the memory 1300 and/or the storage 1600) such as a RAM, a flash memory, a ROM, an erasable and programmable ROM (EPROM), an electrically EPROM (EEPROM), a register, a hard disc, a removable disc, or a compact disc-ROM (CD-ROM).
An exemplary storage medium may be coupled to the processor 1100. The processor 1100 may read out information from the storage medium and may write information in the storage medium. Alternatively, the storage medium may be integrated with the processor 1100. The processor and storage medium may reside in an application specific integrated circuit (ASIC). The ASIC may reside in a user terminal. Alternatively, the processor and storage medium may reside as a separate component of the user terminal.
The technology of the present disclosure may differently apply a control authority transition demand method depending on a driving environment and/or a user state in an autonomous driving environment such that the user safely copes with a risk situation.
In addition, various effects directly or indirectly ascertained through the present disclosure may be provided.
The description of the disclosure is merely exemplary in nature and, thus, variations that do not depart from the substance of the disclosure are intended to be within the scope of the disclosure. Such variations are not to be regarded as a departure from the spirit and scope of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
10-2018-0154724 | Dec 2018 | KR | national |
The present application claims priority to and the benefit of Korean Patent Application No. 10-2018-0154724, filed in the Korean Intellectual Property Office on Dec. 4, 2018 and U.S. Patent Application No. 62/655,831, filed in the US Patent and Trademark Office on Apr. 11, 2018, the entire contents of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4361202 | Minovitch | Nov 1982 | A |
5314037 | Shaw | May 1994 | A |
5521579 | Bernhard | May 1996 | A |
6055467 | Mehring et al. | Apr 2000 | A |
6473678 | Satoh et al. | Oct 2002 | B1 |
6842687 | Winner et al. | Jan 2005 | B2 |
7363140 | Ewerhart et al. | Apr 2008 | B2 |
7821421 | Tamir et al. | Oct 2010 | B2 |
8073595 | Tabata et al. | Dec 2011 | B2 |
8457827 | Ferguson et al. | Jun 2013 | B1 |
8521352 | Ferguson et al. | Aug 2013 | B1 |
8798841 | Nickolaou et al. | Aug 2014 | B1 |
8874301 | Rao et al. | Oct 2014 | B1 |
9079571 | Trost | Jul 2015 | B2 |
9315178 | Ferguson et al. | Apr 2016 | B1 |
9527441 | Matsumura | Dec 2016 | B2 |
9874871 | Zhu et al. | Jan 2018 | B1 |
10183668 | Takae | Jan 2019 | B2 |
10324463 | Konrardy et al. | Jun 2019 | B1 |
10449856 | Kojima | Oct 2019 | B2 |
10451730 | Talamonti et al. | Oct 2019 | B2 |
10558213 | Sato et al. | Feb 2020 | B2 |
10618523 | Fields et al. | Apr 2020 | B1 |
10627813 | Tsuji et al. | Apr 2020 | B2 |
10663971 | Sugawara et al. | May 2020 | B2 |
10676084 | Fujii | Jun 2020 | B2 |
10814913 | Fujii | Oct 2020 | B2 |
10935974 | Fields et al. | Mar 2021 | B1 |
20030163239 | Winner et al. | Aug 2003 | A1 |
20050137782 | Shinada | Jun 2005 | A1 |
20050228588 | Braeuchle et al. | Oct 2005 | A1 |
20050256630 | Nishira et al. | Nov 2005 | A1 |
20060009910 | Ewerhart et al. | Jan 2006 | A1 |
20070043505 | Leicht | Feb 2007 | A1 |
20070255474 | Hayakawa et al. | Nov 2007 | A1 |
20080172153 | Ozaki et al. | Jul 2008 | A1 |
20080204212 | Jordan et al. | Aug 2008 | A1 |
20090005933 | Tabata et al. | Jan 2009 | A1 |
20090088925 | Sugawara et al. | Apr 2009 | A1 |
20090132125 | Yonezawa et al. | May 2009 | A1 |
20090171533 | Kataoka | Jul 2009 | A1 |
20090194350 | Rattapon et al. | Aug 2009 | A1 |
20090299573 | Thrun et al. | Dec 2009 | A1 |
20090319113 | Lee | Dec 2009 | A1 |
20100010733 | Krumm | Jan 2010 | A1 |
20100042282 | Taguchi et al. | Feb 2010 | A1 |
20100289632 | Seder et al. | Nov 2010 | A1 |
20110169625 | James et al. | Jul 2011 | A1 |
20110196592 | Kashi et al. | Aug 2011 | A1 |
20110241862 | Debouk et al. | Oct 2011 | A1 |
20110251758 | Kataoka | Oct 2011 | A1 |
20110293145 | Nogami et al. | Dec 2011 | A1 |
20120166032 | Lee et al. | Jun 2012 | A1 |
20120296522 | Otuka | Nov 2012 | A1 |
20130063595 | Niem | Mar 2013 | A1 |
20130066525 | Tomik et al. | Mar 2013 | A1 |
20130226406 | Ueda et al. | Aug 2013 | A1 |
20140074356 | Bone | Mar 2014 | A1 |
20140336896 | Udaka et al. | Nov 2014 | A1 |
20150006012 | Kammel et al. | Jan 2015 | A1 |
20150006013 | Wimmer et al. | Jan 2015 | A1 |
20150019063 | Lu et al. | Jan 2015 | A1 |
20150094899 | Hackenberg et al. | Apr 2015 | A1 |
20150148985 | Jo | May 2015 | A1 |
20150166062 | Johnson et al. | Jun 2015 | A1 |
20150204687 | Yoon et al. | Jul 2015 | A1 |
20150353082 | Lee et al. | Dec 2015 | A1 |
20150355641 | Choi et al. | Dec 2015 | A1 |
20150360721 | Matsuno et al. | Dec 2015 | A1 |
20160001781 | Fung | Jan 2016 | A1 |
20160091897 | Nilsson et al. | Mar 2016 | A1 |
20160107682 | Tan et al. | Apr 2016 | A1 |
20160107687 | Yamaoka | Apr 2016 | A1 |
20160187879 | Mere et al. | Jun 2016 | A1 |
20160225261 | Matsumoto | Aug 2016 | A1 |
20160250968 | Shirakata et al. | Sep 2016 | A1 |
20160272204 | Takahashi et al. | Sep 2016 | A1 |
20160288707 | Matsumura | Oct 2016 | A1 |
20160297431 | Eigel et al. | Oct 2016 | A1 |
20160297447 | Suzuki | Oct 2016 | A1 |
20160339913 | Yamashita et al. | Nov 2016 | A1 |
20160349066 | Chung et al. | Dec 2016 | A1 |
20160368492 | Al-Stouhi | Dec 2016 | A1 |
20170003683 | Sato et al. | Jan 2017 | A1 |
20170061799 | Fujii et al. | Mar 2017 | A1 |
20170108865 | Rohde et al. | Apr 2017 | A1 |
20170124882 | Wang | May 2017 | A1 |
20170171375 | Kamata | Jun 2017 | A1 |
20170197637 | Yamada et al. | Jul 2017 | A1 |
20170203763 | Yamada et al. | Jul 2017 | A1 |
20170203764 | Fujiki et al. | Jul 2017 | A1 |
20170240172 | Nishiguchi et al. | Aug 2017 | A1 |
20170240186 | Hatano | Aug 2017 | A1 |
20170243491 | Fujii et al. | Aug 2017 | A1 |
20170291603 | Nakamura | Oct 2017 | A1 |
20170308094 | Abe et al. | Oct 2017 | A1 |
20170313313 | Asakura | Nov 2017 | A1 |
20170315556 | Mimura | Nov 2017 | A1 |
20170334460 | Arakawa et al. | Nov 2017 | A1 |
20170341652 | Sugawara et al. | Nov 2017 | A1 |
20170341653 | Kubota et al. | Nov 2017 | A1 |
20170349212 | Oshida et al. | Dec 2017 | A1 |
20170368936 | Kojima | Dec 2017 | A1 |
20180009437 | Ooba | Jan 2018 | A1 |
20180029604 | Niino et al. | Feb 2018 | A1 |
20180033309 | Norwood | Feb 2018 | A1 |
20180043906 | Huang | Feb 2018 | A1 |
20180046185 | Sato et al. | Feb 2018 | A1 |
20180050659 | Coburn | Feb 2018 | A1 |
20180074497 | Tsuji et al. | Mar 2018 | A1 |
20180088574 | Latotzki et al. | Mar 2018 | A1 |
20180091085 | Tamagaki et al. | Mar 2018 | A1 |
20180111628 | Tamagaki et al. | Apr 2018 | A1 |
20180154939 | Aoki | Jun 2018 | A1 |
20180157038 | Kabe | Jun 2018 | A1 |
20180162416 | Honda et al. | Jun 2018 | A1 |
20180170370 | Kataoka | Jun 2018 | A1 |
20180178713 | Fujii | Jun 2018 | A1 |
20180178714 | Fujii | Jun 2018 | A1 |
20180178715 | Fujii | Jun 2018 | A1 |
20180178716 | Fujii | Jun 2018 | A1 |
20180178801 | Hashimoto et al. | Jun 2018 | A1 |
20180178802 | Miyata | Jun 2018 | A1 |
20180186376 | Lee et al. | Jul 2018 | A1 |
20180188735 | Sugawara et al. | Jul 2018 | A1 |
20180194280 | Shibata et al. | Jul 2018 | A1 |
20180197414 | Oooka | Jul 2018 | A1 |
20180209801 | Stentz et al. | Jul 2018 | A1 |
20180215387 | Takae | Aug 2018 | A1 |
20180222422 | Takae | Aug 2018 | A1 |
20180222423 | Takae et al. | Aug 2018 | A1 |
20180237030 | Jones et al. | Aug 2018 | A1 |
20180239352 | Wang et al. | Aug 2018 | A1 |
20180251155 | Chan et al. | Sep 2018 | A1 |
20180257669 | Makke | Sep 2018 | A1 |
20180281788 | Uchida | Oct 2018 | A1 |
20180290666 | Ichikawa et al. | Oct 2018 | A1 |
20180292820 | Markberger | Oct 2018 | A1 |
20180297638 | Fujii | Oct 2018 | A1 |
20180297639 | Fujii | Oct 2018 | A1 |
20180297640 | Fujii | Oct 2018 | A1 |
20180339708 | Geller | Nov 2018 | A1 |
20180345959 | Fujii | Dec 2018 | A1 |
20180345960 | Fujii | Dec 2018 | A1 |
20180345964 | Fujii et al. | Dec 2018 | A1 |
20180346027 | Fujii | Dec 2018 | A1 |
20180348758 | Nakamura et al. | Dec 2018 | A1 |
20180350242 | Fujii | Dec 2018 | A1 |
20180354519 | Miyata | Dec 2018 | A1 |
20180362013 | Ungermann | Dec 2018 | A1 |
20180370542 | Braunagel et al. | Dec 2018 | A1 |
20180370544 | Kitagawa | Dec 2018 | A1 |
20180373250 | Nakamura et al. | Dec 2018 | A1 |
20190005823 | Fujiki et al. | Jan 2019 | A1 |
20190026918 | Gomezcaballero et al. | Jan 2019 | A1 |
20190047469 | Nishiguchi et al. | Feb 2019 | A1 |
20190047561 | Nishiguchi et al. | Feb 2019 | A1 |
20190049958 | Liu et al. | Feb 2019 | A1 |
20190061766 | Nishiguchi | Feb 2019 | A1 |
20190071099 | Nishiguchi | Mar 2019 | A1 |
20190106108 | Wienecke et al. | Apr 2019 | A1 |
20190126923 | Taie et al. | May 2019 | A1 |
20190126927 | Uejima | May 2019 | A1 |
20190135290 | Marden et al. | May 2019 | A1 |
20190155279 | Tayama | May 2019 | A1 |
20190161117 | Suzuki | May 2019 | A1 |
20190168754 | Makled et al. | Jun 2019 | A1 |
20190185005 | Fukuda | Jun 2019 | A1 |
20190196481 | Tay et al. | Jun 2019 | A1 |
20190197497 | Abari et al. | Jun 2019 | A1 |
20190212443 | Nomura et al. | Jul 2019 | A1 |
20190235504 | Carter et al. | Aug 2019 | A1 |
20190241158 | Ghannam et al. | Aug 2019 | A1 |
20190241198 | Mori et al. | Aug 2019 | A1 |
20190250620 | Huang et al. | Aug 2019 | A1 |
20190256064 | Hecker et al. | Aug 2019 | A1 |
20190263411 | Saikyo et al. | Aug 2019 | A1 |
20190265712 | Satzoda et al. | Aug 2019 | A1 |
20190279507 | Oshisaka et al. | Sep 2019 | A1 |
20190283757 | Honda et al. | Sep 2019 | A1 |
20190285726 | Muto | Sep 2019 | A1 |
20190291642 | Chae et al. | Sep 2019 | A1 |
20190291728 | Shalev-Shwartz et al. | Sep 2019 | A1 |
20190302768 | Zhang et al. | Oct 2019 | A1 |
20190315362 | Um et al. | Oct 2019 | A1 |
20190317494 | Lee et al. | Oct 2019 | A1 |
20190325758 | Yoshii et al. | Oct 2019 | A1 |
20190359202 | Zhu et al. | Nov 2019 | A1 |
20190391580 | Di Cairano et al. | Dec 2019 | A1 |
20200001714 | Kojima | Jan 2020 | A1 |
20200049513 | Ma | Feb 2020 | A1 |
20200073396 | Shimizu | Mar 2020 | A1 |
20200172123 | Kubota et al. | Jun 2020 | A1 |
20200180638 | Kanoh | Jun 2020 | A1 |
20200189618 | Ochida | Jun 2020 | A1 |
20200269747 | Kusayanagi et al. | Aug 2020 | A1 |
20200269880 | Tokita | Aug 2020 | A1 |
20200301431 | Matsubara et al. | Sep 2020 | A1 |
20200307634 | Yashiro | Oct 2020 | A1 |
20200312155 | Kelkar et al. | Oct 2020 | A1 |
20200391593 | Lee et al. | Dec 2020 | A1 |
20210188258 | Goto et al. | Jun 2021 | A1 |
20210188262 | Goto et al. | Jun 2021 | A1 |
20210188356 | Goto et al. | Jun 2021 | A1 |
Number | Date | Country |
---|---|---|
198 21 122 | Jun 1999 | DE |
101 14 187 | Sep 2002 | DE |
102004005815 | Jun 2005 | DE |
10 2004 048 468 | Apr 2006 | DE |
10 2007 005 245 | Nov 2007 | DE |
10 2011 016 770 | Nov 2011 | DE |
10 2011 016 771 | Oct 2012 | DE |
10 2012 001405 | Nov 2012 | DE |
10 2011 109618 | Feb 2013 | DE |
10 2012 008090 | Oct 2013 | DE |
10 2014 225 680 | Jun 2016 | DE |
10 2015 205131 | Sep 2016 | DE |
102016202946 | Sep 2016 | DE |
102015206969 | Oct 2016 | DE |
102015209476 | Nov 2016 | DE |
102015219231 | Apr 2017 | DE |
10 2016 007187 | Jun 2017 | DE |
102015224244 | Jun 2017 | DE |
10 2016 215565 | Feb 2018 | DE |
10 2016 216134 | Mar 2018 | DE |
1074904 | Feb 2001 | EP |
1607264 | Dec 2005 | EP |
2116984 | Nov 2009 | EP |
2657921 | Oct 2013 | EP |
2978648 | Feb 2016 | EP |
3075618 | Oct 2016 | EP |
3239960 | Nov 2017 | EP |
3 264 211 | Jan 2018 | EP |
3284646 | Feb 2018 | EP |
3075618 | May 2018 | EP |
19167267.4 | Aug 2021 | EP |
2000-198458 | Jul 2000 | JP |
2003-025868 | Jan 2003 | JP |
2015-138330 | Jul 2015 | JP |
2016-000602 | Jan 2016 | JP |
2016-151815 | Aug 2016 | JP |
2016-196285 | Nov 2016 | JP |
2019-043169 | Mar 2019 | JP |
10-0578573 | May 2006 | KR |
101779823 | Oct 2017 | KR |
20180070401 | Jun 2018 | KR |
2010-088869 | Aug 2010 | WO |
2012-131405 | Oct 2012 | WO |
2014-154771 | Oct 2014 | WO |
2017-018133 | Feb 2017 | WO |
WO 2017064941 | Apr 2017 | WO |
2017-168013 | Oct 2017 | WO |
WO 2017168013 | Oct 2017 | WO |
2018-033389 | Feb 2018 | WO |
2017-017793 | Jun 2018 | WO |
Entry |
---|
European Search Report dated Mar. 27, 2020 from the corresponding European Application No. 19167264.1, 8 pages. |
European Search Report dated Apr. 21, 2020 from the corresponding European Application No. 19167270.8, 8 pages. |
U.S. Office Action dated Feb. 4, 2020 from the corresponding U.S. Appl. No. 16/296,890, 19 pp. |
U.S. Office Action dated Mar. 25, 2020 from the corresponding U.S. Appl. No. 16/204,362, 27 pp. |
U.S. Office Action dated Apr. 24, 2020 from the corresponding U.S. Appl. No. 16/203,884 , 25 pp. |
European Search Report dated Jul. 1, 2019 from the corresponding European Application No. 18210398.6, 9 pages. |
European Search Report dated Jul. 3, 2019 from the corresponding European Application No. 18210063.6, 10 pages. |
European Search Report dated Jul. 18, 2019 from the corresponding European Application No. 18210400.0, 5 pages. |
European Search Report dated Jul. 22, 2019 from the corresponding European Application No. 18210403.4, 8 pages. |
European Search Report dated Jul. 22, 2019 from the corresponding European Application No. 18210401.8, 8 pages. |
European Search Report dated Jul. 25, 2019 from the corresponding European Application No. 18209168.6, 9 pages. |
European Search Report dated Jul. 25, 2019 from the corresponding European Application No. 19156387.3, 8 pages. |
European Search Report dated Aug. 2, 2019 from the corresponding European Application No. 19167271.6, 8 pages. |
European Search Report dated Aug. 22, 2019 from the corresponding European Application No. 19167263.3, 8 pages. |
European Search Report dated Aug. 30, 2019 from the corresponding European Application No. 19167269.0, 9 pages. |
European Search Report dated Aug. 30, 2019 from the corresponding European Application No. 19167267.4, 8 pages. |
Office Action dated Aug. 12, 2020 from the corresponding U.S. Appl. No. 16/192,279, 60 pages. |
Notice of Allowance dated Sep. 4, 2020 from the corresponding U.S. Appl. No. 16/203,884, 15 pages. |
Office Action dated Sep. 15, 2020 from the corresponding U.S. Appl. No. 16/206,170, 23 pages. |
European Search Report dated May 11, 2020 from the corresponding European Application No. 19167265.8, 9 pages. |
Office Action for U.S. Appl. No. 16/204,400 dated Jun. 1, 2020, 44 pages. |
Notice of Allowance for U.S. Appl. No. 16/204,362 dated Jul. 9, 2020, 21 pages. |
European Search Report dated Oct. 2, 2019 from the corresponding European Application No. 19163402.1, 10 pages. |
European Search Report dated Oct. 2, 2019 from the corresponding European Application No. 19162795.9, 8 pages. |
European Search Report dated Oct. 14, 2019 from the corresponding European Application No. 19161253.0, 11 pages. |
European Search Report dated Oct. 18, 2019 from the corresponding European Application No. 19167268.2, 8 pages. |
European Search Report dated Oct. 23, 2019 from the corresponding European Application No. 19167266.6, 9 pages. |
U.S. Office Action for U.S. Appl. No. 16/204,362 dated Oct. 16, 2019, 32 pages. |
Office Action dated Dec. 10, 2020 from the corresponding U.S. Appl. No. 16/269,140, 31 pp. |
Office Action dated Dec. 14, 2020 from the corresponding U.S. Appl. No. 16/378,203, 49 pp. |
Notice of Allowance dated Dec. 28, 2020 from the corresponding U.S. Appl. No. 16/206,170, 16 pp. |
Office Action dated Jan. 25, 2021 from the corresponding U.S. Appl. No. 16/192,279,38 pp. |
Notice of Allowance dated Jan. 25, 2021 from the corresponding U.S. Appl. No. 16/372,937, 31 pp. |
Office Action dated Feb. 11, 2021 from the corresponding U.S. Appl. No. 16/372,896, 26 pp. |
Office Action dated Feb. 11, 2021 from the corresponding U.S. Appl. No. 16/376,661, 24 pp. |
Office Action dated Feb. 17, 2021 from the corresponding U.S. Appl. No. 16/204,324, 30 pp. |
Office Action dated Feb. 17, 2021 from the corresponding U.S. Appl. No. 16/376,576, 56 pp. |
Office Action dated Mar. 17, 2021 from the corresponding U.S. Appl. No. 16/378,181, 14 pp. |
Notice of Allowance dated Mar. 23, 2021 from the corresponding U.S. Appl. No. 16/269,140, 9 pp. |
Office Action dated Mar. 26, 2021 from the corresponding U.S. Appl. No. 16/376,612, 29 pp. |
Office Action dated Apr. 15, 2021 from the corresponding U.S. Appl. No. 16/290,376, 27 pp. |
Office Action dated Apr. 22, 2021 from the corresponding U.S. Appl. No. 16/378,203, 35 pp. |
Notice of Allowance dated May 19, 2021 from the corresponding U.S. Appl. No. 16/204,324, 13 pp. |
Office Action dated May 21, 2021 from the corresponding U.S. Appl. No. 16/372,896, 19 pp. |
Notice of Allowance dated May 27, 2021 from the corresponding U.S. Appl. No. 16/376,661, 10 pp. |
Office Action dated Jun. 1, 2021 from the corresponding U.S. Appl. No. 16/192,279, 39 pp. |
Number | Date | Country | |
---|---|---|---|
20190315346 A1 | Oct 2019 | US |
Number | Date | Country | |
---|---|---|---|
62655831 | Apr 2018 | US |