The present invention relates to a display control apparatus that displays an image on a display device viewed by a passenger of an own vehicle and a vehicle control apparatus that controls the own vehicle.
As the display control apparatus, there is known a display control apparatus that recognizes white lines as boundaries of a traveling lane and displays an image of a recognition state of the white lines as described in Patent Literature 1, for example.
[Patent Literature 1] Japanese Patent No. 5316713
For the above display control apparatus, there is a demand that a passenger is allowed to recognize a lot of things by taking a glance at the image.
In an embodiment of the present invention, a display control apparatus displaying an image on a display device viewed by a passenger of an own vehicle can display more items.
A display control apparatus of the embodiment is installed in an own vehicle to display an image on a display device viewed by a passenger of the own vehicle. The display control apparatus includes a boundary acquisition section that acquires positions of boundary portions defining both width-wise ends of a traveling lane in which the own vehicle travels, and an object acquisition section that acquires a position of an object around the traveling lane. The apparatus generates a position image, which is an image representing the positions of the boundary portions and the position of the object, and displays the position image on the display device.
Embodiments of the present invention will be described below with reference to the drawings.
A deviation avoidance system 2 to which the present invention is applied is installed in a vehicle such as a passenger automobile and has a function of suppressing a deviation of the vehicle from a traveling lane in which the vehicle travels. It is noted that the traveling lane refers to an area closer to the own vehicle than boundary portions that define the right and left ends of an area in which the own vehicle is supposed to travel.
The deviation avoidance system 2 of the present embodiment is configured to display more items on a display 40 to improve convenience. It is noted that, in the present embodiment, “suppressing a deviation” is also expressed as “avoiding a deviation”.
As shown in
The deviation avoidance apparatus 10 is a well-known computer that includes a CPU and memories such as a RAM and a ROM. The deviation avoidance apparatus 10 performs a deviation avoidance process described later by a program stored in the memory. Performing this program performs a method corresponding to the program. One or more microcomputers may configure the deviation avoidance apparatus 10.
In the following description, a vehicle equipped with the deviation avoidance apparatus 10 will be referred to as an own vehicle. It is noted that the memory stores in advance a plurality of kinds of icons. The icons refer to simply symbolized pictures. Specifically, the icons include images of a white line as a boundary, a pedestrian, a vehicle, a guard rail, suitability boundaries described later, and the like. These elements of the deviation avoidance apparatus 10 may not necessarily be implemented by software. Some or all of the elements may be implemented by hardware in combination with logical circuits or analog circuits.
The deviation avoidance apparatus 10 functionally includes a boundary detection section 12, a deviation prediction section 14, an object detection section 16, a command value adjustment section 18, an object parameter recognition section 20, a generation control section 22, and a deviation avoidance section 24. The functions of the sections of the deviation avoidance apparatus 10 will be described later.
The traveling control apparatus 30 acquires steering torque generated by the operation of the steering wheel of the driver from the torque sensor 64 and acquires a vehicle speed of an own vehicle 100 from the vehicle speed sensor 62. Then, the traveling control apparatus 30 calculates assist torque output from the steering motor 32 that assists the steering operation of the driver based on the steering torque and the vehicle speed. The traveling control apparatus 30 controls the steering motor 32 by power distribution in accordance with the calculated result to control the amount of assist for the driver to turn the steering wheel.
To avoid the deviation of the own vehicle from the traveling lane in which the own vehicle is traveling, the traveling control apparatus 30 controls the amount of power distribution to the steering motor 32 by a command issued from the deviation avoidance apparatus 10 to control the traveling state of the own vehicle. The steering motor 32 corresponds to a steering actuator that drives a steering mechanism to change the traveling direction of the own vehicle.
The traveling control apparatus 30 controls not only the power distribution to the steering motor 32 but also a brake system and a power train system, which are not shown, to control the traveling state of the own vehicle. The traveling state of the own vehicle includes longitudinal and lateral vehicle speeds of the own vehicle, a lateral position of the own vehicle in the traveling lane, and longitudinal and lateral accelerations of the own vehicle.
The deviation avoidance activation switch 50 is provided on a front panel, for example. When the deviation avoidance activation switch 50 is turned on, the deviation avoidance apparatus 10 starts the deviation avoidance process. At this time, the performance of the deviation avoidance assist is indicated on the display 40. It is noted that the display 40 may be a display of a navigation system, which is not shown, or may be a display dedicated to the deviation avoidance process.
The camera 54 images an area ahead of the own vehicle 100. The deviation avoidance apparatus 10 analyzes image data of the image captured by the camera 54. The acceleration sensor 56 detects longitudinal and lateral accelerations of the own vehicle 100. The yaw rate sensor 58 detects a turning angle velocity of the own vehicle 100.
The steering angle sensor 60 detects a steering angle of a steering wheel (not shown). The vehicle speed sensor 62 detects a current vehicle speed of the own vehicle 100. The torque sensor 64 detects torque generated by steering operation of the driver.
The deviation avoidance process performed by the deviation avoidance apparatus 10 will be described. The deviation avoidance process is performed at predetermined time intervals when the deviation avoidance activation switch 50 is turned on.
In the deviation avoidance process, as described in
For example, the object detection section 16 detects a distance between the own vehicle 100 and the object based on the position of the lower end of the object in the image captured by the camera 54. The distance between the own vehicle 100 and the object can be determined as longer, as the lower end of the object is positioned more upward in the captured image. In addition, the object detection section 16 determines the kind of the object by, for example, pattern matching using a dictionary of object models pre-stored therein.
In addition, the object parameter recognition section 20 keeps track of the position and type of the object in a time-series manner to recognize a relative movement vector of the object to the own vehicle. In addition, the object parameter recognition section 20 also recognizes the distance between the object and the boundary of the traveling lane, that is, to what degree the object is separated outwardly from the boundary. In S10, the deviation avoidance apparatus 10 acquires, as the various parameters, the positions of the boundaries, the position and type of the object, the relative movement vector, the distance between the object and the boundary of the traveling lane, and the like.
Then, in S20, the boundary detection section 12 determines whether the boundaries of the traveling lane 200 in which the own vehicle 100 is traveling have been successfully detected. The boundaries of the traveling lane 200 define both ends in the width direction of the traveling lane 200.
Referring to
Referring to
For a traveling lane without the center line 214 as shown in
When the own vehicle 100 travels on the right side of the road in the example of
The suitability boundary 222 between the paved surface and the unsuitable section 220 for traveling is recognized, for example, based on the analysis of the image data by the boundary detection section 12 or the object detection section 16. The boundary on the right side of the both ends in the width direction of the traveling lane 200 with respect to the own vehicle 100 is defined by the inner end 214a of the center line 214.
In this manner, when no white line exists on at least one of the both ends in the width direction of the traveling lane 200, the boundary between the suitable section for traveling of the own vehicle 100 and the unsuitable section 220 for traveling of the own vehicle 100 at the end side is set as the suitability boundary 222 of the traveling lane 200 defined by the suitability for traveling.
The suitable section for traveling of the own vehicle 100 refers to a paved surface or a road surface that is not paved but is leveled to a degree that the own vehicle 100 can travel. The unsuitable section 220 for traveling of the own vehicle 100 refers to a section where the own vehicle 100 cannot run or has difficulty in traveling because of its structure with the presence of a wall, a building, a guard rail, lane-defining poles, a groove, a step, a cliff, or a sandy place.
The boundary detection section 12 detects the width of the traveling lane 200 as well as the boundaries of the traveling lane 200. The boundary detection section 12 further detects the coordinates of the boundaries of the traveling lane 200 within the range of the image captured by the camera 54. The boundary detection section 12 then calculates a curvature of the traveling lane 200 based on the coordinates of the boundaries. The boundary detection section 12 may acquire a curvature of the traveling lane 200 based on map information of a navigation system, which is not shown.
The boundary detection section 12 further detects, for example, a lateral position of the own vehicle 100 with respect to the boundaries or center line of the traveling lane 200 as a reference point of the traveling lane 200, based on the image data.
In S20, when the boundary detection section 12 cannot detect the boundaries of the traveling lane 200, the present process proceeds to S230. In S230, the deviation avoidance section 24 instructs the traveling control apparatus 30 to stop the deviation avoidance control for avoiding the deviation of the own vehicle 100 to the outside of the traveling lane 200, and then the present process is terminated. Instructing the traveling control apparatus 30 to stop the deviation avoidance control includes causing the traveling control apparatus 30 to continue the current traveling control while the traveling control apparatus 30 is not performing the deviation avoidance control.
For example, when it is not possible to detect a boundary between the paved surface and the unpaved surface of the traveling lane on which a white line is discontinued or a white line is not present, the boundary detection section 12 determines that the boundary of the traveling lane cannot be detected.
In S20, when the boundary of the traveling lane 200 can be detected, the present process proceeds to S30. In S30, the generation control section 22 generates an image representing a recognition state of white lines as a mode of boundaries and displays the generated image on the display 40. For example, when the white lines on the right and left sides of the traveling lane can be recognized, as shown in
When one of the right and left white lines cannot be recognized, the generation control section 22 displays an image different from the white line icon 71 for the unrecognized side, for example, such as a line narrower than the white line icon 71, on the display 40. That is, the generation control section 22 separately generates the image representing the recognition state of the white line on the right side of the own vehicle and the image representing the recognition state of the left side of the own vehicle, and displays the images on the display 40. The images displayed on the display 40 constitute position images representing the positions of the white lines and objects.
Then, in S40, the deviation prediction section 14 determines whether the own vehicle 100 will deviate depending on whether the own vehicle 100 has reached a control start position where the deviation avoidance section 24 causes the traveling control apparatus 30 to start the deviation avoidance control. The control start position defines the timing for the traveling control apparatus 30 to start the deviation avoidance control.
The control start position is determined from a map, as the distance from the boundary on the deviation side to the inside of the traveling lane 200, for example, by using the lateral speed of the own vehicle 100, the curvature of the traveling lane 200, the width of the traveling lane 200 and the like as parameters.
When it is determined in S40 that the own vehicle 100 has not reached the control start position 300, the present process proceeds to S230. In S230, the deviation avoidance section 24 causes the traveling control apparatus 30 to stop the deviation avoidance control, and then the present process is terminated.
When it is determined in S40 that the own vehicle 100 has reached the control start position 300, the deviation prediction section 14 predicts that the own vehicle 100 will deviate to the outside of the traveling lane 200. In this case, in S50 and S60, the deviation prediction section 14 determines whether any object exists on or outside the boundary on the deviation side.
When it is determined in S50 that no object exists on and outside the boundary on the deviation side, the present process proceeds to S70 described later. When it is determined in S50 that any object exists on or outside the boundary on the deviation side, the present process proceeds to S60 in which the deviation prediction section 14 determines the distance between the object and the boundary of the traveling lane, that is, to what degree the object is separated outward from the boundary. That is, the deviation prediction section 14 determines whether the distance between the object and the boundary is equal to or more than a permitted distance at which the own vehicle 100 is allowed to deviate to the outside of the boundary when no object exists on or outside the boundary. In the present embodiment, the permitted distance is set to 45 cm.
When it is determined in S60 that the distance between the object and the boundary is equal to or more than the permitted distance, the present process proceeds to S70. In S70, the object parameter recognition section 20 determines whether the detected boundary of the traveling lane 200 on the deviation side is a white line. In this process, the white line includes a center line and yellow line.
When it is determined in S70 that the boundary is a white line, the present process proceeds to S80. In S80, the object parameter recognition section 20 sets a command value for commanding the traveling control apparatus 30 to avoid the deviation of the own vehicle 100. For example, as shown in
Upon completion of this step, the present process proceeds to S240. The plus sign of +30 cm indicates the outside of the traveling lane 200 from the inner end 210a of the white line 210 on the deviation side.
When it is determined in S70 that the boundary is other than a white line, the object parameter recognition section 20 sets a command value for commanding the traveling control apparatus 30 to avoid the deviation of the own vehicle 100. For example, as shown in
Since L3 is a positive value, the set target position 310 indicates the inside of the traveling lane 200 from the suitability boundary 222 on the deviation side. L3 cm is set to, for example, 5 cm.
In contrast, in S60, when the distance between the object and the boundary is less than the permitted distance, the present process proceeds to S110, in which the object detection section 16 determines whether the object is a pedestrian.
When it is determined in S110 that the object is not a pedestrian, the present process proceeds to S120, in which the object detection section 16 determines whether the object is a vehicle. When the object is a vehicle, the object parameter recognition section 20 determines whether the vehicle is a parked vehicle, a parallel vehicle traveling in the same direction as that of the own vehicle, or an oncoming vehicle that is traveling in the opposite direction of the own vehicle, based on the relative speed between the own vehicle and the object.
In S120, when the object is a vehicle, the process proceeds to S130, in which the generation control section 22 reads a vehicle icon 72, which is a picture representing a vehicle, from the memory and displays the image thereof on the display 40. More specifically, as shown in
The example of the image shown in
Then, in S140, the object parameter recognition section 20 sets a command value for commanding the traveling control apparatus 30 to avoid the deviation of the own vehicle 100. For example, the object parameter recognition section 20 sets the target maximum movement position 310 to a position whose distance D with respect to the boundary is “the boundary—L2 cm”, where the boundary is the inner end 210a of the white line 210 on the deviation side, and the present process proceeds to S240. L2 is a positive value, and the relationship L1>L2>L3 is established. L2 cm is set to, for example, 10 cm.
When it is determined in S120 that the object is not a vehicle, the present process proceeds to S150 to perform a boundary display process. The boundary display process is a process for displaying an image in accordance with the type of an object that is other than a vehicle and a pedestrian.
In the boundary display process, as shown in
As the image representing a guard rail, when a white line and a guard rail are detected on one side of the vehicle as shown in
When it is determined in S310 that the detected object is not a guard rail, the present process proceeds to S330, in which the object parameter recognition section 20 determines whether the object is another solid object. Another solid object refers to the above-described unsuitable section 220 for traveling of the own vehicle 100.
When it is determined in S330 that the object is another solid object, the present process proceeds to S340. In S340, the generation control section 22 displays an image representing the suitability boundary 222 on the display 40, and then the boundary display process is terminated.
In a possible situation where the suitability boundary 222 is displayed, for example, a grass field or the like is present on the left end of the road as shown in
Next, returning to
When it is determined in S110 that the object is a pedestrian 110, the present process proceeds to S210. In S210, the generation control section 22 displays an image representing a pedestrian on the display 40. For example, as shown in
Both the pedestrian icon 76 and the white line icon 71 are displayed, for example, only when a person such as a pedestrian is located within 45 cm from the white line as shown in
Then, in S240, the generation control section 22 provides an under-control indication. The under-control indication is an indication that the deviation avoidance control is being performed. In this process, as shown in
Next, in S220, the object parameter recognition section 20 sets a command value for commanding the traveling control apparatus 30 to avoid the deviation of the own vehicle 100. For example, the object parameter recognition section 20 sets the target maximum movement position 310 to a position whose distance D with respect to the boundary is “the boundary—L1 cm”, where the boundary is the inner end 210a of the white line 210 on the deviation side, and the present process proceeds to S240. L1 is a positive value, and the relationship L1>L3 is established. L1 cm is set to, for example, 15 cm.
Next, in S250, the deviation avoidance section 24 commands the traveling control apparatus 30 to set a target line 320 on which the own vehicle 100 travels during the deviation avoidance process. The traveling control apparatus 30 performs the deviation avoidance control with feedback control on power distribution to the steering motor 32 so that the own vehicle 100 can run on the commanded target line 320.
When a person is detected within a predetermined distance from the white line, the deviation avoidance section 24 performs offset control to move the lateral position of the own vehicle to the side distant from the person in the traveling lane. In this case, as shown in
In addition, as shown in
According to the first embodiment described above in detail, the following advantageous effects can be obtained.
(1a) In the deviation avoidance apparatus 10 of the deviation avoidance system 2, the boundary detection section 12 acquires the positions of the boundary portions defining the both width-wise ends of the traveling lane in which the own vehicle is traveling, and the object detection section 16 acquires the position of an object around the traveling lane. The generation control section 22 generates the position image, which is an image representing the positions of the boundary portions and the position of the object, and displays the position image on the display device.
According to the deviation avoidance system 2, the position image indicates the positions of the boundary portions and the position of the object, which allows the passenger to recognize favorably the positional relationship between the boundary portions and the object. That is, it is possible to display more items as compared to the conventional technique for displaying an image representing the positions of boundary portions.
(1b) In the deviation avoidance apparatus 10 of the deviation avoidance system 2, the position image includes an image indicating whether the positions of the boundary portions have been successfully acquired.
According to the deviation avoidance system 2, it is possible to allow the passenger to recognize whether the positions of the boundary portions have been successfully acquired.
(1c) In the deviation avoidance apparatus 10 of the deviation avoidance system 2, the positions of the boundary portions on the right and left sides of the traveling lane are acquired, and the position image includes an image indicating whether the position of the boundary portion on the right side of the traveling lane and the position of the boundary portion on the left side of the traveling lane have been successfully acquired.
According to the deviation avoidance system 2, it is possible to allow the passenger to recognize whether the respective positions of the right and left boundary portions have been successfully acquired.
(1d) In the deviation avoidance apparatus 10 of the deviation avoidance system 2, the movement direction of the object is recognized and the position image includes an image representing the movement direction of the object.
According to the deviation avoidance system 2, it is possible to allow the passenger to recognize the movement direction of the object.
(1e) In the deviation avoidance apparatus 10 of the deviation avoidance system 2, the type of the object is recognized and an image representing the type of the object is used to indicate the position of the object.
According to the deviation avoidance system 2, the image corresponding to the type of the recognized object is displayed, which allows the passenger to recognize the type of the object recognized by the display control apparatus.
(1f) In the deviation avoidance apparatus 10 of the deviation avoidance system 2, the relative speed between the own vehicle and the object is recognized, and it is determined whether the object is a vehicle. When the object is a vehicle, it is determined whether the recognized vehicle is a parallel vehicle traveling in the same direction as that of the own vehicle or a non-parallel vehicle traveling in a direction different from that of the own vehicle, based on the relative speed. Then, when the recognized vehicle is a parallel vehicle, an image representing the parallel vehicle is generated, or when the recognized vehicle is a non-parallel vehicle, an image representing the non-parallel vehicle different from the image representing the parallel vehicle is generated. The position image includes the image representing the parallel vehicle or the non-parallel vehicle.
According to the deviation avoidance system 2, when the object is a vehicle, a different image can be displayed in accordance with the running direction of the vehicle. This allows the passenger to recognize that the acquired object is a vehicle and the traveling direction of the vehicle.
(1g) In the deviation avoidance apparatus 10 of the deviation avoidance system 2, it is recognized whether the object is a person, and when the object is recognized as a person, an image representing a pedestrian is generated, and the position image includes an image representing a pedestrian.
According to the deviation avoidance system 2, it is possible to allow the passenger to recognize that the acquired object is a person.
(1h) In the deviation avoidance apparatus 10 of the deviation avoidance system 2, the position image is generated by combining an object icon graphically representing an object and a boundary icon graphically representing a boundary portion.
According to the deviation avoidance system 2, the prepared icons are combined to reduce the process load of generating the image.
(1i) In the deviation avoidance apparatus 10 of the deviation avoidance system 2, as the boundary portion, the recognition result of the suitability boundary indicating the boundary between the unsuitable section 220, which is an unsuitable section for traveling of the own vehicle, and the traveling lane is acquired.
According to the deviation avoidance system 2, even when the both width-wise ends are not strictly defined, it is possible to acquire the boundary with the unsuitable section for traveling of the own vehicle as the suitability boundary.
(1j) In the deviation avoidance apparatus 10 of the deviation avoidance system 2, it is predicted that the own vehicle will deviate from the traveling lane based on the traveling state of the own vehicle traveling on the traveling lane defined by the boundary portions. When the deviation prediction section predicts that the own vehicle will deviate from the traveling lane and there exists an object on or outside the boundary portion on the side on which the own vehicle will deviate from the traveling lane, the traveling control apparatus controlling the traveling state is commanded to suppress the deviation of the own vehicle from the traveling lane such that the maximum movement position, which the own vehicle reaches when moving to the deviation side, is on the more inward side of the traveling lane than that on the occasion when there exists no object on or outside the boundary portion on the side on which the own vehicle will deviate from the traveling lane. The inward side refers to the direction in which the own vehicle comes closer to the desired traveling position as seen from the lateral direction of the traveling lane.
According to the deviation avoidance system 2, at the time of changing the traveling track of the vehicle to fall more inside the traveling lane under the control of suppressing the deviation of the own vehicle from the traveling lane due to the presence of an object around the boundary portion of the traveling lane, it is possible to notify the passenger of the performance of such control by display of the position image.
A second embodiment is basically similar in configuration to the first embodiment, and descriptions of the common components will be omitted and differences will be mainly described. The same reference signs as those of the first embodiment indicate the same components as those of the first embodiment, and the foregoing descriptions thereof are incorporated by reference.
The second embodiment is different from the first embodiment in that, in the deviation avoidance process, the mode of image display is set in consideration of the degree of psychological pressure on the driver, in other words, the degree of psychological margin in the driver.
With reference to the flowchart of
The degree of psychological pressure refers to the numerical value of fear felt by the driver of the own vehicle about the presence of another vehicle. The degree of psychological pressure is calculated, for example, by using the distance from the object such as another vehicle and the vehicle speed, which is the speed of the own vehicle.
Specifically, as shown in
In the map shown in
Subsequently, in S420, the mode of displaying the vehicle on the display 40 is set. In this process, the display mode is set by using a map for setting the display mode based on the speed relative to another vehicle and the calculated degree of psychological pressure. That is, as illustrated in
When the display mode is set for emphasized display, the display of a flashing vehicle icon 81 is set as shown in
Upon completion of the above process, S20 and the subsequent steps are performed as described above.
According to the second embodiment described above in detail, the following advantageous effects can be obtained in addition to the advantageous effect (1a) of the first embodiment.
(2a) In the configuration of the second embodiment, the degree of psychological pressure on the driver of the own vehicle is estimated and the mode of image display is changed depending on the degree of psychological pressure. When the degree of psychological pressure is high and the value indicating the burden on the driver of the own vehicle exceeds a threshold, the display mode is changed to attract the driver's attention such that the icon of the vehicle is flashed or the display color is changed to a warning color (for example, yellow or red).
According to the above configuration, it is possible to allow the driver to recognize an object with a high degree of psychological pressure through images.
The embodiments for implementing the present invention have been described. However, the present invention is not limited to the foregoing embodiments and can be implemented in various forms.
(3a) The deviation avoidance apparatus 10 may be configured such that, as the distance between the acquired position of the object and the position of the boundary portion is longer, the distance between the object icon and the boundary icon is longer in the position image. The object icon refers to an icon representing an object such as a vehicle or a pedestrian, and the boundary icon refers to an icon representing a white line and a suitability boundary.
For example, as illustrated in
According to the deviation avoidance system 2, it is possible to express the distance between the object icon and the boundary icon by the position image.
(3b) The deviation avoidance apparatus 10 may be configured to generate the image representing the distance between an object and a boundary portion by a numerical value and include an image representing the distance indicated by a numerical value as the position image. For example, as illustrated in
According to the deviation avoidance system 2, it is possible to recognize the distance between an object and a boundary portion by a numeric value in the position image.
(3c) The function of one component in the above embodiment may be distributed to a plurality of components, or the functions of a plurality of components in the embodiment may be integrated into one component. Some of the components in the embodiment may be omitted. At least some of the components in the embodiment may be added to or replaced with components in the foregoing other embodiments.
(3d) Besides the foregoing deviation avoidance system, the present invention can be implemented in various modes such as an apparatus serving as a component of the deviation avoidance system, a program for allowing a computer to function as the deviation avoidance system, a non-transitory substantive recording medium such as a semiconductor memory recording the program, and a deviation avoidance method.
The deviation avoidance apparatus 10 in the foregoing embodiments corresponds to a display control apparatus in the present invention. The boundary detection section 12 in the foregoing embodiments corresponds to a boundary acquisition section in the present invention. The object detection section 16 in the foregoing embodiments corresponds to an object acquisition section in the present invention. The object parameter recognition section 20 in the foregoing embodiments corresponds to a movement recognition section, an object type recognition section, and a relative speed recognition section in the present invention.
In the display control apparatus (10) of the foregoing embodiment, the boundary acquisition section (12) acquires the positions of the boundary portions defining the both width-wise ends of the traveling lane (200) in which the own vehicle travels, and the object acquisition section (16) acquires the position of an object around the traveling lane. The generation control section (22) generates the position image, which is an image representing the positions of the boundary portions and the position of the object, and displays the position image on the display device.
According to the above display control apparatus, the position image indicates the positions of the boundary portions and the position of the object, which allows the passenger to favorably recognize the positional relationship between the boundary portions and the object. That is, it is possible to display more items as compared to the conventional technique for displaying an image representing the positions of boundary portions.
Number | Date | Country | Kind |
---|---|---|---|
JP2015-204596 | Oct 2015 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2016/080612 | 10/14/2016 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/065297 | 4/20/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20040193347 | Harumoto | Sep 2004 | A1 |
20070154068 | Stein | Jul 2007 | A1 |
20090112389 | Yamamoto | Apr 2009 | A1 |
20100123778 | Hada | May 2010 | A1 |
20100253593 | Seder et al. | Oct 2010 | A1 |
20120072097 | Ohta | Mar 2012 | A1 |
20120087546 | Focke | Apr 2012 | A1 |
20120154591 | Baur | Jun 2012 | A1 |
20120314055 | Kataoka | Dec 2012 | A1 |
20130054128 | Moshchuk | Feb 2013 | A1 |
20130197758 | Ueda | Aug 2013 | A1 |
20140032049 | Moshchuk | Jan 2014 | A1 |
20140226015 | Takatsudo | Aug 2014 | A1 |
20150103174 | Emura et al. | Apr 2015 | A1 |
20160098837 | Saiki | Apr 2016 | A1 |
20170132922 | Gupta | May 2017 | A1 |
20180170429 | Shimizu | Jun 2018 | A1 |
Number | Date | Country |
---|---|---|
102007027495 | Dec 2008 | DE |
102013016242 | Apr 2015 | DE |
2005-056372 | Mar 2005 | JP |
2008-059458 | Mar 2008 | JP |
2008059458 | Mar 2008 | JP |
2009-083680 | Apr 2009 | JP |
2010-173530 | Aug 2010 | JP |
2013-120574 | Jun 2013 | JP |
5316713 | Oct 2013 | JP |
2014-133512 | Jul 2014 | JP |
5616531 | Oct 2014 | JP |
2015-096946 | May 2015 | JP |
Number | Date | Country | |
---|---|---|---|
20180322787 A1 | Nov 2018 | US |