OBJECT AWARENESS DETERMINATION SYSTEM AND A METHOD FOR DETERMINING AWARENESS OF AN OBJECT

Information

  • Patent Application
  • 20080061999
  • Publication Number
    20080061999
  • Date Filed
    September 07, 2007
    17 years ago
  • Date Published
    March 13, 2008
    16 years ago
Abstract
An object awareness determination system includes an external object sensor system to sense objects and generate input data relating to the objects external to a host vehicle, wherein the input data includes an object position (x, y), an object velocity |({dot over (x)}, {dot over (y)})| and an object direction of movement (({dot over (x)}, {dot over (y)})/|({dot over (x)}, {dot over (y)})|) associated with each object in said detecting range. Additionally, the system includes a controller for determining awareness of the user to an object that recently entered the detecting range of the external object sensor system, and a method for determining awareness of the object.
Description

BRIEF DESCRIPTION OF DRAWINGS

The invention will be described in further detail together with references to appended drawings where



FIG. 1 is a block scheme of an object awareness determination system according to the invention,



FIG. 2 shows a traffic situation including a host vehicle and a set of external objects,



FIG. 3 shows a recorded host vehicle movement track including recorded host vehicle position and host vehicle yaw angle and recorded direction of gaze of the user within a time span [t−n, t0] preceding a time of first entry (t0) of an external object into a detecting range of the sensor system 12 arranged on the host vehicle,





SPECIFICATION

The object determination system will initially be described with references to FIG. 1, which shows a block scheme of an object awareness determination system 10 according to the invention, and FIG. 2, which shows a traffic situation including a host vehicle and a set of external objects. The object awareness determination system 10 includes an external object sensor system 12 arranged on a host vehicle 14. The sensor system 12 is arranged to, within a detecting range 16 of sensor system 12, sense objects and generate input data relating to objects external 18 to the host vehicle, wherein said input data include an object position (x, y), an object velocity |({dot over (x)}, {dot over (y)})| and an object direction of movement (({dot over (x)}, {dot over (y)})/|({dot over (x)}, {dot over (y)})|) associated with each object 18 in the detecting range 16. Sensor systems providing the relevant input data are commercially available and well known to persons skilled in the art. A suitable system is sold under the trade name Mobil Eye.


The object awareness determination system 10 furthermore includes a controller 20 for determining awareness of the user to an object 22 that recently have entered the detecting range 16 of the external object sensor system. The controller 20 is arranged to determine awareness of the object 22 which have recently entered the detecting range 16 of the sensor system 12 based on an assessed observation of the recently entered object 18 by the user before the object has entered the detecting range 16 of said external object sensor system. With the expression “recently have entered the detecting range” is intended that the object has entered into the detecting range such that relevant input data concerning the object, including position and speed and optionally acceleration may be determined. In some know sensor systems these input data may be determined at entry into the detecting range, while other systems relies on a plurality of samples in order to determine speed and acceleration. Since accuracy of estimation of past trajectories decreases with increased distance from the time of observation it is preferred to base the past trajectory estimation on an observation made as early as possible. For this reason, the past trajectory should preferably be based on an observation of the external object made at its entry into the detecting range or at least based on an early set of observations made by the sensor systems. If the sensor system only may determine position, two consecutive observations will be needed in order to determine velocity and three will be needed in order to determine the acceleration of the external object. A suitable sampling interval for the sensor system is at least 10 Hz. The position, velocity and acceleration may the be determined within approximately 0.2 seconds from entry into the detecting range. If the sensor system may detect the velocity of an object The position, velocity and acceleration may the be determined within approximately 0.1 seconds from entry into the detecting range by using a sampling rate of 10 Hz


The object awareness determination system 10 further includes an eye gaze monitor 24 arranged for determining the direction of gaze φrel of a user relative to the host vehicle. Eye gaze monitors 24 are well known in the art. Eye gaze monitors are used to determine the direction of gaze of the user relative to the host vehicle. In order to determine the absolute direction of gaze φabs of the driver the host vehicle yaw angle Ψ relatively to the road must be determined. The host vehicle yaw angle Ψ and host vehicle position (x,y) is determined by a host vehicle movement tracker 26 arranged on the host vehicle 14. The absolute direction of gaze can be calculated as φabsrel+Ψ. Host vehicle movement trackers are well known in the art. The host vehicle movement tracker 26 is arranged to determine past host vehicle position and past host vehicle yaw angle, preferably by use of recorded past host vehicle yaw angle and the calculation based on the model below.


An eye gaze recorder 25 arranged for recording the direction of gaze of a user determined by the eye gaze monitor is included in the system.


A past trajectory estimator 28 is included in the object awareness determination system. The past trajectory estimator 28 is arranged to, after entry of an object into the detecting range 16 of the external object sensor system 12, estimate the past trajectory of the object which has recently entered into the detecting range 16 within a time span [t−n, t0] preceding a time of first entry (t0) of said object into said detecting range.


The past trajectory estimator 28 is arranged to, for each object that have recently entered the detecting range, retrieve object position (x, y), object velocity |({dot over (x)}, {dot over (y)}) and object direction of movement (({dot over (x)}, {dot over (y)})/|({dot over (x)}, {dot over (y)})|) at or after the time of first entry (t0), from said external object sensor system, and to determine the past trajectory based on the (x, y), object velocity |({dot over (x)}, {dot over (y)})| and object direction of movement (({dot over (x)}, {dot over (y)})/|({dot over (x)}, {dot over (y)})|) at or after the time of first entry (t0).


The past trajectory estimator 28 may estimate the object position (x, y)(t−x) at a point of time t−x within said time span [t−n, t0] preceding a time of first entry (t0) as:


(x, y)(t−i)=(x, y)(t0)−({dot over (x)}, {dot over (y)})(t0)Δt−({umlaut over (x)}, ÿ)(t0)Δt2/2, where the acceleration ({umlaut over (x)}, ÿ)(t0), may be detected by the external object sensor system, calculated from change detected external object velocities or neglected. Here t−i is a set of discrete point in time before an external object has entered the detecting range 16, which set of points in time forms a time span [t−n, t−1]. The time span is delimited by the end point t−n since the information available concerning the movement of the external object does not allow accurate estimation of the past position of the external object for very long time spans. Δt is t0−t−i.


The controller 20 is arranged to determine awareness of an object 22 that have recently entered into the detecting range based on an assessed observation, which observation is being assessed by use of a past host vehicle position (x,y)host,past, host vehicle yaw angle ψrec, recorded direction of gaze φrel,rec of the user, and estimated past trajectory of the object (x,y)obj,est which has entered the detecting range 16 of the external object sensor system. The past trajectory of the host vehicle may be a recorded position given by GPS sensor, a recorded position given by a vehicle tracking system or calculated from data representing the motion of the vehicle. A calculation may be based on a Newtonian model as for the external object, preferably, since recorded vehicle yaw angle may be available a calculation based on the following model should be used since very accurate past host vehicle positions may be obtained:







ϕ


-
i

,
rec


=


-




k
=
0


-
i






Ψ
.


abs
,

-
k







T
s





(


x


,

y



)




-
i

;
host

,
rec





=




k
=
0


-
i





(




cos






ϕ

-
k








sin






ϕ

-
k






)



v

-
k



T





s







Here (x′,y′)−i;host,rec is the past position (x,y)host, past of the host vehicle at discrete pointis in time −i, Ts is the sample interval; {dot over (ψ)}abs,−k is the absolute host vehicle yaw angle rate at time t−k; φ−i,rec the yaw angle at time t−i;; (x′,y′)−i;host,rec is the position of the host vehicle at time t−i; φ−k is the yaw angle at time t−k.


In FIG. 3 a recorded host vehicle movement track 30 including recorded host vehicle position and host vehicle yaw angle is shown. A set of recorded direction of gaze 32−532−1 of the user within a time span [t−n, t0] preceding a time of first entry (t0) of an external object into a detecting range of the sensor system 12 arranged on the host vehicle are shown. A past trajectory 34 of the external object 22 estimated by the past trajectory estimator 28 for the angle is shown. A set of recorded directions of gaze 32−532−1 of the user, within a time span [t−n, t−1] preceding a time of first entry (t0) of an external object into a detecting range, is indicated on the drawing. As may be noted the absolute direction of gaze φabs corresponds to the direction of the external object for the recorded direction of gaze 32−5. The controller may thus determine that the driver observed the external object before entry into the detecting range of the sensor system by using recorded host vehicle position and yaw angle, recorded direction of gaze of the driver and an estimated past trajectory of the external object that recently had entered the detecting range of the sensor system. The controller may determine that the user is observing the object at the point of time t−i within said time span [t−n, t−1] preceding a time of first entry (t0) if the object position is within a sector around said absolute direction of gaze at the point of time t−i. A relevant size of sector may be around ±2°. A further requirement of that the object must be within the sector during a predetermined minimum interval, which suitably may be 30 ms. A single observation at a single point in time may also be sufficient for the purposes of this invention.


In practice, the time span [t−n, t−1] includes a set of discrete points of time ti {i=−n:−1}, wherein the past trajectory estimator is arranged to determine the object position (x, y)(tl) at said discrete points of time ti {i=−n:−1}. The controller 20 is arranged to determine that the user is observing the object within the time span [t−n, t−1] preceding a time of first entry (t0) if the object position (x, y)(tl) is within a sector around the absolute direction of gaze at any of said discrete points of time ti {i=−n:−1}.


In one embodiment of the invention, the controller 20 is arranged to start with determining if the user is observing the object at a point of time (t−1) immediately preceding said time of first entry (t0), and to continue with points of time ti consecutively being more distant from time of first entry (t0). The controller may be arranged to stop the evaluation for points of times being more distant from the time of first entry (t0) than a point of time t−i as soon as the controller has determined that the user is observing the object at the point of time t−i due to that the object position is determined to be within a sector around said absolute direction of gaze at the point of time t−i


A suitable size of the time span [t−n, t0] preceding a time of first entry (t0) of an external object into a detecting range of the sensor system 12, during which the past trajectory of the external object is estimated is around 2-10 seconds, preferably around 5 seconds.


Instead of using a fixed time interval, the size of the interval may depend on the velocity of the host vehicle or of an aggregated average value of the velocities of the external objects observed by the sensor system.

Claims
  • 1. An object awareness determination system comprising: an external object sensor system arranged on a host vehicle, said sensor system being arranged to, within a detecting range of said sensor system, sense objects and generate input data relating to objects external to said host vehicle wherein said input data include an object position (x, y), an object velocity |({dot over (x)}, {dot over (y)})| and an object direction of movement (({dot over (x)}, {dot over (y)})/|({dot over (x)}, {dot over (y)})|) associated with each object in said detecting range, characterised in that said object awareness determination system further comprises:a controller for determining awareness of the user to an object that recently have entered the detecting range of the external object sensor system, said controller being arranged to determine awareness of said object based on an assessed observation of the recently entered object by the user before the object has entered the detecting range of said external object sensor system.
  • 2. An object awareness determination system according to claim 1, characterised in that said object awareness determination system further comprises: an eye gaze monitor arranged for determining the direction of gaze (φrel) of a user,a host vehicle movement tracker arranged on the host vehicle, which host vehicle movement tracker is arranged to determine past host vehicle position ((x,y)host, past) and past host vehicle yaw angle (φ−i);an eye gaze recorder arranged for, recording the direction of gaze (φrel, −i) of a user, determined by the eye gaze monitor;a past trajectory estimator which is arranged to, after entry of an object into said detecting range of said external object sensor system, estimate a past trajectory of said object within a time span [t−n, t−1] preceding a time of first entry (t0) of said object into said detecting range,wherein said controller is arranged to determine awareness of said object based on an assessed observation, which observation is being assessed by use of determined past host vehicle position ((x,y)host, past), past host vehicle yaw angle (φ−i); recorded direction of gaze (φrel, −i) of the user, and estimated past trajectory of the object which has entered the detecting range of the external object sensor system.
  • 3. An object awareness determination system according to claim 2, characterised in that said controller is arranged to retrieve, for a point of time t−i within said time span [t−n, t−1] preceding a time of first entry (t0), an object position (x, y)(ti) from said past trajectory; a past host vehicle position ((x,y)host, past) and a past host vehicle yaw angle (φ−i) from said host vehicle movement tracker; and a recorded direction of gaze (φrel, −i) from the eye gaze recorder (25).
  • 4. An object awareness determination system according to claim 3, characterised in that said controller is arranged to retrieve an absolute direction of gaze (φabs, −i) for a point of time t−i within said time span [t−n, t−1] preceding a time of first entry (t0) from a recorded direction of gaze (φrel, −i) and a recorded host vehicle yaw angle φ−l,rec at said a point of time t−−1.
  • 5. An object awareness determination system according to claim 4, characterised in that said controller is arranged to determine that the user is observing the object at the point of time t−−i within said time span [t−n, t−1] preceding a time of first entry (t0) if the object position (x, y)(ti) is within a sector around said absolute direction of gaze (φabs, −i) at the point of time t−−i.
  • 6. An object awareness determination system according to claim 5, characterised in that said time span [t−n, t−1] includes a set of discrete points of time ti {i=−n:−1}, that said past trajectory estimator is arranged to determine the object position (x, y)(t−i) at said discrete points of time ti {i=−n:−1}, and that said controller is arranged to determine that the user is observing the object within said time span [t−n, t−1] preceding a time of first entry (t0) if the object position (x, y)(t−i) is within a sector around the absolute direction of gaze (φabs, −i) at any of said discrete points of time ti {i=−n:−1}.
  • 7. An object awareness determination system according to claim 6, characterised in that said controller is arranged to start with determining if the user is observing the object at a point of time (t−1) immediately preceding said time of first entry (t0), and to continue with points of time ti consecutively being more distant from time of first entry (t0).
  • 8. An object awareness determination system according to claim 7, characterised in that said controller is arranged to stop the evaluation for points of times being more distant from the time of first entry (t0) than a point of time t−i as soon as the controller has determined that the user is observing the object at the point of time t−i due to that the object position is determined to be within a sector around said absolute direction of gaze at the point of time t−i
  • 9. An object awareness determination system according to claim 1, characterised in that said past trajectory estimator (28) is arranged to, for each object that have recently entered the detecting range, retrieve object position (x, y), object velocity |({dot over (x)}, {dot over (y)})| and object direction of movement (({dot over (x)}, {dot over (y)})/|({dot over (x)}, {dot over (y)})|) at or after the time of first entry (t0), from said external object sensor system, and to determine the past trajectory based on the object position (x, y), object velocity |({dot over (x)}, {dot over (y)})| and object direction of movement (({dot over (x)}, {dot over (y)})/|({dot over (x)}, {dot over (y)})|) at or after the time of first entry (t0).
  • 10. An object awareness determination system according to claim 9, characterised in that said past trajectory estimator (28) is arranged to estimate the object position (x, y)(t−i) at a point of time t−i within said time span [t−n, t−1] preceding a time of first entry (t0) as:
  • 11. An object awareness determination system according to claim 1, characterised in that said time span [t−n, t−1] preceding a time of first entry (t0) corresponds to at least 2 seconds immediately preceding said time of first entry.
  • 12. An object awareness determination system according to claim 1, characterised in that said time span [t−n, t0] preceding a time of first entry (t0) corresponds to less than 10 seconds immediately preceding said time of first entry.
  • 13. An object awareness determination system according to claim 1, characterised in that said time span [t−n, t−1] corresponds to approximately 5 seconds immediately preceding a time of first entry (t0).
  • 14. A method for determining awareness of an object comprising the steps of: using an external object sensor system arranged on a host vehicle to, within a detecting range of said sensor system, sense objects and generate input data relating to objects external to said host vehicle, wherein said input data include an object position (x, y), an object velocity |({dot over (x)}, {dot over (y)})| and an object direction of movement (({dot over (x)}, {dot over (y)})/|({dot over (x)}, {dot over (y)})|) associated with each object in said detecting range, characterised in that said method further comprises the step of:determining awareness of the user to an object that recently have entered the detecting range of the external object sensor system by use of controller which determines awareness of said object based on an assessed observation of the recently entered object by the user before the object has entered the detecting range of said external object sensor system.
  • 15. A method according to claim 14, characterised in that said method further includes the steps of: determining, by use of an eye gaze monitor, the direction of gaze (φrel, −i) of a user,determining past host vehicle position ((x,y)host, past) and past host vehicle yaw angle (φ−i) by use of a host vehicle movement tracker arranged on the host vehicle,recording the direction of gaze (φrel, −i) of a user, determined by the eye gaze monitor, by use of an eye gaze recorderafter entry of an object into said detecting range of said external object sensor system, estimating the past trajectory of said object within a time span [t−n, t−1] preceding a time of first entry (t0) of said object into said detecting range, by use of a past trajectory estimator,wherein said controller determines awareness of said object based on an assessed observation, which observation is being assessed by use of determined past host vehicle position ((x,y)host, past), past host vehicle yaw angle (φ−i), recorded direction of gaze (φrel, −i) of the user, and estimated past trajectory of the object which has recently entered the detecting range of the external object sensor system.
  • 16. A method according to claim 15, characterised in that said method further includes the steps of: said controller retrieves, for a point of time t−i within said time span [t−n, t−1] preceding a time of first entry (t0), an object position (x, y)(t−i) from said past trajectory; a past host vehicle position ((x,y)host, past) and a host vehicle yaw angle (φ−i) from said host vehicle movement tracker; and a direction of gaze (φrel, −i) from the eye gaze recorder.
  • 17. A method according to claim 16, characterised in that said controller retrieves an absolute direction of gaze (φabs, −i) for a point of time t−i within said time span [t−n, t−1] preceding a time of first entry (t0) from a recorded direction of gaze (φrel, −i) and a recorded host vehicle yaw angle (φ−i) at said a point of time t−i.
  • 18. A method according to claim 17, characterised in that said controller determines that the user is observing the object at the point of time t−x within said time span [t−n, t−1] preceding a time of first entry (t0) if the object position (x, y)(t−i) is within a sector around said absolute direction of gaze (φabs, −i) at the point of time t−i.
  • 19. A method according to claim 18, characterised in that said time span [t−n, t−1] includes a set of discrete points of time ti {i=−n:−1}, that said past trajectory estimator determines the object position (x, y)(t−i) at said discrete points of time ti {i=−n:−1}, and that that said controller determines that the user is observing the object within said time span [t−n, t−1] preceding a time of first entry (t0) if the object position (x, y)(t−i) is within a sector around the absolute direction of gaze (φabs, −i) at any of said discrete points of time ti {i=−n:−1}.
  • 20. A method according to claim 19, characterised in that said controller starts with determining if the user is observing the object at a point of time (t−1) immediately preceding said time of first entry (t0), and to continue with points of time ti consecutively being more distant from time of first entry (t0).
  • 21. A method according to claim 20, characterised in that said controller stops the evaluation for points of times being more distant from the time of first entry (t0) than a point of time t−i as soon as the controller has determined that the user is observing the object at the point of time t−i due to that the object position is determined to be within a sector around said absolute direction of gaze at the point of time t−i
  • 22. A method according to claim 14, characterised in that said past trajectory estimator, for each object that have recently entered the detecting range, retrieves object position (x, y), object velocity |({dot over (x)}, {dot over (y)})| and object direction of movement (({dot over (x)}, {dot over (y)})/|({dot over (x)}, {dot over (y)})|) at or after the time of first entry (t0), from said external object sensor system, and determines the past trajectory based on the (x, y), object velocity |({dot over (x)}, {dot over (y)})| and object direction of movement (({dot over (x)}, {dot over (y)})/|({dot over (x)}, {dot over (y)})|) at or after the time of first entry (t0).
  • 23. A method according to claim 22, characterised in that said past trajectory estimator estimates the object position (x, y)(t−i) at a point of time t−i within said time span [t−n, t0] preceding a time of first entry (t0) as:
  • 24. A method according to claim 14, characterised in that said time span [t−n, t−1] preceding a time of first entry (t0) corresponds to at least 2 seconds immediately preceding said time of first entry.
  • 25. A method according to claim 14, characterised in that said time span [t−n, t−1] preceding a time of first entry (t0) corresponds to less than 10 seconds immediately preceding said time of first entry.
  • 26. A method according to claim 14, characterised in that said time span [t−n, t−1] corresponds to approximately 5 seconds immediately preceding a time of first entry (t0).
Priority Claims (1)
Number Date Country Kind
06120345.1 Sep 2006 EP regional