OBJECT MONITORING USING EVENT CAMERA DATA

Information

  • Patent Application
  • 20240177484
  • Publication Number
    20240177484
  • Date Filed
    November 09, 2023
    a year ago
  • Date Published
    May 30, 2024
    7 months ago
Abstract
A method includes obtaining data from an event camera for each of a plurality of time instances. The data includes events corresponding to changes detected by a corresponding plurality of pixels of the event camera at each time instance. Temporally regularized optical flow velocities are determined at each time instance. Each of the pixels has a respective one of the optical flow velocities at each time instance. An optical flow of a feature of an object in a field of view of the event camera is determined based on a predetermined relationship between the temporally regularized optical flow velocities at a selected time instance and the temporally regularized optical flow velocities at a subsequent time instance. The feature of the object corresponds to one of the plurality of events at the selected time instance and one of the plurality of events at the subsequent time instance.
Description
BACKGROUND

Object monitoring techniques have multiple uses. For example, autonomous or assisted vehicle navigation includes detecting objects in a vicinity, trajectory or pathway of the vehicle. A variety of technologies have been employed for such object detection including cameras, ultrasound sensors, RADAR detectors and LIDAR detectors.


Event cameras present the potential for advancements in object monitoring because they have a much higher temporal resolution, lower power and higher dynamic range than other types of cameras. Event cameras operate differently than traditional cameras. Event cameras do not output images. Instead, event cameras provide data indicating intensity or illumination brightness changes detected by each pixel of the camera.


One way in which event camera data is being used includes converting the event camera data into an image (frame) and then applying computer vision techniques to process such image data as if it came from a camera that generates image data. This approach suffers from at least two drawbacks; the resulting image data is not accurate and the computer vision techniques are computationally expensive.


SUMMARY

An illustrative example method includes obtaining data from an event camera for each of a plurality of time instances. The data includes a plurality of events corresponding to changes detected by a corresponding plurality of pixels of the event camera at each time instance. Temporally regularized optical flow velocities are determined at each time instance. Each of the pixels has a respective one of the optical flow velocities at each time instance. An optical flow of a feature of an object in a field of view of the event camera is determined based on a predetermined relationship between the temporally regularized optical flow velocities at a selected time instance and the temporally regularized optical flow velocities at a subsequent time instance. The feature of the object corresponds to one of the plurality of events at the selected time instance and one of the plurality of events at the subsequent time instance.


The various features and advantages of an example embodiment will become apparent to those skilled in the art from the following detailed description. The drawings that accompany the detailed description can be briefly described as follows.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 schematically illustrates selected portions of an example embodiment of a system including an event camera and a computing device that is configured to use the event camera data directly for monitoring an area corresponding to a field of view of the event camera.



FIG. 2 is a flowchart diagram summarizing an example method of monitoring an object in the field of view based on the event camera data.



FIG. 3 schematically illustrates a determined optical flow of an object according to an example embodiment.





DETAILED DESCRIPTION

Embodiments of this invention provide enhanced use of event camera data including the ability to use the event camera directly for monitoring purposes. Temporally regulated optical flow velocities determined directly from the event camera data allow for mapping optical flow of an object in a field of view of an event camera. There is no need to convert the event camera data into images, which allows for increased accuracy and reduced computational complexity.



FIG. 1 schematically shows a system 20 that includes an event camera 22 that operates in a generally known manner The event camera 22 includes an array of pixels (not illustrated) that provide an indication of a change in illumination brightness or intensity sensed or detected by each pixel. A change in brightness from a preset baseline that exceeds a threshold results in a pixel output that is referred to as an event. Each pixel output is −1, 0, or 1, when there is a decrease event, no event, or an increase event, respectively.


The event camera 22 has a microsecond resolution and generates output data that indicates which of the pixels detected an event and which did not at a time instance, which can be considered a detection interval.


The event camera 22 is associated with a host 24 that may be a moving object, such as a vehicle, or a stationary object, such as a building or another structure. The manner in which the event camera 22 is associated with the host 24 may vary depending on the particular implementation. For example, the event camera 22 may be mounted or supported on the host. Alternatively, the event camera 22 may be positioned nearby the host 24.


A computing device 26 receives and processes the data from the event camera for monitoring a vicinity of the host 24 corresponding to a field of view 28 of the event camera 22. The computing device 26 includes at least one processor and memory associated with the processor. The computing device 26 may be associated with the event camera 22 and the host 24 in a variety of ways. For example, the computing device 26 may be a dedicated component integral with the event camera 22, a dedicated computing component situated at or on the host 24, a portion of a computing device associated with the host 24 that is useful for additional purposes, or at least a portion of a remotely located device, such as a virtual machine in a cloud computing network. Those skilled in the art that have the benefit of this description will realize what type of computing device 26 will best suit their particular needs.


The computing device 26 is programmed or otherwise configured to process data from the event camera 22 for monitoring the vicinity of the host 24 that corresponds to a field of view 28 of the event camera 22. Monitoring the vicinity or field of view includes monitoring any objects within the field of view 28. For discussion purposes, FIG. 1 includes a single object 30, which is an individual person, within the field of view 28. The object 30 is referred to in this description for discussion purposes without limiting the manner in which the disclosed system and method can be used. For example, there may be more than one object, a variety of different types of objects, or no objects within the field of view 28 and the disclosed method is useful in all such situations.


Monitoring the object 30 includes at least one determination regarding the object 30. Example determinations include detecting the presence of the object 30, locating the object relative to the event camera 22 or host 24, tracking relative movement between the object 30 and the event camera 22 or host 24, classifying the object 30, identifying the object 30, or a combination of such determinations.


The computing device 26 is configured to use the data from the event camera 22 directly for monitoring the object 30. FIG. 2 is a flowchart diagram 40 that summarizes an example technique. At 42, the computing device 26 obtains data from the event camera 22 at each of a plurality of time instances. The data from the event camera 22 at each time instance includes an output from each of the camera pixels. Each pixel output indicates whether an event occurs at each pixel. The data from the event camera 22 includes a plurality of events when the object 30 is within the field of view 28 because the object 30 introduces differences in illumination detected by the pixels. At 44, the computing device 26 determines a temporally regularized optical flow velocity for each of the pixels at each time instance. The computing device determines an optical flow and correspondence of at least one feature of the object 30 based on the temporally regularized optical flow velocities at a selected time instance and a subsequent time instance at 46. The feature of the object 30 corresponds to at least one of the events at the selected time instance and at least one of the events at the subsequent time instance. Determining the optical flow and correspondence of all detected features of the object 30 allows for monitoring the entire object 30.



FIG. 3 schematically illustrates an example optical flow of the object 30. The data provided by the event camera 22 is schematically shown at 50 to represent how the data corresponds to time and space. The pixel outputs at each of a plurality of the time instances are shown at 52, 54, 56, and 58. The pixel outputs, which correspond to the field of view 28, are shown with respect to a two-dimensional space having x and y axes. The progression along the time axis t (moving upward according to the drawing) indicates how the data from the event camera 22 includes information regarding changes in that which is detected by the event camera 22 over time.


In this example, an appearance 30′ of the object 30 is in different locations within the field of view 28 because there is relative movement between the object appearance 30 and the event camera 22. Such relative movement may be the result of the object 30 moving while the event camera 22 is stationary, the event camera 22 (and the host 24) moving while the object 30 is stationary, or simultaneous movement of the object 30 and the event camera 22. Depending on the location of the object 30 within the field of view 28 at a time instance, the outputs of different pixels of the event camera 22 indicate events corresponding to features of the object 30.


The object 30 can be monitored by determining the optical flow of at least some of the features of the object appearance 30′. The optical flow of the entire object appearance 30′ may be useful when sufficient data is available to make such determinations. For discussion purposes, two features of the object appearance 30′ are indicated in FIG. 3. For example, one feature 60 corresponds to one hand of the individual 30. As can be appreciated from the illustration, the feature 60 appears in different locations within the event camera data at the illustrated time instances 52-58. The optical flow of the feature 60 corresponds to the relative movement between the feature 60 and the event camera 22.


The computing device 26 determines the temporally regularized optical flow velocities of the pixels that indicate an event at each of the time instances shown at 52-58. At least one of those determined velocities corresponds to the feature 60. In many situations multiple pixels and associated velocities correspond to a detected feature at each time instance. The predetermined relationship between the temporally regularized optical flow velocities minimizes changes between those velocities from time instance to time instance (e.g., from the time instance represented at 52 and the time instance represented at 58) and the computing device 26 determines the optical flow of the feature 60 as schematically represented at 62. In other words, the predetermined relationship between the temporally regularized optical flow velocities smooths the curve shown at 62.


Another feature 64 of the object appearance 30′, such as a foot of the individual, has an optical flow represented at 66. The computing device 26 uses the temporally regularized optical flow velocities of the pixels indicating events corresponding to the feature 64 and the predetermined relationship to determine the optical flow in a manner that smooths the curve 66.


An example embodiment of a method used to determine the optical flow of a monitored object, such as the object 30, includes a regularizer that enforces regularity along the velocity field.


The following definitions and assumptions apply:





Let e: [0,T]×Ω→[−1,0,1] where Ω⊆text missing or illegible when filed2 be the even stream represent as a continuum function





Let ν: [0,T]×Ω→text missing or illegible when filed2 denote the optical flow at each time.





Let ϕ: [0,T]×Ω→Ω denote the mapping (warp between time 0 and all time t.





Let text missing or illegible when filed denote the partial w,r,t,text missing or illegible when filed and ∇ denote the spatial gradient.


where Ω indicates the pixels of the event camera imaging plane, and custom-character2 indicates the real plane.


The brightness or event constancy can be represented using the classical differential form, which is











e



t




(

t
,
x

)


+





e

(

t
,
x

)


·

v

(

t
,
x

)



=
0.




The regularizer limits or minimizes changes in velocity along paths t→ϕ(t,text missing or illegible when filed). Differentiating













t




v

(

t
,

ϕ

(

t
,
x

)


)





yields:











v



t




(

t
,
x

)


+





v

(

t
,
x

)


·

v

(

t
,
x

)




0




This example embodiment includes using the following variational problem or loss function to determine the optical flow velocity ν:






E(ν)=∫0TΩ(∂te+∇e·ν)2dxdt+α∫0TΩ|∂tν+∇ν+ν|2dxdt.


where (∂te+∇e·ν)2dxdt is the event constancy term,


|∂tν+∇ν·ν|2dxdt is the temporal regularizer term, and


α is a regularization parameter.


The computing device 26 determines the value of v that minimizes the above loss function to determine the temporally regularized optical flow velocity of each event at each time instance. An example embodiment includes a gradient descent technique to minimize the loss function including computing the variation:





δE·δν=∫0TΩ2(∂te+∇e·ν)∇e·δνdxdt+α∫0T∫Ω2(∂tν+∇ν·ν)·[∂t(δν)·ν+∇ν·δν]dxdt=∫0TΩ2(∂te+∇e·ν)∇e·δνdxdt+α∫0TΩ2(∂tν+∇ν·ν)·[∂t(δν)+∇(δν·ν)]dxdt,


where the last line includes





∇[δν·ν]=D(δν)ν+(Dν)δν.


Applying integration by parts to isolate dv includes letting






F:=∂
tν+∇ν·ν.


Then










δ


E
·
δ


v

=




0
T




Ω


2


(




1

e

+




e

·
v


)




·
δ


vdxdt












-
2


α




0
T




Ω




(




1

F

+


div

(
F
)


v


)

·
δ


vdxdt















+
2


α




0
T






Ω




(

F
·
N

)



v
·
δ


vdxdt




+

2

α




Ω



F
·
δ


vdx






"\[RightBracketingBar]"



t
=
0


t
=
T








where N is the unit normal to δΩ.


The gradient of E can therefore be expressed as:





E(ν)=(∂te+∇e·ν)∇etext missing or illegible when filed[∂tF+div(F)ν]|


With text missing or illegible when filed:=(1,ν) and D=(∂t,∇)text missing or illegible when filed it is possible to write F as






F=()text missing or illegible when filed


It follows that the gradient of E can be expressed as:





E(ν)=(∂te+∇e·ν)∇e−text missing or illegible when filed[∂t(()text missing or illegible when filed)+div(()text missing or illegible when filed)ν].


This example embodiment includes eliminating boundary terms from the loss function by applying the following boundary conditions:





()text missing or illegible when filed·N=0, on ∂Ω





()text missing or illegible when filed=0, on {0,T}×Ω


which implies that the velocity is constant along paths across the boundary.


Writing all the equations to compute the gradient:





E(ν)=(∂te+∇e·ν)∇e−text missing or illegible when filed[∂t(()text missing or illegible when filed)+div(()text missing or illegible when filed)ν] on (0,T)×Ω





()text missing or illegible when filed·N=0, on ∂Ω





()text missing or illegible when filed=0, on {0, T}×Ω


The gradient descent PDE can therefore be expressed as





τν=−(∂te+∇e·ν)∇e+text missing or illegible when filed[∂t(()text missing or illegible when filed)+div(()text missing or illegible when filed)ν],


where τ indicates the evolution parameter. Gradient descent is known to have slow convergence and be prone to local minimizers. To alleviate this, the computing device 26 is configured to use a known accelerated optimization resulting in





τ2ν+a∂τν=−(∂te+∇e·ν)∇e+text missing or illegible when filed[∂t(()text missing or illegible when filed)+div(()text missing or illegible when filed)ν].


where a>0 is a damping factor.


According to an example embodiment, the regularizer can be expressed in x and y components. Using ν=(ν1, ν2)T, then








(
dv
)



v

?



=


(





(

Dv
1

)


θ







(

Dv
2

)


θ




)

.








?

indicates text missing or illegible when filed




Deriving the first term of the regularizer yields






E
reg(ν)=∫T×Ω[(1)text missing or illegible when filed]2dxdt.

    • and it follows that





δEreg(ν)·δν=2∫T×Ω[(1)text missing or illegible when filed]·[(text missing or illegible when filed1)text missing or illegible when filed+(1text missing or illegible when filed]dxdt=2∫T×Ω|(1)text missing or illegible when filed·∇(δν1)+|(1)text missing or illegible when filed(1text missing or illegible when fileddxdt=2∫T×Ω−div([(1)text missing or illegible when filed)δν1+[(1)text missing or illegible when filed(1text missing or illegible when fileddxdt


An example embodiment includes treating the loss function E using Pytorch auto-differentiation, which is a known technique, to automatically differentiate and using standard optimizers. The energy can be expressed as






E(ν)=∥(De)text missing or illegible when filedtext missing or illegible when filed2+α∥()text missing or illegible when filedtext missing or illegible when filed2,

    • which corresponds to the loss function


Discretizing (De)text missing or illegible when filed using an upwind scheme includes





(De)text missing or illegible when filed=∂te+ν1text missing or illegible when filede+ν2text missing or illegible when filede

    • and





νj(t,x)∂text missing or illegible when filede(t,x)≈max{0,νj(t,x)]Dje(t,x)+min{0,νj(t,x)}Dj+e(t,x)


Discretizing (Dν)text missing or illegible when filed using a technique similar to Burger's equation includes:








(
dv
)



v

?



=

(







1


v
1


+


?





?




(

v
1

)

2



+


?





?



v
1












1


v
2


+


?





?




(

v
2

)

2



+


?





?



v
2







)








?

indicates text missing or illegible when filed




Let's discretize











?




(

?

)

2




(

t
,
x

)






g

(



?


(

t
,
x

)


,


?


(

t
,

x
+

Δ

?




)



)

-

g

(


?


(

t
,

x
-

Δ

?




)


?


(

t
,
x

)


)



Δ

?










?

indicates text missing or illegible when filed




where






g12)=max[ν1, 0)2+min{ν2, 0}2


Let discretize





νj(t,x)∂text missing or illegible when filed, νtext missing or illegible when filed(t,x)≈max[νj(t,x),0]Djνtext missing or illegible when filed(t,x)+min(νj(t,x),0}Dj+νtext missing or illegible when filed(t,x).


where








D
j
-




v

?


(

i
,
x

)


=




v

?


(

t
,
x

)

-


?


(

t
,

x
-

Δ


x
j




)




Δ


x
j











D
j
+




v

?


(

i
,
x

)


=




v

?


(

t
,

x
+

Δ


x
j




)

-


?


(

t
,
x

)




Δ


x
j










?

indicates text missing or illegible when filed




A forward difference is used for the time derivatives:










?




v

?


(

t
,
x

)







v
i

(


t
+
Δt

,
x

)

-


v
i

(

t
,
x

)



Δ

t









?

indicates text missing or illegible when filed




Monitoring the object 30 and making determinations regarding the object 30 allows for controlling or adjusting at least one aspect of the host 24. For example, when the host 24 is a movable object, such as a vehicle, the computing device 26 provides information or control signals for controlling movement of the host 24. In some embodiments, controlling movement of a vehicle may include altering a trajectory of the vehicle for avoiding a collision with the object 30. Alternatively, the host 26 may be directed toward the object 30. If the host 24 is a stationary object, the information regarding the object 30 may be used to trigger an alarm, activate a lock, or control another feature of the host 24. Those skilled in the art who have the benefit of this description will realize how information regarding a monitored object will be useful to meet the needs of their particular situation.


Using the event data directly to determine the temporally regularized optical flow velocities and then using those to determine the optical flow of the object 30 avoids the drawbacks associated with converting event camera data into an image file and then processing such an image. The disclosed system and method provides more accuracy and accomplishes that at a lower computational expense compared to previous attempts to utilize information from an event camera for purposes such as object monitoring.


The preceding description is exemplary rather than limiting in nature. Variations and modifications to the disclosed examples may become apparent to those skilled in the art that do not necessarily depart from the essence of this invention. The scope of legal protection given to this invention can only be determined by studying the following claims.

Claims
  • 1. A method, comprising: obtaining data from an event camera for each of a plurality of time instances, the data including a plurality of events corresponding to changes detected by a corresponding plurality of pixels of the event camera at each time instance;determining temporally regularized optical flow velocities at each time instance, wherein each of the pixels has a respective one of the optical flow velocities at each time instance;determining an optical flow and correspondence of a feature of an object in a field of view of the event camera based on a predetermined relationship between the temporally regularized optical flow velocities at a selected time instance and the temporally regularized optical flow velocities at a subsequent time instance, wherein the feature of the object corresponds to one of the plurality of events at the selected time instance and one of the plurality of events at the subsequent time instance.
  • 2. The method of claim 1, wherein the predetermined relationship minimizes a difference between the optical flow velocities along an optical flow trajectory induced by the velocity over time.
  • 3. The method of claim 1, wherein determining the temporally regularized optical flow velocities includes determining a velocity that minimizes a loss function having an event constancy term and a temporal regularizer term.
  • 4. The method of claim 3, wherein the temporal regularizer term includes a spatial gradient factor corresponding to a spatial gradient of the optical flow velocity.
  • 5. The method of claim 1, comprising determining relative movement between the object and the event camera based on the determined optical flow of the feature of the object.
  • 6. The method of claim 5, comprising determining the optical flow of a plurality of features of the object; anddetermining the relative movement based on the determined optical flow of the plurality of features.
  • 7. The method of claim 5, wherein the event camera is associated with a host and the method includes controlling at least one aspect of the host based on the determined relative movement between the object and the event camera.
  • 8. The method of claim 7, wherein the host comprises a vehicle and controlling at least one aspect of the host comprises automatically altering at least one aspect of movement of the vehicle.
CROSS-REFERENCE TO RELATED APPLICATION

The present disclosure claims priority to U.S. Provisional Application No. 63/424,267 filed Nov. 10, 2022.

Provisional Applications (1)
Number Date Country
63424267 Nov 2022 US