This application is related to and claims priority to European Patent Application No. 21154258.4, filed Jan. 29, 2021, entitled UNFORESEEN VEHICLE DRIVING SCENARIOS the entirety of which is incorporated herein by reference.
The present disclosure relates to supporting assessment of unforeseen driving scenarios of a host vehicle.
An increasing number of modern vehicles have advanced driver-assistance systems, ADAS, to increase vehicle safety and more generally road safety. ADAS—which for instance may be represented by adaptive cruise control, ACC, lane departure avoidance, collision avoidance system, forward collision warning, etc.—are electronic systems that may aid a vehicle driver while driving. To function as intended, ADAS may rely on inputs from multiple data sources, such as e.g. LIDARs, radars, ultrasonics, cameras, automotive imaging, image processing, computer vision, and/or in-car networking.
Moreover, in a not too distant future, autonomous or automated driving systems, AD systems, will to greater extent find their way into modern vehicles. An AD system is a complex combination of various components that can be defined as systems where perception, decision making, and operation of the vehicle are performed by electronics and machinery instead of a human driver, and as introduction of automation into road traffic. This includes handling of the vehicle, destination, as well as awareness of surroundings. While the automated system has control over the vehicle, it allows the human operator to leave all responsibilities to the system. An AD system commonly combines a variety of sensors to perceive the vehicle's surroundings, such as e.g. radar, LIDAR, sonar, camera, navigation and/or positioning system e.g. GNSS such as GPS, odometer and/or inertial measurement units, upon which advanced control systems may interpret sensory information to identify appropriate navigation paths, as well as obstacles and/or relevant signage.
Similar to manual vehicle driving, when a vehicle is under control of such an ADAS or AD system—and/or is transitioning between a manual drive mode and an ADAS or AD drive mode—unforeseen driving scenarios may occasionally occur. For instance, said vehicle may get involved in an unforeseen critical event e.g. a near accident or even an accident, and/or get involved in an unforeseen takeover behaviour to or from the ADAS or AD drive mode e.g. involving untimely takeover or unforeseen aborting of the ADAS or AD drive mode. Such unforeseen driving scenarios may be of interest to capture—for instance for post-analysis to identify a root cause for the unforeseen driving scenario—and it is known to reconstruct scenarios of critical events with support from input—e.g. videoclips—derived from image capturing devices, e.g. cameras, onboard the vehicle itself, or onboard potential surrounding vehicles in vicinity of said vehicle at the incident of the critical event.
However, although input from image capturing devices of surrounding vehicles may be valuable in scenario reconstruction, it may be challenging to gather such input in an adequate and/or efficient manner.
It is therefore an object of embodiments herein to provide an approach for in an improved and/or alternative manner support assessment of unforeseen driving scenarios of a host vehicle.
The object above may be achieved by the subject-matter disclosed herein. Embodiments are set forth in the appended claims, in the following description and in the drawings.
The disclosed subject-matter relates to a method performed by a scenario reconstruction system for supporting assessment of unforeseen driving scenarios of a host vehicle. The scenario reconstruction system stores continuously and/or intermittently timestamped sensor data comprising the host vehicle's geographical position and surroundings captured with support from one or more surrounding detecting sensors onboard the host vehicle. Moreover, the scenario reconstruction system detects at a scenario time instance, involvement of the host vehicle in an unforeseen driving scenario, potentially involving one or more other objects. The scenario reconstruction system further derives from the timestamped sensor data, subsets of sensor data—e.g. snapshots—pertinent plural separate timestamps prior—and potentially at and/or subsequent—the scenario time instance. Moreover, the scenario reconstruction system identifies one or more surrounding detecting sensors-provided entities respectively present in the subsets pertinent at least two respective separate timestamps. The scenario reconstruction system further selects at least a first entity out of the identified entities, fulfilling selection criteria. Furthermore, the scenario reconstruction system communicates to the at least first selected entity, request data prompting the at least first selected entity to provide stored timestamped sensor data of its surroundings.
The disclosed subject-matter further relates to a scenario reconstruction system for supporting assessment of unforeseen driving scenarios of a host vehicle. The scenario reconstruction system comprises a data storing unit for—and/or adapted for—storing continuously and/or intermittently timestamped sensor data comprising the host vehicle's geographical position and surroundings captured with support from one or more surrounding detecting sensors onboard the host vehicle. The scenario reconstruction system further comprises a scenario detecting unit for—and/or adapted for—detecting at a scenario time instance, involvement of the host vehicle in an unforeseen driving scenario, potentially involving one or more other objects. Moreover, the scenario reconstruction system comprises a data deriving unit for—and/or adapted for—deriving from the timestamped sensor data, subsets of sensor data—e.g. snapshots—pertinent plural separate timestamps prior—and potentially at and/or subsequent—the scenario time instance. The scenario reconstruction system further comprises an identifying unit for—and/or adapted for—identifying one or more surrounding detecting sensors-provided entities respectively present in the subsets pertinent at least two respective separate timestamps. Furthermore, the scenario reconstruction system comprises a selecting unit for—and/or adapted for—selecting at least a first entity out of the identified entities, fulfilling selection criteria. The scenario reconstruction system further comprises a request communicating unit for—and/or adapted for—communicating to the at least first selected entity, request data prompting the at least first selected entity to provide stored timestamped sensor data of its surroundings.
Furthermore, the disclosed subject-matter relates to an arrangement, for instance a vehicle and/or at least a first cloud server, comprising a scenario reconstruction system as described herein.
Moreover, the disclosed subject-matter relates to a computer program product comprising a computer program containing computer program code means arranged to cause a computer or a processor to execute the steps of the scenario reconstruction system described herein, or the storing step, the detecting step and the deriving step of the scenario reconstruction system described herein, or the identifying step, the selecting step and the communicating step—and potentially the providing step—of the scenario reconstruction system described herein, stored on a computer-readable medium or a carrier wave.
The disclosed subject-matter further relates to a non-volatile computer readable storage medium having stored thereon said computer program product.
Thereby, there is introduced an approach according to which input to a scenario reconstruction may be gathered in an adequate and/or efficient manner. That is, since there is continuously and/or intermittently stored timestamped sensor data comprising the host vehicle's geographical position and surroundings captured with support from one or more surrounding detecting sensors onboard the host vehicle, the world around the host vehicle may over time—e.g. as the host vehicle travels—be repeatedly captured and stored in real-time or essentially real-time, e.g. onboard the host vehicle, along with the host vehicle's current and potentially changing position. Accordingly, the host vehicle's current and/or ongoing geographical whereabouts and its surrounding world may continuously be logged, potentially along with the host vehicle's current and/or ongoing drive state indicating whether the host vehicle is in a manual drive mode or under control of the optional ADAS or AD system i.e. in an ADAS or AD drive mode. Furthermore, that is, since there is detected at a scenario time instance, involvement of the host vehicle in an unforeseen driving scenario, potentially involving one or more other objects, there is determined that the host vehicle at a timepoint referred to as a scenario time instance, is experiencing and/or has experienced an unforeseen driving scenario, such as a collision for instance with another object e.g. vehicle or such as an unforeseen takeover behaviour to or from an ADAS or AD drive mode. Moreover, that is, since there is derived from the timestamped sensor data subsets of sensor data, e.g. snapshots, pertinent plural separate timestamps prior—and potentially at and/or subsequent—the scenario time instance, there is obtained out of the stored timestamped sensor data comprising the host vehicle's position(s) and captured surroundings—for instance from the past exemplifying ten, thirty or sixty seconds—portions of sensor data, e.g. snapshots, respectively reflecting the host vehicle's position(s) and captured surroundings at respective two or more—at time intervals distributed—timepoints prior the scenario time instance, and potentially also thereafter. Accordingly, since there is derived—from the stored timestamped sensor data—subsets of sensor data for differing—at regular or irregular time intervals distributed—time instances, a capacity-efficient abbreviated version, overview and/or extract of the stored captured—likely changing—world around the host vehicle, may be provided. Furthermore, that is, since there is identified one or more surrounding detecting sensors-provided entities respectively present in the subsets pertinent at least two respective separate timestamps, respective entity—e.g. another vehicle—provided with at least one surrounding detecting sensor, which respective entity each is comprised in subsets of respective two or more separate timestamps, may be found. Thus, there may be identified entities which are deemed to support detection of surroundings and which further are deemed to during respective—at least partly overlapping or differing—time periods at least to some extent effective prior to the scenario time instance, have had the opportunity to potentially having detected and stored surroundings comprising the host vehicle during respective time period. Moreover, that is, since there is selected at least a first entity out of the identified entities fulfilling selection criteria, there is selected the one or more entities which comply with predeterminable selection criteria comprising conditions singling out entities deemed suitable as unforeseen driving scenario witnesses. Thus, merely entities considered preferred, suitable and/or good witness candidates are selected while the other identified entities may be deemed redundant. Accordingly, entities identified to be provided with surrounding detecting sensor(s) and present in subsets of at least two separate timestamps, may nonetheless be discarded should they not fulfil said selection criteria. Consequently, for instance for differing moments in time or periods of time leading up to the scenario time instance—and potentially further at and/or subsequent the scenario time instance—merely a few or even a single entity deemed suitable as unforeseen driving scenario witness(es), may be selected for respective moment in time and/or period of time, while the remaining identified entities may be discarded. Furthermore, that is, since there is communicated to the at least first selected entity, request data prompting the at least first selected entity to provide stored timestamped sensor data of its surroundings, the at least first selected entity is requested to upload recordings it has made—with support from its surrounding detecting sensor(s)—of the at least first selected entity's surroundings. Accordingly, since the at least first entity was identified to support detection of surroundings and which further was deemed to at least during a time period have had the opportunity to potentially having detected and stored surroundings comprising the host vehicle during said time period, the host vehicle—and potentially further the potential other object(s)—may be found in the requested stored timestamped sensor data of the at least first selected entity. Thus, said sensor data—e.g. video clips and/or object level data—may be requested for upload to be used as input for reconstruction of the unforeseen driving scenario, including scenarios leading up to—and potentially also following—said unforeseen driving scenario. This insinuates that entities no longer in near vicinity of the host vehicle at the time of the scenario time instance may be requested to provide their sensor data, as said sensor data—although not necessarily covering the unexpected driving scenario at the actual scenario instance—nonetheless may be essential in reconstruction of the scenario. Furthermore, since only entities fulfilling the selection criteria were selected, and the rest of the identified entities hence discarded e.g. for being deemed redundant, a request for uploading recorded sensor data is only communicated to entities considered preferred, suitable and/or good witness candidates. Thus, routinely requesting sensor data from all or essentially all surrounding detecting sensors-provided vehicles in vicinity of the host vehicle at the moment of—or around—the scenario time instance, as known in the art, which may translate into great amounts of and/or redundant sensor data being requested to be uploaded, may accordingly be avoided. Consequently, requesting uploading of potentially unnecessarily large sensor data quantities which in turn may lead to unnecessary data processing and/or data congestion, may subsequently similarly be avoided. Accordingly, with the introduced concept as described herein of selectively choosing which identified entities to request sensor data from and which to discard, input to scenario reconstruction is hence requested in an adequate and/or efficient manner.
For that reason, an approach is provided for in an improved and/or alternative manner support assessment of unforeseen driving scenarios of a host vehicle.
The technical features and corresponding advantages will be discussed in further detail in the following.
The various aspects of the non-limiting embodiments, including particular features and advantages, will be readily understood from the following detailed description and the accompanying drawings, in which:
Non-limiting embodiments of the present disclosure will now be described more fully hereinafter with reference to the accompanying drawings, in which currently preferred embodiments of the disclosure are shown. This disclosure may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Like reference characters refer to like elements throughout. Dashed lines of some boxes in the figures indicate that these units or actions are optional and not mandatory.
In the following, according to embodiments herein which relate to supporting assessment of unforeseen driving scenarios of a host vehicle, there will be disclosed an approach according to which input to a scenario reconstruction may be gathered in an adequate and/or efficient manner.
Referring now to the figures, there is depicted in
The phrase “scenario reconstruction system” may refer to “scene reconstruction system” and/or “scenario assessment system”, and according to an example further to “incident reconstruction system”. The phrase “for supporting assessment”, on the other hand, may refer to “for supporting reconstruction” and/or “for selecting and/or gathering sensor data for reconstruction”, whereas “assessment of unforeseen driving scenarios” may refer to “assessment of unwanted and/or uncertain driving scenarios”, and according to an example further to “assessment of critical driving scenarios”. Moreover, “unforeseen driving scenarios” may refer to “unexpected driving scenarios”, “unforeseen driving events” and/or “unplanned, unwanted and/or uncertain driving scenarios”, and according to an example further to “unforeseen driving incidents”, whereas “unforeseen driving scenario of a host vehicle” may refer to “unforeseen driving scenario involving a host vehicle”.
Further depicted in
As depicted in exemplifying manners in
The timestamped sensor data 3 may be stored in any arbitrary—e.g. known—manner, such as in one or more memories and/or data buffers. Such an exemplifying at least first memory and/or data buffer may for instance be of predeterminable size, capacity and/or dimensions, such as continuously and/or intermittently buffering the past e.g. ten, thirty or sixty seconds of the sensor data 3, with older sensor data 3 being overwritten by newer and/or updated sensor data 3. Moreover, the timestamped sensor data 3 may further be of any arbitrary feasible size and/or format, such as in addition to the host vehicle's 2 position(s)—derived with support from a e.g. known positioning system—being indicated in any arbitrary manner e.g. represented by GPS coordinates and/or indicated in relation to a digital map such as e.g. an HD map, being represented by e.g. image data such as images and/or video and/or represented by e.g. object data. The one or more surrounding detecting sensors 22—which may be comprised in and/or be provided onboard the host vehicle 2 and distributed in any arbitrary feasible manner—may thus be represented by any arbitrary feasible sensors, functionality and/or systems adapted to capture surrounding of the host vehicle 2, for instance one or more image capturing devices such as cameras, and/or radar, lidar, ultrasonics etc. Moreover, the surrounding detecting sensor(s) 22 may potentially be provided in association with the optional perception system or similar system and/or the ADAS or AD system 21 discussed above. Furthermore, the surrounding detecting sensor(s) 22 may cover any arbitrary portion of vehicle surroundings, in any arbitrary direction from the host vehicle 2, for instance covering e.g. at least 90 degrees, at least 180 degrees or essentially 360 degrees of the surroundings.
The phrase “storing [ . . . ] timestamped sensor data” may refer to “logging, collecting and/or gathering [ . . . ] timestamped sensor data”, “storing digitally and/or electronically [ . . . ] timestamped sensor data” and/or “storing in on or more memories and/or data buffers [ . . . ] timestamped sensor data”, and according to an example further to “storing at said host vehicle [ . . . ] timestamped sensor data”, whereas “timestamped sensor data” may refer to “time-tagged and/or time-attributed sensor data”. Moreover, “sensor data” may refer to “sensor input”, “derived and/or gathered sensor data” and/or “current or essentially current status data”, whereas “sensor data comprising said host vehicle's geographical position and surroundings captured with support from one or more surrounding detecting sensors onboard said host vehicle” may refer to “sensor data comprising said host vehicle's geographical position captured with support from a positioning system and surroundings captured with support from one or more surrounding detecting sensors onboard said host vehicle” and/or “sensor data comprising said host vehicle's geographical position and surroundings captured continuously and/or intermittently with support from one or more surrounding detecting sensors onboard said host vehicle”. Moreover, according to an example, “sensor data comprising said host vehicle's geographical position and surroundings” may refer to “sensor data comprising said host vehicle's geographical position, drive state and surroundings”. The phrase “with support from one or more surrounding detecting sensors”, on the other hand, may refer to “utilizing and/or by means of one or more surrounding detecting sensors”, whereas “surrounding detecting sensors onboard said host vehicle” may refer to “surrounding detecting comprised in said host vehicle and/or provided or situated onboard said host vehicle”.
As depicted in exemplifying manners in
The one or more other objects 4 may be represented by any static or dynamic objects and/or other road users, e.g. other vehicles, with which the host vehicle 2 potentially accidentally may collide or nearly collide. The scenario time instance, on the other hand, may refer to any arbitrary point in time at which—or essentially at which—the unforeseen driving scenario, e.g. collision and/or unforeseen takeover behaviour to or from an ADAS or AD drive mode, occurred and/or was detected. Moreover, the unforeseen driving scenario may be represented by any driving event, situation and/or incident involving the host vehicle 2 deemed unforeseen, unexpected, unplanned, unwanted and/or uncertain, or potentially deemed critical. The phrase “detecting [ . . . ] involvement” may refer to “learning, determining and/or determining based on a received signal and/or message [ . . . ] involvement”, and according to an example further to “detecting at said host vehicle [ . . . ] involvement”. The phrase “at a scenario time instance”, on the other hand, may refer to “at a current or essentially current scenario time instance”, “at a scenario timepoint”, “at an event time instance”, and according to an example further to “at an incident time instance”. Moreover, “involvement of said host vehicle in an unforeseen driving scenario” may refer to “occurrence of said host vehicle being involved in and/or experiencing an unforeseen driving scenario”, whereas “unforeseen driving scenario”, as previously indicated, may refer to “unexpected, unplanned, unwanted and/or uncertain driving scenario” and/or “unforeseen driving event”, and according to an example further to “unforeseen driving incident”. The phrase “potentially involving one or more other objects” may refer to “optionally involving one or more other objects”, “potentially with one or more other objects” and/or “potentially involving one or more other static or dynamic objects”.
Detecting involvement of the host vehicle 2 in an unforeseen driving scenario may be accomplished in any arbitrary feasible—e.g. known—manner, for instance with support from sensor readings such as readings of crash sensor(s), airbag(s) deployment, harsh braking and/or steering etc., and/or with support from other driving behaviour indicators such as criticality indicators and/or key performance indicators, KPIs, for instance exemplifying indicators and/or indications revealing harsh—e.g. unmotivated—braking and/or steering, off-lane-steering, time-to-collision, jerky driving, threat measures e.g. involving Steering Threat Number and/or Brake Threat Number, untimely and/or unplanned activation/deactivation of the optional ADAS or AD system 21, etc.
Thus, optionally, the detecting involvement of the host vehicle 2 in an unforeseen driving scenario may comprise—and/or said scenario detecting unit 102 may be adapted and/or configured for—detecting, from establishment of accident criteria being fulfilled, involvement of the host vehicle 2 in an unforeseen driving scenario comprising an accident. Thereby, the unforeseen driving scenario may be represented by an accident—e.g. a collision such as with an other object 4—which is determined and/or deemed to have occurred following upon fulfilment of predeterminable accident criteria. The accident criteria may be represented by any feasible criteria stipulating one or more conditions for establishment of involvement of the host vehicle 2 in an accident, such as one or more conditions and/or thresholds identified and/or defined as—when fulfilled and/or e.g. exceeded—indicating occurrence of an accident. The accident criteria may accordingly for instance comprise a condition for an airbag of the host vehicle 2 being deployed, which when fulfilled may translate into the host vehicle 2 being deemed to be—and/or have been—involved in an accident.
Additionally or alternatively, optionally, the detecting involvement of the host vehicle 2 in an unforeseen driving scenario may comprise—and/or said scenario detecting unit 102 may be adapted and/or configured for—detecting, from establishment of critical event criteria being fulfilled, involvement of the host vehicle 2 in an unforeseen driving scenario comprising a critical event, for instance a near accident. Thereby, the unforeseen driving scenario may be represented by a critical event such as a near accident—e.g. a near collision such as with an other object 4—which is determined and/or deemed to have occurred following upon fulfilment of predeterminable critical event criteria. The critical event criteria may be represented by any feasible criteria stipulating one or more conditions for establishment of involvement of the host vehicle 2 in a critical event, such as one or more thresholds—e.g. deviation thresholds in view of KPIs—identified and/or defined as—when e.g. exceeded—indicating occurrence of a critical event. The critical event criteria may accordingly for instance comprise threshold levels for a—to the host vehicle 2 applied—braking force and/or braking torque, steering force and/or steering angle etc., and/or a threshold level for time-to-collision such as a maximum TTC time threshold e.g. exemplified by 0.5 seconds, which when fulfilled and/or e.g. exceeded may translate into the host vehicle 2 being deemed to be—and/or have been—involved in a critical event.
Furthermore, optionally, additionally or alternatively, the detecting involvement of the host vehicle 2 in an unforeseen driving scenario may comprise—and/or said scenario detecting unit 102 may be adapted and/or configured for—detecting, from establishment of unforeseen takeover behaviour criteria being fulfilled, involvement of the host vehicle 2 in an unforeseen driving scenario comprising an unforeseen takeover behaviour to or from an ADAS or AD system 21 drive mode. Thereby, the unforeseen driving scenario may be represented by an unforeseen takeover behaviour to or from an ADAS or AD drive mode—e.g. an untimely takeover and/or unforeseen takeover—which is determined and/or deemed to have occurred following upon fulfilment of predeterminable unforeseen takeover behaviour criteria. The unforeseen takeover behaviour criteria may be represented by any feasible criteria stipulating one or more conditions for establishment of involvement of the host vehicle 2 in an unforeseen takeover behaviour, such as one or more conditions and/or thresholds identified and/or defined as—when fulfilled and/or e.g. exceeded—indicating occurrence of an unforeseen takeover behaviour. The unforeseen takeover behaviour criteria may accordingly for instance stipulate under what conditions a takeover is considered untimely and/or under what conditions a takeover is considered unforeseen, unexpectedly aborted, unplanned and/or unwanted, which when fulfilled may translate into the host vehicle 2 being deemed to be—and/or have been—involved in an unforeseen takeover behaviour.
As illustrated in exemplifying manners in
Respective subset 30 of sensor data may be represented by any feasible portion out of the stored timestamped sensor data 3, such as respective snapshots of the captured world around the host vehicle 2, effective at—and/or essentially at and/or around—the corresponding timestamp. The timestamps corresponding to the derived respective subsets 30, on the other hand, may be distributed in any feasible manner separating said timestamps, for instance to a predeterminable extent. The separate timestamps may accordingly by distributed at regular or irregular time intervals, which time intervals may be predeterminable and further vary, and for instance be greater than e.g. half a second, two seconds and/or five seconds. The number of separate timestamps for which subsets are derived may thus be represented by any feasible quantity, for instance range from just a few up to tens or even hundreds of separate timestamps. At least some, a majority and/or essentially all of said separate timestamps may be effective prior the scenario time instance, i.e. prior involvement of the host vehicle 2 in the unforeseen driving scenario. Optionally, one or more of said separate timestamps may potentially be effective at or essentially at the scenario time instance, i.e. around the moment of involvement of the host vehicle 2 in the unforeseen driving scenario, or thereafter.
The phrase “deriving [ . . . ] sensor data” may refer to “filtering out, selecting, obtaining, retrieving, gathering and/or providing [ . . . ] sensor data”, and according to an example further to “deriving at said host vehicle [ . . . ] sensor data”. Moreover, “subsets of sensor data” may refer to “respective subsets of sensor data” and/or “a respective subset or portion of sensor data”, whereas” deriving from said timestamped sensor data subsets of sensor data” may refer to “deriving subsets out of said timestamped sensor data”. The phrase “e.g. snapshots”, on the other hand, may refer to “comprising snapshots” and/or “e.g. a respective snapshot”. Furthermore, “subsets of sensor data [ . . . ] pertinent plural separate timestamps” may refer to “subsets of sensor data [ . . . ] corresponding to, valid for, reflective of and of effective for plural timestamps”, and further to “subsets of sensor data [ . . . ] pertinent plural—at regular or irregular time intervals distributed—separate timestamps” and/or “subsets of sensor data [ . . . ] pertinent plural—to a predeterminable extent distributed—separate timestamps”. Moreover, “timestamps” may refer to “time instances and/or timepoints” and according to an example further to “past timestamps”, whereas “plural separate timestamps” may refer to “two or more separate time stamps”. The phrase “timestamps prior [ . . . ] said scenario time instance”, on the other hand, may refer to “timestamps effective prior [ . . . ] said scenario time instance”, whereas “timestamps prior—and potentially at and/or subsequent—said scenario time instance” may refer to “timestamps prior—and potentially essentially at and/or subsequent—said scenario time instance”.
As depicted in exemplifying manners in
In exemplifying
In exemplifying
Identifying at least a first surrounding detecting sensors-provided entity 5 present in at least two of the subsets 30 effective for separate timestamps, may for instance take place—at least to some extent—at a commonly known at least a first off-board cloud server 111 adapted to communicate with vehicles including the host vehicle 2, such as e.g. a cloud and/or automotive cloud, cloud network adapted for cloud-based storage and/or back-end system. Furthermore, an entity described herein may be represented by any arbitrary dynamic or stationary entity, for instance another vehicle positioned in e.g. a same, adjacent, oncoming or crossing lane of the host vehicle 2 and/or a piece of infrastructure such as a building, gas station, bridge etc for instance provided with a surveillance camera arrangement. Moreover, establishing that there are entities comprised in the subsets 30, further establishing whether any of said entities are provided with one or more surrounding detecting sensors, and potentially also establishing respective field of view of the identified entities 5, may be accomplished in any feasible—e.g. known—manner, such as with support from object detection, entity identity detection, identity lookup-table(s), etc. Furthermore, the surrounding detecting sensor(s) of an identified entity 5 may similarly to the surrounding detecting sensors of the host vehicle 2, be represented by any arbitrary feasible sensors, functionality and/or systems adapted to capture said entity's 5 surroundings, for instance one or more image capturing devices such as cameras, and/or radar, lidar, ultrasonics etc.
The phrase “identifying [ . . . ] entities” may refer to “pinpointing and/or determining [ . . . ] entities”, and according to an example further to “identifying—at least to some extent—at a cloud server [ . . . ] entities” and/or “identifying—at least to some extent—at a cloud server with support from said host vehicle [ . . . ] entities”. Moreover, “surrounding detecting sensors-provided entities” may refer to “entities respectively provided with at least a first surrounding detecting sensor”, and further to “surrounding detecting sensors-provided entities considered witness candidates”. The phrase “entities respectively present in said subsets”, on the other hand, may refer to “entities respectively comprised in said subsets”, and according to an example further to “entities respectively present in said subsets obtained from said host vehicle”. Furthermore, “pertinent at least two respective separate timestamps” may refer to “for at least two respective separate timestamps”. According to an example, the phrase “identifying one or more surrounding detecting sensors-provided entities respectively present in said subsets pertinent at least two respective separate timestamps” may refer to “identifying one or more surrounding detecting sensors-provided entities respectively present in said subsets pertinent at least two respective separate timestamps, a respective location and/or pose of respective entity relative respective position of said host vehicle—and/or potentially said one or more other objects—rendering a viewing angle deemed to at least partly and/or to a predeterminable extent cover said position” and/or “identifying one or more surrounding detecting sensors-provided entities respectively present in said subsets pertinent at least two respective separate timestamps, deemed to have respective field of views at least partly and/or to a predeterminable extent covering respective position of said host vehicle and/or potentially said one or more other objects”.
As illustrated in exemplifying manners in
Furthermore, optionally, the selecting of at least first entity 6 out of the identified entities 5 fulfilling selection criteria may comprise—and/or said selecting unit 105 may be adapted and/or configured for—selecting at least a first entity 6 out of the identified entities 5 fulfilling selection criteria to greater extent as compared to other entities of the identified entities 5. Thereby, entities 5 identified to be provided with surrounding detecting sensor(s) and present in subsets 30 of at least two separate timestamps, and further fulfilling the selection criteria, may nonetheless be discarded should there be at least a first other entity 6 outperforming said entities 5 in view of the selection criteria. Thus, in this case, where one or more entities 6 accordingly may be prioritized higher than other entities 5, merely entities 6 considered most preferred, best suited and/or best witness candidates may be selected while the other identified entities 5 are deemed redundant, whereby the number of selected entities 6 hence may be kept to a minimum. For instance, should there be two or more identified entities 5 in the subsets 30 which demonstrate similar conditions—such as in the subsets 30 having similar geographical positions and/or respective positions within a predeterminable distance from one another and/or being present in a same predeterminable geographical region e.g. in relation to the host vehicle's 2 position such as within a predeterminable sector of a circle origin from the host vehicle's 2 position—then a single entity 6 or merely a few entities 6 out of said entities 5—which to greater extent fulfil the selection criteria—may be selected i.e. prioritized. The phrase “fulfilling selection criteria to greater extent as compared to other entities of said identified entities” may according to an example refer to “fulfilling selection criteria and additionally further selection criteria”.
Selecting at least a first entity 6 out of the identified entities 5 may for instance take place—at least to some extent—at said at least first cloud server 111. The selection criteria, on the other hand, may vary and/or be dynamic, and thus e.g. change with the situation at hand and/or in view of the identified entities 5, for instance in comparison with one another. For instance, the selection criteria may vary with number of identified entities 5, such as the more identified entities 5 the stricter the selection criteria in order to keep down the number of selected entities 6, for instance to a minimum. Similarly, for instance, the selection criteria may vary in consideration of positions and/or poses of identified entities 5, and/or vary with reliability in identifying of the identified entities 5, such as reliability in establishment of that there are entities comprised in the subsets 30, in establishment of whether any of said entities are provided with one or more surrounding detecting sensors, and potentially in establishment of respective field of view of the identified entities 5. The phrase “selecting at least a first entity” may refer to “filtering out, singling out, identifying, pinpointing and/or determining at least a first entity”, and according to an example further to “selecting at said cloud server at least a first entity”. The phrase “fulfilling selection criteria”, on the other hand, may refer to “fulfilling selection criteria comprising one or more conditions and/or thresholds”. Moreover, “selection criteria” may refer to “witness selection criteria”, “predeterminable selection criteria” and/or “preferred candidates criteria”, and according to an example further to “selection criteria stipulating conditions for selecting suitable witness candidates” and/or “selection criteria comprising conditions singling out entities deemed suitable as unforeseen driving scenario witnesses”.
The selection criteria may be represented by any feasible criteria stipulating under what circumstances the identified entities 5 should be selected or otherwise discarded. The selection criteria may accordingly comprise one or more conditions and/or thresholds defined as—when fulfilled and/or e.g. exceeded—indicating that the entity 6 is a suitable witness candidate.
Thus, optionally, the selecting of at least first entity 6 out of the identified entities 5 fulfilling selection criteria may comprise—and/or said selecting unit 105 may be adapted and/or configured for—selecting at least a first entity 6 out of the identified entities 5 fulfilling selection criteria relating to number of timestamps at which the at least first entity 6 is present in the subsets 30. Thereby, there may be taken into consideration in selection of at least a first entity 6, at how many timestamps—which may be an indication of time duration—respective identified entity 5 is present in the subsets 30 and hence had the opportunity to potentially detect and store surroundings comprising the host vehicle 2. Accordingly, entities 6 for instance demonstrating presence in the subsets 30 for a great enough number of timestamps exceeding a predeterminable quantity stipulated by the selection criteria—and/or for instance demonstrating greater number of timestamps as compared to one or more other identified entities 5—may be prioritized and hence selected. The phrase “fulfilling selection criteria relating to number of timestamps at which said at least first entity is present in said subsets” may refer to “fulfilling selection criteria relating to number of timestamps at which said at least first entity is present in said subsets as compared to one or more other entities of said identified entities”.
Additionally or alternatively, optionally, the selecting of at least first entity 6 out of the identified entities 5 fulfilling selection criteria may comprise—and/or said selecting unit 105 may be adapted and/or configured for—selecting at least a first entity 6 out of the identified entities 5 fulfilling selection criteria relating to time occurrence of timestamps at which the at least first entity 6 is present in the subsets 30. Thereby, there may be taken into consideration in selection of at least a first entity 6, at which moments in time—for instance in relation to the scenario time instance—respective identified entity 5 is present in the subsets 30 and hence had the opportunity to potentially detect and store surroundings comprising the host vehicle 2. Accordingly, entities 6 for instance demonstrating presence in the subsets 30 at a time occurrence occurring at a predeterminable moment in time and/or within a predeterminable time range stipulated by the selection criteria—and/or for instance demonstrating presence in the subsets 30 at a time occurrence occurring closer in time to a predeterminable moment in time as compared to one or more other identified entities 5—may be prioritized and hence selected. The phrase “fulfilling selection criteria relating to time occurrence of timestamps at which said at least first entity is present in said subsets” may refer to “fulfilling selection criteria relating to time occurrence of timestamps at which said at least first entity is present in said subsets as compared to one or more other entities of said identified entities”.
Furthermore, additionally or alternatively, optionally, the selecting of at least first entity 6 out of the identified entities 5 fulfilling selection criteria may comprise—and/or said selecting unit 105 may be adapted and/or configured for—selecting at least a first entity 6 out of the identified entities 5 fulfilling selection criteria relating to positions of the at least first entity 6 relative positions of the host vehicle 2 and/or the potential one or more other objects 4 in the subsets 30. Thereby, there may be taken into consideration in selection of at least a first entity 6, positions of respective identified entity 5—which may be an indication of distance(s) from and/or direction(s) toward—the host vehicle 2 and/or the potential other object(s) 4 in the subsets 30, from which positions respective entity 5 hence had the opportunity to potentially detect and store surroundings comprising the host vehicle 2 and/or the potential other object(s) 4. Accordingly, entities 6 for instance positioned at and/or within predeterminable geographical areas, distances and/or directions stipulated by the selection criteria—and/or for instance at shorter distances and/or preferred directions from the host vehicle's 2 and/or the potential other object(s)′ 4 positions as compared to one or more other identified entities 5—may be prioritized and hence selected. The phrase “fulfilling selection criteria relating to positions of said at least first entity relative positions of said host vehicle and/or said potential one or more other objects in said subsets” may refer to “fulfilling selection criteria relating to positions of said at least first entity relative positions of said host vehicle and/or said potential one or more other objects in said subsets as compared to one or more other entities of said identified entities”.
Moreover, optionally, additionally or alternatively, the selecting of at least first entity 6 out of the identified entities 5 fulfilling selection criteria may comprise—and/or said selecting unit 105 may be adapted and/or configured for—selecting at least a first entity 6 out of the identified entities 5 fulfilling selection criteria relating to viewing angle from the at least first entity 6 relative positions of the host vehicle 2 and/or the potential one or more other objects 4 in the subsets 30. Thereby, there may be taken into consideration in selection of at least a first entity 6, viewing angles of respective identified entity 5—which for instance may be derived from established poses of respective identified entity 5—in relation to the host vehicle 2 and/or the potential other object(s) 4 in the subsets 30, with which viewing angles respective entity 5 hence had the opportunity to potentially detect and store surroundings comprising the host vehicle 2 and/or the potential other object(s) 4. Accordingly, entities 6 with viewing angles for instance giving a coverage of the host vehicle's and/or the potential other object(s)′ 4 position(s) exceeding a predeterminable minimum coverage requirement stipulated by the selection criteria—and/or for instance to greater extent covering positions of the host vehicle 2 and/or the potential other object(s) 4 as compared to one or more other identified entities 5—may be prioritized and hence selected. The phrase “viewing angles form said at least first entity” may refer to “viewing angle from the one or more surrounding detecting sensors onboard said at least first entity”, whereas “fulfilling selection criteria relating to viewing angles from said at least first entity relative positions of said host vehicle and/or said potential one or more other objects in said subsets” may refer to “fulfilling selection criteria relating to viewing angles from said at least first entity relative positions of said host vehicle and/or said potential one or more other objects in said subsets as compared to one or more other entities of said identified entities”.
Furthermore, additionally or alternatively, optionally, the selecting of at least first entity 6 out of the identified entities 5 fulfilling selection criteria may comprise—and/or said selecting unit 105 may be adapted and/or configured for—selecting at least a first entity 6 out of the identified entities 5 fulfilling selection criteria relating to the at least first entity 6 comprising and/or being comprised in the one or more other objects 4 potentially involved in the unforeseen driving scenario. Thereby, there may be taken into consideration in selection of at least a first entity 6, whether any of the identified entities 5 is the one or more other objects 4—in
Moreover, optionally, additionally or alternatively, the selecting of at least first entity 6 out of the identified entities 5 fulfilling selection criteria may comprise—and/or said selecting unit 105 may be adapted and/or configured for—selecting at least a first entity 6 out of the identified entities 5 fulfilling selection criteria relating to characteristics of the one or more surrounding detecting sensors of the at least first entity 6. Thereby, there may be taken into consideration in selection of at least a first entity 6, characteristics of respective identified entity's 5 surrounding detecting sensor(s), such as type of sensors, performance qualities etc.—derivable for instance from the subsets 30 e.g. with support from lookup tables—with which characteristics respective entity 5 hence had the opportunity to potentially detect and store surroundings comprising the host vehicle 2 and/or the potential other vehicle(s) 4. Accordingly, entities 6 with characteristics for instance complying with predeterminable characteristics stipulated by the selection criteria, may be prioritized and hence selected.
In exemplifying
Moreover, as illustrated in an exemplifying manner in
In exemplifying
In exemplifying
Communicating request data 8 prompting the at least first selected entity 6 to provide stored timestamped sensor data 63 of its surroundings, may be accomplished in any arbitrary—e.g. known—manner, such as at, from and/or via the cloud server 111 discussed above, and potentially further via one or more other automotive clouds 9 in communication with said cloud server 111 and the at least first selected entity 6. Moreover, the request data 8 may be represented by any arbitrary one or more data instructions and/or messages, e.g. wirelessly provided to the at least first selected entity 6, instructing said entity 6 to upload its stored timestamped sensor data 63 of its surroundings. The sensor data 63 of the at least first selected entity's 6 surroundings, on the other hand, may be represented by any arbitrary feasible sensor data of the world around said entity 6, such as having characteristics and/or format similar to—and/or having been stored and/or captured similarly to—the host vehicle sensor data 3 discussed above. Moreover, the requested sensor data 63 format and/or size, and further the time duration of and/or moment in time for which the requested sensor data 63 is effective, may vary between the different selected one or more entities 6
The phrase “communicating [ . . . ] to said at least first entity” may comprise “transmitting, submitting and/or delivering [ . . . ] to said at least first entity”, “communicating directly or indirectly [ . . . ] to said at least first entity” and/or “communicating via one or more automotive clouds [ . . . ] to said at least first entity”, and according to an example further to “communicating at and/or from said cloud server [ . . . ] to said at least first entity”. The phrase “request data”, on the other hand, may refer to “a request message” and/or “instructions”, whereas “request data prompting said at least first entity to provide” may refer to “request data instructing and/or requesting said at least first entity to provide” and/or “request data prompting said at least first entity to transmit and/or upload”. Moreover, “provide stored timestamped sensor data of its surroundings” may refer to “provide its stored timestamped sensor data of its surroundings”, “provide continuously and/or intermittently stored timestamped sensor data of its surroundings captured with support from its one or more surrounding detecting sensors”, “provide recorded timestamped sensor data of its surroundings”, “provide stored timestamped sensor data of its surroundings at least pertinent and/or effective for one or more of said at least two respective separate timestamps”, and according to an example further to “provide stored timestamped sensor data of its surroundings comprising image data, video data, video clips and/or object level data”.
Optionally, the scenario reconstruction system 1 may—e.g. by means of an optional reconstruction unit 107—be adapted and/or configured for providing a reconstruction of the unforeseen driving scenario, which reconstruction comprises at least a portion of, from the at least first selected entity 6 obtained, timestamped sensor data 63 of the at least first selected entity's 6 surroundings. Thereby, a reconstruction of the unforeseen driving scenario, including scenarios leading up to—and potentially also following—said unforeseen driving scenario, may be put together, for instance for post-analysis e.g. to find the root cause(s) of the unforeseen driving scenario.
The providing of the reconstruction may for instance take place at the cloud server 111 discussed above. Furthermore, the reconstruction may be provided in any feasible—e.g. known—manner, such as by reconstructing, combining and/or concatenating the—from the one or more selected entities' 6 received—respective sensor data, taking into account temporal and geographical aspects and/or characteristics of the sensor data. The reconstruction may further comprise at least a portion of from the host vehicle 2 derived and/or received stored sensor data 3. Moreover, similarly, the reconstruction may further comprise at least a portion of sensor data of surroundings derived and/or received from an entity e.g. piece of infrastructure in vicinity—but not in immediate surroundings—of the host vehicle 2, which. e.g. infrastructure may be provided with at least a first surrounding detecting sensor—such as a surveillance camera arrangement—during a time period having a range and/or viewing angle potentially capturing the host vehicle 2 and/or the potential other object(s) 4. Such a e.g. piece of infrastructure may for instance be identified by means of its geographical position in view of the host vehicle's 2—and/or the potential other object(s)′ 4—positions.
Potentially, should the reconstruction for instance be deemed insufficient and/or not reaching up to expectations, the step described herein of selecting at least a first entity 6 out of the identified entities 5, may be iterated, with modified selection criteria. Thereby, there may be selected further and/or other entities than those originally selected 6, which in turn may translate into further and/or other entities contributing to the reconstruction of the unforeseen driving scenario.
Further potentially, the reconstruction may subsequently further form basis for identifying for instance challenging and/or problematic driving situation and/or geographical areas.
The phrase “providing a reconstruction” may refer to “providing a reconstructed scene and/or model”, “providing a scenario model”, “generating, producing, concatenating and/or presenting a reconstruction” and/or “providing a digital and/or virtual reconstruction”, and according to an example further to “providing at said cloud server a reconstruction”. The phrase “from said at least first selected entity obtained”, on the other hand, may refer to “from said at least first selected entity received and/or gathered”, whereas “timestamped sensor data of the at least first selected entity's surroundings” may refer to “timestamped sensor data of said at least first selected entity's surroundings captured continuously and/or intermittently with support from said at least first entity's one or more surrounding detecting sensors”, “timestamped sensor data of said at least first selected entity's surroundings at least pertinent and/or effective for one or more of said at least two respective separate timestamps”, and according to an example further to “timestamped sensor data of said at least first selected entity's surroundings comprising image data, video data, video clips and/or object level data”.
Further optionally, the providing a reconstruction of the unforeseen driving scenario may comprise—and/or the optional reconstruction unit 107 may be adapted and/or configured for—providing the reconstruction in view of a digital map, which reconstruction then comprises elements, for instance road information, of the digital map. Thereby, the reconstruction may be augmented with map data. The digital map may be represented by any feasible digital map such as a high definition, HD map, and/or an equivalent or successor thereof.
As further shown in
In Action 1001, for instance performed at the host vehicle 2, the scenario reconstruction system 1 stores—e.g. with support from the data storing unit 101—continuously and/or intermittently timestamped sensor data 3 comprising the host vehicle's 2 geographical position and surroundings captured with support from one or more surrounding detecting sensors 22 onboard the host vehicle 2.
In Action 1002, for instance performed at the host vehicle 2, the scenario reconstruction system 1 detects—e.g. with support from the scenario detecting unit 102—at a scenario time instance, involvement of the host vehicle 2 in an unforeseen driving scenario, potentially involving one or more other objects 4.
Optionally, Action 1002 of detecting involvement of the host vehicle 2 in an unforeseen driving scenario may comprise detecting, from establishment of accident criteria being fulfilled, involvement of the host vehicle 2 in an unforeseen driving scenario comprising an accident; detecting, from establishment of critical event criteria being fulfilled, involvement of the host vehicle 2 in an unforeseen driving scenario comprising a critical event, for instance a near accident; and/or detecting, from establishment of unforeseen takeover behaviour criteria being fulfilled, involvement of the host vehicle 2 in an unforeseen driving scenario comprising an unforeseen takeover behaviour to or from an ADAS or AD system 21 drive mode.
In Action 1003, for instance performed at the host vehicle 2, the scenario reconstruction system 1 derives—e.g. with support from the data deriving unit 103—from the timestamped sensor data 3, subsets 30 of sensor data, e.g. snapshots, pertinent plural separate timestamps prior—and potentially at and/or subsequent—the scenario time instance.
In Action 1004, for instance performed at the cloud server 111, the scenario reconstruction system 1 identifies—e.g. with support from the identifying unit 104—one or more surrounding detecting sensors-provided entities 5 respectively present in the subsets 30 pertinent at least two respective separate timestamps.
In Action 1005, for instance performed at the cloud server 111, the scenario reconstruction system 1 selects—e.g. with support from the selecting unit 105—at least a first entity 6 out of the identified entities 5, fulfilling selection criteria.
Optionally, Action 1005 of selecting at least a first entity 6 may comprise selecting at least a first entity 6 out of the identified entities 5, fulfilling selection criteria relating to number of and/or time occurrence of timestamps at which the at least first entity 6 is present in the subsets 30; positions of and/or viewing angles from the at least first entity 6 relative positions of the host vehicle 2 in the subsets 30; the at least first entity 6 comprising and/or being comprised in the one or more other objects 4 potentially involved in the unforeseen driving scenario; and/or characteristics of the one or more surrounding detecting sensors of the at least first entity 6.
Moreover, optionally, Action 1005 of selecting at least a first entity 6 may comprise selecting at least a first entity 6 out of the identified entities 5, fulfilling selection criteria to greater extent as compared to other entities of the identified entities 5.
In Action 1006, for instance performed at the cloud server 111, the scenario reconstruction system 1 communicates—e.g. with support from the request communicating unit 106—to the at least first selected entity 6, request data 8 prompting the at least first selected entity 6 to provide stored timestamped sensor data 63 of its surroundings.
In optional Action 1007, for instance performed at the cloud server 111, the scenario reconstruction system 1 may provide—e.g. with support from the optional reconstruction unit 107—a reconstruction of the unforeseen driving scenario, which reconstruction comprises at least a portion of, from the at least first selected entity 6 obtained, timestamped sensor data 63 of the at least first selected entity's 6 surroundings.
Optionally, Action 1007 of providing a reconstruction of the unforeseen driving scenario may comprise providing the reconstruction in view of a digital map, which reconstruction then comprises elements, for instance road information, of the digital map
The person skilled in the art realizes that the present disclosure by no means is limited to the preferred embodiments described above. On the contrary, many modifications and variations are possible within the scope of the appended claims. It should furthermore be noted that the drawings not necessarily are to scale and the dimensions of certain features may have been exaggerated for the sake of clarity. Emphasis is instead placed upon illustrating the principle of the embodiments herein. Additionally, in the claims, the word “comprising” does not exclude other elements or steps, and the indefinite article “a” or “an” does not exclude a plurality.
Number | Date | Country | Kind |
---|---|---|---|
21154258.4 | Jan 2021 | EP | regional |