This application claims priority to Japanese Patent Application No. 2019-211215 filed on Nov. 22, 2019, which is incorporated herein by reference in its entirety. including the specification, drawings and abstract.
The present disclosure relates to an image data distribution system and an image data display terminal according to an on-vehicle camera.
A vehicle may be equipped with an on-vehicle camera that captures an outside or an inside of the vehicle.
Japanese Unexamined Patent Application Publication No. 2014-164316 (JP 2014-164316 A) discloses that a user transmits image data of a desired on-vehicle camera of the vehicle that travels around a desired position to a user terminal and the user can be informed of a current state of the position in detail.
Japanese Unexamined Patent Application Publication No. 2006-236292 (JP 2006-236292 A) discloses that image data captured by an on-vehicle camera before and after the occurrence of an accident is recorded and transmitted to an insurance entrusted company.
In Japanese Unexamined Patent Application Publication No. 2014-164316 (JP 2014-164316 A), the image data of the on-vehicle camera is merely provided to grasp the current state at a specific location.
In Japanese Unexamined Patent Application Publication No. 2006-236292 (JP 2006-236292 A), the image data of the on-vehicle camera about a specific situation, such as an accident, is merely transmitted to parties of the insurance entrusted company.
The on-vehicle camera captures the image data at various positions and in various environments. The convenience or satisfaction of the user is conceivable to be improved by providing the image data according to the condition requested by the user.
The present disclosure establishes a technique for providing image data of an on-vehicle camera captured at a position and in an environment desired by the user to the user.
A first aspect of the present disclosure relates to an image data distribution system including a storage unit, an accepting unit, and a distribution unit. The storage unit is configured to store image data captured by an on-vehicle camera in association with imaging position information and imaging environment information. The accepting unit is configured to accept a distribution request in which an imaging position condition and an imaging environment condition are designated. The distribution unit is configured to distribute the image data associated with the imaging position information satisfying the imaging position condition and the imaging environment information satisfying the imaging environment condition.
In the first aspect of the present disclosure, the imaging environment condition may be a condition relating to a timing at which imaging is performed.
In the first aspect of the present disclosure, the imaging environment information may be information on an event occurring around a vehicle equipped with the on-vehicle camera, and the imaging environment condition may be a condition for designating the event.
In the first aspect of the present disclosure, the imaging environment condition may be a weather condition under which imaging is performed.
In the first aspect of the present disclosure, the image data distribution system may further include an editing unit configured to perform editing for time reduction or time extension on the image data, and the distribution unit may be configured to distribute the edited image data.
In the first aspect of the present disclosure, the image data distribution system may further include a receiving unit set to be communicable with a plurality of vehicles and configured to receive the image data captured by the on-vehicle camera of each vehicle, and the storage unit may be configured to store the image data received by the receiving unit.
A second aspect of the present disclosure relates to an image data display terminal including a designating unit, a receiving unit, and a display unit. The designating unit is configured to designate imaging position condition and imaging environment condition. The receiving unit is configured to receive image data captured by an on-vehicle camera and associated with imaging position information satisfying the imaging position condition and imaging environment information satisfying the imaging environment condition. The display unit is configured to display the received image data.
According to the aspects of the present disclosure, image data of an on-vehicle camera can be recognized by designating a position and an imaging environment by a user. Therefore, a plurality of the image data captured at the same position can be selected depending on the imaging environment, and the convenience or satisfaction of the user can be expected to be improved.
Features, advantages, and technical and industrial significance of exemplary embodiments of the present disclosure will be described below with reference to the accompanying drawings, in which like numerals denote like elements, and wherein:
Hereinafter, an embodiment will be described with reference to drawings. In the description, specific aspects are shown for easy understanding, but the specific aspects are merely example of the embodiment, and various other embodiments can be adopted.
Two vehicles 12, 14 in
The distribution system 30 is an example of an image data distribution system, and is a system built in distribution company offices. The distribution system 30 can be built using a plurality of hardware connected to a network. The distribution system 30 includes a collection server 40, a storage server 50, and a distribution server 60. The collection server 40 receives the image data from the vehicles 12, 14 that have obtained permission to participate in the on-vehicle camera image utilization system 10, and stores the image data in the storage server 50. The storage server 50 is a storage device that stores the image data. The distribution server 60 performs distribution of the image data according to a request of the user.
The smartphone 80 is an example of an image data display terminal, and a portable communication terminal used by the user. The smartphone 80 can accept distribution of the image data from the distribution system 30 and display the received image data in the display, by installing an application program on the smartphone 80.
The on-vehicle camera 20 is a camera that is equipped on the vehicle 12 and captures a scene of the outside or the inside of the vehicle. The on-vehicle camera 20 is installed, for example, around a front end of a roof in a vehicle compartment, and captures the outside of the vehicle in front of the vehicle through the front windshield to acquire the image data. The image data is data that provides two-dimensional or three-dimensional visual information. The image data is generally a moving image, but may be a still image captured at suitable time intervals. The on-vehicle camera 20 can be used as, for example, a drive recorder that records a travel status of the vehicle 12. For example, in a case where the vehicle 12 includes an autonomous driving mode, the on-vehicle camera 20 can be used as a sensor that grasps a traffic status in front of the vehicle. In the on-vehicle camera image utilization system 10, the image data of the on-vehicle camera 20 is also used in a manner that the image data is transmitted to the distribution system 30 and is distributed to the third party from the distribution system 30. A visible light camera using visible light is normally used as the on-vehicle camera 20, but cameras with various wavelength bands, such as an infrared camera and an ultraviolet camera, can also be used. Also, the on-vehicle camera 20 may capture a side or a rear side of the vehicle 12 other than the front of the vehicle.
The touch panel 22 is a display by which a driver of the vehicle 12 can perform an input operation. The user, such as the driver, can call the car navigation system on the touch panel 22 and display guidance on a route to a destination. The touch panel 22 is an example of an image data display terminal. Also, the user can display the application program of the on-vehicle camera image utilization system 10 on the touch panel 22, request distribution of the image data, and display the image data distributed from the distribution system 30. The application program can be in conjunction with the car navigation system.
The GPS 24 is an abbreviation of a global positioning system and a sensor that detects a position of the vehicle 12 using a satellite. In the detection result of the GPS 24, the image data of the on-vehicle camera 20 of the vehicle 12 is used as imaging position data that specifies an imaging position. The travel route of the vehicle 12 can be recognized by reviewing the imaging position data chronologically.
The timepiece 26 is a device that displays a timing of date and time. In output of the timepiece 26, the image data of the on-vehicle camera 20 of the vehicle 12 is used as imaging time data that specifies an imaging timing.
The wireless communication device 28 is a device that communicates with the outside by wireless communication, such as Wi-Fi (registered trademark). The vehicle 12 transmits the captured image data, corresponding imaging position data, and corresponding imaging time data to the distribution system 30 through the wireless communication device 28. The vehicle receives various image data from the distribution system 30 through the wireless communication device 28.
The vehicle 12 may be further provided with a sensor that acquires data relating to a weather condition, such as a temperature sensor or an insolation sensor. The corresponding sensor output at the time of imaging may be transmitted together with the image data, as imaging weather condition data, to the distribution system 30 through the wireless communication device 28.
In the collection server 40, a collection condition setting unit 42, a data receiving unit 44, an individual data deleting processing unit 46, and a table creating unit 48 are built under the control of the application program.
The collection condition setting unit 42 is to set a condition regarding a collection target of the image data of the on-vehicle camera 20. The collection condition may be set by a manager, or may be automatically set based on the program. Examples of the collection condition include designation of an area to be collected, designation of the vehicles 12, 14 to be collected in the area (the number of vehicles, a kind of vehicle, or a traveling speed), and designation of imaging time. The setting of the collection condition enables to positively collect image data in the area in which a small number of the vehicles 12, 14 travel, or at the time when a small number of the vehicles 12, 14 travel. The setting of the collection condition also enables to prevent the image data in the area in which a number of the vehicles 12, 14 travel, or at the time when a number of the vehicles 12, 14 travel from being collected more than needed.
The data receiving unit 44 is an example of a communication unit, and acquires the image data from the vehicles 12, 14, and corresponding imaging position data, imaging time data, and imaging weather condition data according to the collection condition set by the collection condition setting unit 42. Also, the data receiving unit 44 can acquire traveling speed data at the time of imaging, and vehicle kind data.
The individual data deleting processing unit 46 performs processing of deleting a part that is easy to specify an individual, such as a face of a person included in the image data or a license plate. The individual data deleting processing unit 46 discriminates a face of a person or a license plate according to the learning algorithm, such as deep learning, and performs processing of shading off the part.
The table creating unit 48 creates a table for searching the image data efficiently based on the imaging position data, the imaging time data, and the imaging weather condition data received with the image data. The table is created so as to include the imaging position information and the imaging environment information.
The imaging position information is information for specifying the position in which the image data is captured, and is basically arranged based on the received imaging position data. The imaging environment information is information relating to a timing or a weather condition under which the image data is captured. The timing at which the image data is captured is basically given by the imaging time data. In a case where the image data is captured around the event area during the event (for example, artificial events, such as holding festivals and sporting events in the area, and natural events, such as occurrence of earthquakes and cherry blossoms in the area), the event can be included as the imaging environment information relating to a timing. The imaging environment information relating to a weather condition is information on weather, the wind direction and the wind speed, and a temperature. The imaging environment information relating to a weather condition can be acquired based on the information provided from the meteorological agency. The imaging environment information relating to weather may be acquired using the imaging weather condition data acquired from the vehicle 12.
The storage server 50 is an example of a storage unit, and stores a table 52 created by the table creating unit 48 and image data 54. The storage server 50 can store the table 52 corresponding to the image data 54 captured in various periods and environments, within the country, in foreign countries, and around the world.
The distribution server 60 is an example of a distribution unit, and includes a distribution request accepting unit 62, an image searching unit 64, an image editing unit 66, and a distribution unit 68.
The distribution request accepting unit 62 is an example of an accepting unit, and accepts a distribution request for the image data from the touch panel 22 of the vehicles 12, 14 or the smartphone 80. In a case where the distribution request is made, the imaging position condition and the imaging environment condition may be designated.
The imaging position condition is a condition corresponding to the imaging position information, and is to designate the imaging position. For example, the condition that designates a start position, an end position, and a route between the start position and the end position is included in the imaging position condition. The imaging position condition may be a condition that broadly designate the imaging position. Examples of the broad designation include designating solely the start position and the end position, designating the travel road and one point included in the road, designating the start position and a travel direction, and designating a name of the area (for example, a city name, a tourist spot name, and a park name). Also, broad designation may be designating a name of the specific location (for example, stations, public facilities, buildings). In this case, a periphery of the location corresponding to the name or an area in which the location corresponding to the name is seen can be set as the imaging position condition. Characteristics of a plurality of positions may be designated as the imaging position condition. For example, roads along the coast, sights of autumn leaves, and cities of World Heritage are examples of designating a plurality of positions. In this case, for example, an aspect in which the corresponding image data is sequentially displayed according to the set priority order can be considered.
The imaging environment condition is a condition corresponding to the imaging environment information, and is to designate a specific timing or weather condition under which imaging is performed. Examples of designating a specific timing include a year, a season, a month, a day, an hour, a day of the week, and an event (festival or occurrence of an earthquake) in which the imaging is performed. A weather condition includes information on the wind direction and the wind speed, a temperature, and a humidity in addition to weather such as clear, cloudy, rainy, foggy, and snowy. A weather condition also includes storms and tornadoes caused by typhoons.
In a case where the distribution request accepting unit 62 accepts a distribution request, the image searching unit 64 performs searching of the image data based on the imaging position condition and the imaging environment condition. That is, the image searching unit 64 searches the corresponding image data 54 from the table 52 of the storage server 50 using the imaging position condition and the imaging environment condition as a searching key. In a case where a plurality of image data 54 that satisfies the condition is present, the image data 54 may be presented to the user and selected by user, or may be selected according to the suitable algorithm. In a case where the image data 54 that satisfies the condition is not present, a plurality of image data 54 may be combined to satisfy the condition, or image data 54 that does not satisfy the condition but is close to the condition may be selected.
The image editing unit 66 is an example of an editing unit, and performs editing on the image data 54 to be distributed. Editing includes processing of performing time extension of reproduction, such as slow-motion reproduction. Editing includes processing of performing time reduction of reproduction, such as fast forward reproduction, continuous reproduction of still images with time intervals, omission of similar scenery. The image editing unit 66 also performs continuous reproduction processing in a case where a plurality of image data 54 is selected. The image editing unit 66 may automatically perform editing according to the setting, or may perform editing based on the instruction of the user.
The distribution unit 68 performs distribution of the image data. The distribution can be performed by various methods, such as a streaming method and a download method.
An example of collection of the image data will be described with reference to
In a case where the vehicles 12, 14, 16, 18 meet the collection condition set by the collection condition setting unit 42, the data receiving unit 44 receives the image data captured by the on-vehicle camera 20 of the vehicles 12, 14, 16, 18 together with the imaging position data and the imaging time data. After the image data is processed to delete individual data by the individual data deleting processing unit 46, and subjected to the table creation processing by the table creating unit 48.
The “data number” indicates a number given to the image data 54 stored in the storage server 50. The “route and time” is sequentially describes the time at which the vehicle travels at a position set on the map. The “year/month/day”, the “day of week”, and the “time zone” show the date, day of the week, and time zone in which the vehicle travels. The “weather” is an example of a weather condition, and shows weather information, such as clear and rainy.
In the example shown in
Similarly, the image data captured by the vehicle 16 is recorded as the data number “5030”, and indicates that the vehicle 16 has arrived at the position E via the positions A, B, and D and has stopped. The data of the data number “5088” captured by the vehicle 18 includes a record in which the vehicle 18 travels at the position G, the position D, the position B, and the position C. Then, the data of the data number “5124” captured by the vehicle 14 includes a record in which the vehicle 14 passes through the position C, the position B, the position D, the position E, and the position F.
Examples of distribution and display of the image data will be described with reference to
In the example of
On the screen of the car navigation system 110, the imaging environment condition can be designated. Specifically, buttons of “season”, “time zone”, and “weather” are set below the car navigation system 110. These buttons are an example of designating unit for designating the imaging environment condition.
In the example of
By operating the “weather” button, the user can select “clear”, “cloudy”, “rainy”, or “snowy”. The user sets the imaging environment condition relating to a weather condition at the time of imaging. In a case where the user does not operate the “weather” button, for example, a setting value that is prepared in advance is adopted.
In a case where the user operates a “reproduction start” button shown in FIG. 6, the vehicle 12 performs a distribution request to the distribution system 30. In this case, in the distribution server 60, the distribution request accepting unit 62 accepts a distribution request, and the image searching unit 64 searches the image data according to the set imaging position condition and the set imaging environment condition. Searching is performed by referring the table shown in
The user can view the image data of the on-vehicle camera by designating the season or the weather, in addition to designation of the position. Therefore, the range of utilization of the image data is expanded, for example, the drive is simulated during a time when the autumn leaves are beautiful, or during a time when the night view is beautiful.
The distribution of the image data can be similarly requested from the smartphone 80 shown in
In the above description, only the display aspect of the image data is described, but for example, audio output may be performed in accordance with the display of the image data. The output audio data may be recorded at the time of capturing the image data, or may be other data (sound effect or music). As an example, in a case where the winter season is selected as the imaging environment condition, outputting a sound effect or music related to the designated imaging environment condition, such as playing music with a winter theme, is conceivable.
In the example described above, the aspect in which the past image data is displayed according to the imaging position condition and the imaging environment condition is described. However, for example, the current image data can be displayed according to the imaging position condition.
The configuration of the on-vehicle camera image utilization system 10 described above is merely an example, and can be variously modified. For example, in the example shown in
Number | Date | Country | Kind |
---|---|---|---|
JP2019-211215 | Nov 2019 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
9141995 | Brinkmann | Sep 2015 | B1 |
9870716 | Rao | Jan 2018 | B1 |
10031526 | Li | Jul 2018 | B1 |
10037689 | Taylor | Jul 2018 | B2 |
20020029242 | Seto | Mar 2002 | A1 |
20040125126 | Egawa | Jul 2004 | A1 |
20050088544 | Wang | Apr 2005 | A1 |
20050219375 | Hasegawa | Oct 2005 | A1 |
20050257273 | Naito | Nov 2005 | A1 |
20060248569 | Lienhart | Nov 2006 | A1 |
20070276589 | Inoue | Nov 2007 | A1 |
20100088021 | Viner | Apr 2010 | A1 |
20120215446 | Schunder | Aug 2012 | A1 |
20150046087 | Nogawa | Feb 2015 | A1 |
20160123743 | Sisbot | May 2016 | A1 |
20170212912 | Chun | Jul 2017 | A1 |
20170300503 | Wang | Oct 2017 | A1 |
20180032997 | Gordon | Feb 2018 | A1 |
20180052658 | Adachi | Feb 2018 | A1 |
20180350144 | Rathod | Dec 2018 | A1 |
20190213425 | Anderson | Jul 2019 | A1 |
20190306677 | Basu | Oct 2019 | A1 |
20200064142 | Choi | Feb 2020 | A1 |
20200088527 | Koda | Mar 2020 | A1 |
20200114930 | Syafril | Apr 2020 | A1 |
20200160722 | Brugman | May 2020 | A1 |
20200193643 | Hess | Jun 2020 | A1 |
20200249670 | Takemura | Aug 2020 | A1 |
20200255020 | Simmons | Aug 2020 | A1 |
20210097311 | McBeth | Apr 2021 | A1 |
20210174101 | Nishiyama | Jun 2021 | A1 |
20210312564 | Katata | Oct 2021 | A1 |
20210370968 | Xiao | Dec 2021 | A1 |
20220060928 | Jung | Feb 2022 | A1 |
Number | Date | Country |
---|---|---|
2003-274382 | Sep 2003 | JP |
2006-236292 | Sep 2006 | JP |
2008165033 | Jul 2008 | JP |
2011141762 | Jul 2011 | JP |
2014-164316 | Sep 2014 | JP |
2015161592 | Sep 2015 | JP |
2017-204104 | Nov 2017 | JP |
2018133055 | Aug 2018 | JP |
2019-021187 | Feb 2019 | JP |
Number | Date | Country | |
---|---|---|---|
20210158632 A1 | May 2021 | US |