The present disclosure relates to an information providing system that includes an image capture device connectable to a network and an information providing device that provides information to the image capture device, and to a computer program to be used in this system.
When shooting on location to obtain a video used in a movie or a TV drama, it is a common practice to research the circumstances of the location and actually visit the location in advance in order to determine whether the location is suitable for the scene intended to be shot. A preliminary check as this often precedes the carrying out of the shooting on the scheduled day because location shooting is greatly affected by the surrounding environment.
Various examples are known of systems that assist in location shooting by providing an image capture device (hereinafter also referred to as “camera”) with information about the surrounding environment from a server when the image capture device is used on location.
Patent Document No. 1 discloses an example in which an information terminal having a Global Positioning System (GPS) receiver accesses a server via a network to obtain, from the server, information about objects in scenery visible from the current position, and displays the information.
Patent Document No. 2 discloses an example in which a warning to a user of an image capture device is displayed by determining, based on information of the image capture device such as the current position, the orientation, and the tilt, and information about the current solar altitude, weather, and the like, whether the scene is backlit by the sun within the shooting range, which is determined by the field angle of the image capture device.
The present disclosure provides a new technique that enables a user to grasp various types of information for assisting in shooting on location in advance.
An information providing device according to one embodiment of the present disclosure includes: an obtaining section that obtains first information indicating a position and direction of an image capture device; and a control section that obtains, from a storage medium, based on the first information, second information indicating geography and layout of buildings in surroundings of the image capture device, and third information indicating at least one of a running status of a public transportation system and a solar orbit, determines whether or not one of a shooting object and a non-shooting object is within a shooting range of the image capture device based on the first information to the third information, and outputs information indicating a result of the determination.
According to the present disclosure, it is possible to enable the user to grasp the various types of information for assisting in the shooting on location in advance.
Embodiments are described in detail below with reference to the respectively appropriate drawings. Descriptions more detailed than necessary, however, may be omitted. For instance, a detailed description on a well-known matter and a redundant description on substantially the same components are omitted in some cases. This is to avoid lengthening the following description unnecessarily and to facilitate the understanding of a person skilled in the art.
The inventor of the present disclosure provides the accompanying drawings and the following description in order for a person skilled in the art to fully understand the present disclosure, and these are not to limit the idea described in the scope of claims.
Before specific embodiments are described, a description on problems of conventional technologies that are solved by embodiments of the present disclosure is given first as well as the outline of the embodiments disclosed herein.
When shooting on location in order to obtain a video used in a TV program or a movie, there are cases where, in spite of a preliminary check, an accident on the day of location shooting forces a camera crew to stop shooting or change location. One of such cases is the interruption of shooting caused by the operation of a public transportation system. Specifically, noise from an airplane flying over the location during shooting can stop the shooting, a train or bus passing nearby can hinder and stop the shooting, and an airplane, train, or bus passing through the perimeter can be captured accidentally in the background of a video being shot.
There are also cases where sunlight greatly affects location shooting. Specifically, a location that has been sunny in the preliminary check can be in the shadow of a building at the time shooting takes place. Conversely, there can be a case where the sky has been cloudy at the time of the preliminary check and the location turns out to be a backlit place on the day of shooting when the sky is clear. Various other factors can necessitate the interruption of shooting and a sudden change of location. These problems cannot be avoided with the methods disclosed in Patent Document Nos. 1 and 2.
On the other hand, there may also be cases where a vehicle of a public transportation system, the sun, a shaded place, or the like is actively sought after for shooting. For instance, shooting an airplane or a train may be desired, shooting a sunrise or a sunset may be desired, and shooting a scene with the main subject in the shade may be desired. In such cases, there is a chance in spite of a preliminary check that a desired object cannot be shot as planned because of a difference in time of the day, weather, or the like between the preliminary check and the actual shooting. In order to shoot successfully at the scheduled time, the preliminary check needs to be thorough.
The inventor of the present disclosure has found out the problems described above and completed the technique of the present disclosure. According to an embodiment of the present disclosure, the interruption of shooting or a change of location can be prevented by enabling a user to grasp various types of information for assisting in shooting on location at the time of the shooting or prior to the shooting. According to another embodiment, a user knows whether or not an object that the user wishes to film is going to be shot properly, and the preliminary thorough check can therefore be simplified.
In the subsequent Step S12, the network communication section 36 in the information providing device 30 receives the information sent from the image capture device 10. In Step S13, the control section 34 in the information providing device 30 obtains second information, which indicates the geography and the spatial layout of buildings in the surroundings of the image capture device 10, from the storage medium 40, based on the received first information. The storage medium 40 stores a database that contains information about the geography (including information on mountains, rivers, oceans, and trees) and about the layout of buildings in the three-dimensional space (hereinafter may be referred to as “surrounding environment database”). Out of information contained in the surrounding environment database, the control section 34 obtains information about the surroundings of the image capture device 10 as the second information. The “surroundings” here can stretch to, for example, a radius of several tens meters to several kilometers, though depending on shooting conditions and shooting objects.
In the subsequent Step S14, the control section 34 obtains third information, which indicates at least one of the running status of a public transportation system and the solar orbit, from the storage medium 40, based on the first information. The storage medium 40 stores a database that contains information about at least one of the running status of a public transportation system (airplanes, trains, buses, or the like) and the solar orbit. Out of information contained in this database, the control section 34 obtains, as the third information, information about at least one of a public transportation system and the sun that has a possibility of affecting shooting with the image capture device 10 at the current position. The third information depends on an object planned to be shot with the image capture device 10 (hereinafter may be referred to as “shooting object”) and an object planned to avoid shooting with the image capture device 10 (hereinafter may be referred to as “non-shooting object”). For instance, in a use where shooting a vehicle of a public transportation system is avoided, the vehicle of the public transportation system is the “non-shooting object” and information indicating the running status of the public transportation system is obtained as the third information. In a use where a sunrise is shot, on the other hand, the sun is the “shooting object” and information indicating the solar orbit is obtained as the third information. The shooting object and the non-shooting object vary from one embodiment to another, and various modes are feasible. Patterns of the shooting object and the non-shooting object are described later.
In the subsequent Step S15, the control section 34 determines whether or not the shooting object or the non-shooting object is within the shooting range based on the obtained first to third information. The “shooting range” means a range in the three-dimensional space that is displayed in a video obtained through shooting with the image capture device 10. For instance, when the sun is situated outside a range defined by the field angle of the image capture device 10, the sun is outside the shooting range at that time. Even when the sun is situated inside the range defined by the field angle, if blocked by a physical object such as a mountain or a building and not shown on the video, the sun is outside the shooting range at that time. In the following description, a physical object being situated within the shooting range may be expressed as “being captured in the shot”.
When the non-shooting object is a vehicle of a public transportation system, for example, the control section 34 determines whether or not the vehicle of the public transportation system is within the shooting range. When the shooting object is a sunrise, the control section 34 determines whether or not the sun is within the shooting range. The control section 34 thus performs determining processing suited to the shooting object and the non-shooting object. These are determined in a comprehensive manner from the position and direction of the image capture device 10, the geography and the spatial layout of buildings in the surroundings of the image capture device 10, and the movement of a vehicle of a public transportation system and/or of the sun, based on the first to third information. For instance, the control section 34 identifies a range that is defined by the field angle of the image capture device 10 based on the position and direction of the image capture device 10 and, from the positional relation of the shooting object (or the non-shooting object) to mountains, trees, or buildings that are within the shooting range, determines whether or not the shooting object (or the non-shooting object) is going to appear on the shot.
In the subsequent Step S16, the control section 34 transmits information indicating the result of the determination to the image capture device 10 via the network communication section 36. In Step S17, the image capture device 10 receives the information indicating the result of the determination via the network communication section 16, and the control section 14 displays the result of the determination on the displaying section.
Through the operation described above, the user of the image capture device 10 is informed of whether the “shooting object” which the user wishes to film or the “non-shooting object” which the user does not wish to film is within the shooting range (whether or not the object is going to appear on the shot).
Determination made by the control section 34 in the information providing device 30 is not limited to the determination described above, and the control section 34 may perform various types of determination processing suited to the type of the shooting object or the non-shooting object to notify the result thereof to the image capture device 10. Typical shooting object/non-shooting object patterns and examples of the specifics of determination are described below.
Example 3 is an example of the case where the non-shooting object is the sun, in other words, the case where backlit shooting is to be avoided. Example 4 is an example of the opposite case where the sun is the shooting object, e.g., when a sunrise, a sunset, a solar eclipse, or the like is to be shot intentionally. The specifics of determination in Examples 3 and 4 can be, for example, (i) whether or not the sun is within the shooting range, (ii) the time at which the sun enters the shooting range or the length of time till the sun enters the shooting range, and (iii) the direction of the image capture device that puts the sun inside the shooting range. The control section 34 determines about these matters and notifies the result of the determination to the image capture device 10, thereby enabling the user to take actions to avoid backlit shooting or actions to intentionally shoot a sunrise or a sunset.
Example 5 is an example of the case where the non-shooting object is a shade. Example 6 is an example of the opposite case where the shooting object is a shade. The specifics of determination in Examples 5 and 6 can be, for example, (i) whether or not a shade is within the shooting range, (ii) whether or not the main subject is going to be in a shade, (iii) the time at which the motion of the sun puts the main subject in a shade or the length of time till the motion of the sun puts the main subject in a shade, (iv) the direction of the image capture device that puts the main subject in a shade, and (v) the proportion of the shade to the entire screen. The control section 34 determines about these matters and notifies the result of the determination to the image capture device 10, thereby enabling the user to take actions to avoid shooting a shade or actions to intentionally shoot a shade.
Example 7 is an example of the case where the shooting object and the non-shooting object both are set. In this example, a vehicle of a public transportation system is set as the shooting object and the sun is set as the non-shooting object. The specifics of determination in Example 7 can be an arbitrary combination of the specifics of determination in Example 2 and Example 3. The control section 34 determines about those matters and notifies the result of the determination to the image capture device 10, thereby enabling the user to take actions to film the vehicle of the public transportation system while avoiding shooting the vehicle backlit. The shooting object and the non-shooting object can thus be set both. Although Example 7 assumes the case where a vehicle of a public transportation system is shot intentionally and backlit shooting is avoided, other combinations of the shooting object and the non-shooting object can be used.
As described above, according to the information providing system of this disclosure, various matters including whether or not the shooting object or the non-shooting object is within the shooting rage are determined based on information about the position and direction of the image capture device, the geography, buildings, public transportation systems, or the sun, and the results of the determination are notified to the image capture device. The information providing device can thus provide such information as how many minutes there are till a vehicle of a public transportation system, e.g., an airplane, a train, or a bus, passes through the shooting range to the user before shooting starts. Consequently, situations such as being forced to stop shooting can be prevented and efficient filming work is accomplished. The information providing system can also provide to the user, in advance, such information as whether the location is in a shade, or whether the location is backlit, at the scheduled date/time of shooting. Situations such as undergoing a change of location on the day of shooting can be prevented and efficient filming work is accomplished. Moreover, in the case where a vehicle of a public transportation system, a sunrise, or the like is to be shot intentionally, preparations can be simplified because an appropriate shooting time and an appropriate direction of the image capture device are grasped in advance.
More specific embodiments of the present disclosure are described below.
A first embodiment is described first. This embodiment relates to an information providing system that provides various types of information to a user in order to prevent an image capture device from accidentally capturing a vehicle of a public transportation system in the shot. In this embodiment, the user is provided with various types of information about a vehicle of a public transportation system that is the “non-shooting object”.
[1-1. Configuration]
The network 210 of
The information providing server 220 is a server computer (information processing device) that corresponds to the information providing device 30 in the description given above. The configuration of the information providing server 220 is the same as the one illustrated in
The map database 230 provides a map and geographical data in an arbitrary spot. The building database 240 provides data about the shapes and sizes of buildings which is data indicating the spatial layout in a three-dimensional coordinate system. The public transportation system database 250 provides real-time running status data such as the current whereabouts of a traveling vehicle of a public transportation system, e.g., a train, a bus, or an airplane. The map database 230 and the public transportation system database 250 may be integrated as a three-dimensional map database.
The codec 120 is a circuit that compresses/decompresses a video signal generated by the image capturing section 110 and outputs the signal. The image displaying section 130 is a display capable of displaying an obtained video and various types of setting information. The control section 140 is a processor that controls the overall operation of the camera 100, such as a central processing unit (CPU) or a microcomputer. The control section 140 controls the respective sections by executing a control program. The storage medium 170 is a memory such as a DRAM, and stores the control program executed by the control section 140 and various types of data generated in the process of computing. The control section 140 may be implemented by a combination of hardware such as an integrated circuit and software (a program), or may be implemented by hardware alone.
The network communication section 160 is a network interface capable of transmitting and receiving information over the network 210. The field angle detecting section 186 is a detection mechanism for identifying the field angle based on the zoom value and the size of the image pickup device in the image capturing section 110. The position sensor 180 is a sensor that detects the position of the camera 100 and can be implemented by, for example, a receiver that receives GPS signals. The orientation sensor 182 is a sensor that detects the orientation of the camera 100 and can be implemented by, for example, a magnetic compass. The elevation angle sensor 184 is a sensor that detects the elevation angle of the camera 100 and can be implemented by, for example, an acceleration sensor.
The camera 100 can include, in addition to the components described above, other components (not shown). For instance, the camera 100 may include an operation panel which is operated by the user, a power supply circuit which supplies power to the respective sections, a camera shake correcting mechanism, a microphone, an audio processing circuit, and a speaker. The camera 100 can have any configuration as long as the following operation can be carried out.
[1-2. Operation]
The operation of the camera 100 configured as above is described. Video signals obtained by the image capturing section 110 are compressed by the codec 120. The compressed video data is transferred via the bus 150 to the storage medium 170 to be recorded as a video file. The control section 140 exerts control on the transfer of the video data via the bus 150 and on the operation of recording the video data as a file. Through the operation described above, the camera 100 records video. Audio has low relevance to the technique of this embodiment, and a description about audio recording is omitted. The camera 100 can use known technologies to record audio.
The camera 100 can identify the camera's current position by receiving, for example, GPS signals with the use of the position sensor 180. The camera 100 can also identify an orientation in which the camera is pointed (the angle in the horizontal direction) with the use of the orientation sensor 182. The camera 100 can further identify the elevation angle in a direction in which the camera 100 is pointed (the angle in the vertical direction) with the use of the elevation angle sensor 184. Using the field angle detecting section 186, the camera 100 detects the zoom value and the optical value from the optical system of the image capturing section 110, such as a lens and a sensor, and identifies the field angle at which the shot is taken.
The network communication section 160 connects the camera 100 to the network 210 by cable or wireless connection. The camera 100 can transmit, through the network communication section 160, data detected by the sensors of the camera 100 to the network 210.
The control section 140 in the camera 100 transmits, via the network communication section 160, data detected respectively by the position sensor 180, the orientation sensor 182, the elevation angle sensor 184, and the field angle detecting section 186 to the network 210. The data transmitted from the camera 100 is transmitted to the information providing server 220 via the network 210.
The information providing server 220 first accesses the map database 230 of
The information providing server 220 next identifies buildings that may be captured in the shot taken with the camera 100 (be within the shooting range) based on the obtained map data, orientation information detected by the orientation sensor 182 of
The information providing server 220 next accesses the public transportation system database 250 with respect to the identified public transportation system's vehicle to obtain information about the real-time whereabouts of a traveling train, bus, airplane, or the like. With the current whereabouts of the identified public transportation system's vehicle thus found out, the information providing server 220 can grasp in advance how many minutes there are till a train, a bus, an airplane, or the like passes through the current field angle range of the camera 100.
The information providing server 220 transmits, via the network 210, to the camera 100, information about the detected passing of a vehicle of a public transportation system. The control section 140 in the camera 100 obtains this information via the network communication section 160. Based on the received information, the control section 140 displays the information about the passing of a vehicle of a public transportation system on the image displaying section 130. The user of the camera 100 can thus be notified in advance of the fact that a train, a bus, an airplane, or the like is going to be captured in the video.
Processing executed by the information providing server 220 of
The control section 140 of
[1-3. Effects and the Like]
As described, the information providing server 220 in this embodiment identifies a vehicle of a public transportation system that has a possibility of being captured in the shot taken with the camera 100 by utilizing information about the current position, orientation, elevation angle, and the field angle of the camera 100, and map information and building information of the surroundings of the camera 100. The information providing server 220 further refers to the running status of the vehicle of the public transportation system, to thereby determine how many minutes there are till a train, a bus, an airplane, or the like passes through the current field angle range of the camera 100, and provide information indicating the result of the determination to the camera 100. The user thus grasps how many minutes there are till these vehicles of public transportation systems pass through the field angle range of the camera 100 which is filming, and can avoid situations such as stopping shooting halfway.
This embodiment corresponds to Example 1 in
The functions in this embodiment described above can also be used when shooting intentionally a scene in which a vehicle of a public transportation system passes across. Also in this case, the camera 100 and the information providing server 220 operate the same way as above.
A second embodiment is described next. This embodiment deals with an information providing system that provides a user with information by detecting, in advance, sunlight in location shooting, in particular, sunlight shined from behind a subject, namely, backlight. In this embodiment, the user is provided with various types of information about sunlight that is the “non-shooting object”.
[2-1. Configuration]
The overall configuration of the information providing system in this embodiment is the same as the overall configuration of the first embodiment which is illustrated in
[2-2. Operation]
The operation of the camera 200 and the information providing server 220 in this embodiment is described. The operation of shooting and recording video is the same as in the first embodiment described above, and a description thereof is omitted here.
The control section 140 in the camera 200 transmits, via the network communication section 160, data detected respectively by the position sensor 180, the orientation sensor 182, the elevation angle sensor 184, and the field angle detecting section 186 to the network 210. The data transmitted from the camera 200 is transmitted to the information providing server 220 via the network 210.
The information providing server 220 in this embodiment uses specified date/time information in addition to the information of the position, direction, and field angle of the camera 200. The specified date/time information is a date/time set by the user of the camera 200 of
The information providing server 220 first accesses the map database 230 of
The information providing server 220 next identifies buildings that may be captured in the shot taken with the camera 200 based on the obtained map data, orientation information detected by the orientation sensor 182, elevation angle information detected by the elevation angle sensor 184, and field angle information detected by the field angle detecting section 186. The information providing server 220 then obtains data about the spatial layout in the three-dimensional space of the buildings that may be captured in the shot from the building database 240. The information providing server 220 can consequently grasp the heights of buildings in the perimeter of a video shot with the camera 200 and the positional relation of the buildings to the camera.
The information providing server 220 next uses the camera position information detected by the position sensor 180 and the specified date/time information set by the user which is described above to obtain the position of the sun. The orientation and elevation angle of the sun can be identified if the position information and the date/time information are known.
The information providing server 220 compares the information of the identified orientation and elevation angle of the sun with the data of the orientation, elevation angle, and field angle of the camera 200 to determine whether or not the sun is within the field angle of the camera. If the sun is within the field angle of the camera, there is a possibility that the location is backlit. At this point, the information providing server 220 in this embodiment which knows the heights and positional relation of the surrounding buildings can further determine cases where the buildings block the sun and the location is not backlit as a result. For instance, even when there is a chance that the sun is captured in the background of a video to be shot, the shot does not actually capture the sun in the background of the video and is not backlit in some cases because of tall buildings or the like. The information providing server 220 is capable of accurately determining whether or not the sun is within the shooting range in such cases, too.
The information providing server 220 transmits the results of these determining operations as backlit shooting information to the camera 200 via the network 210. The control section 140 in the camera 200 of
Processing executed by the information providing server 220 of this embodiment is described next with reference to a flow chart.
The control section 140 of
[2-3. Effects and the Like]
As described, the information providing server 220 in this embodiment uses information of surrounding buildings and the solar orbit in addition to the position, orientation, elevation angle, and field angle of the camera 200 to accurately determine whether or not the location is backlit at a date/time specified by the user, and provides information indicating the result of the determination to the camera 200. This enables the user to avoid a situation where the shot taken in actual shooting is backlit.
This embodiment corresponds to Example 3 in
A third embodiment is described subsequently. This embodiment refers to an information providing system that provides a user with information for shooting the sun when the sun is to be shot intentionally on location. In this embodiment, the user is provided with various types of information about sunlight which is the “shooting object”. This embodiment is effective when shooting, for example, a sunrise scene, a sunset scene, or a solar eclipse. The following assumes a case where an information providing system assists in the shooting of a sunrise scene.
[3-1. Configuration]
The overall configuration of the information providing system in this embodiment is the same as the overall configuration of the second embodiment. The components of the information providing server 220 and the camera 200 are the same as those in the second embodiment as well. The following description focuses on differences from the second embodiment by omitting descriptions on matters common to the third embodiment and the second embodiment.
[3-2. Operation]
The operation of the camera 200 and the information providing server 220 in this embodiment is described.
The camera 200 transmits position information, orientation information, elevation angle information, field angle information, and specified date/time information which is set by the user to the network 210. Specifically, the control section 140 in the camera 200 transmits, via the network communication section 160, data detected respectively by the position sensor 180, the orientation sensor 182, the elevation angle sensor 184, the field angle detecting section 186, and the specified date/time information which is set by the user to the network 210. The data and the specified date/time information transmitted from the camera 200 are transmitted to the information providing server 220 via the network 210.
The information providing server 220 first accesses the map database 230 of
The information providing server 220 next identifies buildings that may be captured in the shot taken with the camera 200 based on the obtained map data, orientation information detected by the orientation sensor 182, elevation angle information detected by the elevation angle sensor 184, and field angle information detected by the field angle detecting section 186. The information providing server 220 then obtains data about the spatial layout in the three-dimensional space of the buildings that may be captured in the shot from the building database 240. The information providing server 220 can consequently grasp the heights of buildings in the perimeter of a video shot with the camera 200 and the positional relation of the buildings to the camera. The information providing server 220 can also grasp the heights of mountains or the like in the perimeter of a video shot with the camera 200 and the positional relation of the mountains or the like with respect to the camera, because the map data obtained from the map database 230 includes the geography and altitude data of the surroundings.
The information providing server 220 next uses the information of the position of the camera 200 detected by the position sensor 180 and the specified date/time information set by the user which is described above to obtain the position of the sun. The orientation and elevation angle of the sun can be identified if the position information and the date/time information are known. In this example where the shooting of a sunrise is assisted, in particular, the information providing server 220 calculates a time when the sun appears on the horizon that is closest to the specified date/time information set by the user, and the orientation and elevation angle of the rising sun.
The information providing server 220 compares the information of the identified orientation and elevation angle of the sun with the data of the orientation, elevation angle, and field angle of the camera 200 to determine whether or not the sunrise is within the field angle of the camera 200. The information providing server 220 which knows the heights and positional relation of buildings, mountains, or the like in the surroundings of the camera can accurately determine whether the sun rises from behind the buildings or the mountains that are captured in the shot taken with the camera 200. The information providing server 220 is further capable of detecting a differential amount which indicates how much the predicted sunrise is shifted from the current orientation, elevation angle, and field angle of the camera.
The information providing server 220 transmits the results of these determining operations as sunrise information to the camera 200 via the network 210. Specifically, the information providing server 220 transmits the information of the time of sunrise that is closest to the specified date/time information set by the user, the position information that indicates at which point in the current shooting range of the camera the sun is to rise, the information of a differential between the current orientation, elevation angle, and field angle of the camera and the predicted point at which the sun rises, or the like.
The control section 140 of the camera 200 obtains these pieces of sunrise information via the network communication section 160. The control section 140 displays a predicted time of sunrise on the image displaying section 130 based on the sunrise time information out of the received pieces of sunrise information. The user of the camera 200 can thus be informed of a time when the sun rises that is closest to the specified date/time information set by the user.
The control section 140 can also display a predicted position of sunrise on the image displaying section 130 based on the information about the sunrise position out of the received pieces of sunrise information. The user of the camera 200 can thus be informed of a point in the shooting range at which the sunrise is going to be captured.
The control section 140 is also capable of informing the user of the camera 200 of such information as an orientation, an elevation angle, and a field angle to which the camera is to be shifted in order to capture the sunrise by displaying a differential amount from the current position, elevation angle, and field angle, based on the information of a differential from the point at which the sun rises out of the received pieces of sunrise information.
Processing executed by the information providing server 220 of this embodiment is described next with reference to a flow chart.
[3-3. Effects and the Like]
As described, the information providing server 220 in this embodiment provides the user with information for assisting in the intended shooting of the sun, such as the shooting of a sunrise. The user can thus shoot a sunrise with ease.
A fourth embodiment is described next. This embodiment relates to an information providing system that provides a user with information by detecting, in advance, sunlight in location shooting, in particular, whether or not a location is in a shade at a date/time scheduled for shooting. In this embodiment, the user is provided with various types of information about a shade that is the “non-shooting object”.
[4-1. Configuration]
The overall configuration of the information providing system in this embodiment is the same as the overall configuration of the second embodiment. The components of the information providing server 220 are the same as those in the second embodiment. The following description focuses on differences from the second embodiment by omitting descriptions on matters common to the fourth embodiment and the second embodiment.
The camera 300 is substantially the same as the camera 200 described in the second embodiment with reference to
The distance sensor 388 is, for example, a ranging use-exclusive camera and detects the distance from the camera 300 to a subject. The ranging use-exclusive camera performs ranging by irradiating a subject with light from a light source that has a specific light emission pattern, and measuring the length of time till light reflected by the subject is detected. There are various methods of measuring a distance, for example, a ranging method that uses a laser as in laser range finders which are used in the field of robots, and a ranging method that uses supersonic waves or millimeter waves. How the distance is detected in this embodiment is not limited to a specific method.
A subject the distance from which is detected by the distance sensor 388 may herein be referred to as “main subject”. The main subject is a subject on which the camera 300 is focused manually by the user or automatically by the camera 300. The main subject is typically a person, animal, plant, or object around the center of the shooting range, or a person's face or conspicuous object that has been automatically detected.
[4-2. Operation]
The operation of the camera 300 and the information providing server 220 in this embodiment is described.
The control section 340 in the camera 300 transmits, via the network communication section 360, data indicating the position, orientation, elevation angle, field angle, and distance detected respectively by the position sensor 380, the orientation sensor 382, the elevation angle sensor 384, the field angle detecting section 386, and the distance sensor 388 to the network 210. The data transmitted from the camera 300 is transmitted to the information providing server 220 via the network 210.
In this embodiment, specified date/time information is used as in the second and third embodiments. The specified date/time information is a date/time set by the user of the camera 300 illustrated in
The information providing server 220 first accesses the map database 230 of
The information providing server 220 next identifies buildings that may be captured in the shot taken with the camera 300 and surrounding buildings that affect the shooting based on the obtained map data, orientation information detected by the orientation sensor 382, elevation angle information detected by the elevation angle sensor 384, and field angle information detected by the field angle detecting section 386. The information providing server 220 then obtains data about these buildings from the building database 240. The information providing server 220 can consequently grasp the heights of buildings in the perimeter of a video shot with the camera 300 and the positional relation of the buildings to the camera 300.
The information providing server 220 next uses the information of the position of the camera 300 which has been detected by the position sensor 380 and the specified date/time information set by the user which is described above to obtain the position of the sun. The orientation and elevation angle of the sun can be identified from the position information and the date/time information.
The information providing server 220 determines a range around the camera 300 that is in a shade from the information of the identified orientation and elevation angle of the sun, and from the information of the shape and heights of the surrounding buildings which is described above. The information providing server 220 then refers to distance information detected by the detection sensor 388 to determine whether or not the main subject of the camera 300 is going to be in a shade.
The information providing server 220 transmits the results of these determining operations to the camera 300 via the network 210. The control section 340 in the camera 300 receives the determination results via the network communication section 360. The control section 340 displays the determination results on the image displaying section 330 based on the received determination results, to thereby notifying the user of the camera 300 of whether or not the main subject is in a shade at the specified date/time.
Processing executed by the information providing server 220 is described next with reference to a flow chart.
The control section 340 of
In this embodiment, the distance to the main subject is detected to determine whether or not the main subject is going to be in a shade. However, other modes than this may be employed. The information providing server 220 may simply determine whether or not a shade is within the shooting range, or may determine whether or not the proportion of the shade to the shooting range is higher than a given threshold.
[4-3. Effects and the Like]
As described above, this embodiment can provide information for assisting in shooting on location by detecting in advance whether a location is in a shade at a date/time specified by the user, or whether the main subject is going to be in a shade at the specified date/time. The user can thus avoid letting a location or the main subject be in a shade, and can accordingly prevent a change of location.
This embodiment corresponds to Example 5 in
The functions in this embodiment described above can also be used when a shaded scene is shot intentionally. In this case, too, the camera 300 and the information providing server 220 operate the same way as above.
The first to fourth embodiments have been described above as exemplification of the technique disclosed in this application. However, the technique disclosed herein is not limited thereto, and is also applicable to embodiments that are modified suitably by changing, replacing, adding, or omitting components, functions, or the like of the embodiments. A new embodiment may also be created by combining the components described in the first to fourth embodiments. Another embodiment is given below as exemplification.
In the second to fourth embodiments, where information related to sunlight is handled, suitability/unsuitability for shooting depends on weather conditions at the time of the shooting. For that reason, a weather information database may additionally be used in order to determine suitability/unsuitability for shooting based on weather forecasts and other types of information.
The image capture device and the information providing server may have at least two of the functions of the first to fourth embodiments so that a switch is made between the functions to one specified by the user. This is accomplished by, for example, configuring the operation section 390 in the image capture device 300 of
In each of the first to fourth embodiments, the functions of the embodiment are provided by the information providing system that includes the camera and the information providing server. Instead, the functions of the embodiment may be provided solely by the information providing device or the image capture device. An example of this embodiment is described below.
Through the operation described above, the result of determination can be obtained by using only the information providing device 400, without needing the user to operate the camera. Accordingly, preliminary check for location shooting can easily performed without stepping out of, for example, a studio.
In the embodiments described above, the information providing device performs various types of determining processing and outputs the results of the determination. This operation may be performed by the image capture device instead. For that purpose, a device having the same functions as those of the information providing device is installed in the image capture device. The image capture device configured as this obtains necessary information such as the geography, buildings, the running status of a public transportation system, the solar orbit, and the like via the network on its own, performs necessary determining processing, and outputs the result of the determination to a display or the like. The user thus needs only the image capture device to obtain various types of information for assisting in location shooting.
The technique disclosed herein is not limited to the information providing systems, information providing devices, and image capture devices described above, and is also applicable to software (computer program) that defines processing in any one of the embodiments described above. The operation defined by this program is as illustrated in
The embodiments have now been described as exemplification of the technique disclosed herein. The accompanying drawings and the detailed description have been provided for that purpose.
Therefore, the components illustrated and described in the accompanying drawings and in the detailed description include not only ones indispensable for solving the problem but also ones that are not indispensable for solving the problem in order to exemplify the technique. These dispensable components should not be found to be indispensable just because the dispensable components are illustrated and described in the accompanying drawings and the detailed description.
The embodiments described above are for exemplification of the technique disclosed herein, and are susceptible of various changes, replacement, addition, omission, and the like within the scope of patent claims or an equivalent scope.
The technique of the present disclosure is applicable to uses in which a user is provided with various type of information for assisting in shooting when, for example, location shooting is conducted.
Number | Date | Country | Kind |
---|---|---|---|
2012-054157 | Mar 2012 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2012/007971 | 12/13/2012 | WO | 00 | 7/19/2013 |