Embodiments described herein relate generally to a vehicle and a vehicle control device.
Conventionally, technology has been disclosed which sets search criteria for a person to be searched for in a facility, identifies a person who is likely to be the person to be searched for, based on the search criteria, and monitors the identified person (see, for example, Japanese Patent Application Laid-open No. 2016-201758).
The present disclosure has been made in consideration of the above, and an object of the present disclosure is to appropriately control the start or end of a monitoring process in a vehicle.
A vehicle according to an embodiment of the present disclosure includes: a first wheel; a second wheel; a body coupled to the first wheel and the second wheel, the body being movable by the first wheel and the second wheel; a first imaging circuit configured to capture an image of exterior of the vehicle; and a processor. The processor is configured to: set a monitoring target; start, when a monitoring start condition is met, a monitoring process of determining whether the monitoring target belongs to a monitoring range based on a captured image by the first imaging circuit; and terminate, when a monitoring end condition is met, the monitoring process, the monitoring start condition or the monitoring end condition including a condition related to an operation of the vehicle.
Embodiments of a vehicle and a vehicle control device according to the present disclosure will be described below with reference to the drawings.
The vehicle 1 can run on two pairs of wheels 3 arranged along a predetermined direction. In this case, the predetermined direction in which two pairs of wheels 3 are arranged is the direction in which the vehicle 1 moves, and the vehicle 1 can move forward or backward, for example, by shifting the gear.
The vehicle 1 also includes a vehicle drive unit 5, an exterior imaging circuit 6, and an exterior speaker 30. The exterior imaging circuit 6 is an example of the first imaging circuit.
The vehicle drive unit 5 is a drive device mounted on the vehicle 1. The vehicle drive unit 5 is, for example, an engine, a motor, or a drive unit for the wheels 3.
The exterior imaging circuit 6 is a camera that captures images of the exterior of the vehicle. An exterior imaging circuit 6A is provided at a front portion of the vehicle 1. The exterior imaging circuit 6A captures an image of the front of the vehicle 1 and generates a captured image. An exterior imaging circuit 6B is provided at a rear portion of the vehicle 1. The exterior imaging circuit 6B captures an image of the rear of the vehicle 1 and generates a captured image. An exterior imaging circuit 6C and an exterior imaging circuit 6D are provided, for example, around the door mirrors of the vehicle 1. The exterior imaging circuit 6C and the exterior imaging circuit 6D capture images of the sides of the vehicle 1 and generate captured images.
In the present embodiment, the manner in which a plurality of exterior imaging circuits 6 (exterior imaging circuits 6A to 6D) are provided on the body 2 of the vehicle 1 is described by way of an example. These exterior imaging circuits 6 are arranged at positions where they can capture images of the outside of the body 2.
The exterior speaker 30 is connected to a vehicle control device 20 and outputs sound in accordance with an audio signal output from the vehicle control device 20. The exterior speaker 30 is provided outside the cabin of the vehicle 1, for example, in the engine room. The exterior speaker 30 outputs sound to the outside of the vehicle 1.
A seat 4 on which the driver who is the user of the vehicle 1 is seated is provided in the cabin of the vehicle 1. In other words, the seat 4 is the driver's seat. The driver seated in the seat 4 can drive the vehicle 1 by operating a steering wheel 11 or operating a not-illustrated accelerator pedal or brake pedal.
An interior imaging circuit 7, an engine switch 10, a shift lever 12, a parking brake 13, a door lock device 14, an in-vehicle display device 15, a human detection sensor 16, and the like are also provided in the cabin of the vehicle 1.
The interior imaging circuit 7 is an example of the second imaging circuit. The interior imaging circuit 7 is a camera that captures an image of the interior of the vehicle. The interior imaging circuit 7 is provided in the cabin. The interior imaging circuit 7 captures, for example, an image around a rear seat 8 on which a passenger is seated. The interior imaging circuit 7 captures an image around the rear seat 8 of the vehicle 1 and generates a captured image.
The engine switch 10 is a switch that the driver operates to start the engine of the vehicle 1. The engine switch 10 is sometimes referred to as ignition switch. The vehicle 1 is brought into an engine started or stopped state through operation of the engine switch 10 by the driver. Further, for example, power supply to an electronic device mounted on the vehicle 1 is controlled through operation of the engine switch 10.
The shift lever 12 is a lever that the driver operates to change shift positions. The range of movement of the shift lever 12 includes, for example, a parking position, a reverse position, a neutral position, and a drive position.
With the shift lever 12 in the parking position, the power of the engine of the vehicle 1 is not transmitted to the wheels 3, which is called a parking state. With the shift lever 12 in the reverse position, the vehicle 1 is ready to move backward. With the shift lever 12 in the drive position, the vehicle 1 is ready to move forward.
The parking brake 13 is one of the braking mechanisms of the vehicle 1. The parking brake 13 is a manual control mechanism that allows the driver to manually stop the movement of the vehicle 1. With the parking brake 13 being applied, the movement of the vehicle 1 is stopped, which is the parking state. With the parking brake 13 being released, the vehicle 1 is ready to move.
The door lock device 14 is a device that switches the door of the vehicle 1 to a locked state or an unlocked state. For example, the door of the vehicle 1 can be switched to a locked state or an unlocked state by a vehicle key or the like from outside the vehicle 1.
The in-vehicle display device 15 is an example of the display device. The in-vehicle display device 15 has an image display function as well as audio output and input functions and the like. The image display function is, for example, a liquid crystal display (LCD), an organic electro-luminescent display (GELD), or the like. The audio output function is, for example, a speaker for the interior of the vehicle. The in-vehicle display device 15 is configured as a touch panel having a function to accept an operation input by the user. The in-vehicle display device 15 may further include a switch that accepts an operation input by the user.
The in-vehicle display device 15 may include a navigation device having a location information acquisition function and a route search function using map information.
The human detection sensor 16 is a sensor that detects the presence or absence of a person in the inside of the body 2. The human detection sensor 16 is, for example, an infrared sensor, an image analysis system for captured images, or a weight-based seating sensor.
The vehicle 1 is equipped with a communication device, a device to detect opening and closing states of the door of the vehicle 1, and a device to detect a state of ignition, which are not illustrated in
In the present embodiment, the vehicle control device 20 is mounted on the vehicle 1. The vehicle control device 20 is a device that can be mounted on the vehicle 1. The vehicle control device 20 sets a monitoring range and a monitoring target and determines whether the monitoring target has deviated from the monitoring range. Here, the monitoring target is, for example, a child riding in the vehicle 1. The vehicle control device 20 is, for example, an electronic control unit (ECU) or an on-board unit (OBU) installed in the inside of the vehicle 1. Alternatively, the vehicle control device 20 may be an external device installed near the dashboard of the vehicle 1.
A hardware configuration of the vehicle control device 20 will now be described.
The interface 21 is a variety of interfaces such as communication interface, image interfaces, audio interface, display interface, and input interface.
The HDD 25 stores a variety of information. For example, the HDD 25 stores a captured image acquired from the exterior imaging circuit 6 and the interior imaging circuit 7.
The CPU 22 executes a computer program stored in the ROM 24 or the HDD 25 to execute a variety of processes. The CPU 22 is an example of a processor.
The vehicle control device 20 is connected to the exterior imaging circuit 6, the interior imaging circuit 7, the parking brake 13, the door lock device 14, the in-vehicle display device 15, the exterior speaker 30, a communication device 40, an ignition detection device 50, and a door open/close detection device 60.
The communication device 40 is a device configured with a communication device and the like for connecting to a communication network such as the Internet. The communication device 40 may be a wireless LAN-compatible communication device, a long term evolution (LTE)-compatible communication device, or a wire communication device that performs wired communication. The vehicle control device 20 can transmit and receive information to/from a terminal device such as smartphone via the communication device 40.
The ignition detection device 50 is a sensor device connected to the engine switch 10 to detect a state of ignition. The ignition detection device 50 outputs information indicating a state of the ignition switch.
The door open/close detection device 60 is a device that detects the opening and closing of the doors on the driver's seat side and the front passenger's seat side of the front and rear rows of the vehicle 1. For example, the door open/close detection device 60 is, for example, a sensor that detects the opening and closing of the door. The door open/close detection device 60 is arranged, for example, at the hinge of each door. The door open/close detection device 60 detects that the door is at an open angle enough to allow an occupant to get in and out. If it is detected that the door open angle is enough to allow an occupant to get in and out, it can be assumed that a vehicle operation to enable the user of the vehicle 1 to get out of the vehicle has been performed. The user of the vehicle 1 is, for example, an occupant in the rear seat.
The monitoring range setting circuit 201 sets a monitoring range. This monitoring range is a range in which the possibility that the monitoring target becomes lost is taken into consideration and which defines whether to perform a notification process for the monitoring target himself/herself or the occupant in the vehicle 1 (for example, driver), and is a geofence which is a geographical fence. In other words, the vehicle control device 20 does not perform the notification process when the monitoring target belongs in the monitoring range, and performs the notification process when the monitoring target deviates from the monitoring range.
For example, when a mode indicating that a monitoring range is to be set is selected through the user's input operation to the in-vehicle display device 15, the monitoring range setting circuit 201 starts a monitoring range setting process. In the monitoring range setting process, the monitoring range setting circuit 201 accepts input of selection of the exterior imaging circuit 6 to be used in the monitoring process and information defining the monitoring range in the selected exterior imaging circuit 6, through the user's input operation to the in-vehicle display device 15. The information defining the monitoring range in the selected exterior imaging circuit 6 is, for example, distance information from the selected exterior imaging circuit 6.
The monitoring range setting circuit 201 sets a monitoring range, based on selection of the exterior imaging circuit 6 to be used in the monitoring process and information defining the monitoring range in the selected exterior imaging circuit 6. An example of the monitoring range will now be described with reference to
The monitoring range setting circuit 201 may accept only the selection of the exterior imaging circuits 6 to be used in the monitoring process, and the respective image capturing ranges of the exterior imaging circuits 6 may be set as the respective monitoring ranges of the exterior imaging circuits 6. In this way, the monitoring range setting circuit 201 may define the monitoring range, based on the image capturing range of the exterior imaging circuit 6.
The monitoring range setting process will now be described with reference to
The monitoring range setting circuit 201 accepts selection of the exterior imaging circuits 6 to be used in the monitoring process (step S1). The monitoring range setting circuit 201 then accepts input of information defining the respective monitoring ranges of the exterior imaging circuits 6 (step S2). The monitoring range setting circuit 201 then sets the monitoring range (step S3).
Returning to
When a mode indicating manual registration is selected through the user's input operation to the in-vehicle display device 15, and the exterior imaging circuit 6 or the interior imaging circuit 7 that takes pictures is selected, the monitoring target setting circuit 202 activates the selected exterior imaging circuit 6 or interior imaging circuit 7 and allows the selected imaging circuit to capture an image. The monitoring target setting circuit 202 acquires a captured image from the selected exterior imaging circuit 6 or interior imaging circuit 7, analyzes the captured image, and sets a whole body image and a face image of a person in the captured image as the monitoring target. The monitoring target setting circuit 202 may analyze the captured image transmitted from a terminal device owned by the user and set a whole body image and a face image of a person in the captured image as the monitoring target.
The manual monitoring target setting process will now be described with reference to
The monitoring target setting circuit 202 accepts selection of the exterior imaging circuit 6 or interior imaging circuit 7 to be used for setting the monitoring target (step S11). The monitoring target setting circuit 202 then activates the selected imaging circuit (step S12). The monitoring target setting circuit 202 acquires a captured image from the activated imaging circuit (step S13). The monitoring target setting circuit 202 analyzes the captured image and sets a whole body image and a face image of a person in the captured image as the monitoring target (step S14).
The learning-based monitoring target setting will now be described. At a predetermined timing, such as immediately after the vehicle 1 starts running, the monitoring target setting circuit 202 activates the interior imaging circuit 7 and acquires a captured image, for example, around the rear seat 8 from the interior imaging circuit 7.
The monitoring target setting circuit 202 analyzes the captured image, extracts a whole body image and a face image of a person in the captured image, and stores the date of imaging, the whole body image, and the face image in association with each other as a monitoring target candidate. As a method for extracting a whole body image and a face image of a person in the captured image, the technique described in Japanese Patent Application Laid-open No. 2020-178167 can be applied.
The monitoring target setting circuit 202 refers to the stored monitoring target candidates in the past and determines whether there are more than a threshold number of face images that have the face identical to the face indicated by the face image and were captured on different dates. An example of the threshold is 10. The monitoring target setting circuit 202 can apply the person matching technique described in Japanese Patent Application Laid-open No. 2016-201758 above as a method of referring to the stored monitoring target candidates in the past and determining whether the face in the face image is identical to the face indicated by the face image.
If there are more than the threshold number, the monitoring target setting circuit 202 sets the most recently stored face image and whole body image of the monitoring target candidate as the monitoring target.
The learning-based monitoring target setting process will now be described with reference to
The monitoring target setting circuit 202 acquires a captured image from the interior imaging circuit 7 (step S21). The monitoring target setting circuit 202 then terminates the process if no person is extracted as a result of image analysis of the captured image (No at step S22). On the other hand, if a person is extracted as a result of image analysis of the captured image (Yes at step S22), the monitoring target setting circuit 202 stores the date of imaging, the whole body image, and the face image in association with each other as a monitoring target candidate (step S23).
The monitoring target setting circuit 202 refers to the stored monitoring target candidates in the past and determines whether there are 10 or more face images that have the face identical to the face indicated by the face image and were captured on different dates. If there are not 10 or more face images (No at step S24), the process ends.
If there are 10 or more face images that have the face identical to the face indicated by the face image and were captured on different dates (Yes at step S24), the monitoring target setting circuit 202 sets the most recently stored face image and whole body image of the monitoring target candidate as the monitoring target (step S25).
Returning to
The parking brake information indicates whether the parking brake 13 of the vehicle is applied or the parking brake 13 is released. The information acquisition circuit 203 receives side brake information from, for example, a sensor device that detects a state of the parking brake 13.
The door lock status information is information indicating whether the door of the vehicle is locked or unlocked. This door lock status information also includes information indicating that the door has been locked by locking from outside the vehicle 1. The information acquisition circuit 203 receives door lock status information from the door lock device 14 that detects the locked state and the unlocked state of the door.
The door open/close information indicates whether the door is open or not. The information acquisition circuit 203 acquires the door open/close information from the door open/close detection device 60.
The ignition information is information indicating a state of the ignition switch. The information acquisition circuit 203 acquires the ignition information from the ignition detection device 50.
The monitoring control circuit 204 executes a monitoring process to determine whether the monitoring target belongs to the monitoring range, based on the captured image by the exterior imaging circuit 6. The monitoring control circuit 204 starts the monitoring process if a monitoring start condition is met, and terminates the monitoring process if a monitoring end condition is met.
The monitoring control circuit 204 sequentially acquires vehicle information from the information acquisition circuit 203 and determines whether the monitoring start condition or the monitoring end condition is met, based on the vehicle information. The monitoring control circuit 204 may activate the exterior imaging circuit 6 and the interior imaging circuit 7 and determine whether the monitoring start condition or the monitoring end condition is met, additionally using the image analysis result of the captured image acquired from the exterior imaging circuit 6 and the interior imaging circuit 7.
The monitoring start condition includes, for example, that the vehicle 1 is not locked from outside, the ignition switch is OFF, the door is open, the parking brake is being applied, and the monitoring target is detected to be outside the vehicle. The monitoring start condition may include that a monitoring start instruction is given by the user's input operation to the in-vehicle display device 15 or the user's terminal.
The monitoring control circuit 204 refers to the door lock status information and determines whether the door is locked by locking from outside the vehicle 1. The monitoring control circuit 204 also refers to the ignition information and determines whether the ignition switch is OFF. The monitoring control circuit 204 also refers to the door open/close information and determines whether the door of the vehicle 1 has been opened. The monitoring control circuit 204 also refers to the parking brake information and determines whether the parking brake is being applied.
The monitoring control circuit 204 analyzes the captured image acquired from the exterior imaging circuit 6 and detects that the monitoring target is outside the vehicle if the captured image includes the face image identical to that of the monitoring target.
The monitoring control circuit 204 starts the monitoring process if the monitoring start condition is met. The monitoring control circuit 204 may start the monitoring process on the condition that the monitoring start condition is met and the monitoring end condition is not met.
The monitoring control circuit 204 activates the exterior imaging circuit 6 that has any one of the monitoring ranges as its imaging range, acquires the captured image from the exterior imaging circuit 6, and searches the acquired captured image for the monitoring target. The monitoring control circuit 204 may search for the monitoring target using not only the face portion of the acquired captured image but also the whole body image. As a technique for searching for the monitoring target using the face portion and the whole body image, the technique described in Japanese Patent Application Laid-open No. 2020-178167 may be applied.
The monitoring control circuit 204 searches for the monitoring target and determines whether the monitoring target belongs to the monitoring range. The monitoring control circuit 204 may determine whether the monitoring target belongs to the monitoring range by specifying the distance from the vehicle 1 to the monitoring target using the captured image including the monitoring target. As a technique for specifying the distance using the captured image, for example, the technique described in Japanese Patent Application Laid-open No. 2007-188417 may be applied, in which the size of the whole body is determined from a pattern of a part of the monitoring target.
As a result of determining whether the monitoring target belongs to the monitoring range based on the distance from the vehicle 1 to the monitoring target, if the monitoring target belongs to the monitoring range, the monitoring control circuit 204 continues to determine again whether the monitoring target belongs to the monitoring range, using the captured image acquired from the exterior imaging circuit 6.
The monitoring control circuit 204 performs a notification process if the monitoring target is not found in any of the captured images acquired from the exterior imaging circuit 6 or if the monitoring target does not belong to the monitoring range.
As the notification process, the monitoring control circuit 204 outputs sound through the exterior speaker 30 to instruct the monitoring target to return in the direction to the vehicle 1. The sound may be pre-recorded voice of the driver, for example. As the notification process, the monitoring control circuit 204 may transmit and output message information to a mobile terminal owned by the monitoring target via the communication device 40 to instruct the monitoring target to return in the direction to the vehicle 1.
As the notification process, the monitoring control circuit 204 may display and output to the in-vehicle display device 15 map information including information indicating the current location and the captured image obtained when the monitoring target belongs to the monitoring range most recently.
If the monitoring target is not found in any of the captured images acquired from the exterior imaging circuit 6 more than once, or if the monitoring target does not belong to the monitoring range for a certain period of time, the monitoring control circuit 204 may display and output to the in-vehicle display device 15 of the vehicle 1 that the monitoring target fails to be identified or deviates from the monitoring range, as the notification process.
The vehicle 1 performs the notification process as described above, whereby the monitoring target is first urged to return voluntarily when the monitoring target no longer belongs to the monitoring range, and if the monitoring target still does not return to the monitoring range, information is output to notify the person riding in the vehicle 1, such as the driver. With this process, the vehicle 1 can prevent the monitoring target located around the vehicle 1, for example, playing around the vehicle 1 far from home, from moving away from the vehicle 1.
The monitoring control circuit 204 terminates the monitoring process if the monitoring end condition is met. The monitoring end condition includes, for example, a condition that the vehicle 1 is locked from outside, the parking brake is turned OFF, or the monitoring target returns to the interior of the vehicle. The monitoring end condition may include that a monitoring end instruction is given by the user's input operation to the in-vehicle display device 15 or the user's terminal.
When the monitoring control circuit 204 terminates the monitoring process if the vehicle 1 is locked from outside or if the parking brake is turned OFF, the monitoring control circuit 204 may display and output information to the in-vehicle display device 15 or the user's terminal device to indicate that the monitoring process is being executed.
The monitoring control circuit 204 refers to the door lock status information and determines whether the door is locked by locking from outside the vehicle 1. The monitoring control circuit 204 refers to the parking brake information and determines whether the parking brake is OFF.
The monitoring control circuit 204 analyzes the captured image acquired from the interior imaging circuit 7 and detects that the monitoring target is back to the interior of the vehicle if the captured image includes the face image identical to that of the monitoring target.
The control procedure of the monitoring process will now be described with reference to
The monitoring control circuit 204 determines whether the monitoring start condition is met (step S31), and if the monitoring start condition is not met (No at step S31), the process ends. Here, the monitoring start condition is that the ignition is OFF, the parking brake 13 is ON, the door is open, and the monitoring target is found in any of the captured images acquired from the exterior imaging circuit 6 by the exterior imaging circuit 6.
If the monitoring start condition is met (Yes at step S31), the monitoring control circuit 204 proceeds to step S32. At step S32, the monitoring control circuit 204 determines whether the monitoring end condition is not met (step S32). The monitoring end condition is that the vehicle 1 has been locked from outside, or the parking brake is turned OFF, or a request to stop the monitoring process is made from the in-vehicle display device 15 through the user's input operation, or the monitoring target has returned to the interior of the vehicle.
If the monitoring end condition is not met at step S32 (Yes at step S32), the monitoring control circuit 204 performs a process of searching for the monitoring target in the captured image acquired from the exterior imaging circuit 6 (step S33). As a result of the process of searching for the monitoring target, if the monitoring target can be identified (Yes at step S34), and if the monitoring target is within the monitoring range (Yes at step S35), the monitoring control circuit 204 proceeds to step S32 without performing the notification process.
As a result of the process of searching for the monitoring target, if the monitoring target fails to be identified (No at step S34), or if the monitoring target is not within the monitoring range at step S35 (No at step S35), the monitoring control circuit 204 performs the notification process (step S36) and proceeds to step S32. At step S32, if the monitoring end condition is met (No at step S32), the process ends.
In the vehicle control device 20 mounted on the vehicle 1 in the present embodiment, the monitoring target setting circuit 202 sets the monitoring target based on the captured image, and when the monitoring start condition is met, the monitoring control circuit 204 starts the monitoring process to determine whether the monitoring target belongs to the monitoring range based on the captured image by the exterior imaging circuit 6. When the monitoring end condition is met, the monitoring control circuit 204 terminates the monitoring process. The monitoring start condition or the monitoring end condition includes a condition related to vehicle operation, such as the state of the parking brake 13.
In this way, the vehicle 1 controls the monitoring process in conjunction with the operation of the vehicle 1, so that the start or end of the monitoring process can be controlled without requiring a specific operation to start or terminate the monitoring process.
Furthermore, in the vehicle 1, the monitoring start condition includes that the parking brake 13 is being applied. At the timing when the vehicle 1 is parked, the occupant such as a child may move away from the vehicle 1, and the monitoring process is thought to be necessary. Based on this point, the monitoring start condition includes the condition based on the parking state that the parking brake 13 is being applied, whereby the vehicle 1 can properly control the start of the monitoring process.
Furthermore, in the vehicle 1, the monitoring start condition includes that the vehicle is not locked from the outside. When the vehicle is locked from the outside, the driver and other occupants presumably leave the vehicle 1 while the vehicle 1 is parked, so the vehicle 1 does not have to perform the monitoring process. In this way, the vehicle 1 can avoid execution of the monitoring process at a timing when the monitoring process is not necessary.
Furthermore, in the vehicle 1, the monitoring start condition includes that the door of the vehicle 1 is opened. In this way, the monitoring start condition includes that the door is opened, which suggests that the monitoring target leaves the vehicle 1, whereby the vehicle 1 can execute the monitoring process at a more appropriate timing.
Furthermore, in the vehicle 1, the monitoring start condition includes that the monitoring target has been extracted based on the captured image by the exterior imaging circuit 6. In this way, the monitoring start condition includes the condition indicating that the monitoring target stays away from the vehicle 1, whereby the vehicle 1 can execute the monitoring process at a more appropriate timing.
Furthermore, in the vehicle 1, the monitoring end condition includes that the parking brake 13 has been released. In this way, the monitoring end condition includes the condition suggesting that the vehicle 1 is no longer parked and the monitoring process is not necessary, whereby the vehicle 1 can control the monitoring process appropriately.
Furthermore, in the vehicle 1, the monitoring end condition includes that the vehicle 1 has been locked from the outside. When the vehicle is locked from the outside, both the driver and other occupants presumably leave the vehicle 1, so the vehicle 1 can avoid execution of the monitoring process at a timing when the monitoring process is not necessary.
Furthermore, in the vehicle 1, when it is determined that the monitoring target does not belong to the monitoring range based on the captured image by the exterior imaging circuit 6, the notification process is performed. In this way, the vehicle 1 can perform the process of preventing the monitoring target from becoming lost by giving notification when the monitoring target deviates from the monitoring range.
Furthermore, in the vehicle 1, the monitoring target is set based on the captured image by the interior imaging circuit 7 and the past captured images by the interior imaging circuit 7. In this way, since the vehicle 1 sets the monitoring target based on the past captured images, the vehicle 1 can properly set the monitoring target by automatically setting a frequently riding person as the monitoring target, without requiring complicated operations.
Furthermore, in the vehicle 1, the monitoring range is set based on the imaging range of the exterior imaging circuit 6. In this way, the vehicle 1 can properly set the monitoring range without requiring complicated operations.
In the foregoing embodiment, the monitoring start condition is that the vehicle 1 is not locked from the outside, the ignition switch is OFF, the door is open, the parking brake is being applied, and the monitoring target is detected to be outside the vehicle. However, embodiments are not limited to this. For example, additional conditions may be added, or only some of the monitoring start conditions in the foregoing embodiment may be used as the monitoring start condition.
In the foregoing embodiment, the condition includes that the vehicle 1 has been locked from the outside, or that the parking brake is turned OFF, or that the monitoring target has returned to the interior of the vehicle. However, embodiments are not limited to this. Additional conditions may be added, or only some of the monitoring end conditions in the foregoing embodiment may be used as the monitoring end condition.
The result of determining whether the vehicle is parked by detecting the state of the shift lever 12 may be used as the monitoring start condition or the monitoring end condition.
Instead of determining that the vehicle has been locked from the outside, whether there are no occupants may be determined based on the detection result by the human detection sensor 16.
In the foregoing embodiment, it is assumed that the monitoring target is a child. However, the monitoring target may be an elderly person, a pet, or the like. In the foregoing embodiment, the vehicle 1 extracts a monitoring target candidate using the captured image around the rear seat 8 captured by the interior imaging circuit 7. However, the interior imaging circuit 7 may capture an image around the front passenger seat, and the monitoring target candidate may be extracted using the captured image around the front passenger seat.
A computer program executed by the vehicle control device 20 in the present embodiment is provided as a file in an installable or executable format recorded on a computer-readable recording medium such as an optical recording medium such as a digital versatile disk (DVD), a USB memory, or a semiconductor memory such as a solid state disk (SSD).
The computer program executed by the vehicle control device 20 in the present embodiment may be stored on a computer connected to a network such as the Internet and downloaded via the network. The computer program executed by the vehicle control device 20 in the present embodiment may be provided or distributed via a network such as the Internet.
The computer program of the vehicle control device 20 in the present embodiment may be embedded in a ROM or the like in advance.
A vehicle according to a second embodiment has a body with a roof, and a cabin is formed inside the body. The cabin is a space in which occupants such as driver and passengers ride. The main function of the vehicle is to run with occupants riding in the cabin and transport the occupants to a destination. The occupants may request to enhance a sense of speed, a sense of immersion, and realistic sensation in traveling of such a vehicle.
When the vehicle runs forward with occupants in the cabin, the occupants in the cabin can see the view through the front and side windows and feel that the front view is approaching. For example, when the vehicle runs on a road in a tunnel, tunnel illumination approaches from a distance, and as the vehicle runs under the lights, the interior of the cabin successively turns into the color of the lights and then becomes dark again. As the vehicle runs through the sunlight filtering through leaves, bright and dark areas are repeated, and the brightness inside the cabin changes accordingly.
Since the vehicle runs on the ground, the occupants in the cabin drive as if they go through the view on the road ahead, but they cannot see the view above as it is interrupted by the roof. If the view above the occupants is provided in an environment matching the view, it is expected that a sense of speed, a sense of immersion, and realistic sensation can be enhanced for the driving of the vehicle, and the entertainment feature in the cabin can be enhanced.
Then, in the second embodiment, in the vehicle, a plurality of light emitters are disposed in a line along the direction of travel on the ceiling of the cabin, and the respective light emission states of the light emitters are controlled in accordance with an image captured in front of the body, thereby improving the entertainment feature in the cabin.
Specifically, in the vehicle, the body has a roof covering the cabin. The roof has an outer surface and an inner surface. An imaging circuit is disposed at an upper front part in the cabin, and an illumination circuit including a plurality of light emitters disposed in a line along the direction of travel is provided on the inner surface of the roof. The imaging circuit can capture an image in an image acquisition area. The image acquisition area is a region in front of the body and having a height corresponding to the roof. The image acquisition area may be a partial region of the entire region included in the angle of view of the imaging circuit. In the illumination circuit, the respective light emission states of the light emitters change in accordance with an image in the image acquisition area. The imaging circuit acquires an image in the image acquisition area for a plurality of timings. The timings may correspond to a plurality of points aligned at predetermined intervals of the distance that the vehicle should travel. In the illumination circuit, the light emission states of the light emitters temporally change in accordance with temporal change of an attribute of the image in the image acquisition area at a plurality of timings. In the illumination circuit, the light emission states of the light emitters may change so as to flow in a direction opposite to the direction of travel when viewed from inside the cabin. Furthermore, in the illumination circuit, the light emission states of the light emitters spatially change in accordance with temporal change of an attribute of the image in the image acquisition area at a plurality of timings. In the illumination circuit, the light emission states of the light emitters may change in response to an attribute of the image in the image acquisition area. With this configuration, an illumination environment matching the view above can be implemented on the inner surface of the roof, so that a sense of speed, a sense of immersion, and realistic sensation can be enhanced for the driving of the vehicle. The entertainment feature in the cabin therefore can be enhanced.
More specifically, the vehicle is configured as illustrated in
The vehicle 1 includes a plurality of wheels 42-1 to 42-4, a body 43, an imaging circuit 44, an illumination circuit 45, and a vehicle control device 100.
The wheels 42-1 to 42-4 are each rotatable around the Y axis. The wheels (second wheel) 42-3 and 42-4 are disposed on the −X side of the wheels (first wheel) 2-1 and 2-2, and an axle on the +X side and an axle on the −X side (not illustrated) are disposed correspondingly. The wheels 42-1 and 42-2 are respectively joined to the ends on the −Y and +Y sides of the axle on the +X side extending in the Y direction. The wheels 42-3 and 42-4 are respectively joined to the ends on the −Y and +Y sides of the axle on the −X side extending in the Y direction. In
The body 43 rotatably supports the axles, and the wheels 42-1 to 42-4 are coupled to the body 43 through the axles. The body 43 can move in the X direction by rotation of the wheels 42-1 and 42-2 and the wheels 42-3 and 42-4. The body 43 forms a cabin 46. The body 43 has a roof 43a and a plurality of windows 43b, 43c-1, 43c-2, 43d-1, 43d-2, 43e-1, 43e-2, and 43f. The roof 43a covers the cabin 46 from the +Z side and defines a boundary on the +Z side of the cabin 46. The roof 43a has a surface on the +Z side as an outer surface and a surface on the −Z side as an inner surface. The surface on the −Z side of the roof 43a forms a ceiling 46a of the cabin 46. The windows 43b, 43c-1, 43c-2, 43d-1, 43d-2, 43e-1, 43e-2, and 43f define boundaries on the +X, −Y, +Y, −Y, +Y, −Y, +Y, and −X sides, respectively, of the cabin 46.
The occupant in the cabin 46 can see the view on the +X, −Y, +Y, −Y, +Y, −Y, +Y, and −X sides through the windows 43b, 43c-1, 43c-2, 43d-1, 43d-2, 43e-1, 43e-2, and 43f, but cannot see the view on the +Z side as it is interrupted by the roof 43a. The occupant is, for example, a driver 300 or passenger (not illustrated) in the cabin 46.
The imaging circuit 44 is disposed at the +X and +Z sides in the cabin 46. The imaging circuit 44 may be disposed adjacent to the window 43b on the −X side or may be disposed near an end on the +Z side of the window 43b. The imaging circuit 44 has a camera 44a. The camera 44a has an imaging plane facing the +X side and has an optical axis extending along the X direction as indicated by a dot-dash line in
The imaging circuit 44 can capture an image of the exterior of the body 43 and can capture images in image acquisition areas IM1 and IM2. The imaging circuit 44 may have one camera 44a capable of capturing images in both of the image acquisition areas IM1 and IM2, or may have a plurality of cameras 44a individually capable of capturing images in the image acquisition areas IM1 and IM2. In
The imaging range of the imaging circuit 44 includes a space away from the outer surface of the roof 43a with respect to a ground 200 on which at least one of the wheels 42-1 and 42-2 and wheels 42-3 and 42-4 is grounded. The image acquisition areas IM1 and IM2 are areas in front of the body 43 and having a height corresponding to the roof 43a, and may include, for example, a location where the height from the ground 200 is substantially equal to the height of the ceiling of the cabin 46, as indicated by a dotted line in
The illumination circuit 45 is disposed on the +Z side in the cabin 46. The illumination circuit 45 illuminates the inside of the cabin 46. The illumination circuit 45 includes a light emitter group 51, a light emitter group 52, a light emission control circuit 53, a light emission control circuit 54 (see
The light emitter group 51 is disposed on the +Z side in the cabin 46 in a region extending from the +X side to the −X side. The light emitter group 51 includes a plurality of light emitters GR1 to GR3. In
Each of the light emitters GR1 to GR3 of the light emitter group 51 may have a plurality of light sources disposed in a line along the X direction. The light emitter GR1 includes a plurality of light sources D1 to D5 disposed in a line along the X direction. The light emitter GR2 includes a plurality of light sources D6 to D10 disposed in a line along the X direction. The light emitter GR3 includes a plurality of light sources D11 to D15 disposed in a line along the X direction.
The light emitters GR1 to GR3 of the light emitter group 51 may be disposed slightly on the +Y side with respect to the center in the Y direction in the ceiling in the cabin 46. Each of the light sources D1 to D15 of the light emitter group 51 is, for example, a light emission diode (LED). Each of the light sources D1 to D15 of the light emitter group 51 can adjust its brightness by performing at least one of: changing the duty ratio of voltage; and changing the magnitude of fed current. The emission color of each of the light sources D1 to D15 of the light emitter group 51 may be colored or monochrome, such as R (red), G (green), B (blue), or white. In the following, a case where the emission color of each of the light sources D1 to D15 of the light emitter group 51 is colored will be mainly described by way of example. Each of the light emitters GR1 to GR3 of the light emitter group 51 may be an LED display, a liquid crystal display, or an organic EL display extending along the X direction, instead of a plurality of light sources disposed in a line along the X direction.
The image acquisition area IM1 corresponds to a plurality of light emitters GR1 to GR3 of the light emitter group 51. The straight line along the direction in which the light emitters GR1 to GR3 of the light emitter group 51 are disposed passes through the image acquisition area IM1, as indicated by a dotted line in
For example, the imaging circuit 44 acquires an image in the image acquisition area IM1 for a plurality of timings. The timings may correspond to a plurality of points aligned at predetermined intervals of the distance that the vehicle 1 should travel.
The light emission control circuit 53 changes the light emission state of each light emitter GR1 to GR3 of the light emitter group 51 in response to the movement along the direction of travel of the body 43 and an image captured by the imaging circuit 44, under control of the vehicle control device 100. The light emission control circuit 53 changes the light emission state of each light emitter GR1 to GR3 of the light emitter group 51 in response to the movement along the direction of travel of the body 43 and an image captured by the imaging circuit 44 a predetermined period of time before. The predetermined period of time may vary depending on the speed of movement of the vehicle 1 in a predetermined direction.
The light emission control circuit 53 temporally changes the light emission states of the light emitters GR1 to GR3 of the light emitter group 51 in accordance with temporal change of an attribute of the image in the image acquisition area IM1. The light emission control circuit 53 changes the light emission states of the light emitters GR1 to GR3 of the light emitter group 51 so as to flow in a direction opposite to the direction of travel when viewed from inside of the cabin 46, in accordance with temporal change of an attribute of the image in the image acquisition area IM1. The light emission control circuit 53 may change the light and dark pattern of the light emitters GR1 to GR3 of the light emitter group 51 so as to flow in a direction opposite to the direction of travel when viewed from inside of the cabin 46, in accordance with temporal change of an attribute of the image in the image acquisition area IM1. The light emission control circuit 53 may change the color pattern of the light emitters GR1 to GR3 of the light emitter group 51 so as to flow in a direction opposite to the direction of travel, in accordance with temporal change of an attribute of the image in the image acquisition area IM1.
Furthermore, the light emission control circuit 53 spatially changes the light emission states of the light emitters GR1 to GR3 of the light emitter group 51 in accordance with temporal change of an attribute of the image in the image acquisition area IM1, under control of the vehicle control device 100. The light emission control circuit 53 changes the light emission states of the light emitters GR1 to GR3 of the light emitter group 51 in response to an attribute of the image in the image acquisition area IM1. The light emission control circuit 53 may change the light and dark pattern of the light emitters GR1 to GR3 of the light emitter group 51 in response to the pattern of luminance of the image in the image acquisition area IM1. The light emission control circuit 53 may change the color pattern of the light emitters GR1 to GR3 of the light emitter group 51 in response to the pattern of color component values of the image in the image acquisition area IM1.
The light emitter group 52 is disposed on the +Z side in the cabin 46 in a region extending from the +X side to the −X side. The light emitter group 52 includes a plurality of light emitters GR1 to GR3. In
Each of the light emitters GR1 to GR3 of the light emitter group 52 may have a plurality of light sources disposed in a line along the X direction. The light emitter GR1 includes a plurality of light sources D1 to D5 disposed in a line along the X direction. The light emitter GR2 includes a plurality of light sources D6 to D10 disposed in a line along the X direction. The light emitter GR3 includes a plurality of light sources D11 to D15 disposed in a line along the X direction.
The light emitters GR1 to GR3 of the light emitter group 52 may be disposed slightly on the −Y side with respect to the center in the Y direction in the ceiling in the cabin 46. Each of the light sources D1 to D15 of the light emitter group 52 is, for example, a light emission diode (LED). Each of the light sources D1 to D15 of the light emitter group 52 can adjust its brightness by performing at least one of: changing the duty ratio of voltage; and changing the magnitude of fed current. The emission color of each of the light sources D1 to D15 of the light emitter group 52 may be colored or monochrome, such as R (red), G (green), B (blue), or white. In the following, a case where the emission color of each of the light sources D1 to D15 of the light emitter group 52 is colored will be mainly described by way of example. Each of the light emitters GR1 to GR3 of the light emitter group 52 may be an LED display, a liquid crystal display, or an organic EL display extending along the X direction, instead of a plurality of light sources disposed in a line along the X direction.
The image acquisition area IM2 corresponds to a plurality of light emitters GR1 to GR3 of the light emitter group 52. The straight line along the direction in which the light emitters GR1 to GR3 of the light emitter group 52 are disposed passes through the image acquisition area IM2, as indicated by a dotted line in
For example, the imaging circuit 44 acquires an image in the image acquisition area IM2 for a plurality of timings. The timings may correspond to a plurality of points aligned at predetermined intervals of the distance that the vehicle 1 should travel.
The light emission control circuit 54 changes the light emission state of each light emitter GR1 to GR3 of the light emitter group 51 in response to the movement along the direction of travel of the body 43 and an image captured by the imaging circuit 44, under control of the vehicle control device 100. The light emission control circuit 53 changes the light emission state of each light emitter GR1 to GR3 of the light emitter group 51 in response to the movement along the direction of travel of the body 43 and an image captured by the imaging circuit 44 a predetermined period of time before. The predetermined period of time may vary depending on the speed of movement of the vehicle 1 in a predetermined direction.
The light emission control circuit 54 temporally changes the light emission states of the light emitters GR1 to GR3 of the light emitter group 52 in accordance with temporal change of an attribute of the image in the image acquisition area IM2. The light emission control circuit 54 changes the light emission states of the light emitters GR1 to GR3 of the light emitter group 52 so as to flow in a direction opposite to the direction of travel when viewed from inside of the cabin 46, in accordance with temporal change of an attribute of the image in the image acquisition area IM2. The light emission control circuit 54 may change the light and dark pattern of the light emitters GR1 to GR3 of the light emitter group 52 so as to flow in a direction opposite to the direction of travel when viewed from inside of the cabin 46, in accordance with temporal change of an attribute of the image in the image acquisition area IM2. The light emission control circuit 54 may change the color pattern of the light emitters GR1 to GR3 of the light emitter group 52 so as to flow in a direction opposite to the direction of travel when viewed from inside of the cabin 46, in accordance with temporal change of an attribute of the image in the image acquisition area IM2.
Furthermore, the light emission control circuit 54 spatially changes the light emission states of the light emitters GR1 to GR3 of the light emitter group 52 in accordance with temporal change of an attribute of the image in the image acquisition area IM2, under control of the vehicle control device 100. The light emission control circuit 54 changes the light emission states of the light emitters GR1 to GR3 in accordance with an attribute of the image in the image acquisition area IM2. The light emission control circuit 54 may change the light and dark pattern of the light emitters GR1 to GR3 of the light emitter group 52 in response to the pattern of luminance of the image in the image acquisition area IM2. The light emission control circuit 54 may change the color pattern of the light emitters GR1 to GR3 of the light emitter group 52 in response to the pattern of color component values of the image in the image acquisition area IM2.
As illustrated in
The vehicle control device 100 is an information processing device that can be mounted on the vehicle 1 and is disposed at any position in the body 43, for example, at the position indicated by the dotted line in
The vehicle control device 100 can be configured in terms of hardware as illustrated in
The vehicle control device 100 includes an imaging interface (IF) 101, a central processing unit (CPU) 102, a random access memory (RAM) 103, a read only memory (ROM) 104, an illumination IF 105, a global positioning system (GPS) IF 106, a vehicle speed IF 107, a communication IF 108, and a bus 109. The imaging IF 101, the CPU 102, the RAM 103, the ROM 104, the illumination IF 105, the GPS IF 106, the vehicle speed IF 107, and the communication IF 108 are connected to communicate with each other via the bus 109. A control program is stored in the ROM 104.
The imaging IF 101 is connected to communicate with the imaging circuit 44 via a cable or the like. The imaging IF 101 acquires an image captured by the imaging circuit 44. The imaging IF 101 continuously acquires a plurality of frame images of moving images captured by the imaging circuit 44.
The illumination IF 105 is connected to communicate with each of the light emission control circuits 53 and 54 in the illumination circuit 45 via a CAN, a cable, or the like. The illumination IF 105 can supply a control signal to each of the light control circuits 53 and 54. Thus, the illumination IF 105 can control a lighting state of each of the light sources D1 to D15 of the light emitter groups 51 and 52.
The GPS IF 106 is connected to communicate with a GPS sensor 47 via a CAN, a cable, or the like. The GPS sensor 47 receives GPS signals. The GPS IF 106 acquires GPS signals received by the GPS sensor 47 and generates location information such as latitude, longitude, and altitude of the vehicle 1 based on the GPS signals.
The vehicle speed IF 107 is connected to communicate with a vehicle speed sensor 48 via a CAN, a cable, or the like. The vehicle speed sensor 48 is disposed near the wheel 42 and generates a vehicle speed pulse indicating the rotational speed or the number of revolutions of the wheel 42. The vehicle speed IF 107 acquires a vehicle speed pulse generated by the vehicle speed sensor 48 and determines the traveling speed of the vehicle 1 based on the vehicle speed pulse.
The communication IF 108 is connected to communicate with a communication device 9 via a CAN, a cable, or the like. The communication device 9 can communicate with a server device (not illustrated) mainly via a wireless communication circuit and receive predetermined information from the server device. The communication IF 108 can acquire the predetermined information from the server device via the communication device 9.
The GPS sensor 47 and the vehicle speed sensor 48 are used to detect the location and the amount of movement of the vehicle and can function as a movement detector set to detect movement of the body 43 in a predetermined direction of travel. The movement detector may be substituted by another means that can detect movement of the body 43 in a predetermined direction of travel and has similar effects.
The vehicle control device 100 can be configured as illustrated in
The vehicle control device 100 includes an acquisition circuit 110, an acquisition circuit 120, and a control circuit 130. The control circuit 130 includes a plurality of shift registers 131 and 136, a processor 133, a distance calculator 134, and a timing controller 135. Each of the shift registers 131 and 136 includes registers 132-1 to 132-50 on multiple stages.
The acquisition circuit 110 acquires images in the image acquisition areas IM1 and IM2. The acquisition circuit 110 acquires, for example, a whole image IM as illustrated in
As indicated by a dotted line in
The acquisition circuit 120 illustrated in
The distance calculator 134 calculates the distance traveled by the vehicle 1, for example, by integrating the traveling speed. The distance calculator 134 supplies the distance traveled by the vehicle 1 to the timing controller 135.
A plurality of shift registers 131 and 136 correspond to a plurality of image acquisition areas IM1 and IM2, correspond to a plurality of light emission control circuits 53 and 54, and correspond to a plurality of light emitter groups 51 and 52. The shift register 131 includes registers 132-1 to 132-50 on multiple stages. Each of the shift registers 131 and 136 receives and holds an image in the corresponding image acquisition area IM1, IM2, and shifts the image held in the register 132 to the register 132 on the next stage at a timing controlled by the timing controller 135. In the following, the shift register 131, the image acquisition area IM1, the light emission control circuit 53, and the light emitter group 51 will be mainly described by way of example, but this is applicable to the shift register 136, the image acquisition area IM2, the light emission control circuit 54, and the light emitter group 52.
The timing controller 135 successively determines a plurality of timings with reference to the point of the image acquisition area IM1, in accordance with the distance traveled by the vehicle 1. As illustrated in
If the road is undulating or winding, the acquired image may differ from the actual location that the vehicle passes through. Therefore, the distance LIM may be set to, for example, such a distance that the time from acquisition of the image acquisition area IM1 to passage through the point of the image acquisition area IM1 is two to three seconds. The distance LIM may be, for example, several tens of meters. The predetermined interval ΔL and the number of dividing can be determined as desired in accordance with the traveling speed of the vehicle 1, the specifications of the imaging circuit 44, and the like.
A plurality of timings to be determined by the timing controller 135 correspond to a plurality of points aligned at predetermined intervals of the distance that the vehicle 1 should travel.
In
Each of the timings is a timing when an image acquired by the acquisition circuit 110 illustrated in
In
Each of the timings is a timing when the shift register 131 illustrated in
In
Each of the timings is a timing for performing image processing in the processor 133 illustrated in
Each of the light emission control circuits 53 and 54 of the illumination circuit 45 has a shift register 531 and a plurality of control circuits C1 to C15. The control circuits C1 to C15 correspond to a plurality of light sources D1 to D15. The shift register 531 includes registers 532-1 to 532-3 on multiple stages. The register 532-1 corresponds to the control circuits C1 to C5, the register 532-2 corresponds to the control circuits C6 to C10, and the register 532-3 corresponds to the control circuits C11 to C15. Each of the control circuits C1 to C15 is connected to the corresponding light source and turns the light source on or off or changes the brightness or color of the light source.
The processor 133 generates a control value for controlling the light source based on the image acquired from the register 132-50 on the final stage, considering the result of statistical processing. The processor 133 stores the control value into the register 532 on the first stage of the shift register 531. The control value may include a Y value indicating luminance, an R value that is a red color component value, a G value that is a green color component value, and a B value that is a blue color component value.
The control of vehicle interior illumination may be performed for one light source at a time, may be performed for a plurality of light sources in a batch, or may be performed all in a batch. When the control is performed for a plurality of light sources in a batch or for all in a batch, the old luminance, the lighting color and luminance, and color are smoothed and complemented so that they change smoothly. If the number of image frames is less than the number of units of control for vehicle interior illumination, the luminance and the lighting color may be complemented to change smoothly from the previous and subsequent images.
In
The control value stored in the register 532-1 on the first stage of the shift register 531 is supplied to the control circuits C1 to C5 corresponding to the light emitter GR1. The control circuits C1 to C5 control the brightness and color of the light sources D1 to D5 in accordance with the control value.
In
Each of the timings is a timing for changing the control of the light emitter group 51 by the light emission control circuit 53 illustrated in
In
As illustrated in
For example, as illustrated in
As illustrated in
As illustrated in
Although not illustrated in the drawing, at timing t52 when the vehicle 1 reaches an X position x52, the register 532-2 of the shift register 531 transfers the control value FR_49 to the register 532-3 on the next stage. In response, the control circuits C11 to C15 turn on the light sources D11 to D15 of the light emitter GR3 corresponding to the point x49 in a color in accordance with the control value FR_49 and at intermediate brightness in accordance with the control value FR_49.
As illustrated in
As the statistical processing performed in the processor 133 illustrated in
It is possible to use image capturing of only one pixel of the camera per image acquisition area in the control of line illumination. In this case, one pixel is acquired from one point on the extension of the dotted line in
In order to match with the experience, the image acquisition area is set in a horizontally long ellipse or a horizontally long rectangle centered on one point on the dotted line in
Each of the pixel signals included in the image includes, as the attribute, a Y signal indicating luminance, an R signal indicating a red (R) color component value, a G signal indicating a green (G) color component value, and a B signal indicating a blue (B) color component value. For the image to be processed, the processor 133 averages the luminance of the divided regions DV1 to DV25 to obtain luminance AVEY, and averages the red (R), green (G), and blue (B) color component values of the divided regions DV1 to DV25 to obtain color AVERGB. For the image to be processed, the processor 133 specifies a region with maximum luminance (e.g., DV18) among the divided regions DV1 to DV25, and sets the attribute of the specified region as representative value A. The representative value A includes luminance AY of the region with maximum luminance and color ARGB that is the red (R), green (G), and blue (B) color component values of the region with maximum luminance. For the image to be processed, the processor 133 specifies a region with minimum luminance (e.g., DV3) among the divided regions DV1 to DV25, and sets the attribute of the specified region as representative value B. The representative value B includes luminance BY of the region with minimum luminance and color BRGB that is the red (R), green (G), and blue (B) color component values of the region with minimum luminance. When the emission color of each of the light sources D1 to D15 is white, the attributes AVERGB, ARGB, and BRGB indicating color component values may be omitted.
Images of the image acquisition area are accumulated in the shift register as the car moves and time passes, and are reflected in the control of the line illumination. This process yields the luminance and color information of representative value A and representative value B. Similar evaluation is performed again in the next image acquisition to obtain regions with representative value A and representative value B in the next image. Dot-like bright spots such as tunnel illumination can be represented by defining a representative value A in a wide image acquisition area.
It is good in a dark area if the line illumination is turned on using the representative value A, but the representative value A may reflect a bright sky all day in the daytime, and the illumination may be turned on monotonously in almost pure white. The representative value B is used to cope with this situation.
In order to match with the experience, a feature point of luminance is extracted from the image in the image acquisition area, and the vehicle interior illumination control is adjusted to match the luminance of the feature point. If the image in the image acquisition area has no feature point, the average luminance is used to control the vehicle interior illumination.
As illustrated in
At timing TM1 illustrated in
At timing TM2 illustrated in
At timing TM3 illustrated in
At timing TM4 illustrated in
At timing TM5 illustrated in
At timing TM6 illustrated in
As illustrated in
The operation flow of the vehicle 1 will now be described using
When receiving a start request from the occupant, the vehicle control device 100 determines that the driving of the vehicle 1 is to be started and performs initial settings (S101). The vehicle control device 100 initializes parameters AVEY (*), AY (*), BY (*), AVERGB (*), ARGB (*), and BRGB (*) to zero. The number in parentheses in the parameter name is a number for identifying the image, and a smaller number represents a parameter of older data. The vehicle control device 100 initializes the registers 132 and 532 on each stage of the shift registers 131 and 531 to zero. In this case, the vehicle control device 100 does not allow light emission from each of the light emitters GR1 to GR3 of the light emitter groups 51 and 52 but may allow light emission like welcome light when the passenger gets on and off the vehicle.
The vehicle control device 100 checks a mode setting request for light emission control of the light emitter groups 51 and 52, and if setting to a light flowing mode is not requested (No at S102), the process returns to S101.
If setting to the light flowing mode is requested (Yes at S102), the vehicle control device 100 captures an image of the image acquisition area ahead, through the imaging circuit 44 (S103).
The vehicle control device 100 looks for the highest luminance point A and the lowest luminance point B in the captured image. If the highest luminance point A and the lowest luminance point B are found, the vehicle control device 100 calculates the luminance AY of the highest luminance point A and the ratio ARGB of color component values (S104).
If the latest image is the 50th image, the vehicle control device 100 writes information on the highest luminance point A in the 50th image into AY (50) and ARGB (50) and writes information on the lowest luminance point B in the 50th image into BY (50) and BRGB (50) (S105).
The vehicle control device 100 calculates the average luminance AVEY and the average RGB ratio AVERGB of the 50th image and writes them into AVEY (50) and AVERGB (50), respectively (S106).
The vehicle control device 100 updates the reference luminance AVEY50 by averaging the average luminance of the image for the 1st to 50th images as a reference of luminance (S107, B).
n=0 is data used to turn on the head of the line illumination. When the vehicle 1 passes through the location of the data, the data is discarded.
The vehicle control device 100 shifts the control value of light emission of the entire light emitter group by one in a direction opposite to the direction of travel when viewed from inside the cabin 46 (S108). Although the control value of light emission is described as being shifted by one, the control value is not necessarily a control value of one physical light source and may be a control value per unit of control, that is, for each light emitter. The process of averaging the control values among a plurality of units of control may be performed to smooth out color and light intensity variations between the preceding and subsequent control circuits.
The vehicle control device 100 compares the difference of the highest luminance from the reference luminance (AY(0)−AVEY50) with a reference value K1, for the zeroth image. If the difference of the highest luminance from the reference luminance (AY(0)−AVEY50) exceeds the reference value K1 (Yes at S109), the vehicle control device 100 determines that the point A should be reflected in the lighting control and performs lighting control of the head light source using AY(0) and ARGB(0) (S110).
If the difference of the highest luminance from the reference luminance (AY(0)−AVEY50) is equal to or less than the reference value K1 (No at S109), the vehicle control device 100 compares the difference of the lowest luminance from the reference luminance (AVEY50−BY(0)) with the reference value K2, for the zeroth image. If the difference of the lowest luminance from the reference luminance (AVEY50-BY(0)) is below the reference value K2 (Yes at S111), the vehicle control device 100 determines that the point B should be reflected in the light emission control and performs lighting control of the head light source using BY(0) and BRGB(0) (S112).
If the difference of the lowest luminance from the reference luminance (AVEY50−BY(0)) is equal to or more than a reference value K2 (No at S111), the vehicle control device 100 determines that the average value should be reflected in the light emission control and performs light emission control of the head light source using AVEY(0) and AVERGB(0) (S113).
The vehicle control device 100 waits until the vehicle 1 travels a certain distance ΔL from the point at S103 (No at S114), and when the vehicle 1 travels the certain distance ΔL (Yes at S114), the vehicle control device 100 shifts the register on each stage of the shift register by one. In other words, n in the parameters AVEY(n), AY(n), BY(n), AVERGB(n), ARGB(n), and BRGB(n) is decremented by one (S115). When the “certain distance” to be traveled is linked with the amount of one physical shift of the light emitter by shifting of the unit of light emission control, the light can flow smoothly in conjunction with change in the ambient light.
Subsequently, the vehicle control device 100 returns the process to S102 (A) and repeats the process after S2.
As described above, in the second embodiment, in the vehicle 1, a plurality of light emitters GR1 to GR3 are disposed in a line along the direction of travel on the ceiling of the cabin 46, and the respective light emission states of the light emitters GR1 to GR3 are controlled in accordance with an image captured in front of the body 43. With this configuration, an illumination environment matching the view above can be implemented real-time on the ceiling of the cabin 46, and a sense of speed, a sense of immersion, and realistic sensation can be enhanced for the driving of the vehicle 1. The entertainment feature in the cabin 46 therefore can be enhanced.
The imaging circuit 44 may be disposed at any position other than the +X and +Z sides in the cabin 46 at which images in the image acquisition areas IM1 and IM2 can be captured, and may be disposed outside the cabin 46. A camera disposed in the cabin 46 for other purposes may be used as the imaging circuit 44. For example, a dashboard camera or a camera for advanced driver assistance systems (ADAS) may be used.
The illumination circuit 45 may have one light emitter group or three or more light emitter groups. When the illumination circuit 45 has three or more light emitter groups, the imaging circuit 44 may be capable of capturing images in three or more image acquisition areas corresponding to the three or more light emitter groups. The imaging circuit 44 may have one camera 44a capable of capturing images in three or more image acquisition areas, or may have three or more cameras 44a capable of individually capturing images in three or more image acquisition areas. Three or more cameras 44a may be aligned along the Y direction at a position on the +X and +Z sides in the cabin 46.
As a first modification of the second embodiment, if the vehicle 1 can acquire map information on a certain object on the roadside ahead, map information may be further reflected in illumination control in the cabin 46. The map information is information in which the geographic location of an object and the attributes such as color and brightness of the object on the roadside are associated with each other for a plurality of objects.
In the vehicle 1, the vehicle control device 100 receives map information from a cloud server or the like at a predetermined timing via the communication device 9 illustrated in
In the vehicle 1, the vehicle control device 100 sequentially receives GPS signals through the GPS sensor 47, sequentially determines the current location of the vehicle 1 using the GPS IF 106, and sequentially corrects the current location with the vehicle speed determined using the vehicle speed IF 107. The vehicle control device 100 makes a change in accordance with the map information when generating a control value for the light emitter groups 51 and 52 in accordance with the images captured for the image acquisition areas IM1 and IM2. When it is determined that the vehicle 1 has reached the geographic location of an object, the vehicle control device 100 specifies an attribute such as color and brightness of the object corresponding to the geographic location in the map information. The vehicle control device 100 changes the control value of the light emitter groups 51 and 52 based on the captured image, in accordance with the attribute specified from the map information. With this configuration, the light emission control of each of the light emitters GR1 to GR3 of the light emitter groups 51 and 52 can be further matched to an object on the roadside, and a sense of immersion and realistic sensation can be further enhanced for the driving of the vehicle 1.
The vehicle control device 100 may reflect the characteristic colors of the surrounding landscape in the vehicle. For example, if the object is a tree, the vehicle control device 100 controls the emission color of the light emitters GR1 to GR3 to the color of green or autumn leaves. If the object is a townscape or historic building, the vehicle control device 100 controls the emission color of the light emitters GR1 to GR3 to the color of wall or roof. The vehicle control device 100 performs control to darken the brightness of the light emitters GR1 to GR3 when the vehicle 1 enters a tunnel, and brighten the brightness of the light emitters GR1 to GR3 in response to the vehicle 1 passing through a streetlight after dark.
In the vehicle 1, the vehicle control device 100 may determine, for each object on the roadside, whether the control value of the light emitter groups 51 and 52 generated from the captured image by the imaging circuit 44 is to be used as it is, or the control value is to be changed in accordance with map information. For example, the objects for which the control value is to be used as it is are trees, and the objects for which the control value is to be changed are tunnels, streetlights, townscapes, and historic buildings. Since the vehicle 1 can sequentially change the attribute specified from the map information depending on the weather, date and time of day, and events, a variety of responses are possible, such as creating a sense of the season and changing colors depending on the time of day when the vehicle passes. The vehicle control device 100 can acquire roadside map information, for example, season and weather of autumn leaves that can be seen through the vehicle windows on the road. In addition to capturing the luminance and color of the image acquisition area with the camera of the imaging circuit 44, the vehicle 1 can detect trees visible through the vehicle windows as objects and, assuming that the trees uniformly turn red, based on the roadside map information, the vehicle 1 allows red or yellow color to flow in the vehicle to match with the location of the trees visible through the vehicle windows. Similarly, the vehicle 1 can allow blue and white color to flow when a snowy tree or blue sky is seen through the vehicle windows.
In this case, the control of luminance and the like in accordance with the map information may be combined with the control of luminance and the like as illustrated in
The vehicle control device 100 changes the values of AY and BY specified from the captured image to sufficiently large values relative to the reference values K1 and K2 so that the colors treated as high luminance point A and low luminance point B are reflected in the light emission control of the light emitter groups 51 and 52.
In
When the vehicle 1 reaches the “tunnel area”, illumination control is performed using map information+GPS. For three points in the “tunnel area”, the vehicle 1 sets the control values treated as high luminance point A to none, none, none, and sets the control values treated as low luminance point B to black, black, black.
When the vehicle 1 reaches the “sunset area” during early evening hours on a sunny day, illumination control is performed using map information+GPS+weather+date and time+time of day. For three points in the “tunnel area”, the vehicle 1 sets the control value treated as high luminance point A to dark blue, purple, orange, and sets the control value treated as low luminance point B to none, none, none.
When the vehicle 1 reaches a “historic building area” during daytime hours, illumination control is performed using map information+GPS+time of day. For three points in “historic building area”, the vehicle 1 sets the control value treated as high luminance point A to white, white, white, and sets the control value treated as low luminance point B to none, none, none.
When the vehicle 1 reaches a “tree-lined avenue area” during daytime hours, illumination control using the captured image is combined with illumination control using map information+GPS. For five points in the “tree-lined avenue area”, the vehicle 1 sets the control value treated as high luminance point A to green, green, green, green, green and sets the control value treated as low luminance point B to none, none, none, none, none, none.
The vehicle control device 100 may update the object information in the map information received from a cloud server or the like via the communication device 9 with the information of an object detected when the vehicle actually runs, and then reflect the updated object information in the light emission control of the light emitter groups 51 and 52 when the vehicle runs on the roadside with the object. For example, when color target objects increase or decrease or dislocated due to construction, or the visible color of objects changes due to weather fluctuations, the vehicle control device 100 can update the map information to a color more appropriate for the objects.
As illustrated in
After the process at S1 to S7 is performed, the vehicle control device 100 receives GPS signals through the GPS sensor 47, determines the current location of the vehicle 1 with the GPS IF 106, and corrects the current location with the vehicle speed determined with the vehicle speed IF 107 (S121).
The vehicle control device 100 refers to the map information and changes AY, ARGB, BY, and BRGB, if available (S122, B). In other words, when it is determined that the vehicle 1 has reached the geographic location of an object, the vehicle control device 100 specifies the attribute such as color and brightness of the object corresponding to the geographic location in the map information. The vehicle 1 overwrites AY, ARGB, BY, and BRGB with AY, ARGB, BY, and BRGB in accordance with the attribute specified from the map information. The vehicle control device 100 then performs the process after S108.
The vehicle control device 100 may skip the process at S121 and S122 if there is no map information, or if the map information is not enabled due to seasonal, time of day, or weather factors.
In this way, in the first modification of the second embodiment, the control values of the light emitter groups 51 and 52 based on the captured images are changed in accordance with the attribute specified from the map information. This process enables the light emission control of the light emitter groups 51 and 52 to be matched with the object on the roadside and can further enhance a sense of immersion and realistic sensation for the driving of the vehicle 1.
As a second modification of the second embodiment, the light emission control of the light emitter groups 51 and 52 may be deactivated in a steady state, and the light emission control of the light emitter groups 51 and 52 may be started when the geographic location of the vehicle 1 comes within a highlight point. The geographic location of the highlight point may be included in the map information. The vehicle control device 100 may acquire information on the geographic location of the highlight point in cooperation with the navigation system of the vehicle 1. For example, when it is detected that the sightseeing spot guidance function of the navigation system has been turned ON, the vehicle control device 100 may determine that the vehicle comes within the highlight point and start light emission control of the light emitter groups 51 and 52. Alternatively, the vehicle control device 100 may start light emission control of the light emitter groups 51 and 52 in synchronization with a start signal of sightseeing commentary voice of the navigation system. In other words, the vehicle control device 100 may determine that the vehicle comes within the highlight point and start light emission control of the light emitter groups 51 and 52, in response to activation of a start signal of sightseeing commentary voice of the navigation system.
This process clarifies the location where the vehicle 1 should perform light emission control. In other words, the vehicle 1 does not emit light from each of the light emitters GR1 to GR3 until it reaches the highlight point, and performs light emission control of each of the light emitters GR1 to GR3 when it reaches the highlight point. This process can produce effects such as surprising the occupants when light is emitted, and attracting their interest in the explanation of the location.
As the second modification of the second embodiment, as illustrated in
If setting to the light flowing mode is requested (Yes at S102), the vehicle control device 100 waits (No at S131) until the geographic location of the vehicle 1 comes within the highlight point. If the geographic location of the vehicle 1 comes within the highlight point (Yes at S131), the vehicle control device 100 performs the process after S103. Although not illustrated in the drawing, the vehicle control device 100 may determine whether the vehicle 1 has come out of the highlight point after S15. The vehicle control device 100 may continue the light emission control of the light emitter groups 51 and 52 when the vehicle 1 is within the highlight point, and suspend the light emission control of the light emitter groups 51 and 52 when the vehicle 1 comes out of the highlight point. Then (A), the vehicle control device 100 may return the process to S2.
In this way, in the second modification of the second embodiment, the control values of the light emitter groups 51 and 52 based on the captured images are started in response to the geographic location of the vehicle 1 coming within the highlight point. This process can produce a surprise to the occupants when light is emitted and can further enhance a sense of immersion and realistic sensation for the driving of the vehicle 1.
A third embodiment will be described using the drawings.
As illustrated in
The vehicle 1 includes a pair of door mirrors 94 at both ends in the ±X direction of the body 2 at positions closer to the front tires 3f in the ±Y direction of the body 2 and at a predetermined height in the vehicle height direction (±Z direction orthogonal to the ±X and ±Y directions).
The vehicle 1 includes a plurality of seats 95a to 95f in the interior of the body 2. The seats 95a and 95b are arranged closer to the front tires 3f side by side in the ±X direction. The seats 95c and 95d are arranged between the front tires 3f and the rear tires 3r side by side in the ±X direction. The seats 95e and 95f are arranged closer to the rear tires 3r side by side in the ±X direction. The seat 95c is arranged behind the seat 95a, the seat 95d is arranged behind the seat 95b, the seat 95e is arranged behind the seat 95c, and the seat 95f is arranged behind the seat 95d. The number and arrangement of the seats 95 in the vehicle 1 is not limited to the example in
The vehicle 1 includes a plurality of communication devices 96a to 96f on the inner walls of the body 2. The communication devices (96a, 996c, 96d, 96e, 96f) for use in communication between occupants in the vehicle are arranged near the seats.
As for the communication devices (96a, 96c, 96d, 96e, 96f), for example, the communication device 96a is arranged in front of the seats 95a and 5b, the communication device 96c is arranged to the right of the seat 95c, the communication device 96d is arranged to the left of the seat 95d, the communication device 96e is arranged to the right of the seat 95e, and the communication device 96f is arranged to the left of the seat 95f. The communication devices (96a, 96c, 96d, 96e, 96f) are used for communication between the occupants.
In the present description, the end surface on the front tire 3f side of the body 2 may be referred to as the front surface. The end surface on the rear tire 3r side of the body 2 may be referred to as the rear surface. Both end surfaces in the ±X direction of the body 2 may be referred to as the side surfaces. When a person is seated on any of the seats 95a to 95f in the vehicle 1, the side surface on the right side may be referred to as the right side surface and the side surface on the left side may be referred to as the left side surface.
In the present description, the direction toward the left side surface of the body 2 is the +X direction, and the direction toward the right side surface is the −X direction. The direction toward the front surface side of the body 2 is the +Y direction, and the direction toward the rear surface side is the −Y direction. The direction toward the top of the body 2 is the +Z direction, and the direction toward the bottom (road surface side) is the −Z direction.
In the present description, when the vehicle 1 is parked on a road surface having an ideal plane, the axis in the ±X direction (X axis) and the axis in the ±Y direction (Y axis) of the vehicle 1 are parallel to the road surface, and the axis in the ±Z direction (Z axis) of the vehicle 1 is parallel to the normal to the road surface.
The vehicle 1 can run on two pairs of wheels 3 arranged along the ±Y direction. In this case, the ±Y direction in which two pairs of wheels 3 are arranged is the traveling direction (movement direction) of the vehicle 1, and the vehicle 1 can move forward (travel in the +Y direction) or backward (travel in the −Y direction), for example by switching the gear. The vehicle 1 can also turn right and left by steering.
The communication control device 400 is mounted, for example, on the vehicle 1 and controls the communication devices 96a, 96c, 96d, 96e, and 96f. The communication control device 400 performs control of, for example, displaying various types of information on the communication control device 400 in response to the user's input. The details will be described later.
The CPU 221 executes a computer program to centrally control the operation of the communication control device 400 and implement various functions of the communication control device 400. The various functions of the communication control device 400 will be described later.
The ROM 222 is a nonvolatile memory and stores various data (information that is written in the manufacturing stage of the communication control device 400) including a computer program for activating the communication control device 400. The RAM 223 is a volatile memory having a work area for the CPU 221. The auxiliary storage device 224 stores various data such as computer programs executed by the CPU 221. The auxiliary storage device 224 is composed of, for example, a hard disk drive (HDD).
The input device 225 is a device for the occupant using the communication control device 400 (here, for example, the person operating the communication device) to perform various operations. The input device 225 is composed of, for example, a touch panel or hardware keys.
The display device 226 is a display that displays various types of information including icons and a seating chart. The display device 226 may be composed of, for example, a liquid crystal display, and the input device 225 and the display device 226 may be configured as one unit, for example, in the form of a touch panel.
The external I/F 227 is an interface for connecting (communicating) to external devices such as the communication devices 96a, 96c, 96d, 96e, and 96f, for example, over a local interconnect network (LIN).
The speaker I/F 228 is a device for outputting dial tone/ringing tone when a transmitted icon and seat are received.
A light-emitting device 229 is a light emitter such that the light-emitting device successively turns on so as to approach from the transmitting seat toward the communication device corresponding to the transmitted seat. The light-emitting device 229 is composed of, for example, a light emitting diode (LED).
The display device 226 includes an operation surface 226A that can be at least touched with a finger, a sheet 226B arranged along the operation surface 226A on the underside of the operation surface 226A, a light emission circuit (display circuit) 226C arranged along the operation surface 226A on the underside of the sheet 226B and capable of emitting visible light (predetermined light), and a touch panel circuit (detection circuit) 226D arranged between the operation surface 226A and the light emission circuit 226C along the operation surface 226A and the light emission circuit 226C and capable of detecting at least finger movement on the operation surface 226A.
The sheet 226B may be affixed to the touch panel circuit 226D with double-sided tape or adhesive, or may be printed directly. The light emission circuit 226C is an organic electroluminescence (EL) display circuit or a liquid crystal display circuit with backlight. The sheet 226B and the touch panel circuit 226D are light-transmitting and allow light emitted by the light emission circuit 226C to pass through. For example, an ultrasonic surface acoustic wave, resistive, or capacitive touch panel is used for the touch panel circuit 226D.
The control circuit 31 includes an icon display control circuit 31A, an icon selection circuit 31B, a seat display control circuit 31C, a seat selection circuit 31D, a transmission control circuit 31E, a reception control circuit 31F, and a light emission control circuit 31G.
The icon display control circuit 31A has a control function of displaying at least one icon for use in communication between the occupants on the display. The at least one icon for use in communication between occupants is icon data 32A stored in the storage memory 32. In the present example, the icon display control circuit 31A may have a function to be activated to display the icon data 32A for use in communication on the display. For the icon data 32A for use in communication, the icon data 32A may be set by default when the icon data 32A appears on the display after the communication control device 400 is activated.
Here,
The control function of displaying icons on the display refers to the control function of displaying the icon data 32A on the display or displaying icon selection data 33A selected by the icon selection circuit 31B in response to the user's input from the icons of icon data 32A appearing on the display.
The icon selection circuit 31B has a function of selecting one icon from among at least one icon in response to the user's input. If the user wants to transmit an icon to another occupant, the user selects one icon from among at least one icon. The icon to be selected is the icon data 32A stored in the storage memory 32, and the selected icon is stored in the icon selection data 33A stored in the working memory 33.
The seat display control circuit 31C has a control function of displaying a seating chart indicating the positions of seats on the display. The seating chart indicating the positions of seats is a seating chart 32C stored in the storage memory 32. Here,
The seat selection circuit 31D has a function of selecting a seat serving as a transmission destination of the icon selected by the icon selection circuit from the seating chart in response to the user's input. The seat serving as a transmission destination of the icon is seat data 32B stored in the storage memory 32, and the selected seat is stored in seat selection data 33B stored in the working memory 33. The seat of a transmission source transmitted by the user is stored in the seat selection data 33B.
The transmission control circuit 31E has a control function of transmitting an icon to the communication device corresponding to the seat selected by the seat selection circuit 31D. The selected seat is the seat selection data 33B stored in the working memory 33, and the icon to be transmitted is icon selection data 33A stored in the working memory 33.
The reception control circuit 31F has a control function of receiving an icon on the communication device corresponding to the seat transmitted by the transmission control circuit 31E. The seat transmitted is the seat selection data 33B stored in the working memory 33, and the icon received is the icon selection data 33A stored in the working memory 33.
The light emission control circuit 31G has a light emission control function in which the light emitter successively turns on so as to approach from the transmitting seat toward the communication device corresponding to the seat transmitted by the transmission control circuit 31E. The transmitting seat and the transmitted seat are the seat selection data 33B stored in the working memory 33, and that light emission successively turns on is lighting control by light emission presence/absence data 33D stored in the working memory 33.
The process of transmitting an icon that is performed by the control circuit 230 will now be described.
First, the icon display control circuit 31A and the seat display control circuit 31C delete the icon selection data 33A, the seat selection data 33B including sender and receiver, seating chart data 33C, and all data (step S501). When all data is deleted, the process proceeds to step S502.
At step S502, the reception control circuit 31F determines whether there is any reception data to receive icon selection data 33A on the communication device corresponding to seat selection data 33B transmitted by the transmission control circuit 31E. If it is determined that there is reception data (No at step S502), the process proceeds to a reception process S503. The function of the reception process S503 of the control circuit 230 will be described later in
At step S504, the icon display control circuit 31A determines whether the user has touched the display. If it is determined that the user has not touched the display (No at step S504), the process returns to step S502. If it is determined that the user has touched the display (Yes at step S504), the process proceeds to step S505.
At step S505, the icon display control circuit 31A displays icon data 32A so that one icon selection data 33A is selected from among at least one icon data 32A in response to the user's input. When icon data 32A is displayed, the process proceeds to step S506.
At step S506, it is determined whether the user changes the type of icon data 32A that the user wants to transmit. If it is determined that the user changes the type of icon data 32A (Yes at step S506), the process proceeds to step S507. At step S507, the icon display control circuit 31A changes the type of icon data 32A, and the process proceeds to step S506. If it is determined that the user does not change the type of icon data 32A (No at step S506), the process proceeds to step S508.
At step S508, it is determined whether the icon selection circuit 31B has selected one icon selection data 33A from among at least one icon data 32A in response to the user's input. If it is determined that the user has not selected icon data 32A (No at step S508), the process proceeds to step S509.
At step S509, the icon display control circuit 31A determines whether a predetermined time has elapsed after the display is touched. If it is determined that the predetermined time has elapsed (Yes at step S509), the process of transmitting icon selection data 33A ends, without the user transmitting icon selection data 33A. If it is determined that the predetermined time has not elapsed (No at step S509), the process proceeds to step S505. If it is determined that the user has selected icon data 32A (Yes at step S508), the process proceeds to step S510.
At step S510, the icon display control circuit 31A performs a process of erasing icon data 32A except for one icon selection data 33A selected from among at least one icon data 32A, in response to the user's input. When the display is erased except for the selected icon selection data 33A, the process proceeds to step S511.
At step S511, the seat display control circuit 31C performs control of displaying the seating chart 32C indicating the positions of seats on the display. When the seating chart 32C is displayed on the display, the process proceeds to step S512.
At step S512, it is determined whether the seat selection circuit 31D selects seat data 32B serving as a transmission destination of the icon selection data 33A selected by the icon selection circuit 31B from the seating chart 32C, in response to the user's input. If it is determined that seat data 32B is not selected (No at step S512), the process proceeds to step S513.
At step S513, it is determined whether a predetermined time has elapsed after the seat selection circuit 31D touches the display. If it is determined that the predetermined time has elapsed (Yes at step S513), the process of transmitting icon selection data 33A ends, without the user transmitting icon selection data 33A. If it is determined that the predetermined time has not elapsed (No at step S513), the process proceeds to step S512. If it is determined that seat data 32B is selected (Yes at step S512), the process proceeds to step S514.
At step S514, in response to user's input, the seat selection circuit 31D erases seat data 32B from the seating chart data 33C, except for the seat selection data 33B that is a transmission destination of icon selection data 33A selected by the icon selection circuit. When the seats are erased except for the selected seat, the process proceeds to step S515.
At step S515, the transmission control circuit 31E performs a process of transmitting icon selection data 33A to the communication device corresponding to the seat selection data 33B selected by the seat selection circuit 31D. When the process of transmitting the icon selection data 33A to the communication device corresponding to the selected seat selection data 33B is performed, the process proceeds to step S516.
At step S516, the light emission control circuit 31G performs a process for the light emitter arranged between the communication devices so that the light emitter successively turns on so as to approach from the transmitting seat selection data 33B toward the communication device corresponding to the seat selection data 33B transmitted by the transmission control circuit 31E. When the process is performed so that the light emitter successively turns on so as to approach from the transmitting seat selection data 33B toward the communication device corresponding to the seat selection data 33B transmitted by the transmission control circuit 31E, the process of transmitting icon selection data 33A that is performed by the control circuit 230 is completed.
Referring now to
As illustrated in
The screen 260 is formed of a liquid crystal panel and an electrostatic touch panel. A screen 260A mainly displays information about infotainment. For example, a screen 260C within the screen 260A displays information about music, a screen 260D displays information on an icon 261 about communication, and a screen 260E displays information about the vehicle.
A screen 260B mainly displays information necessary for driving, such as navigation. In the present disclosure, since it is assumed the driver is driving a right-hand drive vehicle, the screen 260 displays information about infotainment in an area of the screen 260A far from the driver's seat and displays information necessary for driving in the screen 260B closer to the driver's seat. Information necessary for driving is displayed on the screen 260B so that the driver requires less eye movement and the driving is not interrupted.
As illustrated in
As illustrated in
Seats (265, 266, 267, 268) are displayed in the screen 260A (display), and the seat display control circuit 31C displays the seats (265, 266, 267, 268) indicating the positions of seats (seating chart 32C and seat data 32B) in the screen 260A (display). When icons (262, 263, 264) (icon data 32A) and seats (265, 266, 267, 268) (seating chart 32C and seat data 32B) are displayed in the screen 260A (display), the screen makes a transition to
As illustrated in
When the above operation is completed, the transmission control circuit 31E performs a process of transmitting the icon 263 (icon selection data 33A) to the communication device corresponding to the seat (seat selection data 33B) selected by the seat selection circuit 31D. After completion of the transmission process, the screen makes a transition to
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
In this way, in the vehicle according to the third embodiment, the communication control device 400 includes the seat selection circuit 31D that selects seat selection data 33B serving as a transmission destination of icon selection data 33A selected by the icon selection circuit 31B from among seat data 32B in the seating chart 32C, in response to the user's input, and the transmission control circuit 31E that transmits an icon to the communication device corresponding to the seat selection data 33B selected by the seat selection circuit 31D. With this configuration, the transmission control circuit 31E transmits icon selection data 33A to seat selection data 33B, thereby facilitating non-voice communication between occupants.
Furthermore, since the screen 260A (display) that displays information on communication in the front seat is arranged far from the driver's seat, and the screen 260B closer to the driver's seat displays information necessary for driving, the driver requires less eye movement and the driving is not interrupted, while non-voice communication is facilitated between the occupants in the rear seat.
Since the screen (71, 81) (display) that displays information on communication in the rear sheet is provided on the trim near the occupant in the rear seat, the ease of operation is good for the occupant, and selecting an icon and a seat is non-text and non-voice communication, which contributes to smooth communication between occupants. Furthermore, since the screen (71, 81) (display) has a decorative film on the surface layer, a sense of unity in the vehicle is further enhanced when the screen is off, which contributes to the design quality. In addition, the occupant slides the finger from his/her seat to the seat to which he/she wants to transmit an icon. This simple operation contributes to the ease of operation for occupants.
The process of receiving an icon that is performed by the control circuit 230 will now be described.
First, the reception control circuit 31F determines whether there is information that icon selection data 33A has been received on the communication device corresponding to seat selection data 33B transmitted by the transmission control circuit 31E (S901). If it is determined that no information has been received (No at step S901), the process of receiving icon selection data 33A ends without receiving icon selection data 33A. If it is determined that information has been received (Yes at step S901), the process proceeds to step S902.
Next, at step S902, the seat display control circuit 31C performs a process of displaying the seat position including the transmitted seat selection data 33B and the transmitted seating chart data 33C on the display, in accordance with the transmitted seat selection data 33B. The icon display control circuit 31A performs a process of displaying the icon selection data 33A on the display in accordance with the transmitted icon selection data 33A. When the above process is completed, the process of receiving the icon selection data 33A that is performed by the control circuit 230 is completed (step S902).
Referring now to
As illustrated in
The seat display control circuit 31C performs a process of displaying the seat position (seat selection data 33B) including the transmitted seat selection data 33B and the transmitted seating chart data 33C on the screen 301A (display), in accordance with the transmitted seat selection data 33B. The icon display control circuit 31A performs a process of displaying icons (102, 103) (icon selection data 33A) on the screen 301A (display) in accordance with the transmitted icons (102, 103) (icon selection data 33A). When the above process is completed, the process of receiving the icon selection data 33A that is performed by the control circuit 230 is completed. Here, an icon 302 is icon selection data 33A transmitted from the seat D, and an icon 303 includes a plurality of icon selection data 33A transmitted from the seat C, which means that a number of icons are transmitted, and the icon selection data 33A may be superimposed on each other.
As illustrated in
An icon 305 (icon selection data 33A) and a seating chart 306 (seating chart data 33C) are displayed in the screen 304, and seats (106A, 106B, 106C, 106D, 106E) (seat selection data 33B) are displayed in the seating chart 306 (seating chart data 33C). The seat display control circuit 31C performs a process of displaying the seat position (seat selection data 33B) including the transmitted seat 306C and seat 306D (seat selection data 33B) and the transmitted seating chart 306 (seating chart data 33C) on the screen 304 (display), in accordance with the transmitted seat 306C and seat 306D (seat selection data 33B). The icon display control circuit 31A performs a process of displaying the icon 305 (icon selection data 33A) on the screen 304 (display) in accordance with the transmitted icons 305 (icon selection data 33A). When the above process is completed, the process of receiving the icon selection data 33A that is performed by the control circuit 230 is completed.
In this way, in the vehicle according to the third embodiment, the communication control device 400 includes the reception control circuit 31F that receives an icon on the communication device corresponding to the seat selection data 33B transmitted by the transmission control circuit 31E. The seat display control circuit 31C displays the transmitted seat position (seat selection data 33B) on the display in accordance with the transmitted seat selection data 33B. The icon display control circuit 31A displays icon selection data 33A on the display in accordance with the transmitted icon selection data 33A. With this configuration, the reception control circuit 31F displays the transmitted seat position (seat selection data 33B) and icon selection data 33A, thereby facilitating non-voice communication between occupants.
The process of returning an icon that is performed by the control circuit 230 will now be described.
First, the icon display control circuit 31A determines whether the user has touched the display (step S1101). If it is determined that the user has not touched the display (No at step S1101), the process of returning icon selection data 33A ends without returning icon selection data 33A. If it is determined that the user has touched the display (Yes at step S1101), the process proceeds to step S1102.
At step S1102, the icon display control circuit 31A displays icon data 32A so that one icon selection data 33A is selected from among at least one icon data 32A in response to the user's input. When icon data 32A is displayed, the process proceeds to step S1103.
At step S1103, it is determined whether the user changes the type of icon data 32A that the user wants to transmit. If it is determined that the user changes the type of icon data 32A (Yes at step S1103), the process proceeds to step S1104. At step S1104, the icon display control circuit 31A changes the type of icon data 32A, and the process proceeds to step S1103. If it is determined that the user does not change the type of icon data 32A (No at step S1103), the process proceeds to step S1105.
At step S1105, it is determined whether the icon selection circuit 31B has selected one icon data 32A from among at least one icon data 32A in response to the user's input. If it is determined that the user has not selected icon data 32A (No at step S1105), the process proceeds to step S1106.
At step S1106, the icon display control circuit 31A determines whether a predetermined time has elapsed after the display is touched. If it is determined that the predetermined time has elapsed (Yes at step S1106), the process of transmitting icon selection data 33A ends, without the user transmitting icon selection data 33A. If it is determined that the predetermined time has not elapsed (No at step S1106), the process proceeds to step S1102. If it is determined that the user has selected icon data 32A (Yes at step S1105), the process proceeds to step S1107.
At step S1107, the transmission control circuit 31E performs a process of transmitting icon selection data 33A to the communication device corresponding to the latest transmission source seat selection data 33B obtained by the reception control circuit 31F. When the process of transmitting the icon selection data 33A to the communication device corresponding to the selected seat selection data 33B is performed, the process proceeds to step S1108.
At step S1108, the light emission control circuit 31G performs a process for the light emitter arranged between the communication devices so that the light emitter successively turns on so as to approach from the transmitting seat selection data 33B toward the communication device corresponding to the seat selection data 33B transmitted by the transmission control circuit 31E. When the process is performed so that the light emitter successively turns on so as to approach from the transmitting seat selection data 33B toward the communication device corresponding to the seat selection data 33B transmitted by the transmission control circuit 31E, the process of returning icon selection data 33A that is performed by the control circuit 230 is completed.
Referring now to
As illustrated in
The seat display control circuit 31C performs a process of displaying the seat position (seat selection data 33B) including the transmitted seat selection data 33B and the transmitted seating chart data 33C on the screen 320A (display), in accordance with the transmitted seat selection data 33B. This is the screen after the icon display control circuit 31A performs the process of displaying icons (121, 122) (icon selection data 33A) on the screen 320A (display) in accordance with the transmitted icons (121, 122) (icon selection data 33A), that is, the screen after receiving icons (121, 122) (icon selection data 33A). Subsequently, if the icon display control circuit 31A determines whether the user has touched the screen 320 (display), the screen makes a transition to
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
In this way, in the vehicle according to the third embodiment, the communication control device 400 includes the transmission control circuit 31E that transmits the icon 334 (icon selection data 33A) selected by the icon selection circuit 31B to a communication device corresponding to the seat position (seat selection data 33B) serving as a transmission destination in response to the user's input. With this configuration, the communication device receiving icon selection data 33A transmits the icon selection data 33A to seat selection data 33B, thereby facilitating non-voice communication between occupants.
As illustrated in
The light emitters of the light-emitting devices (144a, 144b, 144c, 144d, 144e, 144f, 144g, 144h, 144i, 144j) are provided on the inner walls of the vehicle and each may have a decorative film on the surface layer. The decorative film may be thinly sliced wood or may be a resin material printed with a wood-grain pattern.
As illustrated in
The light-emitting devices (144A, 144B, 144C, 144d, 144e, 144F, 144G, 144H, 144I, 144J) provided between the seat 142C and the seat 142D turn on in the +Y direction of the vehicle, from the light-emitting device 144C as the starting point to the light-emitting device 144A, then turn on from the light-emitting device 144A toward the light-emitting device 144F, along the +X direction of the vehicle, and then successively turn on in the −Y direction of the vehicle to the light-emitting device 144J as the endpoint.
In other words, in response to the user's input, with the selected icon selection data 33A and seat selection data 33B, the light emitter successively turns on so as to approach from the transmitting seat selection data 33B toward the communication device corresponding to the seat selection data 33B transmitted by the transmission control circuit 31E. The light emitters pass through the occupant's line of sight relative to the direction of travel of the vehicle and flow toward the designated seat. This contributes to the visibility of the communication devices.
In this way, in the vehicle according to the third embodiment, the display further includes a light emitter. The light emitter is arranged between each of the communication devices. The vehicle includes the light emission control circuit 31G such that the light emitter successively turns on so as to approach from the transmitting seat selection data 33B toward the communication device corresponding to the seat selection data 33B transmitted by the transmission control circuit 31E.
With this configuration, the light emitter successively turns on so as to approach toward the communication device corresponding to the seat selection data 33B transmitted by the transmission control circuit 31E, whereby not only the receiving occupants but also the occupants in the vehicle can see transmission and reception of icons. This contributes to activation of communication between occupants.
A computer program executed by the communication control device in the present embodiment is embedded in advance in a ROM or the like.
The computer program executed by the communication control device in the present embodiment may be provided as a file in an installable or executable format recorded on a computer-readable recording medium such as CD-ROM, flexible disk (FD), CD-R, or digital versatile disk (DVD).
Furthermore, the computer program executed by the communication control device in the present embodiment may be stored on a computer connected to a network such as the Internet and downloaded via the network. The computer program executed by the communication control device in the present embodiment may be provided or distributed via a network such as the Internet.
In the foregoing third embodiment, the communication control device 400 is mounted on the vehicle 1 with four wheels. However, embodiments are not limited to this. The communication control device can be mounted on a movable body having seats on a plurality of rows in the front-to-rear direction, such as bus, airplane, and train.
For the foregoing embodiments, the following A-1 to A-20 and B-1 to B-20 are disclosed.
(A-1)
A vehicle comprising:
(A-2)
The vehicle according to (A-1), wherein the first light emitter, the second light emitter, and the third light emitter have a light emission state changed in response to the movement of the body along the direction of travel and map information.
(A-3)
The vehicle according to (A-2), further comprising a wireless communication unit set to receive the map information from outside.
(A-4)
The vehicle according to (A-1), further comprising an imaging circuit capable of capturing an image of exterior of the body, wherein
(A-5)
The vehicle according to (A-4), wherein the first light emitter, the second light emitter, and the third light emitter have a light emission state changed in response to the movement of the body along the direction of travel and an image captured by the imaging circuit before a predetermined period of time.
(A-6)
The vehicle according to (A-5), wherein the predetermined period of time changes in accordance with a speed of movement of the vehicle in a predetermined direction.
(A-7)
The vehicle according to any one of (A-4) to (A-6), wherein the imaging circuit is disposed at an upper front part in the cabin.
(A-8)
The vehicle according to any one of (A-1) to (A-7), wherein each of the first light emitter, the second light emitter, and the third light emitter emits monochromatic light.
(A-9)
The vehicle according to any one of (A-1) to (A-7), wherein each of the first light emitter, the second light emitter, and the third light emitter emits light of multiple colors.
(A-10)
The vehicle according to any one of (A-1) to (A-9), wherein a light emission pattern of the first light emitter, the second light emitter, and the third light emitter changes to flow in a direction opposite to the direction of travel, in response to the movement of the body along the direction of travel.
(A-11)
A vehicle control device comprising:
(A-12)
The vehicle control device according to (A-11), wherein the control circuit changes a light emission state for the first light emitter, the second light emitter, and the third light emitter, in response to the movement of the body along the direction of travel and map information.
(A-13)
The vehicle control device according to (A-12), wherein the acquisition circuit acquires the map information received by a wireless communication unit from outside.
(A-14)
The vehicle control device according to (A-11), wherein
(A-15)
The vehicle control device according to (A-14), wherein the control circuit changes a light emission state for the first light emitter, the second light emitter, and the third light emitter, in response to the movement of the body along the direction of travel and an image captured by the imaging circuit before a predetermined period of time.
(A-16)
The vehicle control device according to (A-15), wherein the control circuit changes the predetermined period of time in accordance with a speed of movement of the vehicle in a predetermined direction.
(A-17)
The vehicle control device according to any one of (A-14) to (A-16), wherein the imaging circuit is disposed at an upper front part in the cabin.
(A-18)
The vehicle control device according to any one of (A-11) to (A-17), wherein the control circuit allows each of the first light emitter, the second light emitter, and the third light emitter to emit monochromatic light.
(A-19)
The vehicle control device according to any one of (A-11) to (A-17), wherein the control circuit allows each of the first light emitter, the second light emitter, and the third light emitter to emit light of multiple colors.
(A-20)
The vehicle control device according to any one of (A-11) to (A-19), wherein the control circuit changes a light emission pattern of the first light emitter, the second light emitter, and the third light emitter to flow in a direction opposite to the direction of travel, in response to the movement of the body along the direction of travel.
(B-1)
A vehicle comprising:
(B-2)
The vehicle according to (B-1), further comprising a reception control circuit configured to receive the icon onto the communication device corresponding to the seat transmitted by the transmission control circuit, wherein
(B-3)
The vehicle according to (B-1) or (B-2), wherein
(B-4)
The vehicle according to any one of (B-1) to (B-3), wherein the display at least includes a liquid crystal panel and an electrostatic touch panel.
(B-5)
The vehicle according to any one of (B-1) to (B-4), wherein the display further includes a decorative film on a surface layer.
(B-6)
The vehicle according to (B-5), wherein the decorative film has a wood-grain pattern.
(B-7)
The vehicle according to (B-1) or (B-2), wherein
(B-8)
The vehicle according to (B-7), wherein the display is arranged far from a driver's seat.
(B-9)
The vehicle according to (B-3), wherein
(B-10)
The vehicle according to (B-9), wherein
(B-11)
A communication control device mountable on a vehicle,
(B-12)
The communication control device according to (B-11), further comprising a reception control circuit configured to receive the icon onto the communication device corresponding to the seat transmitted by the transmission control circuit, wherein
(B-13)
The communication control device according to (B-11) or (B-12), wherein
(B-14)
The communication control device according to any one of (B-11) to (B-13), wherein the display at least includes a liquid crystal panel and an electrostatic touch panel.
(B-15)
The communication control device according to any one of (B-11) to (B-14), wherein the display further includes a decorative film on a surface layer.
(B-16)
The communication control device according to (B-15), wherein the decorative film has a wood-grain pattern.
(B-17)
The communication control device according to (B-11) or (B-12), wherein
(B-18)
The communication control device according to (B-17), wherein the display is arranged far from a driver's seat.
(B-19)
The communication control device according to (B-13), wherein
(B-20)
The communication control device according to (B-19), wherein
The effects of the embodiments described herein are examples only and are not limited, and may include other effects.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the methods and systems described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2020-199003 | Nov 2020 | JP | national |
2020-203881 | Dec 2020 | JP | national |
2020-204613 | Dec 2020 | JP | national |
This application is a continuation of International Application No. PCT/JP2021/036576, filed on Oct. 4, 2021 which claims the benefit of priority of the prior Japanese Patent Application No. 2020-199003, filed on Nov. 30, 2020, Japanese Patent Application No. 2020-203881, filed on Dec. 9, 2020 and Japanese Patent Application No. 2020-204613, filed on Dec. 9, 2020, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2021/036576 | Oct 2021 | US |
Child | 18199510 | US |