The present invention relates to a projection system and a projection method.
Advertising media that displays information such as images, etc. by a flat display, a projector or the like, so-called digital signage has been recently widely spread (for example, Patent document 1). The digital signage has advantages that it is easier to update display content than paper media, many kinds of display content can be periodically switched and displayed by one display, and displays of many displays can be simultaneously updated by distributing data through a communication line.
[Patent document1] Japanese Patent Laid-Open No. 2009-289128
The digital signage is generally installed at places where many people see it, such as train stations, airports, shopping malls, etc. However, the digital signage has been placed at various places due to the spread in recent years, and there have been also proposals for installation in toilet booths.
However, when the digital signage is installed in a limited space inside the toilet booth, it has been difficult to install a large-size flat display. Furthermore, when a display is installed at a position where a user's hand can easily reach the display, the display may be destroyed or taken away. Therefore, it is conceivable to install a projector at a high position such as a ceiling and project from the projector onto an inner wall of the toilet booth. However, when the projector is installed at an upper side to project obliquely to the inner wall of the booth which is a projection target surface, a projection image is distorted, and thus it is necessary to correct the distortion according to a projection angle. However, since the amount of correction required at this time also differs depending on the position of a user who views the image, there is a problem that even when correction is uniformly performed, an appropriate display is not obtained.
Therefore, the present invention has an object to provide a technique of correcting a projection image according to the position of a user.
In order to solve the foregoing problem, a projection system according to the present invention comprises:
a detection unit that detects a position of a user who uses a booth;
an image projection unit that projects an image onto a projection target surface determined for the booth; and
a correction unit that performs correction of distortion of the image according to the position of the user.
The projection system may further comprise a movement control unit that moves a projection position based on the position of the user, wherein the correction unit may correct the image to be projected to the projection position based on the projection position.
In the projection system, the detection unit may determine a viewpoint position of the user as the position of the user, and the movement control unit may move the projection position based on the viewpoint position.
In the projection system, the image projection unit may be provided at an upper portion of the booth, when the user enters the booth and closes a door at a doorway of the booth, the image projection unit may project the image with an inner wall of the door set as the projection target surface, when the user opens the door and exits, the image projection unit may project the image from the upper portion of the booth through the doorway onto a floor surface, and the floor surface may be set as the projection target surface.
In the projection system, the booth may be provided with a toilet bowl, and when the user is not seated on the toilet bowl, the toilet bowl may be set as the projection target surface.
The projection system may further comprise:
an action detection unit that detects an operation of the user;
a gesture determination unit that determines whether a user's action corresponds to a predetermined gesture; and
an image control unit that controls the image to be projected according to the gesture when the user's action corresponds to the gesture.
In order to solve the foregoing problem, a projection method executes, by a computer:
a step of detecting a position of a user using a booth by a detection unit;
a step of causing an image projection unit to project an image onto a projection target surface determined for the booth; and
performing correction of distortion of the image according to the position of the user.
The projection method may further execute a step of moving a projection position based on the position of the user to correct the image to be projected to the projection position based on the projection position in the step of performing the correction.
In the projection method, the detection unit may determine a viewpoint position of the user as the position of the user, and move the projection position based on the viewpoint position.
In the projection method, the image projection unit may be provided at an upper portion of the booth, when the user enters the booth and closes a door at a doorway of the booth, the image projection unit may project the image with an inner wall of the door set as the projection target surface, when the user opens the door and exits, the image projection unit may project the image from the upper portion of the booth through the doorway onto a floor surface, and the floor surface may be set as the projection target surface.
In the projection method, the booth may be provided with a toilet bowl, and when the user is not seated on the toilet bowl, the toilet bowl may be set as the projection target surface.
The projection method may execute:
a step of detecting an operation of the user;
a step of determining whether a user's action corresponds to a predetermined gesture; and
a step of controlling the image to be projected according to the gesture when the user's action corresponds to the gesture.
The present invention may be a program for causing a computer to execute the projection method.
According to the present invention, a technique of correcting a projection image according to the position of a user can be provided.
Hereinafter, embodiments of the present invention will be described with reference to the drawings. Note that the embodiments are examples of the present invention, and the configuration of the present invention is not limited to the following examples.
The content server 2 periodically transmits content to the projection system 100, or transmits content in response to a request from the projection system 100. The relay device 6 of the projection system 100 receives the content transmitted from the content server 2 and distributes the content to the control device 3 of each floor. The control device 3 is connected to a detection unit 46 and the projector 1 which are provided in each booth 14, and causes the projector 1 to project an image based on the content to a projection position corresponding to the position of the user detected by the detection unit 46.
The booth 14 is, for example, a toilet booth that includes a toilet bowl 41 and is used by the public at commercial facilities such as a department store or a station.
As illustrated in
The booth 14 has a pair of right and left-side walls 14L and 14R and a rear wall 14B which surround three sides, and a door 9 that opens and closes a doorway 4 of the booth 14. The toilet bowl 41 is installed in the booth 14 which is surrounded on four sides thereof by the side walls 14L and 14R, the rear wall 14B and the door 9. The walls 14L, 14R, and 14B and the door 9 surrounding the booth 14 may have a height extending from the floor surface 14F to the ceiling surface 14C, but in the present embodiment, a space is provided between the ceiling surface 14C and each of the right and left-side walls 14L, 14R and the door 9 to allow air flow as illustrated in
Here, “right and left” mean the left side and the right side when facing the doorway 4 from the outside of the toilet, “front and rear” mean the front side and the rear side when sitting on the toilet bowl 41, and “upper and lower” mean the ceiling surface 14C side and the installation surface (floor) 14F side of the toilet bowl 41.
The right and left-side walls 14L and 14R are plate members each of which is J-shaped in cross-section, that is, forms a straight line on one side of the cross-section and a curved line on the other side of the cross-section, and has a planar rear portion and a front portion having a quadric surface (see
A guide rail 8 is installed on an inner upper portion of the right-side wall 14R (see
An operation panel 61 which has opening and closing buttons of the door 9 and is electrically connected to the door driving unit 63 is installed on the inner surface of the left-side end portion of the door 9. When the closing button of the operation panel 61 is pushed by a user's operation, the door driving unit 63 operates to close the door 9, and the lock 91 is engaged with the door 9 to lock the door 9 in a state where the left end of the door 9 abuts against the left-side wall 14L, thereby preventing opening of the door.
When the opening button of the operation panel 61 is pushed, the door driving unit 63 drives the lock 91 to release the engagement with the door 9, thereby unlocking the door 9, and drives the door 9 in an opening direction. The lock 91 is not limited to the configuration in which the lock 91 is provided to the guide rail 8 and engaged with the door 9, and may be configured so as to be provided to the left-side wall 14L, the right-side wall 14R, the floor surface 14F or the like and engaged with the door 9, thereby preventing opening of the door.
Conversely, the lock 91 may be configured so as to be provided with the door 9 and engaged with the guide rail 8, the left-side wall 14L, the right-side wall 14R, the floor surface 14F or the like, thereby preventing opening of the door. Note that in this example, when the door 9 is closed, the lock 91 locks the door 9 to prevent the door 9 from opening, but the lock 91 may be omitted in the case of a configuration in which the closed door 9 cannot be easily opened from the outside, for example, a configuration in which a gear of the door driving unit 63 is not rotated even when another person applies force to manually open the door 9, and thus the door 9 does not move. As described above, since the operation panel 61 configured to open and close the door 9 is provided in the booth 14, a user who operates the operation panel 61 is present in the booth 14 in the state where the door 9 is closed.
Furthermore, after the user opens the door 9 and exits from the booth 14 after use of the booth 14, the door 9 is set to an open state until a next user enters the room and closes the door 9. Therefore, based on the opened or closed state of the door 9, when the door 9 is closed, it is detected that the user is present in the booth 14, and when the door 9 is opened, it is detected that the user is not present in the booth 14.
Note that with respect to the opened and closed states of the door 9, for example, the door driving unit 63 may be provided with a sensor (opening and closing sensor) that detects the position of the door 9, and it may be detected by the opening and closing sensor whether the door 9 is located at a closing position or opening position, or whether the door 9 is closed or opened may be detected based on a driving history of the door 9 by the door driving unit 63.
Note that
The booth 14 illustrated in
The operation panel 61 configured to operate the opening and closing of the door driving unit 63 is provided inside the left front wall 141L.
An upper frame 142 is bridged between the upper ends of the left front wall 141L and the right front wall 141R, and the lock 91 is provided to the upper frame 142. The lock 91 is driven by the door driving unit 63 in conjunction with the opening and closing of the door 9, and when the door 9 is closed, the lock 91 engages with the door 9 to lock the door, thereby preventing opening of the door.
The booth 14 illustrated in
The operation panel 61 configured to operate opening and closing of the door driving unit 63 is provided in the vicinity of the door 9 of the right-side wall 14R.
Returning to
The toilet seat device 42 is provided on the Western-style toilet bowl 41, and has a function of warming a seat surface on which a user seats and a cleaning function of discharging warm water to clean the anus and the private parts of the user. The toilet seat device 42 is provided with a seating sensor 421 that detects whether the user is seated, and when seating of the user is not detected after a predetermined time has elapsed since detection of the seating of the user, that is, it is determined that the user rises because he/she has relieved himself/herself, based on a detection result of the seating sensor 421, the toilet seat device 42 performs control of discharging washing water for cleaning the toilet seat, control of reducing the temperature of the seating surface to set a power saving mode when the user is not seated, etc. Note that the toilet bowl 41 is not limited to the Western-style, and may be a Japanese style. When a Japanese-style toilet bowl 41 is provided, the toilet seat device 42 is omitted. In this case, it may be detected by a human detection sensor or the like that the user sits down over the Japanese-style toilet bowl 41 and has entered a posture to relieve himself/herself, and this may be detected as seating of the user.
As illustrated in
The display unit 432 displays information received from the control device 3 and the like as well as the set temperature of the toilet seat, the temperature of the warm water for washing, and the washing position.
The speaker 433 outputs an operation sound when the operation unit 431 is operated, an artificial sound simulating a sound generated when washing water for washing the toilet bowl flows, sounds constituting the content together with an image to be projected onto the projection target surface and the like.
The detection unit 46 is a sensor that detects the position of a user in the booth 14. The detection unit 46 is a sensor that detects the presence of the user by, for example, infrared rays, radio waves, ultrasonic waves, or the like. The detection unit 46 may be a passive type sensor that senses infrared rays emitted by the user to detect the presence of the user, or may be an active type sensor that transmits infrared rays, radio waves, or ultrasonic waves from a transmitter, and detects the presence of the user by capturing variation of the infrared rays, the radio waves or ultrasonic waves, which is caused by blocking or reflection by the user, by a receiver.
Particularly, in the present embodiment, an active distance sensor 460 is installed on the ceiling surface 14C located above each booth 14, and the distance to an object in the booth is detected based on a period of time from transmission of signal light of infrared rays or the like to the toilet bowl 41 until reception of reflection light reflected from the object in the booth, or by means of triangulation from a photodetection position at which reflection waves are detected by PSD (Position Sensitive Detector).
As illustrated in
The detection unit that detects the user's height information is not limited to the distance sensor, and a light projector may be provided on the ceiling surface 14C to project a predetermined pattern of infrared rays into the booth, pick up an image of the pattern projected on an object in the booth by a camera, compare the predetermined pattern with the pattern projected on the object and determine the distance to the object present on the toilet bowl 41, that is, the height information of the object from the difference between the patterns.
Furthermore, the distance to the object present on the toilet bowl 41, that is, the height information of the object may be determined by a ToF distance image sensor. In this case, a human shape may be stored as a standard pattern, an object matching this standard pattern may be identified as a user by pattern matching, and a site of the object which matches the head portion of the standard pattern may be recognized to determine the height of the head portion and a viewpoint position.
A sensor of another device may be used as the detection unit 46. For example, the seating sensor 421 of the toilet seat device 42 or a sensor (not illustrated) for detecting that a user enters the booth 14 and operating lighting, air conditioning, a deodorizer, etc. may be used as the detection unit 46. Furthermore, the operation panel 61 or the door driving unit 63 may be used as the detection unit 46.
The control device 3 is a device that receives content from the content server 2 and controls the projector 1 to project an image of the content, and includes a content reception unit 411, an image control unit 412, a movement control unit 413, and a correction unit 414.
The content reception unit 411 receives content from the relay device 6. The content reception unit 411 may be configured to store content received from the relay device 6 into a memory to provide the content to the image control unit 412, or may be configured to provide content from the relay device 6 and provide the content to the image control unit 412 every time an image is projected.
The image control unit 412 transmits image information of the content received by the content reception unit 411 to the projector 1 to project an image. Note that the image control unit 412 may start the projection of the image when it is detected by the detection unit 46 that a user has entered the booth 14, and may stop the projection when the user has exited from the booth 14.
The movement control unit 413 moves the projection position by the image projection unit based on the position of the user detected by the detection unit 46. For example, when the seating sensor is turned on, it can be specified that the user is seated on the toilet bowl 41, that is, the user is positioned on the toilet bowl 41, and thus the projection position is controlled so that an image is projected to a position where the user seated on the toilet bowl 41 can easily see the image. Here, since the toilet bowl 41 is positionally fixed, when the user is seated on the toilet bowl 41, a viewpoint position within a horizontal plane is substantially the same for all users, but a viewpoint position in the height direction is different depending on the user's body height. Therefore, the movement control unit 413 detects the height information of the user by the detection unit 46, determines the projection position according to the height of the viewpoint for each user, and projects an image onto the projection position. Note that the height of the projection position may be set to, for example, the same height as the viewpoint position, or a height which is added with an offset value so as to be higher or lower by a predetermined distance than the viewpoint position. The height of the projection position is the height of a reference position such as the center of an image. In this example, the height of the projection position is indicated by an absolute value from the floor surface 14F, but may be indicated by a relative value from a viewpoint position or the like.
The correction unit 414 performs distortion correction of an image projected onto the projection position according to the projection position. Since the distortion of the image projected onto the projection target surface differs depending on the angle or the shape of the projection target surface at the projection position, a correction value for correcting this distortion is determined in advance according to the projection position, and stored (in an auxiliary memory). The correction value is read from the memory according to the projection position determined by the movement control unit 413, and the image is corrected according to the correction value. When the image is corrected as described above, a correction effect differs depending on the viewpoint position where the projected image is observed. Therefore, the correction unit 414 may correct distortion of an image to be projected onto a projection position according to the projection position and the viewpoint position. For example, a correction value for correcting this distortion is obtained in advance according to the projection position and the viewpoint position and stored in the memory, a correction value is read from the memory according to the projection position determined by the movement control unit 413 and the viewpoint position based on the detection result of the detection unit 46, and the image is corrected according to the correction value.
The relay device 6 is a device that provides content received from the content server 2 to the control device 3, and includes a content reception unit 611 and a content distribution unit 612.
The content reception unit 611 receives content from the content server 2 via the network 5 such as the Internet. The content distribution unit 612 stores content received from the relay device 6 in the memory, and when receiving a request for content from the control device 3, the content distribution unit 612 reads out the content and transmits the content to the control device 3. Every time the content distribution unit 612 receives a request for content from the control device 3, the content reception unit 611 may acquire the content from the content server 2, and every time content is acquired from the content server 2, the content distribution unit 612 may distribute the content to the control device 3.
The memory 22 includes a main memory and an auxiliary memory. The main memory is used as a work area of the CPU 21, a memory area of programs and data, and a buffer area of communication data. The main memory is formed of, for example, Random Access Memory (RAM) or a combination of RAM and Read Only Memory (ROM). The main memory is a memory medium for which the CPU 21 caches programs and data and expands the work area. The main memory includes, for example, a flash memory, a RAM (Random Access Memory), and a ROM (Read Only Memory). The auxiliary memory is a memory medium for storing programs to be executed by the CPU 21, setting information of operations, and the like. The auxiliary storage device is, for example, an HDD (Hard-disk Drive), an SSD (Solid State Drive), an EPROM (Erasable Programmable ROM), a flash memory, a USB memory, a memory card, or the like.
The input/output IF 23 is an interface that inputs and outputs data to and from devices such as a sensor, an operation unit, and a communication module connected to the content server 2, the relay device 6, or the control device 3. Note that each of the above-described components may be provided in the form of a plurality of elements, or some of the components may not be provided.
In the content server 2, the CPU 21 functions as a processing unit that executes processing of reading out content from the memory 22 and transmitting the content to the relay device 6 by executing a program. In the relay device 6, the CPU 21 functions as respective processing units of the content reception unit 611 and the content distribution unit 612 illustrated in
The liquid crystal display unit 12 is an element that displays an image based on the content, and in this example, the image is decomposed into three primary colors of light, and decomposed R (red), G (green), and B (blue) images are assigned to three liquid crystal display units one by one. The light source 13 illuminates each of the three liquid crystal display units 12. The prism 19 combines light fluxes of three primary colors transmitted through the three liquid crystal display units 12. The projection lens 11 projects the light fluxes combined by the prism 19 onto the projection target surface, and forms an enlarged image (color image) of the image displayed on each liquid crystal display unit 12. The lens driving unit 15 drives at least a part of the projection lens 11, and adjusts focus, tilt, and shift of the projection lens 11.
The base 17 is fixed to the ceiling surface 14C, and rotatably holds the housing 18 in which the projection lens 11, the liquid crystal display unit 12, the light source 13, the prism 19, and the lens driving unit 15 are accommodated. The projection position changing unit 16 changes the projection position of the image by rotating the housing 18 with respect to the base 17. In other words, the projector 1 is installed such that an optical axis 110 of the projection lens 11 directed to the projection target surface has a depression angle with respect to the ceiling surface 14C, and the projection position changing unit 16 changes the depression angle, whereby the position of the image projected onto the projection target surface is changed up and down.
<Correction Method>
As illustrated in
In
Here, the calibration pattern 1B projected onto the projection target surface as illustrated in
Then, as illustrated in
The correction unit 414 of the control device 3 reads a correction value from the memory according to the projection position and the viewpoint position, and performs image processing based on this correction value to deform an image of the content, thereby performing the correction.
Note that trapezoidal distortion caused by projecting obliquely downward from the ceiling surface 14C onto a vertical wall surface out of the distortion under projection can also be optically corrected by shifting the projection lens 11 using the lens driving unit 15 of the projector 1. For example, the projection lens 11 is shifted so as to correct trapezoidal distortion of an image which is projected so as to have a predetermined height, and the calibration pattern 1B projected on the projection target surface is imaged by a camera under the above state, and compared with the original calibration pattern as illustrated in
When the door 9 is configured to be flat as illustrated in
<Projection Method>
Next, a projection method in the projection system 100 of the present embodiment will be described.
First, when the detection unit 46 such as a human detection sensor installed in the booth 14 detects doorway of a user, the control device 3 starts processing of
The control device 3 further determines whether the user has exited from the booth 14 (step S30), and when the user has exited (step S30, Yes), the control device 3 ends the processing of
When it is determined that the user has been seated on the toilet bowl 41 (step S40, Yes), the control device 3 acquires height information of the user by the sensor 460, and determines a viewpoint position (step S50).
Next, the control device 3 determines a projection position based on the viewpoint position, and controls the projector 1 to project an image onto the position (step S60).
Furthermore, the control device 3 corrects the image of the content based on the correction value corresponding to the projection position (step S70), and causes the projector 1 to project a corrected image (step S80). Then, the control device 3 returns to step S30, and repeats these processing until the user has exited.
As described above, the projection system 100 according to the first embodiment can project an image for which distortion has been accurately corrected by performing distortion correction on the image according to the position of the user. In particular, by moving the projection position based on the position of the user, the projection system 100 according to the first embodiment can display an image at each position where the image is easily viewable for each user.
Note that in this example, the projection of an image is started when the user has entered the booth 14. However, the present invention is not limited to this example, and the step S20 may be omitted, and after the user has been seated on the toilet bowl 41, the projection position may be determined in conformity with the user's viewpoint position to start the projection.
As compared with the first embodiment described above, a second embodiment is added with a configuration that controls a projection image according to a user's gesture. Note that the other configuration is the same, and thus the same components are represented by the same reference numerals and symbols, and description thereof is omitted.
As illustrated in
The sensor 468 is provided on the ceiling surface 14C to be closer to the door 9 than the toilet bowl 41, and determines the distance from the ceiling surface 14C to an object exiting on the door 9 side, that is, the position of the object in the height direction (vertical direction). The sensor 468 is provided on the left-side wall 14L to be closer to the door 9 than the toilet bowl 41, and determines the distance from the left-side wall 14L to the object existing on the door 9 side, that is, the position of the object in the horizontal direction.
The two-dimensional position in a height direction and a vertical direction of a site such as a user's arm extended to the door 9 side (that is, the projection target surface side) is periodically detected by these sensors 468 and 469, whereby the motion of the site can be detected. Note that the detection unit that detects the motion of the user is not limited to the distance sensor, and a light projector may be provided on the ceiling surface 14C to project a predetermined pattern of infrared rays into the booth, image the pattern projected onto the object in the booth by a camera, compare this predetermined pattern with the pattern projected on the object, and periodically determine the position of the object existing on the toilet bowl 41 from the difference between the patterns, thereby detecting the action of the user.
Furthermore, the position of the object exiting on the toilet bowl 41 may be periodically determined by a ToF distance image sensor to determine the action of the object (user). In this case, a human shape may be stored as a standard pattern to identify an object matching the standard pattern as a user by pattern matching, and recognize a site of the object matching an arm portion of the standard pattern to determine the action of the arm portion.
The gesture determination unit 415 determines whether the user's action detected by the sensors 468 and 469 corresponds to a predetermined gesture. The predetermined gesture is, for example, a gesture of swinging the site extended to the projection target surface side from side to side or swinging up and down, stopping the site extended to the projection target surface side for a predetermined time or more while pointing to a choice displayed on the projection image, etc.
When it is determined by the gesture determination unit 415 that the predetermined gesture has been performed, the image control unit 412 executes processing assigned to the gesture. For example, in the case of a swing gesture in the horizontal direction, the image control unit 412 executes fast-forwarding or fast-reversing of an image, and in the case of a swing gesture in the height direction, the image control unit 412 adjusts sound volume. Moreover, in the case of a gesture of stopping for a predetermined time or more while pointing to a choice (selection operation), the image control unit 412 executes processing in the case of the selection of the choice.
As described above, according to the second embodiment, the user can perform an operation on the image by only a gesture without touching the operation unit or the like, and can easily and hygienically operate even during defecation.
As compared with the first or second embodiment described above, a third embodiment is added with a configuration for projecting an image on the toilet bowl 41 until the user is seated on the toilet bowl 41. Note that the other configuration is the same, and thus the same components are represented by the same reference numerals and symbols, and duplicative description thereon is omitted.
First, when the detection unit 46 such as a human detection sensor installed in the booth 14 detects doorway of a user, the control device 3 starts processing of
Next, in the same manner as described above, the control device 3 determines whether the user has exited from the booth 14 (step S30) and whether the user has been seated on the toilet bowl 41 (step S40). Here, when the user has not been seated (step S40, No), the control device 3 returns step S30.
When it is determined that the user has been seated on the toilet bowl 41 (step S40, Yes), the control device 3 stops the projection of the image onto the toilet bowl 41, and the subsequent processing (steps S50 to S80) projects the image onto the projection position corresponding to the viewpoint position of the user as in the case of
As described above, in the third embodiment, an image can be effectively presented by projecting the image onto the toilet bowl 41 which a user entering the booth 14 surely views. For example, by displaying as if living beings inhabit in the toilet bowl 41, a clean impression can be given to the user. In addition, by displaying an idyllic image like a goldfish or the like, an effect of relaxing the user can be achieved. Furthermore, when an image including a water surface is projected like an image in which goldfishes swimming under the water surface are overlooked like a bird's-eye view, by projecting the image so that the water surface in the image and the sealing water in the toilet bowl 41 coincide with each other, augmented reality (AR) is given to the user as if goldfishes swim under the actual sealing water, and the user is caused to be interested in the projection image, whereby the degree of expectation for an image to be next projected onto the inner wall of the door 9 or the like can be enhanced.
As compared with any of the first to third embodiments described above, a fourth embodiment is added with a configuration for projecting an image onto the floor surface 14F when the door 9 is open. Note that the other configuration is the same as any of the first to third embodiments described above, and thus the same components are represented by the same reference numerals and symbols, and duplicative description thereof is omitted.
In the fourth embodiment, human detection sensors 466 and 467 are provided outside booths 14 to detect that a user has entered a predetermined area close to a booth 14, that is, the user has approached a booth 14, and projection of an image 70 onto the floor surface 14F is started. Specifically, the sensors 466 and 467 are provided in the vicinity of the doorways of the female toilet facilities 101 and the male toilet facilities 102. When the sensor 466 detects existence of a user, it is detected that the user has approached a booth 14 in the female toilet facilities 101, and when the sensor 467 detects existence of a user, it is detected that the user has approached a booth 14 in the male toilet facilities 102. Not limited to the human detection sensors, cameras 51 and 52 may be provided in the toilet facilities 101 and 102 so that when it is recognized that a user appears in a captured image by pattern recognition, the user has approached a booth 14. A user on a wheelchair may be detected based on images captured by the cameras 51 and 52.
When the control device 3 detects that a user has approached a booth 14 by the sensors 466 and 467 or the cameras 51 and 52, the control device 3 starts the processing of
Next, the control device 3 detects whether the door 9 is closed by the opening and closing sensor or the like of the door 9 (step S25). Here, when the door 9 is not closed (step S25, No), the control device 3 continues the display of the image started in step S20B.
When the door 9 is closed (step S25, Yes), the control device 3 stops the projection of the image onto the floor surface 14F and controls the projection position changing unit 16 of the projector 1 to set the projection target surface to the inner wall of the door 9 and project an image. At this time, since it can be estimated that the user has entered the booth 14, it is before the user sits on the toilet bowl 41 and the user is standing, the image is projected to a high position, for example, the maximum height of the adjustment range (for example, 1400 mm). Note that the projection position is not limited to the above position, and it may be set to a middle position in the adjustment range, or an average of viewpoint positions (heights) detected within a predetermined period may be determined to set the projection position according to the averaged viewpoint position. The step S30 and subsequent steps are the same as the first to third embodiments described above.
As described above, according to the fourth embodiment, an image as to whether a booth is available or not, etc. can be presented to a user located outside the booth. In particular, by projecting images such as the position of the washing button, how to use the controller 43, etc. onto the floor surface 14F, the user can know these when the user enters the booth, and focus on viewing of the images displayed on the wall surface when the user has been seated. Furthermore, by recognizing a user on a wheelchair and indicating how to use with the wheelchair, etc., the convenience of the user on the wheelchair is enhanced.
<Others>
The present invention is not limited to only the illustrated examples described above, and it goes without saying that various modifications can be made without departing from the subject matter of the present invention. Moreover, although the example of the toilet booth provided with the toilet bowl is mainly illustrated as the booth 14 in the foregoing embodiments, the booth 14 is not limited to the toilet booth, and the booth 14 may be a place which a user uses alone, such as a shower booth, a dressing room, a fitting room, or a capsule hotel.
Number | Date | Country | Kind |
---|---|---|---|
2017-016104 | Jan 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/003294 | 1/31/2018 | WO | 00 |