1. Technical Field
The present disclosure relates to a projection apparatus that projects an image.
2. Description of the Related Art
Unexamined Japanese Patent Publication No. 2004-48695 discloses a projection-type image display system that can change a projection position of an image. The projection-type image display system disclosed in Patent Literature 1 includes a sensor that performs sensing to a projection target region where an image is to be projected, and detection means that executes an edge detection process or a color distribution detection process based on the sensing information to output detection information. The projection-type image display system determines a projectable region which has no obstructions within the target projection region based on the detection information, and adjusts a projection size of an image to be projected in such a manner that the image is projected on the projectable region. With this, in a case where an obstruction is present within the projection target region on which an image is to be projected, the image is projected with the projection size being reduced so as to avoid the obstruction within the projection target region.
The present disclosure provides a projection apparatus that enables an object, which is a person or the like, to easily see a projection image without being affected by an obstruction, when the projection image is projected for presentation to the object.
The projection apparatus according to the present disclosure includes a projection unit, a detector, and a controller. The projection unit projects a projection image. The detector detects a state of an obstruction in projecting a projection image within a predetermined first projection region. The controller sets a region where a projection image is projected first to the first projection region. The controller changes the region where the projection image is projected from the first projection region to a predetermined second projection region different from the first projection region, when the state of the obstruction detected by the detector corresponds to a predetermined condition.
The projection apparatus according to the present disclosure changes the projection region to the second projection region from the first projection region, when the state of the obstruction corresponds to the predetermined condition with the projection image being projected on the first projection region. This enables an object, which is a person or the like, to easily see the projection image without being affected by the obstruction, when the projection image is projected for presentation to the object.
Exemplary embodiments will be described below in detail with reference to the drawings as necessary. However, more than necessary detailed descriptions will sometimes be omitted. For example, detailed descriptions for matters which have already been well known in the art and redundant descriptions for substantially the same configurations will sometimes be omitted. This is to prevent the description below from becoming unnecessarily redundant to facilitate understanding of a person skilled in the art.
Note that the accompanying drawings and the following description are provided by the applicant in order for a person of ordinary skill in the art to sufficiently understand the present disclosure, and they are not intended to limit the subject matter set forth in the claims.
Projector apparatus 100 will be described as a specific exemplary embodiment of a projection apparatus according to the present disclosure.
The outline of an image projecting operation with projector apparatus 100 will be described with reference to
As illustrated in
Drive unit 110 can drive projector apparatus 100 so as to change a projection direction of projector apparatus 100. Drive unit 110 can drive a body of projector apparatus 100 in a pan direction (horizontal direction) and a tilt direction (vertical direction). As illustrated in
Projector apparatus 100 includes user interface device 200. Thus, projector apparatus 100 can execute various controls to a projection image according to an operation of a person or a standing position of a person.
The configuration and operation of projector apparatus 100 will be described below in detail.
<1. Configuration of Projector Apparatus>
User interface device 200 includes controller 210, memory 220, and distance detector 230. Distance detector 230 is one example of a first detector that detects a state of an obstruction in projecting a projection image within a predetermined first projection region, and also one example of a second detector that detects a specific object.
Controller 210 is a semiconductor element that entirely controls projector apparatus 100. Specifically, controller 210 controls the components (distance detector 230, memory 220) configuring user interface device 200, light source unit 300, image generator 400, and projection optical system 500. Controller 210 can also perform a digital zoom control for zooming out and zooming in a projection image with a video image signal process. Controller 210 may be formed only by hardware, or may be implemented by combining hardware and software.
Memory 220 is a memory element that stores various information. Memory 220 is configured by a flash memory or ferroelectric memory. Memory 220 stores a control program and the like for controlling projector apparatus 100. Memory 220 also stores various information supplied from controller 210. Memory 220 also stores setting of a projection size with which a projection image is expected to be displayed, and data such as a table of focusing values according to distance information to a projection target.
Distance detector 230 is configured by a TOF (Time-of-Flight) sensor, for example, and linearly detects the distance to an opposed surface. When facing wall 140, distance detector 230 detects the distance to wall 140 from distance detector 230. Similarly, when facing floor 150, distance detector 230 detects the distance to floor 150 from distance detector 230.
Sensor controller 233 reads the phase of the infrared detection light emitted from infrared light source unit 231 and the phase of the infrared detection light received by each pixel in infrared light receiving unit 232 from the internal memory. Sensor controller 233 measures the distance to the opposed surface from distance detector 230 based on the phase difference between the infrared detection light emitted from distance detector 230 and the received infrared detection light, thereby generating distance information (distance image).
A TOF sensor is used as distance detector 230 in the above. However, the present disclosure is not limited thereto. Specifically, distance detector 230 may use the one that projects a known pattern such as a random dot pattern and calculates distance using the deviation from the pattern, or may be the one that uses a parallax with a stereo camera.
Next, the configuration of light source unit 300, image generator 400, and projection optical system 500, which are the components other than user interface device 200 out of the components mounted to projector apparatus 100, will be described with reference to
The configuration of light source unit 300 will firstly be described. As illustrated in
Semiconductor laser 310 is a solid light source that emits S-polarized blue light having a wavelength of 440 nm to 455 nm, for example. S polarized blue light emitted from semiconductor laser 310 is incident on dichroic mirror 330 through light guide optical system 320.
For example, dichroic mirror 330 is an optical element having a high reflectance of 98% or more for S polarized blue light having a wavelength of 440 nm to 455 nm and having a high transmittance of 95% or more for P polarized blue light having a wavelength of 440 nm to 455 nm and green light to red light having a wavelength of 490 nm to 700 nm regardless of the polarization state. Dichroic mirror 330 reflects S polarized blue light emitted from semiconductor laser 310 toward λ/4 plate 340.
λ/4 plate 340 is a polarization element that converts linear polarized light into circular polarized light or converts circular polarized light into linear polarized light. λ/4 plate 340 is disposed between dichroic mirror 330 and phosphor wheel 360. S polarized blue light incident on λ/4 plate 340 is converted into circular polarized blue light, and then, emitted to phosphor wheel 360 through lens 350.
Phosphor wheel 360 is an aluminum flat plate configured to be rotatable at a high speed. Phosphor wheel 360 has, on its surface, a plurality of B regions that is a region of a diffusion reflection plane, a plurality of G regions on which a phosphor emitting green light is applied, and a plurality of R regions on which a phosphor emitting red light is applied. Circular polarized blue light emitted to the B regions on phosphor wheel 360 is diffusely reflected, and again enters λ/4 plate 340 as circular polarized blue light. Circular polarized blue light incident on λ/4 plate 340 is converted into P polarized blue light, and then, again enters dichroic mirror 330. The blue light incident on dichroic mirror 330 at that time is P polarized light. Therefore, this blue light passes through dichroic mirror 330, and enters image generator 400 through light guide optical system 370.
Blue light emitted on the G regions or the R regions on phosphor wheel 360 excites the phosphor applied on the G regions or the R regions to allow the phosphor to emit green light or red light. Green light or red light emitted from the G regions or the R regions enters dichroic mirror 330. The green light or red light incident on dichroic mirror 330 at that time passes through dichroic mirror 330, and enters image generator 400 through light guide optical system 370.
Due to the high-speed rotation of phosphor wheel 360, blue light, green light, and red light are time divided and emitted from light source unit 300 to image generator 400.
Image generator 400 generates a projection image according to a video image signal supplied from controller 210. Image generator 400 includes DMD (Digital-Mirror-Device) 420, and the like. DMD 420 is a display element on which a lot of micromirrors are arrayed on a flat plane. DMD 420 deflects each of the arrayed micromirrors according to the video image signal supplied from controller 210 to spatially modulate incident light. Light source unit 300 emits blue light, green light, and red light in a time-division way. DMD 420 repeatedly and sequentially receives blue light, green light, and red light which are time divided and emitted through light guide optical system 410. DMD 420 deflects each of the micromirrors in synchronization with the timing at which light of each color is emitted. With this, image generator 400 generates a projection image according to the video image signal. DMD 420 deflects the micromirrors to form light directed to projection optical system 500 and to form light directed outside an effective range of projection optical system 500, according to the video image signal. With this, image generator 400 can supply the generated projection image to projection optical system 500.
Projection optical system 500 includes optical members such as zoom lens 510 and focusing lens 520. Projection optical system 500 enlarges light directed from image generator 400 and projects the resultant light on a projection plane. Controller 210 adjusts the position of zoom lens 510, thereby being capable of controlling a projection region relative to a projection target in order to attain a desired zoom value. Controller 210 can enlarge a projection image which is to be projected onto a projection plane by increasing a zoom magnification. In this case, controller 210 moves zoom lens 510 in the direction in which an angle of view is widened (toward wide end) to expand the projection region. On the other hand, controller 210 can make a projection image which is to be projected onto a projection plane small by decreasing a zoom magnification. In this case, controller 210 moves zoom lens 510 in the direction in which an angle of view is narrowed (toward tele end) to narrow the projection region. In addition, controller 210 adjusts the position of focusing lens 520 based on predetermined zoom tracking data so as to track the movement of zoom lens 510, thereby being capable of performing focusing of a projection image.
In the above description, the configuration of DLP (Digital-Light-Processing) system using DMD 420 is used as one example of projector apparatus 100. However, the present disclosure is not limited thereto. That is, a configuration of a liquid crystal type may be used as projector apparatus 100.
The configuration of a single-plate type in which a light source using phosphor wheel 360 is time divided has been described above as one example of projector apparatus 100. However, the present disclosure is not limited thereto. That is, the configuration of a three-plate type including light sources of blue light, green light, and red light may be used for projector apparatus 100.
The configuration in which the light source of blue light for generating a projection image and a light source of infrared light for measuring distance are different units has been described above. However, the present disclosure is not limited thereto. That is, a unit formed by combining a light source of blue light for generating a projection image and a light source of infrared light for measuring distance may be used. If the three-plate type is employed, a unit formed by combining light sources of respective colors and a light source of infrared light may be used.
<2. Operation>
2-1. Outline of Operation
The outline of a projecting operation of projector apparatus 100 according to the present exemplary embodiment will be described with reference to
Projector apparatus 100 according to the present exemplary embodiment detects a specific person using distance information from distance detector 230, and projects a predetermined projection image near the person by tracking the movement of the detected person. As illustrated in
However, there may be a case where a region required to project projection image 10 cannot be ensured on floor surface 81, since floor surface 81 is crowded with many persons and the projection is obstructed. Therefore, in the present exemplary embodiment, the state of obstructions 7 other than person 6 on floor surface 81 is detected as illustrated in
The condition of crowd 70 is changing from time to time. Therefore, crowd 70 may be cleared after projection image 10 cannot be projected on floor surface 81 due to crowd 70 that becomes an obstruction, and so, projection of projection image 10 on floor surface 81 may be again enabled. In such a case, projector apparatus 100 returns the projection region where projection image 10 is to be projected to floor surface 81 which is easily seen by person 6. For this, the condition of crowd 70 on floor surface 81 is monitored even during the period of projecting projection image 10 onto wall surface 82 in the present exemplary embodiment. Then, when crowd 70 is cleared away from projection position P1 on floor surface 81, projector apparatus 100 returns the region where projection image 10 is to be projected to floor surface 81 from wall surface 82 as illustrated in
2-2. Detail of Operation
The detail of the operation of projector apparatus 100 according to the present exemplary embodiment will be described below.
2-2-1. Tracking Operation of Projection Image
Firstly, the tracking operation of a projection image of projector apparatus 100 according to the present exemplary embodiment will be described with reference to
2-2-2. Changing Projection Process
Next, the flow of the changing projection process of projector apparatus 100 according to the present exemplary embodiment will be described with reference to
Firstly, controller 210 determines whether or not distance detector 230 detects specific person 6 (S100). Person 6 is an object that is tracked so that projection image 10 is projected for person 6. Person 6 is detected from distance information of floor surface 81 on which person 6 is present. The distance information is an image showing the detection result of the distance detected by distance detector 230, for example (see
When it is determined that person 6 is detected (YES in S100), controller 210 detects the position and the direction of movement of detected person 6 based on the distance information (S102). The detail of the method for detecting the position and the direction of movement of person 6 will also be described below.
Next, controller 210 sets projection position P1 on floor surface 81 based on the position and the direction of movement of person 6 detected in step S102, and projects projection image 10 on projection position P1 as illustrated in
Next, controller 210 detects obstruction 7 near projection position P1 on the extension of the direction of movement of person 6 using the distance information (S106). Obstruction 7 is detected in such a manner that a detection amount showing the congestion degree of overlapped obstructions 7 near projection position P1 is extracted from the distance information that is the detection result of distance detector 230. The congestion degree of obstructions is a number or density of the obstructions within the projection region. The detail of the method for detecting the congestion degree of obstructions 7, i.e., the method for detecting crowd 70 will be described below.
Next, controller 210 determines whether or not the detection amount of obstruction 7 with the detection process in step S106 exceeds a predetermined first threshold (S108). The first threshold is a reference threshold in determining that crowd 70 becomes the obstruction of the projecting operation due to an increase in obstructions 7. When it is determined that the detection amount of obstruction 7 does not exceed the first threshold (NO in S108), controller 210 returns to the process in step S102.
On the other hand, when it is determined that the detection amount of obstruction 7 exceeds the first threshold (YES in S108), controller 210 projects projection image 10 while changing the projection region to wall surface 82 from floor surface 81 as illustrated in
Controller 210 now sets projection position P2 based on the detection result in step S102, and controls drive unit 110 to change the projection region to wall surface 82 from floor surface 81. In addition, controller 210 controls image generator 400 to perform geometric correction of projection image 10 relative to wall surface 82, and controls projection optical system 500 to align the focal point of projection image 10 on projection position P2. In this case, the angle of view of distance detector 230 is set wider than the angle of view for projection. Although drive unit 110 changes the projection region of projection image 10 to wall surface 82 from floor surface 81, drive unit 110 drives projector apparatus 100 such that projection position P1 on floor surface 81 is included in the detection region with distance detector 230.
Next, controller 210 detects an obstruction on floor surface 81 from the distance information on floor surface 81 (S112), as in the process in step S106.
Next, controller 210 determines whether or not the detection amount of obstruction 7 with the detection process in step S108 exceeds a predetermined second threshold (S114). The second threshold is a reference threshold in determining that crowd 70 is cleared due to a decrease in obstructions 7, and the second threshold is set smaller than the first threshold.
When it is determined that the detection amount of obstruction 7 exceeds the second threshold (YES in S114), controller 210 detects the position and the direction of movement of person 6 that is now tracked (S116).
Next, controller 210 sets projection position P2 on wall surface 82 based on the position and the direction of movement of person 6 detected in step S116, and projects projection image 10 on projection position P2 (S118).
On the other hand, when it is determined that the detection amount of obstruction 7 does not exceed the second threshold (NO in S114), controller 210 returns the projection region to floor surface 81 from wall surface 82. Specifically, controller 210 projects projection image 10 by changing projection position P2 on wall surface 82 to projection position P1 on floor surface 81 (S120) as illustrated in
As described above, projector apparatus 100 according to the present exemplary embodiment monitors the condition of crowd 70 by continuously detecting the congestion degree of obstructions 7 on floor surface 81 in steps S106 and S112. Then, when crowd 70 occurs, projector apparatus 100 changes the projection position of projection image 10 to wall surface 82 from floor surface 81 (S110). When crowd 70 is cleared away after that, projector apparatus 100 returns the projection position to floor surface 81 (S120). With this, projection image 10 is projected on a position easily seen by person 6 according to the condition of crowd 70. Notably, floor surface 81 is one example of a first projection region where projection image 10 is projected for person 6, and wall surface 82 is one example of a second projection region different from the first projection region.
Further, in the present exemplary embodiment, projection positions P1 and P2 on floor surface 81 and on wall surface 82 are changed using drive unit 110 in steps S110 and S120, and the angle of view for projection of projection image 10 is set for one of floor surface 81 and wall surface 82. If the angle of view for projection is widened to the entire region where an image may be projected, brightness or resolution is reduced. However, when the angle of view for projection is narrowed by changing the projection direction with drive unit 110 as in the present exemplary embodiment, a bright projection image having a high resolution can be projected in a wide range.
In addition, drive unit 110 causes projection image 10 to track person 6 in steps S104 and S118. With this, the angle of view for projection of projection image 10 can further be narrowed on floor surface 81 or wall surface 82, so that image quality of projection image 10 can be enhanced.
Further, in the determination process in steps S108 and S114, the second threshold for the changeover from projection position P2 to projection position P1 is set smaller than the first threshold for the changeover from projection position P1 to projection position P2, so as to form a hysteresis width. Thus, the changing operation of projection positions P1 and P2 can be stabilized.
In addition, in the processes in steps S110 and S120, image quality of projection image 10 may be changed in changing projection positions P1 and P2 of projection image 10 on floor surface 81 and wall surface 82. Specifically, memory 220 preliminarily stores an image quality data table including attribute information such as a color, diffusion reflectivity, and mirror reflectivity of each of floor surface 81 and wall surface 82. Controller 210 reads the image quality data table from memory 220. Controller 210 controls image generator 400 based on the read image quality data table to generate projection image 10 by performing chromaticity correction or brightness correction of a set value according to the attribute information of floor surface 81 and wall surface 82.
For example, in a case where wall surface 82 is red, the red content in projection image 10 is not noticeable. Therefore, controller 210 emphasizes red in projection image 10 or red color in the content of projection image 10 is replaced by black color.
Further, in a case where a projection plane on which a projection image is to be projected has a high diffusion reflectivity, projection light is diffused on the projection plane. Therefore, in a case where one of floor surface 81 and wall surface 82 has a high diffusion reflectivity even if they have similar color, controller 210 performs correction to increase brightness of projection image 10 upon projecting projection image 10 on the surface. Reflection light of projection image 10 is dazzling on a surface having a high mirror reflectivity. Therefore, controller 210 performs correction to decrease brightness of projection image 10 upon projecting projection image 10 on such a surface.
2-2-3. With Regard to Method for Detecting Person and Crowd
Next, the method for detecting a person and crowd with projector apparatus 100 according to the present exemplary embodiment will be described.
Firstly, the method for detecting a person in step S100 in
As illustrated in
Controller 210 in projector apparatus 100 continuously acquires distance information on floor surface 81 using distance detector 230, and analyses the change in the acquired distance information to basic depth information D1. In a case where person 6 enters on floor surface 81 within the detection region of distance detector 230 as illustrated in
When detecting the presence of person 6, controller 210 detects the position of person 6 based on the detected group of pixels in the distance information (see step S102 in
Next, the method for detecting a crowd in steps S106 and S112 in
In the detection of crowd 70, controller 210 firstly detects the detection amount of obstructions 7 on floor surface 81. Specifically, controller 210 detects the number of obstructions 7, which are concurrently present, in the distance image detected by distance detector 230 as the detection amount. When doing so, controller 210 firstly detects the pixel in which the amount of change to basic depth information D1 in the distance image becomes not less than a predetermined threshold, and extracts a spatial group of such pixels. When the size occupied by the extracted groups of pixels which are spatially continuous exceeds a predetermined threshold corresponding to the size of human, controller 210 detects the presence of one obstruction 7. Controller 210 counts a number of groups of pixels with the size not less than the predetermined threshold to detect the number of obstructions 7.
Next, controller 210 compares the detected number of obstructions 7 to a number of first or second thresholds to determine the congestion or clearing of crowd 70. Specifically, when the number of obstructions 7 exceeds the number of first thresholds, controller 210 determines that crowd 70 on floor surface 81 corresponds to an exception condition, and exceptionally projects the projection image on wall surface 82 (see steps S108 and S110 in
The number of obstructions 7 may be detected in a region within a predetermined range in the direction of movement of person 6, such as the region overlapped with projection position P1 or the region including projection position P1 illustrated in
In addition, crowd 70 may be detected by using the density of obstructions 7 overlapped with floor surface 81 as the detection amount. In this case, controller 210 firstly detects the pixel in which the amount of change to basic depth information D1 in a region within the predetermined range in the distance image becomes not less than a predetermined threshold, and extracts an area occupied by the detected pixels. Controller 210 detects the density of obstructions 7 in the region within the predetermined range based on the extracted area. Controller 210 compares the density of detected obstructions 7 to a predetermined density corresponding to the first or second threshold, thereby determining an exception condition as in the above case.
Alternatively, crowd 70 may be detected by extracting a region having no obstructions 7 on floor surface 81. In this case, controller 210 extracts a region not overlapped with obstructions 7 within the predetermined range on floor surface 81 based on the distance image that is the detection result of distance detector 230, and detects a display size falling within the extracted region. Controller 210 compares the detected display size to a predetermined display size corresponding to the first or second threshold, thereby determining an exception condition as in the above case. It is to be noted that, in this case, the display size corresponding to the first threshold may be set smaller than the display size corresponding to the second threshold.
2-2-4. With Regard to Projection Position of Projection Image
Next, a projection position of a projection image with projector apparatus 100 will be described with reference to
In a case where a projection image is projected on floor surface 81, projection position P1 of the projection image is set on a position ahead of position p6 of person 6 on the floor surface by predetermined distance d1 in direction of movement V6 of person 6 who is now tracked, as illustrated in
On the other hand, in a case where a projection image is projected on wall surface 82, projection position P2 of the projection image is set on a position with height h6 which is the same level of position p6′ of the face of person 6 on wall surface 82, the position being ahead of position p6′ of the face of person 6 by predetermined distance d2 in direction of movement V6 of person 6, as illustrated in
Notably, if wall surface 82 is overlapped with the extension of direction of movement V6 of person 6, or wall surface 82 is overlapped with the extension at the side of direction of movement V6 of person 6, the position with height h6 on wall surface 82 on the extension in these directions may be set as projection position P2. In addition, height h6 of the face of person 6 may be calculated as the height with a predetermined ratio (for example, 80%) to the height of person 6.
Further, the projection size of the projection image may be changed according to the distance to projection position P1 from person 6. For example, in a case where an image is projected on wall surface 82 relatively far away from person 6, the image may be projected with the projection size larger than the projection size of the image which is to be projected on floor surface 81 which is relatively near person 6. With this, visibility of the projection image can be obtained, even if the image is projected at relatively a distant position from person 6.
<3. Effects>
As described above, in the present exemplary embodiment, projector apparatus 100 includes projection unit 250, distance detector 230, and controller 210. Projection unit 250 projects projection image 10. Distance detector 230 detects a state of obstruction 7 on floor surface 81 in projecting projection image 10. Controller 210 sets a region where projection image 10 is projected first to floor surface 81. Controller 210 changes the region where projection image 10 is to be projected from floor surface 81 to wall surface 82 different from floor surface 81 based on the state of obstruction 7 detected by distance detector 230, when the state of obstruction 7 corresponds to a predetermined condition. Controller 210 returns the region where projection image 10 is projected to floor surface 81 from wall surface 82, when the predetermined condition for the state of obstruction 7 is resolved.
According to projector apparatus 100 according to the present exemplary embodiment, a projection image is basically projected on floor surface 81, and when the state of obstruction 7 corresponds to the predetermined condition, the projection region is changed to wall surface 82 from floor surface 81. When the state of obstruction 7 no longer corresponds to the predetermined condition after that, projector apparatus 100 returns projection image 10 to floor surface 81. With this, projection image 10 can be projected at a position where person 6 easily sees projection image 10, when projection image 10 is projected for presentation to person 6.
In addition, in the present exemplary embodiment, distance detector 230 detects specific person 6. Then, controller 210 causes projection image 10 projected with projection unit 250 to track person 6 detected by distance detector 230. Therefore, when person 6 moves, the projection image is projected while tracking person 6, so that visibility of the projection image for specific person 6 can be enhanced.
As described above, the first exemplary embodiment has been described as an illustration of the technology disclosed in the present application. However, the technology in the present disclosure is not limited to this, and can be applied to exemplary embodiments in which various changes, replacements, additions, omissions, etc., are made. Furthermore, an exemplary embodiment can be formed by combining each component described in the first exemplary embodiment.
The other exemplary embodiments will be described below.
Projector apparatus 100 according to the first exemplary embodiment includes distance detector 230 as one example of the second detector that detects a person. However, the second detector is not limited thereto. For example, instead of or in addition to distance detector 230, an imaging unit that captures an image with visible light (RGB) may be provided. For example, controller 210 may recognize a person or an obstruction with an image analysis performed to the image captured by an imaging unit.
For example, projector apparatus 100 may include an imaging unit configured by a CCD camera or the like. The direction of movement or orientation of a person or the congestion degree of obstruction may be extracted from the image captured by the imaging unit. For example, controller 210 may recognize the eye level of person 6, who is now tracked, with an image analysis to the RGB image, and set projection position P2 on wall surface 82 illustrated in
Further, projector apparatus 100 according to the first exemplary embodiment includes distance detector 230 as one example of the first detector that detects the state of an obstruction. However, the first detector is not limited thereto. For example, in detecting crowd 70 illustrated in
Projector apparatus 100 according to the first exemplary embodiment includes distance detector 230 as one example of the first and second detectors. That is, the first exemplary embodiment describes that the first and second detectors are configured by one sensor. However, the configuration is not limited thereto. The first detector and the second detector may be configured by different sensors. For example, one of distance detector 230 and the imaging unit may be specified as one of the first and second detectors, or distance detector 230 and the imaging unit both function as the first and second detectors. In addition, distance detector 230 is fixed such that the projection direction and orientation thereof are aligned to those of projection unit 250. However, the configuration is not limited thereto. For example, distance detector 230 may be provided at a position different from the installation position of projector apparatus 100.
In the first exemplary embodiment, the projection position of a projection image is changed so as to track a person with drive unit 110. However, the configuration is not limited thereto. For example, the angle of view for projection may be set wider than the projection image actually projected, and the projection image may be moved within the range of the angle of view for projection. In this case, the projection on a floor surface and the projection on a wall surface may be changed within the same angle of view for projection, for example.
In the first exemplary embodiment, an object to which a projection image is presented from projector apparatus 100 is specific person 6. However, the exemplary embodiment is not limited thereto. The object to which the projection image is presented may be a group of persons or a vehicle such as an automobile. In addition, an obstruction is not limited to a person, but may be a vehicle such as an automobile.
In addition, a projection image projected for presentation to an object may be a still image or a moving image. In a case where the projection apparatus projects a projection image while tracking an object, the projection apparatus may move and project the projection image to lead the object. The content of the projection image is not necessarily the one leading person 6. It may be the one performing advertisement, for example. In addition, the projection apparatus does not necessarily project a projection image while tracking an object. For example, the projection apparatus may project a projection image to a group of persons such that each person can easily see the projection image.
In the first exemplary embodiment, floor surface 81 is specified as the first projection region, and wall surface 82 is specified as the second projection region, for example. However, the first and second projection regions are not limited thereto. For example, a wall surface may be specified as the first projection region, and a floor surface may be specified as the second projection region. Further, a ceiling surface of a building may be specified as the first or second projection region, for example. For example, projector apparatus 100 may be installed on staircases, a wall surface may be specified as the first projection region, and a ceiling surface may be specified as the second projection region, then a projection image may basically be projected on the wall surface, and may exceptionally be projected on the ceiling surface.
The projection apparatus according to the present disclosure is applicable to a variety of uses for projecting a video image onto a projection plane.
Number | Date | Country | Kind |
---|---|---|---|
2014-263638 | Dec 2014 | JP | national |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2015/005135 | Oct 2015 | US |
Child | 15220702 | US |