The present invention relates to an image processing apparatus, an image processing method, and a storage medium that stores an image processing program.
WO2019/012774A discloses an information processing apparatus that outputs projector disposition information related to disposition of a projector based on projection conditions related to projection with the projector in order to reduce a burden on disposition design of the projector.
WO2017/179272A discloses an information processing apparatus that acquires setting information related to projection of an image with an image projection apparatus and that generates a simulation image including a plurality of the image projection apparatuses and a display region of each of a plurality of images projected by the plurality of image projection apparatuses, based on the acquired setting information.
JP2018-121964A discloses a projection toy in which a first placement portion, a second placement portion, and a third placement portion for placing a body, which is provided with a projection portion capable of projecting a video to an object, on a placement surface are provided in the body to facilitate projection in three directions, and the first placement portion, the second placement portion, and the third placement portion are provided to face in different directions from each other.
One embodiment according to the technology of the present disclosure provides an image processing apparatus, an image processing method, and a non-transitory computer-readable storage medium that stores an image processing program capable of improving a user convenience related to a disposition of a projection surface or a projection apparatus.
According to one aspect of the present invention, there is provided an image processing apparatus comprising a processor, in which the processor is configured to: acquire first image data obtained by imaging a space with an imaging apparatus; determine a first position corresponding to a position of a virtual projection surface and a second position serving as a reference for an orientation of the virtual projection surface, in the space; determine the orientation of the virtual projection surface based on a positional relationship between the first position and the second position and generate virtual projection surface data representing the virtual projection surface; generate second image data representing a second image in which the virtual projection surface is displayed on a first image represented by the first image data, based on the first image data and on the virtual projection surface data; and output the second image data to an output destination.
According to another aspect of the present invention, there is provided an image processing method executed by a processor included in an image processing apparatus, the image processing method comprising: acquiring first image data obtained by imaging a space with an imaging apparatus; determining a first position corresponding to a position of a virtual projection surface and a second position serving as a reference for an orientation of the virtual projection surface, in the space; determining the orientation of the virtual projection surface based on a positional relationship between the first position and the second position and generating virtual projection surface data representing the virtual projection surface; generating second image data representing a second image in which the virtual projection surface is displayed on a first image represented by the first image data, based on the first image data and on the virtual projection surface data; and outputting the second image data to an output destination.
According to still another aspect of the present invention, there is provided a non-transitory computer-readable storage medium that stores an image processing program for causing a processor included in an image processing apparatus to execute a process comprising: acquiring first image data obtained by imaging a space with an imaging apparatus; determining a first position corresponding to a position of a virtual projection surface and a second position serving as a reference for an orientation of the virtual projection surface, in the space; determining the orientation of the virtual projection surface based on a positional relationship between the first position and the second position and generating virtual projection surface data representing the virtual projection surface; generating second image data representing a second image in which the virtual projection surface is displayed on a first image represented by the first image data, based on the first image data and on the virtual projection surface data; and outputting the second image data to an output destination.
According to the aspects of the present disclosure, it is possible to provide an image processing apparatus, an image processing method, and an image processing program capable of improving a user convenience related to a disposition of a projection surface or a projection apparatus.
An example of an embodiment of the present invention will be described below with reference to the drawings.
The image processing apparatus according to the embodiment can be used, for example, to support disposition of the projection apparatus 10. The projection apparatus 10 comprises a projection portion 1, a control device 4, and an operation reception portion 2. The projection portion 1 is composed of, for example, a liquid crystal projector or a projector using liquid crystal on silicon (LCOS). In the following description, it is assumed that the projection portion 1 is a liquid crystal projector.
The control device 4 is a control device that controls projection performed by the projection apparatus 10. The control device 4 is a device including a control unit composed of various processors, a communication interface (not shown) for communicating with each portion, and a memory 4a such as a hard disk, a solid-state drive (SSD), or a read-only memory (ROM) and integrally controls the projection portion 1.
Examples of the various processors of the control unit of the control device 4 include a central processing unit (CPU) which is a general-purpose processor that executes a program to perform various types of processing, a programmable logic device (PLD) which is a processor capable of changing a circuit configuration after manufacture such as a field-programmable gate array (FPGA), a dedicated electrical circuit which is a processor having a circuit configuration exclusively designed to execute specific processing such as an application-specific integrated circuit (ASIC), or the like.
More specifically, a structure of these various processors is an electrical circuit in which circuit elements such as semiconductor elements are combined. The control unit of the control device 4 may be configured with one of the various processors or may be configured with a combination of two or more processors of the same type or different types (for example, a combination of a plurality of FPGAs or a combination of a CPU and an FPGA).
The operation reception portion 2 detects an instruction from a user by receiving various operations from the user. The operation reception portion 2 may be a button, a key, a joystick, or the like provided in the control device 4 or may be a reception portion or the like that receives a signal from a remote controller for remotely operating the control device 4.
A projection object 6 is an object such as a screen or a wall having a projection surface on which a projection image is displayed by the projection portion 1. In the example shown in
A projection surface 11 shown by a dot-dashed line is a region irradiated with projection light by the projection portion 1 in the projection object 6. In the example shown in
The projection portion 1, the control device 4, and the operation reception portion 2 are implemented by, for example, a single device (for example, see
As shown in
The light source 21 includes a light emitting element such as a laser or a light emitting diode (LED) and emits, for example, white light.
The optical modulation portion 22 is composed of three liquid crystal panels that emit each color image by modulating, based on image information, each color light beam which is emitted from the light source 21 and is separated into three colors of red, blue, and green by a color separation mechanism (not shown). Filters of red, blue, and green may be mounted in each of the three liquid crystal panels, and each color image may be emitted by modulating the white light emitted from the light source 21 in each liquid crystal panel.
The light from the light source 21 and the optical modulation portion 22 is incident on the projection optical system 23. The projection optical system 23 includes at least one lens and is composed of, for example, a relay optical system. The light that has passed through the projection optical system 23 is projected onto the projection object 6.
In the projection object 6, a region irradiated with the light transmitted through the entire range of the optical modulation portion 22 is the projectable range in which the projection can be performed by the projection portion 1. Within this projectable range, a region irradiated with the light actually transmitted from the optical modulation portion 22 is the projection surface 11. For example, in the projectable range, a size, a position, and a shape of the projection surface 11 are changed by controlling a size, a position, and a shape of a region through which the light is transmitted in the optical modulation portion 22.
The control circuit 24 controls the light source 21, the optical modulation portion 22, and the projection optical system 23 based on the display data input from the control device 4, thereby projecting an image based on this display data onto the projection object 6. The display data input to the control circuit 24 is composed of three pieces of data including red display data, blue display data, and green display data.
In addition, the control circuit 24 changes the projection optical system 23 based on an instruction input from the control device 4, thereby enlarging or reducing the projection surface 11 (see
The projection apparatus 10 also comprises a shift mechanism that mechanically or optically moves the projection surface 11 while maintaining an image circle of the projection optical system 23. The image circle of the projection optical system 23 is a region where the projection light incident on the projection optical system 23 appropriately passes through the projection optical system 23 in terms of a light fall-off, color separation, edge part curvature, or the like.
The shift mechanism is implemented by at least any of an optical system shift mechanism that performs optical system shifting or an electronic shift mechanism that performs electronic shifting.
The optical system shift mechanism is, for example, a mechanism (for example, see
The electronic shift mechanism is a mechanism that performs pseudo shifting of the projection surface 11 by changing a range through which the light is transmitted in the optical modulation portion 22.
The projection apparatus 10 may also comprise a projection direction changing mechanism that moves the image circle of the projection optical system 23 and the projection surface 11. The projection direction changing mechanism is a mechanism that changes a projection direction of the projection portion 1 by changing the orientation of the projection portion 1 through mechanical rotation (for example, see
As shown in
The optical unit 106 comprises a first member 102 supported by the body part 101 and a second member 103 supported by the first member 102.
The first member 102 and the second member 103 may be an integrated member. The optical unit 106 may be configured to be attachable to and detachable from the body part 101 (in other words, configured to be interchangeable).
The body part 101 includes a housing 15 (see
As shown in
The light emitted from the light source 21 is incident on the optical modulation portion 22 of the optical modulation unit 12 and is spatially modulated and emitted by the optical modulation portion 22.
As shown in
As shown in
The first member 102 is a member having, for example, a rectangular cross-sectional outer shape, in which an opening 2a and an opening 2b are formed in surfaces perpendicular to each other. The first member 102 is supported by the body part 101 in a state in which the opening 2a is disposed at a position facing the opening 15a of the body part 101. The light emitted from the optical modulation portion 22 of the optical modulation unit 12 of the body part 101 is incident into the hollow portion 2A of the first member 102 through the opening 15a and the opening 2a.
The incidence direction of the light incident into the hollow portion 2A from the body part 101 will be referred to as a direction X1, the direction opposite to the direction X1 will be referred to as a direction X2, and the direction X1 and the direction X2 will be collectively referred to as a direction X. In
In addition, the direction perpendicular to the direction X and to the direction Z will be referred to as a direction Y. In the direction Y, the upward direction in
The projection optical system 23 shown in
The first optical system 121 includes at least one lens and guides the light that is incident on the first member 102 from the body part 101 and travels in the direction X1 to the reflective member 122.
The reflective member 122 reflects the light incident from the first optical system 121 in the direction Y1. The reflective member 122 is composed of, for example, a mirror. In the first member 102, the opening 2b is formed on the optical path of light reflected by the reflective member 122, and the reflected light travels to the hollow portion 3A of the second member 103 by passing through the opening 2b.
The second member 103 is a member having an approximately T-shaped cross-sectional outer shape, in which an opening 3a is formed at a position facing the opening 2b of the first member 102. The light that has passed through the opening 2b of the first member 102 from the body part 101 is incident into the hollow portion 3A of the second member 103 through the opening 3a. The first member 102 and the second member 103 may have any cross-sectional outer shape and are not limited to the above.
The second optical system 31 includes at least one lens and guides the light incident from the first member 102 to the reflective member 32.
The reflective member 32 reflects the light incident from the second optical system 31 in the direction X2 and guides the light to the third optical system 33. The reflective member 32 is composed of, for example, a mirror.
The third optical system 33 includes at least one lens and guides the light reflected by the reflective member 32 to the lens 34.
The lens 34 is disposed at an end part of the second member 103 on the direction X2 side in a form of closing the opening 3c formed at this end part. The lens 34 projects the light incident from the third optical system 33 onto the projection object 6.
The projection direction changing mechanism 104 is a rotation mechanism that rotatably connects the second member 103 to the first member 102. By the projection direction changing mechanism 104, the second member 103 is configured to be rotatable about a rotation axis (specifically, the optical axis K) that extends in the direction Y. The projection direction changing mechanism 104 is not limited to the disposition position shown in
The shift mechanism 105 is a mechanism for moving the optical axis K of the projection optical system (in other words, the optical unit 106) in a direction (direction Y in
The shift mechanism 105 may be a mechanism that moves the optical modulation portion 22 in the direction Y instead of moving the optical unit 106 in the direction Y. Even in this case, the image G1 projected onto the projection object 6 can be moved in the direction Y2.
Specifically, the image processing apparatus 50 displays, as a installation support image, a second image in which an image of a virtual projection surface, which is a virtual projection surface, and an image of a virtual projection apparatus, which is a virtual projection apparatus, are superimposed on a first image obtained by imaging the space in which the projection apparatus 10 is installed and performs the projection.
The processor 61 is a circuit that performs signal processing, and is, for example, a CPU that controls the entire image processing apparatus 50. The processor 61 may be implemented by other digital circuits such as an FPGA and a digital signal processor (DSP). The processor 61 may also be implemented by combining a plurality of digital circuits.
For example, the memory 62 includes a main memory and an auxiliary memory. For example, the main memory is a random-access memory (RAM). The main memory is used as a work area of the processor 61.
The auxiliary memory is, for example, a non-volatile memory such as a magnetic disk or a flash memory. The auxiliary memory stores various programs for operating the image processing apparatus 50. The programs stored in the auxiliary memory are loaded into the main memory and executed by the processor 61.
In addition, the auxiliary memory may include a portable memory that can be detached from the image processing apparatus 50. Examples of the portable memory include a memory card such as a universal serial bus (USB) flash drive or a secure digital (SD) memory card, and an external hard disk drive.
The communication interface 63 is a communication interface for communicating with apparatuses outside the image processing apparatus 50. The communication interface 63 includes at least any of a wired communication interface for performing wired communication or a wireless communication interface for performing wireless communication. The communication interface 63 is controlled by the processor 61.
The user interface 64 includes, for example, an input device that receives an operation input from the user, and an output device that outputs information to the user. The input device can be implemented by, for example, a key (for example, a keyboard) or a remote controller. The output device can be implemented by, for example, a display or a speaker. In the image processing apparatus 50 shown in
The sensor 65 includes an imaging apparatus that includes an imaging optical system and an imaging element and that can perform imaging, a space recognition sensor that can three-dimensionally recognize a space around the image processing apparatus 50, and the like. For example, the imaging apparatus includes an imaging apparatus provided on a rear surface of the image processing apparatus 50 shown in
The space recognition sensor is, as an example, a light detection and ranging (LiDAR) sensor of performing irradiation with laser light, measuring a time taken until the laser light of irradiation hits an object and reflects back, and measuring a distance and a direction to the object. However, the space recognition sensor is not limited thereto and can be various sensors such as a radar that emits radio waves, and an ultrasonic sensor that emits ultrasound waves.
The image processing apparatus 50 recognizes the physical space 70 by the space recognition sensor. Specifically, the image processing apparatus 50 recognizes the physical space 70 by a world coordinate system including an X-axis, a Y-axis, and a Z-axis, in which the X-axis is one horizontal direction in the physical space 70, the Y-axis is a direction of gravitational force in the physical space 70, and the Z-axis is a direction orthogonal to the X-axis and to the Y-axis in the physical space 70. Further, the image processing apparatus 50 displays a captured image based on imaging data obtained by imaging using the imaging apparatus on the touch panel 51 as a through-image (live view) to the user. The imaging data is an example of the first image data. The captured image is an example of the first image.
In a case in which a surface that serves as a reference for the position and the orientation of the virtual projection surface, such as a wall or a projection screen, is present in the physical space 70, the position and the orientation of the virtual projection surface can be relatively easily determined by using information on the surface. In addition, the position and the orientation of the virtual projection apparatus can be determined by obtaining and presenting the installable range of the virtual projection apparatus from the virtual projection surface and by allowing the user to designate the position within the installable range. Meanwhile, in a case in which there is no surface that serves as the reference for the position and the orientation of the virtual projection surface, it is difficult to determine the positions and the orientations of the virtual projection surface and the virtual projection apparatus. On the other hand, according to the present example, even in a case in which the surface that serves as the reference for the position and the orientation of the virtual projection surface is not present, it is possible to efficiently determine the disposition of the virtual projection apparatus and the virtual projection surface.
A lateral direction of the virtual projection surface 80 is defined as an SX-axis, a vertical direction of the virtual projection surface 80 is defined as an SY-axis, and a direction perpendicular to the virtual projection surface 80 is defined as an SZ-axis. In a case in which only the virtual projection surface installation position 81 of the virtual projection surface 80 in the physical space 70 is determined, a state in which how to set the SX-axis, the SY-axis, and the SZ-axis with respect to the X-axis, the Y-axis, and the Z-axis of the world coordinate system, respectively, is not determined, that is, a state in which the orientation of the virtual projection surface 80 is not determined, is set.
A vertical direction of the virtual projection apparatus is defined as a PY-axis, a lateral direction of the virtual projection apparatus is defined as a PX-axis, and a front-rear direction (projection direction) of the virtual projection apparatus is defined as a PZ-axis. In a case in which the virtual projection surface installation position 81 and the orientation of the virtual projection surface 80 are determined, the orientation of the virtual projection apparatus can be determined by setting the PY-axis of the virtual projection apparatus to the same orientation as the SY-axis of the virtual projection surface 80 and setting the PZ-axis of the virtual projection apparatus to the same orientation as the SZ-axis of the virtual projection surface 80.
In a case in which the orientation of the virtual projection apparatus is determined, a projection distance D from the virtual projection apparatus to the virtual projection surface 80 can be determined. The size (the lateral width and the vertical width) of the virtual projection surface 80 can be determined based on the projection distance D.
The image processing apparatus 50 displays the position designation image via the touch panel 51. The position designation image is an image in which an image of a position object P1 is superimposed on the captured image such that a virtual position object P1 (for example, a sphere) can be seen to exist at a position moved by a distance d1 from the camera position to TZ in the physical space 70. In addition, the image processing apparatus 50 receives an operation of providing an instruction to change the distance d1 from the user.
As a result, for example, the user adjusts the position and the orientation of the image processing apparatus 50 such that the position to be designated in the image processing apparatus 50 is positioned on a straight line connecting the camera position and the position object P1 by causing the imaging apparatus of the image processing apparatus 50 to direct to the position to be designated in the physical space 70 while viewing the position designation image displayed on the touch panel 51 of the image processing apparatus 50. In addition, the user adjusts the distance dl such that the position to be designated and the position object P1 coincide with each other in the physical space 70 by operating the image processing apparatus 50. The user performs the instruction position determination operation with respect to the image processing apparatus 50 in a state in which the position to be designated and the position object P1 coincide with each other in the physical space 70.
In a case in which the image processing apparatus 50 receives the instruction position determination operation, the image processing apparatus 50 determines the position of the position object P1 at that point in time as the position designated by the user in the physical space 70. Accordingly, the user can designate any position in the physical space 70 as, for example, the virtual projection surface installation position 81 or the virtual projection apparatus installation position 91 to the image processing apparatus 50.
The virtual projection surface installation position 81 is a first position corresponding to the position of the virtual projection surface 80 in the physical space 70. The reference point 111 is a second position that serves as a reference for the orientation of the virtual projection surface 80, and is a position that is not on a plane including the virtual projection surface 80, in the physical space 70. The reference point 111 is, for example, the virtual projection apparatus installation position 91. In this case, the image processing apparatus 50 needs only receive the position designated as the virtual projection apparatus installation position 91 as the reference point 111, and need not receive the designation of the reference point 111 separately from the position of the virtual projection apparatus installation position 91.
A plane passing through the virtual projection surface installation position 81 and through the reference point 111, which are designated by the user, and parallel to the direction of gravitational force (Y-axis) of the physical space 70 is set as an installation position plane. As the coordinate axes in the installation position plane, the same Y-axis as the Y-axis in the physical space 70 and an X′-axis perpendicular to the Y-axis are set. The Y-axis in the installation position plane is a vertical direction, and the X′-axis in the installation position plane is a horizontal direction.
The first angle Θ is an angle formed by a first line segment S1 connecting the reference point 111 (second position) and the virtual projection surface installation position 81 (first position) and a second line segment S2 passing through the reference point 111 and parallel to the X′-axis, in the installation position plane. That is, the first angle Θ is an angle formed by the first line segment S1 connecting the reference point 111 (second position) and the virtual projection surface installation position 81 (first position) and a plane that includes the reference point 111 and that is horizontal, in the physical space 70.
In the example of
In the example of
As shown in
In this case, as shown in
In this case, as shown in
First, the image processing apparatus 50 determines the virtual projection apparatus installation position 91, the virtual projection surface installation position 81, and the reference point 111 (step S11). For example, the image processing apparatus 50 receives designation of the virtual projection apparatus installation position 91, the virtual projection surface installation position 81, and the reference point 111 as shown in
Next, the image processing apparatus 50 calculates a positional relationship between the virtual projection surface installation position 81 and the reference point 111 determined in step S11 (step S12). For example, the image processing apparatus 50 calculates the first angle Θ shown in
Next, the image processing apparatus 50 determines the orientation of the virtual projection surface 80 based on the positional relationship between the virtual projection surface installation position 81 and the reference point 111 calculated in step S12 (step S13). For example, as shown in
Next, the image processing apparatus 50 determines the orientation of the virtual projection apparatus based on the orientation of the virtual projection surface 80 determined in step S13 (step S14). For example, as described in
Next, the image processing apparatus 50 calculates the projection distance D between the virtual projection apparatus and the virtual projection surface 80 based on the orientation of the virtual projection apparatus determined in step S14 (step S15). For example, as shown in
Next, the image processing apparatus 50 determines the size of the virtual projection surface 80 based on the projection distance D calculated in step S15 (step S16). For example, the image processing apparatus 50 determines the lateral width and the vertical width of the virtual projection surface 80 based on the specification (for example, the angle of view or the aspect ratio) of the projection apparatus 10 represented by the virtual projection apparatus and on the projection distance D.
By the processing so far, the position (virtual projection apparatus installation position 91) and the orientation of the virtual projection apparatus, and the position (virtual projection surface installation position 81), the orientation, and the size of the virtual projection surface 80 are determined. Next, the image processing apparatus 50 uses this information to superimpose a virtual projection apparatus image representing the virtual projection apparatus and a virtual projection surface image representing the virtual projection surface 80 on the captured image represented by the imaging data obtained by imaging performed by the image processing apparatus 50 in the physical space 70 (step S17).
Next, the image processing apparatus 50 displays the superimposition image obtained in step S17 on the touch panel 51 as the installation support image (step S18). Accordingly, the user can see the installation support image that virtually shows a state in which the projection apparatus 10 and the projection surface 11 are disposed at the position and the orientation determined based on the virtual projection apparatus installation position 91 and the virtual projection surface installation position 81 which are designated in step S11, in the physical space 70. The installation support image is an example of the second image. The installation support image data representing the installation support image is an example of the second image data.
In addition, the image processing apparatus 50 may re-execute steps S17 and S18 each time the position or the orientation of the image processing apparatus 50 in the physical space 70 is changed (that is, each time the captured image is changed). That is, the image processing apparatus 50 may update the superimposed virtual projection apparatus image and virtual projection surface image and the disposition thereof in the installation support image to be displayed in accordance with the changed imaging data.
For example,
The image processing apparatus 50 updates the virtual projection surface image superimposed on the captured image based on the changed orientation of the virtual projection surface 80 and on the size of the virtual projection surface 80 determined again, and displays the installation support image (second image) in which the virtual projection surface image is updated.
Although the processing in a case of changing the orientation of the virtual projection surface 80 by the user has been described, for example, in a case in which the image processing apparatus 50 receives an instruction operation of changing the position of the virtual projection surface 80 in an SZ direction from the user after the processing shown in
For example, in the example of
In this case, the image processing apparatus 50 changes the virtual projection surface installation position 81 determined in step S11 such that an end part (lower end) of the virtual projection surface 80 is in contact with the floor surface 201 based on the size of the virtual projection surface 80 determined in step S16, between step S16 and step S17 shown in
Accordingly, in a state in which the virtual projection surface 80 is translated downward as compared with a case in which the virtual projection surface installation position 81 is not changed, the virtual projection apparatus image and the virtual projection surface image are superimposed on the captured image in step S17. The determination of the position of the virtual projection surface 80 based on the detection of the surface serving as the reference for the position of the virtual projection surface 80 may be executed even in a case in which steps S17 and S18 are re-executed due to the change in the position or the orientation of the image processing apparatus 50 in the physical space 70, as described above.
Here, a case in which the virtual projection surface installation position 81 is changed such that the end part of the virtual projection surface 80 is in contact with the detected surface (floor surface 201) has been described. However, the image processing apparatus 50 may change the virtual projection surface installation position 81 such that the distance between the detected surface (floor surface 201) and the end part of the virtual projection surface 80 is a predetermined offset value. The offset value may be predetermined or may be able to be designated by the user.
While a case in which the virtual projection apparatus installation position 91 is used as the reference point 111 that serves as the reference for the orientation of the virtual projection surface 80 has been described, the image processing apparatus 50 may provisionally determine the orientation of the virtual projection surface 80 by using the position (camera position) of the image processing apparatus 50 before determining the final orientation of the virtual projection surface 80 based on the virtual projection apparatus installation position 91, and may display the installation support image based on the orientation of the virtual projection surface 80.
In the example of
As a result, as shown in
The determination of the provisional orientation of the virtual projection surface 80 based on the camera position in a case in which the first angle Θ is smaller than the threshold value has been described, but the image processing apparatus 50 may also determine the provisional orientation of the virtual projection surface 80 based on the camera position in a case in which the first angle Θ is equal to or larger than the threshold value.
As described above, the image processing apparatus 50 may determine the orientation of the virtual projection surface 80 based on the positional relationship between the virtual projection surface installation position 81 and the reference point 111 and on the position of the image processing apparatus 50 (imaging apparatus).
As described above, the processor 61 of the image processing apparatus 50 acquires first image data obtained by imaging the physical space 70 with an imaging apparatus of the sensor 65. In addition, the processor 61 of the image processing apparatus 50 determines the virtual projection surface installation position 81 (first position) corresponding to the position of the virtual projection surface 80 and the reference point 111 (second position) that is not on the plane including the virtual projection surface 80 and that serves as the reference for the orientation of the virtual projection surface 80, in the physical space 70, and determines the orientation of the virtual projection surface 80 based on the positional relationship between the virtual projection surface installation position 81 and the reference point 111 and generates virtual projection surface data representing the virtual projection surface 80. The processor 61 of the image processing apparatus 50 generates second image data representing the second image in which the virtual projection surface 80 is displayed on the first image represented by the first image data, based on the first image data and the virtual projection surface data, and outputs the second image data to the touch panel 51 (output destination).
As a result, even in a case in which the surface that serves as the reference for the position and the orientation of the virtual projection surface, such as a wall or a projection screen, is not present in the physical space 70, the position and the orientation of the virtual projection surface 80 can be easily determined. Therefore, it is possible to efficiently determine the disposition of the virtual projection apparatus and the virtual projection surface 80, and thus it is possible to improve the convenience of the user regarding the disposition of the projection surface 11 and the projection apparatus 10.
Modification examples related to each embodiment will be described.
Although a case in which the image processing apparatus 50 is a tablet terminal having a touch panel 51 has been described, the image processing apparatus 50 is not limited to such a configuration. For example, the image processing apparatus 50 may be an information terminal, such as a smartphone or a personal computer.
Although the configuration in which the image processing apparatus 50 displays the second image using the touch panel 51 has been described, the image processing apparatus 50 may transmit the generated second image to another apparatus to perform control to display the second image on the other apparatus. In this case, the image processing apparatus 50 may be an apparatus that does not comprise a display device.
Although a case in which the captured image representing the physical space 70 is an image obtained by imaging using an imaging apparatus of the image processing apparatus 50 has been described, the captured image may be an image obtained by imaging using an apparatus different from the image processing apparatus 50 and received by the image processing apparatus 50 from the apparatus. In this case, the image processing apparatus 50 may be an apparatus that does not comprise an imaging apparatus.
While a case in which the reference point 111 is the virtual projection apparatus installation position 91 has been described, the reference point 111 is not limited to this, and may be a position of the imaging apparatus (image processing apparatus 50), a position of an observer who observes the virtual projection surface 80, or a combination of these positions. Since the position of the imaging apparatus (image processing apparatus 50) is, for example, the origin of the world coordinate system in a case in which the image processing apparatus 50 recognizes the physical space 70, it is not necessary to receive designation from the user. Regarding the position of the observer who observes the virtual projection surface 80, the image processing apparatus 50 receives designation from the user, for example, by the designation method shown in
The image processing method described in the above embodiment can be implemented by executing an image processing program prepared in advance on a computer. This image processing program is recorded in a computer-readable storage medium and is executed by being read from the storage medium by a computer. In addition, this image processing program may be provided in a form of being stored in a non-transitory storage medium, such as a flash memory, or may be provided via a network, such as the Internet. The computer that executes this image processing program may be included in an image processing apparatus, may be included in an electronic apparatus such as a smartphone, a tablet terminal, or a personal computer capable of communicating with the image processing apparatus, or may be included in a server apparatus capable of communicating with the image processing apparatus and the electronic apparatus.
The embodiments and the modification examples can be implemented in combination with each other.
At least the following matters are described in the present specification.
An image processing apparatus comprising a processor,
The image processing apparatus according to (1),
The image processing apparatus according to (1) or (2),
The image processing apparatus according to any one of (1) to (3),
The image processing apparatus according to (4),
The image processing apparatus according to any one of (1) to (5),
The image processing apparatus according to any one of (1) to (6),
The image processing apparatus according to any one of (1) to (6),
The image processing apparatus according to any one of (1) to (8),
The image processing apparatus according to (9),
The image processing apparatus according to any one of (1) to (10),
The image processing apparatus according to (11),
The image processing apparatus according to any one of (1) to (12),
The image processing apparatus according to any one of (1) to (13),
An image processing method executed by a processor included in an image processing apparatus, the image processing method comprising:
An image processing program for causing a processor included in an image processing apparatus to execute a process comprising:
Although various embodiments have been described above, it goes without saying that the present invention is not limited to these examples. It is apparent that those skilled in the art may perceive various modification examples or correction examples within the scope disclosed in the claims, and those examples are also understood as falling within the technical scope of the present invention. In addition, each constituent in the embodiment may be used in any combination without departing from the gist of the invention.
The present application is based on Japanese Patent Application (JP2022-131119) filed on Aug. 19, 2022, the content of which is incorporated in the present application by reference.
| Number | Date | Country | Kind |
|---|---|---|---|
| 2022-131119 | Aug 2022 | JP | national |
This is a continuation of International Application No. PCT/JP2023/026850 filed on Jul. 21, 2023, and claims priority from Japanese Patent Application No. 2022-131119 filed on Aug. 19, 2022, the entire content of which is incorporated herein by reference.
| Number | Date | Country | |
|---|---|---|---|
| Parent | PCT/JP2023/026850 | Jul 2023 | WO |
| Child | 19055725 | US |