This application claims the priority benefit of Taiwan application serial no. 111147125, filed on Dec. 8, 2022. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
The disclosure relates to a projector configuration method, and in particular to a method for automatically detecting a projector configuration and a projection system.
Generally speaking, in the multi-projectors, the projectors are mostly managed using applications. The grouping manner of each projector group needs to be manually configured. For example, in a projector list, which projectors should be set as the same group are manually selected. After completing the projector group, applications of multi-machine settings, such as a splicing application is completed. Accordingly, only manual grouping can be used for the current method, and only one group can be set at a time, which is not efficient. In addition, manual grouping requires personnel to arrive at the place where the projectors are set, otherwise the position configuration of the projectors cannot be easily set.
The information disclosed in this Background section is only for enhancement of understanding of the background of the described technology and therefore it may contain information that does not form the prior art that is already known to a person of ordinary skill in the art. Further, the information disclosed in the Background section does not mean that one or more problems to be resolved by one or more embodiments of the invention was acknowledged by a person of ordinary skill in the art.
The disclosure provides a method for automatically detecting a projector configuration and a projection system, which can automatically detect an actual configuration relationship of projectors.
Other objectives and advantages of the disclosure can be further understood from the technical features disclosed in the disclosure.
In order to achieve one, a part, or all of the above objectives or other objectives, a method for automatically detecting a projector configuration of the disclosure includes the following steps. Multiple projectors are searched in a network domain. Each projector is correspondingly equipped with an imaging apparatus. The projectors are driven to project, and the imaging apparatuses are driven to capture images. The projectors are grouped based on one or more projected ranges included in an imaging range of the imaging apparatus of each projector. A configuration relationship of the projected ranges of the projectors in a same group is determined for overlapping areas between the projected ranges of the projectors grouped in the same group.
The projection system of the disclosure includes multiple projectors, multiple imaging apparatuses, and a processor. The projectors are disposed in a network domain. Each projector is correspondingly equipped with an imaging apparatus. The processor is coupled to the projector and the imaging apparatus, and is configured to perform the following. The projectors are searched in the network domain. The projectors are driven to project, and the imaging apparatuses are driven to capture images. The projectors are grouped based on one or more projected ranges included in an imaging range of the imaging apparatus of each projector. A configuration relationship of the projected ranges of the projectors in a same group is determined for overlapping areas between the projected ranges of the projectors grouped in the same group.
Based on the above, through automatic group pairing, the disclosure can automatically achieve the grouping of the projectors at the remote end and automatically detect the configuration relationship thereof, which improves the operating experience of a user.
Other objectives, features and advantages of the invention will be further understood from the further technological features disclosed by the embodiments of the invention wherein there are shown and described preferred embodiments of this invention, simply by way of illustration of modes best suited to carry out the invention.
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
It is to be understood that other embodiment may be utilized and structural changes may be made without departing from the scope of the invention. Also, it is to be understood that the phraseology and terminology used herein are for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless limited otherwise, the terms “connected,” “coupled,” and “mounted,” and variations thereof herein are used broadly and encompass direct and indirect connections, couplings, and mountings.
The aforementioned and other technical contents, features, and effects of the disclosure will be clearly presented in the following detailed description of a preferred embodiment with reference to the drawings. Directional terms, such as up, down, left, right, front, and back, mentioned in the following embodiments are only directions with reference to the drawings. Therefore, the directional terms are used to illustrate but not to limit the disclosure.
The processor 110 is coupled to the projectors 120-1 to 120-N and the imaging apparatuses 130-1 to 130-N, and is used to control operations of the projectors 120-1 to 120-N and the imaging apparatuses 130-1 to 130-N. The processor 110 is, for example, a central processing unit (CPU), a physical processing unit (PPU), a programmable microprocessor, an embedded control chip, a digital signal processor (DSP), an application specific integrated circuit (ASIC), or other similar apparatuses.
In the embodiment, it is assumed that the projectors 120-1 to 120-N are disposed in the same network domain. After searching the projectors 120-1 to 120-N located in the same network domain, the processor 110 can detect a configuration relationship thereof through a method for automatically detecting a projector configuration described later.
In an embodiment, a central control apparatus 100C (an electronic apparatus having computing and networking functions, such as a computer or a mobile device) may be disposed in the projection system 100, as shown in
The storage apparatus includes one or more code fragments. After installing the code fragments, the code fragments are executed by the processor 110 to implement the method for automatically detecting the projector configuration described later. The storage apparatus may adopt any type of fixed or removable random access memory (RAM), read-only memory (ROM), flash memory, hard disk, other similar apparatuses, or a combination of the apparatuses.
The communication component is used to connect to a network to search the projectors 120-1 to 120-N in the same network domain. The communication element may be a chip or a circuit adopting local area network (LAN) technology, wireless LAN (WLAN) technology, or mobile communication technology. A local area network is, for example, Ethernet. A wireless local area network is, for example, Wi-Fi. The mobile communication technology is, for example, global system for mobile communications (GSM), third-generation (3G) mobile communication technology, fourth-generation (4G) mobile communication technology, fifth-generation (5G) mobile communication technology, etc.
In another embodiment, the processor 110 may also be disposed in any one of the projectors 120-1 to 120-N.
Next, in Step S210, the processor 110 drives the projectors 120-1 to 120-N to project, and drives the imaging apparatuses 130-1 to 130-N to capture images. Afterwards, in Step S215, the processor 110 groups the projectors 120-1 to 120-N based on one or more projected ranges included in an imaging range of each of the imaging apparatus 130-1 to 130-N. In detail, the projectors 120-1 to 120-N are respectively disposed in different spaces or situations, and Step S215 is to find the projectors used together in the same space or situation to be defined as a group, and divide into multiple groups according to multiple projectors in different spaces or situations, that is, to group the projectors. For example, a group of one or more projectors in a conference room A may be found, a group of one or more projectors in a conference room B may be found, and so on. For example, in the same space, projector groups in different situations are used, such as projector groups located at different booths in an exhibition hall.
Here, the processor 110 may drive one or more of the projectors 120-1 to 120-N to project recognition images at one time, and drive one or more of the imaging apparatuses 130-1 to 130-N to capture images. Subsequently, based on obtained captured image, the respective projected ranges of the projectors 120-1 to 120-N may be calculated, and the projected ranges included in the imaging range of each of the imaging apparatuses 130-1 to 130-N may be obtained, thereby analyzing which projector the projected ranges included in the imaging range belong to. After obtaining the projected ranges included in the respective imaging ranges of the imaging apparatuses 130-1 to 130-N, the projectors 120-1 to 120-N may be further optionally grouped according to whether the projected ranges intersect (overlap).
Then, in Step S220, the processor 110 determines a configuration relationship of the projected ranges of the projectors in the same group based on overlapping areas between the projected ranges of the projectors grouped in the same group.
In detail, the key to the automatic grouping is to establish a correlation among the projectors 120-1 to 120-N. After finding all the projectors 120-1 to 120-N in the same network domain, the projectors 120-1 to 120-N are sequentially or simultaneously driven to project recognition images with the same or different patterns or identification information. For example, the recognition image includes at least one of a number, an alphabet, a color, a pattern, etc. The recognition images are projected to a projection surface by the projector 120-1 to 120-N, the captured image is obtained through the imaging apparatus 130-1 to 130-N, and the processor 110 then identifies the captured image via a manner of image processing and identification, so as to find the correlation among the projectors 120-1 to 120-N. Hereinafter, another embodiment is used to describe time points (time periods) of driving the projectors 120-1 to 120-N and the imaging apparatuses 130-1 to 130-N in detail.
In the following embodiments, N=11 is used for illustration, but not limited thereto. That is, it is assumed that projectors 120-1 to 120-11 (N=11) are included in the same network domain, and imaging apparatuses 130-1 to 130-11 (N=11) are disposed corresponding to the projectors 120-1 to 120-11. The projector 120-1 is correspondingly equipped with the imaging apparatus 130-1, the projector 120-2 is correspondingly equipped with the imaging apparatus 130-2, . . . , and the projector 120-11 is correspondingly equipped with the imaging apparatus 130-11. Moreover, the imaging range of the imaging apparatus 130-1 to 130-11 corresponding to each of the projectors 120-1 to 120-11 is greater than the projected range thereof. That is, the imaging range of the imaging apparatus 130-1 is greater than the projected range of the projector 120-1, the imaging range of the imaging apparatus 130-2 is greater than the projected range of the projector 120-2, . . . , and the imaging range of the imaging apparatus 130-11 is greater than the projected range of the projector 120-11.
In the first embodiment, since only one of the projectors 120-1 to 120-11 is driven to project each time, the processor 110 may clearly know which projector is the source of each projection. Accordingly, the projectors 120-1 to 120-11 may adopt the recognition image 30I having the same pattern.
Please refer to
Afterwards, the processor 110 may further determine the projected range included in each imaging range according to 11 groups of captured images obtained (each group includes 11 captured images), so as to determine the correlation among the projectors 120-1 to 120-11.
Specifically, taking
Each of the imaging ranges C301 to C311 at least includes the projected range of its own corresponding projector, and whether the projected ranges of other projectors are included may be further determined based on the above action.
Next, in response to an i-th imaging range of the i-th imaging apparatus including an i-th projected range and at least one other projected range, the processor 110 sets each projector corresponding to an individual at least one other projected range overlapping with the i-th projected range to have correlation with the i-th projector, and sets each projector corresponding to an individual other projected range not overlapping with the i-th projected range to have no correlation with the i-th projector.
That is, for the imaging range that includes multiple projected ranges, the processor 110 determines whether the projected ranges included in the imaging range overlap with the projected range of the corresponding projector, so as to determine that the projectors whose projected ranges overlap have correlation, and determine that the projectors whose projected ranges do not overlap have no correlation. For example, assuming that the imaging range C301 includes the projected ranges P201 and P202, if the projected range P201 overlaps with the projected range P202, the projector 120-1 having correlation with the projector 120-2 is determined. If the projected range P201 does not overlap with the projected range P202, the projector 120-1 not having correlation with the projector 120-2 is determined.
In addition, it is assumed that the imaging range C302 includes projected ranges P201 to P203, and the projected range P202 respectively overlaps with the projected range P201 and the projected range P203, but the projected range P201 does not overlap with the projected range P203. In this case, the projector 120-2 respectively has correlation (direct correlation) with the projectors 120-1 and 120-3 is determined. Moreover, although the projected range P201 does not overlap with the projected range P203, since the projector 120-1 has correlation with the projector 120-2, and the projector 120-2 has correlation with the projector 120-3, the projector 120-1 is determined to have indirect correlation with the projector 120-3 via the projector 120-2.
Then, the processor 110 groups according to the correlation among the projectors 120-1 to 120-11. For example, the projectors having direct or indirect correlation are set to the same group.
In the second embodiment shown in
Specifically, first, as shown in
Afterwards, the processor 110 determines the projected range of the corresponding projector based on the first captured image captured by each of the imaging apparatuses 130-1 to 130-11. Moreover, the processor 110 determines whether the imaging range includes the projected range of at least one of the other 10 projectors except the corresponding projector based on the second captured image captured by each of the imaging apparatuses 130-1 to 130-11 (that is, based on the second captured image captured by the j-th imaging apparatus). That is, the projected range included in each imaging range is determined, so as to determine the correlation among the projectors 120-1 to 120-11.
Afterwards, in response to the j-th imaging range including the projected range of at least one of the other 10 projectors, based on the first captured image and the second captured image captured by the j-th imaging apparatus, the processor 110 sets each projector corresponding to an individual other projected range overlapping with a j-th projected range of the j-th imaging apparatus to have correlation with the j-th projector, and sets each projector corresponding to an individual other projected range not overlapping with the j-th projected range to have no correlation with the j-th projector.
For example, in
In the third embodiment shown in
Afterwards, the processor 110 respectively determines whether a k-th imaging range of a k-th imaging apparatus among the 11 imaging apparatuses 130-1 to 130-11 includes a k-th projected range of a k-th projector and the projected ranges of other projectors based on the 11 captured images, where k=1, 2, . . . , N. That is, each captured image obtained by the imaging apparatuses 130-1 to 130-11 is analyzed to determine which projected ranges are included in each imaging range. Afterwards, the projectors whose projected ranges overlap are set to have correlation, and the projectors whose projected ranges do not overlap have no correlation. Then, the processor 110 groups based on the correlation among the 11 projectors 120-1 to 120-11. For example, the projectors having direct or indirect correlation are set to the same group.
The timing of driving the projector and the imaging apparatus of the first to third embodiments are only examples for illustration and are not limited thereto.
After grouping ends, the processor 110 further determines the configuration relationship of the projected ranges of the projectors in each group. The configuration relationship includes one of a horizontal splicing configuration, a vertical splicing configuration, a stacking configuration, and a matrix splicing configuration.
The technical contents of the determination of correlation and the recognition image (configuration relationship) will be respectively further explained as follows.
First, the determination of correlation will be described with different situations of FIG. 7A,
After obtaining the corresponding captured images via the driving sequence of the projectors and the imaging apparatuses of
As shown in
As shown in
As shown in
Next, the recognition images (configuration relationship) will be described with reference to
A recognition image 800 of
A recognition image 810 of
In an embodiment, whether the recognition images projected by the projectors 120-1 to 120-11 are the same may be decided based on the driving sequence of the projectors 120-1 to 120-11 and the imaging apparatuses 130-1 to 130-11. For example, if the driving sequence of “separate projection and simultaneous or separate imaging” in the first embodiment is adopted, the recognition images projected by the projectors 120-1 to 120-11 may be the same. In addition, if the driving sequence of “simultaneous projection and separate or simultaneous imaging” in the second embodiment is adopted, the recognition images projected by the projectors 120-1 to 120-11 need to be different in order to determine the correlation among different projected ranges in one captured image.
The following further explains how to determine the configuration relationship of the projected ranges. The action of grouping the projectors is similar to the action of determining the configuration relationship of the projected ranges and may be achieved through driving sequence manners such as “separate projection and simultaneous imaging” or “simultaneous projection and separate imaging”.
In an embodiment, the processor 110 may sequentially drive an m-th (m=1, 2, . . . , M) projector among M projectors included in the same group to project a reference image, and drive M imaging apparatuses included in the same group to simultaneously capture images in the case where the m-th projector projects the reference image. Then, the configuration relationship between the m-th projector and the M−1 other projectors is determined based on imaging results of the M imaging apparatuses. Here, the reference image is also similar to the recognition images 800 and 810 shown in
Based on the captured images 911, 914, and 917 and identification patterns of the reference images 901 and 902, the projected ranges P201 and P202 of the projector 120-1 and the projector 120-2 may be confirmed to have a configuration relationship 91 shown in
In addition, in other embodiments, the processor 110 drives all the projectors included in the same group to respectively project multiple reference images having different patterns. In the case where the projectors all project one corresponding reference image, all the imaging apparatuses included in the same group are driven to respectively capture images, so as to obtain multiple captured images (corresponding to the number of the imaging apparatuses, that is, one imaging apparatus obtains one captured image). Afterwards, based on the captured images, the configuration relationship among all the projectors included in the same group is determined.
In addition, in other embodiments, the processor 110 may also complete the grouping and the confirmation of the configuration relationship at the same time. That is, all the projectors 120-1 to 120-11 are driven to respectively project recognition images having different patterns, and each of the imaging apparatuses 130-1 to 130-11 is driven to capture an image, so as to obtain 11 captured images. Here, 11 recognition images are used for the projectors 120-1 to 120-11 to respectively project. The identification image projected by each projector includes a pattern or identification information related to itself, so that the processor 110 may identify the source of each projected range for the captured image. After confirming the grouping of the projectors 120-1 to 120-11, the processor 110 further identifies one or more of the patterns included in overlapping areas between the projected ranges of all the projectors in the same group based on the 11 captured images, so as to obtain identification results. Based on the recognition results, the configuration relationship among all the projectors in the same group is determined.
Below is an example to illustrate how to complete the grouping and the confirmation of the configuration relationship at the same time.
The projectors 120-1 to 120-11 respectively project recognition images 1101 to 1111 having different patterns, as shown in
In
In summary, the disclosure can automatically detect the grouping of the projectors and the configuration relationship of the projected ranges. The disclosure may be implemented through an application, which may be automatically performed by adopting one button, so as to save multiple operation actions, thereby improving the operating experience of the user. Moreover, the disclosure may be applied to a remote control, the operation of the projector and the imaging apparatus thereof is remotely driven through the central control apparatus, and the captured image obtained by each imaging apparatus is received, thereby performing computations such as image processing and recognition on the captured image, thereby achieving automatic grouping and establishment of the configuration relationship. In addition, the disclosure may also be applied to an on-screen display (OSD). The processor is disposed in one of the projectors, and the projector may issue a command to drive the operation of the projector and the imaging apparatus thereof, so as to automatically detect the grouping and the configuration relationship of the projectors.
The foregoing description of the preferred embodiments of the disclosure has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form or to exemplary embodiments disclosed. Accordingly, the foregoing description should be regarded as illustrative rather than restrictive. Obviously, many modifications and variations will be apparent to practitioners skilled in this art. The embodiments are chosen and described in order to best explain the principles of the disclosure and its best mode practical application, thereby to enable persons skilled in the art to understand the disclosure for various embodiments and with various modifications as are suited to the particular use or implementation contemplated. It is intended that the scope of the disclosure be defined by the claims appended hereto and their equivalents in which all terms are meant in their broadest reasonable sense unless otherwise indicated. Therefore, the term “the disclosure”, “the present disclosure” or the like does not necessarily limit the claim scope to a specific embodiment, and the reference to particularly preferred exemplary embodiments of the disclosure does not imply a limitation on the disclosure, and no such limitation is to be inferred. The disclosure is limited only by the spirit and scope of the appended claims. Moreover, these claims may refer to use “first”, “second”, etc. following with noun or element. Such terms should be understood as a nomenclature and should not be construed as giving the limitation on the number of the elements modified by such nomenclature unless specific number has been given. The abstract of the disclosure is provided to comply with the rules requiring an abstract, which will allow a searcher to quickly ascertain the subject matter of the technical disclosure of any patent issued from this disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Any advantages and benefits described may not apply to all embodiments of the disclosure. It should be appreciated that variations may be made in the embodiments described by persons skilled in the art without departing from the scope of the disclosure as defined by the following claims. Moreover, no element and component in the disclosure is intended to be dedicated to the public regardless of whether the element or component is explicitly recited in the following claims.
Number | Date | Country | Kind |
---|---|---|---|
111147125 | Dec 2022 | TW | national |