This application claims the priority benefit of China application serial no. 202311408914.7, filed on Oct. 27, 2023. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
The disclosure relates to a display technology, and particularly to a projection system, a projection device, and a control method thereof.
The current control method of projectors involves adjusting the projection image projected by the projector (such as keystone correction) through manual operation of the projector's remote control or operation of the human-machine interface on the projector by the user. Therefore, the adjustment method of the projection image of traditional projectors is inconvenient.
The information disclosed in this Background section is only for enhancement of understanding of the background of the described technology and therefore it may contain information that does not form the prior art that is already known to a person of ordinary skill in the art. Further, the information disclosed in the Background section does not mean that one or more problems to be resolved by one or more embodiments of the disclosure was acknowledged by a person of ordinary skill in the art.
The disclosure provides a projection system, a projection device, and a control method thereof, which may achieve the function of correcting the projection image.
Other objects and advantages of the disclosure may be further understood from the technical features disclosed in the disclosure.
In order to achieve one or a portion of or all of the above objectives or other objectives, a control method of the projection device of the disclosure includes the following steps. A first original command is transmitted by a terminal device. An adjustment image is projected by the projection device in response to the first original command corresponding to an image correction operation of the projection device, wherein the adjustment image includes at least one pattern array and at least one adjustment reference point. A second original command is transmitted by the terminal device. A position of the at least one adjustment reference point of the adjustment image is adjusted by the projection device in response to the second original command corresponding to adjusting the position of the at least one adjustment reference point of the adjustment image, wherein the at least one adjustment reference point is located in the corresponding at least one pattern array.
In order to achieve one or a portion of or all of the above objectives or other objectives, a projection system of the disclosure includes a terminal device and a projection device. The terminal device is configured to transmit a first original command and a second original command. The projection device is coupled to the terminal device. The projection device is configured to project an adjustment image in response to the first original command corresponding to an image correction operation of the projection device. The adjustment image includes at least one pattern array and at least one adjustment reference point. The projection device is configured to adjust a position of the at least one adjustment reference point of the adjustment image in response to the second original command corresponding to adjusting the position of the at least one adjustment reference point of the adjustment image. The at least one adjustment reference point is located in the corresponding at least one pattern array.
In order to achieve one or a portion of or all of the above objectives or other objectives, a projection device of the disclosure includes a projection module, a communication interface, and a processor. The projection module is configured to project an adjustment image. The communication interface is configured to receive a first original command and a second original command. The processor is coupled to the projection module and the communication interface. The processor is configured to: control the projection module to project the adjustment image in response to the first original command corresponding to an image correction operation of the projection device, wherein the adjustment image includes at least one pattern array and at least one adjustment reference point; and control the projection module to adjust a position of the at least one adjustment reference point of the adjustment image in response to the second original command corresponding to an adjustment of the position of the at least one adjustment reference point of the adjustment image, wherein the at least one adjustment reference point is located in the corresponding at least one pattern array.
Based on the above, the projection system, the projection device, and the control method of the disclosure may instantly identify the commands input by the user and automatically generate corresponding commands to control the projection device to achieve the function of correcting the projection image.
Other objectives, features and advantages of the present invention will be further understood from the further technological features disclosed by the embodiments of the present invention wherein there are shown and described preferred embodiments of this invention, simply by way of illustration of modes best suited to carry out the invention.
It is to be understood that other embodiment may be utilized and structural changes may be made without departing from the scope of the present invention. Also, it is to be understood that the phraseology and terminology used herein are for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless limited otherwise, the terms “connected,” “coupled,” and “mounted,” and variations thereof herein are used broadly and encompass direct and indirect connections, couplings, and mountings.
In the embodiment, the terminal device 120 may have voice and/or text input functions. The terminal device 120 may be, for example, a smart phone, a tablet computer, a personal computer, a remote control of the projection device 110, or other smart portable devices or electronic devices with input functions. The projection device 110 (projector) may include, for example, a communication interface and a projection module. In the embodiment, the user may perform command input through the terminal device 120 to input original commands for controlling the projection device 110, and perform relevant original command identification and generate standard commands for controlling the projection device 110 to achieve the function of controlling the projection device 110 by the terminal device 120, the cloud server 130, and the natural language model 140.
In the embodiment, the terminal device 120 includes, for example, at least one processor, a screen, and a communication interface. The processor is coupled to the screen and the communication interface. The at least one processor may include, for example, a central processing unit (CPU), a graphics processing unit (GPU), other programmable general-purpose or special-purpose microprocessors, a digital signal processor (DSP), a programmable controller, an application specific integrated circuit (ASIC), a programmable logic device (PLD), other similar processing devices, or a combination thereof. The terminal device 120 may also include a storage device (not shown). The storage device is, for example, any form of fixed or movable random access memory (RAM), a read-only memory (ROM), a flash memory, a hard disk, or other circuits or chips with similar functions, or a combination thereof. One or more application programs are stored in the storage device. After the application programs are installed on the terminal device 120, the application programs may be executed by the processor. In the embodiment, the screen is configured to display images, and the screen may be, for example, a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, etc. The terminal device 120 may further include a voice reception device, such as a microphone, and the voice reception device is coupled to the processor. The communication interface is, for example, a chip or circuit using wired and/or wireless communication technologies or mobile communication technology. The mobile communication technology is, for example, a global system for mobile communications (GSM), a third-generation mobile communication technology (3G), a fourth-generation mobile communication technology (4G), a fifth-generation mobile communication technology (5G), etc.
In the embodiment, the natural language model 140 may be, for example, a ChatBot. The ChatBot has a machine learning algorithm. The ChatBot may be, for example, any pre-trained ChatBot such as a chat generative pre-trained transformer (ChatGPT), Microsoft Bing, Google Bard, or an ERNIE Bot, or the natural language model 140 may be a dedicated ChatBot trained with data in a specific field. The natural language model 140 may be configured to perform natural language processing and understanding, dialogue management, speech recognition (speech-to-text), text-to-speech, etc. The natural language model 140 may identify multiple languages and multiple accents. In the embodiment, the natural language model 140 may be disposed in the cloud server 130 or a third-party cloud server. The cloud server 130 and the third-party cloud server may include at least one processor and a storage device. The storage device is, for example, configured to store a ChatBot with a machine learning algorithm, and the at least one processor is, for example, configured to execute the algorithm.
In step S210, the terminal device 120 may transmit the first original command. In step S220, the projection device 110 may project the adjustment image onto a projection target in response to the first original command corresponding to an image correction operation of the projection device 110. The adjustment image includes at least one pattern array and at least one adjustment reference point. For example, the user may input a voice command (the first original command) in the form of natural language such as “I want to correct the image” or “I want to adjust the tilted image” through the terminal device 120, then the terminal device 120 or the cloud server 130 may provide a corresponding standard command (a first standard command) or projector control code (a first projector control code) to the projection device 110, so that the projection device 110 may perform the image correction operation according to the corresponding standard command or projector control code to project the adjustment image onto the projection target.
In step S230, the terminal device 120 may transmit the second original command. In step S240, the projection device 110 may adjust a position of the at least one adjustment reference point of the adjustment image in response to the second original command corresponding to adjusting the position of the at least one adjustment reference point of the adjustment image. The at least one adjustment reference point is located in the corresponding at least one pattern array. In the embodiment, the at least one pattern array may include, for example, a grid pattern array, a dot pattern array, or other pattern arrays of any shape. The at least one adjustment reference point may be adjusted to any position on any pattern in the at least one pattern array, such as the center point or the vertex of the pattern. For example, the at least one adjustment reference point may be adjusted to any grid pattern of the at least one grid pattern array, or may be adjusted to any dot of the at least one dot pattern array.
For example, the pattern array is a grid pattern array, and each grid has a corresponding coordinate value. The at least one adjustment reference point is, for example, the four corner vertices of a quadrilateral adjustment image. The user may, for example, input a voice command (the second original command) in the form of natural language such as “I want to adjust the upper left corner of the adjustment image to coordinates (4,3)” through the terminal device 120. The terminal device 120 or the cloud server 130 may provide a corresponding standard command (a second standard command) or projector control code (a second projector control code) to the projection device 110, so that the projection device 110 may adjust the upper left corner of the adjustment image to the coordinates (4,3) according to the corresponding standard command or projector control code.
In steps S210 and S230, the terminal device 120 may transmit the first original command and the second original command to the cloud server 130. The cloud server 130 may receive the first original command and the second original command, and input a corresponding rule command, the first original command, and the second original command into the natural language model 140. The rule command may, for example, be a command configured to limit the natural language model 140 to output only commands that may be interpreted by the projection device 110 or converted into projector control codes of the projection device 110. For further related information regarding the rule command, reference may be made to U.S. application Ser. No. 18/784,932 (“PROJECTION SYSTEM, TERMINAL DEVICE, PROJECTION DEVICE AND CONTROL METHOD THEREOF”) filed on Jul. 26, 2024, the contents of which are hereby incorporated by reference. The natural language model 140 may respectively convert the first original command and the second original command into the first standard command and the second standard command according to the rule command, and transmit the first standard command and the second standard command to the cloud server 130. The cloud server 130 may receive the first standard command and the second standard command to control the projection device 110 to project and adjust the adjustment image. The cloud server 130 may further transmit the first standard command and the second standard command to the terminal device 120 to control the projection device 110 to project the adjustment image and adjust the position of the adjustment reference point of the adjustment image through the terminal device 120.
In an embodiment, the projection device 110 may also be communicatively connected to the cloud server 130. The terminal device 120 may transmit the first original command and the second original command to the projection device 110, so that the projection device 110 transmits the first original command and the second original command to the cloud server 130. The cloud server 130 may transmit the first standard command and the second standard command to the projection device 110 to directly control the projection device 110 to project the adjustment image and adjust the position of the adjustment reference point of the adjustment image.
In another embodiment, the terminal device 120 or the cloud server 130 may further include a speech recognition model to convert the first original command and the second original command in the form of voice data into the first original command and the second original command in the form of text data, and then input the same (text data) into the natural language model 140. Alternatively, the natural language model 140 may also directly interpret the first original command and the second original command in the form of voice data.
In the embodiment, the first standard command and the second standard command may, for example, be commands that may be interpreted by the projection device 110. The cloud server 130 may transmit the first standard command and the second standard command to the terminal device 120, so that the terminal device 120 outputs the first standard command and the second standard command to the projection device 110 to control the projection device 110. Alternatively, the terminal device 120 may further convert the first standard command and the second standard command into the first projector control code and the second projector control code corresponding to the machine type of the projection device 110 according to projection device information of the projection device 110, and output the projector control code to the projection device 110. The projection device information may include, for example, the machine type or the model of the projection device 110. Alternatively, in an embodiment, the cloud server 130 may further convert the first standard command and the second standard command into the first projector control code and the second projector control code according to the projection device information of the projection device 110 to control the projection device 110. The projection device 110 may receive the first projector control code and the second projector control code from the cloud server 130. For further related information regarding the terminal device 120 obtaining the projection device information of the projection device 110, reference may be made to U.S. application Ser. No. 18/784,932 (“PROJECTION SYSTEM, TERMINAL DEVICE, PROJECTION DEVICE AND CONTROL METHOD THEREOF”) filed on Jul. 26, 2024, the contents of which are hereby incorporated by reference.
In an embodiment, the terminal device 120 may receive the first projector control code and the second projector control code from the cloud server 130 and transmit the first projector control code and the second projector control code to the projection device 110. When the first original command and the second original command correspond to the operation of the projection device 110, the terminal device 120 may output the first projector control code and the second projector control code to the projection device 110 to drive the projection device 110 to perform the operations corresponding to the first projector control code and the second projector control code (such as projecting the adjustment image and adjusting the adjustment image). The first projector control code and the second projector control code respectively correspond to the first original command and the second original command. In an embodiment, if the original command does not correspond to the operation of the projection device 110, the natural language model 140 may generate feedback information according to the original command, and transmit the feedback information to at least one of the terminal device 120 and the projection device 110 through the cloud server 130 so as to inform the user that the projection device 110 cannot perform the operation. For example, the feedback information may be displayed on the screen of the terminal device 120 or projected on the projection target by the projection device 110 in the form of graphics and/or text, or the feedback information may be played in the form of audio through the speakers of the terminal device 120 or the projection device 110.
The projection device 110 of the embodiment may adjust the projection image conveniently and quickly to achieve an accurate keystone correction effect. The data form and command content of the voice command may be determined based on the type of the natural language model 140. The adjustment image of the embodiment and the specific implementation of the adjusting of the adjustment image will be described in detail in the following embodiments. Referring to
In the embodiment, the user may first manually adjust the position of the projection device 110, so that a correction frame 300F of the projection target 300 (for example, the outer frame edge of the projection screen) may be located between the outer frame 401 and the inner frame 402. As shown in
Referring again to
In the embodiment, the user, for example, can adjust the positions of the adjustment reference points P1 to P4 through voice control (i.e., step S230), and the adjusted displacement amounts of the adjustment reference points P1 to P4 may be obtained by referring to the calculation of the following formula (3) to formula (6). In the following formula (3) to formula (6), DP1 to DP4 are the adjusted displacement amounts of the adjustment reference points P1 to P4 respectively. The unit of displacement amounts DP1 to DP4 is pixel distance. (X1,Y1), (X2,Y2), (X3,Y3), and (X4,Y4) are the target adjustment coordinates of the adjustment reference points P1 to P4 respectively, that is, based on the (second) original command provided by the user, the adjustment reference points P1 to P4 are moved to the target adjustment coordinates.
For example, assuming that the (second) original command output by the user is to adjust the adjustment reference point P1 to the target adjustment coordinate (8,2), then the displacement amount of the target adjustment coordinate is (80,−80) (the unit is pixel distance, and the positive and negative signs represent direction). Assuming that the (second) original command output by the user is to adjust the adjustment reference point P2 to the target adjustment coordinate (4,7), then the displacement amount of the target adjustment coordinate is (−60,−30). Assuming that the (second) original command output by the user is to adjust the adjustment reference point P3 to the target adjustment coordinate (6,3), then the displacement amount of the target adjustment coordinate is (60,30). Assuming that the (second) original command output by the user is to adjust the adjustment reference point P4 to the target adjustment coordinate (5,5), then the displacement amount of the target adjustment coordinate is (−50,50). In an embodiment, the projection device 110 may, for example, calculate the adjustment angle and the deformation amount required for adjustment of the adjustment image 400 projected by the projection device 110 according to the second original command. For example, the projection device 110 may calculate and adjust and obtain the slope changes of the adjustment reference points P1 to P4 according to the adjustment reference points P1 to P4 or the displacement amounts DP1 to DP4.
The patterns, shapes, and numbers of the pattern arrays and the numbers of the adjustment reference points of the disclosure are not limited to those shown in
As shown in
As shown in
As shown in
As shown in
In the embodiment, since the corrected adjustment image 500 is a non-tilted image, the shapes of the pattern arrays 511 to 514 are a rectangular grid. The four sides of the outer frame 501 of the adjusted adjustment image 500 may be respectively parallel to the four sides of the correction frame 300F. For example, the outer frame 501 may overlap the correction frame 300F; the correction frame 300F may be located between the outer frame 501 and the inner frame 502; or the correction frame 300F at least partially overlaps the outer frame 501 or the inner frame 502. In this way, the projection image of the projection device 110 may be corrected, and the projection image of the projection device 110 is located in the inner frame 502 of the corrected adjustment image 500 to effectively eliminate the problem of keystone distortion of the projection image.
In the embodiment, the projection module 612 may include, for example, light sources, light valves, projection lenses and other related optical components and related circuit components thereof. The communication interface 613 may be connected to the cloud server 630. The projection device 610 and the cloud server 630 may communicate through wired and/or wireless communication methods. The wired communication method is, for example, a cable. The wireless communication method includes, for example, Wifi, Bluetooth, and/or the Internet.
In the embodiment, the communication interface 613 is, for example, a chip or circuit using wired and/or wireless communication technologies or mobile communication technology. The mobile communication technology includes a global system for mobile communications, third/fourth/fifth generation mobile communication technologies, etc.
The projection device 610 may also have a built-in voice reception device (a microphone) to directly receive the first original command and the second original command in the form of voice data from the user. The at least one processor 611 may transmit the first original command and the second original command to the cloud server 630 through the communication interface 613. In an embodiment, the communication interface 613 may receive the first original command and the second original command from a terminal device (such as the terminal device 120 in the embodiment of
In the embodiment, the projection device 610 may communicate with the cloud server 630 through the communication interface 613 to provide the first original command and the second original command to the cloud server 630. The cloud server 630 may input the first original command and the second original command into the natural language model, and the natural language model may generate the corresponding first standard command and the second standard command. The cloud server 630 may directly provide the first standard command and the second standard command to the projection device 610. In the embodiment, the cloud server 630 may, for example, be connected to the natural language model through the Internet. In an embodiment, the natural language model can also be built into the cloud server 630.
In the embodiment, the at least one processor 611 may receive the first standard command and the second standard command through the communication interface 613. The at least one processor 611 may convert the first standard command and the second standard command into the first projector control code and the second projector control code according to the projection device information to control the projection module 612 and adjust the adjustment image. The projection device 610 of the embodiment may directly detect the user's voice signal and generate a corresponding voice original command. The projection device 610 of the embodiment may obtain the corresponding standard commands through the cloud server 630 and the natural language model, so as to achieve the function of effectively adjusting the adjustment image through voice.
For the specific implementation manner and related technical details of the original commands, standard commands, and projector control codes of the embodiment, please refer to the description of the embodiments of
To sum up, the projection system, the projection device, and the control method of the disclosure have at least one of the following advantages. The projection system, the projection device, and the control method of the disclosure may receive the user's natural language commands through the terminal device or the projection device, and may instantly identify the natural language commands input by the user and automatically generate corresponding standard commands, so as to enable the projection device to project the adjustment image and adjust the position of the adjustment reference point of the adjustment image, thereby achieving an effective projection image correction function.
The foregoing description of the preferred embodiments of the invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form or to exemplary embodiments disclosed. Accordingly, the foregoing description should be regarded as illustrative rather than restrictive. Obviously, many modifications and variations will be apparent to practitioners skilled in this art. The embodiments are chosen and described in order to best explain the principles of the invention and its best mode practical application, thereby to enable persons skilled in the art to understand the invention for various embodiments and with various modifications as are suited to the particular use or implementation contemplated. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents in which all terms are meant in their broadest reasonable sense unless otherwise indicated. Therefore, the term “the invention”, “the present invention” or the like does not necessarily limit the claim scope to a specific embodiment, and the reference to particularly preferred exemplary embodiments of the invention does not imply a limitation on the invention, and no such limitation is to be inferred. The invention is limited only by the spirit and scope of the appended claims. Moreover, these claims may refer to use “first”, “second”, etc. following with noun or element. Such terms should be understood as a nomenclature and should not be construed as giving the limitation on the number of the elements modified by such nomenclature unless specific number has been given. The abstract of the disclosure is provided to comply with the rules requiring an abstract, which will allow a searcher to quickly ascertain the subject matter of the technical disclosure of any patent issued from this disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Any advantages and benefits described may not apply to all embodiments of the invention. It should be appreciated that variations may be made in the embodiments described by persons skilled in the art without departing from the scope of the present invention as defined by the following claims. Moreover, no element and component in the present disclosure is intended to be dedicated to the public regardless of whether the element or component is explicitly recited in the following claims.
| Number | Date | Country | Kind |
|---|---|---|---|
| 202311408914.7 | Oct 2023 | CN | national |