This patent application is based on and claims priority pursuant to 35 U.S.C. §119(a) to Japanese Patent Application Nos. 2015-179860 and 2016-151509, filed on Sep. 11, 2015, and Aug. 1, 2016, respectively, in the Japan Patent Office, the entire disclosures of which is are hereby incorporated by reference herein.
Technical Field
Embodiments of the present invention relate to a video display system, an image display control method, and a recording medium storing an image display control program.
Background Art
In recent years, video display systems in which a video display device such as a projector and a display is connected to a plurality of information processing devices such as tablet personal computers (PCs), personal computers (PCs), and smartphones through the network are known in the art.
In such a video display system, a host of the video display system selects one of a plurality of information processing devices as an information processing device from which moving images are to be projected. Then, the selected information processing device sends video signals to the video display device, and the video display device displays the moving images based on the received video signals.
Embodiments of the present invention described herein provide a video display system, an image display control method, and a recording medium storing an image display control program. Each of the video display system, the image display control method, and the recording medium storing the image display control program includes obtaining a facial image of a user who uses at least one of a plurality of information processing devices, comparing the obtained facial image with a plurality of facial images each of which is associated with each of the plurality of information processing devices, the facial images being stored in advance as association information, selecting one of the plurality of information processing devices associated with the obtained facial image, based on a result of comparison between the obtained facial image and the plurality of facial images, obtaining a video signal from the selected information processing device, and displaying a video based on the obtained video signal.
A more complete appreciation of exemplary embodiments and the many attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings.
The accompanying drawings are intended to depict exemplary embodiments of the present disclosure and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “includes” and/or “including”, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In describing example embodiments shown in the drawings, specific terminology is employed for the sake of clarity. However, the present disclosure is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have the same structure, operate in a similar manner, and achieve a similar result.
In the following description, illustrative embodiments will be described with reference to acts and symbolic representations of operations (e.g., in the form of flowcharts) that may be implemented as program modules or functional processes including routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types and may be implemented using existing hardware at existing network elements or control nodes. Such existing hardware may include one or more central processing units (CPUs), digital signal processors (DSPs), application-specific-integrated-circuits (ASICs), field programmable gate arrays (FPGAs), computers or the like. These terms in general may be collectively referred to as processors.
Unless specifically stated otherwise, or as is apparent from the discussion, terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical, electronic quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Embodiments of the present invention are described below in detail with reference to the drawings. In the present embodiment, a video display system is described in which a video display device such as a projector and a display is connected to a plurality of information processing devices such as tablet personal computers (PCs), personal computers (PCs), and smartphones through the network.
In the present embodiment, the description is given under the following assumptions. In a conference or lecture, the host of a video display system selects one of a plurality of information processing devices as an information processing device from which moving images are to be projected. Then, the selected information processing device sends video signals to a video display device, and the video display device displays the moving images based on the received video signals.
In such a configuration, the host (manager) of the video display system has to use the information processing device of his/her own to select an information processing device from which moving images are to be projected, from a list of information processing devices of participants.
In other words, in conventional video display systems, when an information processing device from which moving images are to be projected is to be selected, the host has to seek a relevant information processing device from a list of a plurality of information processing devices.
In order to deal with such a situation, firstly, the video display system according to the present embodiment associates the facial images of participants with the information processing devices that the participants operate, and produces a list in advance.
Then, in the video display system according to the present embodiment, the host uses the information processing device of his/her own to capture the facial image of the participant who operates the information processing device from which moving images are to be projected. Accordingly, the information processing device that is associated with the captured facial image is selected from the above-produced list. Then, the video display device in the video display system according to the present embodiment displays moving images based on the video signals sent from the information processing device as selected above.
Due to this configuration of the video display system according to the present embodiment, when the information processing device from which moving images are to be projected is to be selected, the only thing that the host has to do is to capture the facial image of the participant who operates the target information processing device. In other words, the host no longer has to seek a target information processing device on his/her own from the list of a plurality of information processing devices. Accordingly, the video display system according to the present embodiment can improve customer convenience mainly on the host side.
As illustrated in
In the following description, it is assumed the client communication terminal 2a is operated by a host and the client communication terminals 2b to 2z are operated by participants. When it is not necessary to distinguish the client communication terminals 2a to 2z from each other in the following description, the client communication terminals 2a to 2z are collectively be referred to as the client communication terminals 2.
The number of the client communication terminals 2 that are connected to the network 4 is not limited. Any greater number of client communication terminals may be connected to the network 4 in large-scale system.
The projector 1 modulates the laser-beam bundles emitted from a light source according to the input video signals to form an optical image. Then, the projector 1 magnifies the formed projection image and projects it onto a projection plane such as a wall or a screen. In the present embodiment, moving images are projected according to the video signals sent from the client communication terminals 2. In other words, in the present embodiment, the projector 1 serves as a video display unit.
The client communication terminal 2 is an information processing terminal operated by a host or a participant, and is implemented by information processing devices such as a personal computer (PC), a personal digital assistant (PDA), a smartphone, and a tablet PC. In the present embodiment, when the client communication terminal 2 is selected as a device from which moving images are to be projected, the client communication terminal 2 captures the being-displayed moving images, and sends the video signals of the captured images to the projector 1.
The network 4 is a limited network such as the local area network (LAN) at an office. The network 4 is implemented, for example, by a network using the Ethernet (registered trademark), a universal serial bus (USB), Bluetooth (registered trademark), wireless fidelity (Wi-Fi) (registered trademark), FeliCa (registered trademark), peripheral component interconnect express (PCIe), a video graphics array (VGA), a digital visual interface (DVI), and an interface manufactured under the institute of electrical and electronics engineers (IEEE) standard.
The DHCP server 3 automatically issues information such as an Internet protocol (IP) address to a computer that is temporarily connected to the network such as the Internet. In other words, in the present embodiment, the DHCP server 3 serves as a location identification information manager.
In the video display system as configured above, the client communication terminals 2b to 2z and the facial images of the participants who operate these communication terminals 2b to 2z are associated with each other and a list is produced in advance. Then, the client communication terminal 2 that is associated with the facial image of a participant captured by the client communication terminal 2a of the host is automatically selected from the list.
Due to this configuration of the video display system according to the present embodiment, when the client communication terminal 2 from which a captured image is to be projected is to be selected, the only thing that the host has to do is to capture the facial image of the participant who operates the target client communication terminal 2. In other words, the host no longer has to seek a target client communication terminal 2 on his/her own from the list of a plurality of client communication terminals 2. Accordingly, the video display system according to the present embodiment can improve customer convenience.
As illustrated in
The CPU 10 serves as a computation unit, and controls the entire operation of the projector 1. The RAM 11 is a volatile storage medium capable of reading and writing data at high speed, and is used as a working area when the CPU 10 processes data. The ROM 12 is a read-only nonvolatile storage medium in which programs such as firmware are stored.
The HDD 13 is a data readable/writable nonvolatile memory in which various kinds of data such as image data, an operating system (OS), various kinds of control programs, or various kinds of programs such as an application program are stored.
The projection device 14 is hardware that implements specific functions in the projector 1. More specifically, the projection device 14 modulates the laser-beam bundles emitted from a light source to form an optical image, and magnifies the formed projection image and projects it onto a projection plane such as a wall or a screen. Note that each of the client communication terminals 2 does not have to be provided with the projection device 14 as the client communication terminals 2 are information processing terminals that are operated by a host or a participant.
The control device 15 is a user interface used to input data to the projector 1, and is implemented by an input device such as a keyboard, a mouse, an input key, and a touch panel.
The display 16 is a user interface that allows a user to visually monitor the status of the projector 1, and is implemented by a display device such as a liquid crystal display (LCD) and an output device such as a light-emitting diode (LED).
The imaging device 18 is a solid-state image sensing device such as a charge-coupled device (CCD) and a complementary metal oxide semiconductor (CMOS), and captures images around the projector 1 and converts the captured images into electrical signals.
The network interface 19 is an interface used to enable the projector 1 to communicate with other devices through the network, and the Ethernet (registered trademark), a universal serial bus (USB), Bluetooth (registered trademark), Wi-Fi (registered trademark), FeliCa (registered trademark), PCIe, an interface manufactured under the IEEE standard, or the like are used as the network interface 19.
In such a hardware configuration, programs stored on a storage medium such as the ROM 12 and the HDD 13 are read by the RAM 11, and the CPU 10 performs computation according to these programs loaded onto the RAM 11. This series of processes configures a software controller.
The software controller as configured above and hardware are combined to configure a functional block that implements the functions of the projector 1 and the client communication terminals 2 according to the present embodiment.
Next, a functional configuration of the projector 1 according to the present embodiment is described with reference to
As illustrated in
The operation key 120 is an input interface used by a user to directly operate the projector 1 and to input data to the projector 1. The operation key 120 is implemented by the control device 15 illustrated in
The display panel 130 is an output interface on which the status of the projector 1 is visually displayed, and also is an input interface such as a touch panel used by a user to directly operate the projector 1 or to input data to the projector 1. Moreover, the display panel 130 may display an image to accept a user operation. The display panel 130 is implemented by the control device 15 and the display 16 illustrated in
The network interface 140 is an interface used to enable the projector 1 to communicate with other devices such as the client communication terminals 2 through the network, and the Ethernet (registered trademark), a universal serial bus (USB), Bluetooth (registered trademark), Wi-Fi (registered trademark), FeliCa (registered trademark), PCIe, an interface manufactured under the IEEE standard, or the like are used as the network interface 140. The network interface 140 is implemented by the communication interface 17 illustrated in
The projection mechanism 150 is an output interface that modulates the laser-beam bundles emitted from a light source to form an optical image, magnifies the formed projection image, and projects the magnified projection image onto a projection plane such as a wall or a screen. The projection mechanism 150 is implemented by the projection device 14 illustrated in
The capturing mechanism 160 is a solid-state image sensing device such as a CCD and a CMOS, and also is an input interface that captures images around the projector 1 and converts the captured images into electrical signals. The capturing mechanism 160 is implemented by the imaging device 18 illustrated in
The controller 110 is configured by a combination of software and hardware. More specifically, a program that is stored in a storage medium such as the ROM 12 and the HDD 13 is loaded into the RAM 11 by the CPU 10, and the controller 110 is configured by a combination of hardware such as an integrated circuit and a software controller configured by the computation performed by the CPU 10 according to the program.
The main controller 100 controls each element of the controller 110, and gives a command to each element of the controller 110. The main controller 100 controls the input and output controller 102, and accesses other devices through the network interface 140.
The operation display controller 101 controls the display panel 130 to display an image under the control of the main controller 100, or inputs data, a signal, or a command, which are input through the operation key 120 or the display panel 130, to the main controller 100. Then, the main controller 100 gives a command to each element of the controller 110 according to the data, signal, or the command input by the operation display controller 101.
The input and output controller 102 sends data, a signal, or a command to other devices through the network interface 140 under the control of the main controller 100, or inputs the data, signal, or the command, which are input through the network interface 140, to the main controller 100. Then, the main controller 100 gives a command to each element of the controller 110 according to the data, signal, or the command input by the input and output controller 102.
The projection controller 103 serves as a video display controller that controls or drives the projection mechanism 150 under the control of the main controller 100 to control the moving images that are projected by the projector 1. The capturing controller 104 controls or drives the capturing mechanism 160 under the control of the main controller 100.
The user manager 105 manages a user list in which the facial images of the users of the video display system are associated with the user ID of the users.
The terminal manager 106 manages a terminal list in which the IP addresses of the client communication terminals 2 are associated with the user ID of the users who operate these client communication terminals 2.
The facial image comparator 107 compares the facial image of a participant who operates the client communication terminal 2 from which a captured image is to be projected, which is sent from the client communication terminal 2a that the host operates, with the facial images that are registered to a user list managed by the user manager 105. Then, based on the results of the comparison, the facial image comparator 107 selects the user ID that is associated with the best-matching facial image from the user list.
The projection selector 108 selects, from the terminal list managed by the terminal manager 106, the client communication terminal 2 from which a captured image is to be projected, based on the user ID selected by the facial image comparator 107.
The hot spot detector 109 detects a spot being pointed by a user of the video display system on the screen where moving images are projected by the projector 1.
Next, a functional configuration of the client communication terminal 2 according to the present embodiment is described with reference to
As illustrated in
The mouse/keyboard 220 is an input interface used to directly operate the client communication terminal 2 or to input data to the projector 1. The mouse/keyboard 220 is implemented by the control device 15 illustrated in
The display panel 230 is an output interface on which the status of the client communication terminal 2 is visually displayed, and also is an input interface such as a touch panel used by a user to directly operate the client communication terminal 2 or to input data to the client communication terminal 2. Moreover, the display panel 230 may display an image to accept a user operation. The display panel 230 is implemented by the control device 15 and the display 16 illustrated in
The network interface 240 is an interface used to enable the client communication terminal 2 to communicate with other devices such as the projector 1 through the network, and the Ethernet (registered trademark), a universal serial bus (USB), Bluetooth (registered trademark), Wi-Fi (registered trademark), FeliCa (registered trademark), PCIe, an interface manufactured under the IEEE standard, or the like are used as the network interface 240. The network interface 240 is implemented by the communication interface 17 illustrated in
The controller 210 is configured by a combination of software and hardware. More specifically, a program that is stored in a storage medium such as the ROM 12 and the HDD 13 is loaded into the RAM 11 by the CPU 10, and the controller 210 is configured by a combination of hardware such as an integrated circuit and a software controller configured by the computation performed by the CPU 10 according to the program.
The main controller 200 controls each element of the controller 210, and gives a command to each element of the controller 210. The main controller 200 controls the input and output controller 202, and accesses other devices through the network interface 240.
The operation display controller 201 controls the display panel 230 to display an image under the control of the main controller 200, or inputs data, a signal, or a command, which are input through the mouse/keyboard 220 or the display panel 230, to the main controller 200. Then, the main controller 200 gives a command to each element of the controller 210 according to the data, signal, or the command input by the operation display controller 201.
The input and output controller 202 sends data, a signal, or a command to other devices through the network interface 240 under the control of the main controller 200, or inputs the data, signal, or the command, which are input through the network interface 240, to the main controller 200. Then, the main controller 200 gives a command to each element of the controller 210 according to the data, signal, or the command input by the input and output controller 202.
The capturing controller 203 controls or drives the capturing mechanism 250 under the control of the main controller 200. The image capturing unit 204 captures the moving images that are being displayed on the display panel 230 as captured images, under the control of the main controller 200.
In
Note also that the client communication terminal 2a that the host operates is connected to the projector 1 in a host mode, and that the client communication terminals 2b to 2z that the participants operate is connected to the projector 1 in a participant mode. Due to this configuration, the projector 1 can distinguish between the client communication terminal 2a that the host operates and the client communication terminals 2b to 2z that the participants operate.
As illustrated in
Then, the projector 1 accepts the connection request from the client communication terminal 2a, and requests the client communication terminal 2a that is the request sender to obtain terminal information (S703).
Then, the client communication terminal 2a accepts the request to obtain terminal information from the projector 1, and activates a capturing mode to display a facial image capturing screen for capturing the facial image of the host (S704).
When the host operates the facial image capturing screen to capture a facial image (S705), the client communication terminal 2a captures the facial image of the host (S706), and sends the terminal information including the captured facial image and its own IP address as a set to the projector 1 (S707).
Then, the projector 1 applies user ID to the facial image of the terminal information sent from the client communication terminal 2a (S708), and updates the user list and the terminal list that are managed by the user manager 105 and the terminal manager 106, respectively (S709).
Then, the projector 1 returns a connection-completion response to the client communication terminal 2a that is the request sender (S710).
After that, when the host operates for termination (S711), the client communication terminal 2a sends a disconnection request to the projector 1 (S712).
In response to the disconnection request, the projector 1 clears the user list and the terminal list that are managed by the user manager 105 and the terminal manager 106, respectively (S713), and returns a disconnection-completion response to the client communication terminal 2a that is the request sender (S714).
As illustrated in
Once a participant who operates the client communication terminal 2 from which a captured image is to be projected is selected on the to-be-projected subject selection screen by the host (S903), the client communication terminal 2a uses the capturing mechanism 250 to capture the facial image of the selected participant (S904), and sends the captured facial image to the projector 1 (S905).
Then, the projector 1 uses the facial image comparator 107 to compare a facial image sent from the client communication terminal 2a with the facial images that are registered to a user list managed by the user manager 105 (S906), and selects the user ID associated with the facial image that best matches the extracted facial image from the user list managed by the user manager 105 (S907). In this configuration according to the present embodiment, the main controller 100 serves as a facial image acquisition unit.
Then, the projector 1 uses the projection selector 108 to select the IP address that is associated with the user ID selected in S907, from the terminal list managed by the terminal manager 106 (S908), and sends a captured-image obtaining request to the client communication terminal 2 that corresponds to the selected IP address (S909). In other words, in the present embodiment, the projection selector 108 serves as an information processing device selector, and an IP address is used as location identification information to identify the location of the client communication terminals 2 on the network 4.
Then, the client communication terminal 2 that has received the captured-image obtaining request uses the image capturing unit 204 to capture images and generates a video signal based on the captured images, and sends the generated video signal to the projector 1 (S910).
Then, after the video signal of the captured images is sent from the client communication terminal 2, the projector 1 projects the captured images of the client communication terminal 2 based on the received video signal of the captured images (S911). In this configuration, the main controller 100 serves as a video signal acquisition unit.
As described above, the video display system according to the present embodiment associates the IP addresses of the client communication terminals 2b to 2z with the facial images of the participants who operate these communication terminals 2b to 2z, and produces a list in advance.
Then, in the video display system according to the present embodiment, the host uses the client communication terminal 2a of his/her own to capture a facial image of the participant who operates the client communication terminal 2 from which a captured image is to be projected, and the IP address that is associated with the captured facial image is selected from the above produced list. Then, in the video display system according to the present embodiment, the projector 1 projects the image that is captured by the client communication terminal 2 associated with the IP address as selected above.
Due to this configuration of the video display system according to the present embodiment, when the client communication terminal 2 from which a captured image is to be projected is to be selected, the only thing that the host has to do is to capture the facial image of the participant who operates the target client communication terminal 2. In other words, the host no longer has to seek a target client communication terminal 2 on his/her own from the list of a plurality of client communication terminals 2. Accordingly, the video display system according to the present embodiment can improve customer convenience.
In the video display system according to the present embodiment, the projector 1 and the client communication terminals 2 are connected to each other through the network. However, the video display system according to the present embodiment may further include a server that is connected to the same network or a different network that is connected to the same network through public lines, and the server may be provided with functions that are equivalent to the user manager 105, the terminal manager 106, the facial image comparator 107, and the projection selector 108.
In the video display system according to the first embodiment, a host or a participant has to capture a facial image every time the client communication terminal 2 is to be connected to the projector 1. This is because the video display system according to the first embodiment has to associate a facial image with the latest IP address as an IP address may be changed every time the client communication terminal 2 is connected to the network.
In the video display system according to the present embodiment, firstly, the client communication terminal 2b to 2z that participants operate are registered to the projector 1. In such registration processes, the video display system according to the present embodiment associates the media access control (MAC) addresses of the client communication terminals 2b to 2z with the facial images of the participants who operate these communication terminals 2b to 2z, and produces a list in advance.
Due to this configuration, the registration processes do not have to be repeated afterward in the video display system according to the present embodiment unless the host or participants are changed or newly added. This is because a MAC address is unique to each of the client communication terminals 2 and is never changed.
Then, the video display system according to the present embodiment obtains a MAC address and an IP address from each of the client communication terminals 2 when the client communication terminal 2 is connected to the projector 1, and associates the obtained IP address with one of the MAC addresses, registered to the above list, that is the same as the MAC address obtained together with the IP address. Accordingly, the latest IP address and a facial image are associated with each other in the above list.
Then, in the video display system according to the present embodiment, the host uses the client communication terminal 2a of his/her own to capture a facial image of the participant who operates the client communication terminal 2 from which a captured image is to be projected, and the IP address that is associated with the captured facial image is selected from the above produced list. Then, in the video display system according to the present embodiment, the projector 1 projects the image that is captured by the client communication terminal 2 associated with the IP address as selected above.
Due to this configuration, in the video display system according to the present embodiment, a host or a participant no longer has to capture a facial image every time the client communication terminal 2 is to be connected to the projector 1. Accordingly, the video display system according to the present embodiment can further improve customer convenience.
Embodiments of the present invention are described below in detail with reference to the drawings. Note that like reference signs are given to elements similar to those described in the first embodiment, and their detailed description is omitted.
As illustrated in
In
Note also that the client communication terminal 2a that the host operates is registered to the projector 1 in a host mode, and that the client communication terminals 2b to 2z that the participants operate is registered to the projector 1 in a participant mode. Due to this configuration, the projector 1 can distinguish between the client communication terminal 2a that the host operates and the client communication terminals 2b to 2z that the participants operate.
As illustrated in
Then, the projector 1 accepts the registration request from the client communication terminal 2a, and requests the client communication terminal 2a that is the request sender to obtain terminal information (S1203).
Then, the client communication terminal 2a accepts the request to obtain terminal information from the projector 1, and activates a capturing mode to display a facial image capturing screen for capturing the facial image of the host (S1204). Note that the facial image capturing screen is described as above with reference to
When the host operates the facial image capturing screen to capture a facial image (S1205), the client communication terminal 2a captures the facial image of the host (S1206), and sends the terminal information including the captured facial image and its own MAC address to the projector 1 (S1207).
Then, the projector 1 applies user ID to the facial image of the terminal information sent from the client communication terminal 2a (S1208), and updates the user list and the terminal list that are managed by the user manager 105 and the terminal manager 106, respectively (S1209). In this configuration, the main controller 100 serves as an association information registration unit.
Then, the projector 1 returns a registration-completion response to the client communication terminal 2a that is the request sender (S1210).
In
Note also that the client communication terminal 2a that the host operates is connected to the projector 1 in a host mode, and that the client communication terminals 2b to 2z that the participants operate is connected to the projector 1 in a participant mode. Due to this configuration, the projector 1 can distinguish between the client communication terminal 2a that the host operates and the client communication terminals 2b to 2z that the participants operate.
As illustrated in
Then, the projector 1 accepts the connection request from the client communication terminal 2a, and requests the client communication terminal 2a that is the request sender to obtain terminal information (S1303).
Then, the client communication terminal 2a sends the terminal information including its own IP address and MAC address as a set to the projector 1 (S1304).
Once the terminal information is sent from the client communication terminal 2a, the projector 1 associates the obtained IP address with one of the MAC addresses, registered to the terminal list stored in the terminal manager 106, that is the same as the MAC address sent together with the IP address. Accordingly, the terminal list is updated (S1305), and facial images are associated with the latest IP addresses in the terminal list. In other words, in the present embodiment, MAC addresses are used as individual identification information for identifying information processing devices individually. In this configuration, the main controller 100 and the terminal manager 106 serve as a location identification information acquisition unit and an associating unit, respectively.
Then, the projector 1 returns a connection-completion response to the client communication terminal 2a that is the request sender (S1306).
After that, when the host operates for termination (S1307), the client communication terminal 2a sends a disconnection request to the projector 1 (S1308).
In response to the disconnection request, the projector 1 clears only the IP addresses in the terminal list that is managed by the terminal manager 106 (S1309), and returns a disconnection-completion response to the client communication terminal 2a that is the request sender (S1310).
As described above, firstly, the video display system according to the present embodiment associates the MAC addresses of the client communication terminals 2b to 2z with the facial images of the participants who operate these communication terminals 2b to 2z, and produces a list in advance. By so doing, the client communication terminals 2 are registered to the video display system. Due to this configuration, the registration processes do not have to be repeated afterward in the video display system according to the present embodiment unless the host or participants are changed or newly added. This is because a MAC address is unique to each of the client communication terminals 2 and is never changed.
Then, the video display system according to the present embodiment obtains a MAC address and an IP address from each of the client communication terminals 2 when the client communication terminal 2 is connected to the projector 1, and associates the obtained IP address with one of the MAC addresses, registered to the above list, that is the same as the MAC address obtained together with the IP address. Accordingly, the latest IP address and a facial image are associated with each other in the above list.
Then, in the video display system according to the present embodiment, the host uses the client communication terminal 2a of his/her own to capture a facial image of the participant who operates the client communication terminal 2 from which a captured image is to be projected, and the IP address that is associated with the captured facial image is selected from the above produced list. Then, in the video display system according to the present embodiment, the projector 1 projects the image that is captured by the client communication terminal 2 associated with the IP address as selected above.
Due to this configuration, in the video display system according to the present embodiment, a host or a participant no longer has to capture a facial image every time the client communication terminal 2 is to be connected to the projector 1. Accordingly, the video display system according to the present embodiment can further improve customer convenience.
The processes in which the projector 1 according to the present embodiment projects an image captured by the client communication terminal 2 are described as above with reference to
In the video display system according to the first and second embodiments, a host or a participant has to connect the client communication terminal 2 of his/her own to the projector 1 on an as-needed basis.
In order to deal with such a situation, firstly, the video display system according to the present embodiment associates the MAC addresses of the client communication terminals 2b to 2z with the facial images of the participants who operate these communication terminals 2b to 2z, and produces a list in advance. Further, the video display system according to the present embodiment generates an address resolution protocol (ARP) table to obtain a MAC address from the IP address of each of the client communication terminals 2b to 2z in advance.
Then, the video display system according to the present embodiment obtains from the DHCP server 3 the IP addresses of the client communication terminals 2 that are connected to the network with the same segment, and broadcast an ARP request to the obtained IP addresses.
Then, the video display system according to the present embodiment associates the IP addresses to which an ARP request has been sent with one of the MAC addresses, registered to the ARP table, that is the same as the MAC address obtained by broadcasting the ARP request. Accordingly, the ARP table is completed.
Then, in the video display system according to the present embodiment, the host uses the client communication terminal 2a of his/her own to capture a facial image of the participant who operates the client communication terminal 2 from which a captured image is to be projected, and the MAC address that is associated with the captured facial image is selected from the above produced list.
Then, the video display system according to the present embodiment selects the IP address that is associated with one of the MAC addresses, registered to the ARP table, that is the same as the MAC address selected based on the facial image. Then, in the video display system according to the present embodiment, the projector 1 projects the image that is captured by the client communication terminal 2 associated with the IP address as selected above.
In the video display system according to the present embodiment, a host or a participant no longer has to connect the client communication terminal 2 of his/her own to the projector 1 on an as-needed basis. Accordingly, the video display system according to the present embodiment can further improve customer convenience.
Embodiments of the present invention are described below in detail with reference to the drawings. Note that like reference signs are given to elements similar to those described in the first and second embodiments, and their detailed description is omitted.
The ARP table manager 161 is used to obtain a MAC address from an IP address, and manages an ARP table in which the IP addresses and the MAC addresses of the client communication terminals 2 are associated with each other.
As illustrated in
In
Note also that the client communication terminal 2a that the host operates is registered to the projector 1 in a host mode, and that the client communication terminals 2b to 2z that the participants operate is registered to the projector 1 in a participant mode. Due to this configuration, the projector 1 can distinguish between the client communication terminal 2a that the host operates and the client communication terminals 2b to 2z that the participants operate.
As illustrated in
Then, the projector 1 accepts the registration request from the client communication terminal 2a, and requests the client communication terminal 2a that is the request sender to obtain terminal information (S1703).
Then, the client communication terminal 2a accepts the request to obtain terminal information from the projector 1, and activates a capturing mode to display a facial image capturing screen for capturing the facial image of the host (S1704). Note that the facial image capturing screen is described as above with reference to
When the host operates the facial image capturing screen to capture a facial image (S1705), the client communication terminal 2a captures the facial image of the host (S1706), and sends the terminal information including the captured facial image and its own MAC address as a set to the projector 1 (S1707).
Then, the projector 1 applies user ID to the facial image of the terminal information sent from the client communication terminal 2a (S1708), and updates the user list, the terminal list, and the ARP table that are managed by the user manager 105, the terminal manager 106, and the ARP table manager 161, respectively (S1709). Note that the MAC addresses are registered to the ARP table, but the IP addresses are not registered to the ARP table at this point in time.
Then, the projector 1 returns a registration-completion response to the client communication terminal 2a that is the request sender (S1710).
As illustrated in
Then, the client communication terminal 2 receives the ARP request, and uses an ARP reply to return the MAC address of its own to the projector 1 (S1803).
Then, the projector 1 associates the IP addresses to which an ARP request has been sent with one of the MAC addresses, registered to the ARP table managed by the ARP table manager 161, that is the same as the MAC address obtained by broadcasting the ARP request. Accordingly, the projector 1 updates the ARP table (S1804). Accordingly, the ARP table is completed.
The projector 1 repeats such update processes of the ARP table at regular time intervals. Accordingly, the latest IP addresses and the latest MAC addresses are associated with each other in the ARP table.
As illustrated in
Once a participant who operates the client communication terminal 2 from which a captured image is to be projected is selected on the to-be-projected subject selection screen by the host (S1903), the client communication terminal 2a uses the capturing mechanism 250 to capture the facial image of the selected participant (S1904), and sends the captured facial image to the projector 1 (S1905).
Then, the projector 1 uses the facial image comparator 107 to compare a facial image sent from the client communication terminal 2a with the facial images that are registered to a user list managed by the user manager 105 (S1906), and selects the MAC address associated with the facial image that best matches the extracted facial image from the terminal list managed by the terminal manager 106 (S1907).
Then, the projector 1 uses the projection selector 108 to select the IP address that is associated with the MAC address selected in S1907, from the ARP table managed by the ARP table manager 161 (S1908), and sends a captured-image obtaining request to the client communication terminal 2 that corresponds to the selected IP address (S1909).
Then, the client communication terminal 2 that has received the captured-image obtaining request uses the image capturing unit 204 to capture images and generates a video signal based on the captured images, and sends the generated video signal to the projector 1 (S1910).
Then, after the video signal of the captured images is sent from the client communication terminal 2, the projector 1 projects the captured images of the client communication terminal 2 based on the received video signal of the captured images (S1911).
As described above, firstly, the video display system according to the present embodiment associates the MAC addresses of the client communication terminals 2b to 2z with the facial images of the participants who operate these communication terminals 2b to 2z, and produces a list in advance. Further, the video display system according to the present embodiment generates an address resolution protocol (ARP) table to obtain a MAC address from the IP address of each of the client communication terminals 2b to 2z in advance.
Then, the video display system according to the present embodiment obtains from the DHCP server 3 the IP addresses of the client communication terminals 2 that are connected to the network with the same segment, and broadcast an ARP request to the obtained IP addresses.
Then, the video display system according to the present embodiment associates the IP addresses to which an ARP request has been sent with one of the MAC addresses, registered to the ARP table, that is the same as the MAC address obtained by broadcasting the ARP request. Accordingly, the ARP table is completed.
Then, in the video display system according to the present embodiment, the host uses the client communication terminal 2a of his/her own to capture a facial image of the participant who operates the client communication terminal 2 from which a captured image is to be projected, and the MAC address that is associated with the captured facial image is selected from the above produced list.
Then, the video display system according to the present embodiment selects the IP address that is associated with one of the MAC addresses, registered to the ARP table, that is the same as the MAC address selected based on the facial image. Then, in the video display system according to the present embodiment, the projector 1 projects the image that is captured by the client communication terminal 2 associated with the IP address as selected above.
In the video display system according to the present embodiment, a host or a participant no longer has to connect the client communication terminal 2 of his/her own to the projector 1 on an as-needed basis. Accordingly, the video display system according to the present embodiment can further improve customer convenience.
In the video display system according to the present embodiment, the projector 1 and the client communication terminals 2 are connected to each other through the network. However, the video display system according to the present embodiment may further include a server that is connected to the same network or a different network that is connected to the same network through public lines, and the server may be provided with functions that are equivalent to the user manager 105, the terminal manager 106, the facial image comparator 107, the projection selector 108, and the ARP table manager 161.
In the video display system according to the first, second, and third embodiments, a host or a participant uses the client communication terminals 2 of his/her own to capture a facial image. For this reason, the client communication terminals 2a to 2z have to be provided with a capturing function in the video display system according to the first, second, and third embodiments.
In order to avoid such a situation, the video display system according to the present embodiment is configured such that a host or a participant can use the projector 1 to capture a facial image. Due to this configuration, in the video display system according to the present embodiment, the client communication terminals 2a to 2z do not have to be provided with a capturing function.
Embodiments of the present invention are described below in detail with reference to the drawings. Note that like reference signs are given to elements similar to those described in the first, second, and third embodiments, and their detailed description is omitted.
As illustrated in
Note that the client communication terminal 2a that the host operates registers the projector 1 in a host mode, and that the client communication terminals 2b to 2z that the participants operate registers the projector 1 in a participant mode. Due to this configuration, the projector 1 can distinguish between the client communication terminal 2a that the host operates and the client communication terminals 2b to 2z that the participants operate.
As illustrated in
Then, upon receiving the request to obtain terminal information from the projector 1, the client communication terminal 2 sends the terminal information including its own terminal name, its own IP address, and its own MAC address as a set to the projector 1 (S2103).
Then, the projector 1 applies user ID to the facial image of the terminal information sent from the client communication terminal 2 (S2104), and updates the user list and the terminal list that are managed by the user manager 105 and the terminal manager 106, respectively (S2105).
Then, the projector 1 projects a terminal list screen including the terminal names, the user ID, the IP addresses, the MAC addresses, and the facial images that are registered to the user list and the terminal list that are managed by the user manager 105 and the terminal manager 106, respectively (S2106).
Once a user operates the terminal list screen to select a communication terminal (S2107), the projector 1 activates a capturing mode to display a facial image capturing screen for capturing the facial image of the user (S2108). Note that the facial image capturing screen is described as above with reference to
When the user operates the facial image capturing screen to capture a facial image (S2109), the projector 1 captures the facial image of the user (S2110).
Then, the projector 1 associates the facial image captured in S2110 with the user ID, which is registered to the user list managed by the user manager 105, that matches the user ID associated with the client communication terminal 2 selected in S2107 in the terminal list managed by the terminal manager 106. Accordingly, the user list is updated (S2111).
As described above, the video display system according to the present embodiment is configured such that a host or a participant can use the projector 1 to capture a facial image. Due to this configuration, in the video display system according to the present embodiment, the client communication terminals 2a to 2z do not have to be provided with a capturing function.
The video display system according to the first, second, third, and fourth embodiments as described above is configured such that a host uses the client communication terminal 2a of his/her own to capture a facial image of the participant who operates the client communication terminal 2 from which a captured image is to be projected. For this reason, in the video display system according to the first, second, third, and fourth embodiments, the client communication terminal 2a has to be provided with a capturing function, and the host has to carry the client communication terminal 2a.
In order to deal with such a situation, the video display system according to the present embodiment is configured such that a host can use the projector 1 to capture a facial image of the participant who operates the client communication terminal 2 from which a captured image is to be projected. As the video display system according to the present embodiment is configured as above, the client communication terminal 2a does not have to be provided with a capturing function, or the host does not have to carry the client communication terminal 2a.
Embodiments of the present invention are described below in detail with reference to the drawings. Note that like reference signs are given to elements similar to those described in the first, second, third, and fourth embodiments, and their detailed description is omitted.
As illustrated in
Once a participant who operates the client communication terminal 2 from which a captured image is to be projected is selected on the to-be-projected subject selection screen by the host (S2303), the client communication terminal 2a uses the hot spot detector 109 to specify the position selected by the host (S2304).
Then, the projector 1 uses the capturing mechanism 250 to capture the facial image of the selected participant (S2305), and uses the facial image comparator 107 to compare the captured facial image with the facial images that are registered to a user list managed by the user manager 105 (S2306). Then, the projector 1 selects the user ID associated with the facial image that best matches the extracted facial image from the user list managed by the user manager 105 (S2307).
Then, the projector 1 uses the projection selector 108 to select the IP address that is associated with the user ID selected in S2307, from the terminal list managed by the terminal manager 106 (S2308), and sends a captured-image obtaining request to the client communication terminal 2 that corresponds to the selected IP address (S2309).
Then, the client communication terminal 2 that has received the captured-image obtaining request uses the image capturing unit 204 to capture images and generates a video signal based on the captured images, and sends the generated video signal to the projector 1 (S2310).
Then, after the video signal of the captured images is sent from the client communication terminal 2, the projector 1 projects the captured images of the client communication terminal 2 based on the received video signal of the captured images (S2311).
As described above, the video display system according to the present embodiment is configured such that a host can use the projector 1 to capture a facial image of the participant who operates the client communication terminal 2 from which a captured image is to be projected. As the video display system according to the present embodiment is configured as above, the client communication terminal 2a does not have to be provided with a capturing function, or the host does not have to carry the client communication terminal 2a.
In the video display system according to the first to fifth embodiments, the projector 1 can project merely a single image captured by the client communication terminal 2 at once. Accordingly, in the video display system according to the first to fifth embodiments, a host and participants can view merely a single image captured by the client communication terminal 2 at once.
In order to deal with such a situation, in the video display system according to the present embodiment, the projector 1 is configured to project a plurality of images captured by the multiple client communication terminals 2 all at once. Due to such a configuration, in the video display system according to the present embodiment, a host and participants can view a plurality of images captured by the multiple client communication terminals 2 all at once.
Embodiments of the present invention are described below in detail with reference to the drawings. Note that like reference signs are given to elements similar to those described in the first to fifth embodiments, and their detailed description is omitted.
As illustrated in
The projector 1 according to the present embodiment can refer to this terminal list to distinguish between the client communication terminal 2 from which a captured image is being projected and the client communication terminal 2 from which a captured image is not being projected. Accordingly, the projector 1 according to the present embodiment can avoid projecting multiple images captured by the same client communication terminal 2. In this configuration, the main controller 100 serves as a source determination unit.
As illustrated in
Next, cases in which the host erroneously operates the client communication terminal 2 of his/her own and selects again the participant who operates the client communication terminal 2 from which a captured image is being projected are described with reference to
As illustrated in
Next, cases in which the host operates the client communication terminal 2a of his/her own to specify the position at which an image captured by the client communication terminal 2 that the selected participant operates is projected and displayed are described with reference to
As illustrated in
As described above, in the video display system according to the present embodiment, the projector 1 can project a plurality of images captured by the multiple client communication terminals 2 all at once. Due to this configuration, in the video display system according to the present embodiment, a host and participants can view a plurality of images captured by the multiple client communication terminals 2 all at once.
In the video display system according to the first to sixth embodiments, the projector 1 is configured to project an image captured by the client communication terminal 2 of the participant selected by a host on the to-be-projected subject selection screen. For this reason, in order to specify the client communication terminal 2 from which a captured image is to be projected, the host has to capture a facial image of a participant whose captured image is to be projected and has to select the participant whose facial image is to be projected, from the captured images of the participants on the to-be-projected subject selection screen.
In order to avoid such a situation, in the video display system according to the present embodiment, the projector 1 is configured such that all what a host has to do is to capture a facial image of a participant whose captured image is to be project to select the client communication terminal 2 from which a captured image is to be projected. Due to this configuration, in the video display system according to the present embodiment, the host no longer has to select a participant whose facial image is to be projected, from the captured images of the participants on the to-be-projected subject selection screen.
Embodiments of the present invention are described below in detail with reference to the drawings. Note that like reference signs are given to elements similar to those described in the first to sixth embodiments, and overlapping descriptions with the description of the first to sixth embodiments are omitted.
As illustrated in
Note that the camera 5 is configured by hardware that is similar to the hardware as described above with reference to
As illustrated in
The gesture detector 162 obtains the moving images captured by the camera 5 through the network interface 140, and recognize a specific gesture in the obtained moving images to detect a participant who made a specific gesture. This indicates that the gesture detector 162 serves as a motion detector. Moreover, the gesture detector 162 extracts a facial image of a participant from which a specific gesture has been detected. Note also that the gesture detector 162 stores a gesture list that includes characteristic information for detecting a specific gesture.
In the present embodiment, the facial image comparator 107 compares the moving images of the participant who made a specific gesture, which are extracted by the gesture detector 162, with the facial images that are registered to the user list managed by the user manager 105, and selects the user ID that is associated with the best-matching facial image from the user list.
As illustrated in
Once the to-be-projected subject selection mode is activated on the client communication terminal 2a, the projector 1 requests the camera 5 to capture images (S3002). The camera 5 capture moving images so as to include all the participants (S3003), and sends the captured moving images to the projector 1 (S3004).
Then, the projector 1 uses the gesture detector 162 to detect a specific gesture from the moving images captured by the camera 5 (S3005), and extracts a facial image of a participant who made a specific gesture (S3006).
In
The projector 1 uses the facial image comparator 107 to compare an extracted facial image with the facial images that are registered to a user list managed by the user manager 105 (S3007), and selects the user ID associated with the facial image that best matches the extracted facial image from the user list managed by the user manager 105 (S3008).
Then, the projector 1 uses the projection selector 108 to select the IP address that is associated with the user ID selected in S3008, from the terminal list managed by the terminal manager 106 (S3009), and sends a captured-image obtaining request to the client communication terminal 2 with the selected IP address (S3010).
Then, the client communication terminal 2 that has received the captured-image obtaining request uses the image capturing unit 204 to capture images and generates a video signal based on the captured images, and sends the generated video signal to the projector 1 (S3011).
Then, after the video signal of the captured images is sent from the client communication terminal 2, the projector 1 projects the captured images of the client communication terminal 2 based on the received video signal of the captured images (S3012).
As described above, in the video display system according to the present embodiment, all what a host has to do is to capture a facial image of a participant whose captured image is to be project to select the client communication terminal 2 from which a captured image is to be projected by the projector 1. Due to this configuration, in the video display system according to the present embodiment, the host no longer has to select a participant whose facial image is to be projected, from the captured images of the participants on the to-be-projected subject selection screen.
In the video display system according to the seventh embodiment, right after a participant who made a specific gesture is detected, the projector 1 projects the image captured by the client communication terminal 2 that is associated with the detected participant. However, in the video display system according to the seventh embodiment, the projector 1 is not configured to deal with a situation in which a plurality of participants make a specific gesture.
In order to deal with such a situation, priority levels are given to the participants in the video display system according to the present embodiment. Accordingly, even when there are a plurality of participants who make a specific gesture, the projector 1 according to the present embodiment can project an image captured by the client communication terminal 2 based on the given priority levels.
Embodiments of the present invention are described below in detail with reference to the drawings. Note that like reference signs are given to elements similar to those described in the first to seventh embodiments, and overlapping descriptions with the description of the first to seventh embodiments are omitted.
As illustrated in
Once the to-be-projected subject selection mode is activated on the client communication terminal 2a, the projector 1 requests the camera 5 to capture images (S3202). The camera 5 capture moving images so as to include all the participants (S3203), and sends the captured moving images to the projector 1 (S3204).
Then, the projector 1 uses the gesture detector 162 to detect a specific gesture from the moving images captured by the camera 5 (S3205), and extracts a facial image of a participant who made a specific gesture (S3206).
The projector 1 uses the facial image comparator 107 to compare an extracted facial image with the facial images that are registered to a user list managed by the user manager 105 (S3207), and
The facial image comparator 107 checks the priority level of a facial image matched in the comparison made in S3207 (S3208). Note that the priority levels are given in advance to the facial images that are registered to the user list.
As illustrated in
In
The projector 1 uses the facial image comparator 107 to select the user ID, which is associated with each of the facial images, from the user list managed by the user manager 105, in the order of the priority levels determined in S3208 (S3209).
Then, the projector 1 uses the projection selector 108 to select the IP address that is associated with the user ID selected in S3209, from the terminal list managed by the terminal manager 106, in order of priority level (S3210), and sends a captured-image obtaining request to the client communication terminal 2 with the selected IP address (S3211).
Accordingly, in
Then, the client communication terminal 2 that has received the captured-image obtaining request uses the image capturing unit 204 to capture images and generates a video signal based on the captured images, and sends the generated video signal to the projector 1 (S3212).
Then, after the video signal of the captured images is sent from the client communication terminal 2, the projector 1 projects the captured images of the client communication terminal 2 based on the received video signal of the captured images (S3213).
As described above, in the video display system according to the present embodiment, even when there are a plurality of participants who make a specific gesture, an image captured by the client communication terminal 2 can be projected based on the priority levels given to the participants. Due to this configuration, in the video display system according to the present embodiment, even when a plurality of participants make a specific gesture, a captured image can be displayed based on the priority levels given to the participants.
Then, in the video display system according to the seventh and eighth embodiments, the projector 1 projects the image that is captured by the client communication terminal 2 associated with the participant from which a specific gesture has been detected. However, in the video display system according to the seventh and eighth embodiments, the projector 1 cannot deal with a situation in which a participant make a gesture that is specified as desired by a host.
In order to deal with such a situation, in the video display system according to the present embodiment, the projector 1 registers to the gesture detector 162 a gesture that is specified as desired by a host. Accordingly, the projector 1 can project an image captured by the client communication terminal 2 of the participant who made such a gesture specified by the host.
Embodiments of the present invention are described below in detail with reference to the drawings. Note that like reference signs are given to elements similar to those described in the first to eighth embodiments, and overlapping descriptions with the description of the first to eighth embodiments are omitted.
Firstly, processes in which the projector 1 according to the present embodiment registers a gesture to be detected by the gesture detector 162 are described with reference to
As illustrated in
Once the client communication terminal 2a is operated to register a gesture, the projector 1 waits until the moving images obtained by capturing the registered gesture are sent from the camera 5 (S3502). The host makes any desired gesture, and uses the camera 5 to capture the moving images of such a gesture (S3503).
The camera 5 sends the captured moving images of the gesture to the projector 1 (S3504), and the projector 1 uses the gesture detector 162 to extract characteristics of the received moving images of the gesture. Then, the gesture detector 162 applies gesture ID to the characteristic information of the moving images of the gesture (S3505), and registers the gesture with the gesture list stored in the gesture detector 162 (S3506).
As illustrated in
Once the to-be-projected subject selection mode is activated on the client communication terminal 2a, the projector 1 requests the camera 5 to capture images (S3702). The camera 5 capture moving images so as to include all the participants (S3703), and sends the captured moving images to the projector 1 (S3704).
Then, the projector 1 uses the gesture detector 162 to detect a specific gesture from the moving images captured by the camera 5 (S3705), and checks whether the detected gesture is registered to the gesture list (S3706). The gesture detector 162 confirms that one of the gestures registered to the gesture list matches the detected gesture, and extracts a facial image of the participant who made that gesture (S3707).
The projector 1 uses the facial image comparator 107 to compare an extracted facial image with the facial images that are registered to a user list managed by the user manager 105 (S3708), and selects the user ID associated with the facial image that best matches the extracted facial image from the user list managed by the user manager 105 (S3709).
Then, the projector 1 uses the projection selector 108 to select the IP address that is associated with the user ID selected in S3709, from the terminal list managed by the terminal manager 106 (S3710), and sends a captured-image obtaining request to the client communication terminal 2 with the selected IP address (S3711).
Then, the client communication terminal 2 that has received the captured-image obtaining request uses the image capturing unit 204 to capture images and generates a video signal based on the captured images, and sends the generated video signal to the projector 1 (S3712).
Then, after the video signal of the captured images is sent from the client communication terminal 2, the projector 1 projects the captured images of the client communication terminal 2 based on the received video signal of the captured images (S3713).
As described above, the video display system according to the present embodiment is configured such that an image captured by the client communication terminal 2 is projected according to the priority level that is given to a participant who made a gesture specified as desired by a host. Due to this configuration, in the video display system according to the present embodiment, an image that is captured by the client communication terminal 2 of the participant who made a gesture specified as desired by the host can be displayed.
Then, in the video display system according to the seventh to ninth embodiments, the projector 1 projects the image that is captured by the client communication terminal 2 associated with the participant from which a specific gesture has been detected. However, in the video display system according to the seventh to ninth embodiments, the projector 1 can merely project the image captured by the client communication terminal 2 from which a captured image is to be projected.
In order to deal with such a situation, in the video display system according to the present embodiment, the projector 1 registers for every gesture a set of data for controlling the projection of the client communication terminal 2. Accordingly, the projection of an image captured by the client communication terminal 2 can be controlled just by a gesture of a participant.
Embodiments of the present invention are described below in detail with reference to the drawings. Note that like reference signs are given to elements similar to those described in the first to ninth embodiments, and overlapping descriptions with the description of the first to ninth embodiments are omitted.
Firstly, processes in which the projector 1 according to the present embodiment registers a gesture to be detected by the gesture detector 162 and a set of data for controlling the projection of the client communication terminal 2 in association with each other are described with reference to
As illustrated in
Once the client communication terminal 2a is operated to register a gesture and configure terminal operation, the projector 1 waits until the moving images obtained by capturing the registered gesture are sent from the camera 5 (S3803). The host makes any desired gesture, and uses the camera 5 to capture the moving images of such a gesture (S3804).
The camera 5 sends the captured moving images of the gesture to the projector 1 (S3805), and the projector 1 uses the gesture detector 162 to extract characteristics of the received moving images of the gesture. Then, the gesture detector 162 applies gesture ID to the characteristic information of the moving images of the gesture (S3806), and registers the set of data for controlling the projection of the client communication terminal 2 (S3807). Note also that the set of data for controlling the projection of the client communication terminal 2 is set when the terminal operation is configured in S3802.
Then, the gesture detector 162 registers the information with the gesture list stored in the gesture detector 162 (S3808). This indicates that the gesture detector 162 serves as a relevant terminal operation information storage unit.
In the present embodiment, the gesture detector 162 of the projector 1 detects a gesture made by a participant based on the gesture list as illustrated in
As illustrated in
Once the to-be-projected subject selection mode is activated on the client communication terminal 2a, the projector 1 requests the camera 5 to capture images (S4002). The camera 5 capture moving images so as to include all the participants (S4003), and sends the captured moving images to the projector 1 (S4004).
Then, the projector 1 uses the gesture detector 162 to detect a specific gesture from the moving images captured by the camera 5 (S4005), and checks whether the detected gesture is registered to the gesture list (S4006). The gesture detector 162 confirms that one of the gestures registered to the gesture list matches the detected gesture, and extracts relevant terminal operation information from the gesture list (S4007).
Then, the gesture detector 162 extracts a facial image of a participant who made a specific gesture, from the moving images captured by the camera 5 (S4008).
The projector 1 uses the facial image comparator 107 to compare an extracted facial image with the facial images that are registered to a user list managed by the user manager 105 (S4009), and selects the user ID associated with the facial image that best matches the extracted facial image from the user list managed by the user manager 105 (S4010).
Then, the projector 1 uses the projection selector 108 to select the IP address that is associated with the user ID selected in S4010, from the terminal list managed by the terminal manager 106 (S4011), and checks the relevant terminal operation information (S4012). sends the relevant terminal operation information to the client communication terminal 2 with the selected IP address (S4013).
Then, the client communication terminal 2 that have received the relevant terminal operation information executes the relevant terminal operation in the gesture list (S4014). The projector 1 uses the image capturing unit 204 to reflects the relevant terminal operation on the client communication terminal 2 in the capturing status (S4015).
Accordingly, for example, when a gesture with the gesture ID: 1 is detected by the gesture detector 162, the projection of an image captured by the client communication terminal 2 on the projector 1 starts. When a gesture with the gesture ID: 2 is detected by the gesture detector 162, the projection of the image captured by the client communication terminal 2 on the projector 1 is terminated.
When a gesture with the gesture ID: 3 is detected by the gesture detector 162, the projection of the image captured by the client communication terminal 2 on the projector 1 is temporarily paused. When a gesture with the gesture ID: 4 is detected by the gesture detector 162, the projection of the image captured by the client communication terminal 2 on the projector 1 is resumed.
As described above, in the video display system according to the present embodiment, the projector 1 registers for every gesture a set of data for controlling the projection of the client communication terminal 2. Accordingly, the projection of an image captured by the client communication terminal 2 can be controlled just by a gesture of a participant.
Then, in the video display system according to the seventh to tenth embodiments, the projector 1 projects the image that is captured by the client communication terminal 2 associated with the participant from which a specific gesture has been detected. However, in the video display system according to the seventh to tenth embodiments, the participant who is associated with the client communication terminal 2 from which a captured image is to be projected cannot terminate the projection of the captured image at a desired timing.
In order to deal with such a situation, in the video display system according to the present embodiment, the projector 1 only allows the participant who is associated with the client communication terminal 2 from which a captured image is to be projected to control the projection of the image captured by the client communication terminal 2. Accordingly, the projection of an image captured by the client communication terminal 2 can be controlled at a timing desired by a participant.
Embodiments of the present invention are described below in detail with reference to the drawings. Note that like reference signs are given to elements similar to those described in the first to tenth embodiments, and overlapping descriptions with the description of the first to tenth embodiments are omitted.
As illustrated in
Once the to-be-projected subject selection mode is activated on the client communication terminal 2a, the projector 1 requests the camera 5 to capture images (S4102). The camera 5 capture moving images so as to include all the participants (S4103), and sends the captured moving images to the projector 1 (S4104).
Then, the projector 1 uses the gesture detector 162 to detect a specific gesture from the moving images captured by the camera 5 (S4105), and checks whether the detected gesture is registered to the gesture list (S4106). The gesture detector 162 confirms that one of the gestures registered to the gesture list matches the detected gesture, and extracts a facial image of a participant who made a specific gesture, from the moving images captured by the camera 5 (S4107).
The projector 1 uses the facial image comparator 107 to compare an extracted facial image with the facial images that are registered to a user list managed by the user manager 105 (S4108), and selects the user ID associated with the facial image that best matches the extracted facial image from the user list managed by the user manager 105 (S4109).
Then, the projector 1 uses the projection selector 108 to select the IP address that is associated with the user ID selected in S4109, from the terminal list managed by the terminal manager 106 (S4110), and checks control permission setting information (S4111).
Hereinafter, the participant who is selected by the projection selector 108 to be the to-be-projected subject is assumed to be the participant himself or herself who is associated with the client communication terminal 2. The projector 1 uses the main controller 100 to send the control permission setting information to the client communication terminal 2 that is selected as a to-be-projected subject by the projection selector 108 (S4112).
Once the control permission setting information is received, the client communication terminal 2 uses the operation display controller 201 to display an indication of control permission on the display panel 230 (S4113). Note that the indication of control permission allows the participant to control the projection of an image captured by the client communication terminal 2.
When the participant operates the client communication terminal 2 of his/her own to terminate the projection while the indication of control permission is being displayed (S4114), the client communication terminal 2 terminates the transmission of a captured image (S4115), and the projector 1 stops projecting the image captured by the client communication terminal 2 (S4116).
As described above, in the video display system according to the present embodiment, the projector 1 only allows the participant who is associated with the client communication terminal 2 from which a captured image is to be projected to control the projection of the image captured by the client communication terminal 2. Accordingly, the projection of an image captured by the client communication terminal 2 can be controlled at a timing desired by a participant.
Numerous additional modifications and variations are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the disclosure of the present invention may be practiced otherwise than as specifically described herein. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of this disclosure and appended claims.
Further, as described above, any one of the above-described and other methods of the present invention may be embodied in the form of a computer program stored on any kind of storage medium. Examples of storage media include, but are not limited to, flexible disks, hard disks, optical discs, magneto-optical discs, magnetic tape, nonvolatile memory cards, ROM, etc. Alternatively, any one of the above-described and other methods of the present invention may be implemented by ASICs, prepared by interconnecting an appropriate network of conventional component circuits, or by a combination thereof with one or more conventional general-purpose microprocessors and/or signal processors programmed accordingly.
Number | Date | Country | Kind |
---|---|---|---|
2015-179860 | Sep 2015 | JP | national |
2016-151509 | Aug 2016 | JP | national |