This patent application is based on and claims priority pursuant to 35 U.S.C. § 119(a) to Japanese Patent Application No. 2020-145154, filed on Aug. 31, 2020, in the Japan Patent Office, the entire disclosure of which is hereby incorporated by reference herein.
The present invention relates to a communication terminal, an image communication system, a method of displaying an image, and a non-transitory recording medium.
Some image capturing devices are capable of capturing images in all directions using a plurality of wide-angle lenses or fisheye lenses. The image data captured using such image capturing device may be used in a system, which distributes the captured image data in real time to another site, to allow a user at the another site to view an image being captured at a site where the image capturing device is provided.
For example, images captured by a plurality of cameras at a remote site are displayed at a terminal, such that a user at the terminal is able to view the remote site.
However, there may be some cases in which a specific location at the remote site where the image is captured needs attention from the user at the terminal. In such cases, the image capturing device needs to be locally operated to capture an image of the specific location to be distributed to the terminal to be viewed by the user.
Example embodiments include a communication terminal including: circuitry that displays, on a display, a part of a whole image as a viewable-area image, the whole image having been captured at a distribution site that is remotely located from a browsing site where the communication terminal is provided; and a memory that stores target area information indicating a plurality of target areas at the distribution site. In response to reception of target area designation information designating a particular target area of the plurality of target areas, from the distribution site, the circuitry controls the display to display the viewable-area image including the particular target area, based on the target area information of the particular target area.
Example embodiments include an image communication system including one or more of the above-described communication terminals, and a distribution system. The distribution system includes an image capturing device that captures the whole image covering the distribution site, and a distribution terminal that transmits the whole image captured at the distribution site to the communication terminal at the browsing site.
Example embodiments include a method including: displaying, on a display, a part of a whole image as a viewable-area image, the whole image having been captured at a distribution site that is remotely located from a browsing site where the communication terminal is provided; storing, in a memory, target area information indicating a plurality of target areas at the distribution site; receiving, from the distribution site, target area designation information designating a particular target area of the plurality of target areas; and controlling the display to display the viewable-area image including the particular target area, based on the target area information of the particular target area.
Example embodiments include a non-transitory recording medium storing a control program for causing one or more processors to perform the above-described method.
A more complete appreciation of the disclosure and many of the attendant advantages and features thereof can be readily obtained and understood from the following detailed description with reference to the accompanying drawings, wherein:
The accompanying drawings are intended to depict embodiments of the present invention and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted. Also, identical or similar reference numerals designate identical or similar components throughout the several views.
In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have a similar function, operate in a similar manner, and achieve a similar result.
Referring now to the drawings, embodiments of the present disclosure are described below. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
Embodiments of the present invention are described with reference to the drawings. In the description of the drawings, the same elements are denoted by the same reference numerals, and redundant descriptions thereof are omitted.
Generating Spherical Image
Referring to
First, referring to
As illustrated in
Next, referring to
Next, referring to
As illustrated in
The image capturing device 10 uses Open Graphics Library for Embedded Systems (OpenGL ES) to map the equirectangular projection image EC so as to cover the sphere surface as illustrated in
Since the spherical image CE is an image attached to the sphere surface to cover the sphere surface, as illustrated in
The predetermined-area image Q, which is an image of the predetermined area T illustrated in
Referring to
L/f=tan(α/2) [Equation 1]
The image capturing device 10 described above is an example of an image capturing device capable of acquiring a wide-angle view image. In this disclosure, the spherical image is an example of a wide-angle view image. Here, the wide-angle view image is generally an image taken with a wide-angle lens, such as a lens capable of taking a range wider than a range that the human eye can perceive. Further, the wide-angle view image is generally an image taken with a lens having a focal length of 35 mm or less in terms of 35 mm film.
Overview of Image Communication System
Referring to
As illustrated in
The distribution terminal 30, the communication management system 50, and the communication terminal 70 in the image communication system 1 are each communicable with one another via a communication network 100. The communication network 100 is implemented by the Internet, mobile communication network, local area network (LAN), etc. The communication network 100 may include, in addition to a wired network, a wireless network in compliance with such as 3rd Generation (3G), 4th Generation (4G), 5th Generation (5G), Wireless Fidelity (Wi-Fi; Registered Trademark), Worldwide Interoperability for Microwave Access (WiMAX), Long Term Evolution (LTE), etc.
The image capturing device 10, the target area designation device 20, and the distribution terminal 30 at the distribution site together operate as a distribution system 3. The image capturing device 10 is a special digital camera, which captures an image of an object or surroundings such as scenery to obtain two hemispherical images, from which the spherical image is generated, as described above referring to
The target area designation device 20 is a dedicated device for designating a target area, which is a specific location at the distribution site. The target area designation device 20 is previously disposed at a specific location, or at a location close to the specific location, of the distribution site, such that the target area designation device 20 is able to detect a trouble such as abnormality at the specific location of the distribution site.
The distribution terminal 30 distributes an image acquired from the image capturing device 10 via a wired cable such as a universal serial bus (USB) cable to the communication terminal 70 via the communication management system 50. The distribution terminal 30 further distributes target area designation information, acquired from the target area designation device 20 via a wired cable such as a USB cable, to the communication terminal 70 via the communication management system 50.
The distribution site subjected to control by the image communication system 1 is not limited to a single site, such that the image communication system 1 may control a plurality of sites. The connection between the distribution terminal 30, and each of the image capturing device 10 and the target area designation device 20, may be a wireless connection using short-range wireless communication, for example, instead of a wired connection using a wired cable.
The communication management system 50 controls communication between the distribution terminal 30 and the communication terminal 70. Further, the communication management system 50 manages information on types (a general image type and a special image type) of image data to be exchanged in the communication between the distribution terminal 30 and the communication terminal 70. In this embodiment, a special image is a spherical image, and a general image is a planar image. The communication management system 50 is provided, for example, at a service provider that provides video communication service.
The communication management system 50 may be configured by a single computer or a plurality of computers to which divided portions (functions) are arbitrarily allocated. All or a part of the functions of the communication management system 50 may be implemented by a server computer residing on a cloud network or a server computer residing on an on-premise network.
The communication terminal 70 is a computer such as a PC operated by a user at each browsing site. The communication terminal 70 displays an image (still image or moving image) distributed from the distribution terminal 30. The communication terminal 70 acquires a spherical image, which is a captured image captured by the image capturing device 10, via the communication network 100. The communication terminal 70 is installed with OpenGL ES, which enables the communication terminal 70 to generate predetermined-area information that indicates a partial area of a spherical image, or to generate a predetermined-area image from a spherical image that is transmitted from the distribution terminal 30. Here, the communication terminal 70A is disposed at the browsing site A where a user A1 is present, and the communication terminal 70B is disposed at the browsing site B where the user B is present.
The arrangement of the terminals and devices (communication terminal, image capturing device, and distribution terminal), and users illustrated in
Referring to
The target area designation device 20 is disposed at a specific location of the distribution site and designates a target area corresponding to its installed location. The target area designation device 20a designates, for example, a target area 1, as a candidate of a target area of the distribution site for viewing by the user at the browsing site. Similarly, the target area designation device 20b designates, for example, a target area 2, which is a candidate of a target area of the distribution site for viewing by the user at the browsing site.
If a plurality of captured images, captured by a plurality of image capturing devices that are remotely located, are displayed at one or more browsing sites, it has been difficult for the user to determine which captured image, taken by which image capturing device, should be viewed, to grasp information on a specific location of the distribution site. In some cases, users at different browsing sites want to browse different places. In such case, operations (for example, a pan-tilt-zoom (PTZ) operation) on the image capturing device by different users may conflict with each other. It has been difficult to enable different users to browse different places of the same distribution site. For example, while a spherical image as described above can be captured by the image capturing device, a plurality of image capturing devices may be necessary to cover an entire distribution site. In such case, the user may want to individually operate each of a plurality of image capturing devices to display a target location as desired by each user. It has been difficult for the user to intuitively operate the image capturing device. In view of the above, the image communication system 1 enables a plurality of users to view different areas of the distribution site by using captured images acquired from the image capturing devices, by intuitive operation on a display screen.
Hardware Configuration
Next, referring to
Hardware Configuration of Image Capturing Device
First, referring to
As illustrated in
The imaging unit 101 includes two wide-angle lenses (so-called fish-eye lenses) 102a and 102b (collectively referred to as lens 102 unless they need to be distinguished from each other), each having an angle of view of equal to or greater than 180 degrees so as to form a hemispherical image. The imaging unit 101 further includes the two imaging elements 103a and 103b corresponding to the lenses 102a and 102b respectively. The imaging elements 103a and 103b each includes an imaging sensor such as a complementary metal oxide semiconductor (CMOS) sensor and a charge-coupled device (CCD) sensor, a timing generation circuit, and a group of registers. The imaging sensor converts an optical image formed by the lenses 102a and 102b into electric signals to output image data. The timing generation circuit generates horizontal or vertical synchronization signals, pixel clocks and the like for the imaging sensor. Various commands, parameters and the like for operations of the imaging elements 103a and 103b are set in the group of registers.
Each of the imaging elements 103a and 103b of the imaging unit 101 is connected to the image processor 104 via a parallel I/F bus. In addition, each of the imaging elements 103a and 103b of the imaging unit 101 is connected to the imaging controller 105 via a serial I/F bus such as an I2C bus. The image processor 104, the imaging controller 105, and the audio processor 109 are each connected to the CPU 111 via a bus 110. Furthermore, the ROM 112, the SRAM 113, the DRAM 114, the operation unit 115, the input/output I/F 116, the short-range communication circuit 117, the electronic compass 118, the gyro sensor 119, the acceleration sensor 120, and the network I/F 121 are also connected to the bus 110.
The image processor 104 acquires image data from each of the imaging elements 103a and 103b via the parallel I/F bus and performs predetermined processing on each image data. Thereafter, the image processor 104 combines these image data to generate data of the equirectangular projection image as illustrated in
The imaging controller 105 usually functions as a master device while the imaging elements 103a and 103b each usually functions as a slave device. The imaging controller 105 sets commands and the like in the group of registers of the imaging elements 103a and 103b via the serial I/F bus such as the I2C bus. The imaging controller 105 receives various commands from the CPU 111. Further, the imaging controller 105 acquires status data and the like of the group of registers of the imaging elements 103a and 103b via the serial I/F bus such as the I2C bus. The imaging controller 105 sends the acquired status data and the like to the CPU 111.
The imaging controller 105 instructs the imaging elements 103a and 103b to output the image data at a time when a shutter button of the operation unit 115 is pressed. In some cases, the image capturing device 10 displays a preview image on a display (e.g., a display of an external terminal such as a smartphone that performs short-range communication with the image capturing device 10 through the short-range communication circuit 117) or displays a moving image (movie). In case of displaying movie, the image data are continuously output from the imaging elements 103a and 103b at a predetermined frame rate (frames per minute).
Furthermore, the imaging controller 105 operates in cooperation with the CPU 111 to synchronize the time when the imaging element 103a outputs image data and the time when the imaging element 103b outputs the image data. It should be noted that, although the image capturing device 10 does not include a display in this embodiment, the image capturing device 1 may include the display. The microphone 108 converts sounds to audio data (signal). The audio processor 109 acquires the audio data output from the microphone 108 via an I/F bus and performs predetermined processing on the audio data.
The CPU 111 controls entire operation of the image capturing device 10, for example, by performing predetermined processing. The ROM 112 stores various programs for execution by the CPU 111. The SRAM 113 and the DRAM 114 each operates as a work memory to store programs loaded from the ROM 112 for execution by the CPU 111 or data in current processing. More specifically, in one example, the DRAM 114 stores image data currently processed by the image processor 104 and data of the equirectangular projection image on which processing has been performed.
The operation unit 115 collectively refers to various operation keys, a power switch, the shutter button, and a touch panel having functions of both displaying information and receiving input from a user, which can be used in combination. The user operates the operation unit 115 to input various image capturing (photographing) modes or image capturing (photographing) conditions.
The input/output I/F 116 collectively refers to an interface circuit such as a USB I/F that allows the image capturing device 10 to communicate data with an external medium such as an SD card or an external personal computer. The input/output I/F 116 supports at least one of wired and wireless communications. The data of the equirectangular projection image, which is stored in the DRAM 114, is stored in the external medium via the input/output I/F 116 or transmitted to an external terminal (apparatus) via the input/output I/F 116, as needed.
The short-range communication circuit 117 communicates data with the external terminal (apparatus) via the antenna 117a of the image capturing device 10 by short-range wireless communication such as NFC, Bluetooth, and Wi-Fi. The short-range communication circuit 117 transmits the data of equirectangular projection image to an external terminal (apparatus).
The electronic compass 118 calculates an orientation of the image capturing device 10 from the Earth's magnetism to output orientation information. This orientation and tilt information is an example of related information, which is metadata described in compliance with Exif. This information is used for image processing such as image correction of captured images. The related information also includes a date and time when the image is captured by the image capturing device 10, and a data size of the image data. The gyro sensor 119 detects the change in tilt of the image capturing device 10 (roll, pitch, yaw) with movement of the image capturing device 10. The change in angle is one example of related information (metadata) described in compliance with Exif. This information is used for image processing such as image correction of captured images. The acceleration sensor 120 detects acceleration in three axial directions. The image capturing device 10 calculates position (an angle with respect to the direction of gravity) of the image capturing device 10, based on the acceleration detected by the acceleration sensor 120. With the gyro sensor 119 and the acceleration sensor 120, the image capturing device 10 is able to correct tilt of image with high accuracy. The network I/F 121 is an interface for performing data communication, via such as a router, using the communication network 100 such as the Internet.
Hardware Configuration of Distribution Terminal
The CPU 301 controls entire operation of the distribution terminal 30. The ROM 302 stores a control program for controlling the CPU 301, such as an initial program loader (IPL). The RAM 303 is used as a work area for the CPU 301. The HD 304 stores various data such as a control program. The HDD controller 305 controls reading or writing of various data to or from the HD 304 under control of the CPU 301. The display 306 displays various information such as a cursor, menu, window, characters, or image. The display 306 is an example of a display (display device). In one example, the display 306 is a touch panel display provided with an input device (input means). The external device connection I/F 308 is an interface for connecting to various external devices. Examples of the external devices include, but are not limited to, a universal serial bus (USB) memory and a printer. The network I/F 309 is an interface that controls communication of data through the communication network 100. The bus line 310 is an address bus or a data bus, which electrically connects the elements in
The keyboard 311 is an example of an input device including a plurality of keys for inputting characters, numerical values, various instructions, and the like. The pointing device 312 is an example of an input device that allows a user to select or execute a specific instruction, select a target for processing, or move a cursor being displayed. The input device is not limited to the keyboard 311 and the pointing device 312, and may be a touch panel, a voice input device, or the like. The DVD-RW drive 314 reads and writes various data from and to a DVD-RW 313, which is an example of a removable recording medium.
In alternative to the DVD-RW, any recording medium may be used such as a DVD-R, Blu-ray Disc (Registered Trademark), etc. The medium I/F 316 controls reading or writing (storing) of data with respect to a recording medium 315 such as a flash memory. The microphone 318 is an example of audio collecting device, which is a built-in type, capable of inputting audio under control of the CPU 301. The audio I/O I/F 317 is a circuit for inputting or outputting an audio signal to the microphone 318 or from the speaker 319 under control of the CPU 301. The short-range communication circuit 320 communicates data with the external terminal (apparatus) by short-range wireless communication such as NFC, Bluetooth, and Wi-Fi.
Hardware Configuration of Communication Management System
Hardware Configuration of Communication Terminal
Hardware Configuration of Target Area Designation Device
Further, any one of the above-described control programs may be recorded in a file in a format installable or executable on a computer-readable recording medium for distribution. Examples of the recording medium include a CD-R (Compact Disc Recordable), a DVD (Digital Versatile Disk), a Blu-ray Disc, an SD card, and a USB memory. In addition, such recording medium may be provided in the form of a program product to users within a certain country or outside that country. For example, the communication terminal 70 executes the control program to implement a method of displaying an image according to the present disclosure.
Functional Configuration
Referring to
Functional Configuration of Image Capturing Device
Referring to
The communication unit 11 is implemented by instructions of the CPU 111, and transmits or receives various data or information to or from other device or terminal. The communication unit 11 communicates data with other device or terminal using, for example, the short-range communication circuit 117, based on short-range wireless communication technology. For example, the communication unit 11 communicates data with other device or terminal via any desired cable via the input/output I/F 116. Furthermore, the communication unit 11 communicates data with other device or terminal via the communication network 100 via the network I/F 121.
The reception unit 12 is implemented by the operation unit 115, which operates according to the instructions of the CPU 111, to receive various selections or inputs from the user. The image capturing unit 13 is implemented by the imaging unit 101, the image processor 104, and the imaging controller 105, illustrated in
The storing and reading unit 19, which is implemented by the instructions of the CPU 111, stores various data or information in the storage unit 1000 or reads out various data or information from the storage unit 1000.
Functional Configuration of Target Area Designation Device
Next, referring to
The communication unit 21 is implemented by instructions of the CPU 201, and transmits or receives various data or information to or from other device or terminal. The communication unit 21 communicates data with other device or terminal using, for example, the short-range communication circuit 220, based on short-range wireless communication technology. For example, the communication unit 21 communicates data with other device or terminal via any desired cable via the external device connection I/F 208. Furthermore, the communication unit 21 communicates data with other device or terminal via the communication network 100 via the network I/F 209.
The detection unit 22 is implemented by instructions of the CPU 201, and detects designation of a target area corresponding to a specific location where the target area designation device 20 is provided. For example, when a trouble occurs at the specific location (target area) where the target area designation device 20 is provided, the detection unit 22 determines to designate the target area where the trouble occurs to be viewed by the user at the browsing site. The detection unit 22 may detect a trouble in various ways. In one example, the detection unit 22 may detect occurrence of the trouble based on recognition of an image taken at the distribution site. In another example, the detection unit 22 may detect occurrence of the trouble based on outputs from various sensors of the target area designation device 20. In another example, the detection unit 22 may detect designation of a target area in response to pressing of a dedicated button provided at the target area designation device 20, by an operator at the distribution site.
Functional Configuration of Distribution Terminal
Next, referring to
The data exchange unit 31, which is implemented by the network I/F 309 that operates according to instructions of the CPU 301, transmits or receives various data or information to or from other device or terminal through the communication network 100.
The reception unit 32, which is implemented by the keyboard 311 or the pointing device 312 that operates according to instructions of the CPU 301, receives various selections or inputs from the user. The image and audio processing unit 33, which is implemented by the instructions of the CPU 301, applies image processing to the captured image data obtained by the image capturing device 10 capturing the object. The image and audio processing unit 33 further processes audio data, based on audio signals, which are converted from voice of the user by the microphone 318. Further, the image and audio processing unit 33 processes the captured image data received from the image capturing device 10 based on the image type information such as the source name. The display control unit 34 causes the display 306 to display an image based on the processed captured image data. More specifically, when the image type information indicates “special image”, the image and audio processing unit 33 converts the captured image data such as hemispherical image data as illustrated in
The display control unit 34, which is implemented by the instructions of the CPU 301, controls the display 306 to display various screens including various images or texts. The determination unit 35, which is implemented by instructions of the CPU 301, performs various determinations. For example, the determination unit 35 determines an image type corresponding to captured image data received from, for example, the image capturing device 10.
The generation unit 36 is implemented by instructions of the CPU 301. The generation unit 36 generates a source name, which is one example of the image type information, according to a naming rule, based on a determination result generated by the determination unit 35 indicating a general image or a special image (that is, spherical image in this disclosure). For example, when the determination unit 35 determines that the image type corresponding to the received captured image data is a general image, the generation unit 36 generates a source name of “Video” that indicates a general image type. By contrast, when the determination unit 35 determines that the image type corresponding to the received captured image data is a special image, the generation unit 36 generates a source name of “Video_Theta” that indicates a special image type.
The communication unit 37, which is implemented by the short-range communication circuit 320 that operates according to the instructions of the CPU 301, performs data communication with the communication unit 11 of the image capturing device 10 and the communication unit 21 of the target area designation device 20, by short-range wireless communication technology such as NFC, Bluetooth, or WiFi. In the above description, the communication unit 37 and the data exchange unit 31 are independent from each other, however, the communication unit 37 and the data exchange unit 31 may be configured as a single unit.
The storing and reading unit 39, which is implemented by the instructions of the CPU 301, stores various data or information in the storage unit 3000 or reads out various data or information from the storage unit 3000.
Image Capturing Device Management Table
Image Type Management Table
The example of the image type management table illustrated in
Target Area Management Table
Functional Configuration of Communication Management System
Next, referring to
The data exchange unit 51, which is implemented by the network I/F 509 that operates according to instructions of the CPU 501, transmits or receives various data or information to or from other device or terminal through the communication network 100.
The determination unit 52, which is implemented by instructions of the CPU 501, performs various determinations. The generation unit 53, which is implemented by instructions of the CPU 501, generates an image data ID and predetermined-area information. The generation unit 53 generates, for example, predetermined area information indicating a predetermined area in a captured image captured by the image capturing device 10. The distribution site management unit 54 is implemented by instructions of the CPU 501, and manages distribution site information indicating a state of the distribution site.
The storing and reading unit 59, which is implemented by the instructions of the CPU 501, stores various data or information in the storage unit 5000 or reads out various data or information from the storage unit 5000.
Session Management Table
Image Type Management Table
Predetermined Area Management Table
In the example of
When the data exchange unit 51 newly receives predetermined-area information including a pair of the IP address of the communication terminal as the transmission source of the captured image data and the IP address of the communication terminal as the transmission destination of the captured image data, which is the same as the pair that is currently stored in the table, the storing and reading unit 59 overwrites currently managed predetermined-area information with the newly received predetermined-area information.
Installation Information Management Table
Distribution Site Management Table
Functional Configuration of Communication Terminal
Next, referring to
The data exchange unit 71, which is implemented by the network I/F 709 that operates according to instructions of the CPU 701, transmits or receives various data or information to or from other device or terminal through the communication network 100.
The reception unit 72, which is implemented by the keyboard 711 or the pointing device 712 that operates according to instructions of the CPU 701, receives various selections or inputs from the user. The image and audio processing unit 73, which is implemented by instructions of the CPU 701, implements the similar or substantially the similar function to that of the image and audio processing unit 33. The display control unit 74, which is implemented by the instructions of the CPU 701, controls the display 706 to display various screens including various images or texts. The determination unit 75, which is implemented by instructions of the CPU 701, performs various determinations.
The first generation unit 76, which is implemented by instructions of the CPU 701, implements the similar or substantially the similar function to that of the generation unit 36. The target area specifying unit 77, which is implemented by instructions of the CPU 701, identifies a target area at the distribution site where the image capturing device 10 is installed. The target area specifying unit 77 specifies the target area to be displayed on a display screen based on the target area designation information distributed from the distribution system 3 and the target area information stored in the target area management DB 7003.
The second generation unit 78, which is implemented by instructions of the CPU 701, generates predetermined-area information. The second generation unit 78 generates, for example, predetermined-area information indicating a predetermined area in a captured image captured by the image capturing device 10.
The storing and reading unit 79, which is implemented by the instructions of the CPU 701, stores various data or information in the storage unit 7000 or reads out various data or information from the storage unit 7000.
Image Type Management Table
Predetermined Area Management Table
Target Area Management Table
Processes and Operations
Next, referring to
When a user at the browser site A (e.g., user A1) operates the communication terminal 70A to request display of the session selection screen for selecting a communication session, the reception unit 72 receives an instruction for displaying the session selection screen. The display control unit 74 of the communication terminal 70A causes the display 706 to display the selection screen 800 as illustrated in
When the user A1 selects a selection button (in this example, the selection button 810a) corresponding to a particular floor (in this example, floor A1) as the desired distribution site, the reception unit 72 receives selection of a corresponding communication session (in this example, se101) (S12). Then, the data exchange unit 71 transmits a request to participate in the communication session for communication with the desired distribution site, to the communication management system 50 (S13). This participation request includes a session ID identifying the communication session for which the selection is received at S12, and the IP address of the communication terminal 70A, which is a request source terminal. The communication management system 50 receives the participation request at the data exchange unit 51.
Next, the storing and reading unit 59 of the communication management system 50 performs a process for causing the communication terminal 70A to participate in the communication session (S14). More specifically, the storing and reading unit 59 adds, in the session management DB 5001 (
Operation of Managing Image Type Information
Next, referring to
When a user at the distribution site connects the image capturing device 10 to the distribution terminal 30, the storing and reading unit 19 of the image capturing device 10 reads out the GUID of the own device (image capturing device 10) stored in the storage unit 1000. Then, the communication unit 11 of the image capturing device 10 transmits the GUID of the image capturing device 10 to the distribution terminal 30 (S31). Accordingly, the communication unit 37 of the distribution terminal 30 receives the GUID of the image capturing device 10.
The determination unit 35 of the distribution terminal 30 determines whether a vendor ID and a product ID, which are the same as a vendor ID and a product ID of the GUID received at S31 are stored in the image capturing device management DB 3001 (see
Next, the storing and reading unit 39 stores, in the image type management DB 3002 (see
Next, the data exchange unit 31 transmits a request for addition of the image type information to the communication management system 50 (S34). The request to add the image type information includes the IP address of the image capturing device 10, the site ID of the distribution site, and the image type information stored at S33. The communication management system 50 receives the request for addition of the image type information at the data exchange unit 51.
Next, the storing and reading unit 59 of the communication management system 50 searches the session management DB 5001 (see
Next, the generation unit 53 generates a unique image data ID (S36). Then, the storing and reading unit 59 adds, in the image type management DB 5002 (see
Next, the storing and reading unit 39 of the distribution terminal 30 stores, in the image type management DB 3002 (see
Further, the data exchange unit 51 of the communication management system 50 transmits a notification indicating the addition of the image type information to the communication terminal 70A (S40). The notification indicating addition of the image type information includes the image data ID generated at S36, and the IP address of the image capturing device 10 and the image type information that are stored at S37. The communication terminal 70A receives the notification indicating the addition of the image type information at the data exchange unit 71.
Next, the storing and reading unit 79 of the communication terminal 70A adds, in the image type management DB 7001 (see
Operation of Distributing Captured Image Data
Next, referring to
The communication unit 11 of the image capturing device 10 transmits, to the distribution terminal 30, captured image data obtained by capturing an object or surroundings, and audio data obtained by collecting sounds (S51). Since the image capturing device 10 is a device that is configured to obtain two hemispherical images, from which a spherical image is generated, the captured image data is configured by data of the two hemispherical images as illustrated in
Next, the data exchange unit 31 of the distribution terminal 30 transmits, to the communication management system 50, the captured image data and the audio data received from the image capturing device 10 (S52). Along with the captured image data and the audio data, an image data ID identifying the captured image data, which is a transmission target, is also transmitted. The communication management system 50 receives the captured image data, the audio data, and the image data ID at the data exchange unit 51.
Next, the data exchange unit 51 of the communication management system 50 transmits the captured image data and audio data to each of the communication terminals (communication terminals 70A and 70B) participating in the same session as the distribution terminal 30 (S53 and S55). Along with the captured image data and the audio data, an image data ID identifying the captured image data, which is a transmission target, is also transmitted. As a result, the data exchange units 71 of the communication terminals 70A and 70B receive the captured image data, the audio data, and the image data ID, respectively. The display control units 74 of the communication terminal 70A and the communication terminal 70B each cause the display 706 to display the captured image based on various types of data that are received (S54 and S56).
Through
Further, in case of the special image, a part of the special image is displayed, which is set by default, for example. The part of the special image to be displayed may be changed according to a user instruction, as described below referring to
Referring now to
First, in response to an input operation performed by the user A1 on the spherical image, which is an example of the captured image, displayed on the communication terminal 70A, the reception unit 72 of the communication terminal 70A receives selection of a predetermined area for display (S71). For example, the user at the communication terminal 70A may select a certain part of the whole image that is currently displayed, with a pointing device such as a mouse or a finger.
Next, the data exchange unit 71 transmits a request for obtaining predetermined-area information to the communication management system 50 (S72). The predetermined-area information acquisition request includes the positional coordinates of the selected area received at S71. The data exchange unit 51 of the communication management system 50 receives the predetermined-area information acquisition request from the communication terminal 70A.
Next, the storing and reading unit 59 of the communication management system 50 searches the installation information management DB 5004 (see
Then, the generation unit 53 generates predetermined-area information corresponding to the positional coordinates received at S72 (S74). Specifically, the generation unit 53 generates predetermined-area information, which causes generation of a predetermined-area image (the image transformed to be in perspective projection) from the spherical image, such that the received positional coordinates are located at the center of the predetermined-area image when displayed. In this example, the radius vector (r) and the polar angle (θ) are previously set to have predetermined values. In other words, processing is performed with the angle of view and the elevation angle of the image in perspective projection being constant. The azimuth angle (φ) can be calculated as a relative value between the installation direction of the image capturing device 10, indicated in the installation information read at S73, and the direction of the positional coordinate from the image capturing device 10. Then, based on the calculated result, the generation unit 53 generates predetermined-area information for causing generation of a predetermined-area image (in perspective projection) from the spherical image obtained by the image capturing device 10, with the predetermined-area image having the positional coordinates at its center when it is displayed. The installation position (coordinate values) and the installation direction (in which direction the front surface of the image capturing device 10 faces) of the image capturing device 10 are set in advance in the installation information management DB5004 by the person in charge, such as a system administrator.
Next, the storing and reading unit 59 stores, in the predetermined-area management DB 5003 (see
Next, the storing and reading unit 79 of the communication terminal 70A stores, in the predetermined-area management DB 7002 (see
Next, in response to an input operation by the user A1, the reception unit 72 receives a selection of a display image to be displayed on the communication terminal 70A, from among the captured image data received at S53 (S78). When the communication terminal 70A is able to simultaneously display a plurality of captured images, or when the number of received captured images is smaller than the number of captured images that can be simultaneously displayed on the communication terminal 70A, the process of S78 may be omitted.
Next, in order to display an image of the predetermined area specified by the predetermined-area information received at S76, the image and audio processing unit 73 generates a predetermined-area image by applying perspective projection transformation to the captured image using the received predetermined-area information (S79). The display control unit 74 controls the display 706 to display the predetermined-area image generated at S79 (S80). The communication terminal 70A displays the predetermined-area image that includes a position of the spherical image being displayed, which is selected by the user A1 for viewing.
In case that the image to be displayed is selected at S78, the predetermined-area image is displayed for the selected image at S80.
Display Processing of Target Area
Next, referring to
First, the detection unit 22 of the target area designation device 20 detects designation of a target area (S101). In this case, when the detection unit 22 detects occurrence of a trouble or any event to be notified at the distribution site, for example, based on the result of image recognition or various sensor outputs, the detection unit 22 determines to designate a target area where the event is detected as a target area for viewing. In alternative to such automatic detection, the detection unit 22 may detect designation of a target area in response to pressing of a dedicated button provided at the target area designation device 20, by an operator at the distribution site.
Next, the communication unit 21 transmits target area designation information to the distribution terminal 30 (S102). The target area designation information includes a target area ID for identifying the target area designation device 20. Accordingly, the communication unit 37 of the distribution terminal 30 receives the target area designation information transmitted from the target area designation device 20.
Next, the data exchange unit 31 of the distribution terminal 30 transmits, to the communication management system 50, the target area designation information transmitted from the target area designation device 20 (S103). The communication management system 50 receives the target area designation information at the data exchange unit 51.
Next, the data exchange unit 51 of the communication management system 50 transmits the target area designation information to each of the communication terminals (communication terminals 70A and 70B) participating in the same session as the distribution terminal 30 (S104 and S106). As a result, the data exchange units 71 of the communication terminals 70A and 70B each receive the target area designation information. Then, the communication terminal 70A and the communication terminal 70B execute processing to display the target area using the received target area designation information (S105 and S107).
Referring now to
First, in response to reception of the target area designation information by the data exchange unit 71, the determination unit 75 determines whether or not to change display of a display screen (S201). Specifically, the determination unit 75 determines whether or not to change display of a display screen, based on whether a predetermined input operation by the user is received at the reception unit 72. For example, when a request for viewing a particular area is received, the determination unit 75 determines that a display screen is not to be changed according to the target area designation information. Otherwise, the determination unit 75 determines to change the display according to the target area designation information. With this configuration, the image communication system 1 is able to control both the communication terminal that changes display of a target area, and the communication terminal that continues to display the same target area without changing display of the target area. Therefore, in a case where the user does not want to change the display such as a case where the user is looking at another location of the distribution site, even when the target area designation information is received, the user may still continue to view the another area of the spherical image without changing the display of the communication terminal 70.
Based on a determination that the display screen is to be changed (YES at S201), the determination unit 75 proceeds the operation to S202. On the other hand, when it is determined that the display screen is no to be changed (NO at S201), the determination unit 75 ends the process without performing change of display at least according to the target area designation information.
Next, the target area specifying unit 77 searches the target area management DB 7003 (see
Next, in response to an input operation by the user A1, the reception unit 72 receives a selection of an image to be displayed (display image) on the communication terminal 70, from among the target area information obtained at S76 (S203). When the communication terminal 70 is able to simultaneously display a plurality of captured images, or when the number of received captured images is smaller than the number of captured images that can be simultaneously displayed on the communication terminal 70, the process of S203 may be omitted.
Next, the image and audio processing unit 73 generates a predetermined-area image by applying perspective projection transformation to a target area of the captured image data, as specified by the target area information corresponding to the display image selected at S203 (S204). The display control unit 74 controls the display 706 to change display, from the currently-displayed image, to the predetermined-area image generated at S204 (S205). For example, the display control unit 74 switches the display from the predetermined-area image (for example, the second predetermined-area image) displayed at S80 to the predetermined-area image (for example, the first predetermined-area image) generated at S204.
The communication terminal 70 executes the above-described processing on each of respective captured images being displayed. For example, assuming that it is determined that all captured images are to be changed at S201, the communication terminal 70 acquires the target area information for each of the currently-displayed captured images at S202, and changes each display image such that all captured images being displayed contain respective target areas, which differ among captured images being displayed.
As described above, the communication terminal 70 acquires the position of the target area designated by the target area designation device 20 based on the target area information set in advance, and displays the predetermined-area image that is generated based on the acquired target area information such that the designated target area is displayed. Specifically, the communication terminal 70 changes the display position (viewable area) of the spherical image so as to display the designated target area. Accordingly, the communication terminal 70 displays an image containing the target area, without requiring operation to the image capturing device 10.
Further, in a case of using an image capturing device that captures a general planar image, it is often needed to physically change the direction of the image capturing device to display a target area, which may be different at each time, at the communication terminal. In such case, if more than one communication terminal is provided, and an image displayed on one communication terminal is changed to include a designated target area, an image displayed on another communication terminal will be the same as the image including the designated target area. In view of this, the image communication system 1 uses the spherical image captured by the image capturing device 10, and transmits the spherical image to each communication terminal 70. Each communication terminal 70 can then cut out and display only a target area of the spherical image, as requested by the user at the communication terminal 70. That is, display images may differ between the communication terminals 70.
Variation in Processing of Displaying Target Area
Next, referring to
First, the communication unit 11 of the image capturing device 10 transmits the captured image data and the audio data to the distribution terminal 30 (S301). Thus, the data exchange unit 31 of the distribution terminal 30 receives the captured image data and the audio data transmitted from the image capturing device 10.
The determination unit 35 of the distribution terminal 30 determines whether or not to designate a target area (S302). Specifically, for example, in a case where a parameter such as the degree of match or the degree of similarity between a reference image that is previously set for a distribution site and an acquired captured image is out of a predetermined range, the determination unit 35 determines to designate a target area. Further, for example, the determination unit 35 determines a position of the spherical image having a value of parameter that is out of the predetermined range, as the designated target area. The following describes the example case where the determination unit 35 determines to designate a target area.
Next, when the determination unit 35 determines at S302 that a target area is designated, the storing and reading unit 39 reads the target area designation information, from the target area management DB 3003, based on the coordinate value that is determined to be the designated target area (S303). Specifically, the storing and reading unit 39 reads, as the target area designation information, the target area ID associated with the coordinate value closest to the coordinate value of the designated target area, from among the coordinate values stored in the target area management DB 3003.
Then, the data exchange unit 31 transmits the captured image data and audio data received at S301, and the target area designation information read at S303, to the communication management system 50 (S304). Thus, the data exchange unit 51 of the communication management system 50 receives the captured image data, the audio data, and the target area designation information, transmitted from the distribution terminal 30. Processing from S305 to S308 are performed in the same or substantially the same manner as S104 to S107 described above with reference to
As described above, the image communication system according to the modified example is not provided with the target area designation device 20 at the distribution site. Even in such case, the distribution terminal determines whether a target area is designated and the position of the target area, if designated, based on the content of the captured image, thereby causing the communication terminal 70 to display the designated target area. That is, the distribution terminal 30 is able to designate the target area, in place of the target area designation device 20, such that the designated target area is displayed to the user at the browsing site.
In the image communication system 1 of the modified example, according to designation of a target area at the distribution terminal 30, the communication terminal 70 at the browsing site displays an image of the designated target area without requiring operation on the image capturing device 10 at the distribution site.
Further, in the image communication system 1, the communication terminal 70 cuts out a portion of the spherical image to display only the portion including the target area. Thus, for example, even when one communication terminal 70 changes to display the designated target area, another communication terminal 70 is able to continue to display a target area, which differs from the designated target area, without changing the display of the another communication terminal 70.
In the above embodiment, the predetermined area T is specified by predetermined-area information indicating an imaging direction and an angle of view of the virtual camera IC in a three-dimensional virtual space containing the spherical image CE, but the present disclosure is not limited thereto. The predetermined area T may be specified by predetermined point information indicating a center point CP or an arbitrary point of four corners of the predetermined area T having a rectangular shape in
In any one of the above-described embodiments, the communication terminal 70 at the browsing site does not need to be a dedicated terminal for browsing. In another example, the communication terminal 70 may be additionally provided with a function of distributing a captured image, such that the distribution function and the browsing function can be performed concurrently. Similarly, the distribution terminal 30 at the distribution site does not need to be a terminal dedicated to distribution. In another example, the distribution terminal 30 may be additionally provided with a function of displaying a captured image, such that the distribution function and the browsing function can be performed concurrently. As described above, the image communication system 1 may be configured to perform bidirectional communication of captured images between a plurality of sites.
As described above, according to one aspect, the communication terminal capable of displaying, on a display, a part of a whole image (such as a spherical image), as a viewable-area image (may be referred to as a predetermined-area image), is provided. The communication terminal 70 includes a storage unit (for example, the target area management DB 7003), a reception unit (for example, the data exchange unit 71), and a display control unit (for example, the display control unit 74). The display 706, which may be built-in or an external device, is also provided. The target area management DB7003 (an example of a storage unit) stores target area information indicating a plurality of target areas at a distribution site at which the whole image is captured. The reception unit receives target area designation information designating a particular target area of the plurality of target areas, from the distribution system 3 at the distribution site. The display control unit causes the display to display a viewable-area image including the particular target area based on the target area information corresponding to the particular target area indicated by the received target area designation information. With this configuration, the communication terminal is able to display an image of the distribution site corresponding to the target area designated at the distribution site, without requiring operation on the image capturing device at the distribution site.
According to one aspect, the communication terminal displays, on the display, a viewable-area image, which is an image of a part of the whole image (for example, the spherical image), before the target area designation information is received. In one example, the viewable-area image including a part of the whole image, which is specified by the user at the browsing site, is displayed. In another example, the viewable-area image including a part of the whole image, which is set by default, is displayed. In response to reception of the target area designation information, the communication terminal changes a display from the viewable-area image including the part, currently displayed, to the viewable-area image including the designated target area. Accordingly, the communication terminal changes an area of the whole image being displayed, to an area including the target area designated at the distribution site.
According to one aspect, the communication terminal 70 further includes a determination unit (for example, the determination unit 75) that determines whether to change display, in response to the received target area designation information. When the determination unit 75 determines to change the display, the communication terminal 70 changes a display from the predetermined-area image to a predetermined-area image including the designated target area. For example, the determination unit determines whether the part of the whole image, currently displayed as a viewable-area image, is specified by the user input. When the part of the whole image is specified by the user at the browsing site, the determination unit determines not to change the display. When the part of the whole image is not specified by the user, the determination unit determines to change the display. With this configuration, in response to reception of the target area designation information, the communication terminal 70 is able to selectively switch between a first case in which the display is changed to include the designated target area, and a second case in which the display is unchanged to continuously display the currently-displayed image as specified by the user.
In one aspect, an image communication system 1 includes the communication terminal (for example, the communication terminal 70) and a distribution system (for example, the distribution system 3). The distribution system 3 includes an obtaining unit (for example, the image capturing device 10), and a transmission unit (for example, the data exchange unit 31 of the distribution terminal 30). The image capturing device 10 captures the whole image (for example, the spherical image) covering the distribution site. The data exchange unit 31 transmits the whole image, acquired, to the communication terminal 70.
In one example, the distribution system 3 further includes a target area designation device 20 that designates a target area. The distribution system 3 transmits, to the communication terminal 70, target area designation information indicating a target area designated by the target area designation device 20. With this configuration, in the image communication system 1, the communication terminal 70 displays the viewable-area image including the target area designated by the target area designation device 20. This allows the user at the browsing site to view an image at the distribution site, corresponding to the designated target area. Further, in one example, the image communication system 1 may cause a plurality of communication terminals 70 to display images of different areas of the distribution site. Specifically, in such case, each communication terminal 70 is caused to display the target area, based on the target area designation information transmitted from the distribution system 3.
Further, in the image communication system 1 according to one aspect, the distribution system 3 includes a determination unit (for example, the determination unit 35) that determines whether or not to designate a target area with respect to the acquired whole image (for example, the spherical image). The distribution system 3 transmits target area designation information indicating the target area to the communication terminal 70, based on a determination that the target area is designated. With this configuration, in the image communication system 1, the distribution terminal 30 is able to designate the target area, in place of the target area designation device 20, such that the designated target area is displayed to the user at the browsing site.
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA), system on a chip (SOC), graphics processing unit (GPU), and conventional circuit components arranged to perform the recited functions.
Further, various tables of any one of the above-described embodiments may be generated by machine learning. Further, data of associated items can be classified, such that use of tables can be optional. In the present disclosure, machine learning is a technique that enables a computer to acquire human-like learning ability. Machine learning refers to a technology in which a computer autonomously generates an algorithm required for determination such as data identification from learning data loaded in advance, and applies the generated algorithm to new data to make a prediction. Any suitable learning method is applied for machine learning, for example, any one of supervised learning, unsupervised learning, semi-supervised learning, reinforcement learning, and deep learning, or a combination of two or more those learning.
One or more embodiments of the communication terminal (including the distribution terminal as an example), the image communication system, the image display method, and the program for controlling image display are described above. The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention. Any one of the above-described operations may be performed in various other ways, for example, in an order different from the one described above.
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention.
Any one of the above-described operations may be performed in various other ways, for example, in an order different from the one described above.
Number | Date | Country | Kind |
---|---|---|---|
2020-145154 | Aug 2020 | JP | national |