The present application claims priority to and incorporates by reference the entire contents of Japanese Patent Application No. 2013-154785 filed in Japan on Jul. 25, 2013, Japanese Patent Application No. 2013-199004 filed in Japan on Sep. 25, 2013, and Japanese Patent Application No. 2014-086773 filed in Japan on Apr. 18, 2014.
1. Field of the Invention
The present invention relates to a distribution management apparatus.
2. Description of the Related Art
Conventionally, electronic information boards capable of displaying a background image on a large-screen display and enabling a user to write down a drawing image, such as a character, a number, and a graphic, on the background image have been used in meetings of business enterprises, educational institutions, administrative agencies, and the like. Such an electronic information board has an enlarged display function of displaying an enlarged image of an image displayed on a display screen of a personal computer (PC) connected to the electronic information board, a PC operating function of operating the connected PC through a touch panel function built into the electronic information board, and an electronic blackboard function of displaying a drawn image such as a character handwritten by a user on the touch panel likened to a blackboard in a manner superimposed on the PC display image, or the like Through the use of such an electronic information board, for example, in an office meeting, a user can directly write down points of note or the like in a display image while performing an operation to display explanatory materials on the electronic information board, and can record a drawn image written down on the electronic information board. Accordingly, it is possible to reuse the drawn image to summarize contents of the meeting efficiently.
Incidentally, Japanese Patent No. 4696480 has disclosed a technique to store history data of memos handwritten on an electronic blackboard and overwritten on materials were written down in a server, thereby enabling to display drawn images on electronic blackboards set in multiple bases of a remote meeting, in a superimposed manner.
However, to cause electronic information boards to operate as electronic blackboards in multiple bases of a remote meeting, the electronic information boards are required to have a high software processing capacity, which results in an increase in cost of equipment. Meanwhile, according to a technique as disclosed in Japanese Patent No. 4696480 in which software processing is performed by an external server, going through a network causes a delay in processing, and therefore displaying of a handwritten drawn image is delayed, which impedes the progress of a meeting.
In view of the above, there is a need to provide a distribution management apparatus capable of displaying, on a terminal, a drawn image handwritten by a user without delay at low cost.
It is an object of the present invention to at least partially solve the problems in the conventional technology.
A distribution management apparatus includes: a receiving unit that receives operation information, which indicates operation input that a terminal has accepted, from the terminal via a network; a browser that creates drawing information to be displayed on the terminal from the operation information; an encoder that encodes the drawing information; and a transmitting unit that transmits the encoded drawing information to the terminal.
The above and other objects, features, advantages and technical and industrial significance of this invention will be better understood by reading the following detailed description of presently preferred embodiments of the invention, when considered in connection with the accompanying drawings.
A distribution management apparatus (an image processing server) according to a first embodiment of the present invention is explained in detail below with reference to accompanying drawings. Incidentally, the present invention is not limited to this embodiment. Furthermore, the identical components are denoted by the same reference numeral in the drawings.
The image processing server 502 is realized by an information processing apparatus such as a workstation or a general computer, and includes a storage device such as a memory such as a ROM or a RAM, and a recording medium such as a CD-ROM and a hard disk, a communication device, an output device such as a display device and a printer, and an input device. An arithmetic processing unit such as a CPU in the information processing apparatus executes an image processing program stored in the memory, and thereby the image processing server 502 performs image processing to be described later.
The drawing device 539 is a pen-shaped device equipped with a contact-sensing unit, which senses contact of a physical body, on the tip thereof, and is used to draw an image while being into contact with the display unit 536. When the contact-sensing unit of the drawing device 539 comes into contact with a physical body, the drawing device 539 transmits a contact signal, which indicates the contact with a physical body, together with identification information of the drawing device 539 to the coordinate detecting unit 538.
Incidentally, the drawing device 539 in the present embodiment is equipped with an erase-mode selector switch for switching from the normal drawing mode to the erase mode on the side surface or rear end thereof. When a user brings the drawing device 539 into contact with the display unit 536 while holding down the erase-mode selector switch of the drawing device 539, the drawing device 539 operates in the erase mode, and transmits a contact signal together with the identification information of the drawing device 539 and mode type information indicating the erase mode to the coordinate detecting unit 538. When the user brings the drawing device 539 into contact with the display unit 536 without holding down the erase-mode selector switch, the drawing device 539 operates in the drawing mode, and transmits a contact signal together with the identification information of the drawing device 539 to the coordinate detecting unit 538. Furthermore, the drawing device 539 can instruct the user to select an object, such as a menu or a button, displayed on the display unit 536. When the user brings the drawing device 539 into contact with an object displayed on the display unit 536 without holding down the erase-mode selector switch, i.e., when a contact position is within a coordinate area of an object, the drawing device 539 operates in the object selection notification mode. In this case, the drawing device 539 transmits a contact signal together with the identification information of the drawing device 539 and mode type information indicating the selection notification mode to the coordinate detecting unit 538.
The contact-sensing device 537 senses contact of a physical body, such as the drawing device 539, with the display unit 536. In the present embodiment, an infrared interruption type touch panel is adopted as the contact-sensing device 537. This contact-sensing device 537, with two light emitting/receiving devices placed at both lower ends of the display unit 536, emits infrared rays in a direction parallel to the display unit 536 and receives infrared rays reflected onto the same light paths by a reflecting member placed around the display unit 536. The contact-sensing device 537 notifies the coordinate detecting unit 538 of identification information of the infrared rays that have been emitted from the two light emitting/receiving devices and interrupted by the physical body. Incidentally, as the contact-sensing device 537, there may be adopted a capacitance type touch panel that senses a change in capacitance thereby detecting contact of a physical body with the display unit 536. Furthermore, a resistive type touch panel that detects contact of a physical body with the display unit 536 from a change in voltage of two corresponding resistance films may be adopted as the contact-sensing device 537. Moreover, an electromagnetic induction type touch panel that senses electromagnetic induction generated by contact of a physical body with the display unit 536 thereby detecting the contact of the physical body with the display unit 536 may be adopted as the contact-sensing device 537.
The coordinate detecting unit 538 identifies a coordinate position corresponding to coordinates of a position at which a physical body has made contact with the display unit 536 on the basis of information notified by the contact-sensing device 537. The coordinate detecting unit 538 in the present embodiment uses identification information of infrared rays notified by the contact-sensing device 537 to calculate the coordinate position of the physical body. Furthermore, when the coordinate detecting unit 538 has received a contact signal from the drawing device 539, the coordinate detecting unit 538 issues an event (a drawing instruction event, a selection notification event, or an erase instruction event) corresponding to operation mode (the drawing mode, the selection notification mode, or the erase mode) of the drawing device 539. This event includes identification information of the drawing device 539 and mode type information indicating the operation mode. The coordinate detecting unit 538 further issues a sub-event in addition to the event. Sub-events issued by the coordinate detecting unit 538 include, for example, a sub-event (TOUCH) which notifies that a physical body has come in contact with or close to the display unit 536, a sub-event (MOVE) which notifies that a contact or close point has moved under a condition where a physical body is kept in contact with or close to the display unit 536, and a sub-event (RELEASE) which notifies that a physical body has separated from the display unit 536. These sub-events each include coordinate position information of the contact or close position.
The communication unit 534 is a network interface with the network 504. The communication control unit 535 transmits information, such as authentication information and event information, to the image processing server 502 through the communication unit 534, and receives image data to be displayed on the display unit 536 from the image processing server 502 through the communication unit 534.
The ROM 532 is a non-volatile memory in which a boot program, such as a BIOS or an EFI, is stored. The RAM 533 is a main memory such as a DRAM or an SRAM, and provides a runspace for execution of an image processing program.
The processor 531 is an arithmetic processing unit such as a CPU or an MPU, and runs an OS, such as Windows® series, UNIX®, Linux®, TRON, ITRON, or μITRON, and executes an image processing program written in a program language, such as assembler, C, C++, Java®, JavaScript®, Perl, Ruby, or Python, under the control of the OS. This processor reads out the image processing program from a hard disk device (not shown) that permanently holds therein a software program and various data, and expands the read image processing program into the RAM 533 and executes the image processing program, thereby the RAM 533 serves as an event processing unit 5331, a drawing generating unit 5334 including a drawing-limits determining unit 5332 and a drawing-data generating unit 5333, an app-image generating unit 5335, a synthesizing unit 5336, and a display control unit 5337. Respective functions of these units are described later.
Incidentally, the image processing program executed by the image processing server 502 and the image processing apparatuses 503 according to the present embodiment is provided by recording the image processing program on a computer-readable recording medium, such as a CD-ROM, a flexible disk (FD), a CD-R, or a digital versatile disk (DVD), in an installable or executable file format.
Furthermore, the image processing program executed by the image processing server 502 and the image processing apparatuses 503 according to the present embodiment may be provided in such a manner that the image processing program is stored on a computer connected to a network such as the Internet so that the image processing program can be downloaded over the network 504. Moreover, the image processing program executed by the image processing server 502 and the image processing apparatuses 503 according to the present embodiment may be provided or distributed over a network such as the Internet. Furthermore, the image processing program according to the present embodiment may be embedded in a ROM or the like in advance.
The image processing program executed by the image processing server 502 and the image processing apparatuses 503 according to the present embodiment is composed of modules including the above-described units (the event processing unit 5331, the drawing generating unit 5334 including the drawing-limits determining unit 5332 and the drawing-data generating unit 5333, the app-image generating unit 5335, the synthesizing unit 5336, and the display control unit 5337). A CPU (a processor) as actual hardware reads out the image processing program from a storage medium and executes the image processing program, thereby the above-described units are loaded onto a main memory, and the units are generated on the main memory. Incidentally, at least some of the units may be realized by hardware such as an integrated circuit (IC).
Image Processing by Image Processing Server
The image processing server 502 distributes image data to some or all of the image processing apparatuses 503 at predetermined frequency, and causes the image processing apparatuses 503 to update an image frame displayed on the display unit 536.
This image data is, as illustrated in
Incidentally, this image data may be compressed. In this case, the compressed image data is decompressed in the image processing apparatuses 503 to display the image data on respective display units 536 of the image processing apparatuses 503. Furthermore, when the image processing server 502 has been notified of a selection notification event or an erase instruction event, the image processing server 502 performs image processing according to the notified event.
Image Processing by Image Processing Apparatus
In the process at Step S1, it is determined whether the communication control unit 535 has received image data from the image processing server 502. When the communication control unit 535 has received image data (YES at Step S1), the image processing proceeds to a process at Step S11; on the other hand, when the communication control unit 535 has not received image data (NO at Step S1), the image processing proceeds to a process at Step S2.
In the process at Step S2, it is determined whether the event processing unit 5331 has received any event from the coordinate detecting unit 538. When the event processing unit 5331 has not received any event (NO at Step S2), the image processing returns to the process at Step S1 to wait to receive image data or an event; on the other hand, when the event processing unit 5331 has received an event (YES at Step S2), the image processing proceeds to a process at Step S3.
In the process at Step S3, it is determined whether the event received by the event processing unit 5331 is a drawing instruction event. When the received event is a drawing instruction event (YES at Step S3), the image processing proceeds to a process at Step S4. On the other hand, when the received event is not a drawing instruction event, i.e., when the received event is a selection notification event or an erase instruction event (NO at Step S3), the event processing unit 5331 notifies the image processing server 502 of that (Step S8). After that, the image processing returns to the process at Step S1 to wait to receive image data or another event.
Drawing-Data Receiving Process
Subsequently, a drawing-data receiving process (Steps S4 to S7) performed when the event processing unit 5331 has received a drawing instruction event is explained. Through this drawing-data receiving process, a drawn image based on drawing data specified by the received drawing instruction event is displayed on the display unit 536.
In the process at Step S4, the event processing unit 5331 accepts drawing data specified by a sub-event, such as TOUCH, MOVE, or RELEASE, notified together with the drawing instruction event, and stores the drawing data in the RAM 533 in a manner associated with identification information of the drawing data. Furthermore, the event processing unit 5331 transmits the drawing instruction event together with identification information of the image processing apparatus 503, the drawing data, and the identification information of the drawing data to the image processing server 502 through the communication control unit 535. Incidentally, identification information of drawing data is issued for each drawing instruction event, and, for example, a value according to the time at which the drawing instruction event has been received is assigned. In this way, the process at Step S4 is completed, and the image processing proceeds to a process at Step S5.
In the process at Step S5, the drawing-limits determining unit 5332 updates a value of a drawing end register with the identification information of the drawing data issued at Step S4. Here, out of a number of drawing data that have been accepted and stored in the RAM 533 through drawing instruction events received from moment to moment, the limits of drawing data displayed on the display unit 536 is specified by a drawing start register and the drawing end register. The drawing-limits determining unit 5332 sets a value of the drawing end register to identification information of the latest drawing data, thereby the latest drawing data can be displayed on the display unit 536. Incidentally, the drawing-limits determining unit 5332 sets identification information of drawing data corresponding to the first drawing instruction event as an initial value of the drawing start register. In this way, the process at Step S5 is completed, and the image processing proceeds to a process at Step S6.
In the process at Step S6, the drawing-data generating unit 5333 generates a drawing layer of a display image based on drawing data on the RAM 533 corresponding to from the drawing start register to the drawing end register. In this way, the process at Step S6 is completed, and the image processing proceeds to a process at Step S7.
In the process at Step S7, the synthesizing unit 5336 synthesizes the drawing layer and an image layer generated from image data to be described later, and the display control unit 5337 displays the synthesized display image on the display unit 536. If an image layer has not been generated, a display image of only the drawing layer is output to the display unit 536. In this way, the process at Step S7 is completed, and the image processing returns to Step S1 to wait to receive image data or another event.
Image-Data Receiving Process
Subsequently, an image-data receiving process (Steps S11 to S13 and S6 to S7) performed when the communication control unit 535 has received image data from the image processing server 502 is explained. Through this image-data receiving process, a display image formed by synthesizing the received image data and the latest draw data is displayed on the display unit 536.
In the process at Step S11, the drawing-limits determining unit 5332 refers to identification information of an image processing apparatus 503 and identification information of drawing data which are included in the image data received from the image processing server 502. When its own identification information of the image processing apparatus 503 is included, the drawing-limits determining unit 5332 compares the identification information of the drawing data with a value of the drawing start register, and determines whether the identification information of the drawing data is the one issued later than the value of the drawing start register. When the identification information of the drawing data is the one issued later than the value of the drawing start register (YES at Step S11), the image processing proceeds to a process at Step S12. On the other hand, when the identification information of the drawing data is the one issued before the value of the drawing start register (NO at Step S11), the image processing proceeds to a process at Step S13. Incidentally, if the received image data does not include identification information of drawing data, or if no value has been set in the drawing start register, the image processing proceeds to the process at Step S13.
In the process at Step S12, the drawing-limits determining unit 5332 updates a value of the drawing start register with the identification information of the drawing data included in the image data received from the image processing server 502. In addition, the drawing-limits determining unit 5332 deletes older drawing data than the drawing data corresponding to the updated drawing start register from the RAM 533.
Incidentally, in the process at Step S11, if the identification information of the drawing data included in the image data received from the image processing server 502 is newer than the value of the drawing start register, that means part or all of the drawing data input to the image processing apparatus 503 is included in the image data. Therefore, the value of the drawing start register is updated so that out of the drawing data input to the image processing apparatus 503, drawing data newer than the drawing data included in the image data is output to the display unit 536. At this time, for the sake of security, the value of the drawing start register can be updated with identification information of drawing data older than the identification information of the drawing data included in the received image data. In this way, the process at Step S12 is completed, and the image processing proceeds to the process at Step S13.
On the other hand, in the process at Step S11, if the identification information of the drawing data included in the image data received from the image processing server 502 is the one issued before the value of the drawing start register, that means the drawing data input to the image processing apparatus 503 is not included in the received image data. Therefore, the process at Step S12 is skipped so that already-input drawing data is output to the display unit 536 together with the image data received from the image processing server 502.
In the process at Step S13, the app-image generating unit 5335 generates an image layer of a display image from the image data received from the image processing server 502. For example, if the image data has been compressed, the app-image generating unit 5335 decompresses the image data to an image layer. In this way, the process at Step S13 is completed, and the image processing proceeds to the process at Step S6.
In the process at Step S6, as described above, the drawing-data generating unit 5333 generates a drawing layer of a display image from drawing data on the RAM 533 corresponding to from the drawing start register to the drawing end register. In this way, the process at Step S6 is completed, and the image processing proceeds to the process at Step S7. Incidentally, if no value has been set in the drawing start register, the process at Step S6 is skipped.
In the process at Step S7, as described above, a display image formed by synthesizing the image layer and a drawing layer generated from the drawing data with the synthesizing unit 5336 is output to the display unit 536 through control by the display control unit 5337. If a drawing layer has not been generated, a display image of only the image layer is output to the display unit 536. In this way, the process at Step S7 is completed, and the image processing returns to Step S1 to wait to receive the latest image data or event.
As explained above, according to the image processing system, image processing method, and image processing program in the present embodiment, the image processing apparatus 503 displays thereon only the minimum drawing data until image processing by the image processing server 502 has been completed. Therefore, the image processing apparatus 503 is not required to have a high software processing capacity, and can display thereon drawing data without delay. Furthermore, when image processing by the image processing server 502 has been completed, drawing data input before then is deleted from the RAM 33 (the memory) of the image processing apparatus 503; therefore, it is possible to reduce the memory capacity required of the image processing apparatus 503. Consequently, it is possible to reduce the software processing capacity and memory capacity required of the image processing apparatus 503, thereby achieving the image processing apparatus 503 capable of displaying thereon a drawn image handwritten by a user without delay at low cost. Furthermore, according to the image processing system 501 including two or more image processing apparatuses 503, the image processing apparatuses 503 are placed in respective multiple bases of a remote meeting; therefore, it is possible to easily achieve a remote meeting in which a drawn image handwritten by a user can be displayed without delay at low cost.
Subsequently, a distribution management apparatus according to a second embodiment of the present invention is explained in detail below with reference to accompanying drawings. Incidentally, the present invention is not limited to this embodiment. Furthermore, the identical components are denoted by the same reference numeral in the drawings.
A distribution system according to the present embodiment is explained in detail below with drawings. In the embodiment described below, the present invention is applied to a distribution system that uses cloud computing to convert Web content into video data, sound data, or video data and sound data and distribute the converted data to communication terminals such as a PC and an electronic blackboard. Incidentally, hereinafter, when at least one of video and sound is described, it is referred to as “video (sound)”.
First, an outline of the present embodiment is explained with
Outline of System Configuration
First, an outline of a configuration of the distribution system 1 is explained.
As shown in
The communication terminals 5 are terminals used by users who get the service of the distribution system 1. Out of the communication terminals 5, the communication terminals 5a1 and 5a2 are notebook PCs. The communication terminals 5b1 and 5b2 are mobile terminals, such as a smartphone and a tablet terminal. The communication terminal 5c is a multifunction peripheral/printer/product (MFP) having multiple functions of copy, scan, print, and fax. The communication terminal 5d is a projector. The communication terminal 5e is a video-conference terminal equipped with a camera, a microphone, and a speaker. The communication terminals 5f1 and 5f2 are electronic blackboards (whiteboards) capable of electronically converting user-drawn content.
Incidentally, the communication terminals 5 are not limited to those shown in
The distribution management apparatus 2, the communication terminals 5, the terminal management apparatus 7, and the Web server 8 can communicate with one another over a communication network 9 such as the Internet and a local area network (LAN). The communication network 9 includes wireless communication networks, such as 3G (3rd Generation), WiMAX (Worldwide Interoperability for Microwave Access), and LTE (Long Term Evolution).
Incidentally, like the communication terminal 5d or the like, some of the communication terminals 5 have no function of communicating with other terminals and systems over the communication network 9. However, as shown in
The distribution management apparatus 2 has a so-called cloud browser (hereinafter, referred to as “browser 20”) as a Web browser existing on a cloud. The distribution management apparatus 2 renders Web content on the cloud by using the browser 20, and distributes obtained H.264 or MPEG-4 video (sound) data to a communication terminal 5.
The terminal management apparatus 7 has a function as a management server, and performs, for example, login authentication of a communication terminal 5 and management of contract information of the communication terminals 5 or the like. Furthermore, the terminal management apparatus 7 has a function of an SMTP (Simple Mail Transfer Protocol) server for sending an e-mail. The terminal management apparatus 7 can be realized, for example, as a virtual machine developed on IaaS (Infrastructure as a Service) which is a service of the cloud. The terminal management apparatus 7 is preferably multiplexed to perform continuous service provision while coping with contingencies.
Incidentally, the browser 20 of the distribution management apparatus 2 enables real-time communication/collaboration (RTC). Furthermore, an encoder bridge unit 30 (an encoding unit 19 shown in
Outlines of Various Distribution Methods
Subsequently, outlines of various distribution methods are explained.
Basic Distribution
Furthermore, as shown in
Composite Distribution
Furthermore, in the first base, a capture G1 of a screen displayed on a communication terminal 5a1 is used, so the communication terminals 5a1 and 5f1 are connected by wired or wireless. When the connection method is wired connection, the screen capture G1 is transmitted to a capture device of the communication terminal 5f1 via an image transmission cable (VGA, HDMI®, DisplayPort, DVI-I/D, or the like), and the capture device transmits the screen capture G1 to an encoding unit 60 through an internal I/F (PCI-E USB, or the like).
When the connection method is wireless connection, the screen capture G1 is transmitted to an input device of the communication terminal 5f1 by using a wireless display transmitting technique, and the input device transmits the screen capture G1 to the encoding unit 60 through the internal I/F. The wireless display transmitting technique includes, for example, Wi-Fi® Alliance Miracast and Intel® Wireless Display.
Incidentally, the communication terminal 5f1 can receive screen captures G1 from multiple communication terminals 5a. In this case, the communication terminal 5f1 displays multiple thumbnail images of the screen captures G1 on the screen of the communication terminal 5f1 so that a capture G1 of a screen of a communication terminal 5a corresponding to a thumbnail image selected by a user can be used.
In the second base, content A of a communication terminal 5a2 for which login has been authenticated by the terminal management apparatus 7 is used. The communication terminal 5a2 uploads the content A onto the Web server 8 via the communication network 9. The Web server 8 stores therein the content A of the communication terminal 5a2 as Web content data.
In the first base, video (sound) data [E1] acquired by the communication terminal 5e1 is encoded by the encoding unit 60, and then is transmitted to the distribution management apparatus 2. After that, the video (sound) data [E1] is decoded by a decoding unit 40 of the distribution management apparatus 2, and is input to the browser 20. Furthermore, operation data [p1] indicating a stroke drawn on the communication terminal 5f1 with the electronic pen P1 or the like is transmitted to the distribution management apparatus 2, and is input to the browser 20. Moreover, the screen capture [G1] of the communication terminal 5a1 is encoded by the encoding unit 60, and then is transmitted to the distribution management apparatus 2. After that, the screen capture [G1] is decoded by the decoding unit 40 of the distribution management apparatus 2, and is input to the browser 20. On the other hand, in the second base, video (sound) data [E2] acquired by the communication terminal 5e2 is encoded by the encoding unit 60, and then is transmitted to the distribution management apparatus 2. After that, the video (sound) data [E2] is decoded by the decoding unit 40 of the distribution management apparatus 2, and is input to the browser 20. Furthermore, operation data [p2] indicating a stroke drawn on the communication terminal 5f2 with the electronic pen P2 or the like is transmitted to the distribution management apparatus 2, and is input to the browser 20.
Meanwhile, the browser 20 acquires, for example, Web content data [A] of a background image displayed on respective displays of the communication terminals 5f1 and 5f2 from the Web server 8. Then, the browser 20 combines the Web content data [A], the screen capture data [G1], the operation data [p1] and [p2], and the video (sound) data [E1] and [E2] and performs rendering, thereby generating video (sound) data in which the above data are arranged in a desired layout. Then, the encoder bridge unit 30 encodes the video (sound) data, and the distribution management apparatus 2 distributes the same video (sound) data to the bases. Accordingly, in the first base, video ([A], [G1], [p1], [p2], [E1 (video part)], and [E2 (video part)]) is displayed on the display of the communication terminal 5f1, and sound [E2 (sound part)] is output from the speaker of the communication terminal 5e1. On the other hand, in the second base, the video ([A], [G1], [p1], [p2], [E1 (video part)], and [E2 (video part)]) is displayed on the display of the communication terminal 5f2, and sound [E1 (sound part)] is output from the speaker of the communication terminal 5e2. Incidentally, in the first base, the sound [E1 (sound part)] in the first base is not output by an echo cancellation function of the communication terminal 5f1. On the other hand, in the second base, the sound [E2 (sound part)] in the second base is not output by an echo cancellation function of the communication terminal 5f2.
In this way, it is possible to perform the remote sharing process for sharing the same information between remote locations of the first and second bases in real time; therefore, the distribution system 1 according to the present embodiment is useful in a remote meeting and the like.
Subsequently, the embodiment is explained in detail with
First, a hardware configuration of the present embodiment is explained with
As shown in
Incidentally, respective programs for each communication terminal, each system, and each server can be distributed in such a manner that each program is recorded on a computer-readable recording medium, such as the recording medium 206, in an installable or executable file format.
Subsequently, a functional configuration of the present embodiment is explained with
Functional Configuration of Distribution Management Apparatus
The distribution management apparatus 2 realizes the functional configuration shown in
Out of the above functional components, the browser 20 is a Web browser that operates in the distribution management apparatus 2. The browser 20 renders content data such as Web content data, thereby generating video (sound) data as RGB data (or pulse-code modulation (PCM) data). The browser 20 is constantly updated to the latest version so as to cope with the tendency that the Web content is made richer.
Furthermore, in the distribution system 1 according to the present embodiment, a plurality of browsers 20 is prepared in the distribution management apparatus 2, and a cloud browser used in a user session is selected from among these browsers 20. Incidentally, here, for sake of simplicity, the case where a single browser 20 is prepared in the distribution management apparatus 2 is explained below.
The browser 20 has, for example, Media Player, Flash Player, JavaScript®, CSS (Cascading Style Sheet), and HTML (HyperText Markup Language) renderer. Incidentally, the JavaScript® includes standard one and unique one to the distribution system 1. The Media Player here is browser plug-in for reproducing a multimedia file, such as a video (sound) file, in the browser 20. The Flash Player is browser plug-in for reproducing Flash content in the browser 20. The unique JavaScript® is a JavaScript® group that provides an application programming interface (API) for a service specific to the distribution system 1. The CSS is a technique for efficiently defining the appearance and style of a Web page written in HTML. The HTML renderer is a WebKit-based HTML rendering engine. Furthermore, the browser 20 receives operation data [p] from the browser managing unit 22, and generates drawing information or electronic pen information (drawing setting information) from the operation data [p]. The browser 20 stores the generated drawing information or electronic pen information in the storage unit 2000. Drawing information and electronic pen information are described later.
The transmitting/receiving unit 21 transmits/receives various data, requests, and/or the like to/from the terminal management apparatus 7 and the Web server 8. For example, the transmitting/receiving unit 21 acquires Web content data from a content site of the Web server 8. Furthermore, the transmitting/receiving unit 21 transmits/receives recognition information and electronic blackboard information (drawing information and electronic pen information) to/from the terminal management apparatus 7.
The browser managing unit 22 manages the browser 20 and the encoder bridge unit 30. For example, the browser managing unit 22 instructs the browser 20 and the encoder bridge unit 30 to start or end, and assigns an encoder ID at the start or end. The encoder ID here is identification information assigned in order for the browser managing unit 22 to manage the process of the encoder bridge unit 30. Furthermore, each time the browser 20 is started, the browser managing unit 22 assigns and manages a browser ID. The browser ID here is identification information assigned by the browser managing unit 22 to manage the process of the browser 20 and to identify the browser 20.
Furthermore, the browser managing unit 22 acquires operation data [p] from a communication terminal 5 through the transmitting/receiving unit 21, and outputs the acquired operation data [p] to the browser 20. Incidentally, the operation data [p] is data generated by an operation event (an operation with the keyboard 211 or the mouse 212, a stroke of the electronic pen P1, or the like) in the communication terminal 5. When the communication terminal 5 is provided with sensors such as a temperature sensor, a humidity sensor, and an acceleration sensor, the browser managing unit 22 acquires sensor information, which corresponds to output signals of the sensors, from the communication terminal 5, and outputs the acquired sensor information to the browser 20.
The transmission FIFO 24 is a buffer that stores therein video (sound) data [AEp] generated by the browser 20.
The time managing unit 25 manages the time T unique to the distribution management apparatus 2. The time acquiring unit 26 performs a time adjusting process in cooperation with a time control unit 56 of a communication terminal 5. Specifically, the time acquiring unit 26 acquires time information (T) indicating the time T in the distribution management apparatus 2 from the time managing unit 25, and receives time information (t) indicating the time t in the communication terminal 5 from the time control unit 56, and transmits the time information (t) and the time information (T) to the time control unit 56.
The line adaptive control unit 27 calculates a reproduction delay time U on the basis of transmission delay time information (D), and calculates operating conditions, such as a frame rate and data resolution, of a converting unit 10 of the encoder bridge unit 30. This reproduction delay time is a time to delay reproduction to buffer data before the reproduction.
The encoder bridge unit 30 outputs video (sound) data [AEp] that has been generated by the browser 20 and stored in the transmission FIFO 24 to the converting unit 10 of the encoder bridge unit 30. The encoder bridge unit 30 is explained in detail below with
As shown in
As shown in
The trimming unit 11 performs a process of capturing only a part of video (an image). The resizing unit 12 rescales video (an age).
The encoding unit 19 encodes video (sound) data generated by the browser 20, thereby converting the video (sound) data into data that can be distributed to a communication terminal 5 via the communication network 9. Furthermore, if there is no motion in video (if there is no change between frames), the encoding unit 19 inserts skip frames until there is a motion in the video to save the bandwidth. Incidentally, in the case of sound, the encoding unit 19 performs only the encoding.
The generating/selecting unit 310 newly creates a converting unit 10, and selects video (sound) data to be input to an already-created converting unit 10. Cases where the generating/selecting unit 310 newly creates a converting unit 10 include, for example, when it is necessary to create a converting unit 10 capable of conversion according to reproduction capability of a communication terminal 5 to reproduce video (sound) data. Furthermore, when the generating/selecting unit 310 selects video (sound) data to be input to a converting unit 10, the generating/selecting unit 310 selects an already-created converting unit 10. For example, in starting data distribution to the communication terminal 5b in addition to data distribution to the communication terminal 5a, the same video (sound) data as that distributed to the communication terminal 5a may be distributed to the communication terminal 5b. In such a case, furthermore, the communication terminal 5b may have the same video (sound) data reproduction capability as the communication terminal 5a. That is, in such a case, the generating/selecting unit 310 uses an already-created converting unit 10a for the communication terminal 5a without creating a new converting unit 10b for the communication terminal 5b.
The selecting unit 320 selects a desired one from among already-created converting units 10. Through the selection by the generating/selecting unit 310 and the selecting unit 320, various patterns of distribution as shown in
Returning to
The transmission response control is a process of managing an HTTPS session for download requested by a communication terminal 5 to transmit data from the distribution management apparatus 2 to the communication terminal 5. A response to this HTTPS session for download is not terminated immediately, and is held for a given length of time (one to a few minutes). The transmitting/receiving unit 31 dynamically writes data to be transmitted to the communication terminal 5 in the body part of the response. Furthermore, to eliminate the cost for reconnection, the transmitting/receiving unit 31 is configured to receive another request from the communication terminal 5 before the previous session ends. The transmitting/receiving unit 31 waits until completion of the previous request; therefore, overhead can be eliminated even a reconnection is established.
The real-time data creation is a process of adding the original header to data (RTP data) of a compressed video (and a compressed sound) generated by the encoding unit 19 shown in
The command transmission is a process of generating command data to be transmitted to a communication terminal 5 and writing the command data in the body part of a downlink HTTPS for distribution to the communication terminal 5.
The receiving response control is a process of managing an HTTPS session for transmission (uplink) requested by a communication terminal 5 in order for the distribution management apparatus 2 to receive data from the communication terminal 5. A response to this HTTPS session is not terminated immediately, and is held for a given length of time (one to a few minutes). The communication terminal 5 dynamically writes data to be transmitted to the transmitting/receiving unit 31 of the distribution management apparatus 2 in the body part of the request.
The received-data analysis is a process of analyzing data transmitted from a communication terminal 5 with respect to each type of the data and passing the data to a required process.
The gesture conversion is a process of converting a gesture event input on a communication terminal 5f as an electronic blackboard by a user with an electronic pen P or by hand into a form that the browser 20 can receive.
The receiving FIFO 34 is a buffer that stores therein video (sound) data decoded by the decoding unit 40.
The recognizing unit 35 performs processing on video (sound) data [E] received from a communication terminal 5. Specifically, for example, for signage, the recognizing unit 35 recognizes the face, age, and sex of a person or an animal from video taken by a camera 62. Furthermore, for an office, the recognizing unit 35 performs name tagging through facial recognition from video taken by the camera 62, replacement of a background image, and/or the like. The recognizing unit 35 stores recognition information on recognized content in the storage unit 2000. This recognizing unit 35 performs processing with a recognition expansion board to achieve high-speed processing.
The delay-information acquiring unit 37a is used in a downlink line adaptive control process in correspondence to a delay-information acquiring unit 57 used in an uplink line adaptive control process. Specifically, the delay-information acquiring unit 37a acquires transmission delay time information (d1) indicating a transmission delay time d1 from the decoding unit 40 and holds the acquired transmission delay time information (d1) for a given length of time, and when having acquired multiple pieces of transmission delay time information (d1), the delay-information acquiring unit 37a outputs transmission delay time information (d) indicating frequency distribution information based on the multiple pieces of transmission delay time information d1 to the line adaptive control unit 37b.
The line adaptive control unit 37b is used in a downlink line adaptive control process in correspondence to the above-described line adaptive control unit 27 used in an uplink line adaptive control process. Specifically, the line adaptive control unit 37b calculates operating conditions of the encoding unit 60 on the basis of the transmission delay time information (d). Furthermore, the line adaptive control unit 37b transmits a line adaptive control signal indicating the operating conditions, such as a frame rate and data resolution, to the encoding unit 60 of a communication terminal 5 through the transmitting/receiving unit 31 and the transmitting/receiving unit 51.
The decoding unit 40 decodes video (sound) data [E] transmitted from a communication terminal 5.
Functional Configuration of Communication Terminal
Subsequently, a functional configuration of the communication terminal 5 is explained with
The communication terminal 5 realizes the functional configuration shown in
The decoding unit 50 decodes video (sound) data [AEp] that has been distributed from the distribution management apparatus 2 and output from the reproduction control unit 53.
The transmitting/receiving unit 51 transmits/receives various data, requests, and/or the like to/from the transmitting/receiving unit 31 of the distribution management apparatus 2 and a transmitting/receiving unit 71a of the terminal management apparatus 7. For example, in a login process of the communication terminal 5, the transmitting/receiving unit 51 transmits a request for login to the transmitting/receiving unit 71a of the terminal management apparatus 7 on the basis of start-up of the communication terminal 5 through the operation unit 52.
The operation unit 52 receives user operation input. For example, the operation unit 52 receives input or selection made through a power switch, a keyboard, a mouse, an electronic pen P, or the like, and transmits the received input or selection as operation data [p] to the browser managing unit 22 of the distribution management apparatus 2.
The reproduction control unit 53 buffers video (sound) data [AEp] (a packet of real-time data) received from the transmitting/receiving unit 51, and outputs the video (sound) data [AEp] to the decoding unit 50 in consideration of a reproduction delay time U.
The rendering unit 55 renders data decoded by the decoding unit 50.
The time control unit 56 performs a time adjusting process in cooperation with the time acquiring unit 26 of the distribution management apparatus 2. Specifically, the time control unit 56 acquires the time information (t) indicating the time t in the communication terminal 5 from the storage unit 5000. Furthermore, the time control unit 56 requests the time acquiring unit 26 of the distribution management apparatus 2 to transmit time information (T) indicating the time T in the distribution management apparatus 2 through the transmitting/receiving unit 51 and the transmitting/receiving unit 31. In this case, the time information (t) is transmitted together with the request for time information (T).
The delay-information acquiring unit 57 acquires transmission delay time information (D1) indicating a transmission delay time D1 from the reproduction control unit 53 and holds the acquired transmission delay time information (D1) for a given length of time, and when having acquired multiple pieces of transmission delay time information (D1), the delay-information acquiring unit 57 transmits transmission delay time information (D) indicating frequency distribution information based on the multiple transmission delay times D1 to the line adaptive control unit 27 through the transmitting/receiving unit 51 and the transmitting/receiving unit 31. Incidentally, the transmission delay time information (D) is transmitted, for example, once every 100 frames.
The display unit 58 reproduces data rendered by the rendering unit 55.
The encoding unit 60 transmits encoded video (sound) data of video (sound) data [E] acquired from the internal microphone 213 (see
Incidentally, the internal microphone 213 and the external camera 62 and microphone 63 are examples of an input means, and are devices that require encoding or decoding. The input means can output touch data and smell data besides video (sound) data. The input means include sensors such as a temperature sensor, a direction sensor, and an acceleration sensor.
Functional Configuration of Terminal Management Apparatus
Subsequently, a functional configuration of the terminal management apparatus 7 is explained with
The terminal management apparatus 7 realizes the functional configuration shown in
The distribution-destination selection menu data 7040 is data of a distribution-destination selection menu screen as shown in
The sharing ID is an ID used in a remote sharing process in which each user distributes the same content of video (sound) data as that distributed to the user's communication terminal 5 to other communication terminals 5, and is identification information for identifying other communication terminals or other communication terminal groups. In the example shown in
The installation position information indicates the installation position, for example, when the multiple communication terminals 5f1, 5f2, and 5f3 are placed side by side as shown in
That is, the display screen shown in
Returning to
The transmitting/receiving unit 71a transmits/receives various data, requests, and/or the like to/from the communication terminal 5. For example, the transmitting/receiving unit 71a receives a login request including a terminal ID and a terminal certificate from the transmitting/receiving unit 51 of the communication terminal 5, and transmits a result of authentication of the login request to the transmitting/receiving unit 51.
The transmitting/receiving unit 71b transmits/receives various data, requests, and/or the like to/from the distribution management apparatus 2. For example, the transmitting/receiving unit 71b receives a request for distribution-destination selection menu data from the transmitting/receiving unit 21 of the distribution management apparatus 2, and transmits the distribution-destination selection menu data to the transmitting/receiving unit 21. Furthermore, the transmitting/receiving unit 71b receives data of electronic blackboard information 7030 from the transmitting/receiving unit 21 of the distribution management apparatus 2, and transmits data of electronic blackboard information 7030 to the transmitting/receiving unit 21.
The authenticating unit 75 searches the terminal management table 7010 on the basis of the terminal ID and user certificate received from the transmitting/receiving unit 51 of the communication terminal 5, and determines whether there is the same combination of the terminal ID and the user certificate in the terminal management table 7010, thereby authenticating the communication terminal 5a.
Subsequently, the operation or processing of the present embodiment is explained with
Basic Distribution Processing
First, specific distribution processing by the distribution management apparatus 2 using the basic distribution method is explained with
As shown in
Next, the authenticating unit 75 of the terminal management apparatus 7 searches the terminal management table 7010 on the basis of the terminal ID and user certificate received from the communication terminal 5a, and determines whether there is the same combination of the terminal ID and the user certificate in the terminal management table 7010, thereby authenticating the communication terminal 5a (Step S22). Here, there is described the case where there is the same combination of the terminal ID and the user certificate in the terminal management table 7010, i.e., the communication terminal 5a is authenticated to be a valid terminal in the distribution system 1.
Then, the authenticating unit 75 of the terminal management apparatus 7 transmits an IP address of the distribution management apparatus 2 to the transmitting/receiving unit 51 of the communication terminal 5a through the transmitting/receiving unit 71a (Step S23). Incidentally, the IP address of the distribution management apparatus 2 has been acquired and stored in the storage unit 7000 by the terminal management apparatus 7 in advance.
Next, the transmitting/receiving unit 71b of the terminal management apparatus 7 transmits a request to start the browser 20 to the browser managing unit 22 through the transmitting/receiving unit 21 of the distribution management apparatus 2 (Step S24). In response to this start request, the browser managing unit 22 of the distribution management apparatus 2 starts the browser 20 (Step S25). Next, the generating/selecting unit 310 of the encoder bridge unit 30 creates a converting unit 10 according to reproduction capability of the communication terminal 5a (resolution of the display or the like) and a type of content (Step S26).
Next, the browser 20 requests content data [A] from the Web server 8 (Step S27). In response to this, the Web server 8 reads out the requested content data [A] from its own storage unit (not shown) (Step S28). Then, the Web server 8 transmits the content data [A] to the requestor browser 20 through the transmitting/receiving unit 21 of the distribution management apparatus 2 (Step S29).
Next, the browser 20 renders the content data [A] thereby generating video (sound) data [A], and outputs the video (sound) data [A] to the transmission FIFO 24 (Step S30). Then, the converting unit 10 encodes the video (sound) data [A] stored in the transmission FIFO 24 thereby converting the video (sound) data [A] into video (sound) data [A] to be distributed to the communication terminal 5a (Step S31).
Then, the encoder bridge unit 30 transmits the video (sound) data [A] to the reproduction control unit 53 through the transmitting/receiving unit 31 and the transmitting/receiving unit 51 (Step S32). In the communication terminal 5a, the video (sound) data [A] is output from the reproduction control unit 53 to the decoding unit 50, and the sound is reproduced from a speaker 61, and the video is reproduced on the display unit 58 through the rendering unit 55 (Step S33).
Communication Processing using Multiple Communication Terminals
Subsequently, a remote sharing process using the distribution management apparatus 2 is explained with
As shown in
On the other hand, when the encoding unit 60 of the communication terminal 5f1 has received input of content data [E] from the camera 62 and the microphone 63 (Step S43), the encoding unit 60 encodes the content data [E] and then transmits the content data [E] to the decoding unit 40 of the distribution management apparatus 2 through the transmitting/receiving unit 51 and the transmitting/receiving unit 31 (Step S44). The content data [E] is decoded by the decoding unit 40 and then input to the browser 20 through the receiving FIFO 34. Then, the browser 20 renders the content data [E] thereby generating video (sound) data [E], and outputs the video (sound) data [E] to the transmission FIFO 24 (Step S45). In this case, the browser 20 combines the content data [E] with the already-acquired content data [A] and then output the combined content data.
Furthermore, when the operation unit 52 of the communication terminal 5f1 has received input of a stroke operation of the electronic pen P1 (Step S46), the operation unit 52 transmits operation data [p] to the browser managing unit 22 through the transmitting/receiving unit 51 and the transmitting/receiving unit 31 (Step S47-1). The operation data [p] is input from the browser managing unit 22 of the distribution management apparatus 2 to the browser 20. The browser 20 analyzes the operation data [p] (Step S47-2).
Returning to
When the operation data [p] is data related to a drawing process (YES at Step S251), the process proceeds to Step S252. The browser 20 determines whether information indicating the operation mode included in the operation data [p] indicates the drawing mode or not (Step S252). For example, when the electronic pen has an operation-mode selector switch, the information indicating operation mode is a selection signal of the selector switch. Furthermore, the browser 20 can identify the information indicating operation mode from the setting of the drawing menu.
When the operation mode is the drawing mode (YES at Step S252), the browser 20 searches device IDs of electronic pen information in the storage unit 2000 with a device ID of the electronic pen included in the operation data [p] as a search key, and reads out retrieved electronic pen information (Step S253). Next, the browser 20 generates a drawing command from the electronic pen information and electronic-pen position information included in the operation data [p] (Step S254). Then, the browser 20 draws a graphic indicated by the drawing command on a drawing layer (Step S255). Incidentally, when a graphic has already been drawn on the drawing layer, the browser 20 adds the graphic indicated by the drawing command generated at Step S254 onto the drawing layer (differential drawing). The browser 20 outputs image data (display information) in which the background image and the drawing layer are synthesized (Step S256), and ends the process.
When the operation mode is not the drawing mode (i.e., the operation mode is the erase mode) (NO at Step S252), the browser 20 selects a drawing command corresponding to an image to be erased from position information included in the operation data (Step S257). Then, the browser 20 deletes a graphic corresponding to the selected drawing command from the image data (the drawing layer) (Step S258), and ends the process.
Returning to
Next, the converting unit 10 encodes the video (sound) data ([A], [E], [p]) stored in the transmission FIFO 24 thereby converting the video (sound) data ([A], [E], [p]) into video (sound) data ([A], [E], [p]) to be distributed to the communication terminal 5a (Step S49). Then, the encoder bridge unit 30 transmits the video (sound) data ([A], [E], [p]) to the reproduction control unit 53 of the communication terminal 5f1 through the transmitting/receiving unit 31 and the transmitting/receiving unit 51 (Step S50-1). After that, the video (sound) data ([A], [E], [p]) is decoded by the decoding unit 50 of the communication terminal 5f1 to output the sound to the speaker 61, and is rendered by the rendering unit 55 to output the video onto the display unit (Step S51-1).
Also to the communication terminal 5f2, in the same manner as at Step S50-1, the encoder bridge unit 30 transmits the same video (sound) data ([A], [E], [p]) to the reproduction control unit 53 of the communication terminal 5f2 through the transmitting/receiving unit 31 and the transmitting/receiving unit 51 (Step S50-2). After that, the video (sound) data ([A], [E], [p]) is decoded by the decoding unit 50 of the communication terminal 5f2 to output the sound to the speaker 61, and is rendered by the rendering unit 55 to output the video onto the display unit (Step S51-2). Accordingly, the same video (sound) as that output onto the communication terminal 5f1 is also output onto the communication terminal 5f2.
Time Adjusting Process
Subsequently, a time adjusting process is explained with
First, the time control unit 56 of the communication terminal 5 acquires time information (ts) in the communication terminal 5 from the storage unit 5000 to acquire the time for the transmitting/receiving unit 51 to request time information (T) from the distribution management apparatus 2 (Step S81). Then, the transmitting/receiving unit 51 requests time information (T) in the distribution management apparatus 2 from the transmitting/receiving unit 31 (Step S82). In this case, together with the request for time information (T), the time information (ts) is transmitted.
Next, the time acquiring unit 26 acquires time information (Tr) in the distribution management apparatus 2 from the time managing unit 25 to acquire the time at which the transmitting/receiving unit 31 has received the request at Step S82 (Step S83). Furthermore, the time acquiring unit 26 acquires time information (Ts) in the distribution management apparatus 2 from the time managing unit 25 to acquire the time for the transmitting/receiving unit 31 to send a response to the request at Step S82 (Step S84). Then, the transmitting/receiving unit 31 transmits the time information ((ts, Tr, Ts) to the transmitting/receiving unit 51 (Step S85).
Next, the time control unit 56 of the communication terminal 5 acquires time information (tr) in the communication terminal 5 from the storage unit 5000 to acquire the time at which the transmitting/receiving unit 51 has received the response at Step S85 (Step S86).
Then, the time control unit 56 of the communication terminal 5 calculates a time difference Δ between the distribution management apparatus 2 and the communication terminal 5 (Step S87). This time difference Δ is expressed by the following equation (1).
Δ=((Tr+Ts)/2)−((tr+ts)/2) (1)
Then, the time control unit 56 stores time difference data Δ in the storage unit 5000 (Step S88). A series of these processes for time adjustment is periodically performed, for example, on a minute-by-minute basis.
Downlink Line Adaptive Control Process
Subsequently, a process of line adaptive control for (downlink) data to be transmitted from the distribution management apparatus 2 to the communication terminal 5 is explained with
First, the encoder bridge unit 30 of the distribution management apparatus 2 transmits reproduction delay time information (U), which indicates a reproduction delay time to delay reproduction to buffer data before the reproduction, to the reproduction control unit 53 of the communication terminal 5 through the transmitting/receiving unit 31 and the transmitting/receiving unit 51 (Step S101). Furthermore, the encoder bridge unit 30 adds the current time T0 acquired from the time managing unit 25 as a time stamp to video (sound) data [A] that has been acquired from the transmission FIFO 24 and encoded, and transmits the video (sound) data [A] to the reproduction control unit 53 of the communication terminal 5 through the transmitting/receiving unit 31 and the transmitting/receiving unit 51 (Step S102).
On the other hand, in the communication terminal 5, the reproduction control unit 53 waits until the time (T0+U−Δ) in the communication terminal 5, and then outputs the video (sound) data to the decoding unit 50, thereby the sound is reproduced from the speaker 61, and the video is reproduced on the display unit 58 through the rendering unit 55 (Step S103). That is, only the video (sound) data that the communication terminal 5 has received within a range of the reproduction delay time U expressed by the following equation (2) is reproduced, and the video (sound) data outside the range is not reproduced and is erased.
U≧(t0+Δ)−T0 (2)
The reproduction control unit 53 reads out the current time t0 in the communication terminal 5 from the storage unit 5000 (Step S104). This time t0 indicates the time in the communication terminal 5 at which the communication terminal 5 has received the video (sound) data from the distribution management apparatus 2. Furthermore, the reproduction control unit 53 reads out the time difference information (Δ) indicating the time difference Δ stored at Step S88 in
D1=(t0+Δ)−T0 (3)
Next, the delay-information acquiring unit 57 acquires transmission delay time information (D1) indicating the transmission delay time D1 from the reproduction control unit 53 and holds the transmission delay time information (D1) for a given length of time, and when having acquired multiple pieces of transmission delay time information (D1), the delay-information acquiring unit 57 transmits transmission delay time information (D) indicating frequency distribution information based on the multiple transmission delay times D1 to the line adaptive control unit 27 of the distribution management apparatus 2 through the transmitting/receiving unit 51 and the transmitting/receiving unit 31 (Step S107).
Next, the line adaptive control unit 27 of the distribution management apparatus 2 newly calculates a reproduction delay information U′ on the basis of the transmission delay time information (D), and calculates operating conditions, such as a frame rate and data resolution, of the converting unit 10 (Step S108).
Next, the encoder bridge unit 30 of the distribution management apparatus 2 transmits reproduction delay time information (U′) indicating the new reproduction delay time U′ calculated at Step S108 to the reproduction control unit 53 of the communication terminal 5 through the transmitting/receiving unit 31 and the transmitting/receiving unit 51 (Step S109).
Furthermore, the converting unit 10 included in the encoder bridge unit 30 changes the operating conditions on the basis of a line adaptive control signal (Step S110). For example, when the transmission delay time D1 is too long, if the reproduction delay time U is increased according to the transmission delay time D1, the time to reproduce the video (sound) data on the speaker 61 and the display unit 58 becomes too late, so there is a limit to the increase in the reproduction delay time U. Therefore, the line adaptive control unit 27 can cope with the congestion of the communication network 2 by causing the converting unit 10 to lower the frame rate of the video (sound) data and lower the resolution of the video (sound) data in addition to causing the encoder bridge unit 30 to change the reproduction delay time U to the reproduction delay time U′. Accordingly, the encoder bridge unit 30 transmits the video (sound) data added with the current time T0 as a time stamp to the reproduction control unit 53 of the communication terminal 5 as in Step S102 in accordance with the changed operating conditions (Step S111).
Next, in the communication terminal 5, the reproduction control unit 53 waits until the time (T0+U′−Δ) in the communication terminal 5, and then outputs the video (sound) data to the decoding unit 50, thereby the sound is reproduced from the speaker 61, and the video is reproduced on the display unit 58 through the rendering unit 55 as in Step S103 (Step S112). After that, the processes from Step S104 onward are continuously performed. In this way, the downlink line adaptive control process is continuously performed.
Uplink Line Adaptive Control Process
Subsequently, a process of line adaptive control for (uplink) data to be transmitted from the communication terminal 5 to the distribution management apparatus 2 is explained with
First, the encoding unit 60 of a communication terminal 5 transmits encoded video (sound) data [E] of video (sound) data acquired from the camera 62 and microphone 63, time information (t0) indicating the current time t0 in the communication terminal 5 acquired from the storage unit 5000, and time difference information (Δ) indicating a time difference Δ acquired from the storage unit 5000 to the decoding unit 40 of the distribution management apparatus 2 through the transmitting/receiving unit 51 and the transmitting/receiving unit 31 (Step S121).
Next, in the distribution management apparatus 2, the decoding unit 40 reads out the time To at which the decoding unit 40 has received the video (sound) data [E] and so on transmitted at Step S121 from the time managing unit 25 (Step S122). Then, the decoding unit 40 calculates a transmission delay time d1, which indicates a time between transmission of the video (sound) data from the communication terminal 5 and receiving of the video (sound) data by the distribution management apparatus 2 (Step S123). This calculation is made by the following equation (4). If the communication network 9 is congested, the transmission delay time d1 gets longer.
d1=T0−(t0+Δ) (4)
Next, in the same manner as the delay-information acquiring unit 57 of the communication terminal 5, the delay-information acquiring unit 37a of the distribution management apparatus 2 acquires transmission delay time information (d1) indicating the transmission delay time d1 from the decoding unit 40 and holds the acquired transmission delay time information (d1) for a given length of time, and when having acquired multiple pieces of transmission delay time information (d1), the delay-information acquiring unit 37a outputs transmission delay time information (d) indicating frequency distribution information based on the multiple transmission delay times d1 to the line adaptive control unit 37b (Step S124).
Next, the line adaptive control unit 37b calculates operating conditions of the encoding unit 60 of the communication terminal 5 on the basis of the transmission delay time information (d) (Step S125). Then, the line adaptive control unit 37b transmits a line adaptive control signal, which indicates the operating conditions such as a frame rate and data resolution, to the encoding unit 60 of the communication terminal 5 through the transmitting/receiving unit 31 and the transmitting/receiving unit 51 (Step S126). That is, the line adaptive control unit 27 in the case of downlink outputs a line adaptive control signal to the encoder bridge unit 30 inside the distribution management apparatus 2; on the other hand, the line adaptive control unit 37b in the case of uplink transmits a line adaptive control signal from the distribution management apparatus 2 to the communication terminal 5 via the communication network 9.
Next, the encoding unit 60 of the communication terminal 5 changes the operating conditions on the basis of the received line adaptive control signal (Step S127). Then, the encoding unit 60 transmits encoded video (sound) data of video (sound) data [E] acquired from the camera 62 and microphone 63, time information (t0) indicating the current time t0 in the communication terminal 5 acquired from the storage unit 5000, and time difference information (Δ) indicating a time difference Δ acquired from the storage unit 5000 to the decoding unit 40 of the distribution management apparatus 2 through the transmitting/receiving unit 51 and the transmitting/receiving unit 31 as in Step S121 in accordance with new operating conditions (Step S128). After that, the processes from Step S122 onward are continuously performed. In this way, the uplink line adaptive control process is continuously performed.
As explained in detail above with specific examples, in the distribution system 1 according to the present embodiment, the distribution management apparatus 2 has the browser 20 and the encoder bridge unit 30 for encoding data on the cloud. Accordingly, the browser 20 generates video data or sound data from content data written in a given description language, and the encoder bridge unit 30 converts the data form of the generated data so that the data can be distributed via the communication network 9 and then distributes the data to the communication terminal 5. Therefore, the communication terminal 5 can reduce the load for receiving content data written in a given description language and the load for converting the received content data into video data or sound data; consequently, it is possible to resolve the problem of high load required to cope with the tendency that content is made richer.
Especially, the browser 20 makes real-time communication possible, and the converting unit 10 encodes video (sound) data generated by the browser 20 in real time. Therefore, unlike the case where a DVD player selects and delivers non-real-time (i.e., previously-encoded) video (sound) data as in on-demand data distribution, the distribution management apparatus 2 generates video (sound) data by rendering content acquired immediately before the distribution and encodes the video (sound) data; therefore, it is possible to perform real-time distribution of video (sound) data.
Supplemental Explanation
In the distribution system 1 according to the present embodiment, the terminal management apparatus 7 and the distribution management apparatus 2 are configured as separate apparatuses; however, the terminal management apparatus 7 and the distribution management apparatus 2 can be configured to be integrated into one apparatus, for example, in such a manner that the distribution management apparatus 2 has the functions of the terminal management apparatus 7.
Furthermore, each of the distribution management apparatus 2 and the terminal management apparatus 7 according to the above-described embodiment can be built up with a single computer, or can be built up with multiple computers arbitrarily assigned to respective units (functions, means, or storage units) into which the units (functions, means, or storage units) of each apparatus are divided.
Moreover, recording media, such as CD-ROM, and the HDD 204 that have stored therein the program according to the above-described embodiment can be provided to domestic and overseas as program products.
According to an embodiment, it is possible to display, on a terminal, a drawn image handwritten by a user without delay at low cost.
Although the invention has been described with respect to specific embodiments for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.
Number | Date | Country | Kind |
---|---|---|---|
2013-154785 | Jul 2013 | JP | national |
2013-199004 | Sep 2013 | JP | national |
2014-086773 | Apr 2014 | JP | national |