This patent application is based on and claims priority pursuant to 35 U.S.C. § 119(a) to Japanese Patent Application No. 2021-126102, filed on Jul. 30, 2021, in the Japan Patent Office, the entire disclosure of which is hereby incorporated by reference herein.
The present invention relates to an information processing apparatus, a text data editing method, and a communication system.
There is a communication system that transmits and receives image and audio between remote locations via a network.
Examples of the above-described communication system include a communication system that converts the audio of a video conference into text through voice recognition to support the video conference.
When a participant of the video conference or an editor edits the text with a first terminal, however, the result of the editing is not transmitted to a second terminal that is capable of displaying the text. Since the text generated by the voice recognition does not necessarily perfectly reflect the intention of a speaker, it is desirable that the text be edited by the editor. According to a method in which the editor later corrects the text of meeting minutes, for example, the editor corrects a substantial amount of text by checking the text against his or her memory (or by guessing the context). Further, a participant of the video conference may understand the contents of the conference by reading the text. In this case, slow editing may hinder the participant from correctly understanding the contents of the conference due to incorrect text.
In one embodiment of this invention, there is provided an information processing apparatus that communicates with a plurality of terminals via a network. The information processing apparatus includes, for example, circuitry that transmits text data to a first terminal and a second terminal of the plurality of terminals. The text data is converted from audio data transmitted from a particular terminal of the plurality of terminals. In response to receipt of a notification of start of editing the text data from the first terminal, the circuitry restricts editing of the text data by the second terminal. In response to receipt of the edited text data from the first terminal, the circuitry transmits at least an edited character of the edited text data to the second terminal.
In one embodiment of this invention, there is provided a text data editing method performed by an information processing apparatus that communicates with a plurality of terminals via a network. The text data editing method includes, for example, transmitting text data to a first terminal and a second terminal of the plurality of terminals. The text data is converted from audio data transmitted from a particular terminal of the plurality of terminals. The text data editing method further includes, in response to receipt of a notification of start of editing the text data from the first terminal, restricting editing of the text data by the second terminal, and in response to receipt of the edited text data from the first terminal, transmitting at least an edited character of the edited text data to the second terminal.
In one embodiment of this invention, there is provided a communication system that includes, for example, a plurality of terminals and an information processing apparatus. The plurality of terminals include a first terminal and a second terminal. The information processing apparatus communicates with the plurality of terminals via a network. The information processing apparatus includes apparatus circuitry that transmits text data to the first terminal and the second terminal. The text data is converted from audio data transmitted from a particular terminal of the plurality of terminals. In response to receipt of a notification of start of editing the text data from the first terminal, the apparatus circuitry restricts editing of the text data by the second terminal. In response to receipt of the edited text data from the first terminal, the apparatus circuitry transmits at least an edited character of the edited text data to the second terminal. The first terminal includes first terminal circuitry. The first terminal circuitry displays the text data received from the information processing apparatus, and receives the editing of the text data. The second terminal includes second terminal circuitry. The second terminal circuitry displays the text data received from the information processing apparatus. Based on the at least edited character of the edited text data received from the information processing apparatus, the second terminal circuitry changes the text data being displayed.
A more complete appreciation of the disclosure and many of the attendant advantages and features thereof can be readily obtained and understood from the following detailed description with reference to the accompanying drawings, wherein:
The accompanying drawings are intended to depict embodiments of the present invention and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. In the drawings illustrating embodiments of the present invention, members or components having the same function or shape will be denoted with the same reference numerals to avoid redundant description.
In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have a similar function, operate in a similar manner, and achieve a similar result.
As an exemplary embodiment for implementing the present invention, a communication system and a text data editing method performed by the communication system will be described below with reference to the drawings.
According to a sharing support server of the present embodiment, while text data converted from audio of a meeting is being viewed by viewers, the text data is edited in real time by an editor and reflected in the text data viewed by the viewers. Consequently, the text data edited by the editor is displayed in substantially real time on display terminals of the viewers. When a person with hearing difficulty relies on the text generated by voice recognition to understand the contents of the meeting, for example, the text is promptly corrected, helping the person to correctly understand the contents of the meeting. Further, in a case in which the editor later corrects the text data, an increase in the volume of the text data makes it difficult for the editor to accurately correct the text data due to the limited memory capacity of the human brain. The present embodiment facilitates the real-time correction of the text data by the editor, thereby reducing the workload on the editor.
Herein, audio data refers to data converted from sound to be subjected to signal processing. Although the audio data may be analog or digital data, the audio data is digitally converted on a computer. It is assumed here that the audio data mainly represents voices. The audio data, however, may contain any kind of sound.
The text data includes characters such as letters (e.g., alphabet), numbers, and symbols represented by a character code.
The restriction of editing includes the prohibition of editing the entire text data and also the prohibition of editing a character forming part of the text data. The editing includes the addition, deletion, and change of a character.
The term “in real time” refers to that, when a certain process is executed, the result of the process is obtained within a certain range of delay after the execution of the process.
A schematic configuration of a communication system 1 of the present embodiment will be described with reference to
The communication terminal 2, the video conference terminal 3, the vehicle navigation system 4, the PC 5, the sharing support server 6, the schedule management server 8, the audio-to-text conversion server 9, and the display terminal 10 are communicable with each other via a communication network N. The communication network N is implemented by the Internet, a mobile telecommunications network, or a local area network (LAN), for example. The communication network N may include not only a wired communication network but also a wireless communication network conforming to a standard such as third generation (3G), worldwide interoperability for microwave access (WiMAX), or long term evolution (LTE).
The communication terminal 2 is used in a meeting room X. The video conference terminal 3 is used in a meeting room Y. The meeting rooms X and Y may be installed with an electronic whiteboard. Herein, a shared item is what is reserved by a user. The vehicle navigation system 4 is used in a vehicle α. In this case, the vehicle α is a vehicle for car sharing. The vehicle includes an automobile, motorcycle, bicycle, and wheelchair, for example.
The shared item refers to an object, service, space (e.g., room), place, or information shared by a plurality of people or organizations. The meeting rooms X and Y and the vehicle α are examples of the shared item shared by a plurality of users. As an example of information provided to the shared item, there is an account. For example, a particular service provided on the world wide web (Web) may be limited to a single account to use.
The communication terminal 2 is a general-purpose computer such as a tablet terminal or a smartphone. The display terminal 10, the video conference terminal 3, and the vehicle navigation system 4 are also examples of a communication terminal. The communication terminal is a terminal that becomes usable in a video conference between different locations after sign-in by a user (see later-described step S33 in
The PC 5 is a general-purpose computer. The PC 5 is an example of a registration apparatus that registers, on the schedule management server 8, a reservation to use a shared item and an event scheduled to be executed by a user. The event includes a conference, assembly, meeting, gathering, consultation, discussion, drive, ride, and transport, for example. The PC 5 is also used as a terminal by an editor who edits the text data converted from audio. The PC 5 is an example of a first terminal. The PC 5 may include a plurality of PCs 5, such as PCs 5a, 5b, and so forth. Hereinafter, any one of the PCs 5a, 5b, and so forth will be referred to as the PC 5.
The sharing support server 6 is a computer. The sharing support server 6 supports the communication terminals in remotely sharing the shared item. The sharing support server 6 is an example of an information processing apparatus.
The schedule management server 8 is a computer. The schedule management server 8 manages the reservations of shared items and the schedules of users.
The audio-to-text conversion server 9 is a computer. The audio-to-text conversion server 9 converts sound (i.e., audio) data received from an external computer (e.g., the sharing support server 6) into text data.
Herein, the sharing support server 6, the schedule management server 8, and the audio-to-text conversion server 9 will be collectively referred to as the management system. The management system may be a computer that integrates all or part of the functions of the sharing support server 6, the schedule management server 8, and the audio-to-text conversion server 9, for example. Alternatively, the functions of the sharing support server 6, the schedule management server 8, and the audio-to-text conversion server 9 may be distributed to and implemented by a plurality of computers. It is assumed in the following description that each of the sharing support server 6, the schedule management server 8, and the audio-to-text conversion server 9 is a server computer located in a cloud environment. The sharing support server 6 and the audio-to-text conversion server 9, however, may be a server located in an on-premise environment. The schedule management server 8 may also be a server located in an on-premise environment.
Respective hardware configurations of apparatuses and terminals forming the communication system 1 will be described with reference to
A hardware configuration of the video conference terminal 3 will first be described.
The CPU 301 controls overall operation of the video conference terminal 3. The ROM 302 stores a program used to drive the CPU 301, such as an initial program loader (IPL). The RAM 303 is used as a work area for the CPU 301. The flash memory 304 stores a communication program and various data such as image data and audio data. The SSD 305 controls writing and reading of various data to and from the flash memory 304 under the control of the CPU 301. The SSD 305 may be replaced by a hard disk drive (HDD). The medium I/F 307 controls writing (i.e., storage) and reading of data to and from a recording medium 306 such as a flash memory. The operation buttons 308 are buttons operated to select the address of the video conference terminal 3, for example. The power switch 309 is a switch for turning on or off the power supply of the video conference terminal 3.
The network I/F 311 is an interface for performing data communication via the communication network N such as the Internet. The CMOS sensor 312 is a built-in imaging device that captures the image of a subject under the control of the CPU 301 to obtain image data. The imaging element I/F 313 is a circuit that controls the driving of the CMOS sensor 312. The microphone 314 is a built-in sound collector that receives input of audio. The audio input and output I/F 316 is a circuit that processes the input of an audio signal from the microphone 314 and the output of an audio signal to the speaker 315 under the control of the CPU 301. The display I/F 317 is a circuit that transmits image data to an external display 320 under the control of the CPU 301. The external apparatus connection I/F 318 is an interface for connecting the video conference terminal 3 to various external apparatuses. The near field communication circuit 319 is a communication circuit conforming to a standard such as near field communication (NFC) or Bluetooth (registered trademark).
The bus line 310 includes an address bus and a data bus for electrically connecting the CPU 301 and the other components in
The display 320 is a display (i.e., display device) implemented as a liquid crystal or organic electroluminescence (EL) display that displays the image of the subject and icons for operations, for example. The display 320 is connected to the display I/F 317 via a cable 320c. The cable 320c may be a cable for analog red-green-blue (RGB) video graphics array (VGA) signal, a cable for component video, or a cable for DisplayPort (registered trademark), high-definition multimedia interface (HDMI, registered trademark), or digital video interactive (DVI) signal.
The CMOS sensor 312 may be an imaging element such as a charge coupled device (CCD) sensor. The external apparatus connection I/F 318 is connectable to an external apparatus such as an external camera, microphone, or speaker via a universal serial bus (USB) cable, for example. If an external camera is connected to the external apparatus connection I/F 318, the external camera is driven in preference to the built-in CMOS sensor 312 under the control of the CPU 301. Similarly, if an external microphone or speaker is connected to the external apparatus connection I/F 318, the external microphone or speaker is driven in preference to the built-in microphone 314 or the built-in speaker 315 under the control of the CPU 301.
The recording medium 306 is removable from the video conference terminal 3. The flash memory 304 may be replaced by any nonvolatile memory for reading or writing data under the control of the CPU 301, such as an electrically erasable programmable ROM (EEPROM).
A hardware configuration of the vehicle navigation system 4 will be described.
The CPU 401 controls overall operation of the vehicle navigation system 4. The ROM 402 stores a program used to drive the CPU 401 such as an IPL. The RAM 403 is used as a work area for the CPU 401. The EEPROM 404 reads or writes various data of a program for the vehicle navigation system 4, for example, under the control of the CPU 401. The power switch 405 is a switch for turning on or off the power supply of the vehicle navigation system 4. The acceleration and orientation sensor 406 includes various sensors such as an electromagnetic compass that detects geomagnetism, a gyrocompass, and an acceleration sensor. The medium I/F 408 controls writing (i.e., storage) and reading of data to and from a recording medium 407 such as a flash memory. The GPS receiver 409 receives a GPS signal from a GPS satellite.
The vehicle navigation system 4 further includes a telecommunication circuit 411, an antenna 411a for the telecommunication circuit 411, a CMOS sensor 412, an imaging element I/F 413, a microphone 414, a speaker 415, an audio input and output I/F 416, a display 417, a display I/F 418, an external apparatus connection I/F 419, a near field communication circuit 420, and an antenna 420a for the near field communication circuit 420.
The telecommunication circuit 411 is a circuit that receives information provided by an external infrastructure outside the vehicle α, such as traffic congestion information, road construction information, and traffic accident information, and transmits positional information of the vehicle α and an emergency rescue signal, for example, to the outside of the vehicle α. The external infrastructure is a road information guide system such as the vehicle information and communication system (VICS, registered trademark), for example. The CMOS sensor 412 is a built-in imaging device that captures the image of a subject under the control of the CPU 401 to obtain image data. The imaging element I/F 413 is a circuit that controls the driving of the CMOS sensor 412. The microphone 414 is a built-in sound collector that receives the input of audio. The audio input and output I/F 416 is a circuit that processes the input of an audio signal from the microphone 414 and the output of an audio signal to the speaker 415 under the control of the CPU 401. The display 417 is a display (i.e., display device) such as a liquid crystal or organic EL display, for example, which displays the image of the subject and various icons, for example. The display 417 has the function of a touch panel. The touch panel is an input device for a user to operate the vehicle navigation system 4. The display I/F 418 is a circuit that causes the display 417 to display the image. The external apparatus connection I/F 419 is an interface for connecting the vehicle navigation system 4 to various external apparatuses. The near field communication circuit 420 is a communication circuit conforming to a standard such as NFC or Bluetooth. The vehicle navigation system 4 further includes a bus line 410. The bus line 410 includes an address bus and a data bus for electrically connecting the CPU 401 and the other components in
A description will be given of respective hardware configurations of the communication terminal 2, the PC 5, the display terminal 10, the sharing support server 6, the schedule management server 8, and the audio-to-text conversion server 9.
The CPU 501 controls overall operation of the communication terminal 2, the PC 5, or the display terminal 10. The ROM 502 stores a program used to drive the CPU 501 such as an IPL. The RAM 503 is used as a work area for the CPU 501. The HD 504 stores various data of programs, for example. The HDD controller 505 controls writing and reading of various data to and from the HD 504 under the control of the CPU 501. The medium I/F 507 controls writing (i.e., storage) and reading of data to and from a recording medium 506 such as a flash memory. The display 508 displays various information such as a cursor, menus, windows, text, and images. The display 508 is an example of a display (i.e., display device). The network I/F 509 is an interface for performing data communication via the communication network N. The keyboard 511 is an input device including a plurality of keys for inputting text, numerical values, and various instructions, for example. The mouse 512 is an input device used to select and execute various instructions, select a processing target, and move the cursor, for example. The CD-RW drive 514 controls writing and reading of various data to and from a CD-RW 513 as an example of a removable recording medium. The speaker 515 outputs an audio signal under the control of the CPU 501. The camera 516 captures the image within the angle of view under the control of the CPU 501 to generate image data. The microphone 517 collects an audio signal under the control of the CPU 501. The bus line 510 includes an address bus and a data bus for electrically connecting the CPU 501 and the other components in
The sharing support server 6 is implemented by a computer. As illustrated in
The schedule management server 8 is implemented by a computer. As illustrated in
The audio-to-text conversion server 9 is implemented by a computer. As illustrated in
Each of the above-described programs may be distributed as recorded on a computer readable recording medium in an installable or executable file format. Examples of the recording medium include a CD-recordable (CD-R), a digital versatile disc (DVD), a Blu-ray (registered trademark) disc, and a secure digital (SD) card. The recording medium may be shipped to the market as a program product. For example, with the execution of a program of the present invention, the communication terminal 2, the PC 5, or the display terminal 10 implements a text data editing method of the present invention.
The sharing support server 6 may be implemented by a single computer, or may be implemented by a plurality of computers to which units (e.g., functions or devices and memories) of the sharing support server 6 are divided and allocated as desired. The same applies to the schedule management server 8 and the audio-to-text conversion server 9.
A functional configuration of the communication system 1 of the present embodiment will be described with reference to
As for a functional configuration of the communication terminal 2, the communication terminal 2 includes a communication unit 21, a receiving unit 22, an image and audio processing unit 23, a display control unit 24, a determination unit 25, and a storage and reading unit 29, as illustrated in
The functional units of the communication terminal 2 will be described.
The communication unit 21 is implemented by a command from the CPU 501 and the network I/F 509 illustrated in
The receiving unit 22 is mainly implemented by a command from the CPU 501, the keyboard 511, the mouse 512, and the display 508 with a touch panel illustrated in
The image and audio processing unit 23 performs image processing on the image data of the image of the subject captured by the camera 516. The image and audio processing unit 23 further performs audio processing on audio data related an audio signal converted from the voice of the user by the microphone 517. The image and audio processing unit 23 further outputs an audio signal related to audio data to the speaker 515 to output sound from the speaker 515.
The display control unit 24 is implemented by a command from the CPU 501 illustrated in
The determination unit 25 is implemented by a command from the CPU 501 illustrated in
The storage and reading unit 29 is implemented by a command from the CPU 501 and the HD 504 illustrated in
Each of the video conference terminal 3 and the vehicle navigation system 4 has functions similar to those of the communication terminal 2, and thus description thereof will be omitted here.
As for a functional configuration of the PC 5, the PC 5 includes a communication unit 51, a receiving unit 52, a display control unit 54, a generation unit 56, an audio control unit 58, and a storage and reading unit 59. These units are functions or functional units implemented when at least one of the components illustrated in
The functional units of the PC 5 will be described.
The communication unit 51 is implemented by a command from the CPU 501 and the network I/F 509 illustrated in
The receiving unit 52 is mainly implemented by a command from the CPU 501, the keyboard 511, and the mouse 512 illustrated in
The display control unit 54 is implemented by a command from the CPU 501 illustrated in
The generation unit 56 is implemented by a command from the CPU 501 illustrated in
The audio control unit 58 is implemented by a command from the CPU 501 illustrated in
The storage and reading unit 59 is implemented by a command from the CPU 501 and the HDD controller 505 illustrated in
As for a functional configuration of the sharing support server 6, the sharing support server 6 includes a communication unit 61, an authentication unit 62, a creation unit 63, a generation unit 64, a determination unit 65, a restriction unit 66, and a storage and reading unit 69. These units are functions or functional units implemented when at least one of the components illustrated in
A user authentication management table of the present embodiment will be described.
An access management table of the present embodiment will be described.
A schedule management table of the present embodiment will be described.
The scheduled event ID is identification information for identifying a scheduled event. The scheduled event ID is an example of scheduled event identification information for identifying an event scheduled to be executed. The executed event ID is identification information for identifying a scheduled event that has actually been executed or is actually being executed. The executed event ID is an example of executed event identification information for identifying an executed event or an event being executed. The name of the reserver is the name of the person who has reserved the shared item. If the shared item is a meeting room, the name of the reserver is the name of the organizer of a meeting in the meeting room, for example. If the shared item is a vehicle, the name of the reserver is the name of the driver of the vehicle, for example. The schedule start time represents the time at which the use of the shared item is scheduled to start. The schedule end time represents the time at which the use of the shared item is scheduled to end. The event name represents the name of the event scheduled to be executed by the reserver. The user IDs of the other participants are identification information for identifying the participants other than the reserver. The names of the other participants are the names of the participants other than the reserver, and include the name of the shared item. That is, the users in this case include the shared item as well as the reserver and the other participants. The file data is the file data of a material file used in the event corresponding to the scheduled event ID, i.e., the event registered by a user A on a later-described schedule input screen 550 in
A content management table of the present embodiment will be described.
Herein, the content includes history information representing the executed contents of the event and an action item generated by the executed event. The history information represents recorded data or the data of snapshot, audio text, or material, for example. Snapshot refers to a process in which a display screen displayed at a certain point of time in an ongoing event is acquired as image data. Snapshot may also be referred to as capture or image recognition, for example.
When the content processing type is recording, the content includes a uniform resource locator (URL) representing the storage location of the recorded audio data. When the content processing type is snapshot, the content includes a URL representing the storage location of the image data of the screen acquired by the snapshot (i.e., capture). Capture refers to storing a still or video image displayed on the display 508 as image data. When the content processing type is audio-to-text conversion, the content includes a URL representing the storage location of text data of the received audio text.
Herein, the action item represents the contents of an action that is generated in an event such as a meeting and should be performed by a person involved in the event. When the content processing type is the generation of an action event, the content includes the user ID of an executor of the action item, the due date to complete the action item, and a URL representing the storage location of image data representing the action item.
The functional units of the sharing support server 6 will be described in detail. In the following description of the functional units of the sharing support server 6, the relationships between the functional units of the sharing support server 6 and major ones of the components in
The communication unit 61 of the sharing support server 6 illustrated in
The authentication unit 62 is implemented by a command from the CPU 601 illustrated in
The creation unit 63 is implemented by a command from the CPU 601 illustrated in
The generation unit 64 is implemented by a command from the CPU 601 illustrated in
The determination unit 65 is implemented by a command from the CPU 601 illustrated in
If the text data starts being edited on one of the PCs 5, the restriction unit 66 restricts the editing of the text data by the other PCs 5 and the display terminals 10. That is, the restriction unit 66 performs exclusion control to prohibit more than one terminal or apparatus from editing the same text data at the same time. When the editor releases the selected text data, the restriction unit 66 lifts the restriction.
The storage and reading unit 69 is implemented by a command from the CPU 601 and the HDD controller 605 illustrated in
As for a functional configuration of the schedule management server 8, the schedule management server 8 includes a communication unit 81, an authentication unit 82, a generation unit 83, and a storage and reading unit 89. These units are functions or functional units implemented when at least one of the components illustrated in
A user authentication management table of the present embodiment will be described.
A user management table of the present embodiment will be described.
A shared item management table of the present embodiment will be described.
A shared item reservation management table of the present embodiment will be described.
An event management table of the present embodiment will be described.
A server authentication management table of the present embodiment will be described.
An executed event history management table of the present embodiment will be described.
An executed event management table of the present embodiment will be described.
A related information management table of the present embodiment will be described.
A text information management table of the present embodiment will be described.
The functional units of the schedule management server 8 will be described in detail. In the following description of the functional units of the schedule management server 8, the relationships between the functional units of the schedule management server 8 and major ones of the components in
The communication unit 81 of the schedule management server 8 illustrated in
The authentication unit 82 is implemented by a command from the CPU 801 illustrated in
The generation unit 83 is implemented by a command from the CPU 801 illustrated in
The storage and reading unit 89 is implemented by a command from the CPU 801 and the HDD controller 805 illustrated in
As for a functional configuration of the audio-to-text conversion server 9, the audio-to-text conversion server 9 includes a communication unit 91, a conversion unit 93, and a storage and reading unit 99. These units are functions or functional units implemented when at least one of the components illustrated in
The functional units of the audio-to-text conversion server 9 will be described in detail. In the following description of the functional units of the audio-to-text conversion server 9, the relationships between the functional units of the audio-to-text conversion server 9 and major ones of the components in
The communication unit 91 of the audio-to-text conversion server 9 illustrated in
The conversion unit 93 is implemented by a command from the CPU 901 illustrated in
The storage and reading unit 99 is implemented by a command from the CPU 901 and the HDD controller 905 illustrated in
The IDs described above are examples of identification information. The organization ID includes the company name, the office name, the department name, and the area name, for example. The user ID includes the employee number, the driver's license number, and My Number in the Japanese social security and tax number system, for example.
A functional configuration (i.e., functional components) of the display terminal 10 will be described.
The display terminal 10 includes a communication unit 11, a receiving unit 12, a display control unit 13, and a storage and reading unit 19. These units are functions or functional units implemented when at least one of the components illustrated in
The communication unit 11 is implemented by a command from the CPU 501 and the network I/F 509 illustrated in
The receiving unit 12 is mainly implemented by a command from the CPU 501, the keyboard 511, the mouse 512, and the display 508 with a touch panel illustrated in
The display control unit 13 is implemented by a command from the CPU 501 illustrated in
The storage and reading unit 19 is implemented by a command from the CPU 501 and the HD 504 illustrated in
The display terminal 10 may have functions similar to those of the communication terminal 2. The functions of the display terminal 10 illustrated in
Process or operations of the present embodiment will be described below.
With reference to
When the user A operates the keyboard 511 of the PC 5, for example, the display control unit 54 of the PC 5 causes the display 508 to display a sign-in screen 530 for the user A to sign in, as illustrated in
The user A then inputs his user ID and organization ID in the input field 531, inputs his password in the input field 532, and presses the “SIGN IN” button 538. Then, the receiving unit 52 of the PC 5 receives a user request for sign-in (step S12). The communication unit 51 of the PC 5 then transmits sign-in request information to the schedule management server 8 (step S13). The sign-in request information, which represents the request for sign-in, includes the information received at step S12 (i.e., the user ID, the organization ID, and the password). Thereby, the communication unit 81 of the schedule management server 8 receives the sign-in request information.
Then, the authentication unit 82 of the schedule management server 8 executes the authentication of the user A with the user ID, the organization ID, and the password (step S14). Specifically, the storage and reading unit 89 of the schedule management server 8 searches the user authentication management DB 8001 (see
The communication unit 81 then transmits an authentication result to the PC 5 (step S15). Thereby, the communication unit 51 of the PC 5 receives the authentication result.
If the authentication result received at step S15 indicates that the user A is a valid user, the generation unit 56 of the PC 5 generates an initial screen 540 as illustrated in
Then, the storage and reading unit 89 of the schedule management server 8 performs a search through the user management DB 8002 (see
In the PC 5, the generation unit 56 then generates a schedule input screen 550 with the schedule input screen information received at step S21 (step S22). Then, the display control unit 24 of the PC 5 causes the display 508 to display the schedule input screen 550 as illustrated in
The schedule input screen 550 includes input fields 551, 552, 553, 554, and 555, a display area 556, a selection menu 557, an “OK” button 558, and a “CANCEL” button 559. The input field 551 is used to input the event name. The input field 552 is used to input the shared item ID or the shared item name. The input field 553 is used to input the scheduled start date and time when the execution of the event (i.e., the use of the shared item) is scheduled to start. The input field 554 is used to input the scheduled end date and time when the execution of the event (i.e., the use of the shared item) is scheduled to end. The input field 555 is used to input notes such as an agenda. The display area 556 is used to display the name of the reserver. The selection menu 557 is used to select the names of the other participants than the reserver. The “OK” button 558 is pressed to register a reservation. The “CANCEL” button 559 is pressed to cancel the input information or the information being input. The name of the reserver is the name of the user A who has input the information for sign-in to the PC 5 at step S12. The schedule input screen 550 further displays a mouse pointer p1.
The input field 552 may be used to input an email address. Further, if the name of the shared item is selected in the selection menu 557, the shared item is also added to the other participants.
Then, if the user A inputs particular information in the input fields 551 to 555, selects the names (i.e., user names) of the users desired to participate in the event from the selection menu 557 by using the mouse pointer p1, and presses the “OK” button 558, the receiving unit 52 receives the input of the schedule information (step S24). The communication unit 51 then transmits the schedule information to the schedule management server 8 (step S25). The schedule information includes the event name, the shared item ID (or the shared item name), the scheduled start date and time, the scheduled end date and time, the user IDs of the participants, and the notes. If the shared item ID is input in the input field 552 of the schedule input screen 550, the shared item ID is transmitted to the schedule management server 8. If the shared item name is input in the input field 552, the shared item name is transmitted to the schedule management server 8. On the schedule input screen 550, the user names are selected from the selection menu 557. Since the user IDs are also received at step S21, the user IDs corresponding to the user names are transmitted to the schedule management server 8. Thereby, the communication unit 81 of the schedule management server 8 receives the schedule information.
Then, the schedule management server 8 performs a search through the shared item management DB 8003 (see
Then, the storage and reading unit 89 stores the reservation information in the shared item reservation management DB 8004 (see
The storage and reading unit 89 further stores schedule information in the event management DB 8005 (see
According to the above-described process, the user A is able to register his schedule on the schedule management server 8. In the process described above with
An event starting process of the present embodiment will be described. Specifically, a process of having a meeting with the other participants by using the communication terminal 2 in the meeting room X reserved by the user A (the reserver named Taro Riko in the present example) will be described with reference to
When the user A presses a power switch of the communication terminal 2, the receiving unit 22 of the communication terminal 2 receives a power-on operation (or the launch of an application) performed by the user A (step S31). Then, as illustrated in
Then, if the user A presses the selection icon 113 and inputs his email address and his password, the receiving unit 22 receives a user request for sign-in (step S33). Then, the communication unit 21 transmits the sign-in request information to the sharing support server 6 (step S34). The sign-in request information representing the sign-in request includes the information received at step S33 (i.e., the user ID, the organization ID, and the password), time zone information of the country or region in which the communication terminal 2 is installed, a user ID of the communication terminal 2, the organization ID, and the password. Thereby, the communication unit 61 of the sharing support server 6 receives the sign-in request information.
Then, the authentication unit 62 of the sharing support server 6 executes the authentication of the user A with the user ID, the organization ID, and the password of the user A received at step S34 (step S35). Specifically, using the user ID, the organization ID, and the password of the user A received at step S34 as a search key, the storage and reading unit 69 of the sharing support server 6 searches the user authentication management DB 6001 (see
Then, the storage and reading unit 69 of the sharing support server 6 performs a search through the access management DB 6002 (see
Then, the communication unit 61 transmits reservation request information and schedule request information to the schedule management server 8 (step S37). The reservation request information represents the request for the reservation information of the shared item. The schedule request information represents the request for the schedule information of the user A. Each of the reservation request information and the schedule request information includes the time zone information, the user ID of the communication terminal 2, and the organization ID received at step S34 and the access ID and the access password read at step S36. Thereby, the communication unit 81 of the schedule management server 8 receives the reservation request information and the schedule request information.
Then, the authentication unit 82 of the schedule management server 8 executes the authentication of the sharing support server 6 with the access ID and the access password (step S38). Specifically, the storage and reading unit 89 of the schedule management server 8 searches the server authentication management DB 8006 (see
Using the user ID of the communication terminal 2 received at step S37 as a search key, the storage and reading unit 89 of the schedule management server 8 performs a search through the shared item reservation management DB 8004 (see
Using the user ID of the communication terminal 2 received at step S37 as a search key, the storage and reading unit 89 further performs a search through the event management DB 8005 (see
Then, the communication unit 81 transmits the reservation information read at step S39 and the schedule information read at step S40 to the sharing support server 6 (step S41). Thereby, the communication unit 61 of the sharing support server 6 receives the reservation information and the schedule information.
Then, the creation unit 63 of the sharing support server 6 creates a reservation list based on the reservation information and the schedule information received at step S41 (step S42). The communication unit 61 then transmits reservation list information to the communication terminal 2 (step S43). The reservation list information represents the contents of the reservation list. Thereby, the communication unit 21 of the communication terminal 2 receives the reservation list information.
In the communication terminal 2, the display control unit 24 then causes the display 508 to display a reservation list screen 230 as illustrated in
Then, if the user A presses the start button 235s in
In the sharing support server 6, the generation unit 64 then generates a unique executed event ID (step S53). Then, the storage and reading unit 69 of the sharing support server 6 manages the executed event ID generated at step S53, the scheduled event ID received at step S52, the user ID and the organization ID of the reserver, and the event information item 235 in association with each other (step S54).
The user ID and the organization ID of the reserver and the event information item 235 are the IDs and information item based on the reservation information and the schedule information received at step S41. At this stage, there is no input in the “ATTENDANCE” field of the reservation management table (see
Then, in the sharing support server 6, the communication unit 61 transmits file data transmission request information to the schedule management server 8 (step S55). The file data transmission request information represents the request to transmit the file data registered in the schedule management server 8. The file data transmission request information includes the scheduled event ID received at step S52, the user ID of the communication terminal 2 and the organization ID received at step S34, and the access ID and the access password read at step S36. Thereby, the communication unit 81 of the schedule management server 8 receives the file data transmission request information.
Then, the storage and reading unit 89 of the schedule management server 8 performs a search through the event management DB 8005 (see
Then, the storage and reading unit 69 of the sharing support server 6 stores and manages the file data received at step S57 in the schedule management DB 6003 (see
The communication unit 61 then transmits the executed event ID generated at step S53 and the file data received at step S57 to the communication terminal 2 (step S59). Thereby, the communication unit 21 of the communication terminal 2 receives the executed event ID and the file data.
In the communication terminal 2, the storage and reading unit 29 then stores the executed event ID and the file data in the storage unit 2000 (step S60). In this step, the file data transmitted from the sharing support server 6 is stored in a particular storage area in the storage unit 2000. The communication terminal 2 accesses the particular storage area during the execution of the event, and the display control unit 24 of the communication terminal 2 causes the display 508 to display the file data stored in the particular storage area. Herein, the particular storage area is a temporary data storage location provided for each ongoing event, and is identified by any desired path (character string) representing a location in the storage unit 2000. The particular storage area is not necessarily provided in the communication terminal 2, and may be provided in an external storage device connected to the communication terminal 2 or in a local server located in an on-premise environment and communicable with the communication terminal 2, for example.
Then, the display control unit 24 causes the display 508 to display a detailed information screen 250 of the selected event, as illustrated in
Then, when the user A ticks the checkboxes corresponding to the names of the actually participating participants (i.e., users) out of the prospective participants and presses the “CLOSE” button 259, the receiving unit 22 of the communication terminal 2 receives the user selection of the participants (step S62). The communication unit 21 of the communication terminal 2 then transmits the user IDs and the attendance information of the prospective participants to the sharing support server 6 (step S63). Thereby, the communication unit 61 of the sharing support server 6 receives the user IDs and the attendance information of the prospective participants.
The sharing support server 6 then stores and manages the attendance information in the “ATTENDANCE” field of the schedule management DB 6003 (step S64), which has been blank until this step.
With the above-described process, the user A starts the event (a policy-making meeting in the present example) with the shared item (the meeting room X in the present example) and the communication terminal 2. When the event starts, the display control unit 24 of the communication terminal 2 causes the display 508 to display a display screen 100a as illustrated in
The various icons included in the display screen 100a displayed on the communication terminal 2 are examples of a receiving area. The receiving area is not limited to the image such as an icon or a button, and may be text such as “CHANGE” or the combination of an image and text. The image in this case is not limited to a symbol or an object, and may be any image viewable to the user, such as an illustration or a pattern. Further, the selection (i.e., pressing) of the various icons is an example of an operation performed on the various icons. The operation performed on the various icons includes an input operation performed on the display 508 with the keyboard 511 and the mouse 512, for example.
The user A is able to have a meeting in the meeting room X with the communication terminal 2. For example, if the user A of the communication terminal 2 presses the operation icon 125c, the receiving unit 22 of the communication terminal 2 receives the user selection of the operation icon 125c, and the display control unit 24 of the communication terminal 2 causes the display 508 to display the file data of the material file stored in the particular storage area of the storage unit 2000. The display control unit 24 may cause the display 508 to display, as well as the file data received at step S59, file data previously stored in the storage unit 2000 or file data newly generated in the started and ongoing event. In this case, the storage and reading unit 29 of the communication terminal 2 stores the file data generated or updated in the started and ongoing event in the particular storage area of the storage unit 2000.
A process of registering the executed event history will be described with
The determination unit 25 of the communication terminal 2 first determines the type of the content processing in the started and ongoing event (step S71). Specifically, if the content is the audio data generated through the recording by the image and audio processing unit 23, the determination unit 25 determines the type of the content processing as recording. If the content is the image data acquired through the snapshot (i.e., capture) by the image and audio processing unit 23, the determination unit 25 determines the type of the content processing as snapshot. If the content is the material file data transmitted by the communication unit 21, the determination unit 25 determines the type of the content processing as the transmission of the material.
Then, the communication unit 21 transmits registration request information to the sharing support server 6 (step S72). The registration request information represents the request to register the generated content. In this case, each time the content is generated, the communication unit 21 automatically transmits the registration request information to the sharing support server 6. The registration request information includes the executed event ID, the user ID of the transmission source of the content, the content data, and the content processing type information. Thereby, the communication unit 61 of the sharing support server 6 receives the registration request information.
Based on the content processing type information included in the registration request information received by the communication unit 61, the determination unit 65 of the sharing support server 6 determines the type of the received content processing (step S73). Then, if the determination unit 65 determines the type of the content processing as recording, the communication unit 61 transmits the audio data, which is the content data, to the audio-to-text conversion server 9 (step S74). Thereby, the communication unit 91 of the audio-to-text conversion server 9 receives the audio data. If the type of the content processing is determined to be other than recording, the sharing support server 6 proceeds to the process of step S77 without executing the processes of steps S74 to S76.
The conversion unit 93 of the audio-to-text conversion server 9 converts the audio data received by the communication unit 91 into text data (step S75).
The audio-to-text conversion process of the audio-to-text conversion server 9 will be described with
The conversion unit 93 first acquires information representing the date and time of reception of the audio data by the communication unit 91 (step S75-1). The information acquired at step S75-1 may be information representing the date and time of reception of the audio data by the sharing support server 6 or the date and time of transmission of the audio data by the sharing support server 6. In this case, the communication unit 91 of the audio-to-text conversion server 9 receives, at step S74, the audio data transmitted from the sharing support server 6 and the information representing the above-described date and time.
Then, the conversion unit 93 executes the process of converting the audio data received by the communication unit 91 into text data (step S75-2). Then, when the process of converting the audio data into text data is completed (YES at step S75-3), the conversion unit 93 proceeds to the process of step S75-4. The conversion unit 93 repeats the process of step S75-2 until the process of converting the audio data into text data is completed. When the audio data received by the communication unit 91 is converted into a particular amount of text, the conversion unit 93 determines that the process of converting the audio data into text data is completed. For example, when the audio data is converted into one sentence of text, the conversion unit 93 determines that the process of converting the audio data into text data is completed. The conversion unit 93 then generates the text data converted from the audio data (step S75-4). Thereby, the audio-to-text conversion server 9 converts the audio data transmitted from the sharing support server 6 into the text data. Since the audio-to-text conversion server 9 receives, as necessary, the audio data transmitted from the sharing support server 6, the audio-to-text conversion server 9 repeatedly executes the process illustrated in
Referring back to
The communication unit 91 transmits the text data converted by the conversion unit 93 to the sharing support server 6 (step S76). In this step, the communication unit 91 transmits, as well as the text data, the information representing the date and time acquired at step S75-1 to the sharing support server 6.
Then, the generation unit 64 of the sharing support server 6 generates a unique content processing ID for identifying the content processing occurred in the event (step S77). The generation unit 64 further generates the URL of the content data representing the content (step S78). Then, for each executed event ID received at step S72, the storage and reading unit 69 of the sharing support server 6 manages, in the content management DB 6005 (see
If the content processing type is audio-to-text conversion, the start date and time and the end date and time of the content processing correspond to the date and time of conversion of the audio data into the text data. Herein, the date and time of conversion of the audio data into the text data corresponds to the date and time of transmission of the audio data by the communication unit 61 of the sharing support server 6 and the date and time of reception of the text data by the communication unit 61 of the sharing support server 6. Alternatively, the date and time of conversion of the audio data into the text data may correspond to the date and time of reception of the audio data by the communication unit 91 of the audio-to-text conversion server 9 and the date and time of transmission of the text data by the communication unit 91 of the audio-to-text conversion server 9. Further, if the content processing type is audio-to-text conversion, the start date and time and the end date and time of the content processing may be the same as the start date and time and the end date and time of the content processing related to the audio data that is to be converted into text data.
Further, if the content processing type is recording, snapshot, or transmission of material, the start date and time and the end date and time of the content processing correspond to the date and time of reception of the content data (e.g., the audio data, the image data, or the file data) by the communication unit 61 of the sharing support server 6. Alternatively, if the content processing type is recording, snapshot, or transmission of material, the start date and time and the end date and time of the content processing may correspond to the date and time of transmission of the content data by the communication unit 21 of the communication terminal 2. Further, if the content processing type is recording, the start date and time and the end date and time of the content processing may correspond to the start date and time and the end date and time of the recording by the image and audio processing unit 23. Further, if the content processing type is snapshot, the start date and time and the end date and time of the content processing may correspond to the date and time of snapshot (i.e., capture) by the image and audio processing unit 23.
The communication unit 61 further transmits the text data to the communication terminal 2 and to the display terminal 10 and the PC 5, with which the session is established (step S80). Thereby, the text data converted from the audio data is displayed in real time on the communication terminal 2 and the display terminal 10.
Then, as illustrated in
The storage and reading unit 69 then performs a search through the access management DB 6002 (see
Then, the communication unit 61 transmits executed event history registration request information to the schedule management server 8 (step S93). The executed event history registration request information represents the request to register the content data. The executed event history registration request information includes the executed event ID, the user ID of the transmission source of the content, and the content data received at step S72, the content processing ID generated at step S77, the URL of the content data generated at step S78, the access ID and the access password read at step S92, and the start date and time and the end date and time of the content processing. Thereby, the communication unit 81 of the schedule management server 8 receives the executed event history registration request information.
Then, in the schedule management server 8, the authentication unit 82 executes the authentication of the sharing support server 6 with the access ID and the access password (step S94). This authentication process is similar to that of step S38, and thus description thereof will be omitted here. It is assumed in the following description that the sharing support server 6 is authenticated.
Then, the storage and reading unit 89 of the schedule management server 8 stores and manages the various data (or information) received at step S93 in the executed event history management DB 8008 (see
Further, the generation unit 83 of the schedule management server 8 generates the related information in which the content data received at step S93 is associated with each content generation time period (step S96). The content generation time period included in the related information is generated with the scheduled event start date and time stored in the event management DB 8005 and the start date and time and the end date and time of the content processing stored in the executed event history management DB 8008. That is, the content generation time period represents the time elapsed from the event start date and time to the time of generation of the content in the executed event. Then, the storage and reading unit 89 of the schedule management server 8 stores and manages the related information generated by the generation unit 83 in the related information management DB 8010 (see
The storage and reading unit 89 of the schedule management server 8 then stores and manages the text information, which includes the text data received at step S93, in the text information management DB 8012 (see
With the above-described process, the communication terminal 2 transmits the executed event ID of the ongoing event and the content generated in the event to the schedule management server 8. Further, for each executed event ID, the schedule management server 8 stores the received content in the executed event history management DB 8008. According to the communication system 1, therefore, the content generated in the executed event is stored for each event.
A process of correcting the text data in real time will be described with reference to
At steps S101 and S102, the editor acquires the URL of the storage location of the text data (see the “CONTENT” field in
At step S103, the communication unit 51 of the PC 5 connects to the URL of the storage location of the text data. It is assumed in
At steps S104, S105, and S106, the communication unit 21 of the communication terminal 2 transmits the audio data to the audio-to-text conversion server 9. The audio-to-text conversion server 9 then converts the audio data into the text data, as described above with
At steps S107, S108, and S109, the communication unit 61 of the sharing support server 6 transmits in real time the newly transmitted text data (i.e., the converted text data) to the PC 5, the display terminal 10, and the communication terminal 2, with which the session is established. When the communication unit 51 of the PC 5 receives the text data, the display control unit 54 of the PC 5 causes the display 508 of the PC 5 to display the newly received text data to follow the latest text data being displayed. When the communication unit 11 of the display terminal 10 receives the text data, the display control unit 13 of the display terminal 10 causes the display 508 of the display terminal 10 to display the newly received text data to follow the latest text data being displayed. An operation similar to the above-described operation also takes place in the communication terminal 2. Thereby, the text data displayed on the PC 5 or the display terminal 10 is synchronized in substantially real time with the voice uttered by the user.
At step S110, the editor starts editing the text data of the utterance, and the receiving unit 52 of the PC 5 receives the editing. Herein, to start editing refers to the editor making the cursor movable on the text data to specify the input position. The PC 5 may display an input field for inputting the text data, such as a dialogue box.
At step S111, the communication unit 51 of the PC 5 transmits an editing start notification to the sharing support server 6 by specifying the corresponding text ID.
At steps S112 and S113, the communication unit 61 of the sharing support server 6 receives the editing start notification, and the restriction unit 66 of the sharing support server 6 transmits, via the communication unit 61, an editing prohibition notification to the communication terminal 2 and the display terminal 10, with which the session is established, by specifying the text ID and the editor ID identified based on the sign-in process. Herein, the editor ID is transmitted to the communication terminal 2 and the display terminal 10 to allow the users thereof to recognize who is editing the text data. Alternatively, the editor name may be transmitted to the communication terminal 2 and the display terminal 10 in place of the editor ID.
At steps S114 and S115, the communication unit 21 of the communication terminal 2 and the communication unit 11 of the display terminal 10 receive the editing prohibition notification. Then, the receiving unit 22 of the communication terminal 2 and the receiving unit 12 of the display terminal 10 restrict the editing of the text data identified by the text ID. Preferably, the display control unit 24 of the communication terminal 2 and the display control unit 13 of the display terminal 10 highlight the entirety of the edited text data or an edited character of the edited text data. With the editing restricted, even if one of the users attempts to edit the text data, the cursor is not displayed, for example. Further, the display control units 24 and 13 display, as well as the currently edited text data, the editor ID (or the editor name associated with the editor ID) in the form of text or icon.
At step S116, when the editor edits the text data (e.g., changes a given character to another character or adds or deletes a character), the receiving unit 52 of the PC 5 receives the editing.
At step S117, the communication unit 51 of the PC 5 transmits the edited text data to the sharing support server 6 by specifying the corresponding text ID. This transmission takes place in real time. Herein, “in real time” indicates that each time at least one character is deleted or added, the contents of the editing is transmitted. The communication unit 51 may transmit the entirety of the text data, or may transmit an edited character of the text data. If the communication unit 51 transmits the edited character of the text data, the communication unit 51 transmits the number of the edited character in the currently edited text data (i.e., the number of the edited character counted from the first character of the text data) and the post-editing state based on the change, addition, or deletion. If the editing is the change of a character, the communication unit 51 transmits a notification of change and the changed character. If the editing is the addition of a character, the communication unit 51 transmits the position of addition and the added character. If the editing is the deletion of a character, the communication unit 51 transmits a notification of deletion.
At steps S118 and S119, in response to receipt of the edited text data, the communication unit 61 of the sharing support server 6 reflects the editing in the text data, and transmits in real time the contents of the editing to the communication terminal 2 and the display terminal 10 (and another PC 5 for editing, if any), with which the session is established. The text data may be being displayed on the communication terminal 2 and the display terminal 10, for example. When the content of the utterance is corrected on the PC 5, therefore, the text data displayed on the display terminal 10 is synchronized in substantially real time with the corrected content.
The communication unit 61 of the sharing support server 6 may transmit, as well as the edited text data, the user identification information of the user (i.e., the editor) currently editing the text data to the communication terminal 2 and the display terminal 10. Then, if the user identification information of the editor who has started the editing (i.e., the user identification information acquired at steps S112 and S113) matches the user identification information transmitted from the sharing support server 6, the communication terminal 2 and the display terminal 10 may reflect the editing in the text data identified by the text ID. Alternatively, when the received text data does not match the original text data, the communication terminal 2 and the display terminal 10 may reflect the editing in the text data. That is, the communication unit 61 does not necessarily transmit an instruction to reflect the editing in the text data.
At steps S120 and S121, the communication unit 21 of the communication terminal 2 and the communication unit 11 of the display terminal 10 receive the edited text data, and the editing is reflected in the text data identified by the text ID. That is, the display control unit 24 of the communication terminal 2 and the display control unit 13 of the display terminal 10 replace the entirety of the currently displayed text data with the received text data, or replace the corresponding character of the currently displayed text data with the edited character of the received text data. Preferably, the display control units 24 and 13 cause the display 508 to display in highlight the entirety of the edited text data or the edited character of the edited text data. The display in highlight will be described later.
If the editor confirms the editing by pressing the return key, completes the editing by pressing the escape key, or releases the selection of the currently edited text data by selecting another text data, the cursor key disappears. In response to receipt of a release notification, the restriction unit 66 of the sharing support server 6 lifts the restriction on the display terminal 10 and the communication terminal 2.
A display example of the text data will be described.
The currently recognized text 1008 displays part of the text data of an utterance of a user currently in voice recognition. The currently recognized text 1008 is a character string yet to be confirmed as an uttered sentence.
A text data editing screen displayed by the PC 5 will be described.
As a supplementary explanation of the editor, the editor is a person who corrects the text data for those who want to view in real time the text data corresponding to the audio data of the utterances made in the meeting or a person who wants to correct the text data to be used as the minutes of the meeting. Herein, those who want to view in real time the text data include a person with hearing loss or difficulty. The present embodiment, however, is also useful to other people.
For example, in the present example, the editor has corrected the text data item 1005 from “The shape of large-scale vaccination is the Tokyo branch in Chuo Ward, Chiba City” to “The site of large-scale vaccination is the Tokyo branch in Chuo Ward, Chiba City.”
The corrected text data item 1005 reading “The site of large-scale vaccination is the Tokyo branch in Chuo Ward, Chiba City” or the changed word “site” made of four characters is transmitted to the display terminal 10 via the sharing support server 6. Therefore, the text data being displayed by the display terminal 10 is changed in real time.
The currently recognized text 1008 and a chat input field 1101 in
A detailed display example of the text data editing screen will be described.
It is assumed here that the text data item 1005 has been selected. During the editing, the text data item 1005 is displayed in highlight with the frame 1202 in a color such as yellow, for example. The display terminal 10 and the communication terminal 2 are also notified of the text data item 1005 being edited. On the display terminal 10 and the communication terminal 2, therefore, the text data item 1005 is displayed in highlight similarly as on the PC 5. Further, the editing of the text data item 1005 is restricted on the display terminal 10 and the communication terminal 2; the users of the display terminal 10 and the communication terminal 2 are prevented from placing the cursor over the text data item 1005.
The frame 1202 is displayed in any color with a highlighting effect (e.g., red or orange), and may be flashed by the display control unit 54. The display control unit 54 may change the color of the area of the text data item 1005, change the color of the text data item 1005, or increase the font size or line width of the text data item 1005.
The text data item 1005 is also displayed with a frame 1010 on the text display screen 1900 displayed by the display terminal 10, indicating that the text data item 1005 is being edited. That is, the text data item 1005 is displayed as enclosed in a frame. The frame 1010 may be displayed in a color with a highlighting effect (e.g., red or orange), and may be flashed by the display control unit 13. The display control unit 13 may change the color of the area of the text data item 1005, change the color of the text data item 1005, or increase the font size or line width of the text data item 1005.
The screen displayed by the PC 5 and the screen displayed by the display terminal 10 are not necessarily synchronized with each other (the PC 5 and the display terminal 10 are different in screen size, for example). Therefore, not all text data displayed by the PC 5 may be displayed by the display terminal 10. If the editor edits the text data displayed by the PC 5 but not by the display terminal 10, the display terminal 10 may temporarily hold the edited text data and then display the edited text data, as illustrated in
The edited text data item 1005 is registered in the text information management table in
Each time the text data is edited by the editor, the contents of the editing are transmitted to the sharing support server 6, and the text information management table is updated. The updated text data is transmitted to the display terminal 10. Thereby, the text data displayed by the display terminal 10 is also updated in real time. If the editor edits the text “shape” into “site,” the change from “shape” to “site” also occurs in substantially real time in the text data displayed by the display terminal 10, as illustrated in
If the user taps the edited text data item 1005, pre-editing characters 1013 (i.e., “shape”) and post-editing characters (i.e., “site”) are displayed at the same time. Thereby, the user is able to check both pre-editing text data and post-editing text data. Alternatively, the display control unit 13 of the display terminal 10 may display, instead of the changed characters, the entirety of the pre-editing text data item 1005 and the entirety of the post-editing text data item 1005.
A description will be given of the display of content after the completion of the meeting.
As described above, according to the sharing support server 6 of the present embodiment, the text data edited by the editor is displayed in substantially real time on the display terminal 10 of the viewer. For example, in a case in which a person with hearing difficulty relies on the text data generated by the voice recognition to understand the contents of the meeting, the text data is promptly corrected, helping the person to correctly understand the contents of the meeting. Further, in a case in which the editor later corrects the text data, an increase in the volume of text data makes it difficult for the editor to accurately correct the text data due to the limited memory capacity of the human brain. The present embodiment facilitates real-time correction of the text data by the editor, thereby reducing the workload on the editor.
In the above-described configuration examples such as the example in
The apparatuses described in the exemplary embodiment form one of a plurality of computing environments for implementing the embodiment disclosed in the present specification. In an embodiment of the present invention, the sharing support server 6 is a server cluster including a plurality of computing devices configured to communicate with each other via a desired type of communication link such as a network or a shared memory, for example, to execute the processes disclosed in the present specification.
Further, the sharing support server 6 may be configured to share the process steps disclosed in the embodiment, such as those illustrated in
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention.
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor programmed to perform the recited functions with software, such as a processor implemented by an electronic circuit, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA), and conventional circuit modules arranged to perform the recited functions. Further, the above-described steps are not limited to the order disclosed herein.
Number | Date | Country | Kind |
---|---|---|---|
2021-126102 | Jul 2021 | JP | national |