This patent application is based on and claims priority pursuant to 35 U.S.C. § 119(a) to Japanese Patent Application No. 2021-140045, filed on Aug. 30, 2021, in the Japan Patent Office, the entire disclosure of which is hereby incorporated by reference herein.
Embodiments of the present disclosure relate to a display apparatus, a display system, a display control method, and a non-transitory recording medium.
Efficient editing using original image information is performable by editing a range such as a part of an entry section or a part of a table included in an image acquired by a device such as a scanner or a facsimile. Such editing is performed by deleting information in a specified range of an image and then inserting information such as characters to the specified range.
An embodiment of the present disclosure includes a display apparatus including circuitry to display, on a display, an image including a table, receive an operation of specifying a range to be edited in the image, acquire coordinates of lines of the table in the range, and change a color of pixels other than pixels corresponding to the lines in the range to a predetermined color.
An embodiment of the present disclosure includes a display system including a display apparatus and a terminal device communicably connected to the display apparatus. The display apparatus includes display apparatus circuitry to display, on a display, an image including a table, receive an operation of specifying a range to be edited in the image, acquire coordinates of lines of the table in the range, and change a color of pixels other than pixels corresponding to the lines in the range to a predetermined color. The terminal device includes terminal device circuitry configured to output on a screen of the display apparatus, the image including the table.
An embodiment of the present disclosure includes a display system including a display apparatus and a storage server communicably connected to each other. The display apparatus includes display apparatus circuitry to display, on a display, an image including a table, receive an operation of specifying a range to be edited in the image, acquire coordinates of lines of the table in the range, and change a color of pixels other than pixels corresponding to the lines in the range to a predetermined color. The storage server includes storage server circuitry to store image data representing the image including the table and transmit the image data to the display apparatus. The display apparatus displays the image including the table based on the image data received from the storage server.
An embodiment of the present disclosure includes a display control method including displaying, on a display, an image including a table, receiving an operation of specifying a range to be edited in the image, acquiring coordinates of lines of the table in the range, and changing a color of pixels other than pixels corresponding to the lines in the range to a predetermined color.
An embodiment of the present disclosure includes a non-transitory recording medium storing a plurality of instructions which, when executed by one or more processors, causes the processors to perform a method. The method includes displaying, on a display, an image including a table, receiving an operation of specifying a range to be edited in the image, acquiring coordinates of lines of the table in the range, and changing a color of pixels other than pixels corresponding to the lines in the range to a predetermined color.
A more complete appreciation of the disclosure and many of the attendant advantages and features thereof can be readily obtained and understood from the following detailed description with reference to the accompanying drawings, wherein:
The accompanying drawings are intended to depict embodiments of the present invention and should not be interpreted to limit the scope thereof. The accompanying drawings are not to be considered as drawn to scale unless explicitly noted. Also, identical or similar reference numerals designate identical or similar components throughout the several views.
In describing embodiments illustrated in the drawings, specific terminology is employed for the sake of clarity. However, the disclosure of this specification is not intended to be limited to the specific terminology so selected and it is to be understood that each specific element includes all technical equivalents that have a similar function, operate in a similar manner, and achieve a similar result.
Referring now to the drawings, embodiments of the present disclosure are described below. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
A display apparatus, a display system, a display control method, and a non-transitory recording medium according to one or more embodiments of the present disclosure are described below with reference to the attached drawings.
An electronic whiteboard 2 is connected to the terminal device 4, an image forming apparatus 9, and a storage server 3 via a communication network 1. Alternatively, the electronic whiteboard 2 or the display 6 is connected to a projector 8 or the terminal device 4 via a display cable. The electronic whiteboard 2 displays, on the screen, image data acquired from the terminal device 4, the image forming apparatus 9, or the storage server 3 via the communication network 1. Alternatively, the electronic whiteboard 2 and the display 6 display, on the screen, an image output from the terminal device 4 or the projector 8 connected by a cable. The storage server 3 stores image data to be displayed on the display apparatus 7. The image forming apparatus 9 acquires image data to be displayed on the display apparatus 7 by using a scan function. Alternatively, the image data may be captured by a camera, or may be an electronic file created by an application for such as document creation, spreadsheet, presentation, or image editing executed by the terminal device 4, for example.
The configuration of the display system 5 illustrated in
The CPU 201 controls the entire operation of the electronic whiteboard 2. The ROM 202 stores a control program such as an initial program loader (IPL) to boot the CPU 201. The RAM 203 is used as a work area for the CPU 201. The SSD 204 stores various data such as a control program for the electronic whiteboard. The network I/F 205 controls communication with an external device through the communication network 1.
The external device connection I/F 206 is an interface for connecting various external devices. Such external devices include a universal serial bus (USB) memory 230 and other external devices (a microphone 240, a speaker 250, and a camera 260).
The electronic whiteboard 2 further includes a capture device 211, a graphics processing unit (GPU) 212, a display controller 213, a contact sensor 214, a sensor controller 215, an electronic pen controller 216, a near-range communication circuit 219, and an antenna 219a for the near-range communication circuit 219, a power switch 222, and selection switches 223.
The capture device 211 acquires an image (video) information as a still image or a video (moving) image from the terminal device 4 such as a personal computer (PC), which is external to the electronic whiteboard 2, and causes a display 280 to display the still image or the video image. The GPU 212 is a semiconductor chip dedicated to graphics. The display controller 213 controls screen display to output an image processed by the GPU 212 to the display 280. The contact sensor 214 detects a touch onto the display 280 with an electronic pen 290 or a user's hand H. The sensor controller 215 controls operation of the contact sensor 214. The contact sensor 214 inputs and senses a coordinate by using an infrared blocking system. The inputting and detecting a coordinate may be as follows. For example, two light receiving and emitting devices are disposed at both ends of the upper face of the display 280, and a reflector frame surrounds the periphery of the display 280. The light receiving and emitting devices emit a plurality of infrared rays in parallel to a surface of the display 280. The rays are reflected by the reflector frame, and a light-receiving element receives light returning through the same optical path of the emitted infrared rays.
The contact sensor 214 outputs an identifier (ID) of the infrared ray that is blocked by an object after being emitted from the light receiving elements, to the sensor controller 215. Based on the ID of the infrared ray, the sensor controller 215 detects a specific coordinate that is touched by the object. The electronic pen controller 216 communicates with the electronic pen 290 to detect contact by the tip or bottom of the electronic pen with the display 280. The near-range communication circuit 219 is a communication circuit in compliance with, for example, the near field communication (NFC) or BLUETOOTH. The power switch 222 turns on or off the power of the electronic whiteboard 2. The selection switches 223 are a group of switches for adjusting brightness, hue, etc., of display on the display 280, for example.
The electronic whiteboard 2 further includes a bus line 210. The bus line 210 is an address bus or a data bus, which electrically connects the components illustrated in
The contact sensor 214 is not limited to the infrared blocking system type, and may be a different type of detector, such as a capacitance touch panel that identifies a contact position by detecting a change in capacitance, a resistance film touch panel that identifies a contact position by detecting a change in voltage of two opposed resistance films, or an electromagnetic induction touch panel that identifies a contact position by detecting electromagnetic induction caused by contact of an object to a display. In addition to or in alternative to detecting a touch by the tip or bottom of the electronic pen 290, the electronic pen controller 216 may also detect a touch by another part of the electronic pen 290, such as a part held by a hand of the user.
The CPU 501 controls the entire operation of each of the terminal device 4 and the storage server 3. The ROM 502 stores programs such as an initial program loader (IPL) to boot the CPU 501. The RAM 503 is used as a work area for the CPU 501. The HD 504 stores various data such as a program. The HDD controller 505 controls reading and writing of various data from and to the HD 504 under control of the CPU 501. The display 506 displays various information such as a cursor, a menu, a window, a character, or an image. The external device connection I/F 508 is an interface that connects to various external devices. Examples of the external devices include, but are not limited to, a universal serial bus (USB) memory and a printer. The network I/F 509 is an interface for performing data communication using a network, such as the Internet. The bus line 510 is an address bus, a data bus, or the like for electrically connecting the components such as the CPU 501 illustrated in
The keyboard 511 is an example of an input device provided with a plurality of keys used to input characters, numerals, or various instructions. The pointing device 512 is an example of an input device that allows a user to select or execute various instructions, select an item for processing, or move a cursor being displayed. The DVD-RW drive 514 reads and writes various data from and to a DVD-RW 513, which is an example of a removable storage medium (recording medium). The DVD-RW drive 514 is not limited to the DVD-RW and may be a digital versatile disc-recordable (DVD-R), for example. The medium I/F 516 controls reading and writing (storing) of data from and to a storage medium 515 such as a flash memory.
The display control unit 10 displays image information of image data on a display screen of the display apparatus 7.
The operation reception unit 11 receives an operation such as inputting characters or pressing buttons by a user via a touch panel of the display apparatus 7 or the keyboard or the pointing device of the terminal device 4.
The communication unit 12 is, for example, a function of the display apparatus 7, and receives image data from the storage server 3 via the communication network 1.
The storage unit 13 stores image data to be displayed on a screen of the display apparatus 7, for example. The storage unit 13 also stores an edited image obtained after editing an image displayed on the display apparatus 7 as image data.
The identification unit 14 identifies each of lines corresponding to grid lines of a table present in a range specified by a user (specified range) on the screen of the display apparatus 7. The identification unit 14 also acquires information on coordinates of end points and intersection points of straight lines of the lines of a table, and color and thickness of each line. As a method of identifying each of the lines of a table, for example, a horizontal length and a vertical length of a specified range are calculated, and a straight line having a length equal to or greater than a value obtained by dividing each length by a predetermined ratio is set as each of the lines.
The acquisition unit 15 acquires a background color of a closed region surrounded by lines of a table (a cell region in a table) or a closed region surrounded by one or more lines of the table and one or more boundary lines of the specified range. As a method of acquiring the background color, for example, colors of pixels adjacent to a boundary line in the specified range are acquired, and a color of the most pixels is set as the background color. For example, an RGB (Red, Green, and Blue) color model is used as a pixel format in acquiring a color. One pixel is represented by three values (for example, 8 bits) indicating brightness of colors of red, green, and blue. These values are referred to as pixel values of a pixel.
A redisplay unit 16 changes a color of pixels in the specified range other than the coordinates of the lines of the table to a predetermined color, or the redisplay unit 16 changes all the pixels in each closed region to the same color as the background color, thereby displaying an image in which information such as characters is deleted while keeping the lines of the table in the specified range. For example, the predetermined color may be set by a user or a designer by using the RGB color model.
The assigning unit 17 assigns an address for identifying each closed region, which is a region surrounded by lines of a table in a specified range or a region surrounded by one or more lines of the table and one or more boundary lines of the specified range.
Step S21: The operation reception unit 11 receives a user operation for specifying a range to be edited by using the hand H or the electronic pen 290 with respect to an image being displayed on the screen of the display apparatus 7. As a method of specifying a range, for example, assuming that a range to be specified is a rectangle, the user specifies or designates a pair of diagonal vertices of the rectangle. The range that is specified is referred to as a specified range.
Referring again to
Step S22: The identification unit 14 identifies lines of a part of the table presenting in the specified range.
Referring again to
Step S23: The acquisition unit 15 acquires a background color of each closed region surrounded by lines of the table (a cell region in the table) in the specified range 52 or each closed region surrounded by one or more lines of the table and one or more boundary lines of the specified range 52. When there are a plurality of closed regions, the acquisition unit 15 acquires the background color for each closed region. As a method of acquiring the background color, for example, colors of pixels adjacent to a boundary line in the specified range are acquired, and a color of the most pixels is set as the background color. For example, the RGB color model is used as a pixel format in acquiring a color. In the following description of embodiments, an area in which the background color is acquired is simply referred to as a “closed region”.
Step S24: The redisplay unit 16 changes the color of all pixels different from the background color in each closed region other than the pixels of the coordinates of the lines of the table to the same color as the background color, thereby displaying an image in which information including characters is deleted and the lines remain in the specified range.
Alternatively, the redisplay unit 16 may change all pixels within the specified range other than the pixels corresponding to the drawing positions of the lines of the table to a predetermined color. In a case that this method is used, it is not necessary to acquire the background color of each closed region in step S23. However, when the background color within the specified range is different from the predetermined color, the original background color is changed. In this case, the characters presented in the specified range 62 are deleted by being overrode with the predetermined color.
The redisplay unit 16 may display the lines of the table again after changing the colors of all the pixels in each closed region including the coordinates of the lines or the colors of all the pixels in the specified range, thereby obtaining the same result as the above-described processing.
Referring again to
Step S25: The operation reception unit 11 receives a user operation of inputting a character in the specified range. A character can be input to the touch panel of the display apparatus 7 in substantially the same manner as the operation of inputting a character by using the hand H or the electronic pen 290.
In addition, the storage unit 13 may store an image edited on the screen of the display apparatus 7 as image data.
A first variation of the first embodiment is described below, in particular, regarding differences from the first embodiment.
Step S31: The assigning unit 17 assigns an address for identifying each closed region, which is a region surrounded by lines of the table in the specified range or a region surrounded by one or more lines of the table and one or more boundary lines of the specified range.
Referring again to
Step S32: The operation reception unit 11 receives an operation of specifying a closed region to be edited by the user. Specifying the closed region to be edited is performable according to a user operation by clicking the closed region to be specified using the hand H or the electronic pen 290, for example. In response to a user operation of clicking on a closed region to be specified, the display control unit 10 can change the background color of the specified closed region to allows the user to confirm that the specifying operation has been accepted.
Referring again to
In the subsequent step S24, in the first variation, the closed region specified by the user is overrode with the background color, and the character strings are deleted. In step 25, character input is performable in the closed region specified by the user.
Further, after the execution of step S23 and before the transition to step S31, the display control unit 10 may display a screen for allowing the user to check whether to delete the character strings of all closed regions or to specify a closed region to be edited.
As a second variation of the first embodiment of the present disclosure, a case where no line of the table is present in the range specified by the user in step S21 of
In the first embodiment of the present disclosure, the display apparatus 7 can delete the characters while keeping the table format, within the range specified by a user (specified range) included in the image data displayed on the screen. In addition, the display apparatus 7 can allow the user to specify a closed region to be edited among from the closed regions each of which is a region surrounded by lines of the table in the specified range or a region surrounded by one or more lines of the table and one or more boundary lines of the specified range.
For example, there is a use case in which a user connects a laptop computer to the electronic whiteboard 2 using a cable for outputting material (image) including a table being displayed on a screen of the laptop computer to the electronic whiteboard 2, so that the user discusses with other users while viewing the image being displayed on the electronic whiteboard 2. In such a case, the user can edit the displayed table by using a function or an application of the electronic whiteboard 2 without performing an operation with respect to an application such as a spreadsheet or a presentation in the laptop computer. In other words, the user can edit the table displayed on the electronic whiteboard 2 without activating any application of a file that is a source of the image displayed on the electronic whiteboard 2.
The description above concerns some of embodiments of the present disclosure. Embodiments of the present disclosure are not limited to the specific embodiments described above, and various modifications and replacements are possible within the scope of aspects of the disclosure.
For example,
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Here, the “processing circuit or circuitry” in the present specification includes a programmed processor to execute each function by software, such as a processor implemented by an electronic circuit, and devices, such as an application specific integrated circuit (ASIC), a digital signal processors (DSP), a field programmable gate array (FPGA), and conventional circuit modules arranged to perform the recited functions.
The group of apparatuses or devices described above is one example of plural computing environments that implement the embodiments disclosed in this specification. In some embodiments, the display system 5 includes multiple computing devices, such as a server cluster. The plurality of computing devices is configured to communicate with one another through any type of communication link, including a network, shared memory, etc., and perform the processes disclosed herein.
In a related art, characters are failed to be deleted while lines of a table present in a specified range in an image are kept.
According to an embodiment of the present disclosure, characters are deleted while lines of a table present in a specified range in an image are kept.
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, elements and/or features of different illustrative embodiments may be combined with each other and/or substituted for each other within the scope of the present invention. Any one of the above-described operations may be performed in various other ways, for example, in an order different from the one described above.
The present disclosure can be implemented in any convenient form, for example using dedicated hardware, or a mixture of dedicated hardware and software. The present disclosure may be implemented as computer software implemented by one or more networked processing apparatuses. The network can include any conventional terrestrial or wireless communications network, such as the Internet. The processing apparatuses can include any suitably programmed apparatus such as a general purpose computer, personal digital assistant, mobile telephone (such as a wireless application protocol (WAP) or 3G-compliant phone), for example. Since the present disclosure can be implemented as software, each or every aspect of the present disclosure thus encompasses computer software implementable on a programmable device. The computer software can be provided to the programmable device using any storage medium for storing processor readable code such as a floppy disk, hard disk, CD ROM, magnetic tape device or solid state memory device.
The hardware platform includes any desired kind of hardware resources including, for example, a CPU, an RAM, and an HDD. The CPU may be implemented by any desired kind of any desired number of processors. The RAM may be implemented by any desired kind of volatile or non-volatile memory. The HDD may be implemented by any desired kind of non-volatile memory capable of storing a large amount of data. The hardware resources may additionally include an input device, an output device, or a network device, depending on the type of the apparatus. Alternatively, the HDD may be provided outside of the apparatus as long as the HDD is accessible. In this example, the CPU, such as a cache memory of the CPU, and the RAM may function as a physical memory or a primary memory of the apparatus, while the HDD may function as a secondary memory of the apparatus.
Number | Date | Country | Kind |
---|---|---|---|
2021-140045 | Aug 2021 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
20140161353 | Ma | Jun 2014 | A1 |
Number | Date | Country |
---|---|---|
2003-069767 | Mar 2003 | JP |
2003069767 | Mar 2003 | JP |
2006-005517 | Jan 2006 | JP |
2006005517 | Jan 2006 | JP |
2019-032604 | Feb 2019 | JP |
2019032604 | Feb 2019 | JP |
Number | Date | Country | |
---|---|---|---|
20230067554 A1 | Mar 2023 | US |