The present disclosure relates to an information processing device in which it is possible to draw (input) drawing information on a display with a touch pen, and also relates to an information processing method and an information processing program.
Conventionally, an electronic board (also referred to as an electronic blackboard or an electronic whiteboard) is known as one of display devices (information processing devices) in which a touch panel is used to receive an instruction input (touch) from a user. The electronic board reads position coordinates of information (an object) handwritten with a touch pen or the like on a touch panel, recognizes the object as a character, based on the information of the read position coordinates, converts the character into text, and displays the converted text on a display.
There is proposed a tag image in which a so-called tag paper is electronically represented on the electronic board. A tag function enables display of various opinions in tag images at a meeting or the like. However, a conventional tag function only has a mode for displaying a plurality of tag images on a display, a mode for moving the tag images to any position, and the like, and it is difficult to efficiently create materials such as meeting minutes.
An object of the present disclosure is to provide an information processing device capable of efficiently creating materials utilizing a tag function, and also to provide an information processing method and an information processing program.
An information processing device according to an aspect of the present disclosure includes an input detector that detects a designated position of a display, a tag generator that generates a tag image by associating the position detected by the input detector with input information to be input and displays the generated tag image on the display, and a display processor that displays, on the display, a plurality of pieces of the input information associated with each of a plurality of the tag images generated by the tag generator, based on each of attributes of the plurality of tag images.
An information processing method according to another aspect of the present disclosure includes using one or more processors to execute: an input detection process for detecting a designated position of a display; a tag generation process for generating a tag image by associating the position detected in the input detection process with input information to be input, and displaying the generated tag image on the display; and a display process for displaying, on the display, a plurality of pieces of the input information associated with each of a plurality of the tag images generated in the tag generation process, based on each of attributes of the plurality of tag images.
In a non-transitory recording medium for storing an information processing program according to another mode of the present disclosure, the program causes one or more processors to execute: an input detection process for detecting a designated position of a display, a tag generation process for generating a tag image by associating the position detected in the input detection process with input information to be input, and displaying the generated tag image on the display, and a display process for displaying, on the display, a plurality of pieces of the input information associated with each of a plurality of the tag images generated in the tag generation process, based on each of attributes of the plurality of tag images.
According to the present disclosure, it is possible to provide an information processing device capable of efficiently creating materials utilizing a tag function, and also to provide an information processing method, and an information processing program.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description with reference where appropriate to the accompanying drawings. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
An embodiment of the present disclosure will be described below with reference to the attached drawings. The following embodiment is an example in which the present disclosure is embodied, and does not intend to limit the technical scope of the present disclosure.
As illustrated in
The touch display 100 includes a touch panel 110 and a display 120. The touch panel 110 may be a capacitive touch panel, or may be a pressure sensitive or an infrared ray touch panel. That is, the touch panel 110 may be any device capable of appropriately receiving a user operation input such as a touch. The touch panel 110 is provided in the display 120. The display 120 is, for example, a liquid crystal display. The display 120 is not limited to a liquid crystal display, and may be a light emitting diode (LED) display, an organic electro-luminescence (EL) display, a projector, or the like.
The touch display 100 may be a device such as a computer, a tablet terminal, a smartphone, and a car navigation system where a touch panel is provided.
The touch pen 300 is a pen used by a user to touch (input into) the touch display 100. If the touch pen 300 is omitted, the user touches (inputs to) the touch display 100 with a finger. For example, the user inputs (draws) an obj ect such as a character or a figure by handwriting with the touch pen 300 or a finger.
As illustrated in
The storage 220 is a non-volatile storage such as a hard disk drive (HDD) or a solid state drive (SSD) that stores various types of information.
Specifically, the storage 220 stores tag information 222 on a tag image created on the touch display 100 by the user.
Further, the storage 220 also stores input information input to an area other than the tag image P.
Further, the storage 220 stores a control program such as an information processing program 221 for causing the controller 210 to execute information processing (see
The controller 210 includes a control device such as a CPU, a ROM, and a RAM. The CPU is a processor that executes various types of arithmetic processes. The ROM is a non-volatile storage in which a control program such as BIOS and OS for causing the CPU to execute various types of arithmetic processes is stored in advance. The RAM is a volatile or non-volatile storage that stores various types of information, and is used as a temporary storage memory (working area) for various types of processes executed by the CPU. The controller 210 controls the information processing device 1 by causing the CPU to execute various types of control programs stored in advance in the ROM or the storage 220.
Specifically, as illustrated in
The input detector 211 detects an input to the touch display 100 by the touch pen 300 or a finger. The input detector 211 is an example of an input detector according to the present disclosure. Specifically, the input detector 211 detects the page number and position coordinates of a position specified by the touch pen 300 or the finger of the user on the touch panel 110. For example, if the user inputs a character by handwriting on the touch panel 110, the input detector 211 detects the page number and the position coordinates of the input position. Further, if the user performs an operation to activate a tag function on the touch panel 110, the input detector 211 detects the page number and the position coordinates of a position corresponding to the operation. Specifically, if the user maintains the contact state of the touch pen 300 at any position of the touch panel 110 (depresses the touch panel 110 with the touch pen 300 at any position) for a predetermined time, the tag function is activated (the mode transitions to a tag mode). In this case, the input detector 211 detects the contact position of the touch pen 300, that is, the page number and the position coordinates of the position of the touch panel 110 depressed by the touch pen 300 for a predetermined time.
The tag generator 212 generates the tag image P by associating the page number and the position detected by the input detector 211 with the input information input by the user, and displays the generated tag image P on the display 120. Further, the tag generator 212 displays the input information in the area of the tag image P selected on the display 120. The tag generator 212 is an example of a tag generator according to the present disclosure.
For example, as illustrated in
Here, the tag selection screen D1 may include an item for selecting the size of the tag image P. In this case, the tag generator 212 generates the tag image P in a size selected by the user. If the item for selecting the size of the tag image P is not included in the tag selection screen D1, the tag generator 212 generates the tag image P in a size at an initial setting. The size of the tag image P may be changed (enlarged or reduced) at any timing after the tag image P is displayed on the display 120.
The tag generator 212 registers, with the tag information 222, tag information corresponding to the generated tag image P. Specifically, the tag generator 212 registers, with the tag information 222, information including the page number at which the tag image P is generated, the position coordinates corresponding to the position X (display position) of the tag image P, the color, the size, and the input information, in association with the identification information (tag ID) of the tag image P. For example, for the tag image P corresponding to
The tag processor 213 changes at least any one of the page number, the position, the color (background color), the size, and the input information of the tag image P displayed on the display 120. The tag processor 213 is an example of a tag processor according to the present disclosure.
For example, in the display screen illustrated in
If at least any one of the page number, the position, the background color, the size, and the input information of the tag image P is changed, the tag processor 213 updates the corresponding information of the tag information 222 (see
The display processor 214 displays, on the display 120, a plurality of pieces of input information associated with each of the plurality of tag images P generated by the tag generator 21, based on each of attributes of the plurality of tag images P2. The display processor 214 is an example of a display processor according to the present disclosure. Specifically, the display processor 214 sorts the plurality of pieces of input information into different areas or different pages for each of the attributes based on the tag information 222, and then displays the sorted information.
For example, the display processor 214 displays, in a first area, first input information associated with one or more first tag images P having the attribute of a first color, and displays, in a second area, second input information associated with one or more second tag images P having the attribute of a second color. For example, in the display screen illustrated in
Further, the display processor 214 displays the input information on a sheet based on a predetermined priority order. For example, in the display screen illustrated in
Further, the display processor 214 may convert the input information into a predetermined display mode and then display the converted input information on the display 120. For example, as illustrated in
Further, the display processor 214 may determine a display size and line spacing of the characters displayed on the display 120 based on the number of characters or the number of lines of the characters included in the plurality of pieces of input information, and display the plurality of pieces of input information on the display 120 in accordance with the determined display size and line spacing. For example, if sorting the input information for each attribute, the display processor 214 determines the display size and the line spacing of the characters on the sheet (here, the first sheet S1) with the largest amount of input information (such as the number of characters and the number of lines) (see
Further, the display processor 214 may display the background color of each of the sheets in a color associated with the tag image P. For example, the display processor 214 displays the background color of the first sheet S1 in “red”, the background color of the second sheet S2 in “blue”, and the background color of the third sheet S3 in “yellow”.
In this way, the display processor 214 embeds, in each of the sheets, the input information sorted in accordance with the attributes to create a material. The display processor 214 stores (saves) the created material (sheet) in the storage 220. The controller 210 may perform control to print the material as meeting minutes or transmit data of the material to a user terminal in response to a request from the user.
The controller 210 may execute a process for drawing the input information on the display 120 in accordance with the attribute of the touch pen 300. For example, the controller 210 receives an identification signal from the touch pen 300 and performs control to draw, on the display 120, characters, figures, and symbols in accordance with the attribute (writing color, character thickness, pen tip shape, and the like) associated with the identification signal. The controller 210 may receive an identification signal of another writing instrument (such as an eraser) and perform control to execute processing (such as erasing of drawing information) in accordance with the writing instrument.
Information processing executed in the information processing device 1 will be described with reference to
The present disclosure can be regarded as an invention of an information processing method in which one or more steps included in the information processing are executed. Further, one or more steps included in the information processing described here may be omitted where appropriate. In addition, each of the steps in the information processing may be executed in a different order as long as a similar operation and effect is obtained. Further, although a case where each of the steps in the information processing is executed by the controller 210 will be described as an example here, an information processing method in which each of the steps in the information processing is executed in a distributed manner by a plurality of processors may be regarded as another embodiment.
First, in step S11, the controller 210 determines whether an input to the touch display 100 by the touch pen 300 or a finger is detected. If the input is detected (S11: YES), the processing proceeds to step S12. Step S11 is an example of an input detection process according to the present disclosure.
In step S12, the controller 210 determines whether an end command is selected by the user. If the end command is selected (S12: YES), the processing ends. If the controller 210 does not acquire the end command (S12: NO), the processing proceeds to step S13.
In step S13, the controller 210 determines whether the mode has shifted to the tag mode. For example, if the user maintains the contact state of the touch pen 300 at any position on the touch panel 110 (depresses the touch panel 110 with the touch pen 300 at any position on the touch panel 110) for a predetermined time, the mode shifts to the tag mode. If the mode shifts to the tag mode (S13: YES), the controller 210 executes a tag generation process described later (see
In step S14, the controller 210 determines whether a change in a tag content in the tag image P displayed on the display 120 is received from the user. If the change of the tag content is received (S14: YES), the controller 210 executes a tag change process described later (see
In step S15, the controller 210 determines whether an instruction for material creation is received from the user. If the instruction for material creation is received from the user (S15: YES), the controller 210 executes a material creation process (see
In step S16, the controller 210 determines whether another command is received from the user. If another command is received from the user (S16: YES), the controller 210 executes the command process. If another command is not received from the user (S16: NO), the processing returns to step S11. The command is, for example, a printing process and a saving process of the created material. These command processes are well-known processes and description thereof will be omitted.
The tag generation process will be described with reference to
In step S21, the controller 210 acquires the page number and the position coordinates of an input position input by the user. Specifically, the controller 210 acquires the page number and the position coordinates of the position X on the touch panel 110 that the user depresses for a long time by the touch pen 300 (see
Next, in step S22, the controller 210 acquires the color selected by the user on the tag selection screen D1 (see
Next, in step S23, the controller 210 generates the tag image P based on the acquired position coordinates and color and performs control to display the tag image P on the display 120 (see
Next, in step S24, the controller 210 performs control to display input information input by the user in an area of the tag image P (see
The tag change process will be described with reference to
In step S31, the controller 210 determines whether an operation of changing the page number or the position of the tag image P is received from the user. If the operation is received (S31: YES), the processing proceeds to step S32. On the other hand, if the operation is not received (S31: NO), the processing proceeds to step S33.
In step S32, the controller 210 changes the page number or the position of the tag image P in response to the operation of the user. For example, if the user performs a drag-and-drop operation on the tag image P with the touch pen 300 or a finger, the page number or the position of the tag image P is moved.
In step S33, the controller 210 determines whether an operation of changing the color of the tag image P is received from the user. If the operation is received (S33: YES), the processing proceeds to step S34. On the other hand, if the operation is not received (S33: NO), the processing proceeds to step S35.
In step S34, the controller 210 changes the background color of the tag image P to a color selected by the user.
In step S35, the controller 210 determines whether an operation of changing the size of the tag image P is received from the user. If the operation is received (S35: YES), the processing proceeds to step S36. On the other hand, if the operation is not received (S35: NO), the processing proceeds to step S37.
In step S36, the controller 210 changes the size of the tag image P in response to the operation of the user. For example, if the user performs an enlargement/reduction operation such as a pinch-in/pinch-out operation on the tag image P, the controller 210 changes the size of the tag image P.
In step S37, the controller 210 determines whether an operation of changing the input information displayed on the tag image P is received from the user. If the operation is received (S37: YES), the processing proceeds to step S38. On the other hand, if the operation is not received (S37: NO), the processing ends.
In step S38, the controller 210 changes the input information displayed on the tag image P to information rewritten by the user.
In step S39, the controller 210 updates each piece of information registered in the tag information 222 (see
The material creation process will be described with reference to
In step S41, the controller 210 creates a different area or a different sheet (page) for the input information for each attribute. For example, the controller 210 creates the first sheet S1 in red, the second sheet S2 in blue, and the third sheet S3 in yellow.
In step S42, the controller 210 determines the display mode of the input information associated with the tag image P. For example, the controller 210 converts the typeface from the handwritten font (see
In step S43, the controller 210 performs control to display (embed) the input information on each of the sheets. For example, the controller 210 performs control to display, on the first sheet S1, the input information T1, T4, T5, and T9 associated with the red tag image P (see
In step S44, the controller 210 performs control to save the created material. For example, the controller 210 performs control to save the data of the material including the first sheet S1, the second sheet S2, and the third sheet S3 in the storage 220. Then, the processing proceeds to step S16 in
The information processing device 1 according to the embodiment executes the information processing as described above.
As described above, the information processing device 1 creates an area or a page (sheet) for each attribute (color) of the tag image P, and inserts the input information of the tag image P into each page. Further, the information processing device 1 converts the handwritten characters into predetermined font characters and displays the converted characters on each page. Further, the information processing device 1 displays the characters in which the line spacing and the character size are unified in each page. As described above, the information processing device 1 can aggregate the plurality of pieces of input information in accordance with the attribute of the tag image P, and thus, efficiently create materials utilizing the tag function.
The present disclosure is not limited to the above-described embodiment. As another embodiment, the information processing device 1 may use a specific keyword displayed on the tag image P as an attribute. For example, the information processing device 1 may create a page for each specific mark (figure) displayed on the tag image P to sort the input information. Further, the keyword may be the content of the input information. For example, the information processing device 1 may create a page for each specific term included in the input information to sort the input information. As described above, the attribute according to the present disclosure may be a color, a character, a figure, or a symbol associated with the tag image P.
In a configuration of the information processing device according to the present disclosure, the touch panel 110 and the touch pen 300 may be omitted. For example, the information processing device according to the present disclosure may include the control device 200 and the display 120. Furthermore, the information processing device according to the present disclosure may include only the control device 200. In this configuration, the information processing device according to the present disclosure causes the display 120 connected via a network to execute various types of display processes.
Further, in the information processing device according to the present disclosure, the tag image may be a graphic image drawn by an operation of the user. For example, the tag image is a graphic image created by drawing application software, document creation application software, or the like, and in the graphic image, an attribute such as a color, a character, a figure, and a symbol is associated with input information input by the user. Further, in the tag image, it is possible to change at least one of a page number, a position, a background color, a size, and input information by an operation of the user. As described above, the tag image according to the present disclosure includes various electronic images that play the same role as the tag paper.
It is noted that, in the information processing device 1 according to the present disclosure, within the scope of the invention described in claims, the embodiments described above may be freely combined, or the embodiments may be appropriately modified or some of the embodiments may be omitted.
It is to be understood that the embodiments herein are illustrative and not restrictive, since the scope of the disclosure is defined by the appended claims rather than by the description preceding them, and all changes that fall within metes and bounds of the claims, or equivalence of such metes and bounds thereof are therefore intended to be embraced by the claims.
Number | Date | Country | Kind |
---|---|---|---|
2019-100123 | May 2019 | JP | national |
This application is a continuation of U.S. Pat. Application No. 16/876746, filed May 18, 2020, which claims the benefit of priority from the corresponding Japanese Patent Application No. 2019-100123 filed on May 29, 2019, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 16876746 | May 2020 | US |
Child | 17957309 | US |