The entire disclosure of Japanese Patent Application No. 2014-9110 filed on Jan. 22, 2014 including description, claims, drawings, and abstract are incorporated herein by reference in its entirety.
1. Field of the Invention
The present invention relates to an object operation system capable of displaying and operating an object, a recording medium storing an object operation control program for controlling operation of an object, and an object operation control method.
2. Description of the Related Art
In recent years, a display screen usable by multiple users (which will be referred to as a shared screen) is used to write and draw display elements (hereinafter referred to as objects) such as characters, figures, and images on the shared screen to have an electronic conference and the like to have a discussion. In such a shared screen, multiple user write various kinds of objects, make multiple objects into a group, move objects and groups to any given place in the shared screen, and perform enlarging/reducing operation (hereinafter referred to as enlarging and reducing operation). In particular, when a shared screen is made of a touch panel supporting multi-touch, various kinds of operations can be done on objects and groups with multi-touch operation.
As a technique of multi-touch operation, for example, JP 2013-8326 A discloses an image processing apparatus including recognition unit configured to recognize touch on two points of the display screen, a selection unit configured to select a type of image processing on the basis of the distance between the two points of the recognized touch, a calculation unit configured to allow a user to touch two points, adopt one of the touched points as a center of rotation, and obtain a rotation angle obtained from movement of the other of the touches, and calculate the amount of processing from the rotation angle, and an image processing unit configured to perform image processing on an image displayed on the display screen on the basis of the amount of processing and the type of the image processing.
JP 2013-37396 A (US 2013-0033717 A) discloses an image forming apparatus including a display unit and a position detection unit configured to detect a contact position onto a display screen of the display unit, wherein the image forming apparatus performs image forming process to form an image on a recording sheet on the basis of a display image displayed on the display unit, and wherein the image forming apparatus includes an editing unit configured to partially edit the display image on the basis of a direction of a straight line connecting two lines detected by the position detection unit.
JP 2012-79279 A (US 2012-0056831 A) discloses an information processing apparatus including a display unit having a screen and a touch panel arranged so as to overlap the screen, wherein the information processing apparatus further includes a control unit sets a writing mode by detecting a predetermined mode switching operation including an operation in which at least two points on the touch panel are designated as a still point and an operation point, and inputting, as writing data, a series of coordinate data corresponding to a trace of movement of the operation points.
In the conventional techniques of JP 2013-8326 A, JP 2013-37396 A (US 2013-0033717 A), and JP 2012-79279 A (US 2012-0056831 A), the difference in the touch operation is recognized, and the operation on the object is changed in accordance with the difference in the touch operation, but in the multi-touch operation, there may be a case where the operation executed on the object may be changed in accordance with two operations in which user's operations performed with an apparatus are completely the same. For example, when a predetermined object and a group including the predetermined object are displayed, the user may touch the predetermined object and the group at a single point of each of them and move the touch position of each of them, thus choosing to use any one of two types of operations including an operation for enlarging or reducing the entire group including the predetermined object and an operation for retrieving the predetermined object from the group and individually moving the predetermined object.
However, in the conventional system as shown in each of JP 2013-8326 A, JP 2013-37396 A (US 2013-0033717 A), and JP 2012-79279 A (US 2012-0056831 A), both of such operations are falsely recognized as completely the same touch operation, and therefore, it is impossible to switch the operation performed on the objects and the groups. For this reason, it is necessary to use a method different from the touch operation to switch the operation on the objects and the groups, and there is a problem in that this makes the operation cumbersome.
The present invention has been made in view of the above problem, and it is a main object of the present invention to provide an object operation system, an object operation control program, and an object operation control method capable of switching operation on an object and a group in accordance with touch operation even when the same touch operation is performed.
To achieve the abovementioned object, according to an aspect, an object operation system reflecting one aspect of the present invention comprises a display unit which displays an object on a screen, an operation unit which receives a touch operation on the screen, and a control unit which controls the display unit and the operation unit, wherein when a touch operation for touching a plurality of points on the screen at a time is performed, the control unit determines whether the touch operation is a touch operation performed with both hands or a touch operation performed with a single hand, and carries out an operation on an object in accordance with a rule which is defined differently in advance in accordance with a determination result of the touch operation.
To achieve the abovementioned object, according to an aspect, a non-transitory recording medium storing a computer readable object operation control program operating on an apparatus for controlling a touch panel, including a display unit which displays an object on a screen and an operation unit which receives a touch operation on the screen, wherein the object operation control program reflecting one aspect of the present invention causes the apparatus to execute first processing for, when a touch operation for touching a plurality of points on the screen at a time is performed, determining whether the touch operation is a touch operation performed with both hands or a touch operation performed with a single hand, and second processing for carrying out an operation on an object in accordance with a rule which is defined differently in advance in accordance with a determination result of the touch operation.
To achieve the abovementioned object, according to an aspect, an object operation control method, reflecting one aspect of the present invention, for a system including a display unit which displays an object on a screen, an operation unit which receives a touch operation on the screen, and a control unit which controls the display unit and the operation unit, wherein the control unit executes first processing for, when a touch operation for touching a plurality of points on the screen at a time is performed, determining whether the touch operation is a touch operation performed with both hands or a touch operation performed with a single hand, and second processing for carrying out an operation on an object in accordance with a rule which is defined differently in advance in accordance with a determination result of the touch operation.
The above and other objects, advantages and features of the present invention will become more fully understood from the detailed description given hereinbelow and the appended drawings which are given by way of illustration only, and thus are not intended as a definition of the limits of the present invention, and wherein:
Hereinafter, an embodiment of the present invention will be described with reference to the drawings. However, the scope of the invention is not limited to the illustrated examples.
When a discussion is held while operating objects such as characters, figures, and images, and groups displayed on a shared screen as explained in the Description of the Related Art, various kinds of operations are performed on the objects and the groups, e.g., moving the objects and the groups or performing enlarging and reducing operation on the objects and the groups. In particular, when a shared screen is made of a touch panel supporting multi-touch, various kinds of operations can be done on objects and groups with multi-touch operation.
In this case, even an operation is recognized as completely the same touch operation, a user may want to execute different operations on the objects and the groups. For example, when pinch operation is performed on a single object in a group (operation for touching two points and changing the distance between the two points), a user may want to either enlarge/reduce the entire group or enlarge/reduce only a target object. Therefore, in a conventional system, as long as an operation is recognized as the same touch operation, operation can be performed only in one of the rules. For this reason, a method different from the touch operation is used to switch the operation, and there is a problem in that the operation is cumbersome.
Therefore, in an embodiment of the present invention, a touch panel is provided with a detection unit for detecting the state of the touch operation (a hand with which the touch operation is performed), and when the multi-touch operation is performed, a determination is made, on the basis of the detection result of the detection unit, as to whether the operation is a multi-touch operation using two fingers of two hands (referred to as a both-hands multi-touch operation) or a multi-touch operation using two fingers of a single hand (referred to as a single-hand multi-touch operation), and when the operation is determined to be the both-hands multi-touch operation, an operation is performed on an element or a group according to a first rule defined in advance, and when the operation is determined to be the single-hand multi-touch operation, an operation is performed on an element or a group according to a second rule different from the first rule. Hereinafter this will be explained with reference to drawings.
The present invention can be applied to both of the case where there is a single operator and the case where there are multiple operators, but in the present specification, a system having a shared work area that can be operated by multiple operators will be hereinafter explained. The system mode includes the following two modes. As shown in
An object operation system 10 according to the present embodiment is a display panel having a calculation function, an electronic blackboard, and the like, and includes a control unit 20, a storage unit 30, a display unit 40, an operation unit 50, a detection unit 60, and the like as shown in
The control unit 20 includes a CPU (Central Processing Unit) 21, memories such as a ROM (Read Only Memory) 22, and a RAM (Random Access Memory) 23. The CPU 21 calls a control program from the ROM 22 and the storage unit 30, and extracts the control program to the RAM 23 and executes the control program, thus controlling operation of the entire object operation system 10. As shown in
The operation determination unit 20a determines whether an operation is a touch operation with a touch at a single point (single touch operation) or a touch operation with a touch at multiple points (multi-touch operation) on the basis of information given by the operation unit 50 (information about touch positions). Then, when the operation is determined to be the multi-touch operation, the information given by the detection unit 60 is analyzed, and a determination is made as to whether the multi-touch operation is the both-hands multi-touch operation or the single-hand multi-touch operation on the basis of the analysis result. Then, the determination result (touch positions and the type of the touch operation) is notified to the processing unit 20b.
It should be noted that the method for determining the multi-touch operation is not particularly limited, but for example, an image is obtained when the display screen is touched with multiple fingers of a single hand or both of the hands, and a pattern obtained by extracting feature points of each of the images is stored, and a determination is made as to whether the operation is the both-hands multi-touch operation or the single-hand multi-touch operation by comparing the image of the hands obtained from the detection unit 60 with the images stored in advance.
A touch area size and a touch pressure of each of the touch positions are obtained when the touch panel is touched with multiple fingers of a single hand or both of the hands, and a combination pattern of the touch area size and the touch pressure is stored, and a determination is made as to whether the operation is the both-hands multi-touch operation and the single-hand multi-touch operation by comparing the touch area size and the touch pressure obtained by the operation unit 50 with the combination pattern stored in advance. For example, when the touch panel is touched with both hands, the same fingers are used in many cases, and when the same fingers of both of the hands are used, the touch area size and the touch pressure are substantially the same. On the other hand, when the touch panel is touched with different fingers of a single hand, the touch area size and the touch pressure are different, and therefore, whether the operation is the both-hands multi-touch operation or the single-hand multi-touch operation can be determined on the basis of the combination of the touch area size and the touch pressure. In this case, the state of the touch operation can be determined on the basis of information given by the operation unit 50, and therefore, the detection unit 60 may be omitted.
In accordance with operation with the operation unit 50, the processing unit 20b displays a hand-written object on the display unit 40, obtains data of an object from the storage unit 30, and displays the object on the display unit 40. The processing unit 20b selects an object (element) or a group which is to be operated, on the basis of the determination result given by the operation determination unit 20a (the touch position and the type of the touch operation), and performs operation on the selected object or group and changes the state of display of the object or the group. For example, when the operation determination unit 20a determines that the operation is the single touch operation, an object displayed at the touch position on the screen of the display unit 40 (hereinafter abbreviated as a touched object) or a group including the touched object is moved in accordance with the change of the touch position. When the operation determination unit 20a determines that the operation is the both-hands multi-touch operation, the touched object or the group including the touched object is changed in accordance with the first rule associated in advance with the both-hands multi-touch operation (for example, the object is enlarged, reduced, or moved in accordance with the change of the two touch positions). When the operation determination unit 20a determines that the operation is the single-hand multi-touch operation, the touched object or the group including the touched object is changed in accordance with the second rule associated in advance with the single-hand multi-touch operation (for example, the group is enlarged or reduced in accordance with the change of the two touch positions).
It should be noted that the enlarging and reducing operation according to the present invention includes both of enlargement and reduction of the size while maintaining a certain ratio between the vertical side and the horizontal side of an object (which means similar figure) and enlargement or reduction of the size while changing the ratio between the vertical side and the horizontal side of an object (which means deformation).
A group according to the present invention is considered to be constituted by a single or multiple objects registered in advance, but multiple objects in a predetermined range (for example, objects in a predetermined range from the center, i.e., the touched object) may be adopted as the group. A group may be constituted based on the types of objects, or a group may be constituted based on the sizes and the colors of objects. When data of objects are managed in a hierarchical structure, one or multiple objects in the same level of hierarchy may be adopted as the group. When objects are associated with users, one or multiple objects associated with the same user may be adopted as a group. An area of a group according to the present invention may be only a display area of objects, or may be an area including the vicinity of objects.
The operation determination unit 20a and the processing unit 20b may be made as hardware, or may be executed by causing the CPU 21 provided in the control unit 20 to execute software functioning as the operation determination unit 20a and the processing unit 20b (operation control program).
The storage unit 30 is constituted by a memory, an HDD (Hard Disk Drive), an SSD (Solid State Drive), and the like, and stores the contents of operation performed with the operation unit 50 (information about the touch position, the type of the touch operation, and the like), information about an object displayed on the display unit 40 (information about data of objects, a number for identifying an object, objects constituting a group, and the like), a pattern for determining whether the touch operation is the both-hands multi-touch operation or the single-hand multi-touch operation, and the like. When the functions of the operation determination unit 20a and processing unit 20b are achieved by causing the CPU 21 to execute the display control program, then this display control program is stored to the storage unit 30.
The display unit 40 is constituted by an LCD (Liquid Crystal Display), an organic EL (Electro Luminescence) display, and the like, and provides a shared work area in which multiple operators can operate objects. The operation unit 50 is constituted by a touch sensor made of lattice-like electrodes arranged on the display unit 40, hard keys, and the like, and is configured to receive operations performed by the user. The display unit 40 and the operation unit 50 constitute a touch panel, and a signal according to touch operation performed on the touch panel is output to the operation determination unit 20a and the processing unit 20b.
The detection unit 60 is constituted by a CCD (Charge Coupled Devices) camera and the like, and uses visible light or infrared light to capture an image of a hand with which a touch operation is performed, and outputs captured image data or data obtained by processing image data (for example, data obtained by extracting the contour of an image and the like) to the operation determination unit 20a. As long as this detection unit 60 is capable of capturing an image of a hand with which a touch operation is performed, the configuration and the arrangement thereof are not particularly limited. For example, when the touch panel has optical transparency, the detection unit 60 is arranged on the back surface side of the touch panel as shown in
Hereinafter, an operation control method of an object using the object operation system 10 having the above configuration will be explained. The CPU 21 extracts an operation control program stored in the ROM 22 or the storage unit 30 to the RAM 23 and executes the operation control program, thus executing processing in each step as shown in the flowchart diagram of
First, the operation determination unit 20a obtains information about the touch operation from the operation unit 50 (information about the touch position), and obtains information about a hand with which a touch operation is performed from the detection unit 60 (image data of a hand or contour data of a hand) (S101).
Subsequently, the operation determination unit 20a compares information about a hand with which a touch operation is performed and a pattern stored in the storage unit 30 in advance, thus determining whether a touch operation is a both-hands multi-touch operation (the touch positions are associated with different hands) or a single-hand multi-touch operation (the touch positions are associated with the same hand) (S102).
Then, when the touch operation is determined to be the both-hands multi-touch operation, the processing unit 20b performs operation on an object or a group in accordance with the first rule associated with the both-hands multi-touch operation in advance (S103). On the other hand, when the touch operation is determined to be the single-hand multi-touch operation, the processing unit 20b executes operation on the object or the group in accordance with the second rule associated with the single-hand multi-touch operation in advance (S104).
As described above, even if information about the touch operation detected by the touch panel is completely the same, different operations can be carried out by determining whether a touch operation is performed with a single hand or both hands. Therefore, when a target having a hierarchical structure is operated, an operation target can be selected with a single multi-touch operation. For example, whether an object is operated or a group is operated can be selected by a single multi-touch operation. An operation content can be selected by a single multi-touch operation. For example, whether an object or a group is moved or is enlarged/reduced can be selected by a single multi-touch operation. Therefore, the object and the group can be operated efficiently, which can improve the user's convenience.
Even when a user operates an object having a multi-layer hierarchical structure, e.g., another group (large group) is formed by collecting multiple groups (small groups), an operation target (small group/large group) and an operation content (movement/enlarging and the like) can be selected by a single multi-touch operation. Even when a window is displayed on a screen and a sub-window is displayed in the window, the sub-window is adopted as an object and the window is adopted as a group, so that an operation target (object/group) and an operation content (movement/enlarging and the like) can be selected by a single multi-touch operation.
In order to explain the embodiments of the present invention described above further in details, the object operation system, the object operation control program, and the object operation control method according to the first embodiment of the present invention will be explained with reference to
An operation control method of an object according to the present embodiment will be explained. The CPU 21 extracts an operation control program stored in the ROM 22 or the storage unit 30 to the RAM 23 and executes the operation control program, thus executing processing in each step as shown in the flowchart diagrams of
First, processing for changing an operation target will be explained with reference to
When the touch operation is determined to be the both-hands multi-touch operation, the processing unit 20b executes operation for enlarging and reducing the element (object) (S203). On the other hand, when the touch operation is determined to be the single-hand multi-touch operation, the processing unit 20b executes operation for enlarging and reducing the group (S204). More specifically, the operation target (element/group) is switched according to whether the touch operation is the both-hands multi-touch operation or the single-hand multi-touch operation.
Subsequently, processing for changing the operation target and the operation content will be explained with reference to
When the touch operation is determined to be the both-hands multi-touch operation, the processing unit 20b identifies the target touched by each finger on the basis of the touch position of the finger (S303). More specifically, a determination is made as to whether each finger is touching the same element (the same object), one of the fingers is touching an element (object) and the other of the fingers is touching a group (a portion of the group area other than the objects), or fingers are touching different elements (different objects).
When each finger touches the same element, the processing unit 20b executes operation for enlarging and reducing a touched element (S304). When one of the fingers is touching an element and the other of the fingers is touching a group, the processing unit 20b executes for separately moving the element and the group (S305). When the fingers are touching different elements, the processing unit 20b executes for separately moving each element (S306).
On the other hand, when the touch operation is determined to be the single-hand multi-touch operation in S302, the processing unit 20b executes operation for enlarging and reducing the group (S307). More specifically, the operation target (element/group) is switched according to whether the touch operation is the both-hands multi-touch operation or the single-hand multi-touch operation, and the operation content (movement/enlarging and reducing operation) is switched on the basis of the touch target.
Hereinafter, this will be hereinafter explained in a more specific manner with reference to
On the other hand, as shown at the left drawing of
On the other hand, as shown at the left drawing of
On the other hand, as shown in the left drawing of
As described above, when the multi-touch operation is performed, the operation target (element/group) is switched in accordance with whether the operation is the both-hands multi-touch operation or the single-hand multi-touch operation, and further, the operation content (movement/enlarging and reducing operation and the like) is switched in accordance with the touched target. Therefore, the object and the group can be operated efficiently, which can improve the user's convenience.
Subsequently, an object operation system, an object operation control program, and an object operation control method according to the second embodiment of the present invention will be explained with reference to
In the first embodiment explained above, the element and the group are switched as the operation target in accordance with the touch operation, but when an object is managed in a multi-layer hierarchical structure, a first group (referred to as a small group) may be formed by one or more objects, and further, a second group (referred to as a large group) may be formed by multiple first groups or the first group and at least another object. Therefore, in the present embodiment, a case where a large group and a small group are switched as an operation target will be explained. It should be noted that the small group of the present embodiment corresponds to the element of the first embodiment, and the large group of the present embodiment corresponds to the group of the first embodiment.
The operation control method of the object in this case will be explained. The CPU 21 extracts an operation control program stored in the ROM 22 or the storage unit 30 to the RAM 23 and executes the operation control program, thus executing processing in each step as shown in the flowchart diagram of
First, the operation determination unit 20a obtains information about the touch operation from the operation unit 50, and obtains information about a hand with which a touch operation is performed from the detection unit 60 (S401). Then, the operation determination unit 20a compares information about a hand with which a touch operation is performed with a pattern stored in the storage unit 30 in advance, thus determining whether the touch operation is the both-hands multi-touch operation or the single-hand multi-touch operation (S402).
When the touch operation is determined to be the both-hands multi-touch operation, the processing unit 20b identifies the target touched by each finger on the basis of the touch position of the finger (S403). More specifically, a determination is made as to whether each finger is touching the same group, one of the fingers is touching a small group and the other of the fingers is touching a large group (a portion of the area of the large group other than the small group), or each finger is touching different small group.
When each finger is determined to be touching the same small group, the processing unit 20b executes operation for enlarging and reducing the touched small group (S404). When one of the fingers is determined to be touching a small group and the other of the fingers is determined to be touching a large group, the processing unit 20b executes operation for separately moving the small group and the large group (S405). When each finger is determined to be touching different small group, the processing unit 20b executes operation for separately moving each of the small groups (S406).
On the other hand, when the touch operation is determined to be the single-hand multi-touch operation in S402, the processing unit 20b executes operation for enlarging and reducing a large group (S407). More specifically, in accordance with whether the touch operation is the both-hands multi-touch operation or the single-hand multi-touch operation, the operation target (small group/large group) is switched, and on the basis of the touch target, the operation content (movement/enlarging and reducing operation) is switched.
Hereinafter, this will be explained in details with reference to
On the other hand, as shown in the left drawing of
On the other hand, as shown in the left drawing of
On the other hand, as shown in the left drawing of
As described above, when the multi-touch operation is performed, the operation target (small group/large group) is switched in accordance with whether the operation is the both-hands multi-touch operation or the single-hand multi-touch operation, and further, the operation content (movement/enlarging and reducing operation and the like) is switched in accordance with the touched target. Therefore, the objects can be operated efficiently in units of groups, which can improve the user's convenience.
Subsequently, an object operation system, an object operation control program, and an object operation control method according to the third embodiment of the present invention will be explained with reference to
In the first embodiment, an object or a group is adopted as an operation target, and in the second embodiment, a small group or a large group is adopted as an operation target. In the present embodiment, hereinafter explained is a window displayed on a screen of a display unit 40, a sub-window displayed inside of the window, and an operation displayed inside of the window or the sub-window are adopted as operation targets, and an operation target and an operation content are switched in accordance with the touch operation. More specifically, in the present embodiment, when a window is displayed on the screen of the display unit 40, and a sub-window is displayed inside of the window, then, an individual sub-window is adopted as an element (object), and the entire window including all the sub-windows inside of the window is treated as a group. When the window on the screen of the display unit 40 is displayed, and an object is displayed inside of the window, then, an individual object is adopted as an element, and the entire window including all the objects inside of the window is treated as a group.
The operation control method of the object in this case will be explained. The CPU 21 extracts an operation control program stored in the ROM 22 or the storage unit 30 to the RAM 23 and executes the operation control program, thus executing processing in each step as shown in the flowchart diagram of
First, the operation determination unit 20a obtains information about the touch operation from the operation unit 50, and obtains information about a hand with which a touch operation is performed from the detection unit 60 (S501). Then, the operation determination unit 20a compares information about a hand with which a touch operation is performed with a pattern stored in the storage unit 30 in advance, thus determining whether the touch operation is the both-hands multi-touch operation or the single-hand multi-touch operation (S502).
When the touch operation is determined to be the both-hands multi-touch operation, the processing unit 20b executes operation for enlarging and reducing the display size inside of the sub-window (for example, each object displayed inside of the sub-window (S503). On the other hand when the touch operation is determined to be the single-hand multi-touch operation, the processing unit 20b executes operation for enlarging and reducing the display size inside of the window (for example, the entire sub-window displayed inside of the window) (S404). More specifically, the operation target (sub-window/window) is switched according to whether the touch operation is the both-hands multi-touch operation or the single-hand multi-touch operation. In
Hereinafter, this will be explained in details with reference to
As shown in the left drawing of
On the other hand, as shown in the left drawing of
As described above, the operation target (sub-window/window) is switched according to whether the operation is the both-hands multi-touch operation or the single-hand multi-touch operation, so that the user's convenience can be improved. For example, when a page embedded with a map is displayed with a browser, and the map is enlarged and the map is displayed on the entire browser, it is impossible to touch a location other than the map using a normal browser, and therefore, the display size of the browser cannot be changed, but according to the above control, the display size of the browser can be changed by changing the finger used for operation.
As shown in the left drawing of
On the other hand, as shown in the left drawing of
On the other hand, as shown in the left drawing of
On the other hand, as shown in the left drawing of
On the other hand, as shown in the left drawing of
As described above, when the multi-touch is performed, the operation target (an object, a window or a sub-window including an object, a window including a sub-window, and the like) is switched in accordance with whether the operation is the both-hands multi-touch operation the single-hand multi-touch operation, and further, the operation content (movement, enlarging and reducing operation, and the like) is switched in accordance with the touched target. Therefore, the windows and sub-windows can be operated efficiently, which can improve the user's convenience.
It should be noted that the present invention is not limited to the above embodiment, and the configuration and the control of the present invention can be changed as necessary as long as not deviating from the gist of the present invention.
For example, in the above embodiments, in the case of the both-hands multi-touch operation, a relatively small range such as an element (an object, a sub-window including an object, and the like) is adopted as an operation target, and in the case of the single-hand multi-touch operation, a relatively large range such as a group (a group including multiple objects, an entire window, and an entire sub-window) is adopted as an operation target. The operation target may be opposite, e.g., in the case of the both-hands multi-touch operation, a large range such as a group is adopted as an operation target, and in the case of the single-hand multi-touch operation, a small range such as an element is adopted as n operation target.
In the above embodiments, examples of operation contents include moving and enlarging and reducing operations, but any operation that can be performed on an element or a group may be applied.
The above embodiments have been explained using the shared screen in which multiple users can operate objects at a time. However, the object operation system according to the present invention may be an apparatus having a touch panel, and for example, the present invention can be applied to a personal computer having a touch panel and a portable terminal such as a tablet terminal and a smart phone in the same manner.
The present invention can be applied to a system capable of operating objects such as characters, figures, and images, and more particularly, the present invention can be used for a system that can be operated by multiple operators in a cooperated manner, an operation control program operating on the system, a recording medium recording an operation control program, and an operation control method controlling operation of an object on the system
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustrated and example only and is not to be taken by way of limitation, the scope of the present invention being interpreted by terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2014-009110 | Jan 2014 | JP | national |