This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2017-170612 filed Sep. 5, 2017.
The present invention relates to an information processing apparatus, an image forming apparatus, and a non-transitory computer readable medium.
JP-A-2011-186742 discloses an apparatus that includes a detection unit that detects that one object overlaps with any other object, and a display unit that displays, when the detection unit detects that the one object overlaps with the any other object, the any other object shifting from its alignment position.
In an information processing apparatus that includes a display unit, for example, an operator performs an operation on the display unit, and thus an image on the display unit is selected, or the image on the display unit is moved. At this point, for example, when images on the display unit are arranged in a manner that approaches each other, an error in an operation may occur. An image which is different from an image that is originally intended to be selected may be selected, or an image may be moved to a portion which is different from an originally intended portion.
Aspects of non-limiting embodiments of the present disclosure relate to reduce an error in an operation performed by an operator on an image which is displayed on a display unit, when the operator performs the operation on the image on the display unit, compared with a case where processing that changes an arrangement of images is not performed.
Aspects of certain non-limiting embodiments of the present disclosure overcome the above disadvantages and/or other disadvantages not described above. However, aspects of the non-limiting embodiments are not required to overcome the disadvantages described above, and aspects of the non-limiting embodiments of the present disclosure may not overcome any of the disadvantages described above.
According to an aspect of the present disclosure, there is provided an information processing apparatus including: a display that displays an overlapping image including plural images which are partially overlapped and are mutually deviated; a gaze detection unit that detects a gaze of an operator, which is fixed on the overlapping image; a motion detection unit that detects a specific motion that is made when the operator performs an operation on the overlapping image; and a display control unit that changes an arrangement of the plural images included in the overlapping image in a case where the gaze fixed on the overlapping image is detected and where the specific motion is detected.
Exemplary embodiments of the present invention will be described in detail based on the following figures, wherein:
Exemplary embodiments of the present invention will be described below with reference to the accompanying drawings.
As illustrated in
A display unit 106 as one example of a display displays an image and thus performs notification of information to an operator who operates the image forming apparatus 10. The display unit 106, as illustrated in
An operation from a user is received by an operation receiving unit 107 (see
At this point, the display unit 106 and the operation receiving unit 107, for example, are configured with a touch panel type display. The display unit 106 and the operation receiving unit 107 are arranged in such a manner as to be overlapped on each other in terms of shape.
It is noted that, in the present embodiment, the case where the operation receiving unit 107 is configured with the touch panel type display is described, but that the operation receiving unit 107 may be configured with a pointing device such as a mouse.
In this case, when a user operates the pointing device, a pointer 106C (see
The display unit 106 (see
Then, in the present embodiment, according to the user's operation that is received by the operation receiving unit 107, processing that corresponds to the operation is performed.
An image reading unit 108 (see
The image forming unit 109 as one example of an image forming unit, for example, uses an electrographic method, and forms a toner image in accordance with the image data, a paper sheet that is one example of a recording material. It is noted that, in the image forming unit 109, image formation may be performed using any other method such as an ink jet head method. The communication unit 110 functions as a communication interface that is connected to a communication line (not illustrated) and performs communication with any other apparatus that is connected to the communication line.
An image processing unit 111 performs image processing, such as color correction or gray level correction, on an image that is represented by the image data.
The camera 112 is one example of an image capture unit, and image-captures a station in the vicinity of the image forming apparatus 10. Furthermore, in a case where an operator is standing in front of the image forming apparatus 10, the camera 112 image-captures the operator.
The camera 112 is configured with a Charge Coupled Device (CCD) or the like. The camera 112, as illustrated in
A gaze detection device 113 (see
The gaze detection device 113, as illustrated in
For example, data that is received in the communication unit 110, or the reading image (the image data) that is generated in the image reading unit 108 is stored in a storage unit 105 (see
The control unit 60 controls each unit of the image forming apparatus 10. The control unit 60 is configured with a Central Processing Unit (CPU) 102, a Read Only Memory (ROM) 103, and a Random Access Memory (RAM) 104.
A program that is executed by the CPU 102 is stored in the ROM 103. The CPU 102 reads the program that is stored in the ROM 103, and executes the program with the RAM 104 as a working area.
At this point, the program that is executed by the CPU 102 may be provided to the image forming apparatus 10, in a state of being stored in a magnetic recording medium (such as a magnetic tape or a magnetic disk), an optical recording medium (such as an optical disc), a magneto-optical recording medium, a semiconductor memory, or the like, which is computer-readable.
Furthermore, the program that is executed by the CPU 102 may be downloaded to the image forming apparatus 10 using communication means such as the Internet.
When the program is executed by the CPU 102, each unit of the image forming apparatus 10 is controlled by the CPU 102, and thus, the image forming apparatus 10, for example, forms an image on a paper sheet, or reads an original document and generates a reading image of the original document.
Furthermore, in the present embodiment, the program is executed by the CPU 102, and thus, as illustrated in
At this point, the display unit 106 and the control unit 60 according to the present embodiment may be regarded as an information processing apparatus that performs image display.
The gaze detection unit 61 (see
Specifically, based on positional information indicating a position of the overlapping image on the display unit 106, and on an output from the gaze detection device 113, the gaze detection unit 61 detects the operator's gaze fixed on the overlapping image.
In other words, the gaze detection unit 61 determines whether or not the overlapping image is present on a destination of the operator's gaze, and, in a case where the overlapping image is present, outputs information to the effect that the operator is taking a look at the overlapping image.
The motion detection unit 62 as one example of an operation detection unit interprets an output from the camera 112 or the pointing device, and detects a specific motion that is made when the operator performs an operation on the overlapping image.
Specifically, in the present embodiment, the motion detection unit 62 detects an operator's motion of causing causes an operation tool or a finger of his/her own to approach the overlapping image, or an operator's motion of causing the pointer 106C, which is displayed on the display unit 106, to approach the overlapping image. At this point, a pen-type tool is given as one example of the operation tool.
The display control unit 63 as one example of a display control unit performs display control on the display unit 106.
Specifically, in a case where the gaze fixed on the overlapping image is detected and the specific motion is detected, the display control unit 63 changes a state where a plurality of images that are included in the overlapping image are arranged.
Furthermore, among the plurality of images that are displayed on the display control unit 63, the display unit 106 moves an image having a predetermined positional relationship with a specific position that is specified by the position specification unit 65 (which will be described below).
An operator's selection with respect to the image on the display unit 106 is received by the receiving unit 64 as one example of a receiving unit.
Specifically, the receiving unit 64 obtains positional information on an image that is displayed on the display unit 106, and an output (information indicating an operation position at which the operator performs an operation) from the operation receiving unit 107 (see
Based on the output from the gaze detection device 113, the position specification unit 65 as one example of a specification unit specifies a position on the display unit 106, which is a position of a destination toward which the operator's gaze is directed.
As described in
The virtual image, as illustrated in
It is noted that the detection of the direction of the user's gaze may be performed any other known method, without being limited to methods that are illustrated in
In
The overlapping image 80 is configured with a plurality of images 81 that are partially overlapped and are mutually deviated or shifted. Additionally, the overlapping image 80 is configured with the plurality of images 81 that correspond to a plurality of pages, respectively, which are mutually deviated.
Furthermore, in the overlapping image 80, the plurality of images 81 that correspond to a plurality of pages, respectively, are arranged in a state of being deviated in the direction of a diagonal of an image 81 that corresponds to any one page. Furthermore, in the overlapping image 80, the images 81, which constitute the overlapping image 80, are arranged side by side in a manner that is equally spaced.
Furthermore, in
Specifically, in the present embodiment, a portion (hereinafter referred to as an edge part and portion) where edge parts 81A that the images 81 have are arranged side by side, of the overlapping image 80, is registered in advance, as a specific portion, in the storage unit 105 of the image forming apparatus 10. In an example that is illustrated in
At this point, each image 81 that is included in the overlapping image 80 that is formed in the shape of a rectangle and that has sides. Specifically, each image 81 has a short side 81E and a long side 81F as sides.
For this reason, in the present embodiment, as the edge part and portion, there are present a first edge part and portion 81X in which the edge parts 81A are arranged side by side along the short side 81E in the upward-downward direction in
Then, in the present embodiment, the first edge part and portion 81X and the second edge part and portion 81Y, in which the edge parts 81A are arranged side by side, and the like are registered in advance, as the specific portion.
In an example that is illustrated in
In
In the present embodiment, in this manner, when the operator causes the finger 200 of his/her own to approach the second edge part and portion 81Y (the specific portion), this motion in which the finger 200 is caused to approach the second edge part and portion 81Y is detected by the motion detection unit 62.
Additionally, in the present embodiment, the operator's motion of causing the finger 200 of his/her own to approach the specific portion is registered in advance, as a specific motion, in the storage unit 105 of the image forming apparatus 10. In the present embodiment, when the operator makes this specific motion, the specific motion is detected by the motion detection unit 62.
More specifically, the motion detection unit 62 interprets an output from the camera 112, and, based on the output from the camera 112 and on positional information (information indicating a position of the overlapping image 80 on the display unit 106) on the overlapping image 80, the operator interprets whether or not the finger 200 is caused to approach the second edge part and portion 81Y of the overlapping image 80.
Then, in a case where the finger 200 approaches the second edge part and portion 81Y of the overlapping image 80, the motion detection unit 62 detects that the operator makes a predetermined specific motion.
Then, in the present embodiment, in this manner, in a case where the operator's gaze fixed on the specific portion (the second edge part and portion 81Y) of the overlapping image 80 is detected and where it is detected that the operator makes the predetermined specific motion (the motion of causing the finger 200 to approach the second edge part and portion 81Y), the display control unit 63, as illustrated in
Specifically, the display control unit 63 changes the arrangement of the plurality of images 81 in such a manner that a deviation between the images 81, among the plurality of images 81 that are included in the overlapping image 80, increases.
More specifically, the display control unit 63 moves each of the plurality of images 81 in such a manner that a gap between the edge parts 81A adjacent to each other, which are positioned in the second edge part and portion 81Y.
Additionally, in an example of the present embodiment, the display control unit 63 moves each image 81 that is included in the overlapping image 80, along a direction in which the short side 81E, which each image 81 has, extends.
Furthermore, the display control unit 63 moves each image 81 in one direction, as indicated by an arrow 6C in
In this manner, in a case where the image 81 is moved that is included in the overlapping image 80, it is difficult for an error in an operation to occur when the operator selects the image 81. More specifically, it is difficult for the error in an operation to occur when one or several images 81 are selected from among the plurality of images 81 that are included in the overlapping image 80.
Additionally, as in the present embodiment, when a deviation between the images 81 is increases, it is difficult for an error in selection to occur when one or several images 81 is selected from among the plurality of images 81, compared with a case where the deviation between the images 81 is not increased.
It is noted that, in the present embodiment, when a state is reached that is illustrated in
In other words, before the finger 200 of the operator reaches the overlapping image 80, the display control unit 63 changes the arrangement of the plurality of images 81. Additionally, while the finger 200 of the operator is in the middle of getting closer to the overlapping image 80, the display control unit 63 moves the image 81 and increases the deviation between the images 81.
In this case, the operator makes a selection of the image 81 at an earlier timing than in a case where the arrangement is changed after the finger 200 of the operator reaches the overlapping image 80.
It is noted that the processing in the case where the finger 200 is caused to approach is described above, but that, in a case where the selection of the image 81 is made with the operation tool or the pointer 106C (see
In the processing that is illustrated in each of
In contrast, in processing that is illustrated in each of
In the case where the finger 200 of the operator approaches the first edge part and portion 81X, as illustrated in
In other words, each image 81 moves along the downward-upward direction in
In this case, in the same manner as described above, it is also difficult for the error in selection to occur when one or several images 81 are selected from among the plurality of images 81.
In the present embodiment, as illustrated in
Specifically, in an example that is illustrated in
At this point, the direction of movement when each image 81 is moved is not limited to a direction along the side, and may be a direction that intersects the direction along the side.
Specifically, for example, in a case where the destination of the operator's gaze is present in the overlapping image 80 and where the finger 200 of the operator approaches the overlapping image 80, as illustrated in
In this case, on both the short side 84 side and the long side 85 side of the overlapping image 80, the deviation between the images 81 increases (the gap between the edge parts 81A is broadened), and on both the short side 84 side and the long side 85 side, the selection of the image 81 is easy to make.
It is noted that, in a case where each image 81 is moved in the diagonal direction, an area that is occupied by the overlapping image 80 after the image 81 is moved increases much more than in a case where each image 81 is moved along the side.
For this reason, for example, in a case where a space for moving the image 81 is small, such as in a case where an area of the display unit 106 is small, and so forth, as illustrated in
Furthermore, in moving each image 81 that is included in the overlapping image 80, instead of moving the image 81 only in one direction, one or several images 81 may be moved in one direction and any other image 81 may be moved in the direction opposite to the one direction.
Specifically, for example, as illustrated in
In other words, in this example, among the images 81 that are included in the overlapping image 80, an image 81 that is positioned close to one end portion (an end portion (a corner portion) in the lower right side of
It is noted that, in this movement, in the same manner as described above, an amount of movement is increased as much as necessary to reach the image 81 that is positioned downstream in the direction of movement.
At this point, when a configuration is employed in which the image 81 is moved only in one direction, and when any other image or the like is present downstream in the one direction, the amount of the movement of the image 81 is small and the amount of the movement of the image 81 is difficult to secure.
As described above, if the image 81 is made to be moved not only in one direction, but also the opposite direction, the amount of the movement of the image 81 is easier to secure than in the case where the image 81 is moved only in one direction.
It is noted that, in an example that is illustrated in
At this point, in a case where the image 81 is moved along the side of the image 81 that is included in the overlapping image 80, one or several images 81 that are included in the overlapping image 80, for example, is moved in the rightward direction, and any one or several images 81 are moved in the leftward direction.
Furthermore, in addition to this, in the case where the image 81 is moved along the side of the image 81 that is included in the overlapping image 80, one or several images 81 that are included in the overlapping image 80, for example, is moved in the upward direction, and any one or several images 81 are in the downward direction.
Furthermore, in moving the image 81 that is included in the overlapping image 80, the image 81 may be made to be moved toward a broader gag, among a plurality of gaps that are positioned adjacent to the overlapping image 80.
In many cases, as illustrated in
More specifically, for example, in some cases, any other images 89, which are arranged to be spaced with a gap 9G over or under the overlapping image 80, to the left or right side of the overlapping image 80, and in any other position adjacent to the overlapping image 80, may be displayed.
In this case, in moving the image 81 that is included in the overlapping image 80, it is preferable that, among gaps 9G each of which is positioned between the overlapping image 80 and any other image 89, the image 81 is moved toward a gap 9G other than the smallest gap 9G.
In an example that is illustrated in
More preferably, among a plurality of gaps 9G the image 81 is moved toward the greatest gap 9G.
In this example, the gap 9G that is indicated by a reference character 9B is the greatest gap 9G, and it is preferable that the image 81 is moved toward the greatest gap 9G.
In a case where the image 81 is moved toward a small gap 9G, the amount of the movement of the image 81 is difficult to secure. In contrast, in a case where the image 81 is moved toward a great gap 9G, the amount of the movement of the image 81 is easy to secure.
In the present embodiment, first, the gaze detection unit 61 determines whether or not the destination of the operator's gaze is present in the above-described specific portion of the overlapping image 80 (Step 101).
Then, in a case where it is determined that the destination of the operator's gaze is present in the specific portion, the motion detection unit 62 determines whether or not the finger 200 approaches the specific place (Step 102).
Then, in a case where the finger 200 approaches the specific portion, the display control unit 63 changes the arrangement of the images 81 that are included in the overlapping image 80 and increases the deviation between the images 81 (Step 103).
It is noted that, in the present embodiment, the case where the destination of the operator's gaze is present in the specific portion of the overlapping image 80 and where the finger 200 of the operator gets closer to the specific place of the overlapping image 80, the arrangement of the images 81 is changed.
Incidentally, in a case where the destination of the operator's gaze is present in any portion of the overlapping image 80 and where the finger 200 of the operator approaches any portion of the overlapping image 80, the change of the arrangement may be performed. In other words, in a case where, without any limitation to a specific place, the gaze is directed toward any portion of the overlapping image 80 and the finger 200 of the overlapping image 80 gets closer to any portion of the overlapping image 80, the arrangement may be changed.
Furthermore, the change of the arrangement may be performed after the finger 200 of the operator reaches the overlapping image 80 (the change of the arrangement may be performed after the finger 200 of the operator reaches the overlapping image 80 and comes into contact with the operation receiving unit 107).
In this processing example, as illustrated in
Then, in this processing, as indicated by a reference character 11A in
More specifically, the receiving unit 64 obtains positional information on each of the images 81 that are displayed on the display unit 106, and an output (the output of an operation position at which the operation performs an operation) from the operation receiving unit 107, and receives the image 81 (hereinafter referred to as a “selection image 86”). In other words, the receiving unit 64 receives content that is selected by the operator.
Next, in the present embodiment, the position specification unit 65 specifies a position on the display unit 106, which is a position of a destination toward which the gaze of the operator who makes a selection of the image 81 is directed.
Specifically, based on the output from the gaze detection device 113 (see
In this example, a portion that is indicated by a reference character 11B in
At this point, in an example that is illustrated in
Thereafter, in this display example, as illustrated in
Thereafter, as illustrated in an arrow 12A in
In other words, the display control unit 63 moves the image 81 that is positioned adjacent to the specific position 78, in a direction away from the specific position 78, and broadens the gap between the two images 81 that is adjacent to each other.
It is noted that, in this display example, the case is described where the gap between two images 81 is broadened in a case where the operator moves the finger 200 in the state where the selection image 86 is selected, in such a manner as to face the specific position 78, but that the gap between two images 81 may also be broadened in a case where the finger 200 is not moved in such a manner as to face the specific position 78.
Specifically, for example, if a position (the specific position 78) of the destination of the operator's gaze is present in the gap between two images 81, although the finger 200 is not moved, the gap between two images 81 may be broadened.
In the present embodiment, among a plurality of images 81 on the display unit 106, the display control unit 63 moves an image 81 (an image 81 that is positioned adjacent to the specific position 78) that has a predetermined positional relationship with the specific position 78 which is specified by the position specification unit 65, and thus broadens the gap between two images 81 that are adjacent to each other.
More specifically, the display control unit 63 moves both the two images 81 with the specific position 78 in between, in the direction away from the specific position 78, and thus broadens the gap. It is noted that in this processing example, in this manner, both the two images 81 are moved, but that only one image may be moved.
Furthermore, in the present embodiment, instead of only an image 81 (hereinafter referred to as an “adjacent image 88”) being moved in the direction away from the specific position 78, an image 81 that is positioned more downstream than the adjacent image 88 is also moved in the direction away from the specific position 78.
Specifically, in the present embodiment, an image 81 that is indicated by a reference character 12B in
In the present embodiment, at this time, regarding the direction away from the specific position 78, an image 81 (an image 81 that is indicated by a reference character 12Y) (all images 81 that are positioned more downstream than the adjacent image 81) that is positioned more downstream than the adjacent image 88 is also moved in the direction away from the specific position 78.
Accordingly, an amount of movement of the adjacent image 88 is easier to secure than in a case where the image 81 that is positioned downstream than the adjacent image 88 is not moved.
Thereafter, in the present embodiment, an operation (movement of the selection image 86 by the operator) by the operator is further performed, and, as illustrated in
In the present embodiment, when the operator moves the selection image 86, a space (a gap) that is a destination to which the selection image 86 is moved is broadened, the selection image 86 is easy to move, and it is difficult for an error in an operation to occur when moving the selection image 86.
In this processing example, as indicated by a reference character 13A in
In other words, in this processing example, it is considered that the operator moves the selection image 86 in such a manner as to be positioned between the rim 106B of the display unit 106 and the image 81 that is displayed on the display unit 106.
In this case, as indicated by an arrow 13B in
It is noted that in this processing example, in the same manner as described above, an image 81 (an image that is indicated by a reference character 13C) that is positioned more downstream than the adjacent image 88 is moved in the direction away from the specific position 78.
In this processing that is illustrated in
Subsequently, the position specification unit 65 determines whether the specific position 78 is present between two images 81 or between an image 81 and the rim 106B (Step 202).
Then, in a case where the specific position 78 is present between two images 81 or between an image 81 and the rim 106B, the display control unit 63 moves the adjacent image 88 that is positioned adjacent to the specific position 78, and thus broadens a gap that is the destination to which the selection image 86 is moved (Step 203).
The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2017-170612 | Sep 2017 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
20090046075 | Kim | Feb 2009 | A1 |
20110219297 | Oda | Sep 2011 | A1 |
20130176208 | Tanaka | Jul 2013 | A1 |
20140189580 | Kawamata | Jul 2014 | A1 |
20160026244 | Ogawa | Jan 2016 | A1 |
Number | Date | Country |
---|---|---|
2011186742 | Sep 2011 | JP |
Number | Date | Country | |
---|---|---|---|
20190075210 A1 | Mar 2019 | US |