This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2015-056904 filed Mar. 19, 2015.
The present invention relates to a selection support apparatus, a selection support method, and a non-transitory computer readable medium.
According to an aspect of the invention, there is provided a selection support apparatus including a display, an enlarging unit, and a determining unit. The display displays an image on a display screen including multiple unit regions in which a designation operation by an operator is detected. The enlarging unit enlarges the image such that a part of the image displayed on one of the multiple unit regions of the display screen is displayed on an enlarged region including two or more of the multiple unit regions neighboring each other of the display screen. The determining unit determines that the part of the image is selected if the designation operation by the operator is detected in all or at least one of the unit regions included in the enlarged region.
An exemplary embodiment of the present invention will be described in detail based on the following figures, wherein:
Hereinafter, an exemplary embodiment of the present invention will be described in detail with reference to the attached drawings.
The image processing apparatus 10 is an example of a selection support apparatus and, for example, a so-called “general-purpose personal computer (PC)”. The image processing apparatus 10 is designed to perform image information generation or the like by running various software applications under the control of an operating system (OS).
The display 20 displays the image on a display screen 21. The display 20 includes a device such as a liquid crystal display for PC, a liquid crystal display television set, or a projector, the device having a function of displaying an image on the basis of an additive color mixture. Accordingly, a display system used by the display 20 is not limited to the liquid crystal system. In the example in
The input device 30 includes a keyboard, a mouse, and the like. The input device 30 is used for starting and terminating image processing application software and inputting designation for image processing into the image processing apparatus 10 (described later in detail).
The image processing apparatus 10 is connected to the display 20 on the basis of, for example, digital visual interface (DVI). Instead of DVI, high-definition multimedia interface (HDMI) (registered trademark), DisplayPort, or the like may be used for the connection.
The image processing apparatus 10 is also connected to the input device 30 on the basis of, for example, universal serial bus (USB). Instead of USB, IEEE1394, RS-232C, or the like may be used for the connection.
In the image processing system 1 as described above, the display 20 first displays an original image that is an image yet to undergo image processing. When the user inputs designation for the image processing into the image processing apparatus 10 by using the input device 30, the image processing apparatus 10 performs the image processing on the image information regarding the original image. The result of the image processing is reflected on the image displayed on the display 20, and the display 20 displays the image that has undergone the image processing and has thus been redrawn.
The mode of the image processing system 1 in the present exemplary embodiment is not limited to the mode in
As described above, the display screen 21 has minimum-unit regions in which designation input from the input device 30 is detected, regardless of whether the display 20 is a device different from the input device 30, functions as the input device 30, or is integrated with the input device 30. In the present exemplary embodiment, the minimum-unit regions are used as an example of unit regions in which a designation operation by a user is detected. In addition, the display screen 21 is provided as an example of a display screen having multiple unit regions, and the display 20 is provided as an example of a display that displays an image on the display screen.
The image information acquiring unit 11 acquires the image information regarding the original image. Specifically, the image information acquiring unit 11 acquires image information yet to undergo image processing. The image information is, for example, red, green, and blue (RGB) video data for displaying an image on the display 20.
The user designation receiving unit 12 receives image processing designation input by the user through the input device 30. Specifically, the user designation receiving unit 12 receives, as user-designated information, designation for obtaining an enlarged image by enlarging the original image displayed on the display 20. The user designation receiving unit 12 also receives, as user-designated information, designation for selecting one or more dots on the enlarged image. The user designation receiving unit 12 receives the designation so as to set a seed on the original image by using the enlarged image displayed on the display 20. The term “seed” denotes a movement line, a point, or the like resulting from an operation performed by the user on a specific region for segregating the region from the other regions. The movement line, the point, or the like may be set, for example, by dragging the mouse on the image displayed on the display screen or by moving a finger of the user, a touch pen, or the like over the image displayed on the display screen.
The enlargement processor 13 performs image processing on the image information regarding the original image on the basis of the user designation received by the user designation receiving unit 12 so as to display, on the display screen 21, the enlarged image obtained by enlarging the original image displayed on the display 20. For example, in a case where one pixel in the original image is represented by one dot on the display screen 21, the original image is enlarged such that a pixel (hereinafter, referred to as an “enlarged pixel”) of the enlarged image corresponding to the one pixel of the original image is represented by multiple dots on the display screen 21. In this case, a pixel is an example of a part of an image, a region in which the one dot is displayed is an example of a unit region, and a region in which the multiple dots are displayed is an example of an enlarged region. The enlargement processor 13 is provided as an example of an enlarging unit that enlarges an image so as to display, in the enlarged region, the part of the image displayed in the unit region. The enlargement processor 13 will be described in detail later.
The seed setting unit 14 sets a seed on the original image by using the enlarged image on the basis of the user designation received by the user designation receiving unit 12. Specifically, when at least one of the dots representing an enlarged pixel is selected on the enlarged image, the seed setting unit 14 determines, on the basis of the proportion of the selected dot relative to the dots representing the enlarged pixel, a relationship of the feature amount between the enlarged pixel and the already selected enlarged pixel, and the like, whether to select the enlarged pixel. The seed setting unit 14 changes the display from the enlarged image back to the original image and sets the seed on a pixel in the original image corresponding to the enlarged pixel determined to be selected. In the present exemplary embodiment, the seed setting unit 14 is provided as an example of a determining unit that determines that the part of the image is selected. The seed setting unit 14 will also be described in detail later.
The region detecting unit 15 detects a designated region in the original image displayed on the display 20 on the basis of information regarding the seed set in the original image by the seed setting unit 14, the designated region being designated by the user as an image region for the image processing. Actually, the region detecting unit 15 performs processing of segregating the designated region from the other regions in the original image displayed on the display 20. Specifically, the region detecting unit 15 first adds a flag to a pixel of a part for which the seed is set. If the pixel value of the pixel with the seed is close to that of one of neighboring pixels (on the basis of a Euclidean distance between RGB values or the like), the region detecting unit 15 couples the pixels together. If the pixel with the seed is not close to that of the neighboring pixel, the region detecting unit 15 does not couple the pixels together. The region detecting unit 15 repeats the processing to extend the region.
The image processor 16 actually performs the image processing on the designated region thus segregated. For example, the image processor 16 performs adjustment of hue, saturation, and brightness in the segregated designated region or adjustment for enhancing visibility, such as retinex.
The image information output unit 17 outputs the image having undergone the image processing in this manner. The image information having undergone the image processing is thereby transmitted to the display 20. Then, the display 20 displays the image on the basis of the image information.
How the enlargement processor 13 performs processing will be described using a specific example.
In the same display resolution situation as described above, the enlargement processor 13 enlarges a region corresponding to one pixel in accordance with an enlargement ratio. In
Since a tool of the same size such as a touch pen or a finger of the user is used in selecting a pixel in any image, enlarging the pixel in this manner facilitates selection of the pixel.
Meanwhile, in general image processing software, emphasis is placed on the appearance of an image, and the image is thus enlarged in such a manner that interpolation is performed to result in smooth color changes between pixels. In contrast, in the present exemplary embodiment, emphasis is placed on the way of showing the user enlarged pixels, and thus an image is enlarged in such a manner that pixels in the same color are arranged in a square.
In addition, in the present exemplary embodiment, the enlarged pixels at the time when the enlargement processor 13 enlarges an image may be presented to the user as illustrated in
Meanwhile, the following four methods are examples of conceivable methods by which the enlargement processor 13 receives user designation and by which the enlargement processor 13 enlarges an image in accordance with the user designation. In the first method, in response to tapping of the display screen 21 by the user, an image in a predetermined range having the tapped pixel in the center is enlarged at a predetermined ratio. In the second method, in response to designation of a region by the user, the image of the region is fully enlarged on the entire display screen 21. In the third method, in response to a two-finger pinch-out operation by the user, an image is enlarged at the enlargement ratio associated with the operation. In the fourth method, in response to pressing of an enlargement button by the user, an image is enlarged at an enlargement ratio associated with the enlargement button.
Note that particularly in the second and third methods, the enlargement ratio might not be an integral multiple (when percentage is used, an integral multiple of a value of 100%). In this case, for example, the enlargement ratio identified from the operation is rounded to a whole number to obtain an integral multiple, and the enlargement ratio thus rounded may be used when a pixel is enlarged.
How the seed setting unit 14 performs processing will be described in detail by using a specific example.
In contrast, in a case where dots are selected on the basis of a line, not a point, the number of selected dots varies depending on the enlarged pixel.
Note that the threshold for the number of dots is herein provided on the assumption that the enlargement ratio is 2, but the threshold is not limited to this. If various enlargement ratios are assumed, a threshold for a proportion of the number of selected dots relative to the number of dots representing an enlarged pixel may be provided.
In
Further in the present exemplary embodiment, whether an enlarged pixel neighboring an already selected enlarged pixel is selected may be determined on the basis of a relationship of the feature amount between the selected enlarged pixel and the neighboring pixel. Any feature amount may be used as long as the feature amount is obtained from pixel values. A feature amount of a color will herein be described as an example. Specifically, when the user selects one enlarged pixel, it is determined that among the enlarged pixels within the predetermined distance from the selected enlarged pixel, one or more enlarged pixels having a color similar to that of the selected enlarged pixel are also selected.
Note that examples of the simplest color-similarity determination method include a method by which a Euclidean distance between RGB values is obtained. The method is used to obtain a distance D between a color of (R1, G1, B1) and a color of (R2, G2, B2) in accordance with the following formula.
If the distance D is equal to or less than a threshold, the colors may be determined to be similar to each other.
Note that the condition in which the color of the neighboring enlarged pixel is similar to the color of the enlarged pixel selected by the user is herein used as the condition for selecting the neighboring enlarged pixel, but this is only an example. A more generalized condition for selecting a neighboring enlarged pixel may be used. Specifically, the condition may be a condition in which the feature amount of a neighboring enlarged pixel has a predetermined relationship with the feature amount of an enlarged pixel selected by the user.
The case where dots are selected on the basis of a line, not a point, may also be considered in the method for determining whether to select an enlarged pixel neighboring the selected enlarged pixel in
Note that α and β are weights assigned to the distance D and the dot proportion (M/N), respectively and are each a positive number. The evaluation value V is increased as the distance D is decreased and as the dot proportion (M/N) is increased. Accordingly, if the evaluation value V is equal to or larger than a threshold, it may be determined that the neighboring enlarged pixel is selected. However, this is only an example. A more generalized condition may be used in consideration of the use of a formula other than Formula 2. Specifically, if the evaluation value V satisfies a predetermined condition, it may be determined that the neighboring enlarged pixel is selected.
Although a relationship between values of α and β has not been described, for example, the value of β may be set larger than the value of α because the higher the enlargement ratio is, the higher the reliability of the dot proportion (M/N) becomes. In other words, a modification is also conceivable in which the higher the enlargement ratio is, the larger the value of β compared with the value of α is set.
The image information acquiring unit 11 first acquires RGB data as image information regarding an original image for image processing (step 101). The RGB data is transmitted to the display 20, and the original image yet to undergo the image processing is displayed on the display 20.
When the user designates enlargement of the original image displayed on the display 20 by using the input device 30, the user designation receiving unit 12 receives the designation of the enlargement (step 102).
The enlargement processor 13 performs, on the RGB data acquired in step 101, image processing for enlarging a pixel at a designated enlargement ratio, for example, as illustrated in
Next, the user inputs a seed such as a movement line by using the input device 30, thus designating a designated region that is an image region for the image processing. The user designation receiving unit 12 receives information regarding the seed (step 104).
The seed information is information for selecting a dot corresponding to an enlarged pixel in the enlarged image displayed on the display 20. Accordingly, the seed setting unit 14 performs the processing described with reference to
The region detecting unit 15 performs processing of segregating a designated region on the basis of the seed set in step 106 (step 107).
The image processor 16 performs the image processing on the segregated designated region (step 108).
The image information output unit 17 outputs image information having undergone the image processing (step 109). The image information is RGB data. The RGB data is transmitted to the display 20, and the display screen 21 displays the image having undergone the image processing.
The aforementioned processing performed by the image processing apparatus 10 in the present exemplary embodiment is prepared as a program such as a software application.
Accordingly, the processing performed by the image processing apparatus 10 in the present exemplary embodiment may also be regarded as a program causing a computer to execute a process including: displaying an image on a display screen including multiple unit regions in which a designation operation by an operator is detected; enlarging the image such that a part of the image displayed on one of the multiple unit regions of the display screen is displayed on an enlarged region including two or more of the multiple unit regions neighboring each other of the display screen; and determining that the part of the image is selected if the designation operation by the operator is detected in all or at least one of the unit regions included in the enlarged region.
The program that implements the present exemplary embodiment may be provided through not only a communication unit but also a recording medium such as a compact disc read only memory (CD-ROM) storing the program therein.
The foregoing description of the exemplary embodiment of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiment was chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2015-056904 | Mar 2015 | JP | national |