This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2013-166811 filed Aug. 9, 2013.
(i) Technical Field
The present invention relates to an image processing apparatus and a non-transitory computer readable medium.
(ii) Related Art
In recent years, face authentication technologies have been developed and become widely available, and have been applied to various fields.
According to an aspect of the invention, there is provided an image processing apparatus including an image capturing unit, a registration unit, a display, and an authentication unit. The image capturing unit captures a face image of a user. The registration unit registers the face image captured by the image capturing unit. The display displays, in a case where a face image is to be captured and registered in the registration unit, a guide image for capturing, in addition to a first image which is the face image of the user, at least any one of an upward-oriented image and a downward-oriented image, the upward-oriented image being an image in which a face of the user is oriented upward relative to the first image, the downward-oriented image being an image in which the face of the user is oriented downward relative to the first image. The authentication unit performs face authentication by comparing a face image obtained by the image capturing unit with a registered face image.
An exemplary embodiment of the present invention will be described in detail based on the following figures, wherein:
Hereinafter, an exemplary embodiment of the present invention will be described with reference to the drawings.
The image processing apparatus 10 includes a person detecting sensor 191, a first image capturing unit 192, and a second image capturing unit 193.
The person detecting sensor 191 is constituted by, for example, an infrared sensor, and is provided on a front surface of a housing of the image processing apparatus 10. The person detecting sensor 191 detects a human body existing in a detection region F1 illustrated in
The first image capturing unit 192 is, for example, a camera including a wide-angle lens, and is provided on the front surface of the housing of the image processing apparatus 10. The first image capturing unit 192 captures an image of a detection region F2 illustrated in
The second image capturing unit 193 is, for example, a camera, and is provided next to an operation unit 13 and a display 14 on a top surface of the housing of the image processing apparatus 10. The second image capturing unit 193 captures a face image of a user who uses the image processing apparatus 10.
An operation region F3 illustrated in
The controller 11 includes, for example, a central processing unit (CPU) and a memory, and controls the individual units of the image processing apparatus 10. The CPU reads out and executes a program stored in the memory or the storage unit 15. The memory includes a read only memory (ROM) and a random access memory (RAM). The ROM stores a program and various pieces of data in advance. The RAM temporarily stores a program and data, and functions as a working area when the CPU executes a program.
The communication unit 12 is a communication interface connected to a communication line. The communication unit 12 communicates with a client apparatus or another image processing apparatus 10 connected to the communication line, via the communication line.
The operation unit 13 is constituted by, for example, a touch panel and keys, and supplies data corresponding to a user operation to the controller 11.
The display 14 is, for example, a liquid crystal display, and displays various pieces of information. The operation unit 13 and the display 14 are provided on the top surface of the housing of the image processing apparatus 10. The operation unit 13 and the display 14 may be integrated together into a touch panel.
The storage unit 15 is a hard disk, a semiconductor memory, or the like, and stores various programs and data used by the controller 11.
The image reading unit 16 is an image scanner, and reads an image of a document and generates image data.
The image forming unit 17 forms an image corresponding to image data on a sheet medium, such as paper. The image forming unit 17 may form an image by using an electrophotographic system, or may form an image by using another method. The image forming unit 17 typically functions as a printer.
The power-source circuit 18 supplies power to the individual units of the image processing apparatus 10.
The person detecting device 19 detects a user of the image processing apparatus 10, and includes the person detecting sensor 191, the first image capturing unit 192, and the second image capturing unit 193.
The image processing unit 194 analyzes an image captured by the first image capturing unit 192 and an image captured by the second image capturing unit 193, and executes various processing operations. The image processing unit 194 may be constituted by a CPU and a memory, or may be constituted by an application specific integrated circuit (ASIC).
The communication controller 195 controls communication performed between the person detecting device 19 and the controller 11. Specifically, when a person is detected from an image captured by the first image capturing unit 192 or the second image capturing unit 193, the communication controller 195 transmits a detection signal to the controller 11.
The operation mode controller 101 is implemented by the controller 11, and controls the operation modes of the individual units of the image processing apparatus 10. The operation mode controller 101 controls the operation modes of a main system of the image processing apparatus 10, the operation modes of the first image capturing unit 192 and the second image capturing unit 193, and the operation modes of the image processing unit 194 and the communication controller 195. The main system corresponds to the configuration of the image processing apparatus 10 except the person detecting device 19, and includes, for example, the image reading unit 16 and the image forming unit 17.
The operation modes of the main system include a standby mode and a sleep mode. In the standby mode, the power that is necessary for operation is supplied to the main system, and an operable state is achieved. After the mode has shifted to the standby mode, the image processing apparatus 10 executes scan processing, copy processing, print processing, or facsimile processing in response to a user operation. In the sleep mode, power supply to at least a part of the main system is stopped, and at least the part of the main system is brought into a non-operation state. In the sleep mode, power supply to a part of the controller 11, and to the display 14, the image reading unit 16, and the image forming unit 17 is stopped.
The operation modes of the first image capturing unit 192 and the second image capturing unit 193 include an ON-state and an OFF-state. In the ON-state, power is supplied to the first image capturing unit 192 and the second image capturing unit 193, and the power of the first image capturing unit 192 and the second image capturing unit 193 is turned on. In the OFF-state, power supply to the first image capturing unit 192 and the second image capturing unit 193 is stopped, and the power of the first image capturing unit 192 and the second image capturing unit 193 is turned off.
The operation modes of the image processing unit 194 and the communication controller 195 include a standby mode and a sleep mode. In the standby mode, the power that is necessary for operation is supplied to the image processing unit 194 and the communication controller 195, and an operable state is achieved. In the sleep mode, power supply to at least a part of the image processing unit 194 and the communication controller 195 is stopped, and the image processing unit 194 and the communication controller 195 are brought into a non-operation state.
The operation mode controller 101 includes a first timer 111 and a second timer 112. The first timer 111 is used to shift the main system to the sleep mode. The second timer 112 is used to bring the first image capturing unit 192 and the second image capturing unit 193 into the OFF-state, and to shift the image processing unit 194 and the communication controller 195 to the sleep mode.
The power controller 102 controls, under control performed by the operation mode controller 101, power supply from the power-source circuit 18 to the individual units of the image processing apparatus 10. The power controller 102 constantly supplies power to the person detecting sensor 191.
The approach determining unit 103 is implemented by the image processing unit 194, and determines, by using an image captured by the first image capturing unit 192, whether or not a person existing in the detection region F2 is approaching the image processing apparatus 10. Specifically, the approach determining unit 103 detects the shape of a person from a captured image and detects the orientation of the person. If the detected human body is oriented toward the image processing apparatus 10, the approach determining unit 103 determines that the person is approaching the image processing apparatus 10. Otherwise, the approach determining unit 103 determines that the person is not approaching the image processing apparatus 10.
The stay determining unit 104 is implemented by the image processing unit 194, and determines, by using an image captured by the first image capturing unit 192, whether or not a person exists in the operation region F3.
The authentication unit 105 is implemented by the image processing unit 194, and authenticates, by using an image captured by the second image capturing unit 193, a user by using the face of the user. Specifically, the authentication unit 105 extracts a face region from an image captured by the second image capturing unit 193, compares features of the extracted face region with features of a pre-registered face image of a user, and thereby determines whether or not the captured image matches the pre-registered face image of the user. If it is determined that the captured image matches the face image of the user, user authentication succeeds. On the other hand, if it is determined that the captured image does not match the face image of the user, user authentication fails. A pre-registered face image of a user will be described below. In this exemplary embodiment, authenticating a user by using a face is referred to as “face authentication”.
The authentication unit 105 may execute, in addition to face authentication processing (first authentication processing), ID and password authentication processing (second authentication processing). Specifically, when a user operates the operation unit 13 or the display 14 to input an ID and a password, the authentication unit 105 compares the ID and the password with pre-registered ID and password of the user, and thereby authenticates the user. Authentication using an ID and a password is executed by the controller 11, not by the image processing unit 194, because a face image is not necessary.
The authentication unit 105 executes face authentication processing by using an image captured by the second image capturing unit 193, and thus it is necessary that the second image capturing unit 193 is in the ON-state. That is, the authentication unit 105 is capable of operating when the second image capturing unit 193 is in the ON-state and when the image processing unit 194 is in the standby mode.
In an initial state, the operation modes of the main system of the image processing apparatus 10, the image processing unit 194, and the communication controller 195 have been shifted to the sleep mode, and the first image capturing unit 192 and the second image capturing unit 193 are in the OFF-state.
When a user does not exist in the detection region F1, as illustrated in
If the user moves to the detection region F1, as illustrated in
The first image capturing unit 192 captures an image of the detection region F2 at a certain time interval while it is activated. If an image is captured by the first image capturing unit 192, approach determination processing and stay determination processing are executed.
If the user moves in a direction D1 to approach the image processing apparatus 10, as illustrated in
If the user moves into the operation region F3, as illustrated in
After the user has finished using the image processing apparatus 10, performed certain logout processing on the image processing apparatus 10, and moved to the outside of the operation region F3 with his/her back to the image processing apparatus 10, as illustrated in
Finally, if the user moves to the outside of the detection region F1, as illustrated in
Focusing on the second image capturing unit 193, the image processing unit 194, and the communication controller 195, the second image capturing unit 193 comes into the ON-state at the timing when the user moves into the detection region F1, as illustrated in
As described above, in the state illustrated in
The face image that is pre-registered and stored in the memory is an image that has been captured by the second image capturing unit 193 and has been stored in the memory before the user actually uses the image processing apparatus 10. However, even the same user may have a difference in its height, for example, the height at the time of registration may be different from the height at the time of face authentication. For example, the user may wear low-heeled shoes at the time of registration and may wear high-heeled shoes at the time of face authentication, and vice versa. In a case where the user is higher at the time of registration than at the time of face authentication, the face position of the user in an image captured by the second image capturing unit 193 at the time of face authentication is lower than the second image capturing unit 193 compared to the time of registration, and a downward-oriented face image captured in a downward direction is obtained. In a case where the user is lower at the time of registration than at the time of face authentication, the face position of the user in an image captured by the second image capturing unit 193 at the time of face authentication is higher than the second image capturing unit 193 compared to the time of registration, and an upward-oriented face image captured in an upward direction is obtained. In the case of extracting features through comparison between such an upward-oriented face image or downward-oriented face image and the face image of a valid user registered and stored in the memory, because the two face images compared with each other have different face angles, it may be difficult to compare the features compared to the case of comparing images of the same face angle.
In the image processing apparatus 10 according to this exemplary embodiment, at least any one of an upward-oriented image and a downward-oriented image, as well as a front face image of a user, is captured by the second image capturing unit 193 and the captured image is registered in the memory, at the time of registration of a face image before face authentication.
That is, any one of the following (1) to (3) is registered in this exemplary embodiment.
(1) front face image+upward-oriented face image
(2) front face image+downward-oriented face image
(3) front face image+upward-oriented face image+downward-oriented face image
The “front image” is an example of a first image according to an exemplary embodiment of the present invention, and is an image captured when the user's face is oriented toward the display 14. After the image has been captured on the screen illustrated in
The guide mark 143 is displayed in an upper portion of the screen of the display 14, the guide mark 145 is displayed in a lower portion of the screen of the display 14, and the guide mark 144 is displayed in a left portion of the screen of the display 14, in consideration of the relative positional relationship between the display 14 and the second image capturing unit 193. That is, as illustrated in the top view in
The display position of the guide mark 144 is determined in accordance with the relative positional relationship between the second image capturing unit 193 and the display 14. Thus, the display position of the guide mark 144 normally changes if the set position of the second image capturing unit 193 changes. For example, if the second image capturing unit 193 is provided on the right side of the display 14, the guide mark 144 is displayed on a right portion of the screen of the display 14. When the user operates the start button 145 on the screen illustrated in
In
In
In
In
In
In
In
Face images 208 to 212 are updated face images. That is, in a case where face authentication succeeds, a face image of the user at that time is newly registered in the memory for update. Accordingly, the accuracy of face authentication may be maintained or increased even if the face image of the user changes over time. In
An exemplary embodiment of the present invention has been described above. The exemplary embodiment of the present invention is not limited thereto, and various modifications may be implemented.
For example, according to the above-described exemplary embodiment, shoes are regarded as a cause of change in the height of the user, but the exemplary embodiment of the present invention is not limited thereto.
Also, in the above-described exemplary embodiment, at least any one of an upward-oriented image and a downward-oriented image of a user is captured by the second image capturing unit 193 and is registered in the memory of the image processing unit 194. Alternatively, the controller 11 may create at least any one of an upward-oriented image and a downward-oriented image from a front image registered in the memory by using computer graphics (CG), and may register the created image. A technique of creating an image captured from a different viewpoint by using an image captured from a certain viewpoint is available in the related art. In this case, an upward-oriented image and a downward-oriented image are automatically created and are registered in the memory in the image processing apparatus 10. Thus, the guide screens and guide marks illustrated in
Further, in this exemplary embodiment, the screen automatically changes to the guide screen illustrated in
Further, in this exemplary embodiment, as illustrated in
The foregoing description of the exemplary embodiment of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiment was chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2013-166811 | Aug 2013 | JP | national |