An embodiment of the present invention will be described hereinbelow. However, the relationship between the constitutional requirements of the present invention and the embodiments appearing in the detailed description of the present invention is exemplified as follows. This serves to confirm that embodiments supporting the present invention appear in the detailed description of the invention. Hence, although such embodiments conforming to the constitutional requirements of the present invention appear in the detailed description of the present invention, this does not mean that the absence of embodiments indicates the non-conformity of these embodiments to the constitutional requirements. Conversely, even though embodiments appear here as embodiments that conform to the constitutional requirements, this does not means that the embodiments do not conform to constitutional requirements other than these constitutional requirements.
In other words, the monitoring device according to the first aspect of the present invention comprises accumulation means (the biological information database 22 in
The monitoring device can further comprise feature extraction means for extracting feature from the facial image of the checking target person (the facial feature extraction section 271 in
The monitoring device can further comprise unregistered person registration means for registering a facial image of the checking target person as an unregistered person when it is judged by the similarity judgment means that the checking target person is not the registered person (the unregistered player database registration section 253 in
The control means can comprise focus control means for controlling the focal position of the image pickup means (the focus control section 221b in
The monitoring device can further comprise recording means for recording an image that is picked up by the image pickup means (the recording section 223 in
The monitoring device can further comprise passage sensing means for sensing the passage of the checking target person (the entrance dual sensor 43 in
The monitoring method and program of the first aspect of the present invention comprise: an accumulation step of accumulating facial images of registered people that have been pre-registered (step S82 in
Gaming facilities 1-1 to 1-n are so-called “pachinko parlors” or “pachi-slot parlors.” Further, gaming facilities 1-1 to 1-n are chain facilities or participating facilities of a biological information management center or gaming facility management center and are facilities requiring centralized management of a plurality of facilities. The respective gaming facilities 1-1 to 1-n are connected by a biological information management bus 6 and a gaming facility management bus 7 and exchange biological information and gaming facility management information with one another via public communication networks 8 and 9 representative of the bus and the Internet and so forth. Hereinafter, when there is no need for a particular distinction between the gaming facilities 1-1 to 1-n, these gaming facilities may be simply called the gaming facilities 1 and a similar name is possible for other constitutions.
The biological information management bus 6 mainly functions as a transmission path for distributing biological information that is managed by a biological information recognition device 21 of the respective gaming facilities 1. Further, the gaming facility management bus 7 mainly functions as a transmission path for distributing management information of gaming medium that is managed by game medium management device 27 of the respective gaming facilities 1.
The biological information management center 2 is a server that is used by a business person that manages a biological information management center. The biological information management center 2 updates a registered player DB 352 (
The gaming facility management center 4 is a server that is used by a business person that manages a gaming facility management center. The gaming facility management center 4 updates the DB that comprises management information of game medium that is managed by the gaming facility management database (DB) 5 on the basis of information that is supplied by the respective gaming facilities 1 and distributes the updated latest management information of game medium to the media lending device management device 27 of the respective gaming facilities 1.
The biological information recognition device 21 performs a check with facial images that have been pre-registered in the biological information database 22 on the basis of information of facial images that is extracted by image processing units 39-1 to 39-(m+p) or entrance image processing units 41-1 to 41-q from images that have been picked up by cameras 38-1 to 38-m or in-store cameras 40-1 to 40-p, and entrance cameras 42-1 to 42-q, the extracted facial image information being supplied via a biological information bus 31 and, in the case of a match, communicates the arrival at the facility of a registered player to the portable terminal 44 and displays the registered player on a display section 23 that comprises a CRT (Cathode Ray Tube) or LCD (Liquid Crystal Display) or the like. Furthermore, the biological information recognition device 21 performs a check with facial images that have been pre-registered in the biological information database 22 and, when there is no match, the biological information recognition device 21 accesses the biological information management database 3 and registers the player in the unregistered player DB 351 as an unregistered player.
A gaming facility management device 24 is called a so-called hall computer. The operation of gaming machines 36-1 to 36-m is monitored via a gaming facility management information bus 30. The gaming facility management device 24 executes predetermined processing in accordance with information on the payout of pachinko balls or medals by the gaming machines 36, call information for the players of the respective gaming machines 36-1 to 36-m, or monitoring states such as the occurrence of an error, and displays the execution results on a display section 25 comprising a CRT or LCD. The game facility management device 24 associates information that is supplied by each of a payment machine 33, a lending machine 34, a counter 35, gaming machines 36-1 to 36-m, and gaming machine peripheral terminals 37-1 to 37-m with identification information identifying each of the aforementioned devices (gaming machine numbers, for example) and manages these associations by means of a gaming machine management database 26.
The game medium device management device 27 uses a game medium management database 29 to manage the game medium management information on the game medium that are lent out on the basis of information from the payment machine 33, lending machine 34, and counter 35 and, when the game medium management information registered in the game medium management database 29 is updated, the game medium device management device 27 sends the updated information to the gaming facility management center 4 via the gaming facility management bus 7 and public communication network 9. In addition, the game medium device management device 27 acquires game medium management information that is supplied by the gaming facility management center 4 via the gaming facility management bus 7 and public communication network 9 and facilities this game medium management information in the game medium management database 29.
Upon acceptance of a predetermined amount of money in the form of cash or a pre-paid card or the like when a player plays a game at a gaming machine 36, the lending machine 34 lends game medium in the quantity corresponding to the amount of money. Thereupon, the lending machine 34 supplies information such as information on the amount of money or number of units remaining of the accepted amount of money or pre-paid card and information on the number of game medium lent out to the game medium device management device 27. As a result, the game medium device management device 27 registers information such as the amount of money or number of units remaining of the accepted amount of money or pre-paid card and information on the number of lent game medium in the game medium management database 29.
The payment machine 33 pays a pre-paid card by adding the frequency for the number of balls borrowed. Thereupon, the payment machine 33 supplies information to the game medium device management device 27 on the frequency of the pre-paid card sold and the amount of money paid. Further, the payment machine 33 counts the cash and pays out money on the basis of remaining number of game medium lent out as the frequency of the pre-paid card or the like. Thereupon, the payment machine 33 supplies the number remaining on the pre-paid card and the amount of cash paid pack to the game medium device management device 27.
The counter 35 counts the number of game medium acquired as a result of the player using a gaming machine 36 to play a game and outputs the result of the count as a magnetic card or receipt.
The gaming machines 36-1 to 36-m execute a game as a result of a predetermined operation being performed by the player and pays out game balls or medals in accordance with a so-called ‘small strike’ or ‘large strike’ (winnings).
The gaming machine peripheral terminals 37-1 to 37-m are so-called inter-station devices that are provided in correspondence with the respective gaming machines 36-1 to 36-m; inter-station ball lending machines (fundamentally similar to the lending machines 34) or the like are provided. Furthermore, the gaming machine peripheral terminal 37 acquires biological information such as facial images of players that play on the gaming machine 36 and transmits the biological information to the gaming machine management device 24 together with game station identification information (gaming machine numbers). Further,
The cameras 38-1 to 38-m may be provided below the information display lamps 61-1 to 61-4 that are installed at the top of the respective gaming machines 36-1 to 36-4 as shown in
Furthermore, protrusions 71-1 to 71-4 may be provided on the gaming machine peripheral terminals 37-1 to 37-4 as shown in
In addition, the cameras 38-1 to 38-m may be provided in the center of the gaming machines 36 (on the surface of the game board of the gaming machines 36) as shown in
The in-store cameras 40-1 to 40-p and entrance cameras 42-1 to 42-q are installed in the doorways and in predetermined locations within the gaming facility 1 and supply picked up images to the image processing units 39-(m+1) to 39-(m+p) and entrance image processing units 41-1 to 41-q. Further, the entrance dual sensors 43-1 to 43-q are installed at the doorways 112 (
The in-store cameras 40-1 to 40-p, entrance cameras 42-1 to 42-q, and entrance dual sensors 43-1 to 43-q are installed as shown in
In other words, in
In addition, an in-store camera 40-a is provided in front of the lending machine 34, an in-store camera 40-b is provided in front of the payment machine 33 and an in-store camera 40-c is provided in front of the counter 35; each of which is capable of picking up an image of a player using the lending machine 34, payment machine 33, and counter 35.
In other words, as shown in
In addition, an entrance dual sensor 43 is constituted by an external sensor 43a and an internal sensor 43b that are provided with the doorway 112 interposed therebetween; the external sensor 43a and internal sensor 43b each detect the passage of players by means of an optical sensor and, when a player is detected by the internal sensor 43b after the player has been detected by external sensor 43a, the external sensor 43a and internal sensor 43b detect the passage of the player through the doorway 112 and the fact that the player has entered the gaming facility 1.
A constitutional example of the image processing unit 39 will be described next with reference to
An image acquisition section 201 acquires images picked up by the cameras 38 (or in-store cameras 40) and supplies the images to a facial image extraction section 202. The facial image extraction section 202 extracts a rectangular image comprising a facial image by means of a pattern for the disposition of the parts constituting the face within the image supplied by the image acquisition section 201 and supplies the rectangular image thus extracted to a transmission section 203. The transmission section 203 transmits the facial image to the biological information recognition device 21.
A constitutional example of an entrance image processing unit 41 and the entrance dual sensor 43 will be described next with reference to
A camera control section 221 comprises a pan-tilt-zoom control section 221a, a focus control section 221b, and an iris control section 221c and controls the pan-tilt-zoom control section 221a, focus control section 221b, and iris control section 221c respectively on the basis of the judgment result from a facial detection judgment section 222 to adjust the pan-tilt-zoom of the entrance camera 42, the focal position, and the iris. The facial detection judgment section 222 judges whether the facial image extraction section 202 is able to extract a facial image from an image picked up by the entrance camera 42 on the basis of an image from the image acquisition section 201 and the player passage judgment result from the entrance dual sensor 43 and, in cases where an image cannot be detected, the facial detection judgment section 222 controls a defective detection information generation section 224 to generate defective detection information and supply same to the biological information recognition device 21. Further, the facial detection judgment section 222 controls a video recording section 223 to record images that are supplied by the image acquisition section 201 successively in the memory 223a when it is judged whether the facial image can be detected. The defective detection information generation section 224 reads an image that has been recorded to memory 223a of the video recording section 223 when defective detection information is generated as a result of the judgment that extraction of the facial image is impossible by the facial detection judgment section 222 when detection of the facial image is impossible, and generates defective detection information.
The external sensor 43a of the entrance dual sensor 43 is an optical sensor and, as shown in
The internal sensor 43b in the entrance dual sensor 43 is an optical sensor and is provided on the side of the doorway 112 on the inside of the gaming facility 1 as shown in
The passage judgment section 233 judges whether the player has passed the doorway 112 on the basis of the timing of the ON or OFF of the signal supplied by the external sensor sensing section 231 and internal sensor sensing section 232. When it is judged that the player has passed the doorway 112, a signal indicating that fact is supplied to the facial detection judgment section 222 of the entrance image processing unit 41. More specifically, the passage judgment section 233 detects the passage of the player when the signal supplied by the external sensor sensing section 231 changes from an OFF state to an ON state and then turns OFF and then, when the signal supplied by the internal sensor sensing section 232 changes from an OFF state to an ON state and then turns OFF.
An example of the constitution of the biological information recognition device 21 will be described next with reference to
A facial image acquisition section 251 acquires facial images that are supplied by the image processing units 39 or entrance image processing units 41 and supplies these images to a checking section 252. The checking section 252 checks a facial image acquired from the facial image acquisition section 251 against a facial image that has been pre-registered in the biological information database 22 and, if a facial image that is a highly similar candidate is present in the biological information database 22, the checking section 252 displays facial image candidates as far as a third candidate on the display section 23 as the checking result. Further, the checking section 252 supplies the supplied facial image to an unregistered player database registration section 253 when a facial image which is a highly similar candidate does not exist.
More precisely, a feature extraction section 271 of the checking section 252 extracts a feature for identifying a facial image and supplies the feature to a similarity calculation section 272 together with the facial image. The similarity calculation section 272 extracts the feature of the facial image of the registered player registered in the biological information database 22 and uses the feature supplied by the feature extraction section 271 to determine the similarity with the facial images all the registered player registered in the biological information database 22; the similarity calculation section 272 supplies the facial image supplied by the facial image acquisition section 251 and similar facial images up to the third-ranked facial image to a similarity judgment section 273. More specifically, the similarity calculation section 272 determines a differential sum, mean percentage, or percentage sum or the like on the basis of facial feature of various types such as the distance between the eyes, the length from the chin to the forehead, the length from the chin to the nose, for example.
The similarity judgment section 273 compares the similarity with the facial image in first position among the respective similarities of the facial images up to the third-ranked facial image that have been supplied by the similarity calculation section 272 with a predetermined threshold value and when, based on the result of the comparison, the registered facial image in first position is similar to the facial image supplied by the facial image acquisition section 251 (a similarity indicating a high degree of similarity is higher than the predetermined threshold value and a similarity representing a low degree of similarity is lower than the predetermined threshold value), the similarity judgment section 273 supplies information on the similarity with the third-ranked facial image to the display section 23 so that the information is displayed by the display section 23 and supplies the information to a communication section 254. Further, the similarity judgment section 273 compares the similarity with the facial image in first position with a predetermined threshold value and, based on the result of the judgment, when the registered facial image in first position is not similar to the facial image supplied by the facial image acquisition section 251, the similarity judgment section 273 supplies the facial image supplied by the facial image acquisition section 251 to the unregistered player database registration section 253.
The unregistered player database registration section 253 registers the facial image in the unregistered player DB 351 of the biological information management database 3 that has been supplied as a result of the player with the facial image being regarded as unregistered by the checking section 252.
An operation section 255 is constituted by buttons, a mouse, or a keyboard or the like and is operated when any of the earlier-mentioned facial images up to the third-ranked facial image which are displayed on the display section 23 is selected. The result of operating the operation section 255 is supplied to a communication section 254. The communication section 254 is constituted by a modem or the like and distributes the selected facial image to a portable terminal 44 on the basis of an operation signal from the operation section 255.
Further, here, an example will be described where the similarity represents a higher value the closer a facial image is to the facial image registered for a registered player as indicated by the percentage sum, for example, and, when the similarity is a higher value than the predetermined threshold value, the facial image is judged to be a facial image of the registered player that corresponds with the similarity. However, in cases where the similarity is expressed as the differential sum of the respective characteristic values of a picked up facial image and facial images registered for the registered players, for example, the similarity judgment section 273 comes to regard the picked up facial image as the facial image of the registered player if the similarity is less than the threshold value or, in the case of a mean percentage or the like, the player can be regarded as the same person if the mean percentage is close to 1, which is equal to or more than a predetermined value in the range 0 to 1.
When a new registered player database 352 (
A defective detection information acquisition section 257 acquires and accumulates defective detection information that is supplied by the entrance image processing unit 41. When defective detection information has accumulated in the defective detection information acquisition section 257, a defective detection information image display control section 258 generates a video recording list image from the defective detection information on the basis of the defective detection information and displays same on the display section 23; when playback of a predetermined image is instructed, the defective detection information image display control section 258 reads the corresponding image information by means of the defective detection information acquisition section 257 and displays this information on the display section 23.
A constitutional example of the biological information management center 2 will be described next with reference to
The biological information management center 2 is constituted by a DB distribution section 341, a DB update section 342, and a DB update judgment section 343 and updates the registered player DB 352 stored in the biological information management database 3 on the basis of an unregistered player DB 351. More specifically, the DB update judgment section 343 contains an RTC (Real Time Clock) 343a that produces time information and, when a predetermined time has elapsed on the basis of the built-in RTC 343a, the DB update judgment section 343 accesses the biological information management DB 3 and judges whether a new unregistered player has been registered in the unregistered player DB 351.
When an unregistered player is newly registered in the unregistered player DB 351, the DB update judgment section 343 communicates this fact to the DB update section 342. The DB update section 342 accesses the unregistered player DB 351 and judges whether an unregistered player that has been registered a predetermined number of times or more exists among the unregistered players thus registered. In cases where an unregistered player that has been registered as an unregistered player a predetermined number of times or more exists, the DB update section 342 reads the registered player DB 352 and registers and updates the unregistered player registered a predetermined number of times or more in the registered player DB 352. In addition, when the registered player DB 352 is updated, the DB update section 342 communicates the fact that the registered player DB 352 has been updated to the DB distribution section 341.
When the fact that the registered player DB 352 has been updated by the DB update section 342 is communicated, the DB distribution section 341 accesses the biological information management database 3, reads the registered player DB 352, and distributes same to the biological information recognition devices 21 of the respective gaming facilities 1.
A constitutional example of the portable terminal 44 will be described next with reference to
A communication section 371 is constituted by a modem or the like and exchanges data with the biological information recognition device 21 via a wireless communication network in the gaming facility 1. Further, the communication section 371 acquires information indicating the fact that a player with a facial image that is similar to a facial image supplied by the image processing units 39 or entrance image processing units 41 that have been distributed by the biological information recognition device 21 has arrived at the facility and supplies this information to the image processing section 372.
The image processing section 372 generates an image that is displayed on a display section 373 that is constituted by an LCD or the like on the basis of information indicating the fact that a player with a facial image that is similar to a facial image supplied by an image processing unit 39 or entrance image processing unit 41 that has been supplied by the communication section 371 has arrived at the facility and displays the image thus generated on the display section 373.
The processing to monitor the arrival at the facility of a registered player will be described next with reference to the flowchart in
In step S1, the entrance camera 42 judges whether the door to the entrance is open on the basis of the information of a picked up image or a signal from the entrance door and this processing is repeated until a predetermined time has elapsed.
In cases where it is judged that the door is open in step S1, the entrance camera 42 picks up images in the installation range in step S2 and supplies the picked up images to the entrance image processing unit 41. The image acquisition section 201 of the entrance image processing unit 41 acquires the images thus supplied and supplies these images to the facial image extraction section 202.
In step S3, the facial image extraction section 202 extracts the facial images of players from the images thus supplied and supplies the extracted images to the transmission section 203. More specifically, the facial image extraction section 202 extracts facial images from the arrangement of characteristic parts such as the eyes and nose which are parts in which the skin is exposed from the color and so forth of the picked up images and supplies the facial images to the transmission section 203.
In step S4, the transmission section 203 transmits the facial images that have been supplied by the facial image extraction section 202 to the biological information recognition device 21. Thereupon, the transmission section 203 adds information such as the camera IDs for identifying the camera 38, in-store camera 40, and entrance camera 42 and transmission time information to the facial images before transmitting the facial images to the biological information recognition device 21.
In step S21, the facial image acquisition section 251 of the biological information recognition device 21 acquires the facial images. In step S22, the facial image acquisition section 251 extracts any one unprocessed image among the supplied facial images and supplies same to the facial feature extraction section 271.
In step S23, the facial feature extraction section 271 of the checking section 252 extracts facial feature from the facial images thus supplied and supplies the facial feature to the similarity calculation section 272 together with the facial images.
In step S24, the similarity calculation section 272 calculates, as the similarity, the differential sum, the mean percentage, or the percentage sum or the like by determining facial feature of various types such as the ratio between the distance between the eyes, the length from the chin to the forehead, and the length from the chin to the nose and so forth for the facial images supplied by the facial feature extraction section 271 and determines the ranking order of similarities which constitute the calculation results; the similarity calculation section 272 then supplies the facial images up to the third-ranked facial image and similarity information to the similarity judgment section 273 together with the facial images supplied by the facial feature extraction section 271.
In step S25, the similarity judgment section 273 judges whether the highest similarity is greater than a predetermined threshold value on the basis of the facial images up to the third-ranked facial image and the similarity information that have been supplied by the similarity calculation section 272. That is, the similarity judgment section 273 compares the similarity of the most similar registered player (the registered player most similar to the facial image acquired by the facial feature extraction section 271 among the facial images registered in the biological information database 22: here, the registered player of the highest similarity) with a predetermined threshold value.
Thus, as mentioned earlier, according to the definition of similarity, the similarity of the facial image of a registered player most similar to the facial image pickedup is not limited to the similarity with the highest value; the magnitude correlation between the similarity and threshold value sometimes differs from the case provided in this example.
In step S26, the similarity judgment section 273 displays a report screen 401 that shows that the third-ranked facial image supplied by the similarity calculation section 272 is a candidate for the facial image of a registered player by controlling display section 23.
Thereupon, the report screen 401 shown in
In the report screen 401 in
Furthermore, below the similarity level display fields 413-1 to 413-3, the ID display fields 414-1 to 414-3 are provided in corresponding positions, and the IDs identifying the facial images in the biological information database 22 of the respective facial images are displayed and, in
In addition, ‘Apply’ buttons 418-1 to 418-3 which are operated by the operation section 255 when selecting the respective candidates are provided below the ID display fields 414-1 to 414-3 in respective corresponding positions.
Further, a camera ID display field 415 that identifies the camera that picked up the facial image is provided below the camera image display field 411. In
In addition, an other person button 417 is provided below the time display field 416. The other person button 417 is operated by the operation section 255 when the facial image of the camera image is not regarded as being similar to any of the facial image display fields 412-1 to 412-3 of registered players who are the first to third candidates.
In step S27, the communication section 254 is operated by the operation section 255 and it is judged whether any of the candidate facial images has been selected, that is, it is judged whether any of the Apply buttons 418-1 to 418-3 has been operated by the operation section 255 when the report screen 401 shown in
In step S27, when the Apply button 418-1 has been operated, for example, it is considered that the first candidate has been selected and, in step S28, the communication section 254 transmits the selected first candidate facial image and the camera image picked up by the entrance camera 42 to the portable terminal 44 and communicates the fact that the relevant registered player has come to the gaming facility.
In step Second optical system 41, the communication section 371 judges whether the fact that a registered player has arrived at the facility has been communicated and repeats the processing until such a communication is made. For example, in step S41, when the fact that a registered player has arrived at the facility has been communicated as a result of the processing of step S28, in step S42, the communication section 371 receives the communication that a registered player has arrived at the facility that was transmitted by the biological information recognition device 21 and supplies the facial image of the registered player and the camera image picked up by the entrance camera 42 which are transmitted together with the communication to the image processing section 372. The image processing section 372 processes the information of the selected facial image and the camera image picked up by the entrance camera 42 to produce information in a format that can be displayed on the display section 373 and, in step S43, displays the information on the display section 373.
As a result of the above processing, when clerk in the gaming facility 1 is in possession of the portable terminal 44, they are able to acknowledge the arrival at the facility of the registered player.
Further, in step S29, the facial image acquisition section 251 judges whether processing has been executed with respect to all of the supplied facial images and, when there is an unprocessed facial image, the processing returns to step S22. That is, until processing has been executed for all the facial images, the processing of steps S22 to S30 is repeated. Further, when it is judged that the processing of all the facial images has ended, the processing returns to step S21.
Meanwhile, in step S27, in cases where no candidate facial image is selected and the other person button 417 of the report screen 401 in
As a result of the above processing, when the facial image supplied by the entrance image processing unit 41 is regarded by the biological information recognition device 21 as not being registered in the biological information database 22, the facial image is registered as the facial image of an unregistered player in the unregistered player DB 351 in the biological information management database 3 managed by the biological information management center 2.
The processing to update the registered player DB 352 will be described next with reference to the flowchart in
In step S61, the DB update judgment section 343 of the biological information management center 2 judges whether a predetermined time has elapsed by consulting the RTC 343a and repeats the processing until a predetermined time has elapsed. For example, when it is judged that the predetermined time has elapsed in step S61, in step S62, the DB update judgment section 343 accesses the unregistered player DB 351 of the biological information management database 3 and judges whether a new unregistered player has been registered, repeating the processing of steps S61 and S62 until a new unregistered player has been registered.
In step S62, in cases where a new unregistered player has been registered by the processing of step S30, for example, in step S63, the DB update judgment section 343 judges whether any of the unregistered players newly registered in the unregistered player DB 351 has been registered a predetermined number of times. When the number of registrations is not equal to or more than the predetermined number of times, the processing returns to step S61.
Meanwhile, in cases where it is judged in step S63 that an unregistered player who has been registered in the unregistered player DB 351 a number of times equal to or more than the predetermined number of times exists, for example, in step S64, the DB update judgment section 343 communicates the fact that a state where an update to the registered player DB 352 is to be performed to the DB update section 342. By way of response, the DB update section 342 accesses the unregistered player DB 351 and reads all of the information of the facial image of the unregistered player that has been registered a predetermined number of times or more, accesses the registered player DB 352, and adds an ID identifying the facial image to the information of the facial image of the unregistered player that has been registered predetermined number of times or more thus read, before newly registering and updating the unregistered player in the registered player DB 352. In addition, the DB update section 342 communicates the fact that the registered player DB 352 has been updated to the DB distribution section 341. Further, at the point where an update to the registered player DB 352 has ended, the unregistered player that has been registered a predetermined number of times or more is deleted from the unregistered player DB 351.
In step S65, the DB distribution section 341 accesses and reads the newly updated registered player DB 352 of the biological information management database 3 and distributes the registered player DB 352 which is read to the database management section 256 of the biological information recognition device 21.
In step S81, the database management section 256 judges whether the new registered player DB 352 which has been updated by the biological information management center 2 has been distributed and repeats the processing until the new registered player DB 352 has been distributed.
In step S81, when the newly updated registered player DB 352 is distributed by the DB distribution section 341 of the biological information management center 2 by means of the processing of step 65, for example, in step S82, the database management section 256 acquires the registered player DB 352 that has been distributed and updates the biological information database 22 on the basis of the information of the registered player DB 352. More precisely, the database management section 256 copies the information of the newly distributed registered player DB 352 to the biological information database 22 so that the information therein is overwritten.
As a result of the above processing, when an unregistered player is registered as an unregistered player in the unregistered player DB 351 a predetermined number of times or more at any of the gaming facilities 1-1 to 1-n, because the unregistered player is registered in the form of a database in the biological information database 22 of the biological information recognition device 21 of any of the gaming facilities 1-1 to 1-n by newly registering the unregistered player in the registered player DB 352, when an image of the unregistered player is picked up by the entrance camera 42 at a plurality of gaming facilities 1, it turns out that the arrival of the player at the gaming facility is reported as that of a registered player. Furthermore, the registered player DB 352 is updated on the basis of the information from the respective gaming facilities 1-1 to 1-n. Hence, a customer facility arrival recording is generated for a plurality of facilities and the customer facility arrival recordings can be centrally managed.
The above description was for an example in which, as shown by the processing of step S1 in the flowchart of
Furthermore, although an example in which, as indicated by the processing of step S1 in the flowchart in
As detailed earlier, it is possible to implement the collection of not only customer facility arrival information for the respective facilities but also customer facility arrival recordings at a plurality of facilities and implement centralized management thereof.
Entrance camera adjustment processing will be described next with reference to the flowchart in
In step S101, the camera control section 221 of the entrance image processing unit 41 controls the pan-tilt-zoom control section 221a, focus control section 221b, and iris control section 221c to adjust the entrance camera 42 to the pan, tilt, and zoom as well as the focal position and iris which are each set to their default values.
In step S102, the facial detection judgment section 222 executes facial detection judgment processing and judges whether the extraction of a facial image from the image is possible on the basis of the image that is supplied to the facial image extraction section 202 by the image acquisition section 201.
Facial detection processing will be described here with reference to the flowchart in
In step S121, the facial detection judgment section 222 resets the passing people counter T and the facial detection people counter K.
In step S122, the recording of images supplied by the image acquisition section 201 is started and the recorded images are made to accumulate sequentially in the memory 223a by controlling the video recording section 223.
In step S123, the passage judgment section 233 of the entrance dual sensor 43 judges whether a signal indicating that the external sensor 43a is ON has been input by the external sensor sensing section 231 and repeats the processing until a signal indicating that the external sensor 43a is ON has been input.
In step S123, when the external sensor 43a installed outside the doorway 112 of the gaming facility 1 detects the passage of a player, for example, produces a predetermined signal and supplies same to the external sensor sensing section 231, the external sensor sensing section 231 supplies a signal indicating that the external sensor 43a is ON to the passage judgment section 233. As a result, the passage judgment section 233 judges that the external sensor 43a is ON and the processing moves on to step S124.
In step S124, the passage judgment section 233 of the entrance dual sensor 43 judges whether a signal indicating that the external sensor 43a is OFF has been input by the external sensor detection section 231 and repeats the processing until a signal indicating that the external sensor 43a is OFF has been input.
In step S124, as result of a player completely passing through the range in which sensing by the external sensor 43a is possible, for example, the external sensor 43a installed on the outside of the doorway 112 of the gaming facility 1 enters a state where the player is not detected and, when the production of the predetermined signal is stopped, the external sensor sensing section 231 supplies a signal indicating that the external sensor 43a is OFF to the passage judgment section 233. As a result, the passage judgment section 233 judges that the external sensor 43a is OFF and the processing moves on to step S125.
In step S125, the passage judgment section 233 of the entrance dual sensor 43 judges whether a signal indicating that the internal sensor 43b is ON has been input by the internal sensor sensing section 232 and repeats the processing until a signal indicating that the internal sensor 43b is ON is input.
In step S125, for example, when the internal sensor 43b installed on the inside of the doorway 112 of the gaming facility 1 detects the passage of the player, produces a predetermined signal, and supplies same to the internal sensor sensing section 232, the internal sensor sensing section 232 supplies a signal indicating that the internal sensor 43b is ON to the passage judgment section 233. As a result, the passage judgment section 233 judges that the internal sensor 43b is ON and the processing moves on to step S126.
In step S126, the passage judgment section 233 of the entrance dual sensor 43 judges whether a signal indicating that the internal sensor 43b is OFF has been input by the internal sensor sensing section 232 and repeats the processing until a signal indicating that the internal sensor 43b is OFF has been input.
In step S126, as a result of a player completely passing through the range in which sensing by the internal sensor 43b is possible, for example, the internal sensor 43b installed on the inside of the doorway 112 of the gaming facility 1 enters a state where the player is not detected and, when the production of the predetermined signal is stopped, the internal sensor sensing section 232 supplies a signal indicating that the internal sensor 43b is OFF to the passage judgment section 233. As a result, the passage judgment section 233 judges that the internal sensor 43b is OFF and the processing moves on to step S127.
In step S127, the passage judgment section 233 communicates the fact that a player has passed through the doorway 112 and entered the gaming facility to the facial detection judgment section 222. In other words, as indicated by the judgment processing of steps S123 to S126, a respective detection state ends after the passage of the player has been detected by the internal sensor 43a, and a respective detection state ends after the passage of the player has been detected by the internal sensor 43b, where by the passage of the player through the doorway 112 of the gaming facility 1 from the outside into the gaming facility is detected.
In step S128, the facial detection judgment section 222 increases the passing people counter T by one on the basis of the communication indicating the passage of the player through the doorway 112 from the passage judgment section 233.
In step S129, the facial detection judgment section 222 detects a facial image by means of the same technique as the technique for extracting a facial image of the facial image extraction section 202 from the image supplied by the image acquisition section 201. Further, the timing with which the image is supplied by the image acquisition section 201 is the timing at which the processing of steps S2 and S3 in
In step S130, the facial detection judgment section 222 judges whether a face can be detected from the image supplied by the image acquisition section 201, in other words, whether a facial image can be extracted. As shown on the left side of
In step S130, in cases where it is judged that the face has been detected, in step S131, the facial detection judgment section 222 increases the facial detection people counter K by one. However, when it is judged that the face cannot be detected, the processing in step S131 is skipped.
In step S132, the facial detection judgment section 222 judges whether the passing people counter T is equal to or more than a predetermined threshold value Tth and in cases where the passing people counter T is not equal to or more than the threshold value Tth, for example, the processing returns to step S123 and the processing of steps S123 to S132 is repeated. In step S132, when judgment processing to determine whether the passing people counter T is equal to or more than the threshold value Tth, in other words, whether the faces corresponding to the predetermined threshold value Tth are detected, in step S133, the facial detection judgment section 222 calculates the facial detection ratio R (=facial detection people counter K/threshold value Tth). In other words, the facial detection ratio R, which indicates the ratio of the faces that can be detected with respect to the number of people entering the facility via the doorway 112, is calculated.
In step S134, the facial detection judgment section 222 judges whether the facial detection ratio R is equal to or more than a predetermined threshold value Rth. In step S134, in cases where the facial detection ratio R is equal to or more than the predetermined threshold value Rth, in other words, in cases where the ratio of the detected faces with respect to the number of people entering the facility via the doorway 112, the facial detection judgment section 222 judges in step S135 that the face is in an adequately detectable state.
However, in step S134, in cases where the facial detection ratio R is not equal to or more than the predetermined threshold value Rth, in other words, in cases where the ratio of faces that can be detected with respect to the number of people entering the facility via the doorway 112 is low, in step S136, the facial detection judgment section 222 is judged as being in a state in which faces cannot be adequately detected.
In step S137, the facial detection judgment section 222 controls the video recording section 223 to stop video recording and images that are continuously recorded after the video recording is started by the processing of step S122 are recorded as one file to the memory 223a, whereupon the processing is stopped.
As a result of the above processing, when processing to detect the faces of a number of people equal to a predetermined number of passing people (the number of people corresponding to the predetermined threshold value Tth) is detected, the facial detection ratio R is determined and it is judged that faces can be adequately detected when the facial detection ratio R is high and it is judged that faces cannot be detected when the facial detection ratio R is low.
In
Let us now return to the description of the flowchart in
In step S103, the facial detection judgment section 222 judges whether a state where adequate facial detection is possible through facial detection processing exists. For example, in cases where it is judged by the processing of step S135 that a face can be adequately detected, in step S104, the facial detection judgment section 222 measures the brightness difference D between a suitable facial image and the facial image that was picked up.
In step S105, the facial detection judgment section 222 judges whether the brightness difference D is equal to or more than a predetermined threshold value Dth and, in cases where the brightness difference D is equal to or more than the predetermined threshold value Dth, for example, respective correction values for the pan-tilt direction, zoom magnification, and iris are calculated on the basis of the brightness difference D and the correction values thus calculated are supplied to the camera control section 221 in step S106.
In step S107, the camera control section 221 controls the pan-tilt-zoom control section 211a, focus control section 211b, and iris control section 221c in order to adjust same in accordance with the correction values, where upon the processing returns to step S102. In other words, the camera control section 221 corrects the pan-tilt direction, zoom magnification, focal length, and iris to suitable states under conditions permitting the adequate extraction of a facial image.
However, in step S105, in cases where the brightness difference D is not equal to or more than the predetermined threshold value Dth, it is considered that there is no need for such correction and the processing of steps S106 and S107 is skipped.
Furthermore, for example, in cases where it is judged in step S103 that a state permitting adequate detection through the processing of step S136 does not exist, in step S108, the facial detection judgment section 222 judges whether the control state of the camera 42 has been set to the final position that can be set as the focal position; when it is judged that the position is not the final position, the facial detection judgment section 222 instructs the camera control section 221 to change the focal position by one level in step S109. In response to the instruction, the camera control section 221 controls the pan-tilt-zoom control section 221a and focus control section 221b to establish the focal position of the entrance camera 42 in a position that is one stage further away from the entrance camera 42.
In other words, in the setting of the home position which is the default value (also known as the HP(Home Position) hereinafter), the entrance camera 42 is established in the focal position HP that is closest to the entrance camera 42 from the doorway 112 as shown at the top of
In a state where a facial image cannot be adequately detected, it can be considered that, as mentioned earlier, the player is at a distance from the doorway 112 and too close to the entrance camera 42. Therefore, in cases where adequate detection of a facial image is not possible, as shown in the middle of
In cases where the face cannot be adequately detected by this processing, the focal position is moved away stepwise from the entrance camera 42 a predetermined distance at a time by the processing of steps S102, S103, S108, and S109. Further, once a state permitting adequate detection is assumed, in step S103, it is judged that a face can be adequately detected and the processing moves on to the processing of step S104 and subsequent steps.
However, the processing of steps S102, S103, S108, and S109 is repeated and, in cases where adequate detection is not possible even in the farthest position Pn for detecting the face by means of the entrance camera 42 as shown at the bottom of
In step S111, the facial detection judgment section 222 judges whether one or more facial images can be detected, in other words, whether the facial detection people counter K is one or more by that point. When the facial detection people counter K is one or more, the fact that it can said that at least one facial image has been detected means that an anomaly has not been produced in the entrance camera 42 and the processing then moves on to step S104.
However, in step S111, in cases where the facial detection people counter K is not one or more, in other words, no facial images have been detected, it is considered that an anomaly has occurred in the entrance camera 42 and, in step S112, the facial detection judgment section 222 instructs the defective detection information generation section 224 to generate defective detection information. By way of response, the defective detection information generation section 224 reads data for images recorded thus far by the memory 223a of the video recording section 223, in other words, images that have been picked up by the processing of steps S122 and S137 of the flowchart in
As a result of the above processing, when a state where a face can be adequately detected exists, the pan-tilt-zoom, focal length, and iris which have been set are corrected to set values. When a state where adequate facial detection is possible does not exist, by changing the pan-tilt-zoom, focal length, and iris which have been set to conditions under which the face can be detected, the focal length is changed stepwise to a position far apart from the entrance camera 42 and, therefore, a facial image can be correctly extracted from the image picked up by the entrance camera 42. Further, in cases where facial detection is not possible even when adjustment is repeated while changing the focal position stepwise, it is considered that an anomaly has occurred in the entrance camera 42 and defective detection information can be generated and the anomaly of the entrance camera 42 can be detected.
Processing to display defective detection information of the biological information recognition device 21 in cases where the defective detection information has been transmitted will be described next with reference to the flowchart in
In step S151, the defective detection information acquisition section 257 of the biological information recognition device 21 judges whether defective detection information has been transmitted by the entrance image processing unit 41 and the processing is repeated until defective detection information has been transmitted. For example, in cases where defective detection information has been transmitted by the processing of step S112 of the flowchart in
In step S152, the defective detection information acquisition section 257 acquires and stores defective detection information that has been transmitted by the entrance image processing unit 41.
In step S153, the defective detection information image display control section 258 reads defective detection information that has been stored in the defective detection information acquisition section 257, generates a video recording list image 451 as shown in
In
Further, video recording thumbnail display fields 461-1 to 461-4 which display a list of images contained in the defective detection information are displayed below the message display. ‘Most recent image’, ‘one image before’, ‘two images before’, and ‘three images before’ are displayed starting from the left and the leading frames of recorded images are displayed as thumbnail images, for example.
Playback buttons 462-1 to 462-4 are provided in correspondence with the video recording thumbnail display fields 461-1 to 461-4 below the video recording thumbnail display fields 461-1 to 461-4 and, when the respective images are played back, the playback operation is performed by means of the operation section 255. A Return button 463 that displays ‘return to original image’ is provided below and to the right of the video recording list image 451 and is operated by the operation section 255 when switching from the display screen of the video recording list image 451 to the normal display screen shown in
In step S154, the defective detection information image display control section 258 judges by consulting the operation section 255 whether the play back of any image has been instructed, in other words, whether any of the playback buttons 462-1 to 462-4 has been operated by the operation section 255. When playback button 462-1 has been operated, for example, the defective detection information image display control section 258 judges that playback has been instructed and, in step S155, the defective detection information image display control section 258 reads and plays back data of the image for which a playback instruction has been issued which are stored in the defective detection information acquisition section 257 and displays the data on the display section 23. In this case, because playback of the ‘latest image’ has been instructed, an image supplied by the image acquisition section 201 is played back while the processing of previous steps S122 to S137 is executed.
However, in cases where no playback is instructed in step S154, the processing of step S155 is skipped.
In step S156, the defective detection information image display control section 258 judges whether the Return button 463 has been operated by consulting the operation section 255 and, when the Return button 463 has not been operated, the processing returns to step S154.
However, in step S156, in cases where it is judged that the Return button 463 has been operated, in step S157, the defective detection information image display control section 258 stops displaying the video recording list image 451, returns to the normal display screen shown in
As a result of the above processing, when the defective detection information is transmitted by the entrance image processing unit 41, the defective detection information is displayed and, therefore, when an anomaly occurs in the entrance camera 42, because this fact is communicated to the clerk of the gaming facility 1 managing the biological information recognition device 21, the anomaly of the entrance camera 42 can be rapidly recognized while monitoring the biological information recognition device 21.
As a result, it is possible to easily extract a facial image without increasing the number of cameras at the image pickup point and adjustment of individual cameras can be easily implemented. The image pickup state of the cameras can always be maintained in the optimum state and, therefore, facial image recognition processing can be more accurately implemented.
Further, the above-described serial monitoring processing can be executed by hardware but can also be executed by software. In cases where the serial processing is executed by software, the program constituting the software is installed via a recording medium on a computer incorporated in dedicated hardware or on a general-purpose personal computer or the like, for example, which is capable of executing various functions through the installation of various programs.
Connected to the IO interface 1005 are an input section 1006 comprising input devices such as a keyboard and mouse whereby the user inputs operating commands, an output section 1007 that outputs images of the processing operation screen and operation result to the display device, a storage section 1008 comprising a hard disk drive or the like for storing programs and a variety of data, and a communication section 1009 that executes communication processing via a network represented by the Internet comprising a LAN (Local Area Network) adapter or the like. In addition, a drive 1010 which reads and writes data to removable media 1011 such as magnetic disks (including flexible disks), optical disks (including CD-ROM (Compact Disc-Read Only Memory), DVD (Digital Versatile Disc), magneto-optical disks (including MD (Mini Disc)), or semiconductor memory is connected.
The CPU 1001 executes various processing in accordance with programs stored in the ROM 1002 or programs which are installed in the storage section 1008 after being read from the removable media 1011 such as magnetic disks, optical disks, magneto-optical disks, or semiconductor memory and which are loaded from the storage section 1008 into the RAM 1003. Data required by the CPU 1001 to execute various processing is also suitably stored in the RAM 1003.
In this specification, the steps describing the programs recorded on the recording media include processing that is executed chronologically in the mentioned sequence and, naturally, processing that is not necessarily executed chronologically which is executed in parallel or individually.
Furthermore, in the above specification, the system represents all devices constituted by a plurality of devices.
Number | Date | Country | Kind |
---|---|---|---|
2006-126364 | Apr 2006 | JP | national |