This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In some embodiments, a system is provided. The system comprises an automated teller machine (ATM), a first network that communicatively couples the ATM to a financial system, an IP camera device, a second network that is communicatively coupled to the IP camera and separate from the first network, and a video annotation device. The video annotation device is communicatively coupled to the first network to receive information addressed to the ATM, and is communicatively coupled to the IP camera device via the second network.
In some embodiments, a method is provided. A computing device monitors automated teller machine (ATM) transaction information. The computing device generates on-screen display information based on the ATM transaction information. The computing device causes the on-screen display information to be added to a video or a still image snapshot associated with the ATM transaction information.
In some embodiments, a non-transitory computer-readable medium is provided. The computer-readable medium has computer-executable instructions stored thereon. The instructions, in response to execution by one or more processors of a computing device, cause the computing device to perform actions comprising monitoring communication signals addressed to an automated teller machine (ATM) on a first network, and generating annotation information based on the communication signals.
The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
Automated teller machines, or ATMs, are becoming the predominant way in which individuals conduct transactions with financial institutions such as banks. While the convenience provided by ATMs is a benefit of the technology, the locations of ATMs are often not as secure as a bank lobby. Accordingly, various steps are generally taken in order to monitor the locations of ATMs for malicious activity.
For monitoring purposes, one or more camera devices are positioned such that their field of view captures activities that occur near the ATM 96. As shown, an analog camera device 88 and/or an Internet Protocol (IP) camera device 84 may be so positioned. Some cameras 84, 86 may be positioned within a cabinet of the ATM 96 so as to capture a face of a user of the ATM 96. Some cameras 84, 86 may be positioned away from the ATM 96 and may be directed toward the ATM 96 in order to capture a profile view of a user of the ATM 96. The analog camera device 88 may be coupled to a video recording device 86 via component video or composite video cables, and the video recording device 86 may record video generated by the analog camera device 88 in a suitable format and on a suitable medium. The IP camera device 84 may be communicatively coupled to a network video recorder 82 via a second network 94, and the network video recorder 82 may record video generated by the IP camera device 84 and transmitted over the second network 94.
The first network 92 and the second network 94 may each include any suitable communication technology, including but not limited to Ethernet, fiber optics, Wi-Fi, WiMAX, 2G, 3G, 4G, LTE, modem communication over a telephony network, and the Internet. In some embodiments, the first network 92 and the second network 94 may be completely isolated from each other, and may not share any common resources such as transmission wires or access points. In some embodiments, some portions of the first network 92 and the second network 94 may share some hardware (such as separate subnets connected by a shared router), but devices on the second network 94 may nevertheless be prevented from communicating with devices on the first network 92 for security purposes. In some embodiments, the first network 92 may be a 100M Ethernet network, and the second network 94 may be a gigabit Ethernet network.
While capturing a face 210 of a person standing in front of the ATM 96 can be useful in fraud prevention and theft prevention, the traditional system 100 has technical drawbacks. For example, the image 204 cannot easily be correlated to a transaction that was occurring at the time the image 204 was captured. Even if a timestamp generated by the camera device 84, 88 for the image 204 was accurately correlated with a timestamp of financial records generated by the financial institution system 98, it is unlikely that a reviewer of the image 204 has access to the financial records stored by the financial institution system 98, both because providing such access would raise security/privacy concerns, and because the video recording device 86, the network video recorder 82, and the IP camera device 84 are walled off from the first network 92 and therefore cannot access such information. What is desired are techniques for making transaction information available while reviewing images or videos captured by camera devices 84, 88, while retaining the security and privacy offered by the separation of the first network 92 and the second network 94.
Unlike the traditional system 100, the system 300 includes a video annotation device 302. The video annotation device 302 includes a first network interface that is coupled to the first network 92, and listens to network traffic between the ATM 96 and the financial institution system 98. From this network traffic, the video annotation device 302 extracts information about transactions occurring using the ATM 96, and generates annotations based on the information.
In some embodiments, the analog camera device 88 provides a video signal to the video annotation device 302. The video annotation device 302 then updates the video signal with an annotation, and provides the updated video signal to the video recording device 86. In some embodiments, the video annotation device 302 includes a second network interface that is coupled to the second network 94. The video annotation device 302 can then transmit annotation information to the IP camera device 84 to be added to a digital video signal as on-screen display (OSD) information. The video annotation device 302 may also retrieve video clips or snapshots from the IP camera device 84 for storage on the video annotation device 302 along with the annotation information.
Further details of the components of the system 300 are provided below.
As shown, the video annotation device 302 includes one or more processors 402, a first network interface 404, a second network interface 406, a video in interface 416, an analog video processing device 418, a video out interface 420, and a non-transitory computer-readable medium 408.
In some embodiments, the one or more processors 402 may include any type of commercially available computer processor. In some embodiments, the first network interface 404 and the second network interface 406 may include hardware and associated software and/or firmware for connecting to suitable networking technologies, including but not limited to wired technologies (including but not limited to Ethernet, USB, FireWire, fiber optic, and serial communication) and/or wireless technologies (including but not limited to 2G, 3G, 4G, LTE, Wi-Fi, WiMAX, and Bluetooth). As one non-limiting example, the first network interface 404 may include a first RJ-45 socket to accept a cable coupled to the first network 92, and the second network interface 406 may include a second RJ-45 socket to accept a cable coupled to the second network 94.
In some embodiments, the video in interface 416 includes a connector such as an RCA socket or jack configured to be coupled via a coaxial cable to an output of an analog camera device 88. The signal received via the video in interface 416 is provided to the analog video processing device 418, which may be an integrated circuit, processor, or other circuitry configured to process the video signal as described below. In some embodiments, the video out interface 420 includes another connector such as an RCA socket or jack configured to be coupled via a coaxial cable to an input of the video recording device 86. The signal processed by the analog video processing device 418 is provided to the video recording device 86 via the video out interface 420. Though the analog camera device 88, the video in interface 416, the video out interface 420, and the video recording device 86 are described as relating to “analog” video, in some embodiments, these components provide video in another format, such as digital, over similar direct-wired video connections. For example, the connection between the analog camera device 88 and the video annotation device 302 (and the connection between the video annotation device 302 and the video recording device 86) may be via a DVI connection, a DisplayPort connection, an HDMI connection, or another type of video transfer technology.
In some embodiments, the computer-readable medium 408 may include one or more computer-readable media that use any suitable technology, including but not limited to a hard drive, a flash drive, an optical drive, an EEPROM, and RAM. As shown, the computer-readable medium 408 has computer-executable instructions stored thereon that, in response to execution by the one or more processors 402, cause the video annotation device 302 to provide a transaction monitoring engine 410 and an annotation generation engine 412.
In some embodiments, the transaction monitoring engine 410 monitors communication traffic on the first network 92 to detect transactions at the ATM 96. In some embodiments, the annotation generation engine 412 uses transaction information detected by the transaction monitoring engine 410 to create annotations to be added to or stored with data generated by the analog camera device 88 and/or the IP camera device 84. As shown, the computer-readable medium 408 also stores thereon a transaction data store 414. In some embodiments, the transaction data store 414 is configured to store copies of data generated by the analog camera device 88 and/or the IP camera device 84, either as updated with the annotations, or along with the annotation information.
In general, the word “engine,” as used herein, refers to logic embodied in hardware or software instructions, which can be written in a programming language, such as C, C++, COBOL, JAVA™, PHP, Perl, HTML, CSS, JavaScript, VBScript, ASPX, Microsoft .NET™, and/or the like. An engine may be compiled into executable programs or written in interpreted programming languages. Software engines may be callable from other engines or from themselves. Generally, the engines described herein refer to logical modules that can be merged with other engines, or can be divided into sub engines. The engines can be stored in any type of computer readable medium or computer storage device and be stored on and executed by one or more general purpose computers, thus creating a special purpose computer configured to provide the engine or the functionality thereof.
A “data store” as described herein may be any suitable device configured to store data for access by a computing device. One example of a data store is a highly reliable, high-speed relational database management system (DBMS) executing on one or more computing devices. Another example of a data store is a key-value store. However, any other suitable storage technique and/or device capable of quickly and reliably providing the stored data in response to queries may be used, and the computing device may be accessible over a network instead of locally, or may be provided as a cloud-based service. A data store may also include data stored in an organized manner on a computer-readable storage medium, as described further below. The single data store described herein may be divided into multiple data stores or may be stored on multiple computer-readable media without departing from the scope of the present disclosure.
In some embodiments, the video annotation device 302 may include other components. For example, the video annotation device 302 may include a power source such as a 12V 2.0A wall power supply or a Power over Ethernet (PoE) interface. As another example, the video annotation device 302 may include a user interface engine that generates a user interface through which information stored within the transaction data store 414 can be searched, browsed, viewed, downloaded, and/or otherwise managed. In some embodiments, the video annotation device 302 may include multiple interfaces to allow connections to multiple different types of networks, camera devices, and/or recording devices, either concurrently or at different times. For example, a single video annotation device 302 may include RCA video in/out interfaces along with HDMI video in/out interfaces, and may include RJ-45 network interfaces along with USB interfaces and/or wireless interfaces.
At block 504, in response to the transaction request, the ATM 96 receives a function command from the financial institution system 98 via the first network 92. As with the transaction request, the function command may use a standard protocol such as the protocols listed above. In some embodiments, the function command includes instructions to the ATM 96 that cause the ATM 96 to perform actions such as dispensing an instructed amount of currency and/or presenting information to the user. In some embodiments, the function command also includes information describing the transaction. For example, the function command may include “receipt text,” which is a plain-text listing of information relevant to the transaction to print on a receipt.
At block 508, a transaction monitoring engine 410 of a video annotation device 302 receives the function command via the first network 92. In some embodiments, the function command may be addressed to the ATM 96 using an internet protocol (IP) address or a media access control (MAC) address that uniquely identifies the ATM 96. As with other IP-based communication, packets intended for the ATM 96 are transmitted on the physical layer of the first network 92, and include the MAC address and/or IP address of the ATM 96 as a destination address in a header portion. Though any device coupled to the first network 92 on the same subnet as the ATM 96 would be able to see the packets traversing the physical layer, only the ATM 96 would typically process the packets due to the addressing. In some embodiments, the transaction monitoring engine 410 is configured during setup of the system 300 to be aware of the MAC address and/or the IP address of the ATM 96, and the transaction monitoring engine 410 reviews packets addressed to the MAC address and/or the IP address of the ATM 96 to look for function commands. When such packets are identified, the transaction monitoring engine 410 receives the packets in order to process the information there. In some embodiments, the first network interface 404 is placed in “promiscuous mode” so that packets or frames addressed to the ATM 96 are not dropped before being passed to the transaction monitoring engine 410 for processing. In some embodiments, a network switch to which both the video annotation device 302 and the ATM 96 are communicatively coupled may be configured during setup of the system 300 to provide network traffic addressed to the ATM 96 to both the ATM 96 and the video annotation device 302.
At block 510, the transaction monitoring engine 410 extracts annotation information from the function command. In some embodiments, the annotation information may be retrieved from the receipt text, which may be detected between standard field separators within the function command. In some embodiments, the annotation information may include additional information other than that found in the receipt text, including but not limited to an account number, an account name, a timestamp, and information identifying the ATM 96 or the video annotation device 302.
The method 500 proceeds to a continuation terminal (“terminal A”). From terminal A (
If it is determined that the annotation will be added to an analog video signal, then the result of decision block 512 is YES. At block 514, the video annotation device 302 receives an analog video signal from an analog camera device 88. At block 516, an annotation generation engine 412 of the video annotation device 302 adds an annotation based on the annotation information to the analog video signal to create an annotated analog video signal. The annotation generation engine 412 may generate the annotation based on the annotation information extracted by the transaction monitoring engine 410, and the annotation may actually be added to the video signal by the analog video processing device 418. In some embodiments, the annotation is added as a text overlay, within a closed caption signal, or in any other suitable format. In some embodiments, the annotation generation engine 412 may also store a digital copy of a portion of the analog video signal, such as a 10-second clip or a snapshot, in the transaction data store 414, either with the annotation applied or in association with the annotation information. At block 518, the video annotation device 302 transmits the annotated analog video signal to a video recording device 86. The video recording device 86 stores the annotated analog video signal using any suitable technique.
In some embodiments, instead of processing the video using the analog video processing device 418, the video annotation device 302 may be communicatively coupled to the video recording device 86 via a data connection such as a USB connection or a serial connection. In such embodiments, instead of processing the video to add the annotation at block 516, the annotation generation engine 412 may transmit the annotation information via the data connection to the video recording device 86, and the video recording device 86 may add the annotation to the video.
Returning to decision block 512, if it is determined that the annotation will be added to digital video data, then the result of decision block 512 is NO. At block 520, the annotation generation engine 412 transmits a command to an IP camera device 84 via a second network 94 to add an on-screen display annotation based on the annotation information. In some embodiments, the annotation generation engine 412 may transmit a command using the ONVIF standard to indicate that the annotation information should be added as on-screen display (OSD) text to the video being captured by the IP camera device 84.
At block 522, the transaction monitoring engine 410 retrieves a snapshot photo from the IP camera device 84 via the second network 94. The snapshot photo may be retrieved using the ONVIF video standard, or using any other technique to communicate with the IP camera device 84. At block 524, the transaction monitoring engine 410 stores the snapshot photo and the annotation information in a transaction data store 414. The annotation information may be applied to the snapshot photo, or may be stored in association with the snapshot photo. In some embodiments, instead of or in addition to retrieving a snapshot photo, the transaction monitoring engine 410 may retrieve a video clip (such as a ten-second video clip).
In its most basic configuration, the computing device 700 includes at least one processor 702 and a system memory 704 connected by a communication bus 706. Depending on the exact configuration and type of device, the system memory 704 may be volatile or nonvolatile memory, such as read only memory (“ROM”), random access memory (“RAM”), EEPROM, flash memory, or similar memory technology. Those of ordinary skill in the art and others will recognize that system memory 704 typically stores data and/or program modules that are immediately accessible to and/or currently being operated on by the processor 702. In this regard, the processor 702 may serve as a computational center of the computing device 700 by supporting the execution of instructions.
As further illustrated in
In the exemplary embodiment depicted in
As used herein, the term “computer-readable medium” includes volatile and non-volatile and removable and non-removable media implemented in any method or technology capable of storing information, such as computer-readable instructions, data structures, program modules, or other data. In this regard, the system memory 704 and storage medium 708 depicted in
Suitable implementations of computing devices that include a processor 702, system memory 704, communication bus 706, storage medium 708, and network interface 710 are known and commercially available. For ease of illustration and because it is not important for an understanding of the claimed subject matter,
While illustrative embodiments have been illustrated and described, it will be appreciated that various changes can be made therein without departing from the spirit and scope of the invention.
This application claims the benefit of Provisional Application No. 62/660,769, filed Apr. 20, 2018, the entire disclosure of which is hereby incorporated by reference herein for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4326221 | Mallos | Apr 1982 | A |
4660168 | Grant | Apr 1987 | A |
4991008 | Nama | Feb 1991 | A |
5491511 | Odle | Feb 1996 | A |
6023688 | Ramachandran et al. | Feb 2000 | A |
6400276 | Clark | Jun 2002 | B1 |
6508397 | Do | Jan 2003 | B1 |
6661910 | Jones et al. | Dec 2003 | B2 |
6704039 | Pena | Mar 2004 | B2 |
6726094 | Rantze et al. | Apr 2004 | B1 |
7015945 | Sullivan | Mar 2006 | B1 |
7039600 | Meek et al. | May 2006 | B1 |
7304662 | Sullivan | Dec 2007 | B1 |
7389914 | Enright | Jun 2008 | B1 |
7416112 | Smith et al. | Aug 2008 | B2 |
7469824 | Crews | Dec 2008 | B1 |
7549574 | Crowell | Jun 2009 | B2 |
7624919 | Meek et al. | Dec 2009 | B2 |
7631808 | Kundu et al. | Dec 2009 | B2 |
7832632 | Meek et al. | Nov 2010 | B2 |
7883009 | Meek et al. | Feb 2011 | B2 |
7925536 | Lipton | Apr 2011 | B2 |
7942313 | Grimm | May 2011 | B1 |
7942314 | Grimm et al. | May 2011 | B1 |
7946476 | Grimm et al. | May 2011 | B1 |
8032667 | Takei | Oct 2011 | B2 |
8132725 | Kundu et al. | Mar 2012 | B2 |
8235283 | Meek et al. | Aug 2012 | B2 |
8302856 | Grimm et al. | Nov 2012 | B1 |
8317087 | Crews | Nov 2012 | B1 |
8336769 | Meek et al. | Dec 2012 | B2 |
8381970 | Meek et al. | Feb 2013 | B2 |
8396766 | Enright et al. | Mar 2013 | B1 |
8474704 | Grimm et al. | Jul 2013 | B1 |
8511546 | Meek et al. | Aug 2013 | B2 |
8538820 | Migdal | Sep 2013 | B1 |
8570375 | Srinivasan | Oct 2013 | B1 |
8584935 | Meek et al. | Nov 2013 | B2 |
8640946 | Block | Feb 2014 | B1 |
8651373 | Block | Feb 2014 | B1 |
8701990 | Meek et al. | Apr 2014 | B2 |
8833653 | Kundu et al. | Sep 2014 | B2 |
9004353 | Block | Apr 2015 | B1 |
9202117 | Kundu et al. | Dec 2015 | B2 |
9277185 | Lipton et al. | Mar 2016 | B2 |
9530132 | Cox | Dec 2016 | B2 |
9734673 | Priesterjahn et al. | Aug 2017 | B2 |
9846996 | Moore | Dec 2017 | B1 |
10311646 | Wurmfeld | Jun 2019 | B1 |
10331874 | Benkreira | Jun 2019 | B1 |
20020026426 | Bennett | Feb 2002 | A1 |
20020105665 | Wasilewski et al. | Aug 2002 | A1 |
20040024709 | Yu | Feb 2004 | A1 |
20040155960 | Wren | Aug 2004 | A1 |
20050171907 | Lewis | Aug 2005 | A1 |
20050177859 | Valentino, III et al. | Aug 2005 | A1 |
20060193004 | Wasilewski et al. | Aug 2006 | A1 |
20070131757 | Hamilton | Jun 2007 | A1 |
20080208698 | Olson | Aug 2008 | A1 |
20090087160 | Lui | Apr 2009 | A1 |
20090150246 | Meenakshi et al. | Jun 2009 | A1 |
20090201372 | O'Doherty | Aug 2009 | A1 |
20090244285 | Chathukutty | Oct 2009 | A1 |
20100102511 | Tammesoo | Apr 2010 | A1 |
20100238286 | Boghossian | Sep 2010 | A1 |
20110035240 | Joao | Feb 2011 | A1 |
20110264586 | Boone et al. | Oct 2011 | A1 |
20120078667 | Denker | Mar 2012 | A1 |
20120185318 | Shipley | Jul 2012 | A1 |
20130227594 | Boone et al. | Aug 2013 | A1 |
20150058159 | Balram | Feb 2015 | A1 |
20160072915 | Decanne | Mar 2016 | A1 |
20160098692 | Johnson | Apr 2016 | A1 |
20160098709 | Johnson | Apr 2016 | A1 |
20180300720 | Tanaka | Oct 2018 | A1 |
20190325710 | King | Oct 2019 | A1 |
Number | Date | Country | |
---|---|---|---|
20190325710 A1 | Oct 2019 | US |
Number | Date | Country | |
---|---|---|---|
62660769 | Apr 2018 | US |