The disclosed technology relates generally to the capture of digital media files, and more particularly some embodiments relate to the capture of authenticatable digital media files.
Systems and methods for capturing authenticatable digital media files on connected media-capture devices are disclosed. In general, one aspect disclosed features a media-capture device, comprising: one or more sensors; a hardware processor; and a non-transitory machine-readable storage medium encoded with instructions executable by the hardware processor to perform a method comprising: initiating acquisition of one or more sensor data samples representing analog phenomena captured by the one or more sensors; receiving the one or more sensor data samples; responsive to receiving the one or more sensor data samples, encoding the one or more sensor data samples; generating a to-be-signed data structure comprising at least one of: the one or more encoded sensor data samples, or one or more cryptographic hashes of the one or more encoded sensor data samples; generating a cryptographic hash of the to-be-signed data structure; transmitting a time-stamping request to a time-stamping server, wherein the time-stamping request comprises the cryptographic hash of the to-be-signed data structure, and wherein the time-stamping server generates a signed time-stamp responsive to receiving the time-stamping request; generating a digital signature using the to-be-signed data structure, the signed time-stamp, a private cryptographic key, and a signed certificate for the corresponding public cryptographic key; and generating a second data structure comprising the one or more encoded or unencoded sensor data samples, the to-be-signed data structure, and the digital signature.
Embodiments of the system may include one or more of the following features. In some embodiments, the method further comprises storing the second data structure in a file system of the device. In some embodiments, the method further comprises generating auxiliary data based on the one or more encoded or unencoded sensor data samples; and generating a hash of the auxiliary data; adding the hash of the auxiliary data to the first data structure. In some embodiments, the method further comprises prior to initiating acquisition of the one or more sensor data samples, determining whether the certificate for the public key corresponding to the private cryptographic key has expired; and responsive to determining the certificate for the public key corresponding to the private cryptographic key has expired, disabling acquisition of the one or more sensor data samples. In some embodiments, the method further comprises, responsive to determining the certificate for the public key corresponding to the private cryptographic key has expired, generating a new cryptographic key pair comprising a new public key and a new private key, generating a certificate signing request for the new public key, signing the certificate signing request with the new private key, and transmitting the signed certificate signing request to a registration server; wherein, responsive to receiving the signed certificate signing request, the registration server validates eligibility of the media-capture device to receive a certificate, and responsive to a successful validation relays the signed certificate signing request to a certification server; wherein, responsive to receiving the related signed certificate signing request, the certification server issues a signed certificate for the new public key and relays the signed certificate to the registration server; wherein, responsive to receiving the signed certificate, the registration server relays signed certificate to the media-capture device; and responsive to receiving the signed certificate, storing the signed certificate and enabling acquisition of the one or more sensor data samples. In some embodiments, the certificate for the public key corresponding to the private cryptographic key has a validity window; and determining whether the certificate for the public key corresponding to the private cryptographic key has expired comprises comparing the certificate's validity window to a local time value generated by a local clock in the device. In some embodiments, the method further comprises, prior to determining whether the certificate for the public key corresponding to the private cryptographic key has expired: obtaining a trusted time value from the time-stamping server; and initiating the local clock with the trusted time value.
The present disclosure, in accordance with one or more various embodiments, is described in detail with reference to the following figures. The figures are provided for purposes of illustration only and merely depict typical or example embodiments.
The figures are not exhaustive and do not limit the present disclosure to the precise form disclosed.
Digital media files, such as photos, videos, audio recordings, are created by media recording devices that digitize analog phenomena into binary information, then encode this binary information into files for storage, transport, or both. Typically, the binary files that encode the digitization of the analog phenomenon carry additional information (typically called metadata) which provides additional information about the media file which may be helpful to the viewer. For example, this metadata may include the date and time when the media was captured and digitized or the location where that took place. Some of the metadata may be the result of the digitization of the analog phenomena (e.g. a capture device's location inferred from a radio transceiver that captures satellite or cellular signals and computes a device's location).
While the resulting media file, which carries both the digitized audiovisual phenomena and the metadata, ostensibly reflects a faithful reproduction of the analog environment that the media capture device digitized, there is typically no way for a downstream consumer of the media file to know that with any certainty. It is possible and in fact common for the binary information in media files to be manipulated without leaving evidence of manipulation. While several categories of manipulations are entirely benign, some manipulations may be intended to deceive the media consumer. For example, a manipulator may use several readily-available tools and emerging artificial intelligence (AI) technology to add or remove objects from a photo, swap faces in videos, or synthesize someone else's voice to replace words in a recorded speech. This may leave the media consumer defenseless against such manipulations, especially as the technology that enables manipulation grows more sophisticated to evade forensic detection techniques.
The disclosed embodiments provide credentials that allow a consumer of a digital media file captured using the disclosed technologies to ascertain whether the integrity of the file has been preserved since it was first created. That is, with these credentials, the user can ensure the file has not been altered. The integrity can be ascertained even if the system that issued the credentials by which the integrity is guaranteed no longer functions or exists.
The system 100 may include a time-stamping server 104. The time-stamping server 104 may provide a trusted time value that media capture devices 102 may use to apply trusted time-stamps to the media files they create, thereby proving the existence of a particular piece of data at a given point in time.
The system 100 may include a registration server 108. The registration server 108 may authenticate and approve requests from media-capture devices for cryptographic credentials. The system 100 may include a certification server 110. The certification server 110 may issue the cryptographic credentials. The system 100 may include a validation server 106. The validation server 106 may confirm the validity of the cryptographic credentials.
The system 100 may communicate via a network 112. The network 112 may be public, private, or a combination thereof, and may include the Internet.
While the functions of the time-stamping server 104, registration server 108, certification server 110, validation server 106 are depicted separately, it should be understood that this is a separation of logical functions, and should not be construed as a mandate for a physical separation of these functions across different servers or machines. In some implementations, these functions may be combined together in various permutations or further subdivided as needed.
The core system of the media-capture device 200 may include one or more sensor data acquisition modules 206. The sensor data acquisition modules 206 may acquire and optionally preprocess the signals from sensors. The media-capture device 200 may include a different sensor data acquisition module 206 for each sensor 202, or one or more sensors 202 may share a data acquisition module 206. Each sensor data acquisition module 206 may be implemented in a dedicated or shared hardware block, software code that executes in a dedicated or shared processor, or a combination of both.
The core system of the media-capture device 200 may include one or more sensor data encoding modules 208. Each sensor data encoding module 208 may encode preprocessed sensor data into a final form. The encoding may compact the sensor data or change its representation in order to make it understandable by downstream recipients, whether human or machine.
The core system of the media-capture device 200 may include a file system 210. The file system 210 may store both ephemeral and non-ephemeral files, including, optionally, media files which may result from the recording activity of the connected media-capture device 200.
The media-capture device 200 may include a capture application (App) 212. The capture application 212 may initiate, control, and receive the results of a media capture operation. In some embodiments, the capture application 212 may be a standalone application that operates autonomously and automatically in the media-capture device 200.
In some embodiments, the capture application 212 may be a user-facing application designed to receive commands from an external actor (e.g., a human user) and relay information about the media capture operation. In such embodiments, the capture application 212 may feature a user control module 218 which is designed to enable an external actor to issue commands to the capture application 212 to effect the capture operation. Also in such embodiments, the capture application 212 may feature a user preview module 216. The user preview module 216 may create a presentation of the sensor data to an external actor that represents a digitized form of the analog phenomena 204. For example, the user preview module 216 in a camera application may present a representation of the data seen by the image sensor through the lens system to a human (e.g., a photographer) via a display subsystem. There may be multiple user preview modules 216 that correspond to different sensors 202 in the media-capture device 200, suitable for the analog phenomena 204 that each sensor 202 converts to electrical signals. In addition to the above-described optional functions, the capture application 212 may contain a core application logic 214 that represents its core logic.
The media-capture device 200 may include a controlled capture subsystem 220. The controlled capture subsystem 220 may oversee and control capture operations. The controlled capture subsystem 220 may be responsible for generating a final representation of the captured media along with data that can prove its integrity.
The media-capture device 200 may include an abstraction module 222. The abstraction module 222 may act as an interface to the capture application 212 and the file system 210.
The media-capture device 200 may include a key generation module 226. The key generation module 226 may generate cryptographic keys. The cryptographic keys may be used for generating cryptographic primitives such as digital signatures and similar cryptographic primitives.
The media-capture device 200 may include a key storage and retrieval module 228. The key storage and retrieval module 228 may provide non-volatile storage for the cryptographic keys generated by the key generation module 226. The key storage and retrieval module 228 may serve up the cryptographic keys for use by other functions.
The media-capture device 200 may include a cryptographic operations module 230. The cryptographic operations module 230 may generate cryptographic primitives such as digital signatures and cryptographic hashes over data it receives from other functions, and may use cryptographic keys when needed.
The media-capture device 200 may include a communication module 232. The communication module 232 may transmit and receive data over networks such as the public or private networks 112 of
The media-capture device 200 may include an orchestration module 224. The orchestration module 224 may act as the core logic of the controlled capture subsystem 220.
The hierarchy and division of the modules of the connected media-capture device 200 are only logical. In various implementations, these modules may be merged together, subdivided further, and the like. The modules may span multiple physical, logical, or virtual hardware and software components within the media-capture device 200, as well as multiple security boundaries. The modules may be performed by dedicated hardware, by firmware executing in specialized or generic hardware, by software executing in specialized or generic processing hardware, or any combination thereof. Additionally, these logical modules may make use of hardware, firmware or software resources that are not explicitly depicted in
The process 300 may begin with the invocation of the application logic 214 of the capture application 212. In some embodiments, the invocation may be initiated by a user. In some embodiments, the invocation may be autonomously effected in the media-capture device 200 without an external agent or trigger. The invocation of the application logic 214 may be the result of the loading of the capture application 212 by an internal or external trigger, or the switching of the capture application 212 into a specialized capture mode (e.g., similar to the invocation of a “panorama” capture mode in a camera app).
The application logic 214 in turn may invoke the controlled capture subsystem 220 by sending a message to the abstraction module 222, which may pass configuration parameters as part of the invocation. The abstraction module 222 in turn may load and activate the orchestration module 224, which may pass configuration parameters in the process, such as parameters that define the desired characteristics for the encoded sensor data, for example the desired pixel width and height of a still photograph.
The orchestration module 224 may request the sensor data acquisition module 206 for one or more sensors 202 to initialize, and may pass configuration parameters in the process, such as the desired accuracy level of the sensor data. The sensor data acquisition module 206 for the initialized sensors 202 may signal its success in initializing the sensors 202 to the orchestration module 224. At this point the sensors 202 are ready for capture, as shown at 302 in
Referring again to
With the orchestration module 224 loaded, and the user preview module 216 optionally operational, the orchestration module 224 may commence the process of preparing cryptographic credentials which will be used to apply integrity data to the captured media data.
Referring to
Referring again to
Referring again to
Referring again to
Alternatively, if the key pair is not valid, at 310, the process 300 may include generating a public/private cryptographic key pair, at 314. For example, referring again to
Referring again to
Referring again to
Referring again to
The communication module 232 may send the registration server 108 a request for a signed certificate, passing along the signed certificate signing request. The registration server 108 may receive the signed certificate signing request, and may prepare the signed certificate signing request for transmission to the certification server 110. The method by which the certification server 110 is made aware of the presence of a pending signed certificate signing request at the registration server 108 may vary by implementation, depending on the security objectives of the system. In some implementations, the registration server 108 may initiate a connection to the certification server 110 and transmit the signed certificate signing request. In other implementations, the certification server 110 may poll the registration server 108 at some interval to check for any signed certificate signing requests that are awaiting certification by the certification server 110. If the registration server 108 finds a pending signed certificate signing request, the registration server 108 may signal this to the certification server 110 and may transmit the signed certificate signing request to the certification server 110.
Upon receiving the signed certificate signing request, the certification server 110 may validate its parameters, including the presence and validity of mandatory and optional information about the new key pair and its subject. The subject may be the unique identity of that specific media capture device 200 that generated the signing request, the identity of the capture application 212 (for example, a name and software version number), the identity of the controlled capture subsystem 220 (for example, a name and software version number), or similar information. If the parameters are valid, the certification server 110 may compose an unsigned certificate which binds the new key pair to its subject, sets a validity period during which it may be used, and may place restrictions on what the new key pair may be used for.
Then, the certification server 110 may sign the unsigned certificate using its own private key, and may signal the success of the operation back to the registration server 108, passing along the signed certificate. In some embodiments, the signed certificate may be in an industry-standard format such as X.509v3. In some embodiments, both the registration server 108 and the certification server 110 may record the receipt of the signed certificate signing request and the issuing of the signed certificate in internal databases.
Referring again to
Referring again to
Referring again to
The capture process 400 is described in terms of capturing and encoding a snapshot in time of the value of one or more sensors. In this example, a single “capture” command may take place once, and is expected to return a result. For example, the result may include a two-dimensional array of pixel values representing an image digitized by an image sensor around a particular moment in time, or the value of a location sensor or a temperature sensor around a particular moment in time.
However, the disclosed technology also applies to other forms of capture that aggregate together multiple sensor readings into a single encoded sensor data value. The single value may be a composite of multiple individual readings from the sensor, anchored around a moment in time. For example, a single, instantaneous “capture” command may trigger a process by which multiple frames of image sensor data are acquired then combined into a single encoded image, for noise reduction or other image enhancement purposes. A similar process may be applied to a “burst” operation, which is usually accomplished by holding down a capture button. This burst operation results in multiple single encoded sensor data values.
Referring again to
Referring to
The abstraction module 222 may pass along the capture command and any configuration parameters to the orchestration module 224. The orchestration module 224 may then query each sensor data acquisition module 206 for one or more sensor readings, and may pass along configuration parameters as part of the query.
A single capture operation may include the acquisition of encoded sensor data snapshot values from multiple sensors 202 at the same time. For example, for a media-capture device capable of capturing still photos, a still photo capture may include data values from an image sensor, a location sensor, a depth sensor, a pressure sensor, a magnetometer, and similar sensors. The orchestration module 224 may acquire data from multiple different sensor types via corresponding sensor data acquisition modules 206 and sensor data encoding modules 208, and may collate their data into a single file container at the end of the capture operation. Referring again to
The capture process 400 may include, responsive to receiving the one or more preprocessed sensor data samples, encoding them using the coding mechanism expected by downstream recipients of the media file, at 406. For example, referring again to
An encoded sensor data snapshot value may have multiple dimensions. For example, an encoded sensor data snapshot value may be a dimensionless unitary value, a one-dimensional array of values, a two-dimensional matrix of values, a three-dimensional tensor of values, and values having other dimensions. Whatever the dimensionality, the data may represent a single encoded value of the sensor reading anchored at a specific moment in time.
A single capture operation may include the capture of encoded sensor data snapshot values from multiple sensors 202 at the same time. For example, for a media-capture device capable of capturing still photos, a still photo capture may include data values from an image sensor, a location sensor, a depth sensor, a pressure sensor, a magnetometer, and similar sensors. The orchestration module 224 may acquire data from multiple different sensor types via corresponding sensor data acquisition modules 206 and sensor data encoding modules 208, and may collate their data into a single file container at the end of the capture operation.
Referring again to
Referring again to
Referring again to
Referring again to
Referring again to
Referring again to
Referring again to
Referring to
Referring again to
Referring again to
While the process 400 describes the use of sensor data samples, auxiliary data about the sensor data samples, and protectable non-sensor data, various embodiments may include only the sensor data samples, only the sensor data samples and the auxiliary data about the sensor data samples, only the protectable non-sensor data, or any combination thereof.
Referring again to
The abstraction module 222 may then signal the success of the capture operation back to the application logic 214, passing the file handle for the container file that houses the captured media and the associated cryptographic authentication data. The application logic 214 may then signal the success of the capture operation to the user via the user preview 216, and may re-enable the capture controls 218 for the user.
Once the file is available on the file system 210, it may be shared with other modules inside the media-capture device 200, or shared with external devices, services, or human actors for consumption and evaluation. A downstream recipient may use digital signature validation tools to verify that the data protected by the digital signature (i.e. the “to-be-signed” data structure, and the data that it in turn protects) has not been modified since the time of the creation of the digital signature.
More importantly, because the file contains a signed time-stamp and a signed certificate for the public key (either directly or as part of the digital signature), a downstream validator can validate the digital signature even after the key pair which was used to create the digital signature has expired or is no longer valid. That's because the signed time-stamp guarantees that the digital signature was created while the signing key pair still had a valid public key certificate, and the two are linked by the hash of the first data structure.
The computer system 500 also includes a main memory 506, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 502 for storing information and instructions to be executed by processor 504. Main memory 506 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 504. Such instructions, when stored in storage media accessible to processor 504, render computer system 500 into a special-purpose machine that is customized to perform the operations specified in the instructions.
The computer system 500 further includes a read only memory (ROM) 508 or other static storage device coupled to bus 502 for storing static information and instructions for processor 504. A storage device 510, such as a magnetic disk, optical disk, or USB thumb drive (Flash drive), etc., is provided and coupled to bus 502 for storing information and instructions.
The computer system 500 may be coupled via bus 502 to a display 512, such as a liquid crystal display (LCD) (or touch screen), for displaying information to a computer user. An input device 514, including alphanumeric and other keys, is coupled to bus 502 for communicating information and command selections to processor 504. Another type of user input device is cursor control 516, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 504 and for controlling cursor movement on display 512. In some embodiments, the same direction information and command selections as cursor control may be implemented via receiving touches on a touch screen without a cursor.
The computing system 500 may include a user interface module to implement a GUI that may be stored in a mass storage device as executable software codes that are executed by the computing device(s). This and other modules may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.
In general, the word “component,” “engine,” “system,” “database,” data store,” and the like, as used herein, can refer to logic embodied in hardware or firmware, or to a collection of software instructions, which may have entry and exit points, written in a programming language, such as, for example, Java, C or C++. A software component may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software components may be callable from other components or from themselves, and/or may be invoked in response to detected events or interrupts. Software components configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, magnetic disc, or any other tangible medium, or as a digital download (and may be originally stored in a compressed or installable format that requires installation, decompression or decryption prior to execution). Such software code may be stored, partially or fully, on a memory device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware components may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors.
The computer system 500 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 500 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 500 in response to processor(s) 504 executing one or more sequences of one or more instructions contained in main memory 506. Such instructions may be read into main memory 506 from another storage medium, such as storage device 510. Execution of the sequences of instructions contained in main memory 506 causes processor(s) 504 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.
The term “non-transitory media,” and similar terms, as used herein refers to any media that store data and/or instructions that cause a machine to operate in a specific fashion. Such non-transitory media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 510. Volatile media includes dynamic memory, such as main memory 506. Common forms of non-transitory media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.
Non-transitory media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between non-transitory media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 502. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
The computer system 500 also includes a communication interface 518 coupled to bus 502. Network interface 518 provides a two-way data communication coupling to one or more network links that are connected to one or more local networks. For example, communication interface 518 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, network interface 518 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN (or a WAN component to communicate with a WAN). Wireless links may also be implemented. In any such implementation, network interface 518 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.
A network link typically provides data communication through one or more networks to other data devices. For example, a network link may provide a connection through local network to a host computer or to data equipment operated by an Internet Service Provider (ISP). The ISP in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet.” Local network and Internet both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link and through communication interface 518, which carry the digital data to and from computer system 500, are example forms of transmission media.
The computer system 500 can send messages and receive data, including program code, through the network(s), network link and communication interface 518. In the Internet example, a server might transmit a requested code for an application program through the Internet, the ISP, the local network and the communication interface 518.
The received code may be executed by processor 504 as it is received, and/or stored in storage device 510, or other non-volatile storage for later execution.
Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code components executed by one or more computer systems or computer processors comprising computer hardware. The one or more computer systems or computer processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). The processes and algorithms may be implemented partially or wholly in application-specific circuitry. The various features and processes described above may be used independently of one another, or may be combined in various ways. Different combinations and sub-combinations are intended to fall within the scope of this disclosure, and certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate, or may be performed in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The performance of certain of the operations or processes may be distributed among computer systems or computers processors, not only residing within a single machine, but deployed across a number of machines.
As used herein, a circuit might be implemented utilizing any form of hardware, or a combination of hardware and software. For example, one or more processors, controllers, ASICs, PLAs, PALs, CPLDs, FPGAs, logical components, software routines or other mechanisms might be implemented to make up a circuit. In implementation, the various circuits described herein might be implemented as discrete circuits or the functions and features described can be shared in part or in total among one or more circuits. Even though various features or elements of functionality may be individually described or claimed as separate circuits, these features and functionality can be shared among one or more common circuits, and such description shall not require or imply that separate circuits are required to implement such features or functionality. Where a circuit is implemented in whole or in part using software, such software can be implemented to operate with a computing or processing system capable of carrying out the functionality described with respect thereto, such as computer system 500.
As used herein, the term “or” may be construed in either an inclusive or exclusive sense. Moreover, the description of resources, operations, or structures in the singular shall not be read to exclude the plural. Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps.
Terms and phrases used in this document, and variations thereof, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. Adjectives such as “conventional,” “traditional,” “normal,” “standard,” “known,” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass conventional, traditional, normal, or standard technologies that may be available or known now or at any time in the future. The presence of broadening words and phrases such as “one or more,” “at least,” “but not limited to” or other like phrases in some instances shall not be read to mean that the narrower case is intended or required in instances where such broadening phrases may be absent.
The present application claims priority to U.S. Provisional Patent Application No. 63/159,048, filed Mar. 10, 2021, entitled “SYSTEM AND METHOD FOR CAPTURING AUTHENTICATABLE DIGITAL MEDIA FILES ON CONNECTED MEDIA-CAPTURE DEVICES,” the disclosure thereof incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5351134 | Yaguchi | Sep 1994 | A |
5499294 | Friedman | Mar 1996 | A |
6342290 | Conk | Jan 2002 | B1 |
6411725 | Rhoads | Jun 2002 | B1 |
6487301 | Zhao | Nov 2002 | B1 |
6557102 | Wong | Apr 2003 | B1 |
6628417 | Naito | Sep 2003 | B1 |
6823075 | Perry | Nov 2004 | B2 |
6947571 | Rhoads | Sep 2005 | B1 |
7209571 | Davis | Apr 2007 | B2 |
7277576 | Abbate | Oct 2007 | B2 |
7525578 | Barbeau | Apr 2009 | B1 |
7616777 | Rodriguez | Nov 2009 | B2 |
7958458 | Maeta | Jun 2011 | B2 |
8121342 | Davis | Feb 2012 | B2 |
8413882 | Nidamarthi | Apr 2013 | B1 |
8443001 | Nichols | May 2013 | B2 |
8462209 | Sun | Jun 2013 | B2 |
8849819 | Johnson | Sep 2014 | B2 |
8868039 | Rodriguez | Oct 2014 | B2 |
8879120 | Thrasher | Nov 2014 | B2 |
8955137 | Mousty | Feb 2015 | B2 |
9002719 | Tofte | Apr 2015 | B2 |
9082235 | Lau | Jul 2015 | B2 |
9300678 | Stack | Mar 2016 | B1 |
9594980 | Graham | Mar 2017 | B1 |
9609288 | Richman | Mar 2017 | B1 |
9614886 | Zhong | Apr 2017 | B2 |
9621565 | Stack | Apr 2017 | B2 |
9652460 | Barisic | May 2017 | B1 |
9779775 | Pacurariu | Oct 2017 | B2 |
9832017 | Malone | Nov 2017 | B2 |
9910865 | Mikolajczyk | Mar 2018 | B2 |
10013568 | Mityagin | Jul 2018 | B2 |
10095877 | Stack | Oct 2018 | B2 |
10102526 | Madisetti | Oct 2018 | B1 |
10255419 | Kragh | Apr 2019 | B1 |
10277400 | Griffin | Apr 2019 | B1 |
10360668 | McGregor | Jul 2019 | B1 |
10361866 | McGregor | Jul 2019 | B1 |
10375050 | Lyons | Aug 2019 | B2 |
10389733 | Fasoli | Aug 2019 | B2 |
10404477 | Deck | Sep 2019 | B1 |
10467507 | Hao | Nov 2019 | B1 |
10635894 | Genner | Apr 2020 | B1 |
10726533 | McGregor | Jul 2020 | B2 |
10733315 | Stack | Aug 2020 | B2 |
11037284 | Rice | Jun 2021 | B1 |
11159504 | Lyons | Oct 2021 | B2 |
11256792 | Tussy | Feb 2022 | B2 |
11334687 | Stack | May 2022 | B2 |
11373449 | Genner | Jun 2022 | B1 |
11544835 | Rice | Jan 2023 | B2 |
11646902 | McGregor | May 2023 | B2 |
20020056043 | Glass | May 2002 | A1 |
20030065922 | Fredlund | Apr 2003 | A1 |
20040039912 | Borrowman | Feb 2004 | A1 |
20040091111 | Levy | May 2004 | A1 |
20040153649 | Rhoads | Aug 2004 | A1 |
20040213437 | Howard | Oct 2004 | A1 |
20050125668 | Botz | Jun 2005 | A1 |
20050273368 | Hutten | Dec 2005 | A1 |
20060018506 | Rodriguez | Jan 2006 | A1 |
20060036864 | Parulski | Feb 2006 | A1 |
20060115111 | Malone | Jun 2006 | A1 |
20060120562 | Fudge | Jun 2006 | A1 |
20060157559 | Levy | Jul 2006 | A1 |
20060218404 | Ogura | Sep 2006 | A1 |
20060262976 | Hart | Nov 2006 | A1 |
20070019836 | Thorwirth | Jan 2007 | A1 |
20070091376 | Calhoon | Apr 2007 | A1 |
20070162756 | Fredlund | Jul 2007 | A1 |
20070171288 | Inoue | Jul 2007 | A1 |
20080005086 | Moore | Jan 2008 | A1 |
20090044235 | Davidson | Feb 2009 | A1 |
20090320101 | Doyle | Dec 2009 | A1 |
20100046748 | Kusnoto | Feb 2010 | A1 |
20100250953 | Wiersma | Sep 2010 | A1 |
20100281475 | Jain | Nov 2010 | A1 |
20100309987 | Concion | Dec 2010 | A1 |
20100317399 | Rodriguez | Dec 2010 | A1 |
20110085728 | Gao | Apr 2011 | A1 |
20110087690 | Cairns | Apr 2011 | A1 |
20110156879 | Matsushita | Jun 2011 | A1 |
20110221568 | Giobbi | Sep 2011 | A1 |
20110231645 | Thomas | Sep 2011 | A1 |
20110258326 | Hu | Oct 2011 | A1 |
20120004949 | Coleman | Jan 2012 | A1 |
20120143630 | Hertenstein | Jun 2012 | A1 |
20120269425 | Marchesotti | Oct 2012 | A1 |
20120278370 | Nichols | Nov 2012 | A1 |
20120311623 | Davis | Dec 2012 | A1 |
20130041948 | Tseng | Feb 2013 | A1 |
20140049653 | Leonard | Feb 2014 | A1 |
20140081932 | Krislov | Mar 2014 | A1 |
20140198687 | Raleigh | Jul 2014 | A1 |
20140244781 | Klayko | Aug 2014 | A1 |
20140279493 | Kamath | Sep 2014 | A1 |
20140297810 | Zhong | Oct 2014 | A1 |
20140324986 | Zhang | Oct 2014 | A1 |
20140358964 | Woods | Dec 2014 | A1 |
20150016661 | Lord | Jan 2015 | A1 |
20150142595 | Acuna-Rohter | May 2015 | A1 |
20150154436 | Shi | Jun 2015 | A1 |
20150213324 | Farid | Jul 2015 | A1 |
20150310306 | Song | Oct 2015 | A1 |
20160162729 | Hagen | Jun 2016 | A1 |
20160224768 | Boccon-Gibod | Aug 2016 | A1 |
20160301531 | Finlow-Bates | Oct 2016 | A1 |
20160379330 | Powers | Dec 2016 | A1 |
20170041306 | Stack | Feb 2017 | A1 |
20170041328 | Stack | Feb 2017 | A1 |
20170048216 | Chow | Feb 2017 | A1 |
20170093867 | Burns | Mar 2017 | A1 |
20170118493 | Hain | Apr 2017 | A1 |
20170178058 | Bhat | Jun 2017 | A1 |
20170180277 | Brady | Jun 2017 | A1 |
20170193329 | Suman | Jul 2017 | A1 |
20170193594 | Glasgow | Jul 2017 | A1 |
20170295232 | Curtis | Oct 2017 | A1 |
20170359326 | Garcia | Dec 2017 | A1 |
20170373847 | Chien | Dec 2017 | A1 |
20170373859 | Shors | Dec 2017 | A1 |
20170374622 | Juhani | Dec 2017 | A1 |
20180019873 | Kraemer | Jan 2018 | A1 |
20180026932 | Wang | Jan 2018 | A1 |
20180048474 | Landrock | Feb 2018 | A1 |
20180260888 | Paolini-Subramanya | Sep 2018 | A1 |
20180357501 | Ma | Dec 2018 | A1 |
20180365442 | Stack | Dec 2018 | A1 |
20190042722 | Hansen | Feb 2019 | A1 |
20190095655 | Krawczewicz | Mar 2019 | A1 |
20190109834 | Lyons | Apr 2019 | A1 |
20190109981 | Zhang | Apr 2019 | A1 |
20190147305 | Lu | May 2019 | A1 |
20190164285 | Nye | May 2019 | A1 |
20190251349 | Duerksen | Aug 2019 | A1 |
20190281259 | Palazzolo | Sep 2019 | A1 |
20190391972 | Bates | Dec 2019 | A1 |
20200007331 | Wentz | Jan 2020 | A1 |
20200012806 | Bates | Jan 2020 | A1 |
20200126209 | Kim | Apr 2020 | A1 |
20200210768 | Turkelson | Jul 2020 | A1 |
20210004949 | Broyda | Jan 2021 | A1 |
20210377262 | Butler | Dec 2021 | A1 |
20220179998 | Lamplmair | Jun 2022 | A1 |
Number | Date | Country |
---|---|---|
101258744 | Sep 2008 | CN |
102687159 | Sep 2012 | CN |
104079830 | Oct 2014 | CN |
103345758 | Aug 2016 | CN |
3099058 | Nov 2016 | EP |
2012164064 | Aug 2012 | JP |
9909743 | Feb 1999 | WO |
0152178 | Jul 2001 | WO |
0227431 | Apr 2002 | WO |
2015020601 | Feb 2015 | WO |
2017023896 | Feb 2017 | WO |
Entry |
---|
Anjum, Areesha, et al., “Recapture Detection Technique Based on Edge-Types by Analysing High-Frequency Components in Digital Images Acquired through LCD Screens”, Multimedia Tools and Applications, vol. 79, Springer, 2020 (Year: 2020), pp. 6965-6985. |
Chingovska, Ivana, et al., “On the Effectiveness of Local Binary Patterns in Face Anti-Spoofing”, 2012 International Conference on the Biometrics Special Interst Group (BIOSIG), 2012 (Year: 2012), pp. 1-7. |
Li, Haoliang, et al., “Image Recapture Detection with Convolutional and Recurrent Neural Networks”, Society for Imaging Science and Technology, 2017 (Year: 2017), pp. 87-91. |
Liu, Huacheng, et al., “Recaptured Image Detection Based on DCT Coefficients”, Journal of Computational Information Systems, vol. 9, No. 20, 2013 (Year: 2013), pp. 8139-8145. |
Maatta, J., et al., “Face Spoofing Detection from Single Images Using Texture and Local Shape Analysis”, IET Biometrics, 2012 (Year: 2012), pp. 3-10. |
Patel, Keyurkumar, et al., “Secure Face Unlock: Spoof Detection on Smartphones”, IEEE Transactions on Informatoin Forensics and Security, vol. 11, No. 10, Oct. 2016 (Year: 2016), pp. 2268-2283. |
Peixoto, Bruno, et al., “Face Liveness Detection Under Bad Illumination Conditions”, 2011 18th IEEE Internatioal Conference on Image Processing, Unicamp, 2011 (Year: 2011), pp. 3557-3560. |
Piva, Alessandro, “An Overview on Image Forensics”, Hindawi Publishing Corporation, ISRN Signal Processing, vol. 2013, Article ID 496701, 2012 (Year: 2012), pp. 1-23. |
Pollicelli, Debora, et al., “Wild Cetacea Identification Using Image Metadata”, JCS&T, vol. 17, No. 1, Apr. 2017 (Year: 2017), pp. 79-84. |
Porter, Glenn, et al., “Detection of Second-Generation Images Using an Assessment Criteria Method”, Journal of Criminological Resaerch, Policy and Practice, vol. 1, No. 4, 2015 (Year: 2015), p. 207-222. |
Thongkamwitoon, Thirapiroon, et al., “An Image Recapture Detection Algorithm Based on Learning Dictionaries of Edge Profiles”, IEEE Transactions on Information Forensics and Security, vol. 10, No. 5, May 2015 (Year: 2015), pp. 953-968. |
Wang, Kai, “A Simple and Effective Image-Statistics-Based Approach to Detecting Recaptured Images from LCD Screens”, Digital Investigation, vol. 23, Elsevier, 2017 (Year: 2017), pp. 75-87. |
Zhang, Zhiwei, et al., “A Face Antispoofing Database with Diverse Attacks”, IEEE, 2012 (Year: 2012) pp. 26-31. |
Bhowmik, Deepayan et al., “The Multimedia Blockchain: A Distributed and Tamper-Proof Media Transaction Framework”, Digital Signal Processing (DSP), 2017 22nd International Conference on IEEE, 2017, 6 pages (Year: 2017). |
Cao, Hong, et al., “Identification of Recaptured Photographs on LCD Screens”, IEEE, 978-1-4244-4296-6, 2010, (Year: 2010), pp. 1790-1793. |
Chinese Patent Application No. 201680057888.X, Office Action mailed Aug. 5, 2020, 9 pages. |
De Las Heras, Lluis-Pere, et al., Use Case Visual Bag-of-Words Techniques for Camera Based Identity Document Classification, 2015 13th International Conference on Document Analysis and Recognition (ICDAR), IEEE, 978-1-4799-1805-8/15, 2015, (Year: 2015), pp. 721-725. |
Drescher, Daniel, “Blockchain Basics: A Non-Technical Introduction in 25 Steps”, 255 pages, APress, ISBN 978-1-4842-2603-2, 2017, 246 pages (Year: 2017). |
European Patent Application No. 16833714.5, Supplementary European Search Report, mailed Nov. 13, 2018, 7 pages. |
International Patent Application No. PCT/US2016/045089, an International Search Report and Written Opinion issued by Authorized Officer Lee W. Young, mailed Oct. 21, 2016, 8 pages. |
International Patent Application No. PCT/US2018/053059, an International Search Report and Written Opinion issued by Authorized Officer Lee W. Young, mailed Dec. 10, 2018, 13 pages. |
International Patent Application No. PCT/US2019/045245, an International Search Report and Written Opinion issued by Authorized Officer Harry C. Kim, mailed Feb. 11, 2020, 14 pages. |
Ke, Yongzhen, et al., “Image Recapture Detection Using Multiple Features”, International Journal of Multimedia and Ubiquitous Engineering, vol. 8, No. 5, ISSN: 1975-0080 IJMUE, 2013 (Year: 2013 ), pp. 71-82. |
Ng, Tian-Tsong, et al., “Discrimination of Computer Synthesized or Recaptured Images from Real Images”, Springer, : 10.1007/978-1-4614-0757-7 _ 10, 2013, (Year: 2013), pp. 275-309. |
The TCP/IP Guide, tcpipguide.com, 2005, pp. 1-4. |
Thongkamwitoon, Thirapiroon, et al., “An Image Recapture Detection Algorithm Based on Learning Dictionaries of Edge Profiles”, IEEE Transactions on Information Forensics and Security, vol. 10, No. 5, May 2015, 10.1109/TIFS.2015.2392566, 2015, (Year: 2015), pp. 953-968. |
Wang, Kai, “A Simple and Effective Image-Statistics-Based Approach to Detecting Recaptured Images from LCD Screens”, Digital Investigation, vol. 23, Elsevier, 1742-2876, 2017, (Year: 2017), pp. 75-87. |
Number | Date | Country | |
---|---|---|---|
20220294640 A1 | Sep 2022 | US |
Number | Date | Country | |
---|---|---|---|
63159048 | Mar 2021 | US |