End-to-end secure operations from a natural language expression

Information

  • Patent Grant
  • 11196540
  • Patent Number
    11,196,540
  • Date Filed
    Friday, January 19, 2018
    6 years ago
  • Date Issued
    Tuesday, December 7, 2021
    3 years ago
Abstract
Systems and methods for an end-to-end secure operation from an expression in natural language. Exemplary methods include: receiving a set of queries from a natural language processor, the set of queries being produced by a method including: getting data schemas associated with a target data source; obtaining the expression in natural language; performing natural language processing on the expression to determine a desired operation; and generating the set of queries using at least one of matching and inference techniques over the desired operation with respect to the data schemas; encrypting the set of queries using a homomorphic encryption technique; providing the encrypted set of queries to a server, the server including the target data source; acquiring encrypted results, the encrypted results being responsive to the encrypted set of queries; and decrypting the encrypted results using a decryption key to produce desired results.
Description
TECHNICAL FIELD

The present technology relates generally to encryption, and more specifically to homomorphic encryption.


BACKGROUND

The approaches described in this section could be pursued but are not necessarily approaches that have previously been conceived or pursued. Therefore, unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section.


Encryption is the process of encoding a message or information in such a way that only authorized parties can access it and those who are not authorized cannot. Encryption does not by itself prevent interference, but denies the intelligible content to a would-be interceptor. In an encryption scheme, the intended information or message, referred to as plaintext, is encrypted using an encryption algorithm, referred to as a cipher, generating ciphertext that can only be read if decrypted. A cryptosystem is pair (encryption and decryption) of algorithms that take a key and convert plaintext to ciphertext and back.


Encryption is used by militaries and governments to facilitate secret communication. It is also used to protect information within civilian systems. Encryption can be used to protect data “at rest,” such as information stored on computers and storage devices. Encryption is also used to protect data in transit, for example, data being transferred via networks (e.g., the Internet, e-commerce), mobile telephones, Bluetooth devices and bank automatic teller machines (ATMs).


SUMMARY

This summary is provided to introduce a selection of concepts in a simplified form that are further described in the Detailed Description below. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


The present disclosure is related to various systems and methods for an end-to-end secure operation from an expression in natural language. Specifically, a method may comprise: receiving a set of queries from a natural language processor, the set of queries being produced by a method including: getting data schemas associated with a target data source; obtaining the expression in natural language; performing natural language processing on the expression to determine a desired operation; and generating the set of queries using at least one of matching and inference techniques over the desired operation with respect to the data schemas; encrypting the set of queries using a homomorphic encryption technique; providing the encrypted set of queries to a server, the server including the target data source; acquiring encrypted results, the encrypted results being responsive to the encrypted set of queries; and decrypting the encrypted results using a decryption key to produce desired results.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments are illustrated by way of example, and not by limitation, in the figures of the accompanying drawings, in which like references indicate similar elements and in which:



FIG. 1 is a simplified representation of a system for encryption, according to some embodiments.



FIG. 2 is a simplified representation of a system for homomorphic encryption, according to various embodiments.



FIG. 3 is a simplified block diagram of a system for end-to-end secure operations from a natural language expression, in accordance with some embodiments.



FIG. 4 is a flow diagram of a method for end-to-end secure queries from speech, in accordance with various embodiments.



FIG. 5 is a simplified block diagram of a computing system, according to some embodiments.





DETAILED DESCRIPTION

While this technology is susceptible of embodiment in many different forms, there is shown in the drawings and will herein be described in detail several specific embodiments with the understanding that the present disclosure is to be considered as an exemplification of the principles of the technology and is not intended to limit the technology to the embodiments illustrated. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the technology. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that like or analogous elements and/or components, referred to herein, may be identified throughout the drawings with like reference characters. It will be further understood that several of the figures are merely schematic representations of the present technology. As such, some of the components may have been distorted from their actual scale for pictorial clarity.



FIG. 1 illustrates system 100 for encryption, according to some embodiments. System 100 can include source system 110, destination system 120, and communications link 130. Source system 110 and destination system 120 can include at least some of the characteristics of computing systems described further in relation to FIG. 5. Source system 110 can include encryption engine 112. Destination system 120 can include decryption engine 122 and process 124. Encryption engine 112, decryption engine 122, and/or process 124 can include any of an application-specific integrated circuit (ASIC), field programmable gate array (FPGA), application-specific standard product (ASSP), an electronic circuit, a processor (shared, dedicated, or group) that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.


Encryption engine 112 can encrypt plaintext A to ciphertext A′ using an encryption algorithm and an encryption key. Decryption engine 122 can decrypt ciphertext A′ to plaintext A using the encryption algorithm and a decryption key.


In symmetric-key encryption schemes, the encryption and decryption keys are the same. In symmetric-key encryption schemes, source system 110 and destination system 120 should have the same key in order to achieve secure communication over communications link 130. Examples of symmetric-key encryption schemes include: Twofish, Serpent, AES (Rijndael), Blowfish, CAST5, Kuznyechik, RC4, 3DES, Skipjack, Safer+/++ (Bluetooth), and IDEA.


In public-key encryption schemes, the encryption key (public key) is published for anyone (e.g., source system 110) to use and encrypt messages. However, only the receiving party (e.g., destination system 120) has access to the decryption key (private key) that enables messages to be read. Examples of public-key encryption schemes include: RSA, ElGamal, Elliptic Curve Cryptography (ECC), and Cramer-Shoup.


Process 124 can be any operation performed (or application which works) on information (e.g., plaintext A). For example, process 124 can be a database search, Internet search, financial transaction, ecommerce transaction, word processing application, spreadsheet application, and the like.


Although depicted as separate systems, source system 110 and destination system 120 can be a single system where ciphertext (encrypted or encoded information) is created, stored, and (subsequently) converted back to plaintext (readable information). Communications link 130 can be various combinations and permutations of wired and wireless networks (e.g., Ethernet, Wi-Fi, Bluetooth, mobile broadband, the Internet, etc.), internal/external computer busses, and the like, such as described in relation to FIG. 5.



FIG. 2 shows system 200 for homomorphic encryption, according to various embodiments. System 200 can include source system 210, destination system 220, and communications link 230. Source system 210 and destination system 220 can include at least some of the characteristics of computing systems described further in relation to FIG. 5. Source system 210 can include encryption engine 212. Destination system 220 can include process 224. Encryption engine 212 and/or process 224 can include any of an application-specific integrated circuit (ASIC), field programmable gate array (FPGA), application-specific standard product (ASSP), an electronic circuit, a processor (shared, dedicated, or group) that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.


Encryption engine 212 can encrypt plaintext B to ciphertext B′ using a homomorphic encryption algorithm and an encryption key. Homomorphic encryption is a form of encryption in which a certain algebraic operation (generally referred to as addition or multiplication) performed on plaintext is equivalent to another operation performed on ciphertext. Homomorphic encryption algorithms can be partially homomorphic (exhibits either additive or multiplicative homomorphism, or an unlimited number addition or multiplication operations and a limited number of multiplication or addition operations) or fully homomorphic (exhibits both additive and multiplicative homomorphism). For example, in partially homomorphic encryption schemes, multiplication in ciphertext is equal to addition of the same values in plaintext.


Examples of partially homomorphic cryptosystems include: RSA (multiplicative homomorphism), ElGamal (multiplicative homomorphism), and Paillier (additive homomorphism). Other partially homomorphic cryptosystems include the Okamoto-Uchiyama, Naccache-Stern, Damgård-Jurik, Sander-Young-Yung, Boneh-Goh-Nissim, and Ishai-Paskin cryptosystems. Examples of fully homomorphic cryptosystems include: the Brakerski-Gentry-Vaikuntanathan, Brakerski's scale-invariant, NTRU-based, and Gentry-Sahai-Waters (GSW) cryptosystems.


Process 224 can be an operation performed (or application which works) on homomorphically encrypted information (e.g., ciphertext B′) such that decrypting the result of the operation is the same as the result of some operation performed on the corresponding plaintext (e.g., plaintext B). For example, a homomorphically encrypted Internet search engine receives encrypted search terms and compare them with an encrypted index of the web. By way of further non-limiting example, a homomorphically encrypted financial database stored in the cloud allows users to ask how much money an employee earned in a particular time period. However, it would accept an encrypted employee name and output an encrypted answer, avoiding the privacy problems that can plague online services that deal with such sensitive data.


Communications link 230 can be various combinations and permutations of wired and wireless networks (e.g., Ethernet, Wi-Fi, Bluetooth, mobile broadband, the Internet, etc.), internal/external computer busses, and the like, such as described in relation to FIG. 5.



FIG. 3 depicts system 300 for end-to-end secure operations from a natural language statement, in accordance with some embodiments. System 300 can include one or more clients 3101-310M, one or more servers 3201-320N, communications links 330, and one or more natural language processors 3401-340O. One or more clients 3101-310M, one or more servers 3201-320N, and one or more natural language processors 3401-340O can each be disposed in same and/or different locations (e.g., offices, data centers, cities, counties, geographic regions, countries, continents, etc.). Additionally or alternatively, one or more clients 3101-310M, one or more servers 3201-320N, and one or more natural language processors 3401-340O can each be in varied computing environments, including shared computing architectures, hybrid architectures, distinct architectures (e.g., cloud computing environments), and combinations thereof. One or more clients 3101-310M, one or more servers 3201-320N, and one or more natural language processors 3401-340O can each include any of an application-specific integrated circuit (ASIC), field programmable gate array (FPGA), application-specific standard product (ASSP), an electronic circuit, a processor (shared, dedicated, or group) that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality. Additionally or alternatively, one or more clients 3101-310M, one or more servers 3201-320N, and one or more natural language processors 3401-340O can include at least some of the characteristics of computing systems described further in relation to FIG. 5.


A target data source may be in a single server or distributed over multiple servers of one or more servers 3201-320N as target data source 3221-322N. Target data source 3221-322N can be unencrypted (in plaintext form), deterministically encrypted (e.g., RSA), semantically encrypted (e.g., AES), and combinations thereof. When target data source 3221-322N is a combination of encrypted and unencrypted fields, each field can be consistently encrypted or unencrypted. For example, when target data source 3221-322N includes an unencrypted “employee name” field, the employees names are all unencrypted, as opposed to some name names being encrypted and others unencrypted. By way of further non-limiting example, when target data source 3221-322N includes an encrypted “social security number” field, the social security numbers are all encrypted, as opposed to some social security numbers being encrypted and others unencrypted. Data stored in and/or retrieved from target data source 3221-322N can be encrypted and/or decrypted as described in relation to FIG. 1.


One or more natural language processors 3401-340O perform (speaker dependent and/or speaker independent) automatic speech recognition (ASR), computer speech recognition, speech to text (STT), and the like, according to some embodiments. For example, one or more natural language processors 3401-340O receive spoken/audible natural (human) language (speech/statement/expression) (e.g., using a transducer such as a microphone, analog-to-digital converter, digital signal processor, etc.), recognize the speech/statement/expression, translate the speech/statement/expression into text. A natural language expression can be one or more words conveying a desired operation (e.g., query or analytic). Additionally or alternatively one or more natural language processors 3401-340O convert the translated text into a desired operation (e.g., query or analytic).


In some embodiments, one or more natural language processors 3401-340O identify boundaries between words, syllables, or phonemes in spoken natural languages, referred to as speech segmentation. For example, one or more natural language processors 3401-340O understand a complex spoken sentence by decomposing it into smaller lexical segments (approximately, the words of the language), associating a meaning to each segment, and combining those meanings according to the grammar rules of the language, referred to as lexical recognition. By way of further non-limiting example, one or more natural language processors 3401-340O use phonotactic cues to identify boundaries between lexical units. Phonotactic cues can be restrictions in a language based on the permissible combinations of phonemes, such as permissible syllable structure, consonant clusters and vowel sequences. In various embodiments, one or more natural language processors 3401-340O use acoustic modeling and/or language modeling, such as Hidden Markov Models (HMM), dynamic time warping (DTW), neural networks, deep learning (deep feedforward neural network (DNN)), “end-to-end” ASR, and the like.


Additionally or alternatively, one or more natural language processors 3401-340O can process a request for an operation (e.g., query or analytic) in a natural (human) language expression, according to some embodiments. A natural language expression can be one or more words conveying a desired operation (e.g., query or analytic). For example, one or more natural language processors 3401-340O can receive a natural (human) language expression in text (e.g., received using a physical keyboard, virtual keyboard, etc.), and convert the text expression into a desired operation (e.g., query or analytic).


In various embodiments, one or more natural language processors 3401-340O process (e.g., analyze syntax, semantics, and the like) a received natural (human) language expression. For example, one or more natural language processors 3401-340O perform at least one of lemmatization (e.g., identifying the intended part of speech and meaning of a word in a sentence and within the larger context surrounding that sentence), part of speech tagging (e.g., determining the part of speech for each word), parsing (e.g., determining a parse tree (grammatical analysis) of a sentence) such as Dependency Parsing and Constituency Parsing, sentence breaking (e.g., finding sentence boundaries), stemming (e.g., reducing inflected or derived words to their word stem, base or root form), word segmentation (e.g., separate continuous text into separate words), terminology extraction (e.g., extract relevant terms from a corpus), and the like.


By way of further non-limiting example, one or more natural language processors 3401-340O perform at least one of lexical semantics (e.g., determining a computational meaning of individual words in context), named entity recognition (NER; e.g., determining which items in the text map to proper names and what a type for each proper name), natural language understanding (e.g., identifying an intended semantic from multiple possible semantics in the form of organized notations of natural languages concepts), word-sense disambiguation (WSD; e.g., identifying which sense of a word (meaning) is used in an expression), and the like. By way of further non-limiting example, optical character recognition (OCR) can be performed on a representation of a (e.g., image or other electronic/digital form of written/printed) natural language expression (e.g., received using a camera, scanner, and the like). By way of further non-limiting example, one or more natural language processors 3401-340O can apply machine learning techniques to natural language processing tasks. The operations of one or more natural language processors 3401-340O, such as described above, may be collectively referred to as natural language processing.


Communications links 330 can be various combinations and permutations of wired and wireless networks (e.g., Ethernet, Wi-Fi, Bluetooth, mobile broadband, the Internet, etc.), internal/external computer busses, and the like, such as described in relation to FIG. 5. Although depicted as a single “block,” communications links 330 can be, for example, multiple distinct/separate combinations and permutations of wired and wireless networks, internal/external computer busses, and the like. For example, one or more natural language processors 3401-340O are each integrated with a client of the one of one or more clients 3101-310M and communicate over a computer bus. By way of further non-limiting example, one or more natural language processors 3401-340O are separate from one or more natural language processors 3401-340O and communicate over a wired network (e.g., LAN).


In some embodiments, system 300 encrypts a desired operation (e.g., query or analytic) to be executed over target data source 3221-322N using a homomorphic encryption scheme, such as described in relation to FIG. 2. For example, system 300 (e.g., one or more clients 3101-310M) encrypts the desired query as a set of homomorphic queries Q_E. The homomorphic queries Q_E is encrypted and the desired query should not be recoverable without a private key. Since one or more servers 3201-320N do not decrypt the desired query or the encrypted results E(R), one or more servers 3201-320N do not have the private key. The homomorphic queries Q_E provides a secure and (completely) encrypted way to perform a query. In contrast, traditional methods of performing queries over data sources require decryption of the query.



FIG. 4 illustrates a method 400 for end-to-end secure queries from speech, in accordance with various embodiments. System 300 (FIG. 3) can perform method 400. At step 410, one or more natural language processors 3401-340O can receive information about target data source 3221-322N from one or more servers 3201-320N. In some embodiments, a natural language processor of one or more natural language processors 3401-340O can receives data schemas D_S associated with data in target data source 3221-322N. Data schemas D_S can be the structure of a database. For example, the information can include a number of records, fields in each record (e.g., name, telephone number, social security number, etc.,), and the like in target data source 3221-322N. By way of further non-limiting example, the information can denote whether target data source 3221-322N is unencrypted, encrypted, and combinations thereof. When a part of target data source 3221-322N is encrypted, a client of one or more clients 3101-310M can receive a decryption key—associated with the encryption method used to encrypt the part of target data source 3221-322N—to decrypt returned encrypted data.


At step 420, a natural (human) language expression including a request for a desired operation (desired query or analytic) can be received. The natural (human) language expression can be audible human speech, text, an image, and the like in a digital form. For example, the natural language request is received using a transducer (e.g., microphone), analog-to-digital-converter, a digital signal processor, and the like. By way of further non-limiting example, the natural language request is received using a physical keyboard, virtual keyboard, camera, scanner, and the like. In some embodiments, step 420 is performed by one or more natural language processors 3401-340O.


At step 430, a desired operation U can be determined using the natural language expression. For example, automatic speech recognition (ASR) is performed on the (spoken) natural language expression to determine the desired operation U (desired query or analytic). By way of further non-limiting example, natural language processing is performed on the (text) natural language expression to determine the desired operation U (desired query or analytic). In some embodiments, step 430 is performed by one or more natural language processors 3401-340O.


At step 440, a set of queries Q can be produced using the desired operation U and the received information. For example, the set of queries Q is produced using matching and inference techniques over the desired operation U with respect to data schemas D_S. Queries Q={Q_i} can contain terms and expressions valid within data schemas D_S that satisfy the desired operation U. In some embodiments, step 440 is performed by one or more natural language processors 3401-340O.


At step 450, queries Q are encoded as a set of homomorphic queries Q_E using a homomorphic encryption scheme H. Homomorphic queries Q_E are encrypted and queries Q should not be recoverable from homomorphic queries Q_E without a private key associated with homomorphic encryption scheme H. In some embodiments, step 450 is performed by one or more clients 3101-310M.


At step 460, using techniques of the homomorphic encryption scheme H, one or more servers 3201-320N can evaluate homomorphic queries Q_E over target data source 3221-322N and produce encrypted results E(R). At step 470, encrypted results E(R) can be provided by one or more servers 3201-320N (FIG. 3) and received by one or more clients 3101-310M.


At step 480, encrypted results E(R) can be decrypted using the private key associated with homomorphic queries Q_E. For example, a client of one or more clients 3101-310M can decrypt the encrypted results E(R). Optionally at step 490, the results R can be decrypted using another decryption key associated with the encryption method used to encrypt the underlying data in target data source 3221-322N.



FIG. 5 depicts an exemplary computer system (or computing system) 500 that may be used to implement some embodiments of the present invention. The computer system 500 in FIG. 5 may be implemented in the contexts of the likes of computing systems, networks, servers, or combinations thereof. The computer system 500 in FIG. 5 includes processor unit(s) 510 and main memory 520. Main memory 520 stores, in part, instructions and data for execution by processor unit(s) 510. Main memory 520 stores the executable code when in operation, in this example. The computer system 500 in FIG. 5 further includes a mass data storage 530, portable storage device 540, output devices 550, user input devices 560, a graphics display system 570, and peripheral device(s) 580.


The components shown in FIG. 5 are depicted as being connected via a single bus 590. The components may be connected through one or more data transport means. Processor unit(s) 510 and main memory 520 are connected via a local microprocessor bus, and the mass data storage 530, peripheral device(s) 580, portable storage device 540, and graphics display system 570 are connected via one or more input/output (I/O) buses.


Mass data storage 530, which can be implemented with a magnetic disk drive, solid state drive, or an optical disk drive, is a non-volatile storage device for storing data and instructions for use by processor unit(s) 510. Mass data storage 530 stores the system software for implementing embodiments of the present disclosure for purposes of loading that software into main memory 520.


Portable storage device 540 operates in conjunction with a portable non-volatile storage medium, such as a flash drive, floppy disk, compact disk, digital video disc, or Universal Serial Bus (USB) storage device, to input and output data and code to and from the computer system 500 in FIG. 5. The system software for implementing embodiments of the present disclosure is stored on such a portable medium and input to the computer system 500 via the portable storage device 540.


User input devices 560 can provide a portion of a user interface. User input devices 560 may include one or more microphones, an alphanumeric keypad, such as a keyboard, for inputting alphanumeric and other information, or a pointing device, such as a mouse, a trackball, stylus, or cursor direction keys. User input devices 560 can also include a touchscreen. Additionally, the computer system 500 as shown in FIG. 5 includes output devices 550. Suitable output devices 550 include speakers, printers, network interfaces, and monitors.


Graphics display system 570 include a liquid crystal display (LCD) or other suitable display device. Graphics display system 570 is configurable to receive textual and graphical information and processes the information for output to the display device.


Peripheral device(s) 580 may include any type of computer support device to add additional functionality to the computer system.


The components provided in the computer system 500 in FIG. 5 are those typically found in computer systems that may be suitable for use with embodiments of the present disclosure and are intended to represent a broad category of such computer components that are well known in the art. Thus, the computer system 500 in FIG. 5 can be a personal computer (PC), hand held computer system, telephone, mobile computer system, workstation, tablet, phablet, mobile phone, server, minicomputer, mainframe computer, wearable, or any other computer system. The computer may also include different bus configurations, networked platforms, multi-processor platforms, and the like. Various operating systems may be used including UNIX, LINUX, WINDOWS, MAC OS, PALM OS, QNX, ANDROID, IOS, CHROME, and other suitable operating systems.


Some of the above-described functions may be composed of instructions that are stored on storage media (e.g., computer-readable medium). The instructions may be retrieved and executed by the processor. Some examples of storage media are memory devices, tapes, disks, and the like. The instructions are operational when executed by the processor to direct the processor to operate in accord with the technology. Those skilled in the art are familiar with instructions, processor(s), and storage media.


In some embodiments, the computing system 500 may be implemented as a cloud-based computing environment, such as a virtual machine and/or container operating within a computing cloud. In other embodiments, the computing system 500 may itself include a cloud-based computing environment, where the functionalities of the computing system 500 are executed in a distributed fashion. Thus, the computing system 500, when configured as a computing cloud, may include pluralities of computing devices in various forms, as will be described in greater detail below.


In general, a cloud-based computing environment is a resource that typically combines the computational power of a large grouping of processors (such as within web servers) and/or that combines the storage capacity of a large grouping of computer memories or storage devices. Systems that provide cloud-based resources may be utilized exclusively by their owners or such systems may be accessible to outside users who deploy applications within the computing infrastructure to obtain the benefit of large computational or storage resources.


The cloud is formed, for example, by a network of web servers that comprise a plurality of computing devices, such as the computing system 500, with each server (or at least a plurality thereof) providing processor and/or storage resources. These servers manage workloads provided by multiple users (e.g., cloud resource customers or other users). Typically, each user places workload demands upon the cloud that vary in real-time, sometimes dramatically. The nature and extent of these variations typically depends on the type of business associated with the user.


It is noteworthy that any hardware platform suitable for performing the processing described herein is suitable for use with the technology. The terms “computer-readable storage medium” and “computer-readable storage media” as used herein refer to any medium or media that participate in providing instructions to a CPU for execution. Such media can take many forms, including, but not limited to, non-volatile media, volatile media and transmission media. Non-volatile media include, for example, optical, magnetic, and solid-state disks, such as a fixed disk. Volatile media include dynamic memory, such as system random-access memory (RAM). Transmission media include coaxial cables, copper wire and fiber optics, among others, including the wires that comprise one embodiment of a bus. Transmission media can also take the form of acoustic or light waves, such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, any other physical medium with patterns of marks or holes, a RAM, a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), a Flash memory, any other memory chip or data exchange adapter, a carrier wave, or any other medium from which a computer can read.


Various forms of computer-readable media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus carries the data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU.


Computer program code for carrying out operations for aspects of the present technology may be written in any combination of one or more programming languages, including an object oriented programming language such as JAVA, SMALLTALK, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of wired and/or wireless network, including a (wireless) local area network (LAN/WLAN) or a (wireless) wide area network (WAN/WWAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider, wireless Internet provider, and the like).


The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present technology has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. Exemplary embodiments were chosen and described in order to best explain the principles of the present technology and its practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.


Aspects of the present technology are described above with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.


These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.


The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.


The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present technology. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.


The description of the present technology has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. Exemplary embodiments were chosen and described in order to best explain the principles of the present technology and its practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims
  • 1. A computer implemented method for an end-to-end secure operation from an expression in natural language, comprising: receiving a set of queries from a natural language processor via a client, the set of queries being produced by a method including: getting data schemas associated with a target data source;obtaining the expression in the natural language;performing natural language processing on the expression to determine a desired operation expressed in the natural language; andgenerating the set of queries using at least one of matching and inference techniques over the desired operation with respect to the data schemas, the set of queries containing terms and expressions valid within the data schemas, the terms and expressions satisfying the desired operation, the set of queries being unencrypted;encrypting the set of queries, via the client, using a homomorphic encryption technique;providing the encrypted set of queries from the client to a server, the server including the target data source;acquiring encrypted results provided by the server and received by the client, the encrypted results being responsive to the encrypted set of queries; anddecrypting the encrypted results by the client, using a decryption key to produce desired results.
  • 2. The method of claim 1, wherein the encrypted results are produced by a process including evaluating the encrypted set of queries over the target data source.
  • 3. The method of claim 1, wherein the natural language processing uses at least one of speech segmentation, lexical recognition, phonotactic cues, lemmatization, part of speech tagging, parsing, sentence breaking, stemming, and word segmentation.
  • 4. The method of claim 1, wherein the natural language processing uses at least one of Hidden Markov Models (HMM), dynamic time warping (DTW), neural networks, deep feedforward neural network (DNN), and “end-to-end” automatic speech recognition.
  • 5. The method of claim 1, wherein the desired operation is at least one of a query and an analytic.
  • 6. The method of claim 1, wherein the target data source is at least one of: unencrypted, deterministically encrypted, and semantically encrypted.
  • 7. The method of claim 1, further comprising: decrypting the desired results using another key when the target data source is encrypted, the other key being associated with an encryption method used to encrypt the target data source.
  • 8. The method of claim 1, wherein the homomorphic encryption technique is partially homomorphic.
  • 9. The method of claim 1, wherein the homomorphic encryption technique is fully homomorphic.
  • 10. The method of claim 1, wherein the natural language processor and server each comprise multiple instances of one or more of a hardware sever, virtual machine, and container, each instance of the multiple instances including a subset of the target data source.
  • 11. A system for an end-to-end secure operation from an expression in natural language, comprising: a natural language processor: getting data schemas associated with a target data source;obtaining the expression in natural language;performing natural language processing on the expression to determine a desired operation expressed in the natural language; andgenerating a set of queries using at least one of matching and inference techniques over the desired operation with respect to the data schemas, the set of queries containing terms and expressions valid within the data schemas, the terms and expressions satisfying the desired operation, the set of queries being unencrypted;a client: receiving the set of queries from the natural language processor;encrypting the set of queries using a homomorphic encryption technique;providing the encrypted set of queries to a server, the server including the target data source;acquiring encrypted results, the encrypted results being responsive to the encrypted set of queries; anddecrypting the encrypted results using a decryption key to produce desired results; andthe server: evaluating the encrypted set of queries over the target data source to produce the encrypted results.
  • 12. The system of claim 11, wherein the natural language processing uses at least one of speech segmentation, lexical recognition, phonotactic cues, lemmatization, part of speech tagging, parsing, sentence breaking, stemming, and word segmentation.
  • 13. The system of claim 11, wherein the natural language processing uses at least one of Hidden Markov Models (HMM), dynamic time warping (DTW), neural networks, deep feedforward neural network (DNN), and “end-to-end” automatic speech recognition.
  • 14. The system of claim 11, wherein the desired operation is at least one of a query and an analytic.
  • 15. The system of claim 11, wherein the target data source is at least one of: unencrypted, deterministically encrypted, and semantically encrypted.
  • 16. The system of claim 11, further comprising: decrypting result R using another key when the target data source is encrypted, the other key being associated with an encryption method used to encrypt the target data source.
  • 17. The system of claim 11, wherein the homomorphic encryption technique is partially homomorphic.
  • 18. The system of claim 11, wherein the homomorphic encryption technique is fully homomorphic.
  • 19. The system of claim 11, wherein the natural language processor and server each comprise multiple instances of one or more of a hardware sever, virtual machine, and container, each instance of the multiple instances including a subset of the target data source.
  • 20. A system for an end-to-end secure operation from an expression in natural language comprising: means for receiving a set of queries from a natural language processor, the set of queries being produced by a method including: getting data schemas associated with a target data source;obtaining the expression in the natural language;performing natural language processing on the expression to determine a desired operation expressed in the natural language; andgenerating a set of queries using at least one of matching and inference techniques over the desired operation with respect to the data schemas, the set of queries containing terms and expressions valid within the data schemas, the terms and expressions satisfying the desired operation, the set of queries being unencrypted;means for encrypting the set of queries using a homomorphic encryption technique;means for providing the encrypted set of queries to a server, the server including the target data source;means for acquiring encrypted results, the encrypted results being responsive to the encrypted set of queries; andmeans for decrypting the encrypted results by a client using a decryption key to produce desired results.
CROSS REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of U.S. Provisional Application No. 62/448,890, filed on Jan. 20, 2017; U.S. Provisional Application No. 62/448,918, filed on Jan. 20, 2017; United States Provisional Application No. 62/448,893, filed on Jan. 20, 2017; U.S. Provisional Application No. 62/448,906, filed on Jan. 20, 2017; U.S. Provisional Application No. 62/448,908, filed on Jan. 20, 2017; United States Provisional Application No. 62/448,913, filed on Jan. 20, 2017; U.S. Provisional Application No. 62/448,916, filed on Jan. 20, 2017; U.S. Provisional Application No. 62/448,883, filed on Jan. 20, 2017; United States Provisional Application No. 62/448,885, filed on Jan. 20, 2017; U.S. Provisional Application No. 62/448,902, filed on Jan. 20, 2017; U.S. Provisional Application No. 62/448,896, filed on Jan. 20, 2017; U.S. Provisional Application No. 62/448,899, filed on Jan. 20, 2017; and U.S. Provisional Application No. 62/462,818, filed on Feb. 23, 2017, all the disclosures of which are hereby incorporated by reference.

US Referenced Citations (190)
Number Name Date Kind
5732390 Katayanagi et al. Mar 1998 A
6178435 Schmookler Jan 2001 B1
6745220 Hars Jun 2004 B1
6748412 Ruehle Jun 2004 B2
6910059 Lu et al. Jun 2005 B2
7712143 Comlekoglu May 2010 B2
7937270 Smaragdis et al. May 2011 B2
8515058 Gentry Aug 2013 B1
8565435 Gentry et al. Oct 2013 B2
8781967 Tehranchi et al. Jul 2014 B2
8832465 Gulati et al. Sep 2014 B2
9059855 Johnson et al. Jun 2015 B2
9094378 Yung Jul 2015 B1
9189411 Mckeen et al. Nov 2015 B2
9215219 Krendelev et al. Dec 2015 B1
9288039 Monet et al. Mar 2016 B1
9491111 Roth et al. Nov 2016 B1
9503432 El Emam et al. Nov 2016 B2
9514317 Martin et al. Dec 2016 B2
9565020 Camenisch et al. Feb 2017 B1
9577829 Roth et al. Feb 2017 B1
9652609 Kang et al. May 2017 B2
9846787 Johnson et al. Dec 2017 B2
9852306 Cash et al. Dec 2017 B2
9942032 Kornaropoulos et al. Apr 2018 B1
9946810 Trepetin et al. Apr 2018 B1
9973334 Hibshoosh et al. May 2018 B2
10027486 Liu Jul 2018 B2
10055602 Deshpande et al. Aug 2018 B2
10073981 Arasu et al. Sep 2018 B2
10075288 Khedr et al. Sep 2018 B1
10129028 Kamakari et al. Nov 2018 B2
10148438 Evancich et al. Dec 2018 B2
10181049 El Defrawy et al. Jan 2019 B1
10210266 Antonopoulos et al. Feb 2019 B2
10235539 Ito et al. Mar 2019 B2
10255454 Kamara et al. Apr 2019 B2
10333715 Chu et al. Jun 2019 B2
10375042 Chaum Aug 2019 B2
10396984 French et al. Aug 2019 B2
10423806 Cerezo Sanchez Sep 2019 B2
10489604 Yoshino et al. Nov 2019 B2
10496631 Tschudin et al. Dec 2019 B2
10644876 Williams et al. May 2020 B2
10693627 Carr Jun 2020 B2
10721057 Carr Jul 2020 B2
10728018 Williams et al. Jul 2020 B2
10771237 Williams et al. Sep 2020 B2
10790960 Williams et al. Sep 2020 B2
10817262 Carr et al. Oct 2020 B2
10873568 Williams Dec 2020 B2
10880275 Williams Dec 2020 B2
10902133 Williams et al. Jan 2021 B2
10903976 Williams et al. Jan 2021 B2
10972251 Carr Apr 2021 B2
20020032712 Miyasaka et al. Mar 2002 A1
20020073316 Collins et al. Jun 2002 A1
20020104002 Nishizawa et al. Aug 2002 A1
20030037087 Rarick Feb 2003 A1
20030059041 MacKenzie et al. Mar 2003 A1
20030110388 Pavlin et al. Jun 2003 A1
20040167952 Gueron et al. Aug 2004 A1
20050008152 MacKenzie Jan 2005 A1
20050076024 Takatsuka et al. Apr 2005 A1
20050259817 Ramzan et al. Nov 2005 A1
20060008080 Higashi et al. Jan 2006 A1
20060008081 Higashi et al. Jan 2006 A1
20070053507 Smaragdis et al. Mar 2007 A1
20070095909 Chaum May 2007 A1
20070140479 Wang et al. Jun 2007 A1
20070143280 Wang et al. Jun 2007 A1
20090037504 Hussain Feb 2009 A1
20090083546 Staddon et al. Mar 2009 A1
20090193033 Ramzan et al. Jul 2009 A1
20090268908 Bikel et al. Oct 2009 A1
20090279694 Takahashi et al. Nov 2009 A1
20090287837 Felsher Nov 2009 A1
20100202606 Almeida Aug 2010 A1
20100205430 Chiou et al. Aug 2010 A1
20100241595 Felsher Sep 2010 A1
20110026781 Osadchy et al. Feb 2011 A1
20110107105 Hada May 2011 A1
20110110525 Gentry May 2011 A1
20110243320 Halevi et al. Oct 2011 A1
20110283099 Nath et al. Nov 2011 A1
20120039469 Mueller et al. Feb 2012 A1
20120054485 Tanaka et al. Mar 2012 A1
20120066510 Weinman Mar 2012 A1
20120201378 Nabeel et al. Aug 2012 A1
20120265794 Niel Oct 2012 A1
20120265797 Niel Oct 2012 A1
20130010950 Kerschbaum Jan 2013 A1
20130051551 El Aimani Feb 2013 A1
20130054665 Felch Feb 2013 A1
20130114811 Boufounos et al. May 2013 A1
20130148868 Troncoso Pastoriza et al. Jun 2013 A1
20130170640 Gentry Jul 2013 A1
20130191650 Balakrishnan Jul 2013 A1
20130195267 Alessio et al. Aug 2013 A1
20130198526 Goto Aug 2013 A1
20130216044 Gentry et al. Aug 2013 A1
20130230168 Takenouchi Sep 2013 A1
20130237242 Oka et al. Sep 2013 A1
20130246813 Mori et al. Sep 2013 A1
20130326224 Yavuz Dec 2013 A1
20130339722 Krendelev Dec 2013 A1
20130339751 Sun Dec 2013 A1
20130346741 Kim et al. Dec 2013 A1
20130346755 Nguyen et al. Dec 2013 A1
20140164758 Ramamurthy et al. Jun 2014 A1
20140189811 Taylor et al. Jul 2014 A1
20140233727 Rohloff et al. Aug 2014 A1
20140281511 Kaushik et al. Sep 2014 A1
20140355756 Iwamura Dec 2014 A1
20150100785 Joye et al. Apr 2015 A1
20150100794 Joye et al. Apr 2015 A1
20150205967 Naedele et al. Jul 2015 A1
20150215123 Kipnis et al. Jul 2015 A1
20150227930 Quigley et al. Aug 2015 A1
20150229480 Joye et al. Aug 2015 A1
20150244517 Nita Aug 2015 A1
20150248458 Sakamoto Sep 2015 A1
20150304736 Lal et al. Oct 2015 A1
20150358152 Ikarashi et al. Dec 2015 A1
20150358153 Gentry Dec 2015 A1
20160004874 Ioannidis et al. Jan 2016 A1
20160036826 Pogorelik et al. Feb 2016 A1
20160072623 Joye et al. Mar 2016 A1
20160105402 Soon-Shiong Apr 2016 A1
20160105414 Bringer et al. Apr 2016 A1
20160119346 Chen et al. Apr 2016 A1
20160140348 Nawaz et al. May 2016 A1
20160179945 Lastra Diaz et al. Jun 2016 A1
20160182222 Rane Jun 2016 A1
20160323098 Bathen Nov 2016 A1
20160335450 Yoshino Nov 2016 A1
20160344557 Chabanne et al. Nov 2016 A1
20160350648 Gilad-Bachrach Dec 2016 A1
20170070340 Hibshoosh et al. Mar 2017 A1
20170070351 Yan Mar 2017 A1
20170099133 Gu et al. Apr 2017 A1
20170134158 Pasol May 2017 A1
20170185776 Robinson et al. Jun 2017 A1
20170264426 Joye et al. Sep 2017 A1
20180091466 Friedman et al. Mar 2018 A1
20180139054 Chu May 2018 A1
20180198601 Laine et al. Jul 2018 A1
20180204284 Cerezo Sanchez Jul 2018 A1
20180212751 Williams et al. Jul 2018 A1
20180212753 Williams Jul 2018 A1
20180212754 Williams et al. Jul 2018 A1
20180212755 Williams et al. Jul 2018 A1
20180212756 Carr Jul 2018 A1
20180212757 Carr Jul 2018 A1
20180212758 Williams et al. Jul 2018 A1
20180212759 Williams et al. Jul 2018 A1
20180212775 Williams Jul 2018 A1
20180212933 Williams Jul 2018 A1
20180224882 Carr Aug 2018 A1
20180234254 Camenisch et al. Aug 2018 A1
20180267981 Sirdey Sep 2018 A1
20180270046 Carr Sep 2018 A1
20180276417 Cerezo Sanchez Sep 2018 A1
20180343109 Koseki et al. Nov 2018 A1
20180349632 Bent et al. Dec 2018 A1
20180359097 Lindell Dec 2018 A1
20180373882 Veugen Dec 2018 A1
20190013950 Becker et al. Jan 2019 A1
20190042786 Williams et al. Feb 2019 A1
20190108350 Bohli et al. Apr 2019 A1
20190158272 Chopra et al. May 2019 A1
20190229887 Ding et al. Jul 2019 A1
20190238311 Zheng Aug 2019 A1
20190251553 Ma et al. Aug 2019 A1
20190251554 Ma et al. Aug 2019 A1
20190253235 Zhang et al. Aug 2019 A1
20190260585 Kawai et al. Aug 2019 A1
20190280880 Zhang et al. Sep 2019 A1
20190312728 Poeppelmann Oct 2019 A1
20190327078 Zhang et al. Oct 2019 A1
20190334716 Kocsis et al. Oct 2019 A1
20190349191 Soriente et al. Nov 2019 A1
20190371106 Kaye Dec 2019 A1
20200134200 Williams et al. Apr 2020 A1
20200150930 Carr et al. May 2020 A1
20200204341 Williams et al. Jun 2020 A1
20200382274 Williams et al. Dec 2020 A1
20200396053 Williams et al. Dec 2020 A1
20210034765 Williams et al. Feb 2021 A1
20210105256 Williams Apr 2021 A1
Foreign Referenced Citations (11)
Number Date Country
2873186 Mar 2018 EP
5680007 Mar 2015 JP
101386294 Apr 2014 KR
WO2014105160 Jul 2014 WO
WO2015094261 Jun 2015 WO
WO2016003833 Jan 2016 WO
WO2016018502 Feb 2016 WO
WO2018091084 May 2018 WO
WO2018136801 Jul 2018 WO
WO2018136804 Jul 2018 WO
WO2018136811 Jul 2018 WO
Non-Patent Literature Citations (35)
Entry
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2018/014535, dated Apr. 19, 2018, 9 pages.
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2018/014530, dated Apr. 23, 2018, 7 pages.
“International Search Report” and “Written Opinion of the International Searching Authority,” Patent Cooperation Treaty Application No. PCT/US2018/014551, dated Apr. 24, 2018, 8 pages.
Petition to Insitute Derivation Proceeding Pursuant to 35 USC 135; Case No. DER2019-00009, US Patent and Trademark Office Patent Trial and Appeal Board; Jul. 26, 2019, 272 pages. (2 PDFs).
SCAMP Working Paper L29/11, “A Woods Hole Proposal Using Striping,” Dec. 2011, 14 pages.
O'Hara, Michael James, “Shovel-ready Private Information Retrieval,” Dec. 2015, 4 pages.
Carr, Benjamin et al., “Proposed Laughing Owl,” NSA Technical Report, Jan. 5, 2016, 18 pages.
Carr, Benjamin et al., “A Private Stream Search Technique,” NSA Technical Report, Dec. 1, 2015, 18 pages.
Drucker et al., “Paillier-encrypted databases with fast aggregated queries,” 2017 14th IEEE Annual Consumer Communications & Networking Conference (CCNC), Jan. 8-11, 2017, pp. 848-853.
Tu et al., “Processing Analytical Queries over Encrypted Data,” Proceedings of the VLDB Endowment, vol. 6, Issue No. 5, Mar. 13, 2013 pp. 289-300.
Boneh et al., “Private Database Queries Using Somewhat Homomorphic Encryption”, Cryptology ePrint Archive: Report 2013/422, Standford University [online], Jun. 27, 2013, [retrieved on Dec. 9, 2019], 22 pages.
Chen et al., “Efficient Multi-Key Homomorphic Encryption with Packed Ciphertexts with Application to Oblivious Neural Network Inference”, CCS '19 Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security, May 19, 2019. pp. 395-412.
Armknecht et al., “A Guide to Fully Homomorphic Encryption” IACR Cryptology ePrint Archive: Report 2015/1192 [online], Dec. 14, 2015, 35 pages.
Bayar et al., “A Deep Learning Approach to Universal Image Manipulation Detection Using a New Convolutional Layer”, IH&MMSec 2016, Jun. 20-22, 2016. pp. 5-10.
Juvekar et al. “GAZELLE: A Low Latency Framework for Secure Neural Network Inference”, 27th USENIX Security Symposium, Aug. 15-17, 2018. pp. 1650-1668.
Viejo et al., “Asymmetric homomorphisms for secure aggregation in heterogeneous scenarios,” Information Fusion 13, Elsevier B.V., Mar. 21, 2011, pp. 285-295.
Patil et al., “Big Data Privacy Using Fully Homomorphic Non-Deterministic Encryption,” IEEE 7th International Advance Computing Conference, Jan. 5-7, 2017, 15 pages.
Williams, Ellison Anne et al., “WIDESKIES: Scalable Private Information Retrieval,” Jun. 8, 2016, 14 pages.
Bösch et al.,“SOFIR: Securely Outsourced Forensic Recognition,” 2014 IEEE International Conference on Acoustic, Speech and Signal Processing (ICASSP), IEEE 978-1-4799-2893-4/14, 2014, pp. 2713-2717.
Waziri et al., “Big Data Analytics and Data Security in the Cloud via Fullly Homomorphic Encryption,” World Academy of Science, Engineering and Technology International Journal of Computer, Electrical, Automation, Control and Information Engineering, vol. 9, No. 3, 2015, pp. 744-753.
Bajpai et al., “A Fully Homomorphic Encryption Implementation on Cloud Computing,” International Journal of Information & Computation Technology, ISSN 0974-2239 vol. 4, No. 8, 2014, pp. 811-816.
Panda et al., “FPGA Prototype of Low Latency BBS PRNG,” IEEE International Symposium on Nanoelectronic and Information Systems, Dec. 2015, pp. 118-123, 7 pages.
Sahu et al., “Implementation of Modular Multiplication for RSA Algorithm,” 2011 International Conference on Communication Systems and Network Technologies, 2011, pp. 112-114, 3 pages.
Drucker et al., “Achieving trustworthy Homomorphic Encryption by combining it with a Trusted Execution Environment,” Journal of Wireless Mobile Networks, Ubiquitous Computing, and Dependable Application (JoWUA), Mar. 2018, pp. 86-99.
Google Scholar, search results for “trusted execution environment database”, 2 pages, Aug. 1, 2020.
PIRK Code Excerpt—QuerierDriver, https://github.com/apache/incubator-retired-pirk/blob/master/src/main/java/org/apache/pirk/querier/wideskies/QuerierDriver.java; Jul. 11, 2016; 5 pages.
PIRK Code Excerpt—QuerierDriverCLI, https://github.com/apache/incubator-retired-pirk/blob/master/src/main/java/org/apache/pirk/querier/wideskies/QuerierCLI.java; Jul. 11, 2016; 9 pages.
PIRK Code Excerpt—Query; [online]; Retreived from the Internet: <URL: https://github.com/apache/incubator-retired-pirk/blob/master/src/main/java/org/apache/pirk/query/wideskies/Query.java>; Jul. 11, 2016; 7 pages.
PIRK Code Excerpt—Queryinfo; [online]; Retreived from the Internet: <URL: https://github.com/apache/incubator-retired-pirk/blob/master/src/main/java/org/apache/pirk/query/wideskies/QueryInfo.java>; Jul. 11, 2016; 4 pages.
PIRK Code Excerpt—ComputeResponse; [online]; Retreived from the Internet: <URL: https://github.com/apache/incubator-retired-pirk/blob/master/src/main/java/org/apache/pirk/responder/wideskies/spark/ComputeResponse.java> Jul. 11, 2016; 8 pages.
PIRK Code Excerpt—HashSelectorsAndPartitionData; [online]; Retreived from the Internet: <URL: https://github.com/apache/incubator-retired-pirk/blob/master/src/main/java/org/apache/pirk/responder/wideskies/spark/HashSelectorsAndPartitionData.java>; Jul. 11, 2016; 2 pages.
PIRK Code Excerpt—HashSelectorsAndFormPartitionsBigInteger; [online]; Retreived from the Internet: <URL: https://github.com/apache/incubator-retired-pirk/blob/master/src/main/java/org/apache/pirk/responder/wideskies/common/HashSelectorAndPartitionData.java>; Jul. 11, 2016; 3 pages.
PIRK Code Excerpt—QueryUtils; [online]; Retreived from the Internet: <URL: https://github.com/apache/incubator-retired-pirk/blob/master/src/main/java/org/apache/pirk/query/wideskies/QueryUtils.java>; Jul. 11, 2016; 8 pages.
PIRK Code Excerpt—QuerySchema; [online]; Retreived from the Internet: <URL: https://github.com/apache/incubator-retired-pirk/blob/master/src/main/java/org/apache/pirk/schema/query/QuerySchema.java>; Jul. 11, 2016; 3 pages.
“PIRK Proposal” Apache.org [online], [retreived on Oct. 28, 2020]; Retreived from the Internet: <URL:https://cwiki.apache.org/confluence/display/incubator/PirkProposal>; Apr. 10, 2019; 5 pages.
Related Publications (1)
Number Date Country
20180212752 A1 Jul 2018 US
Provisional Applications (13)
Number Date Country
62448890 Jan 2017 US
62448918 Jan 2017 US
62448893 Jan 2017 US
62448906 Jan 2017 US
62448908 Jan 2017 US
62448913 Jan 2017 US
62448916 Jan 2017 US
62448883 Jan 2017 US
62448885 Jan 2017 US
62448902 Jan 2017 US
62448896 Jan 2017 US
62448899 Jan 2017 US
62462818 Feb 2017 US