Descriptive statistics (e.g., mean, variance, skewness, kurtosis, etc.) may be represented as statistical moments in different degrees (e.g., variance may be represented as a second degree statistical moment, skewness as a third degree statistical moment, kurtosis as a fourth degree statistical moment, and so on). As descriptive statistics may highlight certain characteristics of a dataset, they have a wide variety of uses, including for machine learning, data mining, and data normalization.
Using conventional techniques, computing a higher-degree statistical moment (i.e., a statistical moment in the first degree or higher) of a dataset requires performing a first scan of the dataset by loading data entries in the dataset one at a time to memory to compute its mean, and then performing at least a second scan to calculate the desired statistical moment. To calculate some statistical moments, more than one additional scan of a dataset may be needed. The inventors have appreciated that employing such sequential techniques which involve multiple dataset scans can be impractical, for several reasons. For example, scanning a very large dataset (e.g., having hundreds of millions of records) multiple times expends unnecessary processor cycles.
Some embodiments of the invention apply algorithms enabling the calculation of one or more statistical moments in a single “pass” (i.e., scan) of a dataset. Using such algorithms, one or more statistical moments may be calculated for a dataset of any size, without the dataset having to be scanned multiple times. Some embodiments of the invention apply such algorithms to a dataset using a software framework known as the “map-reduce” framework. Generally, use of a map-reduce framework involves partitioning an input dataset into multiple shards, using a separate “map” process to apply a user-defined algorithm to each shard, and then using one or more “reduce” processes to consolidate the results generated by all map processes across all of the shards of the dataset. In some embodiments, each map process applies an algorithm enabling calculation of one or more statistical moments in a single scan of an input shard, and one or more reduce processes apply a recursive algorithm to calculate the statistical moments across the entire dataset. Similar techniques may be employed to compute a covariance between data elements expressed in a dataset of any size.
The foregoing is a non-limiting summary of the invention, some embodiments of which are defined by the attached claims.
The accompanying drawings are not intended to be drawn to scale. In the drawings, each identical or nearly identical component that is illustrated in various figures is represented by a like numeral. For purposes of clarity, not every component may be labeled in every drawing. In the drawings:
Some embodiments of the invention may employ one or more algorithms (e.g., recursive algorithms) enabling the calculation of one or more statistical moments in a single pass of a dataset. For example, some embodiments may apply recursive algorithms for calculating statistical moments to a dataset using a map-reduce framework, whereby an input dataset is partitioned into multiple shards, a separate map process is used to apply an algorithm enabling calculation of one or more statistical moments in a single scan to each shard, and one or more reduce processes consolidate the results generated by the map processes to calculate the one or more statistical moments across the entire dataset. In accordance with some embodiments of the invention, a map-reduce framework may be employed to apply algorithms enabling calculation of a covariance between data elements expressed in a dataset, instead of or in addition to statistical moments.
As noted above, the technique illustrated in
Some embodiments of the invention provide an alternative to the technique shown in
In the example depicted in
Some embodiments, each of map processes 215, 220 are executed by a different computer. However, it should be appreciated that embodiments of the invention are not limited to such an implementation, as processing may be performed in any suitable manner, using any suitable combination of processing resources. For example, a different processing node may execute each map process, and each processing node may reside on the same computer or a different computer than other processing nodes.
In the example shown, each map process calculates one or more statistical moments on its respective shard in a single pass. Specifically, in some embodiments, each of map processes 215 and 220 applies the following algorithm to compute p statistical moments for data elements x expressed in shard L:
M
p,L=ΣxεL(x−
In the example of
Results 225, 230 which are generated by map processes 215, 220, respectively, are passed to reducer process 235 for application of a recursive algorithm to compute statistical moments across all of the shards. In the example shown, reduce process 235 applies a recursive algorithm to determine one or more statistical moments Mp,L across the entire dataset L. Specifically, reduce process 235 applies the following formula to compute statistical moment Mp,L:
In the formula above, p is the order of statistical moments, L represents the dataset having two shards L1 and L2, MN, is the p-th statistical moment for dataset L, n is the number of records in dataset L, and δ2,1 is the difference in mean values between L2 and L1.
Using this formula, reducer process 235 calculates M2,L as follows:
Reducer process 235 calculates M3,L as follows:
Reducer process 235 calculates M4,L as follows:
It should be appreciated that although the example technique shown in
In some embodiments of the invention, each of map processes 315, 320 computes a local covariance C2,L as follows:
C
2,L=Σ(u,v)εL(u−ū)(v−
Thus, in the example depicted in
It should be appreciated that although
Reducer process 335 then applies a recursive algorithm to determine the covariance between u and v across the dataset L. In some embodiments, reducer process 335 applies the following recursive algorithm to determine covariance between u and v across the entirety of dataset L:
In the formula above, C2,L, is the covariance for dataset L, which is a set of doubles x=(u, v). δu,2,1 is the difference of the mean values of the u data item between L2 and L1 sets, and δv,2,1 is the difference of the mean values of the v data item between L2 and L1 sets.
As with the example technique described above with reference to
It should also be appreciated that although the example techniques described with reference to
It should further be appreciated that via the foregoing example techniques, embodiments of the present invention enable calculation of one or more statistical moments of a dataset, and/or of covariance between elements expressed in two or more datasets, in a single pass. As a result, embodiments of the invention may eliminate unnecessary processor cycles associated with scanning a dataset multiple times. In addition, embodiments of the invention enable parallelization of calculation operations, thereby removing limitations on the size of the dataset(s) on which the operations may be performed.
The invention is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
The computing environment may execute computer-executable instructions, such as program modules. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
With reference to
Computer 410 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 410 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes both volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 410. Communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above should also be included within the scope of computer readable media.
The system memory 430 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 431 and random access memory (RAM) 432. A basic input/output system 433 (BIOS), containing the basic routines that help to transfer information between elements within computer 410, such as during start-up, is typically stored in ROM 431. RAM 432 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 420. By way of example, and not limitation,
The computer 410 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media discussed above and illustrated in
The computer 410 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 480. The remote computer 480 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 410, although only a memory storage device 481 has been illustrated in
When used in a LAN networking environment, the computer 410 is connected to the LAN 471 through a network interface or adapter 470. When used in a WAN networking environment, the computer 410 typically includes a modem 472 or other means for establishing communications over the WAN 473, such as the Internet. The modem 472, which may be internal or external, may be connected to the system bus 421 via the user input interface 460, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 410, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
Having thus described several aspects of at least one embodiment of this invention, it is to be appreciated that various alterations, modifications, and improvements will readily occur to those skilled in the art.
Such alterations, modifications, and improvements are intended to be part of this disclosure, and are intended to be within the spirit and scope of the invention. Further, though advantages of the present invention are indicated, it should be appreciated that not every embodiment of the invention will include every described advantage. Some embodiments may not implement any features described as advantageous herein and in some instances. Accordingly, the foregoing description and drawings are by way of example only.
The above-described embodiments of the present invention can be implemented in any of numerous ways. For example, the embodiments may be implemented using hardware, software or a combination thereof. When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers. Such processors may be implemented as integrated circuits, with one or more processors in an integrated circuit component. Though, a processor may be implemented using circuitry in any suitable format.
Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.
Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible format.
Such computers may be interconnected by one or more networks in any suitable form, including as a local area network or a wide area network, such as an enterprise network or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.
Also, the various methods or processes outlined herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine.
In this respect, the invention may be embodied as a computer readable storage medium (or multiple computer readable media) (e.g., a computer memory, one or more floppy discs, compact discs (CD), optical discs, digital video disks (DVD), magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the invention discussed above. As is apparent from the foregoing examples, a computer readable storage medium may retain information for a sufficient time to provide computer-executable instructions in a non-transitory form. Such a computer readable storage medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present invention as discussed above. As used herein, the term “computer-readable storage medium” encompasses only a computer-readable medium that can be considered to be a manufacture (i.e., article of manufacture) or a machine. Alternatively or additionally, the invention may be embodied as a computer readable medium other than a computer-readable storage medium, such as a propagating signal.
The terms “program” or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of the present invention as discussed above. Additionally, it should be appreciated that according to one aspect of this embodiment, one or more computer programs that when executed perform methods of the present invention need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present invention.
Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.
Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that conveys relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.
Various aspects of the present invention may be used alone, in combination, or in a variety of arrangements not specifically discussed in the embodiments described in the foregoing and is therefore not limited in its application to the details and arrangement of components set forth in the foregoing description or illustrated in the drawings. For example, aspects described in one embodiment may be combined in any manner with aspects described in other embodiments.
Also, the invention may be embodied as a method, of which an example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
Use of ordinal terms such as “first,” “second,” “third,” etc., in the claims to modify a claim element does not by itself connote any priority, precedence, or order of one claim element over another or the temporal order in which acts of a method are performed, but are used merely as labels to distinguish one claim element having a certain name from another element having a same name (but for use of the ordinal term) to distinguish the claim elements.
Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having,” “containing,” “involving,” and variations thereof herein, is meant to encompass the items listed thereafter and equivalents thereof as well as additional items.