This patent application is related to U.S. patent application Ser. No. 12/359,344 filed concurrently herewith, and hereby incorporated by reference.
Discovery of sequential patterns is becoming increasingly useful and valuable in many scientific and commercial applications. Consider for example a Microsoft® Office command sequence. It is valuable information for Microsoft Corporation's developers and support personnel to know how the product is used, such as to know the answer to “What other features are used before or after feature X?” or “What do users do after they visit help?” or “Is feature X easy to find?” (which corresponds to knowing how many clicks are needed in order to execute command X).
However, there are vast numbers of such patterns in these and other scientific and commercial applications. The main challenge of pattern mining is how to automatically obtain meaningful patterns from very large sets of data.
This Summary is provided to introduce a selection of representative concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used in any way that would limit the scope of the claimed subject matter.
Briefly, various aspects of the subject matter described herein are directed towards a technology by which sequential data is processed into patterns, such as for use in analyzing program usage. In one aspect, sequential data may be first transformed by removing repeated data, grouping similar data into sub-sequences, and/or removing noisy data. One or more finite state machines may be used perform the data transformation or transformations.
The sequential data, which may have been transformed data, is segmented into units. A pattern extraction mechanism extracts patterns from the units into a pattern set. This may be performed by calculating a stability score between succeeding units, such as a mutual information score, selecting the pair of units having the most stability (e.g., the highest mutual information score), and adding corresponding information for that pair into the pattern set. The pattern extraction is iteratively repeated until a stopping criterion is met, e.g., the pattern set reaches a defined size, when the stability score is smaller than a pre-set threshold, and so forth.
Other advantages may become apparent from the following detailed description when taken in conjunction with the drawings.
The present invention is illustrated by way of example and not limited in the accompanying figures in which like reference numerals indicate similar elements and in which:
Various aspects of the technology described herein are generally directed towards an automatic pattern extraction mechanism and method which aim to identify stable usage patterns from sequential data, in which the patterns are straightforward to interpret. In one implementation, data transformation is performed with a finite state machine. An iterative algorithm extracts stable patterns in sequential data, and pattern extraction is performed on high-level units constructed from original sequences.
While Microsoft® Office commands and the like are used to exemplify command sequence analysis, it should be understood that any of the examples described herein are non-limiting examples. As such, the present invention is not limited to any particular embodiments, aspects, concepts, structures, functionalities or examples described herein. Rather, any of the embodiments, aspects, concepts, structures, functionalities or examples described herein are non-limiting, and the present invention may be used various ways that provide benefits and advantages in computing and data processing in general.
As described below, in general, data transformation is performed data transformation with at least one transducer finite state machine (FSM). More particularly, data transformation to remove such noise is performed before extracting patterns. Because a proper data transformation process involves some human knowledge with respect to different data sources, a transducer finite state machine is chosen.
By way of example, the raw command stream is noisy, in part because there are many types of commands that appear within top patterns that are unused. For example, when select words in a document, a floating menu appears automatically for the selection, whereby two or three commands are recorded even though the user may not use any feature from the menu. Further, consider that navigation commands often appear in the same context, e.g., “Copy-PageUp-Paste” “Copy-PageDown-Paste” or “Copy-StartOfLine-Paste”. Also, some commands are often used repeatedly: Delete, DeleteBackward, PageUp/PageDown, Zoom, and so forth.
As another example, there are a group of commands used for relocating the position of current cursor, referred to as “navigation commands”. Navigation commands are often used in the same context, and it is more useful to treat such variations as the same usage pattern.
Another set of commands, like delete and navigation, are often used repeatedly. Patterns with the same commands repeated multiple times are in general useless for analysis purposes.
As represented in
To reduce repeated commands, it is noted that some commands are often used repeatedly. Mining patterns directly on raw sequences results in many patterns containing the same commands repeated different times; such patterns are normally useless. Therefore repeated patterns are collapsed into a new unit. To this end, suppose:
To group similar commands, it is noted that some commands such as page-up, page-down, have similar usage behavior that normally does not need to be distinguished. Such commands are clustered as the same unit. Suppose:
To remove noisy commands, note that some commands are known to be automatically inserted, and they need to be filtered out. Suppose:
In general, sequence segmentation via the sequence segmenter 104 segments the original sequence (following data transformation) into high level units in order to group similar sub-sequences together. In this way, more meaningful patterns are found. In one implementation, the sequence segmenter 104 incorporates a finite state machine that indentifies sub-sequences in the sequential data that have similar tree-like microstructures. In one implementation, the finite state machine operates by entering a path state when an input is a path command corresponding to a non-leaf node; the state machine remains in the path state until another command comprises an action command corresponding to a leaf node, whereby it outputs an action unit, or until the other command comprises a path command that is not the parent or the sibling of a next command, whereby a browsing unit is output. Additional details of one suitable sequence segmenter 104 are described in the aforementioned related U.S. patent application.
The pattern extraction mechanism 106 iteratively extracts stable patterns. More particularly, for a sequential data set X={x1, x2 . . . , xt}, which is generated by a unit set S={s1, s2 . . . , sN}, the mechanism 106 finds sequences of units which tend to appear together. To this end, the pattern extraction mechanism 106 computes a measurement of the stability of a pair of units:
Once computed, the mechanism 106 iteratively identifies patterns with the max (WMI).
In one implementation, the stability between succeeding units are measured by weighted mutual information (WMI), as generally represented by step 406. In this process, the co-occurrence probability of two units is divided by their individual occurrence probabilities. The log of this item is the mutual information between them. The log item is then weighted by the co-occurrence probability.
As represented at step 406, two units with the maximum WMI are output as a pattern with a new unit identifier into a maintained pattern set. Via step 408, the process is carried on iteratively until a stop criterion is met, e.g., the pattern set reaches a pre-defined size, when the stability score is smaller than a pre-set threshold, and so forth. With such iterations, rather long patterns can be identified, as in the examples below, in which parentheses mark the iterations for generating these patterns:
The invention is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to: personal computers, server computers, hand-held or laptop devices, tablet devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, and so forth, which perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in local and/or remote computer storage media including memory storage devices.
With reference to
The computer 510 typically includes a variety of computer-readable media. Computer-readable media can be any available media that can be accessed by the computer 510 and includes both volatile and nonvolatile media, and removable and non-removable media. By way of example, and not limitation, computer-readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by the computer 510. Communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media. Combinations of the any of the above may also be included within the scope of computer-readable media.
The system memory 530 includes computer storage media in the form of volatile and/or nonvolatile memory such as read only memory (ROM) 531 and random access memory (RAM) 532. A basic input/output system 533 (BIOS), containing the basic routines that help to transfer information between elements within computer 510, such as during start-up, is typically stored in ROM 531. RAM 532 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 520. By way of example, and not limitation,
The computer 510 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only,
The drives and their associated computer storage media, described above and illustrated in
The computer 510 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 580. The remote computer 580 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 510, although only a memory storage device 581 has been illustrated in
When used in a LAN networking environment, the computer 510 is connected to the LAN 571 through a network interface or adapter 570. When used in a WAN networking environment, the computer 510 typically includes a modem 572 or other means for establishing communications over the WAN 573, such as the Internet. The modem 572, which may be internal or external, may be connected to the system bus 521 via the user input interface 560 or other appropriate mechanism. A wireless networking component 574 such as comprising an interface and antenna may be coupled through a suitable device such as an access point or peer computer to a WAN or LAN. In a networked environment, program modules depicted relative to the computer 510, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation,
An auxiliary subsystem 599 (e.g., for auxiliary display of content) may be connected via the user interface 560 to allow data such as program content, system status and event notifications to be provided to the user, even if the main portions of the computer system are in a low power state. The auxiliary subsystem 599 may be connected to the modem 572 and/or network interface 570 to allow communication between these systems while the main processing unit 520 is in a low power state.
While the invention is susceptible to various modifications and alternative constructions, certain illustrated embodiments thereof are shown in the drawings and have been described above in detail. It should be understood, however, that there is no intention to limit the invention to the specific forms disclosed, but on the contrary, the intention is to cover all modifications, alternative constructions, and equivalents failing within the spirit and scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
5983180 | Robinson | Nov 1999 | A |
6108714 | Kumagai et al. | Aug 2000 | A |
6473757 | Garofalakis | Oct 2002 | B1 |
6785663 | Wang et al. | Aug 2004 | B2 |
7188340 | Ostertag et al. | Mar 2007 | B2 |
7508985 | Van Lunteren | Mar 2009 | B2 |
7877401 | Hostetter et al. | Jan 2011 | B1 |
20030229471 | Guralnik et al. | Dec 2003 | A1 |
20040024773 | Stoffel et al. | Feb 2004 | A1 |
20040044528 | Chelba et al. | Mar 2004 | A1 |
20050005242 | Hoyle | Jan 2005 | A1 |
20050071465 | Zeng et al. | Mar 2005 | A1 |
20060279546 | Karmazyn | Dec 2006 | A1 |
20070113170 | Dignum et al. | May 2007 | A1 |
20090271720 | Deshpande et al. | Oct 2009 | A1 |
20100191693 | Su | Jul 2010 | A1 |
Number | Date | Country |
---|---|---|
2006076760 | Jul 2006 | WO |
Number | Date | Country | |
---|---|---|---|
20100191753 A1 | Jul 2010 | US |