More data is being received, processed, analyzed, and stored than ever before. This is because businesses recognize the importance of this data for use in analyzing consumer spending behaviors, trends, and other information patterns which allow for increased sales, customer profiling, better service, risk analysis, and so on. However, due to the enormity of the information, mechanisms such as data mining have been devised that extract and analyze subsets of data from different perspectives in attempt to summarize the data into useful information.
One function of data mining is the creation of a model. Models can be descriptive, in that they help in understanding underlying processes or behavior, and predictive, for predicting an unforeseen value from other known values. Using a combination of machine learning, statistical analysis, modeling techniques and database technology, data mining finds patterns and subtle relationships in data and infers rules that allow the prediction of future results.
The process of data mining generally consists of the initial exploration, model building or pattern identification and deployment (the application of the model to new data in order to generate predictions). Exploration can start with data preparation which may involve cleaning data, data transformations, selecting subsets of records. Model building and validation can involve considering various models and choosing the best one based on their predictive performance, for example. This can involve an elaborate process of competitive evaluation of the models to find the best performer. Deployment involves applying the selected model to new data in order to generate predictions or estimates of the expected outcome.
Mining models are trained to ensure viability over the changing patterns in data. However, such mining models can quickly become outdated if not periodically updated to reflect changes in the behavior of the entities being modeled.
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed innovation. This summary is not an extensive overview, and it is not intended to identify key/critical elements or to delineate the scope thereof. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
The disclosed innovation allows for automatically keeping mining models up-to-date with respect to evolving source/training data. A typical scenario is where the user wants the model to be based on a moving window of data, for instance, the last three months of purchases.
Systems are disclosed in support of update training for models at times other than in realtime. Accordingly, periodic, incremental updates can be scheduled through this mechanism as well. The user can configure a refresh interval and other associated values through the training parameters for the mining structure and/or model. Training can also be triggered by other user-defined events such as database notifications, and/or alerts from other operational systems. Once the mining structure and its contained models are initially processed, they are automatically reprocessed by the data mining engine.
The invention disclosed and claimed herein, in one aspect thereof, comprises a computer-implemented system for training of a data mining model. The system can include a data mining model component for training a data mining model on a dataset in realtime, and an update component for updating the data mining model according to predetermined criteria.
In another aspect thereof, the user can specify automatic model training information using a mining model definition language, in both XML DDL (data definition language) (analysis service scripting language) and query language enhancements in the DMX language (Data Mining eXtensions to the SQL language).
In another aspect, the invention functions in conjunction with model versioning and version comparison to detect significant changes and retain updated models only if a threshold criterion is met.
In yet another aspect, the system utilizes a data mining engine and algorithm enhancements including incremental training and aging/weighting of training data (e.g., older data can be retained, but assigned less weight during the learning process).
Additionally, enabled are scenarios not addressed by existing products such as product differentiation for SQL Server Data Mining in the data mining market.
In still another aspect thereof, a machine learning and reasoning component is provided that employs a probabilistic and/or statistical-based analysis to prognose or infer an action that a user desires to be automatically performed.
To the accomplishment of the foregoing and related ends, certain illustrative aspects of the disclosed innovation are described herein in connection with the following description and the annexed drawings. These aspects are indicative, however, of but a few of the various ways in which the principles disclosed herein can be employed and is intended to include all such aspects and their equivalents. Other advantages and novel features will become apparent from the following detailed description when considered in conjunction with the drawings.
The innovation is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding thereof. It may be evident, however, that the innovation can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate a description thereof.
As used in this application, the terms “component” and “system” are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution. For example, a component can be, but is not limited to being, a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers.
The disclosed innovation allows data mining systems to automatically maintain up-to-date mining models in realtime with respect to evolving source and/or training data. A typical scenario is where the model is based on a moving window of data that includes the last three months of purchases, for example.
Additionally, scenarios are described wherein models do not need to be updated in a realtime fashion, such as for periodic, incremental updates scheduled for off-peak processing, for example. The system is suitably robust to provide for user-configuration of a refresh interval, for example, and other associated values/parameters via training parameters for the mining structure and/or model. Training can also be triggered by other user-defined events such as database notifications, or alerts from other operational systems.
Once the mining structure and its contained models are initially processed, they are automatically reprocessed by the data mining engine according to triggering events, predetermined criteria, and/or learned data, for example. These and other aspects are described in greater detail infra.
Referring initially to the drawings,
At 200, a data mining model is developed and trained on a dataset. At 202, an event is detected which triggers an automatic (and realtime) update process for updating the existing model. At 204, the model is updated.
At 406, the user can select an update shift (or stepping) parameter that defines how often the window should be moved (or stepped) forward. For example, if the user chooses a 3-month sliding window, the shift parameter can be set to one month, that is, the window will be slid in 1-month increments every one month. At 408, once the settings are made, the sliding window algorithm can be initiated to facilitate the update process. As can be seen, the sliding window update process implements model updating on a regular basis regardless of whether the model needs updating at all. This will be addressed in a more efficient manner below.
Referring now to
In support managing and storing many different models 1008, the system 1000 can further include a model selection component 1010 that facilitates the selection of one or more of the models 1008 for analysis, processing, versioning, and updating, for example.
The system 1000 can also include a database server system 1012 which interfaces to the model repository 1006 to provide data 1014 against which the one or models 1008 can be processed, and through which can be accessed training data 1016.
The event detection component 302 can also process alerts and/or notifications from other systems as triggers to perform various functions. For example, an alert from a remote system (e.g., the database server 1012) can indicate that sufficient amounts of new data have arrived in the data 1014 that warrant a model training update process to be performed. In another example, a remote system (not shown) is configured to transmit notifications that are processed as trigger events for performing one or more system functions (e.g., age out data, weighting data, . . . ).
The automatic adjustment component 1004 can be employed to make adjustments to system parameters based on, for example, the changing state of the underlying datasets, the training data, the accuracy of the existing model on data, and so on. Accordingly, algorithms can be designed and implemented that monitor functions and results of the system 1000, and based on predetermined adjustment criteria, alter settings, parameters, etc., accordingly to provide the desired outputs.
The learning and reasoning (LR) component 1002 can learn system behaviors and reason about what changes to be made. The subject invention (e.g., in connection with selection) can employ various LR-based schemes for carrying out various aspects thereof. For example, a process for determining when to perform a training model update can be facilitated via an automatic classifier system and process. Moreover, where the database server 1012 has data that is, for example, distributed over several locations, the classifier can be employed to determine which location will be selected for model processing.
A classifier is a function that maps an input attribute vector, x=(x1, x2, x3, x4, xn), to a class label class(x). The classifier can also output a confidence that the input belongs to a class, that is, f(x)=confidence(class(x)). Such classification can employ a probabilistic and/or other statistical analysis (e.g., one factoring into the analysis utilities and costs to maximize the expected value to one or more people) to prognose or infer an action that a user desires to be automatically performed. In the case of data systems, for example, attributes can be words or phrases, or other data-specific attributes derived from the words (e.g., database tables, the presence of key terms), and the classes are categories or areas of interest (e.g., levels of priorities).
A support vector machine (SVM) is an example of a classifier that can be employed. The SVM operates by finding a hypersurface in the space of possible inputs that splits the triggering input events from the non-triggering events in an optimal way. Intuitively, this makes the classification correct for testing data that is near, but not identical to training data. Other directed and undirected model classification approaches include, e.g., naive Bayes, Bayesian networks, decision trees, neural networks, fuzzy logic models, and probabilistic classification models providing different patterns of independence can be employed. Classification as used herein also is inclusive of statistical regression that is utilized to develop models of ranking or priority.
As will be readily appreciated from the subject specification, the subject invention can employ classifiers that are explicitly trained (e.g., via a generic training data) as well as implicitly trained (e.g., via observing user behavior, receiving extrinsic information). For example, SVM's are configured via a learning or training phase within a classifier constructor and feature selection module. Thus, the classifier(s) can be employed to automatically learn and perform a number of functions according to predetermined criteria.
In one example, the LR component 1002 can monitor mining results associated with a sliding window with respect to the quality of the mining model being generated therefrom and/or the amount of change computed between models. For example, consider a trained mining model that is applied against data extracted in a 5-month wide sliding window, which is being moved every two weeks. Based on a qualitative description parameter that is a measure of how well the model describes the data, or a prediction parameter that provides some measure of how well the trained model predicts data patterns or behavior, the LR component learn and reason to make adjustments to sliding window parameters accordingly. For example, if the description measure falls below a predetermined level, the LR component can control the automatic adjustment component 1004 to reduce the window width to four months in an attempt to improve the measure. Once the measure improves, the LR component 1002 can signal the adjustment component 1004 to continue at the present settings or even to relax back to the 5-month wide window.
Similarly, the LR component can learn and reason to adjust the stepping time from two weeks to another value, for example, three weeks, based on descriptive and/or predictive qualities.
In another example, the LR component 1002 can perform basic analysis on the data or be made aware of the type of data being modeled, which type information can change the behavior in operation of the system 1000. For example, if the data is medical information being analyzed for medical information, the degree of accuracy required can be much higher than if based on customer shopping behavior or patterns. The LR component 1002 can detect his and make adjustments through the automatic adjustment component 1004 accordingly.
In yet another example, the LR component 1002 can learn that a first model performs better over another model even though the underperforming model is a most recently trained version. Accordingly, the first model can be retained until a better model has been created tested and trained for implementation.
The LR component can learn and reason that the training data 1016 employed can be negatively affecting the quality of the models being used, and thus, cause a new set of training data to be generated, tested, and employed for model training.
In yet another example, the LR component 1002 can learn and reason that system notifications and/or alerts are normally associated with certain types or versions of models, which can then be automatically implemented based on the next received alert or notification.
As indicated by example, the potential benefits obtained by the LR component 1002 are numerous, and the examples presented herein are not to be construed as limiting in any way. For example, other implementations can employ the LR component 1002 to facilitate processing of aging data, for example, such that aged data is treated differently that more recent data.
The subject invention finds application to data mining extensions (DMX) and data definition language (DDL) enhancements to allow specification of the parameters for automatic processing. DMX is a query language for data mining models, much like SQL (structured query language) is a query language for relational databases and MDX is a query language for OLAP databases. DMX is composed of DDL statements, data manipulation language (DML) statements, and functions and operators. The DDL part of DMX includes DDL statements which can be used to create, process, delete, copy, browse, and predict against data mining models, for example, create new data mining models and mining structures (via
DML statements can be used to train mining models (via
Accordingly, the user can specify automatic model training information using a mining model definition language, in both XML (extensible markup language) DDL (data definition language) (analysis service scripting language) and query language enhancements in the DMX language (Data Mining eXtensions to the SQL language).
Referring now to
Generally, program modules include routines, programs, components, data structures, etc., that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the inventive methods can be practiced with other computer system configurations, including single-processor or multiprocessor computer systems, minicomputers, mainframe computers, as well as personal computers, hand-held computing devices, microprocessor-based or programmable consumer electronics, and the like, each of which can be operatively coupled to one or more associated devices.
The illustrated aspects of the innovation may also be practiced in distributed computing environments where certain tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules can be located in both local and remote memory storage devices.
A computer typically includes a variety of computer-readable media. Computer-readable media can be any available media that can be accessed by the computer and includes both volatile and non-volatile media, removable and non-removable media. By way of example, and not limitation, computer-readable media can comprise computer storage media and communication media. Computer storage media includes both volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital video disk (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer.
With reference again to
The system bus 1208 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures. The system memory 1206 includes read-only memory (ROM) 1210 and random access memory (RAM) 1212. A basic input/output system (BIOS) is stored in a non-volatile memory 1210 such as ROM, EPROM, EEPROM, which BIOS contains the basic routines that help to transfer information between elements within the computer 1202, such as during start-up. The RAM 1212 can also include a high-speed RAM such as static RAM for caching data.
The computer 1202 further includes an internal hard disk drive (HDD) 1214 (e.g., EIDE, SATA), which internal hard disk drive 1214 may also be configured for external use in a suitable chassis (not shown), a magnetic floppy disk drive (FDD) 1216, (e.g., to read from or write to a removable diskette 1218) and an optical disk drive 1220, (e.g., reading a CD-ROM disk 1222 or, to read from or write to other high capacity optical media such as the DVD). The hard disk drive 1214, magnetic disk drive 1216 and optical disk drive 1220 can be connected to the system bus 1208 by a hard disk drive interface 1224, a magnetic disk drive interface 1226 and an optical drive interface 1228, respectively. The interface 1224 for external drive implementations includes at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface technologies. Other external drive connection technologies are within contemplation of the subject innovation.
The drives and their associated computer-readable media provide nonvolatile storage of data, data structures, computer-executable instructions, and so forth. For the computer 1202, the drives and media accommodate the storage of any data in a suitable digital format. Although the description of computer-readable media above refers to a HDD, a removable magnetic diskette, and a removable optical media such as a CD or DVD, it should be appreciated by those skilled in the art that other types of media which are readable by a computer, such as zip drives, magnetic cassettes, flash memory cards, cartridges, and the like, may also be used in the exemplary operating environment, and further, that any such media may contain computer-executable instructions for performing the methods of the disclosed innovation.
A number of program modules can be stored in the drives and RAM 1212, including an operating system 1230, one or more application programs 1232, other program modules 1234 and program data 1236. All or portions of the operating system, applications, modules, and/or data can also be cached in the RAM 1212. It is to be appreciated that the innovation can be implemented with various commercially available operating systems or combinations of operating systems.
A user can enter commands and information into the computer 1202 through one or more wired/wireless input devices, e.g., a keyboard 1238 and a pointing device, such as a mouse 1240. Other input devices (not shown) may include a microphone, an IR remote control, a joystick, a game pad, a stylus pen, touch screen, or the like. These and other input devices are often connected to the processing unit 1204 through an input device interface 1242 that is coupled to the system bus 1208, but can be connected by other interfaces, such as a parallel port, an IEEE 1394 serial port, a game port, a USB port, an IR interface, etc.
A monitor 1244 or other type of display device is also connected to the system bus 1208 via an interface, such as a video adapter 1246. In addition to the monitor 1244, a computer typically includes other peripheral output devices (not shown), such as speakers, printers, etc.
The computer 1202 may operate in a networked environment using logical connections via wired and/or wireless communications to one or more remote computers, such as a remote computer(s) 1248. The remote computer(s) 1248 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 1202, although, for purposes of brevity, only a memory/storage device 1250 is illustrated. The logical connections depicted include wired/wireless connectivity to a local area network (LAN) 1252 and/or larger networks, e.g., a wide area network (WAN) 1254. Such LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, e.g., the Internet.
When used in a LAN networking environment, the computer 1202 is connected to the local network 1252 through a wired and/or wireless communication network interface or adapter 1256. The adaptor 1256 may facilitate wired or wireless communication to the LAN 1252, which may also include a wireless access point disposed thereon for communicating with the wireless adaptor 1256.
When used in a WAN networking environment, the computer 1202 can include a modem 1258, or is connected to a communications server on the WAN 1254, or has other means for establishing communications over the WAN 1254, such as by way of the Internet. The modem 1258, which can be internal or external and a wired or wireless device, is connected to the system bus 1208 via the serial port interface 1242. In a networked environment, program modules depicted relative to the computer 1202, or portions thereof, can be stored in the remote memory/storage device 1250. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
The computer 1202 is operable to communicate with any wireless devices or entities operatively disposed in wireless communication, e.g., a printer, scanner, desktop and/or portable computer, portable data assistant, communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, restroom), and telephone. This includes at least Wi-Fi and Bluetooth™ wireless technologies. Thus, the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices.
Wi-Fi, or Wireless Fidelity, allows connection to the Internet from a couch at home, a bed in a hotel room, or a conference room at work, without wires. Wi-Fi is a wireless technology similar to that used in a cell phone that enables such devices, e.g., computers, to send and receive data indoors and out; anywhere within the range of a base station. Wi-Fi networks use radio technologies called IEEE 802.11x (a, b, g, etc.) to provide secure, reliable, fast wireless connectivity. A Wi-Fi network can be used to connect computers to each other, to the Internet, and to wired networks (which use IEEE 802.3 or Ethernet).
Wi-Fi networks can operate in the unlicensed 2.4 and 5 GHz radio bands. IEEE 802.11 applies to generally to wireless LANs and provides 1 or 2 Mbps transmission in the 2.4 GHz band using either frequency hopping spread spectrum (FHSS) or direct sequence spread spectrum (DSSS). IEEE 802.11a is an extension to IEEE 802.11 that applies to wireless LANs and provides up to 54 Mbps in the 5 GHz band. IEEE 802.11a uses an orthogonal frequency division multiplexing (OFDM) encoding scheme rather than FHSS or DSSS. IEEE 802.11b (also referred to as 802.11 High Rate DSSS or Wi-Fi) is an extension to 802.11 that applies to wireless LANs and provides 11 Mbps transmission (with a fallback to 5.5, 2 and 1 Mbps) in the 2.4 GHz band. IEEE 802.11g applies to wireless LANs and provides 20+ Mbps in the 2.4 GHz band. Products can contain more than one band (e.g., dual band), so the networks can provide real-world performance similar to the basic 10BaseT wired Ethernet networks used in many offices.
Referring now to
The system 1300 also includes one or more server(s) 1304. The server(s) 1304 can also be hardware and/or software (e.g., threads, processes, computing devices). The servers 1304 can house threads to perform transformations by employing the invention, for example. One possible communication between a client 1302 and a server 1304 can be in the form of a data packet adapted to be transmitted between two or more computer processes. The data packet may include a cookie and/or associated contextual information, for example. The system 1300 includes a communication framework 1306 (e.g., a global communication network such as the Internet) that can be employed to facilitate communications between the client(s) 1302 and the server(s) 1304.
Communications can be facilitated via a wired (including optical fiber) and/or wireless technology. The client(s) 1302 are operatively connected to one or more client data store(s) 1308 that can be employed to store information local to the client(s) 1302 (e.g., cookie(s) and/or associated contextual information). Similarly, the server(s) 1304 are operatively connected to one or more server data store(s) 1310 that can be employed to store information local to the servers 1304.
What has been described above includes examples of the disclosed innovation. It is, of course, not possible to describe every conceivable combination of components and/or methodologies, but one of ordinary skill in the art may recognize that many further combinations and permutations are possible. Accordingly, the innovation is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims. Furthermore, to the extent that the term “includes” is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term “comprising” as “comprising” is interpreted when employed as a transitional word in a claim.