The present invention relates generally to computing system evaluation and, more particularly, to techniques for quantitatively measuring and benchmarking complexity in information technology management.
The complexity of managing computing systems and information technology (IT) processes represents a major impediment to efficient, high-quality, error-free, and cost-effective service delivery ranging from small-business servers to global-scale enterprise backbones. IT systems and processes with a high degree of complexity demands human resources and expertise to manage that complexity, increasing the total cost of ownership. Likewise, complexity increases the amount of time that must be spent interacting with a computing system or between operators to perform the desired function, and decreases efficiency and productivity. Furthermore, complexity results in human errors, as complexity challenges human reasoning and results in erroneous decisions even by skilled operators.
Due to the high complexity level incurred in service delivery processes, it is evident that service providers are actively seeking to reduce the IT complexity by designing, architecting, implementing, and assembling systems and processes with minimal complexity level. In order to do so, they must be able to quantitatively measure and benchmark the degree of IT management complexity exposed by particular computing systems or processes, so that global delivery executives, program mangers, and project leaders can evaluate the prospective complexity before investing in them, and designers, architects, and developers can rebuild and optimize them for reduced complexity. Besides improving decision making for projects and technologies, quantitative complexity evaluation can help computing service providers and outsourcers quantify the amount of human management that will be needed to provide a given service, allowing them to more effectively evaluate costs and set price points. All these scenarios require standardized, representative, accurate, easily-compared quantitative assessments of IT management complexity with metrics mapped to human-perceived complexity such as labor cost, efficiency, and error rate. This motivates the need for a system and methods for calibrating and extrapolating complexity metrics of information technology management.
The prior art of computing system evaluation includes no system or methods for calibrating and extrapolating complexity metrics of information technology management. Well-studied computing system evaluation areas include system performance analysis, software complexity analysis, human-computer interaction analysis, dependability evaluation, and basic complexity evaluation.
System performance analysis attempts to compute quantitative measures of the performance of a computer system, considering both hardware and software components. This is a well-established area rich in analysis techniques and systems. However, none of these methodologies and systems for system performance analysis considers complexity-related aspects of the system under evaluation, nor do they collect or analyze complexity-related data. Therefore, system performance analysis provides no insight into the complexity of the IT management being evaluated.
Software complexity analysis attempts to compute quantitative measures of the complexity of a piece of software code, considering both the intrinsic complexity of the code, as well as the complexity of creating and maintaining the code. However, processes for software complexity analysis do not collect management-related statistics or data and therefore provides no insight into the management complexity of the computing systems and processes running the analyzed software.
Human-computer interaction (HCI) analysis attempts to identify interaction problems between human users and computer systems, typically focusing on identifying confusing, error-prone, or inefficient interaction patterns. However, HCI analysis focuses on detecting problems in human-computer interaction rather than performing an objective, quantitative complexity analysis of that interaction. HCI analysis methods are not designed specifically for measuring management complexity, and typically do not operate on management-related data. In particular, HCI analysis collects human performance data from costly observations of many human users, and does not collect and use management-related data directly from a system under test. Additionally, HCI analysis typically produces qualitative results suggesting areas for improvement of a particular user interface or interaction pattern. Thus, it does not produce quantitative results that evaluate an overall complexity of managing a system, independent of the particular user interface experience. The Model Human Processor approach to HCI analysis does provide objective, quantitative results; however, these results quantify interaction time for motor-function tasks like moving a mouse or clicking an on-screen button, and thus do not provide insight into the complexity of managing computing system and service management.
Dependability evaluation combines aspects of objective, reproducible performance benchmarking with HCI analysis techniques with a focus on configuration-related problems, see, e.g., Brown et al., “Experience with Evaluating Human-Assisted Recovery Processes,” Proceedings of the 2004 International Conference on Dependable Systems and Networks, Los Alamitos, Calif., IEEE, 2004. This approach includes a system for measuring configuration quality as performed by human users, but does not measure configuration complexity and does not provide reproducibility or objective measures.
Basic complexity evaluation quantitatively evaluates complexity of computing system configuration, see, e.g., Brown et al., “System and methods for quantitatively evaluating complexity of computing system configuration,” Ser. No. 11/205,972, filed on Aug. 17, 2005, and Brown et al., “System and methods for integrating authoring with complexity analysis for computing system operation procedures.” However, they do not provide metrics calibration that map configuration-related data directly from a system under test to human-perceived complexity such as labor cost, efficiency, and error rate.
The invention broadly and generally provides a method for calibrating the relationship between management-inherent complexity metrics deriving from the management structure and human perceived complexity of information technology management comprising: (a) obtaining a set of management-inherent complexity metrics; (b) obtaining a set of human-perceived complexity metrics; (c) constructing a control model identifying a set of dominant indicators selected from the aforesaid set of management-inherent complexity metrics; (d) establishing a value model mapping from the aforesaid set of dominant indicators to the aforesaid set of human-perceived complexity metrics.
The method may further comprise obtaining and validating the aforesaid control model and the aforesaid value model for quality assessment. This step may be repeated.
In some embodiments, the aforesaid set of management-inherent complexity metrics comprise at least one of: (a) execution complexity metrics; (b) parameter complexity metrics; and (c) memory complexity metrics.
In some embodiments, the aforesaid value model may be constructed using a statistical approach or linear regression.
In some embodiments, the aforesaid value model is constructed using machine learning, an artificial neural network, for example. This artificial neural network may be a radial basis function.
Advantageously, the aforesaid step of obtaining a set of management inherent complexity metrics may comprise at least one of: (a) obtaining management-inherent complexity metrics from a complexity analysis; and (b) acquiring human-perceived complexity metrics through controlled user studies.
The aforesaid step of constructing a control model may comprise at least one of: (a) obtaining a subset of management-inherent complexity metrics as a set of dominant indicators under study; (b) constructing a value model from the aforesaid set of dominant indicators and the aforesaid set of human-perceived complexity metrics based on a set of information technology management data; and (c) evaluating the quality of the aforesaid value model based on a different set of information technology management data.
The method may further comprise obtaining a different subset of management-inherent complexity metrics from the aforesaid set of dominant indicators under study. This step may be repeated until no better set of dominant indicators is found.
The invention further broadly and generally provides a method for extrapolating from management-inherent complexity metrics to human-perceived complexity of information technology management, the aforesaid method comprising: (a) collecting a set of management-inherent complexity metrics; (b) obtaining a value model; (c) predicting human-perceived complexity based on the aforesaid set of management inherent complexity metrics and the aforesaid value model.
The invention further broadly and generally provides a program storage device readable by a digital processing apparatus and having a program of instructions which are tangibly embodied on the storage device and which are executable by the processing apparatus to perform a method for calibrating the relationship between management-inherent complexity metrics deriving from the management structure and human perceived complexity of information technology management, the aforesaid method comprising: (a) obtaining a set of management-inherent complexity metrics; (b) obtaining a set of human-perceived complexity metrics; (c) constructing a control model identifying a set of dominant indicators selected from the aforesaid set of management-inherent complexity metrics; (d) establishing a value model mapping from the aforesaid set of dominant indicators to the aforesaid set of human-perceived complexity metrics.
Exemplary embodiments of the invention as described herein generally include system or methods for calibrating and extrapolating complexity metrics of information technology management.
For illustrative purposes, exemplary embodiments of the invention will be described with specific reference, if needed, to calibrating and extrapolating complexity metrics of information technology management of a configuration procedure, wherein the management-inherent complexity metrics deriving from the management structure comprise one or more execution complexity metrics, parameter complexity metrics, and/or memory complexity metrics, and human-perceived complexity metrics comprise one of more cost metrics, efficiency metrics, and quality metrics. It is to be understood, however, that the present invention is not limited to any particular kind of information technology management. Rather, the invention is more generally applicable to any information technology management in which it would be desirable to conduct complexity model calibration and extrapolation.
It is to be understood that the system and methods described herein in accordance with the present invention may be implemented in various forms of hardware, software, firmware, special purpose processors, or a combination thereof. Preferably, the present invention is implemented in software comprising program instructions that are tangibly embodied on one or more program storage devices (e.g., hard disk, magnetic floppy disk, RAM, CD ROM, DVD, ROM and flash memory), and executable by any device or machine comprising suitable architecture.
It is to be further understood that because the constituent system modules and method steps depicted in the accompanying Figures can be implemented in software, the actual connections between the system components (or the flow of the process steps) may differ depending upon the manner in which the application is programmed. Given the teachings herein, one of ordinary skill in the related art will be able to contemplate these and similar implementations or configurations of the present invention.
An exemplary embodiment of the present invention begins by obtaining (or collecting) a set of human-perceived complexity metrics (110) from the system administrator (103) through user studies, for example, and obtaining a set of management-inherent complexity metrics (111) from complexity evaluation quantified result (108). Thereafter, the calibration analysis (112) is conducted to generate calibration models (113) which quantify the relationship between management-inherent complexity metrics and human-perceived complexity of the configuration procedure.
A different data processing system (120) that collects and evaluates configuration related data utilizing techniques is taught in U.S. patent application Ser. No. 11/205,972 filed on Aug. 17, 2005. The present invention, without collecting again a set of human-perceived complexity metrics from the system administrator through user studies (which can be costly or even not feasible), conducts extrapolation analysis (132) that is based on the set of management-inherent complexity metrics (131) from the data processing system (120) and the calibration models (113) from calibration analysis (112) to generate the human-perceived complexity metrics (133).
The value model can be constructed using statistical approaches or machine learning approaches. For example, a linear regression model can be constructed
ET=b0+b1*nActions+b2*nCtxSw
where the model inputs includes the explanatory variables such as the number of actions (nActions) and the number of context switches (nCtxSw), and the model outputs includes the execution time (ET). The model coefficients such as b0, b1, b2 can be obtained using least squares approach.
Alternatively, a type of neural networks called radial basis function network can be constructed
ET=RBF(nActions, nCtxSw, . . . , goal, . . . )
which can be used to build a nonlinear relationship, and can further comprises environment variables to classify the different IT management types to build a higher quality of model.
While changes and variations to the embodiments may be made by those skilled in the art, the scope of the invention is to be determined by the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
4835372 | Gombrich et al. | May 1989 | A |
5504921 | Dev et al. | Apr 1996 | A |
5724262 | Ghahramani | Mar 1998 | A |
5734837 | Flores et al. | Mar 1998 | A |
5765138 | Aycock et al. | Jun 1998 | A |
5774661 | Chatterjee et al. | Jun 1998 | A |
5826239 | Du et al. | Oct 1998 | A |
5850535 | Maystrovsky et al. | Dec 1998 | A |
5870545 | Davis et al. | Feb 1999 | A |
5884302 | Ho | Mar 1999 | A |
5907488 | Arimoto et al. | May 1999 | A |
5937388 | Davis et al. | Aug 1999 | A |
6049776 | Donnelly et al. | Apr 2000 | A |
6131085 | Rossides | Oct 2000 | A |
6249769 | Ruffin et al. | Jun 2001 | B1 |
6259448 | McNally et al. | Jul 2001 | B1 |
6263335 | Paik et al. | Jul 2001 | B1 |
6308208 | Jung et al. | Oct 2001 | B1 |
6339838 | Weinman, Jr. | Jan 2002 | B1 |
6363384 | Cookmeyer, II et al. | Mar 2002 | B1 |
6453269 | Quernemoen | Sep 2002 | B1 |
6473794 | Guheen et al. | Oct 2002 | B1 |
6496209 | Horii | Dec 2002 | B2 |
6523027 | Underwood | Feb 2003 | B1 |
6526387 | Ruffin et al. | Feb 2003 | B1 |
6526392 | Dietrich et al. | Feb 2003 | B1 |
6526404 | Slater et al. | Feb 2003 | B1 |
6618730 | Poulter et al. | Sep 2003 | B1 |
6675149 | Ruffin et al. | Jan 2004 | B1 |
6738736 | Bond | May 2004 | B1 |
6789101 | Clarke et al. | Sep 2004 | B2 |
6810383 | Loveland | Oct 2004 | B1 |
6865370 | Ho et al. | Mar 2005 | B2 |
6879685 | Peterson et al. | Apr 2005 | B1 |
6907549 | Davis et al. | Jun 2005 | B2 |
6970803 | Aerdts et al. | Nov 2005 | B1 |
6988088 | Miikkulainen et al. | Jan 2006 | B1 |
6988132 | Horvitz | Jan 2006 | B2 |
7010593 | Raymond | Mar 2006 | B2 |
7039606 | Hoffman et al. | May 2006 | B2 |
7089529 | Sweitzer et al. | Aug 2006 | B2 |
7114146 | Zhang et al. | Sep 2006 | B2 |
7177774 | Brown et al. | Feb 2007 | B1 |
7236966 | Jackson et al. | Jun 2007 | B1 |
7260535 | Galanes et al. | Aug 2007 | B2 |
7293238 | Brook et al. | Nov 2007 | B1 |
7315826 | Guheen et al. | Jan 2008 | B1 |
7364067 | Steusloff et al. | Apr 2008 | B2 |
7403948 | Ghoneimy et al. | Jul 2008 | B2 |
7412502 | Fearn et al. | Aug 2008 | B2 |
7467198 | Goodman et al. | Dec 2008 | B2 |
7472037 | Brown et al. | Dec 2008 | B2 |
7562143 | Fellenstein et al. | Jul 2009 | B2 |
7580906 | Faihe | Aug 2009 | B2 |
7707015 | Lubrecht et al. | Apr 2010 | B2 |
7802144 | Vinberg et al. | Sep 2010 | B2 |
20010047270 | Gusick et al. | Nov 2001 | A1 |
20020019837 | Balnaves | Feb 2002 | A1 |
20020055849 | Georgakopoulos et al. | May 2002 | A1 |
20020091736 | Wall | Jul 2002 | A1 |
20020099578 | Eicher et al. | Jul 2002 | A1 |
20020111823 | Heptner | Aug 2002 | A1 |
20020140725 | Horii | Oct 2002 | A1 |
20020147809 | Vinberg | Oct 2002 | A1 |
20020161875 | Raymond | Oct 2002 | A1 |
20020169649 | Lineberry et al. | Nov 2002 | A1 |
20020186238 | Sylor et al. | Dec 2002 | A1 |
20030004746 | Kheirolomoom et al. | Jan 2003 | A1 |
20030018629 | Namba | Jan 2003 | A1 |
20030018771 | Vinberg | Jan 2003 | A1 |
20030033402 | Battat et al. | Feb 2003 | A1 |
20030065764 | Capers et al. | Apr 2003 | A1 |
20030065805 | Barnes | Apr 2003 | A1 |
20030097286 | Skeen | May 2003 | A1 |
20030101086 | San Miguel | May 2003 | A1 |
20030154406 | Honarvar et al. | Aug 2003 | A1 |
20030172145 | Nguyen | Sep 2003 | A1 |
20030187719 | Brocklebank | Oct 2003 | A1 |
20030225747 | Brown et al. | Dec 2003 | A1 |
20040024627 | Keener | Feb 2004 | A1 |
20040158568 | Colle et al. | Aug 2004 | A1 |
20040172466 | Douglas et al. | Sep 2004 | A1 |
20040181435 | Snell et al. | Sep 2004 | A9 |
20040186757 | Starkey | Sep 2004 | A1 |
20040186758 | Halac et al. | Sep 2004 | A1 |
20040199417 | Baxter et al. | Oct 2004 | A1 |
20050027585 | Wodtke et al. | Feb 2005 | A1 |
20050027845 | Secor et al. | Feb 2005 | A1 |
20050066026 | Chen et al. | Mar 2005 | A1 |
20050091269 | Gerber et al. | Apr 2005 | A1 |
20050114306 | Shu et al. | May 2005 | A1 |
20050114829 | Robin et al. | May 2005 | A1 |
20050136946 | Trossen | Jun 2005 | A1 |
20050138631 | Bellotti et al. | Jun 2005 | A1 |
20050159969 | Sheppard | Jul 2005 | A1 |
20050187929 | Staggs | Aug 2005 | A1 |
20050203917 | Freeberg et al. | Sep 2005 | A1 |
20050223299 | Childress et al. | Oct 2005 | A1 |
20050223392 | Cox et al. | Oct 2005 | A1 |
20050254775 | Hamilton et al. | Nov 2005 | A1 |
20060067252 | John et al. | Mar 2006 | A1 |
20060069607 | Linder | Mar 2006 | A1 |
20060112036 | Zhang et al. | May 2006 | A1 |
20060112050 | Miikkulainen et al. | May 2006 | A1 |
20060129906 | Wall | Jun 2006 | A1 |
20060168168 | Xia et al. | Jul 2006 | A1 |
20060178913 | Lara et al. | Aug 2006 | A1 |
20060184410 | Ramamurthy et al. | Aug 2006 | A1 |
20060190482 | Kishan et al. | Aug 2006 | A1 |
20060224569 | DeSanto et al. | Oct 2006 | A1 |
20060224580 | Quiroga et al. | Oct 2006 | A1 |
20060235690 | Tomasic et al. | Oct 2006 | A1 |
20060282302 | Hussain | Dec 2006 | A1 |
20060287890 | Stead et al. | Dec 2006 | A1 |
20070043524 | Brown et al. | Feb 2007 | A1 |
20070055558 | Shanahan et al. | Mar 2007 | A1 |
20070073576 | Connors et al. | Mar 2007 | A1 |
20070073651 | Imielinski | Mar 2007 | A1 |
20070083419 | Baxter et al. | Apr 2007 | A1 |
20070118514 | Mariappan | May 2007 | A1 |
20070168225 | Haider et al. | Jul 2007 | A1 |
20070219958 | Park et al. | Sep 2007 | A1 |
20070234282 | Prigge et al. | Oct 2007 | A1 |
20070282470 | Hernandez et al. | Dec 2007 | A1 |
20070282622 | Hernandez et al. | Dec 2007 | A1 |
20070282645 | Brown et al. | Dec 2007 | A1 |
20070282653 | Bishop et al. | Dec 2007 | A1 |
20070282655 | Jaluka et al. | Dec 2007 | A1 |
20070282659 | Bailey et al. | Dec 2007 | A1 |
20070282692 | Bishop et al. | Dec 2007 | A1 |
20070282776 | Jaluka et al. | Dec 2007 | A1 |
20070282876 | Diao et al. | Dec 2007 | A1 |
20070282942 | Bailey et al. | Dec 2007 | A1 |
20070288274 | Chao et al. | Dec 2007 | A1 |
20070292833 | Brodie et al. | Dec 2007 | A1 |
20080065448 | Hull et al. | Mar 2008 | A1 |
20080109260 | Roof | May 2008 | A1 |
20080213740 | Brodie et al. | Sep 2008 | A1 |
20080215404 | Diao et al. | Sep 2008 | A1 |
20090012887 | Taub et al. | Jan 2009 | A1 |
Number | Date | Country |
---|---|---|
2007143516 | Dec 2007 | WO |
Number | Date | Country | |
---|---|---|---|
20070282644 A1 | Dec 2007 | US |