Method for performance measurement and an analysis of local exchange carrier interconnections

Information

  • Patent Grant
  • 6480749
  • Patent Number
    6,480,749
  • Date Filed
    Wednesday, December 29, 1999
    24 years ago
  • Date Issued
    Tuesday, November 12, 2002
    21 years ago
Abstract
Performance measurement and analysis of local exchange carrier interconnections. Performance measurements can be used to establish that an incumbent local exchange carrier (“ILEC”) is providing interconnections to one or more competitive local exchange carriers (“CLEC's”) that are at least equal in quality to the interconnections provided to itself. Performance measurements are defined to measure the timeliness, accuracy and availability of the interconnections provided to the CLEC'S. A performance measurement is based upon performance data that us collected from one of the ILEC's processes (preordering, ordering, provisioning, collocation, billing, maintenance and repair, emergency 911, operator service/directory assistance and trunk blockage). A performance report is defined to specify the types of performance measurements and dimensions that are included, as well as the reporting period.
Description




TECHNICAL FIELD




This invention relates in general to the collection and analysis of data, and more particularly to collecting data to determine a performance measurement that measures an interconnection provided by a local exchange carrier.




BACKGROUND OF THE INVENTION




A local exchange carrier (“LEC”) utilizes many different processes in order to serve its customers. For example, one or more processes typically supports customer ordering. Customer ordering includes the initiation of new service or the modification of existing service. Other processes support maintenance and repair and billing.




At one time, a local telephone market was served by a single LEC. However, local telephone markets now are open to competition from competitive local exchange carriers (“CLEC”). The existing or incumbent LEC (“ILEC”) is required to offer quality interconnection services, resale of capacity at wholesale rates, dialing parity, number portability and unbundled access to its networks to the CLEC's. Although the ILEC is required to satisfy these requirements, there is no established method for measuring the ILEC's compliance. For example, the ILEC has a duty not to prohibit, and not to impose unreasonable or discriminatory conditions or limitations on, the resale of its telecommunications services. However, there is no well-defined method or measurement to insure that this duty is met.




The telecommunications industry and the FCC issued a notice of proposed rule making (“NPRM”) that set forth some model performance measurements for measuring an ILEC's compliance. However, no rules have been promulgated. In addition, some states or public service commissions have attempted to define performance measurements, but either the performance measurements have not been enacted or the performance measurements are not well-defined. Therefore, there is a need to define performance measurements that establish that an ILEC has satisfied the requirements to provide equivalent service to a CLEC.




Once the performance measurements have been defined, then the ILEC must also determine how to collect the required data and present it in a useable format. Because the requirements to provide equivalent service to a CLEC cover a wide range of services, the ILEC must collect data from the various processes that it uses. These processes can be located on systems that are physically separate from one another. In addition, the processes can use data formats that are incompatible. In order to determine the performance measurements, the ILEC must identify the data that needs to be collected. Once the data is collected, the system must then normalize the data or transform the data into a common format so that data from multiple systems can be used to determine the performance measurements. Therefore, there is a need for a method of collecting data from a variety of systems or processes that may be incompatible with one another, normalizing the data and using the data to determine performance measurements.




Although it may be possible for the performance measurements to be determined manually, the manual collection and analysis of data greatly limits the number of measurements that can be taken. In addition, if the system is manually intensive, then it is difficult to alter the types of data that are collected or to alter the types of reports or other analysis that is generated from the data. Thus, the method for collecting, normalizing and analyzing the data should be automated. In addition, the method should be flexible so that the types of data collected and the reports generated can be easily modified.




SUMMARY OF THE INVENTION




The present invention meets the needs described above by providing a method for defining, analyzing and reporting performance measurements. The performance measurements can be used to establish that an incumbent local exchange carrier (“ILEC”) is providing interconnections to one or more competitive local exchange carriers (“CLEC's”) that are at least equal in quality to the interconnections provided to itself.




Typically, performance measurements are defined to measure such things as timeliness, accuracy and availability. The performance measurements can be used to compare the services provided by the ILEC to the CLEC's to the services provided by the ILEC to itself. An ILEC utilizes many different processes to serve its customers. Exemplary processes include preordering, ordering, provisioning, collocation, billing, maintenance and repair, emergency 911, operator service/directory assistance and trunk blockage.




A performance measurement is based upon a calculation that uses performance data collected from the processes. When the performance data is collected, the data is identified with one or more dimensions, such as the geography, entity, product and time dimensions. A dimension defines how a performance measurement is reported.




The method for defining, determining and reporting a performance measurement includes the steps of defining the performance measurement and defining the dimensions for the performance measurement. The method also includes defining the performance data needed to determine the performance measurement. In some instances, the performance data needed to determine a performance measurement is used for other purposes and is thus, available from the process. However, in other instances, the performance data is created or collected especially for the performance measurement. For example, a performance measurement based upon timeliness may require that the process use timestamps when such timestamps were not previously used. If timestamps are required, then the process associates a time-stamp with certain events in order to measure an interval or response time. The method also includes defining the performance reports to specify the types of performance measurements and dimensions that are included, as well as the reporting period. Preferably, the definition of the report can be easily modified to adapt to changes in the requirements, processes or user requests.




The performance data needed to determine the performance measurements is obtained and is used to determine the performance measurement. The performance measurement can be determined using a combination of dimensions. For example, a performance measurement can be calculated for a particular CLEC in a particular geographic area for a particular time period. Once the performance measurement is determined it can be included in the performance report.




A performance measurement and analysis platform (“PMAP”) system supports the collection of performance data, the determination of the performance measurements and the generation of the performance reports. The PMAP system includes source systems, a staging database, a normalized operational data store, a dimensional data store database and a user interface. A number of source systems provide data to the PMAP system. Typically, the source systems correspond to the processes used by the ILEC.




Once the data is collected from the source systems, the data is loaded into a staging database and the data is filtered and normalized. The normalized operational data store is used to validate the data against business rules and data relationships and transform the data to conform to the PMAP data model. The dimensional data store database includes performance measurements which include aggregate and summary data. The PMAP system provides a variety of reporting capabilities. The reports include aggregate and CLEC-specific reports, state and regional reports, and reports directed to the different processes or subject areas.




The PMAP system also creates raw data files that contain detailed information about specific local service requests, service orders trouble tickets and other items that are typically reported. Typically, the raw data is used to recreate performance reports or to enable a user to create a custom report. A user can download raw data files, import raw data files, import raw data files into a program, such as a spreadsheet program, or manipulate the raw data to create a measurement in any of the performance reports.




These and other aspects, features and advantages of the present invention may be more clearly understood and appreciated from a review of the following detailed description of the disclosed embodiments and by reference to the appended drawings and claims.











BRIEF DESCRIPTION OF THE DRAWINGS





FIG. 1

is a block diagram illustrating exemplary processes, in accordance with an embodiment of the present invention.





FIG. 2

is a block diagram illustrating exemplary dimensions, in accordance with an embodiment of the present information.





FIG. 3

is a flow chart illustrating the definition of a performance measurement, in accordance with an embodiment of the present information.





FIG. 4

is a block diagram illustrating the processing of performance data, in accordance with an embodiment of the present information.











DETAILED DESCRIPTION OF THE INVENTION




The present invention is directed to a method for defining, analyzing and reporting performance measurements. The performance measurements can be used to establish that an incumbent local exchange carrier (“ILEC”) is providing interconnections to one or more competitive local exchange carriers (“CLEC's”) that are at least equal in quality to the interconnections provided to the ILEC itself. Briefly described, the performance measurements measure the timeliness, accuracy and effectiveness of the interconnections provided by the ILEC. The performance measurements are determined using performance data that is collected from systems associated with the ILEC. Typically, the systems correspond to the processes used by the ILEC to support the interconnections, such as ordering and billing. The collected performance data is normalized and the data is used to determine the performance measurements. The performance measurements are reported using a variety of formats. The performance measurements for a particular CLEC can be reported or performance measurements for all CLEC's can be reported. The performance measurements for one or more CLEC's can also be compared to the performance measurements for the ILEC.




Exemplary Local Exchange Carrier Processes




Typically, a local exchange carrier (“LEC”) utilizes many different processes to serve its customers. The exemplary processes illustrated in

FIG. 1

include preordering


100


, ordering


102


, provisioning


104


, collocation


106


, billing


108


, maintenance and repair


110


, emergency 911


112


, operator service/directory assistance


114


, and trunk blockage


116


. As will be apparent to those skilled in the art, different LEC's may have different processes than those illustrated by FIG.


1


. Some LEC's may combine one or more of the processes illustrated in

FIG. 1

into a single process or may divide a single process illustrated in

FIG. 1

into multiple processes.

FIG. 1

also illustrates customer events


118


. The customer events interface directly with some of the processes. In

FIG. 1

, the customer events


118


interface with the emergency 911


112


, billing


108


, operator service/directory assistance


114


, and trunk blockage


116


processes.




The preordering process


100


is directed to activities that occur prior to the submittal of a local service request by an LEC. These activities include verifying the customer's street address, determining available products and services, estimating the service interval and reserving a telephone number. Data validation is performed to ensure that the local service request is complete and accurate. Performance measurements directed to the preordering process include the availability of the preordering process and the response time.




The ordering process


102


begins when an LEC enters a local service request and ends when the LEC receives confirmation that an order has been created in the system. Orders may be submitted electronically or via facsimile, telephone or e-mail. Performance measurements directed to the ordering process include availability of order progress information.




The provisioning process


104


includes facilities assignment, software changes, service design, issuance of technician work orders and activation procedures. The provisioning process ends when a billing record is created for the new account, or the billing record is updated if the order is being provisioned for a change order. Performance measurements directed to the provisioning process include the timeliness of service delivery and the accuracy of the services provided to the LEC.




The collocation process


106


includes activities related to placing customer-owned equipment in the ILEC's central office for interconnection to the ILEC's tariffed services and unbundled network elements. Performance measurements directed to the provisioning process include the timeliness of the interconnection.




The billing process


108


includes the activities associated with accumulated usage data and determining the charges to be billed to a customer's account. Performance measurements directed to the billing process include the accuracy and the timeliness of customer invoices.




The maintenance and repair process


110


includes activities directed to responding to a customer's maintenance and repair needs. The maintenance and repair process begins when a customer reports a service problem. A trouble ticket is entered to document the problem. The equipment and facilities are tested to locate the source of the trouble and once the problem is repaired, the customer is notified and the trouble ticket is closed. Performance measurements directed to the maintenance and repair process include the timeliness of the repair and the rate of repeat problems.




The emergency 911 process


112


includes activities that support emergency 911 service. In one embodiment, the emergency 911 process includes database updates to a third party emergency 911 vendor to insure that the vendor has the most up-to-date information for providing emergency service to residents and businesses. In other embodiments, the emergency 911 process is directed to activities that include the actual provision of emergency 911 service. Performance measurements directed to the emergency 911 process include the timeliness and accuracy of the service.




The operator service/directory assistance process


114


includes activities required to provide additional services to the customer, such as directory inquires to retrieve telephone numbers. Performance measurements directed to the operator services process include the timeliness of the service.




The trunk blockage process


116


includes the collection of traffic performance data on the trunk groups in the network. Performance measurements directed to the trunk blockage process include the number of attempted calls and the number of blocked calls.




Definition of Performance Measurements




Performance measurements are defined to demonstrate that an ILEC is providing interconnections to CLEC's that are at least equal in quality to those provided by the ILEC to itself. The performance measurements typically are used to compare the services provided by the ILEC to the CLEC's to the services provided by the ILEC to itself. In some instances a comparison is made between the ILEC and all other CLEC'S. In other instances a comparison is made between the ILEC and a particular CLEC.




Performance measurements are defined to measure such things as timeliness, accuracy and availability. In the exemplary embodiment described herein, the performance measurements are defined to measure activities associated with the different processes described in the preceding section. Typically, a performance measurement is defined as a total, percentage, interval or accuracy measurement. A performance measurement that measures a total is defined to be the sum of a number of events or occurrences. For example, a flow through error analysis performance measurement associated with the ordering process is defined as the sum of errors by type.




A performance measurement that measures a percentage is defined to be an actual number divided by a total or scheduled number and multiplied by 100. For example, the missed repair appointments performance measurement associated with the maintenance and repair process is defined as follows:






Percentage of missed repair appointments=Σ (count of customer troubles not cleared by the quoted commitment date and time)/Σ (total trouble reports closed in reporting period)×100






A performance measurement that measures an interval is defined to be an actual time interval for an event divided by the total number of events.




For example, the average completion interval performance measurement associated with the provisioning process is defined as follows:






Average Completion Interval=Σ[(completion date and time)−(order issue date and time)] /Σ (total orders completed in reporting period)






A performance measurement that measures accuracy is defined to be the percentage of correct events to total events. For example, the invoice accuracy performance measurement associated with the billing process is defined as follows:




 Invoice Accuracy=(total billed revenues)−(billing related adjustments)/(total billed revenues)




Exemplary performance measurements associated with the processes are summarized in the following tables. Each table corresponds to one of the previously described processes and includes the name of the performance measurements and the definition of each performance measurement. In addition, each table includes the dimensions and performance data associated with the process. The dimensions and the performance data can be associated with more than one of the performance measurements in the table.












TABLE 1











PRE-ORDERING PROCESS














Performance






Performance






Measurements




Performance Definition




Dimensions




Data









Average OSS




Σ(Date and Time of




Time




Number of






Response




Legacy Response) −





Legacy






Interval




(Date and Time of





Requests







Request to Legacy)]/







(Number of Legacy







Requests During the







Reporting Period) × 100






OSS




(Functional Availability)/




Geography




Summation of






Interface




(Scheduled





Response






Availability




Availability) × 100





Interval






















TABLE 2











ORDERING PROCESS














Performance




Performance





Performance






Measurements




Definition




Dimensions




Data









Percent




Σ(Total number of




Entity




Total






Flow




valid service requests





Number






Through




that flow through to the





Rejected






Service




ILEC OSS)/(Total





Service






Requests




number of valid





Requests







service requests







delivered to the ILEC







OSS) × 100






Percent




(Total Number of




Time




Total






Rejected




Rejected Service





Number






Service




Requests)/Total





Service






Requests




Number of Service





Requests







Requests Received) × 100





Received






Reject




Σ[(Date and Time of




Geography




Date and






Interval




Service Request





Time of







Rejection) − (Date and





Service







Time of Service





Request







Request Receipt)]/





rejection







(Number of Service







Requests Rejected in







Reporting Period)






Firm




Σ[(Date and Time of





Date and






Order




Firm Order





Time of






Confirmation




Confirmation) − (Date





Service






Timeliness




and Time of Service





Request







Request Receipt)]/





receipt







(Number of Service







Requests Confirmed in







Reporting Period)






Speed of




(Total Time in second





Date and






Answer




to reach the Local





Time of






in Ordering




Carrier Service Center)/





Firm Order






Center




(Total Number of





Confirmation







Calls) in the Reporting







Period






Flow




Σ of errors by type




Class of




Number






Through





Service/




Service






Error





Product




Requests






Analysis






confirmed









(in reporting









period)






















TABLE 3











PROVISIONING














Performance




Performance





Performance






Measurements




Definition




Dimensions




Data









Average




Σ[(Completion Date




Entity




Total






Completion




and time) − (Order Issue





Service






Interval




Date and Time)]/Σ





Orders







(Count of Orders





Completed







Completed in







Reporting Period)






Order




Σ(Service Orders




Time




Service






Completion




Completed in “X”





Order






Interval




days)/(Total Service





Completion






Distribution




Orders Completed in





Date and







Reporting Period) × 100





Time






Mean Held




Σ(Reporting Period




Geography




Service






Order




Close Date −





Order Issue






Interval




Committed Order Due





Date and







Date)/(Number of





Time







Orders Pending and







Past the Committed







Due Date) for all







orders pending and







past the committed due







date






Held Order




(Number of Orders





Service






Distribution




Held for ≧ “X” days)/





Orders






Interval




(Total number of





completed in







orders pending but not





“X” Days







completed) × 100






Average




Σ(Reporting Period




Class of




Committed






Jeopardy




Close Date −




Service/




Order






Notice




Committed Order Due




Product




Due Date






Interval




Date)/(Number of







Orders Pending and







Past the Committed







Due Date) for all







orders pending and







past the committed due







date






Percentage




Σ[(Number of Orders





Number of






of Orders




Given Jeopardy





Service






Given




Notices in Reporting





Orders






Jeopardy




Period)/(Number of





Pending and






Notices




Orders Committed in





Past the







Reporting Period)





Committed









Due Date






Percent




Σ(Number of Orders





Number of






Missed




Not Complete by





Service






Installation




Committed Due Date





Orders Held






Appointments




in Reporting





for > = 90







Period)/(Number of





Days







Orders Completed in







Reporting Period) × 100






Percent




Σ(Trouble reports on





Number of






Provisioning




all completed orders ≦





Service






Troubles




30 days following





Orders Held






within 30 days




service order(s)





for > = 15







completion)/(All





Days







Service Orders in a







completed in the report







calendar month) × 100






Coordinated




Σ[(Completion Date





Total






Customer




and Time for Cross





Number of






Conversions




Connection of an





Service







Unbundled Loop) −





Orders







(Disconnection Date





Pending







and Time of an





But Not







Unbundled Loop)]/





Completed







Total Number of







Unbundled Loop Items







for the reporting







period.






Average




Σ(Date and Time of





Number of






Completion




Notice of Completion) −





Service






Notice




(Date and Time of





Orders






Interval




Work Completion)/





missed in







(Number of Orders





Reporting







Completed in





Period







Reporting Period)






















TABLE 4











COLLOCATION














Performance




Performance





Performance






Measurements




Definition




Dimensions




Data









Average




Σ(Request Response




Entity




Request






Response




Date) − (Request





Response






Time




Submission Date)/





Date &







Count of Responses





Time







Returned within







Reporting Period






Average




Σ(Date Collocation




Time




Request






Arrangement




Arrangement is





Submission






Time




Complete) − (Date





Date &







Order for Collocation





Time







Arrangement







Submitted)/Total







Number of Collocation







Arrangements







Completed during







Reporting Period






Percent of




Σ(Number of Orders




Geography




Count of






Due Dates




not completed within





Requests






Missed




ILEC Committed Due





Submitted







Date during Reporting







Period)/Number of







Orders Completed in







Reporting Period) × 100






















TABLE 5











BILLING














Performance




Performance





Performance






Measurements




Definition




Dimensions




Data









Invoice




(Total Billed Revenues




Entity




Total






Accuracy




during current month) −





Local







(Billing Related





Services







Adjustments during





billed







current month)/Total





Revenues







Billed Revenues during







current month × 100






Invoice




Σ(total number of




Time




Total






Timeliness




usage records sent





Adjustment







within six (6) calendar





Revenues







days from initial







recording/receipt)/







Σ(Total number of







usage records sent) × 100






Usage Data




Σ(total number of




Geography




Summation






Delivery




usage records sent





of Time to






Completeness




within six (6) calendar





Transmit







days from initial





Invoices







recording/receipt)/







Σ(Total number of







usage records sent) × 100






Usage Data




Σ(Total number of





Total No.






Delivery




usage records sent





of Invoices






Timeliness




within six calendar days







from initial







recording/receipt)/







Σ(Total number of







usage records sent) × 100






Usage Data




Σ[(Total number of




Class of




Number of






Delivery




usage data packs sent




Service/




Usage Data






Accuracy




during current month) −




Product




Packs Sent







(Total number of usage







data packs requiring







retransmission during







current month)]/(Total







number of usage data







packs sent during







current month) × 100






















TABLE 6











MAINTENACE AND REPAIR














Performance




Performance





Performance






Measurements




Definition




Dimensions




Data









OSS Interface




(Actual System




Entity




Total time






Availability




Functional





in seconds







Availability)/(Actual





for ILEC







planned System





Repair







Availability) × 100





Center









Response






Average OSS




(Query Response Date




Time




Total






Response




and Time for Category





Number






Interval




“X”) − (Query Request





of Calls







Date and Time for





Received







Category “X”)/







(Number of Queries







Submitted in the







Reporting Period) where,







“X” is 0-4, ≧4







to 10, ≧10, ≧30 seconds






Average




(Time ILEC Repair




Geography




Count of






Answer




Attendant Answers Call) −





Customer






Time −




(Time of entry into





Troubles Not






Repair




queue until ACD





Resolved by







Selection)/(Total





the Quoted







number of calls by





Resolution







reporting period





Time and









Date






Missed




Σ(Count of Customer




Interval




Count of






Repair




Troubles Not Cleared by




Distribution




Customer






Appointments




the Quoted Commitment





Trouble







Date and Time)/Σ(Total





Tickets







Trouble reports closed in





Closed







Reporting Period) × 100






Customer




(Count of Initial and





Count of






Trouble




Repeated Trouble





Repeated






Report




Reports in the Current





Trouble






Rate




Period)/(Number of





Reports







Service Access Lines in





in the







Service at End of the





Current







Report Period) × 100





Period






Maintenance




Σ(Date and Time of




Class of




Number of






Average




Service Restoration) −




Service/




Service






Duration




(Date and Time Trouble




Product




Access







Ticket was Opened)/





Lines in







Σ(Total Closed Troubles





Service at







in the Reporting Period





End of the









Report









Period






Out of Service




(Total Troubles OOS >24





Total






(“OOS”) >24




Hours)/Total OOS





Duration






Hours




Troubles in Reporting





Time from







Period) × 100





the Receipt









to the









Clearing of









Trouble









Reports






Percent




(Count of Customer





Total Out






Repeat




Troubles where more





of Service






Troubles




than one trouble report





Troubles






within




was logged for the same






30 days




service line within a







continuous 30 days)/







(Total Trouble Reports







Closed in Reporting







Period) × 100






















TABLE 7











EMERGENCY 911














Performance




Performance





Performance






Measurements




Definition




Dimensions




Data









Timeliness




Σ(Number of batch




Time




Number of







orders processed within





Confirmed







24 hours ÷ Total number





Orders







of batch orders







submitted) × 100






Accuracy




Σ(Number of record




Geography




Number of







individual updates





Orders







processed with no errors ÷





missed in







Total number of





Reporting







individual record





Period







updates) × 100






Mean Interval




Σ(Date and time of batch





Total







order completion − Date





Number of







and time of batch order





SOIR







submission)/(Number





orders for







of batch orders





E911







completed)





Updates






















TABLE 8











OPERATOR SERVICES














Performance




Performance





Performance






Measurements




Definition




Dimensions




Data









Average




(Total call waiting




Time




Call Waiting






Speed to




seconds)/Total calls





Seconds






Answer




served)






Percent




(Total number of calls




Geography




Number






Answered




answered within X





of Calls






within “X”




seconds)/(Total calls





served






seconds




served) × 100











Note:










In some embodiments the operator services performance measurements are provided by the operator services process. No raw data is provided.





















TABLE 9











TRUNK BLOCKAGE
















Performance




Performance





Performance







Measurements




Definition




Dimensions




Data











Trunk Group




(Total number of




Entity




Number







Service




blocked calls)/(Total





of Trunk







Report




number of attempted





Groups








calls) × 100





Measured















Dimensions




A performance measurement is based upon performance data collected from the processes. When the performance data is collected, the data is identified with a particular dimension. A dimension defines how a performance measurement based upon the data is reported. In the embodiment illustrated by

FIG. 2

, the primary dimensions include geography


200


, entity


210


, product


220


and time


230


.




The geography dimension permits performance measurements to be calculated based on specific geographic criteria. The geographic criteria shown in

FIG. 2

are region


202


, state


204


, MSA (market service area or metropolitan statistical area)


206


and wire center (or switching center)


208


. The entity dimension permits performance measurements to be calculated based on a set of LEC's or a particular LEC. The entities shown in

FIG. 2

include aggregate


212


(all CLEC's and the ILEC), CLEC


214


(all CLEC's), ILEC


216


(incumbent LEC), CLEC company


218


(a particular CLEC) and CLEC Identifiers


219


.




The product dimension permits performance measurements to be calculated based upon specific products or services. The products shown in

FIG. 2

include LA Product


222


, GA Product


224


and product class


226


. The time dimension permits performance measurements to be calculated based upon certain time intervals. The time dimensions shown in

FIG. 2

include year


232


, quarter


234


and month


236


. Multiple dimensions can be used to report performance measurements. For example, a report can be based on a particular region, CLEC and year. As will be apparent to those skilled in the art, other embodiments can include alternative or additional dimensions.




As illustrated in

FIG. 2

, a dimension has multiple layers. If a performance report uses a particular layer of a dimension a user can obtain additional detail by accessing a different layer. For example, to obtain additional detail for a report that uses the “year” layer of the time dimension, the user can access the “quarter” layer or the “month” layer.




Method for Defining, Determining and Reporting Performance Measurement




Exemplary steps for defining, determining and reporting a performance measurement are shown in FIG.


3


. In step


302


, the performance measurement is defined. Defining the performance measurement includes defining a quantifiable measure of an interconnection provided by the ILEC. The performance measurements are typically associated with one of the processes of the ILEC. For example, to define a performance measurement for the maintenance and repair process, a performance measurement could be defined to measure trouble reports not cleared by a committed date and time (missed repair appointments measurement, Table 6). The preceding tables include exemplary definitions of performance measurements. In step


304


, the dimensions for the performance measurement are defined. For example, the dimensions for the missed repair appointments measurement includes the entity, geography, product and time dimension.




In step


306


, the performance data needed to determine the performance measurement are defined. The performance data are defined by considering the definition of the performance measurement and the dimensions for the performance measurement. For example, the performance data needed to determine the missed repair appointments performance measurement are the count of customer troubles not cleared by the quoted commitment date and time and the total trouble reports closed in the reporting period. The performance data needed to determine the missed repair appointments performance measurement includes data associated with the entity, geography, product and time dimensions.




In some instances, the performance data needed to determine a performance measurement is used for other purposes and is thus, available from the process. However, in other instances, the performance data is created or collected especially for the performance measurement. For example, a performance measurement based upon timeliness may require that the process use timestamps when such timestamps were not previously used. If timestamps are required, then the process associates a timestamp with certain events in order to measure an interval or response time.




In step


308


, the performance reports are defined. The performance reports are defined to specify the types of performance measurements and dimensions and the reporting period. The reports can be defined by the ILEC, a CLEC that is accessing the system or another entity. For example, if the PSC (public service commission) requires a specific type of report or a report that includes specific information, then the report can be defined to meet those requirements. Preferably, the definition of the report can be easily modified so that the report can adapt to changes in the requirements or processes or user requests.




In step


310


, the performance data needed to determine the performance measurements are obtained. The performance data can be obtained from a number of different source systems associated with the different processes. Additional details about the source systems are provided in the preceding section. In step


312


, the performance data is used to determine the performance measurement. A performance measurement can be determined using a combination of dimensions. For example, the percentage of missed repair appointments can be calculated for a particular CLEC in a particular geographic area for a particular quarter. In step


314


the performance measurement is included in the performance report. The performance report can be a written report or an on-line report.




Performance Measurement and Analysis Platform System





FIG. 4

illustrates the logical layers of data that exist in the performance measurement and analysis platform (“PMAP”) system


401


to support the required measurement and analysis. A number of source systems


400




a


,


400




b


. . .


400




n


provide data to the PMAP system. Typically, the source systems correspond to the processes previously described. The source systems can be existing legacy systems and can use data formats that are not compatible with one another.




In the exemplary embodiment illustrated by

FIG. 4

, the LEO system


400




a


is a local exchange ordering system, the LON system


400




b


is a local order number system, and the EXACT system


400




c


is an exchange access control and tracking system. The LEO, LON and EXACT systems all correspond to the ordering process. The SOCS system


400




d


is a service order and control system and corresponds to the provisioning process. The WFA


400




e


system is a work force administration system and corresponds to the maintenance and repair process.




The CRIS system


400




f


is a customer record information system and corresponds to the billing process. The LMOS system is a line maintenance operation system and corresponds to the maintenance and repair process. The TIRKS system


300




h


is a trunk integrated record keeping system and corresponds to the trunk blockage process. The SOIR system


400




n


is a service order information system and corresponds to the emergency 911 process. As will be apparent to those skilled in the art, alternative or additional source systems may be included. In addition, a single system may provide data for multiple processes.




The data is collected from the source systems by accessing existing databases or by retrieving the data manually. The manual retrieval of information can include receiving data via facsimile or e-mail and may require human intervention to enter the data into the PMAP system


401


.




Once the data is collected from the source systems


400




a


,


400




b


. . .


400




n


the data is loaded into a staging database


402


. When the data is received by the staging database


402


, the data is not normalized. Once in the staging database, the data is filtered and normalized. Data errors are captured and handled in the staging database.




The NODS database


404


is a normalized operational data store. In NODS, the data is validated against the business rules and data relationships and transformed to conform to the PMAP data model. Between the NODS database and the DDS database, the data undergoes an aggregation process.




The DDS database


406


is the dimensional data store database. The DDS database includes performance measurements which include aggregate and summary data. If access to detailed information is required, then the data must be accessed by drilling down into the measurement data by accessing the NODS database


404


or the raw data files


412


as described below.




The PMAP system creates raw data files


412


that contain detailed information about specific local service requests, service orders trouble tickets and other items that are typically reported. Typically, the raw data is used to recreate performance reports or to enable a user to create a custom report. A user can download raw data files, import raw data files, import raw data files into a program, such as a spreadsheet program or manipulate the raw data to create a measurement in any of the performance reports.




The PMAP system


401


provides a variety of reporting capabilities. The reports


410


include aggregate and CLEC-specific reports, state and regional reports, and reports directed to the different processes or subject areas. In one embodiment, the reports can be accessed via a network, such as the Internet. Typically, a user is provided with a user ID and a password in order to access the reports (and the raw data). A user generally is permitted to access reports and data related to all CLEC's and the CLEC associated with the user, but generally is not permitted to access reports and data related to another CLEC. A user associated with the incumbent LEC can be provided with broader access to the data. As will be apparent to those skilled in the art, other types of reports and other methods of reporting can also be used with the PMAP system


401


.




Additional alternative embodiments will be apparent to those skilled in the art to which the present invention pertains without departing from its spirit and scope. For example, additional or alternative performance can be used or the definition of the performance measurements described can be modified. Accordingly, the scope of the present invention is described by the appended claims and is supported by the foregoing description.



Claims
  • 1. A method for determining a set of performance measurements for interconnection of a competitive local exchange carrier to an incumbent local exchange carrier's network, comprising:defining a first performance measurement that measures timeliness of the incumbent local exchange carrier's response to a first request from the competitive local exchange carrier, the first performance measurement based on a first set of data provided by the incumbent local exchange carrier; collecting the first set of data from a first process of the incumbent local exchange carrier; using the first set of data to determine the first performance measurement; and providing the first performance measurement to the competitive local exchange carrier.
  • 2. The method of claim 1, further comprising:defining a second performance measurement that measures accuracy of the incumbent local exchange carrier's response to a second request from the competitive local exchange carrier, the second performance measurement based on a second set of data provided by the incumbent local exchange carrier; collecting the second set of data from a second process of the incumbent local exchange carrier; using the second set of data to determine the second performance measurement; and providing the second performance measurement to the competitive local exchange carrier.
  • 3. The method of claim 2, wherein the second process is a collocation process.
  • 4. The method of claim 1, wherein providing the first performance measurement to the competitive local exchange carrier comprises providing a performance report via a network.
  • 5. The method of claim 1, further comprising:providing the first set of data to the competitive local exchange carrier.
  • 6. The method of claim 1, wherein the first process is a provisioning process.
  • 7. A method for determining a performance measurement, comprising:defining the performance measurement for an interconnection provided by a local exchange carrier, the performance measurement relating to a service provided by the local exchange carrier to a competitive local exchange carrier; defining the performance data needed to determine the performance measurement for the local exchange carrier and the competitive local exchange carrier; obtaining the performance data from a process of the local exchange carrier; and using the performance data to determine the performance measurement for the local exchange carrier and the performance measurement for the competitive local exchange carrier; so that the performance measurement for the local exchange carrier can be compared to the performance measurement for the competitive local exchange carrier.
  • 8. The method of claim 7, further comprising:defining a dimension corresponding to the performance measurement.
  • 9. The method of claim 8, wherein the dimension is based upon a product or service provided by the local exchange carrier.
  • 10. The method of claim 7, wherein obtaining the performance data from a process of the local exchange carrier comprises obtaining the performance data from an ordering process.
  • 11. The method of claim 7, wherein obtaining the performance data from a process of the local exchange carrier comprises obtaining the performance data from a maintenance and repair process.
  • 12. The method of claim 7, further comprising:providing a performance report that includes the performance data to the competitive local exchange carrier.
  • 13. A method for creating a performance report that provides information about an interconnection provided by a local exchange carrier, comprising:defining a performance measurement based on performance data from a process of the local exchange carrier for the interconnection; defining a dimension for the performance measurement; obtaining the performance data corresponding to the performance measurement and the dimension; and using the performance data to create the performance report.
  • 14. The method of claim 13, wherein the dimension is based upon geography.
  • 15. The method of claim 13, wherein the performance measurement is related to timeliness.
  • 16. The method of claim 13, wherein the performance measurement is related to accuracy.
  • 17. The method of claim 13, wherein the performance measurement included in the performance report relates to all competitive local exchange carriers served by the local exchange carrier.
  • 18. The method of claim 13, wherein the performance report compares the performance measurement for the local exchange carrier to the performance measurement for a competitive local exchange carrier.
  • 19. The method of claim 13, wherein the process is an emergency process.
  • 20. The method of claim 13, wherein the process is a billing process.
RELATED APPLICATION

This U.S. patent application claims priority to U.S. Provisional Patent Application Ser. No. 60/164,682 entitled “A System and Method for Collecting and Analyzing Performance Measurements for Telecommunications Systems,” filed Nov. 10, 1999 which is incorporated herein by reference. The present application and the related U.S. provisional patent application are commonly assigned to BellSouth Intellectual Property Corporation.

US Referenced Citations (4)
Number Name Date Kind
5164983 Brown et al. Nov 1992 A
5799072 Vulcan et al. Aug 1998 A
5862203 Wulkan et al. Jan 1999 A
5898765 Teglovic et al. Apr 1999 A
Provisional Applications (1)
Number Date Country
60/164682 Nov 1999 US