The present invention generally relates to the field of computer-aided design (CAD). In particular, the present invention is directed to a method and associated apparatus to enable automated help with use of a CAD design tool.
Computer-Aided Design (CAD) programs are typically utilized to create, model, and optimize the design a product or article for subsequent manufacture, typically by rendering a 3D surface representation of the designed product. CAD tools typically include a user interface for enabling a user to input design requirements, constraints, required performance criteria, testing criteria, and required elements or materials.
Systems to provide help to users of software products are known in the industry. One is the menu-driven help system in applications such as Microsoft Word, in which a user reads through a menu of search topics to find applicable information, by either reviewing an index of help topics or by entering key words that search the index and specific help information to find applicable content.
Another utility to provide general assistance is Ask.com. Ask.com includes a utility with a Q&A community, in which specific questions on general topics can be submitted to groups of experts in those general topics. See http://www.ask.com/answers/browse?qsrc=321&qo=channelNavigation&o=0&1=dir, last visited Jun. 12, 2014.
Finally, remote help systems are also known by which computer support technicians can access users' computers remotely in order to provide support. See for example http://www.apextechservices.com/it-consulting/g110202013.aspx?gclid=COvK27np274CFc9 xOgodeBcAig, last visited Jun. 12, 2014.
Accordingly, while prior art exists for providing remote technical support, that technical support relies on either static menus, or relies on users to articulate their questions in a form understandable to the help system. A need has arisen in the art for help systems that do not rely on static menus or user help request articulation.
In an aspect, a method of providing automated help includes receiving, at a computer system including at least one processor and at least one computer readable medium storing machine-executable instructions, user inputs for utilizing a plurality of functions of a computer program. The method includes detecting, at the computer system, a plurality of user actions utilizing the plurality of functions, as a function of the user inputs. The method includes detecting, at the computer system, an action trigger indicating a user help condition, wherein detecting further comprises identifying, in a help trigger database associating a plurality of action triggers with user help conditions, an action trigger matching at least a user action of the plurality of user actions. The method includes identifying, by the computer system, a first area of expertise associated with the user help condition. The method includes, based on the identified first area of expertise associated with the user help condition, identifying, by the computer system, expertise to provide assistance to the user.
In another aspect, a system for automated help includes a computer system, wherein the computer system is configured to receiving user inputs for utilizing a plurality of functions of a computer program, detecting a plurality of user actions utilizing the plurality of functions, as a function of the user inputs, detecting an action trigger indicating a user help condition, wherein detecting further comprises identifying, in a help trigger database associating a plurality of action triggers with user help conditions, an action trigger matching at least a user action of the plurality of user actions, a first area of expertise associated with the user help condition, and, based on the identified first area of expertise associated with the user help condition, identify expertise to provide assistance to the user.
In another aspect, a non-transitory computer readable medium storing machine-executable instructions includes a plurality of modules that are executed by a processor, the instructions causing the processor to execute a method. The method includes receiving, at a computer system including at least one processor and at least one computer readable medium storing machine-executable instructions, user inputs for utilizing a plurality of functions of a computer program. The method includes detecting, at the computer system, a plurality of user actions utilizing the plurality of functions, as a function of the user inputs. The method includes detecting, at the computer system, an action trigger indicating a user help condition, wherein detecting further comprises identifying, in a help trigger database associating a plurality of action triggers with user help conditions, an action trigger matching at least a user action of the plurality of user actions. The method includes identifying, by the computer system, a first area of expertise associated with the user help condition. The method includes, based on the identified first area of expertise associated with the user help condition, identifying, by the computer system, expertise to provide assistance to the user.
These and other aspects and features of non-limiting embodiments of the present invention will become apparent to those skilled in the art upon review of the following description of specific non-limiting embodiments of the invention in conjunction with the accompanying drawings.
For the purpose of illustrating the invention, the drawings show aspects of one or more embodiments of the invention. However, it should be understood that the present invention is not limited to the precise arrangements and instrumentalities shown in the drawings, wherein:
In the description to follow, flowcharts are used to indicate the methods in accordance with a first embodiment of the invention. These flowcharts indicate corresponding sequences of computer code that accomplish the depicted method steps.
The invention could be embodied in one of several ways. All of the code modules could be written in any computer language, such as Java or C++. Alternatively, the code modules of the invention could be embedded within an existing 3D CAD/CAM program, such as SolidWorks, AutoCAD, zwCAD from ZWCAD Software Co, TurboCAD from IMSI/Design LLC, or others. If the code modules are separately programmed, they can interact with a CAD product through its application program interface (API). Any code modules of the invention that are integrated into an existing CAD product would be written in an applicable programing language for CAD products, such as LISP.
Herein, a “structure” (or the “product” that is designed) may be any object or part having a particular geometry. A 3D computer “model” may be a virtual representation of a structure and may be created using an appropriate CAD program, such as those set forth above. A “designer” or “user” may be the designer of a 3D computer model, a purchaser, an agent of the purchaser, a consumer, a home user, or a customer, among others. Examples of a structure include a piece of sheet metal, a solid cube, a cylindrical pipe, an injection molded plastic toy, an article of clothing such as a shirt made of cotton, and an assembly of various parts such as a vehicle, among others. A project (or design) may refer to a CAD model of a part or an assembly of CAD models of parts that may be a virtual representation of a particular structure and may be created using one or more appropriate CAD programs.
As set forth above, the invention can be designed as either a standalone program that operates a separate CAD program through its APIs or can be integrated into a conventional CAD program such as those listed above. In the invention CAD program 200 includes conventional CAD functions modules 210 that are typically found in commercial CAD programs, enabling the design of one or more objects. As a result, the conventional CAD functions 210 produce a CAD model 220 of the object to be designed. The CAD model 220 includes conventional information (such as composition, object dimensions, surface bends, welds, and the like). The user interacts with and controls the conventional CAD functions modules (hereinafter “CAD program functions” or “CAD functions” 210), to in turn manipulate the resultant CAD model 220, through graphical user interface (GUI) 230.
In the invention, a help center module 240 is provided to enable automated help intervention. As will be described in more detail below, the help center module 240 includes an observer software module 245 that reads and records recent interactions between the user (via GUI) 230 and the conventional CAD program functions 210. As will be described in more detail below, the observer software module 245 includes a design state database 250 that records changes and updates to the CAD model 220. The design state database (as well as all the other databases described herein) can be any commercial relational database product, such as DB2 from IBM or Oracle Database 12c, that can store the information and related tables of information described and can be either directly a part of the CAD program 200 (as shown) or accessed by the program modules of the invention remotely (such as in a cloud service environment).
The help center module 240 further includes an expert remote control unit module 255, which in turn includes an expert interface 257. The expert remote control unit module 255 enables experts to remotely access the CAD program 200, through the expert interface 257. In a preferred embodiment of the invention the expert interface 257 is part of the GUI 230 by which the user interacts with CAD program 200. The expert interface is shown in
The invention further includes a help center “switchboard” module 260, which utilizes the output of the design state database 250 to select an expert from the expert database 270 who has listed credentials that are similar to the user's requirements for assistance as indicated by the help center module 240. As a result, an appropriate expert 280 is selected, and a connection is established between the user and expert 280 by any one of a number of known communication technologies (such as, by way of example and not limitation, video chat such as Skype; landline telephone connection; cellphone/smartphone connection; emails; instant messages, and communications over social media portals such as Facebook and Twitter). The help center switchboard module 260 further includes an analyzer module 275, and an optional payment module 277. As will be described in more detail below, the analyzer module 275 compares the output of the design state database 250, as well as (optionally) inputs from the user via GUI 230 and the CAD model itself 220, to the entries in the expert database 270, so as to determine which expert 280-280N should be selected to provide support to the user. Finally, as described in more detail below, each of the experts 280-280N may have their own version of CAD program 200 for entering information into the CAD model 220, while interacting with the user (for example, so that the expert can explain her recommended changes to the CAD model 220) through the expert interface 257. Each expert of experts 280-280N may have access to one or more additional computing resources, including without limitation an Expert CAD program 285 or other computer modeling program or module.
Still referring to
With reference to
As previously stated, and as will be described in more detail below with reference to
In step 410 the expertise area of the associated action is characterized by the observer software module 245 and recorded by observer software module 245 into the design state database 250. As shown in
Therefore, in this step 410, the observer software module 245 compares incoming user actions from GUI 230 and the CAD program functions 210 to the database shown in
Note also that the running total calculated running time shown in column 624 includes two other aspects, the multiplier/weighing factors 616 and certain actions for which running time is not assigned. As shown most clearly with action 620C, the actual time spent was five seconds, yet the calculated running time associated with that action is ten seconds. That is because the help center module 240 multiplies the time spent for each action in 622 by the multiplier (or weighing factor) 616 for that action, and the resultant is included in the calculated running total 624. Note that while an embodiment of the invention adds weighing factors to the running total calculation, and discounts one or more steps from the running totals, it is to be understood that the invention can be practiced without applying one or both of those aspects, and for example the invention could calculate running time solely as a function of the time spent on each action, or by only adding the weighing factors, or by only discounting certain steps.
With reference to
In step 510, the help center module 240 presents the resultant data to the user for selection of help topics. A schematic view 700 of the GUI 230 screen providing that information to the user is shown in
Continuing with a description of the method of
In step 525, the analyzer module 275 of the help center switchboard module 260 carries out a search on the expert database 270, looking for experts that have expertise in at least the primary expertise area as indicated either by the user (by virtue of the selection in step 515) or by the calculations from help center module 240 (by determining the expertise area with the highest total running time), as well as (optionally) looking for such experts that do not have “restrictions” on their expertise that would render them inapplicable due to the attributes of the CAD model 220 as discussed above.
In
In the invention, some sort of quantitative indication is provided indicating the relative expertise of the listed experts in all of the expertise areas listed in
Finally, the database table includes listings of restrictions 810. So, for example while expert Baker has provided expertise in welding (as indicated by his score of 6), he does not work on tungsten inert gas (TIG) welding. So, if the CAD model 220 indicates that type of welding is used, Baker will not be chosen even if his expertise score would otherwise indicate that he is suitable. Optionally, other information could be provided in column 810 on what a given expert will (or will not) provide support for.
Then, as a result of step 520, the analyzer module 275 of help center switchboard module 260 determines the best expert to provide help, based on the highest expertise score for the desired expertise area, and availability (and optionally, whether or not the expert would be disqualified because a restriction 810 is applicable for the desired expertise area). In step 530, the analyzer module 275 determines if there is a tie between designated experts. If there is no tie, the highest ranked expert is designated in step 540. If there is a tie, in step 535 the analyzer module searches the expert database 270 to compare the relative expertise of the tied experts for the next most important expertise area. This second expertise area is either indicated by the user (in step 515 and as shown in and described with reference to
In step 545 the designated expert is then identified to the user by a screen view in GUI 230. The presented screen includes the option to enable the user to allow the expert to assume remote control of the CAD program 200, and in particular the CAD program functions 210 and the resulting CAD model 220. If in step 550 the user grants the expert remote access capability, the process continues in step 555 to the flowchart of
With reference to
Referring now to
At step 1015, and still referring to
In an embodiment, and continuing to refer to
Still viewing
With continued reference to
Still referring to
With continued reference to
In an embodiment, computer system may trigger only upon detection of user stagnation; for instance, user stagnation, plus a second trigger, may indicate user need for help, upon which computer system may identify an area of expertise associated with second trigger.
Still viewing
At step 1020, computer system identifies a first area of expertise associated with the user help condition. This may be implemented as described above in reference to
In an embodiment, and still referring to
At step 1020, and still referring to
It is to be noted that any one or more of the aspects and embodiments described herein may be conveniently implemented using one or more machines (e.g., one or more computing devices that are utilized as a user computing device for an electronic document, one or more server devices, such as a document server, etc.) programmed according to the teachings of the present specification, as will be apparent to those of ordinary skill in the computer art. Appropriate software coding can readily be prepared by skilled programmers based on the teachings of the present disclosure, as will be apparent to those of ordinary skill in the software art. Aspects and implementations discussed above employing software and/or software modules may also include appropriate hardware for assisting in the implementation of the machine executable instructions of the software and/or software module.
Such software may be a computer program product that employs a machine-readable storage medium. A machine-readable storage medium may be any medium that is capable of storing and/or encoding a sequence of instructions for execution by a machine (e.g., a computing device) and that causes the machine to perform any one of the methodologies and/or embodiments described herein. Examples of a machine-readable storage medium include, but are not limited to, a magnetic disk, an optical disc (e.g., CD, CD-R, DVD, DVD-R, etc.), a magneto-optical disk, a read-only memory “ROM” device, a random access memory “RAM” device, a magnetic card, an optical card, a solid-state memory device, an EPROM, an EEPROM, and any combinations thereof. A machine-readable medium, as used herein, is intended to include a single medium as well as a collection of physically separate media, such as, for example, a collection of compact discs or one or more hard disk drives in combination with a computer memory. As used herein, a machine-readable storage medium does not include transitory forms of signal transmission.
Such software may also include information (e.g., data) carried as a data signal on a data carrier, such as a carrier wave. For example, machine-executable information may be included as a data-carrying signal embodied in a data carrier in which the signal encodes a sequence of instruction, or portion thereof, for execution by a machine (e.g., a computing device) and any related information (e.g., data structures and data) that causes the machine to perform any one of the methodologies and/or embodiments described herein.
Examples of a computing device include, but are not limited to, an electronic book reading device, a computer workstation, a terminal computer, a server computer, a handheld device (e.g., a tablet computer, a smartphone, etc.), a web appliance, a network router, a network switch, a network bridge, any machine capable of executing a sequence of instructions that specify an action to be taken by that machine, and any combinations thereof. In one example, a computing device may include and/or be included in a kiosk.
Memory 1108 may include various components (e.g., machine readable media) including, but not limited to, a random access memory component, a read only component, and any combinations thereof. In one example, a basic input/output system 1116 (BIOS), including basic routines that help to transfer information between elements within computer system 1100, such as during start-up, may be stored in memory 1108. Memory 1108 may also include (e.g., stored on one or more machine-readable media) instructions (e.g., software) 1120 embodying any one or more of the aspects and/or methodologies of the present disclosure. In another example, memory 1108 may further include any number of program modules including, but not limited to, an operating system, one or more application programs, other program modules, program data, and any combinations thereof.
Computer system 1100 may also include a storage device 1124. Examples of a storage device (e.g., storage device 1124) include, but are not limited to, a hard disk drive, a magnetic disk drive, an optical disc drive in combination with an optical medium, a solid-state memory device, and any combinations thereof. Storage device 1124 may be connected to bus 1112 by an appropriate interface (not shown). Example interfaces include, but are not limited to, SCSI, advanced technology attachment (ATA), serial ATA, universal serial bus (USB), IEEE 1394 (FIREWIRE), and any combinations thereof. In one example, storage device 1124 (or one or more components thereof) may be removably interfaced with computer system 1100 (e.g., via an external port connector (not shown)). Particularly, storage device 1124 and an associated machine-readable medium 1128 may provide nonvolatile and/or volatile storage of machine-readable instructions, data structures, program modules, and/or other data for computer system 1100. In one example, software 1120 may reside, completely or partially, within machine-readable medium 1128. In another example, software 1120 may reside, completely or partially, within processor 1104.
Computer system 1100 may also include an input device 1132. In one example, a user of computer system 1100 may enter commands and/or other information into computer system 1100 via input device 1132. Examples of an input device 1132 include, but are not limited to, an alpha-numeric input device (e.g., a keyboard), a pointing device, a joystick, a gamepad, an audio input device (e.g., a microphone, a voice response system, etc.), a cursor control device (e.g., a mouse), a touchpad, an optical scanner, a video capture device (e.g., a still camera, a video camera), a touchscreen, and any combinations thereof. Input device 1132 may be interfaced to bus 1112 via any of a variety of interfaces (not shown) including, but not limited to, a serial interface, a parallel interface, a game port, a USB interface, a FIREWIRE interface, a direct interface to bus 1112, and any combinations thereof. Input device 1132 may include a touch screen interface that may be a part of or separate from display 1136, discussed further below. Input device 1132 may be utilized as a user selection device for selecting one or more graphical representations in a graphical interface as described above.
A user may also input commands and/or other information to computer system 1100 via storage device 1124 (e.g., a removable disk drive, a flash drive, etc.) and/or network interface device 1140. A network interface device, such as network interface device 1140, may be utilized for connecting computer system 1100 to one or more of a variety of networks, such as network 1144, and one or more remote devices 1148 connected thereto. Examples of a network interface device include, but are not limited to, a network interface card (e.g., a mobile network interface card, a LAN card), a modem, and any combination thereof. Examples of a network include, but are not limited to, a wide area network (e.g., the Internet, an enterprise network), a local area network (e.g., a network associated with an office, a building, a campus or other relatively small geographic space), a telephone network, a data network associated with a telephone/voice provider (e.g., a mobile communications provider data and/or voice network), a direct connection between two computing devices, and any combinations thereof. A network, such as network 1144, may employ a wired and/or a wireless mode of communication. In general, any network topology may be used. Information (e.g., data, software 1120, etc.) may be communicated to and/or from computer system 1100 via network interface device 1140.
Computer system 1100 may further include a video display adapter 1152 for communicating a displayable image to a display device, such as display device 1136. Examples of a display device include, but are not limited to, a liquid crystal display (LCD), a cathode ray tube (CRT), a plasma display, a light emitting diode (LED) display, and any combinations thereof. Display adapter 1152 and display device 1136 may be utilized in combination with processor 1104 to provide graphical representations of aspects of the present disclosure. In addition to a display device, computer system 1100 may include one or more other peripheral output devices including, but not limited to, an audio speaker, a printer, and any combinations thereof. Such peripheral output devices may be connected to bus 1112 via a peripheral interface 1156. Examples of a peripheral interface include, but are not limited to, a serial port, a USB connection, a FIREWIRE connection, a parallel connection, and any combinations thereof.
The foregoing has been a detailed description of illustrative embodiments of the invention. Various modifications and additions can be made without departing from the spirit and scope of this invention. Features of each of the various embodiments described above may be combined with features of other described embodiments as appropriate in order to provide a multiplicity of feature combinations in associated new embodiments. Furthermore, while the foregoing describes a number of separate embodiments, what has been described herein is merely illustrative of the application of the principles of the present invention. Additionally, although particular methods herein may be illustrated and/or described as being performed in a specific order, the ordering is highly variable within ordinary skill to achieve methods, systems, and software according to the present disclosure. Accordingly, this description is meant to be taken only by way of example, and not to otherwise limit the scope of this invention.
The present application is a continuation in part of U.S. patent application Ser. No. 14/313,676, filed on Jun. 24, 2014, and titled “Systems and Methods for Automated Help,” the entirety of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4495559 | Gelatt, Jr. et al. | Jan 1985 | A |
5117354 | Long | May 1992 | A |
5465221 | Merat et al. | Nov 1995 | A |
5495430 | Matsunari et al. | Feb 1996 | A |
5552995 | Sebastian | Sep 1996 | A |
5570291 | Dudle et al. | Oct 1996 | A |
5655087 | Hino et al. | Aug 1997 | A |
5758328 | Giovannoli | May 1998 | A |
5847971 | Ladner et al. | Dec 1998 | A |
5870719 | Maritzen et al. | Feb 1999 | A |
5937189 | Branson et al. | Aug 1999 | A |
6031535 | Barton | Feb 2000 | A |
6112133 | Fishman | Aug 2000 | A |
6295513 | Thackston | Sep 2001 | B1 |
6341271 | Salvo et al. | Jan 2002 | B1 |
6343285 | Tanaka et al. | Jan 2002 | B1 |
6611725 | Harrison | Aug 2003 | B1 |
6647373 | Calton-Foss | Nov 2003 | B1 |
6701200 | Lukis et al. | Mar 2004 | B1 |
6750864 | Anwar | Jun 2004 | B1 |
6834312 | Edwards et al. | Dec 2004 | B2 |
6836699 | Lukis et al. | Dec 2004 | B2 |
6859768 | Wakelam et al. | Feb 2005 | B1 |
6922701 | Ananian et al. | Jun 2005 | B1 |
6917847 | Littlejohn et al. | Jul 2005 | B2 |
7006084 | Buss et al. | Feb 2006 | B1 |
7058465 | Emori et al. | Jun 2006 | B2 |
7079990 | Haller et al. | Jul 2006 | B2 |
7085687 | Eckenwiler et al. | Aug 2006 | B2 |
7089082 | Lukis et al. | Aug 2006 | B1 |
7123986 | Lukis et al. | Oct 2006 | B2 |
7134096 | Brathwaite et al. | Nov 2006 | B2 |
7299101 | Lukis et al. | Nov 2007 | B2 |
7305367 | Hollis et al. | Dec 2007 | B1 |
7327869 | Boyer | Feb 2008 | B2 |
7343212 | Brearley et al. | Mar 2008 | B1 |
7359886 | Sakurai et al. | Apr 2008 | B2 |
7366643 | Verdura et al. | Apr 2008 | B2 |
7369970 | Shimizu et al. | May 2008 | B2 |
7418307 | Katircioglu | Aug 2008 | B2 |
7467074 | Faruque et al. | Dec 2008 | B2 |
7496487 | Wakelam et al. | Feb 2009 | B2 |
7496528 | Lukis et al. | Feb 2009 | B2 |
7499871 | McBrayer et al. | Mar 2009 | B1 |
7523411 | Carlin | Apr 2009 | B2 |
7526358 | Kawano et al. | Apr 2009 | B2 |
7529650 | Wakelam et al. | May 2009 | B2 |
7565139 | Neven, Sr. et al. | Jul 2009 | B2 |
7565223 | Moldenhauer et al. | Jul 2009 | B2 |
7567849 | Trammell et al. | Jul 2009 | B1 |
7568155 | Axe et al. | Jul 2009 | B1 |
7571166 | Davies et al. | Aug 2009 | B1 |
7574339 | Lukis et al. | Aug 2009 | B2 |
7590466 | Lukis et al. | Sep 2009 | B2 |
7590565 | Ward et al. | Sep 2009 | B2 |
7603191 | Gross | Oct 2009 | B2 |
7606628 | Azuma | Oct 2009 | B2 |
7630783 | Walls-Manning et al. | Dec 2009 | B2 |
7656402 | Abraham et al. | Feb 2010 | B2 |
7689936 | Rosel | Mar 2010 | B2 |
7733339 | Laning et al. | Jun 2010 | B2 |
7747469 | Hinman | Jun 2010 | B2 |
7748622 | Schon et al. | Jul 2010 | B2 |
7761319 | Gil et al. | Jul 2010 | B2 |
7822682 | Arnold et al. | Oct 2010 | B2 |
7836573 | Lukis et al. | Nov 2010 | B2 |
7840443 | Lukis et al. | Nov 2010 | B2 |
7908200 | Scott et al. | Mar 2011 | B2 |
7957830 | Lukis et al. | Jun 2011 | B2 |
7979313 | Baar | Jul 2011 | B1 |
7993140 | Sakezles | Aug 2011 | B2 |
8000987 | Hickey et al. | Aug 2011 | B2 |
8024207 | Ouimet | Sep 2011 | B2 |
8140401 | Lukis et al. | Mar 2012 | B2 |
8170946 | Blair et al. | May 2012 | B2 |
8175933 | Cook, Jr. et al. | May 2012 | B2 |
8180396 | Athsani et al. | May 2012 | B2 |
8209327 | Danish et al. | Jun 2012 | B2 |
8239284 | Lukis et al. | Aug 2012 | B2 |
8249329 | Silver | Aug 2012 | B2 |
8271118 | Pietsch et al. | Sep 2012 | B2 |
8275583 | Devarajan et al. | Sep 2012 | B2 |
8295971 | Krantz | Oct 2012 | B2 |
8417478 | Gintis et al. | Apr 2013 | B2 |
8441502 | Reghetti et al. | May 2013 | B2 |
8515820 | Lopez et al. | Aug 2013 | B2 |
8554250 | Linaker | Oct 2013 | B2 |
8571298 | McQueen et al. | Oct 2013 | B2 |
8595171 | Qu | Nov 2013 | B2 |
8700185 | Yucel et al. | Apr 2014 | B2 |
8706607 | Sheth et al. | Apr 2014 | B2 |
8768651 | Bhaskaran et al. | Jul 2014 | B2 |
8798324 | Conradt | Aug 2014 | B2 |
8806398 | Brathwaite et al. | Aug 2014 | B2 |
8830267 | Brackney | Sep 2014 | B2 |
8849636 | Becker et al. | Sep 2014 | B2 |
8861005 | Grosz | Oct 2014 | B2 |
8874413 | Mulligan et al. | Oct 2014 | B2 |
8923650 | Wexler | Dec 2014 | B2 |
8977558 | Nielsen et al. | Mar 2015 | B2 |
9037692 | Ferris | May 2015 | B2 |
9055120 | Firman | Jun 2015 | B1 |
9106764 | Chan et al. | Aug 2015 | B2 |
20010023418 | Suzuki et al. | Sep 2001 | A1 |
20010047251 | Kemp | Nov 2001 | A1 |
20020065790 | Oouchi | May 2002 | A1 |
20020087440 | Blair et al. | Jul 2002 | A1 |
20020099579 | Stowell et al. | Jul 2002 | A1 |
20020107673 | Haller et al. | Aug 2002 | A1 |
20020152133 | King et al. | Oct 2002 | A1 |
20030018490 | Magers et al. | Jan 2003 | A1 |
20030069824 | Menninger | Apr 2003 | A1 |
20030078846 | Burk et al. | Apr 2003 | A1 |
20030139995 | Farley | Jul 2003 | A1 |
20030149500 | Faruque et al. | Aug 2003 | A1 |
20030163212 | Smith et al. | Aug 2003 | A1 |
20030172008 | Hage et al. | Sep 2003 | A1 |
20030212610 | Duffy et al. | Nov 2003 | A1 |
20030220911 | Tompras | Nov 2003 | A1 |
20040008876 | Lure | Jan 2004 | A1 |
20040113945 | Park et al. | Jun 2004 | A1 |
20040195224 | Kanodia et al. | Oct 2004 | A1 |
20050055299 | Chambers et al. | Mar 2005 | A1 |
20050125092 | Lukis et al. | Jun 2005 | A1 |
20050144033 | Vreeke et al. | Jun 2005 | A1 |
20050171790 | Blackmon | Aug 2005 | A1 |
20050251478 | Yanavi | Nov 2005 | A1 |
20050273401 | Yeh et al. | Dec 2005 | A1 |
20060085322 | Crookshanks | Apr 2006 | A1 |
20060185275 | Yatt | Aug 2006 | A1 |
20060253214 | Gross | Nov 2006 | A1 |
20070016437 | Elmufdi et al. | Jan 2007 | A1 |
20070067146 | Devarajan et al. | Mar 2007 | A1 |
20070073593 | Perry et al. | May 2007 | A1 |
20070112635 | Loncaric | May 2007 | A1 |
20070198231 | Walch | Aug 2007 | A1 |
20080120086 | Lilley et al. | May 2008 | A1 |
20080183614 | Gujral et al. | Jul 2008 | A1 |
20080269942 | Free | Oct 2008 | A1 |
20080281678 | Keuls et al. | Nov 2008 | A1 |
20090058860 | Fong et al. | Mar 2009 | A1 |
20090208773 | DuPont | Aug 2009 | A1 |
20090299799 | Racho et al. | Dec 2009 | A1 |
20090319388 | Yuan et al. | Dec 2009 | A1 |
20100049800 | Hatfield | Feb 2010 | A1 |
20110040542 | Sendhoff et al. | Feb 2011 | A1 |
20110047140 | Free | Feb 2011 | A1 |
20110209081 | Chen et al. | Aug 2011 | A1 |
20110213757 | Bhaskaran et al. | Sep 2011 | A1 |
20120016678 | Gruber et al. | Jan 2012 | A1 |
20120072299 | Sampsell | Mar 2012 | A1 |
20120230548 | Calman et al. | Sep 2012 | A1 |
20120316667 | Hartloff | Dec 2012 | A1 |
20130055126 | Jackson | Feb 2013 | A1 |
20130097259 | Li | Apr 2013 | A1 |
20130100128 | Steedly et al. | Apr 2013 | A1 |
20130138529 | Hou | May 2013 | A1 |
20130144566 | De Biswas | Jun 2013 | A1 |
20130166470 | Grala et al. | Jun 2013 | A1 |
20130218961 | Ho | Aug 2013 | A1 |
20130293580 | Spivack | Nov 2013 | A1 |
20130297320 | Buser | Nov 2013 | A1 |
20130297460 | Spivack | Nov 2013 | A1 |
20130311914 | Daily | Nov 2013 | A1 |
20130325410 | Jung et al. | Dec 2013 | A1 |
20140042136 | Daniel et al. | Feb 2014 | A1 |
20140067333 | Rodney et al. | Mar 2014 | A1 |
20140075342 | Corlett | Mar 2014 | A1 |
20140098094 | Neumann et al. | Apr 2014 | A1 |
20140157579 | Chhabra et al. | Jun 2014 | A1 |
20140207605 | Allin et al. | Jul 2014 | A1 |
20140229316 | Brandon | Aug 2014 | A1 |
20140279177 | Stump | Sep 2014 | A1 |
20140358842 | Flinn | Dec 2014 | A1 |
20140379119 | Sciacchitano et al. | Dec 2014 | A1 |
20150055085 | Fonte et al. | Feb 2015 | A1 |
20150066189 | Mulligan et al. | Mar 2015 | A1 |
20150127480 | Herrman et al. | May 2015 | A1 |
20150234377 | Mizikovsky | Aug 2015 | A1 |
Entry |
---|
U.S. Appl. No. 14/267,447, Aug. 5, 2015, Office Action. |
U.S. Appl. No. 14/197,922, Nov. 26, 2014, Office Action. |
U.S. Appl. No. 14/197,922, Apr. 27, 2015, Response to Office Action. |
U.S. Appl. No. 14/197,922, May 15, 2015, Office Action. |
U.S. Appl. No. 14/267,447, Jun. 18, 2015, Response to Office Action. |
U.S. Appl. No. 14/263,665, Oct. 8, 2015, Office Action. |
U.S. Appl. No. 14/053,222, Jan. 29, 2016, Office Action. |
U.S. Appl. No. 14/311,943, Apr. 27, 2016, Office Action. |
U.S. Appl. No. 14/486,550, May 26, 2016, Office Action. |
U.S. Appl. No. 14/060,033, Jun. 15, 2016, Office Action. |
U.S. Appl. No. 14/172,462, Jul. 6, 2016, Office Action. |
U.S. Appl. No. 14/053,222, Jul. 29, 2016, Response to Office Action. |
U.S. Appl. No. 14/185,204, Jul. 29, 2016, Office Action. |
U.S. Appl. No. 14/062,947, Sep. 16, 2016, Office Action. |
U.S. Appl. No. 14/060,033, filed Oct. 22, 2013. |
U.S. Appl. No. 14/053,222, filed Oct. 14, 2013. |
U.S. Appl. No. 14/172,462, filed Oct. 16, 2013. |
U.S. Appl. No. 14/062,947, filed Oct. 25, 2013. |
U.S. Appl. No. 14/172,404, filed Feb. 4, 2014. |
U.S. Appl. No. 14/303,372, filed Jun. 12, 2014. |
U.S. Appl. No. 14/185,204, filed Feb. 20, 2014. |
U.S. Appl. No. 14/195,391, filed Mar. 3, 2014. |
U.S. Appl. No. 14/246,254, filed Apr. 7, 2014. |
U.S. Appl. No. 14/229,008, filed Mar. 28, 2014. |
U.S. Appl. No. 14/197,922, filed Mar. 5, 2014. |
U.S. Appl. No. 14/263,665, filed Apr. 28, 2014. |
U.S. Appl. No. 14/267,447, filed May 1, 2014. |
U.S. Appl. No. 14/311,943, filed Jun. 23, 2014. |
Number | Date | Country | |
---|---|---|---|
Parent | 14313676 | Jun 2014 | US |
Child | 16020297 | US |