Connecting graphical shapes using gestures

Information

  • Patent Grant
  • 10698599
  • Patent Number
    10,698,599
  • Date Filed
    Monday, June 5, 2017
    7 years ago
  • Date Issued
    Tuesday, June 30, 2020
    4 years ago
Abstract
The present disclosure describes systems and apparatuses for connecting graphical shapes. A client digital data processor receives selection events. Each selection event tracks one or more input locations. The client digital data processor identifies a source and target graphical shape based at least on the received selection events. The client digital data processor determines a source and target connection point for a connector based at least on the source and target graphical shapes. The client digital data processor determines a length for the connector based at least on the source and target connection points. The client digital data processor generates and displays the connector based at least on the source and target connection points and the length. The present disclosure also describes methods for operating a client digital data processor as described above, and a computer-readable medium storing a program having instructions for so operating a client digital data processor.
Description
FIELD OF THE DISCLOSURE

The present disclosure relates to digital data processing, and more particularly, to methods, apparatus, systems, and computer-readable media for connecting graphical shapes. The teachings herein have application, by way of non-limiting example, to rapid and improved generation of graphical connectors in diagramming applications.


BACKGROUND

Applications typically provide graphical shapes and connectors on a virtual canvas to generate diagrams. In creating business process flows, flowcharts, or illustrations, applications provide connectors to represent relationships between two or more graphical shapes. To generate a connector between graphical shapes, traditional applications typically require users to select a connector tool, select a first shape, and drag toward a second shape using, for example, a mouse pointer, cursor, or finger.


An object of this invention is to provide improved systems and methods for digital data processing. A more particular object is to provide improved systems and methods for connecting graphical shapes.


A further object is to provide such improved systems and methods as facilitate connecting graphical shapes using rapid and intuitive gestures.


SUMMARY

The foregoing are among the objects attained by the invention which provides, in one aspect, a digital data processing system, apparatus, method, and computer-readable medium for connecting graphical shapes.


In some embodiments, the system includes a client apparatus. The client apparatus includes a display and a client digital data processor. The display is configured to present a source graphical shape and a target graphical shape. The client digital data processor is in communicative coupling with the display. The client digital data processor is configured to receive one or more selection events tracking one or more input locations. The client digital data processor is further configured to identify the source graphical shape and the target graphical shape based at least on the received selection events. The client digital data processor is still further configured to determine a source connection point and a target connection point for a connector based at least on the source graphical shape and the target graphical shape. The client digital data processor is yet further configured to determine a length for the connector based at least on the source connection point and the target connection point. The client digital data processor is still further configured to generate and display, on the display, the connector based at least on the source connection point, the target connection point, and the length.


Further aspects of the invention provide a method for connecting graphical shapes, the method comprising the steps of identifying a source graphical shape and a target graphical shape based at least on one or more received selection events tracking one or more input locations; determining a source connection point and a target connection point for a connector based at least on the source graphical shape and the target graphical shape; determining a length for the connector based at least on the source connection point and the target connection point; and generating and displaying the connector based at least on the source connection point, the target connection point, and the length.


The invention provides, in further aspects, a non-transitory computer-readable medium having stored therein a computer program product having instructions, which when executed by a client digital data processor cause the client digital data processor to: identify a source graphical shape and a target graphical shape based at least on one or more received selection events tracking one or more input locations; determine a source connection point and a target connection point for a connector based at least on the source graphical shape and the target graphical shape; determine a length for the connector based at least on the source connection point and the target connection point; and generate and display the connector based at least on the source connection point, the target connection point, and the length.


In related aspects, the one or more selection events include one or more of: a single tap; a long tap held for any of microseconds, milliseconds, and seconds; and a multi-touch event indicating a plurality of input locations.


In further related aspects, the step of receiving the one or more selection events includes receiving the one or more selection events from one or more of: conductive gloves, wand controllers, any of an augmented reality peripheral and controller, any of a virtual reality peripheral and controller, a camera, and a machine vision peripheral.


In still further related aspects, the step of identifying the source graphical shape and the target graphical shape includes identifying the source graphical shape and a plurality of target graphical shapes based on receiving a plurality of selection events, the source graphical shape being identified based on a first selection event, and the plurality of target graphical shapes being identified based on a plurality of subsequent selection events; and the step of generating and displaying the connector includes generating and displaying a plurality of connectors between the source graphical shape and the plurality of target graphical shapes based on determining a source connection point, a target connection point, and a length for each connector among the plurality of connectors.


In other related aspects, the invention further includes responding to receiving a subsequent selection event by determining whether a said connector exists between the source graphical shape and the target graphical shape; and, upon determining that a said connector exists, generating and displaying an inverted connector between the source graphical shape and the target graphical shape that replaces the connector.


In yet other related aspects, the subsequent selection event includes any of: a single tap of the connector, a double tap of the connector, a multi-touch single tap on the source graphical shape and the target graphical shape, a multi-touch double tap on the source graphical shape and the target graphical shape, and a long press on the source graphical shape followed by any of a single tap and a double tap of the target graphical shape.


In related aspects, the step of identifying the source graphical shape and the target graphical shape based on one or more of: a relative position of the source graphical shape and the target graphical shape, a relative time that the source graphical shape was added compared to the target graphical shape, a color of any of the source graphical shape and the target graphical shape, and a size of any of the source graphical shape and the target graphical shape


The foregoing and other aspects of the invention are evident in the text that follows and in the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

Various objects, features, and advantages of the present disclosure can be more fully appreciated with reference to the following detailed description when considered in connection with the following drawings, in which like reference numerals identify like elements. The following drawings are for the purpose of illustration only and are not intended to be limiting of the invention, the scope of which is set forth in the detailed description that follows.



FIG. 1 illustrates an example shape connection system, in accordance with some embodiments of the present invention.



FIG. 2 illustrates an example method for connecting shapes, in accordance with some embodiments of the present invention.



FIG. 3 illustrates an example interaction in which the shape connection engine processes a single selection event, in accordance with some embodiments of the present invention.



FIG. 4 illustrates an example interaction in which the shape connection engine processes multiple selection events, in accordance with some embodiments of the present invention.



FIGS. 5-6 illustrate example interactions in which the shape connection engine processes a subsequent selection event to invert a graphical connector, in accordance with some embodiments of the present invention.





DETAILED DESCRIPTION OF EXAMPLE EMBODIMENTS

The systems, apparatus, methods, and computer-readable media described herein allow for rapid generation of connections between graphical shapes. In some embodiments, the shape connection engine described herein can be used in any application that displays graphical shapes. Non-limiting example applications include a business application displaying a workflow canvas, a visual model editor, visual code editor, a presentation or slide program, or other diagramming program. (The shape connection system and related apparatus, methods, and computer-readable medium as discussed here and elsewhere herein are sometimes referred to herein as the “shape connection engine.”)


The shape connection system and methods involve generating a connector between two graphical shapes based on receiving and processing one or more selection events on a client apparatus. In some embodiments, a selection event corresponds to receiving a single tap on a graphical shape displayed by a client mobile device such as a tablet or phone. Thus, in response to receiving sequential selection events that correspond to single taps sequentially on a source shape followed by a target shape, the shape connection engine generates and displays a graphical connector between the source shape and the target shape. In further embodiments, the selection event can correspond to receiving a tap held for several microseconds, milliseconds, or seconds on the client mobile device to identify the source shape (sometimes referred to herein as a “long press” or “long tap”). In still further aspects, the client mobile device can change visual aspects of the source shape upon receiving the selection event, such as graphically enlarging the size of the source shape or changing the color of the source shape so as to simulate that the source shape is elevated or selected. In response to receiving multiple selection events sequentially while a long press is held, additional embodiments of the shape connection engine can generate multiple connectors from the source shape to multiple target shapes corresponding to the multiple selection events.


In other embodiments, a selection event contains two selection points that correspond to receiving a single tap and tracking two fingers that are touched substantially simultaneously on two graphical shapes displayed by the client mobile device (sometimes referred to herein as “multi-touch”). In response to receiving a selection event tracking multi-touch, the shape connection engine identifies the source shape and target shape. Some embodiments of the shape connection engine can identify the source shape as the leftmost shape and the target shape as the rightmost shape on a graphical canvas. Other embodiments of the shape connection engine can identify the source shape as a first shape that was added earlier in time to the canvas, regardless of its relative position left or right. The target shape can be a second shape added later in time to the canvas, regardless of relative position. In still other embodiments, the shape connection engine can identify the source shape and target shape based on any shape property that allows the user to define an ordering criterion in the application. Non-limiting example properties include position (e.g., up, down, left, right), time that the shape was added to the canvas, color (e.g., green, red, or blue shapes), or size (e.g., small to large).


The shape connection engine determines a source connection point and target connection point based on the source shape and target shape. For example, the source connection point or target connection point can correspond to a geometrical center of the source shape or target shape, even if the user tapped a point inside the source shape or target shape that is offset from the geometrical center. Alternatively, the shape connection engine can determine the source connection point and target connection point so as to correspond to a point on a boundary of the source shape and a point on a boundary of the target shape that minimizes a distance and connector length between the source shape and target shape. The shape connection engine determines a length for the connector based on the source connection point and target connection point. In some embodiments, the connector type can be a straight connector, a right angle connector (e.g., having one or more right angles between the source connection point and target connection point) or a curved connector (e.g., having a curved path between the source connection point and target connection point instead of a straight line). Accordingly, some embodiments of the shape connection engine can determine the length further based on the connector type.


The shape connection engine generates and displays the connector between the source shape and target shape based at least on the source connection point, target connection point, and length. In some embodiments, the displayed connector can be a solid or dashed line between the source connection point and the target connection point. In further embodiments, the shape connection engine can display the connector using a thinner or thicker line based on a connection weight or thickness. In still further embodiments, the displayed connector can include an arrow at the beginning to indicate the source shape, an arrow at the end to indicate the target shape, or arrows at the beginning and end.


Some embodiments of the shape connection engine can invert existing connectors. The shape connection engine can receive a subsequent selection event. For example, the subsequent selection event can be a multi-touch event corresponding to a single tap tracking two input locations touched substantially simultaneously (e.g., single tapping with two fingers substantially simultaneously, one finger on an existing source shape and another finger on an existing target shape). Further embodiments of the subsequent selection event can correspond to a double tap of the two input locations touched twice substantially simultaneously (e.g., double tapping with two fingers substantially simultaneously on an existing source shape and target shape). Still further embodiments of the subsequent selection event can include receiving a user's long press on an existing source shape or target shape, followed sequentially by a single tap or double tap of an existing target shape or source shape. If the shape connection engine receives a subsequent selection event, the shape connection engine determines whether a connector already exists between the source shape and target shape. Upon an affirmative determination that a connector already exists, the shape connection engine sets the original source shape to be the new target shape and the original target shape to be the new source shape, and inverts the existing connector with the beginning connected to the new source shape and the end connected to the new target shape. In alternate embodiments, the shape connection engine removes the existing connector and generates and displays a new inverted connector with the beginning connected to the new source shape and the end connected to the new target shape. In further embodiments, if there are multiple connectors between the original source shape and original target shape, the shape connection engine selects a connector to invert based on a contextual criterion relevant to the context of the flow or diagram. Non-limiting example contextual criteria include the oldest or newest connector, the front-most or back-most connector, the highest or lowest connector in the stack, the thickest or thinnest connector, or the darkest or lightest connector. Alternately, the shape connection engine can allow the user to select which connector to invert.



FIG. 1 illustrates example shape connection system 100 in accordance with some embodiments of the present invention. Shape connection system 100 includes client 102 and server 104, in communication over network 106.


Client 102 runs application 108. Application 108 is configured to display graphical shapes on a virtual canvas. Non-limiting examples of application 108 include a business application displaying a workflow canvas, a visual code editor, visual model editor, a presentation or slide program, or other diagramming program, all of the type known in the art as adapted in accord with the teachings herein. In this regard, application 108 can be implemented, for example, on one or more digital data processing systems in the conventional manner known in the art, again, as adapted in accord with the teachings herein.


Application 108 uses event processor 110, connection generator 112, and canvas display 114 to generate and display connectors among graphical shapes. Event processor 110 receives one or more selection events. In some embodiments, the selection events can represent a single multi-touch event (e.g., single tapping two or more graphical shapes using two or more fingers). In other embodiments, the shape connection engine can receive multiple selection events that represent a sequence or series of multi-touch events (e.g., double tapping two or more graphical shapes using two or more fingers). In still other embodiments, the shape connection engine can receive selection events from a sensor array that does not depend on touch input to determine input locations or selection points. By way of non-limiting example, the shape connection engine can receive input from a stylus, digital pen or pencil, conductive gloves, wand controllers, augmented or virtual reality peripherals or controllers, photo camera, video camera, or other machine vision peripherals, or other sensor arrays configured to detect a user's finger or pointer position in a two dimensional plane or three dimensional space (e.g., sensor detection in front of or behind a display) to allow a user to manipulate a graphical shape or other object directly or virtually. A non-limiting example augmented or virtual reality peripheral includes the HoloLens augmented reality environment from Microsoft Corporation in Redmond, Wash., United States. A non-limiting example camera peripheral includes the Kinect camera peripheral also from Microsoft Corporation. Connection generator 112 generates a connector between a source shape and a target shape based at least on the received selection events. Canvas display 114 displays the generated connector in application 108.


Some embodiments of client 102 include one or more client digital data processors. The client digital data processors can be of the type commercially available in the marketplace suitable for operation in shape connection system 100 and adapted in accord with the teachings herein, for example, in communication with applications executing in one or more rules engines, e.g. as discussed elsewhere herein. Client 102 may be implemented in mobile computers executing on mobile phones, tablet computers, personal digital assistants (PDAs), desktop computers, laptop computers, workstations, or other suitable apparatus adapted based on the systems and methods described herein. The client digital data processor includes central processing, memory, storage using a non-transitory computer-readable medium (e.g., a magnetic disk, solid state drive, or other storage medium), and input/output units and other constituent components (not shown) of the type conventional in the art that are programmed or otherwise configured in accord with the teachings herein.


In some embodiments, client 102 and application 108 communicate with server 104 over network 106. However, server 104 and network 106 are optional for shape connection system 100, which can be configured using the client digital data processor and application 108 on client 102. Additionally, some embodiments of application 108 and client 102 can run in an offline mode, disconnected from network 106 and server 104.


In some embodiments, server 104 includes one or more server digital data processors, The server digital data processors can be digital processors of the type commercially available in the marketplace suitable for operation in shape connection system 100 and adapted in accord with the teachings herein, for example, utilizing models and rules that form enterprise applications executing in one or more rules engines, e.g. as discussed elsewhere herein. Though server 104 can be typically implemented in server-class computers such as a minicomputer, server 104 may also be implemented in desktop computers, workstations, laptop computers, tablet computers, personal digital assistants (PDAs), mobile computers, or other suitable apparatus adapted based on the systems and methods described herein. The server digital data processor includes central processing, memory, storage using a non-transitory computer-readable medium (e.g., a magnetic disk, solid state drive, or other storage medium), and input/output units and other constituent components (not shown) of the type conventional in the art that are programmed or otherwise configured in accord with the teachings herein.


In some embodiments, an enterprise can deploy shape connection system 100 in support of enterprise applications executing on server 104 remote to application 108 on client 102. Such enterprise applications can include specialized software or hardware used within a specific industry or business function (e.g., human resources, finance, healthcare, telecommunications, insurance, etc.). Alternatively, the enterprise applications can include cross-industry applications (e.g., project management), or other types of software or hardware applications.


In some embodiments, rules define the enterprise applications. Server 104 can be in communication with rules engine 116. Rules engine 116 can be in communication with rules base 118 and transactional database 120. As the enterprise application executes on a server digital data processor (e.g., server 104), shape connection system 100 may retrieve any portion of the rules that define the enterprise application from rules base 118 and process or execute the rules in response to requests or events signaled to or detected by the server digital data processors or client digital data processors at run-time, (e.g., using rules engine 116).


Rules base 118 can include a rules base of the type known in the art (albeit configured in accord with the teachings herein) for storing rules (e.g., scripts, logic, controls, instructions, metadata, etc.) and other application-related information in tables, database records, database objects, and so forth. Preferred rules and rules bases can be of the type described in U.S. Pat. No. 5,826,250, entitled “Rules Bases and Methods of Access Therein” and U.S. Pat. No. 7,640,222, entitled “Rules Base Systems and Methods with Circumstance Translation,” the entire contents of both of which are incorporated by reference herein in their entirety. In other embodiments, rules and rules bases that are architected or operated differently may be used as well.


Some embodiments of shape connection system 100 may utilize multiple rules bases. For example, rules base 118 may be an enterprise-wide rules base in communication with rules engine 116, and domain-specific rules bases may be accessible to server 104 or application 108 on client 102 via network 106. If multiple rules bases are provided in a given embodiment, the rules bases may be of like architecture and operation or may differ in architecture and operation as well.


In some embodiments, rules comprise meta-information structures. For example, the rules can include data elements or method elements. The method elements can be procedural or declarative. For example, method elements in a rule may be procedural insofar as the rule comprises one or more of a series of ordered steps. Declarative elements in a rule may set forth (i.e., declare) a relation between variables or values (e.g., a loan rate calculation or a decision-making criterion). Alternatively, declarative elements may declare a desired computation or result without specifying how the computations should be performed or how the result should be achieved. In one non-limiting example, a declarative portion of a rule may declare a desired result of retrieving a specified value without specifying a data source for the value or a particular query language for such retrieval (e.g., SQL, CQL, .QL, etc.). In other cases, the declarative portion of a meta-information structure may comprise declarative programming language statements (e.g., SQL). Still other types of declarative meta-information structures are possible.


While some rules may comprise meta-information structures that are wholly procedural and other rules may comprise meta-information structures that are wholly declarative, shape connection system 100 can also include rules that comprise both procedural and declarative meta-information structures. That is, such rules can have meta-information structure portions that are declarative, as well as meta-information structure portions that are procedural. Furthermore, rules of the illustrated embodiments that comprise meta-information structures may also reference or incorporate other rules. Those other rules may themselves in turn reference or incorporate still other rules. As a result, editing such a rule may affect one or more rules that incorporate it (if any).


An advantage of rules that comprise meta-information structures over conventional rules is that meta-information structures provide administrators with flexibility to apply code-based or model-driven techniques in development and modification of applications or computing platforms. Particularly, like models in a model-driven environment, meta-information structures comprise data elements that can be used to define aspects of a complex system at a higher level of abstraction than source code written in programming languages such as Java or C++. On the other hand, administrators may also embed programming language statements into meta-information structures if the administrators deem that to be the most efficient design for the system being developed or modified. At run-time, rules engine 116 can convert the data elements of the meta-information structures along with programming language statements (if any) automatically into executable code for the application.


Thus, in some embodiments rules may be the primary artifacts that get created, stored (e.g., in rules base 118) or otherwise manipulated to define or modify the overall functionality of rules-based enterprise applications. The enterprise applications may automate or manage various types of work in different business domains at run-time. By way of non-limiting example, rules stored in rules base 118 may be configured to define aspects of an enterprise application. For example, rules can define the user interface, decision logic, integration framework, process definition, data model, reports, or security settings of a given enterprise application.


Transactional database 120 can include databases of the type known in the art (albeit configured in accord with the teachings herein) for storing corporate, personal, governmental, or other data. Rules such as in rules base 118 may generate, update, transform, delete, store, or retrieve the data (herein collectively referred to as “processing” the data). Example data may include financial data; customer records; personal data; design-time, development-time, or runtime data related to an application; or other types of data. Transactional database 120 may store the data in tables, database records, or database objects, for example.


Transactional database 120 may be present in any given embodiment. Conversely, some embodiments may use multiple transactional databases, e.g., an enterprise-wide database accessible to server 104 and branch-office specific databases accessible to client 102, by way of non-limiting example. If multiple transactional databases are provided in a given embodiment, the transactional databases may be of like architecture and operation; though, they may have differing architecture or operation, as well.


Rules engine 116 can be of the type conventionally known in the art (albeit configured in accord with the teachings herein) for use in processing or executing rules from rules base 118 to process data in (or for storage to) transactional database 120, e.g. in connection with events signaled to or detected by rules engine 116. Preferred such rules engines are of the type described in U.S. Pat. No. 5,826,250, entitled “Rules Bases and Methods of Access Therein,” U.S. Pat. No. 7,640,222, entitled “Rules Base Systems and Methods with Circumstance Translation,” and U.S. Pat. No. 8,250,525, entitled “Proactive Performance Management For Multi-User Enterprise Software Systems,” all of which are incorporated by reference in their entirety herein. Rules engine 116 may be implemented in a single software program, multiple software programs or modules, or a combination of software modules or programs. Rules engine 116 may comprise programming instructions, scripts, or rules (e.g., rules stored in rules base 118) or a combination therein.


Some embodiments of rules engine 116 may execute on or over multiple digital data processors. For example, shape connection system 100 may invoke rules engine 116 for execution on a single digital data processor (e.g., a digital data processor on server 104 or client 102). Subsequently, shape connection system 100 may apportion, distribute, or execute portions of rules engine 116 (or, potentially, the entirety of rules engine 116) over multiple digital data processors.


Other ways of implementing or executing rules engine 116 are also possible. By way of non-limiting example, rules engine 116 may have additional distinct components or portions that can be apportioned and distributed separately. Non-limiting example components include a data access component for processing data during rule execution, a session management component for keeping track of activity across sessions of interaction with a digital data processor, or a performance monitoring component for monitoring and interacting with various system resources or event logs to manage performance thresholds.


Network 106 can include one or more networks of the type commercially available in the marketplace or otherwise suitable for supporting communication between client 102 and server 104 in accord with the teachings herein. Network 106 can be wired or wireless, a cellular network, a Local Area Network (LAN), a Wireless LAN (WLAN), a Metropolitan Area Network (MAN), a Wireless MAN (WMAN), a Wide Area Network (WAN), a Wireless WAN (WWAN), a Personal Area Network (PAN), a Wireless PAN (WPAN), or a network operating in accordance with existing IEEE 802.11, 802.11a, 802.11b, 802.11g, 802.11n, 802.16, 802.16d, 802.16e, 802.16m standards or future versions or derivatives of the above standards.



FIG. 2 illustrates an example method 200 for connecting shapes, in accordance with some embodiments of the present invention. The client digital data processor receives one or more selection events (step 210). The selection events can track input locations on a sensor array. By way of non-limiting example, the sensor array can be part of a touch-sensitive display used on a mobile phone or tablet. The input locations can be coordinates on the touch-sensitive display where the user taps or presses on a source shape or target shape. Some embodiments of the selection events can track a single input location (e.g., a location corresponding to a single tap on a graphical shape). Other embodiments of the selection events can use a single selection event to track “multi-touch” with multiple input locations (e.g., multiple locations corresponding to a single tap with multiple fingers on multiple graphical shapes).


The shape connection engine identifies a source shape and a target shape based on the received selection events (step 220). For example, the shape connection engine determines the input locations from the selection events. These input locations are sometimes referred to herein as “selection points.” The shape connection engine can use boundary metadata for shapes to determine that a first input location or selection point is bounded within a first graphical shape, and a second input location or selection point is bounded within a second graphical shape. The shape connection engine proceeds to identify one graphical shape as the source shape, and the other graphical shape as the target shape. Some embodiments of the shape connection engine can identify the source shape as the leftmost shape and the target shape as the rightmost shape on the graphical canvas. Other embodiments of the shape connection engine can identify the source shape as a first shape that was added earlier in time to the canvas, regardless of its relative position left or right. The target shape can be a second shape added later in time to the canvas, regardless of relative position. In still other embodiments, the shape connection engine can identify the source shape and target shape based on any shape property that allows the user to define an ordering in the application. Non-limiting example properties include position (e.g., up, down, left, right), time that the shape was added to the canvas, color (e.g., green, red, or blue shapes), or size (e.g., small to large).


The shape connection engine determines a source connection point and a target connection point based on the identified source shape and target shape (step 230). For example, the source connection point or target connection point can correspond to a geometrical center of the source shape or target shape, even if the user tapped a point inside the source shape or target shape that is offset from the geometrical center. Alternatively, the shape connection engine can determine the source connection point and target connection point so as to correspond to a point on a boundary of the source shape and a point on a boundary of the target shape that minimizes a distance and connector length between the source shape and target shape.


The shape connection engine further determines a length for the connector based on the source connection point and target connection point (step 240). For example, if the connector type is a straight connector, the shape connection engine determines (xs, ys) coordinates for the source connection point and (xt, yt) coordinates for the target connection point. The shape connection engine determines the connector length according to √{square root over (|xt−xs|2+|yt−ys|2)}. In some embodiments, the connector type can be a right angle connector (e.g., having one or more right angles between the source connection point and target connection point) or a curved connector (e.g., having a curved path between the source connection point and target connection point instead of a straight line). Accordingly, some embodiments of the shape connection engine can determine the length further based on the connector type.


The shape connection engine generates and displays the connector between the source shape and target shape based at least on the source connection point, target connection point, and length (step 250). In some embodiments, the displayed connector can be a solid or dashed line between the source connection point and the target connection point. In further embodiments, the shape connection engine can display the connector using a thinner or thicker line based on a connection weight or thickness. In still further embodiments, the displayed connector can include an arrow at the beginning to indicate the source shape, an arrow at the end to indicate the target shape, or arrows at the beginning and end.



FIG. 3 illustrates an example interaction 300 in which the shape connection engine processes a single selection event, in accordance with some embodiments of the present invention. Client 102 receives a selection event tracking selection points 302, 304, and uses the shape connection engine to generate connector 306 in response to the received selection event.


Client 102 receives a selection event, for example from a touch-sensitive display in communicative coupling with the client digital data processor. In some embodiments the received selection event is a single event that tracks multiple selection points such as selection points 302, 304. Selection points 302, 304 can correspond to a user tapping multiple graphical shapes substantially simultaneously. The shape connection engine identifies selection points 302, 304 from the selection event. In further embodiments, the shape connection engine can receive multiple selection events within a short timeframe such as a few microseconds, milliseconds, or seconds, and process selection points 302, 304 from the multiple selection events as if the selection points were received substantially simultaneously.


The shape connection engine identifies a source shape and a target shape corresponding to selection points 302, 304. First, the shape connection engine identifies graphical shapes corresponding to selection points 302, 304. For example, the shape connection engine uses shape boundary metadata to determine that selection point 302 is bounded within a first rectangle shape, and selection point 304 is bounded within a second rectangle shape. Next, the shape connection engine identifies one shape as the source shape and the other shape as the target shape. For example, some embodiments of the shape connection engine can identify source shape 308 as the leftmost shape and target shape 310 as the rightmost shape on the graphical canvas. Other embodiments of the shape connection engine can identify the source shape as a first shape that was added earlier in time to the canvas, regardless of its relative position left or right. The target shape can be a second shape added later in time to the canvas, regardless of relative position. In still other embodiments, the shape connection engine can identify the source shape and target shape based on any shape property that allows the user to define an ordering in the application. Non-limiting example properties include position (e.g., up, down, left, right), time that the shape was added to the canvas, color (e.g., green, red, or blue shapes), or size (e.g., small to large).


The shape connection engine determines a source connection point and a target connection point based on the source shape and target shape. For example, the source connection point or target connection point can correspond to a geometrical center of the source shape or target shape, even if the user tapped a point inside the source shape or target shape that is offset from the geometrical center. Alternatively, the shape connection engine can determine the source connection point and target connection point so as to correspond to a point on a boundary of the source shape and a point on a boundary of the target shape that minimizes a distance and connector length between the source shape and target shape. The shape connection engine further determines a length for the connector based on the source connection point and target connection point. In some embodiments, the connector type can be a straight connector, a right angle connector such as connector 306 (e.g., having one or more right angles between the source connection point and target connection point) or a curved connector (e.g., having a curved path between the source connection point and target connection point instead of a straight line). Accordingly, some embodiments of the shape connection engine can determine the length further based on the connector type. The shape connection engine generates and displays the connector, such as connector 306, between the source shape and target shape based at least on the source connection point, target connection point, and length.



FIG. 4 illustrates an example interaction 400 in which the shape connection engine processes multiple selection events, in accordance with some embodiments of the present invention. First, client 102 receives selection events 402, 404, and uses the shape connection engine to generate connector 406 in response to the received selection events. Next, client 102 receives selection event 408 while the user is still touching source shape 412 corresponding to selection event 402, and uses the shape connection engine to generate connector 410 in response to the received selection events.


Client 102 receives first selection event 402, for example from a touch-sensitive display in communicative coupling with the client digital data processor. In some embodiments, first selection event 402 can include metadata that indicates the user is holding a finger in substantially the same position for several microseconds, milliseconds, or seconds (e.g., a “long press” or “long tap”). FIG. 4 illustrates this long press with a clock over first selection event 402. Some embodiments of the shape connection engine can graphically enlarge the size of source shape 412 upon receiving first selection event 402, so as to simulate that source shape 412 is elevated or selected. In other embodiments, first selection event 402 can correspond to a single tap to select source shape 412, without a long press. The shape connection engine identifies a source selection point from selection event 402. The shape connection engine identifies source shape 412 corresponding to the source selection point. For example, the shape connection engine uses shape boundary metadata to determine that the source selection point is bounded within a first rectangle shape.


Client 102 receives second selection event 404, for example from the touch-sensitive display. In some embodiments, the shape connection engine identifies a target selection point from second selection event 404. Accordingly, the shape connection engine allows the user to tap the first shape to identify source shape 412, and sequentially tap the second shape to identify target shape 414. In other embodiments, the shape connection engine first determines whether first selection event 402 indicates that the user's finger is still held down, or “long pressing,” on the source selection point corresponding to source shape 412. Upon an affirmative determination that the user is still long pressing the source selection point, the shape connection engine identifies the target selection point from second selection event 404. Upon a negative determination that the user's finger is no longer long pressing the source selection point, the shape connection engine does not identify a target selection point, and does not generate or display a graphical connector to connect source shape 412 to target shape 414. Accordingly, the shape connection engine requires the user to long press the first shape to identify source shape 412, and subsequently tap the second shape to identify target shape 414.


After identifying a target selection point from second selection event 404, the shape connection engine identifies target shape 414 corresponding to the target selection point. For example, the shape connection engine uses shape boundary metadata to determine that the target selection point is bounded within a second rectangle shape. Because second selection event 404 follows first selection event 402 in time, the shape connection engine determines that first selection event 402 corresponds to source shape 412 and second selection event 404 corresponds to target shape 414.


The shape connection engine determines a source connection point and a target connection point based on source shape 412 and target shape 414. For example, the source connection point or target connection point can correspond to a geometrical center of source shape 412 or target shape 414, even if the user tapped a point inside source shape 412 or target shape 414 that is offset from the geometrical center. Alternatively, the shape connection engine can determine the source connection point and target connection point so as to correspond to a point on a boundary of source shape 412 and a point on a boundary of target shape 414 that minimizes a distance and connector length between source shape 412 and target shape 414. The shape connection engine further determines a length for connector 406 based on the source connection point and target connection point. In some embodiments, the connector type can be a straight connector, a right angle connector such as connector 406 (e.g., having one or more right angles between the source connection point and target connection point) or a curved connector (e.g., having a curved path between the source connection point and target connection point instead of a straight line). Accordingly, some embodiments of the shape connection engine can determine the length further based on the connector type. The shape connection engine generates and displays the connector, such as connector 406, between source shape 412 and target shape 414 based at least on the source connection point, target connection point, and length.


Client 102 subsequently receives third selection event 408. In some embodiments, the shape connection engine determines whether first selection event 402 indicates that the user's finger is still long pressing the source selection point corresponding to source shape 412. Upon an affirmative determination that the user's finger is still long pressing the source selection point, the shape connection engine identifies a subsequent target selection point from third selection event 408. After identifying the subsequent target selection point from third selection event 408, the shape connection engine identifies subsequent target shape 416 corresponding to the subsequent target selection point. For example, the shape connection engine uses shape boundary metadata to determine that the subsequent target selection point is bounded within a third rectangle shape. The shape connection engine determines a subsequent target connection point based on subsequent target shape 416, for example, a point on a boundary of subsequent target shape 416 that minimizes a distance and connector length between source shape 412 and subsequent target shape 416. The shape connection engine further determines a length for connector 410 based on the source connection point and subsequent target connection point. In this manner, the shape connection engine provides rapid generation of multiple connectors to connect multiple target shapes 414, 416 from long pressing a single source shape 412.



FIG. 5 illustrates an example interaction 500 in which the shape connection engine processes a subsequent selection event to invert a connector, in accordance with some embodiments of the present invention. Client 102 initially uses the shape connection engine to generate and display connector 306 in accordance with the interaction described earlier in connection with FIG. 3. Next, client 102 receives a subsequent selection event.


In some embodiments, the shape connection engine verifies whether the subsequent selection event indicates the user has performed a second single tap using two fingers, thereby identifying selection points 502, 504. In other embodiments, the shape connection engine verifies whether the subsequent selection event indicates the user has performed a double tap using two fingers, so as to identify selection points 502, 504. In alternate embodiments, the shape connection engine verifies whether the subsequent selection event indicates the user has performed a single tap or double tap substantially near connector 306. If the user has tapped two shapes rather than an existing connector, the shape connection engine determines whether a connector already exists between the shapes identified by selection points 502, 504. For example, FIG. 5 illustrates that connector 306 already exists between source shape 308 and target shape 310.


Upon an affirmative determination that connector 306 already exists, the shape connection engine sets original source shape 308 to be new target shape 510 and original target shape 310 to be new source shape 508, and inverts connector 306 to create connector 506 beginning at new source shape 508 and ending at new target shape 510. In alternate embodiments, the shape connection engine removes existing connector 306 and generates and displays connector 506 as a new inverted connector with the beginning connected to new source shape 508 and the end connected to new target shape 510. In further embodiments, if there are multiple connectors between the original source shape and original target shape, the shape connection engine selects a connector to invert based on any property relevant to the context of the flow or diagram (e.g., oldest or newest connector, front-most or back-most connector, highest or lowest connector in stack, thickest or thinnest connector, darkest or lightest connector). Alternately, the shape connection engine can allow the user to select which connector to invert.



FIG. 6 illustrates an example interaction 600 in which the shape connection engine processes a subsequent selection event to invert a connector, in accordance with some embodiments of the present invention. Client 102 initially uses the shape connection engine to generate and display connector 606 between source shape 602 and target shape 604, in accordance with the interaction described earlier in connection with FIG. 4. Next, client 102 receives subsequent selection event 608.


In some embodiments, the shape connection engine verifies whether subsequent selection event 608 indicates the user has performed a long press, thereby identifying source shape 602. Next, client 102 receives a further selection event 610. In some embodiments, further selection event 610 can be a single tap or a double tap on new source shape 612. In further embodiments, the order of selection does not matter between original source shape 602 and original target shape 604. That is, in some embodiments the shape connection engine may allow the user to select original source shape 602 first and original target shape 604 second, or select original target shape 604 first and original source shape 602 second. Although FIG. 6 illustrates the result of selecting original source shape 602 first and original target shape 604 second, in some embodiments the same result would occur if the user selected original target shape 604 first (e.g., via long press) and original source shape 602 second (e.g., via single tap or double tap). The shape connection engine determines whether a connector already exists between the shapes identified by selection events 608, 610. For example, FIG. 6 illustrates that connector 606 already exists between original source shape 602 and original target shape 604.


Upon an affirmative determination that connector 606 already exists, the shape connection engine sets original source shape 602 to be new target shape 614 and original target shape 604 to be new source shape 612, and inverts existing connector 606 to generate connector 616 with the beginning connected to new source shape 612 and the end connected to new target shape 614. In alternate embodiments, the shape connection engine removes existing connector 606 and generates and displays connector 616 as a new inverted connector with the beginning connected to new source shape 612 and the end connected to new target shape 614. In further embodiments, if there are multiple connectors between original source shape 602 and original target shape 604, the shape connection engine selects a connector to invert based on any property relevant to the context of the flow or diagram (e.g., oldest or newest connector, front-most or back-most connector, highest or lowest connector in stack, thickest or thinnest connector, darkest or lightest connector). Alternately, the shape connection engine can allow the user to select which connector to invert.


Other embodiments are within the scope and spirit of the shape connecting systems and methods. For example, the shape connecting functionality described above can be implemented using software, hardware, firmware, hardwiring, or combinations of any of these. One or more digital data processors operating in accordance with instructions may implement the functions associated with shape connecting in accordance with the present disclosure as described above. If such is the case, it is within the scope of the shape connecting systems and methods that such instructions may be stored on one or more non-transitory computer-readable storage media (e.g., a magnetic disk, solid state drive, or other storage medium). Additionally, as described earlier, modules implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations.


The shape connecting systems and methods are not to be limited in scope by the specific embodiments described herein. Indeed, other various embodiments of and modifications to the shape connecting, in addition to those described herein, will be apparent to those of ordinary skill in the art from the foregoing description and accompanying drawings. Thus, such other embodiments and modifications are intended to fall within the scope of the shape connecting systems and methods described herein. Furthermore, although the shape connecting has been described herein in the context of a particular implementation in a particular environment for a particular purpose, those of ordinary skill in the art will recognize that its usefulness is not limited thereto and that the shape connecting may be beneficially implemented in any number of environments for any number of purposes.

Claims
  • 1. A client apparatus for connecting graphical shapes, the apparatus comprising: a display configured to present a source graphical shape and a plurality of target graphical shapes,a client digital data processor in communicative coupling with the display, wherein the client digital data processor is configured to: identify the source graphical shape and the plurality of target graphical shapes based at least on a plurality of selection events determined by tracking input locations on a sensor array that is coupled to the client digital data processor, wherein the plurality of selection events comprises a multi-touch event comprising substantially simultaneous touching of a plurality of the input locations and includes (i) a first selection event in which a user holds a first finger on the sensor array in substantially a same position for a period constituting any of a long press or a long tap to identify the source graphical shape, (ii) a second selection event in which the user, simultaneous with the first selection event, touches a finger other than the first finger on the sensor array with a tap to identify a first said target graphical shape, and (iii) a third selection event successive to the second selection event in which the user, simultaneous with the first selection event, touches a finger other than the first finger on the sensor array with a tap to identify a second said target graphical shape;determine at least the source graphical shape based on the first selection event;determine at least the first target graphical shape based on the second selection event;determine at least the second target graphical shape based on the third selection event;determine a source connection point and a target connection point for a first connector between the source graphical shape and the first target graphical shape based at least on the source graphical shape and the first target graphical shape;determine a source connection point and a target connection point for a second connector between the source graphical shape and the second target graphical shape based at least on the source graphical shape and the second target graphical shape;determine a length for the first connector based at least on the source connection point and the target connection point of the first connector;determine a length for the second connector based at least on the source connection point and the target connection point of the second connector;generate and display, on the display, the first connector based at least on the source connection point, the target connection point, and the length of the first connector; andgenerate and display, on the display, the second connector based at least on the source connection point, the target connection point, and the length of the second connector.
  • 2. The apparatus of claim 1, wherein the plurality of selection events is received from one or more of: conductive gloves, wand controllers, any of an augmented reality peripheral and controller, any of a virtual reality peripheral and controller, a camera, and a machine vision peripheral.
  • 3. The apparatus of claim 1, wherein the client digital data processor is further configured to: responsive to receiving a selection event subsequent to the third selection event, determine whether a said connector exists between the source graphical shape and a selected one of said target graphical shapes; andupon a determination that a said connector exists, generate and display an inverted connector that replaces that connector between the source graphical shape and the selected target graphical shape.
  • 4. The apparatus of claim 3, wherein the subsequent selection event includes any of: a single tap of the connector, a double tap of the connector, a multi-touch single tap on the source graphical shape and the selected target graphical shape, a multi-touch double tap on the source graphical shape and the selected target graphical shape, and a long press on the source graphical shape followed by any of a single tap and a double tap of the selected target graphical shape.
  • 5. The apparatus of claim 1, wherein the client digital data processor is configured to identify the source graphical shape and one of the target graphical shapes based further on one or more of: a relative position of the source graphical shape and that target graphical shape, a relative time that the source graphical shape was added compared to that target graphical shape, a color of any of the source graphical shape and that target graphical shape, and a size of any of the source graphical shape and that target graphical shape.
  • 6. A method for operating a client digital data processor to connect graphical shapes displayed thereby, the method comprising: identifying a source graphical shape and a plurality of target graphical shapes based on a plurality of received selection events determined by tracking one or more input locations on a sensor array that is coupled to the client digital data processor, wherein the selection event is a multi-touch event indicating substantially simultaneous touching of a plurality of the input locations, wherein the plurality of selection events comprises a multi-touch event comprising substantially simultaneous touching of a plurality of the input locations and includes (i) a first selection event in which a user holds a first finger on the sensor array in substantially a same position for a period constituting any of a long press or a long tap to identify the source graphical shape, (ii) a second selection event in which the user, simultaneous with the first selection event, touches a finger other than the first finger on the sensor array with a tap to identify a first said target graphical shape, and (iii) a third selection event successive to the second selection event in which the user, simultaneous with the first selection event, touches a finger other than the first finger on the sensor array with a tap to identify a second said target graphical shape;with the client digital data processor, determining at least the source graphical shape based on the first selection event;with the client digital data processor, determining at least the first target graphical shape based on the second selection event;with the client digital data processor, determining at least the second target graphical shape based on the third selection event;with the client digital data processor, determining a source connection point and a target connection point for a first connector between the source graphical shape and the first target graphical shape based at least on the source graphical shape and the first target graphical shape;with the client digital data processor, determining a source connection point and a target connection point for a second connector between the source graphical shape and the second target graphical shape based at least on the source graphical shape and the second target graphical shape;with the client digital data processor, determining a length for the first connector based at least on the source connection point and the target connection point of the first connector;with the client digital data processor, determining a length for the second connector based at least on the source connection point and the target connection point of the second connector;with the client digital data processor, generating and displaying the first connector based at least on the source connection point, the target connection point, and the length of the first connector; andwith the client digital data processor, generating and displaying the second connector based at least on the source connection point, the target connection point, and the length of the second connector.
  • 7. The method of claim 6, wherein the step of receiving the plurality of selection events includes receiving the one or more selection events from one or more of: conductive gloves, wand controllers, any of an augmented reality peripheral and controller, any of a virtual reality peripheral and controller, a camera, and a machine vision peripheral.
  • 8. The method of claim 6, further comprising: responding to receiving a selection event subsequent to the third selection event by determining whether a said connector exists between the source graphical shape and a selected one of said target graphical shapes; andupon determining that a said connector exists, generating and displaying an inverted connector that replaces that connector between the source graphical shape and the selected target graphical shape.
  • 9. The method of claim 8, wherein the subsequent selection event includes any of: a single tap of the connector, a double tap of the connector, a multi-touch single tap on the source graphical shape and the selected target graphical shape, a multi-touch double tap on the source graphical shape and the selected target graphical shape, and a long press on the source graphical shape followed by any of a single tap and a double tap of the selected target graphical shape.
  • 10. The method of claim 6, including the step of identifying the source graphical shape and one of the target graphical shapes based on one or more of: a relative position of the source graphical shape and that target graphical shape, a relative time that the source graphical shape was added compared to that target graphical shape, a color of any of the source graphical shape and that target graphical shape, and a size of any of the source graphical shape and that target graphical shape.
  • 11. A non-transitory computer-readable medium having stored therein a computer program product having instructions, which when executed by a client digital data processor cause the client digital data processor to: identify a source graphical shape and a plurality of target graphical shapes based on at least on one or more received selection events determined from tracking one or more input locations on a sensor array that is coupled to the client digital data processor, wherein the selection event is a multi-touch event indicating substantially simultaneous touching of a plurality of the input locations, wherein the plurality of selection events comprises a multi-touch event comprising substantially simultaneous touching of a plurality of the input locations and includes (i) a first selection event in which a user holds a first finger on the sensor array in substantially a same position for a period constituting any of a long press or a long tap to identify the source graphical shape, (ii) a second selection event in which the user, simultaneous with the first selection event, touches a finger other than the first finger on the sensor array with a tap to identify a first said target graphical shape, and (iii) a third selection event successive to the second selection event in which the user, simultaneous with the first selection event, touches a finger other than the first finger on the sensor array with a tap to identify a second said target graphical shape;determine at least the source graphical shape based on the first selection event;determine at least the first target graphical shape based on the second selection event;determine at least the second target graphical shape based on the third selection event;determine a source connection point and a target connection point for a first connector between the source graphical shape and the first target graphical shape based at least on the source graphical shape and the first target graphical shape;determine a source connection point and a target connection point for a second connector between the source graphical shape and the second target graphical shape based at least on the source graphical shape and the second target graphical shape;determine a length for the first connector based at least on the source connection point and the target connection point of the first connector;determine a length for the second connector based at least on the source connection point and the target connection point of the second connector;generate and display the first connector based at least on the source connection point, the target connection point, and the length of the first connector; andgenerate and display the second connector based at least on the source connection point, the target connection point, and the length of the second connector.
  • 12. The non-transitory computer-readable medium of claim 11, wherein the computer program product has instructions which, when executed by a client digital data processor, causes the client digital data processor to receive the plurality of selection events from one or more of: conductive gloves, wand controllers, any of an augmented reality peripheral and controller, any of a virtual reality peripheral and controller, a camera, and a machine vision peripheral.
  • 13. The non-transitory computer-readable medium of claim 11, wherein the computer program product has instructions which, when executed by a client digital data processor, causes the client digital data processor to: respond to receiving a selection event subsequent to the third selection event by determining whether a said connector exists between the source graphical shape and a selected one of said target graphical shapes; andupon determining that a said connector exists, to generate and display an inverted connector that replaces that connector between the source graphical shape and the selected target graphical shape.
  • 14. The non-transitory computer-readable medium of claim 13, wherein the subsequent selection event includes any of: a single tap of the connector, a double tap of the connector, a multi-touch single tap on the source graphical shape and the selected target graphical shape, a multi-touch double tap on the source graphical shape and the selected target graphical shape, and a long press on the source graphical shape followed by any of a single tap and a double tap of the selected target graphical shape.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of priority under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application Ser. No. 62/345,367, filed Jun. 3, 2016 and entitled “Connecting Graphical Shapes Using Gestures,” the entire contents of which are incorporated herein by reference.

US Referenced Citations (676)
Number Name Date Kind
4047059 Rosenthal Sep 1977 A
4344142 Diehr, II et al. Aug 1982 A
4602168 Single Jul 1986 A
4607232 Gill, Jr. Aug 1986 A
4659944 Miller, Sr. et al. Apr 1987 A
4701130 Whitney et al. Oct 1987 A
4866634 Reboh et al. Sep 1989 A
4884217 Skeirik et al. Nov 1989 A
4895518 Arnold et al. Jan 1990 A
4930071 Tou et al. May 1990 A
4953106 Gansner et al. Aug 1990 A
5062060 Kolnick Oct 1991 A
5077491 Heck et al. Dec 1991 A
5093794 Howie et al. Mar 1992 A
5119465 Jack et al. Jun 1992 A
5129043 Yue Jul 1992 A
5136184 Deevy Aug 1992 A
5136523 Landers Aug 1992 A
5140671 Hayes et al. Aug 1992 A
5193056 Boes Mar 1993 A
5199068 Cox Mar 1993 A
5204939 Yamazaki et al. Apr 1993 A
5228116 Harris et al. Jul 1993 A
5259766 Sack et al. Nov 1993 A
5262941 Saladin et al. Nov 1993 A
5267175 Hooper Nov 1993 A
5267865 Lee et al. Dec 1993 A
5270920 Pearse et al. Dec 1993 A
5276359 Chiang Jan 1994 A
5276885 Milnes et al. Jan 1994 A
5291394 Chapman Mar 1994 A
5291583 Bapat Mar 1994 A
5295256 Bapat Mar 1994 A
5297279 Bannon et al. Mar 1994 A
5301270 Steinberg et al. Apr 1994 A
5310349 Daniels et al. May 1994 A
5311422 Loftin et al. May 1994 A
5326270 Ostby et al. Jul 1994 A
5333254 Robertson Jul 1994 A
5339390 Robertson et al. Aug 1994 A
5374932 Wyschogrod et al. Dec 1994 A
5379366 Noyes Jan 1995 A
5379387 Carlstedt et al. Jan 1995 A
5381332 Wood Jan 1995 A
5386559 Eisenberg et al. Jan 1995 A
5395243 Lubin et al. Mar 1995 A
5412756 Bauman et al. May 1995 A
5421011 Camillone et al. May 1995 A
5421730 Lasker, III et al. Jun 1995 A
5446397 Yotsuyanagi Aug 1995 A
5446885 Moore et al. Aug 1995 A
5450480 Man et al. Sep 1995 A
5463682 Fisher et al. Oct 1995 A
5473732 Chang Dec 1995 A
5477170 Yotsuyanagi Dec 1995 A
5481647 Brody et al. Jan 1996 A
5499293 Behram et al. Mar 1996 A
5504879 Eisenberg et al. Apr 1996 A
5512849 Wong Apr 1996 A
5519618 Kastner et al. May 1996 A
5537590 Amado Jul 1996 A
5542024 Balint et al. Jul 1996 A
5542078 Martel et al. Jul 1996 A
5548506 Srinivasan Aug 1996 A
5561740 Barrett et al. Oct 1996 A
5579223 Raman Nov 1996 A
5579486 Oprescu et al. Nov 1996 A
5586311 Davies et al. Dec 1996 A
5596752 Knudsen et al. Jan 1997 A
5597312 Bloom et al. Jan 1997 A
5608789 Fisher et al. Mar 1997 A
5611076 Durflinger et al. Mar 1997 A
5627979 Chang et al. May 1997 A
5630127 Moore et al. May 1997 A
5649192 Stucky Jul 1997 A
5655118 Heindel et al. Aug 1997 A
5664206 Murow et al. Sep 1997 A
5675753 Hansen et al. Oct 1997 A
5678039 Hinks et al. Oct 1997 A
5689663 Williams Nov 1997 A
5715450 Ambrose et al. Feb 1998 A
5732192 Malin et al. Mar 1998 A
5754740 Fukuoka et al. May 1998 A
5761063 Jannette et al. Jun 1998 A
5761673 Bookman et al. Jun 1998 A
5765140 Knudson et al. Jun 1998 A
5768480 Crawford, Jr. et al. Jun 1998 A
5788504 Rice et al. Aug 1998 A
5795155 Morrel-Samuels Aug 1998 A
5809212 Shasha Sep 1998 A
5815415 Bentley et al. Sep 1998 A
5819243 Rich et al. Oct 1998 A
5819257 Monge et al. Oct 1998 A
5822780 Schutzman Oct 1998 A
5825260 Ludwig et al. Oct 1998 A
5826077 Blakeley et al. Oct 1998 A
5826239 Du et al. Oct 1998 A
5826250 Trefler Oct 1998 A
5826252 Wolters, Jr. et al. Oct 1998 A
5829983 Koyama et al. Nov 1998 A
5831607 Brooks Nov 1998 A
5832483 Barker Nov 1998 A
5841435 Dauerer et al. Nov 1998 A
5841673 Kobayashi et al. Nov 1998 A
5864865 Lakis Jan 1999 A
5873096 Lim et al. Feb 1999 A
5875334 Chow et al. Feb 1999 A
5875441 Nakatsuyama et al. Feb 1999 A
5880614 Zinke et al. Mar 1999 A
5880742 Rao et al. Mar 1999 A
5886546 Hwang Mar 1999 A
5890146 Wavish et al. Mar 1999 A
5890166 Eisenberg et al. Mar 1999 A
5892512 Donnelly et al. Apr 1999 A
5907490 Oliver May 1999 A
5907837 Ferrel et al. May 1999 A
5909213 Martin Jun 1999 A
5910748 Reffay et al. Jun 1999 A
5911138 Li et al. Jun 1999 A
5918222 Fukui et al. Jun 1999 A
5920717 Noda Jul 1999 A
5930795 Chen et al. Jul 1999 A
5945852 Kosiec Aug 1999 A
5974441 Rogers et al. Oct 1999 A
5974443 Jeske Oct 1999 A
5978566 Plank et al. Nov 1999 A
5983267 Shklar et al. Nov 1999 A
5983369 Bakoglu et al. Nov 1999 A
5987415 Breese et al. Nov 1999 A
5990742 Suzuki Nov 1999 A
5995948 Whitford et al. Nov 1999 A
5995958 Xu Nov 1999 A
6008673 Glass et al. Dec 1999 A
6008808 Almeida et al. Dec 1999 A
6012098 Bayeh et al. Jan 2000 A
6020768 Lim Feb 2000 A
6023704 Gerard et al. Feb 2000 A
6023714 Hill et al. Feb 2000 A
6023717 Argyroudis Feb 2000 A
6028457 Tihanyi Feb 2000 A
6037890 Glass et al. Mar 2000 A
6044373 Gladney et al. Mar 2000 A
6044466 Anand et al. Mar 2000 A
6078982 Du et al. Jun 2000 A
6085188 Bachmann et al. Jul 2000 A
6085198 Skinner et al. Jul 2000 A
6091226 Amano Jul 2000 A
6092036 Hamann Jul 2000 A
6092083 Brodersen et al. Jul 2000 A
6094652 Faisal Jul 2000 A
6098172 Coss Aug 2000 A
6105035 Monge et al. Aug 2000 A
6108004 Medl Aug 2000 A
6122632 Botts et al. Sep 2000 A
6125363 Buzzeo et al. Sep 2000 A
6130679 Chen et al. Oct 2000 A
6137797 Bass et al. Oct 2000 A
6144997 Lamming et al. Nov 2000 A
6151595 Pirolli et al. Nov 2000 A
6151624 Teare et al. Nov 2000 A
6154738 Call Nov 2000 A
6167441 Himmel Dec 2000 A
6177932 Galdes et al. Jan 2001 B1
6185516 Hardin et al. Feb 2001 B1
6185534 Breese et al. Feb 2001 B1
6192371 Schultz Feb 2001 B1
6194919 Park Feb 2001 B1
6212502 Ball et al. Apr 2001 B1
6216135 Brodersen et al. Apr 2001 B1
6233332 Anderson et al. May 2001 B1
6233617 Rothwein et al. May 2001 B1
6240417 Eastwick et al. May 2001 B1
6243713 Nelson et al. Jun 2001 B1
6246320 Monroe Jun 2001 B1
6275073 Tokuhiro Aug 2001 B1
6275790 Yamamoto et al. Aug 2001 B1
6281896 Alimpich et al. Aug 2001 B1
6282547 Hirsch Aug 2001 B1
6300947 Kanevsky Oct 2001 B1
6304259 DeStefano Oct 2001 B1
6308163 Du et al. Oct 2001 B1
6310951 Wineberg et al. Oct 2001 B1
6311324 Smith et al. Oct 2001 B1
6313834 Lau et al. Nov 2001 B1
6314415 Mukherjee Nov 2001 B1
6324693 Brodersen et al. Nov 2001 B1
6330554 Altschuler et al. Dec 2001 B1
6338074 Poindexter et al. Jan 2002 B1
6341277 Coden et al. Jan 2002 B1
6341293 Hennessey Jan 2002 B1
6344862 Williams et al. Feb 2002 B1
6349238 Gabbita et al. Feb 2002 B1
6351734 Lautzenheiser et al. Feb 2002 B1
6356286 Lawrence Mar 2002 B1
6356897 Gusack Mar 2002 B1
6359633 Balasubramaniam et al. Mar 2002 B1
6366299 Lanning et al. Apr 2002 B1
6369819 Pitkow et al. Apr 2002 B1
6370537 Gilbert et al. Apr 2002 B1
6380910 Moustakas et al. Apr 2002 B1
6380947 Stead Apr 2002 B1
6381738 Choi et al. Apr 2002 B1
6389460 Stewart et al. May 2002 B1
6389510 Chen et al. May 2002 B1
6393605 Loomans May 2002 B1
6396885 Ding et al. May 2002 B1
6405211 Sokol et al. Jun 2002 B1
6405251 Bullard et al. Jun 2002 B1
6415259 Wolfinger et al. Jul 2002 B1
6415283 Conklin Jul 2002 B1
6418448 Sarkar Jul 2002 B1
6421571 Spriggs et al. Jul 2002 B1
6426723 Smith et al. Jul 2002 B1
6429870 Chen et al. Aug 2002 B1
6430571 Doan et al. Aug 2002 B1
6430574 Stead Aug 2002 B1
6437799 Shinomi et al. Aug 2002 B1
6446065 Nishioka et al. Sep 2002 B1
6446089 Brodersen et al. Sep 2002 B1
6446200 Ball et al. Sep 2002 B1
6446256 Hyman et al. Sep 2002 B1
6448964 Isaacs et al. Sep 2002 B1
6453038 McFarlane et al. Sep 2002 B1
6463346 Flockhart et al. Oct 2002 B1
6463440 Hind et al. Oct 2002 B1
6469715 Carter et al. Oct 2002 B1
6469716 Carter et al. Oct 2002 B1
6473467 Wallace et al. Oct 2002 B1
6473748 Archer Oct 2002 B1
6493331 Walton et al. Dec 2002 B1
6493399 Xia et al. Dec 2002 B1
6493731 Jones et al. Dec 2002 B1
6493754 Rosborough et al. Dec 2002 B1
6496812 Campaigne et al. Dec 2002 B1
6496833 Goldberg et al. Dec 2002 B1
6502239 Zgarba et al. Dec 2002 B2
6509898 Chi et al. Jan 2003 B2
6513018 Culhane Jan 2003 B1
6526440 Bharat Feb 2003 B1
6526457 Birze Feb 2003 B1
6529217 Maguire, III et al. Mar 2003 B1
6529899 Kraft et al. Mar 2003 B1
6529900 Patterson et al. Mar 2003 B1
6530079 Choi et al. Mar 2003 B1
6532474 Iwamoto et al. Mar 2003 B2
6539374 Jung Mar 2003 B2
6542912 Meltzer et al. Apr 2003 B2
6546381 Subramanian et al. Apr 2003 B1
6546406 DeRose et al. Apr 2003 B1
6549904 Ortega et al. Apr 2003 B1
6556226 Gould et al. Apr 2003 B2
6556983 Altschuler et al. Apr 2003 B1
6556985 Karch Apr 2003 B1
6559864 Olin May 2003 B1
6560592 Reid et al. May 2003 B1
6560649 Mullen et al. May 2003 B1
6567419 Yarlagadda May 2003 B1
6571222 Matsumoto et al. May 2003 B1
6577769 Kenyon et al. Jun 2003 B1
6583800 Ridgley et al. Jun 2003 B1
6584464 Warthen Jun 2003 B1
6584569 Reshef et al. Jun 2003 B2
6594662 Sieffert et al. Jul 2003 B1
6597381 Eskridge et al. Jul 2003 B1
6597775 Lawyer et al. Jul 2003 B2
6598043 Baclawski Jul 2003 B1
6606613 Altschuler et al. Aug 2003 B1
6625657 Bullard Sep 2003 B1
6629138 Lambert et al. Sep 2003 B1
6636850 Lepien Oct 2003 B2
6636901 Sudhakaran et al. Oct 2003 B2
6643638 Xu Nov 2003 B1
6643652 Helgeson et al. Nov 2003 B2
6661889 Flockhart et al. Dec 2003 B1
6661908 Suchard et al. Dec 2003 B1
6678679 Bradford Jan 2004 B1
6678773 Marietta et al. Jan 2004 B2
6678882 Hurley et al. Jan 2004 B1
6684261 Orton et al. Jan 2004 B1
6690788 Bauer et al. Feb 2004 B1
6691067 Ding et al. Feb 2004 B1
6691230 Bardon Feb 2004 B1
6701314 Conover et al. Mar 2004 B1
6711565 Subramaniam et al. Mar 2004 B1
6721747 Lipkin Apr 2004 B2
6728702 Subramaniam et al. Apr 2004 B1
6728852 Stoutamire Apr 2004 B1
6732095 Warshaysky et al. May 2004 B1
6732111 Brodersen et al. May 2004 B2
6748422 Morin et al. Jun 2004 B2
6750858 Rosenstein Jun 2004 B1
6751663 Farrell et al. Jun 2004 B1
6754475 Harrison et al. Jun 2004 B1
6756994 Tlaskal Jun 2004 B1
6763351 Subramaniam et al. Jul 2004 B1
6771706 Ling et al. Aug 2004 B2
6772148 Baclawski Aug 2004 B2
6772350 Belani et al. Aug 2004 B1
6778971 Altschuler et al. Aug 2004 B1
6782091 Dunning, III Aug 2004 B1
6785341 Walton et al. Aug 2004 B2
6788114 Krenzke et al. Sep 2004 B1
6792420 Stephen Chen et al. Sep 2004 B2
RE38633 Srinivasan Oct 2004 E
6804330 Jones et al. Oct 2004 B1
6807632 Carpentier et al. Oct 2004 B1
6810429 Walsh et al. Oct 2004 B1
6820082 Cook et al. Nov 2004 B1
6829655 Huang et al. Dec 2004 B1
6831668 Cras et al. Dec 2004 B2
6836275 Arquie Dec 2004 B1
6839682 Blume et al. Jan 2005 B1
6847982 Parker et al. Jan 2005 B2
6851089 Erickson et al. Feb 2005 B1
6856575 Jones Feb 2005 B2
6856992 Britton et al. Feb 2005 B2
6859787 Fisher et al. Feb 2005 B2
6865546 Song Mar 2005 B1
6865566 Serrano-Morales et al. Mar 2005 B2
6865575 Smith et al. Mar 2005 B1
6867789 Allen et al. Mar 2005 B1
6918222 Lat et al. Jul 2005 B2
6920615 Campbell et al. Jul 2005 B1
6925457 Britton et al. Aug 2005 B2
6925609 Lucke Aug 2005 B1
6927728 Vook et al. Aug 2005 B2
6934702 Faybishenko et al. Aug 2005 B2
6940917 Menon et al. Sep 2005 B2
6944644 Gideon Sep 2005 B2
6954737 Kalantar et al. Oct 2005 B2
6956845 Baker et al. Oct 2005 B2
6959432 Crocker Oct 2005 B2
6961725 Yuan et al. Nov 2005 B2
6965889 Serrano-Morales et al. Nov 2005 B2
6966033 Gasser et al. Nov 2005 B1
6976144 Trefler et al. Dec 2005 B1
6978719 Sebata et al. Dec 2005 B2
6985912 Mullins et al. Jan 2006 B2
6991153 Silverbrook et al. Jan 2006 B2
7020869 Abrari et al. Mar 2006 B2
7020882 Lewallen Mar 2006 B1
7028225 Maso et al. Apr 2006 B2
7031901 Abu El Ata Apr 2006 B2
7035808 Ford Apr 2006 B1
7058367 Luo et al. Jun 2006 B1
7058637 Britton et al. Jun 2006 B2
7064766 Beda et al. Jun 2006 B2
7073177 Foote et al. Jul 2006 B2
7076558 Dunn Jul 2006 B1
7089193 Newbold Aug 2006 B2
7103173 Rodenbusch et al. Sep 2006 B2
7124145 Surasinghe Oct 2006 B2
7139999 Bowman-Amuah Nov 2006 B2
7143116 Okitsu et al. Nov 2006 B2
7171145 Takeuchi et al. Jan 2007 B2
7171415 Kan et al. Jan 2007 B2
7174514 Subramaniam et al. Feb 2007 B2
7178109 Hewson et al. Feb 2007 B2
7194380 Barrow et al. Mar 2007 B2
7194690 Guillermo et al. Mar 2007 B2
7289793 Norwood et al. Oct 2007 B2
RE39918 Slemmer Nov 2007 E
7302417 Iyer Nov 2007 B2
7318020 Kim Jan 2008 B1
7318066 Kaufman et al. Jan 2008 B2
7334039 Majkut et al. Feb 2008 B1
7343295 Pomerance Mar 2008 B2
7353229 Vilcauskas, Jr. et al. Apr 2008 B2
7398391 Carpentier et al. Jul 2008 B2
7406475 Dome et al. Jul 2008 B2
7412388 Dalal et al. Aug 2008 B2
7415731 Carpentier et al. Aug 2008 B2
7505827 Boddy et al. Mar 2009 B1
7526481 Cusson et al. Apr 2009 B1
7536294 Stanz et al. May 2009 B1
7555645 Vissapragada Jun 2009 B2
7574494 Mayernick et al. Aug 2009 B1
7596504 Hughes et al. Sep 2009 B2
7640222 Trefler Dec 2009 B2
7647417 Taneja Jan 2010 B1
7665063 Hofmann et al. Feb 2010 B1
7685013 Gendler Mar 2010 B2
7689447 Aboujaoude et al. Mar 2010 B1
7711919 Trefler et al. May 2010 B2
7779395 Chotin et al. Aug 2010 B1
7783596 Smolen et al. Aug 2010 B2
7787609 Flockhart et al. Aug 2010 B1
7791559 Piasecki Sep 2010 B2
7818506 Shepstone et al. Oct 2010 B1
7844594 Holt et al. Nov 2010 B1
7870244 Chong et al. Jan 2011 B2
7889896 Roehrig et al. Feb 2011 B2
7937690 Casey May 2011 B2
7971180 Kreamer et al. Jun 2011 B2
7974714 Hoffberg Jul 2011 B2
7983895 McEntee et al. Jul 2011 B2
8001519 Conallen et al. Aug 2011 B2
8037329 Leech et al. Oct 2011 B2
8073802 Trefler Dec 2011 B2
8250525 Khatutsky Aug 2012 B2
8335704 Trefler et al. Dec 2012 B2
8386960 Eismann et al. Feb 2013 B1
8468492 Frenkel Jun 2013 B1
8479157 Trefler et al. Jul 2013 B2
8516193 Clinton et al. Aug 2013 B1
8739044 Varadarajan May 2014 B1
8843435 Trefler et al. Sep 2014 B1
8880487 Clinton et al. Nov 2014 B1
8924335 Trefler et al. Dec 2014 B1
8959480 Trefler et al. Feb 2015 B2
9026733 Clinton et al. May 2015 B1
9189361 Khatutsky Nov 2015 B2
9195936 Chase Nov 2015 B1
9270743 Frenkel Feb 2016 B2
9658735 Trefler et al. May 2017 B2
9678719 Frenkel Jun 2017 B1
20010013799 Wang Aug 2001 A1
20010035777 Wang et al. Nov 2001 A1
20010047355 Anwar Nov 2001 A1
20010049682 Vincent et al. Dec 2001 A1
20010052108 Bowman-Amuah Dec 2001 A1
20010054064 Kannan Dec 2001 A1
20020010855 Reshef et al. Jan 2002 A1
20020013804 Gideon Jan 2002 A1
20020029161 Brodersen et al. Mar 2002 A1
20020042831 Capone et al. Apr 2002 A1
20020049603 Mehra et al. Apr 2002 A1
20020049715 Serrano-Morales et al. Apr 2002 A1
20020049788 Lipkin et al. Apr 2002 A1
20020054152 Palaniappan et al. May 2002 A1
20020059566 Delcambre et al. May 2002 A1
20020070972 Windl et al. Jun 2002 A1
20020073337 Ioele et al. Jun 2002 A1
20020083063 Egolf Jun 2002 A1
20020091677 Sridhar Jul 2002 A1
20020091678 Miller et al. Jul 2002 A1
20020091710 Dunham et al. Jul 2002 A1
20020091835 Lentini et al. Jul 2002 A1
20020093537 Bocioned et al. Jul 2002 A1
20020107684 Gao Aug 2002 A1
20020118688 Jagannathan Aug 2002 A1
20020120598 Shadmon et al. Aug 2002 A1
20020120627 Mankoff Aug 2002 A1
20020120762 Cheng et al. Aug 2002 A1
20020133502 Rosenthal et al. Sep 2002 A1
20020177232 Melker et al. Nov 2002 A1
20020178232 Ferguson Nov 2002 A1
20020181692 Flockhart et al. Dec 2002 A1
20020184610 Chong et al. Dec 2002 A1
20020186826 Hsu et al. Dec 2002 A1
20020198935 Crandall et al. Dec 2002 A1
20030001894 Boykin et al. Jan 2003 A1
20030004934 Qian Jan 2003 A1
20030004951 Chokshi Jan 2003 A1
20030009239 Lombardo et al. Jan 2003 A1
20030014399 Hansen et al. Jan 2003 A1
20030037145 Fagan Feb 2003 A1
20030050834 Caplan Mar 2003 A1
20030050927 Hussam Mar 2003 A1
20030050929 Bookman et al. Mar 2003 A1
20030061209 Raboczi et al. Mar 2003 A1
20030065544 Elzinga et al. Apr 2003 A1
20030066031 Laane Apr 2003 A1
20030074352 Raboczi et al. Apr 2003 A1
20030074369 Scheutze et al. Apr 2003 A1
20030084401 Abel et al. May 2003 A1
20030093279 Malah et al. May 2003 A1
20030098991 Laverty et al. May 2003 A1
20030109951 Hsiung et al. Jun 2003 A1
20030115281 McHenry et al. Jun 2003 A1
20030135358 Lissauer et al. Jul 2003 A1
20030152212 Burok et al. Aug 2003 A1
20030154380 Richmond et al. Aug 2003 A1
20030191626 Al-Onaizan et al. Oct 2003 A1
20030198337 Lenard Oct 2003 A1
20030200254 Wei Oct 2003 A1
20030200371 Abujbara Oct 2003 A1
20030202617 Casper Oct 2003 A1
20030222680 Jaussi Dec 2003 A1
20030229529 Mui et al. Dec 2003 A1
20030229544 Veres et al. Dec 2003 A1
20040003043 Rajamony et al. Jan 2004 A1
20040021686 Barberis Feb 2004 A1
20040024603 Mahoney et al. Feb 2004 A1
20040034651 Gupta et al. Feb 2004 A1
20040049479 Dome et al. Mar 2004 A1
20040049509 Keller et al. Mar 2004 A1
20040049580 Boyd et al. Mar 2004 A1
20040054610 Amstutz et al. Mar 2004 A1
20040064552 Chong et al. Apr 2004 A1
20040068517 Scott Apr 2004 A1
20040088199 Childress et al. May 2004 A1
20040103014 Teegan et al. May 2004 A1
20040117759 Rippert et al. Jun 2004 A1
20040122652 Andrews et al. Jun 2004 A1
20040133416 Fukuoka et al. Jul 2004 A1
20040133876 Sproule Jul 2004 A1
20040139021 Reed et al. Jul 2004 A1
20040145607 Alderson Jul 2004 A1
20040147138 Vaartstra Jul 2004 A1
20040148152 Horikawa Jul 2004 A1
20040148586 Gilboa Jul 2004 A1
20040162812 Lane et al. Aug 2004 A1
20040162822 Papanyan et al. Aug 2004 A1
20040167765 Abu El Ata Aug 2004 A1
20040205672 Bates et al. Oct 2004 A1
20040220792 Gallanis et al. Nov 2004 A1
20040236566 Simske Nov 2004 A1
20040243587 Nuyens et al. Dec 2004 A1
20040268221 Wang Dec 2004 A1
20040268299 Lei et al. Dec 2004 A1
20050027563 Fackler et al. Feb 2005 A1
20050027871 Bradley et al. Feb 2005 A1
20050039191 Hewson et al. Feb 2005 A1
20050044198 Okitsu et al. Feb 2005 A1
20050050000 Kwok et al. Mar 2005 A1
20050055330 Britton et al. Mar 2005 A1
20050059566 Brown et al. Mar 2005 A1
20050060372 DeBettencourt et al. Mar 2005 A1
20050071211 Flockhart et al. Mar 2005 A1
20050096959 Kumar et al. May 2005 A1
20050104628 Tanzawa et al. May 2005 A1
20050125683 Matsuyama et al. Jun 2005 A1
20050132048 Kogan et al. Jun 2005 A1
20050138162 Byrnes Jun 2005 A1
20050144023 Aboujaoude et al. Jun 2005 A1
20050165823 Ondrusek et al. Jul 2005 A1
20050198021 Wilcox et al. Sep 2005 A1
20050216235 Butt et al. Sep 2005 A1
20050222889 Lai et al. Oct 2005 A1
20050228875 Monitzer et al. Oct 2005 A1
20050234882 Bennett et al. Oct 2005 A1
20050267770 Banavar et al. Dec 2005 A1
20050288920 Green et al. Dec 2005 A1
20060004845 Kristiansen et al. Jan 2006 A1
20060015388 Flockhart et al. Jan 2006 A1
20060020783 Fisher Jan 2006 A1
20060041861 Trefler et al. Feb 2006 A1
20060053125 Scott Mar 2006 A1
20060063138 Loff et al. Mar 2006 A1
20060064486 Baron et al. Mar 2006 A1
20060064667 Freitas Mar 2006 A1
20060075360 Bixler Apr 2006 A1
20060080082 Ravindra et al. Apr 2006 A1
20060080401 Gill et al. Apr 2006 A1
20060092467 Dumitrescu et al. May 2006 A1
20060100847 McEntee et al. May 2006 A1
20060101386 Gerken et al. May 2006 A1
20060101393 Gerken et al. May 2006 A1
20060106846 Schulz et al. May 2006 A1
20060139312 Sinclair et al. Jun 2006 A1
20060149751 Jade et al. Jul 2006 A1
20060167655 Barrow et al. Jul 2006 A1
20060173724 Trefler et al. Aug 2006 A1
20060173871 Taniguchi et al. Aug 2006 A1
20060206303 Kohlmeier et al. Sep 2006 A1
20060206305 Kimura et al. Sep 2006 A1
20060209085 Wong Sep 2006 A1
20060218166 Myers et al. Sep 2006 A1
20060271559 Stavrakos et al. Nov 2006 A1
20060271920 Abouelsaadat Nov 2006 A1
20060288348 Kawamoto et al. Dec 2006 A1
20070005623 Self et al. Jan 2007 A1
20070010991 Lei et al. Jan 2007 A1
20070028225 Whittaker et al. Feb 2007 A1
20070038765 Dunn Feb 2007 A1
20070055938 Herring et al. Mar 2007 A1
20070061789 Kaneko et al. Mar 2007 A1
20070094199 Deshpande et al. Apr 2007 A1
20070100782 Reed et al. May 2007 A1
20070118497 Katoh May 2007 A1
20070130130 Chan et al. Jun 2007 A1
20070136068 Horvitz Jun 2007 A1
20070143163 Weiss et al. Jun 2007 A1
20070143851 Nicodemus et al. Jun 2007 A1
20070203756 Sears et al. Aug 2007 A1
20070208553 Hastings et al. Sep 2007 A1
20070226031 Manson et al. Sep 2007 A1
20070233902 Trefler et al. Oct 2007 A1
20070239646 Trefler Oct 2007 A1
20070245300 Chan et al. Oct 2007 A1
20070260584 Marti et al. Nov 2007 A1
20070294644 Yost Dec 2007 A1
20080002823 Fama et al. Jan 2008 A1
20080046462 Kaufman et al. Feb 2008 A1
20080077384 Agapi et al. Mar 2008 A1
20080085502 Allen et al. Apr 2008 A1
20080109467 Brookins et al. May 2008 A1
20080120593 Keren et al. May 2008 A1
20080163253 Massmann et al. Jul 2008 A1
20080184230 Leech et al. Jul 2008 A1
20080189679 Rodriguez et al. Aug 2008 A1
20080195377 Kato et al. Aug 2008 A1
20080196003 Gerken et al. Aug 2008 A1
20080208785 Trefler et al. Aug 2008 A1
20080216055 Khatutsky Sep 2008 A1
20080216060 Vargas Sep 2008 A1
20080263510 Nerome et al. Oct 2008 A1
20080297482 Weiss Dec 2008 A1
20090007084 Conallen et al. Jan 2009 A1
20090018998 Patten, Jr. et al. Jan 2009 A1
20090075634 Sinclair et al. Mar 2009 A1
20090083697 Zhang et al. Mar 2009 A1
20090132232 Trefler May 2009 A1
20090132996 Eldridge et al. May 2009 A1
20090138844 Halberstadt et al. May 2009 A1
20090150541 Georgis Jun 2009 A1
20090158213 Ryu Jun 2009 A1
20090158407 Nicodemus et al. Jun 2009 A1
20090164494 Dodin Jun 2009 A1
20090171938 Levin et al. Jul 2009 A1
20090199123 Albertson Aug 2009 A1
20090228786 Danton Sep 2009 A1
20090276206 Fitzpatrick et al. Nov 2009 A1
20090282384 Keppler Nov 2009 A1
20090319948 Stannard Dec 2009 A1
20100011338 Lewis Jan 2010 A1
20100088266 Trefler Apr 2010 A1
20100107137 Trefler et al. Apr 2010 A1
20100149109 Elias Jun 2010 A1
20100217737 Shama Aug 2010 A1
20110016422 Miyazawa Jan 2011 A1
20110066486 Bassin et al. Mar 2011 A1
20110072373 Yuki Mar 2011 A1
20110148791 Luu Jun 2011 A1
20110239113 Hung et al. Sep 2011 A1
20110252305 Tschani et al. Oct 2011 A1
20110264251 Copello et al. Oct 2011 A1
20120041921 Canaday et al. Feb 2012 A1
20120050530 Raman Mar 2012 A1
20120102420 Fukahori Apr 2012 A1
20120272186 Kraut Oct 2012 A1
20120290939 Yu Nov 2012 A1
20120293558 Dilts Nov 2012 A1
20120306773 Yeung Dec 2012 A1
20130007267 Khatutsky Jan 2013 A1
20130031455 Griffiths et al. Jan 2013 A1
20130047165 Goetz et al. Feb 2013 A1
20130067392 Leonard Mar 2013 A1
20130120319 Givon May 2013 A1
20130120434 Kim May 2013 A1
20130135294 An May 2013 A1
20130159904 Kelappan Jun 2013 A1
20130167245 Birtwhistle et al. Jun 2013 A1
20130179816 Seo Jul 2013 A1
20130231970 Trefler et al. Sep 2013 A1
20130254833 Nicodemus et al. Sep 2013 A1
20130290249 Merriman et al. Oct 2013 A1
20130335339 Maunder Dec 2013 A1
20140019400 Trefler et al. Jan 2014 A1
20140082539 Tjissen Mar 2014 A1
20140089819 Andler Mar 2014 A1
20140125577 Hoang May 2014 A1
20140137019 Paulsen May 2014 A1
20140258860 Subramanian Sep 2014 A1
20140277164 Ramsay et al. Sep 2014 A1
20140313135 Pisters Oct 2014 A1
20140325410 Jung Oct 2014 A1
20150058772 Bator Feb 2015 A1
20150074606 Melen Mar 2015 A1
20150089406 Trefler et al. Mar 2015 A1
20150127736 Clinton et al. May 2015 A1
20160041961 Romney Feb 2016 A1
20160070560 Chase Mar 2016 A1
20160085809 de Castro Alves et al. Mar 2016 A1
20160098298 Trefler et al. Apr 2016 A1
20160105370 Mellor et al. Apr 2016 A1
20170013073 Mendez et al. Jan 2017 A1
20170109032 MeLinand Apr 2017 A1
20170242582 Yaremko Aug 2017 A1
20170255341 Trefler et al. Sep 2017 A1
20170351425 D'angelo et al. Dec 2017 A1
20170357703 Theimer et al. Dec 2017 A1
20180011678 Shipper et al. Jan 2018 A1
20180024901 Tankersley et al. Jan 2018 A1
20180067580 Bonnery Mar 2018 A1
Foreign Referenced Citations (106)
Number Date Country
19911098 Dec 1999 DE
0 549 208 Jun 1993 EP
0 669 717 Aug 1995 EP
0 996 916 May 2000 EP
1 015 997 Jul 2000 EP
1 019 807 Jul 2000 EP
1 073 955 Feb 2001 EP
1 073 992 Feb 2001 EP
1 135 723 Sep 2001 EP
1 163 604 Dec 2001 EP
1 183 636 Mar 2002 EP
1 196 882 Apr 2002 EP
1 203 310 May 2002 EP
1 208 482 May 2002 EP
1 212 668 Jun 2002 EP
1 240 592 Sep 2002 EP
1 277 102 Jan 2003 EP
1 277 119 Jan 2003 EP
1 277 120 Jan 2003 EP
1 277 153 Jan 2003 EP
1 277 155 Jan 2003 EP
1 277 329 Jan 2003 EP
1 374 083 Jan 2004 EP
1 382 030 Jan 2004 EP
1 386 241 Feb 2004 EP
1 393 172 Mar 2004 EP
1 393 188 Mar 2004 EP
1 402 336 Mar 2004 EP
1 407 384 Apr 2004 EP
1 430 396 Jun 2004 EP
1 438 649 Jul 2004 EP
1 438 654 Jul 2004 EP
1 438 672 Jul 2004 EP
1 483 685 Dec 2004 EP
1 490 747 Dec 2004 EP
1 490 809 Dec 2004 EP
1 492 232 Dec 2004 EP
1 782 183 May 2007 EP
1 830 312 Sep 2007 EP
1 840 803 Oct 2007 EP
2 115 581 Nov 2009 EP
9838564 Sep 1998 WO
9840807 Sep 1998 WO
9905632 Feb 1999 WO
9945465 Sep 1999 WO
9950784 Oct 1999 WO
0033187 Jun 2000 WO
0033217 Jun 2000 WO
0033226 Jun 2000 WO
0033235 Jun 2000 WO
0033238 Jun 2000 WO
0052553 Sep 2000 WO
0052603 Sep 2000 WO
0067194 Nov 2000 WO
0140958 Jun 2001 WO
0175610 Oct 2001 WO
0175614 Oct 2001 WO
0175747 Oct 2001 WO
0175748 Oct 2001 WO
0176206 Oct 2001 WO
0177787 Oct 2001 WO
0179994 Oct 2001 WO
02021254 Mar 2002 WO
02044947 Jun 2002 WO
02056249 Jul 2002 WO
02080006 Oct 2002 WO
02080015 Oct 2002 WO
02082300 Oct 2002 WO
02084925 Oct 2002 WO
02088869 Nov 2002 WO
02091346 Nov 2002 WO
02101517 Dec 2002 WO
02103576 Dec 2002 WO
03021393 Mar 2003 WO
03029923 Apr 2003 WO
03029955 Apr 2003 WO
03030005 Apr 2003 WO
03030013 Apr 2003 WO
03030014 Apr 2003 WO
03058504 Jul 2003 WO
03069500 Aug 2003 WO
03071380 Aug 2003 WO
03071388 Aug 2003 WO
03073319 Sep 2003 WO
03077139 Sep 2003 WO
03085503 Oct 2003 WO
03085580 Oct 2003 WO
2004001613 Dec 2003 WO
2004003684 Jan 2004 WO
2004003766 Jan 2004 WO
2004003885 Jan 2004 WO
2004046882 Jun 2004 WO
2004061815 Jul 2004 WO
2004086197 Oct 2004 WO
2004086198 Oct 2004 WO
2004095207 Nov 2004 WO
2004095208 Nov 2004 WO
2004114147 Dec 2004 WO
2005001627 Jan 2005 WO
2005003888 Jan 2005 WO
2005010645 Feb 2005 WO
2005117549 Dec 2005 WO
2006081536 Aug 2006 WO
2007033922 Mar 2007 WO
2008109441 Sep 2008 WO
2009097384 Aug 2009 WO
Non-Patent Literature Citations (134)
Entry
Summons to Attend Oral Proceedings pursuant to rule 115(1) EPC, issued May 2, 2018 for Application No. 08731127.0 (8 pages).
International Preliminary Report on Patentability for Application No. PCT/US2004/020783, dated Feb. 13, 2006 (6 pages).
International Search Report for PCT/US05/018599, dated May 15, 2007 (1 page).
International Preliminary Report on Patentability for PCT/US2005/018599, dated Jun. 5, 2007 (10 pages).
International Search Report & Written Opinion for PCT/US06/03160, dated Jul. 21, 2008 (16 pages).
International Preliminary Report on Patentability for PCT/US06/03160, dated Apr. 9, 2009 (14 pages).
International Search Report for PCT/US08/55503, dated Jul. 28, 2008 (1 page).
International Preliminary Report on Patentability for PCT/US2008/055503, dated Sep. 17, 2009 (4 pages).
International Search Report & Written Opinion for PCT/US09/32341, dated Mar. 11, 2009 (14 pages).
International Preliminary Report on Patentability for PCT/US2009/032341, dated Aug. 12, 2010 (8 pages).
Johnson et al., Sharing and resuing rules-a feature comparison of five expert system shells. IEEE Expert, IEEE Services Center, New York, NY, vol. 9, No. 3, Jun. 1, 1994, pp. 3-17.
Jones et al., A user-centered approach to functions in excel. International Conference on Functional Programming, Uppsala, Jun. 30, 2003, pp. 1-12.
Kappel, G., et al., TriGSflow active object-oriented workflow management. Proceedings of the 28th Annual Hawaii International Conference on System Sciences. 1995, pp. 727-736.
Kim, W., Object-Oriented Databases: Definition and Research Directions, IEEE Trans. on Knowledge and Data Engineering, vol. 2(3) pp. 327-341, Sep. 1990.
Kuhn, H.W., The Hungarian Method for the Assignment Problem, Naval Research Logistics Quarterly, 2 (1955), pp. 83-97.
Kuno, H.A., and E.A. Rundensteiner, Augmented Inherited Multi-Index Structure for Maintenance of Materialized Path Query Views, Proc. Sixth Int'l. Workshop on Research Issues in Data Engineering, pp. 128-137, Feb. 1996.
LaRue, J., Leveraging Integration and Workflow. Integrated Solutions, Accounting Today, SourceMedia, Aug. 2006, pp. 18-19.
Lippert, Enc, Fabulous Adventures in Coding: Metaprogramming, Toast and the Future of Development Tools, Microsoft.com Blog, MSDN Home, published Mar. 4, 2004, 6 pgs.
Mandal, et al., Integrating existing scientific workflow systems: The kepler/pegasus example. USC Information Sciences Institute, 2007, 8 pages.
Manghi, Paolo, et. al., Hybrid Applications Over XML: Integrating the Procedural and Declarative Approaches, 2002 ACM, pp. 1-6. Retrieved Mar. 22, 2007.
Manolescu, D.A., et al., Dynamic object model and adaptive workflow. Proceedings of Metadata and Active Object-Model Pattern Mining Workshop co-located with OOPSLA, 1999, vol. 99, 19 pages.
Markiewicz, M.E., et al., Object oriented framework development. ACM, 2001, 13 pages, <http://dl.acm.org/citation.cfm?id=372771>.
Markowitz, V.M., and A. Shoshani, Object Queries over Relational Databases: Language, Implementation, and Applications, IEEE Xplore, pp. 71-80, Apr. 1993.
Marmel, Elaine, Microsoft Office Project 2007 Bible, ISBN 0470009926, Wiley Publishing, Inc., 2007, 961 pages.
Maryanski, F., et al., The Data Model Compiler: A Tool for Generating Object-Oriented Database Systems, 1986 Int'l. Workshop on Object-Oriented Database Systems, IEEE, 73-84 (1986).
McConnell, Steven C., Brooks' Law Repealed, IEEE Software, pp. 6-9, Nov./Dec. 1999.
Mecca, G., et al., Cut and Paste, ACM, pp. 1-25 and Appendix I-IV (Jun. 1999). Retrieved Mar. 22, 2007.
Mitchell, T.M., Machine Learning, Chapter 3, 1997, McGraw-Hill, pp. 52-80.
Mitchell, T.M., Machine Learning, Chapter 6, 1997, McGraw-Hill, pp. 154-200.
Morizet-Mahoudeaux, P., A Hierarchy of Network-Based Knowledge Systems, IEEE Trans. on Systems, Man, and Cybernetics, vol. 21(5), pp. 1184-1191, Sep./Oct. 1991.
Pientka, B., et al., Programming with proofs and explicit contexts. International Symposium on Principles and Practice of Declarative Programming, ACM, 2008, pp. 163-173, <http://delivery.acm.org/10.1145/1390000/1389469/p163-pientka.pdf?>.
Reinersten, Don, Is It Always a Bad Idea to Add Resources to a Late Project?, Oct. 30, 2000. Electronic Design. vol. 48, Issue 22, p. 70.
Riccuiti, M., Oracle 8.0 on the way with objects: upgrade will also build in multidimensional engine. InfoWorld. Sep. 25, 1995;17(39):16.
Richner, T., et al., Recovering high-level views of object-oriented applications from static and dynamic information. IEEE, 1999, 10 pages, <http://ieeexploreieee.org/stamp/stamp.jsp?tp=&arnumber=792487>.
Saiz, Francisco, et al. Rule-Based Web Page Generation, Proceedings of the 2nd Workshop on Adaptive Hypertext and Hypermedia, Hypertext'98, Jun. 20-24, 1998, 9 pages.
Salvini, S., and M.H. Williams, Knowledge Management for Expert Systems, IEEE Colloquium on ‘Knowledge Engineering’, 3 pages, May 1990.
Schiefelbein, Mark, A Backbase Ajax Front-end for J2EE Applications, Internet Article, dev2dev <http://dev2dev.bea.com/1pt/a/433>, Aug. 29, 2005, 16 pages.
Schulze, W., Filling the workflow management facility into the object management architecture. Business Object Design and Implementation II. Springer London, 1998, pp. 109-117.
Sellis, T., et al., Coupling Production Systems and Database Systems: A Homogeneous Approach, IEEE Trans. on Knowledge and Data Engineering, vol. 5(2), pp. 240-256, Apr. 1993.
Shyy Y.M., and S.Y.W. Su, Refinement Preservation for Rule Selection in Active Object-Oriented Database Systems, Proc. Fourth Int'l. Workshop on Research Issues in Data Engineering, pp. 115-123, Feb. 1994.
Simpson, Alan, et al., Access 97 for Windows 95/NT; 1997 SYBEX; 16 pages; USPTO STIC-EIC 2100/2400.
Singh, G., et al., Workflow task clustering for best effort systems with pegasus, Pegasus, 2008, 8 pages.
Smedley, T.J. et al., “Expanding the Utility of Spreadsheets Through the Integration of Visual Programming and User Interface Objects,” School of Computer Science, Technical University of Nova Scotia, ACM, 1996; pp. 148-155.
Srinivasan, V., et al., Object persistence in object-oriented applications. IBM Systems Journal, 1997, vol. 36, issue 1, pp. 66-87, <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber-5387186>.
Stonebraker, M., The Integration of Rule Systems and Database Systems, IEEE Trans. on Knowledge and Data Engineering, vol. 4(5), pp. 415-423, Oct. 1992.
Sun, et al., “Supporting Inheritance in Relational Database Systems,” IEEE, pp. 511-518, Jun. 1992.
Surjanto, B., XML content management based on object-relational database technology. Proceedings of the First International Conference on Web Information Systems Engineering, IEEE, 2000, Jun. 19-21, 2000, vol. 1, pp. 70-79.
Thuraisingham, “From Rules to Frames and Frames to Rules,” AI Expert, pp. 31-39, Oct. 1989.
Vranes, S., et al., Integrating Multiple Paradigms within the Blackboard Framework, IEEE Transactions on Software Engineering, vol. 21, No. 3, Mar. 1995, pp. 244-262.
Yang, Bibo; Geunes, Joseph; O'Brien, William J.; Resource-Constrained Project Scheduling: Past Work and New Directions, Apr. 2001, 28 pages, Research Report Jun. 2001, Department of Industrial and Systems Engineering, University of Florida.
[No Author Listed] About the Integrated Work Manager (IWM). Pegasystems, Inc., Apr. 30, 2009, 3 pages, <http://pdn-dev/DevNet/PRPCv5/KB/TMP9ad01zurnf.asp>.
[No Author Listed] FreeBSD Project. “EDQUOTA(8)” in Free BSD System Managers Manual. FreeBSD 8.2 Jun. 6, 1993. pp. 1-2. Retrieved from freebsd.org on Oct. 27, 2011.
[No Author Listed] How SmartForms for Blaze Advisor works, Fair Issac White Paper, http://www.FAIRISAAC.COM/, Oct. 31, 2005, 8 pages (website no longer active).
[No Author Listed] How to Configure and Customize the Universal Worklist. SAP Netweaver '04 and SAP Enterprise Portal 6.0. SAP AG. Version 1, May 2004, 65 pages. <http://www.erpgenie.com/sap/netweaver/ep/Configuring%20the%20UWL.pdf>.
[No Author Listed] How to configure the IWM/IAC gateway. Pegasystems, Inc., Apr. 30, 2009, 4 pages, <http://pdn-dev/DevNet/PRPCv5/KB/TMP9cf8fzurq4.asp>.
[No Author Listed] How to install the Integrated Work Manager (IWM). Pegasystems, Inc., Apr. 30, 2009, 6 pages, <http://pdn-dev/DevNet/PRPCv5/KB/TMP9br1ezurp8.asp>.
[No Author Listed] HP Integrated Lights-Out 2, User Guide, Part No. 394326-004, HP, Aug. 2006, 189 pages.
[No Author Listed] Integrating with External Systems, PegaRULES Process Commander 5.2. Process Commander 5.2 reference. Pegasystems Inc, Cambridge, MA, 2006, 103 pages <http://pdn.pega.com/ProductSupport/Products/PegaRULESProcessCommander/documents/PRPC/V5/502/iwes/PRPC52_Integrating_with_External_Systems.pdf>.
[No Author Listed] IP Prior Art Database, Options when returning work items in workflow management systems. IBM, IPCOM000027980D, 2004, 3 pages.
[No Author Listed] IP Prior Art Database, Staff Queries and Assignments in Workflow Systems. IBM, IPCOM000142382D, 2006, 4 pages.
[No Author Listed] IP Prior Art Database, Using work items to manage user interactions with adaptive business services. IBM TDB, IPCOM000015953D, 2003, 4 pages.
[No Author Listed] Localizing an Application, PegaRULES Process Commander. Process Commander 4.2 reference. Pegasystems Inc., Cambdrige, MA, 2006, 92 pages <http://pdn.pega.com/DevNet/PRPCv4/TechnologyPapers/documents/Localization0402.pdf>.
[No Author Listed] Oracle Universal Work Queue: Implementation Guide. Release 11i for Windows NT. Oracle Corporation. Jul. 2001, 136 pages. <http://docs.oracle.com/cd/A85964_01/acrobat/ieu115ug.pdf>.
[No Author Listed] Solaris 9 resource manager software. A technical white paper. Sun Microsystems, Inc., Palo Alto CA, 2002, 37 pages. XP-002291080. Retrieved Aug. 3, 2004 from <http://wwws.sun.com/software/whitepapers/solaris9/srm.pdf>.
Bertino, E., and P. Foscoli, Index Organizations for Object-Oriented Database Systems, IEEE Trans. on Knowledge and Data Engineering, 7(2):193-209 (Apr. 1995).
Bierbaum, A., et al., VR juggler: A virtual platform for virtual reality application development. Proceedings of the Virtual Reality 2001 Conference, IEEE, 2001, 8 pages, <http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber-913774>.
Breiman, L., Bagging predictors, Machine Learning, vol. 24, No. 2, Aug. 31, 1996, pp. 123-140, Kluwer Academic Publishers, Netherlands.
Brusilovsky, P., and De Bra, P., Editors, “Second Workshop on Adaptive Hypertext and Hypermedia Proceedings,” Jun. 20-24, 1998. Ninth ACM Conference on Hypertext and Hypermedia, Hypertext'98. pp. 1-2.
Burleson, D., Adding behaviors to relational databases, DBMS, 8(10): 68(5) (Sep. 1995).
Busse, Ralph et al., Declarative and Procedural Object Oriented Views, IEEE, 1998, pp. 570-578, retrieved Mar. 22, 2007.
Buyya et al., Economic Models for Resource Management and Scheduling in Grid Computing, Concurrency and Computation: Practice and Experience, 2002, vol. 14, pp. 1507-1542.
Ceri, S., et al., WIDE—A distributed architecture for workflow management. Proceedings. Seventh International Workshop on Research Issues in Data Engineering, IEEE, 1997, pp. 76-79, 1997.
Chan, T.W., and W. Hwang, Towards Integrating Logic, Object, Frame, and Production, Proc. Fourth Int'l. Conf. on Software Engineering and Knowledge Engineering, IEEE, pp. 463-469, Jun. 1992.
Cheng, Cheng-Chung; Smith, Stephen F.; A Constraint Satisfaction Approach to Makespan Scheduling, AIPS 1996 Proceedings, pp. 45-52 (1996).
Cheng, C.C. and Smith, Applying Constraint Satisfaction Techniques to Job Shop Scheduling, Annals of Operations Research, 70: 327-357 (1997).
Cochrane, Roberta et al., Integrating Triggers and Declarative Constraints in SQL Database Systems, pp. 567-578, Proceedings of the 22nd VLDB Conference Mumbai (Bombay), India, 1996, retrieved Mar. 22, 2007.
Damerau, F.J., Problems and some solutions in customization of natural language database front ends. ACM Transactions on Office Information Systems, vol. 3, No. 2, Apr. 1, 1985, pp. 165-184.
Danforth, S., Integrating Object and Relational Technologies, Proc. Sixteenth Annual Intl. Computer Software and Applications Conf., IEEE Comput. Soc. Press, pp. 225-226, Sep. 1992 (Abstract).
Deelman, E., et al., Pegasus: A framework for mapping complex scientific workflows onto distributed systems, submitted to Scientific Programming, Jan. 2005. Pre journal publication copy of article, 22 pages.
Deelman, E., et al., Pegasus: A framework for mapping complex scientific workflows onto distributed systems. Scientific Programming, 13, pp. 219-237, 2005.
Deelman, E., et al., Workflows and e-science: An overview of workflow system features and capabilities. Future Generation Computer Systems, May 2009, vol. 25, issue 5, pp. 528-540.
DeMichiel, L.G., et al., Polyglot: Extensions to Relational Databases for Sharable Types and Functions in a Multi-Language Environment, Proc. Ninth Int'l. Conf. on Data Engineering, IEEE, pp. 651-660, Apr. 1993.
Devarakonda et al., Predictability of process resource usage: A measurement-based study on UNIX. IEEE Transactions on Software Engineering. 1989;15(12):1579-1586.
Eisenstein, et al., Adaptation in Automated User-Interface Design. IUI, 2000, pp. 74-81.
Communication for European Patent Application No. 05755530.2, dated Sep. 6, 2007 (2 pages).
European Search Report for Application No. 05755530.2, dated Mar. 26, 2012 (3 Pages).
European Office Action dated Aug. 31, 2012 for Application No. 05755530.2 (4 Pages).
Communication for European Patent Application No. 07250844.3 enclosing European Search Report, dated Jul. 11, 2007 (6 pages).
Communication for European Patent Application No. 07250844.3, dated Mar. 28, 2008 (1 page).
European Office Action dated Jul. 9, 2012 for Application No. 07250844.3 (8 Pages).
Communication for European Patent Application No. 07250848.4, dated Aug. 13, 2007 (EESR enclosed) (6 pages).
Communication for European Patent Application No. 07250848.4, dated May 29, 2008 (1 page).
Communication for European Patent Application No. 08731127.0, dated Oct. 13, 2009 (2 pages).
Extended European Search Report dated Oct. 29, 2012 for Application No. 08731127.0 (8 Pages).
Extended European Search Report for Application No. 15189385.6, dated Dec. 17, 2015 (9 pages).
Fayad, M.E., et al., Object-oriented application frameworks. Communications of the ACM, Oct. 1997, vol. 40, Issue 10, pp. 32-38, <http://dl.acm.org/citation.cfm?id=262798>.
Sajos et al. SUPPLE: Automatically Generating User Interfaces. IUI 2004, 8 pages.
Hague, Darren, Universal Worklist with SAP Netweaver Portal. Galileo Press, 2008, pp. 11-31. <http://www.sap-hefte.de/download/dateien/1461/146_leseprobe.pdf>.
International Search Report and Written Opinion for Application No. PCT/GB2004/000677, dated Aug. 2, 2004 (15 pages).
International Search Report for Application No. PCT/US2004/020783, dated Nov. 8, 2005 (2 pages).
U.S. Appl. No. 08/666,165, filed Jun. 19, 1996, Rules Bases and Methods of Access Thereof.
U.S. Appl. No. 10/430,693, filed May 6, 2003, Methods and Apparatus for Digital Data Processing With Mutable Inheritance.
U.S. Appl. No. 10/547,014, filed Aug. 25, 2005, Classification Using Probability Estimate Re-Sampling.
U.S. Appl. No. 10/639,735, filed Aug. 12, 2003, Process/Viewer Interface.
U.S. Appl. No. 10/854,017, filed May 26, 2004, Integration of Declarative Rule-Based Processing With Procedural Programming.
U.S. Appl. No. 11/046,211, filed Jan. 28, 2005, Methods and Apparatus for Work Management and Routing.
U.S. Appl. No. 11/203,513, filed Aug. 12, 2005, Methods and Apparatus for Digital Data Processing With Mutable Inheritance.
U.S. Appl. No. 11/368,360, filed Mar. 3, 2006, Rules Base Systems and Methods With Circumstance Translation.
U.S. Appl. No. 11/396,415, filed Mar. 30, 2006, User Interface Methods and Apparatus for Rules Processing.
U.S. Appl. No. 11/681,269, filed Mar. 2, 2007, Proactive Performance Management for Multi-User Enterprise Software Systems.
U.S. Appl. No. 12/035,682, filed Feb. 22, 2008, User Interface Methods and Apparatus for Rules Processing.
U.S. Appl. No. 12/174,624, filed Jul. 16, 2008, Methods and Apparatus for Implementing Multilingual Software Applications.
U.S. Appl. No. 12/381,523, filed Mar. 12, 2009, Techniques for Dynamic Data Processing.
U.S. Appl. No. 12/386,959, filed Apr. 24, 2009, Method and Apparatus for Integrated Work Management.
U.S. Appl. No. 12/590,454, filed Nov. 6, 2009, Techniques for Content-Based Caching in a Computer System.
U.S. Appl. No. 12/619,215, filed Nov. 16, 2009, Rules Base Systems and Methods With Circumstance Translation.
U.S. Appl. No. 12/649,095, filed Dec. 29, 2009, Methods and Apparatus for Integration of Declarative Rule-Based Processing With Procedural Programming in a Digital Data-Processing Evironment.
U.S. Appl. No. 12/798,161, filed Mar. 30, 2010, System and Method for Creation and Modification of Software Applications.
U.S. Appl. No. 13/031,097, filed Feb. 18, 2011, Systems and Methods for Distributed Rules Processing.
U.S. Appl. No. 13/031,109, filed Feb. 18, 2011, Rule-Based User Interface Conformance Methods.
U.S. Appl. No. 13/341,411. filed Dec. 30, 2011, System and Method for Updating or Modifying an Application Without Manual Coding.
U.S. Appl. No. 13/536,079, filed Jun. 28, 2012, Proactive Performance Management for Multi-User Enterprise Software Systems.
U.S. Appl. No. 13/718,255, filed Dec. 18, 2012, Methods and Apparatus for Work Management and Routing.
U.S. Appl. No. 13/892,956, filed May 13, 2013, Content-Based Caching Using a Content Identifier at a Point in Time.
U.S. Appl. No. 13/897,763, filed May 20, 2013, System and Software for Creation and Modification of Software.
U.S. Appl. No. 13/907,287, filed May 31, 2013, Methods and Apparatus for Integration of Declarative Rule-Based Processing With Procedural Programming in a Digital Data-Processing Environment.
U.S. Appl. No. 14/469,208, filed Aug. 26, 2014, Techniques for Dynamic Data Processing.
U.S. Appl. No. 14/527,348, filed Oct. 29, 2014, Systems and Methods for Distributed Rules Processing.
U.S. Appl. No. 14/558,084, filed Dec. 2, 2014, Methods and Apparatus for User Interface Optimization.
U.S. Appl. No. 14/597,207, filed Jan. 14, 2015, Methods and Apparatus for Integrated Work Management.
U.S. Appl. No. 14/879,679, filed Oct. 9, 2015. Event Processing With Enhanced Throughput.
U.S. Appl. No. 14/928,085, filed Oct. 30, 2015, System and Method for Updating or Modifying an Application Without Manual Coding.
U.S. Appl. No. 15/206,956, filed Jul. 11, 2016, Selective Sharing for Collaborative Application Usage.
U.S. Appl. No. 15/602,880, filed May 23, 2017, Methods and Apparatus for User Interface Optimization.
Related Publications (1)
Number Date Country
20170351425 A1 Dec 2017 US
Provisional Applications (1)
Number Date Country
62345367 Jun 2016 US