Method and apparatus for generating a linguistic representation of raw input data

Information

  • Patent Grant
  • 10776561
  • Patent Number
    10,776,561
  • Date Filed
    Tuesday, January 15, 2013
    11 years ago
  • Date Issued
    Tuesday, September 15, 2020
    4 years ago
  • CPC
    • G06F40/103
    • G06F16/95
    • G06F40/106
    • G06F40/40
    • G06F16/211
    • G06F16/258
  • Field of Search
    • US
    • 715 243000
  • International Classifications
    • H04N21/4728
    • H04N21/472
    • G06F40/103
    • G06F16/95
    • G06F40/40
    • G06F40/106
    • G06F16/21
    • G06F16/25
    • Term Extension
      146
Abstract
Methods, apparatuses, and computer program products are described herein that are configured to be embodied as and/or performed by a document planner. In some example embodiments, a method is provided that comprises selecting a schema based on one or more messages available in a message store and using the selected schema and one or more messages available in the message store to generate a document plan. The schema of this embodiment may be defined by a specification containing one or more queries for selecting one or more messages, one or more messages, and/or one or more predefined phrases to instantiate a document plan. The method of this embodiment may also include applying an optimization specification to the document plan to generate an optimized document plan.
Description
TECHNOLOGICAL FIELD

Embodiments of the present invention relate generally to natural language generation technologies and, more particularly, relate to a method, apparatus, and computer program product for document planning.


BACKGROUND

In some examples, a natural language generation (NLG) system is configured to transform raw input data that is expressed in a non-linguistic format into a format that can be expressed linguistically, such as through the use of natural language. For example, raw input data may take the form of a value of a stock market index over time and, as such, the raw input data may include data that is suggestive of a time, a duration, a value and/or the like. Therefore, an NLG system may be configured to input the raw input data and output text that linguistically describes the value of the stock market index; for example, “securities markets rose steadily through most of the morning, before sliding downhill late in the day.”


Data that is input into a NLG system may be provided in, for example, a recurrent formal structure. The recurrent formal structure may comprise a plurality of individual fields and defined relationships between the plurality of individual fields. For example, the input data may be contained in a spreadsheet or database, presented in a tabulated log message or other defined structure, encoded in a ‘knowledge representation’ such as the resource description framework (RDF) triples that make up the Semantic Web and/or the like. In some examples, the data may include numerical content, symbolic content or the like. Symbolic content may include, but is not limited to, alphanumeric and other non-numeric character sequences in any character encoding, used to represent arbitrary elements of information. In some examples, the output of the NLG system is text in a natural language (e.g. English, Japanese or Swahili), but may also be in the form of synthesized speech.


BRIEF SUMMARY

Methods, apparatuses, and computer program products are described herein that are configured to be embodied as and/or performed by a document planner in a natural language generation system. In some example embodiments, a method is provided that comprises selecting a schema based on one or more messages available in a message store and using the selected schema and the one or more messages available in the message store to generate a document plan. The schema of this embodiment may include one or more queries for selecting one or more messages from the message store, one or more messages, and/or predefined text. In some example embodiments, an optimization specification may be applied to optimize the document plan. Such optimization specification may be applied during the generation of the document plan or to a completed document plan. In some example embodiments, the optimization specification comprises rules for at least one of modifying the document plan and/or selecting a subset of the document plan. The document planner of this embodiment may then output the document plan to a microplanner or the like.





BRIEF DESCRIPTION OF THE DRAWINGS

Having thus described embodiments of the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:



FIG. 1 is a schematic representation of a natural language generation environment that may benefit from some example embodiments of the present invention;



FIG. 2 illustrates an example document planner according to some example embodiments described herein;



FIG. 3 illustrates an example document plan in accordance with some example embodiments of the present invention;



FIG. 4 illustrates a block diagram of an apparatus that embodies a natural language generation system in accordance with some example embodiments of the present invention;



FIG. 5 illustrates a flowchart of operations that may be performed by a document planner in accordance with some example embodiments of the present invention;



FIG. 6 illustrates the temporal order of messages used in a document plan in accordance with some example embodiments of the present invention;



FIGS. 7a-7g illustrate generating an example document plan in accordance with some example embodiments of the present invention; and



FIG. 8 illustrates an example document plan in accordance with some example embodiments of the present invention.





DETAILED DESCRIPTION

Example embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments are shown. Indeed, the embodiments may take many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. The terms “data,” “content,” “information,” and similar terms may be used interchangeably, according to some example embodiments, to refer to data capable of being transmitted, received, operated on, and/or stored. Moreover, the term “exemplary”, as may be used herein, is not provided to convey any qualitative assessment, but instead merely to convey an illustration of an example. Thus, use of any such terms should not be taken to limit the spirit and scope of embodiments of the present invention.


Natural language generation (NLG) is a field of study devoted to building technology to map data or other underlying information into natural language text. The generation of natural language texts involves subtasks such as document planning, microplanning and realization. In some example embodiments, document planning includes the process of selecting and mapping fragments of data, information or the like (e.g. messages) into data structures (e.g. document plan trees or the like), such that the data structures can be further processed into text specifications (e.g. phrase specifications, sentence plans or the like) by a microplanner so that the document plan may be expressed in natural language. In other words, a document planner, such as the document planner described herein, is configured to select information (e.g. messages) to be communicated in a text and to determine how to order and structure the selected information into sentences and paragraphs.


The task of document planning can be described as selecting a subset of messages from an input message set that fulfills the informational requirements of the user (e.g. a message store), partitioning the selected subset of messages into sentences and paragraphs, and ordering the messages for each of the partitions. An exhaustive search based method to find an appropriate document plan through all possible combinations of selection, partitioning and ordering of messages is computationally very expensive. As such, a knowledge-based approach may be appropriate for document planning. In addition, it may not be possible to identify a single unique document plan because there could be more than one document plan appropriate for a particular communicative context. Therefore document planning may further involve finding an optimum document plan among a number of alternative document plans.


In some examples, and as is described herein, a document planner may be configured using top-down planning and bottom-up narrative optimization. Top-down planning is a type of document planning, used by a document planner, which may use schemas to define the structure of the document. A schema is a template that specifies how a particular document plan should be constructed from constituent elements, where those constituent elements may be individual messages or, recursively, instantiations of other schemas. As is described herein, a schema may be expressed using a plan specification in terms of ordered messages or queries to retrieve messages. An example schema may specify a document plan which controls the global structure and global coherence of a generated text, as well as the conditions under which the schema is applicable.


Bottom-up narrative optimization achieves required variations of a document plan when variations of a schema are possible, such as when the global structure and/or ordering of messages is underspecified in a schema. Bottom-up narrative optimization may provide functions such as locally ordering multiple returned messages, globally reordering messages to achieve variation, and/or inserting or deleting subtrees of additional messages into a document plan tree. As is described herein, bottom-up narrative optimization may be configured to use an optimization specification that operates on a document plan or a set of messages. The optimization specification may be configured to control, in some examples, discourse features such as local coherence, continuity, text size, text fluency, discourse-focus maintenance and narration development. In some example embodiments, top-down planning may be combined with bottom-up narrative optimization to generate a document plan that may be input to, or otherwise be accessed by, a microplanner in a natural language generation system. In some embodiments, bottom-up narrative optimization may be used during generation of a document plan by top-down planning and/or bottom-up narrative optimization may be used to modify a document plan once top-down planning is complete.



FIG. 1 is an example block diagram of example components of an example natural language generation environment 100. In some example embodiments, the natural language generation environment 100 comprises a natural language generation system 102, a message store 110, a domain model 112 and/or linguistic resources 114. The natural language generation system 102 may take the form of, for example, a code module, a component, circuitry, and/or the like. The components of the natural language generation environment 100 are configured to provide various logic (e.g. code, instructions, functions, routines and/or the like) and/or services related to a document planner.


A message store 110 is configured to store one or more messages that are accessible by the natural language generation system 102. Messages are language independent data structures that correspond to informational elements in a text and/or collect together underlying data, referred to as slots, arguments or features, which can be presented within a fragment of natural language such as a phrase or sentence. Messages may be represented in various ways; for example, each slot may consist of a named attribute and its corresponding value; these values may recursively consist of sets of named attributes and their values, and each message may belong to one of a set of predefined types. The concepts and relationships that make up messages may be drawn from an ontology (e.g. a domain model 112) that formally represents knowledge about the application scenario.


In some examples, the domain model 112 is a representation of information about a particular domain. For example, a domain model may contain an ontology that specifies the kinds of objects, concepts and/or the like that may exist in the domain in concrete or abstract form, properties that may be predicated of the objects, concepts and the like, relationships that may hold between the objects, concepts and the like, and representations of any specific knowledge that is required to function in the particular domain.


In some examples, messages are created based on a requirements analysis as to what is to be communicated for a particular scenario (e.g. for a particular domain or genre). A message typically corresponds to a fact about the underlying data (for example, the existence of some observed event) that could be expressed via a simple sentence (although it may ultimately be realized by some other linguistic means). For example, to linguistically describe wind, a user may want to know a speed, a direction, a time period or the like, but the user may also want to know changes in speed over time, warm or cold fronts, geographic areas and or the like. In some cases, users do not even want to know wind speed values, but instead want an indication that describes the presence of a dangerous wind condition. Thus, a message related to wind speed may include fields to be populated by data related to the speed, direction, time period or the like, and may have other fields related to different time points, front information or the like. The mere fact that wind exists may be found in the data, but to linguistically describe “light wind” or “gusts” different data interpretation must be undertaken as is described herein.


In some examples, a message is created in an instance in which the raw input data warrants the construction of such a message. For example, a wind message would only be constructed in an instance in which wind data was present in the raw input data. Alternatively or additionally, while messages may correspond directly to observations taken from a raw data input, others may be derived from the observations by means of a process of inference or based on one or more detected events. For example, the presence of rain may be indicative of other conditions, such as the potential for snow at some temperatures.


Messages may be instantiated based on many variations of source data, such as but not limited to time series data, time and space data, data from multiple data channels, an ontology, sentence or phrase extraction from one or more texts, a text, survey responses, structured data, unstructured data and/or the like. For example, in some cases, messages may be generated based on text related to multiple news articles focused on the same or similar news stories in order to generate a news story; whereas in other examples, messages may be built based on survey responses and/or event data.


Messages may be annotated with an indication of their relative importance; this information can be used in subsequent processing steps or by the natural language generation system 102 to make decisions about which information may be conveyed and which information may be suppressed. Alternatively or additionally, messages may include information on relationships between the one or more messages or an indication that a message is a focus of discourse.


In some example embodiments, a natural language generation system, such as natural language generation system 102, is configured to generate phrases, sentences, text or the like which may take the form of natural language text. The natural language generation system 102 comprises, in some example embodiments, a document planner 130, a microplanner 132 and/or a realizer 134. The natural language generation system 102 may also be in data communication with the message store 110, the domain model 112 and/or the linguistic resources 114. In some examples, the linguistic resources 114 include, but are not limited to, text schemas, aggregation rules, reference rules, lexicalization rules and/or grammar rules that may be used by one or more of the document planner 130, the microplanner 132 and/or the realizer 134. Other natural language generation systems may be used in some example embodiments, such as a natural language generation system as described in Building Natural Language Generation Systems by Ehud Reiter and Robert Dale, Cambridge University Press (2000), which is incorporated by reference in its entirety herein.


The document planner 130 is configured to input the one or more messages from the message store 110. The document planner 130 is further configured to determine how to arrange those messages in order to describe the patterns in the one or more data channels derived from the raw input data. The document planner 130 may comprise a content determination process that is configured to select the messages, such as the messages that contain a representation of the data that is to be output via a natural language text. For example, an intravenous feed message may be described prior to a milk feed message in output text describing the status of a baby's feeding. In other examples, an administration method message may be described after, but in relation to, a fluid details message. See, for example, the document plan tree 302 in FIG. 3. The document planner 130 is further described with reference to FIG. 2 below.


The output of the document planner 130 may be a tree-structured object or other data structure that is referred to in some embodiments as a document plan tree. In an instance in which a tree-structured object is chosen for the document plan, the leaf nodes of the document plan tree may contain the messages or pre-defined text to be presented in a document, and the intermediate nodes of the tree-structured object may be configured to indicate how the subordinate nodes are related (e.g. elaboration, consequence, contrast, sequence and/or the like) to each other, specify document structure (e.g. paragraph breaks), and/or the like. In some embodiments, nodes of the document plan tree may also contain parameters for use with a microplanner, such as microplanner 132.


The microplanner 132 is configured to construct a text specification based on the document plan output from the document planner 130, such that the document plan may be expressed in natural language. In some example embodiments, the microplanner 132 may perform aggregation, lexicalization and referring expression generation. In some examples, aggregation includes, but is not limited to, determining whether two or more messages can be combined together linguistically to produce a more complex sentence. For example, one or more events may be aggregated so that both of the events are described by a single sentence.


In some examples, lexicalization includes, but is not limited to, choosing particular words for the expression of concepts and relations. For example, the phrase “along with” may be used to describe coinciding conditions or “administered” may be used to describe the causal event.


In some examples, referring expression generation includes, but is not limited to, choosing how to refer to an entity so that it can be unambiguously identified by the reader. For example, in a first sentence “John Smith” and “a heart rate alarm” may be used where “he” and “it” may be used in subsequent sentences.


The output of the microplanner 132, in some example embodiments, is a tree-structured text specification whose leaf nodes are phrase specifications, and whose internal nodes express rhetorical relations between the leaf nodes. A phrase specification may correspond to a sentence or a sub-sentence fragment (e.g. a title) and are produced from one or more messages. A phrase specification is configured to contain one or more syntactic constituents (e.g. subject, verb, prepositional phrase and/or the like) and one or more syntactic features (e.g. tense).


A realizer 134 is configured to traverse a text specification output by the microplanner 132 to express the text specification in natural language. The realization process that is applied to each phrase specification in the text specification makes use of a grammar (e.g. the grammar of the linguistic resources 114) which specifies the valid syntactic constituents in the language and further provides a way of mapping from phrase specifications into the corresponding natural language sentences. The output of the process is, in some example embodiments, a well-formed natural language text. In some examples, the natural language text may include embedded mark-up.



FIG. 2 illustrates an example document planner 130 according to some example embodiments described herein. The document planner 130 is configured to build a document plan based on a top-down document planner 212 defined using a schema and a bottom-up plan optimizer 214 that makes use of an optimization specification.


A schema may be defined using a plan specification language that is configured to define one or more messages and/or one or more queries for messages to be included in the document plan and the order in which the messages are to be presented in the output document plan. For example, a schema may specify compulsory or optional queries that may be used to extract messages from message store 110 for instantiating the schema. A schema may additionally or alternatively specify one or more messages or predefined phrases for instantiating the schema. The one or more schemas may be stored in or accessible via a schema store 202.


The schema may be configured to represent the structure of the document plan, such as via Extensible Markup Language (XML). Advantageously, by defining a schema, such as by using XML as the specification language, a user may define the structure of a document plan and insert a particular message or set of messages in a particular location in the document, where the messages may be retrieved based on queries specified in the schema or the messages may be directly specified in the schema. For example, top-down schema may be represented using a specification such as below, and further illustrated in FIG. 3.














<?xml version=“1.0”?>


<!- Baby Feed Section Example-->


<document xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance


xsi:noNamespaceSchemaLocation=“docplanner-schema.xsd”


title=“Baby Feed Section”>


 <template id=“Baby Feed” type=“root”>









<section id=“babyfeedSection”>



 <sentence id=“ivfeedSen” focus=“true”>









<message-single-query>



<messagestore-class>IV_FEED</messagestore-class>



 <order-by>



 <order-by-property name=“date_of_entry”



 order=“descending” />



 </order-by>



</message-single-query>









 </sentence>



 <sentence id=“milkfeedSen” focus=“false”>









<message-single-query>



<messagestore-class>MILK_FEED</messagestore-class>



 <order-by>



 <order-by-property name=“date_of_entry”



 order=“descending” />



 </order-by>



</message-single-query>









 </sentence>



</section>







 </template>


</document>









As is shown in the example schema, multiple sections, and messages that make up sections, may be defined. In some examples and as shown above, the IV Feed message 304 of FIG. 3 may be defined in a node of the document plan as a message that will instantiate that particular location in the document plan if it is available in the message store 110.


This example schema specifies queries for an IV Feed message and a Milk feed message. The example schema further specifies that an IV Feed message should be followed by a milk feed message. In the message store, both the IV feed and the Milk feed messages may refer or link to other messages in the message store, such as a Fluid Details message (the details of the fluid given to the baby) and an Administration Method message (how the feed was actually administered). Because the messages are linked in the message store, there is no need to explicitly specify these messages in the schema. As the example schema contains queries that return only single messages, a top-down planning approach alone may be appropriate. If the queries of a schema return multiple messages, or if an order for multiple messages is not specified in the schema, generating the document plan may require the combination of top-down planning and bottom-up optimizations as described herein.


Alternatively or additionally, a schema may invoke sub-schemas. For example, a schema may invoke another schema for the purposes of building a particular paragraph or other section of the document plan


In some example embodiments, the document planner 130 may include a top-down document planner 212 that provides functionality to generate document plans by instantiating one or more schemas selected from the schema store 202 and one or more messages selected from the message store 110. The schema may be expressed using a planning specification. As described above, the schema may contain queries for the selection of the one or more messages from the message store 110 based on at least one of user defined features; features possessed by the messages; features that describe the communicative context of the messages; or previously selected messages. Once the selected schema is instantiated by the top-down document planner 212, the top-down document planner 212 may output one or more document plans that represent the messages and/or pre-defined text to the bottom-up plan optimizer 214. The top-down document planner 212 is further described with respect to FIG. 5.


In some example embodiments, the document planner 130 may include a bottom-up plan optimizer 214 that is configured to apply an optimization specification during generation of the document plan or against the complete document plan output by the top-down document planner 212 to provide an optimized document plan for output, such as to microplanner 132. The bottom-up plan optimizer 214 is further described with respect to FIG. 5.


An optimization specification may be made up of functions that perform tasks such as locally ordering multiple returned messages, globally ordering messages, or inserting and/or deleting subtrees of additional messages, for example. Such planning functions may run in a fixed sequence or may be called from the top-down document planner as necessary.


An optimization specification may contain and execute rules comprised of triggering conditions and actions to be taken to generate a second set of one or more optimal document plans for output. For example, such rules may be of the form “if <condition> then <action1> else <action2>”. In some embodiments, the rules may reference externally specified parameters, for example, message properties such as the start-time of an event used for ordering messages in the document plan. In some embodiments, the rules may also call support functions, such as an “importance(message)” function to compute the importance of a given message.


In some example embodiments, the optimization specification may comprise rules for document and/or text size, text fluency, repetition avoidance, determination of paragraph breaks, message ordering, ensuring narrative coherence, maintaining discourse focus, narration development, and/or the like. The optimization specification may also specify sequencing patterns for messages and aggregation of messages.


In some embodiments, rules may be domain specific, such as are acquired from a corpus or domain expert, which may be represented as follow-on rules. A follow-on rule associates a follow-on score with a pair of messages ordered in a specific sequence. A follow-on score might be estimated by analyzing a corpus to determine the proportion of times a pair of messages appears in a specific order in the corpus. Alternatively domain experts could specify follow-on scores. For example, a follow-on rule may include “if lead_Message is A RAIN_EVENT and the follow_on_Message is A SKY_STATE_EVENT then follow_on_Score=1.0”. This means that a RAIN_EVENT should always be (because the follow-on score is 1) ordered before a SKY_STATE_EVENT in the document plan. In some embodiments, rules may be domain independent, such as where messages in all domains have an “importance” property and rules may specify ordering, reordering, or inserting of messages based on the importance value. For example, a domain independent rule may include “if importance(incoming_Message)>highestImportance(currentDocPlan) then addToFront(incoming_Message, currentDocPlan)”.


In some example embodiments, the bottom-up plan optimizer 214 may retrieve optimization specifications from an optimization specification store 204 to apply against a document plan generated by the top-down document planner 212. In some example embodiments, an optimization specification may include rules comprising triggering conditions and actions to be taken to modify document plans.


Alternatively or additionally, the optimization specification may be configured to specify acceptable sequencing patterns of messages returned from the message store or specify the aggregation of the selected messages.



FIG. 3 illustrates an example document plan tree 302 that may be generated by the top-down document planner 212 for input to a microplanner, such as microplanner 132. As is shown in FIG. 3 and as is described herein, the document plan 302 may contain one or more leaf nodes that contain messages, such as messages 306, 308, 310, and 312. The document plan illustrated in FIG. 3 may be created using a schema, such as the schema described above. In such example, the schema specifies queries for an IV Feed message and a Milk Feed message and further specifies that an IV Feed message is followed by a Milk Feed message, if both exist in the message store. The document plan 302 generated using the schema may then be used in the natural language generation system to generate an output text, such as output text 314.



FIG. 4 is an example block diagram of an example computing device for practicing embodiments of an example document planner. In particular, FIG. 4 shows a computing system 400 that may be utilized to implement a natural language generation environment 100 having a natural language generation system 102 including, in some examples, a document planner 130, a microplanner 132 and/or a realizer 134; and/or a user interface 410. One or more general purpose or special purpose computing systems/devices may be used to implement the natural language generation system 102 and/or the user interface 410. In addition, the computing system 400 may comprise one or more distinct computing systems/devices and may span distributed locations. In some example embodiments, the natural language generation system 102 may be configured to operate remotely via the network 450. In other example embodiments, a pre-processing module or other module that requires heavy computational load may be configured to perform that computational load and thus may be on a remote device or server. For example, the realizer 134 may be accessed remotely. Furthermore, each block shown may represent one or more such blocks as appropriate to a specific example embodiment. In some cases one or more of the blocks may be combined with other blocks. Also, the natural language generation system 102 may be implemented in software, hardware, firmware, or in some combination to achieve the capabilities described herein.


In the example embodiment shown, computing system 400 comprises a computer memory (“memory”) 401, a display 402, one or more processors 403, input/output devices 404 (e.g., keyboard, mouse, CRT or LCD display, touch screen, gesture sensing device and/or the like), other computer-readable media 405, and communications interface 406. The processor 403 may, for example, be embodied as various means including one or more microprocessors with accompanying digital signal processor(s), one or more processor(s) without an accompanying digital signal processor, one or more coprocessors, one or more multi-core processors, one or more controllers, processing circuitry, one or more computers, various other processing elements including integrated circuits such as, for example, an application-specific integrated circuit (ASIC) or field-programmable gate array (FPGA), or some combination thereof. Accordingly, although illustrated in FIG. 4 as a single processor, in some embodiments the processor 403 comprises a plurality of processors. The plurality of processors may be in operative communication with each other and may be collectively configured to perform one or more functionalities of the example document planner as described herein.


The natural language generation system 102 is shown residing in memory 401. The memory 401 may comprise, for example, transitory and/or non-transitory memory, such as volatile memory, non-volatile memory, or some combination thereof. Although illustrated in FIG. 4 as a single memory, the memory 401 may comprise a plurality of memories. The plurality of memories may be embodied on a single computing device or may be distributed across a plurality of computing devices collectively configured to function as the example document planner. In various example embodiments, the memory 401 may comprise, for example, a hard disk, random access memory, cache memory, flash memory, a compact disc read only memory (CD-ROM), digital versatile disc read only memory (DVD-ROM), an optical disc, circuitry configured to store information, or some combination thereof.


In other embodiments, some portion of the contents, some or all of the components of the natural language generation system 102 may be stored on and/or transmitted over the other computer-readable media 405. The components of the natural language generation system 102 preferably execute on one or more processors 403 and are configured to enable operation of an example document planner, as described herein.


Alternatively or additionally, other code or programs 430 (e.g., an administrative interface, a Web server, and the like) and potentially other data repositories, such as other data sources 440, also reside in the memory 401, and preferably execute on one or more processors 403. Of note, one or more of the components in FIG. 4 may not be present in any specific implementation. For example, some embodiments may not provide other computer readable media 405 or a display 402.


The natural language generation system 102 is further configured to provide functions such as those described with reference to FIG. 1. The natural language generation system 102 may interact with the network 450, via the communications interface 406, with remote data sources 456 (e.g. remote reference data, remote lexicalization rules, remote aggregation data, remote genre parameters and/or the like), third-party content providers 454 and/or client devices 458. The network 450 may be any combination of media (e.g., twisted pair, coaxial, fiber optic, radio frequency), hardware (e.g., routers, switches, repeaters, transceivers), and protocols (e.g., TCP/IP, UDP, Ethernet, Wi-Fi, WiMAX, Bluetooth) that facilitate communication between remotely situated humans and/or devices. In some instance the network 450 may take the form of the internet or may be embodied by a cellular network such as an LTE based network. In this regard, the communications interface 406 may be capable of operating with one or more air interface standards, communication protocols, modulation types, access types, and/or the like. The client devices 458 include desktop computing systems, notebook computers, mobile phones, smart phones, personal digital assistants, tablets and/or the like.


In an example embodiment, components/modules of the natural language generation system 102 are implemented using standard programming techniques. For example, the natural language generation system 102 may be implemented as a “native” executable running on the processor 403, along with one or more static or dynamic libraries. In other embodiments, the natural language generation system 102 may be implemented as instructions processed by a virtual machine that executes as one of the other programs 430. In general, a range of programming languages known in the art may be employed for implementing such example embodiments, including representative implementations of various programming language paradigms, including but not limited to, object-oriented (e.g., Java, C++, C#, Visual Basic.NET, Smalltalk, and the like), functional (e.g., ML, Lisp, Scheme, and the like), procedural (e.g., C, Pascal, Ada, Modula, and the like), scripting (e.g., Perl, Ruby, Python, JavaScript, VBScript, and the like), and declarative (e.g., SQL, Prolog, and the like).


The embodiments described above may also use synchronous or asynchronous client-server computing techniques. Also, the various components may be implemented using more monolithic programming techniques, for example, as an executable running on a single processor computer system, or alternatively decomposed using a variety of structuring techniques, including but not limited to, multiprogramming, multithreading, client-server, or peer-to-peer, running on one or more computer systems each having one or more processors. Some embodiments may execute concurrently and asynchronously, and communicate using message passing techniques. Equivalent synchronous embodiments are also supported. Also, other functions could be implemented and/or performed by each component/module, and in different orders, and by different components/modules, yet still achieve the described functions.


In addition, programming interfaces to the data stored as part of the natural language generation system 102, such as by using one or more application programming interfaces can be made available by mechanisms such as through application programming interfaces (API) (e.g. C, C++, C#, and Java); libraries for accessing files, databases, or other data repositories; through scripting languages such as XML; or through Web servers, FTP servers, or other types of servers providing access to stored data. The message store 110, the domain model 112 and/or the linguistic resources 114 may be implemented as one or more database systems, file systems, or any other technique for storing such information, or any combination of the above, including implementations using distributed computing techniques. Alternatively or additionally, the message store 110, the domain model 112 and/or the linguistic resources 114 may be local data stores but may also be configured to access data from the remote data sources 456.


Different configurations and locations of programs and data are contemplated for use with techniques described herein. A variety of distributed computing techniques are appropriate for implementing the components of the illustrated embodiments in a distributed manner including but not limited to TCP/IP sockets, RPC, RMI, HTTP, Web Services (XML-RPC, JAX-RPC, SOAP, and the like). Other variations are possible. Also, other functionality could be provided by each component/module, or existing functionality could be distributed amongst the components/modules in different ways, yet still achieve the functions described herein.


Furthermore, in some embodiments, some or all of the components of the natural language generation system 102 may be implemented or provided in other manners, such as at least partially in firmware and/or hardware, including, but not limited to one or more ASICs, standard integrated circuits, controllers executing appropriate instructions, and including microcontrollers and/or embedded controllers, FPGAs, complex programmable logic devices (“CPLDs”), and the like. Some or all of the system components and/or data structures may also be stored as contents (e.g., as executable or other machine-readable software instructions or structured data) on a computer-readable medium so as to enable or configure the computer-readable medium and/or one or more associated computing systems or devices to execute or otherwise use or provide the contents to perform at least some of the described techniques. Some or all of the system components and data structures may also be stored as data signals (e.g., by being encoded as part of a carrier wave or included as part of an analog or digital propagated signal) on a variety of computer-readable transmission mediums, which are then transmitted, including across wireless-based and wired/cable-based mediums, and may take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). Such computer program products may also take other forms in other embodiments. Accordingly, embodiments of this disclosure may be practiced with other computer system configurations.



FIG. 5 illustrates an example flowchart of the operations performed by an apparatus, such as computing system 400 of FIG. 4, in accordance with example embodiments of the present invention. It will be understood that each block of the flowchart, and combinations of blocks in the flowchart, may be implemented by various means, such as hardware, firmware, one or more processors, circuitry and/or other devices associated with execution of software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions which embody the procedures described above may be stored by a memory 401 of an apparatus employing an embodiment of the present invention and executed by a processor 403 in the apparatus. As will be appreciated, any such computer program instructions may be loaded onto a computer or other programmable apparatus (e.g., hardware) to produce a machine, such that the resulting computer or other programmable apparatus provides for implementation of the functions specified in the flowcharts' block(s). These computer program instructions may also be stored in a non-transitory computer-readable storage memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable storage memory produce an article of manufacture, the execution of which implements the function specified in the flowcharts' block(s). The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide operations for implementing the functions specified in the flowcharts' block(s). As such, the operations of FIG. 5, when executed, convert a computer or processing circuitry into a particular machine configured to perform an example embodiment of the present invention. Accordingly, the operations of FIG. 5 define an algorithm for configuring a computer or processor, to perform an example embodiment. In some cases, a general purpose computer may be provided with an instance of the processor which performs the algorithm of FIG. 5 to transform the general purpose computer into a particular machine configured to perform an example embodiment.


Accordingly, blocks of the flowchart support combinations of means for performing the specified functions and combinations of operations for performing the specified functions. It will also be understood that one or more blocks of the flowcharts, and combinations of blocks in the flowchart, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.


In some example embodiments, certain ones of the operations herein may be modified or further amplified as described below. Moreover, in some embodiments additional optional operations may also be included. It should be appreciated that each of the modifications, optional additions or amplifications described herein may be included with the operations herein either alone or in combination with any others among the features described herein.



FIG. 5 is a flowchart illustrating an example method for generating an optimal document plan tree performed by a document planner according to some example embodiments. As shown in block 502, the document planner 130 may include means, such as the top-down document planner 212, the processor 403, or the like, for accessing a message store, such as message store 110, containing one or more messages. At block 504, the document planner 130 may include means, such as the top-down document planner 212, the processor 403, or the like, for selecting a schema from a schema store, such as schema store 202, based on the one or more messages available in the message store.


At block 506, the document planner 130 may include means, such as the top-down document planner 212, the processor 403, or the like, for beginning operations for generating a document plan. For example, the document planner 130 may begin generation of a document plan using the selected schema and one or more messages from the message store. In some example embodiments, the selected schema may call a sub-schema that is also to be used in generating the first set of document plans. In some embodiments, a schema may also specify pre-defined messages or phrases that may be used in generating a document plan.


At block 508, the document planner 130 may include means, such as the top-down document planner 212, the processor 403, or the like, for selecting one or more messages from the message store based on the schema. The schema may specify queries to be executed against the message store to retrieve message content for use in generating a document plan. In some example embodiments, a schema may further specify alternate queries that may be run against the message store if the initial queries do not return a result including one or more messages. The schema may also specify predefined messages or text for use in generating the document plan.


At block 510, the document planner 130 may include means, such as the top-down document planner 212, the processor 403, or the like, for determining if optimizations is needed based on the messages retrieved from a message store, such as message store 110. For example, if a query returns more than one message, or if the schema does not specify the ordering for multiple messages, the document planner 130 may determine that optimization is needed to generate the desired document plan. If optimization of the returned messages is needed, for example, multiple messages are returned which need to be locally ordered, operation continues to block 512 (510-YES). If optimization of the returned messages is not needed, operation continues to block 516 (510-NO).


At block 512, the document planner 130 may include means, such as the bottom-up plan optimizer 214, the processor 403, or the like, for retrieving an optimization specification, such as from optimization specification store 204, for use in optimizing the retrieved messages to be added to a document plan. An optimization specification may contain rules comprised of triggering conditions and actions to be taken to determine how messages may be added to a document plan during generation of the document plan. For example, the optimization specification may provide rules for locally ordering messages for a section of the document plan.


At block 514, the document planner 130 may include means, such as the bottom-up plan optimizer 214, the processor 403, or the like, for applying the optimization specification rules against the retrieved messages to determine optimal placement of the messages.


At block 516, the document planner 130 may include means, such as the bottom-up plan optimizer 214, the top-down document planner 212, the processor 403, or the like, for adding the messages to the document plan. The document planner 130 may add the retrieved messages to the document plan based on the schema or based on the rules of an optimization specification.


At block 518, the document planner 130 may include means, such as the bottom-up plan optimizer 214, the top-down document planner 212, the processor 403, or the like, for determining whether the generation of a document plan is complete. For example, in some embodiments, the document planner 130 may determine that the schema has been completely instantiated or that all the relevant messages from a message store have been placed in the document plan. If it is determined that the document plan is not complete, for example, there are additional queries specified in the schema, operation returns to block 508 (518-NO). If it is determined that the document plan is complete, operation may continue to block 520 (518-YES).


At block 520, the document planner 130 may include means, such as the bottom-up plan optimizer 214, the processor 403, or the like, for retrieving an optimization specification, such as from optimization specification store 204, for use in optimizing the completed document plan. An optimization specification may contain rules comprised of triggering conditions and actions to be taken to modify the completed document plan to provide an optimal document plan for output, such as to a microplanner. In some example embodiments, the optimization specification may comprise rules for document and/or text size, text fluency, repetition avoidance, determination of paragraph breaks, message ordering, ensuring narrative coherence, maintaining discourse focus, narration development, and/or the like. The optimization specification may also specify sequencing patterns for messages and aggregation of messages.


At block 522, the document planner 130 may include means, such as the bottom-up plan optimizer 214, the processor 403, or the like, for applying the optimization specification rules against the completed document plan to generate an optimal document plan. The optimal document plan may then be provided as input to the microplanner.



FIGS. 6-8 illustrate generating an example document plan using top-down document planning and bottom-up narrative optimization in accordance with some example embodiments of the present invention. In some example embodiments, generating an output document plan comprises a process of first generating a document plan using a top-down document planner and then optimizing the document plan using bottom-up narrative optimization. In some example embodiments, generating an output document plan comprises a process of optimizing the document plan using bottom-up narrative optimization during generation of the document plan using a top-down document planner, i.e., calling the bottom-up optimization operation from within the top-down document planning operation.


In an example embodiment, a sample schema for top-down document planning to generate a weather and temperature text may be represented as:














 <?xml version=“1.0”?>


<!-- Weather Temperature Example -->


<document xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance


xsi:noNamespaceSchemaLocation=“docplanner-schema.xsd”>


<template id=“weatherTemperature” type=“root”>


 <section id=“Summary”>









<paragraph>



 <message-single-query>









<messagestore-class>WEATHER_EVENT</messagestore-



class>



 <order-by>









<order-by-property name=“START_TIME”



order=“ascending” />









 </order-by>









 </message-single-query>



 <message-single-query>









<messagestore-



class>TEMPERATURE_EVENT</messagestore-class>



 <order-by>









<order-by-property



name=“TEMPERATURE_VALUE”



order=“ascending” />









 </order-by>



</message-single-query>









</paragraph>







 </section>


</template>


</document>










Such a schema orders the temperature information after other weather information, capturing the global order of the text, but fails to specify how to order the multiple weather messages. In such situations, the schema may underspecify the global structure and ordering of messages for a text.


In some embodiments the document planner 130 may first call the top-down document planner 212 to select a schema to construct a document plan. When certain conditions are fulfilled, the document planner 130 may then call the bottom-up plan optimizer 214 to provide document plan optimization, such as calling optimization functions such as orderMessages( ) or applyDomainRules( ) to locally order multiple messages returned by a query.


To generate the document plan illustrated in FIG. 8, in an example embodiment, the document planner 130 may access a message store containing messages including:

    • 1. FROST_EVENT
    • 2. PATCHY_FOG_EVENT
    • 3. FREEZING_FOG_EVENT
    • 4. CLOUDY_EVENT
    • 5. FRESHENING_BREEZE_EVENT
    • 6. LIFTING_FOG_EVENT
    • 7. MIN_TEMPERATURE_EVENT
    • 8. MAX_TEMPERATURE_EVENT
    • 9. POLLEN_COUNT_EVENT.



FIG. 6 illustrates the temporal order of the events represented by these messages. As illustrated, message 2 (patchy fog event) and message 3 (freezing fog event) temporally overlap message 1 (frost event). Additionally, message 5 (freshening breeze event) and message 6 (lifting fog event) temporally overlap message 4 (cloudy event).


The top-down document planner 212 may select the Weather+Temperature schema to generate the document plan. As shown in FIG. 7a, the top-down document planner 212 may first create the ‘root’ docPlan Node, then create a ‘section’ docPlan node and add it to root as a child node, and then create a ‘paragraph’ docPlan node and add it to section as a child node. The top-down document planner 212 may execute Message-single-query for WEATHER_EVENT from the schema returning messages 1 through 6 (as listed above) from the message store, with the schema specifying a temporal order for these six messages, as illustrated in FIG. 6.


Because the query returned multiple messages and because there may be an opportunity to further optimize the ordering of these messages, the document planner 130 may call the bottom-up plan optimizer 214 to optimize the document plan being generated. The bottom-up plan optimizer 214 may call an orderMessages( ) function and create a docPlan node to be set as the root of the subtree to be created with the messages returned from the query, as illustrated in FIG. 7b. The orderMessages( ) function may receive the six WEATHER_EVENT messages and the subtree docPlan node. The bottom-up plan optimizer 214 may then execute rules to determine a domain specific subtree construction method. In the example embodiment, the bottom-up plan optimizer 214 may call a createTemporalStructure( ) function to create the subtree with the six messages.


The createTemporalStructure( ) function may receive the temporally ordered list of six messages and the subtree root docPlan node. The createTemporalStructure( ) function creates a docPlan node with the first (temporally ordered) message (#1 Frost_Event) and adds it to the subtree root docPlan node as a child, as illustrated in FIG. 7c. The createTemporalStructure( ) function checks through the rest of the message list to determine if any messages temporally overlap the first message (#1). The createTemporalStructure( ) function finds messages #2 and #3, since, as shown in FIG. 6, messages #2 and #3 temporally overlap with message #1. The createTemporalStructure( ) function creates docPlan nodes for messages #2 and #3 and adds them as children to the docPlan node with message #1, as illustrated in FIG. 7d. The createTemporalStructure( ) function creates a docPlan node with the next message (#4 Cloudy_Event) and adds it to the subtree root docPlan node. The createTemporalStructure( ) function checks through the rest of the message list to determine if any messages temporally overlap the fourth message and finds messages #5 and #6, since, as shown in FIG. 6, messages #5 and #6 temporally overlap with message #4. The createTemporalStructure( ) function creates docPlan nodes for messages #5 and #6 and adds them as children to the docPlan node with message #4, as illustrated in FIG. 7e.


The bottom-up plan optimizer 214 then adds the subtree received from OrderMessages( ) to the main document plan by merging the subtree root docPlan node with the paragraph docPlan node, as illustrated in FIG. 7f.


The top-down document planner 212 then executes Message-single-query for TEMPERATURE_EVENT which returns messages #7 and #8 from the message store, with the schema specifying how to order these messages. Because the query returned multiple messages, the bottom-up optimizer might be called to further optimize the sub-plan. In this example case, the bottom-up optimizer would not find any further optimizations, The top-down document planner 212 creates docPlan nodes for each of messages #7 and #8 and adds them as children to the paragraph docPlan node, as illustrated in FIG. 7g. The top-down document planner 212 may then determine that the document plan is complete as there are no more messages or queries to be executed in the schema (note that the schema did not select the POLLEN_COUNT_EVENT from the message store and thus that message is never used in the output document plan). In some embodiments, upon completion of the top-down document planning, the document planner 130 may call the bottom-up plan optimizer 214 to call further functions to optimize the document plan, such as a removeRepeats( ) function that removes repeating messages, for example. The document planner 130 may then output the completed document plan, as illustrated in FIG. 8. FIG. 8 illustrates an exemplary document plan generated using top-down planning and bottom-up optimization which may then be output from document planner 130, such as to microplanner 132.


Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims
  • 1. A computer-implemented method for generating an output text by transforming received raw input data into a format that can be linguistically expressed in the output text comprising: selecting, using a processor, a schema from a schema store, wherein the schema is selected based on one or more messages in a message store, each of the one or more messages being a language independent data structure, each message corresponding to and selected in response to an existence of at least one fact about the received raw input data, the received raw input data expressed in a non-linguistic format;generating, using the processor, a document plan that is instantiated with the one or more messages, wherein the document plan is instantiated according to the schema that comprises a query set, at least one query in the query set configured for selecting any number of the one or more messages in the message store;altering, using the processor, the document plan according to an optimization specification, wherein the optimization specification comprises a set of one or more rules, the rules comprising at least one rule for arranging the one or more messages of the document plan based on an optimization function; andgenerating, using the processor, an output text for display on a user interface based on the altered document plan, the output text being a linguistic representation of the raw input data.
  • 2. A method according to claim 1, wherein generating the document plan further comprises the selected schema selecting, using the processor, a second schema to also be used in the generating of the document plan.
  • 3. A method according to claim 1, wherein the schema further comprises one or more of pre-defined text and queries to be executed against the message store for use in generating the document plan.
  • 4. A method according to claim 3, wherein an alternate query may be specified if an initial query executed against the message store does not return a result including one or more messages.
  • 5. A method according to claim 1, wherein the document plan further comprises nodes having parameters specifying one or more of document structure, rhetorical structure, message ordering, document plan optimization, and microplanner parameters.
  • 6. A method according to claim 1, wherein the document plan further comprises nodes having microplanner parameters.
  • 7. A method according to claim 1, wherein the optimization specification further comprises rules for at least one or more of text size, text fluency, avoiding repetition, determining paragraph breaks, ordering messages, ensuring narrative coherence, maintaining discourse focus, and narration development.
  • 8. A method according to claim 7, wherein the optimization specification rules further comprise triggering conditions defined over at least one or more of message properties, message types, and relationships of messages.
  • 9. A method according to claim 1, wherein the optimization specification further specifies one or more of sequencing patterns of messages and aggregation of messages.
  • 10. A method according to claim 1, wherein the one or more rules are configured for comparing a value for at least one message property of the one or more messages in the message store to arrange the one or more messages for maintaining discourse focus.
  • 11. An apparatus that is configured to generate an output text by transforming received raw input data into a format that can be linguistically expressed in the output text, the apparatus comprising: at least one processor; andat least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to at least:select, using the at least one processor, a schema from a schema store, wherein the schema is selected based on one or more messages in a message store, each of the one or more messages being a language independent data structure, each message corresponding to and selected in response to an existence of at least one fact about the received raw input data, the received raw input data expressed in a non-linguistic format;generate, using the at least one processor, a document plan that is instantiated with the one or more messages, wherein the document plan is instantiated according to the schema that comprises a query set, at least one query in the query set configured for selecting any number of the one or more messages in the message store;alter, using the at least one processor, the document plan in accordance with an optimization specification, wherein the optimization specification comprises a set of one or more rules, the rules comprising at least one rule for arranging the one or more messages of the document plan based on an optimization function; andgenerate, using the at least one processor, an output text for display on a user interface based on the altered document plan, the output text being a linguistic representation of the raw input data.
  • 12. An apparatus according to claim 11, wherein generating the document plan further comprises the selected schema selecting a second schema to also be used in the generating of the document plan.
  • 13. An apparatus according to claim 11, wherein the schema further comprises one or more of pre-defined text and queries to be executed against the message store for use in generating the document plan.
  • 14. An apparatus according to claim 13, wherein an alternate query may be specified if an initial query executed against the message store does not return a result including one or more messages.
  • 15. An apparatus according to claim 11, wherein the document plan further comprises nodes having parameters specifying one or more of document structure, rhetorical structure, message ordering, document plan optimization, and microplanner parameters.
  • 16. An apparatus according to claim 11, wherein the optimization specification comprises rules for at least one or more of text size, text fluency, avoiding repetition, determining paragraph breaks, ordering messages, ensuring narrative coherence, maintaining discourse focus, and narration development.
  • 17. An apparatus according to claim 16, wherein the optimization specification rules further comprise triggering conditions defined over at least one or more of message properties, message types, and relationships of messages.
  • 18. An apparatus according to claim 11, wherein the optimization specification specifies one or more of sequencing patterns of messages and aggregation of messages.
  • 19. An apparatus according to claim 11, wherein the one or more rules are configured to compare a value for at least one message property of the one or more messages in the message store to arrange the one or more messages for maintaining discourse focus.
  • 20. A computer program product that is configured to generate an output text by transforming received raw input data into a format that can be linguistically expressed in the output text, the computer program product comprising: at least one computer readable non-transitory memory medium having program code instructions stored thereon, the program code instructions which when executed by an apparatus having at least one processor cause the apparatus at least to:select, using the at least one processor, a schema from a schema store, wherein the schema is selected based on one or more messages in a message store, each of the one or more messages being a language independent data structure, each message corresponding to and selected in response to an existence of at least one fact about the received raw input data, the received raw input data expressed in a non-linguistic format;generate, using the at least one processor, a document plan that is instantiated with the one or more messages, wherein the document plan is instantiated according to the schema that comprises a query set, at least one query in the query set configured for selecting any number of the one or more messages in the message store;alter, using the at least one processor, the document plan in accordance with an optimization specification, wherein the optimization specification comprises a set of one or more rules, the rules comprising at least one rule for arranging the one or more messages of the document plan based on an optimization function; andgenerate, using the at least one processor, an output text for display on a user interface based on the altered document plan the output text being a linguistic representation of the raw input data.
PCT Information
Filing Document Filing Date Country Kind
PCT/IB2013/050375 1/15/2013 WO 00
Publishing Document Publishing Date Country Kind
WO2014/111753 7/24/2014 WO A
US Referenced Citations (339)
Number Name Date Kind
5181250 Morgan et al. Jan 1993 A
5237502 White et al. Aug 1993 A
5311429 Tominaga May 1994 A
5321608 Namba et al. Jun 1994 A
5629687 Sutton et al. May 1997 A
5794177 Carus et al. Aug 1998 A
5802488 Edatsune Sep 1998 A
6023669 Suda et al. Feb 2000 A
6078914 Redfern Jun 2000 A
6138087 Budzinski Oct 2000 A
6266617 Evans Jul 2001 B1
6442485 Evans Aug 2002 B2
6466899 Yano et al. Oct 2002 B1
6629340 Dale et al. Oct 2003 B1
6665640 Bennett et al. Dec 2003 B1
6717513 Sandelman et al. Apr 2004 B1
6947885 Bangalore et al. Sep 2005 B2
6958746 Anderson et al. Oct 2005 B1
7043420 Ratnaparkhi May 2006 B2
7062483 Ferrari et al. Jun 2006 B2
7111018 Goodrich et al. Sep 2006 B1
7167824 Kallulli Jan 2007 B2
7231341 Bangalore et al. Jun 2007 B2
7238313 Ferencz et al. Jul 2007 B2
7305336 Polanyi et al. Dec 2007 B2
7310969 Dale Dec 2007 B2
7346493 Ringger et al. Mar 2008 B2
7418447 Caldwell et al. Aug 2008 B2
7424363 Cheng et al. Sep 2008 B2
7444287 Claudatos et al. Oct 2008 B2
7493253 Ceusters Feb 2009 B1
7493311 Cutsinger Feb 2009 B1
7496621 Pan et al. Feb 2009 B2
7526424 Corston-Oliver et al. Apr 2009 B2
7533089 Pan et al. May 2009 B2
7562005 Bangalore et al. Jul 2009 B1
7653545 Starkie Jan 2010 B1
7657424 Bennett Feb 2010 B2
7684991 Stohr et al. Mar 2010 B2
7711581 Hood et al. May 2010 B2
7783486 Rosser et al. Aug 2010 B2
7809552 Pan et al. Oct 2010 B2
7849048 Langseth et al. Dec 2010 B2
7849049 Langseth et al. Dec 2010 B2
7856390 Schiller Dec 2010 B2
7873509 Budzinski Jan 2011 B1
7921091 Cox et al. Apr 2011 B2
7930169 Billerey-Mosier Apr 2011 B2
7933774 Begeja et al. Apr 2011 B1
7966172 Ruiz et al. Jun 2011 B2
7966369 Briere et al. Jun 2011 B1
7970601 Burmester et al. Jun 2011 B2
7979267 Ruiz et al. Jul 2011 B2
8015006 Kennewick et al. Sep 2011 B2
8019610 Walker et al. Sep 2011 B2
8024331 Calistri-Yeh et al. Sep 2011 B2
8037000 Delmonico et al. Oct 2011 B2
8082144 Brown et al. Dec 2011 B1
8090727 Lachtarnik et al. Jan 2012 B2
8117261 Briere et al. Feb 2012 B2
8150676 Kaeser Apr 2012 B1
8175873 Di Fabbrizio et al. May 2012 B2
8180647 Walker et al. May 2012 B2
8180758 Cornali May 2012 B1
8204751 Di Fabbrizio et al. Jun 2012 B1
8229937 Kiefer et al. Jul 2012 B2
8355903 Birnbaum et al. Jan 2013 B1
8374848 Birnbaum et al. Feb 2013 B1
8425325 Hope Apr 2013 B2
8457950 Gardner Jun 2013 B1
8473911 Baxter Jun 2013 B1
8494944 Schiller Jul 2013 B2
8515733 Jansen Aug 2013 B2
8515737 Allen Aug 2013 B2
8521512 Gorman et al. Aug 2013 B2
8548814 Manuel-Devadoss Oct 2013 B2
8548915 Antebi et al. Oct 2013 B2
8561014 Mengusoglu et al. Oct 2013 B2
8566090 Di Fabbrizio et al. Oct 2013 B2
8572173 Briere et al. Oct 2013 B2
8589148 Atallah et al. Nov 2013 B2
8589172 Alonso et al. Nov 2013 B2
8616896 Lennox Dec 2013 B2
8620669 Walker et al. Dec 2013 B2
8626613 Dale et al. Jan 2014 B2
8630844 Nichols et al. Jan 2014 B1
8655889 Hua et al. Feb 2014 B2
8660545 Redford et al. Feb 2014 B1
8676691 Schiller Mar 2014 B2
8688434 Birnbaum et al. Apr 2014 B1
8689176 Bagheri et al. Apr 2014 B2
8700396 Mengibar et al. Apr 2014 B1
8711732 Johnson Apr 2014 B2
8719696 Duncan May 2014 B2
8738384 Bansal et al. May 2014 B1
8738558 Antebi et al. May 2014 B2
8762134 Reiter May 2014 B2
8762133 Reiter Jun 2014 B2
8775161 Nichols et al. Jul 2014 B1
8825533 Basson et al. Sep 2014 B2
8843363 Birnbaum et al. Sep 2014 B2
8849670 Di Cristo et al. Sep 2014 B2
8886520 Nichols et al. Nov 2014 B1
8892417 Nichols et al. Nov 2014 B1
8892419 Lundberg et al. Nov 2014 B2
8898063 Sykes et al. Nov 2014 B1
8903711 Lundberg et al. Dec 2014 B2
8903718 Akuwudike Dec 2014 B2
8909595 Gandy et al. Dec 2014 B2
8914452 Boston et al. Dec 2014 B2
8924330 Antebi et al. Dec 2014 B2
8930178 Pestian et al. Jan 2015 B2
8930305 Namburu et al. Jan 2015 B2
8935769 Hessler Jan 2015 B2
8977953 Pierre et al. Mar 2015 B1
8984051 Olsen et al. Mar 2015 B2
9002695 Watanabe et al. Apr 2015 B2
9002869 Riezler et al. Apr 2015 B2
9015730 Allen et al. Apr 2015 B1
9028260 Nanjiani et al. May 2015 B2
9092276 Allen et al. Jul 2015 B2
9104720 Rakshit et al. Aug 2015 B2
9110882 Overell et al. Aug 2015 B2
9110977 Pierre et al. Aug 2015 B1
9111534 Sylvester et al. Aug 2015 B1
9135244 Reiter Sep 2015 B2
9135662 Evenhouse et al. Sep 2015 B2
9146904 Allen Sep 2015 B2
9164982 Kaeser Oct 2015 B1
9173005 Redford et al. Oct 2015 B1
9190054 Riley et al. Nov 2015 B1
9208147 Nichols et al. Dec 2015 B1
9229927 Wolfram et al. Jan 2016 B2
9240197 Begeja et al. Jan 2016 B2
9244894 Dale et al. Jan 2016 B1
9251134 Birnbaum et al. Feb 2016 B2
9251143 Bird et al. Feb 2016 B2
9263039 Di Cristo et al. Feb 2016 B2
9268770 Kursun Feb 2016 B1
9323743 Reiter Apr 2016 B2
9396181 Sripada et al. Jul 2016 B1
9405448 Reiter Aug 2016 B2
9640045 Reiter May 2017 B2
9904676 Sripada et al. Feb 2018 B2
10026274 Reiter Jul 2018 B2
20020026306 Bangalore et al. Feb 2002 A1
20020143742 Nonomura Oct 2002 A1
20020147711 Hattori Oct 2002 A1
20030131315 Escher Jul 2003 A1
20030182102 Corston-Oliver et al. Sep 2003 A1
20030195740 Tokuda et al. Oct 2003 A1
20030212545 Kallulli Nov 2003 A1
20030233230 Ammicht et al. Dec 2003 A1
20040002958 Seshadri Jan 2004 A1
20040044515 Metcalf et al. Mar 2004 A1
20040093344 Berger May 2004 A1
20040246120 Benner et al. Dec 2004 A1
20040268237 Jones Dec 2004 A1
20050039107 Hander et al. Feb 2005 A1
20050108001 Aarskog May 2005 A1
20050228635 Araki et al. Oct 2005 A1
20050256703 Markel Nov 2005 A1
20060004725 Abraido-Fandino Jan 2006 A1
20060004844 Rothschiller Jan 2006 A1
20060020886 Agrawal et al. Jan 2006 A1
20060020916 Allison et al. Jan 2006 A1
20060085414 Chai et al. Apr 2006 A1
20060085667 Kubota et al. Apr 2006 A1
20060136196 Brun et al. Jun 2006 A1
20060178868 Billerey-Mosier Aug 2006 A1
20060184888 Bala Aug 2006 A1
20060224638 Wald et al. Oct 2006 A1
20060242563 Liu Oct 2006 A1
20060259293 Orwant Nov 2006 A1
20070038643 Epstein Feb 2007 A1
20070078655 Semkow et al. Apr 2007 A1
20070106628 Adjali et al. May 2007 A1
20070129942 Ban et al. Jun 2007 A1
20070143099 Balchandran et al. Jun 2007 A1
20070143278 Srivastava et al. Jun 2007 A1
20070150806 Hartmann Jun 2007 A1
20070156677 Szabo Jul 2007 A1
20070169021 Huynh et al. Jul 2007 A1
20070219773 Roux et al. Sep 2007 A1
20080005005 Billieux Jan 2008 A1
20080221865 Wellmann Sep 2008 A1
20080221870 Attardi et al. Sep 2008 A1
20080281781 Zhao et al. Nov 2008 A1
20080312954 Ullrich et al. Dec 2008 A1
20090076799 Crouch et al. Mar 2009 A1
20090089100 Nenov et al. Apr 2009 A1
20090089126 Odubiyi Apr 2009 A1
20090111486 Burstrom Apr 2009 A1
20090138258 Neale May 2009 A1
20090144609 Liang et al. Jun 2009 A1
20090156229 Hein et al. Jun 2009 A1
20090177929 Sijelmassi Jul 2009 A1
20090182549 Anisimovich et al. Jul 2009 A1
20090198496 Denecke Aug 2009 A1
20090281839 Lynn et al. Nov 2009 A1
20090286514 Lichorowic et al. Nov 2009 A1
20090287567 Penberthy et al. Nov 2009 A1
20100010802 Ruano et al. Jan 2010 A1
20100146491 Hirano et al. Jun 2010 A1
20100153095 Yang et al. Jun 2010 A1
20100153105 Di Fabbrizio et al. Jun 2010 A1
20100174545 Otani Jul 2010 A1
20100191658 Kannan et al. Jul 2010 A1
20100203970 Hope Aug 2010 A1
20100210379 Shelley Aug 2010 A1
20100241421 Funakoshi Sep 2010 A1
20100325608 Radigan Dec 2010 A1
20100332235 David Dec 2010 A1
20110010164 Williams Jan 2011 A1
20110035210 Rosenfeld et al. Feb 2011 A1
20110055687 Bhandar Mar 2011 A1
20110068929 Franz et al. Mar 2011 A1
20110087486 Schiller Apr 2011 A1
20110160986 Wu et al. Jun 2011 A1
20110179006 Cox et al. Jul 2011 A1
20110184959 Maxwell, III et al. Jul 2011 A1
20110218822 Buisman et al. Sep 2011 A1
20110225185 Gupta Sep 2011 A1
20110257839 Mukherjee Oct 2011 A1
20110307435 Overell et al. Dec 2011 A1
20110313757 Hoover et al. Dec 2011 A1
20110314060 Sinha et al. Dec 2011 A1
20120078888 Brown et al. Mar 2012 A1
20120084027 Caine Apr 2012 A1
20120131008 Ahn et al. May 2012 A1
20120136649 Freising et al. May 2012 A1
20120158089 Bocek et al. Jun 2012 A1
20120173475 Ash et al. Jul 2012 A1
20120174018 Ash et al. Jul 2012 A1
20120232919 Wilson et al. Sep 2012 A1
20120290289 Manera et al. Nov 2012 A1
20120290310 Watson Nov 2012 A1
20120310990 Viegas Dec 2012 A1
20130013290 Funakoshi et al. Jan 2013 A1
20130030810 Kopparapu et al. Jan 2013 A1
20130041921 Cooper et al. Feb 2013 A1
20130066873 Salvetti Mar 2013 A1
20130095864 Marovets Apr 2013 A1
20130138428 Chandramouli et al. May 2013 A1
20130144606 Birnbaum et al. Jun 2013 A1
20130145242 Birnbaum et al. Jun 2013 A1
20130151238 Beaurpere et al. Jun 2013 A1
20130174026 Locke Jul 2013 A1
20130185050 Bird et al. Jul 2013 A1
20130185056 Ingram et al. Jul 2013 A1
20130205195 Dekhil Aug 2013 A1
20130211855 Eberle et al. Aug 2013 A1
20130238329 Casella dos Santos Sep 2013 A1
20130238330 Casella dos Santos Sep 2013 A1
20130238987 Lutwyche Sep 2013 A1
20130251233 Yang et al. Sep 2013 A1
20130268263 Park et al. Oct 2013 A1
20130293363 Plymouth et al. Nov 2013 A1
20130311201 Chatfield et al. Nov 2013 A1
20140019531 Czajka et al. Jan 2014 A1
20140025371 Min Jan 2014 A1
20140039878 Wasson Feb 2014 A1
20140052696 Soroushian Feb 2014 A1
20140062712 Reiter Mar 2014 A1
20140067377 Reiter Mar 2014 A1
20140072947 Boguraev et al. Mar 2014 A1
20140072948 Boguraev et al. Mar 2014 A1
20140089212 Sbodio Mar 2014 A1
20140100846 Haine et al. Apr 2014 A1
20140100901 Haine et al. Apr 2014 A1
20140100923 Strezo et al. Apr 2014 A1
20140143720 Dimarco et al. May 2014 A1
20140149107 Schilder May 2014 A1
20140164303 Bagchi et al. Jun 2014 A1
20140164304 Bagchi et al. Jun 2014 A1
20140188477 Zhang Jul 2014 A1
20140278358 Byron et al. Sep 2014 A1
20140281935 Byron et al. Sep 2014 A1
20140281951 Megiddo et al. Sep 2014 A1
20140297268 Govrin et al. Oct 2014 A1
20140316768 Khandekar Oct 2014 A1
20140328570 Cheng et al. Nov 2014 A1
20140358964 Woods et al. Dec 2014 A1
20140375466 Reiter Dec 2014 A1
20140379322 Koutrika et al. Dec 2014 A1
20140379378 Cohen-Solal et al. Dec 2014 A1
20150006437 Byron et al. Jan 2015 A1
20150032443 Karov et al. Jan 2015 A1
20150081299 Jasinschi et al. Mar 2015 A1
20150081307 Cederstrom et al. Mar 2015 A1
20150081321 Jain Mar 2015 A1
20150095015 Lani et al. Apr 2015 A1
20150106307 Antebi et al. Apr 2015 A1
20150142418 Byron et al. May 2015 A1
20150142421 Buurman et al. May 2015 A1
20150154359 Harris et al. Jun 2015 A1
20150163358 Klemm et al. Jun 2015 A1
20150169522 Logan et al. Jun 2015 A1
20150169548 Reiter Jun 2015 A1
20150169659 Lee et al. Jun 2015 A1
20150169720 Byron et al. Jun 2015 A1
20150169737 Bryon et al. Jun 2015 A1
20150179082 Byron et al. Jun 2015 A1
20150227508 Howald et al. Aug 2015 A1
20150242384 Reiter Aug 2015 A1
20150261744 Suenbuel et al. Sep 2015 A1
20150261836 Madhani et al. Sep 2015 A1
20150279348 Cao et al. Oct 2015 A1
20150310013 Allen et al. Oct 2015 A1
20150310112 Allen et al. Oct 2015 A1
20150310861 Waltermann et al. Oct 2015 A1
20150324343 Carter et al. Nov 2015 A1
20150324347 Bradshaw et al. Nov 2015 A1
20150324351 Sripada et al. Nov 2015 A1
20150324374 Sripada et al. Nov 2015 A1
20150324413 Gubin et al. Nov 2015 A1
20150325000 Sripada Nov 2015 A1
20150326622 Carter et al. Nov 2015 A1
20150331845 Guggilla et al. Nov 2015 A1
20150331846 Guggilla et al. Nov 2015 A1
20150332670 Akbacak et al. Nov 2015 A1
20150347400 Sripada Dec 2015 A1
20150356127 Pierre et al. Dec 2015 A1
20150363363 Bohra et al. Dec 2015 A1
20150363382 Bohra et al. Dec 2015 A1
20150363390 Mungi et al. Dec 2015 A1
20150363391 Mungi et al. Dec 2015 A1
20150371651 Aharoni et al. Dec 2015 A1
20160019200 Allen Jan 2016 A1
20160027125 Bryce Jan 2016 A1
20160055150 Bird et al. Feb 2016 A1
20160132489 Reiter May 2016 A1
20160140090 Dale et al. May 2016 A1
20160232152 Mahamood Aug 2016 A1
20160328381 Reiter Nov 2016 A1
20160328385 Reiter Nov 2016 A1
20170018107 Reiter Jan 2017 A1
20170075884 Sripada et al. Mar 2017 A1
20190035232 Reiter Jan 2019 A1
Foreign Referenced Citations (42)
Number Date Country
2011247830 Dec 2011 AU
2011253627 Dec 2011 AU
2013201755 Sep 2013 AU
2013338351 May 2015 AU
2577721 Mar 2006 CA
2826116 Mar 2006 CA
103999081 Aug 2014 CN
104182059 Dec 2014 CN
104881320 Sep 2015 CN
1 336 955 May 2006 EP
2707809 Mar 2014 EP
2750759 Jul 2014 EP
2849103 Mar 2015 EP
2518192 Mar 2015 GB
61-221873 Oct 1986 JP
2004-21791 Jan 2004 JP
2014165766 Sep 2014 JP
WO-2000074394 Dec 2000 WO
WO-2002031628 Apr 2002 WO
WO-2002073449 Sep 2002 WO
WO-2002073531 Sep 2002 WO
WO-2002031628 Oct 2002 WO
WO 2006010044 Jan 2006 WO
WO-2007041221 Apr 2007 WO
WO-2009014465 Jan 2009 WO
WO-2010049925 May 2010 WO
WO-2010051404 AI May 2010 WO
WO-2012071571 May 2012 WO
WO 2013009613 Jan 2013 WO
WO-2013042115 Mar 2013 WO
WO-2013042116 Mar 2013 WO
WO 2013177280 Nov 2013 WO
WO 2014035402 Mar 2014 WO
WO 2014098560 Jun 2014 WO
WO 2014140977 Sep 2014 WO
WO 2014187076 Nov 2014 WO
WO 2015028844 Mar 2015 WO
WO 2015113301 Aug 2015 WO
WO 2015148278 Oct 2015 WO
WO 2015164253 Oct 2015 WO
WO 2015175338 Nov 2015 WO
WO 2016004266 Jan 2016 WO
Non-Patent Literature Citations (153)
Entry
Kukich, K., Knowledge-Based Report Generation: A Knowledge-Engineering Approach to Natural Language Report Generation, Dissertation to the Interdisciplinary Department of Information Science, University of Pittsburg (Aug. 1983) 260 pages.
U.S. Appl. No. 14/914,461, filed Feb. 25, 2016; In re: Reiter et al., entitled Text Generation From Correlated Alerts.
U.S. Appl. No. 15/022,420, filed Mar. 16, 2016; In re: Mahamood, entitlted Method and Apparatus for Document Planning.
U.S. Appl. No. 15/074,425, filed Mar. 18, 2016; In re: Reiter, entitled Method and Apparatus for Situational Analysis Text Generation.
U.S. Appl. No. 15/093,337, filed Apr. 7, 2016; In re: Reiter, entitled Method and Apparatus for Referring Expression Generation.
U.S. Appl. No. 15/093,365, filed Apr. 7, 2016; In re: Logan et al., entitled Method and Apparatus for Updating a Previously Generated Text.
International Search Report and Written Opinion for Application No. PCT/IB2012/056513 dated Jun. 26, 2013.
International Search Report and Written Opinion for Application No. PCT/IB2012/056514 dated Jun. 26, 2013.
International Search Report and Written Opinion for Application No. PCT/IB2012/057773 dated Jul. 1, 2013.
International Search Report and Written Opinion for Application No. PCT/IB2012/057774 dated Sep. 20, 2013
International Search Report and Written Opinion for Application No. PCT/US2012/053115 dated Jul. 24, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/053127 dated Jul. 24, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/053128 dated Jun. 27, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/053156 dated Sep. 26, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/053183 dated Jun. 4, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/061051 dated Jul. 24, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/063343; dated Jan. 15, 2014.
International Search Report for Application No. PCT/IB2013/058131 dated Jul. 3, 2014.
Alawneh, A. L. et al., Pattern Recognition Techniques Applied to the Abstraction of Traces of Inter-Process Communication, Software Maintenance and Reengineering (CSMR), 2011 15th European Conference on Year: 2011, IEEE Conference Publications (2011) pp. 211-220.
Andre, E. et al., From Visual Data to Multimedia Presentations, Grounding Presentations, Integration of Sensory Information in Natural Language Processing, Artificial Intelligence and Neural networks, IEE Colloquium on (May 15, 1995) pp. 1-3.
Andre, E. et al., Natural Language Access to Visual Data: Dealing with Space and Movement, Report 63, German Research Center for Articial Intelligence (DFKI) SFB 314, Project VITRA, (Nov. 1989) 1-21.
Barzilay, R., et al., “Aggregation via Set Partitioning for Natural Language Generation;” Proceedings of the Human Language Technology Conference of the North American Chapter of the ACL; pp. 359-366; dated Jun. 2006.
Bhoedjang, R. A. F. et al., Optimizing Distributed Data Structures Using Application-Specific Network Interface Software, Parallel Processing, 1998, Proceedings; 1998 International Conference on Year; 1998, IEEE Conference Publications (1998) pp. 485-492.
Cappozzo, A. et al., Surface-Marker Cluster Design Criteria for 3-D Bone Movement Reconstruction, IEEE Transactions on Biomedical Engineering, vol. 44, No. 12 (Dec. 1997) 1165.
Dragon, R. et al., Multi-Scale Clustering of Frame-to-Frame Correspondences for Motion Segmentation, Computer Vision ECCV 2012, Springer Berlin Heidelberg (Oct. 7, 2012) 445-458.
Gatt, A. et al., From Data to Text in the Neonatal Intensive Care Unit: Using NLG Technology for Decision Support and Information Management, AI Communication (Jan. 1, 2009) 153-186.
Hercules, D., et al.; “Aggregation in Natural Language Generation,” Trends in Natural Language Generation, an Artificial Intelligence Perspective; pp. 88-105; dated Apr. 1993.
Herzog, G. et al., Combining Alternatives in the Multimedia Presentation of Decision Support Information for Real-Time ControlIFIP (1998) 15 pages.
Kottke, D. P. et al., Motion Estimation via Cluster Matching, 8180 IEEE Transactions on Pattern Analysis and Machine Intelligence 16, No. 11 (Nov. 1994) 1128-1132.
Perry, B. et al., Automatic Realignment of Data Structures to Improve MPI Performance, Networks (ICN), 2010 Ninth International Conference on Year: 2010, IEEE Conference Publications (2010) pp. 42-47.
Quinlan, J. R., Induction of Decision Trees, Machine Learning, Kluwer Academic Publishers, vol. 1, No. 1 (Jan. 1, 1986) 81-106.
Radev, D. R. et al., Generating Natural Language Summaries from Multiple On-Line Sources, Association of Computational Linguistics, vol. 24, No. 3 (1998) 469-500.
Reiter, E., An Architecture for Data-to-Text Systems, Proceedings of ENLG-2007 (Jun. 20, 2007) 97-104.
Reiter, E. et al., Building Applied Natural Language Generation Systems, Natural Language Engineering 1 (1) (1995) 31 pages.
Shaw, J.; “Clause Aggregation Using Linguistic Knowledge;” Proceedings of IWNLG; pp. 138-147; dated Jan. 1998; retrieved from <http://acl.ldc.upenn.edu/W/W98-1415.pdf>.
Spillner, J. et al., Algorithms for Dispersed Processing, Utility and Cloud Computing (UC), 204 IEEE/ACM 7th International Conference on Year: 2014, IEEE Conferenced Publications (2014) pp. 914-921.
Voelz, D. et al., Rocco: A RoboCup Soccer Commentator System, German Research Center for Artificial Intelligence DFKI GmbH (1999) 11 pages.
Yu, J. et al., Choosing the Content of Textual Summaries of Large Time-Series Data Sets, Natural Language Engineering 13, (Jan. 1, 2007) pp. 1-28.
Statement in accordance with the Notice from the European patent Office dated Oct. 1, 2007 concerning business methods (OJ EPO Nov. 2007, 592-593, (XP002456414) 1 page.
Office Action for U.S. Appl. No. 14/023,023, dated Mar. 4, 2014.
Notice of Allowance for U.S. Appl. No. 14/023,023 dated Apr. 11, 2014.
Office Action for U.S. Appl. No. 14/023,056 dated Nov. 21, 2013.
Notice of Allowance for U.S. Appl. No. 14/023,056 dated Apr. 29, 2014.
U.S. Appl. No. 12/779,636; entitled “System and Method for Using Data to Automatically Generate a Narrative Story”.
U.S. Appl. No. 13/186,308; entitled “Method and Apparatus for Triggering the Automatic Generation of Narratives”.
U.S. Appl. No. 13/186,337; entitled “Method and Apparatus for Triggering the Automatic Generation of Narratives”.
U.S. Appl. No. 13/186,346; entitled “Method and Apparatus for Triggering the Automatic Generation of Narratives”.
U.S. Appl. No. 13/464,635; entitled “Use of Tools and Abstraction in a Configurable and Portable System for Generating Narratives”.
U.S. Appl. No. 13/464,675; entitled “Configurable and Portable System for Generating Narratives”.
U.S. Appl. No. 13/464,716; entitled “Configurable and Portable System for Generating Narratives”.
U.S. Appl. No. 14/023,023; entitled “Method and Apparatus for Alert Validation;” filed Sep. 10, 2013.
U.S. Appl. No. 14/023,056; entitled “Method and Apparatus for Situational Analysis Text Generation;” filed Sep. 10, 2013.
U.S. Appl. No. 14/027,684, filed Sep. 16, 2013; In re: Sripad et al., entitled Method, Apparatus and Computer Program Product for User-Directed Reporting.
U.S. Appl. No. 14/027,775; entitled “Method and Apparatus for Interactive Reports”, filed Sep. 16, 2013.
Gorelov, S. S. et al.., Search Optimization in Semistructured Databases Using Hierarchy of Document Schemas, Programming and Computer Software, vol. 31, No. 6 (Nov. 1, 2005) pp. 321-331.
Leonov, A. V. et al., Construction of an Optimal Relational Schema for Storing XML Documents in an RDBMS Without Using DTD/XML Schema, Programming and Computer Software, vol. 30, No. 6 (Nov. 1, 2004) pp. 323-336.
Reiter, E. et al., Building Natural Language Generation Systems, Cambridge University Press (2000), 138 pages.
International Search Report and Written Opinion for Application No. PCT/IB2013/050375 dated May 7, 2013.
Chang-Jie, M. et al., Interactive Location-based Services Combined with Natural Language, International Conference on Wireless Communications, Networking and Mobile Computing (2007) 3015-3018.
Guoqiang, D. et al., The Research on Interactive short Message Response, Workshop on Intelligent Information Technology Application, IEEE Conference Publications (2007) 206-209.
International Preliminary Report on Patentability for Application No. PCT/IB2012/056513 dated May 19, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2012/056514 dated May 19, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2012/057773 dated Jun. 30, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2012/057774 dated Jun. 30, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2013/050375 dated Jul. 21, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2013/058131 dated May 5, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2014/060846 dated Oct. 18, 2016.
International Preliminary Report on Patentability for Application No. PCT/US2012/053115 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/053127 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/053128 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/053156 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/053183 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/061051 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/063343 dated May 5, 2015.
International Search Report and Written Opinion for Application No. PCT/IB2013/058131 dated Jul. 3, 2014.
International Search Report and Written Opinion for Application No. PCT/IB2014/060846 dated Feb. 4, 2015.
Notice of Allowance for U.S. Appl. No. 14/027,684 dated Mar. 21, 2016.
Notice of Allowance for U.S. Appl. No. 14/027,775 dated Aug. 12, 2015.
Notice of Allowance for U.S. Appl. No. 14/027,775 dated Sep. 10, 2015.
Notice of Allowance for U.S. Appl. No. 14/311,806 dated Dec. 28, 2016.
Notice of Allowance for U.S. Appl. No. 14/311,998 dated Dec. 22, 2015.
Notice of Allowance for U.S. Appl. No. 14/311,998 dated Jan. 21, 2016.
Notice of Allowance for U.S. Appl. No. 14/634,035 dated Mar. 30, 2016.
Office Action for U.S. Appl. No. 14/027,684 dated Oct. 6, 2015.
Office Action for U.S. Appl. No. 14/027,775 dated Jul. 13, 2015.
Office Action for U.S. Appl. No. 14/311,806 dated Jun. 10, 2016.
Office Action for U.S. Appl. No. 14/311,998 dated Feb. 20, 2015.
Office Action for U.S. Appl. No. 14/311,998 dated Oct. 7, 2015.
Office Action for U.S. Appl. No. 14/634,035 dated Aug. 28, 2015.
Office Action for U.S. Appl. No. 14/634,035 dated Dec. 10, 2015.
Office Action for U.S. Appl. No. 14/634,035 dated Mar. 30, 2016.
Office Action for U.S. Appl. No. 15/022,420 dated May 18, 2017.
Office Action for U.S. Appl. No. 14/760,848 dated May 11, 2017.
Office Action for U.S. Appl. No. 15/074,425 dated May 10, 2017.
Office Action for U.S. Appl. No. 15/186,927 dated May 1, 2017.
Office Action for U.S. Appl. No. 15/188,423 dated Oct. 23, 2017.
Office Action for U.S. Appl. No. 15/421,921 dated Sep. 27, 2017.
Premchaiswadi, W. et al., Enhancing Learning Systems by using Virtual Interactive Classrooms and Web-based Collaborative Work, Education Engineering (EDUCON) IEEE Conference Publications, (2010) 1531-1537.
Reiter, E., Chapter 4: Document Planning (early draft) Building Natural Language Generation Systems (2005) 73-113 [Retrieved from the Internet Nov. 2, 2017: <http://www.ling.helsinki.fi/˜gwilcock/Tartu-2003/ReiterDale/4-DocumentPlanning.pdf>].
Seki, Y., XML Transformation-based three-stage pipelined Natural Language Generation System, Proc. of 6th NLP Pacific Rim Symposium (NLPRS 2001) (2001) 767-768 [Retrieved from the Internet Nov. 2, 2017: <http://www.afnlp.org/archives/nlprs2001/pdf/exh-04-01.pdf>].
Takeuchi, Y. et al., Human Prosocial Response to Emotive Facial Expression of Interactive Agent, The 15th IEEE International Symposium on Robot and Human Interactive Communication (2006), 680-685.
U.S. Appl. No. 14/311,998, entitled Method and Apparatus for Situational Analysis Text Generation; In re: Reiter; filed Jun. 23, 2014.
U.S. Appl. No. 14/634,035, entitled Method and Apparatus for Annotating a Graphical Output; In re: Reiter; filed Feb. 27, 2015.
U.S. Appl. No. 14/760,848, entitled Method and Apparatus for Document Planning; In re: Sripada; filed Jul. 14, 2015.
U.S. Appl. No. 14/961,222, entitled Method and Apparatus for Interactive Reports; In re: Dale et al., filed Dec. 7, 2015.
U.S. Appl. No. 14/311,806; entitled Method and Apparatus for Alert Validation; In re: Reiter, filed Jun. 23, 2014.
U.S. Appl. No. 15/186,927, filed Jun. 20, 2016; In re: Sripada, entitled Method, Apparatus, and Computer Program Product for User-Directed Reporting.
U.S. Appl. No. 15/188,423, filed Jun. 21, 2016; In re: Reiter, entitled Method and Apparatus for Annotating a Graphical Output.
U.S. Appl. No. 15/421,921, filed Feb. 1, 2017; In re: Reiter, entitled Method and Apparatus for Alert Validation.
Wilcox, G., An Overview of Shallow XML-Based Natural Language Generation, Baltic HLT (2005) 67-78 [Retrieved from the Internet Nov. 2, 2017: <https://www.ling.helsinki.fi/˜gwilcock/Pubs/2005/BalticHLT-05.pdf>].
Notice of Allowance for U.S. Appl. No. 15/421,921 dated Mar. 14, 2018.
Office Action for U.S. Appl. No. 15/022,420 dated Feb. 13, 2018.
Office Action for U.S. Appl. No. 15/074,425 dated Feb. 26, 2018.
Krahmer et al., “Computational Generation of Referring Expressions: A Survey,” In Computational Linguistics, 38:173-218, (2012).
Paraboni, “Generating Referring Expressions: Making Referents Easy to Identify,” In Computational Linguistics, 33(2):229-254, (2007).
Paraboni, “Generating references in hierarchical domains: the case of Document Deixis,” University of Brighton PhD thesis, pp. 1-207, (2003).
Siddharthan et al., “Generating referrng expressions in open domains,” In Proceedings of ACL 2004, pp. 1-8, (2004).
Applicant Initiated Interview Summary for U.S. Appl. No. 14/822,349 dated Feb. 13, 2018.
Notice of Allowance for U.S. Appl. No. 14/634,074 dated Jun. 30, 2015.
Notice of Allowance for U.S. Appl. No. 14/634,119 dated Feb. 2, 2016.
Office Action for U.S. Appl. No. 14/634,074 dated Apr. 17, 2015.
Office Action for U.S. Appl. No. 14/634,119 dated Apr. 21, 2015.
Office Action for U.S. Appl. No. 14/634,119 dated Oct. 23, 2015.
Office Action for U.S. Appl. No. 14/822,349 dated Jan. 20, 2017.
Office Action for U.S. Appl. No. 14/822,349 dated Jun. 27, 2018.
Office Action for U.S. Appl. No. 14/822,349 dated Nov. 13, 2017.
Office Action for U.S. Appl. No. 14/822,349 dated Sep. 2, 2016.
Office Action for U.S. Appl. No. 14/961,222 dated Mar. 3, 2018.
Office Action for U.S. Appl. No. 15/093,337 dated Apr. 4, 2018.
Office Action for U.S. Appl. No. 15/093,337 dated Jun. 29, 2017.
Office Action for U.S. Appl. No. 15/186,927 dated Jul. 3, 2018.
Office Action for U.S. Appl. No. 15/186,927 dated Nov. 17, 2017.
Office Action for U.S. Appl. No. 15/188,423 dated Jul. 20, 2018.
U.S. Appl. No. 14/634,074, entitled Method and Apparatus for Configurable Microplanning; In re: Reiter; filed Feb. 27, 2015.
U.S. Appl. No. 14/822,349; entitled Method and Apparatus for Configurable Microplanning; In re: Reiter, filed Aug. 10, 2015.
Buschmeier et al, “An alignment-capable microplanner for natural language generation,” Proceedings of the 12th European Workshop on Natural Language Generation. Association for Computational Linguistics, pp. 82-89, (2009).
Theune, “Natural Language Generation for dialogue: sysem survey,” Thesis , University of Twene, pp. 1-47, (2003).
Notice of Allowance for U.S. Appl. No. 14/961,222 dated Nov. 16, 2018.
Notice of Allowance for U.S. Appl. No. 15/186,927 dated Dec. 20, 2018.
Notice of Allowance for U.S. Appl. No. 15/188,423 dated Dec. 28, 2018.
Office Action for U.S. Appl. No. 15/074,425 dated Nov. 27, 2018.
Office Action for U.S. Appl. No. 15/093,337 dated Dec. 14, 2018.
Office Action for U.S. Appl. No. 15/188,423 dated Oct. 30, 2018.
Office Action for U.S. Appl. No. 16/009,006 dated Dec. 3, 2018.
Office Action for U.S. Appl. No. 15/022,420 dated Sep. 28, 2018.
Notice of Allowance for U.S. Appl. No. 16/009,006 dated Jul. 31, 2019.
Office Action for U.S. Appl. No. 14/822,349 dated Dec. 26, 2018.
Office Action for U.S. Appl. No. 14/822,349 dated Mar. 22, 2019.
Office Action for U.S. Appl. No. 15/022,420 dated Apr. 22, 2019.
Notice of Allowance for U.S. Appl. No. 15/022,420 dated Jan. 17, 2020.
Office Action for U.S. Appl. No. 15/074,425 dated Oct. 4, 2019.
Notice of Allowance for U.S. Appl. No. 15/074,425 dated May 8, 2020.
Office Action for U.S. Appl. No. 16/367,095 dated May 28, 2020.
Related Publications (1)
Number Date Country
20150363364 A1 Dec 2015 US