Method and apparatus for expressing time in an output text

Information

  • Patent Grant
  • 10853584
  • Patent Number
    10,853,584
  • Date Filed
    Friday, April 19, 2019
    5 years ago
  • Date Issued
    Tuesday, December 1, 2020
    3 years ago
  • CPC
  • Field of Search
    • CPC
    • G06F17/2881
    • G06F17/2785
    • G06F17/279
    • G06F40/268
    • G06F40/274
    • G06F40/279
    • G06F40/30
    • G06F40/56
  • International Classifications
    • G06F40/40
    • G06Q10/06
    • G06F40/30
    • G06F40/35
    • G06F40/56
    • G06F40/279
    • G06F40/268
    • G06F40/274
    • Disclaimer
      This patent is subject to a terminal disclaimer.
Abstract
Methods, apparatuses, and computer program products are described herein that are configured to express a time in an output text. In some example embodiments, a method is provided that comprises identifying a time period to be described linguistically in an output text. The method of this embodiment may also include identifying a communicative context for the output text. The method of this embodiment may also include determining one or more temporal reference frames that are applicable to the time period and a domain defined by the communicative context. The method of this embodiment may also include generating a phrase specification that linguistically describes the time period based on the descriptor that is defined by a temporal reference frame of the one or more temporal reference frames. In some examples, the descriptor specifies a time window that is inclusive of at least a portion of the time period to be described linguistically.
Description
TECHNOLOGICAL FIELD

Embodiments of the present invention relate generally to natural language generation technologies and, more particularly, relate to a method, apparatus, and computer program product for expressing time in an output text.


BACKGROUND

In some examples, a natural language generation (NLG) system is configured to transform raw input data that is expressed in a non-linguistic format into a format that can be expressed linguistically, such as through the use of natural language. For example, raw input data may take the form of a value of a stock market index over time and, as such, the raw input data may include data that is suggestive of a time, a duration, a value and/or the like. Therefore, an NLG system may be configured to input the raw input data and output text that linguistically describes the value of the stock market index; for example, “securities markets rose steadily through most of the morning, before sliding downhill late in the day.”


Data that is input into a NLG system may be provided in, for example, a recurrent formal structure. The recurrent formal structure may comprise a plurality of individual fields and defined relationships between the plurality of individual fields. For example, the input data may be contained in a spreadsheet or database, presented in a tabulated log message or other defined structure, encoded in a ‘knowledge representation’ such as the resource description framework (RDF) triples that make up the Semantic Web and/or the like. In some examples, the data may include numerical content, symbolic content or the like. Symbolic content may include, but is not limited to, alphanumeric and other non-numeric character sequences in any character encoding, used to represent arbitrary elements of information. In some examples, the output of the NLG system is text in a natural language (e.g. English, Japanese or Swahili), but may also be in the form of synthesized speech.


BRIEF SUMMARY

Methods, apparatuses, and computer program products are described herein that are configured to linguistically describe a time period detected in a data structure in an output text generated by a natural language generation system. In some example embodiments, a method is provided that comprises identifying the time period to be described linguistically in an output text. The method of this embodiment may also include identifying a communicative context for the output text. The method of this embodiment may also include determining one or more temporal reference frames that are applicable to the time period and are appropriate for the domain defined by the communicative context. The method of this embodiment may also include generating a phrase specification that linguistically describes the time period based on the descriptor that is defined by a temporal reference frame of the one or more temporal reference frames. In some examples, the descriptor specifies a time window that is inclusive of at least a portion of the time period to be described linguistically.





BRIEF DESCRIPTION OF THE DRAWINGS

Having thus described embodiments of the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:



FIG. 1 is a schematic representation of natural language generation environment that may benefit from some example embodiments of the present invention;



FIG. 2 illustrates an example expression of a time using a temporal description system according to some example embodiments described herein;



FIG. 3 illustrates a block diagram of an apparatus that embodies a natural language generation system in accordance with some example embodiments of the present invention; and



FIG. 4 illustrates a flowchart that may be performed by a temporal description system in accordance with some example embodiments of the present invention.





DETAILED DESCRIPTION

Example embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all, embodiments are shown. Indeed, the embodiments may take many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. The terms “data,” “content,” “information,” and similar terms may be used interchangeably, according to some example embodiments, to refer to data capable of being transmitted, received, operated on, and/or stored. Moreover, the term “exemplary”, as may be used herein, is not provided to convey any qualitative assessment, but instead merely to convey an illustration of an example. Thus, use of any such terms should not be taken to limit the spirit and scope of embodiments of the present invention.


Natural language generation systems may be configured to describe a time of an event, happening or the like in an output text. The time of the event may, in some cases, be described by its numerical time, such as “11:02 am”, but more often, the time of the event may be referred to by a description of that time period, such as “late morning”, “before lunch”, “before dawn”, “early in the semester” or the like. As such, to generate a description of a time, a natural language generation system may need external information or a communicative context (e.g. the domain of the event, the location of the reader of the output text, the time the output text is generated in comparison to the timing of the event and/or the like) of the output text in order to generate the preferred or otherwise appropriate description of the time. For example, a local time of 6:00 am may be before sunrise in some areas of the world whereas in other locations 6:00 am may be after sunrise. In other examples, “early morning” may be 6 am for a soldier, whereas “early morning” may be 9 am for a professional.


As is described herein and according to some example embodiments, a temporal description system is provided that enables the generation of a linguistic description, based on communicative context, in the form of a phrase specification for a time period that can be incorporated in an output text. A time period is a space of seconds, minutes, hours, days, weeks, months or years with an established beginning date and ending date. A time period may also include and/or be used interchangeably with, for example, a time point, a time window, a duration of time, an instance of time and/or the like.


In some examples, a microplanner may receive a document plan tree that contains or otherwise refers to a message that includes reference to a time period in a slot of the message. In order to convert the time period in the message into a phrase specification or a syntactic constituent for use in a phrase specification that may be processed by a microplanner, the microplanner may call or otherwise access a temporal description system, such as the temporal description system described herein. In some example embodiments, the temporal description system may reference or otherwise utilize one or more temporal reference frames that are aligned to a current communicative context and define one or more descriptors (e.g. linguistic words or phrases that describe a time window) to linguistically describe the time period. A temporal reference frame is a means of partitioning a given timeline into a set of time partitions called descriptors that can be refined based on a hierarchy. For example, a temporal reference frame may relate to seasons and have descriptors called winter, spring, summer and fall. Those descriptors may be refined, such as by using a modifier “early,” “middle” and/or “late.” As such, the temporal description system may select one of the descriptors that at least partially include the time period received to describe the time period, such as “during the spring.” In some examples, a temporal relationship between a time period and a descriptor may be also be linguistically described to provide a more precise time reference, for example “early spring”.



FIG. 1 is an example block diagram of example components of an example natural language generation environment 100. In some example embodiments, the natural language generation environment 100 comprises a natural language generation system 102, message store 110, a domain model 112 and/or linguistic resources 114. The natural language generation system 102 may take the form of, for example, a code module, a component, circuitry and/or the like. The components of the natural language generation environment 100 are configured to provide various logic (e.g. code, instructions, functions, routines and/or the like) and/or services related to the natural language generation system, the microplanner and/or a temporal description system.


A message store 110 or knowledge pool is configured to store one or more messages that are accessible by the natural language generation system 102. Messages are language independent data structures that correspond to informational elements in a text and/or collect together underlying data, referred to as slots, arguments or features, which can be presented within a fragment of natural language such as a phrase or sentence. Messages may be represented in various ways; for example, each slot may consist of a named attribute and its corresponding value; these values may recursively consist of sets of named attributes and their values, and each message may belong to one of a set of predefined types. The concepts and relationships that make up messages may be drawn from an ontology (e.g. a domain model 112) that formally represents knowledge about the application scenario. In some examples, the domain model 112 is a representation of information about a particular domain. For example, a domain model may contain an ontology that specifies the kinds of objects, instances, concepts and/or the like that may exist in the domain in concrete or abstract form, properties that may be predicated of the objects, concepts and the like, relationships that may hold between the objects, concepts and the like, a communicative context and representations of any specific knowledge that is required to function in the particular domain.


In some examples, messages are created based on a requirements analysis as to what is to be communicated for a particular scenario (e.g. for a particular domain or genre). A message typically corresponds to a fact about the underlying data (for example, the existence of some observed event) that could be expressed via a simple sentence (although it may ultimately be realized by some other linguistic means). For example, to linguistically describe a weather event, such as a rain storm, a user may want to know the location of the rain storm, when it will reach the user's location, the last time rain was detected and/or the like. In some cases, the user may not want to know about a weather event, but instead want to be warned in an instance in which the weather presents a danger in a particular area; for example, “high winds predicted this evening.”


In some examples, a message is created in an instance in which the raw input data warrants the construction of such a message. For example, a wind message would only be constructed in an instance in which wind data was present in the raw input data. Alternatively or additionally, while messages may correspond directly to observations taken from a raw data input, others, however, may be derived from the observations by means of a process of inference or based on one or more detected events. For example, the presence of rain may be indicative of other conditions, such as the potential for snow at some temperatures.


Messages may be instantiated based on many variations of source data, such as but not limited to time series data, time and space data, data from multiple data channels, an ontology, sentence or phrase extraction from one or more texts, a text, survey responses, structured data, unstructured data and/or the like. For example, in some cases, messages may be generated based on text related to multiple news articles focused on the same or similar news story in order to generate a news story; whereas, in other examples, messages may be built based on survey responses and/or event data.


Messages may be annotated with an indication of their relative importance; this information can be used in subsequent processing steps or by the natural language generation system 102 to make decisions about which information may be conveyed and which information may be suppressed. Alternatively or additionally, messages may include information on relationships between the one or more messages.


In some example embodiments, a natural language generation system, such as natural language generation system 102, is configured to generate words, phrases, sentences, text or the like which may take the form of a natural language text. The natural language generation system 102 comprises a document planner 130, a microplanner 132 and/or a realizer 134. The natural language generation system 102 may also be in data communication with the message store 110, the domain model 112 and/or the linguistic resources 114. In some examples, the linguistic resources 114 include, but are not limited to, text schemas, communicative context, aggregation rules, reference rules, lexicalization rules and/or grammar rules that may be used by one or more of the document planner 130, the microplanner 132 and/or the realizer 134. Other natural language generation systems may be used in some example embodiments, such as a natural language generation system as described in Building Natural Language Generation Systems by Ehud Reiter and Robert Dale, Cambridge University Press (2000), which is incorporated by reference in its entirety herein.


The document planner 130 is configured to input the one or more messages from the message store 110 and to determine how to arrange those messages in order to describe one or more patterns in the one or more data channels derived from the raw input data. The document planner 130 may also comprise a content determination process that is configured to select the messages, such as the messages that contain a representation of the data that is to be output via a natural language text.


The document planner 130 may also comprise a structuring process that determines the order of messages to be included in a text. In some example embodiments, the document planner 130 may access one or more text schemas for the purposes of content determination and document structuring. A text schema is a rule set that defines the order in which a number of messages are to be presented in a document. For example, a rain message may be described prior to a temperature message. In other examples, a wind message may be described after, but in a specific relation to, the rain message.


The output of the document planner 130 may be a tree-structured object or other data structure that is referred to as a document plan. In an instance in which a tree-structured object is chosen for the document plan, the leaf nodes of the tree may contain the messages, and the intermediate nodes of the tree structure object may be configured to indicate how the subordinate nodes are related (e.g. elaboration, consequence, contrast, sequence and/or the like) to each other. An example document plan is shown with respect to document plan 202 of FIG. 2. An example message is also shown in FIG. 2 as message 206.


The microplanner 132 is configured to construct a text specification based on the document plan from the document planner 130, such that the document plan may be expressed in natural language. In some example embodiments, the microplanner 132 may convert the one or more messages in the document plan into one or more phrase specifications in a text specification. In some example embodiments, the microplanner 132 may perform aggregation, lexicalization and referring expression generation. In some examples, aggregation includes, but is not limited to, determining whether two or more messages can be combined together linguistically to produce a more complex phrase specification. For example, one or more messages may be aggregated so that both of the messages can be described by a single sentence. In some examples, lexicalization includes, but is not limited to, choosing particular words for the expression of concepts and relations. In some examples, referring expression generation includes, but is not limited to, choosing how to refer to an entity so that it can be unambiguously identified by the reader.


In some example embodiments, the microplanner 132 may embody or otherwise may be in data communication with a temporal description system 140. The microplanner 132 may interact with the temporal description system 140 in an instance in which the microplanner detects a time period (e.g. a time point, a time window, a duration or the like) in a slot of a message in the document plan tree received or otherwise accessed via the document planner 130. As such, the temporal description system 140 is configured to determine or otherwise identify the communicative context of an output text as is provided via the domain model 112 and/or the linguistic resources 114. A communicative context is a factor or combinations of factors of the environment in which the events to be described are occurring and which have an influence on the output text. In some example embodiments, the factor or combination of factors may include a domain for which the text is to be generated (e.g. medical, weather, academic, sports and/or the like), a location of a reader of the output text or a location described by the output text (e.g. Scotland may have a later sunrise when compared to Italy), the current time that the output text is being generated (e.g. “6 am” may be an appropriate descriptor for event tomorrow, but “in the morning next month” may be a more appropriate descriptor to identify an event in the future), the time of the event (e.g. in order to set the tense of a verb), user or reader preferences (e.g. “early morning,” “6 am” or “0600”), language preferences (e.g. descriptors chosen based on regional dialects), and/or the like.


In some example embodiments, the temporal description system 140 may be configured to output a phrase specification that describes the input time period. The temporal description system 140 may linguistically describe the time period in the phrase specification using generic time descriptors, such as a day name, a date, an hour or the like. However, in other example embodiments, the temporal description system 140 may linguistically describe the time period using a descriptor that is defined by a temporal reference frame. A temporal reference frame is a set of time partitionings that are used to describe time in a particular domain (e.g. trimesters in pregnancy, semesters in a university and/or the like). A descriptor is the linguistically describable name of the various partitionings within a temporal reference frame (e.g. first, second and third trimesters of a pregnancy, fall semester and spring semester in a university and/or the like). The various partitionings may be further refined by using modifiers, may be refined based on a hierarchy and/or the like. In some examples, the temporal reference frames are configured to be aligned to or otherwise instantiated based on the communicative context (e.g. first trimester aligned to the conception date of a pregnancy, fall semester aligned to the fall start date, morning tied to a sunrise time and/or the like) by the temporal description system 140. Descriptors may also be aligned within the temporal reference frames in some example embodiments.


The temporal description system 140, in some example embodiments, may then select a temporal reference frame and a descriptor from the available temporal reference frames. In some examples, the temporal reference frame and a descriptor may be chosen based on whether a particular descriptor describes the entire time period. In other examples, a domain model or the linguistic resources may provide a preferred ordering of temporal reference frames to be used in a particular output text. In other examples, a temporal reference frame and descriptor may be chosen based on previously chosen temporal reference frames or previously generated phrase specifications. By way of example, a previously referred-to temporal reference frame may be solar movement (e.g. before sunrise) and, as such, the temporal description system 140 may subsequently use a day period (e.g. morning or afternoon) temporal reference frame to add to readability and/or variety in the output text.


In some examples, the temporal reference frame and descriptor may be chosen based on a scoring system. The scoring system may be based on the ability of a descriptor to describe the entire time period, the detection of false positives (e.g. describing a time period that does not include the event) or false negatives (e.g. failing to describe a time period that contains the event) based on the descriptor and/or the like. In some examples, the temporal reference frame may be selected by the temporal description system 140 randomly. Alternatively or additionally, multiple descriptors within a reference frame may be used to describe a time period.


Once a temporal reference frame and one or more descriptors have been identified, the temporal description system 140 is configured to determine a temporal relationship between the time period and the one or more descriptors. For example, if a single descriptor, such as day period, is used, a temporal relationship may be defined as “early,” “late” or “mid” if the time period relates to the descriptor in such a way (e.g. 8 am is “early in the day”). In an instance in which two or more descriptors are used, the temporal relationship between the time period and the descriptors may be defined by describing the time period as “between” or “overlapping” (e.g. “between lunch and dinner” or “shift 1 overlaps shift 2”).


Using the temporal reference frame, the one or more descriptors and the determined relationship, the temporal description system 140 is configured to generate a phrase specification subject to one or more constraints. Constraints may include, but are not limited to, constraints imposed by a domain model, user preferences, language constraints, output text length constraints, readability and variety constraints, previously referred to temporal reference frames, previous phrase specifications and/or the like. The temporal description system may then output or otherwise provide the phrase specification to the microplanner 132 to be used in a text specification as its own phrase specification or more likely incorporated inside another phrase specification. The output of the microplanner 132, in some example embodiments, is a tree-structured realization specification whose leaf-nodes are phrase specifications, and whose internal nodes express rhetorical relations between the leaf nodes.


A realizer 134 is configured to traverse a text specification output by the microplanner 132 to express the text specification in natural language. The realization process that is applied to each phrase specification and further makes use of a grammar (e.g. the grammar of the linguistic resources 114), which specifies the valid syntactic structures in the language and further provides a way of mapping from phrase specifications into the corresponding natural language sentences. The output of the process is, in some example embodiments, a natural language text. In some examples, the natural language text may include embedded mark-up.



FIG. 2 illustrates an example expression of a time period using a temporal description system according to some example embodiments described herein. In some examples, and by using the systems and methods described with respect to FIG. 1, a time period described in a message may be linguistically described via a temporal description system. One or more non-limiting examples of various operations of the temporal description system will now be described with respect to FIG. 2.


In one example, the timing of weather events may make use of domain specific temporal reference frames, such solar movement (e.g. before or after sunrise), day period (e.g. morning, afternoon, evening), mealtime (e.g. around lunch), time (e.g. before 5 pm) and/or the like. As such, and according to some example embodiments, a microplanner may detect a message, such as message 206 in document plan 202 that contains a time period such as a start time (e.g. StartTime=0600 11 Oct. 2012) and an end time (e.g. EndTime=1500 11 Oct. 2012). The microplanner, when converting the message to a phrase specification, is configured to call or otherwise access the temporal description system 140 to convert the time period defined by the StartTime and the EndTime into a phrase specification, such as phrase specification 208 that is part of phrase specification 210 in the text specification 204 (e.g. “from early morning into the afternoon”). Using the message 206 and the phrase specification 210, a text may be generated (e.g. via the realizer) such as: “rain is expected from early morning into the afternoon.”


In further examples, a document plan may have multiple messages having references to one or more time periods. As such, multiple temporal reference frames may be used to describe the multiple time periods. For example, the following text uses four temporal reference frames: “showers are likely to develop before sunrise before clearing into the morning. There should be some sunny spells around lunchtime lasting until around 5 pm.” The four example temporal reference frames selected by the temporal description system, in this example, are: solar movement (“before sunrise”), day period (“into the morning”), mealtime (“around lunchtime”) and time (“around 5 pm”). In other examples, the temporal description system 140 may also select alternate temporal reference frames. Alternatively or additionally, the temporal reference frames may be used in a different order, such as is shown in the following text: “Showers are likely to develop during the early morning before clearing later. There should be some sunny spells from midday lasting into the evening.”


By way of another example in a medical domain, a time period used to describe a key event during a pregnancy could be described as occurring during the following non-exhaustive list of temporal reference frames: month number (e.g. “in the 6th month”), trimester (e.g. “during the 2nd trimester”), month name (e.g. “during July”), week number (e.g. “in week 30”), day number (e.g. “around day 210”) as well as simply giving a date or time (e.g. “Friday”, “July 5”, “5 am” or the like). By way of further example, the temporal reference frame TRIMESTER may include example descriptors “first trimester”, “second trimester” or “third trimester”. As such, using the domain-specific temporal reference frames, an example output text may include three descriptors that belong to three different temporal reference frames: “During the 2nd trimester, the baby developed as expected. There were concerns during April that the baby was starting to turn but these concerns have subsided due to the baby turning back to the normal position on day 90 of this pregnancy.” In this example, the baby's development has been detected as fine for a 3-month-long period. Without the introduction of the TRIMESTER temporal reference frame, another descriptor may have been used, such as “May to July,” but this may, in some examples, result in fairly repetitive and limited text. By enabling the temporal description system 140 to describe the same time span using multiple descriptors, advantageously, for example, a greater variation in descriptors may be achieved in the resultant output text. Alternatively or additionally, temporal reference frames may be reused in some example embodiments.


In some examples, the temporal description system 140 is configured to use descriptors such as sunrise and sunset times for a specific location along with the time of day and span of an input time period to generate an output text to linguistically describe a time period. By making use of sunrise and sunset times, it is also possible to take into account the time of year when creating the linguistic description of the input time period. For example, during the summer, the night time is shorter than in the middle of winter. Therefore the time period that can be linguistically described as “morning” covers a longer time period if the input time period is during the summer than if the input time period was during the winter. Further, during the winter, the sun may only be up in the morning for a few hours in some locations, therefore the temporal description system 140 may not need to describe a relationship between the time period and the descriptor and, as such, may drop “early” from “early morning.”



FIG. 3 is an example block diagram of an example computing device for practicing embodiments of an example temporal description system. In particular, FIG. 3 shows a computing system 300 that may be utilized to implement a natural language generation environment 100 having a natural language generation system 102 including, in some examples, a document planner 130, a microplanner 132 and/or a realizer 134 and/or a user interface 320. One or more general purpose or special purpose computing systems/devices may be used to implement the natural language generation system 102 and/or the user interface 320. In addition, the computing system 300 may comprise one or more distinct computing systems/devices and may span distributed locations. In some example embodiments, the natural language generation system 102 may be configured to operate remotely via the network 350. In other example embodiments, a pre-processing module or other module that requires heavy computational load may be configured to perform that computational load and thus may be on a remote device or server. For example, the realizer 134 or the temporal description system may be accessed remotely. In other example embodiments, a user device may be configured to operate or otherwise access the natural language generation system 102. Furthermore, each block shown may represent one or more such blocks as appropriate to a specific example embodiment. In some cases one or more of the blocks may be combined with other blocks. Also, the natural language generation system 102 may be implemented in software, hardware, firmware, or in some combination to achieve the capabilities described herein.


In the example embodiment shown, computing system 300 comprises a computer memory (“memory”) 302, a display 304, one or more processors 306, input/output devices 308 (e.g., keyboard, mouse, CRT or LCD display, touch screen, gesture sensing device and/or the like), other computer-readable media 310, and communications interface 312. The processor 306 may, for example, be embodied as various means including one or more microprocessors with accompanying digital signal processor(s), one or more processor(s) without an accompanying digital signal processor, one or more coprocessors, one or more multi-core processors, one or more controllers, processing circuitry, one or more computers, various other processing elements including integrated circuits such as, for example, an application-specific integrated circuit (ASIC) or field-programmable gate array (FPGA), or some combination thereof. Accordingly, although illustrated in FIG. 3 as a single processor, in some embodiments the processor 306 comprises a plurality of processors. The plurality of processors may be in operative communication with each other and may be collectively configured to perform one or more functionalities of the temporal description system as described herein.


The natural language generation system 102 is shown residing in memory 302. The memory 302 may comprise, for example, transitory and/or non-transitory memory, such as volatile memory, non-volatile memory, or some combination thereof. Although illustrated in FIG. 3 as a single memory, the memory 302 may comprise a plurality of memories. The plurality of memories may be embodied on a single computing device or may be distributed across a plurality of computing devices collectively configured to function as the natural language system, the microplanner and/or the temporal description system. In various example embodiments, the memory 302 may comprise, for example, a hard disk, random access memory, cache memory, flash memory, a compact disc read only memory (CD-ROM), digital versatile disc read only memory (DVD-ROM), an optical disc, circuitry configured to store information, or some combination thereof.


In other embodiments, some portion of the contents, some or all of the components of the natural language generation system 102 may be stored on and/or transmitted over the other computer-readable media 310. The components of the natural language generation system 102 preferably execute on one or more processors 306 and are configured to enable operation of a temporal description system, as described herein.


Alternatively or additionally, other code or programs 340 (e.g., an administrative interface, one or more application programming interface, a Web server, and the like) and potentially other data repositories, such as other data sources 330, also reside in the memory 302, and preferably execute on one or more processors 306. Of note, one or more of the components in FIG. 3 may not be present in any specific implementation. For example, some embodiments may not provide other computer readable media 310 or a display 304.


The natural language generation system 102 is further configured to provide functions such as those described with reference to FIG. 1. The natural language generation system 102 may interact with the network 350, via the communications interface 312, with remote data sources 352 (e.g. remote reference data, remote performance data, remote aggregation data, remote knowledge pools and/or the like), third-party content providers 354 and/or client devices 356. The network 350 may be any combination of media (e.g., twisted pair, coaxial, fiber optic, radio frequency), hardware (e.g., routers, switches, repeaters, transceivers), and protocols (e.g., TCP/IP, UDP, Ethernet, Wi-Fi, WiMAX, Bluetooth) that facilitate communication between remotely situated humans and/or devices. In some instances, the network 350 may take the form of the internet or may be embodied by a cellular network such as an LTE based network. In this regard, the communications interface 312 may be capable of operating with one or more air interface standards, communication protocols, modulation types, access types, and/or the like. The client devices 356 include desktop computing systems, notebook computers, mobile phones, smart phones, personal digital assistants, tablets and/or the like. In some example embodiments, a client device may embody some or all of computing system 300.


In an example embodiment, components/modules of the natural language generation system 102 are implemented using standard programming techniques. For example, the natural language generation system 102 may be implemented as a “native” executable running on the processor 306, along with one or more static or dynamic libraries. In other embodiments, the natural language generation system 102 may be implemented as instructions processed by a virtual machine that executes as one of the other programs 340. In general, a range of programming languages known in the art may be employed for implementing such example embodiments, including representative implementations of various programming language paradigms, including but not limited to, object-oriented (e.g., Java, C++, C#, Visual Basic.NET, Smalltalk, and the like), functional (e.g., ML, Lisp, Scheme, and the like), procedural (e.g., C, Pascal, Ada, Modula, and the like), scripting (e.g., Perl, Ruby, Python, JavaScript, VBScript, and the like), and declarative (e.g., SQL, Prolog, and the like).


The embodiments described above may also use synchronous or asynchronous client-server computing techniques. Also, the various components may be implemented using more monolithic programming techniques, for example, as an executable running on a single processor computer system, or alternatively decomposed using a variety of structuring techniques, including but not limited to, multiprogramming, multithreading, client-server, or peer-to-peer, running on one or more computer systems each having one or more processors. Some embodiments may execute concurrently and asynchronously, and communicate using message passing techniques. Equivalent synchronous embodiments are also supported. Also, other functions could be implemented and/or performed by each component/module, and in different orders, and by different components/modules, yet still achieve the described functions.


In addition, programming interfaces to the data stored as part of the natural language generation system 102, such as by using one or more application programming interfaces can be made available by mechanisms such as through application programming interfaces (API) (e.g. C, C++, C#, and Java); libraries for accessing files, databases, or other data repositories; through scripting languages such as XML; or through Web servers, FTP servers, or other types of servers providing access to stored data. The message store 110, the domain model 112 and/or the linguistic resources 114 may be implemented as one or more database systems, file systems, or any other technique for storing such information, or any combination of the above, including implementations using distributed computing techniques. Alternatively or additionally, the message store 110, the domain model 112 and/or the linguistic resources 114 may be local data stores but may also be configured to access data from the remote data sources 352.


Different configurations and locations of programs and data are contemplated for use with techniques described herein. A variety of distributed computing techniques are appropriate for implementing the components of the illustrated embodiments in a distributed manner including but not limited to TCP/IP sockets, RPC, RMI, HTTP, Web Services (XML-RPC, JAX-RPC, SOAP, and the like). Other variations are possible. Also, other functionality could be provided by each component/module, or existing functionality could be distributed amongst the components/modules in different ways, yet still achieve the functions described herein.


Furthermore, in some embodiments, some or all of the components of the natural language generation system 102 may be implemented or provided in other manners, such as at least partially in firmware and/or hardware, including, but not limited to one or more ASICs, standard integrated circuits, controllers executing appropriate instructions, and including microcontrollers and/or embedded controllers, FPGAs, complex programmable logic devices (“CPLDs”), and the like. Some or all of the system components and/or data structures may also be stored as contents (e.g., as executable or other machine-readable software instructions or structured data) on a computer-readable medium so as to enable or configure the computer-readable medium and/or one or more associated computing systems or devices to execute or otherwise use or provide the contents to perform at least some of the described techniques. Some or all of the system components and data structures may also be stored as data signals (e.g., by being encoded as part of a carrier wave or included as part of an analog or digital propagated signal) on a variety of computer-readable transmission mediums, which are then transmitted, including across wireless-based and wired/cable-based mediums, and may take a variety of forms (e.g., as part of a single or multiplexed analog signal, or as multiple discrete digital packets or frames). Such computer program products may also take other forms in other embodiments. Accordingly, embodiments of this disclosure may be practiced with other computer system configurations.



FIG. 4 is a flowchart illustrating an example method performed by a temporal description system in accordance with some example embodiments described herein. As is shown in operation 402, an apparatus may include means, such as the microplanner 132, the temporal description system 140, the processor 306, or the like, for receiving a time period to be described linguistically in an output text. In some example embodiments, the time period may define a time point or a series of time points. As is shown in operation 404, an apparatus may include means, such as the microplanner 132, the temporal description system 140, the processor 306, or the like, for identifying a communicative context for the output text.


As is shown in operation 406, an apparatus may include means, such as the microplanner 132, the temporal description system 140, the processor 306, or the like, for determining one or more temporal reference frames that are applicable to the time period and are appropriate for the domain defined by the communicative context. For example, the temporal reference frame that is partitioned into trimesters that define a pregnancy term would likely only be appropriate for the medical domain and not for the weather domain. As is shown in operation 408, an apparatus may include means, such as the microplanner 132, the temporal description system 140, the processor 306, or the like, for instantiating or otherwise aligning the one or more temporal reference frames to the communicative context. For example, if the temporal reference frame is partitioned into trimesters that define a pregnancy term, then an example first trimester would be instantiated with the conception date and a date three months later.


As is shown in operation 410, an apparatus may include means, such as the microplanner 132, the temporal description system 140, the processor 306, or the like, for selecting a descriptor within a temporal reference frame of the one or more temporal reference frames based on the time period and one or more previously generated phrase specifications. As is described herein, a temporal reference frame may be chosen at random, may be selected by a user, may be dictated by a domain model and/or the like. In some examples, a temporal reference frame may be different than the previously referred-to temporal reference frame. In some example embodiments, each temporal reference frame of the one or more temporal reference frames are partitioned, such that a descriptor is configured to linguistically describe each partition (e.g. each partition or descriptor has a linguistically-expressible name).


As is shown in operation 412, an apparatus may include means, such as the microplanner 132, the temporal description system 140, the processor 306, or the like, for determining a temporal relationship between the time period and the descriptor. For example the time period may be at the beginning of a descriptor. As such, the relationship may, for example, be “early” in the descriptor “March.” As is shown in operation 414, an apparatus may include means, such as the microplanner 132, the temporal description system 140, the processor 306, or the like, for generating a phrase specification based on the descriptor and the temporal relationship between the time period and the descriptor. As is shown in operation 416, an apparatus may include means, such as the microplanner 132, the temporal description system 140, the processor 306, or the like, for verifying the phrase specification using one or more constraints. A constraint may include, but is not limited to, verifying that the same descriptor is not used consecutively in the same sentence, that a descriptor is not used incorrectly within a communicative context (e.g. 6 am is not described as “morning” when the sun has not yet risen), context based constraints, constraints due to the text length and/or the like.



FIG. 4 illustrates an example flowchart of the operations performed by an apparatus, such as computing system 300 of FIG. 3, in accordance with example embodiments of the present invention. It will be understood that each block of the flowchart, and combinations of blocks in the flowchart, may be implemented by various means, such as hardware, firmware, one or more processors, circuitry and/or other devices associated with execution of software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions which embody the procedures described above may be stored by a memory 302 of an apparatus employing an embodiment of the present invention and executed by a processor 306 in the apparatus. As will be appreciated, any such computer program instructions may be loaded onto a computer or other programmable apparatus (e.g., hardware) to produce a machine, such that the resulting computer or other programmable apparatus provides for implementation of the functions specified in the flowchart's block(s). These computer program instructions may also be stored in a non-transitory computer-readable storage memory that may direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable storage memory produce an article of manufacture, the execution of which implements the function specified in the flowchart's block(s). The computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operations to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide operations for implementing the functions specified in the flowchart's block(s). As such, the operations of FIG. 4, when executed, convert a computer or processing circuitry into a particular machine configured to perform an example embodiment of the present invention. Accordingly, the operations of FIG. 4 define an algorithm for configuring a computer or processor, to perform an example embodiment. In some cases, a general purpose computer may be provided with an instance of the processor which performs the algorithm of FIG. 4 to transform the general purpose computer into a particular machine configured to perform an example embodiment.


Accordingly, blocks of the flowchart support combinations of means for performing the specified functions and combinations of operations for performing the specified functions. It will also be understood that one or more blocks of the flowchart, and combinations of blocks in the flowchart, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.


In some example embodiments, certain ones of the operations herein may be modified or further amplified as described below. Moreover, in some embodiments additional optional operations may also be included. It should be appreciated that each of the modifications, optional additions or amplifications described herein may be included with the operations herein either alone or in combination with any others among the features described herein.


Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.

Claims
  • 1. An apparatus comprising at least one processor and at least one memory including computer program code, the at least one memory and the computer program code configured to, with the at least one processor, cause the apparatus to: identify a time period to be described linguistically in an output text and one or more linguistic resources for the output text;generate a phrase specification that linguistically describes the time period based on one or more descriptors that are defined by a temporal reference frame of one or more temporal reference frames, the one or more temporal reference frames applicable to the time period and a domain defined by the one or more linguistic resources, wherein the one or more descriptors specify a time window that is inclusive of at least a portion of the time period to be described linguistically; andgenerate the output text based on the phrase specification.
  • 2. The apparatus according to claim 1, wherein the at least one memory including the computer program code is further configured to, with the at least one processor, cause the apparatus to: align the one or more temporal reference frames to the one or more linguistic resources; andselect the one or more descriptors defined by the temporal reference frame of the one or more temporal reference frames based on the time period and one or more previously generated phrase specifications.
  • 3. The apparatus according to claim 1, wherein the at least one memory including the computer program code is further configured to, with the at least one processor, cause the apparatus to: determine a temporal relationship between the time period and the one or more descriptors, wherein the phrase specification is further based on the temporal relationship between the time period and the one or more descriptors.
  • 4. The apparatus according to claim 1, wherein the one or more linguistic resources comprises a preferred ordering of the one or more temporal reference frames to be used in the output text.
  • 5. The apparatus according to claim 1, wherein the at least one memory including the computer program code is further configured to, with the at least one processor, cause the apparatus to: verify the phrase specification using one or more constraints.
  • 6. The apparatus according to claim 1, wherein the phrase specification is incorporated inside a second phrase specification.
  • 7. The apparatus according to claim 1, wherein each temporal reference frame of the one or more temporal reference frames are partitioned, such that a descriptor is configured to linguistically describe each partition.
  • 8. The apparatus according to claim 7, wherein the one or more linguistic resources is defined by at least one of a domain model, a current time or a current location.
  • 9. The apparatus according to claim 1, wherein the time period comprises at least one of a time point or a series of time points.
  • 10. A computer program product comprising at least one computer readable non-transitory memory medium having program code instructions stored thereon, the program code instructions which when executed by an apparatus cause the apparatus to: identify a time period to be described linguistically in an output text and one or more linguistic resources for the output text; generate a phrase specification that linguistically describes the time period based on one or more descriptors that are defined by a temporal reference frame of one or more temporal reference frames, the one or more temporal reference frames applicable to the time period and a domain defined by the one or more linguistic resources, wherein the one or more descriptors specify a time window that is inclusive of at least a portion of the time period to be described linguistically; andgenerate the output text based on the phrase specification.
  • 11. A computer-implemented method, comprising: identifying, using a processor, a time period to be described linguistically in an output text and one or more linguistic resources for the output text;generating, using the processor, a phrase specification that linguistically describes the time period based on one or more descriptors that are defined by a temporal reference frame of one or more temporal reference frames, the one or more temporal reference frames applicable to the time period and a domain defined by the one or more linguistic resources, wherein the one or more descriptors specify a time window that is inclusive of at least a portion of the time period to be described linguistically; andgenerating, using the processor, the output text based on the phrase specification.
  • 12. The method according to claim 11, further comprising: instantiating or aligning, using the processor, the one or more temporal reference frames to the one or more linguistic resources; andselecting, using the processor, the one or more descriptors defined by the temporal reference frame of the one or more temporal reference frames based on the time period and one or more previously generated phrase specifications.
  • 13. The method according to claim 11, further comprising: determining, using the processor, a temporal relationship between the time period and the one or more descriptors, wherein the phrase specification is further based on the temporal relationship between the time period and the one or more descriptors.
  • 14. The method according to claim 11, wherein the one or more linguistic resources comprises a preferred ordering of the one or more temporal reference frames to be used in the output text.
  • 15. The method according to claim 11, further comprising: verifying, using the processor, the phrase specification using one or more constraints.
  • 16. The method according to claim 11, wherein the phrase specification is incorporated inside a second phrase specification.
  • 17. The method according to claim 11, wherein each temporal reference frame of the one or more temporal reference frames are partitioned, such that a descriptor is configured to linguistically describe each partition.
  • 18. The method according to claim 17, wherein the one or more linguistic resources is defined by at least one of a domain model, a current time or a current location.
  • 19. The method according to claim 11, wherein the time period comprises at least one of a time point or a series of time points and wherein the phrase specification comprises at least one word.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 15/872,826, titled “METHOD AND APPARATUS FOR EXPRESSING TIME IN AN OUTPUT,” filed Jan. 16, 2018, which is a continuation of U.S. patent application Ser. No. 14/702,352, titled “METHOD AND APPARATUS FOR EXPRESSING TIME IN AN OUTPUT TEXT,” filed May 1, 2015, now U.S. Pat. No. 9,904,676, which is continuation of International Application No. PCT/IB2012/056514, titled “METHOD AND APPARATUS FOR EXPRESSING TIME IN AN OUTPUT TEXT,” filed Nov. 16, 2012, the contents of which are hereby incorporated herein by reference in their entirety.

US Referenced Citations (288)
Number Name Date Kind
5181250 Morgan et al. Jan 1993 A
5237502 White et al. Aug 1993 A
5311429 Tominaga May 1994 A
5321608 Namba et al. Jun 1994 A
5369574 Masegi et al. Nov 1994 A
5629687 Sutton et al. May 1997 A
5634084 Malsheen et al. May 1997 A
5794177 Carus et al. Aug 1998 A
5802488 Edatsune Sep 1998 A
5924089 Mocek et al. Jul 1999 A
6023669 Suda et al. Feb 2000 A
6078914 Redfern Jun 2000 A
6138087 Budzinski Oct 2000 A
6266617 Evans Jul 2001 B1
6442485 Evans Aug 2002 B2
6466899 Yano et al. Oct 2002 B1
6665640 Bennett et al. Dec 2003 B1
6717513 Sandelman et al. Apr 2004 B1
6947885 Bangalore et al. Sep 2005 B2
7043420 Ratnaparkhi May 2006 B2
7167824 Kallulli Jan 2007 B2
7231341 Bangalore et al. Jun 2007 B2
7238313 Ferencz et al. Jul 2007 B2
7305336 Polanyi et al. Dec 2007 B2
7310969 Dale Dec 2007 B2
7346493 Ringger et al. Mar 2008 B2
7418447 Caldwell et al. Aug 2008 B2
7424363 Cheng et al. Sep 2008 B2
7444287 Claudatos et al. Oct 2008 B2
7496621 Pan et al. Feb 2009 B2
7526424 Corston-Oliver et al. Apr 2009 B2
7533089 Pan et al. May 2009 B2
7562005 Bangalore et al. Jul 2009 B1
7684991 Stohr et al. Mar 2010 B2
7711581 Hood et al. May 2010 B2
7783486 Rosser et al. Aug 2010 B2
7809552 Pan et al. Oct 2010 B2
7849048 Langseth et al. Dec 2010 B2
7849049 Langseth et al. Dec 2010 B2
7856390 Schiller Dec 2010 B2
7873509 Budzinski Jan 2011 B1
7921091 Cox et al. Apr 2011 B2
7930169 Billerey-Mosier Apr 2011 B2
7933774 Begeja et al. Apr 2011 B1
7966172 Ruiz et al. Jun 2011 B2
7970601 Burmester et al. Jun 2011 B2
7979267 Ruiz et al. Jul 2011 B2
8019610 Walker et al. Sep 2011 B2
8024331 Calistri-Yeh et al. Sep 2011 B2
8037000 Delmonico et al. Oct 2011 B2
8082144 Brown et al. Dec 2011 B1
8090727 Lachtarnik et al. Jan 2012 B2
8150676 Kaeser Apr 2012 B1
8175873 Di Fabbrizio et al. May 2012 B2
8180647 Walker et al. May 2012 B2
8180758 Cornali May 2012 B1
8229937 Kiefer et al. Jul 2012 B2
8355903 Birnbaum et al. Jan 2013 B1
8374848 Birnbaum et al. Feb 2013 B1
8425325 Hope Apr 2013 B2
8473911 Baxter Jun 2013 B1
8494944 Schiller Jul 2013 B2
8515733 Jansen Aug 2013 B2
8515737 Allen Aug 2013 B2
8548814 Manuel-Devadoss Oct 2013 B2
8548915 Antebi et al. Oct 2013 B2
8561014 Mengusoglu et al. Oct 2013 B2
8566090 Di Fabbrizio et al. Oct 2013 B2
8589148 Atallah et al. Nov 2013 B2
8589172 Alonso et al. Nov 2013 B2
8616896 Lennox Dec 2013 B2
8620669 Walker et al. Dec 2013 B2
8626613 Dale et al. Jan 2014 B2
8630844 Nichols et al. Jan 2014 B1
8655889 Hua et al. Feb 2014 B2
8676691 Schiller Mar 2014 B2
8688434 Birnbaum et al. Apr 2014 B1
8700396 Mengibar et al. Apr 2014 B1
8738384 Bansal et al. May 2014 B1
8738558 Antebi et al. May 2014 B2
8762134 Reiter May 2014 B2
8762133 Reiter Jun 2014 B2
8775161 Nichols et al. Jul 2014 B1
8825533 Basson et al. Sep 2014 B2
8843363 Birnbaum et al. Sep 2014 B2
8849670 Di Cristo et al. Sep 2014 B2
8886520 Nichols et al. Nov 2014 B1
8892417 Nichols et al. Nov 2014 B1
8892419 Lundberg et al. Nov 2014 B2
8898063 Sykes et al. Nov 2014 B1
8903711 Lundberg et al. Dec 2014 B2
8903718 Akuwudike Dec 2014 B2
8909595 Gandy et al. Dec 2014 B2
8914452 Boston et al. Dec 2014 B2
8924330 Antebi et al. Dec 2014 B2
8930305 Namburu et al. Jan 2015 B2
8977953 Pierre et al. Mar 2015 B1
8984051 Olsen et al. Mar 2015 B2
9002695 Watanabe et al. Apr 2015 B2
9002869 Riezler et al. Apr 2015 B2
9015730 Allen et al. Apr 2015 B1
9028260 Nanjiani et al. May 2015 B2
9092276 Allen et al. Jul 2015 B2
9104720 Rakshit et al. Aug 2015 B2
9110882 Overell et al. Aug 2015 B2
9110977 Pierre et al. Aug 2015 B1
9111534 Sylvester et al. Aug 2015 B1
9135244 Reiter Sep 2015 B2
9135662 Evenhouse et al. Sep 2015 B2
9146904 Allen Sep 2015 B2
9164982 Kaeser Oct 2015 B1
9190054 Riley et al. Nov 2015 B1
9208147 Nichols et al. Dec 2015 B1
9224894 Hackenberg et al. Dec 2015 B2
9229927 Wolfram et al. Jan 2016 B2
9240197 Begeja et al. Jan 2016 B2
9244894 Dale et al. Jan 2016 B1
9251134 Birnbaum et al. Feb 2016 B2
9251143 Bird et al. Feb 2016 B2
9263039 Di Cristo et al. Feb 2016 B2
9268770 Kursun Feb 2016 B1
9323743 Reiter Apr 2016 B2
9336193 Logan et al. May 2016 B2
9396181 Sripada et al. Jul 2016 B1
9405448 Reiter Aug 2016 B2
9600471 Bradshaw et al. Mar 2017 B2
9640045 Reiter May 2017 B2
9904676 Sripada et al. Feb 2018 B2
9990360 Sripada Jun 2018 B2
10026274 Reiter Jul 2018 B2
10115202 Sripada Oct 2018 B2
10255252 Dale et al. Apr 2019 B2
10282878 Reiter May 2019 B2
10311145 Sripada et al. Jun 2019 B2
10504338 Reiter Dec 2019 B2
10565308 Reiter Feb 2020 B2
10664558 Mahamood May 2020 B2
20020026306 Bangalore et al. Feb 2002 A1
20020116173 Ratnaparkhi Aug 2002 A1
20030131315 Escher Jul 2003 A1
20030182102 Corston-Oliver et al. Sep 2003 A1
20030212545 Kallulli Nov 2003 A1
20040246120 Benner et al. Dec 2004 A1
20050033582 Gadd Feb 2005 A1
20050039107 Hander et al. Feb 2005 A1
20050228635 Araki et al. Oct 2005 A1
20050256703 Markel Nov 2005 A1
20060085667 Kubota et al. Apr 2006 A1
20060178868 Billerey-Mosier Aug 2006 A1
20060259293 Orwant Nov 2006 A1
20070027673 Moberg Feb 2007 A1
20070078655 Semkow et al. Apr 2007 A1
20070106628 Adjali et al. May 2007 A1
20070129942 Ban et al. Jun 2007 A1
20070143099 Balchandran et al. Jun 2007 A1
20080103756 Singh May 2008 A1
20080221865 Wellmann Sep 2008 A1
20080221870 Attardi et al. Sep 2008 A1
20080243472 DeGroot Oct 2008 A1
20080281781 Zhao et al. Nov 2008 A1
20080312954 Ullrich et al. Dec 2008 A1
20090089100 Nenov et al. Apr 2009 A1
20090089126 Odubiyi Apr 2009 A1
20090111486 Burstrom Apr 2009 A1
20090156229 Hein et al. Jun 2009 A1
20090198496 Denecke Aug 2009 A1
20090281839 Lynn et al. Nov 2009 A1
20090286514 Lichorowic et al. Nov 2009 A1
20090287567 Penberthy et al. Nov 2009 A1
20100146491 Hirano et al. Jun 2010 A1
20100153095 Yang et al. Jun 2010 A1
20100174545 Otani Jul 2010 A1
20100191658 Kannan et al. Jul 2010 A1
20100203970 Hope Aug 2010 A1
20100332235 David Dec 2010 A1
20110010164 Williams Jan 2011 A1
20110068929 Franz et al. Mar 2011 A1
20110087486 Schiller Apr 2011 A1
20110160986 Wu et al. Jun 2011 A1
20110179006 Cox et al. Jul 2011 A1
20110218822 Buisman et al. Sep 2011 A1
20110225185 Gupta Sep 2011 A1
20110257839 Mukherjee Oct 2011 A1
20120078888 Brown et al. Mar 2012 A1
20120084027 Caine Apr 2012 A1
20120109998 Patch May 2012 A1
20120124176 Curtis May 2012 A1
20120136649 Freising et al. May 2012 A1
20120158089 Bocek et al. Jun 2012 A1
20120173475 Ash et al. Jul 2012 A1
20120278080 Singh Nov 2012 A1
20120290289 Manera et al. Nov 2012 A1
20120310990 Viegas et al. Dec 2012 A1
20120323574 Wang et al. Dec 2012 A1
20120323576 Wang et al. Dec 2012 A1
20130030810 Kopparapu et al. Jan 2013 A1
20130066873 Salvetti et al. Mar 2013 A1
20130073280 O'Neil et al. Mar 2013 A1
20130144606 Birnbaum et al. Jun 2013 A1
20130145242 Birnbaum et al. Jun 2013 A1
20130151238 Beaurpere et al. Jun 2013 A1
20130174026 Locke Jul 2013 A1
20130185050 Bird et al. Jul 2013 A1
20130185051 Buryak et al. Jul 2013 A1
20130211855 Eberle et al. Aug 2013 A1
20130238329 Casella dos Santos Sep 2013 A1
20130238330 Casella dos Santos Sep 2013 A1
20130238987 Lutwyche Sep 2013 A1
20130251233 Yang et al. Sep 2013 A1
20130262092 Wasick Oct 2013 A1
20130268263 Park et al. Oct 2013 A1
20130293363 Plymouth et al. Nov 2013 A1
20130311201 Chatfield et al. Nov 2013 A1
20140019531 Czajka et al. Jan 2014 A1
20140025371 Min Jan 2014 A1
20140039878 Wasson Feb 2014 A1
20140052696 Soroushian Feb 2014 A1
20140062712 Reiter Mar 2014 A1
20140067377 Reiter Mar 2014 A1
20140072947 Boguraev et al. Mar 2014 A1
20140072948 Boguraev et al. Mar 2014 A1
20140089212 Sbodio Mar 2014 A1
20140100846 Haine et al. Apr 2014 A1
20140100901 Haine Apr 2014 A1
20140100923 Strezo et al. Apr 2014 A1
20140136186 Adami et al. May 2014 A1
20140143720 Dimarco et al. May 2014 A1
20140149107 Schilder May 2014 A1
20140164303 Bagchi et al. Jun 2014 A1
20140164304 Bagchi et al. Jun 2014 A1
20140188477 Zhang Jul 2014 A1
20140278358 Byron et al. Sep 2014 A1
20140281935 Byron et al. Sep 2014 A1
20140281951 Megiddo et al. Sep 2014 A1
20140297268 Govrin et al. Oct 2014 A1
20140316768 Khandekar Oct 2014 A1
20140375466 Reiter Dec 2014 A1
20140379322 Koutrika et al. Dec 2014 A1
20140379378 Cohen-Solal et al. Dec 2014 A1
20150006437 Byron et al. Jan 2015 A1
20150032443 Karov et al. Jan 2015 A1
20150081299 Jasinschi et al. Mar 2015 A1
20150081307 Cederstrom et al. Mar 2015 A1
20150081321 Jain Mar 2015 A1
20150095015 Lani et al. Apr 2015 A1
20150106307 Antebi et al. Apr 2015 A1
20150142418 Byron et al. May 2015 A1
20150142421 Buurman et al. May 2015 A1
20150154359 Harris et al. Jun 2015 A1
20150163358 Klemm et al. Jun 2015 A1
20150169522 Logan et al. Jun 2015 A1
20150169547 Reiter Jun 2015 A1
20150169548 Reiter Jun 2015 A1
20150169659 Lee et al. Jun 2015 A1
20150169720 Byron et al. Jun 2015 A1
20150169737 Bryon et al. Jun 2015 A1
20150179082 Byron et al. Jun 2015 A1
20150227508 Howald et al. Aug 2015 A1
20150242384 Reiter Aug 2015 A1
20150261744 Suenbuel et al. Sep 2015 A1
20150261836 Madhani et al. Sep 2015 A1
20150279348 Cao et al. Oct 2015 A1
20150310013 Allen et al. Oct 2015 A1
20150310112 Allen et al. Oct 2015 A1
20150310861 Waltermann et al. Oct 2015 A1
20150324343 Carter et al. Nov 2015 A1
20150324347 Bradshaw et al. Nov 2015 A1
20150324351 Sripada et al. Nov 2015 A1
20150324374 Sripada et al. Nov 2015 A1
20150324413 Gubin et al. Nov 2015 A1
20150325000 Sripada Nov 2015 A1
20150326622 Carter et al. Nov 2015 A1
20150331845 Guggilla et al. Nov 2015 A1
20150331846 Guggilla et al. Nov 2015 A1
20150332670 Akbacak et al. Nov 2015 A1
20150347400 Sripada Dec 2015 A1
20150356127 Pierre et al. Dec 2015 A1
20150363363 Bohra et al. Dec 2015 A1
20150363364 Sripada Dec 2015 A1
20150363382 Bohra et al. Dec 2015 A1
20150363390 Mungi et al. Dec 2015 A1
20150363391 Mungi et al. Dec 2015 A1
20150371651 Aharoni et al. Dec 2015 A1
20160019200 Allen Jan 2016 A1
20160027125 Bryce Jan 2016 A1
20160055150 Bird et al. Feb 2016 A1
20160328385 Reiter Nov 2016 A1
20200058145 Reiter Feb 2020 A1
Foreign Referenced Citations (43)
Number Date Country
2011247830 Dec 2011 AU
2011253627 Dec 2011 AU
2013201755 Sep 2013 AU
2013338351 May 2015 AU
2577721 Mar 2006 CA
2826116 Mar 2006 CA
103999081 Aug 2014 CN
104182059 Dec 2014 CN
104881320 Sep 2015 CN
1336955 May 2006 EP
2707809 Mar 2014 EP
2750759 Jul 2014 EP
2849103 Mar 2015 EP
2518192 Mar 2015 GB
61-221873 Oct 1986 JP
2004-21791 Jan 2004 JP
2014165766 Sep 2014 JP
WO 2000074394 Dec 2000 WO
WO 2002031628 Apr 2002 WO
WO 2002073449 Sep 2002 WO
WO 2002073531 Sep 2002 WO
WO 2002031628 Oct 2002 WO
WO 2006010044 Jan 2006 WO
WO 2007041221 Apr 2007 WO
WO 2009014465 Jan 2009 WO
WO 2010049925 May 2010 WO
WO 2010051404 May 2010 WO
WO 2012071571 May 2012 WO
WO 2013009613 Jan 2013 WO
WO 2013042115 Mar 2013 WO
WO 2013042116 Mar 2013 WO
WO 2013177280 Nov 2013 WO
WO 2014035402 Mar 2014 WO
WO 2014098560 Jun 2014 WO
WO 2014140977 Sep 2014 WO
WO 2014187076 Nov 2014 WO
WO 2015028844 Mar 2015 WO
WO 2015113301 Aug 2015 WO
WO 2015148278 Oct 2015 WO
WO 2015159133 Oct 2015 WO
WO 2015164253 Oct 2015 WO
WO 2015175338 Nov 2015 WO
WO 2016004266 Jan 2016 WO
Non-Patent Literature Citations (110)
Entry
Alawneh et al., “Pattern Recognition Techniques Applied to the Abstraction of Traces of Inter-Process Communication,” Software Maintenance and Reengineering (CSMR), 2011 15th European Conference on Year: 2011, IEEE Conference Publications, pp. 211-220, (2011).
Andre et al., “From Visual Data to Multimedia Presentations,” Grounding Representations: Integration of Sensory Information in Natural Language Processing, Artificial Intelligence and Neural networks, IEE Colloquium on, pp. 1-3, (1995).
Andre et al., “Natural Language Access to Visual Data: Dealing with Space and Movement,” Report 63, German Research Center for Artificial Intelligence (DFKI) SFB 314, Project VITRA, pp. 1-21, (1989).
Barzilay et al.; “Aggregation via Set Partitioning for Natural Language Generation”, Proceedings of the Human Language Technology Conference of the North American Chapter of the ACL; pp. 359-366; (2006).
Bhoedjang et al., “Optimizing Distributed Data Structures Using Application-Specific Network Interface Software,” Parallel Processing, Proceedings; International Conference on Year: 1998, IEEE Conference Publications, pp. 485-492, (1998).
Cappozzo et al., “Surface-Marker Cluster Design Criteria for 3-D Bone Movement Reconstruction,” IEEE Transactions on Biomedical Engineering, 44(12):1165-1174, (1997).
Dalianis et al.; “Aggregation in Natural Language Generation;” Trends in Natural Language Generation, an Artificial Intelligence Perspective; pp. 88-105; (1993).
Dragon et al., “Multi-Scale Clustering of Frame-to-Frame Correspondences for Motion Segmentation,” Computer Vision ECCV, Springer Berlin Heidelberg, pp. 445-458, (2012).
Gatt et al.,“From Data to Text in the Neonatal Intensive Care Unit: Using NLG Technology for Decision Support and Information Management,” AI Communication, pp. 153-186, (2009).
Gorelov et al., “Search Optimization in Semistructured Databases Using Hierarchy of Document Schemas,” Programming and Computer Software, 31(6):321-331, (2005).
Herzog et al., “Combining Alternatives in the Multimedia Presentation of Decision Support Information for Real-Time Control,” IFIP, 15 pages,(1998).
Kottke et al., “Motion Estimation Via Cluster Matching,” 8180 IEEE Transactions on Pattern Analysis and Machine Intelligence, 16(11):1128-1132, (1994).
Kukich, “Knowledge-Based Report Generation: A Knowledge-Engineering Approach to Natural Language Report Generation,” Dissertation to The Interdisciplinary Department of Information Science, University of Pittsburg, 260 pages, (1983).
Leonov et al., “Construction of an Optimal Relational Schema for Storing XML Documents in an RDBMS Without Using DTD/XML Schema,” Programming and Computer Software, 30(6):323-336, (2004).
Perry et al., “Automatic Realignment of Data Structures to Improve MPI Performance,” Networks (ICN), Ninth International Conference on Year: 2010, IEEE Conference Publications, pp. 42-47, (2010).
Quinlan, “Induction of Decision Trees,” Machine Learning, Kluwer Academic Publishers, 1(1):81-106, (1986).
Radev et al., “Generating Natural Language Summaries from Multiple On-Line Sources,” Association of Computational Linguistics, 24(3):469-500, (1998).
Reiter et al., “Building Applied Natural Language Generation Systems,” Natural Language Engineering 1 (1), 31 pages, (1995).
Reiter et al.; “Studies in Natural Language Processing—Building Natural Language Generation Systems,” Cambridge University Press, (2000).
Reiter, “An Architecture for Data-to-Text Systems,” Proceedings of ENLG-2007, pp. 97-104, (2007).
Shaw, “Clause Aggregation Using Linguistic Knowledge;” Proceedings of IWNLG, pp. 138-147, (1998). Retrieved from <http://acl.ldc.upenn.edu/W/W98/W98-1415.pdf>.
Spillner et al., “Algorithms for Dispersed Processing,” Utility and Cloud Computing (UC), 204 IEEE/ACM 7th International Conference on Year: 2014, IEEE Conferenced Publications, pp. 914-921, (2014).
Voelz et al., “Rocco: A RoboCup Soccer Commentator System,” German Research Center for Artificial Intelligence DFKI GmbH, 11 pages, (1999).
Yu et al., “Choosing the Content of Textual Summaries of Large Time-Series Data Sets,” Natural Language Engineering, 13:1-28, (2007).
International Preliminary Report on Patentability for Application No. PCT/IB2012/056513 dated May 29, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2012/056514 dated May 19, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2012/057773 dated Jun. 30, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2012/057774 dated Jun. 30, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2013/050375 dated Jul. 21, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2013/058131 dated May 5, 2015.
International Preliminary Report on Patentability for Application No. PCT/IB2014/060846 dated Oct. 18, 2016.
International Preliminary Report on Patentability for Application No. PCT/US2012/053115 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/053127 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/053128 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/053156 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/053183 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/061051 dated Mar. 3, 2015.
International Preliminary Report on Patentability for Application No. PCT/US2012/063343 dated May 5, 2015.
International Search Report and Written Opinion for Application No. PCT/IB2012/056513 dated Jun. 26, 2013.
International Search Report and Written Opinion for Application No. PCT/IB2012/056514 dated Jun. 26, 2013.
International Search Report and Written Opinion for Application No. PCT/IB2012/057773 dated Jul. 1, 2013.
International Search Report and Written Opinion for Application No. PCT/IB2012/057774 dated Sep. 20, 2013.
International Search Report and Written Opinion for Application No. PCT/IB2013/050375 dated May 7, 2013.
International Search Report and Written Opinion for Application No. PCT/IB2013/058131 dated Jul. 3, 2014.
International Search Report and Written Opinion for Application No. PCT/IB2014/060846 dated Feb. 4, 2015.
International Search Report and Written Opinion for Application No. PCT/US2012/053115 dated Jul. 24, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/053127 dated Jul. 24, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/053128 dated Jun. 27, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/053156 dated Sep. 26, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/053183 dated Jun. 4, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/061051 dated Jul. 24, 2013.
International Search Report and Written Opinion for Application No. PCT/US2012/063343; dated Jan. 15, 2014.
Notice of Allowance for U.S. Appl. No. 14/023,023 dated Apr. 11, 2014.
Notice of Allowance for U.S. Appl. No. 14/023,056 dated Apr. 29, 2014.
Notice of Allowance for U.S. Appl. No. 14/311,806 dated Dec. 28, 2016.
Notice of Allowance for U.S. Appl. No. 14/311,998 dated Dec. 22, 2015.
Notice of Allowance for U.S. Appl. No. 14/311,998 dated Jan. 21, 2016.
Notice of Allowance for U.S. Appl. No. 14/634,035 dated Mar. 30, 2016.
Notice of Allowance for U.S. Appl. No. 14/702,352 dated Oct. 17, 2017.
Notice of Allowance for U.S. Appl. No. 15/074,425 dated May 8, 2020.
Notice of Allowance for U.S. Appl. No. 15/186,927 dated Dec. 20, 2018.
Notice of Allowance for U.S. Appl. No. 15/188,423 dated Dec. 28, 2018.
Notice of Allowance for U.S. Appl. No. 15/421,921 dated Mar. 14, 2018.
Notice of Allowance for U.S. Appl. No. 15/872,826 dated Jan. 22, 2019.
Notice of Allowance for U.S. Appl. No. 16/009,006 dated Jul. 31, 2019.
Office Action for U.S. Appl. No. 14/023,023 dated Mar. 4, 2014.
Office Action for U.S. Appl. No. 14/023,056 dated Nov. 21, 2013.
Office Action for U.S. Appl. No. 14/311,806 dated Jun. 10, 2016.
Office Action for U.S. Appl. No. 14/311,998 dated Feb. 20, 2015.
Office Action for U.S. Appl. No. 14/311,998 dated Oct. 7, 2015.
Office Action for U.S. Appl. No. 14/634,035 dated Aug. 28, 2015.
Office Action for U.S. Appl. No. 14/634,035 dated Dec. 10, 2015.
Office Action for U.S. Appl. No. 14/634,035 dated Mar. 30, 2016.
Office Action for U.S. Appl. No. 14/702,352 dated Mar. 29, 2017.
Office Action for U.S. Appl. No. 15/074,425 dated Feb. 26, 2018.
Office Action for U.S. Appl. No. 15/074,425 dated May 10, 2017.
Office Action for U.S. Appl. No. 15/074,425 dated Nov. 27, 2018.
Office Action for U.S. Appl. No. 15/074,425 dated Oct. 4, 2019.
Office Action for U.S. Appl. No. 15/188,423 dated Jul. 20, 2018.
Office Action for U.S. Appl. No. 15/188,423 dated Oct. 23, 2017.
Office Action for U.S. Appl. No. 15/188,423 dated Oct. 30, 2018.
Office Action for U.S. Appl. No. 15/421,921 dated Sep. 27, 2017.
Office Action for U.S. Appl. No. 15/872,826 dated Aug. 15, 2018.
Office Action for U.S. Appl. No. 16/009,006 dated Dec. 3, 2018.
Office Action for U.S. Appl. No. 16/367,095 dated May 28, 2020.
Statement in accordance with the Notice from the European patent Office dated Oct. 1, 2007 concerning business methods (OJ EPO Nov. 2007, 592-593, (XP002456414) 1 page.
U.S. Appl. No. 12/779,636; entitled “System and Method for Using Data to Automatically Generate a Narrative Story” filed May 13, 2010.
U.S. Appl. No. 13/186,308; entitled “Method and Apparatus for Triggering the Automatic Generation of Narratives” filed Jul. 19, 2011.
U.S. Appl. No. 13/186,329; entitled “Method and Apparatus for Triggering the Automatic Generation of Narratives” filed Jul. 19, 2011.
U.S. Appl. No. 13/186,337; entitled “Method and Apparatus for Triggering the Automatic Generation of Narratives” filed Jul. 19, 2011.
U.S. Appl. No. 13/186,346; entitled “Method and Apparatus for Triggering the Automatic Generation of Narratives” filed Jul. 19, 2011.
U.S. Appl. No. 13/464,635; entitled “Use of Tools and Abstraction in a Configurable and Portable System for Generating Narratives” filed May 4, 2012.
U.S. Appl. No. 13/464,675; entitled “Configurable and Portable System for Generating Narratives” filed May 4, 2012.
U.S. Appl. No. 13/464,716; entitled “Configurable and Portable System for Generating Narratives” filed May 4, 2012.
U.S. Appl. No. 14/023,023; entitled “Method and Apparatus for Alert Validation;” filed Sep. 10, 2013.
U.S. Appl. No. 14/023,056; entitled “Method and Apparatus for Situational Analysis Text Generation;” filed Sep. 10, 2013.
U.S. Appl. No. 14/027,684; entitled “Method, Apparatus, and Computer Program Product for User-Directed Reporting;” filed Sep. 16, 2013.
U.S. Appl. No. 14/027,775; entitled “Method and Apparatus for Interactive Reports;” filed Sep. 16, 2013.
U.S. Appl. No. 14/311,806; entitled Method and Apparatus for Alert Validation; In re: Reiter, filed Jun. 23, 2014.
U.S. Appl. No. 14/311,998, entitled Method and Apparatus for Situational Analysis Text Generation; In re: Reiter; filed Jun. 23, 2014.
U.S. Appl. No. 14/634,035, entitled Method and Apparatus for Annotating a Graphical Output; In re: Reiter; filed Feb. 27, 2015.
U.S. Appl. No. 14/914,461, filed Feb. 25, 2016; In re: Reiter et al., entitled Text Generation From Correlated Alerts.
U.S. Appl. No. 15/022,420, filed Mar. 16, 2016; In re: Mahamood, entitled Method and Apparatus for Document Planning.
U.S. Appl. No. 15/074,425, filed Mar. 18, 2016; In re: Reiter, entitled Method and Apparatus for Situational Analysis Text Generation.
U.S. Appl. No. 15/093,337, filed Apr. 7, 2016; In re: Reiter, entitled Method and Apparatus for Referring Expression Generation.
U.S. Appl. No. 15/093,365, filed Apr. 7, 2016; In re: Logan et al., entitled Method and Apparatus for Updating a Previously Generated Text.
U.S. Appl. No. 15/188,423, filed Jun. 21, 2016; In re: Reiter, entitled Method and Apparatus for Annotating a Graphical Output.
U.S. Appl. No. 15/421,921, filed Feb. 1, 2017; In re: Reiter, entitled Method and Apparatus for Alert Validation.
U.S. Appl. No. 15/872,826, filed Jan. 16, 2018, U.S. Pat. No. 10,311,145, Issued.
U.S. Appl. No. 14/702,352, filed May 1, 2015, U.S. Pat. No. 9,904,676, Issued.
Related Publications (1)
Number Date Country
20200081985 A1 Mar 2020 US
Continuations (3)
Number Date Country
Parent 15872826 Jan 2018 US
Child 16389523 US
Parent 14702352 May 2015 US
Child 15872826 US
Parent PCT/IB2012/056514 Nov 2012 US
Child 14702352 US