The present invention relates generally to information aggregation systems and, in particular, to a method and apparatus for multi-dimensional graphical representation of queries and results.
Currently, browser-based queries are in either a text form, for example, “gunshot at Michigan and Jackson in Chicago,” or a multimedia file uploaded to a system for search.
One of ordinary skill in the art will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve understanding of various embodiments of the present invention. Also, common and well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present invention.
A method and user terminal are provided that graphically formulate a search query. The method and user terminal display, via a display screen, a multi-dimensional graphical representation of a search query space, receive, via a user interface of the user terminal, a plurality of parameters from a user, wherein the parameters define the search query space, position a multi-dimensional icon in the multi-dimensional representation of the search query space, associate one or more of a keyword and multimedia content with the icon, and generate a search query based on the keyword and the position of the icon in the multi-dimensional representation of the search query space. The method and user terminal further may graphically display the results of the corresponding database search, wherein the retrieved content is displayed as one or more icons positioned in a multi-dimensional graph having a plurality of axes associated with the plurality of parameters defining a context of the search query.
Generally, an embodiment of the present invention encompasses a method for graphically formulating a search query. The method includes displaying a multi-dimensional graphical representation of a search query space, receiving a plurality of parameters from a user, wherein the parameters define the search query space, positioning and sizing a multi-dimensional icon in the multi-dimensional representation of the search query space, associating one or more of a keyword and multimedia content with the icon, and generating a search query based on the one or more of the keyword and multimedia content, and the position and size of the icon in the multi-dimensional representation of the search query space.
Another embodiment of the present invention encompasses a method for graphically displaying results of a database search. The method includes retrieving search-related multi-media content from one or more databases based on a search query and displaying the search results in a multi-dimensional graphical format on a display screen, wherein the retrieved multimedia content is displayed as one or more icons positioned in a multi-dimensional graph having a plurality of axes, wherein each axis of the plurality of axes is associated with a parameter of the plurality of parameters defining a context of the search query, and wherein a relationship among search results is indicated.
Yet another embodiment of the present invention encompasses a user terminal that includes a user interface comprising a display screen. The user terminal further includes a processor that is configured to display, via the display screen, a multi-dimensional graphical representation of a search query space, receive, via the user interface, a plurality of parameters from a user, wherein the parameters define the search query space, position and size a multi-dimensional icon in the multi-dimensional representation of the search query space, associate one or more of a keyword and multimedia content with the icon, and generate a search query based on the one or more of the keyword and multimedia content, and the position and size of the icon in the multi-dimensional representation of the search query space.
Still another embodiment of the present invention encompasses a user terminal for graphically displaying results of a database search. The user terminal includes a display screen and a processor that is configured to retrieve search-related multi-media content from one or more databases based on a search query and display, via the display screen, the search results in a multi-dimensional graphical format on a display screen, wherein the retrieved multi-media content is displayed as one or more icons positioned in a multi-dimensional graph having a plurality of axes, wherein each axis of the plurality of axes is associated with a parameter of the plurality of parameters defining a context of the search query, and wherein a relationship among search results is indicated.
Turning now to the drawings, the present invention may be more fully described with reference to
User terminal 102 may be any kind of user device into which a user may enter a data query and which includes a display for displaying results of that query. More particularly, user terminal 102 includes a user interface 104 via which a user may input a search query into the user terminal, and a display screen 106 for displaying the search query and results of a corresponding search. For example, user terminal 102 may be a wireless mobile device, such as a cellular telephone, a radio telephone, a smart phone, or a personal digital assistant (PDA) with radio frequency (RF) capabilities, may be a personal computer, a laptop computer, or a tablet computer with or without radio frequency (RF) capabilities, or may be a communication console, such as used in a computer-assisted dispatch (CAD) system, for example, a Public Safety and enterprise system. User terminal further includes a network interface 108, for example, a wireless, wireline, or optical interface, for connecting to data network 110.
Servers 120-122 each includes a respective database 130-132 that may be searched by user terminal 102. More particular, servers 120-122 each includes a server entity that may collect, process, and maintain data in the corresponding database 130-132 and further includes a respective search engine 140-142 that may search the database, or other databases that may be internal or external to the server, in response to receiving a query from user terminal 102. In other embodiments of the present invention, one or more of search engines 140-142 may be external to, and in communication with, a corresponding server 120-122. Servers 120-122 each may be connected to data network 110 via any of a wireless, wireline, or optical connection, or any other connection known in the art. Databases 130-132 maintain multimedia content, such as video recordings, audio recordings, emails, tweets, and/or any other social media, such as Facebook© entries. Further, it is assumed herein that multimedia content stored in databases 120-122 is stored in association with one or more of: one or more content-defining parameters, one or more keywords, and one or more keyword modifiers, as described in greater detail below, and can be retrieved by searching for those parameters/keywords/keyword modifiers.
Referring now to
User terminal 102 further includes user interface 104 and network interface 108, which user interface and network interface are each coupled to processor 202. As described above, network interface 108 may be a wireless, wireline, or optical interface capable of conveying messaging, such as data packets, to, and receiving messaging from, data network 110. User interface 104 includes display screen 106, which display screen may or may not comprise a capacitive touchscreen, and further may include a keypad, buttons, a touch pad, a joystick, a mouse, an additional display, or any other device useful for providing an interface between a user and an electronic device such as user terminal 102 and via which the user may input instructions into the user terminal. For example, the user may select an icon displayed on display screen 106, as described in greater detail below, by touching the icon on the touchscreen or by selecting the icon by use of the mouse. By way of another example, the user may input text in an icon or label an axis of a graph displayed on display screen 106 by selecting the icon or axis and then entering text into the icon/axis via the keyboard. Display screen 106 may be a liquid crystal display (LCD), a light emitting diode (LED) display, a plasma display, or any other means for visually displaying information.
At least one memory device 204 includes a display screen driver 206 that is executed by processor 202. Display screen driver 206 comprises data and programs that control an operation of display screen 106, for example, for providing a multi-dimensional graphical representation on the display screen in response to user inputs. Further, when display screen 106 comprises a touchscreen, display screen driver 206 comprises data and programs for sensing a capacitive change in the touchscreen and determining a location of a user's touch on the touchscreen.
At least one memory device 204 further includes a multidimensional graphical user interface (GUI) user query converter 208, that when executed by processor 202, converts a user's graphical query into a machine readable format for execution by a search engine, such as search engines 140-142 and that converts text received in response to a search to graphical information (text-to-graph conversion) for display on display screen 106. More particularly, display screen driver 206 is configured such that a user of user terminal 102 may enter a search query in a graphical form, that is, by manipulating an icon in a multi-dimensional graphical representation depicted on display screen 106, wherein each dimension, or axis, of the graphical representation corresponds to a parameter to be searched, such as time and location of an event, which query then is converted by the display screen driver to an instruction that is converted to searchable code by user query converter 208. Similarly, display screen driver 206 then may present the results of the search in a multi-dimensional graphical representation on display screen 106, again, for example, wherein each dimension, or axis, of the graphical representation corresponds to a searched parameter.
In other words, communication system 100, and in particular user terminal 102, provides a user of the user terminal with a multimedia query system comprising a multi-dimensional graphical query and result representation. More particularly, user terminal 102 allows the user to input a search query into the user terminal by positioning an icon in a multi-dimensional graph that is displayed on display screen 106, wherein the different dimensions, or axes, of the graph correspond to different parameters of the query, such as location and time, thereby allowing the user to use the positioning of the icon to input the parameters of the query. Further, the user can input text into the icon, thereby inputting one or more keywords and associated parameters to be searched. Furthermore, the user can upload multimedia files into the icon to provide additional search parameters; multimedia files can be indexed to facilitate the search.
For example, and referring now to
The user then may input, and user terminal 102 receive (308), one or more values, or instances, associated with each of one or more parameters (that is, the parameters corresponding to each axis of the graphical representation). Each such value received with respect to a parameter may correspond to a position along the axis associated with that parameter. The user terminal then may label the corresponding positions along with corresponding axis using the provided values. The user then may input, and user terminal 102 receive (310), the values to be used in the search query by positioning a multi-dimensional icon in the multi-dimensional search space, wherein the position of the icon corresponds to a value assigned to the icon in association with each of the multiple parameters corresponding to the multiple axes of the search space. For example, the user may specifically label discrete values along one or more of the axes. In addition, the user may input one or more keywords and, optionally, keyword modifiers and multimedia content or files, that the user terminal then associates (312) with each icon. For example, the user may input text and/or upload multimedia content or files into each icon, which text may comprise one or more keywords, and associated keyword modifiers, to be searched and which multimedia content/files provide additional search parameters—multimedia files can be indexed to facilitate the search.
Further, a user may adjust (314) a position of an icon, that is, input an instruction to adjust a position of an icon, in response to which user terminal 102 adjusts the position of the icon and, by repositioning the icon in the search space, or graph, adjusts one or more values of a parameter associated with an icon, thereby redefining a value assigned to the icon in association with the parameter, when generating the search query. For example, with reference to
After inputting his or her search query, comprising the parameters defining the speech space, a positioning of an icon in the search space, and an associating of one or more keywords and keyword modifiers associated with the icon, the user of user terminal 102 then may instruct the user terminal to perform (316) a database search associated with the icon. For example, display screen 106 may include a separate ‘Search’ icon corresponding to an instruction to perform a search based on the positioning of the search space icon(s). When the user selects the ‘Search’ icon, display screen driver 206 converts this selection to an instruction to perform a search based on the positioning of the search space icon(s), and in response to the instruction, search engine 208 generates (318) a search query based on the position(s) of the icon(s) in the multi-dimensional search space. However, in another embodiment of the present invention, the user of user terminal 102 then may instruct the user terminal to perform a database search by individually selecting a particular icon in the multi-dimensional search space, thereby instructing user terminal 102 to generate a search query based on the position of that particular icon(s) in the multi-dimensional graph.
User terminal 102, and in particular user query converter 208, then executes a search (320), via one or more of search engines 140-142, of one or more of the multiple databases 120-132 based on the search query, and downloads (322) the results of the search, that is, search-related content, to user terminal 102. User terminal 102 then may display (324) the search results in a multi-dimensional graphical format, as described in greater detail with respect to
For example, and referring now to
For example, in
Further, as depicted in
Referring now to
That is, similar to
However, in
Thus, when the user instructs user terminal 102 to generate a search query based on the positioning of icons 406 and 408 in
Referring now to
Thus, when the user instructs user terminal 102 to generate a search query based on the positioning of icons 406, 408, and 602 in
In other embodiments of the present invention, the user may individually select a particular icon in the multi-dimensional graph, such as individually selecting one of icons 406, 408, and 602, thereby instructing user terminal 102 to generate a search query based on the positioning of that particular icon(s) in the multi-dimensional graph.
Referring now to
The user of user terminal 102 then may select an icon, in response to which display driver 206 generates an instruction (708) to display all search results, that is, content, associated with that icon/event. In response to receiving the instruction, user terminal displays (710) the content associated with the selected icon, and logic flow diagram 700 ends (712).
For example, and referring now to
For example, in
Further, each of icons 806, 808, and 810 is positioned at a different position along horizontal, for example, time, axis 802. Thus it may be inferred that a second event defining parameter, time, is different for each of the events represented by icons 806, 808, and 810, and further that an event represented by icon 808 occurred later in time than an event represented by icon 806, and that an event represented by icon 810 occurred later in time than the event represented by icon 808.
In addition, each icon includes text corresponding to keywords, and optionally keyword modifiers, searched with respect to the corresponding event. For example, icon 806 includes the text ‘GS 1,’ corresponding to the keyword ‘Gunshot’ (GS) and the keyword modifier ‘1’ (that is, one gunshot). Similarly, icon 808 includes the text ‘GS 2,’ corresponding to the keyword ‘Gunshot’ (GS) and the keyword modifier ‘2’ (that is, two gunshots). And icon 810 includes the text ‘C-TIRE,’ corresponding to the keywords ‘Car Tire Screech.’
User terminal 102 then may display the retrieved content by indicating, in multi-dimensional graphical representation 800, which of events have been found in the search. For example, user terminal 102 may highlight an icon to indicate that content associated with the event corresponding to that icon has been found. For example, in multi-dimensional graphical representation 800, icon 808, is highlighted (by shading). This may be interpreted to mean that “media content was found that meets at least one parameter of the ‘two gunshots’ query.” For example, a video with two consecutive gunshots may have been found and/or a tweet noting the occurrence of two consecutive gunshots may have been found. The user of user terminal 102 then may select the highlighted icon, that is, icon 808, to instruct the user terminal to display all search results, that is, content, associated with that icon/event, such as all video recordings, audio recordings, emails, tweets, and/or any other social media associated with that event, such as Facebook© entries. The search results, that is, the retrieved content, then is displayed by user terminal 102 on display screen 106, for example, in a list, and the user then may select particular content to view by inputting a selection of an item in the list.
Referring now to
For example, the search may have produced a tweet of someone mentioning a gunshot at Michigan Avenue, and a few minutes later the same person tweeting about two gunshots at Michigan Avenue. Or the search may have produced an audio recording where one gunshot is followed a few minutes later by two gunshots, where the location is unknown. The user of user terminal 102 then may select the highlighting stripe 812 to instruct the user terminal to display all search results. Again, the retrieved content may be presented by user terminal 102 on display screen 106 in a list that is ordered based on the number and the importance of parameters/keywords/keyword modifiers/multimedia content or files met by the content, with content meeting a larger number and importance of the parameters/keywords/keyword modifiers/multimedia content or files associated with the event (for example, content wherein the one gunshot and the following two gunshots are indicated to be close in time and at a same location) being positioned higher in the list.
Referring now to
For example, suppose someone heard a gunshot at the intersection of Michigan Avenue and Jackson Boulevard and then pulled out his or her camera phone to record events. The person then recorded video that includes the sound of two more gunshots and that further depicts a person running away from an apparent location of the gunshots, getting in a car on Adams Street, and speeding away. Or perhaps, instead of actually recording video of the person getting in the car and driving away, the person just recorded the sound of the car screeching. Further, suppose the person recording the event then uploaded the video to a social media website, such as Facebook©. Such videos would meet all, or nearly all, of the search parameters/keywords/keyword modifiers associated with icons 808 and 810, and accordingly would be retrieved by search engine 108 and positioned near the top of a list of search results.
Further, and referring now to
Referring now to
More particularly, graphical representation 1301 comprises four icons 1306-1309 that each represents a person who is searched. For example, icon 1306 includes a keyword identifying a ‘Person 1’ (P1 ID), icon 1307 includes a keyword identifying a ‘Person 2’ (P2 ID), icon 1308 includes a keyword identifying a ‘Person 3’ (P3 ID), and icon 1309 includes a keyword identifying a ‘Person 4’ (P4 ID). Further, icons 1306-1309 are approximately aligned in time (along the horizontal axis), indicating that a parameter of the search query is content that includes one or more of these persons and which content is of approximately of the same time (which searched time is the “present” time).
Graphical representation 1321, displaying the search results, also depicts the content-defining parameters, that is, “person” on the vertical axis and again “time” on the horizontal axis, and the events searched (icons 1306-1309). However, graphical representation 1321 further displays, via an icon-based representation, the search results, again by highlighting icons to indicate that content associated with the event corresponding to that icon has been found. For example, highlighted icon combination 1314 may indicate that the search produced media, or content, placing Persons 1 and 2 (corresponding to icons 1306 and 1307) at a same location but at different times. By way of another example, highlighted icon combination 1316 may indicate that the search produced media, or content, placing Persons 1 and 2 (again, corresponding to icons 1306 and 1307) at a same location at approximately a same time. By way of yet another example, highlighted icon combination 1318 may indicate that the search produced media, or content, placing Person 1 (corresponding to icon 1306) at a same location at two different times. And further, graphical representation 1321 includes a textual summary 1322 of the depicted search results.
Again, user may select particular content to view by inputting a selection of a particular highlighting stripe, in response to which user terminal 102 generates an instruction for the display of, and responsively displays on display screen 106, all search results, that is, content, associated with that highlighting stripe, such as a list of all video recordings or audio recordings associated with that highlighting stripe.
As described above, a method and apparatus are provided for presenting search queries and results in a multi-dimensional graphical representation. The axes of the graph may be any content-defining parameters sought to be searched, for example, any parameters that may be used to define an event that is being searched. While examples herein disclose the parameters time, location, and persons, any other event-defining parameter may be used that may occur to one who wishes to perform a database search. Further, while the examples herein disclose a two-dimensional search space, this is not meant to limit the invention as additional dimensions may be searched, for example, time, location, and persons. Further, as indicated, the values assigned to the content-defining parameters (the axes of a search space) need not be proximate to each other; for example, various values along a ‘location’ axis may be cities, such as ‘Chicago’ and ‘Paris,’ or various values along a ‘time’ axis may be hours, days, months, or years.
The search results depicted in
Thus, the multi-dimensional graphical representation may display any kind of relationship of interest. Further, the graphical query may set out a parameter range, rather than merely specifying specific instances of a parameter. For example, in another embodiment of the present invention, graphical representation 1301 may correspond to a search query “find all media from the past 4 months (‘present’ to ‘−4’ months) that includes any of these four people and display any relationship that is detected among them.”
Further, when the user selects a highlighting stripe, the relationship can also be revealed graphically. For example, the highlighting stripe 1314 in
A user then may review, for example, view or listen to, the retrieved media by selecting an icon in a graphical representation of the search results, thereby instructing user terminal 102 to display all media, for example, a list of retrieved media, associated with that icon.
In the foregoing specification, specific embodiments have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present teachings.
The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims. The invention is defined solely by the appended claims including any amendments made during the pendency of this application and all equivalents of those claims as issued.
Moreover in this document, relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “has”, “having,” “includes”, “including,” “contains”, “containing” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises, has, includes, contains a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “comprises . . . a”, “has . . . a”, “includes . . . a”, “contains . . . a” does not without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises, has, includes, contains the element. The terms “a” and “an” are defined as one or more unless explicitly stated otherwise herein. The terms “substantially,” “essentially,” “approximately,” “about,” or any other version thereof, are defined as being close to as understood by one of ordinary skill in the art, and in one non-limiting embodiment the term is defined to be within 10%, in another embodiment within 5%, in another embodiment within 1% and in another embodiment within 0.5%. The term “coupled” as used herein is defined as connected, although not necessarily directly and not necessarily mechanically. A device or structure that is “configured” in a certain way is configured in at least that way, but may also be configured in ways that are not listed.
The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.