Overview user interface of emergency call data of a law enforcement agency

Information

  • Patent Grant
  • 10042524
  • Patent Number
    10,042,524
  • Date Filed
    Tuesday, December 23, 2014
    9 years ago
  • Date Issued
    Tuesday, August 7, 2018
    5 years ago
Abstract
Techniques in this disclosure may provide a user interface that concurrently displays multiple panels which provide visualization of emergency call data of a law enforcement agency. The user interface can provide a high-level overview of emergency calls in a geographical area. Each panel in the user interface can provide visualization of the emergency calls and/or statistics relating to the calls. A user can customize which panels to include in the user interface and/or customize setting for each panel. The user may apply various types of filters to the data displayed in the user interface, and the panels can update the visualizations according to the filters. The user interface can also provide the ability to show data at various levels of detail within the same user interface or panel. The techniques in the disclosure can provide a convenient, digestible overview of tactical and/or strategic data in a single user interface.
Description
TECHNICAL FIELD

The present disclosure relates to systems and techniques for data integration, analysis, and visualization. More specifically, the present disclosure relates to visualization of law enforcement agency data.


BACKGROUND

Law enforcement agencies (e.g., a police department of a city) can monitor emergency calls in a designated area. Systems and methods for allowing such agencies to better (e.g., more quickly, more accurately, etc.) interact with such data are desired.


SUMMARY

The systems, methods, and devices described herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, several non-limiting features will now be discussed briefly.


In one embodiment, a computer system configured to provide a customizable user interface relating to visualization of data associated with a law enforcement agency comprises: one or more hardware computer processors configured to execute code in order to cause the system to: generate a user interface configured to concurrently display a plurality of panels each including a visual representation based on emergency call data of a law enforcement agency, the emergency call data comprising data associated with a plurality of emergency calls, wherein the plurality of panels comprises at least: a first panel displaying a map of a geographical region associated with the law enforcement agency, the map of the geographical region comprising a plurality of selectable precinct indicators representing a corresponding plurality of precincts for which the law enforcement agency has at least some law enforcement responsibilities, the first panel configured to: in response to receiving a selection of a particular precinct indicator corresponding to a particular precinct, update the first panel to display one or more emergency call indicators representing a corresponding one or more emergency calls within the particular precinct; and in response to receiving a selection of a particular emergency call indicator corresponding to a particular emergency call, update the first panel to display information relating to the particular emergency call.


In another embodiment, a method of providing a customizable user interface relating to visualization of data associated with a law enforcement agency comprises: generating, using one or more hardware computer processors, a user interface configured to concurrently display a plurality of panels each including a visual representation based on emergency call data of a law enforcement agency, the emergency call data comprising data associated with a plurality of emergency calls; displaying in the user interface at least a first panel of the plurality of panels, the first panel displaying a map of a geographical region associated with the law enforcement agency, the map of the geographical region comprising a plurality of selectable precinct indicators representing a corresponding plurality of precincts for which the law enforcement agency has at least some law enforcement responsibilities; in response to receiving a selection of a particular precinct indicator corresponding to a particular precinct, updating the first panel to display one or more emergency call indicators representing a corresponding one or more emergency calls within the particular precinct; and in response to receiving a selection of a particular emergency call indicator corresponding to a particular emergency call, updating the first panel to display information relating to the particular emergency call.


In yet another embodiment, a non-transitory computer readable medium comprises instructions for providing a customizable user interface relating to visualization of data associated with a law enforcement agency that cause a computer processor to: generate a user interface configured to concurrently display a plurality of panels each including a visual representation based on emergency call data of a law enforcement agency, the emergency call data comprising data associated with a plurality of emergency calls; display in the user interface at least a first panel of the plurality of panels, the first panel displaying a map of a geographical region associated with the law enforcement agency, the map of the geographical region comprising a plurality of selectable precinct indicators representing a corresponding plurality of precincts for which the law enforcement agency has at least some law enforcement responsibilities; in response to receiving a selection of a particular precinct indicator corresponding to a particular precinct, update the first panel to display one or more emergency call indicators representing a corresponding one or more emergency calls within the particular precinct; and in response to receiving a selection of a particular emergency call indicator corresponding to a particular emergency call, update the first panel to display information relating to the particular emergency call.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates one embodiment of a user interface comprising multiple panels for visualizing various aspects of emergency call data of a law enforcement agency.



FIGS. 2A-2E illustrate one embodiment of a panel for visualizing emergency calls in a geographical area associated with a law enforcement agency at various levels of detail.



FIG. 3 illustrates one embodiment of a panel for visualization of high priority emergency calls in a geographical area associated with a law enforcement agency.



FIG. 4 illustrates one embodiment of a panel for visualization of statistics comparing past jobs and current jobs.



FIG. 5 illustrates one embodiment of a panel for visualization of statistics relating to common radio codes.



FIG. 6 illustrates one embodiment of a panel for visualization of statistics relating to common radio subcodes.



FIG. 7 illustrates another embodiment of a user interface comprising multiple panels for visualizing data of a law enforcement agency.



FIGS. 8A-8I illustrate various panels that can be included in a user interface for visualizing data of a law enforcement agency.



FIG. 9 illustrates a flowchart for providing a user interface including multiple panels for visualizing emergency call data of a law enforcement agency, according to certain embodiments.



FIG. 10 illustrates one embodiment of a database system using an ontology.



FIG. 11 illustrates a computer system with which certain methods discussed herein may be implemented.





DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS

Overview


Techniques in this disclosure may provide a user interface that concurrently displays multiple panels which provide visualization of emergency call data of a law enforcement agency, such as data that is specifically pertinent to a particular agency director/supervisor. For example, the panels on an “executive dashboard,” may be customized to include the most relevant/useful data for a particular user, department, or agency, for example. In one embodiment, the panels provide summary data that is useful for a director, supervisor, or other “executive,” while still allowing the executive to drill down into the data in order to view detailed information about any summarized data. For example, top officials or decision makers of a law enforcement agency, such as police chiefs or sheriffs, may not be interested in the details of each call, but would want to know which sections or divisions in the geographical area have a high level of unassigned calls.


The user interface can provide a high-level overview of emergency calls in a geographical area that is overseen by the law enforcement agency. Examples of law enforcement agencies include police departments (e.g., NYPD, LAPD, etc.), sheriff's departments, etc. Each panel in the user interface can provide a visualization (e.g., a chart, graph, diagram, list, map, drawing, etc.) indicating aspects related to emergency calls and/or statistics relating to the calls. For example, the user interface can include a panel that displays the number of emergency calls by precinct or section on a map of the geographical area.


A user (e.g., system administrator, analyst, etc.) can customize which panels to include in the user interface and/or customize setting for each panel. For instance, the user can specify which panels should be included in the user interface, dimensions and/or size of a panel, the order in which the panels should be displayed, etc. Then, the user interface can determine the optimal layout and display the panels according to the user specified requirements. In one embodiment, a first user, such as a system administrator, customizes the user interface for use by one or more other users, such as analysts. Alternatively, the user that views and interacts with the user interface may customize the user interface.


The user may apply various types of filters to the data displayed in the user interface, and the panels can update the visualizations according to the filters. For example, the user may wish to view emergency calls in a specific time frame and apply a filter for that time frame, and the visualizations in all or some of the panels can be automatically updated to show the results of applying the filter. The user interface can also provide the ability to show data at various levels of detail within the same user interface. For instance, the user can click on an emergency call in the precinct or section, and the details of the call can be displayed in the same panel. In this manner, the techniques in the disclosure can provide a convenient, digestible overview of tactical and/or strategic data in a single user interface.


In one embodiment, the executive dashboard allows a user to easily identify trends (geographic, temporal, etc.) based on Computer Aided Dispatch (CAD) Job data. Maps allow the user to visualize CAD Jobs geographically, with the option to view the map with a street view or based on precinct zones. Various charts also display time-based trends, allowing users to compare current conditions with conditions from last week (or another time in the past). Filters allow the user to drill down further based on various criteria, including radio code, precinct, and status. These features, as well as others that may be implemented in various embodiments, are discuss further below.


Example Executive Dashboard



FIG. 1 illustrates one embodiment of a user interface 100 comprising multiple panels 110 (including panels 110a, 110b, 110c, 110d, and 110e) for visualizing various aspects of emergency call data of a law enforcement agency. In the example of FIG. 1, the user interface 100 includes five different panels 110. Each panel is explained in more detail with respect to FIGS. 2-6. The first panel 110a is a map of a geographical area under the control of the law enforcement agency. Details relating to the first panel are explained with respect to FIGS. 2A-2E. The second panel 110b is a table of high priority calls for the geographical area. Details relating to the second panel are explained with respect to FIG. 3. The third panel 110c is a bar chart showing a comparison of number of jobs at current time and a past time. Details relating to the third panel are explained with respect to FIG. 4. The fourth panel 110d is a bar chart showing the top radio codes. Details relating to the fourth panel are explained with respect to FIG. 5. The fifth panel 110e is a bar chart showing the top radio subcodes. Details relating to the fifth panel are explained with respect to FIG. 6.


As explained above, the panels 110 in the user interface 100 can be customized, and the settings for a panel 110 can also be customized. The user can choose the types of panels 110 that are included in the user interface 100. Some examples of types of panels 110 include, where panels are referenced based on a type of visualization included with the panels: map, table, bar chart, line graph, flow diagram, word cloud diagram, etc. Visualizations may also be referred to as a “visual representations.” These types of panels 110 are further explained with respect to FIGS. 2-8. The user may also define the order in which the panels 110 should be displayed. The layout engine for the user interface 100 can display the panels 110 in the order specified. A layout engine can generate or render the user interface 100 and the panels 110 included in the user interface 100.


The user can also specify the dimensions or size for a particular panel 110 and/or multiple or all panels 110. In one embodiment, the size of a panel 110 is defined by number of rows and columns, and the layout engine renders the panels 110 in the user interface 100 based on the size of each panel 110. Specifying in terms of rows and columns may make the process simpler since the user does not have to use pixels. For instance, the rows and columns may be defined by a unit that is associated with a group of pixels. The sizes of panels 110 can be the same or different from each other. In some embodiments, the user may define the number of columns in the user interface 100, and the panels 110 are arranged in columns. The customization setting for the panels 110 may be specified for an individual, a group of individuals, the entire organization, etc.


The user interface 100 may also include one or more filters 150 to apply to the data displayed in the user interface 100. Types of filters can include: keyword filters, date filters, area filters (e.g., precinct or other geographic filters), call information filters (e.g., radio code, caller information, etc), or filters based on any other attribute associated with the emergency call data. Such filters may be configured using any available user interface controls, such as radio buttons, drop-down lists, text entry fields, etc. The example in FIG. 1 includes a radio code filter 150a, a time filter 150b, a precinct filter 150c, and a patrol borough filter 150d. In the radio code filter 150a, the user can type in the radio code of interest (or select via any other user interface controls), and the user interface 100 filters the results in one or more panels 110 according to the radio code. The radio code filter 150a may accept one or more radio codes. A “radio code” may refer to a code used by a law enforcement agency to indicate a specific type of situation and/or action. For example, a particular radio code can indicate robbery or burglary. In one embodiment, radio codes may be grouped such that the user (e.g., an executive) can filter data included in the various panels based on a group identifier (e.g., violent crimes) that include multiple radio codes.


The time filter 150b can accept a time frame or time period from the user. The time frame may be a range, a specific time, or a specific date. The time filter 150b can be selected using a date picker, a drop down menu that provides a list of time frames as options, or other user interface controls. The precinct filter 150c and the patrol borough filter 150d can filter the data by precinct and by patrol borough, respectively. They may be drop down menus (or other user interface controls) that provide a list of precincts and a list of patrol boroughs. In one example, a patrol borough consists of one or more precincts (e.g., in New York City). The list of precincts provided by the precinct filter 150c can vary based on the borough selected in the patrol borough filter 150d. A “precinct” may refer to a geographical section or division within a geographical area that is served by or is under the jurisdiction of a particular law enforcement agency. Different terms may be used by various law enforcement agencies to refer to such geographical section or division within the geographical area (e.g., “area”, “division,” etc.).


The type and/or content of information displayed in the panels 110 and/or the user interface 100 may vary depending on the requirements of a law enforcement agency. For example, one law enforcement agency could be interested in viewing emergency call data; this law enforcement agency may want to view data for various precincts and quickly determine which precincts have a backlog in terms of resource assignment. The user interface 100 and the panels 110 can show which precincts have a high number of jobs that are not assigned to a resource. On the other hand, another law enforcement agency might be more interested in viewing arrest data. Examples of user interface and panels displaying arrest data are discussed in detail in connection with FIGS. 7-8.


The user interface 100 may be generated by systems described with respect to FIGS. 10-11. Such systems can be based on object-centric data models. For example, any data displayed in the user interface 100 may be represented by data objects, links, relationships, etc. as explained with respect to FIGS. 10-11. In some embodiments, the user interface 100 can be provided in connection with the systems described in connection with FIGS. 10-11. For instance, the user interface 100 may be implemented as an overview layer for such systems, e.g., based on the data of the systems. The user interface 100 can be generated by a platform configured to build the overview layer. Details relating to systems that may generate or create the user interface 100 are further explained with respect to FIG. 7.


A panel 110 can include any type of visualization. For example, the panels 110 in the user interface 100 may include any of the visualizations illustrated in FIGS. 8A-8I, in addition to the visualizations illustrated in FIGS. 2-6. The user interface 100 and the panels 110 in the user interface 110 can be easily configured and customized. The panels 110 may be easily rearranged (e.g., by drag and drop), and the dimensions of the panels 110 can be specified in a simple manner (e.g., by number of rows and columns or resizing with a mouse or touchscreen input). Details regarding customizability of the user interface 110 are further explained with respect to FIGS. 7-8.


Example Drill-Down of Emergency Call Data



FIGS. 2A-2E illustrate one embodiment of a panel 200 for visualizing emergency calls in a geographical area associated with a law enforcement agency at various levels of detail. The panel 200 displays a map 260 of a geographical area under the control of the law enforcement agency, which can provide a high-level overview of emergency call activity in the area, while allowing the user to zoom in on particular areas to be provided with lower-level details. Moving from general information to more detailed information may be referred to as “drilling down.” The panel 200 can provide the ability to drill down within the actual panel 200, which may be one of multiple panels included in an executive dashboard.


In FIG. 2A, the panel 200 displays a map 260 of precincts that are under the jurisdiction of the law enforcement agency. The number of emergency calls for a precinct can be shown in circles or orbs 230. For example, the number for circle 230a is “15,” which indicates that the current number of emergency calls in that precinct is 15. The circles 230 may serve as visual indicators for different precincts and may be selectable. In some embodiments, other indicators of data associated with particular precincts may be used, such as other shapes, overlays, etc. For example, in one embodiment, a precinct area on a map may be overlaid with a color, gradient, shading, etc. to indicate data associated with the precinct (e.g., color may be shaded differently based on quantity of calls in a precinct relative to other precincts) and/or selection of a precinct. The circles 230 may be color coded or otherwise distinguished (e.g., by pattern) according to the value of the number. A legend 210 may provide an explanation for a color or a pattern. In FIG. 2A, the patterns indicate a quantity of calls within each area. In this example, a first pattern in the legend 210 refers to less than 10 emergency calls; a second pattern refers to between 10 and 100 calls; and a third pattern refers to more than 100 calls. Depending on the embodiment, patterns may be replaced with or supplemented by other visual indications, such as different colors of circles (or other shaped indicator), font changes, blinking text/circles, etc. Furthermore, in some embodiment, the pattern criteria may be user defined and/or automatically adjusted by the system. For example, a user may adjust patterns (or colors or other visual indicators) to distinguish between precincts having less than 20, between 20 and 200, and more than 200. In one embodiment, the system may automatically select the most informative ranges based on quantities of calls in various precincts, for example. In some embodiments, the patterns (or other visual indicators) may be based on percentages (e.g., calls per population of a precinct) or other parameters. FIGS. 2A-2D use the term “pin,” which may refer to a visual indicator that represents a particular emergency call. FIG. 2A does not include any pins, but shows another legend 220 that provides explanations for colors or patterns used for the pins.



FIG. 2B illustrates the same panel 200 as in FIG. 2A, now with a particular precinct selected on the map 260 in the panel 200. For example, the user hovers over the circle 230a (with a mouse pointer, touchscreen, or other input device) for a precinct, and the boundary for that precinct is indicated on the map 260. The user can click on the circle 230a to select the precinct. FIG. 2C illustrates the panel 200 after the user has selected circle 230a, wherein the panel 200 now includes a magnified view of the geographic area of the precinct associated with circle 230a and the surrounding area.



FIG. 2D illustrates the panel 200 updated so the map 260 displays the pins representing emergency calls within the particular selected precinct (associated with circle 230a in this example). In this embodiment, emergency calls can be represented by visual indicators, which can be referred to as “pins” as explained above, and the pins 240 can be color-coded or otherwise distinguished (e.g., by pattern) according to type of the emergency call, time call has been open, urgency of call, and/or other characteristics of the call. The legend 220 can provide information relating to each color or pattern used for pins 240. The pin details illustrated in FIG. 2D may be displayed in response to the user selecting the circle 230a for the precinct in the panel 200 of FIG. 2C or may be displayed initially when the map is magnified in response to selecting circle 230a in the panel 200 of FIG. 2B.


In FIGS. 2A-2D, the legend 220 lists the radio codes used to classify the emergency calls. The pins 240 may have different shapes. For example, square pins 240b may be used to indicate emergency calls that occurred within the past hour, whereas circular pins 240a may be used as the default shape.



FIG. 2E illustrates the panel 200 displaying details relating to an emergency call when the user clicks on (or hovers over or otherwise selects) a particular pin 240a. In this embodiment, the details are displayed near the pin 240a. The details may include the job number, the time and date of the job, location of the job, comments, etc. A job may be created for one or more emergency calls, and the law enforcement agency may manage various emergency calls by handling and processing the jobs. In this manner, the user interface 100 can display general information and detailed information in the same panel 200. More detailed information may be displayed in a separate window within the panel 200. For example, the emergency call can be represented as an object by the system that provides the user interface, and a separate pop-up window can be displayed to show various properties relating to the object. The window can have some degree of transparency so that it does not block the visualization behind it. The separate window may be referred to as the “object inspector.”


If any filters are applied in the user interface, as in FIG. 1, the data in the panel 200 may be filtered accordingly. For example, the emergency calls displayed in the panel 200 may be calls for a particular time frame (e.g., a specific shift of the police department). In certain embodiments, the map 260 may include other features and functionalities described in co-pending U.S. patent application Ser. No. 13/917,571, filed Jun. 13, 2013, entitled “INTERACTIVE GEOSPATIAL MAP,” which is incorporated by reference in its entirety.


Other Example Panels



FIG. 3 illustrates one embodiment of a panel 300 for visualization of high priority emergency calls in a geographical area associated with a law enforcement agency. Priority can refer to the urgency or importance level of a job. Lower numbers may indicate higher priority. In some implementations, priority is determined based on the radio code, such as automatically by the system (e.g., radio codes are mapped to priorities so that the system can automatically look up a priority when a radio code is known) or manually by the dispatch operator, for example. The priority can be updated based on circumstances related to a call, such as by officers on the scene or by changing events. In the embodiments discussed herein, lower numbers indicate higher priorities, but in other embodiments different priority scales and orders may be used to distinguish priorities of particular calls.


In the example of FIG. 3, the panel 300 is a table or tabular visualization. In FIG. 3, the panel 300 provides a table 310 of high priority calls. The table 310 can be organized as a number of pages, and one page may be displayed at a time. Multiple pages of the table 310 can be navigated using “prev,” “next,” “first,” and “last” buttons. Each page in the table may be scrollable within the panel 300. The information in the table 310 can be displayed in rows and columns. Types of information displayed in the table 310 can include date, job ID or number, precinct, priority, etc. The table 310 can support sorting, and the emergency calls in the table 310 may be sorted based on priority. For example, the table 310 displays the calls with higher priority (e.g., priority level “3” in FIG. 3) first, then displays the calls with lower priority (e.g., priority level “4”). The calls may be sorted by priority and then sorted by time within the same priority level. The example of FIG. 3 shows calls with priority level “3” first, and within the calls having priority level “3”, more recent calls are listed first.



FIG. 4 illustrates one embodiment of a panel 400 for visualization of statistics comparing past jobs and current jobs. Law enforcement agencies may be interested in seeing trends for number of jobs over time, which may be referred to as historical trends. As explained above, a job may be created for each emergency call, and the law enforcement agency can process the jobs to handle the emergency calls. For example, a particular law enforcement agency might want a comparison of current jobs and jobs from two weeks prior. The panel 400 provides a bar chart 410 of past jobs and current jobs. The law enforcement agency can select a time in the past for the comparison and/or a default time is automatically selected. The x-axis for the bar chart 410 indicates time periods of a day, and the y-axis represents the number of jobs. In the example of FIG. 4, the maximum number of jobs on the y-axis is 1,400. In the panel 400, each set of bars 420 represents the number of current jobs and the number of past jobs at a certain time in the day. In particular, the first set of bars 420a illustrates jobs from 14 days ago during the 8-9 am time period as bar 422a and jobs on the current day during the 8-9 am time period as bar 424a. In this embodiment, similar bars for the previous and current period during different hourly periods are illustrated. The bars for past jobs 422 and the bars for current jobs 424 can be distinguished by color or pattern. The legend 430 can include an explanation for a particular color or pattern. The pattern 432 indicates past jobs, and the pattern 434 indicates current jobs.



FIG. 5 illustrates one embodiment of a panel 500 for visualization of statistics relating to common radio codes. The panel 500 provides a bar chart 510 for top ten radio codes. As mentioned above, radio codes can signify different types of situations and/or responses. For example, “10-54” refers to ambulance calls, “10-53” refers to vehicle accidents, and “10-10” refers to general crimes (e.g., shoplifting, shots fired, suspicious vehicle, etc.). The top ten radio code bar chart 510 can provide an overview of which types of jobs being dispatched are most common at a glance. Depending on the filters that are applied, the panel 500 may show the top ten radio codes for a patrol borough, a precinct, a time frame, etc.



FIG. 6 illustrates one embodiment of a panel 600 for visualization of statistics relating to common radio subcodes. The panel 600 provides a bar chart 610 for the top ten radio subcodes. The bar chart 610 can be similar to the bar chart 510 in FIG. 5. The bar chart 610 for top ten radio subcodes can provide further details on which types of jobs are more common within particular radio codes. As in FIG. 6, the most common subcodes may be from the same radio code. Depending on the filters that are applied, the panel 600 may show the top ten radio subcodes for a patrol borough, a precinct, a time frame, etc.


Example Executive Dashboard



FIG. 7 illustrates another embodiment of a user interface 700 comprising multiple panels 710 for visualizing data of a law enforcement agency, referred to as an Executive Dashboard. FIG. 7 uses arrest data for illustrative purposes, but the user interface 700 can also be used to display emergency call data. Similar to the user interface 100 in FIG. 1, the user interface 700 includes multiple panels that provide various information related to law enforcement activities. However, user interface 700 includes different types of information in different panels that are of interest to the particular executive that views the user interface 700. Accordingly, the executive dashboard system allows different users to focus on types of information that are most important to the particular user, and panels included on such executive dashboards may vary between law enforcement agencies and even between different users within a same law enforcement organization (e.g., two detectives in a police department may have different panels on their executive dashboards). As another example, two police departments in different regions of the U.S. could be interested in showing different types of information in the user interface 700 or user interface 100. The user interface 700 illustrates additional panels that are not included in user interface 100 and, as discussed herein, other user interfaces may be configured to include any portion of the panels discussed with references to FIG. 1 or 7.


In the example of FIG. 7, the user interface 700 includes nine different panels 710. Each panel 710 can include a visualization relating to the data of the law enforcement agency. The panels 710 are explained in more detail with respect to FIGS. 8A-8I. The panel 710a provides a map of important arrests in a geographical area (e.g., the U.S). The panel 710b provides a bar chart showing arrests by region. The panel 710c provides a bar chart showing arrest priority. The panel 710d includes a table of high priority calls for the geographical area. The panel 710e is an information panel showing brief details of the largest arrest. The panel 710f includes a line graph or time series chart showing arrest trend. The panel 710g includes a sankey or flow diagram. The panel 710h shows a word diagram or a word cloud diagram. The panel 710i shows another line graph or time series chart showing arrest trend by priority. Details relating to the panels 710a-710i are explained with respect to FIGS. 8A-8I, respectively.


Other types of visualizations can include a line graph, object summary, pie chart, time wheel, time series chart, etc. A time wheel may refer to a circular representation of a unit of time (e.g., an hour, a day, a week, a month), which may be subdivided into smaller units of time. The panels 710 can be similar to the panels 110 in the user interface 100. A panel 710 can include any type of visualization. For example, the panels 710 in the user interface 700 may include any of the visualizations illustrated in FIGS. 2-6, in addition to the visualizations illustrated in FIGS. 8A-8i. The panels 710, 110 can update or refresh periodically (e.g., at a specified interval), in response to an action or an input (e.g., keywords in a keyword filter), in response to changes in information in another panel 710, 110 (e.g., from application of one or more filters), etc.


Similar to FIG. 1, the user interface 700 can also include one or more filters 750 to apply to the data displayed in the user interface 700. Types of filters can include: keyword filters, multiple value filters, date filters, etc. The example in FIG. 7 includes a time filter 750a, a region filter 750b, and a priority filter 750c. The time filter 750a can accept a time frame or time period from the user. In the example of FIG. 7, the time filter 750a is a drop down menu, and “past year” is selected as the relevant time frame. The region filter 750b may also be a drop down menu, but in the example of FIG. 7, no value is selected for the region. Therefore, the data shown in the user interface 700 is not filtered for a particular region. The priority filter 750c may be a multi-value filter that can accept multiple values. Here, selected priority levels are “high” and “urgent.”


The user interface 700 can be configured so that visualizations in all of the panels 710 may be automatically updated (e.g., in realtime). In one embodiment, the visualizations are updated in response to changes to the filter criteria. In another embodiment, only a portion of the panels 710 are updated in response to filter changes, such as a predefined set of panels 710 and/or a user selected group of panels 710. In this way, the user may be able to view unfiltered data in one or more panels 710, while adjusting filters that are automatically applied to other panels 710. In another embodiment, different sets of filters may be applied to different sets of one or more panels 710.


The user interface 700 or the user interface 100 may be implemented as a layer on top of systems using object centric data models as described with respect to FIGS. 10-11. For example, the user interface 700, 100 can be an overview layer that provides a high-level overview of data in the underlying systems. As mentioned above, the overview layer may be referred to as the “executive dashboard” since it displays relevant high-level data in one user interface and can be useful to executives and decision makers. The user interface 700, 100 can be a web interface or a user interface for an application program. The underlying systems may web applications or native applications.


A platform may be provided for building overview layers for object-centric systems. For example, such platform can be provided by the systems described in FIGS. 10-11. The platform may be used to generate the user interface 700, 100. An interface may be provided between the overview layer and the underlying system or data source in order to provide a level of abstraction. By using a standard interface, an overview layer can be compatible with any type of underlying system or data source. For example, abstract forms of filters and data may be provided, and these abstract filters and data can be compatible with systems or data sources using different formats. The data displayed in a single visualization can be obtained from multiple systems or data sources. For example, a time series chart could display the number of current incidents by the hour and also overlay the average number of incidents over a period of several days. In another example, a map might show both emergency calls and arrests.


The platform may also provide general scripts that can be used to create customized user interfaces so that the user does not have to write code. Such scripts can include SQL or proprietary scripts, for example. In some embodiments, the platform provides a configuration plugin template, and the user can use a command line to start implementing the overview user interface or the executive dashboard. The template may specify the configurations and/or settings for generating the customized user interface 700, 100. The platform can allow various aspects of the user interface 700, 100 to be configured, such as by using a visualization framework, configuration framework, and/or layout framework. In one embodiment, the visualization framework allows users to implement their own visualizations that can be used in the dashboard. The configuration framework may allow users to implement their own data source and transform the data to be displayed in a visualization. The layout framework may take the requested size of a visualization (e.g., 2 rows high and 2 columns wide) and fit it into the dashboard's layout.


One of the features of the user interface 700, 100 is that customization of panels 710, 110 in the user interface 700, 100 can be simple and easy for the users. The panels 710, 110 may be easily rearranged, and the dimensions of the panels 710, 110 can be specified in a simple manner (e.g., by number of rows and columns). In one embodiment, a panel 710, 110 can be dragged to a location within the user interface 700, 100 to arrange its position. In another embodiment, the user can add or select panels 710, 110 of the user's choice to create a customized user interface 700, 100 (e.g., in real-time). The user interface 700, 100 may be a web interface.


Other aspects of the user interface 700, 100 may also be configured and customized. Such aspects can include pages, page layout, security settings, visualizations, etc. In some embodiments, the user interface 700, 100 are organized as multiple pages, and one page of the user interface 700, 100 may be displayed at one time. The different pages may be grouped by categories. Similarly, the panels 710, 110 can be grouped by categories. Access or security settings may be specified for a group of pages or panels 710, 110, and only those with the access privileges may be able to view a certain page or panel 710, 110. For example, if a user does not have access rights to a particular panel 710, 110, the user interface 700, 100 would not display the panel 710, 110 for that user. The features of a visualization could also be customized. For example, the user can specify the maximum length of a bar in a bar chart within the panel 710, 110 or the colors used for the bars.


In certain embodiments, the user interface 700, 100 detects or is otherwise aware of the user interface mode in which it is displayed and adjusts the way the user interface 700, 100 is presented on a display device. For example, the user interface 700, 100 may be displayed in full screen mode. The user interface mode (“UI mode”) can be designated by a specific application. For large screens (e.g., 10-foot display), the user interface 700, 100 could be displayed for showcasing purposes, rather than (or in addition to) performing daily operations. In such case, the user interface 700, 100 can be adjusted to be more fitting for large screens. The UI mode for displaying on large screens may be referred to as “large screen mode” or “10-foot mode.” In large screen mode, the user interface 700, 100 may not concurrently display the panels 710, 110, but instead rotate one or more panels 710, 110 at an interval. Also, filters 750, 150 may not be displayed, and fewer details may be shown in the user interface 700, 100 and/or the panels 710, 110. Text and/or graphical elements can be larger so they can be seen from a distance. Each panel 710, 110 may also know about the UI mode and adjust itself to be more appropriately displayed in a selected UI mode.


Figures have been explained with respect to law enforcement agencies, but the user interfaces described in this disclosure may be used by other types of organizations. Some examples include fire departments, offices of mayors, etc. Fire departments may use the overview user interface to track service calls. A mayor's office may use the overview user interface to manage building jobs and building complaints. For example, by viewing the building jobs and building complaints in the same panel or user interface, the mayor's office can note any correlations between the jobs and the complaints at a glance. As explained above, the techniques of this disclosure may provide a user interface 700, 100 that can be easily configured and customized and that can display a customizable overview of high-level data. The overview data can assist decision makers in obtaining relevant data at a glance and making informed decisions.



FIGS. 8A-8I are expanded view of the various panels 700 illustrated in FIG. 7, which can be included in a user interface for visualizing data of a law enforcement agency. FIG. 8A illustrates one embodiment of the panel 710a that provides a map 810 of important arrests in a geographical area (e.g., the U.S.). The map 810 may be narrowed down by a geographical region of interest. In most cases, law enforcement agencies are local to particular regions, and the default region shown in the panel 800a can be set to the particular region associated with a law enforcement agency. The circles 815 are visual indicators that represent arrests across the geographical area. The importance or priority level of the arrests may be indicated by the relative size of the circles 815. The importance of an arrest may be determined by the law enforcement agency, e.g., based on a variety of factors. For example, a law enforcement agency may assign a number or a level (e.g., high, low, etc.) indicating the importance of the arrest. Importance may be provided as a property of an arrest object or event. Also, different types of arrests may be indicated by colors or patterns of the circles 815.


The map 810 is configured to allow the user to drill down from a larger area to a smaller area. In one example, the map 810 starts with showing a country (e.g., the U.S.). The user clicks (or scrolls, presses a certain key, provides an audible command, or any other predefined input) on a state within the country, and the map 810 zooms in to show the state. The user clicks on a county in the state, and the map 810 zooms in to display the county. The user then can click on a city in the county, and the map 810 zooms in to show the details for that city, and so forth. In this way, the same panel or user interface can display information of varying levels of depth.



FIG. 8B illustrates one embodiment of a panel 710b that provides a bar chart 820 showing arrests by regions. The example in FIG. 8B displays the arrests for different cities. A bar can represent the number of arrests for a city or region. The number of arrests may also be displayed over the bar for the city or region. In one embodiment, clicking on a bar works as a filter, and when the user clicks on a bar for a city or region, the data in the other panels in the user interface may update to apply the filter. The bar chart 820 can display multiple series of data, for example, in the form of stacked sets of bars, which may be referred to as the stacked bar chart.



FIG. 8C illustrates one embodiment of a panel 710c that provides a bar chart 830 showing arrest priority. The example in FIG. 8C displays three different priority levels for arrests: low, high, and urgent. A bar can represent the number of arrests for a specific priority level, and the number of arrests can also be displayed over the bar for the priority level. The priority levels may be defined by a law enforcement agency as appropriate.



FIG. 8D illustrates one embodiment of a panel 710d that provides a table 840 of high priority calls for the geographical area. Similar to FIG. 4, the table 840 supports paging and can be organized as a number of pages. One page may be displayed at a time, and the different pages of the table 840 can be navigated using “prev,” “next,” “first,” and “last” buttons. A page in the table may be scrolled within the panel 710d. The information in the table 840 can be displayed in rows and columns. The table 840 can include various types of information, depending on the preference or requirements of a law enforcement agency. The example of FIG. 8D includes date, arrest description, arrest region, arrest priority, and arrest subjects. Priority may indicate the level of crime associated with the arrest, such that more dangerous crimes may have a higher level of priority. Subjects may refer to subjects or topics relating to the arrest. The data in the table 840 can be sorted by columns (e.g., priority).



FIG. 8E illustrates one embodiment of a panel 710e that provides brief details of the largest arrest. The user interface may include a panel that provides overview information for a prominent arrest in a geographical area. In one embodiment, the largest arrest is an arrest with the largest number of individuals arrested in relation to a certain event. In the example of FIG. 8E, the information included is person count, status, and location name. Person count may refer to the number of individuals involved in the arrest. The user can view more detailed information by clicking on a link or a button. The panel may be an example of object summary visualization. For example, an arrest can be represented as an object as explained in connection with FIGS. 10-11. Information about the arrest can be represented as various properties of the object. The object summary may include some or all of the properties relating to the object. The object summary can include a link to the object itself. For example, clicking on a link or a button can bring up the screen that includes more detailed information about the object. Such screen may be referred to as the “object inspector.” The object summary or the object inspector may also include links for obtaining information about the object in other applications.



FIG. 8F illustrates one embodiment of a panel 710f that provides a time series chart 860 showing arrest trend. A time series chart can be a specific example of a line graph. A time series chart may show changes over time or time-bucketed events as a line chart. Discrete events may be represented by visual indicators (e.g., circles or bubbles). The time series chart can overlay multiple series as illustrated with respect to FIG. 8I. The time series chart 860 in FIG. 8F shows the arrest trend over several months. The months are plotted on the x-axis, and the number of arrests is plotted on the y-axis. The legend 865 provides information on the type of data that is plotted. Here, the line represents arrests. In certain embodiments, the time series chart can support selecting a time range on the time series chart itself. For example, the user can drag from left to right in order to create a time region, and the time region could be applied as a filter to the some or all of the panels 710. The time series chart may also display detailed information for a point when the mouse hovers over that point.



FIG. 8G illustrates one embodiment of a panel 710g that provides a sankey or flow diagram 870. A flow diagram can show how objects move between states and/or categories. The example of FIG. 8G includes three states: State A 875a; State B 875b; and State C 875c. The flow diagram 870 shows arrests moving from State A 875a to State B 875b, arrests moving from State A 875a to State C 875c, and arrests moving from State B 875b to State C 875c. In one example, State A 875a can represent arrest warrant issued, and State C 875c can represent arrest completed. State B 875b can represent an intermediate state, for example, where an attempt at arrest has been made but was unsuccessful. The width of a flow can be proportional to the quantity included in that flow. Similar to other panels, the flow diagram 870 may be updated in realtime as the user adjust one or more filters, such as to adjust a time period for which data is compiled in generating the flow diagram 870.



FIG. 8H illustrates one embodiment of a panel 710h that provides a word diagram 880. A word diagram may display various keywords according to the relative occurrence, count, or size associated with each keyword. The example of 8H displays city names in different sizes. A city that has more arrests than other cities may appear bigger in the word diagram 880; conversely, a city that has fewer arrests than other cities may appear smaller. Different colors may be used to display the keywords. The colors could represent another type of information, or simply make it easier to set the words apart from each other. The word diagram can help visualize prevalence of various words. In other embodiments, size of the keywords may be determined based on other factors, such as emergency calls, average response time to emergency calls, casualties, and/or any other attribute or combination of attributes associated with respective law enforcement agencies.



FIG. 8I illustrates one embodiment of a panel 710i that provides another time series chart 890 showing arrest trend by priority. The time series chart 890 can be similar to the time series chart 860 in FIG. 8F. The time series chart 890 also shows arrest trend, but breaks down the arrests by priority level. A separate line represents arrests at a priority level. The months are plotted on the x-axis, and the number of arrests is plotted on the y-axis. The legend 895 provides information on the type of data that is plotted. Here, the first line indicates high priority arrests; the second line indicates low priority arrests; and the third line indicates urgent arrests.


Filters can narrow or drill down the information displayed in the user interface 700,100 and/or the panels 710, 110. Filters can be temporal filters or object filters. Information displayed in the panels 710, 110 can be represented by objects, and object filters may filter data based on various properties of objects. Different types of filters may include: a keyword filter, a date filter, a time filter, a multiple value filter, etc. A keyword filter can accept one or more keywords for filtering the data. A keyword filter may support full-text searching. For example, full texts of objects can be searched. A date filter, a time filter, or a date picker may allow the user to select a specific time, date, range of time, range of date, etc. A multiple value or multi-value filter may accept more than one value for filtering the data. Users may also create their own filters (e.g., as plug-in filters). Various types of filters may be used in combination. For example, one filter is both a keyword filter and a multi-value filter; the user can enter one or more keywords in the keyword filter.


When filters are applied in the user interface 700, 100 or a panel 710, 110 in the user interface 700, 100, the user interface 700,100 or the panel 710, 110 can show an indication that the data has been filtered. For example, the user interface 700, 100 can include a button for undoing the filtering or returning to the previous unfiltered data, or the user interface 700, 100 can have a reset button to return to the initial view without any filtering. Applying a filter in one panel 710, 110 may filter the data in some or all of the other panels 710, 110. Generally, the filter will apply to the panels 710, 110 included in the user interface 700, 100. However, in some cases, application of the filter to a particular panel 710, 110 may make it more difficult for the user to navigate. For example, in FIG. 8B, the selection of a bar chart for a city or a region can work as a filter to display the data for that city or region; FIG. 8B does not filter the cities or regions initially displayed in the panel 710b so that the user can click on the bars for different cities or regions to view the corresponding data, but the data in the other panels 710 in the user interface are updated to reflect the filter. The panels 710, 110 and/or the user interface 700, 100 may refresh when filters are applied.


Example Method



FIG. 9 illustrates a flowchart for providing a user interface including multiple panels for visualizing emergency call data of a law enforcement agency, according to certain embodiments. The process 900 is explained in connection with FIGS. 1-2, but may also apply to FIGS. 7-8 or other executive dashboards. Certain details relating to the process 900 are explained in more detail with respect to FIGS. 1-8. The process 900 may be implemented by one or more systems described with respect to FIGS. 10-11, such as by a server system that has access to the various emergency call data and generates the various panels and visualizations requests by users. Depending on the embodiment, the process 900 may include fewer or additional blocks, and the blocks may be performed in an order that is different than illustrated.


At block 902, the process 900 generates a user interface 100 configured to concurrently display a plurality of panels 110 each including a visual representation based on emergency call data of a law enforcement agency. In this embodiment, the emergency call data includes data associated with a plurality of emergency calls. Types of visual representation or visualizations included in a panel 110 may include: a map, a bar chart, a table, a line graph, a time series chart, an object summary, a flow diagram, a word diagram, a pie chart, a time wheel, etc. The details relating to each type of visual representation are explained with respect to FIGS. 1-8.


At block 904, a map panel of a geographical region associated with the law enforcement agency is included in the user interface (such as the first panel 110a discussed above). The map of the geographical region may include a plurality of selectable precinct indicators representing a corresponding plurality of precincts for which the law enforcement agency has at least some law enforcement responsibilities. The plurality of precinct indicators may each show the number of emergency calls in a particular precinct. The precinct indicators may have different colors to convey information at a glance. For example, indicators for precincts with high number of emergency calls may be shown in orange; indicators for precincts with low number of emergency calls may be shown in green.


At block 906, in response to receiving a selection (e.g., from an executive viewing the executive dashboard) of a particular precinct indicator corresponding to a particular precinct, the map panel is updated to display one or more emergency call indicators representing a corresponding one or more emergency calls within the selected precinct. Emergency calls may be represented as objects and have properties associated with them, such as job ID, time and date, location, radio code, assigned resource, comments, etc. The one or more emergency call indicators can have different colors and/or shapes to convey information quickly. For example, different colors correspond to different types of jobs or radio codes, and different shapes indicate how recent the emergency calls are.


At block 908, in response to receiving a selection of a particular emergency call indicator corresponding to a particular emergency call, the map panel is updated to display information relating to the particular emergency call. Information about the emergency call may be displayed on the map itself or in a separate pop-up window. In one example, brief information is shown on the map itself, next to the emergency call indicator, and detailed information is displayed in the separate pop-up window. The details may include job number, time and date, location, comments, etc.


In certain embodiments, a second panel displaying a statistic relating to at least some of the emergency call data is included in the user interface. In one embodiment, the statistic in the second panel is associated with the one or more emergency calls within the particular precinct. In another embodiment, the statistic in the second panel is associated with emergency calls in all of the plurality of precincts in the geographical region.


In some embodiments, a filter may be applied to the map panel. Applying the filter may update the map panel to display the one or more emergency call indicators that meet criteria indicated by the filter. Types of filters can include: a keyword filter, a date filter, a time filter, a multiple value filter, etc. The filter applied in the map panel can also be applied in the second panel and/or another panel of the plurality of panels.


The type of information displayed in the panels may include: high priority emergency calls, historical trend of emergency calls, top radio codes, top radio subcodes, etc. For example, the plurality of panels other than the map panel can display any of the different types of information listed above.


Definitions


In order to facilitate an understanding of the systems and methods discussed herein, a number of terms are defined below. The terms defined below, as well as other terms used herein, should be construed to include the provided definitions, the ordinary and customary meaning of the terms, and/or any other implied meaning for the respective terms. Thus, the definitions below do not limit the meaning of these terms, but only provide exemplary definitions.


Ontology: Stored information that provides a data model for storage of data in one or more databases. For example, the stored data may comprise definitions for object types and property types for data in a database, and how objects and properties may be related.


Database: A broad term for any data structure for storing and/or organizing data, including, but not limited to, relational databases (Oracle database, mySQL database, etc.), spreadsheets, XML files, and text file, among others.


Data Object or Object: A data container for information representing specific things in the world that have a number of definable properties. For example, a data object can represent an entity such as a person, a place, an organization, a market instrument, or other noun. A data object can represent an event that happens at a point in time or for a duration. A data object can represent a document or other unstructured data source such as an e-mail message, a news report, or a written paper or article. Each data object may be associated with a unique identifier that uniquely identifies the data object. The object's attributes (e.g. metadata about the object) may be represented in one or more properties.


Object Type: Type of a data object (e.g., Person, Event, or Document). Object types may be defined by an ontology and may be modified or updated to include additional object types. An object definition (e.g., in an ontology) may include how the object is related to other objects, such as being a sub-object type of another object type (e.g. an agent may be a sub-object type of a person object type), and the properties the object type may have.


Properties: Attributes of a data object that represent individual data items. At a minimum, each property of a data object has a property type and a value or values.


Property Type: The type of data a property is, such as a string, an integer, or a double. Property types may include complex property types, such as a series data values associated with timed ticks (e.g. a time series), etc.


Property Value: The value associated with a property, which is of the type indicated in the property type associated with the property. A property may have multiple values.


Link: A connection between two data objects, based on, for example, a relationship, an event, and/or matching properties. Links may be directional, such as one representing a payment from person A to B, or bidirectional.


Link Set: Set of multiple links that are shared between two or more data objects.


Object Centric Data Model


To provide a framework for the following discussion of specific systems and methods described herein, an example database system 1210 using an ontology 1205 will now be described. This description is provided for the purpose of providing an example and is not intended to limit the techniques to the example data model, the example database system, or the example database system's use of an ontology to represent information.


In one embodiment, a body of data is conceptually structured according to an object-centric data model represented by ontology 1205. The conceptual data model is independent of any particular database used for durably storing one or more database(s) 1209 based on the ontology 1205. For example, each object of the conceptual data model may correspond to one or more rows in a relational database or an entry in Lightweight Directory Access Protocol (LDAP) database, or any combination of one or more databases.



FIG. 10 illustrates an object-centric conceptual data model according to an embodiment. An ontology 1205, as noted above, may include stored information providing a data model for storage of data in the database 1209. The ontology 1205 may be defined by one or more object types, which may each be associated with one or more property types. At the highest level of abstraction, data object 1201 is a container for information representing things in the world. For example, data object 1201 can represent an entity such as a person, a place, an organization, a market instrument, or other noun. Data object 1201 can represent an event that happens at a point in time or for a duration. Data object 1201 can represent a document or other unstructured data source such as an e-mail message, a news report, or a written paper or article. Each data object 1201 is associated with a unique identifier that uniquely identifies the data object within the database system.


Different types of data objects may have different property types. For example, a “Person” data object might have an “Eye Color” property type and an “Event” data object might have a “Date” property type. Each property 203 as represented by data in the database system 1210 may have a property type defined by the ontology 1205 used by the database 1205.


Objects may be instantiated in the database 1209 in accordance with the corresponding object definition for the particular object in the ontology 1205. For example, a specific monetary payment (e.g., an object of type “event”) of US$30.00 (e.g., a property of type “currency”) taking place on Mar. 27, 2009 (e.g., a property of type “date”) may be stored in the database 1209 as an event object with associated currency and date properties as defined within the ontology 1205.


The data objects defined in the ontology 1205 may support property multiplicity. In particular, a data object 1201 may be allowed to have more than one property 203 of the same property type. For example, a “Person” data object might have multiple “Address” properties or multiple “Name” properties.


Each link 1202 represents a connection between two data objects 1201. In one embodiment, the connection is either through a relationship, an event, or through matching properties. A relationship connection may be asymmetrical or symmetrical. For example, “Person” data object A may be connected to “Person” data object B by a “Child Of” relationship (where “Person” data object B has an asymmetric “Parent Of” relationship to “Person” data object A), a “Kin Of” symmetric relationship to “Person” data object C, and an asymmetric “Member Of” relationship to “Organization” data object X. The type of relationship between two data objects may vary depending on the types of the data objects. For example, “Person” data object A may have an “Appears In” relationship with “Document” data object Y or have a “Participate In” relationship with “Event” data object E. As an example of an event connection, two “Person” data objects may be connected by an “Airline Flight” data object representing a particular airline flight if they traveled together on that flight, or by a “Meeting” data object representing a particular meeting if they both attended that meeting. In one embodiment, when two data objects are connected by an event, they are also connected by relationships, in which each data object has a specific relationship to the event, such as, for example, an “Appears In” relationship.


As an example of a matching properties connection, two “Person” data objects representing a brother and a sister, may both have an “Address” property that indicates where they live. If the brother and the sister live in the same home, then their “Address” properties likely contain similar, if not identical property values. In one embodiment, a link between two data objects may be established based on similar or matching properties (e.g., property types and/or property values) of the data objects. These are just some examples of the types of connections that may be represented by a link and other types of connections may be represented; embodiments are not limited to any particular types of connections between data objects. For example, a document might contain references to two different objects. For example, a document may contain a reference to a payment (one object), and a person (a second object). A link between these two objects may represent a connection between these two entities through their co-occurrence within the same document.


Each data object 1201 can have multiple links with another data object 1201 to form a link set 1204. For example, two “Person” data objects representing a husband and a wife could be linked through a “Spouse Of” relationship, a matching “Address” property, and one or more matching “Event” properties (e.g., a wedding). Each link 1202 as represented by data in a database may have a link type defined by the database ontology used by the database.


Implementation Mechanisms


According to one embodiment, the techniques described herein are implemented by one or more special-purpose computing devices. The special-purpose computing devices may be hard-wired to perform the techniques, or may include circuitry or digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such special-purpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accomplish the techniques. The special-purpose computing devices may be desktop computer systems, server computer systems, portable computer systems, handheld devices, networking devices or any other device or combination of devices that incorporate hard-wired and/or program logic to implement the techniques.


Computing device(s) are generally controlled and coordinated by operating system software, such as iOS, Android, Chrome OS, Windows XP, Windows Vista, Windows 7, Windows 8, Windows Server, Windows CE, Unix, Linux, SunOS, Solaris, iOS, Blackberry OS, VxWorks, or other compatible operating systems. In other embodiments, the computing device may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface functionality, such as a graphical user interface (“GUI”), among other things.


For example, FIG. 11 is a block diagram that illustrates a computer system 1800 upon which an embodiment may be implemented. For example, the computing system 1800 may comprises a server system that accesses law enforcement data and provides user interface data to one or more users (e.g., executives) that allows those users to view their desired executive dashboards and interface with the data. Other computing systems discussed herein, such as the user (e.g., executive), may include any portion of the circuitry and/or functionality discussed with reference to system 1800.


Computer system 1800 includes a bus 1802 or other communication mechanism for communicating information, and a hardware processor, or multiple processors, 1804 coupled with bus 1802 for processing information. Hardware processor(s) 1804 may be, for example, one or more general purpose microprocessors.


Computer system 1800 also includes a main memory 1806, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 1802 for storing information and instructions to be executed by processor 1804. Main memory 1806 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 1804. Such instructions, when stored in storage media accessible to processor 1804, render computer system 1800 into a special-purpose machine that is customized to perform the operations specified in the instructions.


Computer system 1800 further includes a read only memory (ROM) 808 or other static storage device coupled to bus 1802 for storing static information and instructions for processor 1804. A storage device 1810, such as a magnetic disk, optical disk, or USB thumb drive (Flash drive), etc., is provided and coupled to bus 1802 for storing information and instructions.


Computer system 1800 may be coupled via bus 1802 to a display 1812, such as a cathode ray tube (CRT) or LCD display (or touch screen), for displaying information to a computer user. An input device 1814, including alphanumeric and other keys, is coupled to bus 1802 for communicating information and command selections to processor 1804. Another type of user input device is cursor control 1816, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 1804 and for controlling cursor movement on display 1812. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane. In some embodiments, the same direction information and command selections as cursor control may be implemented via receiving touches on a touch screen without a cursor.


Computing system 1800 may include a user interface module to implement a GUI that may be stored in a mass storage device as executable software codes that are executed by the computing device(s). This and other modules may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.


In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digital video disc, flash drive, magnetic disc, or any other tangible medium, or as a digital download (and may be originally stored in a compressed or installable format that requires installation, decompression or decryption prior to execution). Such software code may be stored, partially or fully, on a memory device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules or computing device functionality described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage


Computer system 1800 may implement the techniques described herein using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 1800 to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 1800 in response to processor(s) 1804 executing one or more sequences of one or more instructions contained in main memory 1806. Such instructions may be read into main memory 1806 from another storage medium, such as storage device 1810. Execution of the sequences of instructions contained in main memory 1806 causes processor(s) 1804 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.


The term “non-transitory media,” and similar terms, as used herein refers to any media that store data and/or instructions that cause a machine to operate in a specific fashion. Such non-transitory media may comprise non-volatile media and/or volatile media. Non-volatile media includes, for example, optical or magnetic disks, such as storage device 1810. Volatile media includes dynamic memory, such as main memory 1806. Common forms of non-transitory media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.


Non-transitory media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between nontransitory media. For example, transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise bus 1802. Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.


Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 1804 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 1800 can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 1802. Bus 1802 carries the data to main memory 1806, from which processor 1804 retrieves and executes the instructions. The instructions received by main memory 1806 may retrieves and executes the instructions. The instructions received by main memory 1806 may optionally be stored on storage device 1810 either before or after execution by processor 1804.


Computer system 1800 also includes a communication interface 1818 coupled to bus 1802. Communication interface 1818 provides a two-way data communication coupling to a network link 1820 that is connected to a local network 1822. For example, communication interface 1818 may be an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 1818 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN (or WAN component to communicated with a WAN). Wireless links may also be implemented. In any such implementation, communication interface 1818 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.


Network link 1820 typically provides data communication through one or more networks to other data devices. For example, network link 1820 may provide a connection through local network 1822 to a host computer 1824 or to data equipment operated by an Internet Service Provider (ISP) 1826. ISP 1826 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the “Internet” 1828. Local network 1822 and Internet 1828 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on network link 1820 and through communication interface 1818, which carry the digital data to and from computer system 1800, are example forms of transmission media.


Computer system 1800 can send messages and receive data, including program code, through the network(s), network link 1820 and communication interface 1818. In the Internet example, a server 1830 might transmit a requested code for an application program through Internet 1828, ISP 1826, local network 1822 and communication interface 1818.


The received code may be executed by processor 1804 as it is received, and/or stored in storage device 1810, or other non-volatile storage for later execution.


Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and fully or partially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware. The processes and algorithms may be implemented partially or wholly in application-specific circuitry.


The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.


Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.


Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.


It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated. The scope of the invention should therefore be construed in accordance with the appended claims and any equivalents thereof.

Claims
  • 1. A computer system configured to provide a customizable user interface relating to visualization of data associated with a law enforcement agency, the computer system comprising: one or more hardware computer processors configured to execute code in order to cause the system to: generate a user interface configured to concurrently display a plurality of panels each including a visual representation based on data of a law enforcement agency, the data comprising data associated with a plurality of events,wherein the plurality of panels comprises at least: a first panel displaying a selectable list of precincts and a map of a geographical region associated with the law enforcement agency, the map of the geographical region comprising a plurality of selectable indicators representing a corresponding plurality of events, the first panel configured to: in response to receiving a selection of a particular indicator, update the map displayed in the first panel to display, overlaid on the map, one or more detail information items related to the particular indicator; andin response to receiving a selection of a particular precinct from the selectable list of precincts, update the first panel to show events associated with the particular precinct and not events associated with other precincts;a second panel displaying a list of the events corresponding to at least some of the plurality of events displayed in the first panel, the second panel configured to dynamically update the list in response to the selection of the particular precinct from the selectable list of precincts in the first panel to show events associated with the particular precinct and not events associated with other precincts; anda third panel displaying data of the law enforcement agency related to events, wherein the third panel is configured not to update in response to the selection of the particular precinct in the first panel.
  • 2. The system of claim 1, wherein the list in the second panel is further dynamically updated to show events associated with the geographical region.
  • 3. The system of claim 1, wherein the plurality of panels further comprise a visual representation including at least one of: a map, a bar chart, a table, a line graph, a time series chart, an object summary, a flow diagram, a word diagram, a pie chart, or a time wheel.
  • 4. The system of claim 1, wherein the plurality of events include at least one of: high priority emergency calls, historical trend of emergency calls, top radio codes, or top radio subcodes.
  • 5. The system of claim 1, wherein the code is further configured to cause the system to apply a filter to the first panel, wherein applying the filter updates the first panel to display the one or more detail information items that meet criteria indicated by the filter.
  • 6. The system of claim 5, wherein the filter comprises at least one of: a keyword filter, a date filter, a time filter, or a multiple value filter.
  • 7. The system of claim 5, wherein applying the filter to the first panel applies the filter to one or more other panels of the plurality of panels.
  • 8. The system of claim 1, wherein each panel in the plurality of panels is resizable.
  • 9. The system of claim 1, wherein settings for each panel in the plurality of panels are customized for an individual user, a group of users, or an organization.
  • 10. The system of claim 1, wherein the one or more detail information items are displayed in a pop-up overlaid on the map.
  • 11. A method of providing a customizable user interface relating to visualization of data associated with a law enforcement agency, the method comprising: generating, using one or more hardware computer processors, a user interface configured to concurrently display a plurality of panels each including a visual representation based on data of a law enforcement agency, the data comprising data associated with a plurality of events;displaying in the user interface at least a first panel of the plurality of panels, the first panel displaying a selectable list of precincts and a map of a geographical region associated with the law enforcement agency, the map of the geographical region comprising a plurality of selectable indicators representing a corresponding plurality of events;in response to receiving a selection of a particular indicator, updating the map displayed in the first panel to display, overlaid on the map, one or more detail information items related to the particular indicator;in response to receiving a selection of a particular precinct from the selectable list of precincts, updating the first panel to show events associated with the particular precinct and not events associated with other precincts;displaying in the user interface a second panel of the plurality of panels, the second panel configured to display a list of the events corresponding to at least some of the plurality of events displayed in the first panel, and dynamically update the list in response to the selection of the particular precinct from the selectable list of precincts in the first panel to show events associated with the particular precinct and not events associated with other precincts; anddisplaying in the user interface a third panel of the plurality of panels, the third panel configured to display data of the law enforcement agency related to events, wherein the third panel is configured not to update in response to the selection of the particular precinct in the first panel.
  • 12. The method of claim 11, wherein the plurality of panels further comprise a visual representation including at least one of: a map, a bar chart, a table, a line graph, a time series chart, an object summary, a flow diagram, a word diagram, a pie chart, or a time wheel.
  • 13. The method of claim 11, further comprising applying a filter to the second panel, wherein applying the filter updates the second panel to display the list of events including events that meet criteria indicated by the filter.
  • 14. The method of claim 11, wherein the plurality of events include at least one of: high priority emergency calls, historical trend of emergency calls, top radio codes, or top radio subcodes.
  • 15. The method of claim 11, further comprising applying a filter to the first panel, wherein applying the filter updates the first panel to display the one or more detail information items that meet criteria indicated by the filter.
  • 16. The method of claim 15, wherein the filter comprises at least one of: a keyword filter, a date filter, a time filter, or a multiple value filter.
  • 17. The method of claim 15, wherein applying the filter to the first panel applies the filter to one or more other panels of the plurality of panels.
  • 18. The method of claim 11, wherein the one or more detail information items are displayed in a pop-up overlaid on the map.
  • 19. A non-transitory computer readable medium comprising instructions for providing a customizable user interface relating to visualization of data associated with a law enforcement agency that cause a hardware computer processor to: generate a user interface configured to concurrently display a plurality of panels each including a visual representation based on data of a law enforcement agency, the data comprising data associated with a plurality of events;display in the user interface at least a first panel of the plurality of panels, the first panel displaying a selectable list of precincts and a map of a geographical region associated with the law enforcement agency, the map of the geographical region comprising a plurality of selectable indicators representing a corresponding plurality of events;in response to receiving a selection of a particular indicator, update the map displayed in the first panel to display, overlaid on the map, one or more detail information items related to the particular indicator;in response to receiving a selection of a particular precinct from the selectable list of precincts, update the first panel to show events associated with the particular precinct and not events associated with other precincts;display in the user interface a second panel of the plurality of panels, the second panel displaying a list of the events corresponding to at least some of the plurality of events displayed in the first panel;dynamically update the list in the second panel in response to the selection of the particular precinct from the selectable list of precincts in the first panel to show events associated with the particular precinct and not events associated with other precincts; anddisplay in the user interface a third panel of the plurality of panels, the third panel displaying data of the law enforcement agency related to events, wherein the third panel is configured not to update in response to the selection of the particular precinct in the first panel.
INCORPORATION BY REFERENCE TO ANY PRIORITY APPLICATIONS

This application is a continuation of U.S. application Ser. No. 14/108,187, filed Dec. 16, 2013, which claims the benefit of U.S. Provisional Application No. 61/893,058, filed Oct. 18, 2013, the entire content of which is incorporated herein by reference. Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 CFR 1.57.

US Referenced Citations (550)
Number Name Date Kind
4899161 Morin et al. Feb 1990 A
4958305 Piazza Sep 1990 A
5109399 Thompson Apr 1992 A
5329108 Lamoure Jul 1994 A
5632009 Rao et al. May 1997 A
5632987 Rao et al. May 1997 A
5670987 Doi et al. Sep 1997 A
5754182 Kobayashi May 1998 A
5781195 Marvin Jul 1998 A
5781704 Rossmo Jul 1998 A
5798769 Chiu et al. Aug 1998 A
5845300 Comer Dec 1998 A
6057757 Arrowsmith et al. May 2000 A
6091956 Hollenberg Jul 2000 A
6157747 Szeliski et al. Dec 2000 A
6161098 Wallman Dec 2000 A
6173067 Payton et al. Jan 2001 B1
6178432 Cook et al. Jan 2001 B1
6219053 Tachibana et al. Apr 2001 B1
6232971 Haynes May 2001 B1
6247019 Davies Jun 2001 B1
6279018 Kudrolli et al. Aug 2001 B1
6341310 Leshem et al. Jan 2002 B1
6366933 Ball et al. Apr 2002 B1
6369835 Lin Apr 2002 B1
6389289 Voce et al. May 2002 B1
6414683 Gueziec Jul 2002 B1
6456997 Shukla Sep 2002 B1
6483509 Rabenhorst Nov 2002 B1
6529900 Patterson et al. Mar 2003 B1
6549944 Weinberg et al. Apr 2003 B1
6560620 Ching May 2003 B1
6581068 Bensoussan et al. Jun 2003 B1
6594672 Lampson et al. Jul 2003 B1
6631496 Li et al. Oct 2003 B1
6642945 Sharpe Nov 2003 B1
6662103 Skolnick et al. Dec 2003 B1
6714936 Nevin, III Mar 2004 B1
6757445 Knopp Jun 2004 B1
6775675 Nwabueze et al. Aug 2004 B1
6828920 Owen et al. Dec 2004 B2
6839745 Dingari et al. Jan 2005 B1
6877137 Rivette et al. Apr 2005 B1
6976210 Silva et al. Dec 2005 B1
6980984 Huffman et al. Dec 2005 B1
6985950 Hanson et al. Jan 2006 B1
7036085 Barros Apr 2006 B2
7043702 Chi et al. May 2006 B2
7055110 Kupka et al. May 2006 B2
7139800 Bellotti et al. Nov 2006 B2
7158878 Rasmussen et al. Jan 2007 B2
7162475 Ackerman Jan 2007 B2
7168039 Bertram Jan 2007 B2
7171427 Witowski et al. Jan 2007 B2
7269786 Malloy et al. Sep 2007 B1
7278105 Kitts Oct 2007 B1
7290698 Poslinski et al. Nov 2007 B2
7333998 Heckerman et al. Feb 2008 B2
7370047 Gorman May 2008 B2
7375732 Arcas May 2008 B2
7379811 Rasmussen et al. May 2008 B2
7379903 Caballero et al. May 2008 B2
7426654 Adams et al. Sep 2008 B2
7454466 Bellotti et al. Nov 2008 B2
7457706 Malero et al. Nov 2008 B2
7467375 Tondreau et al. Dec 2008 B2
7487139 Fraleigh et al. Feb 2009 B2
7502786 Liu et al. Mar 2009 B2
7519470 Brasche et al. Apr 2009 B2
7525422 Bishop et al. Apr 2009 B2
7529195 Gorman May 2009 B2
7529727 Arning et al. May 2009 B2
7529734 Dirisala May 2009 B2
7539666 Ashworth et al. May 2009 B2
7546245 Surpin et al. Jun 2009 B2
7558677 Jones Jun 2009 B2
7574409 Patinkin Aug 2009 B2
7574428 Leiserowitz et al. Aug 2009 B2
7579965 Bucholz Aug 2009 B2
7596285 Brown et al. Sep 2009 B2
7614006 Molander Nov 2009 B2
7617232 Gabbert et al. Nov 2009 B2
7620628 Kapur et al. Nov 2009 B2
7627812 Chamberlain et al. Dec 2009 B2
7634717 Chamberlain et al. Dec 2009 B2
7640173 Surpin et al. Dec 2009 B2
7663621 Allen et al. Feb 2010 B1
7716067 Surpin et al. Mar 2010 B2
7703021 Flam Apr 2010 B1
7706817 Bamrah et al. Apr 2010 B2
7712049 Williams et al. May 2010 B2
7716077 Mikurak May 2010 B1
7725530 Sah et al. May 2010 B2
7725547 Albertson et al. May 2010 B2
7730082 Sah et al. Jun 2010 B2
7730109 Rohrs et al. Jun 2010 B2
7770100 Chamberlain et al. Aug 2010 B2
7791616 Ioup et al. Sep 2010 B2
7805457 Viola et al. Sep 2010 B1
7809703 Balabhadrapatruni et al. Oct 2010 B2
7818658 Chen Oct 2010 B2
7870493 Pall et al. Jan 2011 B2
7872647 Mayer et al. Jan 2011 B2
7894984 Rasmussen et al. Feb 2011 B2
7899611 Downs et al. Mar 2011 B2
7917376 Bellin et al. Mar 2011 B2
7920963 Jouline et al. Apr 2011 B2
7933862 Chamberlain et al. Apr 2011 B2
7945852 Pilskains May 2011 B1
7962281 Rasmussen et al. Jun 2011 B2
7962495 Jain et al. Jun 2011 B2
7962848 Bertram Jun 2011 B2
7970240 Chao et al. Jun 2011 B1
7971150 Raskutti et al. Jun 2011 B2
7984374 Caro et al. Jun 2011 B2
8001465 Kudrolli et al. Aug 2011 B2
8001482 Bhattiprolu et al. Aug 2011 B2
8010545 Stefik et al. Aug 2011 B2
8010886 Gusmorino et al. Aug 2011 B2
8015487 Roy et al. Sep 2011 B2
8019709 Norton et al. Sep 2011 B2
8024778 Cash et al. Sep 2011 B2
8036632 Cona et al. Oct 2011 B1
8065080 Koch Nov 2011 B2
8082172 Chao et al. Dec 2011 B2
8085268 Carrino et al. Dec 2011 B2
8103543 Zwicky Jan 2012 B1
8134457 Velipasalar et al. Mar 2012 B2
8145703 Frishert et al. Mar 2012 B2
8185819 Sah et al. May 2012 B2
8214361 Sandler et al. Jul 2012 B1
8214764 Gemmell et al. Jul 2012 B2
8225201 Michael Jul 2012 B2
8229947 Fujinaga Jul 2012 B2
8230333 Decherd et al. Jul 2012 B2
8271461 Pike et al. Sep 2012 B2
8280880 Aymeloglu et al. Oct 2012 B1
8290926 Ozzie et al. Oct 2012 B2
8290942 Jones et al. Oct 2012 B2
8301464 Cave et al. Oct 2012 B1
8301904 Gryaznov Oct 2012 B1
8312367 Foster Nov 2012 B2
8312546 Alme Nov 2012 B2
8325178 Doyle, Jr. Dec 2012 B1
8352881 Champion et al. Jan 2013 B2
8368695 Howell et al. Feb 2013 B2
8397171 Klassen et al. Mar 2013 B2
8400448 Doyle, Jr. Mar 2013 B1
8407180 Ramesh et al. Mar 2013 B1
8411046 Kruzensiki et al. Apr 2013 B2
8412234 Gatmir-Motahari et al. Apr 2013 B1
8412707 Mianji Apr 2013 B1
8422825 Neophytou et al. Apr 2013 B1
8447722 Ahuja et al. May 2013 B1
8452790 Mianji May 2013 B1
8463036 Ramesh et al. Jun 2013 B1
8489331 Kopf et al. Jul 2013 B2
8489641 Seefeld et al. Jul 2013 B1
8498984 Hwang et al. Jul 2013 B1
8508533 Cervelli et al. Aug 2013 B2
8510743 Hackborn et al. Aug 2013 B2
8514082 Cova et al. Aug 2013 B2
8514229 Cervelli et al. Aug 2013 B2
8515207 Chau Aug 2013 B2
8554579 Tribble et al. Oct 2013 B2
8554653 Falkenborg et al. Oct 2013 B2
8554709 Goodson et al. Oct 2013 B2
8554840 Milgramm Oct 2013 B1
8564596 Carrino et al. Oct 2013 B2
8577911 Stepinski et al. Nov 2013 B1
8589273 Creeden et al. Nov 2013 B2
8595234 Siripurapu et al. Nov 2013 B2
8620641 Farnsworth et al. Dec 2013 B2
8646080 Williamson et al. Feb 2014 B2
8639757 Adams et al. Mar 2014 B1
8676597 Buehler et al. Mar 2014 B2
8676857 Adams et al. Mar 2014 B1
8689108 Duffield et al. Apr 2014 B1
8707185 Robinson et al. Apr 2014 B2
8713467 Goldenberg et al. Apr 2014 B1
8726379 Stiansen et al. May 2014 B1
8739278 Varghese May 2014 B2
8742934 Sarpy et al. Jun 2014 B1
8745516 Mason et al. Jun 2014 B2
8756224 Dassa et al. Jun 2014 B2
8768009 Smith Jul 2014 B1
8781169 Jackson et al. Jul 2014 B2
8787939 Papakipos et al. Jul 2014 B2
8799799 Cervelli et al. Aug 2014 B1
8799812 Parker Aug 2014 B2
8812960 Sun et al. Aug 2014 B1
8830322 Nerayoff et al. Sep 2014 B2
8832594 Thompson et al. Sep 2014 B1
8868486 Tamayo Oct 2014 B2
8868537 Colgrove et al. Oct 2014 B1
8917274 Ma et al. Dec 2014 B2
8924872 Bogomolov et al. Dec 2014 B1
8937619 Sharma et al. Jan 2015 B2
8938686 Erenrich et al. Jan 2015 B1
9009171 Grossman et al. Apr 2015 B1
9009827 Albertson et al. Apr 2015 B1
9021260 Falk et al. Apr 2015 B1
9021384 Beard et al. Apr 2015 B1
9043696 Meiklejohn et al. May 2015 B1
9043894 Dennison et al. May 2015 B1
20010021936 Bertram Sep 2001 A1
20020003539 Abe Jan 2002 A1
20020033848 Sciammarella et al. Mar 2002 A1
20020065708 Senay et al. May 2002 A1
20020091707 Keller Jul 2002 A1
20020095658 Shulman Jul 2002 A1
20020116120 Ruiz et al. Aug 2002 A1
20020130867 Yang et al. Sep 2002 A1
20020130907 Chi et al. Sep 2002 A1
20020174201 Ramer et al. Nov 2002 A1
20020194119 Wright et al. Dec 2002 A1
20030028560 Kudrolli et al. Feb 2003 A1
20030039948 Donahue Feb 2003 A1
20030052896 Higgins et al. Mar 2003 A1
20030103049 Kindratenko et al. Jun 2003 A1
20030140106 Raguseo Jul 2003 A1
20030144868 MacIntyre et al. Jul 2003 A1
20030163352 Surpin et al. Aug 2003 A1
20030200217 Ackerman Oct 2003 A1
20030225755 Iwayama et al. Dec 2003 A1
20030229848 Arend et al. Dec 2003 A1
20040030492 Fox et al. Feb 2004 A1
20040032432 Baynger Feb 2004 A1
20040039498 Ollis et al. Feb 2004 A1
20040064256 Barinek et al. Apr 2004 A1
20040085318 Hassler et al. May 2004 A1
20040095349 Bito et al. May 2004 A1
20040098236 Mayer et al. May 2004 A1
20040111410 Burgoon et al. Jun 2004 A1
20040126840 Cheng et al. Jul 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20040143796 Lerner et al. Jul 2004 A1
20040163039 Gorman Aug 2004 A1
20040181554 Heckerman et al. Sep 2004 A1
20040193600 Kaasten et al. Sep 2004 A1
20040194549 Noel Oct 2004 A1
20040210847 Berson et al. Oct 2004 A1
20040221223 Yu et al. Nov 2004 A1
20040260702 Cragun et al. Dec 2004 A1
20040267746 Marcjan et al. Dec 2004 A1
20050027705 Sadri et al. Feb 2005 A1
20050028094 Allyn Feb 2005 A1
20050031197 Knopp Feb 2005 A1
20050034062 Bufkin et al. Feb 2005 A1
20050039119 Parks et al. Feb 2005 A1
20050065811 Chu et al. Mar 2005 A1
20050080769 Gemmell Apr 2005 A1
20050086207 Heuer et al. Apr 2005 A1
20050125715 Franco et al. Jun 2005 A1
20050162523 Darrell et al. Jul 2005 A1
20050166144 Gross Jul 2005 A1
20050180330 Shapiro Aug 2005 A1
20050182502 Iyengar Aug 2005 A1
20050182793 Keenan et al. Aug 2005 A1
20050183005 Denoue et al. Aug 2005 A1
20050210409 Jou Sep 2005 A1
20050246327 Yeung et al. Nov 2005 A1
20050251786 Citron et al. Nov 2005 A1
20050267652 Allstadt et al. Dec 2005 A1
20060026120 Carolan et al. Feb 2006 A1
20060026170 Kreitler et al. Feb 2006 A1
20060045470 Poslinski et al. Mar 2006 A1
20060059139 Robinson Mar 2006 A1
20060074866 Chamberlain et al. Apr 2006 A1
20060074881 Vembu et al. Apr 2006 A1
20060080619 Carlson et al. Apr 2006 A1
20060129746 Porter Jun 2006 A1
20060139375 Rasmussen et al. Jun 2006 A1
20060142949 Helt Jun 2006 A1
20060146050 Yamauchi Jul 2006 A1
20060149596 Surpin et al. Jul 2006 A1
20060203337 White Sep 2006 A1
20060218637 Thomas et al. Sep 2006 A1
20060241974 Chao et al. Oct 2006 A1
20060242040 Rader Oct 2006 A1
20060242630 Koike et al. Oct 2006 A1
20060251307 Florin et al. Nov 2006 A1
20060271277 Hu et al. Nov 2006 A1
20060279630 Aggarwal et al. Dec 2006 A1
20070011150 Frank Jan 2007 A1
20070016363 Huang et al. Jan 2007 A1
20070024620 Muller-Fischer et al. Feb 2007 A1
20070038962 Fuchs et al. Feb 2007 A1
20070057966 Ohno et al. Mar 2007 A1
20070078832 Ott et al. Apr 2007 A1
20070083541 Fraleigh et al. Apr 2007 A1
20070094389 Nussey et al. Apr 2007 A1
20070150369 Zivin Jun 2007 A1
20070174760 Chamberlain et al. Jul 2007 A1
20070188516 Loup et al. Aug 2007 A1
20070192265 Chopin et al. Aug 2007 A1
20070198571 Ferguson et al. Aug 2007 A1
20070208497 Downs et al. Sep 2007 A1
20070208498 Barker et al. Sep 2007 A1
20070208681 Bucholz Sep 2007 A1
20070208736 Tanigawa et al. Sep 2007 A1
20070240062 Christena et al. Oct 2007 A1
20070258642 Thota Nov 2007 A1
20070266336 Nojima et al. Nov 2007 A1
20070294643 Kyle Dec 2007 A1
20080010605 Frank Jan 2008 A1
20080016216 Worley et al. Jan 2008 A1
20080040684 Crump Feb 2008 A1
20080051989 Welsh Feb 2008 A1
20080052142 Bailey et al. Feb 2008 A1
20080077597 Butler Mar 2008 A1
20080077642 Carbone et al. Mar 2008 A1
20080082578 Hogue et al. Apr 2008 A1
20080098085 Krane et al. Apr 2008 A1
20080104019 Nath May 2008 A1
20080126951 Sood et al. May 2008 A1
20080155440 Trevor et al. Jun 2008 A1
20080162616 Worley et al. Jul 2008 A1
20080163073 Becker et al. Jul 2008 A1
20080192053 Howell et al. Aug 2008 A1
20080195417 Surpin et al. Aug 2008 A1
20080195608 Clover Aug 2008 A1
20080222295 Robinson et al. Sep 2008 A1
20080223834 Griffiths et al. Sep 2008 A1
20080228512 Calkins et al. Sep 2008 A1
20080229056 Agarwal et al. Sep 2008 A1
20080255973 El Wade et al. Oct 2008 A1
20080263468 Cappione et al. Oct 2008 A1
20080267107 Rosenberg Oct 2008 A1
20080270468 Mao Oct 2008 A1
20080276167 Michael Nov 2008 A1
20080278311 Grange et al. Nov 2008 A1
20080288306 MacIntyre et al. Nov 2008 A1
20080294678 Gorman et al. Nov 2008 A1
20080301643 Appleton et al. Dec 2008 A1
20090002492 Velipasalar et al. Jan 2009 A1
20090027418 Maru et al. Jan 2009 A1
20090030915 Winter et al. Jan 2009 A1
20090055251 Shah et al. Feb 2009 A1
20090076845 Bellin et al. Mar 2009 A1
20090088964 Schaaf et al. Apr 2009 A1
20090100018 Roberts Apr 2009 A1
20090115786 Shmiasaki et al. May 2009 A1
20090119309 Gibson et al. May 2009 A1
20090125369 Kloostra et al. May 2009 A1
20090125459 Norton et al. May 2009 A1
20090132921 Hwangbo et al. May 2009 A1
20090132953 Reed et al. May 2009 A1
20090143052 Bates et al. Jun 2009 A1
20090144262 White et al. Jun 2009 A1
20090144274 Fraleigh et al. Jun 2009 A1
20090158185 Lacevic et al. Jun 2009 A1
20090164934 Bhattiprolu et al. Jun 2009 A1
20090171939 Athsani et al. Jul 2009 A1
20090172511 Decherd et al. Jul 2009 A1
20090177962 Gusmorino et al. Jul 2009 A1
20090179892 Tsuda et al. Jul 2009 A1
20090187464 Bai et al. Jul 2009 A1
20090222400 Kupershmidt et al. Sep 2009 A1
20090222760 Halverson et al. Sep 2009 A1
20090234720 George et al. Sep 2009 A1
20090249244 Robinson et al. Oct 2009 A1
20090281839 Lynn et al. Nov 2009 A1
20090287470 Farnsworth et al. Nov 2009 A1
20090292626 Oxford Nov 2009 A1
20100011282 Dollard et al. Jan 2010 A1
20100042922 Bradateanu et al. Feb 2010 A1
20100057716 Stefik et al. Mar 2010 A1
20100063961 Guiheneuf et al. Mar 2010 A1
20100070523 Delgo et al. Mar 2010 A1
20100070842 Aymeloglu et al. Mar 2010 A1
20100070845 Facemire et al. Mar 2010 A1
20100070897 Aymeloglu et al. Mar 2010 A1
20100076968 Boyns et al. Mar 2010 A1
20100100963 Mahaffey Apr 2010 A1
20100103124 Kruzensiki et al. Apr 2010 A1
20100106420 Mattikalli et al. Apr 2010 A1
20100114887 Conway et al. May 2010 A1
20100122152 Chamberlain et al. May 2010 A1
20100131457 Heimendinger May 2010 A1
20100162176 Dunton Jun 2010 A1
20100191563 Schlaifer et al. Jul 2010 A1
20100198684 Eraker et al. Aug 2010 A1
20100199225 Coleman et al. Aug 2010 A1
20100228812 Uomini Sep 2010 A1
20100250412 Wagner Sep 2010 A1
20100280857 Liu et al. Nov 2010 A1
20100293174 Bennett et al. Nov 2010 A1
20100306713 Geisner et al. Dec 2010 A1
20100313119 Baldwin et al. Dec 2010 A1
20100318924 Frankel et al. Dec 2010 A1
20100321399 Ellren et al. Dec 2010 A1
20100325526 Ellis et al. Dec 2010 A1
20100325581 Finkelstein et al. Dec 2010 A1
20100330801 Rouh Dec 2010 A1
20110022312 McDonough et al. Jan 2011 A1
20110029526 Knight et al. Feb 2011 A1
20110047159 Baid et al. Feb 2011 A1
20110060753 Shaked et al. Mar 2011 A1
20110061013 Bilicki et al. Mar 2011 A1
20110074811 Hanson et al. Mar 2011 A1
20110078055 Faribault et al. Mar 2011 A1
20110078173 Seligmann et al. Mar 2011 A1
20110093327 Fordyce, III et al. Apr 2011 A1
20110117878 Barash et al. May 2011 A1
20110119100 Ruhl et al. May 2011 A1
20110137766 Rasmussen et al. Jun 2011 A1
20110153384 Horne et al. Jun 2011 A1
20110161096 Buehler et al. Jun 2011 A1
20110167105 Ramakrishnan et al. Jul 2011 A1
20110167710 Ramakrishnan et al. Jul 2011 A1
20110170799 Carrino et al. Jul 2011 A1
20110173032 Payne et al. Jul 2011 A1
20110185316 Reid et al. Jul 2011 A1
20110208724 Jones et al. Aug 2011 A1
20110218934 Elser Sep 2011 A1
20110219450 McDougal et al. Sep 2011 A1
20110225198 Edwards et al. Sep 2011 A1
20110238553 Raj et al. Sep 2011 A1
20110258158 Resende et al. Oct 2011 A1
20110270705 Parker Nov 2011 A1
20110289397 Eastmond et al. Nov 2011 A1
20110289407 Naik et al. Nov 2011 A1
20110289420 Morioka et al. Nov 2011 A1
20110291851 Whisenant Dec 2011 A1
20110310005 Chen et al. Dec 2011 A1
20110314007 Dassa et al. Dec 2011 A1
20120019559 Siler et al. Jan 2012 A1
20120036013 Neuhaus et al. Feb 2012 A1
20120036434 Oberstein Feb 2012 A1
20120050293 Carlhian et al. Mar 2012 A1
20120066296 Appleton et al. Mar 2012 A1
20120072825 Sherkin et al. Mar 2012 A1
20120079363 Folting et al. Mar 2012 A1
20120084118 Bai et al. Apr 2012 A1
20120105632 Renkis May 2012 A1
20120106801 Jackson May 2012 A1
20120117082 Koperda et al. May 2012 A1
20120131512 Takeuchi et al. May 2012 A1
20120144335 Abeln et al. Jun 2012 A1
20120159307 Chung et al. Jun 2012 A1
20120159362 Brown et al. Jun 2012 A1
20120159363 DeBacker et al. Jun 2012 A1
20120159399 Bastide et al. Jun 2012 A1
20120170847 Tsukidate Jul 2012 A1
20120173985 Peppel Jul 2012 A1
20120196557 Reich et al. Aug 2012 A1
20120196558 Reich Aug 2012 A1
20120203708 Psota et al. Aug 2012 A1
20120206469 Hulubei et al. Aug 2012 A1
20120208636 Feige Aug 2012 A1
20120221511 Gibson et al. Aug 2012 A1
20120221553 Wittmer et al. Aug 2012 A1
20120221580 Barney Aug 2012 A1
20120245976 Kumar et al. Sep 2012 A1
20120246148 Dror Sep 2012 A1
20120254129 Wheeler et al. Oct 2012 A1
20120284345 Costenaro et al. Nov 2012 A1
20120290879 Shibuya et al. Nov 2012 A1
20120296907 Long et al. Nov 2012 A1
20120311684 Paulsen et al. Dec 2012 A1
20120323888 Osann, Jr. Dec 2012 A1
20120330973 Ghuneim et al. Dec 2012 A1
20130006426 Healey et al. Jan 2013 A1
20130006725 Simanek et al. Jan 2013 A1
20130006916 McBride et al. Jan 2013 A1
20130018796 Kolhatkar et al. Jan 2013 A1
20130021445 Cossette-Pacheco et al. Jan 2013 A1
20130024202 Harris et al. Jan 2013 A1
20130046635 Grigg et al. Feb 2013 A1
20130046842 Muntz et al. Feb 2013 A1
20130060786 Serrano et al. Mar 2013 A1
20130061169 Pearcy Mar 2013 A1
20130073377 Heath Mar 2013 A1
20130073454 Busch Mar 2013 A1
20130076732 Cervelli et al. Mar 2013 A1
20130078943 Biage et al. Mar 2013 A1
20130097482 Marantz et al. Apr 2013 A1
20130100134 Cervelli et al. Apr 2013 A1
20130101159 Chao et al. Apr 2013 A1
20130110822 Ikeda et al. May 2013 A1
20130110877 Bonham et al. May 2013 A1
20130111320 Campbell et al. May 2013 A1
20130117651 Waldman et al. May 2013 A1
20130150004 Rosen Jun 2013 A1
20130151148 Parundekar et al. Jun 2013 A1
20130151388 Falkenborg et al. Jun 2013 A1
20130157234 Gulli et al. Jun 2013 A1
20130166550 Buchmann et al. Jun 2013 A1
20130176321 Mitchell et al. Jul 2013 A1
20130179420 Park et al. Jul 2013 A1
20130224696 Wolfe et al. Aug 2013 A1
20130226953 Markovich et al. Aug 2013 A1
20130238616 Rose et al. Sep 2013 A1
20130246170 Gross et al. Sep 2013 A1
20130251233 Yang et al. Sep 2013 A1
20130262527 Hunter et al. Oct 2013 A1
20130263019 Castellanos et al. Oct 2013 A1
20130267207 Hao et al. Oct 2013 A1
20130268520 Fisher et al. Oct 2013 A1
20130279757 Kephart Oct 2013 A1
20130282696 John et al. Oct 2013 A1
20130282723 Petersen et al. Oct 2013 A1
20130290011 Lynn et al. Oct 2013 A1
20130290825 Arndt et al. Oct 2013 A1
20130297619 Chandrasekaran et al. Nov 2013 A1
20130311375 Priebatsch Nov 2013 A1
20140019936 Cohanoff Jan 2014 A1
20140032506 Hoey et al. Jan 2014 A1
20140033010 Richardt et al. Jan 2014 A1
20140040371 Gurevich et al. Feb 2014 A1
20140047357 Alfaro et al. Feb 2014 A1
20140059038 McPherson et al. Feb 2014 A1
20140067611 Adachi et al. Mar 2014 A1
20140068487 Steiger et al. Mar 2014 A1
20140095273 Tang et al. Apr 2014 A1
20140095509 Patton Apr 2014 A1
20140108068 Williams Apr 2014 A1
20140108380 Gotz et al. Apr 2014 A1
20140108985 Scott et al. Apr 2014 A1
20140129261 Bothwell et al. May 2014 A1
20140149436 Bahrami et al. May 2014 A1
20140156527 Grigg et al. Jun 2014 A1
20140157172 Peery et al. Jun 2014 A1
20140164502 Khodorenko et al. Jun 2014 A1
20140188847 Tang et al. Jul 2014 A1
20140189536 Lange et al. Jul 2014 A1
20140195515 Baker et al. Jul 2014 A1
20140195887 Ellis et al. Jul 2014 A1
20140267294 Ma et al. Sep 2014 A1
20140267295 Sharma et al. Sep 2014 A1
20140279824 Tamayo Sep 2014 A1
20140316911 Gross Oct 2014 A1
20140333651 Cervelli et al. Nov 2014 A1
20140337772 Cervelli et al. Nov 2014 A1
20140344230 Krause et al. Nov 2014 A1
20140361899 Layson Dec 2014 A1
20150019394 Unser et al. Jan 2015 A1
20150029176 Baxter et al. Jan 2015 A1
20150046870 Goldenberg et al. Feb 2015 A1
20150089424 Duffield et al. Mar 2015 A1
20150100897 Sun et al. Apr 2015 A1
20150100907 Erenrich et al. Apr 2015 A1
20150134666 Gattiker et al. May 2015 A1
20150169709 Kara et al. Jun 2015 A1
20150169726 Kara et al. Jun 2015 A1
20150170077 Kara et al. Jun 2015 A1
20150178877 Bogomolov et al. Jun 2015 A1
20150186821 Wang et al. Jul 2015 A1
20150187036 Wang et al. Jul 2015 A1
Foreign Referenced Citations (45)
Number Date Country
10 2014 103 482 Sep 2014 DE
102013222023 Jan 2015 DE
102014215621 Feb 2015 DE
0763201 Mar 1997 EP
1 672 527 Jun 2006 EP
2 551 799 Jan 2013 EP
2560134 Feb 2013 EP
2575107 Apr 2013 EP
2 778 977 Sep 2014 EP
2 778 983 Sep 2014 EP
2 779 082 Sep 2014 EP
2778913 Sep 2014 EP
2835745 Feb 2015 EP
2835770 Feb 2015 EP
2838039 Feb 2015 EP
2846241 Mar 2015 EP
2851852 Mar 2015 EP
2858014 Apr 2015 EP
2858018 Apr 2015 EP
2863326 Apr 2015 EP
2863346 Apr 2015 EP
2869211 May 2015 EP
2881868 Jun 2015 EP
2884439 Jun 2015 EP
2884440 Jun 2015 EP
2891992 Jul 2015 EP
2 516 155 Jan 2015 GB
2518745 Apr 2015 GB
2012778 Nov 2014 NL
2013306 Feb 2015 NL
624557 Dec 2014 NZ
WO 95032424 Nov 1995 WO
WO 00009529 Feb 2000 WO
WO 2002065353 Aug 2002 WO
WO 2004057268 Jul 2004 WO
WO 2005013200 Feb 2005 WO
WO 2005104736 Nov 2005 WO
WO 2008064207 May 2008 WO
WO 2009061501 May 2009 WO
WO 2009123975 Oct 2009 WO
WO 2010000014 Jan 2010 WO
WO 2010030913 Mar 2010 WO
WO 2011058507 May 2011 WO
WO 2013010157 Jan 2013 WO
WO 20130102892 Jul 2013 WO
Non-Patent Literature Citations (226)
Entry
“A Word About Banks and the Laundering of Drug Money,” Aug. 18, 2012, http://www.golemxiv.co.uk/2012/08/a-word-about-banks-and-the-laundering-of-drug-money/.
AMNET, “5 Great Tools for Visualizing Your Twitter Followers,” posted Aug. 4, 2010, http://www.amnetblog.com/component/content/article/115-5-grate-tools-for-visualizing-your-twitter-followers.html.
Boyce, Jim, “Microsoft Outlook 2010 Inside Out,” Aug. 1, 2010, retrieved from the internet https://capdtron.files.wordpress.com/2013/01/outlook-2010-inside_out.pdf.
Celik, Tantek, “CSS Basic User Interface Module Level 3 (CSS3 UI),” Section 8 Resizing and Overflow, Jan. 17, 2012, retrieved from internet http://www.w3.org/TR/2012/WD-css3-ui-20120117/#resizing-amp-overflow retrieved on May 18, 2015.
Chung, Chin-Wan, “Dataplex: An Access to Heterogeneous Distributed Databases,” Communications of the ACM, Association for Computing Machinery, Inc., vol. 33, No. 1, Jan. 1, 1990, pp. 70-80.
Definition “Identify” downloaded Jan. 22, 2015, 1 page.
Definition “Overlay” downloaded Jan. 22, 2015, 1 page.
Hardesty, “Privacy Challenges: Analysis: It's Surprisingly Easy to Identify Individuals from Credit-Card Metadata,” MIT News on Campus and Around the World, MIT News Office, Jan. 29, 2015, 3 pages.
Hogue et al., “Thresher: Automating the Unwrapping of Semantic Content from the World Wide Web,” 14th International Conference on World Wide Web, WWW 2005: Chiba, Japan, May 10-14, 2005, pp. 86-95.
Li et al., “Interactive Multimodal Visual Search on Mobile Device,” IEEE Transactions on Multimedia, vol. 15, No. 3, Apr. 1, 2013, pp. 594-607.
Nierman, “Evaluating Structural Similarity in XML Documents,” 2002, 6 pages.
Olanoff, Drew, “Deep Dive with the New Google Maps for Desktop with Google Earth Integration, It's More than Just a Utility,” May 15, 2013, pp. 1-6, retrieved from the internet: http://web.archive.org/web/20130515230641/http:techcrunch.com//2013/05/15/deep-dive-with-the-new-google-maps-for-desktop-with-google-earth-integration-its-more-than-just-a-utility/.
“Potential Money Laundering Warning Signs,” snapshot taken 2003, https://web.archive.org/web/20030816090055/http:/finsolinc.com/ANTI-MONEY%20LAUNDERING%20TRAINING%20GUIDES.pdf.
“Refresh CSS Ellipsis When Resizing Container—Stack Overflow,” Jul. 31, 2013, retrieved from internet http://stackoverflow.com/questions/17964681/refresh-css-ellipsis-when-resizing-container, retrieved on May 18, 2015.
Thompson, Mick, “Getting Started with GEO,” Getting Started with GEO, Jul. 26, 2011.
Umagandhi et al., “Search Query Recommendations Using Hybrid User Profile with Query Logs,” International Journal of Computer Applications, vol. 80, No. 10, Oct. 1, 2013, pp. 7-18.
Wikipedia, “Federated Database System,” Sep. 7, 2013, retrieved from the internet on Jan. 27, 2015 http/en.wikipedia.org/w/index.php?title=Federated_database_system&oldid=571954221.
Yang et al., “HTML Page Analysis Based on Visual Cues,” 2001, pp. 859-864.
Official Communication for Australian Patent Application No. 2014201511 dated Feb. 27, 2015.
Official Communication for Australian Patent Application No. 2014202442 dated Mar. 19, 2015.
Official Communication for Australian Patent Application No. 2014210604 dated Jun. 5, 2015.
Official Communication for Australian Patent Application No. 2014210614 dated Jun. 5, 2015.
Official Communication for Australian Patent Application No. 2014213553 dated May 7, 2015.
Official Communication for Australian Patent Application No. 2014250678 dated Jun. 17, 2015.
Official Communication for European Patent Application No. 14180142.3 dated Feb. 6, 2015.
Official Communication for European Patent Application No. 14180281.9 dated Jan. 26, 2015.
Official Communication for European Patent Application No. 14180321.3 dated Apr. 17, 2015.
Official Communication for European Patent Application No. 14180432.8 dated Jun. 23, 2015.
Official Communication for European Patent Application No. 14186225.0 dated Feb. 13, 2015.
Official Communication for European Patent Application No. 14187739.9 dated Jul. 6, 2015.
Official Communication for European Patent Application No. 14187996.5 dated Feb. 12, 2015.
Official Communication for European Patent Application No. 14189344.6 dated Feb. 20, 2015.
Official Communication for European Patent Application No. 14189802.3 dated May 11, 2015.
Official Communication for European Patent Application No. 14191540.5 dated May 27, 2015.
Official Communication for European Patent Application No. 14197879.1 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14197895.7 dated Apr. 28, 2015.
Official Communication for European Patent Application No. 14199182.8 dated Mar. 13, 2015.
Official Communication for Great Britain Patent Application No. 1404574.4 dated Dec. 18, 2014.
Official Communication for Great Britain Patent Application No. 1408025.3 dated Nov. 6, 2014.
Official Communication for Great Britain Patent Application No. 1411984.6 dated Dec. 22, 2014.
Official Communication for Great Britain Patent Application No. 1413935.6 dated Jan. 27, 2015.
Official Communication for Netherlands Patent Application No. 2013306 dated Apr. 24, 2015.
Official Communication for New Zealand Patent Application No. 622517 dated Apr. 3, 2014.
Official Communication for U.S. Appl. No. 12/556,318 dated Jul. 2, 2015.
Official Communication for U.S. Appl. No. 13/247,987 dated Apr. 2, 2015.
Official Communication for U.S. Appl. No. 13/831,791 dated Mar. 4, 2015.
Official Communication for U.S. Appl. No. 13/835,688 dated Jun. 17, 2015.
Official Communication for U.S. Appl. No. 13/839,026 dated Aug. 4, 2015.
Notice of Allowance for U.S. Appl. No. 14/102,394 dated Aug. 25, 2014.
Notice of Allowance for U.S. Appl. No. 14/108,187 dated Aug. 29, 2014.
Notice of Allowance for U.S. Appl. No. 14/135,289 dated Oct. 14, 2014.
Official Communication for U.S. Appl. No. 14/148,568 dated Oct. 22, 2014.
Official Communication for U.S. Appl. No. 14/148,568 dated Mar. 26, 2015.
Notice of Allowance for U.S. Appl. No. 14/192,767 dated Dec. 16, 2014.
Notice of Allowance for U.S. Appl. No. 14/225,084 dated May 4, 2015.
Notice of Allowance for U.S. Appl. No. 14/268,964 dated Dec. 3, 2014.
Official Communication for U.S. Appl. No. 14/196,814 dated May 5, 2015.
Official Communication for U.S. Appl. No. 14/225,006 dated Sep. 10, 2014.
Official Communication for U.S. Appl. No. 14/225,006 dated Feb. 27, 2015.
Official Communication for U.S. Appl. No. 14/225,084 dated Sep. 2, 2014.
Official Communication for U.S. Appl. No. 14/225,084 dated Feb. 20, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated Feb. 11, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated Aug. 12, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated May 20, 2015.
Official Communication for U.S. Appl. No. 14/225,160 dated Oct. 22, 2014.
Official Communication for U.S. Appl. No. 14/225,160 dated Jul. 29, 2014.
Official Communication for U.S. Appl. No. 14/268,964 dated Sep. 3, 2014.
Official Communication for U.S. Appl. No. 14/289,596 dated Jul. 18, 2014.
Official Communication for U.S. Appl. No. 14/289,596 dated Jan. 26, 2015.
Official Communication for U.S. Appl. No. 14/289,596 dated Apr. 30, 2015.
Official Communication for U.S. Appl. No. 14/289,599 dated Jul. 22, 2014.
Official Communication for U.S. Appl. No. 14/289,599 dated May 29, 2015.
Official Communication for U.S. Appl. No. 14/294,098 dated Aug. 15, 2014.
Official Communication for U.S. Appl. No. 14/294,098 dated Nov. 6, 2014.
Notice of Allowance for U.S. Appl. No. 14/294,098 dated Dec. 29, 2014.
Official Communication for U.S. Appl. No. 14/306,138 dated Feb. 18, 2015.
Official Communication for U.S. Appl. No. 14/306,138 dated Sep. 23, 2014.
Official Communication for U.S. Appl. No. 14/306,138 dated May 26, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Feb. 19, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Aug. 7, 2015.
Official Communication for U.S. Appl. No. 14/306,147 dated Sep. 9, 2014.
Official Communication for U.S. Appl. No. 14/306,154 dated Mar. 11, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated May 15, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Jul. 6, 2015.
Official Communication for U.S. Appl. No. 14/306,154 dated Sep. 9, 2014.
Official Communication for U.S. Appl. No. 14/319,765 dated Jun. 16, 2015.
Official Communication for U.S. Appl. No. 14/319,765 dated Nov. 25, 2014.
Official Communication for U.S. Appl. No. 14/319,765 dated Feb. 4, 2015.
Official Communication for U.S. Appl. No. 14/323,935 dated Jun. 22, 2015.
Official Communication for U.S. Appl. No. 14/323,935 dated Nov. 28, 2014.
Official Communication for U.S. Appl. No. 14/323,935 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/326,738 dated Dec. 2, 2014.
Official Communication for U.S. Appl. No. 14/326,738 dated Jul. 31, 2015.
Official Communication for U.S. Appl. No. 14/326,738 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/473,552 dated Feb. 24, 2015.
Notice of Allowance for U.S. Appl. No. 14/473,552 dated Jul. 24, 2015.
Notice of Allowance for U.S. Appl. No. 14/473,860 dated Jan. 5, 2015.
Official Communication for U.S. Appl. No. 14/486,991 dated Mar. 10, 2015.
Notice of Allowance for U.S. Appl. No. 14/486,991 dated May 1, 2015.
Official Communication for U.S. Appl. No. 14/504,103 dated Mar. 31, 2015.
Official Communication for U.S. Appl. No. 14/504,103 dated Feb. 5, 2015.
Notice of Allowance for U.S. Appl. No. 14/504,103 dated May 18, 2015.
Official Communication for U.S. Appl. No. 14/579,752 dated Aug. 19, 2015.
Official Communication for U.S. Appl. No. 14/579,752 dated May 26, 2015.
Notice of Allowance for U.S. Appl. No. 14/616,080 dated Apr. 2, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated May 18, 2015.
Official Communication for U.S. Appl. No. 14/639,606 dated Jul. 24, 2015.
“A First Look: Predicting Market Demand for Food Retail using a Huff Analysis,” TRF Policy Solutions, Jul. 2012, pp. 30.
“A Quick Guide to UniProtKB Swiss-Prot & TrEMBL,” Sep. 2011, pp. 2.
Acklen, Laura, “Absolute Beginner's Guide to Microsoft Word 2003,” Dec. 24, 2003, pp. 15-18, 34-41, 308-316.
Ananiev et al., “The New Modality API,” http://web.archive.org/web/20061211011958/http://java.sun.com/developer/technicalArticles/J2SE/Desktop/javase6/modality/ Jan. 21, 2006, pp. 8.
Bluttman et al., “Excel Formulas and Functions for Dummies,” 2005, Wiley Publishing, Inc., pp. 280, 284-286.
Bugzilla@Mozilla, “Bug 18726—[feature] Long-click means of invoking contextual menus not supported,” http://bugzilla.mozilla.org/show_bug.cgi?id=18726 printed Jun. 13, 2013 in 11 pages.
Canese et al., “Chapter 2: PubMed: The Bibliographic Database,” The NCBI Handbook, Oct. 2002, pp. 1-10.
Chen et al., “Bringing Order to the Web: Automatically Categorizing Search Results,” CHI 2000, Proceedings of the SIGCHI conference on Human Factors in Computing Systems, Apr. 1-6, 2000, The Hague, The Netherlands, pp. 145-152.
Conner, Nancy, “Google Apps: The Missing Manual,” Sharing and Collaborating on Documents, May 1, 2008, pp. 93-97, 106-113 & 120-121.
Delcher et al., “Identifying Bacterial Genes and Endosymbiont DNA with Glimmer,” BioInformatics, vol. 23, No. 6, 2007, pp. 673-679.
Dramowicz, Ela, “Retail Trade Area Analysis Using the Huff Model,” Directions Magazine http://www.directionsmag.com/articles/retail-trade-area-analysis-using-the-huff-model/123411, Jul. 2, 2005 in 10 pages.
GIS-NET 3 Public—Department of Regional Planning. Planning & Zoning Information for Unincorporated LA County. Retrieved Oct. 2, 2013 from http://gis.planning.lacounty.gov/GIS-NET3_Public/Viewer.html.
Goswami, Gautam, “Quite ‘Writely’ Said!” One Brick at a Time, Aug. 21, 2005, pp. 7.
Griffith, Daniel A., “A Generalized Huff Model,” Geographical Analysis, Apr. 1982, vol. 14, No. 2, pp. 135-144.
Hansen et al., “Analyzing Social Media Networks with NodeXL: Insights from a Connected World”, Elsevier Science, Sep. 2010, Ch. 4 & 10, pp. 53-67 & 143-164.
Hibbert et al., “Prediction of Shopping Behavior Using a Huff Model Within a GIS Framework,” Healthy Eating in Context, Mar. 18, 2011, pp. 16.
Huff et al., “Calibrating the Huff Model Using ArcGIS Business Analyst,” ESRI, Sep. 2008, pp. 33.
Huff, David L., “Parameter Estimation in the Huff Model,” ESRI, ArcUser, Oct.-Dec. 2003, pp. 34-36.
Kahan et al., “Annotea: An Open RDF Infrastructure for Shared WEB Annotations”, Computer Networks, 2002, vol. 39, pp. 589-608.
Keylines.com, “An Introduction to KeyLines and Network Visualization,” Mar. 2014, http://keylines.com/wp-content/uploads/2014/03/KeyLines-White-Paper.pdf downloaded May 12, 2014 in 8 pages.
Keylines.com, “KeyLines Datasheet,” Mar. 2014, http://keylines.com/wp-content/uploads/2014/03/KeyLines-datasheet.pdf downloaded May 12, 2014 in 2 pages.
Keylines.com, “Visualizing Threats: Improved Cyber Security Through Network Visualization,” Apr. 2014, http://keylines.com/wp-content/uploads/2014/04/Visualizing-Threats1.pdf downloaded May 12, 2014 in 10 pages.
Kitts, Paul, “Chapter 14: Genome Assembly and Annotation Process,” The NCBI Handbook, Oct. 2002, pp. 1-21.
Liu, Tianshun, “Combining GIS and the Huff Model to Analyze Suitable Locations for a New Asian Supermarket in the Minneapolis and St. Paul, Minnesota USA,” Papers in Resource Analysis, 2012, vol. 14, pp. 8.
Madden, Tom, “Chapter 16: The BLAST Sequence Analysis Tool,” The NCBI Handbook, Oct. 2002, pp. 1-15.
Manno et al., “Introducing Collaboration in Single-user Applications through the Centralized Control Architecture,” 2010, pp. 10.
Manske, “File Saving Dialogs,” http://www.mozilla.org/editor/ui_specs/FileSaveDialogs.html, Jan. 20, 1999, pp. 7.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.bing.comn.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.google.com.
Map of San Jose, CA. Retrieved Oct. 2, 2013 from http://maps.yahoo.com.
Microsoft—Developer Network, “Getting Started with VBA in Word 2010,” Apr. 2010, http://msdn.microsoft.com/en-us/library/ff604039%28v=office,14%29.aspx as printed Apr. 4, 2014 in 17 pages.
Microsoft Office—Visio, “About connecting shapes,” http://office.microsoft.com/en-us/visio-help/about-connecting-shapes-HP085050369.aspx printed Aug. 4, 2011 in 6 pages.
Microsoft Office—Visio, “Add and glue connectors with the Connector tool,” http://office.microsoft.com/en-us/visio-help/add-and-glue-connectors-with-the-connector-tool-HA010048532.aspx?CTT=1 printed Aug. 4, 2011 in 1 page.
Mizrachi, Ilene, “Chapter 1: GenBank: The Nuckeotide Sequence Database,” The NCBI Handbook, Oct. 2002, pp. 1-14.
Palmas, et al., “An Edge-Bundling Layout for Interactive Parallel Coordinates,” Proceedings of the 2014 IEEE Pacific Visualization Symposium, Mar. 2014, pp. 57-64.
Rouse, Margaret, “OLAP Cube,” http://searchdatamanagement.techtarget.com/definition/OLAP-cube, Apr. 28, 2012, pp. 16.
Sigrist, et al., “PROSITE, a Protein Domain Database for Functional Characterization and Annotation,” Nucleic Acids Research, 2010, vol. 38, pp. D161-D166.
Sirotkin et al., “Chapter 13: The Processing of Biological Sequence Data at NCBI,” The NCBI Handbook, Oct. 2002, pp. 1-11.
“The FASTA Program Package,” fasta-36.3.4, Mar. 25, 2011, pp. 29.
Official Communication in European Search Report in Application No. 14189347.9 dated Mar. 4, 2015.
Official Communication in New Zealand Application No. 622513 dated Apr. 3, 2014.
Official Communication in European Application No. EP 14158861.6 dated Jun. 16, 2014.
Official Communication in New Zealand Application No. 622517 dated Apr. 3, 2014.
Official Communication in British Application No. GB1408025.3 dated Nov. 6, 2014.
Official Communication in New Zealand Application No. 624557 dated May 14, 2014.
Official Communication in New Zealand Application No. 628585 dated Aug. 26, 2014.
Official Communication in New Zealand Application No. 628495 dated Aug. 19, 2014.
Official Communication in New Zealand Application No. 628263 dated Aug. 12, 2014.
Official Communication in New Zealand Application No. 628161 dated Aug. 25, 2014.
Official Communication in New Zealand Application No. 627962 dated Aug. 5, 2014.
Official Communication in New Zealand Application No. 628840 dated Aug. 28, 2014.
“Andy Turner's GISRUK 2012 Notes” <https://docs.google.com/document/d/1 cTmxg7mVx5gd89lqblCYvCEnHA4QAivH4l4WpyPsqE4/edit?pli=1> printed Sep. 16, 2013 in 15 pages.
Barnes et al., “Viewshed Analysis”, GIS-ARC/INFO 2001, <www.evsc.virginia.edu/˜jhp7e/evsc466/student_pres/Rounds.pdf>.
Carver et al., “Real-Time Visibility Analysis and Rapid Viewshed Calculation Using a Voxel-Based Modelling Approach,” GISRUK 2012 Conference, Apr. 11-13, Lancaster UK, Apr. 13, 2012, pp. 6.
Ghosh, P., “A Solution of Polygon Containment, Spatial Planning, and Other Related Problems Using Minkowski Operations,” Computer Vision, Graphics, and Image Processing, 1990, vol. 49, pp. 1-35.
Gorr et al., “Crime Hot Spot Forecasting: Modeling and Comparative Evaluation”, Grant 98-IJ-CX-K005, May 6, 2002, 37 pages.
Haralick et al., “Image Analysis Using Mathematical Morphology,” Pattern Analysis and Machine Intelligence, IEEE Transactions, Jul. 1987, vol. PAMI-9, No. 4, pp. 532-550.
“HunchLab: Heat Map and Kernel Density Calculation for Crime Analysis,” Azavea Journal, printed from www.azavea.com/blogs/newsletter/v4i4/kernel-density-capabilities-added-to-hunchlab/ on Sep. 9, 2014, 2 pages.
Ipbucker, C., “Inverse Transformation for Several Pseudo-cylindrical Map Projections Using Jacobian Matrix,” ICCSA 2009, Part 1 LNCS 5592, pp. 553-564.
Levine, N., “Crime Mapping and the Crimestat Program,” Geographical Analysis, 2006, vol. 38, pp. 41-56.
Mandagere, Nagapramod, “Buffer Operations in GIS,” <http://www-users.cs.umn.edu/˜npramod/enc_pdf.pdf> retrieved Jan. 28, 2010, pp. 7.
Map Builder, “Rapid Mashup Development Tool for Google and Yahoo Maps!” <http://web.archive.org/web/20090626224734/http://www.mapbuilder.net/> printed Jul. 20, 2012 in 2 pages.
Murray, C., Oracle Spatial Developer's Guide—6 Coordinate Systems (Spatial Reference Systems), <http://docs.oracle.com/cd/B28359_01/appdev.111/b28400.pdf>, Jun. 2009.
Open Street Map, “Amm's Diary:Unconnected ways and other data quality issues,” http://www.openstreetmap.org/user/amm/diary printed Jul. 23, 2012 in 3 pages.
POI Editor, “How to: Create Your Own Points of Interest,” <http://www.poieditor.com/articles/how_to_create_your_own_points_of_interest/> printed Jul. 22, 2012 in 4 pages.
Pozzi et al., “Vegetation and Population Density in Urban and Suburban Areas in the U.S.A.” Third International Symposium of Remote Sensing of Urban Areas Istanbul, Turkey, Jun. 2002, pp. 8.
Qiu, Fang, “3d Analysis and Surface Modeling”, <http://web.archive.org/web/20091202221925/http://www.utsa.edu/Irsg/Teaching/EES6513/08-3D.pdf> printed Sep. 16, 2013 in 26 pages.
Reddy et al., “Under the hood of GeoVRML 1.0,” SRI International, Proceedings of the fifth symposium on Vurtual Reality Modeling Language (Web3D-VRML), New York, NY, Feb. 2000, pp. 23-28. <http://pdf.aminer.org/000/648/038/under_the_hood_of_geovrml.pdf>.
Reibel et al., “Areal Interpolation of Population Counts Using Pre-classi_ed Land Cover Data,” Population Research and Policy Review, 2007, vol. 26, pp. 619-633.
Reibel, M., “Geographic Information Systems and Spatial Data Processing in Demography: a Review,” Population Research and Policy Review, 2007, vol. 26, pp. 601-618.
Rizzardi et al., “Interfacing U.S. Census Map Files with Statistical Graphics Software: Application and Use in Epidemiology,” Statistics in Medicine, Oct. 1993, vol. 12, No. 19-20, pp. 1953-1964.
Snyder, “Map Projections—A Working Manual,” U.S. Geological Survey Professional paper 1395, United States Government Printing Office, Washington: 1987, pp. 11-21 and 60-70.
Sonris, “Using the Area of Interest Tools,” <http://web.archive.org/web/20061001053327/http://sonris-www.dnr.state.la.us/gis/instruct_files/tutslide12> printed Jan. 3, 2013 in 1 page.
Tangelder et al., “Freeform Shape Matching Using Minkowski Operations,” The Netherlands, Jun. 1996, pp. 12.
Valentini et al., “Ensembles of Learning Machines”, M. Marinaro and R. Tagliaferri (Eds.): WIRN VIETRI 2002, LNCS 2486, pp. 3-20.
VB Forums, “Buffer a Polygon,” Internet Citation, <http://www.vbforums.com/showthread.php?198436-Buffer-a-Polygon>, Specifically Thread #1, #5 & #11 retrieved on May 2, 2013, pp. 8.
Vivid Solutions, “JTS Topology Suite: Technical Specifications,” <http://www.vividsolutions.com/jts/bin/JTS%20Technical%20Specs.pdf> Version 1.4, 2003, pp. 36.
Wikipedia, “Douglas_Peucker-Algorithms,” <http://de.wikipedia.org/w/index.php?title=Douglas-Peucker-Algorithmus&oldid=91846042> printed Jul. 2011, pp. 2.
Wikipedia, “Ramer_Douglas_Peucker Algorithm,” <http://en.wikipedia.org/wiki/Ramer%E2%80%93Douglas%E2%80%93Peucker_algorithm> printed Jul. 2011, pp. 3.
Wongsuphasawat et al., “Visual Analytics for Transportation Incident Data Sets,” Transportation Research Record 2138, 2009, pp. 135-145.
Woodbridge, Stephen, “[geos-devel] Polygon simplification,” <http://lists.osgeo.org/pipermail/geos-devel/2011-May/005210.html> dated May 8, 2011, pp. 3.
IBM—i2 Integrated Law Enforcement, https://www-03.ibm.com/software/products/en/integrated-law-enforcement, as printed Feb. 15, 2017 in 2 pages.
IBM—i2 Analyze, https://www-03.ibm.com/software/products/en/i2-analyze, as printed Feb. 15, 2017 in 2 pages.
IBM—Data analysis—i2 Analyst's Notebook, http://www-03.ibm.com/software/products/en/analysts-notebook, as printed Feb. 16, 2017 in 2 pages.
Visual Analysis, “Overview of merging timeline charts and creating hybrid charts,” available at https://www.youtube.com/watch?v=dl6jzNtEVpA, as published on Mar. 9, 2015.
IBM Analytics, “IBM i2 Intelligence Analysis Portfolio Overview,” available at https://www.youtube.com/watch?v=ElFu_oUiaBY, as published on Sep. 24, 2015.
i2—An IBM Company, “IBM i2 Intelligent Law Enforcement Demo,” available at https://www.youtube.com/watch?v=_KCXZ2iTMXQ, as published on Dec. 3, 2012.
Yair Shaked, “IBM i2 Enterprise Insight Analysis—cyber Demo,” available at https://www.youtube.com/watch?v=ZXmTWKqkfF4, as published on Nov. 19, 2015.
Visual Analysis, “Overview of importing data and creating timelines,” available at https://www.youtube.com/watch?v=SovxKrvkZZs, as published on Mar. 9, 2015.
IBM Corporation, “IBM i2 Analyst's Notebook,” Aug. 2015, in 4 pages.
IBM Corporation, “IBM i2 Analyst's Notebook Connector for ESRI,” May 2012, in 3 pages.
IBM Corporation, “IBM i2 Enterprise Insight Analysis V2.0 delivers a modern contextual user interface and enhanced software operational warehouse support,” http://www-01.ibm.com/common/ssi/ShowDoc.wss?docURL=/common/ssi/rep_ca/2/897/ENUS215-302/index.html&lang=en&request_locale=en, as published on Sep. 1, 2015.
IBM Support, “Software lifecycle—i2 Analyst's Notebook Premium 9.0.0,” https://www-01.ibm.com/software/support/lifecycleapp/PLCDetail.wss?q45=I570331B72886X86, as printed Mar. 7, 2017 in 1 page.
IBM Support, “Software lifecycle—i2 Enterprise Insight Analysis 2.0.0,” https://www-01.ibm.com/software/support/lifecycleapp/PLCDetail.wss?q45=E170786H45496l53, as printed Mar. 7, 2017 in 1 page.
i2 a ChoicePoint Company, “i2 Analyst's Notebook 7 User Guide: Creating Charts” Jun. 2007, 373 pages.
Gatewaynews, “New Crime Fighting Tool ‘Coplink’” available at https://www.youtube.com/watch?v=GbU6E0grnTw, as published on Mar. 8, 2008.
COPLINK, “Incident Analyzer User Guide,” created Nov. 5, 2010 (as indicated by the PDF file metadata), 14 pages.
International Search Report and Written Opinion in Application No. PCT/US2009/056703 dated Mar. 15, 2010.
Notice of Allowance for U.S. Appl. No. 13/948,859 dated Dec. 10, 2014.
Notice of Allowance for U.S. Appl. No. 14/319,765 dated Nov. 25, 2016.
Official Communication for Australian Patent Application No. 2010227081 dated Mar. 18, 2011.
Official Communication for Australian Patent Application No. 2010257305 dated Apr. 12, 2011.
Official Communication for Australian Patent Application No. 2010257305 dated Sep. 22, 2011.
Official Communication for European Patent Application No. 08839003.4 dated Jun. 12, 2013.
Official Communication for European Patent Application No. 08839003.4 dated Aug. 14, 2012.
Official Communication for European Patent Application No. 10195798.3 dated May 17, 2011.
Official Communication for European Patent Application No. 12186236.1 dated May 17, 2013.
Official Communication for European Patent Application No. 14159464.8 dated Jul. 31, 2014.
Official Communication for European Patent Application No. 14189347.9 dated Oct. 13, 2017.
Official Communication for Great Britain Patent Application No. 1319225.7 dated May 2, 2014.
Official Communication for Great Britain Patent Application No. 1404457.2 dated Aug. 14, 2014.
Official Communication for New Zealand Patent Application No. 616167 dated Oct. 10, 2013.
Official Communication for U.S. Appl. No. 12/840,673 dated Sep. 17, 2014.
Official Communication for U.S. Appl. No. 12/840,673 dated Jan. 2, 2015.
Official Communication for U.S. Appl. No. 13/728,879 dated Mar. 17, 2015.
Official Communication for U.S. Appl. No. 13/728,879 dated Jan. 27, 2015.
Official Communication for U.S. Appl. No. 14/319,161 dated Jan. 23, 2015.
Official Communication for U.S. Appl. No. 14/672,009 dated Jul. 14, 2017.
Official Communication for U.S. Appl. No. 14/672,009 dated May 26, 2017.
Related Publications (1)
Number Date Country
20150178877 A1 Jun 2015 US
Provisional Applications (1)
Number Date Country
61893058 Oct 2013 US
Continuations (1)
Number Date Country
Parent 14108187 Dec 2013 US
Child 14581823 US