The present invention relates to interactive graphical displays of data visualizations representing quantitative data.
Electronic devices—such as desktop, laptop, and tablet computers, as well as mobile devices such as smartphones—are often capable of storing and displaying various forms of data visualizations (or “reports”) representing quantitative data. Data visualizations can represent any information, such as financial information, marketing information, and/or the like, in any suitable tabular, text-based, and/or graphical format.
It is known in the art to provide interactivity for data visualizations. In many computing environments, including web-based applications such as browsers for presenting web pages, a user can interact with a data visualization so as to change the format and/or nature of the displayed data, to highlight certain elements and/or obscure others, and/or to zoom into and out of a displayed report.
One particular form of interactivity is to provide a mechanism for a user to invoke and/or control the display of secondary reports from a primary report. Yuen, U.S. Pat. No. 5,423,033 for “Report Generation System and Method”, issued Jun. 6, 1995, describes a report generation system and method wherein a secondary report can be generated containing detailed information concerning a specific data element of a primary report. A user selects a data element on a primary report; upon activation, a secondary report is generated, using new parameters determined by the particular data element selected by the user.
Yuen's technique, and similar techniques, offer limited functionality as to the type(s) of secondary report that can be generated, and as to the degree of user control of the nature and format of the secondary report. In general, such techniques generate a single type of secondary report based solely on the user's selection of a particular data element. The user is not generally able to interactively select among a plurality of available secondary reports or visualizations directly from the primary report.
According to various embodiments of the present invention, a user interface is provided—for a computer or other electronic device that displays quantitative data in graphical form—which allows a user to dynamically invoke and control the display of secondary data visualizations based on a selected element of a primary data visualization. In at least one embodiment, previews of these secondary data visualizations are presented in response to user interaction with the primary visualization. In response to user input, one or more of the previews can be dynamically expanded. This allows a user to dynamically “drill down” into selected aspects and/or elements of the primary data visualization, in a manner that is highly user-configurable, interactive, and responsive.
In at least one embodiment, the system and method of the present invention are implemented in such a manner as to respond to direct manipulation of the displayed elements, for example via a touch-sensitive screen. Any touch-sensitive, proximity-sensitive, or gesture-based system can be used. Known gestures such as pinching and rotating can be interpreted in an intuitive manner to provide improved control and feedback in response to user input.
For example, in at least one embodiment, a gesture including a two-finger spreading motion invokes previews of available secondary visualizations for a given element of a displayed primary visualization. The position at which the gesture is performed specifies which data element of the primary visualization is being explored. The axis defined by the two points of contact determines which of the displayed previews of secondary visualizations is to be highlighted and/or expanded; the user can rotate his or her fingers to change the axis and thereby highlight and/or expand different secondary visualizations. In at least one embodiment, the user can tap on a displayed preview to expand it, or can increase the distance between the spread fingers, or perform some other action to cause the displayed preview to be expanded.
As described in more detail herein, in various embodiments, a hierarchy of visualizations can be established, and the user can navigate among primary, secondary, tertiary, and/or additional levels of visualizations in a similar interactive manner. The system and method of the present invention thereby provide an improved level of user control and interactivity in the display of visualizations on an electronic device.
For purposes of the description herein, the terms “report”, “data visualization”, “visualization”, and “graph” are used interchangeably to refer to any suitable representation or representations of quantitative data, with the examples depicted and described herein being provided for illustrative purposes with no intention of limiting the invention to those particular types of visualizations. Such representations can be graphical, tabular, text-based, or any combination thereof.
Further details and variations are described herein.
The accompanying drawings illustrate several embodiments of the invention. Together with the description, they serve to explain the principles of the invention according to the embodiments. One skilled in the art will recognize that the particular embodiments illustrated in the drawings are merely exemplary, and are not intended to limit the scope of the present invention.
System Architecture
According to various embodiments, the present invention can be implemented on any electronic device equipped to receive, store, and present quantitative data. Such an electronic device may be, for example, a desktop computer, laptop computer, smartphone, tablet computer, or the like.
Although the invention is described herein in connection with an implementation in a computer, one skilled in the art will recognize that the techniques of the present invention can be implemented in other contexts, and indeed in any suitable device capable of presenting quantitative data graphically and/or interactively. Accordingly, the following description is intended to illustrate various embodiments of the invention by way of example, rather than to limit the scope of the claimed invention.
Referring now to
In at least one embodiment, device 101 has a number of hardware components well known to those skilled in the art. Input device 102 can be any element that receives input from user 100, including, for example, a keyboard, mouse, stylus, touch-sensitive screen (touchscreen), touchpad, trackball, accelerometer, five-way switch, microphone, or the like. Input can be provided via any suitable mode, including for example, one or more of: pointing, tapping, typing, dragging, and/or speech. Display screen 103 can be any element that graphically displays quantitative data.
Processor 104 can be a conventional microprocessor for performing operations on data under the direction of software, according to well-known techniques. Memory 105 can be random-access memory, having a structure and architecture as are known in the art, for use by processor 104 in the course of running software.
Data store 106 can be any magnetic, optical, or electronic storage device for data in digital form; examples include flash memory, magnetic hard drive, CD-ROM, DVD-ROM, or the like. In at least one embodiment, data store 106 stores information describing quantitative data 107.
Data store 106 can be local or remote with respect to the other components of device 101. In at least one embodiment, device 101 is configured to retrieve data from a remote data storage device when needed. Such communication between device 101 and other components can take place wirelessly, by Ethernet connection, via a computing network such as the Internet, or by any other appropriate means. This communication with other electronic devices is provided as an example and is not necessary to practice the invention.
In at least one embodiment, data store 106 is detachable in the form of a CD-ROM, DVD, flash drive, USB hard drive, or the like. Quantitative data 107 can be entered into such a detachable data store 106 from a source outside of device 101 and later displayed after data store 106 is connected to device 101. In another embodiment, data store 106 is fixed within device 101.
Referring now to
Client device 108 can be any electronic device incorporating input device 102 and display screen 103, such as a desktop computer, laptop computer, personal digital assistant (PDA), cellular telephone, smartphone, music player, handheld computer, tablet computer, kiosk, game system, or the like. Any suitable communications network 109, such as the Internet, can be used as the mechanism for transmitting data between client 108 and server 110, according to any suitable protocols and techniques. In addition to the Internet, other examples include cellular telephone networks, EDGE, 3G, 4G, long term evolution (LTE), Session Initiation Protocol (SIP), Short Message Peer-to-Peer protocol (SMPP), SS7, WiFi, Bluetooth, ZigBee, Hypertext Transfer Protocol (HTTP), Secure Hypertext Transfer Protocol (SHTTP), Transmission Control Protocol/Internet Protocol (TCP/IP), and/or the like, and/or any combination thereof. In at least one embodiment, client device 108 transmits requests for data via communications network 109, and receives responses from server 110 containing the requested data.
In this implementation, server 110 is responsible for data storage and processing, and incorporates data store 106 for storing quantitative data 107. Server 110 may include additional components as needed for retrieving data from data store 106 in response to requests from client device 108.
In at least one embodiment, quantitative data 107 is organized into one or more well-ordered data sets, with one or more data entries in each set. Data store 106, however, can have any suitable structure. Accordingly, the particular organization of quantitative data 107 within data store 106 need not resemble the form in which it is displayed to user 100. In at least one embodiment, an identifying label is also stored along with each data entry, to be displayed along with each data entry.
Quantitative data can be retrieved from client-based or server-based data store(s), and/or from any other source. In at least one embodiment, input device 102 is configured to receive data entries from user 100, to be added to quantitative data 107 held in data store 106. User 100 may provide such data entries via the hardware and software components described above according to means that are well known to those skilled in the art.
Display screen 103 presents one or more data visualizations that present quantitative data 107 in some visual form, whether text-based, tabular, graphical, interactive, and/or any other suitable format. Such data visualizations may, for example, take the form of bar graphs or other visual graphs that present all of, or some subset of, quantitative data 107. In at least one embodiment where only some of quantitative data 107 is presented at a time, a dynamic control, such as a scrolling mechanism, may be available via input device 102 to change which data entries are currently displayed, and/or to alter the manner in which the data is displayed.
In at least one embodiment, data visualizations presented via display screen 103 include visual cues, such as height, distance, and/or area, to convey the value of each data entry. In at least one embodiment, labels accompany data entries on display screen 103, or can be displayed when user 100 taps on or clicks on a data entry, or causes an on-screen cursor to hover over a data entry.
Method
As described in more detail herein, various embodiments of the present invention provide techniques for dynamically expanding aspects of displayed data visualizations in an interactive manner, such as for example in response to user input.
Referring now to
A primary visualization is displayed 210. The primary visualization can be any representation of quantitative data 107, such as a graph, table, chart, and/or the like.
According to various embodiments, user 100 can interact with the primary visualization in any of a number of different ways. In at least one embodiment, display screen 103 is a touch-sensitive screen, allowing user 100 to interact with the primary visualization by touching the screen or causing a stylus or other object to touch the screen. In at least one embodiment, user 100 can move an on-screen cursor, via an input device 102 such as a mouse, trackball, touchpad, keyboard, five-way switch, and/or any other suitable device, to point to and interact with various areas of the primary visualization. Such interactions may include resizing, moving, scrolling, and/or reformatting the primary visualization, and/or performing any other type of operation in connection with the visualization.
In at least one embodiment, user 100 can provide input causing one or more secondary visualization(s) to be displayed. Such secondary visualization(s) may provide more details regarding a particular aspect or element of the primary visualization (such as a rectangle 301 of graph 300), and/or they may present data in a different form than the primary visualization. For example, a secondary visualization may depict the same or similar data as that displayed in the first visualization, but displayed according to a different dimension or in a different format or scale. As another example, a secondary visualization may depict a subset or superset of the data displayed in the first visualization.
As an example, a primary data visualization may depict total yearly sales. Secondary visualizations associated with such a primary visualization may include total yearly sales subdivided by various categories such as:
In at least one embodiment, the input to activate a secondary visualization may include, for example, tapping or clicking on a particular element of the primary visualization, or activating a keyboard command, or any other suitable technique. The position at which such input is provided may determine the particular aspect of the primary visualization to be expanded or presented in the secondary visualization, and/or it may determine the format and/or nature of the secondary visualization. For example, tapping on a particular rectangle 301 may cause secondary visualization(s) associated with a corresponding data value to be made available.
In at least one embodiment, a set of previews of available secondary visualizations can be presented, to allow user 100 to select a desired secondary visualization more intelligently, and to give user 100 a sense of what kind of secondary visualizations are available. Accordingly, in response to receiving 211 user input to activate previews of secondary visualizations, previews are displayed 212. In the example described above, previews could be displayed for the various visualizations depicting total yearly sales subdivided by the above-listed categories.
Any suitable form of input can be provided for activating previews of available secondary visualizations. In one example, a pinch gesture is used, as described in more detail herein. Alternatively, input for activating previews of available secondary visualizations can be provided in the form of tapping, gesturing, clicking, keyboard input, interaction with on-screen buttons or links, and/or the like. Voice command can also be used.
The position at which the user's input is provided may determine which previews are displayed; for example, tapping on a particular rectangle 301 may activate display of previews for secondary visualization(s) associated with a data value corresponding to that rectangle 301.
In at least one embodiment, user 100 performs some sort of gesture to activate previews of secondary visualizations, wherein the gesture is detected by input device 102 and/or by a touch-sensitive display screen 103. For example as depicted in
In at least one embodiment, the on-screen location of the input (for example the position of contact points 401A, 401B, or the midpoint between contact points 401A, 401B) determines which portion of the displayed data is to be expanded by presentation of secondary visualizations. Accordingly, in the example of
Previews 501 can take any suitable form. For example, as shown in
In at least one embodiment, previews 501 for all available secondary visualizations are displayed. In at least one embodiment, previews 501 for a subset of available secondary visualizations are shown, for example if there are too many available secondary visualizations to effectively display previews 501 for all of them. The display may scroll through available visualizations, or may present only the most popular or suitable ones, or provide some mechanism for the user 100 to view previews other than those initially displayed. In at least one embodiment, a hierarchy of previews may be established, allowing user 100 to more easily navigate to the one he or she is interested in; such a technique is described in more detail below.
In at least one embodiment, user 100 can provide input 214 to cause one of the displayed previews 501 to be highlighted; the selected preview is highlighted 216. For example, user 100 may tap on one of the displayed previews 501 to cause it to be highlighted. Alternatively, user 100 may rotate the axis of the pinch gesture to cause different previews 501 to be highlighted. For example, as depicted in
In
In
Highlighted preview 501 can be displayed in any visually distinctive manner. For example, it can be brightened, shown in color (while other previews 501 are black-and-white), enlarged, and/or made dynamic; any other suitable effect may be applied. In at least one embodiment, previews 501 initially depict various types of visualizations without actually containing the visualizations themselves; highlighting a preview 501 may cause that preview to go “live”, showing actual data based on the currently selected element of the primary visualization. In at least one embodiment, highlighting a preview 501 causes that preview to be selected for further operations, such as hierarchical navigation, configuration, naming, modification, deletion, and/or the like.
In at least one embodiment, when previews 501 are displayed, the primary visualization is temporarily dismissed, grayed out, blurred, and/or shown in a subdued manner.
In at least one embodiment, additional information may be displayed for a selected preview 501. For example, a text box, ToolTip, or other element containing descriptive information may be shown for a preview 501 when that preview is selected. The descriptive element can be displayed alongside the selected preview 501, or on top of it (for example, in a translucent manner), or at some other location on the display screen. In another embodiment, an audio description (such as speech) of the selected preview 501 can be output on a speaker or similar component.
In at least one embodiment, if user 100 spreads his or her fingers further apart (or otherwise continues the gesture that caused previews 501 to be activated), all previews 501, or the selected preview 501, can dynamically expand in size. Dynamic resizing of previews 501, or the selected preview 501, can continue in response to further gestures by user 100; for example, previews 501, or the selected preview 501, can change their size dynamically based on the user 100 bringing his or her fingers closer together or farther apart.
In at least one embodiment, user 100 can shift finger position(s) with respect to the primary visualization, while previews 501 are being displayed. If user 100 shifts his or her fingers so that the center point between contact points 401A, 401B moves to a different element of the primary visualization (such as a different rectangle 301), the displayed previews 501 can be dynamically updated to reflect the newly selected element of the primary visualization. For example, if user 100 shifts so that the center point is now on rectangle 301C instead of rectangle 301B, previews 501 are updated so that they now depict available secondary visualizations for rectangle 301C. In at least one embodiment, user 100 can move his or her fingers around screen 103 to cause different elements of the primary visualization to be selected and thereby cause different sets of previews 501 to be displayed. In addition, rotation of the axis between contact points 401A, 401B can continue to dynamically change the selected preview 501.
In embodiments using different input modes, such dynamic updating of previews can be performed in manner suited to the particular input mode being used. For example, appropriate mechanisms for changing the set of displayed previews 501, and/or selecting particular previews 501, can be implemented for various types of input devices 102.
In at least one embodiment, user 100 can provide input 215 to cause one of the displayed previews 501 to be expanded; the selected preview is expanded 217. For example, in at least one embodiment, user 100 can remove his or her fingers from screen 103 while a particular preview 501 is selected, to cause that preview 501 to be expanded. Such a technique is depicted in the example of
In other embodiments, other input mechanisms can be used for invoking expansion of a preview 501. For example, user 100 can tap on or click on a displayed preview 501 to cause it to be expanded 217. Alternatively, user 100 can hit a key, click a mouse, or perform some other input operation to cause a displayed or selected 501 preview to be expanded 217.
Any suitable mechanism can be provided for causing dismissal of full-sized display 1001 of the secondary visualization. For example, user 100 may tap on the secondary visualization to dismiss it or to cause it to transition back to preview form. Alternatively, user 100 can click on a dismiss button, or hit a key, or perform some other input operation to cause the secondary visualization to be dismissed. In at least one embodiment, user 100 can interact with the secondary visualization, for example to perform additional operations to view and/or manipulate various aspects and elements of the secondary visualization in different ways.
In at least one embodiment, while previews 501 are displayed, user 100 can provide input 218 to cause the displayed previews 501 to be dismissed; in response, previews 501 are dismissed 219, and the original form of primary visualization 300 is restored. Any suitable input can be provided for causing such dismissal 219. For example, user 100 can tap on an area of screen 103, or click on a dismiss button, or hit a key, or perform some other input operation to cause previews 501 to be dismissed.
Hierarchy of Visualizations
In at least one embodiment, available secondary visualizations can be organized in a hierarchy. The user can navigate the hierarchy to find and select a particular visualization for viewing and/or activation. The hierarchy can be organized according to any suitable scheme, such as by data type, format, style, and/or the like. Any number of levels can be available within the hierarchy.
Any suitable mechanism can be provided for navigating the hierarchy of visualizations. In at least one embodiment, a hierarchy of previews is made available; a top level can be provided to indicate a particular type of visualization, and subordinate levels of previews can be provided so as to provide the user with information about individual visualizations within the type. In at least one embodiment, so as to avoid cluttering the screen with an excessive number of previews at any given time, each set of second-level previews associated with a particular first-level preview is displayed only when user 100 selects, activates, highlights, or otherwise interacts that corresponding first-level preview. In at least one embodiment, multiple (or all) available sets of second-level previews can be displayed concurrently. In at least one embodiment, user 100 can select which sets of second-level previews should be displayed at any given time. In at least one embodiment, similar techniques can be used for successively lower levels of previews.
Referring now to
Although the example of
A primary visualization is displayed 210. In response to receiving 231 user input to activate first-level previews of secondary visualizations, previews are displayed 232. Such display of first-level previews can be performed in a manner similar to that described above in connection with steps 211 and 212 of
In at least one embodiment, user 100 can provide input 233 to cause one of the displayed first-level previews to be highlighted; the selected first-level preview is highlighted 234. Such highlighting can be performed in a manner similar to that described above in connection with steps 214 and 216 of
In at least one embodiment, user 100 can provide input 235 to cause second-level previews to be displayed for a highlighted first-level preview. In response, second-level previews are displayed 236.
Any suitable mechanism can be used for allowing user 100 to provide input 235 to cause second-level previews to be displayed 236. In the example shown in
In the example, only those second-level previews 501E-H associated with the currently selected first-level preview 501D are displayed. If user 100 rotates his or her fingers so that axis 601 no longer points to first-level preview 501D, second-level previews 501E-H are dismissed. One skilled in the art will recognize that other input schemes are possible, including for example a scheme whereby previews are dismissed only upon receipt of explicit input from user 100 to dismiss them.
In the example, selected first-level preview 501D depicts a visualization that is a representative example of a category or type of visualizations. Other visualizations 501E-501H are part of the same category or type. The preview 501D selected for display as a representative example may be selected based on a determination that that visualization is a “best fit” for user's 100 needs, or it can be selected by some other means.
In another embodiment, each first-level preview can instead be an indication of a category or type, rather than a representative example depicting a particular visualization of that category or type; second-level visualizations can belong to the indicated category or type.
In at least one embodiment, the display of second-level previews 501E-501H is persistent; once they are displayed, they remain on the screen to allow user 100 to select among them and/or interact with them. For example, user 100 can drag, tap, click on, or otherwise interact with one of the displayed second-level previews to activate it, causing it to be expanded.
In the example, as shown in
Referring again to
In the example, as shown in
As described above, all transitions can be implemented in a smooth manner, with previews expanding, changing position, and/or being dismissed gradually using effects such as zooming in/out, fading in/out, and/or dissolving. One skilled in the art will recognize that any suitable transition effects can be used to reinforce relationships between elements as they are moved, introduced, dismissed, or otherwise transitioned from one state to another.
User 100 can dismiss displayed visualization 1400, for example by tapping on or clicking on a dismiss button or icon (not shown), or entering a keyboard command, or by another mechanism. In at least one embodiment, visualization 1400 may automatically be dismissed after some predefined period of time. In at least one embodiment, after visualization 1400 is dismissed, the display may return to its initial state, or to the state that was shown just before visualization 1400 was invoked, or to any other suitable state.
Referring again to
One skilled in the art will recognize that the examples depicted and described herein are merely illustrative, and that other arrangements of user interface elements can be used. In addition, some of the depicted elements can be omitted or changed, and additional elements depicted, without departing from the essential characteristics of the invention.
The present invention has been described in particular detail with respect to possible embodiments. Those of skill in the art will appreciate that the invention may be practiced in other embodiments. First, the particular naming of the components, capitalization of terms, the attributes, data structures, or any other programming or structural aspect is not mandatory or significant, and the mechanisms that implement the invention or its features may have different names, formats, or protocols. Further, the system may be implemented via a combination of hardware and software, or entirely in hardware elements, or entirely in software elements. Also, the particular division of functionality between the various system components described herein is merely exemplary, and not mandatory; functions performed by a single system component may instead be performed by multiple components, and functions performed by multiple components may instead be performed by a single component.
Reference in the specification to “one embodiment” or to “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one embodiment of the invention. The appearances of the phrases “in one embodiment” or “in at least one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
In various embodiments, the present invention can be implemented as a system or a method for performing the above-described techniques, either singly or in any combination. In another embodiment, the present invention can be implemented as a computer program product comprising a non-transitory computer-readable storage medium and computer program code, encoded on the medium, for causing a processor in a computing device or other electronic device to perform the above-described techniques.
Some portions of the above are presented in terms of algorithms and symbolic representations of operations on data bits within a memory of a computing device. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of steps (instructions) leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic or optical signals capable of being stored, transferred, combined, compared and otherwise manipulated. It is convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. Furthermore, it is also convenient at times, to refer to certain arrangements of steps requiring physical manipulations of physical quantities as modules or code devices, without loss of generality.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “displaying” or “determining” or the like, refer to the action and processes of a computer system, or similar electronic computing module and/or device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Certain aspects of the present invention include process steps and instructions described herein in the form of an algorithm. It should be noted that the process steps and instructions of the present invention can be embodied in software, firmware and/or hardware, and when embodied in software, can be downloaded to reside on and be operated from different platforms used by a variety of operating systems.
The present invention also relates to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computing device. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, DVD-ROMs, magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, flash memory, solid state drives, magnetic or optical cards, application specific integrated circuits (ASICs), or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus. Further, the computing devices referred to herein may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
The algorithms and displays presented herein are not inherently related to any particular computing device, virtualized system, or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will be apparent from the description provided herein. In addition, the present invention is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any references above to specific languages are provided for disclosure of enablement and best mode of the present invention.
Accordingly, in various embodiments, the present invention can be implemented as software, hardware, and/or other elements for controlling a computer system, computing device, or other electronic device, or any combination or plurality thereof. Such an electronic device can include, for example, a processor, an input device (such as a keyboard, mouse, touchpad, trackpad, joystick, trackball, microphone, and/or any combination thereof), an output device (such as a screen, speaker, and/or the like), memory, long-term storage (such as magnetic storage, optical storage, and/or the like), and/or network connectivity, according to techniques that are well known in the art. Such an electronic device may be portable or non-portable. Examples of electronic devices that may be used for implementing the invention include: a mobile phone, personal digital assistant, smartphone, kiosk, server computer, enterprise computing device, desktop computer, laptop computer, tablet computer, consumer electronic device, or the like. An electronic device for implementing the present invention may use any operating system such as, for example and without limitation: Linux; Microsoft Windows, available from Microsoft Corporation of Redmond, Wash.; Mac OS X, available from Apple Inc. of Cupertino, Calif.; iOS, available from Apple Inc. of Cupertino, Calif.; Android, available from Google, Inc. of Mountain View, Calif.; and/or any other operating system that is adapted for use on the device.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of the above description, will appreciate that other embodiments may be devised which do not depart from the scope of the present invention as described herein. In addition, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, the disclosure of the present invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the claims.
This application is a continuation of U.S. application Ser. No. 13/535,019, entitled “Dynamic Expansion of Data Visualizations,” filed Jun. 27, 2012, which claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Application Ser. No. 61/506,912 for “Drill by Expansion,” filed Jul. 12, 2011, the entire contents of each of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
5359712 | Cohen et al. | Oct 1994 | A |
5375201 | Davoust | Dec 1994 | A |
5416895 | Anderson et al. | May 1995 | A |
5423033 | Yuen | Jun 1995 | A |
5461708 | Kahn | Oct 1995 | A |
5550964 | Davoust | Aug 1996 | A |
5581678 | Kahn | Dec 1996 | A |
5586240 | Khan et al. | Dec 1996 | A |
5600775 | King et al. | Feb 1997 | A |
5625767 | Bartell et al. | Apr 1997 | A |
5634133 | Kelley | May 1997 | A |
5689667 | Kurtenbach | Nov 1997 | A |
5737557 | Sullivan | Apr 1998 | A |
5844558 | Kumar et al. | Dec 1998 | A |
5929854 | Ross | Jul 1999 | A |
5933823 | Cullen et al. | Aug 1999 | A |
5970471 | Hill | Oct 1999 | A |
5990888 | Blades et al. | Nov 1999 | A |
6016502 | Haneda et al. | Jan 2000 | A |
6023280 | Becker et al. | Feb 2000 | A |
6298174 | Lantrip et al. | Oct 2001 | B1 |
6484168 | Pennock et al. | Nov 2002 | B1 |
6529217 | Maguire, III et al. | Mar 2003 | B1 |
6577304 | Yablonski et al. | Jun 2003 | B1 |
6613100 | Miller | Sep 2003 | B2 |
6626959 | Moise et al. | Sep 2003 | B1 |
6707454 | Barg et al. | Mar 2004 | B1 |
6904427 | Hagiwara | Jun 2005 | B1 |
6940509 | Crow et al. | Sep 2005 | B1 |
6985898 | Ripley et al. | Jan 2006 | B1 |
6995768 | Jou et al. | Feb 2006 | B2 |
7002580 | Aggala et al. | Feb 2006 | B1 |
7103837 | Sato | Sep 2006 | B2 |
7249328 | Davis | Jul 2007 | B1 |
7353183 | Musso | Apr 2008 | B1 |
7421648 | Davis | Sep 2008 | B1 |
7522176 | Tolle et al. | Apr 2009 | B2 |
7546522 | Tolle et al. | Jun 2009 | B2 |
7605804 | Wilson | Oct 2009 | B2 |
7685159 | Mitchell et al. | Mar 2010 | B2 |
7689933 | Parsons | Mar 2010 | B1 |
7705847 | Helfman et al. | Apr 2010 | B2 |
7788606 | Patel et al. | Aug 2010 | B2 |
7809582 | Wessling et al. | Oct 2010 | B2 |
8089653 | Kobashi | Jan 2012 | B2 |
8099674 | Mackinlay et al. | Jan 2012 | B2 |
8130211 | Abernathy | Mar 2012 | B2 |
8145600 | Lewis | Mar 2012 | B1 |
8176096 | Allyn et al. | May 2012 | B2 |
8185839 | Jalon | May 2012 | B2 |
8201096 | Robert et al. | Jun 2012 | B2 |
8214747 | Yankovich et al. | Jul 2012 | B1 |
8245156 | Mouilleseaux | Aug 2012 | B2 |
8261194 | Billiard et al. | Sep 2012 | B2 |
8286098 | Ju | Oct 2012 | B2 |
8289316 | Reisman | Oct 2012 | B1 |
8291349 | Park | Oct 2012 | B1 |
8291350 | Park | Oct 2012 | B1 |
8296654 | Ahlberg et al. | Oct 2012 | B2 |
8434007 | Morita | Apr 2013 | B2 |
8456466 | Reisman | Jun 2013 | B1 |
8463790 | Joshi et al. | Jun 2013 | B1 |
8468466 | Cragun | Jun 2013 | B2 |
8499284 | Pich et al. | Jul 2013 | B2 |
8549432 | Warner | Oct 2013 | B2 |
8566700 | Ueda | Oct 2013 | B2 |
8578294 | Eom | Nov 2013 | B2 |
8579814 | Fotiades et al. | Nov 2013 | B2 |
8621391 | Leffert et al. | Dec 2013 | B2 |
8624858 | Fyke | Jan 2014 | B2 |
8627233 | Cragun | Jan 2014 | B2 |
8645863 | Mandic et al. | Feb 2014 | B2 |
8661358 | Duncker et al. | Feb 2014 | B2 |
8667418 | Chaudhri et al. | Mar 2014 | B2 |
8671343 | Oberstein | Mar 2014 | B2 |
8683389 | Bar-Yam et al. | Mar 2014 | B1 |
8686962 | Christie | Apr 2014 | B2 |
8707192 | Robert et al. | Apr 2014 | B2 |
8713467 | Goldenberg et al. | Apr 2014 | B1 |
8738814 | Cronin | May 2014 | B1 |
8745280 | Cronin | Jun 2014 | B1 |
8799826 | Missig | Aug 2014 | B2 |
8806336 | Miyazawa | Aug 2014 | B2 |
8812947 | Maoz | Aug 2014 | B1 |
8826178 | Zhang | Sep 2014 | B1 |
8826181 | Mouilleseaux et al. | Sep 2014 | B2 |
8863019 | Pourshahid et al. | Oct 2014 | B2 |
8878879 | Lee | Nov 2014 | B2 |
8886622 | Parent et al. | Nov 2014 | B1 |
8914740 | Joos et al. | Dec 2014 | B1 |
8959423 | Hammoud | Feb 2015 | B2 |
9026944 | Kotler | May 2015 | B2 |
9081494 | Migos | Jul 2015 | B2 |
9086794 | Gil et al. | Jul 2015 | B2 |
9098182 | Migos | Aug 2015 | B2 |
9104365 | Sirpal | Aug 2015 | B2 |
9182900 | Choi | Nov 2015 | B2 |
9195368 | Kuscher | Nov 2015 | B2 |
9201589 | Nasraoui | Dec 2015 | B2 |
9202297 | Winters | Dec 2015 | B1 |
9223496 | Howard | Dec 2015 | B2 |
9235978 | Charlton | Jan 2016 | B1 |
9244562 | Rosenberg | Jan 2016 | B1 |
9250789 | Kobayashi | Feb 2016 | B2 |
9251722 | Miyazawa | Feb 2016 | B2 |
9261989 | Kuscher | Feb 2016 | B2 |
9280263 | Kim | Mar 2016 | B2 |
9292199 | Choi | Mar 2016 | B2 |
9299170 | Moon | Mar 2016 | B1 |
9310907 | Victor | Apr 2016 | B2 |
9310993 | Choi | Apr 2016 | B2 |
9329769 | Sekiguchi et al. | May 2016 | B2 |
9354780 | Miyake | May 2016 | B2 |
9367198 | Radakovitz et al. | Jun 2016 | B2 |
9390349 | Awano | Jul 2016 | B2 |
9395826 | Cronin | Jul 2016 | B1 |
9400997 | Beaver et al. | Jul 2016 | B2 |
9424333 | Bisignani et al. | Aug 2016 | B1 |
9459791 | Mouilleseaux et al. | Oct 2016 | B2 |
9465452 | Nishizawa | Oct 2016 | B2 |
9477315 | Fujimura | Oct 2016 | B2 |
9513799 | Fleizach | Dec 2016 | B2 |
9582187 | Gil | Feb 2017 | B2 |
9588645 | Heo | Mar 2017 | B2 |
9612736 | Lee | Apr 2017 | B2 |
9652056 | Park | May 2017 | B2 |
9658766 | Nan et al. | May 2017 | B2 |
9678343 | Kuehne | Jun 2017 | B2 |
9716825 | Manzari | Jul 2017 | B1 |
9721375 | Rivard | Aug 2017 | B1 |
9733734 | Chase | Aug 2017 | B2 |
9733796 | Warner | Aug 2017 | B2 |
9747018 | Han | Aug 2017 | B2 |
9804726 | Joos et al. | Oct 2017 | B1 |
9817548 | Lai | Nov 2017 | B2 |
9875023 | Brown | Jan 2018 | B2 |
9880701 | Hyun | Jan 2018 | B2 |
9881645 | Novikoff | Jan 2018 | B2 |
9886183 | Lee | Feb 2018 | B2 |
9996171 | Chase | Jun 2018 | B2 |
10078421 | Jeon | Sep 2018 | B2 |
10168817 | Hiraga | Jan 2019 | B2 |
10254927 | Missig | Apr 2019 | B2 |
20020087894 | Foley et al. | Jul 2002 | A1 |
20020091678 | Miller et al. | Jul 2002 | A1 |
20030028504 | Burgoon et al. | Feb 2003 | A1 |
20030069873 | Fox et al. | Apr 2003 | A1 |
20030074292 | Masuda | Apr 2003 | A1 |
20030128883 | Kim et al. | Jul 2003 | A1 |
20030158855 | Farnham et al. | Aug 2003 | A1 |
20030167278 | Baudel | Sep 2003 | A1 |
20030193502 | Patel et al. | Oct 2003 | A1 |
20040150668 | Myers | Aug 2004 | A1 |
20040189717 | Conally et al. | Sep 2004 | A1 |
20040230599 | Moore et al. | Nov 2004 | A1 |
20050068320 | Jaeger | Mar 2005 | A1 |
20050091254 | Stabb et al. | Apr 2005 | A1 |
20050091612 | Stabb et al. | Apr 2005 | A1 |
20050134578 | Chambers et al. | Jun 2005 | A1 |
20050246643 | Gusmorino et al. | Nov 2005 | A1 |
20050275622 | Patel et al. | Dec 2005 | A1 |
20050278625 | Wessling et al. | Dec 2005 | A1 |
20060004718 | McCully et al. | Jan 2006 | A1 |
20060020623 | Terai et al. | Jan 2006 | A1 |
20060026535 | Hotelling | Feb 2006 | A1 |
20060036950 | Himberger et al. | Feb 2006 | A1 |
20060041178 | Viswanathan et al. | Feb 2006 | A1 |
20060095865 | Rostom | May 2006 | A1 |
20060136819 | Tolle et al. | Jun 2006 | A1 |
20060242164 | Evans et al. | Oct 2006 | A1 |
20060244735 | Wilson | Nov 2006 | A1 |
20060288284 | Peters et al. | Dec 2006 | A1 |
20070008300 | Yang | Jan 2007 | A1 |
20070022000 | Bodart et al. | Jan 2007 | A1 |
20070061714 | Stuple et al. | Mar 2007 | A1 |
20070083911 | Madden | Apr 2007 | A1 |
20070094592 | Turner et al. | Apr 2007 | A1 |
20070124677 | de los Reyes | May 2007 | A1 |
20070171716 | Wright et al. | Jul 2007 | A1 |
20070179969 | Finley et al. | Aug 2007 | A1 |
20070186173 | Both et al. | Aug 2007 | A1 |
20070186177 | Both et al. | Aug 2007 | A1 |
20070186186 | Both et al. | Aug 2007 | A1 |
20070189737 | Chaudhri | Aug 2007 | A1 |
20070245238 | Fugitt et al. | Oct 2007 | A1 |
20070252821 | Hollemans | Nov 2007 | A1 |
20070256029 | Maxwell | Nov 2007 | A1 |
20070271528 | Park et al. | Nov 2007 | A1 |
20080037051 | Otsubo | Feb 2008 | A1 |
20080115049 | Tolle et al. | May 2008 | A1 |
20080136754 | Tsuzaki et al. | Jun 2008 | A1 |
20080168404 | Ording | Jul 2008 | A1 |
20080180458 | Favart et al. | Jul 2008 | A1 |
20080195639 | Freeman et al. | Aug 2008 | A1 |
20080244454 | Shibaike | Oct 2008 | A1 |
20080307343 | Robert et al. | Dec 2008 | A1 |
20080309632 | Westerman | Dec 2008 | A1 |
20090006318 | Lehtipalo et al. | Jan 2009 | A1 |
20090007012 | Mandic | Jan 2009 | A1 |
20090007017 | Anzures | Jan 2009 | A1 |
20090024411 | Albro et al. | Jan 2009 | A1 |
20090070301 | McLean et al. | Mar 2009 | A1 |
20090077501 | Partridge et al. | Mar 2009 | A1 |
20090096812 | Boixel et al. | Apr 2009 | A1 |
20090106674 | Bray et al. | Apr 2009 | A1 |
20090150177 | Buck et al. | Jun 2009 | A1 |
20090164171 | Wold et al. | Jun 2009 | A1 |
20090210814 | Agrusa et al. | Aug 2009 | A1 |
20090235155 | Ueda | Sep 2009 | A1 |
20090282325 | Radakovitz et al. | Nov 2009 | A1 |
20090284478 | De la Torre Baltierra | Nov 2009 | A1 |
20090307213 | Deng et al. | Dec 2009 | A1 |
20090307622 | Jalon et al. | Dec 2009 | A1 |
20090307626 | Jalon et al. | Dec 2009 | A1 |
20090315848 | Ku | Dec 2009 | A1 |
20090319897 | Kotler et al. | Dec 2009 | A1 |
20090327213 | Choudhary | Dec 2009 | A1 |
20090327955 | Mouilleseaux et al. | Dec 2009 | A1 |
20090327963 | Mouilleseaux | Dec 2009 | A1 |
20090327964 | Mouilleseaux | Dec 2009 | A1 |
20100005008 | Duncker et al. | Jan 2010 | A1 |
20100005411 | Duncker et al. | Jan 2010 | A1 |
20100031203 | Morris | Feb 2010 | A1 |
20100067048 | Suzuki | Mar 2010 | A1 |
20100070254 | Tsai et al. | Mar 2010 | A1 |
20100077354 | Russo | Mar 2010 | A1 |
20100080491 | Ohnishi | Apr 2010 | A1 |
20100083172 | Breeds et al. | Apr 2010 | A1 |
20100083190 | Roberts | Apr 2010 | A1 |
20100087322 | Yuan et al. | Apr 2010 | A1 |
20100095234 | Lane | Apr 2010 | A1 |
20100097338 | Miyashita | Apr 2010 | A1 |
20100100849 | Fram | Apr 2010 | A1 |
20100138766 | Nakajima | Jun 2010 | A1 |
20100157354 | Darwish | Jun 2010 | A1 |
20100161374 | Horta et al. | Jun 2010 | A1 |
20100162152 | Allyn et al. | Jun 2010 | A1 |
20100185962 | Zhang et al. | Jul 2010 | A1 |
20100185989 | Shiplacoff | Jul 2010 | A1 |
20100188353 | Yoon et al. | Jul 2010 | A1 |
20100192102 | Chmielewski | Jul 2010 | A1 |
20100192103 | Cragun | Jul 2010 | A1 |
20100194778 | Robertson et al. | Aug 2010 | A1 |
20100199202 | Becker | Aug 2010 | A1 |
20100205563 | Haapsaari et al. | Aug 2010 | A1 |
20100211895 | Mistry et al. | Aug 2010 | A1 |
20100218115 | Curtin et al. | Aug 2010 | A1 |
20100231536 | Chaudhri et al. | Sep 2010 | A1 |
20100235726 | Ording | Sep 2010 | A1 |
20100235771 | Gregg, III | Sep 2010 | A1 |
20100238176 | Guo et al. | Sep 2010 | A1 |
20100251151 | Alsbury et al. | Sep 2010 | A1 |
20100251179 | Cragun | Sep 2010 | A1 |
20100251180 | Cragun | Sep 2010 | A1 |
20100275144 | Dejoras et al. | Oct 2010 | A1 |
20100275159 | Matsubara et al. | Oct 2010 | A1 |
20100283750 | Kang et al. | Nov 2010 | A1 |
20100299637 | Chmielewski | Nov 2010 | A1 |
20100299638 | Choi | Nov 2010 | A1 |
20100306702 | Warner | Dec 2010 | A1 |
20100312462 | Gueziec et al. | Dec 2010 | A1 |
20100312803 | Gong et al. | Dec 2010 | A1 |
20100332511 | Stockton et al. | Dec 2010 | A1 |
20110001628 | Miyazawa | Jan 2011 | A1 |
20110004821 | Miyazawa | Jan 2011 | A1 |
20110016390 | Oh | Jan 2011 | A1 |
20110016433 | Shipley | Jan 2011 | A1 |
20110018806 | Yano | Jan 2011 | A1 |
20110041098 | Kajiya | Feb 2011 | A1 |
20110050562 | Schoen | Mar 2011 | A1 |
20110055691 | Carlen et al. | Mar 2011 | A1 |
20110055760 | Drayton | Mar 2011 | A1 |
20110069019 | Carpendale | Mar 2011 | A1 |
20110074171 | Maehara | Mar 2011 | A1 |
20110074696 | Rapp et al. | Mar 2011 | A1 |
20110074716 | Ono | Mar 2011 | A1 |
20110074718 | Yeh et al. | Mar 2011 | A1 |
20110074719 | Yeh | Mar 2011 | A1 |
20110077851 | Ogawa | Mar 2011 | A1 |
20110115814 | Heimendinger | May 2011 | A1 |
20110141031 | McCullough | Jun 2011 | A1 |
20110148796 | Hollemans | Jun 2011 | A1 |
20110173569 | Howes et al. | Jul 2011 | A1 |
20110179376 | Berestov et al. | Jul 2011 | A1 |
20110188760 | Wright et al. | Aug 2011 | A1 |
20110199639 | Tani | Aug 2011 | A1 |
20110205163 | Hinckley | Aug 2011 | A1 |
20110209048 | Scott et al. | Aug 2011 | A1 |
20110209088 | Hinckley | Aug 2011 | A1 |
20110209093 | Hinckley | Aug 2011 | A1 |
20110212717 | Rhoads et al. | Sep 2011 | A1 |
20110234503 | Fitzmaurice | Sep 2011 | A1 |
20110270851 | Mishina et al. | Nov 2011 | A1 |
20110271233 | Radakovitz et al. | Nov 2011 | A1 |
20110276603 | Bojanic et al. | Nov 2011 | A1 |
20110279363 | Shoji et al. | Nov 2011 | A1 |
20110283231 | Richstein et al. | Nov 2011 | A1 |
20110291988 | Bamji et al. | Dec 2011 | A1 |
20110298708 | Hsu | Dec 2011 | A1 |
20110302490 | Koarai | Dec 2011 | A1 |
20110320458 | Karana | Dec 2011 | A1 |
20120011437 | James et al. | Jan 2012 | A1 |
20120032901 | Kwon | Feb 2012 | A1 |
20120036434 | Oberstein | Feb 2012 | A1 |
20120050192 | Kobayashi | Mar 2012 | A1 |
20120056836 | Cha et al. | Mar 2012 | A1 |
20120056878 | Miyazawa | Mar 2012 | A1 |
20120081375 | Robert | Apr 2012 | A1 |
20120084644 | Robert | Apr 2012 | A1 |
20120089933 | Garand et al. | Apr 2012 | A1 |
20120092286 | O'Prey et al. | Apr 2012 | A1 |
20120127206 | Thompson et al. | May 2012 | A1 |
20120133585 | Han | May 2012 | A1 |
20120144335 | Abeln et al. | Jun 2012 | A1 |
20120154269 | Oki | Jun 2012 | A1 |
20120162265 | Heinrich et al. | Jun 2012 | A1 |
20120166470 | Baumgaertel | Jun 2012 | A1 |
20120174034 | Chae et al. | Jul 2012 | A1 |
20120180002 | Campbell | Jul 2012 | A1 |
20120210275 | Park | Aug 2012 | A1 |
20120254783 | Pourshahid | Oct 2012 | A1 |
20120262489 | Caliendo, Jr. | Oct 2012 | A1 |
20120284753 | Roberts | Nov 2012 | A1 |
20120293427 | Mukai | Nov 2012 | A1 |
20120306748 | Fleizach | Dec 2012 | A1 |
20120319977 | Kuge | Dec 2012 | A1 |
20130002802 | Mock | Jan 2013 | A1 |
20130007577 | Hammoud | Jan 2013 | A1 |
20130007583 | Hammoud | Jan 2013 | A1 |
20130019175 | Kotler et al. | Jan 2013 | A1 |
20130019205 | Gil et al. | Jan 2013 | A1 |
20130033448 | Yano | Feb 2013 | A1 |
20130036380 | Symons | Feb 2013 | A1 |
20130047125 | Kangas et al. | Feb 2013 | A1 |
20130067391 | Pittappilly | Mar 2013 | A1 |
20130076668 | Maeda | Mar 2013 | A1 |
20130080444 | Wakefield et al. | Mar 2013 | A1 |
20130093782 | Wakefield et al. | Apr 2013 | A1 |
20130097177 | Fan et al. | Apr 2013 | A1 |
20130097544 | Parker et al. | Apr 2013 | A1 |
20130104079 | Yasui | Apr 2013 | A1 |
20130114913 | Nagarajan et al. | May 2013 | A1 |
20130127758 | Kim | May 2013 | A1 |
20130127911 | Brown | May 2013 | A1 |
20130145244 | Rothschiller | Jun 2013 | A1 |
20130145316 | Heo | Jun 2013 | A1 |
20130169549 | Seymour | Jul 2013 | A1 |
20130174032 | Tse et al. | Jul 2013 | A1 |
20130201106 | Naccache | Aug 2013 | A1 |
20130204862 | Marchiori | Aug 2013 | A1 |
20130219340 | Linge | Aug 2013 | A1 |
20130222265 | Smith | Aug 2013 | A1 |
20130222340 | Ito | Aug 2013 | A1 |
20130235071 | Ubillos | Sep 2013 | A1 |
20130254662 | Dunko | Sep 2013 | A1 |
20130275898 | Fujimoto | Oct 2013 | A1 |
20130293672 | Suzuki | Nov 2013 | A1 |
20130307861 | Lang | Nov 2013 | A1 |
20130321340 | Seo | Dec 2013 | A1 |
20130328804 | Oshima | Dec 2013 | A1 |
20130346906 | Farago | Dec 2013 | A1 |
20140019899 | Cheng | Jan 2014 | A1 |
20140022192 | Hatanaka | Jan 2014 | A1 |
20140033127 | Choi | Jan 2014 | A1 |
20140047380 | Mak | Feb 2014 | A1 |
20140071063 | Kuscher | Mar 2014 | A1 |
20140075388 | Kuscher | Mar 2014 | A1 |
20140078102 | Araki | Mar 2014 | A1 |
20140089828 | Okuma | Mar 2014 | A1 |
20140092100 | Chen | Apr 2014 | A1 |
20140101579 | Kim | Apr 2014 | A1 |
20140111422 | Chow | Apr 2014 | A1 |
20140111516 | Hall et al. | Apr 2014 | A1 |
20140129564 | Kritt | May 2014 | A1 |
20140157200 | Jeon | Jun 2014 | A1 |
20140157210 | Katz | Jun 2014 | A1 |
20140173457 | Wang et al. | Jun 2014 | A1 |
20140173530 | Mesguich Havilio | Jun 2014 | A1 |
20140189581 | Kawannata | Jul 2014 | A1 |
20140210759 | Toriyama | Jul 2014 | A1 |
20140215365 | Hiraga | Jul 2014 | A1 |
20140229871 | Tai | Aug 2014 | A1 |
20140245217 | Asahara | Aug 2014 | A1 |
20140267084 | Krulce | Sep 2014 | A1 |
20140282145 | Dewan | Sep 2014 | A1 |
20140313142 | Yairi | Oct 2014 | A1 |
20140331179 | Tullis et al. | Nov 2014 | A1 |
20140340204 | O'Shea | Nov 2014 | A1 |
20140351738 | Kokovidis | Nov 2014 | A1 |
20150009157 | Chung | Jan 2015 | A1 |
20150012854 | Choi et al. | Jan 2015 | A1 |
20150022432 | Stewart | Jan 2015 | A1 |
20150029095 | Gomez | Jan 2015 | A1 |
20150029553 | Fujimoto | Jan 2015 | A1 |
20150033165 | Yoo | Jan 2015 | A1 |
20150035800 | Uchiyama | Feb 2015 | A1 |
20150062046 | Cho | Mar 2015 | A1 |
20150066356 | Kirsch | Mar 2015 | A1 |
20150067555 | Joo | Mar 2015 | A1 |
20150074615 | Han | Mar 2015 | A1 |
20150106709 | Kritt | Apr 2015 | A1 |
20150135109 | Zambetti | May 2015 | A1 |
20150143233 | Weksler et al. | May 2015 | A1 |
20150153571 | Ballard | Jun 2015 | A1 |
20150160807 | Vakharia | Jun 2015 | A1 |
20150160843 | Kim | Jun 2015 | A1 |
20150169057 | Shiroor | Jun 2015 | A1 |
20150169096 | Nishizawa | Jun 2015 | A1 |
20150169530 | Otero et al. | Jun 2015 | A1 |
20150169531 | Campbell et al. | Jun 2015 | A1 |
20150186350 | Hicks | Jul 2015 | A1 |
20150186351 | Hicks | Jul 2015 | A1 |
20150205483 | Takamura | Jul 2015 | A1 |
20150212688 | Mcmillan | Jul 2015 | A1 |
20150227308 | Kim | Aug 2015 | A1 |
20150234562 | Ording | Aug 2015 | A1 |
20150261728 | Davis | Sep 2015 | A1 |
20150268805 | Patel | Sep 2015 | A1 |
20150286636 | Elkhou et al. | Oct 2015 | A1 |
20150301609 | Park | Oct 2015 | A1 |
20150338974 | Stone | Nov 2015 | A1 |
20150341212 | Hsiao et al. | Nov 2015 | A1 |
20150356160 | Berwick et al. | Dec 2015 | A1 |
20150363082 | Zhao | Dec 2015 | A1 |
20150378978 | Gross et al. | Dec 2015 | A1 |
20160055232 | Yang et al. | Feb 2016 | A1 |
20160070430 | Kim et al. | Mar 2016 | A1 |
20160070461 | Herbordt | Mar 2016 | A1 |
20160092080 | Swanson | Mar 2016 | A1 |
20160139695 | Chase | May 2016 | A1 |
20160147308 | Gelman | May 2016 | A1 |
20160188181 | Smith | Jun 2016 | A1 |
20160202892 | Rath | Jul 2016 | A1 |
20160253086 | Jiang | Sep 2016 | A1 |
20160259528 | Foss | Sep 2016 | A1 |
20160274686 | Alonso Ruiz | Sep 2016 | A1 |
20160274733 | Hasegawa | Sep 2016 | A1 |
20160274750 | Stewart | Sep 2016 | A1 |
20160283049 | Faydi | Sep 2016 | A1 |
20160283054 | Suzuki | Sep 2016 | A1 |
20160283081 | Johnston | Sep 2016 | A1 |
20160291849 | Stockwell | Oct 2016 | A1 |
20160306328 | Ko et al. | Oct 2016 | A1 |
20160313911 | Langseth et al. | Oct 2016 | A1 |
20160364367 | Takayama | Dec 2016 | A1 |
20160370994 | Galu, Jr. | Dec 2016 | A1 |
20170010781 | Bostick | Jan 2017 | A1 |
20170031587 | Kimoto | Feb 2017 | A1 |
20170060819 | Rucine | Mar 2017 | A1 |
20170102838 | Roy et al. | Apr 2017 | A1 |
20170109023 | Cherna | Apr 2017 | A1 |
20170109026 | Ismailov | Apr 2017 | A1 |
20170147188 | Rong | May 2017 | A1 |
20170185258 | Fu | Jun 2017 | A1 |
20170185281 | Park | Jun 2017 | A1 |
20170192658 | Kim | Jul 2017 | A1 |
20170193058 | Fung et al. | Jul 2017 | A1 |
20170199651 | Pintoffl | Jul 2017 | A1 |
20170221244 | Hiraga | Aug 2017 | A1 |
20170228138 | Paluka | Aug 2017 | A1 |
20170269696 | Naidoo | Sep 2017 | A1 |
20170269800 | Park | Sep 2017 | A1 |
20170277367 | Pahud | Sep 2017 | A1 |
20170287230 | Gortler | Oct 2017 | A1 |
20170315635 | Chase | Nov 2017 | A1 |
20170315721 | Merel | Nov 2017 | A1 |
20170329458 | Kanemaru | Nov 2017 | A1 |
20180040154 | Gibb | Feb 2018 | A1 |
20180069983 | Cho | Mar 2018 | A1 |
20180101239 | Yin | Apr 2018 | A1 |
20180152636 | Yim | May 2018 | A1 |
20180203596 | Dhaliwal | Jul 2018 | A1 |
20180239519 | Hinckley | Aug 2018 | A1 |
20180239520 | Hinckley | Aug 2018 | A1 |
20180246639 | Han | Aug 2018 | A1 |
20180329623 | Usami | Nov 2018 | A1 |
20190056856 | Simmons | Feb 2019 | A1 |
20190094850 | Li | Mar 2019 | A1 |
20190146643 | Foss | May 2019 | A1 |
Number | Date | Country | |
---|---|---|---|
61506912 | Jul 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13535019 | Jun 2012 | US |
Child | 14884597 | US |