The disclosed implementations relate generally to data visualization and more specifically to systems, methods, and user interfaces that enable a user to interactively explore and investigate a data set.
Data visualization is a powerful tool for exploring data sets. Graphical views provide user-friendly ways to visualize and interpret data. However, the task of effectively visualizing databases imposes significant demands on the human-computer interface to the visualization system, especially on a mobile device with a small screen.
As computing and networking speeds increase, data visualization that was traditionally performed on desktop computers can also be performed on portable electronic devices, such as smart phones, tablets, and laptop computers. These portable devices typically use touch-sensitive surfaces (e.g., touch screens and/or trackpads) as input devices. These portable devices typically have significantly smaller displays than desktop computers. Thus, additional challenges arise in using touch-sensitive surfaces to manipulate graphical views of data in a user-friendly manner on portable devices.
Consequently, there is a need for faster, more efficient methods and interfaces for manipulating graphical views of data. Such methods and interfaces may complement or replace conventional methods for visualizing data. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface. For battery-operated devices, such methods and interfaces conserve power and increase the time between battery charges.
The above deficiencies and other problems associated with visualizing data are reduced or eliminated by the disclosed methods, devices, graphical user interfaces, and computer readable storage media. Various implementations of methods, devices, graphical user interfaces, and storage media within the scope of this disclosure and the appended claims each have several aspects, no single one of which is solely responsible for the attributes described herein.
Thus methods, systems, and graphical user interfaces are provided that enable users to more easily and more efficiently analyze data sets.
For a better understanding of the aforementioned implementations of the invention as well as additional implementations thereof, reference should be made to the Description of Implementations below, in conjunction with the following drawings in which like reference numerals refer to corresponding parts throughout the figures.
Reference will now be made in detail to implementations, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will be apparent to one of ordinary skill in the art that the present invention may be practiced without these specific details.
As portable electronic devices become more compact, and the number of functions performed by applications on any given device increase, it has become a significant challenge to design user interfaces that allow users to interact with the applications easily. This challenge is particularly significant for portable devices with smaller screens and/or limited input devices. In addition, data visualization applications need to provide user-friendly ways to explore data in order to enable a user to extract significant meaning from a particular data set. Some application designers have resorted to using complex menu systems to enable a user to perform desired functions. These conventional user interfaces often result in complicated key sequences and/or menu hierarchies that must be memorized by the user and/or that are otherwise cumbersome and/or not intuitive to use.
The methods, devices, and graphical user interfaces (GUIs) described herein make manipulation of data sets and data visualizations more efficient and intuitive for a user. In some instances, a data visualization is referred to as a “chart.” A number of different intuitive user interfaces for data visualizations are described below. For example, applying a filter to a data set can be accomplished by a simple touch input on a given portion of a displayed data visualization rather than via a nested menu system.
Attention is now directed toward implementations of portable devices with touch-sensitive displays. Implementations of electronic devices and user interfaces for such devices are described. In some implementations, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Other portable electronic devices include laptops or tablet computers with touch-sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that, in some implementations, the device is not a portable communications device, but is a desktop computer with a touch-sensitive surface (e.g., a touch screen display and/or a touch pad). In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device optionally includes one or more other physical user-interface devices, such as a physical keyboard, a mouse, a microphone, and/or a joystick.
The device 100 includes one or more processing units (CPU's) 302, an input/output (I/O) subsystem 306, memory 308 (which optionally includes one or more computer readable storage media), and a network communications interface 310. In some implementations, these components communicate over one or more communication buses or signal lines 304. In some implementations, the communication buses 304 include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
The memory 308 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices, and may include non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In some implementations, the memory 308 includes one or more storage devices remotely located from processor(s) 302. The memory 308, or alternately the non-volatile memory device(s) within the memory 308, comprises a non-transitory computer readable storage medium.
In some implementations, the software components stored in the memory 308 include an operating system 318, a communication module 320, an input/output (I/O) module 322, and one or more applications 328 (e.g., a data visualization application 422). In some implementations, one or more of the various modules comprises a set of instructions in memory 308. In some implementations, the memory 308 stores one or more data sets in one or more database(s) 332.
The operating system 318 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware, software, and/or firmware components.
The communication module 320 facilitates communication with other devices over one or more external ports and also includes various software components for handling data received from other devices.
The I/O module 322 includes a touch input sub-module 324 and a graphics sub-module 326. In some implementations, the touch input sub-module 324 detects touch inputs with touch screen 102 and other touch sensitive devices (e.g., a touchpad or physical click wheel). The touch input sub-module 324 includes various software components for performing various operations related to detection of a touch input, such as determining if contact has occurred (e.g., detecting a finger-down event), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). The touch input sub-module 324 receives contact data from the touch-sensitive surface (e.g., touch screen 102). In some implementations, these operations are applied to single touch inputs (e.g., one finger contacts) or to multiple simultaneous touch inputs (e.g., “multitouch”/multiple finger contacts). In some implementations, the touch input sub-module 324 detects contact on a touchpad.
In some implementations, the touch input sub-module 324 detects a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns. In some implementations, a gesture is detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (lift off) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an data mark). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (lift off) event.
The graphics sub-module 326 includes various known software components for rendering and displaying graphics on the touch screen 102 or other display, including components for changing the visual impact (e.g., brightness, transparency, saturation, contrast or other visual property) of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including data visualizations, icons (such as user-interface objects including soft keys), text, digital images, animations, and the like. In some implementations, the graphics sub-module 326 stores data representing graphics to be used. In some implementations, each graphic is assigned a corresponding code. The graphics sub-module 326 receives (e.g., from applications) one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to the display or touch screen.
In some implementations, the applications 328 include a data visualization module 330 or data visualization application 422 for displaying graphical views of data and one or more other applications. Examples of other applications that are optionally stored in the memory 308 include word processing applications, email applications, and presentation applications.
In conjunction with the I/O interface 306, including the touch screen 102, the CPU(s) 302, and/or the database(s) 332, the data visualization module 330 includes executable instructions for displaying and manipulating various graphical views of data.
Each of the above identified modules and applications corresponds to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules are, optionally, combined or otherwise re-arranged in various implementations. In some implementations, the memory 308 stores a subset of the modules and data structures identified above. In some implementations, the memory 308 stores additional modules and data structures not described above.
The device 200 typically includes one or more processing units/cores (CPUs) 352, one or more network or other communications interfaces 362, memory 350, an I/O interface 356, and one or more communication buses 354 for interconnecting these components. In some implementations, the communication buses 354 include circuitry (sometimes called a chipset) that interconnects and controls communications between system components.
The I/O interface 306 includes a screen 202 (also sometimes called a display), a touch-sensitive surface 204, and one or more sensor(s) 360 (e.g., optical, acceleration, proximity, and/or touch-sensitive sensors). In some implementations, the I/O interface 356 includes a keyboard and/or mouse (or other pointing device) 358. The I/O interface 356 couples input/output peripherals on the device 200, such as the screen 202, the touch-sensitive surface 204, other input devices 358, and one or more sensor(s) 360, to the CPU(s) 352 and/or the memory 350.
The screen 202 provides an output interface between the device and a user. The screen 202 displays visual output to the user. In some implementations, the visual output includes graphics, text, icons, data marks, and any combination thereof (collectively termed “graphics”). In some implementations, some or all of the visual output corresponds to user-interface objects. In some implementations, the screen 202 uses LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies are used in other implementations.
In addition to the touch screen, the device 200 includes a touch-sensitive surface 204 (e.g., a touchpad) for detecting touch inputs. The touch-sensitive surface 204 accepts input from the user via touch inputs (e.g., the touch input 210 in
The memory 350 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices. In some implementations, the memory includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In some implementations, the memory 350 includes one or more storage devices remotely located from CPU(s) 352. In some implementations, the software components stored in the memory 350 include an operating system 364, a communication module 366, an input/output (I/O) module 368, and one or more applications 374 (e.g., a data visualization application 422). In some implementations, one or more of the various modules comprises a set of instructions in the memory 350. In some implementations, the memory 350 stores one or more data sets in one or more database(s) 378. In some implementations, the I/O module 368 includes a touch input sub-module 370 and a graphics sub-module 372. In some implementations, the applications 374 include data visualization module 376.
In some implementations, the memory 350 stores programs, modules, and data structures analogous to the programs, modules, and data structures stored in the memory 308 of the portable multifunction device 100, or a subset thereof. In some implementations, the memory 350 stores additional programs, modules, and data structures not present in the memory 308 of the portable multifunction device 100. In some implementations, the memory 350 of the device 200 stores drawing, presentation, and word processing applications.
The device 200 also includes a power system for powering the various components. The power system optionally includes a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management, and distribution of power in portable devices.
Each of the above identified elements in
In some implementations, the memory 414 includes high-speed random access memory, such as DRAM, SRAM, DDR RAM or other random access solid state memory devices. In some implementations, the memory 414 includes non-volatile memory, such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices. In some implementations, the memory 414 includes one or more storage devices remotely located from the CPU(s) 402. The memory 414, or alternately the non-volatile memory device(s) within the memory 414, comprises a non-transitory computer readable storage medium. In some implementations, the memory 414, or the computer readable storage medium of the memory 414, stores the following programs, modules, and data structures, or a subset thereof:
Each of the above identified executable modules, applications, or set of procedures may be stored in one or more of the previously mentioned memory devices, and corresponds to a set of instructions for performing a function described above. The above identified modules or programs (i.e., sets of instructions) need not be implemented as separate software programs, procedures, or modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various implementations. In some implementations, the memory 414 stores a subset of the modules and data structures identified above. Furthermore, the memory 414 may store additional modules or data structures not described above.
Although
Disclosed user interfaces are optionally implemented on a portable multifunction device 100 or device 200. The following examples are shown utilizing a touch screen (e.g., a touch screen 102). However, it should be understood that, in some implementations, the inputs (e.g., finger contacts) are detected on a touch-sensitive surface on a device that is distinct from a display on the device. In addition, while the following examples are given primarily with reference to finger inputs (e.g., finger contacts, finger tap gestures, finger swipe gestures), it should be understood that, in some implementations, one or more of the finger inputs are replaced with input from another input device (e.g., a mouse based input or stylus input). For example, a swipe gesture is, optionally, replaced with a mouse click (e.g., instead of a contact) followed by movement of the cursor along the path of the swipe (e.g., instead of movement of the contact). As another example, a tap gesture is, optionally, replaced with a mouse click while the cursor is located over the location of the tap gesture (e.g., instead of detection of the contact followed by ceasing to detect the contact). Similarly, when multiple user inputs are simultaneously detected, it should be understood that multiple computer mice are, optionally, used simultaneously, or a mouse and finger contacts are, optionally, used simultaneously.
In
In
In
In
Although
When the dimensions are independent of each other, the process is as described. In some instances, there are hierarchical dependencies between the dimensions. In some implementations, when there is a hierarchical dependency between the selected dimensions, filtering out a dependent value filters out the dependent value only within its hierarchy.
One of the possible dimensions 712 is selected and highlighted, as illustrated by the “Content Type” dimension 714 in
As
In some implementations, the column header “content type” 718 can be switched to another dimension. In some implementations, this can be achieved by doing a horizontal scroll at a contact point on the content type header.
As illustrated in
As described above with respect to
The terminology used in the description of the invention herein is for the purpose of describing particular implementations only and is not intended to be limiting of the invention. As used in the description of the invention and the appended claims, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof.
The foregoing description, for purpose of explanation, has been described with reference to specific implementations. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The implementations were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to best utilize the invention and various implementations with various modifications as are suited to the particular use contemplated.
This application is a continuation-in-part of U.S. patent application Ser. No. 14/603,302, filed Jan. 22, 2015, entitled “Methods and Devices for Adjusting Chart Filters,” which claims priority to U.S. Provisional Application No. 62/047,429, filed Sep. 8, 2014, entitled “Methods and Devices for Manipulating Graphical Views of Data,” each of which is hereby incorporated by reference in its entirety. This application further claims priority to U.S. Provisional Application No. 62/221,084, filed Sep. 20, 2015, entitled “Interactive Data Visualization User Interface,” which is hereby incorporated by reference in its entirety. This application is related to U.S. patent application Ser. No. 15/172,052, filed Jun. 2, 2016, and U.S. patent application Ser. No. 15/172,076, filed Jun. 2, 2016, each of which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5414809 | Hogan et al. | May 1995 | A |
5717939 | Bricklin et al. | Feb 1998 | A |
5806078 | Hug et al. | Sep 1998 | A |
5874965 | Takai et al. | Feb 1999 | A |
6400366 | Davies et al. | Jun 2002 | B1 |
6529217 | Maguire, III et al. | Mar 2003 | B1 |
7345688 | Baudisch et al. | Mar 2008 | B2 |
7420562 | Shinohara et al. | Sep 2008 | B2 |
7469381 | Ording | Dec 2008 | B2 |
8106856 | Matas et al. | Jan 2012 | B2 |
8527909 | Mullany | Sep 2013 | B1 |
8654125 | Gibson | Feb 2014 | B2 |
8762867 | Mattos et al. | Jun 2014 | B1 |
8832588 | Kerzner et al. | Sep 2014 | B1 |
8836726 | Schickler | Sep 2014 | B1 |
8996978 | Richstein et al. | Mar 2015 | B2 |
9389777 | Sekharan | Jul 2016 | B2 |
20020024535 | Ueno et al. | Feb 2002 | A1 |
20020118192 | Couckuyt et al. | Aug 2002 | A1 |
20020129053 | Chan et al. | Sep 2002 | A1 |
20030030634 | Sang'udi et al. | Feb 2003 | A1 |
20050060647 | Doan et al. | Mar 2005 | A1 |
20060080594 | Chavoustie et al. | Apr 2006 | A1 |
20070190924 | Stadheim et al. | Aug 2007 | A1 |
20070233666 | Carlson et al. | Oct 2007 | A1 |
20070285426 | Matina | Dec 2007 | A1 |
20080010670 | Campbell et al. | Jan 2008 | A1 |
20080046805 | Shewchenko et al. | Feb 2008 | A1 |
20080192056 | Robertson et al. | Aug 2008 | A1 |
20080195930 | Tolle | Aug 2008 | A1 |
20080229242 | Goering | Sep 2008 | A1 |
20080288201 | Oettinger et al. | Nov 2008 | A1 |
20090013287 | Helfman et al. | Jan 2009 | A1 |
20090135240 | Phaneuf et al. | May 2009 | A1 |
20090171606 | Murata et al. | Jul 2009 | A1 |
20090178007 | Matas et al. | Jul 2009 | A1 |
20090254557 | Jordan | Oct 2009 | A1 |
20090267947 | Libby et al. | Oct 2009 | A1 |
20090313537 | Fu et al. | Dec 2009 | A1 |
20100079499 | Scott et al. | Apr 2010 | A1 |
20100083089 | Rapp et al. | Apr 2010 | A1 |
20100174678 | Massand | Jul 2010 | A1 |
20100205520 | Parish et al. | Aug 2010 | A1 |
20100211920 | Westerman et al. | Aug 2010 | A1 |
20100238176 | Guo et al. | Sep 2010 | A1 |
20100283800 | Cragun et al. | Nov 2010 | A1 |
20110074710 | Weeldreyer et al. | Mar 2011 | A1 |
20110106791 | Maim | May 2011 | A1 |
20110115814 | Heimendinger et al. | May 2011 | A1 |
20110145689 | Campbell et al. | Jun 2011 | A1 |
20110154188 | Forstall et al. | Jun 2011 | A1 |
20110164055 | McCullough et al. | Jul 2011 | A1 |
20110283231 | Richstein et al. | Nov 2011 | A1 |
20120005045 | Baker | Jan 2012 | A1 |
20120013540 | Hogan | Jan 2012 | A1 |
20120023449 | Zabielski | Jan 2012 | A1 |
20120158623 | Bilenko et al. | Jun 2012 | A1 |
20120159380 | Kocienda et al. | Jun 2012 | A1 |
20120166470 | Baumgaertel et al. | Jun 2012 | A1 |
20120180002 | Campbell et al. | Jul 2012 | A1 |
20120233573 | Sullivan et al. | Sep 2012 | A1 |
20120240064 | Ramsay et al. | Sep 2012 | A1 |
20120254783 | Pourshahid et al. | Oct 2012 | A1 |
20120284601 | Chan et al. | Nov 2012 | A1 |
20120313957 | Fisher | Dec 2012 | A1 |
20120324357 | Viegers et al. | Dec 2012 | A1 |
20130009963 | Albrecht | Jan 2013 | A1 |
20130019205 | Gil et al. | Jan 2013 | A1 |
20130024803 | Workman et al. | Jan 2013 | A1 |
20130080884 | Lisse et al. | Mar 2013 | A1 |
20130111319 | Lin et al. | May 2013 | A1 |
20130111321 | Dorrell | May 2013 | A1 |
20130120267 | Pasquero et al. | May 2013 | A1 |
20130120358 | Fan et al. | May 2013 | A1 |
20130174087 | Chen et al. | Jul 2013 | A1 |
20130194272 | Hao et al. | Aug 2013 | A1 |
20130275904 | Bhaskaran et al. | Oct 2013 | A1 |
20130293480 | Kritt et al. | Nov 2013 | A1 |
20130298085 | Kritt et al. | Nov 2013 | A1 |
20130314341 | Lee et al. | Nov 2013 | A1 |
20130332810 | Lin et al. | Dec 2013 | A1 |
20140049557 | Hou et al. | Feb 2014 | A1 |
20140053091 | Hou et al. | Feb 2014 | A1 |
20140068403 | Bhargav et al. | Mar 2014 | A1 |
20140075286 | Harada | Mar 2014 | A1 |
20140098020 | Koshi | Apr 2014 | A1 |
20140109012 | Choudhary et al. | Apr 2014 | A1 |
20140113268 | Dhasmana et al. | Apr 2014 | A1 |
20140129985 | Morozov et al. | May 2014 | A1 |
20140136939 | Chan et al. | May 2014 | A1 |
20140149947 | Blyumen | May 2014 | A1 |
20140157142 | Heinrich et al. | Jun 2014 | A1 |
20140181756 | Kuo | Jun 2014 | A1 |
20140198105 | Gibson et al. | Jul 2014 | A1 |
20140218383 | Srivastava | Aug 2014 | A1 |
20140223350 | Woodward | Aug 2014 | A1 |
20140247268 | Drucker | Sep 2014 | A1 |
20140267424 | Benson et al. | Sep 2014 | A1 |
20140281867 | Vogel et al. | Sep 2014 | A1 |
20140281868 | Vogel et al. | Sep 2014 | A1 |
20140282124 | Grealish et al. | Sep 2014 | A1 |
20140282276 | Drucker | Sep 2014 | A1 |
20140320539 | Hao et al. | Oct 2014 | A1 |
20140336786 | Asenjo et al. | Nov 2014 | A1 |
20140372952 | Otero et al. | Dec 2014 | A1 |
20140380140 | Kapahi | Dec 2014 | A1 |
20140380178 | Kapahi | Dec 2014 | A1 |
20150007078 | Feng et al. | Jan 2015 | A1 |
20150015504 | Lee et al. | Jan 2015 | A1 |
20150026554 | Hogan | Jan 2015 | A1 |
20150029213 | Benson et al. | Jan 2015 | A1 |
20150058801 | John et al. | Feb 2015 | A1 |
20150135113 | Sekharan | May 2015 | A1 |
20150169531 | Campbell et al. | Jun 2015 | A1 |
20150254369 | Hou et al. | Sep 2015 | A1 |
20150278315 | Baumgartner et al. | Oct 2015 | A1 |
20160004423 | Springer et al. | Jan 2016 | A1 |
20160055232 | Yang | Feb 2016 | A1 |
20160070015 | Sastry et al. | Mar 2016 | A1 |
20160104311 | Allyn | Apr 2016 | A1 |
20160154575 | Xie et al. | Jun 2016 | A1 |
20160224221 | Liu et al. | Aug 2016 | A1 |
20170004638 | Csenteri et al. | Jan 2017 | A1 |
20170169592 | Ruble et al. | Jun 2017 | A1 |
Number | Date | Country |
---|---|---|
2386940 | Nov 2011 | EP |
Entry |
---|
Stewart, Office Action, U.S. Appl. No. 14/603,312, dated Feb. 7, 2017, 11 pgs. |
Tableau, International Search Report and Written Opinion, PCT/US2016/052689, dated Dec. 5, 2016, 11 pgs. |
Drucker et al., Touch Viz: A Case Study Comparing Two Interfaces for Data Analytics on Tablets, CHI'13, Apr. 27-May 2, 2013, Paris, France, downloaded from http://research.microsoft.com/jump/189003, 10 pgs. |
Glueck et al., Dive In! Enabling Progressive Loading for Real-Time Navigation of Data Visualizations, CHI 2014, Apr. 26-May 1, 2014 Toronto, ON, Canada, 10 pgs. |
Stewart, Final Office Action, U.S. Appl. No. 14/603,312, dted Aug. 28, 2017, 15 pgs. |
Stewart, Final Office Action, U.S. Appl. No. 14/603,302, dated Sep. 25, 2017, 15 pgs. |
Stewart, Notice of Allowance, U.S. Appl. No. 14/603,330, dated Oct. 3, 2017, 10 pgs. |
Stewart, Office Action, U.S. Appl. No. 15/172,052, dated Jan. 10, 2018, 11 pgs. |
Stewart, Office Action, U.S. Appl. No. 15/172,076, dated Oct. 5, 2017, 11 pgs. |
Stewart, Office Action, U.S. Appl. No. 14/603,322, dated Oct. 20, 2017, 12 pgs. |
Apple, macOS Human Interface Guidelines—Drag and Drop, <URL: https://developer.apple.com/library/content/documentation/UserExperience/Conceptual/OSXHIGuidelines/DragDrop.html>, date, 5 pgs. |
Microsoft, Guidelines for drag animations, <URL: https://msdn.microsoft.com/en-us/windows/hardware/drivers/hh465193>, date, 2 pgs. |
Stewart, Office Action, U.S. Appl. No. 14/603,302, dated Feb. 27, 2017, 14 pgs. |
Stewart, Office Action, U.S. Appl. No. 14/603,322, dated Mar. 9, 2017, 15 pgs. |
Stewart, Office Action, U.S. Appl. No. 14/603,330, dated Apr. 13, 2017, 15 pgs. |
Stewart, Office Action, US14/603,312, 04APR2018, 12 pgs. |
Stewart, Final Office Action, US14/603,312, 08NOV2018, 12 pgs. |
Stewart, Office Action, US14/603,302, 06APR2018, 16 pgs. |
Stewart, Final Office Action, US14/603,302, 08NOV2018, 17 pgs. |
Stewart, Office Action, US14/603,322, 01JUN2018, 11 pgs. |
Stewart, Office Action, US15/172,052, 04DEC2018, 26 pgs. |
Stewart, Final Office Action, US15/172,076, 03JUL2018, 12 pgs. |
Stewart, Notice of Allowance, US15/172,076, 03JAN2019, 9 pgs. |
Stewart, Office Action, US15/260,261, 050CT2018, 10 pgs. |
Tableau, International Preliminary Report on Patentability, PCT/US2016/052689, 20MAR2018, 8 pgs. |
Tableau, Communication Pursuant to Rules 161(1) and 162, EP16775417.5, 17MAY2018, 3 pgs,. |
Number | Date | Country | |
---|---|---|---|
20160274750 A1 | Sep 2016 | US |
Number | Date | Country | |
---|---|---|---|
62047429 | Sep 2014 | US | |
62221084 | Sep 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14603302 | Jan 2015 | US |
Child | 15172085 | US |