Natural input for spreadsheet actions

Information

  • Patent Grant
  • 10732825
  • Patent Number
    10,732,825
  • Date Filed
    Thursday, June 29, 2017
    7 years ago
  • Date Issued
    Tuesday, August 4, 2020
    3 years ago
Abstract
Different gestures and actions are used to interact with spreadsheets. The gestures are used in manipulating the spreadsheet and performing other actions in the spreadsheet. For example, gestures may be used to move within the spreadsheet, select data, filter, sort, drill down/up, zoom, split rows/columns, perform undo/redo actions, and the like. Sensors that are associated with a device may also be used in interacting with spreadsheets. For example, an accelerometer may be used for moving and performing operations within the spreadsheet.
Description
BACKGROUND

Many people utilize spreadsheets to interact with data. Generally, users interact with spreadsheets through input devices, such as mice, touch screens, graphical user interfaces and keyboards. Sometimes this interaction can be frustrating. For example, interacting with and manipulating a large spreadsheet on a small screen device (e.g. cell phone, tablet) can be difficult and tedious.


SUMMARY

This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


Different gestures and actions are used to interact with spreadsheets. The gestures are used in manipulating the spreadsheet and performing other actions in the spreadsheet. For example, gestures may be used to move within the spreadsheet, select data, filter, sort, drill down/up, zoom, split rows/columns, perform undo/redo actions, and the like. Sensors that are associated with a device may also be used in interacting with spreadsheets. For example, an accelerometer may be used for moving and performing operations within the spreadsheet.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an exemplary computing device;



FIG. 2 illustrates an exemplary touch input system;



FIG. 3 shows a system for using gestures and sensor information to interact with an application;



FIG. 4 shows a zooming gesture within a spreadsheet;



FIG. 5 illustrates the use of a gesture box;



FIG. 6 shows a karate chop gesture;



FIG. 7 shows a user selecting data and then drawing a chart gesture to change a view of the selected data;



FIG. 8 shows a user drawing a trend line gesture on a chart;



FIG. 9 illustrates a comment gesture;



FIG. 10 shows a vortex effect in response to an action being performed on data;



FIG. 11 illustrates a display and interaction with a grip user interface element;



FIG. 12 shows spreadsheet objects being displayed based on a movement of the device; and



FIG. 13 shows an illustrative process for using gestures and sensors to interact with a spreadsheet.





DETAILED DESCRIPTION

Referring now to the drawings, in which like numerals represent like elements, various embodiments will be described. In particular, FIG. 1 and the corresponding discussion are intended to provide a brief, general description of a suitable computing environment in which embodiments may be implemented.


Generally, program modules include routines, programs, components, data structures, and other types of structures that perform particular tasks or implement particular abstract data types. Other computer system configurations may also be used, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers, and the like. Distributed computing environments may also be used where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.


Referring now to FIG. 1, an illustrative computer architecture for a computer 100 utilized in the various embodiments will be described. The computer architecture shown in FIG. 1 may be configured as a mobile or a desktop computer and includes a central processing unit 5 (“CPU”), a system memory 7, including a random access memory 9 (“RAM”) and a read-only memory (“ROM”) 10, and a system bus 12 that couples the memory to the central processing unit (“CPU”) 5. According to embodiments, computer 100 is a handheld computing device such as a mobile phone, tablet, laptop, net book, PDA, and the like.


A basic input/output system containing the basic routines that help to transfer information between elements within the computer, such as during startup, is stored in the ROM 10. The computer 100 further includes a mass storage device 14 for storing an operating system 16, application program(s) 24, and other program modules 25, and gesture manager 26 which will be described in greater detail below.


The mass storage device 14 is connected to the CPU 5 through a mass storage controller (not shown) connected to the bus 12. The mass storage device 14 and its associated computer-readable media provide non-volatile storage for the computer 100. Although the description of computer-readable media contained herein refers to a mass storage device, such as a hard disk or CD-ROM drive, the computer-readable media can be any available physical media that can be accessed by the computer 100.


By way of example, and not limitation, computer-readable media may comprise computer storage media and communication media. Computer storage media includes physical volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, Erasable Programmable Read Only Memory (“EPROM”), Electrically Erasable Programmable Read Only Memory (“EEPROM”), flash memory or other solid state memory technology, CD-ROM, digital versatile disks (“DVD”), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer 100.


According to various embodiments, computer 100 may operate in a networked environment using logical connections to remote computers through a network 18, such as the Internet. The computer 100 may connect to the network 18 through a network interface unit 20 connected to the bus 12. The network connection may be wireless and/or wired. The network interface unit 20 may also be utilized to connect to other types of networks and remote computer systems. The computer 100 may also include an input/output controller 22 for receiving and processing input from a number of other devices, including a display/touch input device 28. The touch input device may utilize any technology that allows touch input to be recognized at a time. For example, the technologies may include, but are not limited to: heat, finger pressure, high capture rate cameras, infrared light, optic capture, tuned electromagnetic induction, ultrasonic receivers, transducer microphones, laser rangefinders, shadow capture, and the like. An exemplary touch input device is shown in FIG. 2. The touch input device 28 may also be separate from the display. The input/output controller 22 may also provide output to one or more display screens, a printer, or other type of output device.


Computer 100 may also include one or more sensors 21. According to an embodiment, computer 100 includes an accelerometer for sensing acceleration of the computer or a portion of the computer. For example, the accelerometer may detect movement of display 28. The accelerometer may be a single axis or multi-axis accelerometer that is used to sense orientation, acceleration, vibration, and other types of actions that may be sensed by an accelerometer. Other sensors may also be included, such as location sensors (i.e. GPS), audio sensors, infrared sensors, other types of tilt sensors, and the like. Information received by sensor 21 may be used to interact with an application program. For example, when a user moves the computing device, different parts of a spreadsheet may be shown in response to the movement.


As mentioned briefly above, a number of program modules and data files may be stored in the mass storage device 14 and RAM 9 of the computer 100, including an operating system 16 suitable for controlling the operation of a networked personal computer, such as the WINDOWS® 7® operating system from MICROSOFT CORPORATION of Redmond, Wash. According to one embodiment, the operating system is configured to include support for touch input device 23. According to another embodiment, a gesture manager 26 may be utilized to process some/all of the touch input that is received from touch input device 23.


The mass storage device 14 and RAM 9 may also store one or more program modules. In particular, the mass storage device 14 and the RAM 9 may store one or more application programs 24, such as a spreadsheet application. According to an embodiment, the spreadsheet application is the MICROSOFT EXCEL spreadsheet application. Other spreadsheet applications may also be used.


In conjunction with the operation of the application, gesture manager 26 is configured to detect gestures that are received by the touch input device 28. Generally, gesture manager 26 is configured to sense when a gesture is received that is related to performing an operation in conjunction with an application program, such as a spreadsheet application. Different types of gestures may be received. For example, a swipe gesture, a cut/paste gesture, an insert gesture, a vortex gesture, a grip gesture, a chart gesture, a trend line gesture, a comment gesture, a zoom gesture, a sort gesture, an undo/redo gesture, and the like may be received.


Gesture manager 26 is also configured to receive input from one or more sensors. The information received from the sensor(s) may be used alone and/or in combination with a received gesture. For example, tilting the device may cause a spreadsheet to scroll/pan in the tilted direction. Shaking the device may be used to clear a filter, reset a state, perform an undo and the like. Jerking the device may cause an acceleration in scrolling or a jump in the scroll position. Tilting the device steeply (i.e. greater then 30 degrees or some other predetermined angle) may cause the spreadsheet objects contained within spreadsheet 23 to appear as if they are spilling to the top of the spreadsheet thereby allowing the user to select one of the spreadsheet objects. Upon selection, the spreadsheet objects can return to their original location and the view may center on the chosen object. The gestures and sensor information may be used to change a display of information, activate/deactivate functions, and/or perform some other type of operation associated with application 24 or some other function and/or program. Additional details regarding the gestures and sensor information will be provided below.



FIG. 2 illustrates an exemplary touch input system. Touch input system 200 as illustrated comprises a touch panel 202 that has several sensors 204 integrated therein. According to one embodiment, the sensors 204 are Infrared (IR) sensors. The touch input system 200 is configured to detect objects that either in contact with the touchable surface 206 or are close to but not in actual contact with (“adjacent”) touchable surface 206. The objects that are sensed may be many different types of objects such as finger, hands, or other physical objects. Infrared sensors 204 are distributed throughout touch panel 202 and are disposed parallel to touchable surface 206. One or more of the infrared sensors 204 may detect infrared radiation reflected from objects, such as hand 208, as indicated by the arrow. Although the term “above” is used in this description, it should be understood that the orientation of the touch panel system is irrelevant. As shown in FIG. 2, touchable surface 206 is horizontal, but in a different embodiment generated by rotating system 200 clockwise by 90 degrees, touchable surface 206 could be vertical. In that embodiment, the objects from which reflected IR radiation is detected are to the side of touchable surface 206. The term “above” is intended to be applicable to all such orientations. Touchable surface 206 may also be changed to other orientations.


Touch panel 202 may comprise filters 212 that absorbs visible light and transmits infrared radiation and are located between touchable surface 206 and IR sensors 204 in order to shield IR sensors 204 from visible light incident on touchable surface 206 in the case where IR sensors 204 are sensitive to a broader range of wavelengths of light other than purely infrared wavelengths.


Touch panel 202 may comprise a display that is configured to display images that are viewable via touchable surface 206. For example, the displayed image may be images relating to an application, such as a spreadsheet. The display may be, for example, an LCD, an organic light emitting diode (OLED) display, a flexible display such as electronic paper, or any other suitable display in which an IR sensor can be integrated.


System 200 may comprise a backlight 216 for the display. Backlight 216 may comprise at least one IR source 218 that is configured to illuminate objects in contact with or adjacent to touchable surface 206 with infrared radiation through touchable surface 206, as indicated by the arrows. IR sensors 204 are sensitive to radiation incident from above, so IR radiation traveling directly from backlight 216 to IR sensors 204 is not detected.


The output of sensors 204 may be processed by gesture manager 26 and/or functionality included within an operating system or some other application to detect when a physical object (e.g., a hand, a bottle, a glass, a finger, a hat, etc.) has come into physical contact with a portion of the touch input surface 206 and/or a physical object is in close proximity to the surface. For example, sensors 204 can detect when a portion of hand 208, such as one or more fingers, has come in contact with or is near to the touch input display surface 206. Additional sensors can be embedded in the touch input display surface 206 and can include for example, accelerometers, pressure sensors, temperature sensors, image scanners, barcode scanners, etc., to detect multiple simultaneous inputs.


When the sensors 204 are IR sensors, the IR radiation reflected from the objects may be reflected from a user's hands, fingers, reflective ink patterns on the objects, metal designs on the objects or any other suitable reflector. Fingers reflect enough of the near IR to detect that a finger or hand is located at a particular location on or adjacent the touchable surface. A higher resolution of IR sensors may be used to scan objects in order to achieve higher resolution.


Sensors 204 can be included (e.g., embedded) in a plurality of locations. The density of sensors 204 can be sufficient such that contact across the entirety of touch input surface 206 can be detected. Sensors 204 are configured to sample the surface of touch input display surface 206 at specified intervals, such as, for example, 1 ms, 5 ms, etc. for detected contact and/or near contact. The sensor data received from sensors 204 changes between sampling intervals as detected objects move on the touch surface; detected objects are no longer within range of detection; and when new objects come in range of detection. For example, gesture manager 26 can determine that contact was first detected at a first location and then contact was subsequently moved to other locations. In response, the gesture manager 26 may determine when a gesture is received and what type of gesture is received.



FIG. 2 provides just one example of a touch input system. In other exemplary touch systems, the backlight may not comprise any IR sources and the surface 206 may include a frontlight which comprises at least one IR source. In such an example, the touchable surface 206 of the system is a surface of the frontlight. The frontlight may comprise a light guide, so that IR radiation emitted from IR source travels through the light guide and is directed towards touchable surface and any objects in contact with or adjacent to it. In other touch panel systems, both the backlight and frontlight may comprise IR sources. In yet other touch panel systems, there is no backlight and the frontlight comprises both IR sources and visible light sources. In further examples, the system may not comprise a frontlight or a backlight, but instead the IR sources may be integrated within the touch panel. In an implementation, the touch input system 200 may comprise an OLED display which comprises IR OLED emitters and IR-sensitive organic photosensors (which may comprise reverse-biased OLEDs). In some touch systems, a display may not be included. Even if the touch system comprises one or more components or elements of a display, the touch system may be configured to not display images. For example, this may be the case when the touch input tablet is separate from a display. Other examples include a touchpad, a gesture pad, and similar non-display devices and components.


For some applications, it may be desirable to detect an object only if it is in actual contact with the touchable surface of the touch panel system. For example, according to one embodiment, a gesture may not be recognized when the gesture is not performed touching the surface. Similarly, a gesture may be recognized when performed above the surface. The IR source of the touch input system may be turned on only if the touchable surface is touched. Alternatively, the IR source may be turned on regardless of whether the touchable surface is touched, and detection of whether actual contact between the touchable surface and the object occurred is processed along with the output of the IR sensor. Actual contact between the touchable surface and the object may be detected by any suitable means, including, for example, by a vibration sensor or microphone coupled to the touch panel. A non-exhaustive list of examples for sensors to detect contact includes pressure-based mechanisms, micro-machined accelerometers, piezoelectric devices, capacitive sensors, resistive sensors, inductive sensors, laser vibrometers, and LED vibrometers.



FIG. 3 shows a system for using gestures and sensor information to interact with an application. As illustrated, system 300 includes spreadsheet application 302, callback code 312, sensor(s) 304, gesture manager 26, spreadsheet 310 and touch input device/display 340. According to an embodiment, the functionality of system 300 is included within a mobile computing device.


In order to facilitate communication with the gesture manager 26, one or more callback routines, illustrated in FIG. 3 as callback code 312 may be implemented. According to one embodiment, gesture manager 26 is configured to receive input from a touch-sensitive input device 340 and sensor(s) 304. For example, gesture manager 26 may provide an indication to application 302 when a user's hand (i.e. hand 312) or some other object performs a gesture that is used in interacting with spreadsheet 310. Sensor information may also be received by gesture manager 26 to interact with spreadsheet 310. For example, a user may tilt or tap the side of the computing device to scroll the display of spreadsheet 310.


Gesture manager 26 is configured to recognize many different types of gestures. Some of the gestures may be context dependent, be specific to an application and/or be used within many different types of applications. For example, gestures may be used to interact with a spreadsheet 310 that is associated with a spreadsheet application 302. Gestures may be received in many different locations relating to touch input device/display 340. For example, a gesture may be received within a display of spreadsheet 310, within a gesture box 314 and/or at some other location on display 340.


Gestures may be predetermined and/or specified in different ways. For example, some gestures may be predetermined to be associated with a particular action whereas other gestures may be associated with one or more actions by a user. For instance, a user could specify that when a particular gesture is received then one or more spreadsheet operations are to occur. The operation may be specified in many different ways. For example, programming code may be used, a macro may be created, a formula may be configured, and the like.


There are many different examples of actions that may occur. For example, certain gestures when recognized may reverse the polarity of conditional formatting (i.e. change from high-to-low to low-to-high formatting). Performing a gesture may change formatting of the data (i.e high values were originally formatted to be green and now are red after the user performs a certain gesture). One or more gestures may result in reversing the orientation of charts. For example, a chart was originally displaying data from January to March and after a gesture is performed, the chart reverses and display data from March to January. One or more gestures may expose more information. For example, suppose that a chart currently is displaying data between January 2009 to March 2009. When a gesture is received, the display of the chart shifts and shows data from February 2009 to June 2009. Exemplary gestures and interactions with a spreadsheet are described below with reference to FIGS. 4-12.


Sensor information received by gesture manager 26 may be used to interact with spreadsheet 310. For example, the relative position of the computing device/display may be used to determine the portion of the spreadsheet to display. For example, moving the device to the left may scroll/pan the display of spreadsheet to the left whereas moving the device to the right may scroll/pan the display of the spreadsheet to the right. The scrolling/panning of the display may be a combination of a horizontal and vertical scrolling/panning. For example, moving the device diagonally may result in an equal amount of horizontal and vertical scroll/pan. The terms “scroll” and “pan” as used herein may be used interchangeably. Moving the device upwards or downwards may also affect the display of spreadsheet 310. For example, moving the device down may perform a zoom out operation such that more of the spreadsheet is displayed whereas moving the device in an upwards direction may be a zoom in operation such that a more detailed view of the spreadsheet is displayed. According to an embodiment, a zoom out operation displays a thumbnail view for each sheet of the spreadsheet workbook when the zoom exceeds a predetermined zoom level. Moving the spreadsheet upwards/downwards may also cause drilling operations to be performed on the spreadsheet. For example, moving the device in an upwards direction may perform a drilling up operation to less detail in the spreadsheet whereas moving the device in a downwards may perform a drilling down operation to show more detail.


The speed of the movement of the spreadsheet/computing device may also be used in determining a speed of the scrolling and/or zooming. For example, a sudden movement in a direction may increase the speed of the scrolling and/or zooming action. According to an embodiment, scrolling may slow/stop when it comes near a predetermined location (“speed bump”) within the spreadsheet. The location of the speed bumps may be placed at different locations within the spreadsheet. They may be automatically determined and/or manually located. A speed bump may be automatically placed whenever there is a gap in the data within the spreadsheet. For example, suppose that a spreadsheet has content from rows 1-3 and 10-20 and no content from rows 4-10. A speed bump may be placed at row 4 (318). When the user pans to row 4, the panning stops as if it hit the end of the spreadsheet. The panning may also slow as it nears the end of the data before the speed bump. To continue panning, the user just performs the panning operation again. Speed bumps may also be placed periodically throughout the grid of the spreadsheet (i.e. every 100 rows, 200 rows, 1000 rows, and the like). Different actions may also be associated with the speed bumps. For example, the movement of the spreadsheet may stop when it hits a speed bump, slow when it hits a speed bump and then speed up as it moves away from the speed bump, and the like.


The orientation of the spreadsheet/computing device may also be monitored. Tilting the display of the spreadsheet causes the display of the spreadsheet to scroll/pan in the tilted direction. An amount of tilt and/or a speed of the tilt may also be used to perform operations. For example, tilting the device steeply in the vertical direction may cause all the spreadsheet objects within spreadsheet 310 to appear as if they are spilling to the top of the spreadsheet (See FIG. 12). The titling may also be interpreted to navigate different data within an object. For example, tilting the device may navigate across the data series in a chart.


Rotating the device may be used to change views that are associated with the spreadsheet. For example, rotating the display of the spreadsheet may cause the view to change from a sheets view to a Named Object View that displays each object that is associated with spreadsheet 310. Similarly, when a user is viewing an object, the view may be changed based on the object. For example, when a user is viewing a pie chart, rotating the spreadsheet may cause the view to change to some other type of chart (e.g. bar chart).


Shaking the device may also be determined to perform operations relating to the spreadsheet. For example, shaking the device may be used to clear filters that are set on a table or pivot table within a spreadsheet, reset a state, perform and undo operation, and the like.



FIGS. 4-12 illustrate exemplary gestures and actions to interact with a spreadsheet.



FIG. 4 shows a zooming gesture within a spreadsheet. As illustrated, FIG. 4 shows a spreadsheet view 410 before zooming and a zoomed view 420 after performing a zoom out operation. According to an embodiment, a zoom gesture is detected when a user spreads two of their fingers apart as indicated by gesture 445. A zoom in gesture may be detected when a user closes two of their fingers. The zoomed out view 420 may be shown when a user zooms out beyond a certain point (i.e. <5%) and/or based on a speed of the movement between the two fingers of the gesture. For example, a very quick separation of the fingers may cause the thumbnail zoomed view 420 to be shown. As illustrated, zoomed view 420 is a thumbnail view of each sheet within the spreadsheet workbook. Other zoomed views may also be provided. For example, before zooming to the thumbnail view of each sheet within the spreadsheet, a Named Object View of the current may be displayed at a predetermined zoom level (e.g. at a zoom level 15-20%).



FIG. 5 illustrates the use of a gesture box. As illustrated, FIG. 5 illustrates display 530 that shows spreadsheet 540, selection 512, and gesture box 518. According to an embodiment, the gesture box 518 is drawn near a corner of the display. A gesture 516 may be drawn into the gesture box 518. Each gesture is associated with a particular command. For example, some exemplary commands are illustrated by gesture box commands 520. For example: drawing S could save the spreadsheet, drawing B could bold text, drawing a + sign could write a sum formula drawing a − sign could write a subtraction formula, drawing an up arrow could sort upwards, drawing a down arrow could sort downwards, drawing a circle could write an average formula. Other gesture commands may be used within gesture box 518. For example, a user could create custom gestures and/or modify existing gestures. These gestures could be recorded and associated with programming code, commands, and/or macros.


As illustrated, a user has created a selection 512 of the numbers 10, 12 and 5 with gesture 514 and then draws a “+” symbol within gesture box 518. In response to drawing the + symbol, a sum formula is written to grid location A4 that causes the value 27 to be displayed. A user may draw any gesture command into the box that was appropriate for selection 512. For example, a user may draw a circle within gesture box 518 that writes an average formula that would base 9 to be displayed at location A4.


While gesture box 518 is shown at the bottom left location of display 530, the gesture box may be displayed at other locations. For example, gesture box 518 may be displayed on top of spreadsheet 540, or at any other location on display 530.



FIG. 6 shows a karate chop gesture. As illustrated, FIG. 6 shows spreadsheet 610 receiving a horizontal karate type gesture 612 and a vertical karate type gesture 614. The karate type gesture is used to insert a row/column at a particular location within a spreadsheet. The karate chop gesture is detected when a karate chop motion is performed. For example, the edge of a user's hand may be detected or an edge of another physical object with an edge having similar characteristics to a user's hand may be detected to receive the karate type gesture. When a horizontal karate chop gesture 612 is received a new row is inserted at the location of the karate chop gesture. When a vertical karate chop gesture 614 is received a new column is inserted at the location of the karate chop gesture. In the current example, a new row is inserted between rows 2 and 3 and a new column is inserted after column 6 (see spreadsheet 620). According to another embodiment, the karate chop gesture is associated with other commands. For example, the karate chop gesture could trigger the automatic conversion of text-to-columns.



FIG. 7 shows a user selecting data and then drawing a chart gesture to change a view of the selected data. As illustrated, FIG. 7 shows a spreadsheet 710 comprising data 716. In the current example, a user has selected data 716 and then draws a chart gesture to create a display of a chart to represent the selected data.


Chart gesture 722 shows the user drawing a circle that represents a pie chart. When the user has completed chart gesture 722, a pie chart 712 is displayed. The chart may be displayed at different locations. For example, the chart may replace the selected data, the chart may be placed near the selected data or the chart may be placed at a user designated position. The chart may also be placed on another sheet of the spreadsheet workbook.


Chart gesture 724 shows a user drawing a chart gesture that is in the form of a line that creates a bar chart 714 for selected data 716. According to an embodiment, more than one type of chart may be displayed with selected data.


Chart gesture 726 shows a user drawing a chart gesture that is in the form of a semi-circle line that creates a scatter chart 718 for selected data 716. According to an embodiment, more than one type of chart may be displayed with selected data.



FIG. 8 shows a user drawing a chart gesture that is recognized as a trend line gesture on a chart. As illustrated, FIG. 8 shows a bar chart 810 bar chart 814 with a trend line, scatter chart 820 and scatter chart 824 with a logarithmic trend line. In the example of the bar chart, a user draws a trend line gesture 812 that is near the edges of the displayed chart elements data to create a display of a trend line with the bar chart (814). In the example of the scatter chart 820, a user draws a trend line gesture 822 that is a line near the middle of the data to create a display of a trend line with the scatter data 824.



FIG. 9 illustrates a comment gesture. A comment gesture creates a comment location near the beginning of the comment gesture. For example, in FIG. 9, a comment is located near the top of the last bar chart (920). The comment gesture is a line followed by a portion of a circle as illustrated by display 910. While the comment gesture is shown being performed on a chart, the comment gesture may be located anywhere within a display and be associated with different types of elements. For example, a comment gesture may be associated with a single cell, a group of selected cells, a chart, a table, or some other object. A comment box is displayed in response to receiving the comment gesture that allows a user to enter a comment.



FIG. 10 shows a vortex effect in response to an action being performed on data. As illustrated, display 1010 shows a user cutting selected data. In response to the cut option being selected, an animation that looks like a “vortex” is displayed creating the illusion that the content that is cut is getting sucked into the finger (1012). According to an embodiment, content disappears to the point where the finger last contacted the screen.


Display 1020 shows a user pasting data. On a paste command, the content is drawn outward from the finger point of contact (1014) and the vortex animation creates the illusion that the pasted content is coming out of the finger.



FIG. 11 illustrates a display and interaction with a grip user interface element. A grip user interface element 1112 is displayed on top of a document, such as a spreadsheet 1110. According to an embodiment, the grip user interface element 1112 is placed on the side of the screen of the non-dominant hand (e.g. for right handed users on the left side and for left handed users on the right side). When the grip 1112 is not held down, a slide action by the dominant hand 1114 is interpreted as a pan.


When the grip is held down (e.g. by the non-dominant hand) as illustrated in display 1120, any slide action by the other hand 1124 is interpreted as selecting cells. Different effects may be applied with the display of grip 1112. For example, the grid of the spreadsheet can visually “bend” at the point of contact with the grip to provide visual feedback that the grip is being held down.



FIG. 12 shows spreadsheet objects being displayed based on a movement of the device.


Display 1220 shows spreadsheet objects spilling off of the display. For example, tilting the computing device in the vertical direction may cause the spreadsheet objects within spreadsheet 1210 to appear as if they are spilling to the top of the spreadsheet as shown in display 1220. Tilting the device horizontally may cause the spreadsheet objects to spill to the side of the device. Upon selection of an object, the spreadsheet is displayed with the view centered on the chosen object. This provides a faster way to navigate objects on a sheet.


Display 1230 shows spreadsheet objects being gravity sorted. For example, the user may tilt the computing device from landscape to portrait mode, and then shake the device. According to an embodiment, the combined actions sort the data so the largest items appear on the bottom. The data may be numbers, chart data, spreadsheet objects and the like. For example, when applied to a bar chart, it would appear that the heaviest bars of the chart are falling to the bottom.


Referring now to FIG. 13, an illustrative process 1300 for using gestures and sensors to interact with a spreadsheet will be described. When reading the discussion of the routines presented herein, it should be appreciated that the logical operations of various embodiments are implemented (1) as a sequence of computer implemented acts or program modules running on a computing system and/or (2) as interconnected machine logic circuits or circuit modules within the computing system. The implementation is a matter of choice dependent on the performance requirements of the computing system implementing the invention. Accordingly, the logical operations illustrated and making up the embodiments described herein are referred to variously as operations, structural devices, acts or modules. These operations, structural devices, acts and modules may be implemented in software, in firmware, in special purpose digital logic, and any combination thereof.


After a start operation, the process flows to operation 1310, where a gesture is received and/or sensor information is received. The gesture may be any of the gestures described herein or other recognized gestures, such as a karate chop gesture, a grip gesture, a shape gesture, a trend line gesture, a comment gesture, a zoom gesture, a sort gesture, and the like may be received. According to one embodiment, the user places at least a portion of their hand (e.g. one or more fingers) on the touch surface. Additionally, according to some embodiments, the user may place their hand near the surface of the touch surface but not on the touch surface. The sensor information may relate to many different types of sensor information that may be used in interacting with a display. For example, the sensor information may relate to accelerometer data that may be used in determining an orientation of the computing device and a speed of the device,


Moving to operation 1320, the action to perform is determined. According to an embodiment, the action relates to interacting with a spreadsheet and comprises actions such as panning, tilting, sorting, zooming, drilling, and the like. While the actions described relate to interaction with spreadsheets, other applications may be utilized with the gestures described.


Flowing to operation 1330, the determined action is performed. Generally, the action relates to updating a spreadsheet. For example, a chart may be created, a trendline may be added, data may be sorted, summed, or some other operation may be performed on data, and the like.


Transitioning to operation 1440, the display is updated. The display is updated in response to the action received gesture/sensor information that is received.


The process then flows to an end operation and returns to processing other actions.


The above specification, examples and data provide a complete description of the manufacture and use of the composition of the invention. Since many embodiments of the invention can be made without departing from the spirit and scope of the invention, the invention resides in the claims hereinafter appended.

Claims
  • 1. A method for interacting with a spreadsheet, comprising: receiving an indication to change a view of the spreadsheet, wherein the indication includes an indication to perform an operation including one or more of: a zoom out operation with respect to the spreadsheet in response to moving a touch input device in a first direction, anda zoom in operation with respect to the spreadsheet in response to moving the touch input device in a second direction, andwherein the zoom out operation and the zoom in operation slow at one or more predetermined locations within the spreadsheet, wherein the one or more predetermined locations are automatically prepositioned within the spreadsheet based at least in part on an arrangement of data within the spreadsheet; andupdating a display of the spreadsheet.
  • 2. The method of claim 1, wherein the indication to change the view of the spreadsheet is a gesture received on the touch input device.
  • 3. The method of claim 2, wherein the gesture comprises one of: a karate chop gesture,a chart gesture,a comment gesture,a grip gesture,a gesture within a gesture box, anda sort gesture.
  • 4. The method of claim 3, wherein a different operation is performed based on the gesture.
  • 5. The method of claim 4, wherein the different operation comprises one of: in response to receiving the karate chop gesture on the spreadsheet, inserting at least one of a row and a column in the spreadsheet near a location of the karate chop gesture;in response to receiving the chart gesture, inserting at least one of: a trend line on a displayed chart in the spreadsheet and a chart in the spreadsheet;in response to receiving the comment gesture, displaying a comment box to receive a comment at a location near a received gesture location;in response to receiving the gesture within the gesture box, determining a gesture command from the gesture and performing the gesture command on the data within the spreadsheet; andin response to receiving the sort gesture, sorting at least a portion of data within the spreadsheet.
  • 6. The method of claim 5, wherein performing a gesture command further comprises: displaying a vortex animation in response to a cut command or in response to a paste command.
  • 7. The method of claim 1, further comprising: receiving sensor information, wherein the sensor information comprises accelerometer data used to change the view of the spreadsheet.
  • 8. The method of claim 7, wherein the accelerometer data is used to perform one or more of the zoom out operation and the zoom in operation.
  • 9. The method of claim 1, further comprising: in response to performing the zoom out operation, displaying a thumbnail for each sheet within a workbook associated with the spreadsheet when a zoom level exceeds a threshold.
  • 10. A computer system, comprising: at least one processor; andat least one memory storing computer-executable instructions that when executed by the at least one processor cause the computer system to: receive an indication to change a view of a spreadsheet, wherein the indication includes an indication to perform an operation including one or more of: a zoom out operation with respect to the spreadsheet in response to moving a touch input device in a first direction, anda zoom in operation with respect to the spreadsheet in response to moving the touch input device in a second direction,wherein the zoom out operation and the zoom in operation slow at one or more predetermined locations within the spreadsheet, wherein the one or more predetermined locations are automatically prepositioned within the spreadsheet based at least in part on an arrangement of data within the spreadsheet; andupdate a display of the spreadsheet.
  • 11. The computer system of claim 10, wherein the indication to change the view of the spreadsheet is the gesture received on a touch input device.
  • 12. The computer system of claim 11, wherein a different operation is performed based on the gesture.
  • 13. The computer system of claim 10, the computer-executable instructions further causing the computer system to: receive sensor information, wherein the sensor information comprises accelerometer data used to change the view of the spreadsheet.
  • 14. The computer system of claim 13, wherein the accelerometer data is used to perform one or more of the zoom out operation and the zoom in operation.
  • 15. The computer system of claim 10, the computer-executable instructions further causing the computer system to: in response to performing the zoom out operation, display a thumbnail for each sheet within a workbook associated with the spreadsheet when a zoom level exceeds a threshold.
  • 16. A computer storage medium comprising computer-executable instructions that when executed by a processor cause the processor to: receive an indication to change a view of a spreadsheet, wherein the indication includes an indication to perform an operation including one or more of: a zoom out operation with respect to the spreadsheet in response to moving a touch input device in a first direction, anda zoom in operation with respect to the spreadsheet in response to moving the touch input device in a second direction,wherein the zoom out operation and the zoom in operation slow at one or more predetermined locations within the spreadsheet, wherein the one or more predetermined locations are automatically prepositioned within the spreadsheet based at least in part on an arrangement of data within the spreadsheet; andupdate a display of the spreadsheet.
  • 17. The computer storage medium of claim 16, wherein the indication to change the view of the spreadsheet is a gesture received on the touch input device.
  • 18. The computer storage medium of claim 17, wherein a different operation is performed based on the gesture.
  • 19. The computer storage medium of claim 16, the computer-executable instructions further causing the processor to: receive sensor information, wherein the sensor information comprises accelerometer data used to change the view of the spreadsheet.
  • 20. The computer storage medium of claim 19, wherein the accelerometer data is used to perform one or more of the zoom out operation and the zoom in operation.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation application of U.S. patent application Ser. No. 12/986,473, entitled “NATURAL INPUT FOR SPREADSHEET ACTIONS,” filed on Jan. 7, 2011, the entire disclosure of which is hereby incorporated herein by reference.

US Referenced Citations (322)
Number Name Date Kind
5095429 Harris et al. Mar 1992 A
5212788 Lomet et al. May 1993 A
5249296 Tanaka Sep 1993 A
5339392 Risberg et al. Aug 1994 A
5371675 Greif et al. Dec 1994 A
5403639 West et al. Apr 1995 A
5416895 Anderson et al. May 1995 A
5418902 West et al. May 1995 A
5423034 Cohen-Levy et al. Jun 1995 A
5452447 Nelson et al. Sep 1995 A
5455945 Vanderdrift Oct 1995 A
5555403 Cambot et al. Sep 1996 A
5581670 Bier et al. Dec 1996 A
5581760 Atkinson Dec 1996 A
5604854 Glassey Feb 1997 A
5613058 Kopplou et al. Mar 1997 A
5664127 Anderson Sep 1997 A
5669005 Curbow et al. Sep 1997 A
5694608 Shostak Dec 1997 A
5708827 Kaneko et al. Jan 1998 A
5717939 Bricklin et al. Feb 1998 A
5727161 Purcell, Jr. Mar 1998 A
5745714 Glass et al. Apr 1998 A
5819292 Hitz et al. Oct 1998 A
5848187 Bricklin et al. Dec 1998 A
5852439 Musgrove et al. Dec 1998 A
5883623 Cseri Mar 1999 A
5890174 Khanna et al. Mar 1999 A
5893123 Tuinenga Apr 1999 A
5893125 Shostak Apr 1999 A
5899988 Depledge et al. May 1999 A
5978818 Lin Nov 1999 A
5987481 Michelman et al. Nov 1999 A
6003012 Nick Dec 1999 A
6009455 Doyle Dec 1999 A
6023691 Bertrand et al. Feb 2000 A
6038639 O'Brien et al. Mar 2000 A
6097391 Wilcox Aug 2000 A
6157934 Khan et al. Dec 2000 A
6160549 Touma et al. Dec 2000 A
6199099 Gershman et al. Mar 2001 B1
6216138 Wells et al. Apr 2001 B1
6247008 Cambot et al. Jun 2001 B1
6249606 Kiraly et al. Jun 2001 B1
6256651 Tuli Jul 2001 B1
6269403 Anders Jul 2001 B1
6293334 Ghiani Sep 2001 B1
6298334 Burfield et al. Oct 2001 B1
6323846 Westerman et al. Nov 2001 B1
6360246 Begley et al. Mar 2002 B1
6411313 Conlon et al. Jun 2002 B1
6412070 Van Dyke et al. Jun 2002 B1
6460059 Wisniewski Oct 2002 B1
6484186 Rungta Nov 2002 B1
6490593 Proctor Dec 2002 B2
6501491 Brown et al. Dec 2002 B1
6507865 Hanson et al. Jan 2003 B1
6578027 Cambot et al. Jun 2003 B2
6592626 Bauchot et al. Jul 2003 B1
6613098 Sorge et al. Sep 2003 B1
6625603 Garg et al. Sep 2003 B1
6626959 Moise et al. Sep 2003 B1
6631497 Jamshidi Oct 2003 B1
6631498 McCauley et al. Oct 2003 B1
6632249 Pollock Oct 2003 B2
6633851 Engler et al. Oct 2003 B1
6651075 Kusters et al. Nov 2003 B1
6662341 Cooper et al. Dec 2003 B1
6691100 Alavi et al. Feb 2004 B1
6701485 Igra et al. Mar 2004 B1
6757867 Bauchot et al. Jun 2004 B2
6775675 Nwabueze et al. Aug 2004 B1
6801910 Bedell et al. Oct 2004 B1
6832351 Batres Dec 2004 B1
6892211 Hitz et al. May 2005 B2
6906717 Couckuyt et al. Jun 2005 B2
6961905 Cover et al. Nov 2005 B1
6988241 Guttman et al. Jan 2006 B1
6990632 Rothchiller et al. Jan 2006 B2
6993533 Barnes Jan 2006 B1
7013312 Bala et al. Mar 2006 B2
7015911 Shaughnessy et al. Mar 2006 B2
7017112 Collie et al. Mar 2006 B2
7031979 Kauffman Apr 2006 B2
7047380 Tormasov et al. May 2006 B2
7231657 Honarvar et al. Jun 2007 B2
7234107 Aoki et al. Jun 2007 B1
7441197 Tschiegg et al. Oct 2008 B2
7469381 Ording Dec 2008 B2
7519223 Dehlin et al. Apr 2009 B2
7580928 Wu et al. Aug 2009 B2
7584414 Mortensen Sep 2009 B2
7640496 Chaulk et al. Dec 2009 B1
7650644 Cheng et al. Jan 2010 B2
7657571 Battagin et al. Feb 2010 B2
7660843 Atkinson et al. Feb 2010 B1
7673340 Cohen et al. Mar 2010 B1
7676763 Rummel Mar 2010 B2
7680823 Garfinkle et al. Mar 2010 B2
7730425 de los Reyes Jun 2010 B2
7752536 Megiddo et al. Jul 2010 B2
7792847 Dickerman et al. Sep 2010 B2
7797621 Danner et al. Sep 2010 B1
7805437 Andersson et al. Sep 2010 B1
7908549 Khen et al. Mar 2011 B2
7949937 Wu et al. May 2011 B2
7990843 Laroia et al. Aug 2011 B2
8121975 Averbuch et al. Feb 2012 B2
8245156 Mouilleseaux Aug 2012 B2
8255789 Berger et al. Aug 2012 B2
8274534 Montague Sep 2012 B2
8279174 Jee Oct 2012 B2
8321781 Tolle Nov 2012 B2
8352423 Phillips et al. Jan 2013 B2
8381133 Iwema Feb 2013 B2
8392890 Miller Mar 2013 B2
8416217 Eriksson et al. Apr 2013 B1
8468444 Middlefart Jun 2013 B2
8549432 Warner Oct 2013 B2
8566953 Campbell et al. Oct 2013 B2
8601389 Schulz Dec 2013 B2
8719251 English et al. May 2014 B1
8817053 Robert Aug 2014 B2
8854433 Rafii Oct 2014 B1
8943142 Simon et al. Jan 2015 B1
9003298 Hoke et al. Apr 2015 B2
9053083 Waldman et al. Jun 2015 B2
9171099 Prish et al. Oct 2015 B2
10180714 Kin Jan 2019 B1
20010055013 Fuki Dec 2001 A1
20020010743 Ryan et al. Jan 2002 A1
20020015059 Clarke Feb 2002 A1
20020049778 Bell et al. Apr 2002 A1
20020065846 Ogawa et al. May 2002 A1
20020070953 Barg et al. Jun 2002 A1
20020077803 Kudoh et al. Jun 2002 A1
20020077842 Charisius et al. Jun 2002 A1
20020078086 Alden et al. Jun 2002 A1
20020099824 Bender et al. Jul 2002 A1
20020113822 Windl et al. Aug 2002 A1
20020129054 Ferguson et al. Sep 2002 A1
20020143780 Gorman Oct 2002 A1
20020158887 Samra et al. Oct 2002 A1
20020184131 Gatto Dec 2002 A1
20030011638 Chung Jan 2003 A1
20030016247 Lai et al. Jan 2003 A1
20030018644 Bala et al. Jan 2003 A1
20030033329 Bergman et al. Feb 2003 A1
20030044762 Bergan et al. Mar 2003 A1
20030051209 Androski et al. Mar 2003 A1
20030061305 Copley et al. Mar 2003 A1
20030065638 Robert Apr 2003 A1
20030066030 Curns et al. Apr 2003 A1
20030071814 Jou et al. Apr 2003 A1
20030088586 Fitzpatrick et al. May 2003 A1
20030105765 Smith et al. Jun 2003 A1
20030120999 Miller et al. Jun 2003 A1
20030164817 Graham et al. Sep 2003 A1
20030169295 Becerra, Jr. Sep 2003 A1
20030212960 Shaughnessy et al. Nov 2003 A1
20030226105 Waldau Dec 2003 A1
20030233257 Matian et al. Dec 2003 A1
20040003353 Rivera et al. Jan 2004 A1
20040006539 Royer et al. Jan 2004 A1
20040015783 Lennon et al. Jan 2004 A1
20040049465 Engler et al. Mar 2004 A1
20040060001 Coffen Mar 2004 A1
20040064449 Ripley et al. Apr 2004 A1
20040100501 Dornback May 2004 A1
20040103366 Peyton-Jones et al. May 2004 A1
20040117731 Blyashov Jun 2004 A1
20040119727 Dietz et al. Jun 2004 A1
20040125130 Flamini et al. Jul 2004 A1
20040128147 Vallinayagam et al. Jul 2004 A1
20040143788 Aureglia et al. Jul 2004 A1
20040168115 Bauernschmidt et al. Aug 2004 A1
20040174397 Cereghini et al. Sep 2004 A1
20040181748 Jamshidi et al. Sep 2004 A1
20040199867 Brandenborg Oct 2004 A1
20040205595 DelGobbo et al. Oct 2004 A1
20040205638 Thomas et al. Oct 2004 A1
20040221233 Thielen Nov 2004 A1
20040260673 Hitz et al. Dec 2004 A1
20040268364 Faraj Dec 2004 A1
20050039114 Naimat et al. Feb 2005 A1
20050044496 Kotler et al. Feb 2005 A1
20050049906 Leymann et al. Mar 2005 A1
20050068290 Jaeger Mar 2005 A1
20050097146 Konstantinou et al. May 2005 A1
20050102608 Batres May 2005 A1
20050108052 Omaboe May 2005 A1
20050114661 Cheng et al. May 2005 A1
20050144554 Salmon et al. Jun 2005 A1
20050165829 Varasano Jul 2005 A1
20050166159 Mondry et al. Jul 2005 A1
20050210389 Middelfart Sep 2005 A1
20050240985 Alkove et al. Oct 2005 A1
20050268215 Battagin et al. Dec 2005 A1
20050275622 Patel et al. Dec 2005 A1
20050278647 Leavitt Dec 2005 A1
20050289136 Wu et al. Dec 2005 A1
20060013462 Sadikali Jan 2006 A1
20060055662 Rimas-Ribikauskas Mar 2006 A1
20060069696 Becker et al. Mar 2006 A1
20060168536 Portmann Jul 2006 A1
20060233257 Keith et al. Oct 2006 A1
20060265641 Garfinkle et al. Nov 2006 A1
20060288267 DeSpain Dec 2006 A1
20070028159 Ying et al. Feb 2007 A1
20070050416 Battagin et al. Mar 2007 A1
20070061669 Major et al. Mar 2007 A1
20070061698 Megiddo et al. Mar 2007 A1
20070061699 Battagin et al. Mar 2007 A1
20070130517 Wu Jun 2007 A1
20070136653 Khen et al. Jun 2007 A1
20070143715 Hollins et al. Jun 2007 A1
20070176898 Suh Aug 2007 A1
20070233811 Rochelle et al. Oct 2007 A1
20070260585 Bodine et al. Nov 2007 A1
20070266342 Chang et al. Nov 2007 A1
20080005678 Buttner et al. Jan 2008 A1
20080010670 Campbell et al. Jan 2008 A1
20080036743 Westerman Feb 2008 A1
20080046462 Kaufman et al. Feb 2008 A1
20080046803 Beauchamp et al. Feb 2008 A1
20080195930 Tolle Aug 2008 A1
20080204476 Montague Aug 2008 A1
20080235352 Yolleck et al. Sep 2008 A1
20080270886 Gossweiler et al. Oct 2008 A1
20080271127 Naibo et al. Oct 2008 A1
20080271227 Mollo Nov 2008 A1
20080294751 Dreiling Nov 2008 A1
20080307385 Dreiling Dec 2008 A1
20090019063 Gandhi et al. Jan 2009 A1
20090083619 Davis Mar 2009 A1
20090100345 Miller Apr 2009 A1
20090100360 Janzen et al. Apr 2009 A1
20090109187 Noma Apr 2009 A1
20090158190 Higginson Jun 2009 A1
20090198566 Greenberg Aug 2009 A1
20090198683 Robertson et al. Aug 2009 A1
20090217147 Thomsen Aug 2009 A1
20090254572 Redlich et al. Oct 2009 A1
20090271735 Anderson et al. Oct 2009 A1
20090300544 Psenka et al. Dec 2009 A1
20090307623 Agarawala et al. Dec 2009 A1
20090307762 Cudd, Jr. Dec 2009 A1
20090309849 Iwema Dec 2009 A1
20090313268 Folting et al. Dec 2009 A1
20090327964 Mouilleseaux Dec 2009 A1
20090328010 Cao Dec 2009 A1
20100026649 Shimizu Feb 2010 A1
20100031152 Villaron et al. Feb 2010 A1
20100031167 Roytman Feb 2010 A1
20100077344 Gaffney Mar 2010 A1
20100094658 Mok et al. Apr 2010 A1
20100100854 Russell et al. Apr 2010 A1
20100131529 Kasera et al. May 2010 A1
20100136957 Horodezky et al. Jun 2010 A1
20100192103 Cragun Jul 2010 A1
20100214322 Lim et al. Aug 2010 A1
20100229090 Newton Sep 2010 A1
20100235794 Ording Sep 2010 A1
20100262900 Romatier et al. Oct 2010 A1
20100306702 Warner Dec 2010 A1
20100318890 Billharz Dec 2010 A1
20100325526 Ellis et al. Dec 2010 A1
20100333044 Kethireddy Dec 2010 A1
20110041087 Leveille et al. Feb 2011 A1
20110074699 Marr et al. Mar 2011 A1
20110087954 Dickerman et al. Apr 2011 A1
20110145299 Zhou Jun 2011 A1
20110145689 Campbell et al. Jun 2011 A1
20110154268 Trent, Jr. Jun 2011 A1
20110163968 Hogan Jul 2011 A1
20110164058 Lemay Jul 2011 A1
20110283176 Zulian Nov 2011 A1
20110320563 Seo Dec 2011 A1
20120011195 Prish Jan 2012 A1
20120013539 Hogan et al. Jan 2012 A1
20120013540 Hogan Jan 2012 A1
20120023449 Zabielski Jan 2012 A1
20120030567 Victor Feb 2012 A1
20120072820 Weinman, Jr. Mar 2012 A1
20120173963 Hoke et al. Jul 2012 A1
20120180002 Campbell et al. Jul 2012 A1
20120198374 Zhang et al. Aug 2012 A1
20120212438 Vaisanen Aug 2012 A1
20120221933 Heiney et al. Aug 2012 A1
20120226967 Oh Sep 2012 A1
20120254782 Van Ieperen et al. Oct 2012 A1
20120254783 Pourshahid et al. Oct 2012 A1
20120272192 Grossman et al. Oct 2012 A1
20120330995 Muenkel Dec 2012 A1
20130013993 Oh Jan 2013 A1
20130061122 Sethi et al. Mar 2013 A1
20130111320 Campbell et al. May 2013 A1
20130117651 Waldman et al. May 2013 A1
20130117653 Sukhanov et al. May 2013 A1
20130159832 Ingargiola et al. Jun 2013 A1
20130159833 Look et al. Jun 2013 A1
20130174025 Lee et al. Jul 2013 A1
20130198323 Prish et al. Aug 2013 A1
20130212470 Karunamuni et al. Aug 2013 A1
20130212541 Dolenc Aug 2013 A1
20130229373 Eriksson et al. Sep 2013 A1
20130321283 Mak Dec 2013 A1
20130321285 Hoyer Dec 2013 A1
20130321308 Lee Dec 2013 A1
20130339903 Cheng et al. Dec 2013 A1
20140019842 Montagna et al. Jan 2014 A1
20140032575 Kiang et al. Jan 2014 A1
20140033093 Brauninger et al. Jan 2014 A1
20140194162 Tsudik Jul 2014 A1
20140310649 Berstein et al. Oct 2014 A1
20140358733 Achuthan et al. Dec 2014 A1
20140372856 Radakovitz et al. Dec 2014 A1
20140372858 Campbell et al. Dec 2014 A1
20140372932 Rutherford et al. Dec 2014 A1
20150161095 Wang et al. Jun 2015 A1
20150347372 Waldman et al. Dec 2015 A1
20160041964 Prish et al. Feb 2016 A1
Foreign Referenced Citations (97)
Number Date Country
2006291313 Jul 2011 AU
2618224 Mar 2007 CA
2616563 May 2015 CA
102016795 Jun 1989 CN
1578949 Feb 2005 CN
1655120 Aug 2005 CN
1755679 Apr 2006 CN
1790325 Jun 2006 CN
1877505 Dec 2006 CN
1904879 Jan 2007 CN
101013954 Aug 2007 CN
101258485 Sep 2008 CN
101263453 Sep 2008 CN
101300564 Nov 2008 CN
101326520 Dec 2008 CN
101371255 Feb 2009 CN
101404009 Apr 2009 CN
101553812 Oct 2009 CN
101882108 Nov 2010 CN
101983388 Mar 2011 CN
102334098 Jan 2012 CN
103034708 Apr 2013 CN
103049476 Apr 2013 CN
0798655 Oct 1997 EP
0990972 Apr 2000 EP
1037157 Sep 2000 EP
1367514 Dec 2003 EP
1603053 Dec 2005 EP
1922939 May 2008 EP
1979804 Oct 2008 EP
2354851 Apr 2001 GB
H01165795 Jun 1989 JP
03-268185 Nov 1991 JP
04003477 Jan 1992 JP
06-028349 Feb 1994 JP
H06139261 May 1994 JP
07-334696 Dec 1995 JP
8-500200 Jan 1996 JP
H1027089 Jan 1998 JP
H1069480 Mar 1998 JP
10-508403 Aug 1998 JP
H10214264 Aug 1998 JP
H11-143606 May 1999 JP
2000056888 Feb 2000 JP
2001092444 Apr 2001 JP
2001109741 Apr 2001 JP
2001312442 Nov 2001 JP
2001357343 Dec 2001 JP
2002-140159 May 2002 JP
2002189595 Jul 2002 JP
2003050964 Feb 2003 JP
2003108440 Apr 2003 JP
2003281128 Oct 2003 JP
2003533755 Nov 2003 JP
2004145713 May 2004 JP
2006-048110 Feb 2006 JP
2006107444 Apr 2006 JP
2007511002 Apr 2007 JP
2007141190 Jun 2007 JP
2008-059010 Mar 2008 JP
2008123199 May 2008 JP
2009-508237 Feb 2009 JP
2010-152801 Jul 2010 JP
2010524095 Jul 2010 JP
2010170573 Aug 2010 JP
2014501996 Jan 2014 JP
20060046307 May 2006 KR
20060106640 Oct 2006 KR
20070013739 Jan 2007 KR
10-2009-0007365 Jan 2009 KR
10-2009-0013551 Feb 2009 KR
1020090017517 Feb 2009 KR
1020090116591 Nov 2009 KR
1020100096424 Sep 2010 KR
10-2011-0139649 Dec 2011 KR
278251 Aug 2010 MX
2231117 Jun 2004 RU
2250492 Apr 2005 RU
200411T770 May 2005 RU
2383923 Mar 2010 RU
2390834 May 2010 RU
2419853 May 2011 RU
2433449 Oct 2011 RU
2439683 Jan 2012 RU
117587 Dec 2005 SG
536673 Jun 2003 TW
200424889 Nov 2004 TW
TA 1416342 Nov 2013 TW
WO 9707454 Feb 1997 WO
WO 200072197 Nov 2000 WO
WO 0146868 Jun 2001 WO
WO 200203595 Jan 2002 WO
WO 2002084531 Oct 2002 WO
WO 2007032907 Mar 2007 WO
WO 2007061057 May 2007 WO
WO 2010065664 Jun 2010 WO
WO 2010071630 Jun 2010 WO
Non-Patent Literature Citations (325)
Entry
Remote Control for Zoomable UI on TV an IP.com Prior Art Database Technical Disclosure Authors et. al.: Disclosed Anonymously IP.com No. IPCOM000159392D.
Visualization Exploration and Encapsulation via a Spreadsheet-Like Interface T.J. Jankun-Kelly and Kwan-Liu Ma, Member, IEEE Jul. 2001.
European Search Report Issued in Patent Application No. 12732029.9, dated Dec. 19, 2017, 13 Pages.
Shimpi, et al., “Apple's iPad—the AnandTech Review—AnandTech Your Source for Hardware Analysis and News”, Retrieved From <<https://www.anandtech.com/show/3640/apples-ipad-the-anandtech-review/4>>, Apr. 7, 2010, 6 Pages.
Socubeliveg, et al., “Apple iPad Numbers Tutorial—how to use Numbers on ipad”, Retrieved From <<https://www.youtube.com/watch?v=p0aOZWK12fM>>, Aug. 31, 2010, 1 Page.
PCT International Search Report, dated Jun. 27, 2013, Application No. PCT/US2013/027547, Filed Date: Feb. 25, 2013, pp. 10.
“Touch”, Retrieved at <<http://msdn.microsoft.com/en-us/library/windows/desktop/cc872774.aspx>>, Retrieved Date: Feb. 1, 2012, pp. 18.
Office Action received for U.S. Appl. No. 12/887,003, dated Apr. 4, 2014, 25 pages.
U.S. Appl. No. 13/418,489, Office Action dated Feb. 13, 2014, 12 pages.
U.S. Appl. No. 13/418,489, Office Action dated Aug. 27, 2014, 16 pages.
U.S. Appl. No. 13/418,489, Notice of Allowance dated Dec. 9, 2014, 10 pages.
“Cologo: A Collaborative Web-based Programming Environment”, Published on: Sep. 5, 2011, Available at: http://www.cologo-lang.org/docs_starting.html.
“WP01:WebSphere MQ Workflow-Performance Estimates and Capacity Assessments”, http://www.1.ibm.com/suggort/docview. wss?rs=171&uid=swg24006573&l0c=enUS&cs=ytu-8&lang=en, 2 pgs.
“Data Warehouse Trend, Part 2 OLAP is enabled on WWW Browser, formulation/operation of data warehouse becomes easy and enlargement of user target is accelerated”; Nikkei Computer, No. 440, pp. 224-227; Nikkei Business Publications, Inc., Japan, Mar. 30, 1998 (cited in Jan. 31, 2012 JP NOR).
“How To: Save a Workbook as a Web Page in Excel 2002; Summary,” Retrieved from the Internet: http://support.microsoft.com/default.aspx?scid=kb;en-us;289260, Retrieved on Dec. 12, 2005, 3 pgs.
“Object Lens: A Spreadsheet for Cooperative Work; Abstract,” by Kum-Yew Lai, et al., Sep. 1988, Retrieved from the Internet: https://hpds1.mit.edu/bitstream/1721.1/2210/1/SWP-2053-21290214.pdf, Retrieved on Dec. 12, 2005, 42 pgs.
“Welcome to Gnumeric!” Retrieved from the Internet: http://www.gnome.org/projects/gnumeric/, Retrieved on Dec. 12, 2005, 2 pgs.
“XESS the Advanced X Windows Spreadsheet System,” Retrieved from the Internet: http://www.ais.com/Xess/xess5_product_sheet.html, Retrieved on Dec. 12, 2005, 3 pgs.
Andrews et al., “Liquid Diagrams: Information Visualization Gadgets”; Information Visualization (IV), 2010 14th International Conference, IEEE, Jul. 26, 2010; pp. 104-109 (cited in Jul. 16, 2015 ISR).
Australian Examination Report dated May 29, 2007, cited in Appln No. SG 200503164-6 dated May 29, 2007, 6 pages.
Australian Examination Report dated Oct. 22, 2009, cited in Appln No. 2006284595, 3 pages.
Australian Examination Report dated Oct. 26, 2010 cited in Appln No. 2006291313, 2pgs.
Australian Examination Report dated Oct. 29, 2010 cited in Appln No. 2006287357, 2 pgs.
Australian Office Action in Application 2012204477, dated Apr. 24, 2016, 3 pgs.
Australian Office Action Issued in Australian Patent Application No. 2012204477, dated Aug. 23, 2016, 3 Pages.
Author Unknown, “Trade Like a Geek—One Click Stock Quotes in Excel—Learn How to Learn Excel”; 2009 Pointy Haired Dilbert—Chandoo.org; 6 pgs. (cited in Sep. 22, 2015 EP ISR).
Author Unknown, “Use Online Data in Excel 2010 Spreadsheets—How to Geek”; Jan. 6, 2012; Retrieved from http://web.archive.org/web/201020106083121/haap://howtogeek.com/howto/24285/use-on line-data-in-excel-2010-spreadsheets; 6 pgs. (cited in Sep. 22, 2015 EP ISR).
Author Unknown, Using GoogleFinance to Track Stocks on the Australian Securities Exchange (ASX); Ben's Blog; Sep. 18, 2010; 2 pgs. (cited in Sep. 22, 2015 EP ISR).
Author Unknown, About Dynamic Data Exchange—Published Date: Sep. 6, 2011, 5 pgs; http://msdn.microsoft.com/en-us/library/windows/desktop/ms648774%28v=vs.85%29.aspx Dynamic Data Exchange Protocol.
Battagin, Dan, Using Excel Web Services in a SharePoint Web Part—Published Date: Nov. ##, 2006, 8 pgs; http://msdn.microsoft.com/en-us/library/aa973804%28v=office.12%29.aspx.
Blattner et al, “Special Edition Using Microsoft ExceI2000”, May 3, 1999, Que, pp. 1-13 (cited in Oct. 29, 2008 OA).
Blattner, “Special Edition Using Microsoft Excel 2003”; Que, published Sep. 11, 2003, pp. 16, 47-51, 350-369 and 445-447, 30 pgs.
Brain Matter [Online], AlphaBlox, Jul. 22, 2001 [Retrieved on Sep. 7, 2006]. Retrieved from <URL:http://web.archive.org/web/20010818124342/www.blox.com/products?subsection=spreadhseets>, 1 pg.
Brain Matter [Online], AlphaBlox, Apr. 5, 2001 [Retrieved on Sep. 7, 2006]. Retrieved from <URL:http://web.archive.org/ web/20010405152714/www.blox.com/?id=sheet>.
Canadian Notice of Allowance dated Apr. 3, 2014 in Appln No. 2,618,224, 2 pgs.
Canadian Office Action dated Dec. 17, 2012 in Appln No. 2,618,211, 2 pages.
Canadian Office Action dated May 13, 2013 in Appln No. 2,618,224, 4 pages.
Chilean Office Action cited in Appln No. 1155-2005 dated Jan. 16, 2008, 10 pgs.
Chilean Second Office Action cited in Appln No. 1155-2005 dated Jun. 23, 2009, 8 pgs.
Chilean Third Office Action cited in Appln No. 1155-2005 dated Jun. 8, 2010, 11 pgs.
Chinese 1st Office Action in Application 201210044546.8, dated Feb. 24, 2016, 12 pgs.
Chinese 2nd Office Action in Application 201210044546.8, dated Aug. 2, 2016, 13 pgs.
Chinese 4th Office Action in Application 201210434821.7, dated Jun. 22, 2016, 13 pgs.
Chinese Notice of Grant dated Nov. 27, 2015 in Appln No. 201210012142.0, 4 pgs.
Chinese Office Action and Search Report Issued in Patent Application No. 201210434821.7, dated Oct. 27, 2014, 13 Pages.
Chinese Office Action and Search Report Issued in Patent Application No. 201380007011.6, dated Feb. 3, 2016, 12 Pages.
Chinese Office Action cited in Appln No. 200510075819.5 , dated Dec. 14, 2007, 17 pgs.
Chinese Office Action dated Jan. 6, 2014 in Appln No. 201210012142.0, 10 pgs.
Chinese Office Action dated Jul. 24, 2009, cited in Appln No. 200680031441.1, 11 pgs.
Chinese Office Action dated May 22, 2009, cited in Appln No. 200680032787.3.
Chinese Second Office Action cited in Appln No. 200510075819.5 dated May 30, 2008, 18 pgs.
Chinese Second Office Action dated Feb. 5, 2010, cited in Appln No. 200680031441.1, 7 pgs.
Chinese Second Office Action dated Nov. 13, 2009, cited in Appln No. 200680032787.3, 5 pgs.
Chinese Second Office Action Issued in Patent Application No. 201210012142.0, dated Nov. 4, 2014, 6 Pages.
Chinese Second Office Action Issued in Patent Application No. 201210434821.7, dated Jun. 19, 2015, 8 Pages.
Chinese Third Office Action and Search Report Issued in Patent Application No. 201210012142.0, dated May 18, 2015, 13 Pages.
Chinese Third Office Action cited in Appln No. 200510075819.5 dated Nov. 7, 2008, 8 pgs.
Chinese Third Office Action Issued in Patent Application No. 201210434821.7, dated Dec. 17, 2015, 10 Pages.
Curie, D., “The Medium is the Message: Data Downlink's.xls Lets Number Stay Numbers,” Online, Nov.-Dec. 1997, vol. 21, No. 6, p. 64, 66.
Dodge, et al., Microsoft Office Excel 2003; Official Manual, the first edition, Nikkei BP Soft Press, Inc., Jul. 12, 2004, pp. 129-135 and 387-392 (cited in Feb. 3, 2013 JP NOR).
Dovico™ Software, “Investing in Better Time & Expense Gathering”, Jun. 2005, http://www.dovico.com/documents/Investing-in-Better-Time-Expense-Gathering.pdf; 11 pgs.
Entology, “Large Diversified Manufacturer Achieves Sarbanes-Oxley Compliance through Financial Document Management”, http://www.entologv.com/Dress/cs/cs_029.htm, 2003, 2 pgs.
European Communication in Application 14736166.1, dated Jan. 22, 2016, 2 pgs.
European Extended Search Report dated Jul. 16, 2015 in Appln No. PCT/US2012/063133, 9 pgs.
European Extended Search Report dated Sep. 22, 2015 in Appln No. PCT/US2013/022824, 8 pgs. (also known as EP 13741052.8).
European Office Action in Application 06790149.6, dated May 17, 2016, 6 pgs.
European Search Report dated Mar. 6, 2012 in Appl No. 06790149.6, 12 pgs. (also known as European Search Report dated Feb. 27, 2012 in Appl No. PCT/US2006/034312.
European Search Report dated Mar. 6, 2012 in Appl No. PCT/US2006/034312; 12 pgs.
Fox, Pamela, How to Convert a Google Spreadsheet into JSON, XML, and MySQL—Published Date: May 17, 2009; 7 pgs; http://blog.pamelafox.org/2009/05/how-to-convert-google-spreadsheet-into.html.
Google.com; “Getting Started with Spreadsheets Gadgets”, accessed Oct. 20, 2011, at: http://code.google.com/apis/spreadsheets/gadgets/; 8 pgs.
Granet, V., “The Xxl Spreadsheet Project”; Linux Journal, 1999, http://www.;inuxjournal.com/article/3186; downloaded Sep. 21, 2005; 10 pgs.
Hudson, S.E., “User interface specification using an enhanced spreadsheet model,” ACM Transactions on Graphics, 1994, 13(3), 209-239.
Huynh, D.; “Timeline Gadget for Google Spreadsheets”, Retrieved on: Sep. 20, 2011, Available at: http://s3.amazonaws.com/iac-production/attachments/28/TimeLine_Gadget_for_Google_Spreadsheets.pdf.
India First Examination Report dated Jan. 22, 2014 cited in 1286/DEL/2005, 2 pgs.
India First Examination Report dated Oct. 9, 2015 cited in 1575/DELNP/2008, 3 pgs.
India First Examination Report dated Sep. 30, 2015 cited in 1943/DELNP/2008, 3 pgs.
Indian Exam Report in Application 1981/DELNP/2008, dated Apr. 18, 2016, 7 pgs.
International Search Report dated Jan. 8, 2007, issued in PCT Application No. PCT/US2006/033800, 2 pgs.
International Preliminary Report on Patentability Issued for PCT Patent Application No. PCT/US2014/041258, dated Aug. 31, 2015, 8 Pages.
International Search Report and Written Opinion for International Application No. PCT/US2012/026672, dated Oct. 25, 2012, 11 pgs.
International Search Report and Written Opinion Issued for PCT Patent Application No. PCT/US2014/041258, dated Feb. 18, 2015, 10 Pages.
International Search Report and Written Opinion Issued for PCT Patent Application No. PCT/US2014/041258, dated Jun. 12, 2015, 7 Pages.
International Search Report dated Aug. 21, 2007, issued in EP 05104560; 3 pgs.
International Search Report dated Mar. 22, 2013, issued in PCT/US2012/063133, 9 pages.
Israeli Office Action cited in Appln No. 168621 dated Sep. 22, 2009, 2 pgs.
IWork for iOS—Numbers—Innovative spreadsheets in a few taps, Retrieved on: Apr. 26, 2013, Available at: https://movies.apple.com/ca/apps/iwork/numbers/, 9 pgs.
IWork—Numbers—Create perfect spreadsheets in minutes., Retrieved on: Apr. 26, 2013, Available at: http://www.apple.com/in/iwork/numbers/#spreadsheet, 3 pgs.
Japanese Final Notice of Rejection dated Dec. 7, 2012 in Appln No. 2008-530095, 4 pages.
Japanese Notice of Rejection cited in Appln No. 2008-530095 dated Jan. 31, 2012, 6 pgs.
Japanese Notice of Rejection dated Apr. 22, 2011 cited in Appln No. 2008-529328, 7 pgs.
Japanese Notice of Rejection dated Feb. 3, 2012 cited in Appln No. 2008-530243; 9 pgs.
Japanese Notice of Rejection in Appln No. 2005-161 206 dated Oct. 22, 2010.
Japanese Notice of Rejection Issued in Patent Application No. 2013-548479, dated Jan. 26, 2016, 9 Pages.
Jones, S.P., “A user-centered approach to functions in Excel,” JCEP, 2003, 165-176.
Khor, “Microsoft Office Excel 2003 Preview”, Jun. 2003, Microsoft Office Excel 2003 Preview, Microsoft Excel 2003 Technical Articles, Microsoft Corporation Publishing (cited in Nov. 26, 2008 OA).
Korean Preliminary Rejection dated Jan. 13, 2013 in Appln No. 10-2008-7004303; 13 pgs.
Levin, Carol, “Skinny Clients to Rule on Web—Corporate Intranets Will Fuel a New Breed of Applications,” PC Magazine, Mar. 26, 1996, vol. 15, No. 6, p. 37.
Loney et al., “An Overview of Databases and Instances”; In: Oracle Database 10g DBA Handbook, Mar. 24, 2005; Oracle Press; 2 pgs. (cited in Feb. 27, 2012 EP Search Report).
Loney et al., “Dynamic Data Replication”; In: Oracle Database 10g DBA Handbook, Mar. 24, 2005; Oracle Press; 2 pgs. (cited in Feb. 27, 2012 EP Search Report).
Loney et al., “Oracle Logical Database Structures”; In: Oracle Database 10g DBA Handbook, Mar. 24, 2005; Oracle Press; 16 pgs. (cited in Feb. 27, 2012 EP Search Report).
Loney et al., “Overview of Oracle Net”; In: Oracle Database 10g OBA Handbook, Mar. 24, 2005; Oracle Press; 12 pgs. (cited in Feb. 27, 2012 EP Search Report).
Malaysia Modified Substantive Examination Report dated Feb. 13, 2015 in Appln No. PI 20080396, 2 pgs.
Malaysia Substantive Examination Report dated Jan. 15, 2014 in Appln No. PI 20080503, 3 pgs.
Malaysia Substantive Examination Report dated Jul. 15, 2015 in Appln No. PI 20080503, 2 pgs.
Malaysian Modified Substantive Examination Report dated Aug. 15, 2014 in Appln No. PI 20080396 2 pgs.
Malaysian Notice of Allowance dated Jun. 13, 2014 in Appln No. PI 20080500, 2 pgs.
Malaysian Office Action cited in Appln No. PI 20052416, dated Sep. 15, 2010, 3 pgs.
McManus, Sean, “Excel Everywhere for HTML: Transform Static Excel Spreadsheets Into Smart Interactive Web Pages,” Internet Magazine, Mar. 2004, No. 115, p. 106.
Mcpherson; Bruce, Serializing Excel Data for Input to any Google Visualization—Published Date: Jan. 26, 2011; 6 pgs; http://www.eggheadcafe.com/tutorials/excel/571d84dc-9fcf-44de-b2ad-005c12372ab3/serializing-excel-data-for-input-to-any-google-visualization.aspx.
Mexican Second Office Action cited in Appl No. PNa/2005/005855, dated Nov. 18, 2009, 2 pgs.
Mexican Office Action cited in Appl No. MX/a/2008/003318, dated Aug. 17, 2010, 3 pgs.
Mexican Office Action dated Apr. 27, 2012, cited in Appl No. MX/a/2008/003318, 3 pages.
Mexican Office Action dated Dec. 8, 2010, cited in Appl No. MX/a/2008/003318; 8 pgs.
Mexican Office Action dated Feb. 11, 2013 in Appin No. Mx/a/2008/003309, 7 pages.
Mexican Office Action dated Jul. 14, 2010, cited in Appl No. MX/a/2008/002501; 4 pgs.
Mexican Office Action dated May 6, 2013, cited in Appl No. MX/a/2008/003318; 15 pgs.
Mexican Office Action dated Oct. 4, 2011 in Appln No. Mx/a/2008/003309.
New Zealand Examination Report cited in Appln No. 540420 dated Jun. 7, 2005, 2 pgs.
New Zealand Examination Report dated Jan. 20, 2011 cited in Appln No. 566309, 2 pages.
New Zealand Examination Report dated May 5, 2011 cited in Appln No. 566309, 3 pages.
New Zealand Examination Report dated Sep. 9, 2011 cited in Appln No. 594997, 2 pgs.
New Zealand Further Examination Report dated Dec. 14, 2012 cited in Appln No. 594997, 2 pgs.
Norwegian Office Action and Search Report in Application 20080596, dated Oct. 11, 2016, 5 pgs.
Notice of Allowance dated Feb. 6, 2015, issued in U.S. Appl. No. 13/289,663, 57 pgs.
Notice of Allowance dated Jun. 25, 2015, issued in U.S. Appl. No. 13/570,071, 7 pgs.
Office Action dated Apr. 20, 2012, issued in U.S. Appl. No. 11/860,394, 20 pages.
Office Action dated Apr. 22, 2013, issued in U.S. Appl. No. 11/223,541, 33 pages.
Office Action dated Apr. 23, 2009, issued in U.S. Appl. No. 11/223,180, 18 pages.
Office Action dated Apr. 30, 2012, issued in U.S. Appl. No. 11/223,541; 35 pgs.
Office Action dated Apr. 7, 2009, issued in U.S. Appl. No. 10/858,188, 25 pgs.
Office Action dated Dec. 20, 2013, issued in U.S. Appl. No. 11/223,541, 54 pgs.
Office Action dated Dec. 24, 2014, issued in U.S. Appl. No. 12/986,473, 28 pgs.
Office Action dated Dec. 8, 2010, issued in U.S. Appl. No. 11/860,394, 16 pages.
Office Action dated Feb. 13, 2013, issued in U.S. Appl. No. 13/289,663, 15 pages.
Office Action dated Feb. 20, 2008, issued in U.S. Appl. No. 10/858,188, 19 pgs.
Office Action dated Jan. 20, 2010, issued in U.S. Appl. No. 11/223,541, 14 pages.
Office Action dated Jan. 8, 2008, issued in U.S. Appl. No. 11/214,676, 13 pages.
Office Action dated Jul. 2, 2008, issued in U.S. Appl. No. 11/214,676, 13 pages.
Office Action dated Jul. 6, 2007, issued in U.S. Appl. No. 10/858,188.
Office Action dated Jul. 7, 2014, issued in U.S. Appl. No. 11/223,541, 26 pgs.
Office Action dated Jun. 1, 2009, issued in U.S. Appl. No. 11/214,676, 12 pages.
Office Action dated Jun. 17, 2011, issued in U.S. Appl. No. 11/860,394, 17 pages.
Office Action dated Jun. 23, 2011, issued in U.S. Appl. No. 11/223,541, 23 pages.
Office Action dated Mar. 2, 2010, issued in U.S. Appl. No. 11/298,380, 12 pages.
Office Action dated Mar. 2, 2011, issued in U.S. Appl. No. 11/223,541, 23 pages.
Office Action dated Mar. 24, 2014, issued in U.S. Appl. No. 13/570,071, 13 pgs.
Office Action dated Mar. 26, 2015 issued in U.S. Appl. No. 13/035,689, 24 pgs.
Office Action dated Mar. 30, 2009, issued in U.S. Appl. No. 11/298,380, 14 pages.
Office Action dated May 15, 2006, issued in U.S. Appl. No. 10/858,188, 19 pgs.
Office Action dated May 22, 2014, issued in U.S. Appl. No. 13/035,689, 60 pgs.
Office Action dated May 26, 2009, issued in U.S. Appl. No. 11/223,541, 13 pages.
Office Action dated May 8, 2014, issued in U.S. Appl. No. 12/986,473, 75 pgs.
Office Action dated Nov. 19, 2008, issued in U.S. Appl. No. 11/223,541, 7 pages.
Office Action dated Nov. 19, 2014, issued in U.S. Appl. No. 13/035,689, 31 pgs.
Office Action dated Nov. 21, 2006, issued in U.S. Appl. No. 10/858,188, 19 pgs.
Office Action dated Nov. 21, 2008, issued in U.S. Appl. No. 11/214,676, 12 pages.
Office Action dated Nov. 26, 2008, issued in U.S. Appl. No. 11/223,180, 15 pages.
Office Action dated Nov. 6, 2013, issued in U.S. Appl. No. 13/289,663, 15 pages.
Office Action dated Nov. 8, 2011, issued in U.S. Appl. No. 11/860,394, 20 pgs.
Office Action dated Oct. 29, 2008, issued in U.S. Appl. No. 10/858,188, 26 pgs.
Office Action dated Oct. 7, 2011, issued in U.S. Appl. No. 11/223,541, 27 pages.
Office Action dated Oct. 8, 2014, issued in U.S. Appl. No. 13/570,071, 14 pgs.
Office Action dated Sep. 1, 2009, issued in U.S. Appl. No. 11/223,180, 16 pages.
Office Action dated Sep. 14, 2009, issued in U.S. Appl. No. 11/298,380, 15 pages.
Office Action dated Sep. 15, 2010, issued in U.S. Appl. No. 11/223,541, 21 pages.
Office Action dated Sep. 23, 2015 issued in U.S. Appl. No. 13/035,689, 38 pgs.
Office Action dated Sep. 24, 2015, issued in U.S. Appl. No. 12/986,473, 26 pgs.
Office Action dated Sep. 25, 2012, issued in U.S. Appl. No. 11/223,541, 35 pages.
Oliver, Andrew C. and Barozzi, Nicola Ken, POI-HSSF and POI-XSSF—Java API to Access Microsoft Excel Format Files—Retrieved Date: Jan. 11, 2012; 2 pgs; http://poi.apache.org/spreadsheet/.
PCT Application PCT/US2013/022824, International Search Report dated May 30, 2013, 10 pages.
PCT International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2013/022824, dated Aug. 7, 2014, 6 Pages.
PCT International Search Report dated Jan. 8, 2007, issued in PCT Application No. PCT/US2006/033800; 2 pgs.
PCT Search Report dated Jan. 9, 2007 in Appln No. PCT/US2006/034312, 4 pages.
PCT Search Report dated Oct. 25, 2012 in Appln No. PCT/US2012/026672.
Pembudon et al., XHTML 1.0: The Extensible Hypertext Markup Language, a Reformulation ofHTML-4.0 in XML 1.0, W3C Working Draft, May 5, 1999, http://www.w3.org/TR/1999/xhtml 1-199990505/, 16 pgs.
Person, R.; “Creating Charts”, Special Edition Using Microsoft Excel '97; © 1997 Que Corp., pp. 385-410 (cited in Apr. 22, 2013 OA).
Powell, Jim, “Add-Ins Turn App Docs Into Web Pages: Microsoft Internet Assistant Tools,” Windows Magazine, Jun. 1, 1996, vol. 7, No. 6, p. 120.
Quixa, Builder/Quixa Solutions, http://www.guixa.com/ultimus/builder.asp, 2005, 4 pgs.
Russian Notice of Allowance in Application 2013131022, dated Jul. 22, 2016, pgs.
Russian Office Action cited in Appln No. 2005116667 dated Apr. 24, 2009, 4 pgs.
Russian Office Action dated Dec. 18, 2015 in Appln No. 2013131022 2412-197167, 5 pgs.
Russian Office Action dated Jul. 29, 2010, cited in Appln No. 2008107762, 4 pgs.
Russian Office Action dated Sep. 22, 2010, cited in Appln No. 2008108992, 5 pgs.
Russian Office Action dated Sep. 6, 2010, cited in Appln No. 2008108999; 5 pgs.
Smedley, T.J., et al., “Expanding the utility of spreadsheets through the integration of visual programming and user interface objects,” The ACM Digital Library, 1996, 148-155.
Stinson, C., Microsoft Office Excel 2003, Sep. 3, 2003, Microsoft Press, pp. 1-7 (cited in Nov. 26, 2008 OA).
Taiwan Notice of Allowance dated Jul. 30, 2013 in Appln No. 95132059, 3 pgs.
Truvé, Staffan., “Dynamic what-if analysis: exploring computational dependencies with slidercells and micrographs,” Mosaic of Creativity, 1995, 280-281.
U.S. Appl. No. 14/731,023, filed Jun. 4, 2015 entitled “Interaction Between Web Gadgets and Spreadsheets”.
U.S. Appl. No. 14/920,277, filed Oct. 22, 2015 entitled “System and Method for Providing Calculation Web Services for Online Documents”.
U.S. Appl. No. 141731,023, filed Jun. 4, 2015 entitled “Interaction Between Web Gadgets and Spreadsheets”.
U.S. Appl. No. 12/986,473, Office Action dated Jun. 2, 2016, 16 pgs.
U.S. Appl. No. 12/986,473, Office Action dated Sep. 19, 2016, 20 pgs.
U.S. Appl. No. 13/035,689, Office Action dated Jul. 1, 2016, 30 pgs.
U.S. Appl. No. 13/918,914, Office Action dated Apr. 6, 2016, 22 pgs.
U.S. Appl. No. 13/918,914, Office Action dated Jun. 8, 2016, 19 pgs.
U.S. Appl. No. 14/731,023, Amendment after Allowance filed Aug. 17, 2016, 3 pgs.
U.S. Appl. No. 14/731,023, Notice of Allowance dated Jul. 29, 2016, 7 pgs.
U.S. Appl. No. 14/731,023, Notice of Allowance dated Aug. 17, 2016, 2 pgs.
U.S. Appl. No. 14/731,023, Office Action dated Apr. 11, 2015, 7 pgs.
Walkenbach, John.; “Microsoft Office Excel 2007”; Chapters 15, 27, 40; Excel 2007 Bible; Wiley Publishing; 44 pgs. (cited in Sep. 22, 2015 EP ISR).
Zhao, Jensen J., “Developing Web-Enabled Interactive Financial Tools Without HTML and Script Languages”, in Information Technology, Learning, and Performance Journal, Fall 2001, vol. 19, No. 2, 2001, 5 Pages.
U.S. Appl. No. 14/731,023, Notice of Allowance dated Oct. 31, 2016, 2 pgs.
Japanese Office Action in Application 2014-541110, dated Sep. 26, 2016, pgs.
Chinese 2nd Office Action in Application 201380007011.6, dated Nov. 30, 2016, 9 pgs.
U.S. Appl. No. 13/918,871, Office Action dated Jun. 4, 2015, 18 pgs.
U.S. Appl. No. 13/918,871, Office Action dated Dec. 18, 2015, 16 pgs.
U.S. Appl. No. 13/918,871, Office Action dated Oct. 3, 2016, 19 pgs.
PCT International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2014/041071, dated Jul. 22, 2015, 6 Pages.
PCT International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2014/041071, dated Jul. 7, 2015, 6 Pages.
Schnell, Joshua, “Grid: An Upcoming, Modern Approach to Spreadsheets for iOS Devices”, Published on: Aug. 20, 2012, Available at: http://www. macgasm.nel/2012/08/20/grid-an-upcoming-modern-approach-to-spreadsheets-for-ios-devices/, 8 pages.
Lardinois, Frederic, “YC-Backed Grid Reinvents the Spreadsheet for the Tablet Age”, Published on: Aug. 8, 2012, Available at: http://techcrunch.com/2012/08/08/grid-launch/, 5 pages.
“LiveCode Grid for Mobile Devices”, Retrieved on: Apr. 26, 2013, Available at: http://www.runrevplanel.com/index.php?option=com_content&view=article&id=250&1temid=148, 2 pages.
“Grid”, Published on: Jul. 12, 2012, Available at: http://www.infragistics.com/products/windows-forms/grid/, 6 pages.
Ramakrishnan, et al., “XcelLog: A Deductive Spreadsheet System”, Published on: Sep. 2007, Available at: http://www.cs.sunysb.edu/-cram/Papers/RRW_ KER07/paper.pdf, 15 pages.
PCT International Search Report and Written Opinion for PCT/US2014/041071 dated Sep. 1, 2014, 9 pgs.
U.S. Appl. No. 13/918,904, Office Action dated Apr. 8, 2015, 23 pgs.
U.S. Appl. No. 13/918,904, Office Action dated Nov. 20, 2015, 26 pgs.
U.S. Appl. No. 13/918,904, Advisory Action dated May 5, 2016, 4 pgs.
U.S. Appl. No. 13/918,904, Office Action dated Oct. 6, 2016, 30 pgs.
Smith, et al., “Analyzing (Social Media) Networks with NodeXL”, in Proceedings of the 4th International Conference on Communities and Technologies, Jun. 25, 2009, 9 pgs.
Gibbs, Samuel, “Google Spreadsheets Gains Filtering, One More Reason not to Use Excel”, Published on: Mar. 24, 2011, Available at: http://downloadsquad.switched.com/2011/03/24/google-spreadsheets-gains-filtering-one-more-reason-not-to-use/, 8 pgs.
PCT International Search Report and Written Opinion Issued in PCT Application No. PCT/US2014/041276, dated Jan. 16, 2015, 17 Pages.
Jelen, Bill, “Microsoft Excel 2010 in Depth”, ISBN 9780789744265, published 2010, 11 pgs.
PCT International Preliminary Report on Patentability Issued in PCT Application No. PCT/US2014/041276, dated Dec. 23, 2015, 6 pgs.
European Extended Search Report in Application 14737383.1, dated Dec. 22, 2016, 6 pgs.
Australian Notice of Allowance in Application 2012204477, dated Oct. 24, 2016, 4 pgs.
U.S. Appl. No. 14/920,277, Office Action dated Jan. 6, 2017, 23 pgs.
U.S. Appl. No. 12/986,473, Notice of Allowance dated Apr. 28, 2017, 10 pgs.
U.S. Appl. No. 13/035,689, Office Action dated Apr. 6, 2017, 39 pgs.
Webb, J., “Excel 2003 Programming: A Developer's Notebook”, Chapter 6, Explore Security in Depth, copyright Aug. 20, 2004, O'Reilly Media, Inc. pp. 197-241.
Chinese Notice of Allowance in Application 201210044546.8, dated Feb. 6, 2017, 4 pgs.
Chinese Notice of Allowance in Application 201210434821.7, dated Feb. 6, 2017, 4 pgs.
Japanese Notice of Allowance in Application 2014-541110, dated Apr. 3, 2017, 3 pgs. (No English Translation yet).
Chinese 3rd Office Action in Application 201380007011.6, dated Apr. 1, 2017, 10 pgs.
U.S. Appl. No. 13/918,914, Office Action dated May 2, 2017, 24 pgs.
European Supplementary Search Report in Application 0681483.3, dated May 26, 2017, 10 pages.
U.S. Appl. No. 13/918,871, Office Action dated Jun. 16, 2017, 23 pgs.
U.S. Appl. No. 13/918,904, Office Action dated May 26, 2017, 32 pgs.
Chinese 4th Office Action in Application 201380007011.6, dated Jul. 4, 2017, 10 pages.
European Office Action in Application 14736150.5, dated Aug. 2, 2017, 7 pages.
Anonymous: “Selecting non-blank cells in Excel with VBA—Stack Overflow”, May 1, 2009 (May 1, 2009), XP55391820, Retrieved from the Internet: URL:https://stackoverflow.com/questions/821364/selecting-non-blank-cells-in-excel-with-vba [retrieved on Jul. 18, 2017], 3 pages.
Anonymous: “Select Non Empty Cells”, Oct. 1, 2009 (Oct. 1, 2009), XP55391822, Retrieved from the Internet: URL:http://www.ozgrid.com/forum/showthread.php?t=141201, [retrieved on Jul. 18, 2017), 5 pages.
U.S. Appl. No. 14/920,277, Office Action dated Jul. 26, 2017, 16 pgs.
Mexican Office Action in Application MX/a/2015/017360, dated Jun. 26, 2017, 5 pages.
“First Office Action and Search Report Issued in Chinese Patent Application No. 201480045299.0”, dated Mar. 8, 2018, 11 Pages.
Chinese Office Action and Search Report Issued in Chinese Application No. 201480045377.7, dated Feb. 13, 2018, 14 Pages.
U.S. Appl. No. 13/918,871, Office Action dated Apr. 5, 2018, 28 pgs.
European Office Action Issued in European Patent Application No. 14736150.5, dated Mar. 15, 2018, 5 Pages.
“International Search Report & Written Opinion Issued in PCT Application No. PCT/US2012/020192”, dated Jul. 31, 2012, 8 Pages.
“Office Action Issued in Mexican Patent Application No. MX/a/2016/013976”, dated Jun. 5, 2017, 4 Pages. No English translation.
“Notice of Allowance Issued in Japanese Patent Application No. 2013-548479”, dated Jul. 4, 2016, 4 Pages.
U.S. Appl. No. 13/918,914, Office Action dated Dec. 6, 2017, 28 pgs.
Chinese First Office Action and Search Report Issued in Chinese Patent Application No. 201480045249.2, dated Nov. 29, 2017,15 Pages.
Chinese Notice of Allowance in Application 201380007011.6, dated Nov. 7, 2017, 4 pages.
Canadian Notice of Allowance in Application 2822066, dated Nov. 23, 2017, 1 page.
U.S. Appl. No. 13/918,904, Office Action dated Jan. 5, 2018, 30 pgs.
“Office Action Issued in European Patent Application No. 13741052.8”, dated Jul. 1, 2019, 6 Pages.
“Office Action Issued in European Patent Application No. 14736166.1”, dated Jul. 1, 2019, 10 Pages.
“Office Action Issued in Australian Patent Application No. 2014278514”, dated Feb. 12, 2019, 3 Pages.
“Office Action Issued in Korean Patent Application No. 10-2014-7025384”, dated Jul. 22, 2019, 5 Pages.
“Office Acton Issued in European Patent Application No. 12732029.9”, dated Jul. 26, 2019, 7 Pages.
“Non-Final Office Action Issued in U.S. Appl. No. 13/918,871”, dated Apr. 4, 2019, 35 Pages.
Omura, Atsushi, “Protect method—Protecting Sheets, Introduction to VBA Programing for Excel Users”, in Nikkei Business Publications Inc., Jan. 1, 2001, 5 Pages.
“Summons to Attend Oral Proceedings Issued in European Patent Application No. 14737383.1”, dated Apr. 15, 2019, 5 Pages.
“Final Office Action Issued in U.S. Appl. No. 13/918,914”, dated Jun. 27, 2019, 41 Pages.
“Office Action Issued in Australian Patent Application No. 2014278514”, dated May 23, 2019, 3 Pages.
“Office Action Issued in Chinese Patent Application No. 201480045299.0”, dated May 27, 2019, 8 Pages.
“Office Action Issued in Mexican Patent Application No. MX/a/2015/017360”, dated Jul. 16, 2019, 8 Pages.
“Final Office Action Issued in U.S. Appl. No. 13/918,871”, dated Oct. 28, 2019, 49 Pages.
“Office Action Issued in Indian Patent Application No. 4834/CHENP/2013”, dated Nov. 6, 2019, 10 Pages.
“Final Office Action Issued in U.S. Appl. No. 13/918,904”, dated Sep. 5, 2018, 28 Pages.
“Final Office Action Issued in U.S. Appl. No. 13/918,871”, dated Nov. 1, 2018, 33 Pages.
“Use Slicers to Filter Data”, Retrieved from: https://support.office.com/en-us/article/use-slicers-to-filter-data-2491966b-a9d5-4b0f-b31a-12651785d29d, Jan. 2018, 3 Pages.
“Complete Illustration of Mini Excel 2007 Graph”, Published by Exmedia Ltd., Sep. 2, 2007, 1 Page.
“Notice of Allowance Issued in Japanese Patent Application No. 2016519558”, dated Nov. 1, 2018, 6 Pages.
Takei, Kazumi, “Gmail & Google Document—Show you all you need to know”, Published by Shoeisha Co., Ltd., 1st Edition, Oct. 3, 2011, 1 Page.
“Non Final Office Action Issued in U.S. Appl. No. 13/918,914”, dated Dec. 14, 2018, 41 Pages.
“Second Office Action Issued in Chinese Patent Application No. 201480045299.0”, dated Nov. 23, 2018, 10 Pages.
“Notice of Allowance Issued in Chinese Application No. 201480045377.7”, dated Oct. 26, 2018, 8 Pages.
“Office Action Issued in Mexico Patent Application No. MX/a/2015/017360”, dated Nov. 5, 2018, 11 Pages.
“Office Action Issued in Mexico Patent Application No. MX/a/2016/013976”, dated Nov. 5, 2018, 16 Pages.
“Put an Excel Snapshot into Work”, Retrieved from: <<https://www.journalofaccountancy.com/issues/2003/apr/putanexcelsnapshatintoword.html>>, Apr. 1, 2003, 1 Page.
“Office Action Issued in European Patent Application No. 05104560.7”, dated Feb. 25, 2008, 6 Pages.
Hong, Wu, “Excel Essential Toolbox: Fun for Only a Three-Month Mortgage”, Retrieved from: <<http://article.ochome.net/content-1047708-all.html>>, Jan. 25, 2010, 4 Pages.
“Supplementary Search Report Issued in European Patent Application No. 06790082.9”, dated Dec. 1, 2017, 9 Pages.
“Office Action Issued in European Patent Application No. 06790149.6”, dated Jun. 6, 2018, 6 Pages.
“Partial Search Report Issued in European Patent Application No. 06814383.3”, dated May 26, 2017, 10 Pages.
“Search Report Issued in European Patent Application No. 06814383.3”, dated Sep. 1, 2017, 9 Pages.
“Office Action Issued in Korean Patent Application No. 10-2008-7003836”, dated Aug. 21, 2012, 6 Pages.
“Office Action Issued in Korean Patent Application No. 10-2008-7005475”, dated Feb. 26, 2013, 6 Pages.
“Office Action Issued in Korean Patent Application No. 10-2013-7017694”, dated Apr. 30, 2018, 11 Pages.
“Office Action Issued in European Patent Application No. 13761403.8”, dated Mar. 20, 2017, 6 Pages.
“Search Report Issued in European Patent Application No. 13761403.8”, dated Nov. 16, 2015, 7 Pages.
“Non Final Office Action Issued in U.S. Appl. No. 14/920,277”, dated May 16, 2018, 22 Pages.
“Office Action Issued in European Patent Application No. 14737383.1”, dated May 24, 2018, 5 Pages.
“Office Action Issued in Israeli Patent Application No. 189152”, dated Jun. 13, 2011, 4 Pages.
“Office Action Issued in Canadian Patent Application No. 2,616,563”, dated Sep. 13, 2013, 2 Pages.
“First Office Action Issued in China Patent Application No. 200680033019.X”, dated Jun. 26, 2009, 20 Pages.
“Second Office Action Issued in Chinese Patent Application No. 200680033019.X”, dated Jun. 23, 2010, 10 Pages.
“Office Action Issued in Russian Patent Application No. 2008108999”, dated Jun. 29, 2010, 4 Pages. (W/o English Translation).
“Office Action Issued in Japanese Patent Application No. 2008-530243”, dated Aug. 17, 2012, 3 Pages.
“First Office Action Issued in Chinese Patent Application No. 201380014464.1”, dated Aug. 3, 2016, 12 Pages.
“Second Office Action Issued in Chinese Patent Application No. 201380014464.1”, dated Apr. 14, 2017, 14 Pages.
“Second Office Action Issued in Chinese Patent Application No. 201480045249.2”, dated Jul. 23, 2018, 5 Pages.
“Office Action and Search Report Issued in Russian Patent Application No. 2015153382”, dated Apr. 9, 2018, 8 Pages. (W/o English Translation).
“Office Action Issued in Japanese Patent Application No. 2016-519558”, dated May 9, 2018, 7 Pages.
“Office Action Issued in Egyptian Patent Application No. 399/2008”, dated Jun. 9, 2006, 3 Pages. (W/o English Translation).
“Office Action Issued in New Zealand Patent Application No. 566309”, dated Oct. 14, 2009, 2 Pages.
“Office Action and Search Report Issued in Taiwanese Patent Application No. 95132059”, dated Feb. 8, 2013, 9 Pages.
“Office Action Issued in Mexican Patent Application No. MX/a/2008/003309”, dated Apr. 24, 2014, 8 Pages. (W/o English Translation).
“Office Action Issued in Mexican Patent Application No. MX/a/2008/003318”, dated Dec. 15, 2011, 5 Pages.
“Office Action Issued in Mexican Patent Application No. MX/a/2008/003318”, dated Apr. 11, 2013, 7 Pages.
“Office Action Issued in Mexican Patent Application No. MX/a/2016/013976”, dated Feb. 20, 2018, 4 Pages.
Schlatter, et al., “The Business Object Management System”, in IBM Systems Journal, vol. 33, Issue 2, 1994, pp. 239-263.
“International Search Report and Written Opinion Issued in PCT Application No. PCT/US06/35168”, dated Feb. 26, 2007, 9 Pages.
“Office Action Issued in Philippines Patent Application No. PH/1/2008/500354”, dated Jun. 6, 2011, 1 Page.
“Office Action Issued in Malaysian Patent Application No. PI 200800396”, dated Jun. 28, 2013, 3 Pages.
“Office Action Issued in Malaysian Patent Application No. PI 20080500”, dated Jul. 31, 2012, 3 Pages.
“Office Action Issued in Malaysian Patent Application No. PI 20080503”, dated Jul. 15, 2015, 2 Pages.
“Office Action Issued Brazilian Patent Application No. PI0615023-3”, dated Feb. 2, 2018, 6 Pages.
“Notice of Allowance Issued in Korean Patent Application No. 1020147025384”, dated Jan. 6, 2020, 5 Pages.
“Office Action and Search Report Issued in Chinese Patent Application No. 201710255434.X”, dated Feb. 3, 2020, 9 Pages.
“Office Action Issued in Brazil Patent Application No. BR112015031195.4”, dated Feb. 18, 2020, 5 Pages.
“Summons to Attend Oral Proceedings Issued in European Patent Application No. 13741052.8”, Mailed Date: Apr. 22, 2020, 7 Pages.
Related Publications (1)
Number Date Country
20170300222 A1 Oct 2017 US
Continuations (1)
Number Date Country
Parent 12986473 Jan 2011 US
Child 15637788 US