1. Field
The embodiments discussed herein are directed to a set of GUI tracking menus for different navigation tasks where each navigation tool has action tools associated with the navigation task where the action tools are arranged in rings with the most used tools on an outside.
2. Description of the Related Art
Today, operating with three dimensional (3D) scenes is becoming more and more common. As a result, new or novice users are being confronted with a system that can be difficult and hard to use. However, it is also necessary to allow experienced users to also use the 3D systems effectively. What is needed is a system that accommodates and helps both novice and advanced users.
A navigation system for navigating a three-dimensional (3D) scene that includes a model or object with which a user can interact. The system accommodates and helps both novice and advanced users. To do this, the system provides a set of GUI tracking menus for different navigation tasks where each navigation tool has action tools associated with the navigation task. The action tools are arranged in rings with the most used tools on an outside. In particular, a universal navigation graphical user interface can be displayed on a display for navigating with a three dimensional scene on the display, where the GUI includes a tracking menu having tools arranged in a circle. An an outer ring includes a zoom tool for zooming into a displayed scene and located in an upper quadrant of the outer ring, a pan tool for panning in a scene and located in a bottom quadrant of the outer ring, an orbit tool for orbiting with respect to an object in a displayed scene and located in a left quadrant of the outer ring, and a rewind tool for rewinding states of a view history and located in a right quadrant of the outer ring. An inner circle includes a center tool for centering a view on the display and located in an upper left wedge of the inner circle, a walk tool for moving a view away from a start point in a scene and located in an upper right wedge of the inner circle, a look tool for adjusting the view direction for a scene and located in a lower left wedge of the inner circle, and an up/down tool for moving up/down in a scene relative to an up vector of a model and located in a bottom right wedge of the inner circle. Also included are a menu button located outside the circle and a close button located outside the circle.
These together with other aspects and advantages which will be subsequently apparent, reside in the details of construction and operation as more fully hereinafter described and claimed, reference being had to the accompanying drawings forming a part hereof, wherein like numerals refer to like parts throughout.
a depicts other mini-wheels.
a and 3b depict the hit zones of the wheels.
a and 4b shows switching between big and mini wheels.
a-11d show first contact GUIs.
a-23c show rewind cursor appearance.
a-24c show rewinding.
a and 27b show a perspective slider.
a shows an up/down slider and 29b shows up down motion.
The embodiments of the present invention discussed herein provide a navigation system for three dimensional (3D) scene navigation for new and experienced users that allow fast and safe navigation within a 3D scene (or 2D scene). The system, as depicted in
The set of widgets or wheels, as includes a view object wheel 110 would typically be used by a novice or an inexperienced 3D user to navigate around an object in a scene, such as a model of an automobile or a part. The view object wheel 110 includes a center action tool 112. The center tool 112 is located in the upper wedge of the view object wheel 110. The center tool is designed to bring (through an automatic pan operation) a clicked point on a model part or other object to the center of the canvas view. A tool 110 secondary purpose is to establish a reference point in 3D space for other tools on the wheel 110. The orbit tool 114 allows a user to orbit around an object in a scene where the center of the orbit is the center reference point. The zoom tool 116 allows a user to zoom in toward or out away from the center point. The rewind tool 118 allows the user to rewind back through way points that are automatically saved as the user navigates about a scene.
A tour wheel 120 would typically be used by a novice to navigate within a building. A full navigation wheel 130 is designed for an experienced user and includes all of the tools of the object 110 and tour 120 wheels. The tour wheel 120 includes the rewind tool discussed above. The tour wheel 120 also includes a forward action tool 122. This forward tool 122 allows a user to move forward in the scene toward an object or model in the scene. If there is no object beneath the cursor, no motion occurs and the user is informed that no motion will occur via an invalid action or prohibited action message. A look tool 124 allows a user to “look” or change the view toward the left, right, up or down much the way a person turns their head in these directions. An up/down tool 126 allows the user to move vertically up and down within the scene.
The full navigation wheel 130 includes all of the tools functionality of the object 110 and tour 120 navigation tools with additional capability. A walk tool 132 allows users to interactively walk about in 3D space, moving forward and backwards and/or turning freely to the left or right. A pan tool 134 allows the user to move up, down, right and left in the scene. The zoom tool moves in and out from the current cursor position instead of the center point.
Each of the wheels 110, 120 and 130 has a corresponding miniature or mini wheel 142, 144 and 146, respectively, each of which includes the same action tools arranged in a pie type shape with each action tool occupying a pie wedge that corresponds to the general location of the action tool in the full size version.
A 2D navigation wheel 150 can also be provided as depicted in
a depicts enlarged versions of different mini-wheels some of which have places for additional functions.
The navigation tools or wheels have some common characteristics.
The wheels are all tracking menus. A tracking menu is a cluster of graphical buttons, as with traditional menus, the cursor (sometimes shaped like an arrow) can be moved within the menu to select and interact with items. However, unlike traditional menus, when the cursor hits the edge of the menu, the menu moves to continue tracking the cursor. Thus the menu always stays under the cursor and close at hand. Each wheel is displayed on the “canvas” of the screen display for 2D drawing and 3D model views. As long as the tool is active the wheel's menu surface will track and follow the mouse or input device cursor when it hits the edge of the menu surface. No mouse button down is required to get the menu to follow along. If the mouse cursor leaves the canvas area, the menu still follows the mouse off-canvas, and the navigation tool menu will re-display when the mouse cursor returns to the canvas and moves the menu along with it.
Each wheel is “pinned” to the mouse cursor when the left mouse button is held and a child tool is selected. When a child tool is no longer in use, the wheel re-appears under the mouse cursor on the canvas. The cursor is displayed exactly on the tool where it clicked to start the tool usage. So, for example, when the zoom tool is released, the wheel is aligned under the cursor exactly as it was when the mouse button was first pressed, while the cursor was on the zoom tool. This allows users to quickly re-invoke the same tool if needed. If the mouse cursor is off the canvas when the sub-tool is exited, the wheel continues to track the cursor. If the user moves the cursor back towards the canvas area the wheel is moved along until it becomes visible again.
Each of the wheels has the most used functions or action tools on an exterior and lesser used functions in the center. In addition, functions can occupy different amounts or space consistent with the importance of that function. The wheels, both full and mini, can have different sizes from 256×256 pixels down to 16×16 pixels. At a size of 32×32 the arrow cursor is no longer visible. The wheels appear on top of the canvas and any objects that are in the canvas are still visible since the wheels can be made semi-transparent. The wheels' level of opacity can be set in a range between 25% and 90% (i.e., mostly opaque). No matter what the transparency setting, the navigate tool or wheel is completely hidden (invisible) while one of its child tools is in use. Each child tool will determine any on screen display that may be required for itself. When the child tool is released, the wheel is redisplayed using the current transparency settings.
The area of each action tool is highlighted (including wedges in the mini wheels) as the cursor moves over the tool to alert the user that the user is over that tool. The area of each tool corresponds to the activation or hit zones, as depicted in
The highlighting and selection for a mini wheel is particularly depicted in
A mouse down event activates a highlighted command and the mini wheel cursor is replaced by the appropriate tool cursor. A user can then drag to operate the command (for example, dragging to zoom in/out using the standard zoom tool cursor). Releasing the mouse button brings back the mini wheel cursor as depicted in
The wedge selection or tool associated with a mini wheel or pie cursor can be enhanced by extending the hit zone of the active wedge as shown in
Each wheel exhibits a slide under behavior when the ShowMotion UI and/or the View Cube are displayed on the canvas. The wheel passes under both of these other features' onscreen displays.
As the above cursors are moved about on the canvas, when the cursor encounters the edge of the display or canvas, the cursor wraps around as depicted in
All of the tools of each wheel can have a tool tip or description graphic (HUD—heads-up-display) that is displayed when the cursor hovers over the tool.
As depicted in
The menu items for the full wheel can include:
When the 2D wheel is active it will display the following menu items in its menu:
The first four menu items in the view, tour and full wheels switch to the corresponding wheel, the “Basic Wheels” is a sub-menu with the two novice wheels. Go Home moves the camera to the document's saved Home View with a quick animation lasting 1 second. Fit to Window adjusts the current camera to frame the 3D content to fit entirely within the view. Level Camera flattens the camera view rotation relative to the XY ground plane. Restore Original Center restores the view object wheel center point to the center of the currently loaded model's bounding box. Increase/Decrease Walk Speed doubles or halves the current walk speed. Help brings up the help window for the wheels. Preferences brings up the Preference window for the wheels. Close Wheel exits from the current wheel navigation tool and returns to the previous tool (or the default tool such as Select). Optional wheel menu arrangements can be provided.
To help users get started a set of graphic tool tips is provided that includes first contact tool tips depicted in
When the system is first run and a 3D model is loaded into the canvas either the view object wheel or the full navigation wheel is preferably be displayed in the canvas or display in the lower left corner. While displayed in this manner the wheel is even more transparent than when it is active. The wheel is in an inactive or pinned state, and only responds to mouse cursor rollover. While in this state the pinned wheel does not interact with any other tools or features. The pinned wheel has no other functions. After the first run from installation, the currently selected wheel is pinned to the same location. (The wheel selection is stored as an option across sessions.) If a mini wheel is the currently selected wheel, then the full navigation wheel is displayed. When the user moves the cursor into the pinned wheel's space a small combined graphical tooltip as discussed above and roll-over switch will pop up. As described above, the tool tip is broken up into different regions corresponding to a wheel type. When the cursor rolls over one of these tooltip regions, the pinned wheel changes to bring up the matching wheel. When a mouse click is detected in the tool tip region or the wheel, the current wheel is activated (i.e., unpinned). The wheel is not to be pinned on startup again unless the user goes and re-enables the behavior manually (“Always Show the Pinned Wheel on Startup” setting). If no click is detected the wheel stays pinned on cursor exit. If the user does not click and moves the cursor beyond either the border of the wheel or tool tip, the tool tips go away and the wheel remains in the pinned state. Preferably, to get rid of the wheel users first activate it and then close the wheel, or select another tool. Otherwise the wheel remains pinned in the 3D canvas.
A main menu pull down menu can also be provided in an application to allow activation of one of the navigation tools, such as:
Where the above-menu is ordered for an advanced user and could be inverted in order for a novice user.
As previously mentioned the center tool moves a clicked or designated point on a model part to the center of the view and establishes a 3D space reference point. When the user places the cursor over the center wedge and over a model part in the canvas or scene at the same time, the cursor will display as the select arrow 1202, as shown in
The center point sphere represents a 3D point in model space. As discussed above, this point is set by the center tool and used by the orbit and zoom tools that are found on the view object wheel. If the user has not defined a center point by clicking in the canvas view, then the default center point location in space is preferably defined as the center of the currently loaded Model's bounding box. The sphere is a 3D model that is placed into the current model view. Since the sphere is a geometric object with 3D geometry it may appear larger or smaller depending on how far the target point is from the camera view, and it may be occluded by any model geometry that falls between it and the camera view.
Preferably, the center sphere is a “green ball” pivot indicator 1602 preferably with a text label 1604. The sphere can have axis rings 1702 around the perimeter as depicted in
The sphere is preferably always accompanied by a text label where the label appears in a fixed pixel size on the canvas. The label is preferably always drawn in front of any model geometry in the view. The label text changes depending on which tool is using the center point sphere. The label text when seen with the center tool is: CENTER
The view object wheel version of the orbit tool 114 (see
The zoom tool 116 (see
During operation, after a single left mouse click and release the camera view will perform an animated transition and zoom in towards the current target point. The amount the camera travels during this transition is preferably defined as 25% of the distance from the camera view's current location to the target point location. The zoom tool on the view object wheel preferably always uses as its target point the last point created by the center tool. This means that the view object wheel zoom tool preferably always zooms directly to the center of the canvas. Zoom is typically a camera dolly operation; it typically does not change the camera's field of view (FOV) in any way. Holding the Shift key when clicking the zoom tool inverts the single click zoom, and moves the camera view directly away from the target point. The zoom out distance is preferably defined as 125% of the distance from the current camera view location to the target point location.
Clicking on the zoom tool wedge will activate the zoom tool and also allow mouse dragging to zoom in our out from the current focus point. Again, the last defined center point is used as the target point for the view object wheel zoom tool. This means that this version of zoom always zooms in to the center of the canvas.
In the view object wheel, the zoom tool cannot zoom in forward beyond the target point. The camera view will stop moving once it reaches the Target Point and further mouse drag in that direction will have no effect. The user can always reverse mouse drag direction to zoom back away from the Target Point. When the user has reached this boundary an additional message is displayed in the HUD: Change the Center point to zoom further
The zoom tool is also constrained when zooming out or away from the target point when using the view object wheel. The camera view cannot be moved further away than a maximum distance, which is determined by the model's bounds. Once the view has backed out to this maximum distance the camera will stop and further mouse drag in that direction will have no effect. User can always reverse their mouse drag direction to zoom back inward. A HUD message is provided as depicted in
The zoom tool can have additional modes available. When tool is active and the user holds down the Shift key, a “+” cursor is shown and the system allows the user to perform a “zoom extents” operation. In this operation, clicking and dragging define a rectangle which is visually shown to the user, to define a zoomed view. Releasing the mouse after the clicking and dragging, zooms the view (using an animation) to match the defined rectangle as best as possible. At the end of the transition, the visual zoom rectangle fades away.
A cursor drag up or drag right will zoom the view in. A cursor drag left or drag down mouse will zoom the view out. By default the directions for zoom-In are cursor drag up and right, but this can be inverted based on the zoom direction options dialog setting.
The mouse scroll wheel can also be enabled for the view object wheel. Scrolling the wheel forward causes the view to zoom, and backwards to zoom out. Zooming with the scroll wheel is preferably always towards, or away from, the center of the canvas. While scrolling with the wheel the target point sphere is displayed at the center of the canvas with the label discussed previously.
The rewind tool is found in all of the wheel interfaces. Its behavior does not change no matter which wheel it is found on. It is located in a wheel differently depending on the wheel. The tool allows the user to navigate 3D spaces with a level of security as they can always rewind back through their navigation history to a previous view location. The tool presents this rewind history as a set of onscreen thumbnails that show each step the user has so far taken so far. The rewind tool also lets users interactively drag between the waypoint thumbnails to run back and forth through their navigation history animating the camera view as they do so. When the cursor is held over the rewind tool a tooltip is shown below the wheel in the canvas with the text: Rewind through your movement history. The system uses “waypoints” to facilitate the rewind function or command. Each waypoint is a captured camera view position+orientation, along with a thumbnail of the model as seen from the camera's current view at that location. As the user navigates using: the tools provided by the wheels (e.g., forward, orbit, center), or the ViewCube available from AutoDesk (clicking or dragging) or any other 3D scene navigation tool, these events are added to the navigation Rewind history. Every camera and viewport combination has a separate Rewind navigation history queue. Thus, in a quad view (2×2 viewport) arrangement where there is a Front, Top, Right and Perspective camera/view, 4 separate Rewind queues are maintained. Each time the mouse is released when using a navigation tool; the system stores the current view as a new waypoint and appends it to a history list for that camera/view combination. This new waypoint is set as the “current” waypoint. In general, waypoints are preferably not captured or recorded (but can be) at stages during a tools usage, only at the ending view position when the tool interaction is completed. Some tools, such as the walk tool described below, can generate multiple waypoints (see section below).
a illustrates a cursor positioned over the rewind tool in the view navigation widget, when the user clicks on the tool, the widget fades-out and the waypoint history fades-in as shown in
The rewind history list preferably works like an Undo/Redo mechanism, when the user rewinds and subsequently creates a new waypoint the new waypoint becomes the end of the history list and all others after it in time are discarded. The rewind history is stored for the current model only. If a new section or model is loaded then the history is discarded. The rewind history also persists across wheel switching and all of the wheels can make use of it in the same manner. When another tool is selected and the wheels are dismissed, the rewind history continues to be stored. When the user returns to navigation with the wheels by selecting them again, all previous history waypoints are retained and the user can rewind through them normally. A maximum number of history events are saved (by default 100 events).
A single mouse click on the rewind wedge in the wheels 110, 120 and 130 (see
If the user left mouse clicks and holds the mouse down, the rewind GUI as shown in
If the user then drags the cursor to the right or left the current waypoint's border will start to fade out and the next (in the dragged direction) thumbnail's border will begin to brighten as shown in
The distance the cursor is currently placed between any two waypoint thumbnails is used to control the transition of the camera view between those two waypoint locations and orientations. So, if the cursor is halfway between two waypoints, the camera view will be smoothly animated to the halfway point between the two waypoint view locations using linear interpolation. The camera view is constantly updated to follow the transitions between these thumbnails and waypoints as the user drags over them using the linear interpolation. This sequence of cursor position changes and corresponding scene changes is shown in
When the user releases the mouse the camera will remain at its current position and orientation even if this is part way between two waypoints.
When the mouse is released: If the closest thumbnail under the cursor is not the last in the history list (there are more thumbnails to the right of it) or the cursor is not exactly on a saved waypoint, a new waypoint thumbnail will be generated and info will be stored or updated. This selected waypoint becomes the “current waypoint” and any new waypoints will be created after it in the list. If there are any new waypoints created, the entire list from this current one on will be discarded.
When the mouse is released if the closest waypoint under the cursor is the last waypoint in the history list (no more thumbs to the right of the selected item) then the waypoint's information is replaced with the camera's current position+orientation+thumbnail. This last waypoint becomes the current waypoint and any new views will be created after it in the list.
The rewind cursor is constrained to move only horizontally and graphical “corner borders” are also displayed surrounding the cursor to frame the current thumbnail.
To access rewind history beyond what is currently visible in the thumbnail strip on the canvas, an automatic thumbnail strip scrolling has been defined. When there are only two visible thumbnails remain in the history due to canvas clipping, the rewind cursor remains fixed and the thumbnail strip begins to move in the opposite direction of the mouse movement. This movement is position-based not rate-controlled.
Rather than discard (or write over) all later in time waypoints when a past waypoint is selected, it is also possible to create a branch in the rewind history and maintain all of the waypoints. Multiple waypoints can also be compressed. As previously mentioned, waypoints are typically saved at the end of tool operation and at intervals within some tools. However, it is possible to constantly record waypoints in “high resolution” with small separations in time between them.
As previously discussed the tour building wheel 120 (see
The forward tool or command is located at the top center of the tour building wheel 122 (see
The forward tool moves the view towards a clicked focus point location. When the mouse is clicked over a model surface with the forward tool, a focus point is defined by projecting the cursor location on the canvas onto the model surface using ray casting in a “click-through” type manner. If no model part falls under the cursor then camera view will not move forward and a prohibited operation cursor 2602 (see also 710 of
The center point sphere is reused as the target point and the sphere remains displayed until the user releases the mouse. For this tool the sphere has no label. Instead an endpoint 2702 of a perspective slider (HUD) 2704 reads: Surface—as depicted in
Clicking and dragging on the forward tool wedge also sets the focus point in the same manner as for the click to go forward behavior described previously, but instead of animating the camera towards the target automatically, the user may drag the mouse to move the view forward (and back) to set the distance to the focus point interactively.
Like with a zoom tool the user may perform both the forward and backward movement with a single mouse drag action. This is basically a zoom interaction but the zoom amount is capped in both directions. While interactively dragging the view with the forward tool, the tour building wheel is hidden. Dragging the mouse/cursor upward moves the view forward. Dragging the cursor downward does the opposite.
The zoom forward distance is capped so that when the user has moved (dragged) forward right up to the focus point position, the tool resists further mouse drag beyond that point in a resistance zone, which is a percentage of the distance to the surface, such as 95%. If the user continues to drag the mouse in the forward direction for a period of time, they will be allowed to “pop-through” the view to beyond the existing focus point position and continue to zoom in/move forward from there. This behavior allows users to push through solid surfaces or walls.
The user may also interactively move backwards from their starting position towards an invisible focus point that is behind the camera view. A “sticky point” or range of no movement is defined for the start position of the forward tool. To go backwards, the user must drag some distance downward to move off the sticky point. Similar to the forward movement, a ray is cast behind the camera view plane to try and see how far the user can move without entering a wall or passing through a model surface. If a surface is found, the camera is allowed to only move backwards as far as ½ the distance from the start position to the surface found behind the camera. Unlike forward the user cannot push beyond the surface that is detected behind the camera.) If no backward surface is found during the initial ray cast, the camera is allowed to move backwards a small amount (roughly 25% of a forward ray). Again the cursor is hidden during this behavior.
During use, a user places cursor over target and clicks on the forward tool wedge. At this point, as depicted in
The up/down tool 126 (see
When the cursor is over the up/down tool on the wheel the tool tip text is: Move up or down.
A bounding box of the model (plus a small amount of extra padding 10-15%) is used to determine an absolute high and low Y axis values. The up/down tool is constrained by these values and cannot push the view beyond them in either direction.
To avoid sudden jumps in the camera view when the up/down tool is invoked the following rules apply. If the camera is already above the bounding box high mark when the user first clicks and drags the mouse, then the system uses the current camera position as the high mark. If the camera is already below the bounding box low mark when the user first clicks and drags the mouse, then the system uses the current camera position as the low mark.
When the up/down tool is active the display (see
During use of the up/down tool, a vertical mouse/cursor drag upward (relative to the view) will push the view up towards the high or top marker. A mouse drag downward does the opposite.
The look tool on the tour wheel has movement keyboard keys disabled whereas on the full navigation wheel, as discussed below they are enabled. The look tool 124 located in the top half of the middle circle on the tour building wheel 120 (see
To look left or right, a horizontal drag or X-axis mouse movement (relative to the view) causes the view to rotate to the left or right. To look up or down, a vertical drag or Y-axis mouse movement causes the view to rotate up or down. The up and down looking is optionally constrained so that the view never flips upside down. The camera or view can be rotated until it looks nearly straight up or down in the tour wheel, but not beyond. When the Look tool is active the cursor is changed to the look cursor 702 as shown in
While the mouse is held down the look tool graphical tool description element is displayed below the cursor on the canvas with the text: Look Tool.
As previously discussed, the full navigation wheel 130 (see
The walk tool 132 allows users to interactively walk about in 3D space, moving forward and backwards and/or turning freely to the left or right. The walk is preferably along a horizontal axis or plane of the scene. The walk tool also allows shortcut key access to the up/down tool to adjust the view up or down while walking. The walk tool is preferably located in the top right interior wedge on the full navigation wheel. The tool tip text is: Click and drag in the direction you want to walk.
Clicking the left mouse button with the cursor over the walk tool activates the walk tool. When the mouse is first clicked on the walk tool the cursor changes to a default walk icon or cursor 3002 as depicted in
The direction of the cursor from the center target or zone 3102 determines the walk direction. Up is forward, down is backwards, left is turn left, right is turn right, and diagonal combines the two closest headings. The distance of the cursor from the center target 3102 determines the speed of walking. The closer to the center is slower, farther away is faster. A default speed is set based on scene space parameters, such as the size of the bounding box, and can be changed by the user. Placing the cursor over the center circle area 3102 will stop camera or view movement. At any time the user may release the mouse button to end walking and redisplay the wheel.
While walking the cursor is changed to an arrow (a direction indication cursor) 3202 that indicates the current move direction and rotation. In
There is a default cursor 3302 for when the camera is not moving and the cursor is hovering inside the center circle or zone as shown on the far left of
While the walk tool is active the following text is displayed as a HUD element on the canvas: Hold down Shift key to move up or down. This HUD text does not follow the cursor movement, and is instead fixed to the lower center of the canvas area.
While the walk tool is active the user may press and hold a hot key such as the Shift key at any time to temporarily activate the Up/Down tool and move up or down. The walk tool is suspended while the Shift key is held and the up/down tool is active and operates as previously discussed. When Shift is pressed the cursor location (and therefore the walking speed and heading) are temporarily stored. When the shift key is released the cursor is placed back to its previous location and the walk operation continues on from where it left off. A walk tool option allows users to adjust the walking movement direction based on their current camera looking direction instead of the default ground plane. Thus, it can be used in combination with the look tool to move around in 3D space.
It is possible and likely that users will activate the walk tool for long periods of time (30 seconds to a minute, etc). In order to capture some of these long walk events, the system defines multiple rewind events during a single mouse drag walk event. Preferably, this is the only tool that allows for multiple rewind events to be defined. By default, a walk rewind event is preferably recorded every 5 seconds. Consequently, a thumbnail is added to the rewind thumbnail strip (history) for each event. If it is not possible to create thumbnails during a walk (due to performance issues or if it is too noticeable during the walk), it may be possible to create the thumbnails on the mouseUp event or, a less favorable solution, to use the same thumbnail for all multiple walk events belonging to a given long walk drag history. To increase walk speed (increase movement rate) the user presses CTRL+Shift+>. To decrease walk speed (decrease movement rate) the user presses CTRL+Shift+<. Alternative hotkeys may be used.
The look tool 136 on the full navigation wheel 130 allows the user to move the view while looking by pressing the arrow keys on the keyboard. The tool is located on the bottom left interior wedge on the full navigation wheel 130 (see
The look tool can also have an additional mode. When the user holds down the Shift key while the tool is active, the system performs a “look at” operation. Similar to the Center tool, the green ball is positioned and can be dragged along the model surface. The surface beneath the green ball is highlighted. Once the key is released, the view animates to move the green ball to the center of the view, roughly keeping the zoom factor the same but orients the view to be looking “head-on” to the surface where the green ball resides (e.g., the surface normal is pointing directly at the user/camera).
Shortcut key inputs from the keyboard are provided to extend the look tool for users using the full navigation tool who have keyboard access. These keys will cause the camera to move forward or backward, and left or right. These movement commands are available while the look tool is active. (left mouse down). The forward key command causes the camera or view to move forwards (dolly in) based on the camera's current heading. For example, if the user is looking straight up and presses this key they will fly up into the air like superman. If they are looking straight ahead they will fly directly ahead. The camera movement rate for the keys is determined by a walk speed factor that can be set by the user. The move backwards key command causes the camera to move backwards (dolly out) based on the camera's current heading. The move left or sidestep left key command causes the camera to move to the left as if the user was panning horizontally. The move right or sidestep right key command causes the camera view to move to the right as if the user was panning horizontally. No up or down movement key is provided.
Both the walk tool and the look tool take into consideration the scales of the model space in which they are operating. The speed to move through an engine block is very different than what will work to move past the Empire State Building. A base movement rate value is automatically calculated, based on the model's bounding box when the model is loaded. Both look and walk use the same base setting. Increase or decrease speed commands can modify this base move rate setting to fit the user's needs, and these ± modifications can be stored for the duration of the session.
When the look tool is active the cursor is changed to the look cursor 702 as depicted in
The orbit tool 114 appears on both the view object wheel 110 and the full navigation wheel 130. The version on the full wheel uses the same mouse control scheme to orbit the view about a pivot point, but this tool has a more dynamically defined pivot point that is not always directly in the center of the view. The tool is located on the left side wedge of the full wheel 130 (see
If the user activates CTRL+Click or CTRL+Click and drags on a model in the view with the orbit tool, a new pivot point is defined about which to orbit. Subsequent orbits will use this pivot until the user defines a new one by using another tool, such as zoom, another CTRL+click on the orbit tool and navigating the view with another tool so that the pivot location ends up off screen.
By default the Keep Scene Upright option is on. If it is turned off, then the orbit tool behaves in screen space (not model space). When “Keep Scene Upright” is off, a Shift key modifier is enabled that, on mouse-down presents a graphical ring HUD (see
When a user makes a selection, the pivot point is moved to the center of the bounding box of the current selection. Subsequent orbit operations will use this point unless it is moved again by another selection or other navigation actions (e.g., pan operation, etc.).
Pan operation of the full wheel is preferably a common pan operation except that the clicked point in the view will stay fixed to the cursor and not go flying off too fast or too slow.
For the full navigation wheel the tool is not constrained, and can zoom towards or away from any point visible in the canvas. A single click to zoom in (left mouse button click and release) behaves like the single click behavior for the view wheel zoom tool, with the following exceptions. The target point is set based on where the user placed their cursor over the model, not the center of the canvas. The user is allowed to click anywhere on the canvas to set the zoom target point and zoom will zoom into this location. This behavior may be turned off. The single click to zoom out (Shift+left mouse button click and release) zooms out away from the clicked point on the canvas. The target point is set based on where the user placed their cursor over the model, not the center of the canvas. The user is allowed to click on ‘nothing’ and can still zoom. This Zoom tool also supports the invert mouse zoom option setting where the zoom can cause the scene to be inverted. Whenever the zoom tool is active on the full wheel, the target point is indicated as a green sphere with a text label and is the same HUD graphic as for the Center tool shown earlier. The mouse scroll wheel will zoom in to the last defined pivot/focus point in the same manner as the CTRL+Zoom shown above.
The mini wheels (see
Showing the standard arrow cursor within a mini wheel or pie cursor can be visually heavy and confusing. Preferably no cursor is shown and the mini wheel visually acts as the cursor. However, there may be times when it is useful to have a visible cursor. In such a situation, a smaller dot cursor (what can be called a flea cursor) can be provided as depicted in
All of the wheels can share a particular key as a shortcut key. Pressing this shortcut can bring up the last used wheel for 3D, or just bring up the 2D wheel for 2D content. Tapping the shortcut key when a wheel is already up will bring the wheel down and activate the last tool (e.g., Select Tool). The CTRL+Shift or another key can be used as a shortcut key or hot key to activate/deactivate the wheels. Activating a wheel should occur “transparently” from the current tool. For example, if the a first tool is active, a user can hit Ctrl+Shift to activate the view object wheel, navigate, hit Ctrl+Shift again to bring down the wheel and be placed back into the first tool to continue working.
The flow of this shortcut key process 3500 is depicted in
The center tool operation is depicted in
When the green ball or center sphere is used, it can be moved about in the model space. As depicted in
The orbit operation, except as described above, has a conventional flow and so is not described herein.
For the tools that allow cursor wrapping, this operation, as depicted in
The zoom operation starts with the user initiating 3902 the zoom tool as depicted in
During tool operation, the rewind history is saved and this operation 4002 is shown in
a illustrates a linked list data structure used for the movement history where each entry 4020 includes view location, orientation, etc.
During operation of the forward tool, when the user presses 4202 the mouse button, the system determines 4204 whether the cursor is over the model as depicted in
The up/down tool flow is activated 4302 and when the user moves 4304 the cursor, the camera view is moved 4306 along the up vector (a vector vertical with respect to the model) based on the current position as shown in
As shown in
The pan operation (see
The mini-wheel activation (see
Perspective and orthographic projections differ in that orthographic projection has no sense of depth. Thus, when using only a view direction to determine the perspective and orthographic frustums for a scene, a great disparity can emerge between the two after several camera operations have been carried out. Depicted in
The standard pinhole perspective camera model has been augmented to include the distance to the geometry in the scene. This is updated whenever the camera position changes. That is, the user performs a camera move operation 4702, the camera parameters are updated 4704 and the distance to the object is updated 4706. We then use this extra information to scale the orthographic frustum such that the size of the geometry as it is viewed in orthographic projection mimics the size of the geometry as it is viewed in perspective projection. In doing this, a determination 4708 is made as to whether the mode is projection or perspective. If perspective, the perspective frustum is calculated 4710. If orthographic, the distance is used to calculate 4712 the frustum. After the frustum is determined, the view is rendered 4714.
This distance value is used when calculating 4712 the orthographic frustum; the parameters of orthographic width and height are calculated from the pinhole camera model as follows:
ORTHOW=d×tan(fhor)
ORTHOH=d×tan(fver)
where d is the distance to the geometry in the scene, and fhor and fver are the horizontal and vertical field-of-views, respectively.
The approximate distance to the object can be calculated based on the distance of the perspective camera's eye to the center point. In doing this the position of the center point is updated throughout all camera operations and generally lies on scene geometry. This is an effective and computationally inexpensive way of approximating distance to the geometry in the scene. In addition, the magnitude of the view direction vector is transparent to the perspective projection model. However, as this is an approximation, it does not work in all cases.
Another approach is to calculate the actual distance to the geometry view for every frame. While this ensures the calculation of the orthographic frustum is accurate, it is computationally expensive. Carrying out such calculations on a graphics processing unit (GPU) helps reduce the load on a central processing unit (CPU) but can result in slower than real-time interaction for complex geometry.
The typical hardware 4800 for the system discussed herein, as depicted in
The embodiments of the present invention discussed provide a safe 3D navigation environment by clustering and caching tools; creating task and skill-based tool sets; providing orientation awareness; enhancing tool feedback; providing pre-canned navigation; preventing errors; and recovering from errors. Rapid access and use are provided by the tracking menu and click through features. The circular design of the wheels with important action tools on the exterior and less used tools in the interior organizes the tools so they are easy to access in an order that fits the tasks. The tool selection dialogs that allows new and advanced users to select a navigation tool makes the users quickly aware of what is available and what are the navigation tool uses and allows immediate selection. The task grouping of commonly used tools for and limited to types of tasks helps prevent users from becoming confused during navigation. The customization of the action tools for the type of task also prevents confusion and errors during navigation. The sphere provides visual feedback to orient the users and the limitations on placement of the sphere prevent navigation errors. The sliders provide location feedback for dragging operations. Labels and icons reinforce the operations of the tools being used. Constraints on navigation, such as limiting motion to the bounding box or to a rear wall along with cursor control like the cursor screen wrapping, keeps users from getting into visually confusing states. Error recovery using rewind allows users to return to a familiar visual state.
When any of the 3D wheels are selected, and the user then navigates to a 2D sheet, the 2D navigation wheel can automatically become the selected tool. When the 2D navigation tool is selected tool and the user navigates to a 3D model, the last used 3D Wheel will become the selected tool. If the ‘pinned’ option is enabled, show the 3D wheel pinned to the canvas. If a non-compatible section type is displayed in the canvas the wheel(s) are dismissed and the Select Tool is activated instead.
Depending on the hotkey being used, it may be preferable to offer spring-loaded activation of the wheel. That is, only when the hotkey is pressed down, the wheel is visible and a user can interact with it. Once the hotkey is released, the wheel goes away. Again, this is an optional offering depending on the available hotkey and ease of articulation.
Alternative, and less ideal, systems may be built that offer the collection and sets of navigation tools not using the Tracking Menu design but instead using standard tools on a toolbar as well as potentially offering hotkey access to the individual tools. So, for example, a 3D navigation tool palette may offer: Forward, Look, Rewind, and Up/Down tools as one set and Center, Zoom, Rewind and Orbit as another set.
Additionally, the shapes of the Tracking Menus may be altered (from circles to other shapes such as ovals or squares) and the tool regions shapes may be reshaped and their proportions changed. The positioning of the tools may also be rearranged (e.g., the Orbit wedge on the left side of the wheel could be moved to the top wedge).
Other elements such as the “green ball” may be different colors or shapes (e.g., square or pyramid or cross), orientation demarcations (e.g., tick marks instead of axis belts) and have different transparency dynamics.
Note also that the help text in the HUDs and tooltips change based on the language settings of the computer (e.g., French, Chinese, etc).
The Rewind tool can have different visual representations for each waypoint and be presented to the user in different styles (such as a line with beads on it or timestamps, or a stack of thumbnails or a stack of graphical objects) and different orientations (e.g., vertical instead of horizontal or in 3D). The Rewind mechanism can be attached to different input devices such as the mouse scrollwheel. Additional information can be saved in the Rewind queue beyond camera position and orientation such as user settings, background color and styles, environment settings, rendering modes (e.g., line or shaded), camera options such as orthographic or perspective projection, etc. When Rewinding occurs, the previous state information is installed. In some cases, interpolation of application state information can occur between two waypoints. For example, if the user switched between perspective and orthographic projection, these two projection modes would be blended when rewinding between the two waypoint events.
In addition to the thumbnail strip, supplementary information can be presented such as user or system annotations or indicating which tools were active during the navigation history or other system activity. The Rewind thumbnail strip and graphics may be drawn semi-transparently.
Note for the Rewind Tool, it is possible to save the exact movement path the user performed (instead of just the last position and orientation when the mouse is released). We call this “high resolution”. We have found that this is mostly undesired and requires a lot of information to be saved. However, we prototyped this and found it too frenzied when someone dragged through the rewind history. Nevertheless, it may be useful in some applications in which a modifier key can be hit to toggle between using low or high resolution rewind information.
Also navigation events may be created by the system (not explicitly by the user). These can be added to the Rewind history.
Note that some events may be compressed or combined. For example, navigation events that occurred in which the view did not change or in which the view was composed of viewing empty space, are candidates for being dropped from the Rewind history or compressed into a single event.
The perspective sliders can be used in other contexts where for example 3D depth information is useful to indicate. Note that an interesting property of the perspective slider is that it can take less visual space than a 2D slider since it is drawn in perspective.
The perspective slider does not have to relate to positional information about the 3D scene. For example, it can adjust volume settings. The point is that it is drawn in perspective and the perspective may not even match the current perspective in the 3D scene. Also, it could be used in a 2D canvas or 2D application.
The slider can be drawn such that it become more transparent as the slider is deeper in the scene. Thus, the “Start” end of the slider would be more opaque and the “Surface” end would be more transparent.
The perspective slider can be drawn in many different visual ways using different metaphors such as a measurement ruler, or a road/driveway.
User or system annotation labels (text, graphics, iconic symbols) can be added on or near the slider to provide information and context.
Additional semantic controls to go up one floor or down one floor—or other units (e.g. one meter, one foot, etc.), instead of up/down one page or screen full of information on typical scrollbars.
Input snapping at interesting points along the slider (e.g., at each floor or tick mark indicating distance) may also be added for additional user control.
The up/down slider may also have additional graphics, annotations and tick marks on it to indicate application specific information or sizing/scale information. For example, tick marks may indicate floors in a building or where important elements are in a building (such as pipes, electrical junctions, etc. Drag snapping can occur at designated points along the slider for precise movement. In some cases, it may be useful to enable the tool without drawing the graphical slider. The slider may also be drawn semi-transparently. The preferred behavior is for the slider to initially appear near the current cursor position but it may be useful to have it come up in other locations such as the center of the 3D canvas.
Pre and Post actions may be conducted while using the slider. For example, temporarily leveling the camera such that it is parallel to the ground plane before and after actions are performed while using the Up/Down slider.
During mouse-dragging input for the Up/Down tool, the mouse movement can be mapped to a non-linear function to move the camera. For example, an exponential function (which we use), has the property that initial mouse movement move the view in small amounts (for precise adjustments) and as the drag motion gets longer, large view movements are generated. This design is handy if the user is viewing a condominium building and wants to either move up/down in small distances (e.g., within a floor) or large distances (multiple floors at a time).
The Walk tool may deploy intelligent sensing to automatically determine a constant height above the current “ground plane” or horizontal surface the camera is currently over. Smoothing algorithms would make the transition to different heights more visually pleasing. Additional sensing could be used to avoid objects or walls. In addition note that application specific or user customized HUD messages and graphic annotations can be triggered and displayed depending on the user's current location, direction and speed of travel. Moreover to ensure smooth traveling, the system may deploy “level of detail” adjustments in the graphics rendering of the scene—drawing fewer polygons and performing less rendering in dense scenes to approximate the visuals to keep the refresh redrawing at a high rate. Hotkeys may be used to increase or decrease the speed factor of the walk movement.
It is possible to have the walk left/right action and the up/down action be assigned to different parameters (e.g., navigation directions). For example, left/right may pan the user left/right instead of the default look left/right. It is also possible to have the up/down mouse direction map in the up/down direction instead of the in/out direction. Thus, a rate-based pan operation which behaves directionally similar to the standard Pan operation but is rate based.
As a convenience, if the user is walking using the Walk tool and performs cursor-wrapping, moving the cursor into the dead zone will cause the movement to stop and reset the speed to zero. Thus, the user does not have to disengage the Walk tool (by releasing the mouse), or “unwrap” the cursor to pause and reset movement.
The method of cursor-wrapping can be activated while the user input is in the tracking state (e.g., moving a mouse device with no buttons pressed) or during a drag state (e.g. moving a mouse device with the left mouse button pressed). In most scenarios, the cursor-wrapping is active when a tool is active (such as dragging while orbiting).
With cursor-wrapping, the concepts are the same for a view displaying a mixture of 2D and 3D or purely 2D content. When the cursor reaches the out-of-view position, an audio signal or visual signal can be displayed near the cursor location. For example, a chirp sound may be heard and the cursor could fade out, then fade in on the other side. Alternatively, as an example, a small animation could be played showing a puff of smoke near the cursor location. Note that it is possible to display a portion of the cursor (and auxiliary graphics that are currently moving with the cursor—such as a tool/cursor label or status information) at the out-of-view position and the remaining portion at the new wrapped-around position. A number of variations are possible: clip the cursor and auxiliary graphics; re-format or reposition the auxiliary graphics to remain visible until the cursor completes its wrap; always keep the cursor displayed whole but allow auxiliary graphics to be split.
The cursor-wrapping boundary may be dynamically defined to be not the current view boundary but the boundary of the currently selected object or set of objects in the scene. In addition, the cursor-wrapping boundary could be dynamically determined to be the bounds of the object it currently is over. Both of these cases are useful in that it keeps the cursor near where the user is working and auxiliary information that is moving with the cursor (such as dimensioning, status info, or state information) can remain nearby.
For the Tap Activation functionality, some embodiments will only implement the quick tap toggle to activate/deactivate a tool and not the spring-loaded tap to activate a tool. In the quick tap situation, the trigger will often require that “no mouse buttons are pressed” as this is typically reserved for modifying the current selection.
If spring-loaded mode is supported, then mouse button presses are processed. However, the trigger key is already in use so if the tool requires this key, it is unavailable. That is why we like less used combinations such as cntrl+shift or the Alt key. The First Contact pop-up mechanism is anchored to some visual element (e.g., a tool icon button in a toolbar or a widget in the 2D/3D scene). In our case we put a grayed-out 2D graphic on top of the 3D scene.
An important behavioral design to capture is that the FirstContact graphic will dismiss itself when the user moves the cursor outside of the bounds of the graphic. Moving the cursor over sub-elements of the graphic may change the graphic and switch default settings for the tool if the user chooses to activate the tool by clicking with the mouse. Thus, the user customizes the tool while learning about the tool or collection of tools.
It is important to emphasize that the user can switch between options by clicking on sub-element graphics without activating the tool. These graphics are used to select different sub-groups. We have found that an important grouping is by user expertise. Moreover, users are very bad at self-assessment and classifying themselves as a novice or expert. Thus, we strategically do not use these terms and instead use terms such as “Familiar with 3D” and “New to 3D”.
One style of FirstContact is that the FirstContact graphics and tool choosing should be seen only until the user activates the tool for the first time. This is to reduce information overload and clutter. The next time the user hovers over the anchor element, the FirstContact will not be presented. This state information is preserved in the user preferences and thus when the application is re-launched, FirstContact exposure is respected.
A mechanism exists to always re-activate FirstContact on application start-up for those users who want this mechanism to persist and to always be shown FirstContact.
The “green ball” navigation sphere has some important characteristics. The sphere becomes more transparent as the user gets closer during a camera movement operation (such as zoom) until the sphere is completely transparent.
Also it is important to note that the size of the sphere resets after any camera movement operation. When the sphere is interactively moved within the 3D scene, its size changes depending on the depth location information. By default, the sphere does not cast a 3D shadow but it could if desired. A texture could be added to the sphere to alter its appearance.
We display the focus point sphere on tool activation (i.e., mouse down) and remove it on tool deactivation.
Alternatively, one may choose to show/hide the sphere when the tool is selected from the toolbar—in addition to when the tool is being used (e.g., while the mouse button is pressed). The sphere could change color when the tool is being used (i.e., mouse down).
The sphere could change shape, color or be annotated with graphics or icons to indicate state or mode information. For example, within the Orbit tool, the sphere could indicate if dimensional constraints were active (e.g., only allowing orbit in x dimension).
In general modifier keys can be used for all of the tools to add additional functionality.
The many features and advantages of the embodiments are apparent from the detailed specification and, thus, it is intended by the appended claims to cover all such features and advantages of the embodiments that fall within the true spirit and scope thereof. Further, since numerous modifications and changes will readily occur to those skilled in the art, it is not desired to limit the inventive embodiments to the exact construction and operation illustrated and described, and accordingly all suitable modifications and equivalents may be resorted to, falling within the scope thereof.
This application is related to and claims priority to U.S. provisional application entitled Steering Wheels having Ser. No. 60/975,366, by Fitzmaurice, et al, filed Sep. 26, 2007 and incorporated by reference herein. This application is also related to and claims priority to U.S. provisional application entitled Steering Wheels having Ser. No. 61/022,640, by Fitzmaurice, et al, filed Jan. 22, 2008 and incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
4734690 | Waller | Mar 1988 | A |
5047959 | Phillips et al. | Sep 1991 | A |
5075673 | Yanker | Dec 1991 | A |
5233513 | Doyle | Aug 1993 | A |
5267334 | Normille et al. | Nov 1993 | A |
5293529 | Yoshimura et al. | Mar 1994 | A |
5295241 | Eagen et al. | Mar 1994 | A |
5333247 | Gest et al. | Jul 1994 | A |
5371846 | Bates | Dec 1994 | A |
5488392 | Harris | Jan 1996 | A |
5491781 | Gasperina | Feb 1996 | A |
5511157 | Wang | Apr 1996 | A |
5521841 | Arman et al. | May 1996 | A |
5532715 | Bates et al. | Jul 1996 | A |
5535321 | Massaro et al. | Jul 1996 | A |
5544300 | Skarbo et al. | Aug 1996 | A |
5550563 | Matheny et al. | Aug 1996 | A |
5583977 | Seidl | Dec 1996 | A |
5606655 | Arman et al. | Feb 1997 | A |
5613058 | Koppolu et al. | Mar 1997 | A |
5634019 | Koppolu et al. | May 1997 | A |
5689628 | Robertson | Nov 1997 | A |
5734805 | Isensee et al. | Mar 1998 | A |
5740037 | McCann et al. | Apr 1998 | A |
5754175 | Koppolu et al. | May 1998 | A |
5808613 | Marrin et al. | Sep 1998 | A |
5818446 | Bertram et al. | Oct 1998 | A |
5841440 | Guha | Nov 1998 | A |
5850212 | Nishibori | Dec 1998 | A |
5874956 | LaHood | Feb 1999 | A |
5884213 | Carlson | Mar 1999 | A |
5910799 | Carpenter et al. | Jun 1999 | A |
5917615 | Reifman et al. | Jun 1999 | A |
5929840 | Brewer et al. | Jul 1999 | A |
5973663 | Bates et al. | Oct 1999 | A |
6014142 | LaHood | Jan 2000 | A |
6067624 | Kuno | May 2000 | A |
6097393 | Prouty et al. | Aug 2000 | A |
6121966 | Teodosio et al. | Sep 2000 | A |
6128631 | Wallace et al. | Oct 2000 | A |
6144375 | Jain et al. | Nov 2000 | A |
6144377 | Oppermann et al. | Nov 2000 | A |
6222541 | Bates et al. | Apr 2001 | B1 |
6222557 | Pulley, IV et al. | Apr 2001 | B1 |
6252594 | Xia et al. | Jun 2001 | B1 |
6271854 | Light | Aug 2001 | B1 |
6278938 | Alumbaugh | Aug 2001 | B1 |
6300936 | Braun et al. | Oct 2001 | B1 |
6313849 | Takase et al. | Nov 2001 | B1 |
6322059 | Kelm et al. | Nov 2001 | B1 |
6331146 | Miyamoto et al. | Dec 2001 | B1 |
6346938 | Chan et al. | Feb 2002 | B1 |
6426745 | Isaacs et al. | Jul 2002 | B1 |
6480191 | Balabanovic | Nov 2002 | B1 |
6483485 | Huang et al. | Nov 2002 | B1 |
6556206 | Benson et al. | Apr 2003 | B1 |
6563514 | Samar | May 2003 | B1 |
6593944 | Nicolas et al. | Jul 2003 | B1 |
6597378 | Shiraishi et al. | Jul 2003 | B1 |
6628279 | Schell et al. | Sep 2003 | B1 |
6640185 | Yokota et al. | Oct 2003 | B2 |
6717597 | Letzelter et al. | Apr 2004 | B2 |
6720949 | Pryor et al. | Apr 2004 | B1 |
6734882 | Becker | May 2004 | B1 |
6738065 | Even-Zohar | May 2004 | B1 |
6741250 | Furlan et al. | May 2004 | B1 |
6763272 | Knepper | Jul 2004 | B2 |
6828962 | McConkie et al. | Dec 2004 | B1 |
6832176 | Hartigan et al. | Dec 2004 | B2 |
6832353 | Itavaara et al. | Dec 2004 | B2 |
6850252 | Hoffberg | Feb 2005 | B1 |
6867790 | Brooks | Mar 2005 | B1 |
6928618 | Kohls et al. | Aug 2005 | B2 |
6954906 | Kamachi et al. | Oct 2005 | B1 |
6981229 | Balakrishnan et al. | Dec 2005 | B1 |
6985178 | Morita et al. | Jan 2006 | B1 |
6987512 | Robertson et al. | Jan 2006 | B2 |
7017124 | Jaeger | Mar 2006 | B2 |
7084875 | Plante | Aug 2006 | B2 |
7096428 | Foote et al. | Aug 2006 | B2 |
7123189 | Lalik et al. | Oct 2006 | B2 |
7127679 | Cohen | Oct 2006 | B2 |
7178159 | Ando et al. | Feb 2007 | B1 |
7190365 | Fleury | Mar 2007 | B2 |
7191409 | Petersen | Mar 2007 | B2 |
7242387 | Fitzmaurice et al. | Jul 2007 | B2 |
7263661 | Chavers et al. | Aug 2007 | B2 |
7333712 | Jeannin et al. | Feb 2008 | B2 |
7334190 | Wierowski | Feb 2008 | B2 |
7405752 | Kondo et al. | Jul 2008 | B2 |
7415676 | Fujita | Aug 2008 | B2 |
7423214 | Reynolds et al. | Sep 2008 | B2 |
7516406 | Cameron | Apr 2009 | B1 |
7549127 | Chasen et al. | Jun 2009 | B2 |
7603633 | Zhao et al. | Oct 2009 | B2 |
7676762 | Shafron | Mar 2010 | B2 |
7712033 | Shafron | May 2010 | B2 |
7782319 | Ghosh et al. | Aug 2010 | B2 |
7805338 | Kolls | Sep 2010 | B2 |
7814436 | Schrag et al. | Oct 2010 | B2 |
7823080 | Miyajima et al. | Oct 2010 | B2 |
7974714 | Hoffberg | Jul 2011 | B2 |
7979796 | Williams et al. | Jul 2011 | B2 |
7987003 | Hoffberg et al. | Jul 2011 | B2 |
8051375 | Friedlander et al. | Nov 2011 | B2 |
8103973 | Harris | Jan 2012 | B2 |
8136038 | Ross et al. | Mar 2012 | B2 |
8165916 | Hoffberg et al. | Apr 2012 | B2 |
8180399 | Katzer et al. | May 2012 | B1 |
8245142 | Mizrachi et al. | Aug 2012 | B2 |
8245143 | Yach et al. | Aug 2012 | B2 |
8285732 | Danninger | Oct 2012 | B2 |
8365099 | Cho et al. | Jan 2013 | B2 |
8413065 | Horodezky | Apr 2013 | B2 |
8458608 | Raiz et al. | Jun 2013 | B2 |
8473859 | Chaudhri et al. | Jun 2013 | B2 |
8478245 | Carion et al. | Jul 2013 | B2 |
8516266 | Hoffberg et al. | Aug 2013 | B2 |
8560510 | Brueggerhoff et al. | Oct 2013 | B2 |
8572476 | Yi et al. | Oct 2013 | B2 |
8626937 | Tomida | Jan 2014 | B2 |
20010001091 | Asai et al. | May 2001 | A1 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020033845 | Elber et al. | Mar 2002 | A1 |
20020054172 | Berman et al. | May 2002 | A1 |
20020063705 | Moriwaki et al. | May 2002 | A1 |
20020099800 | Brainard et al. | Jul 2002 | A1 |
20020175918 | Barber | Nov 2002 | A1 |
20030081012 | Chang | May 2003 | A1 |
20030179248 | Watson et al. | Sep 2003 | A1 |
20030186201 | Martin | Oct 2003 | A1 |
20030193524 | Bates et al. | Oct 2003 | A1 |
20030206150 | Hussey et al. | Nov 2003 | A1 |
20030227480 | Polk | Dec 2003 | A1 |
20040001041 | Chang et al. | Jan 2004 | A1 |
20040027460 | Morita | Feb 2004 | A1 |
20040036718 | Warren et al. | Feb 2004 | A1 |
20040046769 | Arvin et al. | Mar 2004 | A1 |
20040095480 | Battles et al. | May 2004 | A1 |
20040135824 | Fitzmaurice | Jul 2004 | A1 |
20040139401 | Unbedacht et al. | Jul 2004 | A1 |
20040141010 | Fitzmaurice et al. | Jul 2004 | A1 |
20040141015 | Fitzmaurice et al. | Jul 2004 | A1 |
20040174458 | Okubo | Sep 2004 | A1 |
20040189802 | Flannery | Sep 2004 | A1 |
20040210852 | Balakrishnan et al. | Oct 2004 | A1 |
20040216058 | Chavers et al. | Oct 2004 | A1 |
20040233222 | Lee et al. | Nov 2004 | A1 |
20040240709 | Shoemaker | Dec 2004 | A1 |
20040243940 | Lee et al. | Dec 2004 | A1 |
20040246269 | Serra et al. | Dec 2004 | A1 |
20050060667 | Robbins | Mar 2005 | A1 |
20050114778 | Branson et al. | May 2005 | A1 |
20050128180 | Wang | Jun 2005 | A1 |
20050168488 | Montague | Aug 2005 | A1 |
20050187832 | Morse et al. | Aug 2005 | A1 |
20050210380 | Kramer et al. | Sep 2005 | A1 |
20050212756 | Marvit et al. | Sep 2005 | A1 |
20050232610 | Boger et al. | Oct 2005 | A1 |
20050253753 | Lalik et al. | Nov 2005 | A1 |
20050273778 | Bixler | Dec 2005 | A1 |
20060004298 | Kennedy et al. | Jan 2006 | A1 |
20060032362 | Reynolds et al. | Feb 2006 | A1 |
20060080604 | Anderson | Apr 2006 | A1 |
20060085763 | Leavitt et al. | Apr 2006 | A1 |
20060090022 | Flynn et al. | Apr 2006 | A1 |
20060129937 | Shafron | Jun 2006 | A1 |
20060155398 | Hoffberg et al. | Jul 2006 | A1 |
20060158450 | Ferguson et al. | Jul 2006 | A1 |
20060174295 | Martin et al. | Aug 2006 | A1 |
20060202973 | Kobayashi et al. | Sep 2006 | A1 |
20060218493 | Murray | Sep 2006 | A1 |
20060230361 | Jennings et al. | Oct 2006 | A1 |
20060253245 | Cera et al. | Nov 2006 | A1 |
20060284839 | Breed et al. | Dec 2006 | A1 |
20060288300 | Chambers et al. | Dec 2006 | A1 |
20070016476 | Hoffberg et al. | Jan 2007 | A1 |
20070033172 | Williams et al. | Feb 2007 | A1 |
20070053513 | Hoffberg | Mar 2007 | A1 |
20070065140 | Sorsa | Mar 2007 | A1 |
20070070066 | Bakhash | Mar 2007 | A1 |
20070074244 | Miyamori | Mar 2007 | A1 |
20070110338 | Snavely et al. | May 2007 | A1 |
20070123270 | Casey | May 2007 | A1 |
20070126628 | Lalik et al. | Jun 2007 | A1 |
20070132767 | Wright et al. | Jun 2007 | A1 |
20070136557 | Okochi et al. | Jun 2007 | A1 |
20070136690 | MacLaurin et al. | Jun 2007 | A1 |
20070156677 | Szabo | Jul 2007 | A1 |
20070162871 | Ishii | Jul 2007 | A1 |
20070168890 | Zhao et al. | Jul 2007 | A1 |
20070179646 | Oempski et al. | Aug 2007 | A1 |
20070192739 | Hunleth et al. | Aug 2007 | A1 |
20070198941 | Baar et al. | Aug 2007 | A1 |
20070206030 | Lukis | Sep 2007 | A1 |
20070211149 | Burtnyk et al. | Sep 2007 | A1 |
20070216710 | Stevenson et al. | Sep 2007 | A1 |
20070234223 | Leavitt et al. | Oct 2007 | A1 |
20070273712 | O'Mullan et al. | Nov 2007 | A1 |
20070300182 | Bilow | Dec 2007 | A1 |
20080010007 | Tomizawa | Jan 2008 | A1 |
20080034321 | Griffin | Feb 2008 | A1 |
20080072139 | Salinas et al. | Mar 2008 | A1 |
20080148183 | Danninger | Jun 2008 | A1 |
20080155413 | Ubillos | Jun 2008 | A1 |
20080238916 | Ghosh et al. | Oct 2008 | A1 |
20080282166 | Fillman et al. | Nov 2008 | A1 |
20080307315 | Sherman et al. | Dec 2008 | A1 |
20080320396 | Mizrachi et al. | Dec 2008 | A1 |
20090055758 | Sim et al. | Feb 2009 | A1 |
20100005421 | Yoshioka | Jan 2010 | A1 |
20100251134 | Van Seggelen et al. | Sep 2010 | A1 |
20130275845 | Hurewitz et al. | Oct 2013 | A1 |
Entry |
---|
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 3, 2008 in corresponding International Patent Application No. PCT/US08/77928. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 3, 2008 in corresponding International Patent Application No. PCT/US08/77912. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 2, 2008 in corresponding International Patent Application No. PCT/US08/77929. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 2, 2008 in corresponding International Patent Application No. PCT/US08/77915. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 4, 2008 in corresponding International Patent Application No. PCT/US08/77966. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 5, 2008 in corresponding International Patent Application No. PCT/US08/77964. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 2, 2008 in corresponding International Patent Application No. PCT/US08/77963. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 4, 2008 in corresponding International Patent Application No. PCT/US08/77949. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 2, 2008 in corresponding International Patent Application No. PCT/US08/77969. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 2, 2008 in corresponding International Patent Application No. PCT/US08/77958. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 8, 2008 in corresponding International Patent Application No. PCT/US08/77962. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 3, 2008 in corresponding International Patent Application No. PCT/US08/77967. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 8, 2008 in corresponding International Patent Application No. PCT/US08/77931. |
International Search Report and the Written Opinion of the International Searching Authority issued on Dec. 2, 2008 in corresponding International Patent Application No. PCT/US08/77938. |
International Search Report and the Written Opinion of the International Searching Authority issued on Jan. 12, 2009 in corresponding International Patent Application No. PCT/US08/77939. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077928. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077912. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077929. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077915. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077966. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077964. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077963. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077949. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077969. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077958. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077962. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077967. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077931. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077938. |
International Preliminary Report on Patentability issued on Mar. 30, 2010 in corresponding International Patent Application No. PCT/US08/077939. |
Anthony Steed, “Efficient Navigation Around Complex Virtual Environments”, Virtual Reality Software and Technology, Proceedings of the ACM symposium on Virtual reality software and technology, 1997, pp. 173-180. |
Colin Ware et al., “Exploration and virtual camera control in virtual three dimensional environments”, Symposium on Interactive 3D Graphics, Proceedings of the 1990 symposium on Interactive 3D graphics, 1990, pp. 175-183. |
Robert C. Zeleznik et al., “Two pointer input for 3D interaction”, Symposium on Interactive 3D Graphics, Proceedings of the 1997 symposium on Interactive 3D graphics, 1997, pp. 115-120. |
Jock D. Mackinlay, “Rapid controlled movement through a virtual 3D workspace”, ACM SIGGRAPH Computer Graphics, vol. 24, No. 4, Aug. 1990, pp. 171-176. |
Colin Ware et al., “Context sensitive flying interface”, Symposium on Interactive 3D Graphics, Proceedings of the 1997 symposium on Interactive 3D graphics, 1997, pp. 127-130. |
William H. Bares et al., “Intelligent multi-shot visualization interfaces for dynamic 3D worlds”, International Conference on Intelligent User Interfaces, Proceedings of the 4th international conference on Intelligent user interfaces, 1998, pp. 119-126. |
Tinsley A. Galyean, “Guided navigation of virtual environments”, Symposium on Interactive 3D Graphics, Proceedings of the 1995 symposium on Interactive 3D graphics, 1995, pp. 103-104 and 210. |
George Fitzmaurice et al., “Tracking menus”, Symposium on User Interface Software and Technology, Proceedings of the 16th annual ACM symposium on User interface software and technology, 2003, pp. 71-79. |
Nicholas Burtnyk et al., “StyleCam: interactive stylized 3D navigation using integrated spatial & temporal controls”, Symposium on User Interface Software and Technology, Proceedings of the 15th annual ACM symposium on User interface software and technology, 2002, pp. 101-110. |
Doug A. Bowman et al., “Testbed evaluation of virtual environment interaction techniques”, Virtual Reality Software and Technology, Proceedings of the ACM symposium on Virtual reality software and technology, 1999, pp. 26-33. |
Nicolas Burtnyk et al., “ShowMotion: camera motion based 3d design review”, Symposium on Interactive 3D Graphics, Proceedings of the 2006 symposium on Interactive 3D graphics and games, 2006, pp. 167-174. |
Azam Khan et al., “ViewCube: a 3D orientation indicator and controller”, Symposium on Interactive 3D Graphics, Proceedings of the 2008 symposium on Interactive 3D graphics and games, 2008, pp. 17-25. |
Michael Tsang et al., “Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display”, Symposium on User Interface Software and Technology, Proceedings of the 15th annual ACM symposium on User Interface software and technology, 2002, pp. 111-120. |
Ravin Balakrishnan et al., “Exploring bimanual camera control and object manipulation in 3D graphics interfaces”, Conference on Human Factors in Computing Systems, Proceedings of the SIGCHI conference on Human factors in computing systems: the CHI is the limit, 1999, pp. 56-62. |
Francois Guimbretiere et al., “ExperiScope: an analysis tool for interaction data”, Conference on Human Factors in Computing Systems, Proceedings of the SIGCHI conference on Human factors in computing systems, 2007, pp. 1333-1342. |
Gordon Paul Kurtenbach, “The Design and Evaluation of Marking Menus”, PhD Thesis, Graduate Department of Computer Science, University of Toronto, 1993. |
Francois Guimbretiere, “Fluid interaction with high-resolution wall-size displays”, Symposium on User Interface Software and Technology, Proceedings of the 14th annual ACM symposium on User interface software and technology, 2001, pp. 21-30. |
Shumin Zhai et al., “The ‘Silk Cursor’: investigating transparency for 3D target acquisition”, Conference on Human Factors in Computing Systems, Proceedings of the SIGCHI conference on Human factors in computing systems: celebrating interdependence, 1994, pp. 459-464. |
Tovi Grossman et al., “The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area”, Conference on Human Factors in Computing Systems, Proceedings of the SIGCHI conference on Human factors in computing systems, 2005, 281-290. |
Gordon Kurtenbach et al., “The Hotbox: efficient access to a large number of menu-items”, Conference on Human Factors in Computing Systems, Proceedings of the SIGCHI conference on Human factors in computing systems: the CHI is the limit, 1999, 231-237. |
Tovi Grossman et al., “Hover widgets: using the tracking state to extend the capabilities of pen-operated devices”, Conference on Human Factors in Computing Systems, Proceedings of the SIGCHI conference on Human Factors in computing systems, 2006, 861-870. |
Beverly L. Harrison et al., “An experimental evaluation of transparent menu usage”, Conference on Human Factors in Computing Systems, Proceedings of the SIGCHI conference on Human factors in computing systems: common ground, 1996, pp. 391-398. |
Barry A. Po et al., “Comparing cursor orientations for mouse, pointer, and pen interaction”, Conference on Human Factors in Computing Systems, Proceedings of the SIGCHI conference on Human factors in computing systems, 2005, pp. 291-300. |
Michael McGuffin et al., “FaST Sliders: Integrating Marking Menus and the Adjustment of Continuous Values”, Graphics Interface, 2002, pp. 35-42. |
Ken Hinckley et al., “The springboard: multiple modes in one spring-loaded control”, Conference on Human Factors in Computing Systems, Proceedings of the SIGCHI conference on Human Factors in computing systems, 2006, pp. 181-190. |
Gordon Kurtenbach et al., “Issues in combining marking and direct manipulation techniques”, Symposium on User Interface Software and Technology, Proceedings of the 4th annual ACM symposium on User interface software and technology, 1991, pp. 137-144. |
Jack Callahan et al., “An empirical comparison of pie vs. linear menus”, Conference on Human Factors in Computing Systems, Proceedings of the SIGCHI conference on Human factors in computing systems, 1988, pp. 95-100. |
Michael J. Muller et al, “Multifunctional Cursor for Direct Manipulation User Interfaces”, Conference on Human Factors in Computing Systems, Proceedings of the SIGCHI conference on Human factors in computing systems, 1988, pp. 89-94. |
Francois Guimbretiere et al., “Benefits of merging command selection and direct manipulation”, ACM Transactions on Computer-Human Interaction (TOCHI), 2005, pp. 460-476. |
D. Nieuwenhuisen et al., “High quality navigation in computer games”, Science of Computer Programming, vol. 67, No. 1, 2007, pp. 91-104. |
George Drettakis, “Design and Evaluation of a Real-World Virtual Environment for Architecture and Urban Planning”, Presence: Teleoperators and Virtual Environments, vol. 16, No. 3, 2007. |
U.S. Appl. No. 12/200,278, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,355, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,340, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,333, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,346, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,485, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,327, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,309, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No.12/200,440, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,475, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,373, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,429, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,449, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,319, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,458, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,480, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
U.S. Appl. No. 12/200,421, filed Aug. 28, 2008, Fitzmaurice, et al., Autodesk, Inc. |
Richard Stoakley et al., “Virtual Reality on a WIM: Interactive Worlds in Miniature”, The University of Virginia, Department of Computer Science, 1995. |
Robert Zeleznik et al., “UniCam—2D Gestural Camera Controls for 3D Environments” Brown University, Department of Computer Science, 1999 Symposium on Interactive 3D Graphics, Atlanta, GA, pp. 169-173. |
Jeffrey S. Pierce, et al., “Toolspaces and Glances: Storing, Accessing, and Retrieving Objects in 3D Desktop Applications”, Microsoft Research, 1999. |
Daniel L. Odell, et al., “Toolglasses, Marking Menus, and Hotkeys: A Comparison of One and Two-Handed Command Selection Techniques” Berkeley Manufacturing Institute, UC Berkeley Department of Mechanical Engineering, pp. 17-24, 2004. |
Nicolas Villar, et al., “The VoodooIO Gaming Kit: A real-time adaptable gaming controller”, Lancaster University, Computing Department, ACE 06, Jun. 14-16, 2006, Hollywood, CA. |
Li-wei He, et al., “The Virtual Cinematographer: A Paradigm for Automatic Real-Time Camera Control and Directing” Microsoft Research, pp. 217-224, 1996. |
Feng Tian, et al., “The Tilt Cursor: Enhancing Stimulus-Response Compatibility by Providing 3D Orientation Cue of Pen” Intelligence Engineering Lab, Institute of Software, Chinese Academy of Sciences, pp. 303-306, 2007. |
Abigail J. Sellen, et al., “The Prevention of Mode Errors Through Sensory Feedback”, Computer Systems Research Institute, University of Toronto, 1992. |
Gordon Kurtenbach, et al., “The Limits of Expert Performance Using Hierarchic Marking Menus” Department of Computer Science, University of Toronto, Apr. 24-29, 1993, pp. 482-487. |
Don Hopkins, et al., “The Design and Implementation of Pie Menus”, Dr. Dobb's Journal, Dec. 1991. |
William A.S. Buxton, “A Three-State Model of Graphical Input”, Human-Computer Interaction—Interact 1990, pp. 449-456. |
Doug A. Bowman, et al., “Travel in Immersive Virtual Environments: An Evaluation of Viewpoint Motion Control Techniques”, Graphics, Visualization, and Usability Center, College of Computing Georgia Institute of Technology, 1997. |
Eric Saund et al., “Stylus Input and Editing Without Prior Selection of Mode”, Palo Alto Research Center, Letters CHI, vol. 5, Issue 2, pp. 213-216, 2003. |
Takeo Igarashi, et al., “Path Drawing for 3D Walkthrough” Department of Information Engineering, University of Tokyo, 1998. |
Tomer Moscovich, et al., “Multi-finger Cursor Techniques”, Department of Computer Science, Brown University, Graphics Interface, 2006, pp. 1-7. |
Dale Chapman, et al., “Manipulating the Future: Predictor Based Feedback for Velocity Control in Virtual Environment Navigation”, Faculty of Computer Science, University of New Brunswick, pp. 63-66, 1992. |
Benjamin Bederson, “Interfaces for Staying in the Flow”, Computer Science Department, University of Maryland, pp. 1-8, 2004. |
Michel Beauduin-Lafon, “Instrumental Interaction: An Interaction Model for Designing Post-WIMP User Interfaces”, Department of Computer Science, University of Aarhus, pp. 1-9, 2000. |
Morton Hertzum, et al., “Input techniques that dynamically change their cursor activation area: A comparison of bubble and cell cursors”, International Journal of Human-Computer Studies, vol. 65, No. 10, 2007, pp. 833-851. |
Francois Guimbretiere, et al., “FlowMenu: Combining Command, Text, and Data Entry” Computer Science Department, Stanford, Letters CHI, vol. 2, No. 2, pp. 213-216, 2000. |
William Bares, et al., “Generating Virtual Camera Compositions”, Center for Advanced Computer Studies, University of Louisiana at Lafayette, pp. 9-12, 2001. |
Desney S. Tan, et al., “Exploring 3D Navigation: Combining Speed-coupled Flying with Orbiting”, Microsoft Research, Letters chi, 2001, vol. 3, Issue 1, pp. 418-425. |
Yang Li, et al., “Experimental Analysis of Mode Switching Techniques in Pen-based User Interfaces”, Letters CHI 2005, pp. 461-470. |
Edwin L. Hutchins, et al., “Direct Manipulation Interfaces”, Human-Computer Interaction, 1985, vol. 1, pp. 311-338. |
David B. Christianson, et al., “Declarative Camera Control for Automatic Cinematogrpahy”, Department of Computer Science and Engineering, University of Washington, 1996. |
Stuart Pook, et al., “Control Menus: Execution and Control in a Single Interactor”, Letters CHI 2000, pp. 263-264. |
Andrew J. Hanson, et al., “Constrained 3D Navigation with 2D Controllers”, Computer Science Department, Indiana University, 1997. |
William Buxton, “Chucking and Phrasing and the Design of Human-Computer Dialogues”, Computer Systems Research Institute, University of Toronto, Proceedings of the IFIP World Computer Congress, Dublin, Ireland, pp. 475-480, 1995. |
Steven M. Drucker, et al., “CamDroid: A System for Implementing Intelligent Camera Control” MIT Media Lab, Massachusetts Institute of Technology, 1995. |
Cary B. Phillips, et al., “Automatic Viewing Control for 3D Direct Manipulation”, Computer Graphics Research Laboratory, Department of Computer and Information Science, University of Pennsylvania, 1992. |
Spatial Freedom, Inc., “Spatial Controllers (3D Motion Controllers)”, pp. 1-12, 1999. |
Michael Gleicher, et al., “Through-the-Lens Camera Control”, School of Computer Science, Carnegie Mellon University, Computer Graphics, vol. 26, No. 2, Jul. 1992, pp. 331-340. |
Azam Khan, et al., “HoverCam: Interactive 3D Navigation for Proximal Object Inspection”, www.alias.com/research, pp. 73-80, 2005. |
Office Action dated Jan. 28, 2014 issued in copending U.S. Appl. No. 12/200,373. |
Office Action dated Mar. 25, 2014 issued in copending U.S. Appl. No. 12/200,346. |
Office Action dated Apr. 11, 2014 issued in copending U.S. Appl. No. 12/200,340. |
Office Action dated Mar. 30, 2014 issued in copending U.S. Appl. No. 12/200,319. |
Office Action dated Feb. 6, 2014 issued in copending U.S. Appl. No. 12/200,309. |
Office Action dated Jan. 30, 2014 issued in copending U.S. Appl. No. 12/200,278. |
Office Action dated Mar. 3, 2014 issued in copending U.S. Appl. No. 12/200,449. |
Office Action dated Mar. 26, 2013 issued in copending U.S. Appl. No. 12/200,373. |
Office Action dated Oct. 15, 2012 issued in copending U.S. Appl. No. 12/200,373. |
Office Action dated Jun. 19, 2012 issued in copending U.S. Appl. No. 12/200,373. |
Office Action dated Nov. 16, 2011 issued in copending U.S. Appl. No. 12/200,373. |
Office Action dated Jun. 22, 2011 issued in copending U.S. Appl. No. 12/200,373. |
Office Action dated May 13, 2013 issued in copending U.S. Appl. No. 12/200,421. |
Office Action dated Jan. 9, 2012 issued in copending U.S. Appl. No. 12/200,421. |
Office Action dated Jul. 19, 2011 issued in copending U.S. Appl. No. 12/200,421. |
Notice of Allowance dated Jan. 13, 2013 issued in copending U.S. Appl. No. 12/200,421. |
Office Action dated May 8, 2013 issued in copending U.S. Appl. No. 12/200,346. |
Office Action dated Dec. 12, 2011 issued in copending U.S. Appl. No. 12/200,346. |
Office Action dated Jul. 6, 2011 issued in copending U.S. Appl. No. 12/200,346. |
Office Action dated Jul. 19, 2013 issued in copending U.S. Appl. No. 12/200,340. |
Office Action dated Oct. 23, 2012 issued in copending U.S. Appl. No. 12/200,340. |
Office Action dated Mar. 13, 2013 issued in copending U.S. Appl. No. 12/200,340. |
Office Action dated Aug. 16, 2011 issued in copending U.S. Appl. No. 12/200,340. |
Office Action dated Feb. 26, 2013 issued in copending U.S. Appl. No. 12/200,333. |
Office Action dated Jul. 30, 2012 issued in copending U.S. Appl. No. 12/200,333. |
Office Action dated Nov. 15, 2011 issued in copending U.S. Appl. No. 12/200,333. |
Office Action dated Jun. 7, 2011 issued in copending U.S. Appl. No. 12/200,333. |
Office Action dated Oct. 5, 2012 issued in copending U.S. Appl. No. 12/200,327. |
Office Action dated Feb. 29, 2012 issued in copending U.S. Appl. No. 12/200,327. |
Office Action dated Sep. 1, 2011 issued in copending U.S. Appl. No. 12/200,327. |
Office Action dated Aug. 9, 2013 issued in copending U.S. Appl. No. 12/200,327. |
Office Action dated Jun. 21, 2013 issued in copending U.S. Appl. No. 12/200,319. |
Office Action dated Nov. 21, 2011 issued in copending U.S. Appl. No. 12/200,319. |
Office Action dated Jun. 10, 2011 issued in copending U.S. Appl. No. 12/200,319. |
Office Action dated Apr. 3, 2012 issued in copending U.S. Appl. No. 12/200,309. |
Office Action dated Nov. 3, 2011 issued in copending U.S. Appl. No. 12/200,309. |
Office Action dated Apr. 25, 2011 issued in copending U.S. Appl. No. 12/200,309. |
Office Action dated May 14, 2013 issued in copending U.S. Appl. No. 12/200,278. |
Office Action dated Jul. 2, 2012 issued in copending U.S. Appl. No. 12/200,278. |
Office Action dated Jan. 30, 2012 issued in copending U.S. Appl. No. 12/200,278. |
Office Action dated Oct. 1, 2012 issued in copending U.S. Appl. No. 12/200,485. |
Office Action dated Jan. 26, 2012 issued in copending U.S. Appl. No. 12/200,485. |
Office Action dated Jan. 10, 2012 issued in copending U.S. Appl. No. 12/200,485. |
Office Action dated Jul. 7, 2011 issued in copending U.S. Appl. No. 12/200,485. |
Office Action dated Nov. 17, 2011 issued in copending U.S. Appl. No. 12/200,429. |
Office Action dated Mar. 31, 2011 issued in copending U.S. Appl. No. 12/200,429. |
Office Action dated May 9, 2013 issued in copending U.S. Appl. No. 12/200,475. |
Office Action dated Dec. 22, 2011 issued in copending U.S. Appl. No. 12/200,475. |
Office Action dated May 23, 2011 issued in copending U.S. Appl. No. 12/200,475. |
Office Action dated Nov. 28, 2011 issued in copending U.S. Appl. No. 12/200,458. |
Office Action dated Mar. 1, 2011 issued in copending U.S. Appl. No. 12/200,458. |
Office Action dated Sep. 5, 2013 issued in copending U.S. Appl. No. 12/200,458. |
Office Action dated Oct. 27, 2011 issued in copending U.S. Appl. No. 12/200,449. |
Office Action dated Apr. 1, 2011 issued in copending U.S. Appl. No. 12/200,449. |
Office Action dated Dec. 27, 2010 issued in copending U.S. Appl. No. 12/200,440. |
Office Action dated Apr. 8, 2011 issued in copending U.S. Appl. No. 12/200,440. |
Office Action dated May 22, 2013 issued in copending U.S. Appl. No. 12/200,480. |
Office Action dated Feb. 6, 2013 issued in copending U.S. Appl. No. 12/200,480. |
Office Action dated Aug. 8, 2011 issued in copending U.S. Appl. No. 12/200,480. |
Abasolo et al., Magallanes: 3D Navigation for Everybody, 2007, Graphite '07 Proceedings of the 5th International Conference on Computer Graphics and Interactive Techniques in Australia and Southeast Asia. |
Elvins et al., Worldlets—30 Thumbnails for Wayfinding in Virtual Environments, 1997, UIST '97 Proceedings of the 10th Annual ACM Symposium on User Interface Software and Technology. |
Lewis et al., Multiple Slider Control Scales, Feb. 1, 1994, IBM Technical Disclosure vol. 37, No. 02B. |
“AGG: A Tool Environment for Algebraic Graph Transformation” by Gabriele Taentzer, published in AGTIVE, ser. Lecture Notes in Computer Science, 2000. |
Office Action dated Nov. 4, 2013 issued in copending U.S. Appl. No. 12/200,333. |
Notice of Allowance dated Oct. 23, 2013 issued in copending U.S. Appl. No. 12/200,355. |
Notice of Allowance dated Nov. 25, 2013 issued in copending U.S. Appl. No. 12/200,346. |
Notice of Allowance dated Nov. 29, 2013 issued in copending U.S. Appl. No. 12/200,480. |
Office Action dated Jan. 15, 2014 issued in copending U.S. Appl. No. 12/200,475. |
Office Action dated Dec. 13, 2013 issued in copending U.S. Appl. No. 12/200,440. |
Office Action dated Dec. 4, 2013 issued in copending U.S. Appl. No. 12/200,429. |
Number | Date | Country | |
---|---|---|---|
20090079731 A1 | Mar 2009 | US |
Number | Date | Country | |
---|---|---|---|
61022640 | Jan 2008 | US | |
60975366 | Sep 2007 | US |