Gesture based interaction with traffic data

Information

  • Patent Grant
  • 9046924
  • Patent Number
    9,046,924
  • Date Filed
    Tuesday, September 14, 2010
    13 years ago
  • Date Issued
    Tuesday, June 2, 2015
    9 years ago
Abstract
Gesture based interaction with traffic data is disclosed. A virtual broadcast presentation may be generated based on dynamic information such as traffic information, weather information, or other information that may be featured on a virtual broadcast presentation. A gesture made by a user is detected and processed to determine an input command associated with the detected gesture. The virtual broadcast presentation may be manipulated based on the input command.
Description
FIELD OF THE INVENTION

The present invention generally relates to broadcast presentation technology. More particularly, the present invention concerns the use of a gesture to interact with traffic information and other related data presented during a virtual broadcast presentation.


BACKGROUND OF THE INVENTION

Existing broadcast presentations generally include a variety of maps, images, and animations that display current or forecasted conditions for reference by a presenter (i.e., news reporter) during a broadcast presentation such as a traffic or weather report. The broadcast presentation is often produced prior to a scheduled broadcast for presentation by a traffic or weather reporter in a fixed arrangement (much like a slide show) with a prerehearsed script. Although the presenter has the ability to control the speed and manner in which the broadcast presentation is presented to a viewing audience, the content in the maps and images remains fixed. That is, the content presented during the broadcast presentation is not in real-time and is outdated. The reporting of outdated information (e.g., traffic or weather information) may have a drastic effect on a viewing audience who may rely on the reported information to make decisions about such things as travel or logistics.


Another shortcoming of existing broadcast technology is the lack of interaction with the content of the virtual broadcast presentation. Since the presentation contains pre-determined content, a presenter is unable to directly interact with or manipulate the maps and images of the presentation. The presenter cannot, for example, retrieve real-time conditions or other information associated with the maps or images of the presentation.


As such, there is a need in the art for gesture based interaction with traffic data and other related data.


SUMMARY OF THE INVENTION

Embodiments of the present invention allow a presenter to interact with traffic data and other related data in real-time presented on a display screen or other suitable display using one or more gestures.


In a first claimed embodiment, a method for gesture based interaction with traffic data is claimed. Through the method, a virtual broadcast presentation is generated based on traffic data received from one or more information sources. A gesture made by a user interacting with the virtual broadcast presentation is processed, the gesture being associated with an input command for manipulating the virtual broadcast presentation. The input command associated with a detected gesture is determined and the virtual broadcast presentation is manipulated based on the input command.


In a second claimed embodiment, a system for gesture based interaction with traffic data is claimed. The system includes at least a communications module, a presentation rendering module, and a gesture recognition module, each module stored in memory and executable by a processor. Execution of the communications module by the processor detects a gesture made by a user interacting with a virtual broadcast presentation. The gesture is associated with an input command for manipulating the virtual broadcast presentation. Execution of the gesture recognition module processes, recognizes, and/or interprets a gesture and determines whether the gesture corresponds to a known input command. Execution of the presentation rendering module by the processor generates the virtual broadcast presentation based on traffic data received from one or more information sources, and manipulates the virtual broadcast presentation based on the input command.


In a third claimed embodiment, a non-transitory computer-readable storage medium is claimed. The storage medium includes a computer program that is executable by a processor to perform a method for gesture based interaction with traffic data. A gesture made by a user interacting with the virtual broadcast presentation is processed, the gesture being associated with an input command for manipulating the virtual broadcast presentation. The input command associated with a detected gesture is then determined and the virtual broadcast presentation is manipulated based on the input command.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a block diagram of an environment for the broadcast of a virtual broadcast presentation that a user may interact with and reference in real-time.



FIG. 2 illustrates the virtual broadcast presentation engine of FIG. 1.



FIGS. 3A-3B illustrate a virtual broadcast presentation displayed on a display screen.



FIG. 4 illustrates an interactive element appearing in a virtual broadcast presentation.



FIGS. 5A-5B illustrate a virtual broadcast presentation in ‘trip time’ mode.



FIGS. 6A-6B illustrate a traffic camera appearing within a virtual broadcast presentation.



FIG. 7 is a flowchart illustrating a method for gesture based interaction with traffic data presented in a virtual broadcast presentation.





DETAILED DESCRIPTION

The present invention provides for the use of a gesture to interact with traffic information and other related data during a virtual broadcast presentation. The virtual broadcast presentation may include maps, images, graphics, animations, multimedia overlays, and the like, that are rendered in a two-dimensional or three-dimensional manner on a display screen such as a liquid crystal display (LCD) screen or computer monitor screen. A presenter may refer to the virtual broadcast presentation in real-time and may manipulate a view of the presentation using a gesture. The presenter may also use a gesture to select an interactive element included in the presentation.



FIG. 1 illustrates a block diagram of an environment for the broadcast of a virtual broadcast presentation that a user may interact with and reference in real-time. The environment 100 of FIG. 1 includes a computing device 110 having a virtual broadcast presentation engine 115, a gesture recognition module 120, and a database 125. The computing device 110 of FIG. 1 is communicatively coupled to information sources 130, a display screen 140, and a broadcast system 150. While FIG. 1 illustrates one particular environment 100 including certain elements for the broadcast of a virtual presentation, alternative embodiments may be implemented that utilize differing elements than those disclosed in FIG. 1 (or combinations of the same), but that otherwise fall within the scope and spirit of the present invention.


The computing device 110 and the virtual broadcast presentation engine 115 may generate a composite presentation that includes a virtual broadcast presentation. The virtual broadcast presentation may be two-dimensional or three-dimensional. The composite presentation may be generated using information obtained in real-time (or near real-time) from the information sources 130 as described in further detail below. The virtual broadcast presentation engine 115, in particular, is discussed with respect to FIG. 2. The computing device 110 may include various components such as one or more of communications interfaces, a processor, memory, storage, and any number of buses providing communication therebetween (not shown). The processor may execute instructions implemented through computing modules or engines while the memory and storage may both permanently or temporarily store data including the aforementioned modules and engines.


The computing device 110 of FIG. 1 includes a gesture recognition module 120 which is stored in the memory of computing device 110 and is executable by a processor to process, recognize, and/or interpret a gesture. The gesture recognition module 120 may employ any technique known in the art for processing and detecting a gesture such as computer vision or image processing. In one embodiment, the gesture recognition module 120 may identify a received gesture by comparing the gesture to a plurality of previously identified gestures stored in database 125.


A gesture may be static or dynamic and may include any form of movement, motion, or other non-verbal communication or form of expression made by the body of a user. A gesture, for example, may include a static pose or configuration, or may include movement or motion of the limbs (i.e., hands, legs) or face (i.e., eyes, lips) of the user. In one embodiment, each detected gesture may be associated with an input command for directing the computing device 110 to perform a particular action or task such as manipulating a view of the virtual broadcast presentation. For example, detection of vertical downward movement of a pointed index finger may correspond to an input command for scrolling down a virtual broadcast presentation. Gesture recognition module 120 may process the detected gesture and determine whether the gesture corresponds to a known input command.


In another embodiment, a gesture may be associated with a voice command (e.g., a spoken word or phrase). The computing device 110 may include a microphone for voice detection and a voice recognition module for processing the detected voice command (not shown). The gesture recognition module 120 may match a detected gesture to a detected voice command and determine whether the gesture and voice command are associated with a particular input command. The computing device 110 may perform a specific action or task based on the detection of a gesture associated with a particular voice command and input command.


To capture or detect a gesture or any other movement or pose made by a user, the computing device 110 may be communicatively coupled to or include one or more cameras known in the art such as depth-aware cameras, stereo cameras, single cameras or any other suitable cameras. The computing device 110 may also include one or more sensors known in the art for gesture detection or motion sensing such as an optical sensor, (e.g. infrared, laser), an image sensor, an accelerometer, a gyroscopic component, a magnetometer, and the like. Processing of a detected gesture or image, for example, may include enhancement of the image and extraction and grouping of image features. The sensors may take various measurements and a gesture may be identified using pattern recognition, neural network algorithms, or any other technique known in the art.


The computing device 110 may also include a database 125 for storing a plurality of acceptable gestures each associated with a particular input command and/or voice command. The database 125 may be integrated with or separate from the computing device 110. In one embodiment, a user may customize or update the database 125 by adding or removing acceptable gestures, voice commands, and input commands. The detection of a specific gesture and/or voice command associated with a particular input command causes the computing device (i.e., virtual broadcast presentation engine 115) to perform some action such as manipulating the virtual broadcast presentation based on the input command.


The information sources 130 may be provided by various organizations and in a variety of forms. The information sources 130 may include data sources related to traffic data such as traffic flow and as described in U.S. patent application Ser. No. 11/302,418, now U.S. Pat. No. 7,221,287, or weather data such as forecasts. The information sources 130 may also include data sources related to newsworthy events or incidents such as school closings, election results, and other information that may be featured in a virtual broadcast presentation. The information sources 130 may require subscription or authentication for access and may be accessible via Telnet, FTP, or web services protocols. The information may be received from the information sources 130 in real-time or near real-time to allow for generation of an equally real-time or near real-time presentation. That presentation may, in turn, be manipulated in real-time.


In an embodiment of the present invention utilizing traffic data specific to the San Francisco Bay area, the information sources 130 may include one or more of the 511.org system (a collaboration of public agencies including the California Highway Patrol, Metropolitan Transportation Commission, and CALTRANS), the California Highway Patrol (CHP) World Wide Web server, the PeMS system at the University of California at Berkeley, various public event listings, or a publicly or privately accessible user input mechanism. For weather data, the information sources 130 may include the National Weather Service among other weather information sources. Other data sources or alternative types of data sources (e.g., non-traffic and non-weather related sources) may be incorporated and utilized in various embodiments of the present invention.


The display screen 140 may be any media display screen known in the art such as a LCD screen, television screen, computer monitor screen, projection screen, multi-touch touch screen, or any other suitable display. A presenter may interact with display screen 140 using a gesture.


The broadcast system 150 disseminates the composite presentation to viewers. Dissemination may occur via radio waves such as UHF or VHF, cable, satellite, or the World Wide Web. Hardware and software necessary to effectuate a broadcast may be included in the broadcast system 150 and are generally known to those skilled in the broadcasting art. Broadcast images are also commonly disseminated to the public via cable transmission, Internet Protocol television (IPTV) transmission, and digital broadcasts such as using Digital Video Broadcasting (DVB) and the like.



FIG. 2 illustrates the virtual broadcast presentation engine of FIG. 1. The virtual broadcast presentation engine 115 of FIG. 2 includes a communications module 210, a presentation rendering module 220, a selection module 230, a feedback module 240, and a trip calculation module 250. The virtual broadcast presentation engine 115 and its constituent modules may be stored in memory and executed by a processing device to effectuate the functionality corresponding thereto. The virtual broadcast presentation engine 115 may be composed of more or less modules (or combinations of the same) and still fall within the scope of the present invention. For example, the functionality of the selection module 230 and the functionality of the feedback module 240 may be combined into a single module.


Execution of the communications module 210 allows for receipt of a gesture detected or captured by a sensor or camera of the computing device 110. Execution of the communications module 210 may also allow for a user selection such as the selection by a presenter of an interactive element displayed within the virtual broadcast presentation. Execution of the communications module 210 may additionally allow for a user selection of other components included within the virtual broadcast presentation such as various soft keys with different functionalities. Execution of the communications module 210 may also allow for receipt of dynamic information from the information sources 130. This dynamic information may be used by other modules for generating, manipulating, and interacting with the virtual broadcast presentation.


Referring again to FIG. 2, execution of the presentation rendering module 220 allows for the generation of a virtual broadcast presentation based on the dynamic information received through execution of the communications module 210. The dynamic information may include traffic information, weather information, newsworthy events or incidents, election results, school closings, or other information that may be featured in a virtual broadcast presentation.


Execution of the presentation rendering module 220 may also allow for manipulation of a view of the virtual broadcast presentation in response to the gesture received by the communications module 210. Manipulating the view of the presentation may include one or more of panning across, rotating, tilting, or zooming in/out of the virtual broadcast presentation. A gesture associated with a particular input command may be assigned to a specific manipulation of the virtual broadcast presentation. For example, a gesture of waving a hand from left to right may adjust the view to pan across the map or presentation. As another example, a gesture corresponding to the interaction technique of “pinching” may be associated with an input command to zoom in/out of a current viewpoint of a presentation. A presenter may manipulate the presentation by gesturing and either spreading two fingers apart or bringing two fingers close together. As another example, a gesture may be associated with an input command to select a soft key displayed within the virtual broadcast presentation. Selection of one soft key, for example, may affect zoom speed, while selection of a different soft key may affect zoom direction.


Execution of the selection module 230 allows for selection of an interactive element included in the virtual broadcast presentation in response to the received gesture. A gesture may be associated with an input command to select an interactive element displayed within the virtual broadcast presentation. An interactive element may include a soft key displayed within the virtual broadcast presentation. The interactive element may also represent a traffic alert. For example, if road construction is taking place at a given intersection of two streets, an icon indicative of road construction may be placed in the virtual broadcast presentation at a position that corresponds to that given intersection. Execution of the selection module 230 may also select the interactive element when the interactive element is positioned near the center of the virtual broadcast presentation.


Selecting an interactive element may cause one of a variety of responses from the virtual broadcast presentation. For example, selection of the interactive element may cause additional information related to the interactive element to be displayed within the virtual broadcast presentation. In one embodiment, the interactive element may correspond to a traffic camera wherein selection of the interactive element causes a live camera view to appear within the virtual broadcast presentation.


Execution of the feedback module 240 provides feedback to the presenter to inform the presenter that a given interactive element is selectable. For example, the interactive element may be selectable in certain regions of the virtual broadcast presentation, such as the center. When the interactive element enters or leaves the center of the virtual broadcast presentation, the presenter may be informed via feedback. The feedback may include highlighting of the interactive element. To avoid distracting or otherwise undesirable imagery such as a cursor being included in the virtual broadcast presentation, non-visible feedback, such as an audible tone, may be invoked. The interactive element may also be selectable by gesturing to hover a pointer on the virtual broadcast presentation, and by applying gesture of selection when the location of the pointer coincides with that of the interactive element.


Execution of the feedback module 240 also provides feedback to the presenter that a given interactive element has been successfully selected. For example, if the presenter has selected a particular interactive element, the feedback module 240 may highlight the interactive element, change the color or appearance of the interactive element, or cause the interactive element to blink or flash continually. Such feedback confirms the selection of the interactive element and prevents the presenter from selecting the same interactive element multiple times.


Execution of the trip calculation module 250 may allow for the determination or calculation of an estimated amount of time (e.g., ‘trip time’) needed to travel from a selected location to another location. In one embodiment, the presenter may select a first interactive element displayed in the virtual broadcast presentation wherein the first interactive element corresponds to a starting point or location. The presenter may then select a second interactive element displayed in the presentation that corresponds to a desired end point or destination location. An interactive element or starting/end point may include a particular street, road, landmark or point of interest, highway, neighborhood, town, city, area, region or the like. The trip calculation module 250 may calculate or forecast the estimated amount of time required to traverse the real world distance from the first selected interactive element to the second interactive element in real-time considering, at least in part, information from the information sources 130. When calculating a trip time, the trip calculation module 250, for example, may consider the actual distance from the starting point to the end point, as well as various conditions affecting travel, including current weather conditions or traffic conditions such as a recent accident or road closure. In another embodiment, trip calculation module 250 may be used to calculate an estimated travel distance between two selected locations. Execution of trip calculation module 250 may occur following the selection of a ‘mode key’ as discussed further in FIG. 3A below.


Execution of the virtual broadcast presentation engine 115 may output the virtual broadcast presentation to other components of the computing device 110 for generation of the composite presentation. Accordingly, the computing device 110 may output the composite presentation to the broadcast system 150 for dissemination to viewers.



FIG. 3A illustrates a virtual broadcast presentation 300 displayed on a display screen 140. The presentation 300 of FIG. 3A includes traffic information. The principles described herein with respect to traffic are equally applicable to embodiments of the present invention that include weather information, newsworthy events or incidents, school closings, election results, or other information that may be featured in a virtual broadcast presentation. Presentation 300 may be generated and manipulated by execution of the presentation rendering module 220 in real-time. Presentation 300 may include satellite images of a given area with an animated road traffic report. A detailed description of animated road traffic reports may be found in U.S. patent application Ser. No. 11/302,418, now U.S. Pat. No. 7,221,287, the disclosure of which is incorporated by reference.


Satellite images may be manipulated by execution of the presentation rendering module 220 to aid in generating three-dimensional information. For example, two-dimensional satellite images may be processed in the context of other geographical information (e.g., topographical information) to generate a three-dimensional satellite image that reflects information along an x-, y-, and z-axis as illustrated in presentation 300. The textured three-dimensional representation of landscape of a particular urban area aligns with and provides the three-dimensional coordinates for the roadways that may be animated and overlaid on the satellite images.


The presentation 300 may also include a variety of markers (310A-310 C) to identify or label various locations, landmarks, or points of interest appearing in the presentation 300 such as exit ramps, highways, named sections of highways, or city streets. These markers may be readily or universally recognizable, such as a highway marker resembling a California state highway sign with the appropriate highway number. The presentation 300 may also include markers or icons corresponding to the location of traffic incidents, road construction, and traffic cameras. Some or all of these markers 310C may be interactive elements of the virtual broadcast presentation 300 and show real-time conditions, such as an average traffic speed associated with a particular location. An interactive element may include any marker, icon, label, object, or image appearing in the presentation 300 that may be associated with real-time content or data. An interactive element, for example, may include a street, road, bridge, highway, landmark, point of interest, traffic incident or alert, road construction, or traffic camera.



FIG. 4 illustrates an interactive element appearing in a virtual broadcast presentation 300 displayed on display screen 140. A presenter 305 may select an interactive element using a gesture. Any gesture corresponding to an input command for selection of an interactive element may be used. Pointing to an interactive element with one hand, for example, may select of the interactive element. In one embodiment, an interactive element 410 (i.e., traffic incident) may be marked by a particular icon, image, or symbol (e.g., an arrow pointing to the location of the traffic incident), as shown in FIG. 4. When an interactive element is selected, additional information related to that interactive element may be displayed. In one embodiment, an interactive element marking a traffic incident may be selected resulting in detailed textual information describing the traffic incident being displayed within the presentation 300 (not shown).


Returning to FIG. 3A, presentation 300 may include images of vehicles 315 appearing along a specific roadway or highway. A vehicle 315 may be animated, for example, to show the speed and direction of traffic along a particular highway. The presentation 300 may also use color coding to demonstrate real-time traffic conditions. Color coding may help a viewer of the presentation 300 to quickly understand real-time traffic conditions associated with a depicted map or location. The presentation 300 may include a legend 320 describing various objects or color representations used in the presentation 300. A ‘green’ colored section of a road, street, or highway, for example, may represent that real-time traffic is moving at a speed of 50 miles per hour or higher (e.g., normal or optimal conditions). A ‘yellow’ colored highway may represent traffic speeds of 25 miles per hour or higher (e.g., delayed conditions), while a ‘red’ colored highway may represent traffic speeds that are less than 25 miles per hour (e.g., slow or impacted conditions).


In one embodiment, the presentation 300 may display one or more soft keys with various functionalities such as orientation key 325, tilt key 330, rotation key 335, synchronization key 340, previous and next presentation display keys 345A-345B, and mode key 350. In another embodiment, soft keys may be absent from the presentation 300 where the functionality of a soft key is activated by a specific gesture alone.


The presenter 305 may select a soft key or activate the functionality of a soft key to facilitate or enhance the understanding of the content of presentation 300. For example, the presenter 305 may use the tilt key 330 to adjust or modify a view or perspective of the presentation 300 vertically or horizontally. The presenter 305 may also change the perspective of the presentation 300 by selecting the rotation key 335. Changing the perspective of the presentation 300 may alter the orientation of the presentation such that a ‘north’ direction of a map or image is not oriented at the top of the display screen 140. As such, the presenter 305 may select the orientation key 325 to return the ‘north’ direction to the top of the display screen 140. In one embodiment, the presenter 305 may select a soft key (e.g., tilt key 330 or rotation key 335) with one gesture (i.e., using one hand) while using another gesture (i.e., the other hand) to activate the functionality of the soft key (e.g., move or adjust the presentation 300 in the desired direction).


The presentation 300 may also include a synchronization key 340. The presentation 300 may be generated based on information received in real-time or near real-time through execution of the communications module 210. The presenter 305 may select the synchronization key 340 to cause the synchronization of data in real time such that the presentation 300 reflects the most current information and conditions. In one embodiment, synchronization of data may be done automatically. In another embodiment, the presenter 305 or another user may program or instruct the computing device 110 to synchronize data at regular time periods (e.g., every 10 seconds, every minute, every two minutes, etc.).


A presenter 305 may zoom in or out of the presentation 300 by selecting keys corresponding to a particular view of the presentation such as a previous key 345A and a next key 345B. For example, a previous key 345A may revert to a presentation that offers a zoom out view while a next key 345B may allow a view that zooms in from the current view. The previous and next keys may, alternatively, be assigned zoom in or zoom out functionality. A presenter 305 may select a particular key (345A, 345B) multiple times to further zoom in or out of the current view. In one embodiment, the keys (345A, 345B) may be used to display or shift to a different image or map within the presentation 300.


The presentation 300 may also include a mode key 350. A presenter 305 may operate the presentation 300 in different modes such as ‘trip time mode’ or ‘navigation mode.’ A presenter 305 may switch between various modes by selecting the mode key 350. A presenter 305 may use ‘navigation mode’ to view the presentation 300 as described in FIG. 3B below. ‘Trip time mode’ is discussed in further detail in FIGS. 5A and 5B below.



FIG. 3B illustrates the virtual broadcast presentation 300 of FIG. 3A following manipulation by the presenter 305. The presenter 305 may manipulate the presentation 300 in ‘navigation mode’ to review or illustrate real-time traffic conditions (e.g., average traffic speeds, traffic incidents, etc.) associated with various locations depicted in the presentation 300. A view of the presentation 300 may be manipulated to give the effect of ‘flying’ or scrolling through the three-dimensional virtual representation of the traffic map and images. As the presenter 305 scrolls through the presentation 300, various interactive elements may be highlighted and/or become available for selection. FIG. 3B illustrates the presentation 300 of FIG. 3A following the presenter 305 making a gesture that is associated with an input command to scroll through the presentation 300. As a result, the presentation 300 (as shown in FIG. 3B) shows a different portion of the presentation 300 (i.e., the intersection of highways 287 and 107) and the associated traffic conditions (e.g., traffic speeds).


The presenter 305 may also manipulate the presentation 300 by panning, tilting, and/or rotating the view. For example, as the presenter 305 makes a gesture corresponding to an input command to scroll through the presentation 300, the communications module 210 detects the gestures in conjunction with execution of the computing device 110 which processes the detected gesture. In turn, the presentation rendering module 220 may be executed to move or rotate the presentation 300 a corresponding amount or in proportion to the gesture of the presenter 305. The correspondence of the presentation 300 to the gesture of the presenter 305 gives the presenter 305 the sensation of directly controlling the presentation 300. Such manipulation of the view may also be used in selecting interactive elements. For example, if a particular interactive element may be selected only when near the center of the presentation 300, the presenter 305 may cause the view to be manipulated such that the particular interactive element is centered and therefore selectable.



FIGS. 5A-5B illustrate a virtual broadcast presentation in ‘trip time mode.’ The presenter 305 may activate trip time mode by making a gesture associated with an input command to activate trip time mode. In another embodiment, the presenter 305 may make a gesture to select the mode key 350 to activate trip time mode. Once trip time mode has been activated, the presenter 305 may make a gesture associated with an input command to select an interactive element corresponding to a first location or starting point displayed within the virtual broadcast presentation 300 on the display screen 140. As shown in FIG. 5A, the presenter 305 has selected or designated “83rd Ave” as a starting point. Following selection of the first location, a display 355A may appear confirming the selection of the presenter 305.


The presenter 305 may then select another interactive element corresponding to a second location or end point or travel destination by making a gesture associated with an input command to select a second interactive element within the presentation 300. As shown in FIG. 5B, the presenter 305 has selected or designated “1st Ave” as an end point. Following selection of the second interactive element, the trip calculation module 250 may calculate the estimated amount of time required to traverse the real world distance from the first selected interactive element (i.e., “83rd Ave”) to the second interactive element (i.e., “1st Ave”) in real-time considering, at least in part, information from information sources 130. The trip calculation module 250 may consider various conditions affecting travel such as weather conditions or traffic conditions such as a recent accident, a road closure, special events (e.g., holidays, sporting events, festivals, concerts, graduations, etc.) or any other delay. A display 355B may then show the estimated trip time (i.e., “28 minutes”), as well as any condition affecting travel within presentation 300 on the display screen 140. The display 355B may also show the route (i.e., highway “25”) associated with the calculated trip time.


Besides calculating the estimate trip time in real-time, the trip time module 250 may calculate or forecast the estimated trip time based on a time of day and/or date (i.e., special day or occasion) designated by presenter 305. For example, presenter 305 may want to determine the estimated trip time at 9:00 AM (e.g., morning rush hour) or at 8:00 PM (e.g., a later evening hour). As another example, presenter 305 may want to determine the estimated trip time when departing at a particular time on the Labor Day holiday or on a date when a sporting event, concert, or other large gathering is scheduled at a venue. In trip time mode, the presenter 305 may input the desired time of day and/or date and select a starting point and end point for trip time calculation. In another embodiment, trip time mode may also be used to calculate an estimated travel distance between two selected locations (not shown). The calculated estimated travel distance may also be displayed within presentation 300.



FIGS. 6A-6B illustrate a traffic camera appearing within the virtual broadcast presentation 300 displayed on the display screen 140. In one embodiment, an interactive element appearing in the presentation 300 may include a traffic camera (610A, 610B). The presenter 305 may select a traffic camera 610A by making a gesture associated with an input command to select the traffic camera 610A (as shown in FIG. 6A). Following selection of a traffic camera 610A associated with a particular location, a live video feed 620 corresponding to the location of a real-world traffic camera may be displayed within the presentation 300 (as shown in FIG. 6B). The presenter 305 may then use the live video 620 feed to view actual traffic conditions associated with the real world location of traffic camera 610A.



FIG. 7 is a flow chart illustrating a method 700 for gesture based interaction with traffic data presented in a virtual broadcast presentation. The steps of method 700 may be performed in varying orders. Steps may be added or subtracted from the method 700 and still fall within the scope of the present invention. The steps of the process of FIG. 7 may be embodied in hardware or software including a non-transitory computer-readable storage medium comprising instructions executable by a processor of a computing device.


At step 710, a real-time, virtual broadcast presentation 300 is generated. The presentation 300 may be based on dynamic information and may be two-dimensional or three-dimensional. Execution of the presentation rendering module 210 may perform step 710. The dynamic information may include real-time traffic information or real-time weather information and be received from the information sources 130 in conjunction with execution of the communications module 210.


At step 720, a gesture made by a user may be detected by a sensor or camera of the computing device 110. The gesture may be associated with an input command for directing the computing device 110 to perform a particular action. For example, computing device 110 may cause the selection of an interactive element displayed within presentation 300 or manipulation of a view of the presentation 300. Step 720 may be performed by execution of the communications module 210. Detection of the gesture at step 720 allows for processing of the gesture at step 730.


At step 730, information related to the detected gesture is processed. Execution of the gesture recognition module 220 may perform step 730. The gesture recognition module 220 may process, recognize, and/or interpret a gesture and determine whether the gesture is associated with a known input command stored in database 125.


At step 740, the presentation 300 is manipulated based on the input command identified at step 730. Execution of the presentation rendering module 220 may perform step 740. The presentation 300 may be processed, for example, to allow for real-time manipulation of the presentation 300 and various views thereof such as zooming in and out, scrolling, panning across, tilting, or rotating. The presentation 300 may also be manipulated based on a gesture associated with an input command for selection of a soft key or selection of a particular mode, such as navigation mode, or trip time mode, displayed within the presentation 300. The presentation 300 may also be manipulated based on a gesture associated with an input command for the display of information associated with an interactive element selected by the presenter 305, such as information regarding a traffic incident, road closure, or average travel speeds.


Any number of additional and/or optional steps that are not otherwise depicted may be included in method 700. These steps may include selection of an interactive element included in the virtual broadcast presentation or feedback being provided to the presenter to inform the presenter that an interactive element is selectable.


It is noteworthy that any hardware platform suitable for performing the processing described herein is suitable for use with the invention. Computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of computer-readable storage media include a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, or any other memory chip or cartridge.


Various forms of transmission media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus may carry data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM may optionally be stored on a fixed disk either before or after execution by a CPU.


The foregoing detailed description of the technology herein has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the technology to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. The described embodiments were chosen in order to best explain the principles of the technology and its practical application to thereby enable others skilled in the art to best utilize the technology in various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the technology be defined by the claims appended hereto.

Claims
  • 1. A method for gesture based interaction with dynamic traffic data, the method comprising: generating a real time virtual broadcast presentation based on dynamic traffic data received from one or more information sources in real time; andexecuting instructions stored in memory, wherein execution of the instructions by a processor: processes information related to a first gesture performed by a user interacting with the virtual broadcast presentation, the first gesture detected by a sensor coupled to a computing device, wherein the first gesture is associated with a first input command that selects a soft key displayed within the virtual broadcast presentation, and wherein the first gesture is a movement of a first hand performed by the user in front of the sensor without touching the sensor or the computing device;processes information related to a second gesture performed by the user, the second gesture detected by the sensor during detection of the first gesture, wherein the second gesture is associated with a second input command that modifies a view of the virtual broadcast presentation by performing a functionality associated with the selected soft key, and wherein the second gesture is a movement of a second hand performed by the user in front of the sensor without touching the sensor or the computing device;determines the first and second input commands associated with the detected first and second gestures by comparing each of the detected first and second gestures to a plurality of previously identified gestures stored in a database, wherein the database is customizable by the user; andmanipulates the real time virtual broadcast presentation based on the first and second input commands.
  • 2. The method of claim 1, wherein generating the real time virtual broadcast presentation is further based on dynamic weather data received from one or more information sources.
  • 3. The method of claim 1, wherein the real time virtual broadcast presentation is three-dimensional.
  • 4. The method of claim 1, wherein further execution of the instructions processes information related to a third gesture performed by the user, the third gesture detected by a sensor coupled to the computing device, wherein the third gesture is associated with a third input command.
  • 5. The method of claim 4, wherein the third input command selects an interactive element displayed within the virtual broadcast presentation.
  • 6. The method of claim 5, wherein the interactive element is a traffic alert.
  • 7. The method of claim 5, wherein the interactive element is a live camera showing a real time traffic condition associated with a real world location of the live camera.
  • 8. The method of claim 4, wherein the third input command includes calculating an estimated travel time between two user selected locations, the two user selected locations being interactive elements displayed within the real time virtual broadcast presentation, and wherein the first selected location corresponds to a first location designated as a starting point and the second selected location corresponds to a second location designated as an end point.
  • 9. The method of claim 8, wherein calculating the estimated travel time is based on a designated time of day.
  • 10. The method of claim 4, wherein the third input command includes calculating an estimated travel distance between two user selected locations, the two user selected locations being interactive elements displayed within the real time virtual broadcast presentation, and wherein the first selected location corresponds to a first location designated as a starting point and the second selected location corresponds to a second location designated as an end point.
  • 11. The method of claim 1, wherein manipulating the real time virtual broadcast presentation includes rotating or tilting a view of the real time virtual broadcast presentation.
  • 12. The method of claim 1, wherein manipulating the real time virtual broadcast presentation includes zooming in or zooming out of the real time virtual broadcast presentation.
  • 13. The method of claim 1, wherein manipulating the real time virtual broadcast presentation includes scrolling through the real time virtual broadcast presentation.
  • 14. The method of claim 1, wherein manipulating the real time virtual broadcast presentation includes displaying information regarding a user selected interactive element included in the real time virtual broadcast presentation.
  • 15. A system for gesture based interaction with dynamic traffic data, the system comprising: a processor; andmemory that stores instructions executable by the processor to: detect a first gesture performed by a user interacting with a real time virtual broadcast presentation, the first gesture detected by a sensor coupled to the processor, wherein the first gesture is associated with a first input command that selects a soft key displayed within the virtual broadcast presentation, and wherein the first gesture is a movement of a first hand performed by the user in front of the sensor without touching the sensor;detect a second gesture performed by the user, the second gesture detected by the sensor during detection of the first gesture, wherein the second gesture is associated with a second input command that modifies a view of the virtual broadcast presentation by performing a functionality associated with the selected soft key, and wherein the second gesture is a movement of a second hand performed by the user in front of the sensor without touching the sensor;determine the first and second input commands associated with the detected first and second gestures by comparing each of the detected first and second gestures to a plurality of previously identified gestures stored in a database, wherein the database is customizable by the user;generate the real time virtual broadcast presentation based on dynamic traffic data received from one or more information sources in real time; andmanipulate the real time virtual broadcast presentation based on the first and second input commands.
  • 16. The system of claim 15, wherein further execution of the instructions processes information related to a third gesture performed by the user, the third gesture detected by a sensor coupled to the processor, wherein the third gesture is associated with a third input command, and the third input command selects an interactive element included in the real time virtual broadcast presentation.
  • 17. The system of claim 16, further comprising instructions executable by the processor to inform the user that the interactive element is selectable.
  • 18. The system of claim 15, further comprising instructions executable by the processor to determine an estimated travel time or travel distance between two user selected locations.
  • 19. A non-transitory computer readable storage medium having a program embodied thereon, the program executable by a processor to perform a method for gesture based interaction with dynamic traffic data, the method comprising: generating a real time virtual broadcast presentation based on dynamic traffic data received from one or more information sources in real time;processing information related to a first gesture performed by a user interacting with the virtual broadcast presentation, the first gesture detected by a sensor coupled to a computing device, wherein the first gesture is associated with a first input command that selects a soft key displayed within the virtual broadcast presentation, and wherein the first gesture is a movement of a first hand performed by the user in front of the sensor without touching the sensor or the computing device;processing information related to a second gesture performed by the user, the second gesture detected by the sensor during detection of the first gesture, wherein the second gesture is associated with a second input command that modifies a view of the virtual broadcast presentation by performing a functionality associated with the selected soft key, and wherein the second gesture is a movement of a second hand performed by the user in front of the sensor without touching the sensor or the computing device;determining the first and second input commands associated with the detected first and second by comparing each of the detected first and second gestures to a plurality of previously identified gestures stored in a database, wherein the database is customizable by the user; andmanipulating the real time virtual broadcast presentation based on the first and second input commands.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation-in-part and claims the priority benefit of U.S. patent application Ser. No. 12/398,120, now U.S. Pat. No. 8,619,072, filed Mar. 4, 2009 and entitled “Controlling a Three-Dimensional Virtual Broadcast.” The disclosure of the aforementioned application is incorporated herein by reference.

US Referenced Citations (336)
Number Name Date Kind
4734863 Honey et al. Mar 1988 A
4788645 Zavoli et al. Nov 1988 A
4792803 Madnick et al. Dec 1988 A
4796191 Honey et al. Jan 1989 A
4878170 Zeevi Oct 1989 A
4914605 Longhmiller, Jr. et al. Apr 1990 A
4926343 Tsuruta et al. May 1990 A
5068656 Sutherland Nov 1991 A
5095532 Mardus Mar 1992 A
5126941 Gurmu et al. Jun 1992 A
5164904 Sumner Nov 1992 A
5173691 Sumner Dec 1992 A
5182555 Sumner Jan 1993 A
5220507 Kirson Jun 1993 A
5247439 Gurmu et al. Sep 1993 A
5262775 Tamai et al. Nov 1993 A
5276785 Mackinlay et al. Jan 1994 A
5283575 Kao et al. Feb 1994 A
5291412 Tamai et al. Mar 1994 A
5291413 Tamai et al. Mar 1994 A
5291414 Tamai et al. Mar 1994 A
5297028 Ishikawa Mar 1994 A
5297049 Gurmu et al. Mar 1994 A
5303159 Tamai et al. Apr 1994 A
5311195 Mathis et al. May 1994 A
5311434 Tamai May 1994 A
5339246 Kao Aug 1994 A
5343400 Ishikawa Aug 1994 A
5345382 Kao Sep 1994 A
5359529 Snider Oct 1994 A
5374933 Kao Dec 1994 A
5377113 Shibazaki et al. Dec 1994 A
5390123 Ishikawa Feb 1995 A
5394333 Kao Feb 1995 A
5402120 Fujii et al. Mar 1995 A
5414630 Oshizawa et al. May 1995 A
5428545 Maegawa et al. Jun 1995 A
5430655 Adachi Jul 1995 A
5440484 Kao Aug 1995 A
5465079 Bouchard et al. Nov 1995 A
5477220 Ishikawa Dec 1995 A
5485161 Vaughn Jan 1996 A
5488559 Seymour Jan 1996 A
5499182 Ousborne Mar 1996 A
5508931 Snider Apr 1996 A
5515283 Desai May 1996 A
5515284 Abe May 1996 A
5539645 Mandhyan et al. Jul 1996 A
5546107 Deretsky et al. Aug 1996 A
5548822 Yogo Aug 1996 A
5550538 Fujii et al. Aug 1996 A
5554845 Russell Sep 1996 A
5583972 Miller Dec 1996 A
5608635 Tamai Mar 1997 A
5610821 Gazis et al. Mar 1997 A
5689252 Ayanoglu et al. Nov 1997 A
5694534 White, Jr. et al. Dec 1997 A
5699056 Yoshida Dec 1997 A
5706503 Poppen et al. Jan 1998 A
5712788 Liaw et al. Jan 1998 A
5729458 Poppen Mar 1998 A
5731978 Tamai et al. Mar 1998 A
5742922 Kim Apr 1998 A
5751245 Janky et al. May 1998 A
5751246 Hertel May 1998 A
5757359 Morimoto et al. May 1998 A
5774827 Smith et al. Jun 1998 A
5818356 Schuessler Oct 1998 A
5822712 Olsson Oct 1998 A
5845227 Peterson Dec 1998 A
5850190 Wicks et al. Dec 1998 A
5862244 Kleiner et al. Jan 1999 A
5862509 Desai et al. Jan 1999 A
5864305 Rosenquist Jan 1999 A
5867110 Naito et al. Feb 1999 A
5893081 Poppen Apr 1999 A
5893898 Tanimoto Apr 1999 A
5898390 Oshizawa et al. Apr 1999 A
5902350 Tamai et al. May 1999 A
5904728 Tamai et al. May 1999 A
5908464 Kishigami et al. Jun 1999 A
5910177 Zuber Jun 1999 A
5911773 Mutsuga et al. Jun 1999 A
5912635 Oshizawa et al. Jun 1999 A
5916299 Poppen Jun 1999 A
5922042 Sekine et al. Jul 1999 A
5928307 Oshizawa et al. Jul 1999 A
5931888 Hiyokawa Aug 1999 A
5933100 Golding Aug 1999 A
5938720 Tamai Aug 1999 A
5948043 Mathis et al. Sep 1999 A
5978730 Poppen et al. Nov 1999 A
5982298 Lappenbusch et al. Nov 1999 A
5987381 Oshizawa et al. Nov 1999 A
5991687 Hale et al. Nov 1999 A
5999882 Simpson et al. Dec 1999 A
6009374 Urahashi Dec 1999 A
6011494 Watanabe et al. Jan 2000 A
6016485 Amakawa et al. Jan 2000 A
6021406 Kuznetsov Feb 2000 A
6038509 Poppen et al. Mar 2000 A
6058390 Liaw et al. May 2000 A
6064970 McMillan et al. May 2000 A
6091359 Geier Jul 2000 A
6091956 Hollenberg Jul 2000 A
6097399 Bhatt et al. Aug 2000 A
6111521 Mulder et al. Aug 2000 A
6144919 Ceylan et al. Nov 2000 A
6147626 Sakakibara Nov 2000 A
6150961 Alewine et al. Nov 2000 A
6161092 Latshaw et al. Dec 2000 A
6169552 Endo et al. Jan 2001 B1
6188956 Walters Feb 2001 B1
6209026 Ran et al. Mar 2001 B1
6222485 Walters et al. Apr 2001 B1
6226591 Okumura et al. May 2001 B1
6236933 Lang May 2001 B1
6253146 Hanson et al. Jun 2001 B1
6253154 Oshizawa et al. Jun 2001 B1
6256577 Granuke Jul 2001 B1
6259987 Ceylan et al. Jul 2001 B1
6282486 Bates et al. Aug 2001 B1
6282496 Chowdhary Aug 2001 B1
6292745 Robare et al. Sep 2001 B1
6295492 Lang et al. Sep 2001 B1
6297748 Lappenbusch et al. Oct 2001 B1
6298305 Kadaba et al. Oct 2001 B1
6317685 Kozak et al. Nov 2001 B1
6317686 Ran Nov 2001 B1
6335765 Daly et al. Jan 2002 B1
6353795 Ranjan Mar 2002 B1
6356836 Adolph Mar 2002 B1
6360165 Chowdhary Mar 2002 B1
6362778 Neher Mar 2002 B2
6415291 Bouve et al. Jul 2002 B2
6424910 Ohler et al. Jul 2002 B1
6442615 Nordenstam et al. Aug 2002 B1
6456931 Polidi et al. Sep 2002 B1
6456935 Ng Sep 2002 B1
6463400 Barkley-Yeung Oct 2002 B1
6466862 DeKock et al. Oct 2002 B1
6470268 Ashcraft et al. Oct 2002 B1
6473000 Secreet et al. Oct 2002 B1
6480783 Myr Nov 2002 B1
6504541 Liu et al. Jan 2003 B1
6529143 Mikkola et al. Mar 2003 B2
6532304 Liu et al. Mar 2003 B1
6539302 Bender et al. Mar 2003 B1
6542814 Polidi et al. Apr 2003 B2
6552656 Polidi et al. Apr 2003 B2
6556905 Mittelsteadt et al. Apr 2003 B1
6559865 Angwin May 2003 B1
6574548 DeKock et al. Jun 2003 B2
6584400 Beardsworth Jun 2003 B2
6594576 Fan et al. Jul 2003 B2
6598016 Zavoli et al. Jul 2003 B1
6600994 Polidi Jul 2003 B1
6603405 Smith Aug 2003 B2
6622086 Polidi Sep 2003 B2
6639550 Knockeart et al. Oct 2003 B2
6643581 Ooishi Nov 2003 B2
6650997 Funk Nov 2003 B2
6654681 Kiendl et al. Nov 2003 B1
6675085 Straub Jan 2004 B2
6681176 Funk et al. Jan 2004 B2
6687615 Krull et al. Feb 2004 B1
6700503 Masar et al. Mar 2004 B2
6710774 Kawasaki et al. Mar 2004 B1
6720889 Yamaki et al. Apr 2004 B2
6728605 Lash et al. Apr 2004 B2
6728628 Peterson Apr 2004 B2
6731940 Nagendran May 2004 B1
6735516 Manson May 2004 B1
6754833 Black et al. Jun 2004 B1
6785606 DeKock et al. Aug 2004 B2
6791472 Hoffberg Sep 2004 B1
6807483 Chao et al. Oct 2004 B1
6845316 Yates Jan 2005 B2
6859728 Sakamoto et al. Feb 2005 B2
6862524 Nagda et al. Mar 2005 B1
RE38724 Peterson Apr 2005 E
6885937 Sunranyi Apr 2005 B1
6901330 Krull et al. May 2005 B1
6914541 Zierden Jul 2005 B1
6922629 Yoshikawa et al. Jul 2005 B2
6931309 Phelan et al. Aug 2005 B2
6952643 Matsuoka et al. Oct 2005 B2
6965665 Fan et al. Nov 2005 B2
6983204 Knutson Jan 2006 B2
6987964 Obradovich et al. Jan 2006 B2
6989765 Gueziec Jan 2006 B2
6999873 Krull et al. Feb 2006 B1
7010583 Aizono et al. Mar 2006 B1
7062378 Krull et al. Jun 2006 B2
7069143 Peterson Jun 2006 B2
7103854 Fuchs et al. Sep 2006 B2
7161497 Gueziec Jan 2007 B2
7209828 Katou Apr 2007 B2
7221287 Gueziec May 2007 B2
7243134 Bruner et al. Jul 2007 B2
7343242 Breitenberger et al. Mar 2008 B2
7356392 Hubbard et al. Apr 2008 B2
7375649 Gueziec May 2008 B2
7424388 Sato Sep 2008 B2
7433676 Kobayashi et al. Oct 2008 B2
7440842 Vorona Oct 2008 B1
7486201 Kelly et al. Feb 2009 B2
7508321 Gueziec Mar 2009 B2
7557730 Gueziec Jul 2009 B2
7558674 Neilley et al. Jul 2009 B1
7603138 Zhang et al. Oct 2009 B2
7610145 Kantarjiev et al. Oct 2009 B2
7613564 Vorona Nov 2009 B2
7634352 Soulchin et al. Dec 2009 B2
7702452 Kantarjiev et al. Apr 2010 B2
7792642 Neiley et al. Sep 2010 B1
7880642 Gueziec Feb 2011 B2
7908076 Downs et al. Mar 2011 B2
7912627 Downs et al. Mar 2011 B2
8103443 Kantarjiev et al. Jan 2012 B2
8358222 Gueziec Jan 2013 B2
8428856 Tischer Apr 2013 B2
8531312 Gueziec Sep 2013 B2
8537033 Gueziec Sep 2013 B2
8564455 Gueziec Oct 2013 B2
8619072 Gueziec Dec 2013 B2
8660780 Kantarjiev Feb 2014 B2
8718910 Gueziec May 2014 B2
8725396 Gueziec May 2014 B2
8781718 Margulici Jul 2014 B2
8786464 Gueziec Jul 2014 B2
8825356 Vorona Sep 2014 B2
8958988 Gueziec Feb 2015 B2
8982116 Gueziec Mar 2015 B2
20010014848 Walgers et al. Aug 2001 A1
20010018628 Jenkins et al. Aug 2001 A1
20010026276 Sakamoto et al. Oct 2001 A1
20010033225 Razavi et al. Oct 2001 A1
20010047242 Ohta Nov 2001 A1
20020042819 Reichert et al. Apr 2002 A1
20020077748 Nakano Jun 2002 A1
20020152020 Seibel Oct 2002 A1
20020177947 Cayford Nov 2002 A1
20030046158 Kratky Mar 2003 A1
20030055558 Watanabe et al. Mar 2003 A1
20030109985 Kotzin Jun 2003 A1
20030135304 Sroub et al. Jul 2003 A1
20030151592 Ritter Aug 2003 A1
20030182052 DeLorme et al. Sep 2003 A1
20040034464 Yoshikawa et al. Feb 2004 A1
20040046759 Soulchin et al. Mar 2004 A1
20040049424 Murray et al. Mar 2004 A1
20040080624 Yuen Apr 2004 A1
20040107288 Menninger et al. Jun 2004 A1
20040143385 Smyth et al. Jul 2004 A1
20040166939 Leifer et al. Aug 2004 A1
20040225437 Endo et al. Nov 2004 A1
20040249568 Endo et al. Dec 2004 A1
20050021225 Kantarjieve et al. Jan 2005 A1
20050027436 Yoshikawa et al. Feb 2005 A1
20050143902 Soulchin et al. Jun 2005 A1
20050154505 Nakamura et al. Jul 2005 A1
20050212756 Marvit et al. Sep 2005 A1
20060122846 Burr et al. Jun 2006 A1
20060143959 Stehle et al. Jul 2006 A1
20060145892 Gueziec Jul 2006 A1
20060158330 Gueziec Jul 2006 A1
20060238521 Westerman et al. Oct 2006 A1
20060238617 Tamir Oct 2006 A1
20060284766 Gruchala et al. Dec 2006 A1
20070013551 Gueziec Jan 2007 A1
20070038362 Gueziec Feb 2007 A1
20070060384 Dohta Mar 2007 A1
20070066394 Ikeda et al. Mar 2007 A1
20070115252 Burgmans May 2007 A1
20070197217 Sutardja Aug 2007 A1
20070208495 Chapman et al. Sep 2007 A1
20070208496 Downs et al. Sep 2007 A1
20070211026 Ohta Sep 2007 A1
20070211027 Ohta Sep 2007 A1
20070222750 Ohta Sep 2007 A1
20070247291 Masuda et al. Oct 2007 A1
20070265766 Jung et al. Nov 2007 A1
20080071465 Chapman et al. Mar 2008 A1
20080084385 Ranta et al. Apr 2008 A1
20080096654 Mondesir et al. Apr 2008 A1
20080133120 Romanick Jun 2008 A1
20080248848 Rippy et al. Oct 2008 A1
20080255754 Pinto Oct 2008 A1
20080287189 Rabin Nov 2008 A1
20080297488 Operowsky et al. Dec 2008 A1
20090005965 Forstall et al. Jan 2009 A1
20090061971 Weitzner et al. Mar 2009 A1
20090066495 Newhouse et al. Mar 2009 A1
20090082950 Vorona Mar 2009 A1
20090096753 Lim Apr 2009 A1
20090112465 Weiss et al. Apr 2009 A1
20090118017 Perlman et al. May 2009 A1
20090118996 Kantarjieve et al. May 2009 A1
20090189979 Smyth Jul 2009 A1
20090192702 Bourne Jul 2009 A1
20100079306 Liu et al. Apr 2010 A1
20100094531 MacLeod Apr 2010 A1
20100100307 Kim Apr 2010 A1
20100145569 Bourque et al. Jun 2010 A1
20100145608 Kurtti et al. Jun 2010 A1
20100175006 Li Jul 2010 A1
20100198453 Dorogusker et al. Aug 2010 A1
20100225643 Gueziec Sep 2010 A1
20100305839 Wenzel Dec 2010 A1
20100312462 Gueziec Dec 2010 A1
20110037619 Ginsberg et al. Feb 2011 A1
20110106427 Kim et al. May 2011 A1
20110304447 Marumoto Dec 2011 A1
20120072096 Chapman et al. Mar 2012 A1
20120123667 Gueziec May 2012 A1
20120150422 Kantarjiev et al. Jun 2012 A1
20120150425 Chapman et al. Jun 2012 A1
20120158275 Huang et al. Jun 2012 A1
20120290202 Gueziec Nov 2012 A1
20120290204 Gueziec Nov 2012 A1
20120296559 Gueziec Nov 2012 A1
20130033385 Gueziec Feb 2013 A1
20130204514 Margulici Aug 2013 A1
20130207817 Gueziec Aug 2013 A1
20130211701 Baker et al. Aug 2013 A1
20140088871 Gueziec Mar 2014 A1
20140091950 Gueziec Apr 2014 A1
20140107923 Gueziec Apr 2014 A1
20140129124 Margulici May 2014 A1
20140129142 Kantarjiev May 2014 A1
20140139520 Gueziec May 2014 A1
20140236464 Gueziec Aug 2014 A1
20140249734 Gueziec Sep 2014 A1
20140316688 Margulici Oct 2014 A1
20140320315 Gueziec Oct 2014 A1
Foreign Referenced Citations (34)
Number Date Country
6710924 Jul 2013 CO
19856704 Jun 2001 DE
0 749 103 Dec 1996 EP
0 987 665 Mar 2000 EP
1 006 367 Jun 2000 EP
2 178 061 Apr 2010 EP
2 635 989 Sep 2011 EP
2 616 910 Jul 2013 EP
2 638 493 Sep 2013 EP
2 710 571 Mar 2014 EP
2 820 631 Jan 2015 EP
2 400 293 Oct 2004 GB
05-313578 Nov 1993 JP
08-77485 Mar 1996 JP
10-261188 Sep 1998 JP
10-281782 Oct 1998 JP
10-293533 Nov 1998 JP
2000-055675 Feb 2000 JP
2000-113387 Apr 2000 JP
2001-330451 Nov 2001 JP
WO 9636929 Nov 1996 WO
WO 9823018 May 1998 WO
WO 0050917 Aug 2000 WO
0188480 Nov 2001 WO
WO 02077921 Oct 2002 WO
03014671 Feb 2003 WO
WO 2005013063 Feb 2005 WO
WO 2005076031 Aug 2005 WO
WO 2010073053 Jul 2010 WO
WO 2012024694 Feb 2012 WO
WO 2012037287 Mar 2012 WO
WO 2012065188 May 2012 WO
WO 2012159803 Nov 2012 WO
WO 2013113029 Aug 2013 WO
Non-Patent Literature Citations (154)
Entry
WSI, “TrueView Interactive Training Manual, Showfx Student Guide,” Print Date: Sep. 2004, Document Version: 4.3x. Link: http://apollo.lsc.vsc.edu/intranet/WSI—Showfx/training/970-TVSK-SG-43.pdf.
Truett, R., “Car Navigation System May Live on After Test,” The Orlando Sentinel, Feb. 17, 1993, p. 3 pages.
U.S. Dept. of Transportation, Closing the Data Gap: Guidelines for Quality Advanced Traveler Information System (ATIS) Data, Version 1.0, Sep. 2000, 41 pages.
Watanabe et al, “Development of a Three-dimensional Bird's-Eye View Map Drawing Technique for Car Navigation Systems,” Society of Automotive Engineers, Inc., 1996, pp. 11-18.
Wischhof, L. et al., “SOTIS—A Self-Organizing Traffic Information System,” Proceedings of the 57th IEEE Vehicular Technology Conference (VTC—'03), 2003, pp. 2442-2446, New York, NY, USA.
Yokouchi, K., “Car-Navigation Systems,” Mitsubishi Electr. Adv. Technical Reports, 2000, vol. 91, pp. 10-14, Japan.
You, J. et al., “Development and Evaluation of a Hybrid Travel Time Forecasting Model,” Transportation Research Part C 8, 2000, pp. 231-256, Pergamon Press Ltd., Elsevier Science Ltd., U.K.
Zhu, C. et al., “3D Terrain Visualization for Web GIS,” Center for Advanced Media Technology, Nanyang Technological University, Singapore, 2003, 8 pages.
Audi-V150 Manual, Oct. 2001, 152 pages, Japan.
Birdview Navigation System by Nissan Motor Corp, 240 Landmarks of Japanese Automotive Technology, 1995, 2 pages, Society of Automotive Engineers of Japan, Inc., Japan.
Carin Navigation System Manual and Service Manual for Model Carin 22SY520, 76 pages, Philips Car Systems, The Netherlands.
Dancer, F. et al, “Vehicle Navigation Systems: Is America Ready?,” Navigation and Intelligent Transportation Systems, Automotive Electronics Series, Society of Automotive Engineers, 1998, pp. Cover page, Table of Contents, pp. 3-8.
Endo et al., “Development and Evaluation of a Car Navigation System Providing a Birds Eye View Map Display,” Navigation and Intelligent Transportation Systems, Automotive Electronics Series, Society of Automotive Engineers, 1998, pp. Cover page, Table of Contents, pp. 19-22.
Panasonic Portable Navigation System User Manual for Products KX-GT30, KX-GT30X, and KX-GT30Z, Cover page, pp. 1-5, 132-147, End pages, Matsushita Denki Sangyo K.K., Fukuoka City, Japan [Date Unknown].
Inman, V.W. et al., “TravTek Global Evaluation and Executive Summary,” Publication No. FHWA-RD-96-031, Mar. 1996, 104 pages, U.S. Department of Transportation, McLean, VA, USA.
Inman, V.W. et al., “TravTek Evaluation Rental and Local User Study,” Publication No. FHWA-RD-96-028, Mar. 1996, 110 pages, U.S. Department of Transportation, McLean, VA, USA.
Vollmer, R., “Navigation Systems—Intelligent Co-Drivers with Knowledge of Road and Tourist Information,” Navigation and Intelligent Transportation Systems, Automotive Electronics Series, Society of Automotive Engineers, 1998, pp. Cover page, Table of Contents, pp. 9-17.
Blumentritt, K. et al., “TravTek System Architecture Evaluation,” Publication No. FHWA-RD-96-141, Jul. 1995, 504 pages, U.S. Department of Transportation, McLean, VA, USA.
Zhao, Y. “Vehicle Location and Navigation Systems,” 1997, 370 pages, Arthech House, Inc., Norwood, MA, USA.
“TravTek Information and Services Center Policy/Procedures Manual,” Feb. 1992, 133 pages, U.S. Department of Transportation, McLean, VA, USA.
N'Fit Xanavi, unknown date, 94 pages, Japan.
XM Radio Introduces Satellite Update Service for Vehicle Navigation, Apr. 8, 2004, 2 pages.
Slothower, D., “Sketches & Applications,” SIGGRAPH 2001, pp. 138-144, Stanford University.
Karbassi, A. et al., “Vehicle Route Prediction and Time of Arrival Estimation Techniques for Improved Transportation System Management,” in Proceedings of the Intelligent Vehicles Symposium, 2003, pp. 511-516, IEEE, New York, NY, USA.
Koller, D. et al., “Virtual GIS: A Real-Time 3D Geographic Information System,” Proceedings of the 6th IEEE Visualization Conference (VISUALIZATION '95) 1995, pp. 94-100, IEEE, New York, NY, USA.
Acura Debuts AcuraLink™ Satellite-Linked Communication System with Industry's First Standard Real Time Traffic Feature at New York International Auto Show, 2004, 4 pages.
Burgett, A. L.,“Safety Evaluation of TravTek,” Vehicle Navigation & Information Systems Conference Proceedings (VNIS'91), P-253, Part 1, Oct. 1991, pp. 819-825, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
de Cambray, B. “Three-Dimensional (3D) Modelling in a Geographical Database,” Auto-Carto'11, Eleventh International Conference on Computer Assisted Cartography, Oct. 30, 1993-Nov. 1, 1993, pp. 338-347, Minneapolis, USA.
Campbell, J. L., “Development of Human Factors Design Guidelines for Advanced Traveler Information Systems (ATIS)”, Proceedings Vehicle Navigation and Information Systems Conference, 1995, pp. 161-164, IEEE, New York, NY, USA.
Campbell, J. L., “Development of Human Factors Design Guidelines for Advanced Travel Information Systems (ATIS) and Commercial Vehicle Operations (CVO)”, Publication No. FHWA-RD-98-057, Report Date Sep. 1998, 294, pages, U.S. Department of Transportation, McLean, VA 22010-2296.
Cathey, F.W. et al., “A Prescription for Transit Arrival/Departure Prediction Using Automatic Vehicle Location Data,” Transportation Research Part C 11, 2003, pp. 241-264, Pergamon Press Ltd., Elsevier Ltd., U.K.
Chien, S. I. et al., “Predicting Travel Times for the South Jersey Real-Time Motorist Information System,” Transportation Research Record 1855, Paper No. 03-2750, Revised Oct. 2001, pp. 32-40.
Chira-Chavala, T. et al. “Feasibility Study of Advanced Technology HOV Systems,” vol. 3: Benefit Implications of Alternative Policies for Including HOV lanes in Route Guidance Networks, Dec. 1992, 84 pages, UCB-ITS-PRR-92-5 PATH Research Report, Inst. of Transportation Studies, Univ. of Calif., Berkeley, USA.
Clark. E.L., Development of Human Factors Guidelines for Advanced Traveler Information Systems (ATIS) and Commercial Vehicle Operations (CVO): Comparable Systems Analysis, Dec. 1996, 199 pages.
Davies, P. et al., “Assessment of Advanced Technologies for Relieving Urban Traffic Congestion,” National Cooperative Highway Research Program Report 340, Dec. 1991, 106 pages.
Dillenburg, J.F. et al., “The Intelligent Travel Assistant,” IEEE 5th International Conference on Intelligent Transportation Systems, Sep. 3-6, 2002, pp. 691-696, Singapore.
Dingus, T.A. et al., “Human Factors Engineering the TravTek Driver Interface,” Vehicle Navigation & Information Systems Conference Proceedings (VNIS'91), P-253, Part 2, Oct. 1991, pp. 749-755, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Eppinger, A. et al., “Dynamic Route Guidance—Status and Trends,” Convergence 2000 International Congress on Transportation Electronics, Oct. 16-18, 1999, 7 pages, held in Detroit, MI, SAE International Paper Series, Warrendale, PA, USA.
GM Exhibits Prototype of Travtek Test Vehicle, Inside IVHS, Oct. 28, 1991, V. 1, No. 21, 2 pages.
Golisch, F., Navigation and Telematics in Japan, International Symposium on Car Navigation Systems, May 21, 1997, 20 pages, held in Barcelona, Spain.
Gueziec, A., “Architecture of a System for Producing Animated Traffic Reports,” Mar. 30, 2001, 42 pages.
Handley, S. et al., “Learning to Predict the Duration of an Automobile Trip,” Proceedings of the Fourth International Conference on Knowledge Discovery and Data Mining, 1998, 5 pages, AAAI Press, New York, NY, USA.
Hirata et al., “The Development of a New Multi-AV System Incorporating an On-Board Navigation Function,” International Congress and Exposition, Mar. 1-5, 1993, pp. 1-12, held in Detroit, MI, SAE International, Warrendale, PA, USA.
Hoffmann, G. et al., Travel Times as a Basic Part of the LISB Guidance Strategy, Third International Conference on Road Traffic Control, May 1-3, 1990, pp. 6-10, London, U.K.
Hofmann, T. “2005 Acura RL Prototype Preview,” Auto123.com, 4 pages.
Hu, Z. et al., “Real-time Data Fusion on Tracking Camera Pose for Direct Visual Guidance,” IEEE Vehicles Symposium, Jun. 14-17, 2004, pp. 842-847, held in Parma, Italy.
Hulse, M.C. et al., “Development of Human Factors Guidelines for Advanced Traveler Information Systems and Commercial Vehicle Operations: Identification of the Strengths and Weaknesses of Alternative Information Display Formats,” Publication No. FHWA-RD-96-142, Oct. 16, 1998, 187 pages, Office of Safety and Traffic Operation R&D, Federal Highway Administration, USA.
Jiang, G., “Travel-Time Prediction for Urban Arterial Road: A Case on China,” Proceedings Intelligent Transportation Systems, Oct. 12-15, 2003, pp. 255-260, IEEE, New York, NY, USA.
Kopitz et al., Table of Contents, Chapter 6, Traffic Information Services, and Chapter 7, Intelligent Transport Systems and RDS-TMC in RDS: The Radio Data System, 1992, Cover p. XV, pp. 107-167, Back Cover page, Artech House Publishers, Boston, USA and London, Great Britain.
Krage, M.K., “The TravTek Driver Information System,” Vehicle Navigation & Information Systems Conference Proceedings (VNIS'91), P-253, Part 1, Oct. 1991, pp. 739-748, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Ladner, R. et al., “3D Mapping of an Interactive Synthetic Environment,” Computing Practices, Mar. 2000, pp. 33-39, IEEE, New York, NY, USA.
Lowenau, J. et al., “Final Map Actualisation Requirements,” Version 1.1, ActMAP Consortium, Sep. 30, 2004, 111 pages.
Meridian Series of GPS Receivers User Manual, Magellan, 2002, 106 pages, Thales Navigation, Inc., San Dimas, CA, USA.
Ness, M. et al, “A Prototype Low Cost In-Vehicle Navigation System,” IEEE—IEE Vehicle Navigation & Information Systems Conference (VNIS), 1993, pp. 56-59, New York, NY, USA.
Noonan J. et al., “Intelligent Transportation Systems Field Operational Test Cross-Cutting Study Advanced Traveler Information Systems,” Sep. 1998, 27 pages, U.S. Department of Transportation, McLean, VA, USA.
Odagaki et al., Automobile Navigation System with Multi-Source Guide Information, International Congress & Exposition, Feb. 24-28, 1992, pp. 97-105, SAE Inernational,Warrendale, PA, USA.
Raper, J.F. et al., “Three-Dimensional GIS,” in Geographical Information Systems: Principles and Applications, 1991, vol. 1, Chapter 20, 21 pages.
Reference Manual for the Magellan RoadMate 500/700, 2003, 65 pages, Thales Navigation, Inc., San Dimas, CA, USA.
Riiett, L.R., “Simulating the TravTek Route Guidance Logic Using the Integration Traffic Model,” Vehicle Navigation & Information System, P-253, Part 2, Oct. 1991, pp. 775-787, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Rillings, J. H., et al., “TravTek,” Vehicle Navigation & Information System Conference Proceedings (VNIS'91), P-253, Part 2, Oct. 1991, pp. 729-737, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Rupert, R.L., “The TravTek Traffic Management Center and Traffic Information Network,” Vehicle Navigation & Information System Conference Proceedings (VNIS'91), P-253, Part 1, Oct. 1991, pp. 757-761, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Schofer, J. L., “Behavioral Issues in the Design and Evaluation of Advanced Traveler Information Systems,” Transportation Research Part C 1, 1993, pp. 107-117, Pergamon Press Ltd., Elsevier Science Ltd.
Schulz, W., “Traffic Management Improvement by Integrating Modern Communication Systems,” IEEE Communications Magazine, Oct. 1996, pp. 56-60, New York, NY, USA.
Shepherd, I.D.H. et al., “Information Integration and GIS,” in Geographical Information Systems: Principles and Applications, 1991, vol. 1, pp. Cover p. 337-360, end page.
Sirius Satellite Radio: Traffic Development Kit Start Up Guide, Sep. 27, 2005, Version 00.00.01, NY, New York, 14 pages.
Sumner, R. “Data Fusion in Pathfinder and TravTek,” Part 1, Vehicle Navigation & Information Systems Conference Proceedings (VNIS '91), Oct. 1991, Cover & Title page, pp. 71-75.
Tamura et al., “Toward Realization of VICS—Vehicle Information and Communications System,” IEEE—IEE Vehicle Navigation & Information Systems Conference (VNIS 93),1993, pp. 72-77, held in Ottawa, Canada.
Taylor, K.B., “TravTek—Information and Services Center,” Vehicle Navigation & Information System Conference Proceedings (VNIS '91), P-253, Part 2, Oct. 1991, pp. 763-774, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Thompson S.M. et al., “Exploiting Telecommunications to Delivery Real Time Transport Information,” Road Transport Information and Control, Conf. Publication No. 454, Apr. 21-23, 1998, pp. 59-63, IEE, U.K.
“The Challenge of VICS: The Dialog Between the Car and Road has Begun,” Oct. 1, 1996, pp. 19-63, The Road Traffic Information Communication System Centre (VICS Centre), Tokyo, Japan.
Tonjes, R., “3d Reconstruction of Objects from Ariel Images Using a GIS,” presented at ISPRS Workshops on “Theoretical and Practical Aspects of Surface Reconstructions and 3-D Object Extraction” Sep. 9-11, 1997, 8 pages, held in Haifa, Israel.
Nissan Automobile Navigation System User Manual, [date unknown], 163 pages.
Levinson, D. et al., “Assessing the Benefits and C osts of Intelligent Transportation Systems: The Value of Advanced Traveler Information Systems,” Publication UCB-ITS-PRR-99-20, California Path Program, Jul. 1999, Institute of Transportation Studies, University of California, Berkeley, CA, USA.
Garmin's Preliminary Invalidity Contentions and Certificate of Service filed May 16, 2011 in Triangle Software, LLC. v. Garmin International, Inc. et al., Case No. 1: 10-cv-1457-CMH-TCB in the United States District Court for the Eastern District of Virginia, Alexandria Division, 46 pages.
Attachment A of Garmin's Preliminary Invalidity Contentions and Certificate of Service filed May 16, 2011 in Triangle Software, LLC. v. Garmin International, Inc. et al., Case No. 1: 10-cv-1457-CMH-TCB in the United States District Court for the Eastern District of Virginia, Alexandria Division, 6 pages.
Attachment B of Garmin's Preliminary Invalidity Contentions and Certificate of Service filed May 16, 2011 in Triangle Software, LLC. v. Garmin International, Inc. et al., Case No. 1: 10-cv-1457-CMH-TCB in the United States District Court for the Eastern District of Virginia, Alexandria Division, 618 pages.
Preliminary Invalidity Contentions of Defendant TomTom, Inc., Certificate of Service and Exhibit A filed May 16, 2011 in Triangle Software, LLC. v. Garmin International, Inc. et al., Case No. 1: 10-cv-1457-CMH-TCB in the United States District Court for the Eastern District of Virginia, Alexandria Division, 354 pages.
Garmin International, Inc. and Garmin USA, Inc.'s Answer and Counterclaim to Triangle Software, LLC's Supplemental Complaint filed Jun. 17, 2011 in Triangle Software, LLC v. Garmin International Inc. et al., in the United States District Court for the Eastern District of Virginia, Alexandria Division, Case No. 1:10-cv-1457-CMH-TCB, 36 pages.
Initial Expert Report of William R. Michalson, Ph. D. dated Jun. 17, 2011 in Triangle Software, LLC v. Garmin International Inc. et al., in the United States District Court for the Eastern District of Virginia, Alexandria Division, Case No. 1:10-cv-1457-CMH-TCB, 198 pages.
Supplemental Expert Report of William R. Michalson, Ph. D. Regarding Invalidity of the Patents—in-Suit dated Jul. 5, 2011 in Triangle Software, LLC v. Garmin International Inc. et al., in the United States District Court for the Eastern District of Virginia, Alexandria Division, Case No. 1:10-cv-1457-CMH-TCB, 23 pages.
Initial Expert Report of Roy Summer dated Jun. 16, 2011 in Triangle Software, LLC v. Garmin International Inc. et al., in the United States District Court for the Eastern District of Virginia, Alexandria Division, Case No. 1:10-cv-1457-CMH-TCB, 289 pages.
Expert Report of Dr. Michael Goodchild Concerning the Validity of U.S. 5,938,720 dated Jun. 16, 2011 in Triangle Software, LLC v. Garmin International Inc. et al., in the United States District Court for the Eastern District of Virginia, Alexandria Division, Case No. 1:10-cv-1457-CMH-TCB, 16 pages.
Balke, K.N. et al., “Advanced Technologies for Communicating with Motorists: A Synthesis of Human Factors and Traffic Management Issues,” Report No. FHWA/TX-92/1232-8, May 1992, Texas Department Transportation, Austin, TX, USA, 62 pages.
Fawcett, J. et. al, Adaptive Routing for Road Traffic, IEEE Computer Graphics and Applications, May/Jun. 2000, pp. 46-53, IEEE, New York, NY, USA.
Rillings, J.H. et al., “Advanced Driver Information Systems,” IEEE Transactions on Vehicular Technology, Feb. 1991, vol. 40, No. 1, pp. 31-40, IEEE, New York, NY, USA.
Adib Kanafani, “Towards a Technology Assessment of Highway Navigation and Route Guidance,” Program on Advanced Technology for the Highway, Institute of Transportation Studies, University of California, Berkeley, Dec. 1987, PATH Working Paper UCB-ITS-PWP-87-6.
Answer, Affirmative Defenses, and Counterclaims by Defendant Westwood One, Inc., to Plaintiff Triangle Software, LLC's Complaint for Patent Infringement.
Answer and Counterclaims of TomTom, Inc. to Plaintiff Triangle Software, LLC's Complaint for Patent Infringement.
Amended Answer and Counterclaims of TomTom, Inc. to Plaintiff Triangle Software, LLC's Complaint for Patent Infringement.
Brooks, et al., “Turn-by-Turn Displays versus Electronic Maps: An On-the-Road Comparision of Driver Glance Behavior,” Technical Report, The University of Michigan, Transportation Research Institute (UMTRI), Jan. 1999.
Declaration Under 37 C.F.R. 1.131 and Source Code from U.S. Appl. No. 10/897,550.
Fleischman, R.N., “Research and Evaluation Plans for the TravTek IVHS Operational Field Test,” Vehicle Navigation & Information Systems Conference Proceedings (VNIS'91), P-253, Part 2, Oct. 1991, pp. 827-837, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Garmin International, Inc.'s Answer and Counterclaims to Triangle Software, LLC's Complaint.
Garmin International, Inc.'s Amended Answer and Counterclaims to Triangle Software, LLC's Complaint.
Goldberg et al., “Computing the Shortest Path: A Search Meets Graph Theory,” Proc. of the 16th Annual ACM-SIAM Sym. on Discrete Algorithms, Jan. 23-25, 2005. Vancouver, BC.
Goldberg et al., “Computing the Shortest Path: A Search Meets Graph Theory,” Microsoft Research, Technical Report MSR-TR-2004 Mar. 24, 2003.
Gueziec, Andre, “3D Traffic Visualization in Real Time,” ACM Siggraph Technical Sketches, Conference Abstracts and Applications, p. 144, Los Angeles, CA, Aug. 2001.
Hankey, et al., “In-Vehicle Information Systems Behavioral Model and Design Support: Final Report,” Feb. 16, 2000, Publication No. 00-135, Research, Development, and Technology, Turner-Fairbank Highway Research Center, McLean, Virginia.
Nintendo Wii Operations Manual Systems Setup. 2009.
Rockwell, Mark, “Telematics Speed Zone Ahead,” Wireless Week, Jun. 15, 2004, Reed Business Information, http://www.wirelessweek.com.
Taylor, K.B., “TravTek—Information and Services Center,” Vehicle Navigation & Information System Conference Proceedings (VNIS'91), P-253, Part 2, Oct. 1991, pp. 763-774, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
User Guide of Tom Tom ONE; 2006.
Volkswagon Group of America, Inc.'s Answer and Counterclaim.
Watanabe, M. et al., “Development and Evaluation of a Car Navigation System Providing a Bird's-Eye View Map Display,” Technical Paper No. 961007, Feb. 1, 1996, pp. 11-18, SAE International.
Yim et al., TravInfo. Field Operationall Test Evaluation “Evaluation of TravInfo Field Operation Test” Apr. 25, 2000.
Yim et al., “TravInfo Field Operational Test Evaluation: Information Service Providers Customer Survey” (2000).
PCT Application No. PCT/US2004/23884, Search Report and Written Opinion mailed Jun. 17, 2005.
PCT Application No. PCT/US2011/48680, Search Report and Written Opinion mailed Feb. 7, 2012.
PCT Application No. PCT/US2011/51647, Search Report and Written Opinion mailed Feb. 2, 2012.
PCT Application No. PCT/US2011/60663, Search Report and Written Opinion mailed May 31, 2012.
PCT Application No. PCT/US2012/38702, Search Report and Written Opinion mailed Aug. 24, 2012.
EP Patent Application No. 11 825 897.9, Communication mailed May 3, 2013.
U.S. Appl. No. 10/379,967, Final Office Action mailed May 11, 2005.
U.S. Appl. No. 10/379,967, Office Action mailed Sep. 20, 2004.
U.S. Appl. No. 10/897,550, Office Action mailed Jun. 12, 2009.
U.S. Appl. No. 10/897,550, Office Action mailed Jan. 21, 2009.
U.S. Appl. No. 10/897,550, Office Action mailed Aug. 1, 2008.
U.S. Appl. No. 10/897,550, Office Action mailed Oct. 3, 2007.
U.S. Appl. No. 11/509,954, Office Action mailed Nov. 23, 2007.
U.S. Appl. No. 11/751,628, Office Action mailed Jan. 29, 2009.
U.S. Appl. No. 12/283,748, Office Action mailed Aug. 20, 2009.
U.S. Appl. No. 12/283,748, Office Action mailed Mar. 11, 2009.
U.S. Appl. No. 12/398,120, Final Office Action mailed Mar. 26, 2013.
U.S. Appl. No. 12/398,120, Office Action mailed Nov. 14, 2012.
U.S. Appl. No. 12/398,120, Final Office Action mailed Apr. 12, 2012.
U.S. Appl. No. 12/398,120, Office Action mailed Nov. 15, 2011.
U.S. Appl. No. 12/763,199, Final Office Action mailed Nov. 1, 2010.
U.S. Appl. No. 12/763,199, Office Action mailed Aug. 5, 2010.
U.S. Appl. No. 12/860,700, Office Action mailed Feb. 26, 2013.
U.S. Appl. No. 12/967,045, Final Office Action mailed Jun. 27, 2012.
U.S. Appl. No. 12/967,045, Office Action mailed Jul. 18, 2011.
U.S. Appl. No. 13/296,108, Office Action mailed May 9, 2013.
U.S. Appl. No. 13/316,250, Office Action mailed Jan. 18, 2013.
U.S. Appl. No. 13/475,502, Office Action mailed Apr. 22, 2013.
U.S. Appl. No. 13/561,269, Office Action mailed Dec. 13, 2012.
U.S. Appl. No. 13/561,327, Office Action mailed Oct. 26, 2012.
U.S. Appl. No. 12/860,700, Final Office Action mailed Jun. 26, 2013.
U.S. Appl. No. 13/316,250, Final Office Action mailed Jun. 24, 2013.
U.S. Appl. No. 13/475,502, Final Office Action mailed Sep. 10, 2013.
U.S. Appl. No. 13/747,454, Office Action mailed Jun. 17, 2013.
U.S. Appl. No. 13/752,351, Office Action mailed Jul. 22, 2013.
PCT Application No. PCT/US2013/23505, Search Report and Written Opinion mailed May 10, 2013.
U.S. Appl. No. 13/296,108, Final Office Action mailed Oct. 25, 2013.
U.S. Appl. No. 12/860,700, Final Office Action mailed Jul. 22, 2014.
U.S. Appl. No. 12/860,700, Office Action mailed Apr. 3, 2014.
Yang, Qi; “A Simulation Laboratory for Evaluation of Dynamic Traffic Management Systems”, Massachusetts Institute of Technology, Jun. 1997.
U.S. Appl. No. 14/100,985, Office Action mailed Sep. 23, 2014.
Huang, Tsan-Huang, Chen, Wu-Cheng; “Experimental Analysis and Modeling of Route Choice with the Revealed and Stated Preference Data” Journal of the Eastern Asia Society for Transportation Studies, vol. 3, No. 6, Sep. 1999—Traffic Flow and Assignment.
U.S. Appl. No. 14/323,352, Office Action mailed Nov. 26, 2014.
U.S. Appl. No. 14/058,195, Office Action mailed Nov. 12, 2014.
U.S. Appl. No. 14/100,985, Final Office Action mailed Mar. 25, 2015.
U.S. Appl. No. 14/327,468, Office Action mailed Mar. 12, 2015.
U.S. Appl. No. 14/323,352, Final Office Action mailed Apr. 3, 2015.
U.S. Appl. No. 14/058,195, Final Office Action mailed Apr. 8, 2015.
Related Publications (1)
Number Date Country
20100333045 A1 Dec 2010 US
Continuation in Parts (1)
Number Date Country
Parent 12398120 Mar 2009 US
Child 12881690 US