Controlling a three-dimensional virtual broadcast presentation

Information

  • Patent Grant
  • 9448690
  • Patent Number
    9,448,690
  • Date Filed
    Monday, December 9, 2013
    10 years ago
  • Date Issued
    Tuesday, September 20, 2016
    7 years ago
Abstract
Control of a three-dimensional virtual broadcast presentation is disclosed. The three-dimensional virtual broadcast presentation may be generated based on dynamic information such as traffic information, weather information, or other information that may be featured on a three-dimensional virtual broadcast presentation. A signal generated by a control device maneuvered by a presenter and reflecting positional information of the control device is received. A view of the three-dimensional virtual broadcast presentation is manipulated in response to the received signal, the manipulation of the virtual broadcast presentation at least partially based on positional information of the control device.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention


The present invention generally relates to broadcast presentation technology. More specifically, the present invention relates to controlling a three-dimensional virtual broadcast presentation.


2. Description of the Related Art


Broadcast presentations such as traffic reports and weather forecasts generally include a series of maps and images referred to be a presenter can during the course of such presentations. The maps and images may be quite complex incorporating animations, three-dimensional graphics, and multimedia overlays. Transitions between the maps and images may have added effects as well.


These broadcast presentations are conventionally performed in a scripted manner. The series of maps and images referred to by the presenter are produced prior to the broadcast presentation and arranged in a fixed sequence much like a slide show. The presenter may have an ability to control progression or retrogression of the sequence, but is otherwise bound to the initially included material. As a result, the information included in the maps and images—at the time of presentation—is outdated. This outdated information can have a drastic impact on the accuracy of, for example, for traffic reports.


Existing broadcast presentation technology also lacks the ability for the presenter to interact with the maps and images during the broadcast presentation. The presenter is unable to spontaneously change a view of a particular map to investigate an area that was not a part of the original script.


There is, therefore, a need in the art for the broadcast of three-dimensional virtual presentations that can be referred to and manipulated by a user in real-time.


SUMMARY OF THE PRESENTLY CLAIMED INVENTION

Embodiments of the present invention allow a presenter to change a view of a broadcast presentation and to interact with the broadcast presentation in real-time.


In a first claimed embodiment, a method for controlling a three-dimensional virtual broadcast presentation is disclosed. The method includes generating a three-dimensional virtual broadcast presentation based on dynamic information. A signal generated by a control device maneuvered by a presenter is received, the signal reflecting positional information of the control device. Positional information of the control device may be associated with motion or attitude of the control device. A view of the three-dimensional virtual broadcast presentation is manipulated in response to the received signal, the manipulation of the virtual broadcast presentation at least partially based on positional information of the control device.


In a second claimed embodiment, a system for controlling a three-dimensional virtual broadcast presentation is disclosed. The system includes a communications module stored in memory and executable by a processor to receive a signal generated by a control device maneuvered by a presenter, the signal reflecting positional information of the control device. The system also include a presentation rendering module stored in memory and executable by a processor to generate a three-dimensional virtual broadcast presentation based on dynamic information, and manipulate a view of the three-dimensional virtual broadcast presentation. The manipulation of the virtual broadcast presentation at least partially based on positional information of the control device.


A third claimed embodiment discloses a computer readable storage medium having a program embodied thereon. The program is executable by a processor to perform method for controlling a three-dimensional virtual broadcast presentation. The method includes generating a three-dimensional virtual broadcast presentation based on dynamic information; receiving a signal generated by a control device maneuvered by a presenter, the signal reflecting positional information of the control device; and manipulating a view of the three-dimensional virtual broadcast presentation in response to the received signal, the manipulation of the virtual broadcast presentation at least partially based on positional information of the control device.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A is a block diagram illustrating an environment for the broadcast of three-dimensional virtual presentations that can be referred to and manipulated by a user in real-time.



FIG. 1B illustrates an exemplary control device as referenced in FIG. 1A.



FIG. 2A is a block diagram of an exemplary virtual broadcast presentation engine.



FIG. 2B illustrates an exemplary configuration panel as may be launched through execution of the virtual broadcast presentation engine of FIG. 2A.



FIG. 3 illustrates an exemplary three-dimensional virtual broadcast presentation.



FIG. 4 is a flowchart illustrating an exemplary method for controlling a three-dimensional virtual broadcast presentation.





DETAILED DESCRIPTION

The present invention provides for control of a three-dimensional virtual broadcast presentation. The three-dimensional virtual broadcast presentation may include maps and images rendered in a three-dimensional manner and that can be referred to in real-time by a presenter during the broadcast presentation. The presenter may maneuver a handheld control device to manipulate a view of the three-dimensional virtual broadcast presentation. The presenter may also select interactive elements included in the three-dimensional virtual broadcast presentation using the handheld control device.


Referring now to FIG. 1A, a block diagram of an environment 100 for the broadcast of three-dimensional virtual presentations that can be referred to and manipulated by a user in real-time is shown. The environment 100 of FIG. 1A includes a computing device 105 having a virtual broadcast presentation engine 110. The computing device 105 of FIG. 1A is in communication with information sources 115, a control device 120, a chroma key system 125, and a broadcast system 130. While FIG. 1A illustrates one particular environment 100 for the broadcast of a three-dimensional virtual presentation and including certain elements, alternative embodiments may be implemented that utilize differing elements than those disclosed in FIG. 1A (or combinations of the same), but that otherwise fall within the scope and spirit of the present invention.


The computing device 105 and the virtual broadcast presentation engine 110 generate a composite presentation that includes a three-dimensional virtual broadcast presentation and possibly footage of a presenter. The composite presentation may be generated using information obtained in real-time (or near real-time) from the information sources 115 and the chroma key system 125 as described in further detail below. The virtual broadcast presentation engine 110 is, in particular, discussed with respect to FIG. 2A. The computing device 105 may include various components (not depicted) such as one or more of communications interfaces, a processor, memory, storage, and any number of buses providing communication therebetween. The processor may execute instructions implemented through computing modules or engines while the memory and storage may both permanently or temporarily store data including the aforementioned modules and engines.


The information sources 115 may be provided by various organizations and in a variety of forms. The information sources 115 may include data sources related to traffic data such as traffic flow and as described in U.S. patent application Ser. No. 11/302,418 or weather data such as forecasts. The information sources 115 may also include data sources related to election results, newsworthy events or incidents, school closings, and other information that may be featured on a three-dimensional virtual broadcast presentation. The information sources 115 may require subscription or authentication for access and may be accessible via Telnet, FTP, or web services protocols. Information may be received from information sources 115 in real-time or near real-time to allow for generation of an equally real-time or near real-time presentation. That presentation may, in turn, be manipulated in real-time.


In an embodiment of the present invention utilizing traffic data specific to the San Francisco Bay area, the information sources 115 may include one or more of the 511.org system (a collaboration of public agencies including the California Highway Patrol, Metropolitan Transportation Commission, and CALTRANS), the California Highway Patrol (CHP) World Wide Web server, the PeMS system at the University of California at Berkeley, various public event listings, or a publicly or privately accessible user input mechanism. For weather data, the information sources 115 may include the National Weather Service among other weather information sources. Other data sources or alternative types of data sources (e.g., non-traffic and non-weather related sources) may be incorporated and utilized in various embodiments of the present invention.


Control device 120 may include a wireless handheld controller. The control device 120 may be in communication with the computing device 105 via a Bluetooth, WiFi, or other wireless connection. The control device 120 may sense its own motion and/or attitude. Attitude of the control device 120 describes the inclination of the principal axes of the control device 120 relative to the direction of the Earth's gravitational pull. The control device 120 may include a three-axis accelerometer that can sense orientation or changes in orientation of the control device 120 relative to the direction of the Earth's gravitational pull. The control device 120 may also be capable of sensing its own motion or attitude by detecting illumination emitted by positioned emitters. As the presenter maneuvers the control device 120 by turning, rotating, tilting, or twisting about various axes, the control device 120 generates a signal based at least partially on positional information of the control device 120. The positional information may be associated with the motion of the control device 120 or the attitude of the control device 120. A magnitude of such maneuvers of the control device 120 by the presenter may be included in the signal as described further herein. Stability control may be implemented in some embodiments of the control device 120 such that small and often unintentional motions of the control device 120 are smoothed or ignored.


The control device 120 may include other components such as buttons, switches, or triggers. Actuation of these other components may be a partial basis for any signal generated by the control device 120. Actuation of these other components may be combined with certain motions or attitudes of the control device yielding a wider variety of signal possibilities. For example, actuation of various buttons, switches, or triggers may control certain zooming functions, open or close pop-up windows in the three-dimensional virtual broadcast presentation, or obtain a default orientation of the three-dimensional virtual broadcast presentation (e.g., align the three-dimensional virtual broadcast presentation to face north). Additionally, a signal associated with positional information of the control device 120 may be sent by the control device 120 to the computing device 105 only when a particular button is depressed, in accordance with exemplary embodiments. This may be implemented as a safely feature so that the presenter cannot accidentally or automatically affect the three-dimensional virtual broadcast presentation.


The control device 120 may include various portable devices capable of detecting positional information. For example, the control device 120 may be an iTouch.™ or iPhone™, both available from Apple Inc., of Cupertino, Calif. The control device 120 may also be a Wii™ Remote (sometimes referred to as a Wiimote) available from Nintendo Co., Ltd., of Kyoto, Japan. A control device similar to the Wii™ Remote is described in greater detail in U.S. patent application Ser. No. 11/532,328 filed Sep. 15, 2006 and entitled “Video Game System with Wireless Modular Handheld Controller,” the disclosure of which is incorporated herein by reference.



FIG. 1B illustrates an exemplary control device 120. Other devices may be used as the control device 120 in the context of the present invention. Signals may be sent by the control device 120 that correspond to positional information of the control device 120, and to actuation of buttons 135-170 and trigger 175. These signals may be configured to instruct the computing device 105 and/or the virtual broadcast presentation engine 110 to control various aspects of the three-dimensional virtual broadcast presentation. Such configuration of the signals may be customized for a specific application or to suit preferences of a specific presenter.


Actuation of the button 135 may turn on or turn off the control device 120 or another component of the environment 100. Various actuations of the directional button 140 may change a focus of the three-dimensional virtual broadcast presentation to different points of interest within the presentation. Actuation of the button 145 may control whether signals are sent that correspond to maneuvers of the control device 120. For example, signals corresponding to maneuvers of the control device 120 may be sent to the computing device 105 only when the button 145 is depressed by the presenter. Actuation of the buttons 150 and 155 may result a zoom-in or zoom-out of a view of the three-dimensional virtual broadcast presentation. Actuation of the button 160 may result in the three-dimensional virtual broadcast presentation returning to a default orientation (e.g., the three-dimensional virtual broadcast presentation being aligned to face north). Actuation of the button 170 may result in selection of interactive elements included in the three-dimensional virtual broadcast presentation. Actuation of the button 175 may cause certain elements of the three-dimensional virtual broadcast presentation to be hidden such as pop-up windows. Actuation of the trigger 180 may effectuate similar functions as actuation of the button 145.


Chroma key system 125 may be used to capture footage of the presenter that can be used as part of the composite presentation generated by the computing device 105. The chroma key system 125 may provide the presenter with a preview of the composite presentation to allow the presenter to appear as though he or she is naturally interacting with the three-dimensional virtual broadcast presentation.


Chroma key systems are well known in the art. To illustrate the basic principles of such a system, consider a weather forecast broadcast. The presenter appears to be standing in front of a large weather map. In the television studio, however, the weather map is actually a large ‘blue screen.’ The presenter stands in front of the ‘blue screen’ and the weather map is added to those parts of the image where the color is blue.


The chroma key system 125 may include a blue or green screen, a monitor meant only for viewing by the presenter and that shows the presenter ‘overlaid’ on the three-dimensional virtual broadcast presentation, and a camera that captures video footage of the presenter in front of the screen. Various components may be included in the chroma key system 125 depending on the particular implementation of the same.


The broadcast system 130 disseminates the composite presentation to viewers. Dissemination may occur via radio waves such as UHF or VHF, cable, satellite, or the World Wide Web. Hardware and software necessary to effectuate a broadcast may be included in the broadcast system 130 and are generally known to those skilled in the broadcasting art.



FIG. 2A is a block diagram of the virtual broadcast presentation engine 110. The virtual broadcast presentation engine 110 of FIG. 2A includes a communications module 205, a presentation rendering module 210, a selection module 215, and a feedback module 220. The broadcast engine 110 and its constituent modules may be stored in memory and executed by a processing device to effectuate the functionality corresponding thereto. The virtual broadcast presentation engine 110 may be composed of more or less modules (or combinations of the same) and still fall within the scope of the present invention. For example, the functionality of the selection module 215 and the functionality of the feedback module 220 may be combined into a single module.


Execution of the communications module 205 allows for receipt of a signal generated by the control device 120, which may be based at least partially on the positional information of the control device 120 as maneuvered by the presenter. The signal may additionally be based on—in part or in whole—the actuation of other components included in the control device 120 such as buttons, switches, or triggers.


In addition to the signal generated by the control device 120, execution of the communications module 205 may also allow for receipt of dynamic information from the information sources 115. This dynamic information may be used by other modules for generating, manipulating, and interacting with the three-dimensional virtual broadcast presentation.


The communications module 205 may also allow the presenter or other users to control certain aspects of the control device 120 such as how signals received from the control device 120 are interpreted by the modules of the virtual broadcast presentation engine 110. FIG. 2B illustrates an exemplary configuration panel 225. The configuration panel 225 may be accessed and manipulated by the presenter or the other users by use of components associated with the computing device 105. These components (not depicted) may include a monitor, a keyboard, a mouse, and other various peripheral devices.


As depicted, the configuration panel 225 includes sliders 230 and check boxes 235. The sliders 230 may be used to adjust sensitivity to various maneuvers of the control device 120 by the presenter. The check boxes 235 may be used to activate or deactivate various buttons included in the control device 125. The configuration panel 225 may also include status information about the control device 120. For example, as depicted, the configuration panel 225 includes a power meter 240 that may indicate a power level of batteries included in the control device 120.


Referring again to FIG. 2A, execution of the presentation rendering module 210 allows for the generation of a three-dimensional virtual broadcast presentation based on the dynamic information received through execution of the communications module 205. The dynamic information may include traffic information, weather information, election results, newsworthy events or incidents, school closings, or other information that may be featured on a three-dimensional virtual broadcast presentation.


Execution of the presentation rendering module 210 may also allow for manipulation of a view of the three-dimensional virtual broadcast presentation in response to the signal received by the communications module 205 from the control device 120. Manipulating the view of the presentation may include one or more of zooming into, panning across, rotating, or tilting the three-dimensional virtual broadcast presentation. Signals corresponding to various motions or attitudes of the control device 120 may be assigned to various other manipulations of the three-dimensional virtual broadcast presentation. For example, actuation of a trigger included in the control device 120 may affect zoom speed, whereas a certain motion or attitude of the control device 120 may affect zoom direction. Furthermore, the magnitude of the maneuvers of the control device 120 by the presenter may be included in the signal received by the communications module 205 to adjust a speed at which the view of the presentation is manipulated.


Execution of the selection module 215 allows for selection of an interactive element included in the three-dimensional virtual broadcast presentation in response to the received signal. The interactive element may represent a traffic alert. For example, if road construction is taking place at a given intersection of two streets, an icon indicative of road construction may be placed in the three-dimensional virtual broadcast presentation at a position that corresponds to that given intersection. Execution of the selection module 215 may also select the interactive element when the interactive element is positioned near the center of the three-dimensional virtual broadcast presentation.


Selecting the interactive element may cause one of a variety of responses from the three-dimensional virtual broadcast presentation. For example, if the interactive element corresponds to a traffic camera, selecting the interactive element may cause a live camera view to appear within the three-dimensional virtual broadcast presentation.


Execution of the feedback module 220 provides feedback to the presenter to inform the presenter that a given interactive element is selectable. For example, the interactive element may only be selectable in certain regions of the three-dimensional virtual broadcast presentation, such as the center. When the interactive element enters or leaves the center of the three-dimensional virtual broadcast presentation, the presenter may be informed via feedback. The feedback may include highlighting of the interactive element. To avoid distracting or otherwise undesirable imagery such as a cursor being included in the three-dimensional virtual broadcast presentation, non-visible feedback may be invoked. Examples of non-visible feedback include a vibration of the control device or an audible tone. Visible feedback mechanisms may be employed only on the monitor of the chroma key system 125, which is meant only for the presenter, while the composite presentation used by the broadcast system 130 may not include the visible feedback mechanism.


Execution of the virtual broadcast presentation engine 110 may output the three-dimensional virtual broadcast presentation to other components of the computing device 105 for generation of the composite presentation. Accordingly, the computing device 105 may output the composite presentation to the broadcast system 130 for dissemination to the views.



FIG. 3 illustrates an exemplary three-dimensional virtual broadcast presentation 300. The presentation 300 of FIG. 3 includes traffic information. The principles described herein with respect to traffic are equally applicable to embodiments of the present invention that include weather information, election results, newsworthy events or incidents, school closings, or other information that may be featured on a three-dimensional virtual broadcast presentation. The presentation 300 may be generated and manipulated by execution of the presentation rendering module 210 in real-time. Presentation 300 may include satellite images of a given area with an animated road traffic report. A detailed description of animated road traffic reports may be found in U.S. patent application Ser. No. 11/302,418, the disclosure of which has been previously incorporated by reference.


Satellite images may be manipulated by execution of the presentation rendering module 210 to aid in generating three-dimensional information. For example, two-dimensional satellite images may be processed in the context of other geographical information (e.g., topographical information) to generate a three-dimensional satellite image that reflects information along an x-, y-, and z-axis as illustrated in presentation 300. The textured three-dimensional representation of landscape of the particular urban area aligns with and provides the three-dimensional coordinates for the road ways that may be animated and overlain on the satellite images.


The presentation 300 may also include a variety of markers such as city street labels 305, exit labels, nick-named sections of highways, or city streets. These markers may be readily recognizable, such as a highway marker 310 resembling a California state highway sign with the appropriate highway number applied to the sign. Presentation 300 may include markers or icons that correspond to the location of traffic incidents, road construction, and traffic cameras such as incident marker 315. Some or all of these markers may be interactive elements of the three-dimensional virtual broadcast presentation 300. Accordingly, the interactive elements may be selected by the presenter using the control device 120. When an interactive element is selected, additional information related to that interactive element may be displayed within the presentation 300. In one example, an interactive element marking a traffic incident may be selected resulting in detailed textual information describing that traffic incident being displayed.


A view of the presentation 300 may be manipulated to give the effect of ‘flying’ through the three-dimensional virtual representation of the urban area by a combination of zooming, panning, tilting, and/or rotating the view. For example, as the presenter rotates the control device 120, the control device 120 generates a corresponding signal that is received in conjunction with execution of the communications module 205. In turn, the presentation rendering module 210 is executed to rotate the presentation 300 a corresponding amount as the presenter rotated the control device 120. This correspondence of the presentation 300 to manipulation of the control device 120 gives the presenter the sensation of directly controlling the presentation 300. Such manipulation of the view may also be used in selecting interactive elements. For example, if a particular interactive element may be selected only when near the center of the presentation 300, the presenter may cause the view to be manipulated such that that particular interactive element is centered and therefore selectable.



FIG. 4 is a flowchart illustrating an exemplary method 400 for controlling a three-dimensional virtual broadcast presentation. The steps of method 400 may be performed in varying orders. Steps may be added or subtracted from the method 400 and still fall within the scope of the present invention.


In step 405, a three-dimensional (3D), real-time, virtual broadcast presentation is generated. The presentation may be based on dynamic information. Execution of the presentation rendering module 210 may perform step 405. The dynamic information may include real-time traffic information or real-time weather information and be received in conjunction with execution of the communications module 205 from the information sources 115.


In step 410, a signal may be received that is generated by the control device 120 maneuvered by a presenter. Step 410 may be performed by execution of the communications module 205. The signal may be based at least partially on positional information of the control device 120. The signal may also be based at least partially on actuation of other components such as buttons, switches, or triggers of the control device 120. Receipt of the signal in step 410 allows for real-time manipulation of the presentation in step 415.


In step 415, a view of the three-dimensional virtual broadcast presentation is manipulated in real-time and in response to the signal received in step 410. Execution of the presentation rendering module 210 may perform step 415. Real-time manipulation of presentation and various views thereof may include one or more of zooming into, panning across, tilting, or rotating the three-dimensional virtual broadcast presentation.


Any number of additional and/or optional steps that are not otherwise depicted may be included in method 400. These steps may include one or more of an interactive element included in the three-dimensional virtual broadcast presentation being selected using the control device 120 or feedback being provided to the presenter to inform the presenter that an interactive element is selectable.


It is noteworthy that any hardware platform suitable for performing the processing described herein is suitable for use with the invention. Computer-readable storage media refer to any medium or media that participate in providing instructions to a central processing unit (CPU) for execution. Such media can take forms, including, but not limited to, non-volatile and volatile media such as optical or magnetic disks and dynamic memory, respectively. Common forms of computer-readable storage media include a floppy disk, a flexible disk, a hard disk, magnetic tape, any other magnetic medium, a CD-ROM disk, digital video disk (DVD), any other optical medium, RAM, PROM, EPROM, a FLASHEPROM, any other memory chip or cartridge.


Various forms of transmission media may be involved in carrying one or more sequences of one or more instructions to a CPU for execution. A bus carries the data to system RAM, from which a CPU retrieves and executes the instructions. The instructions received by system RAM can optionally be stored on a fixed disk either before or after execution by a CPU.


While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. The descriptions are not intended to limit the scope of the invention to the particular forms set forth herein. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments. It should be understood that the above description is illustrative and not restrictive. To the contrary, the present descriptions are intended to cover such alternatives, modifications, and equivalents as may be included within the spirit and scope of the invention as defined by the appended claims and otherwise appreciated by one of ordinary skill in the art. The scope of the invention should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the appended claims along with their full scope of equivalents.

Claims
  • 1. A method for controlling a three-dimensional virtual broadcast presentation, the method comprising: processing two-dimensional satellite images in the context of geographical information in real-time to generate a three-dimensional virtual broadcast presentation;displaying traffic-based information obtained from a plurality of different data sources on the generated three-dimensional virtual broadcast presentation via one or more traffic icons that are selectable by a user to provide additional information associated with the selected traffic icon;receiving a first signal and a second signal from a control device, wherein the first signal is associated with a motion or attitude of the control device caused by a maneuver of the control device by the user and the second signal is associated with actuation of a component of the control device by the user, wherein the first signal is processed only when concurrently receiving the second signal;manipulating a view of the three-dimensional virtual broadcast presentation in real-time in response to the processing of the first signal, wherein the manipulation of the view of the three-dimensional virtual broadcast presentation includes at least one of panning or tilting the view; anddisplaying the manipulated view of the three-dimensional virtual broadcast presentation, wherein the manipulated view highlights a sub-set of the one or more traffic icons that can be selected by the user.
  • 2. The method of claim 1, wherein the maneuver of the control device by the user is selected from a group consisting of: turning, rotating, tilting, or twisting the control device about principal axes of the control device.
  • 3. The method of claim 1, wherein the component of the control device is a button, switch, or trigger of the control device.
  • 4. The method of claim 1, wherein the manipulation of the view includes aligning the view of the three-dimensional virtual broadcast presentation with a default orientation.
  • 5. The method of claim 1, wherein the manipulation of the view changes a focus of the view of the three-dimensional virtual broadcast presentation to a point of interest within the presentation, wherein the presentation includes a plurality of points of interest.
  • 6. The method of claim 1, wherein the first signal received is associated with a magnitude of the maneuver of the control device by the user.
  • 7. The method of claim 6, wherein a speed at which the view is manipulated is based on the magnitude of the maneuver of the control device by the user.
  • 8. The method of claim 1, wherein a sensitivity of the control device to a maneuver of the control device is adjustable by the user.
  • 9. The method of claim 1, wherein the control device is configurable by the user to activate and deactivate components of the control device.
  • 10. The method of claim 1, further comprising: receiving a selection of one or more traffic icons on the generated three-dimensional virtual broadcast presentation from the user;determining that the selected traffic icon is selectable based on the manipulation of the three-dimensional virtual broadcast presentation; anddisplaying detailed traffic-based information based on the selected traffic icons.
  • 11. The method of claim 1, wherein the highlights used to identify selectable traffic icons for the user are only provided to the user.
  • 12. A system for controlling a three-dimensional virtual broadcast presentation, the system comprising: a control device; anda computing device communicatively coupled to the control device, the computing device including a processor and memory storing: a communications module executable by the processor to receive a first signal and a second signal from the control device, wherein the first signal received is associated with a motion or attitude of the control device caused by a maneuver of the control device by a user, wherein the second signal received is associated with actuation of a component of the control device by the user, and wherein the first signal is only received during concurrent receipt of the second signal; anda presentation rendering module executable by the processor to: process two-dimensional satellite images in the context of geographical information in real-time to generate a three-dimensional virtual broadcast presentation,display the traffic-based information obtained from a plurality of different data sources on the generated three-dimensional virtual broadcast presentation via one or more traffic icons, wherein the traffic icons are selectable by the user to provide additional information associated with the selected traffic icon,manipulate a view of the three-dimensional virtual broadcast presentation in real-time in response to the processing of the first signal, wherein the manipulation of the view of the three-dimensional virtual broadcast presentation includes at least one of panning or tilting the view, anddisplaying the manipulated view of the three-dimensional virtual broadcast presentation, wherein the manipulated view highlights a sub-set of the one or more traffic icons that can be selected by the user.
  • 13. The system of claim 12, wherein the maneuver of the control device by the user is selected from a group consisting of: turning, rotating, tilting, or twisting the control device about principal axes of the control device.
  • 14. The system of claim 12, wherein the component of the control device is a button, switch, or trigger of the control device.
  • 15. The system of claim 12, wherein the manipulation of the view includes aligning the view of the three-dimensional virtual broadcast presentation with a default orientation.
  • 16. The system of claim 12, wherein the manipulation of the view changes a focus of the view of the three-dimensional virtual broadcast presentation to a point of interest within the presentation, wherein the presentation includes a plurality of points of interest.
  • 17. The system of claim 12, wherein the first signal received is associated with a magnitude of the maneuver of the control device by the user.
  • 18. The system of claim 17, wherein a speed at which the view is manipulated is based on the magnitude of the maneuver of the control device by the user.
  • 19. The system of claim 12, wherein the communications module further comprises a configuration panel stored in memory and executable by the processor to adjust a sensitivity of the control device to a maneuver of the control device by the user.
  • 20. The system of claim 19, wherein the configuration panel is further executable by a processor to activate and deactivate components of the control device.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a continuation and claims the priority benefit of U.S. patent application Ser. No. 12/398,120 filed Mar. 4, 2009, the disclosure of which is incorporated herein by reference.

US Referenced Citations (390)
Number Name Date Kind
4734863 Honey et al. Mar 1988 A
4788645 Zavoli et al. Nov 1988 A
4792803 Madnick et al. Dec 1988 A
4796191 Honey et al. Jan 1989 A
4878170 Zeevi Oct 1989 A
4914605 Loughmiller, Jr. et al. Apr 1990 A
4926343 Tsuruta et al. May 1990 A
5068656 Sutherland Nov 1991 A
5086510 Guenther et al. Feb 1992 A
5095532 Mardus Mar 1992 A
5126941 Gurmu et al. Jun 1992 A
5164904 Sumner Nov 1992 A
5173691 Sumner Dec 1992 A
5182555 Sumner Jan 1993 A
5220507 Kirson Jun 1993 A
5247439 Gurmu et al. Sep 1993 A
5262775 Tamai et al. Nov 1993 A
5276785 Mackinlay et al. Jan 1994 A
5283575 Kao et al. Feb 1994 A
5291412 Tamai et al. Mar 1994 A
5291413 Tamai et al. Mar 1994 A
5291414 Tamai et al. Mar 1994 A
5297028 Ishikawa Mar 1994 A
5297049 Gurmu et al. Mar 1994 A
5303159 Tamai et al. Apr 1994 A
5311195 Mathis et al. May 1994 A
5311434 Tamai May 1994 A
5339246 Kao Aug 1994 A
5343400 Ishikawa Aug 1994 A
5345382 Kao Sep 1994 A
5359529 Snider Oct 1994 A
5374933 Kao Dec 1994 A
5377113 Shibazaki et al. Dec 1994 A
5390123 Ishikawa Feb 1995 A
5394333 Kao Feb 1995 A
5402120 Fujii et al. Mar 1995 A
5414630 Oshizawa et al. May 1995 A
5428545 Maegawa et al. Jun 1995 A
5430655 Adachi Jul 1995 A
5440484 Kao Aug 1995 A
5465079 Bouchard et al. Nov 1995 A
5477220 Ishikawa Dec 1995 A
5485161 Vaughn Jan 1996 A
5488559 Seymour Jan 1996 A
5499182 Ousborne Mar 1996 A
5508931 Snider Apr 1996 A
5515283 Desai et al. May 1996 A
5515284 Abe May 1996 A
5539645 Mandhyan et al. Jul 1996 A
5546107 Deretsky et al. Aug 1996 A
5548822 Yogo Aug 1996 A
5550538 Fujii et al. Aug 1996 A
5554845 Russell Sep 1996 A
5583972 Miller Dec 1996 A
5608635 Tamai Mar 1997 A
5610821 Gazis et al. Mar 1997 A
5689252 Ayanoglu et al. Nov 1997 A
5694534 White, Jr. et al. Dec 1997 A
5699056 Yoshida Dec 1997 A
5706503 Poppen et al. Jan 1998 A
5712788 Liaw et al. Jan 1998 A
5729458 Poppen Mar 1998 A
5731978 Tamai et al. Mar 1998 A
5742922 Kim Apr 1998 A
5751245 Janky et al. May 1998 A
5751246 Hertel May 1998 A
5757359 Morimoto et al. May 1998 A
5774827 Smith et al. Jun 1998 A
5818356 Schuessler Oct 1998 A
5822712 Olsson Oct 1998 A
5842142 Murray et al. Nov 1998 A
5845227 Peterson Dec 1998 A
5850190 Wicks et al. Dec 1998 A
5862244 Kleiner et al. Jan 1999 A
5862509 Desai et al. Jan 1999 A
5864305 Rosenquist Jan 1999 A
5867110 Naito et al. Feb 1999 A
5893081 Poppen Apr 1999 A
5893898 Tanimoto Apr 1999 A
5898390 Oshizawa et al. Apr 1999 A
5902350 Tamai et al. May 1999 A
5904728 Tamai et al. May 1999 A
5908464 Kishigami et al. Jun 1999 A
5910177 Zuber Jun 1999 A
5911773 Mutsuga et al. Jun 1999 A
5912635 Oshizawa et al. Jun 1999 A
5916299 Poppen Jun 1999 A
5922042 Sekine et al. Jul 1999 A
5928307 Oshizawa et al. Jul 1999 A
5931888 Hiyokawa Aug 1999 A
5933100 Golding Aug 1999 A
5938720 Tamai Aug 1999 A
5948043 Mathis et al. Sep 1999 A
5978730 Poppen et al. Nov 1999 A
5982298 Lappenbusch et al. Nov 1999 A
5987381 Oshizawa et al. Nov 1999 A
5991687 Hale et al. Nov 1999 A
5999882 Simpson et al. Dec 1999 A
6009374 Urahashi Dec 1999 A
6011494 Watanabe et al. Jan 2000 A
6016485 Amakawa et al. Jan 2000 A
6021406 Kuznetsov Feb 2000 A
6038509 Poppen et al. Mar 2000 A
6058390 Liaw et al. May 2000 A
6064970 McMillan et al. May 2000 A
6091359 Geier Jul 2000 A
6091956 Hollenberg Jul 2000 A
6097399 Bhatt et al. Aug 2000 A
6111521 Mulder et al. Aug 2000 A
6144919 Ceylan et al. Nov 2000 A
6147626 Sakakibara Nov 2000 A
6150961 Alewine et al. Nov 2000 A
6161092 Latshaw et al. Dec 2000 A
6169552 Endo et al. Jan 2001 B1
6188956 Walters Feb 2001 B1
6209026 Ran et al. Mar 2001 B1
6222485 Walters et al. Apr 2001 B1
6226591 Okumura et al. May 2001 B1
6236933 Lang May 2001 B1
6253146 Hanson et al. Jun 2001 B1
6253154 Oshizawa et al. Jun 2001 B1
6256577 Graunke Jul 2001 B1
6259987 Ceylan et al. Jul 2001 B1
6282486 Bates et al. Aug 2001 B1
6282496 Chowdhary Aug 2001 B1
6292745 Robare et al. Sep 2001 B1
6295492 Lang et al. Sep 2001 B1
6297748 Lappenbusch et al. Oct 2001 B1
6298305 Kadaba et al. Oct 2001 B1
6317685 Kozak et al. Nov 2001 B1
6317686 Ran Nov 2001 B1
6335765 Daly et al. Jan 2002 B1
6353795 Ranjan Mar 2002 B1
6356836 Adolph Mar 2002 B1
6360165 Chowdhary Mar 2002 B1
6360168 Shimabara Mar 2002 B1
6362778 Neher Mar 2002 B2
6415291 Bouve et al. Jul 2002 B2
6424910 Ohler et al. Jul 2002 B1
6442615 Nordenstam et al. Aug 2002 B1
6456931 Polidi et al. Sep 2002 B1
6456935 Ng Sep 2002 B1
6463400 Barkley-Yeung Oct 2002 B1
6466862 DeKock et al. Oct 2002 B1
6470268 Ashcraft et al. Oct 2002 B1
6473000 Secreet et al. Oct 2002 B1
6480783 Myr Nov 2002 B1
6504541 Liu et al. Jan 2003 B1
6526335 Treyz et al. Feb 2003 B1
6529143 Mikkola et al. Mar 2003 B2
6532304 Liu et al. Mar 2003 B1
6539302 Bender et al. Mar 2003 B1
6542814 Polidi et al. Apr 2003 B2
6552656 Polidi et al. Apr 2003 B2
6556905 Mittelsteadt et al. Apr 2003 B1
6559865 Angwin May 2003 B1
6574548 DeKock et al. Jun 2003 B2
6584400 Beardsworth Jun 2003 B2
6594576 Fan et al. Jul 2003 B2
6598016 Zavoli et al. Jul 2003 B1
6600994 Polidi Jul 2003 B1
6603405 Smith Aug 2003 B2
6622086 Polidi Sep 2003 B2
6639550 Knockeart et al. Oct 2003 B2
6643581 Ooishi Nov 2003 B2
6650948 Atkinson et al. Nov 2003 B1
6650997 Funk Nov 2003 B2
6654681 Kiendl et al. Nov 2003 B1
6675085 Straub Jan 2004 B2
6681176 Funk et al. Jan 2004 B2
6687615 Krull et al. Feb 2004 B1
6700503 Masar et al. Mar 2004 B2
6710774 Kawasaki et al. Mar 2004 B1
6720889 Yamaki et al. Apr 2004 B2
6728605 Lash et al. Apr 2004 B2
6728628 Peterson Apr 2004 B2
6731940 Nagendran May 2004 B1
6735516 Manson May 2004 B1
6754833 Black et al. Jun 2004 B1
6785606 DeKock et al. Aug 2004 B2
6791472 Hoffberg Sep 2004 B1
6807483 Chao et al. Oct 2004 B1
6845316 Yates Jan 2005 B2
6859728 Sakamoto et al. Feb 2005 B2
6862524 Nagda et al. Mar 2005 B1
RE38724 Peterson Apr 2005 E
6885937 Suranyi Apr 2005 B1
6901330 Krull et al. May 2005 B1
6914541 Zierden Jul 2005 B1
6922629 Yoshikawa et al. Jul 2005 B2
6931309 Phelan et al. Aug 2005 B2
6952643 Matsuoka et al. Oct 2005 B2
6965665 Fan et al. Nov 2005 B2
6983204 Knutson Jan 2006 B2
6987964 Obradovich et al. Jan 2006 B2
6989765 Gueziec Jan 2006 B2
6999873 Krull et al. Feb 2006 B1
7010583 Aizono et al. Mar 2006 B1
7062378 Krull et al. Jun 2006 B2
7069143 Peterson Jun 2006 B2
7103854 Fuchs et al. Sep 2006 B2
7161497 Gueziec Jan 2007 B2
7209828 Katou Apr 2007 B2
7221287 Gueziec May 2007 B2
7243134 Bruner et al. Jul 2007 B2
7343242 Breitenberger et al. Mar 2008 B2
7356392 Hubbard et al. Apr 2008 B2
7375649 Gueziec May 2008 B2
7424388 Sato Sep 2008 B2
7433676 Kobayashi et al. Oct 2008 B2
7440842 Vorona Oct 2008 B1
7486201 Kelly et al. Feb 2009 B2
7508321 Gueziec Mar 2009 B2
7557730 Gueziec Jul 2009 B2
7558674 Neilley et al. Jul 2009 B1
7603138 Zhang et al. Oct 2009 B2
7610145 Kantarjiev et al. Oct 2009 B2
7613564 Vorona Nov 2009 B2
7634352 Soulchin et al. Dec 2009 B2
7702452 Kantarjiev et al. Apr 2010 B2
7792642 Neiley et al. Sep 2010 B1
7880642 Gueziec Feb 2011 B2
7908076 Downs et al. Mar 2011 B2
7912627 Downs et al. Mar 2011 B2
8024111 Meadows et al. Sep 2011 B1
8103443 Kantarjiev et al. Jan 2012 B2
8229658 Dabell Jul 2012 B1
8358222 Gueziec Jan 2013 B2
8428856 Tischer Apr 2013 B2
8531312 Gueziec Sep 2013 B2
8537033 Gueziec Sep 2013 B2
8564455 Gueziec Oct 2013 B2
8618954 Free Dec 2013 B2
8619072 Gueziec Dec 2013 B2
8660780 Kantarjiev Feb 2014 B2
8718910 Gueziec May 2014 B2
8725396 Gueziec May 2014 B2
8781718 Margulici Jul 2014 B2
8786464 Gueziec Jul 2014 B2
8825356 Vorona Sep 2014 B2
8958988 Gueziec Feb 2015 B2
8965695 Tzamaloukas Feb 2015 B2
8972171 Barth Mar 2015 B1
8982116 Gueziec Mar 2015 B2
9002636 Udeshi et al. Apr 2015 B2
9046924 Gueziec Jun 2015 B2
9070291 Gueziec Jun 2015 B2
9082303 Gueziec Jul 2015 B2
9127959 Kantarjiev Sep 2015 B2
9158980 Ferguson et al. Oct 2015 B1
9293039 Margulici Mar 2016 B2
9368029 Gueziec Jun 2016 B2
9390620 Gueziec Jul 2016 B2
20010014848 Walgers et al. Aug 2001 A1
20010018628 Jenkins et al. Aug 2001 A1
20010026276 Sakamoto et al. Oct 2001 A1
20010033225 Razavi et al. Oct 2001 A1
20010047242 Ohta Nov 2001 A1
20010049424 Petiniot et al. Dec 2001 A1
20020022923 Hirabayashi et al. Feb 2002 A1
20020042819 Reichert et al. Apr 2002 A1
20020077748 Nakano Jun 2002 A1
20020152020 Seibel Oct 2002 A1
20020177947 Cayford Nov 2002 A1
20030009277 Fan et al. Jan 2003 A1
20030046158 Kratky Mar 2003 A1
20030055558 Watanabe et al. Mar 2003 A1
20030109985 Kotzin Jun 2003 A1
20030135304 Sroub et al. Jul 2003 A1
20030151592 Ritter Aug 2003 A1
20030182052 DeLorme et al. Sep 2003 A1
20040034464 Yoshikawa et al. Feb 2004 A1
20040046759 Soulchin et al. Mar 2004 A1
20040049424 Murray et al. Mar 2004 A1
20040080624 Yuen Apr 2004 A1
20040107288 Menninger et al. Jun 2004 A1
20040143385 Smyth et al. Jul 2004 A1
20040166939 Leifer et al. Aug 2004 A1
20040225437 Endo et al. Nov 2004 A1
20040249568 Endo et al. Dec 2004 A1
20050021225 Kantarjiev et al. Jan 2005 A1
20050027436 Yoshikawa et al. Feb 2005 A1
20050083325 Cho Apr 2005 A1
20050099321 Pearce May 2005 A1
20050143902 Soulchin et al. Jun 2005 A1
20050154505 Nakamura et al. Jul 2005 A1
20050212756 Marvit et al. Sep 2005 A1
20050240340 Ishikawa et al. Oct 2005 A1
20060074546 DeKock et al. Apr 2006 A1
20060122846 Burr et al. Jun 2006 A1
20060143959 Stehle et al. Jul 2006 A1
20060145892 Gueziec Jul 2006 A1
20060158330 Gueziec Jul 2006 A1
20060238521 Westerman et al. Oct 2006 A1
20060238617 Tamir Oct 2006 A1
20060284766 Gruchala et al. Dec 2006 A1
20070009156 O'Hara Jan 2007 A1
20070013551 Gueziec Jan 2007 A1
20070038362 Gueziec Feb 2007 A1
20070060384 Dohta Mar 2007 A1
20070066394 Ikeda et al. Mar 2007 A1
20070115252 Burgmans May 2007 A1
20070142995 Wottlermann Jun 2007 A1
20070197217 Sutardja Aug 2007 A1
20070208495 Chapman et al. Sep 2007 A1
20070208496 Downs et al. Sep 2007 A1
20070211026 Ohta Sep 2007 A1
20070211027 Ohta Sep 2007 A1
20070222750 Ohta Sep 2007 A1
20070247291 Masuda et al. Oct 2007 A1
20070265766 Jung et al. Nov 2007 A1
20080014908 Vasant Jan 2008 A1
20080021632 Amano Jan 2008 A1
20080071465 Chapman et al. Mar 2008 A1
20080084385 Ranta et al. Apr 2008 A1
20080096654 Mondesir et al. Apr 2008 A1
20080133120 Romanick Jun 2008 A1
20080248848 Rippy et al. Oct 2008 A1
20080255754 Pinto Oct 2008 A1
20080287189 Rabin Nov 2008 A1
20080297488 Operowsky et al. Dec 2008 A1
20090005965 Forstall et al. Jan 2009 A1
20090061971 Weitzner et al. Mar 2009 A1
20090066495 Newhouse et al. Mar 2009 A1
20090082950 Vorona Mar 2009 A1
20090096753 Lim Apr 2009 A1
20090112465 Weiss et al. Apr 2009 A1
20090118017 Perlman et al. May 2009 A1
20090118996 Kantarjiev et al. May 2009 A1
20090189979 Smyth Jul 2009 A1
20090192702 Bourne Jul 2009 A1
20090254272 Hendrey Oct 2009 A1
20100036594 Yamane Feb 2010 A1
20100045517 Tucker Feb 2010 A1
20100079306 Liu et al. Apr 2010 A1
20100094531 MacLeod Apr 2010 A1
20100100307 Kim Apr 2010 A1
20100145569 Bourque et al. Jun 2010 A1
20100145608 Kurtti et al. Jun 2010 A1
20100164753 Free Jul 2010 A1
20100175006 Li Jul 2010 A1
20100194632 Raento et al. Aug 2010 A1
20100198453 Dorogusker et al. Aug 2010 A1
20100225643 Gueziec Sep 2010 A1
20100305839 Wenzel Dec 2010 A1
20100312462 Gueziec Dec 2010 A1
20100333045 Gueziec Dec 2010 A1
20110029189 Hyde et al. Feb 2011 A1
20110037619 Ginsberg et al. Feb 2011 A1
20110106427 Kim et al. May 2011 A1
20110304447 Marumoto Dec 2011 A1
20120044066 Mauderer et al. Feb 2012 A1
20120065871 Deshpande et al. Mar 2012 A1
20120072096 Chapman et al. Mar 2012 A1
20120123667 Gueziec May 2012 A1
20120150422 Kantarjiev et al. Jun 2012 A1
20120150425 Chapman et al. Jun 2012 A1
20120158275 Huang et al. Jun 2012 A1
20120226434 Chiu Sep 2012 A1
20120290202 Gueziec Nov 2012 A1
20120290204 Gueziec Nov 2012 A1
20120296559 Gueziec Nov 2012 A1
20130033385 Gueziec Feb 2013 A1
20130204514 Margulici Aug 2013 A1
20130207817 Gueziec Aug 2013 A1
20130211701 Baker et al. Aug 2013 A1
20130297175 Davidson Nov 2013 A1
20130304347 Davidson Nov 2013 A1
20130304349 Davidson Nov 2013 A1
20140088871 Gueziec Mar 2014 A1
20140091950 Gueziec Apr 2014 A1
20140107923 Gueziec Apr 2014 A1
20140129124 Margulici May 2014 A1
20140129142 Kantarjiev May 2014 A1
20140200807 Geisberger Jul 2014 A1
20140236464 Gueziec Aug 2014 A1
20140249734 Gueziec Sep 2014 A1
20140316688 Margulici Oct 2014 A1
20140320315 Gueziec Oct 2014 A1
20150081196 Petty et al. Mar 2015 A1
20150141043 Abramson et al. May 2015 A1
20150168174 Abramson et al. Jun 2015 A1
20150168175 Abramson et al. Jun 2015 A1
20150177018 Gueziec Jun 2015 A1
20150248795 Davidson Sep 2015 A1
20150261308 Gueziec Sep 2015 A1
20150268055 Gueziec Sep 2015 A1
20150268056 Gueziec Sep 2015 A1
20150325123 Gueziec Nov 2015 A1
20160047667 Kantarjiev Feb 2016 A1
Foreign Referenced Citations (34)
Number Date Country
6710924 Jul 2013 CO
19856704 Jun 2001 DE
0 749 103 Dec 1996 EP
0 987 665 Mar 2000 EP
1 006 367 Jun 2000 EP
2 178 061 Apr 2010 EP
2 635 989 Sep 2011 EP
2 616 910 Jul 2013 EP
2 638 493 Sep 2013 EP
2 710 571 Mar 2014 EP
2 820 631 Jan 2015 EP
2 400 293 Oct 2004 GB
05-313578 Nov 1993 JP
08-77485 Mar 1996 JP
10-261188 Sep 1998 JP
10-281782 Oct 1998 JP
10-293533 Nov 1998 JP
2000-055675 Feb 2000 JP
2000-113387 Apr 2000 JP
2001-330451 Nov 2001 JP
WO 9636929 Nov 1996 WO
WO 9823018 May 1998 WO
WO 0050917 Aug 2000 WO
WO 0188480 Nov 2001 WO
WO 0277921 Oct 2002 WO
WO 03014671 Feb 2003 WO
WO 2005013063 Feb 2005 WO
WO 2005076031 Aug 2005 WO
WO 2010073053 Jul 2010 WO
WO 2012024694 Feb 2012 WO
WO 2012037287 Mar 2012 WO
WO 2012065188 May 2012 WO
WO 2012159083 Nov 2012 WO
WO 2013113029 Aug 2013 WO
Non-Patent Literature Citations (187)
Entry
US 9,019,260, 04/2015, Gueziec (withdrawn)
U.S. Appl. No. 12/881,690, Final Office Action mailed May 21, 2014.
U.S. Appl. No. 14/155,174, Christoher Kantarjiev, System and Method for Delivering Departure Notifications, filed Jan. 14, 2014.
U.S. Appl. No. 14/029,617, Andre Gueziec, Generating Visual Information Associated With Traffic, filed Sep. 17, 2013.
U.S. Appl. No. 14/022,224, Andre Gueziec, Method for Choosing a Traffic Route, filed Sep. 10, 2013.
U.S. Appl. No. 14/029,621, Andre Gueziec, Method for Predicting a Travel Time for a Traffic Route, filed Sep. 17, 2013.
U.S. Appl. No. 14/058,195, J.D. Margulici, Estimating Time Travel Distributions on Signalized Arterials, filed Oct. 18, 2013.
Acura Debuts AcuraLink™ Satellite-Linked Communication System with Industry's First Standard Real Time Traffic Feature at New York International Auto Show, 2004, 4 pages.
Adib Kanafani, “Towards a Technology Assessment of Highway Navigation and Route Guidance,” Program on Advanced Technology for the Highway, Institute of Transportation Studies, University of California, Berkeley, Dec. 1987, PATH Working Paper UCB-ITS-PWP-87-6.
Answer, Affirmative Defenses, and Counterclaims by Defendant Westwood One, Inc., to Plaintiff Triangle Software, LLC's Complaint for Patent Infringement, Mar. 11, 2011.
Answer and Counterclaims of TomTom, Inc. to Plaintiff Triangle Software, LLC's Complaint for Patent Infringement, May 16, 2011.
Amended Answer and Counterclaims of TomTom, Inc. to Plaintiff Triangle Software, LLC's Complaint for Patent Infringement, Mar. 16, 2011.
Attachment A of Garmin's Preliminary Invalidity Contentions and Certificate of Service filed May 16, 2011 in Triangle Software, LLC. V. Garmin International, Inc. et al., Case No. 1: 10-cv-1457-CMH-TCB in the United States District Court for the Eastern District of Virginia, Alexandria Division, 6 pages.
Attachment B of Garmin's Preliminary Invalidity Contentions and Certificate of Service filed May 16, 2011 in Triangle Software, LLC. V. Garmin International, Inc. et al., Case No. 1: 10-cv-1457-CMH-TCB in the United States District Court for the Eastern District of Virginia, Alexandria Division, 618 pages.
Audi-V150 Manual, Oct. 2001, 152 pages, Japan.
Balke, K.N., “Advanced Technologies for Communicating with Motorists: A Synthesis of Human Factors and Traffic Management Issues,” Report No. FHWA/TX-92/1232-8, May 1992, Texas Department Transportation, Austin, TX, USA, 62 pages.
Barnaby J. Feder, “Talking Deals; Big Partners in Technology,” Technology, The New York Times, Sep. 3, 1987.
Birdview Navigation System by Nissan Motor Corp, 240 Landmarks of Japanese Automotive Technology, 1995, 2 pages, Society of Automotive Engineers of Japan, Inc., Japan.
Blumentritt, K. et al., “Travel System Architecture Evaluation,” Publication No. FHWA-RD-96-141, Jul. 1995, 504 pages, U.S. Department of Transportation, McLean, VA, USA.
Brooks, et al., “Turn-by-Turn Displays versus Electronic Maps: An On-the-Road Comparison of Driver Glance Behavior,” Technical Report, The University of Michigan, Transportation Research Institute (UMTRI), Jan. 1999.
Burgett, A.L., “Safety Evaluation of TravTek,” Vehicle Navigation & Information Systems Conference Proceedings (VNIS'91), p. 253, Part 1, Oct. 1991, pp. 819-825, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Campbell, J.L. “Development of Human Factors Design Guidelines for Advanced Traveler Information Systems (ATIS)”, Proceedings Vehicle Navigation and Information Systems Conference, 1995, pp, 161-164, IEEE, New York, NY, USA.
Campbell, J.L. “Development of Human Factors Design Guidelines for Advanced Traveler Information Systems (ATIS) and Commercial Vehicle Operations (CVO)”, Publication No. FHWA-RD-98-057, Report Date Sep. 1998, 294, pages, U.S. Department of Transportation, McLean, VA 22010-2296.
Carin Navigation System Manual and Service Manual for Model Carin 22SY520, 75 pages, Philips Car Systems, The Netherlands, [date unknown].
Cathey, F.W. et al., “A Prescription for Transit Arrival/Department Prediction Using Automatic Vehicle Location Data,” Transportation Research Part C 11, 2003, pp. 241-264, Pergamon Press Ltd., Elsevier Ltd., U.K.
Chien, S.I. et al., “Predicting Travel Times for the South Jersey Real-Time Motorist Information System,” Transportation Research Record 1855, Paper No. 03-2750, Revised Oct. 2001, pp. 32-40.
Chira-Chavala, T. et al., “Feasibility Study of Advanced Technology HOV Systems,” vol. 3: Benefit Implications of Alternative Policies for Including HOV lanes in Route Guidance Networks, Dec. 1992, 84 ages, UCB-ITS-PRR-92-5 PATH Research Report, Inst. of Transportation Studies, Univ. of Calif., Berkeley, USA.
Clark, E.L., Development of Human Factors Guidelines for Advanced Traveler Information Systems (ATIS) and Commercial Vehicle Operations (CVO): Comparable Systems Analysis, Dec. 1996, 199 pages.
Dancer, F. et al., “Vehicle Navigation Systems: Is America Ready?,” Navigation and Intelligent Transportation System, Automotive Electronics Series, Society of Automotive Engineers, 1998, pp. Cover page, Table of Contents pp. 3-8.
Davies, P. et al., “Assessment of Advanced Technologies for Relieving Urban Traffic Congestion” National Cooperative Highway Research Program Report 340, Dec. 1991, 106 pages.
de Cambray, B. “Three-Dimensional (3D) Modeling in a Geographical Database,” Auto-Carto'11, Eleventh International Conference on Computer Assisted Cartography, Oct. 30, 1993-Nov. 1, 1993, pp. 338-347, Minneapolis, USA.
Declaration Under 37 C.F.R. 1.131 and Source Code from U.S. Appl. No. 10/897,550, Oct. 27, 2008.
Dillenburg, J.F. et al., “The Intelligent Travel Assistant,” IEEE 5th International Conference on Intelligent Transportation Systems, Sep. 3-6, 2002, pp. 691-696, Singapore.
Dingus, T.A. et al., “Human Factors Engineering the TravTek Driver Interface,” Vehicle Navigation & Information System Conference Proceedings (VNIS'91), p. 253, Part 2, Oct. 1991, pp. 749-755, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Endo, et al., “Development and Evaluation of a Car Navigation System Providing a Birds Eye View Map Display,” Navigation and Intelligent Transportation Systems, Automotive Electronics Series, Society of Automotive Engineers, 1998, pp. Cover page, Table of Contents, pp. 19-22.
Eppinger, A. et al., “Dynamic Route Guidance—Status and Trends,” Convergence 2000 International Congress on Transportation Electronics, Oct. 16-18, 1999, 7 pages, held in Detroit, MI, SAE International Paper Series, Warrendale, PA, USA.
Expert Report of Dr. Michael Goodchild Concerning the Validity of U.S. Pat. No. 5,938,720 dated Jun. 16, 2011 in Triangle Software, LLC v. Garmin International Inc. et al., in the United States District Court for the Eastern District of Virginia, Alexandria Division, Case No. 1:10-cv-1457-CMH-TCB, 16 pages.
Fawcett, J., “Adaptive Routing for Road Traffic,” IEEE Computer Graphics and Applications, May/Jun. 2000, pp. 46-53, IEEE, New York, NY, USA.
Fleischman, R.N., “Research and Evaluation Plans for the TravTek IVHS Operational Field Test, ”Vehicle Navigation & Information Systems Conference Proceedings (VNIS'91), p. 253, Part 2, Oct. 1991, pp. 827-837, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Garmin International, Inc.'s Answer and Counterclaims to Triangle Software, LLC's Complaint, Feb. 24, 2011.
Garmin International, Inc.'s Amended Answer and Counterclaims to Triangle Software, LLC's Complaint, Mar. 16, 2011.
Garmin International, Inc. and Garmin USA, Inc.'s Answer and Counterclaim to Triangle Software, LLC's Supplemental Complaints filed Jun. 17, 2011 in Triangle Software, LLC v. Garmin International Inc. et al., in the United States District Court for the Eastern District of Virginia, Alexandria Division, Case No. 1:10-cv-1457-CMH-TCB, 36 pages.
Garmin's Preliminary Invalidity Contentions and Certificate of Service filed May 16, 2011 in Triangle Software, LLC. V. Garmin International, Inc. et al., Case No. 1: 10-cv-1457-CMH-TCB in the United States District Court for the Eastern District of Virginia, Alexandria Division, 46 pages.
Goldberg et al., “Computing the Shortest Path: A* Search Meets Graph Theory,” Proc. of the 16th Annual ACM-SIAM Sym. on Discrete Algorithms, Jan. 23-25, 2005. Vancouver, BC.
Goldberg et al., “Computing the Shortest Path: A* Search Meets Graph Theory,” Microsoft Research, Technical Report MSR-TR-2004 Mar. 24, 2003.
Golisch, F., Navigation and Telematics in Japan, International Symposium on Car Navigation Systems, May 21, 1997, 20 pages, held in Barcelona, Spain.
GM Exhibits Prototype of TravTek Test Vehicle, Inside IVHS, Oct. 28, 1991, V. 1, No. 21, 2 pages.
Gueziec, Andre, “3D Traffic Visualization in Real Time,” ACM Siggraph Technical Sketches, Conference Abstracts and Applications, p. 144, Los Angeles, CA, Aug. 2001.
Gueziec, A., “Architecture of a System for Producing Animated Traffic Reports,” Mar. 30, 2011, 42 pages.
Handley, S. et al., “Learning to Predict the Duration of an Automobile Trip,” Proceedings of the Fourth International Conference on Knowledge Discovery and Data Mining, 1998, 5 pages, AAAI Press, New York, NY, USA.
Hankey, et al., “In-Vehicle Information Systems Behavioral Model and Design Support: Final Report,” Feb. 16, 2000, Publication No. 00-135, Research, Development, and Technology, Turner-Fairbank Highway Research Center, McLean, Virginia.
Hirata et al., “The Development of a New Multi-AV System Incorporating an On-Board Navigation Function,” International Congress and Exposition, Mar. 1-5, 1993, pp. 1-12, held in Detroit, MI, SAE International, Warrendale, PA, USA.
Hoffmann, G. et al., Travel Times as a Basic Part of the LISB Guidance Strategy, Third International Conference on Road Traffic Control, May 1-3, 1990, pp. 6-10, London, U.K.
Hoffmann, T., “2005 Acura RL Prototype Preview,” Auto123.com, 4 pages.
Hu, Z. et al., “Real-time Data Fusion on Tracking Camera Pose for Direct Visual Guidance,” IEEE Vehicles Symposium, Jun. 14-17, 2004, pp. 842-847, held in Parma, Italy.
Hulse, M.C. et al., “Development of Human Factors Guidelines for Advanced Traveler Information Systems and Commercial Vehicle Operations: Identification of the Strengths and Weaknesses of Alternative Information Display Formats,” Publication No. FHWA-RD-96-142, Oct. 16, 1998, 187 pages, Office of Safety and Traffic Operation R&D, Federal Highway Administration, USA.
Initial Expert Report of Roy Summer dated Jun. 16, 2011 in Triangle Software, LLCv. Garmin International Inc. et al., in the United States District Court for the Eastern District of Virginia, Alexandria Division, Case No. 1:10-cv-1457-CMH-TCB, 289 pages.
Initial Expert Report of William R. Michalson, PH.D. dated Jun. 17, 2011 in Triangle Software, LLCv. Garmin International Inc. et al., in the United States District Court for the Eastern District of Virginia, Alexandria Division, Case No. 1:10-cv-1457-CMH-TCB, 198 pages.
Inman, V.W., et al., “TravTek Global Evaluation and Executive Summary,” Publication No. FHWA-RD-96-031, Mar. 1996, 104 pages, U.S. Department of Transportation, McLean, VA, USA.
Inman, V.W., et al., “TravTek Evaluation Rental and Local User Study,” Publication No. FHWA-RD-96-028, Mar. 1996, 110 pages, U.S. Department of Transportation, McLean, VA, USA.
Jiang, G., “Travel-Time Prediction for Urban Arterial Road: A Case on China,” Proceedings Intelligent Transportation Systems, Oct. 12-15, 2003, pp. 255-260, IEEE, New York, NY, USA.
Karabassi, A. et al., “Vehicle Route Prediction and Time and Arrival Estimation Techniques for Improved Transportation System Management,” in Proceedings of the Intelligent Vehicles Symposium, 2003, pp. 511-516, IEEE, New York, NY, USA.
Koller, D. et al., “Virtual GIS: A Real-Time 3D Geographic Information System,” Proceedings of the 6th IEEE Visualization Conference (Visualization 95) 1995, pp. 94-100, IEEE, New York, NY, USA.
Kopitz et al., Table of Contents, Chapter 6, Traffic Information Services, and Chapter 7, Intelligent Transport Systems and RDS-TMC in RDS: The Radio Data System, 1992, Cover p. XV, pp. 107-167, Back Cover page, Artech House Publishers, Boston, USA and London, Great Britain.
Krage, M.K., “The TravTek Driver Information System,” Vehicle Navigation & Information Systems Conference Proceedings (VNIS'91), P-253, Part 1, Oct. 1991, pp. 739-748, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Ladner, R. et al., “3D Mapping of Interactive Synthetic Environment,” Computing Practices, Mar. 2000, pp. 33-39, IEEE, New York, NY, USA.
Levinson, D., “Assessing the Benefits and Costs of Intelligent Transportation Systems: The Value of Advanced Traveler Information System,” Publication UCB-ITS-PRR-99-20, California Path Program, Jul. 1999, Institute of Transportation Studies, University of California, Berkeley, CA, USA.
Lowenau, J., “Final Map Actualisation Requirements,” Version 1.1, ActMAP Consortium, Sep. 30, 2004, 111 pages.
Meridian Series of GPS Receivers User Manual, Magellan, 2002, 106 pages, Thales Navigation, Inc., San Dimas, CA, USA.
Ness, M., “A Prototype Low Cost In-Vehicle Navigation System,” IEEE-IEE Vehicle Navigation & Information Systems Conference (VNIS), 1993, pp. 56-59, New York, NY, USA.
N'Fit Xanavi, unknown date, 94 pages, Japana.
Nintendo Wii Operations Manual Systems Setup. 2009.
Nissan Automobile Navigation System User Manual, [date unknown], 163 pages.
Noonan, J., “Intelligent Transportation Systems Field Operational Test Cross-Cutting Study Advanced Traveler Information Systems,” Sep. 1998, 27 pages, U.S. Department of Transportation, McLean, VA, USA.
Odagaki et al., Automobile Navigation System with Multi-Source Guide Information, International Congress & Exposition, Feb. 24-28, 1992, pp. 97-105. SAE International, Warrendale, PA, USA.
Panasonic Portable Navigation System User Manual for Products KX-GT30, KX-GT30X, and KX-GT30Z, Cover page, pp. 1-5, 132-147, End pages, Matsushita Denki Sangyo K.K., Fukuoka City, Japan [Date Unknown].
Preliminary Invalidity Contentions of Defendant TomTom, Inc., Certificate of Service and Exhibit A filed May 16, 2011 in Triangle Software, LLC. V. Garmin International, Inc. et al., Case No. 1: 10-cv-1457-CMH-TCB in the United States District Court for the Eastern District of Virginia, Alexandria Division, 354 pages.
Raper, J.F., “Three-Dimensional GIS,” in Geographical Information Systems: Principles and Applications, 1991, vol. 1, Chapter 20, 21 pages.
“Reference Manual for the Magellan RoadMate 500/700.” 2003, 65 pages, Thales Navigation, Inc., San Dimas, CA, USA.
Riiett, L.R., “Simulating the TravTek Route Guidance Logic Using the Integration Traffic Model,” Vehicle Navigation & Information System, p. 253, Part 2, Oct. 1991, pp. 775-787, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Rillings, J.H., “Advanced Driver Information Systems,” IEEE Transactions on Vehicular Technology, Feb. 1991, vol. 40, No. 1, pp. 31-40, IEEE, New York, NY, USA.
Rillings, J.H., “TravTek,” Vehicle Navigation & Information System Conference Proceedings (VNIS'91), p. 253, Part 2, Oct. 1991, pp. 729-737, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Rockwell, Mark, “Telematics Speed Zone Ahead,” Wireless Week, Jun. 15, 2004, Reed Business Information, http://www.wirelessweek.com.
Rupert, R.L., “The TravTek Traffic Management Center and Traffic Information Network,” Vehicle Navigation & Information System Conference Proceedings (VNIS'91), p. 253, Part 1, Oct. 1991, pp. 757-761, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Schofer, J.L., “Behavioral Issues in the Design and Evaluation of Advanced Traveler Information Systems,” Transportation Research Part C 1, 1993, pp. 107-117, Pergamon Press Ltd., Elsevier Science Ltd.
Schulz, W., “Traffic Management Improvement by Integrating Modem Communication Systems,” IEEE Communications Magazine, Oct. 1996, pp. 56-60, New York, NY, USA.
Shepard, I.D.H., “Information Integration and GIS,” in Geographical Information Systems: Principles and Applications, 1991, vol. 1, pp. Cover page, 337-360, end page.
Sirius Satellite Radio: Traffic Development Kit Start Up Guide, Sep. 27, 2005, Version 00.00.01, NY, New York, 14 pages.
Slothhower, D., “Sketches & Applications,” SIGGRAPH 2001, pp. 138-144, Stanford University.
Sumner, R., Data Fusion in Pathfinder and TravTek, Part 1, Vehicle Navigation & Information Systems Conference Proceedings (VNIS'91), Oct. 1991, Cover & Title page, pp. 71-75.
Supplemental Expert Report of William R. Michalson, Ph.D. Regarding Invalidity of the Patents-in-Suit dated Jul. 5, 2011 in Triangle Software, LLC v. Garmin International Inc. et al., in the United States District Court for the Eastern District of Virginia, Alexandria Division, Case No. 1:10-cv-1457-CMH-TCB, 23 pages.
Tamuara et al., “Toward Realization of VICS—Vehicle Information and Communications System,” IEEE-IEE Vehicle Navigation & Information Systems Conference (VNIS'93), 1993, pp. 72-77, held in Ottawa, Canada.
Taylor, K.B., “TravTek-Information and Services Center,” Vehicle Navigation & Information System Conference Proceedings (VNIS'91), p. 253, Part 2, Oct. 1991, pp. 763-774, Soc. of Automotive Engineers, Inc., Warrendale, PA, USA.
Texas Transportation Institute, “2002 Urban Mobility Study: 220 Mobility Issues and Measures: The Effects of Incidents—Crashes and Vehicle Breakdowns” (2002).
“The Challenge of VICS: The Dialog Between the Car and Road has Begun,” Oct. 1, 1996, pp. 19-63, The Road Traffic Information Communication System Centre (VICS Centre), Tokyo, Japan.
Thompson, S.M., “Exploiting Telecommunications to Delivery Real Time Transport Information,” Road Transport Information and Control, Conf. Publication No. 454, Apr. 21-23, 1998, pp. 59-63, IEE, U.K.
Tonjes, R., “3D Reconstruction of Objects from Ariel Images Using a GIS,” presented at ISPRS Workshops on “Theoretical and Practical Aspects of Surface Reconstructions and 3-D Object Extraction” Sep. 9-11, 1997, 8 pages, held in Haifa, Israel.
“Travtek Information and Services Center Policy/Procedures Manual,” Feb. 1992, 133 pages, U.S. Department of Transportation, McLean, VA, USA.
Truett, R., “Car Navigation System May Live on After Test,” The Orlando Sentinel, Feb. 17, 1993, p. 3 pages.
U.S. Dept. of Transportation, Closing the Data Gap: Guidelines for Quality Advanced Traveler Information System (ATIS) Data, Version 1.0, Sep. 2000, 41 pages.
User Guide of Tom Tom ONE; 2006.
Vollmer, R., “Navigation Systems—Intelligent Co-Drivers with Knowledge of Road and Tourist Information,” Navigation and Intelligent Transportation Systems, Automotive Electronics Series, Society of Automotive Engineers, 1998, pp. Cover page, Table of Contents, pp. 9-17.
Volkswagen Group of America, Inc.'s Answer and Counterclaim, Feb. 24, 2011.
Watanabe, M. et al., “Development and Evaluation of a Car Navigation System Providing a Bird's-Eye View Map Display,” Technical Paper No. 961007, Feb. 1, 1996, pp. 11-18, SAE International.
Wischhof, L. et al., “SOTIS—A Self-Organizing Traffic Information System,” Proceedings of the 57th IEEE Vehicular Technology Conference (VTC—03), 2003, pp, 2442-2446, New York, NY, USA.
WSI, “TrueView Interactive Training Manual, Showfx Student Guide,” Print Date: Sep. 2004, Document Version: 4.3x. Link: http://apollo.lsc.vsc.edu/intranet/WSI—Showfx/training/970-TVSK-SG-43.pdf.
XM Radio Introduces Satellite Update Service for Vehicle Navigation, Apr. 8, 2004, 2 pages.
Yim et al., TravInfo. Field Operational Test Evaluation “Evaluation of TravInfo Field Operation Test” Apr. 25, 2000.
Yim et al., “Travinfo Field Operational Test Evaluation: Information Service Providers Customer Survey”, May 1, 2000.
Yokouchi, K., “Car-Navigation Systems,” Mitsubishi Electr. Adv. Technical Reports, 2000, vol. 91, pp. 10-14, Japan.
You, J. et al., “Development and Evaluation of a Hybrid Travel Time Forecasting Model,” Transportation Research Parc C 9, 2000, pp. 231-256, Pergamon Press Ltd., Elsevier Science Ltd., U.K.
Zhao, Y., “Vehicle Location and Navigation Systems,” 1997, 370 pages, Arthech House, Inc., Norwood, MA, USA.
Zhu, C. et al. “3D Terrain Visualization for Web GIS,” Center for Advance Media Technology, Nanyang Technological University, Singapore, 2003, 8 pages.
PCT Application No. PCT/US2004/23884, Search Report and Written Opinion mailed Jun. 17, 2005.
PCT Application No. PCT/US2011/48680, Search Report and Written Opinion mailed Feb. 7, 2012.
PCT Application No. PCT/US2011/51647, Search Report and Written Opinion mailed Feb. 2, 2012.
PCT Application No. PCT/US2011/60663, Search Report and Written Opinion mailed May 31, 2012.
PCT Application No. PCT/US2012/38702, Search Report and Written Opinion mailed Aug. 24, 2012.
PCT Application No. PCT/US2013/23505, Search Report and Written Opinion mailed May 10, 2013.
U.S. Appl. No. 10/379,967, Final Office Action mailed May 11, 2005.
U.S. Appl. No. 10/379,967, Office Action mailed Sep. 20, 2004.
U.S. Appl. No. 10/897,550, Office Action mailed Jun. 12, 2009.
U.S. Appl. No. 10/897,550, Office Action mailed Jan. 21, 2009.
U.S. Appl. No. 10/897,550, Office Action mailed Aug. 1, 2008.
U.S. Appl. No. 10/897,550, Office Action mailed Oct. 3, 2007.
U.S. Appl. No. 11/509,954, Office Action mailed Nov. 23, 2007.
U.S. Appl. No. 11/751,628, Office Action mailed Jan. 29, 2009.
U.S. Appl. No. 12/283,748, Office Action mailed Aug. 20, 2009.
U.S. Appl. No. 12/283,748, Office Action mailed Mar. 11, 2009.
U.S. Appl. No. 12/398,120, Final Office Action mailed Mar. 26, 2013.
U.S. Appl. No. 12/398,120, Office Action mailed Nov. 14, 2012.
U.S. Appl. No. 12/398,120, Final Office Action mailed Apr. 12, 2012.
U.S. Appl. No. 12/398,120, Office Action mailed Nov. 15, 2011.
U.S. Appl. No. 12/763,199, Final Office Action mailed Nov. 1, 2010.
U.S. Appl. No. 12/763,199, Office Action mailed Aug. 5, 2010.
U.S. Appl. No. 12/860,700, Final Office Action mailed Jun. 26, 2013.
U.S. Appl. No. 12/860,700, Office Action mailed Feb. 26, 2013.
U.S. Appl. No. 12/881,690, Office Action mailed Jan. 9, 2014.
U.S. Appl. No. 12/881,690, Final Office Action mailed Aug. 9, 2013.
U.S. Appl. No. 12/881,690, Office Action mailed Apr. 22, 2013.
U.S. Appl. No. 12/967,045, Final Office Action mailed Jun. 27, 2012.
U.S. Appl. No. 12/967,045, Office Action mailed Jul. 18, 2011.
U.S. Appl. No. 13/296,108, Final Office Action mailed Oct. 25, 2013.
U.S. Appl. No. 13/296,108, Office Action mailed May 9,2013.
U.S. Appl. No. 13/316,250, Final Office Action mailed Jun. 25, 2013.
U.S. Appl. No. 13/316,250, Office Action mailed Jan. 18, 2013.
U.S. Appl. No. 13/475,502, Final Office Action mailed Sep. 10, 2013.
U.S. Appl. No. 13/475,502, Office Action mailed Apr. 22, 2013.
U.S. Appl. No. 13/561,269, Office Action mailed Dec. 13, 2012.
U.S. Appl. No. 13/561,327, Office Action mailed Oct. 26, 2012.
U.S. Appl. No. 13/747,454, Office Action mailed Jun. 17, 2013.
U.S. Appl. No. 13/752,351, Office Action mailed Jul. 22, 2013.
U.S. Appl. No. 14/327,468, Andre Gueziec, GPS Generated Traffic Information, filed Jul. 9, 2014.
U.S. Appl. No. 14/323,352, J.D. Margulici, Estimating Time Travel Distributions on Signalized Arterials, filed Jul. 3, 2014.
U.S. Appl. No. 14/265,290, Andre Gueziec, Crowd Sourced Traffic Reporting, filed Apr. 29, 2014.
U.S. Appl. No. 14/275,702, Andre Gueziec, System for Providing Traffic Data and Driving Efficiency Data, filed May 12, 2014.
U.S. Appl. No. 12/860,700, Final Office Action mailed Jul. 22, 2014.
U.S. Appl. No. 12/860,700, Office Action mailed Apr. 3, 2014.
Yang, Qi; “A Simulation Laboratory for Evaluation of Dynamic Traffic Management Systems”, Massachusetts Institute of Technology, Jun. 1997.
U.S. Appl. No. 12/881,690, Office Action mailed Sep. 3, 2014.
Huang, Tsan-Huang, Chen, Wu-Cheng; Experimental Analysis and Modeling of Route Choice with the Revealed and Stated Preference Data Journal of the Eastern Asia Society for Transportation Studies, vol. 3, No. 6, Sep. 1999—Traffic Flow and Assignment.
U.S. Appl. No. 14/323,352, Office Action mailed Nov. 26, 2014.
U.S. Appl. No. 14/058,195, Office Action mailed Nov. 12, 2014.
U.S. Appl. No. 14/624,498, Andre Gueziec, Method for Choosing a Traffic Route, filed Feb. 17, 2015.
U.S. Appl. No. 14/637,357, Andre Gueziec, Touch Screen Based Interaction With Traffic Data, Mar. 3, 2015.
U.S. Appl. No. 14/327,468, Office Action mailed Mar. 12, 2015.
U.S. Appl. No. 14/323,352, Final Office Action mailed Apr. 3, 2015.
U.S. Appl. No. 14/058,195, Final Office Action mailed Apr. 8, 2015.
U.S. Appl. No. 14/265,290, Office Action mailed Jul. 23, 2015.
U.S. Appl. No. 14/327,468, Final Office Action mailed Apr. 4, 2015.
U.S. Appl. No. 14/058,195, Office Action mailed Aug. 4, 2015.
U.S. Appl. No. 14/846,576, Christopher Kantarjiev, System and Method for Delivering Departure Notifications, filed Sep. 4, 2015.
U.S. Appl. No. 14/793,879, Andre Gueziec, Generating Visual Information Associated With Traffic, filed Jul. 8, 2015.
U.S. Appl. No. 14/726,858, Andre Gueziec, Gesture Based Interaction With Traffic Data, Jun. 1, 2015.
EP Patent Application No. 12785688.8 Extended European Search Report dated Aug. 12, 2015.
U.S. Appl. No. 14/275,702, Office Action mailed Nov. 30, 2015.
U.S. Appl. No. 14/265,290, Final Office Action mailed Jan. 29, 2016.
U.S. Appl. No. 14/624,498, Office Action mailed Feb. 18, 2016.
U.S. Appl. No. 14/058,195, Final Office Action mailed Mar. 1, 2016.
U.S. Appl. No. 14/726,858 Office Action mailed Feb. 22, 2016.
U.S. Appl. No. 15/077,880, J.D. Margulici, Estimating Time Travel Distributions on Signalized Arterials, filed Mar. 22, 2016.
Canada Patent Application No. 2,688,129 Office Action dated Jan. 18, 2016.
U.S. Appl. No. 14/265,290, Office Action mailed May 31, 2016.
U.S. Appl. No. 15/181,221, Andre Gueziec, GPS Generated Traffic Information, filed Jun. 13, 2016.
U.S. Appl. No. 15/207,377, Andre Gueziec, System for Providing Traffic Data and Driving Efficiency Data.
U.S. Appl. No. 15/077,880, Office Action mailed Jul. 21, 2016.
U.S. Appl. No. 15/218,619, Andre Gueziec, Method for Predicting a Travel Time for a Traffic Route.
Related Publications (1)
Number Date Country
20140139520 A1 May 2014 US
Continuations (1)
Number Date Country
Parent 12398120 Mar 2009 US
Child 14100985 US