Aspects of the present disclosure relate to video and geographic information coordination, and more particularly to systems and methods that facilitate the sharing of video information synchronized with geographic information.
Video recording technology has become ubiquitously available, relatively inexpensive, and easy to use. For example, inexpensive high resolution digital cameras with full motion video, often in high definition formats are readily available. Similarly, smart phone technology commonly now comes with a high resolution digital video camera and sufficient memory to store several minutes or even hours of video. Video camera technology may also now be found in helmet mountable formats. With such readily available technologies, Internet based video sharing platforms, like YouTube™, have also now become wildly successful.
Similarly, systems for recording geographic location data, like geographic positioning system (GPS) data, have also become ubiquitously available, relatively inexpensive and easy to use. For example, GPS watches are readily available as are GPS tracking applications on smart phones and other dedicated mobile devices.
Despite these such amazing advances and proliferation of video technologies and geographic location technologies, there is a lack of technologies that bring these two platforms together. It is with these observations in mind, among others, that various aspects of the present disclosure were conceived and developed.
Aspects of the present disclosure involve a method of data synchronization within a browser operating on a computing device. Within the browser, the method involves receiving a video data file and a geographic data file. An image from the video data file is displayed as is a route corresponding to the geographic data file. A selection on the route is received, such as by a user touching or selecting with a mouse a point on a map, where the selection identifies a location corresponding to the image from the video data file. The received selection generates a synchronization association between the geographic data file and the video data file.
Aspects of the disclosure may further involve a system of data synchronization occurring at a server, which may be some form of computing device accessible over a network. The server provides access to an application in response to a request. The application is configured to run within a browser at a computing device from which the request was initiated and the application is further configured to receive a video data file and a geographic data file. The application may cause a display of an image from the video data file and a display of a route corresponding to the geographic data file. Finally, the application is configured to process a selection on the route identifying a location corresponding to the image from the video data file where the selection establishes a synchronization link between the geographic data file and the video data file.
Aspects of the disclosure may also involve a system of data synchronization, occurring at a server device that provides access to an application in response to a request. The application may be configured to run within a browser at a computing device from which the request was initiated. Further, the application may be configured to receive a video data file and to extract geographic data embedded within the video data file and to create a geographic data file from the extracted geographic data.
Finally, aspects of the present disclosure may further involve a method of synchronizing video and geographic data that comprises receiving a first wireless message indicating an initiation of recording a first data file at a first recording device, and at a second device, initiating a recording of a second data file at the second device in response to the receipt of the first wireless message.
Aspects of the present disclosure involve systems and methods for synchronizing video and geographic information and the sharing of synchronized video and geographic information, such as a video data file synchronized with a geographic data file, with third parties and or providing the synchronized data to an application or mechanism that may use the data. Aspects of the present disclosure may also involve synchronizing video data and geographic data when that data is recorded separately.
Some particular uses of the systems and methods discussed herein involve recording a video of an activity, such as bicycling, trail running, driving, motocross, cross country skiing, downhill (alpine) skiing, kayaking, and the like, and associating or otherwise synchronizing geographic data with the video file. Generally speaking, the geographic data and the video data are synchronized when there is a link or other relationship, such as a temporal relationship, between a specific portion of the video and the geographic data. The synchronized data may then be used by a variety of possible applications and made available to third parties, among other possible uses. For example, a user may obtain the synchronized data set, and play back the video along with a digital map so that a location on the map is displayed for the portion of the video being played. A digital display element on the map, such as a colorful dot or flashing dot, may move along the digital map in the same location as is being shown in synchronized video, and the video and digital display element may move contemporaneously. With such synchronized data it is possible to display a video along with geographic information related to the portion of video being shown, such as route along a digital map, elevation information, speed information, and other possible geographic or geographic data derived information. Example implementations and uses of the methods and systems discussed herein are primarily presented in the context of sporting activities; however, the systems and methods discussed herein are equally applicable to automobiles, automobile racing, boating, hang-gliding, parasailing, and nearly any activity that involves movement and where video and geographic data synchronization might prove interesting.
To further illustrate aspects of the technology described herein and expand on example uses, a system user may record a video of some or all of a particular bicycle ride. The types of rides that could be recorded are only limited by the user's creativity—the rider may ride and video a stage of a race, video themselves in a race with other riders, record their favorite mountain biking trail, etc. The user may also record or otherwise obtain geographic data associated with the ride. In one specific example, the user may use some form of geographic position system (GPS) device to record GPS geographic data during the ride while also recording video of the ride. Stated differently, the rider films video such as with a helmet camera, and at the same time has a GPS device recording GPS data during the ride.
After the ride, the user may access the system at a web browser and provide the video and GPS data to the system. In some cases the data may be pre-synchronized, as will be discussed in more detail below, and which case the user may use the system to locally edit, and otherwise define various possible meta-data attributes that will become associated with the data. The video and geographic data are separated to form two separate data files and those files are then uploaded to the system. Synchronization information may be included in either or both files, and may also be included in the meta-data file. In other instances, the video data and the geographic data may be recorded separately. The user accesses the system and identifies the two data sources associated with the same activity (e.g., the user selects the video data file and its corresponding geographic data file, or vice versa). Locally (on the user's device) but using functionality of the system served from a remote location, the user may synchronize the two data sets by associating a specific location in a portion of the video being displayed (e.g. a still or paused image showing a recognizable landmark in the video) with a specific location along a digital route displayed using the geographic data. The system generates synchronization information or creates a synchronization relationship associating the video frame or specific location with the specific map location, and because the video and the geographic information were recorded during the same time (regardless of whether one event started earlier, or ended later) the system then knows the geographic data elements associated with the video frames as the video progresses. The synchronized data is then uploaded to the system, which may be in two separate data files and a meta-data file, and the data may then be accessed and used by the creator (member), by other parties, or by devices.
Generally speaking, a video data file is synchronized with a geographic data file when there is a relationship created between the two files, data sets, or discrete data elements, or the files or data sets are integrated, such that the scenes in the video have a substantially accurate real world relationship to a geographic data element in the geographic data. A video file, whether compressed or otherwise processed, typically resolves to a frame that is momentarily displayed or projected, and through the proper speed of showing a sequence of frames, full motion is video is achieved. Various possible video devices may record at different speeds. Geographic data, such as GPS data, often involves a sequence of data elements that identify some particular location on the surface of the Earth. For example, GPS data may include a series of data elements describing latitude and longitude coordinates, and may also include elevation information. The accuracy and resolution of the geographic data is often related to the equipment used to record the data, the equipment and data used to determine or calculate elevation, the methods used to calculate location, the visibility and number of GPS satellites visible to the device, the resolution of the GPS device, and other factors. With the variability in the video and geographic systems, synchronization can nonetheless be achieved by linking at least one frame or discrete portion of the video to at least one geographic data element. For example, assuming both devices record during the same time over the same path, if the user links a frame of the video accurately with a geographic location, the other portions of the video and the geographic data will naturally align as they were recorded over the same period and along the same geographic path, even if data was not recorded at the same rate. Thus, synchronization information may involve some indicia that links or otherwise identifies a portion of the video with a geographic data element.
With such synchronization, a digital map display may identify a route along a map commensurate with the recorded video and may identify a specific point on the map, such as through an illuminated point or flashing icon that moves along the displayed route in accordance with the location of the particular part of the video being shown. A third party or a third party device may thus access the synchronized data and display or otherwise use the synchronized data. So, for example, a third party may preview a ride and get a sense for the difficulty of the ride by watching the video and also viewing the geographic information to understand the climbs, descents and flat sections of the ride. When elevation data is also included with the geographic data, an elevation profile may also be displayed and may also include some form of identifier related to the portion of the video being displayed. When speed data is included with or derived from the geographic data, a speed profile may also be displayed and may include some identifier related to the portion of the video being displayed. In such examples, an identifier is positioned on the elevation profile and/or the speed profile in the location corresponding to the video and map icon. A class instructor may use the video during an indoor cycling (IC) class or other exercise class, and instruct the class to alter the resistance of their IC bicycle, or alter their cadence, based on the geographic data. The data file or files with the synchronized information may also be used by exercise equipment in an automated way. Including real video and actual geographic information linked or otherwise synchronized with the video provides numerous possible creative uses for the data.
Referring to the drawings,
After the video data and geographic data are made accessible to the web-uploader 106, the data may be synchronized so that the images in the video data are aligned with specific geographic data instances of the geographic data file. Stated differently and in one specific example, a particular frame or sequence of frames may be associated with a specific latitude and longitude of GPS data. After synchronizing (in cases where synchronizing is needed), the synchronized data sets are uploaded and stored in a database 114 from which it is accessible by another party operating a computing device 116, such as a personal computer, tablet or smart phone. In some specific implementations, the data files remain stored at the user device while synchronization and other operations occur, and then the data is uploaded. One or both of the video and geographic data files may include synchronization information, stored separately or as a single record. Further, a separate synchronization record, table or other computing structure, may be created and stored in the database or elsewhere, which may or may not involve distinct synchronization information being included in either of the data files. In one specific implementation, after the user synchronizes the data sets, three files are uploaded to the server—a video file, a GPS file, and a meta-data file.
More specifically, and also now referring to the process described with respect to
Referring to
Referring again to
Regardless, when the video file is loaded or otherwise the web-uploader is provided with access to the video file, the web-uploader will display a video player in the video window, and will also display, within the video player, a still image of the first frame or some other frame, of the video file (
The video file may be recorded using any form of digital video recording device. Additionally, the video file may also be in various possible formats and conforming to various possible file formats, compression and encoding standards, and the like. In one specific implementation, the video file is provided in a form accepted by the web browser natively in HTML 5. In such a specific implementation, the video file may be a .mp4 file or a .mov file (or files) using H.264 (MPEG-4 AVC) video compression encoding.
The video may be recorded by a standalone video recorder. Alternatively, the video may be recorded using a smart phone with a high definition video camera, such as an iPhone™ that may also automatically apply H.264 and provide the video as a .mov file. Of course, like other specific technological platforms discussed herein, the use of an iPhone™ and the particular compression standards are not meant to imply that the technology discussed herein is limited to those platforms but rather is discussed simply to provide some specific illustrations. Additionally, the video recorder may only record video data or it may also be capable of recording geographic data associated with the video recording. In one example, the video recorder may be capable of recording GPS data with the video. One such device with video and GPS functionality is the CountourGPS™. In the event the recorded video is on conventional film, then the video should first be digitized into an acceptable format before providing the video file to the system.
Referring again to
In these examples, the video data and the geographic data are recorded during the same event. So, for example, a bicycle rider would record a video during a ride while at the same time recording GPS data. In such a situation, the video data includes inherent synchronization components with the geographic data because the two files are being recorded at the same time and along the same route. Thus, there may be a series of discrete video frames recorded over a period of time that correspond with a series of discrete geographic data elements recorded over the same period of time and along the same route as the video. Actually synchronizing the data files, as discussed herein, therefore may involve defining or otherwise including at least one synchronizing point or link between the files, which may be identified in either or both files or in a separate data structure or file, so that a specific video feature is aligned with a specific geographic data feature, such as by creating a link between a specific video frame and a particular GPS coordinate, and knowing the time or some other temporal aspect of the files which can then be used to align the series of video frames and the series of geographic data elements. However, also as discussed herein, should the device be capable of recording both video and geographic data, it is possible to obtain one file with both video data and geographic data that are synchronized by the device while recording. For example, the device may embed GPS data within the video file as they are being recorded. In another example, the two files may be started and stopped at the same time under some form of control, and by knowing the relationship between the two files, they are synchronized temporally.
Referring now to
Alternatively and when the video data does not include geographic data, the web-uploader application prompts the user to load a geographic data file (operation 404). Referring again to
In one specific example, the geographic data device is a GPS device which may be referred to as a GPS receiver that records GPS data in either GPX format or NMEA 0183 form and which may be a dedicated device or one that includes GPS functionality such as a smart phone or the like, and the geographic data obtained and stored by the device in a geographic data file includes a series of data fields where each field may include a latitude coordinate, a longitude coordinate, elevation information, a calculated speed number, and possibly other information measured, derived, or calculated. Each data field may be recorded at the rate at which the GPS receiver calculates the information for the data field, which may be every one (1) second, every two (2) seconds, or at some other frequency depending on the device. The GPS device records the information in a geographic data file, which may be in comma separated form or some other format.
In
As discussed in more detail herein, the geographic data file may also include video synchronization information. In one example, the synchronization information may be an index to a position of the video, such as an index to a frame of the video, to which a particular data field of the geographic data file pertains. In a specific example, the first data field of the geographic data field may be indexed to a particular frame of a synchronized video file, which may be the first frame of the video file. As discussed herein, however, there is no requirement that synchronization information be limited to the first geographic data field and/or the first video frame. Similarly, synchronization information may be stored in the video file and refer to a particular data field of the geographic data field. Moreover, synchronization information may be stored in a separate file or database structure or in the nature of the data storage itself that includes synchronization information, as well as possibly other information, related to a video file that is synchronized with a geographic data file.
In the example of the
Nonetheless, returning to
As shown in
Regardless of whether the geographic data is embedded in the video file or loaded separately, the web-uploader allows the user to synchronize the start of the geographic data with the beginning of the video file and to also identify the end of the geographic data. In the example shown in
The video file is associated with a time parameter (or parameters) between the start of the video and the end of the video. Similarly, the GPS data is also associated with a time parameter. Both the video file time parameter and the GPS data may be normalized to real time or some other common time parameter. Regardless, because the rider is moving and recording video and geographic data, the geographic data may be automatically synchronized with the video data once a video frame or set of frames is synchronized to a geographic location. Hence, it is sufficient to define or otherwise include one synchronization location. When the first video frame is synchronized with the geographic data (as shown in
Alternatively, all of the recorded GPS data is transferred to the database with information to synchronize the GPS data to the video data. In such a situation, a third party or device may only receive the GPS data associated with a video file but the entire GPS data would be stored. Accordingly, if the user that originally transferred the video and geographic data to the database made an error in synchronization or desired to further edit the video and resynchronize the geographic data, that raw original data would still be available even if the user had deleted the original file on the user device 102 or connected device 110. In such an instance, the user would access the web-uploader but would select a file from the remote database rather than from local storage.
If some frame besides the start frame is associated with the start indicator, the web-uploader may automatically synchronize the start of the video data and the start of the GPS data. In such a case, the user initially sets the flag at a point in the map and then once the start of the route is identified, the flag will be displayed at the start of the route. The system is able to automatically determine and set the start synchronization regardless of whether the start of the GPS data precedes the start of the video file or whether the start of the GPS data follows the start of the video file. Generally, the system is able to determine the starting map location by knowing the elapsed time from the first map location and the selected map location, and comparing it to the time of the video frame shown in the window 302. To illustrate the operation, assume, for example, that the selected start latitude and longitude location for the synchronization occurs at time equal to 5 minutes, meaning that the elapsed time between the first GPS data input and the selected GPS point is 5 minutes. In one example scenario, the video frame being synchronized is at a greater time stamp than the GPS file—say, 6 minutes. In another example scenario, the video frame being synchronized is a lesser time stamp than the GPS file—say, 4 minutes. In the 6 minute example, the start of the GPS file is the first GPS data record, and the video is begun at the frame 5 minutes before the synchronization frame or the frame at a time stamp of 1 minute. In this scenario, the first one minute of video does not correspond to any GPS data and it is cropped or otherwise deleted. Alternatively, a start indicator is included in the video file at the location and the preceding video is not cropped. Alternatively, the start time is saved or otherwise recorded in a separate synchronize record or meta-data included with one or both files or as a separate file. In yet another example, it is also possible to store a time offset in either or both files, or in the meta file. In the 4 minute example, the start of the video file is the first frame of video and the start of the synchronized GPS file is at the GPS coordinates 4 minutes less than the synchronization GPS record (the GPS record at a time stamp of 1 minute). Here, the first minute of the GPS data does not correspond to any video so the first minute is cropped or otherwise deleted. Alternatively, a start indicator is included in the GPS file and the GPS data preceding the start is not deleted. Alternatively, the start time is saved or otherwise recorded in a separate synchronization record or meta-data included with one or both files. In either case, the two files are synchronized by having time information or some other common temporal information for the discrete data elements (GPS coordinates or video frame) of the respective video files and GPS files. In the first example, the video file is cropped or marked to synchronize to the start of the GPS file whereas in the second example the GPS file is cropped or marked to synchronize to the start of the video file.
In the instance where a location on the map is selected to match the first frame of the video file, the GPS file is cropped to remove the GPS data elements preceding the start indicator. Alternatively, the GPS file is not actually cropped but is instead marked, a separate file or meta data is created that identifies the starting GPS data element, or a time offset is defined. For example, if 10 minutes of GPS data precede the GPS data point associated with the start indicator, then that 10 minutes of data is removed from the GPS file and not included with the synchronization files. In such a case, the files are synchronized at the beginning of the respective files. Alternatively, the actual starting GPS data element is marked or otherwise identified. Similarly, the GPS file is either marked or cropped to identify or remove any GPS data elements that exceed the length of the video. So, for example, if the video has a total length of 10 minutes, and the GPS file has 15 minutes of GPS records after the GPS data point at the start indicator, then the GPS data records from 10 to 15 minutes are cropped, deleted or otherwise removed from the GPS file, or the ending data element is marked or otherwise identified and the data elements after the last element are not made available or displayed for the third parties.
When the synchronization operation is complete, depending on whether synchronization is involved, there is an index generated that maps the geographic data file to the video file to synchronize the two. In one specific example, after the video or GPS file is physically or virtually cropped (when the files differ in length), the starting GPS data element is indexed to the first frame of the video file. The data files with any associating indexing information are then stored at a remote data location, such as the database 114, by the web uploader, and are thereby available for access over the network or for other purposes.
More specifically, when the synchronization operations are complete and the user decides to upload the actual files to the database, a communication between the server 108, using an API, initiates the collection of the appropriate files. In one specific example, the server initiates the transfer of the video file, which may be transferred in discrete chunks, the associated and synchronized GPS file and a meta-data file. The meta-data file may include a name, a length, a vehicle description (bicycle, car, etc.) and a file description, among other attributes. In some instances the meta-data file may include linking data. Referring again to
In one specific example, meta-data may be provided in a .kino file (or other file extension) that is XML formatted. The XML fields may include the title of the track (as entered by the user), the description of the type of track (a text description entered by the user that summarizes the course), the type of vehicle or other mechanism from which the video was taken (car, 4×4, cycle, etc.), and the video format of the video (e.g., mp3, mp4, avi, wmv, 3gp, etc.). The XML fields may further include:
An email is sent to the contributor when this process is complete and the synchronized data sets are ready for sharing and using in apps or the like. While a database is shown and discussed, the term database in the context of this application can include various possible cloud-based storage systems and may be provided in a discrete location, such as a data center, or may be stored or otherwise distributed across one or more physical or virtual locations. Similarly, while a single server is shown in
The stored data may be organized with a first set of data that contains one line in the database for each video and a second set of data that contains one line in the database for geographic data set. The stored data may also be organized to include a third set of data concerning the creator of the video and/or geographic data files, and other meta-data. The data fields for each data set type (video, geographic coordinate and member) may be illustrated or otherwise discussed with reference to three associated database tables. Each table references various possible data fields for each data set, and may also reference meta information (meta data) as well as synchronization information that creates a relationship between the data sets.
More specifically,
More specifically, referring first to the member table, a primary key in the form of a member identification (mem_id) is shown as the first entry in the member table. The same data is included in the track table (trk_mem_id), which provides a data linking element between the two tables (and data) by establishing a relationship between the two sets of data. Specifically, the trk_mem_id may be a foreign key, based on the mem_id, that established a data link between the member table and the track table.
The member table includes several data fields related specifically to the creator or uploader of the video and geographic data. In this specific case, a member has an account or is otherwise a registered user of the system, although the system does not require member accounts in order to function for the purpose of uploading synchronized video and geographic data. Specifically, the member table may include a username entry, a password entry, a mail (email) entry, a name entry, a Facebook (or other social media(s)) entry (Facebook user information for the member), and other information related to the member. The member table may also include information and synchronization information associated with the video and geographic data created or otherwise uploaded by the member. For example, the member table may include a duration_total data element associated with the length of the video and a length_total data element associated with the length of the GPS coordinate file, where the three sets of data are linked through keys and are associated with a particular video, a particular set of geographic data, and created by a specific member. Of course, a member may up-load any number of synchronized data sets, in which case common member data elements for the member table will be used and unique data elements associated with a specific video and set of coordinate data will also be included in the database entries.
The track table includes numerous data fields associated with a video file. In one example, each video file and each line in the track table represents one specific uploaded and synchronized video in the database. More specifically, the id is a unique identifier of the video. Note, the track table data entries includes the prefix “trk” which is left off this discussion for convenience. The name field is the name chosen or entered by the member. For example, referring to
The ne_lat, ne_long, sw_lat, and sw_long data fields define the latitude and longitude coordinates for a bounding rectangle to define the initial map for the video. The coo_id_start and the coo_id_end are identifiers to the first and last GPS coordinates associated with the video. These values may be particular instances of synchronization information. These values may be updated and/or defined when the user sets the start and/or end locations on the map. The first and last GPS coordinates may be found in the coordinates table.
Still referring to the coordinates table, the table also includes creation data information, data of uploading information, rating information (e.g., 1-5 stars), average rating information, identification information associated with those rating the video, the date and time of the start of the video recording, time zone information for the location where the video was recorded, a length (e.g., meters) of the video, which may be derived from the geographic data, and the geographic addresses for the start and the end locations of the video. The addresses may be derived from a reverse geolocating service based on the GPS coordinates for the start and end of the video or manually entered by the member.
The table may further include a polyline field providing a mechanism (equation) by which a course, associated with the GPS coordinates for the video file, may be rendered on a digital map. The polyline information may further be supplemented with polyline weighting information and resolution information. The table may include numerous other types of data associated with the video including:
Other fields are shown in the table the meaning of which may be understood from the data names. Further, it will be apparent that some data elements of each table are defined or obtained from the originally uploaded data. Other fields are define or modified afterward. For example, the trk_view is tracked by an application running on the server that increments the data value each time a user downloads or views the video.
The coordinates table includes a number of data entries including:
Turning more specifically to the GPS device, in this example, the GPS device is a smart phone or similar device with a GPS chip set and associated GPS functionality. Accordingly, the device is able to generate and store a GPS data file. The GPS device in this example includes an application, which may be an “app,” that when running is configured to receive or otherwise detect a synchronization signal from the video recording device. Referring to
Referring again now to
A sequence of XML data for a given coordinate file may be transmitted to the user device. So, for example, if there are 100 GPS coordinates in the file, then 100 sets of XML coordinate data may be transmitted. The coordinate data in the XML stream includes latitude and longitude information, altitude or elevation information, speed information (the speed of travel when the latitude and longitude information was received), position information, time information (the relative time since the first geographic element), and the accumulated length.
The XML data for the track data may be provided as follows:
The coordinate data and track data that is sent to the user or device is temporally aligned so additional synchronization information is not necessary but may nonetheless be sent.
Referring to
The I/O section 804 is connected to one or more user-interface devices (e.g., a keyboard 616 and a display unit 818), a disc storage unit 812, and a disc drive unit 820. In the case of a tablet or smart phone device, there may not be physical keyboard but rather a touch screen with a computer generated touch screen keyboard. Generally, the disc drive unit 820 is a DVD/CD-ROM drive unit capable of reading the DVD/CD-ROM medium 810, which typically contains programs and data 822. Computer program products containing mechanisms to effectuate the systems and methods in accordance with the presently described technology may reside in the memory section 804, on a disc storage unit 812, on the DVD/CD-ROM medium 810 of the computer system 800, or on external storage devices made available via a cloud computing architecture with such computer program products, including one or more database management products, web server products, application server products, and/or other additional software components. Alternatively, a disc drive unit 820 may be replaced or supplemented by other storage medium drive unit types. Similarly, the disc drive unit may be replaced or supplemented with a random access memory (RAM) and/or various possible forms of semiconductor based memories commonly found in smart phones and tablets. The network adapter 824 is capable of connecting the computer system 800 to a network via the network link 814, through which the computer system can receive instructions and data. Examples of such systems include personal computers, Intel or PowerPC-based computing systems, AMD-based computing systems and other systems running a Windows-based, a UNIX-based, I/Os or other operating systems. It should be understood that computing systems may also embody devices such as personal digital assistants (PDAs), mobile phones, smart phones, tablets or slates, multimedia consoles, gaming consoles, set top boxes, etc.
When used in a LAN-networking environment, the computer system 800 is connected (by wired connection or wirelessly) to a local network through the network interface or adapter 824, which is one type of communications device. When used in a WAN-networking environment, the computer system 800 typically includes a modem, a network adapter, or any other type of communications device for establishing communications over the wide area network. In a networked environment, program modules depicted relative to the computer system 600 or portions thereof, may be stored in a remote memory storage device. It is appreciated that the network connections shown are examples of communications devices for and other means of establishing a communications link between the computers may be used.
Some or all of the operations described herein may be performed by the processor 802. Further, local computing systems, remote data sources and/or services, and other associated logic represent firmware, hardware, and/or software configured to control operations of the web-uploader 106, the user devices 102, 116, or the other devices 112 and 110. Such services may be implemented using a general purpose computer and specialized software (such as a server executing service software), a special purpose computing system and specialized software (such as a mobile device or network appliance executing service software), or other computing configurations. In addition, one or more functionalities disclosed herein may be generated by the processor 802 and a user may interact with a Graphical User Interface (GUI) using one or more user-interface devices (e.g., the keyboard 816, the display unit 818, and the user devices 804) with some of the data in use directly coming from online sources and data stores. The system set forth in
In the present disclosure, the methods disclosed may be implemented as sets of instructions or software readable by a device. Further, it is understood that the specific order or hierarchy of steps in the methods disclosed are instances of example approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the method can be rearranged while remaining within the disclosed subject matter. The accompanying method claims present elements of the various steps in a sample order, and are not necessarily meant to be limited to the specific order or hierarchy presented.
The described disclosure may be provided as a computer program product, or software, that may include a machine-readable medium having stored thereon instructions, which may be used to program a computer system (or other electronic devices) to perform a process according to the present disclosure. A machine-readable medium includes any mechanism for storing information in a form (e.g., software, processing application) readable by a machine (e.g., a computer). The machine-readable medium may include, but is not limited to, magnetic storage medium (e.g., floppy diskette), optical storage medium (e.g., CD-ROM); magneto-optical storage medium, read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; or other types of medium suitable for storing electronic instructions.
The description above includes example systems, methods, techniques, instruction sequences, and/or computer program products that embody techniques of the present disclosure. However, it is understood that the described disclosure may be practiced without these specific details.
It is believed that the present disclosure and many of its attendant advantages will be understood by the foregoing description, and it will be apparent that various changes may be made in the form, construction and arrangement of the components without departing from the disclosed subject matter or without sacrificing all of its material advantages. The form described is merely explanatory, and it is the intention of the following claims to encompass and include such changes.
While the present disclosure has been described with reference to various embodiments, it will be understood that these embodiments are illustrative and that the scope of the disclosure is not limited to them. Many variations, modifications, additions, and improvements are possible. More generally, embodiments in accordance with the present disclosure have been described in the context of particular implementations. Functionality may be separated or combined in blocks differently in various embodiments of the disclosure or described with different terminology. These and other variations, modifications, additions, and improvements may fall within the scope of the disclosure as defined in the claims that follow.
The present application is a non-provisional utility application claiming priority under 35 U.S.C. §119 to co-pending provisional application No. 61/819,265 titled “VIDEO DATA AND GEOGRAPHIC DATA SYNCHRONIZATION AND SHARING,” filed on May 3, 2013, which is hereby incorporated by reference herein for all purposes.
Number | Date | Country | |
---|---|---|---|
61819265 | May 2013 | US |