The present disclosure generally relates to an apparatus, system, and method for training, and more particularly to an apparatus, system, and method for firearms training.
Conventional scenario-based firearms training typically involves a high level of risk-taking and/or may be costly in terms of time and resources. Also, conventional scenario-based firearms training typically does not accomplish most or all training goals.
For example, virtual reality training usually involves a “video game” approach lacking realism, having features such as fake weapons and wearable electronics. Approaches using simulated guns and cinematic apparatuses utilize video-based scenarios lacking real shooting and also lacking interaction with other shooters. The approaches also involve considerable costs.
Conventional tactical shooting sequences involve working on student physicality to create exhaustion, e.g., by focusing on a student's cognitive system through elaborate confusion-inducing exercises or by enhancing an environment to create atmospherics (e.g., that may be taxing to a system). Such intense approaches usually involve safety issues as training becomes more realistic. Also, such approaches typically lack actual interaction with other shooters. Conventional reflective target set-ups do not offer accurate shot tracking and placement analysis, and therefore lack measures for meaningful evaluation of student performance.
The exemplary disclosed system and method are directed to overcoming one or more of the shortcomings set forth above and/or other deficiencies in existing technology.
In one exemplary aspect, the present disclosure is directed to a system. The system includes a target assembly having a reflective surface, a plurality of acoustic sensors disposed at the target assembly, and at least one optical sensor. The target assembly includes sound-absorbing material. The at least one optical sensor is configured to image a plurality of users of the system.
In another aspect, the present disclosure is directed to a method. The method includes using a reflective surface of a target assembly to reflect an image of a first user at a first location to be visible on the reflective surface to a second user at a second location, acoustically sensing a bullet impact location point on the target assembly of a bullet fired from a firearm of the second user, and optically sensing a location of the first user and the second user.
System 300 may include an acoustic sensor array 305, an optical sensor array 310, a target assembly 315, and an analysis system 320. Data collected by acoustic sensor array 305 and optical sensor array 310 may be utilized by analysis system 320 to determine an impact of a user firearm on target assembly 315. Acoustic sensor array 305, optical sensor array 310, and target assembly 315 may be installed in a training range 325.
Acoustic sensor array 305 may include one or more impact sensors 330 and one or more trigger sensors 335. As described for example below, one or more impact sensors 330 may be disposed around target assembly 315, and one or more trigger sensors 335 may be disposed in an area of training range 325.
One or more impact sensors 330 may be any suitable sensor for locating a position of an impact of a projectile on target assembly 315. For example, impact sensor 330 may be any suitable sensor that may be disposed near, around, and/or within target assembly 315 and that may detect the location at which a projectile such as a bullet fired by a firearm may strike target assembly 315. One or more impact sensors 330 may be any suitable acoustic triangulation device. One or more impact sensors 330 may form a microphone array that may be used to triangulate a location of an impact of a projectile on target assembly 315. A plurality of impact sensors 330 (e.g., six or any other desired number or configuration of impact sensors 330) may be disposed around target assembly 315 as described for example below. As described below, target assembly 315 may be formed in such a way as to create a characteristic noise when a projectile such as a bullet strikes target assembly 315, which may be detected by one or more impact sensors 330. For example, one or more impact sensors 330 may be one or more microphones mounted around target assembly 315 as illustrated for example in
Any suitable firearm for training may be used with system 300. For example, a user may use his or her service weapon (e.g., service firearm) or personal weapon with live ammunition. Weapons such as pistols, rifles, and/or any other desired firearm having any suitable caliber and/or other desired features may be used. Weapons using bullets of any suitable dimension and characteristics may be used with system 300.
One or more trigger sensors 335 may be any suitable sensors (e.g., acoustic sensors) for determining a time at which a firearm is fired (e.g., shot start of a projectile such as a bullet) and/or a time of arrival (“TOA”) of a projectile impacting target assembly 315. For example, one or more trigger sensors 335 may open a “window of measurement” for acoustic sensor array 305 to sense target assembly 315, which may permit data recording by acoustic sensor array 305 at the desired time frame (e.g., as the exemplary method may be focused on a narrow noise frequency to discern the recorded data from blast noise saturation and impact noise on target assembly). Also for example, TOA may be relevant to an exemplary triangulation method utilized by system 300 (e.g., the TOA for the impact noise to acoustic sensor array 305 may allow for triangulation positioning). It is also contemplated that either or both of sensors 330 and/or 335 may determine a location of impact of a projectile on target assembly 315, a shot start, and/or a time of arrival. One or more trigger sensors 335 may for example be any suitable microphone sensors for determining a shot start time of a projectile. As illustrated in
Optical sensor array 310 may include one or more optical sensors 340. One or more optical sensors 340 may be any suitable sensors for determining a location of shot origination of a projectile such as a bullet and a location of a target (e.g., a user acting as a target and/or a target reflection) as described below. For example, one or more optical sensors 340 may be any suitable sensor for imaging and/or determining a position of a barrel of a weapon (e.g., firearm) used by a user and/or an actual target (e.g., role-playing user) or reflected target of a user as described for example herein. In at least some exemplary embodiments, optical sensor 340 may be a stereoscopic camera and/or any other suitable device for stereo photography, stereo videography, and/or stereoscopic vision. Optical sensor 340 may be a three-dimensional video sensor. One or more optical sensors 340 may include a plurality of cameras or a single camera configured to collect three-dimensional image data. One or more optical sensors 340 may be disposed in training range 325 as described for example below. In an exemplary embodiment and as illustrated in
It is also contemplated that any of sensors 330, 335, and/or 340 may be any desired type of sensor such as, for example, an acoustic sensor, an optical sensor, a thermal sensor, a pressure sensor, an infrared sensor, an ultrasonic sensor, an accelerometer, and/or any other suitable sensor for collecting the exemplary data described herein.
Target assembly 315 may be any suitable assembly for reflecting a target image as described for example herein and/or for creating a characteristic noise from an impact of a projectile that can be located (e.g., by triangulation) based on an operation of acoustic sensor array 305. As illustrated in
Reflective layer 345 may be any suitable material for reflecting an image. For example, reflective layer 345 may be a reflective film formed from any suitable reflective material. For example, reflective layer 345 may be a polyester film such as a metalized polyester film. For example, reflective layer 345 may be any suitable metalized plastic film. Also for example, reflective layer 345 may be any film having a suitable thickness and/or tear mitigation coating properties for use on target assembly 315.
Acoustic layer 350 may be any suitable layer for creating a characteristic noise that may be utilized by acoustic sensor array 305 to determine a location of impact of a projectile such as a bullet striking target assembly 315. Acoustic layer 350 may be for example an acoustic screen including a plurality of layers. For example, acoustic layer 350 may include an absorbing layer 355 and a plate member 360. Absorbing layer 355 may be any suitable material for substantially preventing an echoing of sound caused by an impact of a projectile on target assembly 315. For example, absorbing layer 355 may include any suitable sound-absorbing material such as porous material (e.g., foam material), textile material, acoustic composite material, and/or any other suitable material for absorbing sound. For example, absorbing layer 355 may include mineral wool material such as stone wool, slag wool, glass wool, and/or ceramic fiber. For example, absorbing layer 355 may be Rockwool™ insulation. Absorbing layer 355 may include any suitable material for isolating impact noise of a projectile striking plate member 360 from a shot noise of a firearm firing that projectile and/or a blast (e.g., subsonic, transonic, and/or supersonic blast depending for example on a caliber and velocity of the projectile) associated with the projectile. Absorbing layer 355 may be an acoustic insulation layer (e.g., sound insulation layer). Plate member 360 may be any suitable structural material for use in an acoustic layer such as, for example, plastic material, composite material, and/or any other suitable material for supporting sound-absorbing material. For example, plate member 360 may be an impact plate. In at least some exemplary embodiments, reflective layer 345 and acoustic layer 350 may be formed from transparent and/or translucent materials. Target assembly 315 may be of any suitable dimensions and/or configuration for use in a live-fire training exercise using firearms. For example, target assembly 315 may be between about 2 meters and about 5 meters wide and between about 1 meter and about 3 meters in height (e.g., about 3 meters wide by about 2 meters in height), and/or any other desired width and height. Target assembly 315 may for example be disposed at between about 2 meters and about 20 meters from an entrance to training range 325 (e.g., between about 4 meters and about 15 meters inside training range 325). Reflective layer 345, absorbing layer 355, and plate member 360 may be integrally formed as a single unit. Alternatively for example, reflective layer 345, absorbing layer 355, and plate member 360 may be modular components that may be assembled together to form target assembly 315 by a user. Also for example, target assembly 315 may be formed as a cassette-type unit including two components (e.g., two of reflective layer 345, absorbing layer 355, and plate member 360) that are integrally formed with the third component attachable (e.g., insertable) by a user.
Returning to
In at least some exemplary embodiments, analysis system 320 may include components 365 and/or 370. For example, component 365 may include hardware for controlling an operation of acoustic sensor array 305 and/or for synchronizing an operation of acoustic sensor array 305 and optical sensor array 310. Component 365 may include hardware that may provide for shooting detection, microphone signal formatting of acoustic sensor array 305, camera synchronization of optical sensor array 310 with other components of system 300, and/or microphone signal acquisition and storage for acoustic sensor array 305. Component 370 may include a computing device and software that may be similar to the exemplary computing device and software described below regarding
The components of system 300 may be directly connected (e.g., by wire, cable, USB connection, and/or any other suitable electro-mechanical connection) to each other and/or connected via a network (e.g., via Ethernet LAN) that may be similar to the exemplary network disclosed below regarding
For example, system 300 may include any suitable transceiver devices (e.g., transmitter device and/or receiver device) for transmitting data between components of system 300 and also for receiving data from other components of system 300. System 300 may also include a plurality of computing devices, a plurality of exemplary user interfaces, and/or a plurality of any other components of system 300 that may be in direct communication and/or connected via network. For example, components of system 300 may receive and transmit data as disclosed below regarding exemplary communication techniques of
Analysis system 320 may include any suitable user interface for receiving input and/or providing output to a user. For example, one or more user interfaces of analysis system 320 may include a touchscreen device (e.g., of a smartphone, a tablet, a smartboard, and/or any suitable computer device), a computer keyboard and monitor (e.g., desktop or laptop), an audio-based device for entering input and/or receiving output via sound, a tactile-based device for entering input and receiving output based on touch or feel, a dedicated user interface designed to work specifically with other components of system 300, and/or any other suitable user interface (e.g., including components and/or configured to work with components described below regarding
As illustrated in
Training range 325 may include a barrier 390 that may be disposed in shooting room 380. Barrier 390 may serve as a safety barrier between occupants of training range 325 during live-fire exercises. For example, barrier 390 may be any suitable structural member including wood, plastic, concrete, and/or any other suitable structural member. Barrier 390 may be a movable, standing barrier that may be moved within shooting room 380 (e.g., or other locations of training range 325) as desired. Barrier 390 may be bulletproof to prevent projectiles such as bullets fired from firearms of users of system 300 from passing through barrier 390 (e.g., projectiles such as bullets fired from firearms may instead become lodged within barrier 390). Some or all walls, ceilings, and/or floors of shooting room 380 (e.g., including target wall 375) may be formed from similar materials as barrier 390. Some or substantially all components of training range 325 may be modular components that may be easily and/or quickly installed and uninstalled in a desired location (e.g., in a warehouse, training facility building, and/or built from scratch in an open area such as a field). For example, target wall 375, target assembly 315, and/or other structural components of training range 325 may be modular components having attachment devices configured to removably attach to each other in a modular fashion. For example, some or all components of training range 325 may be removably attachable to each other to facilitate quick and easy modular installation and uninstallation at multiple locations as desired. For example when uninstalled, some or all components of training range 325 may be compactable (e.g., foldable) so that they may be loaded onto vehicles and moved between installation locations (e.g., hard-site or other permanent locations, tactical locations, and/or any other desired training location). Components of acoustic sensor array 305, optical sensor array 310, target assembly 315, and/or analysis system 320 may be configured to be partially or substantially entirely removable from or retained on modular elements of training range 325 during uninstallation, storage, and/or transport.
As illustrated in
Lighting within shooting room 380 may affect the features viewed by shooter 400 on target assembly 315. System 300 may adjust lighting to allow for an optimal operation of optical sensor array 310. System 300 may utilize any suitable techniques to selectively modify lighting and/or room interiors and appearances such as, for example, “Pepper's ghost” optical techniques and any other desired optical effects using reflective surfaces. For example, any desired backdrop or configuration of room features may be made visible to shooter 400 based on any desired scenario (e.g., urban warfare scenario, narcotics seizure scenario, terrorism scenario with room features configured based on known intelligence, hostage release scenario and/or any other desired visual representations). Shooter 400 may thereby view a dynamically moving target reflection 410 on target assembly 315 based on movements of role-player 405. Role-player 405 may thereby interact with shooter 400 via reflected movements of target reflection 410. For example, target assembly 315 may be substantially entirely reflective so that a reflection of shooting room 380 (e.g., including target reflection 410) may be constantly viewed by shooter 400.
As illustrated in
In at least some exemplary embodiments and as illustrated in
As illustrated in
In at least some exemplary embodiments, system 300 may include a target assembly (e.g., target assembly 315) having a reflective surface (e.g., surface of reflective layer 345), a plurality of acoustic sensors (e.g., impact sensors 330) disposed at the target assembly, and at least one optical sensor (e.g., optical sensor 340). The target assembly may include sound-absorbing material. The at least one optical sensor may be configured to image a plurality of users (e.g., shooter 400 and role-player 405) of the system. The target assembly may include a reflective layer (e.g., reflective layer 345) and an acoustic layer (e.g., acoustic layer 350). The reflective layer may be a metalized polyester film or any other type of metalized coated plastic film that provides sufficient clarity of reflection. The acoustic layer may include the sound-absorbing material (e.g., absorbing layer 355) and an impact plate (e.g., plate member 360). The at least one optical sensor may include a plurality of stereoscopic cameras. The plurality of acoustic sensors may include a plurality of microphones. The plurality of microphones may be configured to locate a bullet impact point on the target assembly based on time of arrival extraction and/or triangulation computing. The system may further include an additional plurality of acoustic sensors (e.g., trigger sensors 335) disposed at a location facing the reflective surface of the target assembly. The additional plurality of acoustic sensors may include a plurality of microphones configured to determine a time at which a firearm is fired and trigger a start of data collection by the plurality of acoustic sensors and the at least one optical sensor. The at least one optical sensor may be configured to collect three-dimensional image data of a barrel of a firearm of the plurality of users. The at least one optical sensor may be configured to collect image data of the plurality of users.
In at least some exemplary embodiments, system 300 may include an impact shot tracking module (e.g., analysis system 320), comprising computer-executable code stored in non-volatile memory, a processor, an acoustic sensor array (e.g., acoustic sensor array 305), an optical sensor array (e.g., optical sensor array 310), and a target assembly (e.g., target assembly 315). The impact shot tracking module, the processor, the acoustic sensor array, the optical sensor array, and the target assembly may be configured to reflect an image (e.g., target reflection 410) of a first user (e.g., role-player 405) on the target assembly, the reflected image of the first user being visible to a second user (e.g., shooter 400) at a second location, sense a time at which a firearm of the second user is fired at the target assembly, sense a projectile impact location point on the target assembly of a projectile fired from the firearm of the second user, and determine a location of the reflected image (e.g., target reflection 410) of the first user on a reflective surface of the target assembly. The system may also include using the optical sensor array to sense a location of a barrel of the firearm. The acoustic sensor array may sense the time at which the firearm is fired and the projectile impact location point. Determining the location of the reflected image of the first user on the reflective surface of the target assembly may be based on using the optical sensor array to sense a location of the first user.
The exemplary disclosed apparatus, system, and method may be used in any suitable application for firearms training. For example, the exemplary disclosed apparatus, system, and method may be used in any suitable application for live-fire training of firearms and/or professional training of personnel. The exemplary disclosed apparatus, system, and method may be used in any suitable application for tactical training of military, police, and security force personnel in using their respective service weapons. The exemplary disclosed apparatus, system, and method may also be used to train firearms users in general.
An exemplary operation of the exemplary disclosed apparatus, system, and method will now be described. For example,
At step 515, shooter 400 and role-player 405 may interact on training range 325 to for example conduct a live-fire training exercise using real firearms and live ammunition. As illustrated for example in
At step 520, system 300 may collect data of the movements, interaction, and/or firearm usage of step 515. Step 520 may occur concurrently with step 515, with data being collected in real time or near real time by system 300. One or more trigger sensors 335 may open a data gathering window for data collection by acoustic sensor array 305 and/or optical sensor array 310 as described above. One or more impact sensors 330 may detect a location at which a projectile such as a bullet fired by a firearm may strike target assembly 315. One or more optical sensors 340 may determine a position of a barrel of a firearm used by shooter 400. System 300 may also determine a location of target reflection 410 from a point of view of shooter 400 (e.g., based on one or more optical sensors 340 sensing a location of role-player 405, which may be used to determine a location of target reflection 410). Analysis system 320 may synchronize an operation of acoustic sensor array 305 and optical sensor array 310 during data collection so that data sensed by each sensor is synchronized (e.g., synchronized with respect to a time of occurrence or measurement) with data sensed by other sensors. Based on the calibration of step 510, location data sensed by each sensor may be calibrated with respect to location (e.g., location data may be referenced to the same coordinate system) with data sensed by other sensors.
At step 525, system 300 (e.g., automatically or based on input from an operator of system 300) may determine whether or not interaction by users with system 300 will continue. If interaction continues, system 300 may return to steps 515 and 520. If interaction is complete (e.g., a firearm training exercise is complete), system 300 may proceed to step 530 to determine a shot trajectory reconstruction. It is also contemplated that steps 515, 520, and 530 may occur concurrently and that system 300 may collect data and determine a shot trajectory reconstruction in real time or near real time as users interact with system 300.
At step 530, system 300 may perform a shot trajectory reconstruction as illustrated for example in
In at least some exemplary embodiments, the exemplary disclosed method may include using a reflective surface of a target assembly (e.g., target assembly 315) to reflect an image of a first user (e.g., role-player 405) at a first location to be visible on the reflective surface to a second user (e.g., shooter 400) at a second location. The method may also include acoustically sensing (e.g., sensing using acoustic sensors) a bullet impact location point on the target assembly of a bullet fired from a firearm of the second user and optically sensing (e.g., sensing using optical sensors) a location of the first user and the second user. The method may also include acoustically sensing a time at which the firearm of the second user is fired and optically sensing a location of a barrel of the firearm. Acoustically sensing and optically sensing may be synchronized. The method may also include disposing a line-of-sight barrier (e.g., barrier 390) between the first user at the first location and the second user at the second location.
It is contemplated that system 300 may provide for an interactive exercise in which role-player 405 may also fire (e.g., fire live-fire ammunition) at a target reflection of shooter 400 in a manner similar to the exemplary techniques described above. For example, system 300 may simultaneously evaluate shots fired by role-player 405 at a target reflection of shooter 400 while shooter 400 is firing at target reflection 410 of role-player 405 (e.g., role-player 405 may become another shooter firing back at a target reflection of shooter 400). In this exemplary embodiment, system 300 may provide for a realistic force-on-force simulation in which each shooter attempts to hit the other shooter first, as in a realistic confrontational situation often faced by military and law enforcement personnel. Additional sensors and computing components similar to the exemplary elements described above may be added to system 300 to facilitate this exemplary embodiment. Additions and adjustments may also be made to training range 325 to facilitate this exemplary embodiment.
In at least some exemplary embodiments, system 300 may track the impacts of shots fired from a weapon (e.g., pistol, rifle, and/or any other suitable firearm) in real time to be able to make the shot sequence (e.g., towards a moving/live reflection) analyzable and valuable from a professional training point of view. For example, training range 325 may be calibrated from target assembly 315 in a cartesian referential by system 300.
In at least some exemplary embodiments, system 300 may provide a scenario-based firearms training using naturally moving targets in a secure and realistic way. The targets (e.g., target reflection 410) may be lively and dynamic moving targets that allow for scenario-based shot sequences, which may bring target training to the level of professional engagement, use of force engagement, and high-fidelity realism. For example, system 300 may provide training of an interactive nature that uses high accuracy tracking capability. System 300 may allow for an individual and focused experience for a user such as a student, while permitting the capture and measurement of shooting performance in a simple, modular, and cost-effective way. System 300 may analyze and process large amounts of training data related to firearms training. System 300 may also provide analysis of shot grouping and precision of impact to measure proficiency, while also providing opportunities to train with realistic moving targets, encounter confrontational emotional situations, and practice decision making regarding the use of force. The exemplary method may place and track impacts of projectiles at a short distance on a large surface (e.g., target assembly 315) involving differing calibers and bullet velocities, in a challenging acoustic environment with high precision.
In at least some exemplary embodiments, system 300 may provide video feed of a user such as a trainee (e.g., shooter 400) from smart security glasses (e.g., worn by shooter 400 and/or role-player 405) to provide additional data for analysis regarding performance. System 300 may also include personal devices (e.g., physiological measurement devices such as bracelets) to provide additional data for the analysis of a given shot sequence (e.g., including heart rate, stress indicators, and/or any other desired biometric indicators). Also for example, fixed cameras may be linked to system 300 to record position and movement of shooter 400 and also equipment of shooter 400 for further analysis. Such a data feed may be used by trainers for performance analysis.
At step 605, system 300 may perform an acoustic signal acquisition step. System 300 may digitize signals from a plurality of impact sensors 330. For example, system 300 may digitize signals from six microphones.
At step 610, system 300 may perform a time of arrival (TOA) step. System 300 may determine a signal time band for a triangulation calculation.
At step 615, system 300 may perform a “triangulation by time interval step. System 300 may carry out a triangulation calculation to determine (e.g., determine an evaluation) of a position of an impact point of the projectile.
At step 620, system 300 may determine an optimal solution for process 600. For example, system 300 may determine an optimal solution for an impact point on target assembly 315 with respect to overall coherence of the plurality of impact sensors 330 (e.g., a plurality of microphones such as six microphones).
As illustrated in the examples of
At step 705, system 300 may measure acoustic signals using impact sensors 330. For example, system 300 may use a plurality of microphones (e.g., six microphones).
At step 710, system 300 may utilize a high-pass filter to reduce a sound of a detonation of a firearm. For example, system 300 may utilize a high-pass filter at 2000 Hz to reduce the sound of detonation of the firearm.
At step 715 and step 720, system 300 may trigger an analog digital conversion by using trigger sensor 335 to detect the start of a shot from a firearm. For example during steps 715 and 720, the sampling frequency may be about 150 kHz or any other suitable frequency and the duration of the acquisition window may be about 20 ms or any other suitable duration.
At step 725, the digitized signals may be stored for subsequent processing. For example, the digitized signals may be stored using a high capacity RAM.
In at least some exemplary embodiments and as illustrated in
System 300 may use any suitable technique for acoustic triangulation. For example, system 300 may use a plurality of impact sensors 330 such as six impact sensors 330 or any other desired number of impact sensors 330. For example, system 300 may use three impact sensors 330. For example as illustrated in
The exemplary disclosed apparatus, system, and method may provide an efficient and easy-to-implement technique for providing ultra-realistic training to a user (e.g., trainee) that may involve no additional equipment or gear to be used by the user. Also, the exemplary disclosed apparatus, system, and method may provide realistic live-fire training for a user, who may use his or her own service weapon and live ammunition in the training. The exemplary disclosed apparatus, system, and method may provide for high precision of shot sequence analyses for complex engagement scenarios, which may allow for precise evaluation of a user's accuracy and performance in negotiating the tasks of a training exercise. The exemplary disclosed apparatus, system, and method may also provide for versatile use of role-playing actors (e.g., posing as a threat) and backdrops to add realism to the user's training experience. The exemplary disclosed apparatus, system, and method may provide a modular system that may be quickly and efficiently installed in a variety of locations.
An illustrative representation of a computing device appropriate for use with embodiments of the system of the present disclosure is shown in
Various examples of such general-purpose multi-unit computer networks suitable for embodiments of the disclosure, their typical configuration and many standardized communication links are well known to one skilled in the art, as explained in more detail and illustrated by
According to an exemplary embodiment of the present disclosure, data may be transferred to the system, stored by the system and/or transferred by the system to users of the system across local area networks (LANs) (e.g., office networks, home networks) or wide area networks (WANs) (e.g., the Internet). In accordance with the previous embodiment, the system may be comprised of numerous servers communicatively connected across one or more LANs and/or WANs. One of ordinary skill in the art would appreciate that there are numerous manners in which the system could be configured and embodiments of the present disclosure are contemplated for use with any configuration.
In general, the system and methods provided herein may be employed by a user of a computing device whether connected to a network or not. Similarly, some steps of the methods provided herein may be performed by components and modules of the system whether connected or not. While such components/modules are offline, and the data they generated will then be transmitted to the relevant other parts of the system once the offline component/module comes again online with the rest of the network (or a relevant part thereof). According to an embodiment of the present disclosure, some of the applications of the present disclosure may not be accessible when not connected to a network, however a user or a module/component of the system itself may be able to compose data offline from the remainder of the system that will be consumed by the system or its other components when the user/offline system component or module is later connected to the system network.
Referring to
According to an exemplary embodiment, as shown in
Components or modules of the system may connect to server 203 via WAN 201 or other network in numerous ways. For instance, a component or module may connect to the system i) through a computing device 212 directly connected to the WAN 201, ii) through a computing device 205, 206 connected to the WAN 201 through a routing device 204, iii) through a computing device 208, 209, 210 connected to a wireless access point 207 or iv) through a computing device 211 via a wireless connection (e.g., CDMA, GMS, 3G, 4G) to the WAN 201. One of ordinary skill in the art will appreciate that there are numerous ways that a component or module may connect to server 203 via WAN 201 or other network, and embodiments of the present disclosure are contemplated for use with any method for connecting to server 203 via WAN 201 or other network. Furthermore, server 203 could be comprised of a personal computing device, such as a smartphone, acting as a host for other computing devices to connect to.
The communications means of the system may be any means for communicating data, including image and video, over one or more networks or to one or more peripheral devices attached to the system, or to a system module or component. Appropriate communications means may include, but are not limited to, wireless connections, wired connections, cellular connections, data port connections, Bluetooth® connections, near field communications (NFC) connections, or any combination thereof. One of ordinary skill in the art will appreciate that there are numerous communications means that may be utilized with embodiments of the present disclosure, and embodiments of the present disclosure are contemplated for use with any communications means.
Traditionally, a computer program includes a finite sequence of computational instructions or program instructions. It will be appreciated that a programmable apparatus or computing device can receive such a computer program and, by processing the computational instructions thereof, produce a technical effect.
A programmable apparatus or computing device includes one or more microprocessors, microcontrollers, embedded microcontrollers, programmable digital signal processors, programmable devices, programmable gate arrays, programmable array logic, memory devices, application specific integrated circuits, or the like, which can be suitably employed or configured to process computer program instructions, execute computer logic, store computer data, and so on. Throughout this disclosure and elsewhere a computing device can include any and all suitable combinations of at least one general purpose computer, special-purpose computer, programmable data processing apparatus, processor, processor architecture, and so on. It will be understood that a computing device can include a computer-readable storage medium and that this medium may be internal or external, removable and replaceable, or fixed. It will also be understood that a computing device can include a Basic Input/Output System (BIOS), firmware, an operating system, a database, or the like that can include, interface with, or support the software and hardware described herein.
Embodiments of the system as described herein are not limited to applications involving conventional computer programs or programmable apparatuses that run them. It is contemplated, for example, that embodiments of the disclosure as claimed herein could include an optical computer, quantum computer, analog computer, or the like.
Regardless of the type of computer program or computing device involved, a computer program can be loaded onto a computing device to produce a particular machine that can perform any and all of the depicted functions. This particular machine (or networked configuration thereof) provides a technique for carrying out any and all of the depicted functions.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Illustrative examples of the computer readable storage medium may include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A data store may be comprised of one or more of a database, file storage system, relational data storage system or any other data system or structure configured to store data. The data store may be a relational database, working in conjunction with a relational database management system (RDBMS) for receiving, processing and storing data. A data store may comprise one or more databases for storing information related to the processing of moving information and estimate information as well one or more databases configured for storage and retrieval of moving information and estimate information.
Computer program instructions can be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner. The instructions stored in the computer-readable memory constitute an article of manufacture including computer-readable instructions for implementing any and all of the depicted functions.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electromagnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
The elements depicted in flowchart illustrations and block diagrams throughout the figures imply logical boundaries between the elements. However, according to software or hardware engineering practices, the depicted elements and the functions thereof may be implemented as parts of a monolithic software structure, as standalone software components or modules, or as components or modules that employ external routines, code, services, and so forth, or any combination of these. All such implementations are within the scope of the present disclosure. In view of the foregoing, it will be appreciated that elements of the block diagrams and flowchart illustrations support combinations of means for performing the specified functions, combinations of steps for performing the specified functions, program instruction technique for performing the specified functions, and so on.
It will be appreciated that computer program instructions may include computer executable code. A variety of languages for expressing computer program instructions are possible, including without limitation C, C++, Java, JavaScript, assembly language, Lisp, HTML, Perl, and so on. Such languages may include assembly languages, hardware description languages, database programming languages, functional programming languages, imperative programming languages, and so on. In some embodiments, computer program instructions can be stored, compiled, or interpreted to run on a computing device, a programmable data processing apparatus, a heterogeneous combination of processors or processor architectures, and so on. Without limitation, embodiments of the system as described herein can take the form of web-based computer software, which includes client/server software, software-as-a-service, peer-to-peer software, or the like.
In some embodiments, a computing device enables execution of computer program instructions including multiple programs or threads. The multiple programs or threads may be processed more or less simultaneously to enhance utilization of the processor and to facilitate substantially simultaneous functions. By way of implementation, any and all methods, program codes, program instructions, and the like described herein may be implemented in one or more thread. The thread can spawn other threads, which can themselves have assigned priorities associated with them. In some embodiments, a computing device can process these threads based on priority or any other order based on instructions provided in the program code.
Unless explicitly stated or otherwise clear from the context, the verbs “process” and “execute” are used interchangeably to indicate execute, process, interpret, compile, assemble, link, load, any and all combinations of the foregoing, or the like. Therefore, embodiments that process computer program instructions, computer-executable code, or the like can suitably act upon the instructions or code in any and all of the ways just described.
The functions and operations presented herein are not inherently related to any particular computing device or other apparatus. Various general-purpose systems may also be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these systems will be apparent to those of ordinary skill in the art, along with equivalent variations. In addition, embodiments of the disclosure are not described with reference to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the present teachings as described herein, and any references to specific languages are provided for disclosure of enablement and best mode of embodiments of the disclosure. Embodiments of the disclosure are well suited to a wide variety of computer network systems over numerous topologies. Within this field, the configuration and management of large networks include storage devices and computing devices that are communicatively coupled to dissimilar computing and storage devices over a network, such as the Internet, also referred to as “web” or “world wide web”.
Throughout this disclosure and elsewhere, block diagrams and flowchart illustrations depict methods, apparatuses (e.g., systems), and computer program products. Each element of the block diagrams and flowchart illustrations, as well as each respective combination of elements in the block diagrams and flowchart illustrations, illustrates a function of the methods, apparatuses, and computer program products. Any and all such functions (“depicted functions”) can be implemented by computer program instructions; by special-purpose, hardware-based computer systems; by combinations of special purpose hardware and computer instructions; by combinations of general purpose hardware and computer instructions; and so on—any and all of which may be generally referred to herein as a “component”, “module,” or “system.”
While the foregoing drawings and description set forth functional aspects of the disclosed systems, no particular arrangement of software for implementing these functional aspects should be inferred from these descriptions unless explicitly stated or otherwise clear from the context.
Each element in flowchart illustrations may depict a step, or group of steps, of a computer-implemented method. Further, each step may contain one or more sub-steps. For the purpose of illustration, these steps (as well as any and all other steps identified and described above) are presented in order. It will be understood that an embodiment can contain an alternate order of the steps adapted to a particular application of a technique disclosed herein. All such variations and modifications are intended to fall within the scope of this disclosure. The depiction and description of steps in any particular order is not intended to exclude embodiments having the steps in a different order, unless required by a particular application, explicitly stated, or otherwise clear from the context.
The functions, systems and methods herein described could be utilized and presented in a multitude of languages. Individual systems may be presented in one or more languages and the language may be changed with ease at any point in the process or methods described above. One of ordinary skill in the art would appreciate that there are numerous languages the system could be provided in, and embodiments of the present disclosure are contemplated for use with any language.
It should be noted that the features illustrated in the drawings are not necessarily drawn to scale, and features of one embodiment may be employed with other embodiments as the skilled artisan would recognize, even if not explicitly stated herein. Descriptions of well-known components and processing techniques may be omitted so as to not unnecessarily obscure the embodiments.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed system and method. Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed method and apparatus. It is intended that the specification and examples be considered as exemplary only, with a true scope being indicated by the following claims.