The present invention pertains to augmented reality systems, and in particular to the use of augmented reality systems and method for use in construction sites.
Estimating in the field of heavy civil construction is a high risk, high reward endeavor. A single estimating error can either lead to losses on the project or may result in not obtaining a contract at all.
Typically, an estimator goes to a construction site with a paper copy of the drawings. Drawings may include details of roads, utilities, buildings, etc. Upon arriving, the estimator attempts to translate in their minds-eye how these drawings and the actual site itself correlate. Visualizing how the drawings will translate into reality is a very difficult task. An error in this exercise can be very costly. Misinterpreting the proximity of structures, environmental challenges, the need for removal and reconstruction of roads and sidewalks, conflicts with overhead powerlines, or misunderstanding site boundaries, are just a few common estimating mistakes that happen.
Therefore, there is a need for a method and apparatus that allows an estimator to easily visualize drawing and map information from different sources that obviates or mitigates one or more limitations of the prior art.
This background information is provided to reveal information believed by the applicant to be of possible relevance to the present invention. No admission is necessarily intended, nor should be construed, that any of the preceding information constitutes prior art against the present invention.
An object of embodiment of the present invention is to provide methods and apparatus that allows an estimator or other party to “Walk the Drawings.” An estimator can open one or more sets of drawings on a mobile device and walk those same electronic drawings as they actually walk or traverse the physical site itself. In other words, as the estimator physically moves across the construction site in the real world, their electronic icon (avatar) moves across the corresponding electronic drawings on their mobile device. Now the estimator can identify and locate features that may need to be accessible while being buried under asphalt. While walking the drawings, the estimator can label any challenges or features by simply clicking their avatar and it will post that geo-stamped location complete with corresponding notes and photos straight to the drawings for later review and analysis.
Embodiments may provide extended mapping feature to enable site leaders to optimally design their site layouts for optimal construction efficiency. A user can label the best locations for haul roads, access points, equipment laydown areas, material storage spots and any other job details that might improve site efficiency. Placing these details on the drawings translates into actual placement in the field and at those exactly identified locations. These mapping features can also be used to documenting extras billings, quality control or environmental challenges.
Embodiments may include project schedules that allow for unlimited sub-schedules such as embedded equipment plans, embedded crew plans or any other embedding a user can think of. Schedule data, dates, and resources may be filtered and viewable separately. As an overall schedule changes at the highest level all the embedded schedules change with it. These embedded schedules provide a holistic view of an entire site, or an entire company's resource utilization.
In accordance with embodiments of the present invention, there is provided a method for navigation. The method includes receiving, on a mobile device a terrain image and a map overlay from a server, where the terrain image and the map overlay are aligned together. Also, tracking, the movement of the mobile device as it moves within the area of the terrain image, and annotating, a position of the mobile device superimposed on the terrain image and the map overlay.
In accordance with embodiments of the present invention, there is provided a method for aligning layers on an augmented reality display. The method includes receiving, an image file, and rotating the image file to a predetermined heading. Also, selecting a plurality of reference points on a terrain image, and selecting a first alignment point in the image file corresponding to one of the plurality of reference points. Then selecting a second alignment point in the image file, where the second alignment point is located on a line connecting two of the reference points.
In accordance with embodiments of the present invention, there is provided a method for displaying annotated image data. The method includes receiving, image data. Then processing the image data into byteslist and injecting the byteslist into a native map image layer. Also, combining the native map image layer with a visual object, and rendering a map image.
In accordance with embodiments of the present invention, there is provided a method for capturing a photo. The method includes receiving a camera heading associated with a camera, then rotating a terrain image until a heading of the terrain image matches the camera heading. Also, capturing a photo with the camera, and tagging the photo with metadata to produce a tagged photo, then sending the tagged photo to a server.
In further embodiments, the camera is a component of the mobile device.
Embodiments have been described above in conjunctions with aspects of the present invention upon which they can be implemented. Those skilled in the art will appreciate that embodiments may be implemented in conjunction with the aspect with which they are described but may also be implemented with other embodiments of that aspect. When embodiments are mutually exclusive, or are otherwise incompatible with each other, it will be apparent to those skilled in the art. Some embodiments may be described in relation to one aspect, but may also be applicable to other aspects, as will be apparent to those of skill in the art.
Further features and advantages of the present invention will become apparent from the following detailed description, taken in combination with the appended drawings, in which:
It will be noted that throughout the appended drawings, like features are identified by like reference numerals.
Embodiments will now be described with reference to the figures. For simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein may be practiced without these specific details. In other instances, well-known methods, procedures, and components have not been described in detail so as not to obscure the embodiments described herein. Also, the description is not to be considered as limiting the scope of the embodiments described herein.
Various terms used throughout the present description may be read and understood as follows, unless the context indicates otherwise: “or” as used throughout is inclusive, as though written “and/or”; singular articles and pronouns as used throughout include their plural forms, and vice versa; similarly, gendered pronouns include their counterpart pronouns so that pronouns should not be understood as limiting anything described herein to use, implementation, performance, etc. by a single gender; “exemplary” should be understood as “illustrative” or “exemplifying” and not necessarily as “preferred” over other embodiments. Further definitions for terms may be set out herein; these may apply to prior and subsequent instances of those terms, as will be understood from a reading of the present description.
Embodiments of the present invention provide an augmented reality computer system together with methods that allows an estimator or other party to “Walk the Drawings.” An estimator can open one or more sets of drawings on a mobile device and walk those same electronic drawings as they actually walk or traverse the physical site itself. In other words, as the estimator physically moves across the construction site in the real world, their electronic icon (avatar) moves across the corresponding electronic drawings on their mobile device. Now the estimator can identify and locate features that may need to be accessible while being buried under asphalt. While walking the drawings, the estimator can label any challenges or features by simply clicking their avatar and it will post that geo-stamped location complete with corresponding notes and photos straight to the drawings for later review and analysis.
Embodiments may provide extended mapping feature to enable site leaders to optimally design their site layouts for optimal construction efficiency. A user can label the best locations for haul roads, access points, equipment laydown areas, material storage spots and any other job details that might improve site efficiency. Placing these details on the drawings translates into actual placement in the field and at those exactly identified locations. These mapping features can also be used to documenting extras billings, quality control or environmental challenges.
Embodiments may include project schedules that allow for unlimited sub-schedules such as embedded equipment plans, embedded crew plans or any other embedding a user can think of. Schedule data, dates, and resources may be filtered and viewable separately. As an overall schedule changes at the highest level all the embedded schedules change with it. These embedded schedules provide a holistic view of an entire site, or an entire company's resource utilization.
Augmented reality systems as described in embodiments herein may be used in a number of industries and applications. Land developers and home builders may use embodiments to show how to drive a site by creating a Google Map annotated equivalent before Google Maps are actually supported in that area. A person can use the augmented reality system to drive to their lot without it being staked and understand the orientation, size, and view of key features. The shipping industry may utilize a map overlay over waterways to facilitate the travel of vessels in predefined shipping lanes or parking spots/berths, while avoiding hazards. Similarly, the airline industry may overlay runways for the pilots. Embodiments may also be used by the mining Industry to accurately overlay features, obstacles, hazards, etc. on the mine area.
In embodiments, a user 108 may open a set of drawings or maps on their mobile device 106 and walk those same electronic drawings as they actually walk the physical site itself. In other words, as an estimator, foreman or laborer (user 108) physically moves across a construction site in the real world, their electronic icon (avatar) moves across the corresponding electronic drawings or maps that they have chosen on their mobile device 106 to view. These maps can include drawings, safety maps, underground locate maps, job construction details, material details.
With reference to
While moving about in the construction site, a user 108, such as a construction estimator, can simultaneously see themselves as an avatar on the drawings 900 on site and see information on satellite image 800, map 900, and other layers that have been configured for the build site. Also, while walking the drawings, the user 108 can label any challenges by simply clicking their avatar and the mobile device 106 software will post that geo-stamped location complete with any corresponding notes and photos (showing the direction of the camera) straight to the shared drawings for later review and analysis. This feature can also be used to documenting extras billings, quality control or environmental challenges to name just a few use cases. Information such as notes, may entered by user 108 by typing at a keyboard or keypad, using voice-text software, adding voice recordings, etc. Other information such as absolute or relative location, bearing, altitude, azimuth, etc. may be obtained from sensors included in the mobile device 106.
Embodiments extended the “Walk the Drawings” mapping feature to enable other users, such as site leaders, to optimally design their site layouts for optimal construction efficiency. For example, a user 108 can label the best locations for haul roads, access points, equipment laydown areas, material storage spots and any other job details that might improve site efficiency. Placing these details on the drawings 900 translates into actual placement in the field 800 and at those exactly identified locations as a person delivering items can follow their avatar to the drop off spots using their own mobile device connected to the augmented reality system. In embodiments, the augmented reality system may be used to issue work orders, and if another user accepts the work order, they may be automatically added to the system for that build site. User's who are not able to access the system may be provided with static or dynamic drawings, images, maps, etc.
In embodiments, any user, that has access and permission, can label points from the field to the office or from the office to the field. Photos or video added from a phone will include metadata including GPS location data to show where that picture was taken, be geo-stamped, and may indicate the direction that the camera was pointing. These points and maps can stored, catalogued, and filtered by category for quick filtering depending on what the user requires for what they are doing and whether they are on their phone or the computer. Categories may be customizable and may include additional information such as site logistics, safety/hazard points, indicators for extra billings, quality control points, environmental, tendering, etc. System administrators may configure the system to accept, store, filter, and display any number of metadata that may be used for that particular application.
In embodiments, when looking at the drawings on the computer the transparency of layers can be adjusted to simultaneously view an informational layers as well as the underlying terrain. Tools may be added to allow a user to perform absolute or relative measurements of distance, height, angle, etc. between points or other references.
In embodiments, locations may be indicated such a spill piles, areas to place materials or not to place materials, recommended or prohibited routes, etc. When a user enters a site, their avatar will appear on the map of the area and locations may be viewed with the user's avatar indicated. For example, if a driver arrives to deliver gravel, through a dispatch received through the augmented reality system, they may access the system to see their location and where they should deliver the gravel to. The driver may access the system through a user interface, such as by clicking an icon to access the build site or through automatically detecting the driver's location. The system may indicate directions to the destination or launch a GPS program to direct the driver. Maps may be rotated so that the driver's direction of travel is in front of them to make for easier navigation and may alert the driver when they reach their destination and provide additional information on how and where the gravel should be placed.
Embodiments may provide enhancements to traditional project management software, including a macro scheduler with sub-schedules that move as the higher level schedule changes. In the case of a construction site, this could be a job schedule at its highest level similar to schedules supported by software such as Microsoft Project. Embodiments may improve on this by allowing for unlimited sub-schedules such as embedded equipment plans, embedded crew plans, or any other required embedded information. Sub-schedules may be filtered using different criteria and viewable separately. Examples of filtering criteria are duration, dates, milestones, equipment, location, crew, other resources, etc. As the overall schedule changes at the highest level all the embedded schedules change with it. These embedded schedules provide a holistic view of the entire company's resource utilization.
As an example, using embodiments of the augmented reality system described herein, an equipment sub-schedule (equipment) may show the actual available fleet available on any date and compare that to the aggregated projected equipment demands. This scheduler can include a specific equipment or a type of equipment (example unit #1 or “large backhoe”). Embodiments may take into consideration factors such as equipment out of service, under repair or to be rented out at a future date. As the job schedules change or the equipment fleet changes, the equipment schedule may change with it, keeping the information related to equipment demands and availability up to date and relevant. Being able to see equipment demands for each type of equipment well in advance helps companies make better decisions around renting vs. buying equipment, selling vs. repairing equipment, and choosing rental terms for hourly vs. monthly, or even moving projects around.
Embodiments may include a user interface that may be used by methods of adding equipment, or other resources, to a project. Embodiments may also include a user interface that shows a holistic view of equipment, or other resources, and how they may move to illustrate schedule changes.
As shown, the device includes a processor 710, such as a central processing unit (CPU) or specialized processors such as a graphics processing unit (GPU) or other such processor unit, memory 720, non-transitory mass storage 730, I/O interface 740, network interface 750, video adaptor 770, and any required transceivers 760, all of which are communicatively coupled via bi-bus 725. Video adapter 770 may be connected to one or more of display 775 and I/O interface 740 may be connected to one or more of I/O device 745 which may be used to implement a user interface. According to certain embodiments, any or all of the depicted elements may be utilized, or only a subset of the elements. Further, computing devices 700 may contain multiple instances of certain elements, such as multiple processors, memories, or transceivers. Also, elements of the hardware device may be directly coupled to other elements without the bus 725. Additionally, or alternatively to a processor and memory, other electronics, such as integrated circuits, may be employed for performing the required logical operations.
The memory 720 may include any type of non-transitory memory such as static random access memory (SRAM), dynamic random access memory (DRAM), synchronous DRAM (SDRAM), read-only memory (ROM), any combination of such, or the like. The mass storage element 530 may include any type of non-transitory storage device, such as a solid state drive, hard disk drive, a magnetic disk drive, an optical disk drive, USB drive, or any computer program product configured to store data and machine executable program code. According to certain embodiments, the memory 720 or mass storage 730 may have recorded thereon statements and instructions executable by the processor 710 for performing any of the aforementioned method operations described above.
It will be appreciated that it is within the scope of the technology to provide a computer program product or program element, or a program storage or memory device such as a magnetic or optical wire, tape or disc, USB stick, file, or the like, for storing signals readable by a machine, for controlling the operation of a computer according to the method of the technology and/or to structure some or all of its components in accordance with the system of the technology. Acts associated with the method described herein can be implemented as coded instructions in a computer program product. In other words, the computer program product is a computer-readable medium upon which software code is recorded to execute the method when the computer program product is loaded into memory and executed on the microprocessor of computing devices.
Although the present invention has been described with reference to specific features and embodiments thereof, it is evident that various modifications and combinations can be made thereto without departing from the invention. The specification and drawings are, accordingly, to be regarded simply as an illustration of the invention as defined by the appended claims, and are contemplated to cover any and all modifications, variations, combinations, or equivalents that fall within the scope of the present invention.
This application claims the benefit of priority to U.S. provisional patent application Ser. No. 63/302,540 entitled “AUGMENTED REALITY SYSTEM WITH INTERACTIVE OVERLAY DRAWING” filed Jan. 24, 2022, hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63302540 | Jan 2022 | US |