The present invention generally relates to the field of computer-aided design and structure fabrication. In particular, the present invention is directed to automated joining data recommending and presenting methods for computer-modeled structures.
Technical specifications and engineering drawings typically convey a variety of information about a to-be-fabricated structure, such as a part or an assembly of components. Examples of such information includes information about geometry, materials, finishes, connections, hardware, special processes, dimensions, tolerances, and others things as known in the art. The documents are prepared by engineers; however, manufacturers rely on the documents for manufacturing preparation to build the desired structure, such as a part or an assembly of multiple components. There are differing skill levels involved on both the engineering side and the manufacturing side, which has led to a gap between the two stages involved in fabricating a structure.
For the joining of structures, an engineer communicates desired techniques and parameters to the manufacturer using complex symbols (e.g., American National Standards Institute, or ANSI, symbols). The manufacturer must understand the meaning of these symbols, which may be prone to error and require interpretation skill and/or time. The manufacturer also verifies that the structure can be fabricated using the engineer's specified joining technique. If the manufacturer identifies a potential issue or an opportunity to improve the engineer's design, the manufacturer must contact the engineer directly, which is time consuming and inefficient.
In an implementation, the present disclosure is directed to a method of assisting a user in selecting a join for a structure represented in a computer model. The method includes receiving via a join-recommending system a join request; receiving via the join-recommending system input joining data; executing, within the join-recommending system and in response to the receiving the join request, a joining recommendation algorithm that determines recommended joining data as a function of the input joining data; presenting the recommended joining data to the user via a join-recommendation user interface; receiving via the join-recommendation user interface a user selection of the recommended joining data so as to create selected joining data; and automatedly associating the selected joining data with the computer model.
For the purpose of illustrating the invention, the drawings show aspects of one or more embodiments of the invention. However, it should be understood that the present invention is not limited to the precise arrangements and instrumentalities shown in the drawings, wherein:
Aspects of the present invention include software tools and techniques for automatically recommending and presenting joining data and methods for joining in one or more instantiations of a structure that is represented in a computer model through the use of a join-recommendation user interface. Using various ones of these tools and techniques, precise joining features and other joining data extracted from the computer model can be used along with joining parameters, as well as input joining data from an input-joining-data user interface, to create highly precise and highly repeatable joining recommendations to be presented to a user. Other aspects of the present invention include software tools and techniques for allowing a user to optionally share or send the presented recommendations to another. Still other aspects of the present invention include software tools and techniques to optionally allow a user to select a joining method from the presented recommendations through a join-recommendation user interface which may be associated with the computer model in any of various ways (e.g. through the use of a feature tree or by visually displaying the selected joining data on the computer model). These and other aspects of the present invention will become readily apparent upon reviewing this entire disclosure. Before proceeding with describing numerous aspects of the present invention in detail, a number of definitions of certain terms used throughout this disclosure are first presented. These terms shall have the following meanings throughout unless noted otherwise. Like terms, such as differing parts of speech, differing tenses, singulars, plurals, etc., for the same term (e.g., fabricating, fabricate, fabrication, fabrications, fabricated) shall have commensurate meanings.
Structure: A “structure” can be any physical thing that can be made by or under the control of a human and/or under the control of one or more machines. For example, a “structure” can be made by hand, using one or more tools, using one or more pieces of machinery, or any combination thereof. Examples of structures include, but are not limited to objects, parts, assemblies of components, buildings, vehicles, machines, semiconductor devices, computing devices, and electronic equipment, among many others. Fundamentally, there is no limitation on what a “structure” can be other than that it is fabricated.
Fabricate: To “fabricate” a structure is perform a step or collection of steps needed to physically instantiate the structure. In this context, fabrication includes, but is not limited to steps of cutting, machining, milling, turning, making connections, molding, in particular injection molded parts, casting, stamping, forming, bending, etching, drilling, etc. Synonyms that fall within the meaning of “fabricate” herein include, but are not limited to manufacture, erect, assemble, mold, and form, among many others.
Computer model: A “computer model” is a virtual, for example, digital, model of a physical structure as created using appropriate computer-modeling software, such as SolidWorks (available from Dassault Syst{hacek over (e)}mes SolidWorks Corp, Waltham, Mass.), AutoCAD (available from Autodesk, Inc., San Rafael, Calif.), and MicroStation (available from Bentley Systems, Inc., Exton, Pa.), among many others. A “computer model” can be of any type, such as a wireframe or solid model, among others, or any combination thereof, and can be saved in a computer file using any suitable file protocol, such as .SLDPRT, .SLDASM, .STP, .IGS, a .DWG, .DXF, .DGN, etc. A “computer model” includes information about the geometry and/or other properties of the modeled structure.
Joining: A “joining” is a means that attaches portions of a structure or structures together in such a way as to ensure that the overall structure is structurally sound (e.g., strong enough to resist and withstand the force loads expected of it). A joining can be made by hand, using one or more tools, using one or more pieces of machinery, or any combination thereof. Joining can be fully automated (e.g., robotic) or involve varying degrees of human input. Examples of joinings include, but are not limited to seam welds, stitch welds, spot welds, rivets, nails, screws, nuts and bolts, adhesives, fuses, among many others. Fundamentally, there is no limitation on what “joining” can mean other than that it attaches portions of a structure or structures to one another.
Input joining data: “Input joining data” are input data to a system of the present disclosure that may be extracted from a computer model and that influence the recommended joining data of the structure(s) represented by the computer model. It is noted that input joining data may be either “extracted” or “non-extracted.” The process of extracting joining data from a computer model is more one of scraping than extraction, because the input joining data is not removed from the computer model (which would destroy its integrity) but rather scraped, i.e., copied from the computer model and processed as needed for use in the system. “Non-extracted input joining data” can be input or received in any suitable manner, such as via a input-joining-data user interface or a non-computer-model electronic document, such as a form-fillable portable document format (PDF) document, or a non-computer-model data file, among others. Examples of “input joining data” include, but are not limited to, geometry (such as size, shape, dimensions, areas, configurations, numbers of components and other features, such as openings, recesses, bosses, etc.), type(s) of material (in computer models wherein materials can be specified), connection type(s) and features (in computer models wherein such information can be specified, finish type(s) (in computer models wherein finishes can be specified), among others. Fundamentally, there is no limitation on the data that can constitute “input joining data,” other than that they influence the recommended joining data of the structure(s) and, therefore, selected joining data.
Recommended joining data: “Recommended joining data” are output data from a system of the present disclosure that are determined by a joining recommendation algorithm based on at least a portion of a computer model of at least a portion of a structure. Recommended joining data may be presented to the user via a join-recommendation user interface. For example, recommended joining data may include a specific type of joining, one or more joining characteristics for a joining, quantitative parameter values for a joining, or qualitative parameter values for a joining.
Joining characteristic: A “joining characteristic” includes any quantitative and/or qualitative data that conveys manufacturing specific knowledge about joining to a user. For example, a joining characteristic may be a minimum material thickness, a maximum material thickness, compatibility information for material types, strength of a joint, a cosmetic descriptor of a join, joining time per edge length segment, heat distortion factors per edge length measurement, tolerance adjustments per edge length segment, or a cost to be incurred by the user for the join, among others. Fundamentally, there is no limitation on the data that can constitute “joining characteristic,” other than that data conveys manufacturing specific knowledge about joining.
With the foregoing terms and meanings in mind, reference is now made to
Examples of hardware 144 that can be used to implement the various steps of method 140 include, but are not limited to, web servers, desktop computers, laptop computers, tablet computers, smartphones, and Internet appliances, wearable computer (such as a GOOGLE GLASS™ computing device), among others. A network of two or more of such devices can include any one or more types of networks, including, but not limited to, a global communications network (such as the Internet), a wide-area network, a local-area network, and a telecommunications network, among others. In this connection, those skilled in the art will also recognize the myriad of ways that the steps of method 140 can be implemented across a network. For example, if any steps of method 140 are implemented on one or more web-servers, they may be performed by suitable software residing on, provided by, and/or executed by such server(s). Such software can include a software application, a software module (such as a plugin to another software application, such as a computer modeling application, web browser, etc.), and/or a software code segment. In many cases, the same or similar software, or any portion thereof, can be implemented on a desktop computer, a laptop computer, and a tablet computer. As another example, various steps of method 140 can be performed by one or more mobile apps running, on, for example, a smartphone or tablet computer, and provided the ability to communicate with one or more other computing devices running software that performs one or more other steps of the method.
In a particular embodiment, all steps of method 140 can be performed by a single computing device, such as a desktop computer or a webserver accessible via a browser, running an integrated software application that performs all steps of method 140 and that may also include computer-modeling functionality as well, such as a computer-modeling software application 148. In another embodiment, some steps of method 140 can be performed on a first computing device, whereas other steps of the method are performed on a second computing device located remotely from the first computing device across a network. Those skilled in the art will understand how to implement any variation reasonably conceivable using only known programming techniques and this disclosure as a guide. Consequently, it is not necessary to describe every potential variation for skilled artisans to practice the present invention to the fullest scope accorded by the appended claims. Regardless of the type of hardware 144 used to implement software 136 made in accordance with the present invention, the hardware works in combination with and under the control of such software to form join-recommending system 100, which provides functionality described herein.
Referring now to
At step 210, in conjunction with the user making joining request 152, software 136 receives input joining data 120. As described above, input joining data 120 can be any data that a joining recommendation algorithm 156 of software 136 needs for determining recommended joining data 104. Those skilled in the art will understand that input joining data 120′ can be determined in a variety of ways. In one example embodiment, joining data 120′ may be inputted manually by the user through any of a variety of means (e.g. a text entry dialogue box or prompt, a drop-down selection menu and/or a bulleted or button selection, among others). In another exemplary embodiment, joining data 120′ may be extracted from computer model 116 automatically in any of a variety of ways. For example, if data-extraction code is built into computer-modeling software, such as computer-modeling software application 148, the data-extraction code may be preprogramed to recognize input joining data 120′ within computer model 116 and utilize the internal protocols of that application to gather the that data. As another example, if the data-extraction code is implemented as an external plugin module to computer-modeling software, the code might utilize the application's plugin module protocols. As yet another example, if the data-extraction code is executed externally from the computer-modeling application but not as a plugin, the external code may utilize an application programming interface of the application. Regardless of how software 136 and the data-extraction code are configured, those skilled in the art will readily understand how to design the code.
Alternatively, input joining data 120 may comprise input joining data, which can be received in any of a variety of ways, depending on how software 136 is configured in a particular instantiation. For example, software 136 may be configured to allow a user to input joining data via the input-joining-data user interface 124 that includes one or more data-input features of one or more differing types, such as, but not necessarily limited to, keyed-input fields, drop-down menus, radio-control selectors, hyperlinks, and other selectors, among others. Those skilled in the art will readily appreciate that the type(s) and number of data-input features can depend on the robustness of computer model 116 relative to input joining data 120′ that it contains and the variety of options available from a particular fabricator. Regarding the former, as noted above, if a version of computer model 116 includes input joining data 120′ such as material type(s), then the input-joining-data user interface 124 does not necessarily need a data-input feature directed to the material type(s), since that information will be extracted from the computer model. However, for a version of computer model 116 that does not allow the user to specify materials within the computer modeling software 148, therefore prohibiting the extraction of such data, the input-joining-data user interface 124 would need one or more data-input features to allow the user to input the appropriate material(s) needed for recommending joining method(s). Of course, material type is but one example of a joining-data type that can be of either an extracted type or a non-extracted type, and other joining data that can be of either type can be handled similarly.
The input-joining-data user interface 124 of the present disclosure can be implemented in any of a number of ways. For example, if the input-joining-data software code is implemented within computer-modeling software, such as computer-modeling software application 148, then the user interface (not shown) may be presented as a graphical user interface of the software application. Similarly, if the input-joining-data software code is executed in a plug-in-type external module, then the user interface may be presented in the same manner. It is noted that such computer-modeling software can be of the type that presents its graphical user interface via an on-screen window under the control of the operating system of the computer on which the application is implemented. However, in other embodiments, the graphical user interface can be presented in another way (via a web-browser, for example) when the computer-modeling software application 148 includes the input-joining-data software code and is implemented over the World-Wide Web, perhaps in a software-as-a-service implementation, among others, or when the input-joining-data software code is implemented separately from the computer-modeling software application. Regardless of how the input-joining-data software code is implemented for receiving joining data, skilled artisans will be able to create the appropriate software code.
At step 215, software 136 determines and presents recommended joining data 104 to a user via a join-recommendation user interface 114. Depending on the configuration of software 136 and where the various software components of the joining recommendation portion of the software are physically executed, step 215 will typically include several substeps. For example, at substep 220, joining recommendation algorithm 156 may determine recommended joining data 104 by, for example, utilizing input joining data 120. The precise joining recommendation algorithm 156 used in a particular embodiment will be highly dependent on, for example, the type(s) of structure(s) 112 handled by software 136, as well as the particularities of the fabricator(s) needed to fabricate the instantiation(s). Algorithm 156 may, in one embodiment, function in the form of a decision tree. This decision tree structure may function through a series of branching decisions, the outcomes of which may be determined by the available input joining data 120. This algorithm results in recommended joining data 104 being presented to the user. Joining recommendation algorithm 156 makes these recommendations by analyzing the geometry of the selected features and/or the surrounding features and comparing this geometrical data to known parameters and limitations for various joining methods and techniques.
Step 215 may also include a substep 225 at which software 136 conveys recommended joining data 104 to the user. These recommendations may be presented to the user in any one or more of a variety of ways. For example, recommended joining data 104 can be displayed on a display screen (not shown) of the user's computer, conveyed in an email, and/or provided in some other type of message, including regular mail, an instant message, a text message, etc. and/or as an attachment thereto, among others. Illustrative embodiments of this display screen will further be shown in
Step 215 may further include an optional substep 230 at which software 136 conveys recommended joining data 104 to an entity other than the user. For example, software 136 may transmit recommended joining data to others wishing to collaborate on the design of structure 112 (e.g., user's manager, design project team members, etc.). As another example, software 136 may transmit recommended joining data 104 and input joining data 120 to one or more fabricator(s) for purposes of data collection. As yet another example, software 136 may transmit recommended joining data and input joining data 120 to a social media system (not shown). Fundamentally, there are no limitations on how recommended joining data can be conveyed to other entities.
At optional step 235, the user determines the selected joining data 142 to associate with the computer model 116 of the structure 112 from the recommended joining data 104 presented via a join-recommendation user interface 114. For example, selected joining data 142 can be chosen from a display screen (not shown) of the user's computer, conveyed in an email, and/or provided in some other type of message. Step 235 may involve an iterative process in which the user is provided additional information about recommended joining data 104 before determining the selected joining data 142. Step 235 may involve the user specifying additional input information (e.g., parameter values) required for the selected joining data 142 to be associated with the computer model 116 via the input-joining-data user interface 124.
At optional step 240, in response to receiving the selected joining data 142, software 136 associates the selected joining data 142 with the computer model 116 of the structure 112. For example, software 136 may append information about the selected joining data 142 to the computer model 116, create a data link between information about the selected joining data 142 and the computer model 116, and/or display a visual representation of the selected joining data 142 with a visual representation of the computer model 116.
It is particularly emphasized that the order of performance of the foregoing steps of method 200 need not be as shown. Rather, they may be implemented in any logical order that results in determining and conveying recommended joining data.
In this example, optional window 300 further includes a panel 316 that conveys one or more joining characteristics 320 for ones of the recommended joining methods 308(1) to 308(N). For illustrative purposes, panel 316 includes information about the cosmetic appearance, strength, and cost of ones of the recommended joining methods 308(1) to 308(N). In one embodiment, each joining characteristic 320 includes a control 324 for conveying quantitative and/or qualitative information regarding the characteristic 320 which further empowers the user with the ability to compare manufacturing knowledge across recommended joining methods.
The one or more recommended welding methods 308(1) to 308(N) may be selectable by the user. When the user selects a recommended joining method 308(N) displayed in panel 304, information for one or more joining characteristics 320(1) to 320(N) associated with the selection is displayed in panel 316. This feature enables users to quickly toggle between the one or more recommended welding methods 308(1) to 308(N) and compare the one or more joining characteristics 320(1) to 320(N).
Window 300 further includes a panel 328 that enables the user to signal software 136 regarding recommended joining data 104 and/or recommended joining method(s) 308 that may or may not be associated with the model. Panel 328 includes a confirmation button 332 that enables the user to signal software 136 that ones of the recommended joining methods 308(1) to 308(N) are to be associated with the model. For example, upon selection of the confirmation button 332, software 136 may execute step 240 of
Window 400 also includes a panel 416 that includes one or more controls for conveying recommended joining data 104 determined by joining recommendation algorithm 156. For example, controls 420(1) to 420(N) may convey quantitative and/or qualitative joining parameter values 424(1) to 424(N) associated with the joining method 408. Controls 420(1) to 420(N) may enable the user to modify one or more of the recommended joining parameter values 424(1) to 424(N).
Window 400 further includes panel 316
Window 400 also includes a panel 428 that enables the user to signal software 136 regarding recommended joining data 104 that may or may not be associated with the model. Panel 428 includes a confirmation button 432 that enables the user to signal software 136 that the joining method 408 and quantitative and/or qualitative joining parameter values 424(1) to 424(N) are to be associated with the model. For example, upon selection of the confirmation button 432, software 136 may execute step 240 of
Exemplary Computing Device
As noted above aspects and embodiments described herein may be conveniently implemented using one or more machines (e.g., one or more computing devices/computer systems that are part of a join-recommending system 100 of
Such software may be, for example, a computer program product that employs one or more machine-readable hardware storage mediums. A machine-readable storage medium may be any medium that is capable of storing and/or encoding a sequence of instructions for execution by a machine (e.g., a computing device) and that causes the machine to perform any one of the methodologies and/or embodiments described herein. Examples of a machine-readable hardware storage medium include, but are not limited to, a magnetic disk (e.g., a conventional floppy disk, a hard drive disk), an optical disk (e.g., a compact disk “CD”, such as a readable, writeable, and/or re-writable CD; a digital video disk “DVD”, such as a readable, writeable, and/or rewritable DVD), a magneto-optical disk, a read-only memory “ROM” device, a random access memory “RAM” device, a magnetic card, an optical card, a solid-state memory device (e.g., a flash memory), an EPROM, an EEPROM, and any combinations thereof. A machine-readable medium, as used herein, is intended to include a single medium as well as a collection of physically separate media, such as, for example, a collection of compact disks or one or more hard disk drives in combination with a computer memory. As used herein, a machine-readable storage medium does not include a signal.
Such software may also include information (e.g., data) carried as a data signal on a data carrier, such as a carrier wave. Such a data signal or carrier wave would not be considered a machine-readable hardware storage medium. For example, machine-executable information may be included as a data-carrying signal embodied in a data carrier in which the signal encodes a sequence of instruction, or portion thereof, for execution by a machine (e.g., a computing device) and any related information (e.g., data structures and data) that causes the machine to perform any one of the methodologies and/or embodiments described herein.
Examples of a computing device include, but are not limited to, a computer workstation, a terminal computer, a server computer, a handheld device (e.g., tablet computer, a personal digital assistant “PDA”, a mobile telephone, etc.), a web appliance, a network router, a network switch, a network bridge, any machine capable of executing a sequence of instructions that specify an action to be taken by that machine, and any combinations thereof.
Computing device 500 can also include a memory 508 that communicates with the one or more processors 504, and with other components, for example, via a bus 512. Bus 512 may include any of several types of bus structures including, but not limited to, a memory bus, a memory controller, a peripheral bus, a local bus, and any combinations thereof, using any of a variety of bus architectures.
Memory 508 may include various components (e.g., machine-readable hardware storage media) including, but not limited to, a random access memory component (e.g., a static RAM “SRAM”, a dynamic RAM “DRAM”, etc.), a read only component, and any combinations thereof. In one example, a basic input/output system 516 (BIOS), including basic routines that help to transfer information between elements within computing system 500, such as during start-up, may be stored in memory 508. Memory 508 may also include (e.g., stored on one or more machine-readable hardware storage media) instructions (e.g., software) 520 embodying any one or more of the aspects and/or methodologies of the present disclosure. In another example, memory 508 may further include any number of program modules including, but not limited to, an operating system, one or more application programs, other program modules, program data, and any combinations thereof.
Computing device 500 may also include a storage device 524, such as, but not limited to, the machine readable hardware storage medium described above. Storage device 524 may be connected to bus 512 by an appropriate interface (not shown). Example interfaces include, but are not limited to, SCSI, advanced technology attachment (ATA), serial ATA, universal serial bus (USB), IEEE 1394 (FIREWIRE), and any combinations thereof. In one example, storage device 524 (or one or more components thereof) may be removably interfaced with computing system 500 (e.g., via an external port connector (not shown)). Particularly, storage device 524 and an associated machine-readable medium 528 may provide nonvolatile and/or volatile storage of machine-readable instructions, data structures, program modules, and/or other data for computing device 500. In one example, software instructions 520 may reside, completely or partially, within machine-readable hardware storage medium 528. In another example, software instructions 520 may reside, completely or partially, within processors 504.
Computing device 500 may also include an input device 532. In one example, a user of computing system 500 may enter commands and/or other information into computing system 500 via one or more input devices 532. Examples of an input device 532 include, but are not limited to, an alpha-numeric input device (e.g., a keyboard), a pointing device, a joystick, a gamepad, an audio input device (e.g., a microphone, a voice response system, etc.), a cursor control device (e.g., a mouse), a touchpad, an optical scanner, a video capture device (e.g., a still camera, a video camera), touch screen, and any combinations thereof. Input device(s) 532 may be interfaced to bus 512 via any of a variety of interfaces (not shown) including, but not limited to, a serial interface, a parallel interface, a game port, a USB interface, a FIREWIRE interface, a direct interface to bus 512, and any combinations thereof. Input device(s) 532 may include a touch screen interface that may be a part of or separate from display(s) 536, discussed further below. Input device(s) 532 may be utilized as a user selection device for selecting one or more graphical representations in a graphical interface as described above.
A user may also input commands and/or other information to computing device 500 via storage device 524 (e.g., a removable disk drive, a flash drive, etc.) and/or network interface device(s) 540. A network interface device, such as any one of network interface device(s) 540, may be utilized for connecting computing system 500 to one or more of a variety of networks, such as network 544, and one or more remote devices 548 connected thereto. Examples of a network interface device include, but are not limited to, a network interface card (e.g., a mobile network interface card, a LAN card), a modem, and any combination thereof. Examples of a network include, but are not limited to, a wide area network (e.g., the Internet, an enterprise network), a local area network, a telephone network, a data network associated with a telephone/voice provider, a direct connection between two computing devices, and any combinations thereof. A network, such as network 544, may employ a wired and/or a wireless mode of communication. In general, any network topology may be used. Information (e.g., data, software instructions 520, etc.) may be communicated to and/or from computing system 500 via network interface device(s) 540.
Computing device 500 may further include one or more video display adapter 552 for communicating a displayable image to one or more display devices, such as display device(s) 536. Examples of a display device include, but are not limited to, a liquid crystal display (LCD), a cathode ray tube (CRT), a plasma display, a light emitting diode (LED) display, and any combinations thereof. Display adapter(s) 552 and display device(s) 536 may be utilized in combination with processor(s) 504 to provide a graphical representation of a utility resource, a location of a land parcel, and/or a location of an easement to a user. In addition to a display device, computing system 500 may include one or more other peripheral output devices including, but not limited to, an audio speaker, a printer, and any combinations thereof. Such peripheral output devices may be connected to bus 512 via a peripheral interface 556. Examples of a peripheral interface include, but are not limited to, a serial port, a USB connection, a FIREWIRE connection, a THUNDERBOLT connection, a parallel connection, and any combinations thereof.
The foregoing has been a detailed description of illustrative embodiments of the invention. Various modifications and additions can be made without departing from the spirit and scope of this invention. Features of each of the various embodiments described above may be combined with features of other described embodiments as appropriate in order to provide a multiplicity of feature combinations in associated new embodiments. Furthermore, while the foregoing describes a number of separate embodiments of the apparatus and method of the present invention, what has been described herein is merely illustrative of the application of the principles of the present invention. Additionally, although the methods herein have been illustrated as being performed in a specific order, the ordering is highly variable within ordinary skill to achieve the spot welding placement and display methods, systems, and software described herein. Accordingly, this description is meant to be taken only by way of example, and not to otherwise limit the scope of this invention.
Number | Name | Date | Kind |
---|---|---|---|
4495559 | Gelatt, Jr. et al. | Jan 1985 | A |
5117354 | Long | May 1992 | A |
5465221 | Merat et al. | Nov 1995 | A |
5495430 | Matsunari et al. | Feb 1996 | A |
5552995 | Sebastian | Sep 1996 | A |
5570291 | Dudle et al. | Oct 1996 | A |
5655087 | Hino et al. | Aug 1997 | A |
5758328 | Giovannoli | May 1998 | A |
5847971 | Ladner et al. | Dec 1998 | A |
5870719 | Maritzen et al. | Feb 1999 | A |
5937189 | Branson et al. | Aug 1999 | A |
6031535 | Barton | Feb 2000 | A |
6112133 | Fishman | Aug 2000 | A |
6295513 | Thackston | Sep 2001 | B1 |
6341271 | Salvo et al. | Jan 2002 | B1 |
6343285 | Tanaka et al. | Jan 2002 | B1 |
6611725 | Harrison | Aug 2003 | B1 |
6647373 | Carlton-Foss | Nov 2003 | B1 |
6701200 | Lukis et al. | Mar 2004 | B1 |
6750864 | Anwar | Jun 2004 | B1 |
6834312 | Edwards et al. | Dec 2004 | B2 |
6836699 | Lukis et al. | Dec 2004 | B2 |
6859768 | Wakelam et al. | Feb 2005 | B1 |
6917847 | Littlejohn et al. | Jul 2005 | B2 |
6922701 | Ananian et al. | Jul 2005 | B1 |
7006084 | Buss et al. | Feb 2006 | B1 |
7058465 | Emori et al. | Jun 2006 | B2 |
7079990 | Haller | Jul 2006 | B2 |
7085687 | Eckenwiler et al. | Aug 2006 | B2 |
7089082 | Lukis et al. | Aug 2006 | B1 |
7123986 | Lukis et al. | Oct 2006 | B2 |
7134096 | Brathwaite et al. | Nov 2006 | B2 |
7299101 | Lukis et al. | Nov 2007 | B2 |
7305367 | Hollis et al. | Dec 2007 | B1 |
7327869 | Boyer | Feb 2008 | B2 |
7343212 | Brearley et al. | Mar 2008 | B1 |
7359886 | Sakurai et al. | Apr 2008 | B2 |
7366643 | Verdura et al. | Apr 2008 | B2 |
7369970 | Shimizu et al. | May 2008 | B2 |
7418307 | Katircioglu | Aug 2008 | B2 |
7467074 | Faruque et al. | Dec 2008 | B2 |
7496487 | Wakelam et al. | Feb 2009 | B2 |
7496528 | Lukis et al. | Feb 2009 | B2 |
7499871 | McBrayer et al. | Mar 2009 | B1 |
7523411 | Carlin | Apr 2009 | B2 |
7526358 | Kawano et al. | Apr 2009 | B2 |
7529650 | Wakelam et al. | May 2009 | B2 |
7565139 | Neven, Sr. et al. | Jul 2009 | B2 |
7565223 | Moldenhauer et al. | Jul 2009 | B2 |
7567849 | Trammell et al. | Jul 2009 | B1 |
7568155 | Axe et al. | Jul 2009 | B1 |
7571166 | Davies et al. | Aug 2009 | B1 |
7574339 | Lukis et al. | Aug 2009 | B2 |
7590466 | Lukis et al. | Sep 2009 | B2 |
7590565 | Ward et al. | Sep 2009 | B2 |
7603191 | Gross | Oct 2009 | B2 |
7606628 | Azuma | Oct 2009 | B2 |
7630783 | Walls-Manning et al. | Dec 2009 | B2 |
7656402 | Abraham et al. | Feb 2010 | B2 |
7689936 | Rosel | Mar 2010 | B2 |
7733339 | Laning et al. | Jun 2010 | B2 |
7747469 | Hinman | Jun 2010 | B2 |
7748622 | Schon et al. | Jul 2010 | B2 |
7761319 | Gil et al. | Jul 2010 | B2 |
7822682 | Arnold et al. | Oct 2010 | B2 |
7836573 | Lukis et al. | Nov 2010 | B2 |
7840443 | Lukis et al. | Nov 2010 | B2 |
7908200 | Scott et al. | Mar 2011 | B2 |
7957830 | Lukis et al. | Jun 2011 | B2 |
7979313 | Baar | Jul 2011 | B1 |
7993140 | Sakezles | Aug 2011 | B2 |
8000987 | Hickey et al. | Aug 2011 | B2 |
8024207 | Ouimet | Sep 2011 | B2 |
8140401 | Lukis et al. | Mar 2012 | B2 |
8170946 | Blair et al. | May 2012 | B2 |
8175933 | Cook, Jr. et al. | May 2012 | B2 |
8180396 | Athsani et al. | May 2012 | B2 |
8209327 | Danish et al. | Jun 2012 | B2 |
8239284 | Lukis et al. | Aug 2012 | B2 |
8249329 | Silver | Aug 2012 | B2 |
8271118 | Pietsch et al. | Sep 2012 | B2 |
8275583 | Devarajan et al. | Sep 2012 | B2 |
8295971 | Krantz | Oct 2012 | B2 |
8417478 | Gintis et al. | Apr 2013 | B2 |
8441502 | Reghetti et al. | May 2013 | B2 |
8515820 | Lopez et al. | Aug 2013 | B2 |
8554250 | Linaker | Oct 2013 | B2 |
8571298 | McQueen et al. | Oct 2013 | B2 |
8595171 | Qu | Nov 2013 | B2 |
8700185 | Yucel et al. | Apr 2014 | B2 |
8706607 | Sheth et al. | Apr 2014 | B2 |
8768651 | Bhaskaran et al. | Jul 2014 | B2 |
8798324 | Conradt | Aug 2014 | B2 |
8806398 | Brathwaite et al. | Aug 2014 | B2 |
8830267 | Brackney | Sep 2014 | B2 |
8849636 | Becker et al. | Sep 2014 | B2 |
8861005 | Grosz | Oct 2014 | B2 |
8874413 | Mulligan et al. | Oct 2014 | B2 |
8923650 | Wexler | Dec 2014 | B2 |
8977558 | Nielsen et al. | Mar 2015 | B2 |
9037692 | Ferris | May 2015 | B2 |
9055120 | Firman | Jun 2015 | B1 |
9106764 | Chan et al. | Aug 2015 | B2 |
20010023418 | Suzuki et al. | Sep 2001 | A1 |
20010047251 | Kemp | Nov 2001 | A1 |
20020065790 | Oouchi | May 2002 | A1 |
20020087440 | Blair et al. | Jul 2002 | A1 |
20020099579 | Stowell et al. | Jul 2002 | A1 |
20020107673 | Haller | Aug 2002 | A1 |
20020152133 | King et al. | Oct 2002 | A1 |
20030018490 | Magers et al. | Jan 2003 | A1 |
20030069824 | Menninger | Apr 2003 | A1 |
20030078846 | Burk et al. | Apr 2003 | A1 |
20030139995 | Farley | Jul 2003 | A1 |
20030149500 | Faruque et al. | Aug 2003 | A1 |
20030163212 | Smith et al. | Aug 2003 | A1 |
20030172008 | Hage et al. | Sep 2003 | A1 |
20030212610 | Duffy et al. | Nov 2003 | A1 |
20030220911 | Tompras et al. | Nov 2003 | A1 |
20040008876 | Lure et al. | Jan 2004 | A1 |
20040113945 | Park et al. | Jun 2004 | A1 |
20040195224 | Kanodia et al. | Oct 2004 | A1 |
20050055299 | Chambers et al. | Mar 2005 | A1 |
20050125092 | Lukis et al. | Jun 2005 | A1 |
20050144033 | Vreeke et al. | Jun 2005 | A1 |
20050171790 | Blackmon | Aug 2005 | A1 |
20050251478 | Yanavi | Nov 2005 | A1 |
20050273401 | Yeh et al. | Dec 2005 | A1 |
20060085322 | Crookshanks | Apr 2006 | A1 |
20060185275 | Yatt | Aug 2006 | A1 |
20060253214 | Gross | Nov 2006 | A1 |
20070016437 | Elmufdi et al. | Jan 2007 | A1 |
20070067146 | Devarajan et al. | Mar 2007 | A1 |
20070073593 | Perry et al. | Mar 2007 | A1 |
20070112635 | Loncaric | May 2007 | A1 |
20070198231 | Walch | Aug 2007 | A1 |
20080120086 | Lilley et al. | May 2008 | A1 |
20080183614 | Gujral et al. | Jul 2008 | A1 |
20080269942 | Free | Oct 2008 | A1 |
20080281678 | Keuls et al. | Nov 2008 | A1 |
20090058860 | Fong et al. | Mar 2009 | A1 |
20090208773 | DuPont | Aug 2009 | A1 |
20090299799 | Racho et al. | Dec 2009 | A1 |
20090319388 | Yuan et al. | Dec 2009 | A1 |
20110040542 | Sendhoff et al. | Feb 2011 | A1 |
20110047140 | Free | Feb 2011 | A1 |
20110209081 | Chen et al. | Aug 2011 | A1 |
20110213757 | Bhaskaran et al. | Sep 2011 | A1 |
20120016678 | Gruber et al. | Jan 2012 | A1 |
20120072299 | Sampsell | Mar 2012 | A1 |
20120230548 | Calman et al. | Sep 2012 | A1 |
20120316667 | Hartloff | Dec 2012 | A1 |
20130055126 | Jackson | Feb 2013 | A1 |
20130097259 | Li | Apr 2013 | A1 |
20130100128 | Steedly et al. | Apr 2013 | A1 |
20130138529 | Hou | May 2013 | A1 |
20130144566 | De Biswas | Jun 2013 | A1 |
20130166470 | Grala et al. | Jun 2013 | A1 |
20130218961 | Ho | Aug 2013 | A1 |
20130293580 | Spivack | Nov 2013 | A1 |
20130297320 | Buser | Nov 2013 | A1 |
20130297460 | Spivack | Nov 2013 | A1 |
20130311914 | Daily | Nov 2013 | A1 |
20130325410 | Jung et al. | Dec 2013 | A1 |
20140042136 | Daniel et al. | Feb 2014 | A1 |
20140067333 | Rodney et al. | Mar 2014 | A1 |
20140075342 | Corlett | Mar 2014 | A1 |
20140098094 | Neumann et al. | Apr 2014 | A1 |
20140157579 | Chhabra et al. | Jun 2014 | A1 |
20140207605 | Allin et al. | Jul 2014 | A1 |
20140229316 | Brandon | Aug 2014 | A1 |
20140279177 | Stump | Sep 2014 | A1 |
20140379119 | Sciacchitano et al. | Dec 2014 | A1 |
20150055085 | Fonte et al. | Feb 2015 | A1 |
20150066189 | Mulligan et al. | Mar 2015 | A1 |
20150127480 | Herrman et al. | May 2015 | A1 |
20150234377 | Mizikovsky | Aug 2015 | A1 |
Number | Date | Country |
---|---|---|
0154476 | Aug 2001 | WO |
0171626 | Sep 2001 | WO |
0177781 | Oct 2001 | WO |
2006086332 | Aug 2006 | WO |
2007067248 | Jun 2007 | WO |
2011139630 | Nov 2011 | WO |
2011140646 | Nov 2011 | WO |
2011140646 | Nov 2011 | WO |
2013058764 | Apr 2013 | WO |
2014152396 | Sep 2014 | WO |
Entry |
---|
Wu et al. Interactive 3D Geometric Modelers with 2D UI, 2002, State University of Campinas, www.dca.fee.unicamp.br, Sao Paulo, Brazil; 2002, 8 pages. |
“Upload Your Photos, Print a 3D Model with hypr3D.” SolidSmack. http://www.solidsmack.com/cad-design-news/hypr3d-photo-video-3d-print/; last accessed on Oct. 13, 2015. |
Rothganger et al. “3D Object Modeling and Recognition from Photographs and Image Sequences.” Toward Category-Level Object Recognition. 2006, pp. 105-126, vol. 4170 of the series Lecture Notes in Computer Science. Springer Berlin Heidelberg. |
Retrieved from:http://www.solidworks.com/sw/products/3d-cad/manufacturing-cost-estimation-quoting.htm p. 1: Automatic Manufacturing Cost Estimation Overview; Solidworks; 2015. |
Retrieved from: http://www.gom.com/fileadmin/user—upload/industries/touch—probe—fixtures—EN.pdf; Application Example: Quality Control, Online Calibration and Validation of Fixtures, Jigs and Gauges. GOM mbH, 2008. |
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.194.77858trep=rep1&type=pdf Kim, Jin Baek, and Arie Segev. “A web services-enabled marketplace architecture for negotiation process management.” Decision Support Systems 40.1 (2005): 71-87. |
Jaiswal, Ashutosh et al., “Design and Implementation of a Secure Multi-Agent Marketplace”, Elsevier Science, pp. 1-23, Jun. 3, 2004; http://magnet.cs.umn.edu/papers/Jaiswal04cera.pdf. |
http://www.bridgelinedigital.com/File%20Library/Repository/eCommerce/Sample-eCommerce-RFP-Template—Bridgeline-Digital.pdf. Sample RFP Template: Ecommerce Platform, Bridgeline Digital, 2014. |
Matchbook, Tealbook, http://www.matchbookinc.com/ Sep. 28, 2015. |
3Diligent, Source Smarter, http://www.3diligent.com/customer.html; Sep. 28, 2015. |
Dassault Systemes, Brochure, Mar. 24, 2010: New Features Type3ToCatia http://www.type3.us/content/download/2202/405535/file/New%20Feature—Type3ToCatia—2010—US%20old.pdf. |
Xue, S., X. Y. Kou, and S. T. Tan. “Natural voice-enabled CAD: modeling via natural discourse.” Computer-Aided Design and Applicarrons 6.1 (2009): 125-136. |
Kou, X. Y., S. K. Xue, and S. T. Tan. “Knowledge-guided inference for voice-enabled CAD.” Computer-Aided Design 42.6 (2010): 545-557. |
Sharma, Anirudh, et al. “MozArt: a multimodal interface for conceptual 3D modeling.” Proceedings of the 13th international conference on multimodal interfaces. ACM, 2011. |
U.S. Appl. No. 14/267,447, Aug. 5, 2015, Office Action. |
U.S. Appl. No. 14/197,922, Nov. 26, 2014, Office Action. |
U.S. Appl. No. 14/197,922, Apr. 27, 2015, Response to Office Action. |
U.S. Appl. No. 14/197,922, May 15, 2015, Office Action. |
U.S. Appl. No. 14/267,447, Jun. 18, 2015, Response to Office Action. |
U.S. Appl. No. 14/263,665, Oct. 8, 2015, Office Action. |
U.S. Appl. No. 14/053,222, Jan. 29, 2016, Office Action. |
U.S. Appl. No. 14/311,943, Apr. 27, 2016, Office Action. |
U.S. Appl. No. 14/486,550, May 26, 2016, Office Action. |
U.S. Appl. No. 14/060,033, Jun. 15, 2016, Office Action. |
U.S. Appl. No. 14/172,462, Jul. 6, 2016, Office Action. |
U.S. Appl. No. 14/053,222, Jul. 29, 2016, Response to Office Action. |
U.S. Appl. No. 14/185,204, Jul. 29, 2016, Office Action. |
U.S. Appl. No. 14/062,947, Sep. 16, 2016, Office Action. |
U.S. Appl. No. 14/060,033, filed Oct. 22, 2013. |
U.S. Appl. No. 14/053,222, filed Oct. 14, 2013. |
U.S. Appl. No. 14/172,462, filed Oct. 16, 2013. |
U.S. Appl. No. 14/062,947, filed Oct. 25, 2013. |
U.S. Appl. No. 14/172,404, filed Feb. 4, 2014. |
U.S. Appl. No. 14/303,372, filed Jun. 12, 2014. |
U.S. Appl. No. 14/185,204, filed Feb. 20, 2014. |
U.S. Appl. No. 14/195,391, filed Mar. 3, 2014. |
U.S. Appl. No. 14/246,254, filed Apr. 7, 2014. |
U.S. Appl. No. 14/229,008, filed Mar. 28, 2014. |
U.S. Appl. No. 14/197,922, filed Mar. 5, 2014. |
U.S. Appl. No. 14/263,665, filed Apr. 28, 2014. |
U.S. Appl. No. 14/267,447, filed May 1, 2014. |
U.S. Appl. No. 14/311,943, filed Jun. 23, 2014. |