The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate one or more implementations and, together with the description, explain these implementations. In the drawings:
The following detailed description refers to the accompanying drawings. The same reference numbers in different drawings may identify the same or similar elements.
Software developers may utilize a variety of sources to learn how to program in a particular programming environment (e.g., a technical computing environment (TCE)). For example, the software developers may use manuals, text books, videos, etc. to learn how to program in a programming environment. A video may be a particularly helpful tool for teaching software developers since it may include visual use of features of the programming environment. However, any information (e.g., program code) shown in the video may need to be replicated by a software developer in the programming environment utilized by the software developer. For example, if the video shows how to enter code to create a particular model, the software developer would need to retype the code, shown in the video, into the programming environment in order to create the particular model.
Systems and/or methods described herein may provide TCE information associated with a computing environment (e.g., a TCE) into a video recording of the TCE. The TCE information may include inputs to the TCE and images displayed by the TCE based on the inputs. The video with the TCE information may be played at a device with a local TCE associated with a user. The user may select all or a portion of the TCE information, of the video, to be provided in the local TCE. Upon user selection, the selected TCE information may be provided to the local TCE. Alternatively, or additionally, the TCE information may be automatically streamed to the local TCE as the video is playing on the device, without user interaction.
The images displayed by the TCE, over a particular time period, may be provided to a recording device. In one example, the recording device may be incorporated within a device providing the TCE or may be a separate from the device providing the TCE. As further shown in
As shown in
Such an arrangement may enable any information (e.g., TCE code) shown in the enriched video to be provided in the local TCE associated with the user, without the user having to replicate the information. The information may be automatically provided to the local TCE based on the user selection of the retrieval mechanism. Alternatively, or additionally, the information may be automatically streamed to the local TCE as the enriched video is playing on the device. For example, the enriched video may include information that instructs an application displaying the video to connect to a TCE process on the device. If the video is displayed in the TCE, the enriched video information may be directly used in the TCE. If the video is displayed in another application, the application may connect to the TCE through an inter-process communication means (e.g., based on socket communication). The application may start a TCE process before sending the information from the enriched video to the TCE. The communication of the enriched video information may rely on an application programming interface (API) between the application and the TCE for the purpose of communicating TCE commands and controlling the TCE execution. For example, the enriched video may contain information associated with an attempt to locate a process on a device that corresponds to a local TCE.
The terms “code” and “program code,” as used herein, are to be used interchangeably and are to be broadly interpreted to include text-based code that may require further processing to execute (e.g., C++ code, Hardware Description Language (HDL) code, very-high-speed integrated circuits (VHSIC) HDL (VHDL) code, Verilog, Java, and/or other types of hardware or software based code that may be compiled and/or synthesized); binary code that may be executed (e.g., executable files that may directly be executed by an operating system, bitstream files that can be used to configure a field programmable gate array (FPGA), Java byte code, object files combined together with linker directives, source code, makefiles, etc.); text files that may be executed in conjunction with other executables (e.g., Python text files, a collection of dynamic-link library (DLL) files with text-based combining, configuration information that connects pre-compiled modules, an extensible markup language (XML) file describing module linkage, etc.); etc. In one example, code may include different combinations of the above-identified classes (e.g., text-based code, binary code, text files, etc.). Alternatively, or additionally, code may include a dynamically-typed programming language (e.g., the M language, a MATLAB® language, a MATLAB-compatible language, a MATLAB-like language, etc.) that can be used to express problems and/or solutions in mathematical notations. Alternatively, or additionally, code may be of any type, such as function, script, object, etc., and a portion of code may include one or more characters, lines, etc. of the code.
User interfaces, as described herein, may include graphical user interfaces (GUIs) or non-graphical user interfaces, such as text-based interfaces. The user interfaces may provide information to users via customized interfaces (e.g., proprietary interfaces) and/or other types of interfaces (e.g., browser-based interfaces, etc.). The user interfaces may receive user inputs via one or more input devices, may be user-configurable (e.g., a user may change the sizes of the user interfaces, information displayed in the user interfaces, color schemes used by the user interfaces, positions of text, images, icons, windows, etc., in the user interfaces, etc.), and/or may not be user-configurable. Information associated with the user interfaces may be selected and/or manipulated by a user of the TCE (e.g., via a touch screen display, a mouse, a keyboard, a keypad, voice commands, etc.).
Client device 210 may include one or more devices that are capable of communicating with server device 220 via network 230. For example, client device 210 may include a laptop computer, a personal computer, a tablet computer, a desktop computer, a workstation computer, a smart phone, a personal digital assistant (PDA), and/or other computation and communication devices.
Server device 220 may include one or more server devices, or other types of computation and communication devices, that gather, process, and/or provide information in a manner described herein. Server device 220 may include a device that is capable of communicating with client device 210 (e.g., via network 230). In one example, server device 220 may include one or more laptop computers, personal computers, workstation computers, servers, central processing units (CPUs), graphical processing units (GPUs), application-specific integrated circuits (ASICs), field-programmable gate arrays (FPGAs), etc. and/or software (e.g., a simulator) executing on the aforementioned devices. In one example, server device 220 may include TCE 240 and may perform some or all of the functionality described herein for client device 210. Alternatively, server device 220 may be omitted and client device 210 may perform all of the functionality described herein for client device 210.
Network 230 may include a network, such as a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), a telephone network, such as the Public Switched Telephone Network (PSTN), an intranet, the Internet, or a combination of networks.
TCE 240 may be provided within a computer-readable medium of client device 210. Alternatively, or additionally, TCE 240 may be provided in another device (e.g., server device 220) that is accessible by client device 210. TCE 240 may include hardware or a combination of hardware and software that provides a computing environment that allows users to perform tasks related to disciplines, such as, but not limited to, mathematics, science, engineering, medicine, business, etc., more efficiently than if the tasks were performed in another type of computing environment, such as an environment that required the user to develop code in a conventional programming language, such as C++, C, Fortran, Pascal, etc. In one implementation, TCE 240 may include a dynamically-typed programming language (e.g., the M language, a MATLAB® language, a MATLAB-compatible language, a MATLAB-like language, etc.) that can be used to express problems and/or solutions in mathematical notations.
For example, TCE 240 may use an array as a basic element, where the array may not require dimensioning. These arrays may be used to support array-based programming where an operation may apply to an entire set of values included in the arrays. Array-based programming may allow array-based operations to be treated as high-level programming that may allow, for example, operations to be performed on entire aggregations of data without having to resort to explicit loops of individual non-array operations. In addition, TCE 240 may be adapted to perform matrix and/or vector formulations that can be used for data analysis, data visualization, application development, simulation, modeling, algorithm development, etc. These matrix and/or vector formulations may be used in many areas, such as statistics, image processing, signal processing, control design, life sciences modeling, discrete event analysis and/or design, state based analysis and/or design, etc.
TCE 240 may further provide mathematical functions and/or graphical tools (e.g., for creating plots, surfaces, images, volumetric representations, etc.). In one implementation, TCE 240 may provide these functions and/or tools using toolboxes (e.g., toolboxes for signal processing, image processing, data plotting, parallel processing, etc.). Alternatively, or additionally, TCE 240 may provide these functions as block sets or in another way, such as via a library, etc.
TCE 240 may be implemented as a text-based environment (e.g., MATLAB software; Octave; Python; Comsol Script; MATRIXx from National Instruments; Mathematica from Wolfram Research, Inc.; Mathcad from Mathsoft Engineering & Education Inc.; Maple from Maplesoft; Extend from Imagine That Inc.; Scilab from The French Institution for Research in Computer Science and Control (INRIA); Virtuoso from Cadence; Modelica or Dymola from Dynasim; etc.); a graphically-based environment (e.g., Simulink® software, Stateflow® software, SimEvents® software, Simscape™ software, etc., by The MathWorks, Inc.; VisSim by Visual Solutions; LabView® by National Instruments; Dymola by Dynasim; SoftWIRE by Measurement Computing; WiT by DALSA Coreco; VEE Pro or SystemVue by Agilent; Vision Program Manager from PPT Vision; Khoros from Khoral Research; Gedae by Gedae, Inc.; Scicos from (INRIA); Virtuoso from Cadence; Rational Rose from IBM; Rhopsody or Tau from Telelogic; Ptolemy from the University of California at Berkeley; aspects of a Unified Modeling Language (UML) or SysML environment; etc.); or another type of environment, such as a hybrid environment that includes one or more of the above-referenced text-based environments and one or more of the above-referenced graphically-based environments.
TCE 240 may include a programming language (e.g., the MATLAB language) that may be used to express problems and/or solutions in mathematical notations. The programming language may be dynamically typed and/or array-based. In a dynamically typed array-based computing language, data may be contained in arrays and data types of the data may be determined (e.g., assigned) at program execution time.
For example, suppose a program, written in a dynamically typed array-based computing language, includes the following statements:
A=‘hello’
A=int32([1, 2])
A=[1.1, 2.2, 3.3]
Now suppose the program is executed, for example, in a TCE, such as TCE 240. During run-time, when the statement “A=‘hello”’ is executed the data type of variable “A” may be a string data type. Later when the statement “A=int32([1, 2])” is executed the data type of variable “A” may be a 1-by-2 array containing elements whose data type are 32 bit integers. Later, when the statement “A=[1.1, 2.2, 3.3]” is executed, since the language is dynamically typed, the data type of variable “A” may be changed from the above 1-by-2 array to a 1-by-3 array containing elements whose data types are floating point. As can be seen by this example, data in a program written in a dynamically typed array-based computing language may be contained in an array. Moreover, the data type of the data may be determined during execution of the program. Thus, in a dynamically type array-based computing language, data may be represented by arrays and data types of data may be determined at run-time.
TCE 240 may provide mathematical routines and a high-level programming language suitable for non-professional programmers and may provide graphical tools that may be used for creating plots, surfaces, images, volumetric representations, or other representations. TCE 240 may provide these routines and/or tools using toolboxes (e.g., toolboxes for signal processing, image processing, data plotting, parallel processing, etc.). TCE 240 may also provide these routines in other ways, such as, for example, via a library, local or remote database (e.g., a database operating in a computing cloud), remote procedure calls (RPCs), and/or an application programming interface (API). TCE 240 may be configured to improve runtime performance when performing computing operations. For example, TCE 240 may include a just-in-time (JIT) compiler.
Recording device 250 may include one or more devices that are capable of communicating with client device 210 and/or server device 220, via network 230, and are capable of recording video generated by and/or inputs provided to client device 210 and/or server device 220. For example, recording device 250 may include a laptop computer, a personal computer, a tablet computer, a desktop computer, a workstation computer, a video camera, a digital camera with video capability, and/or other computation and communication devices. In one example, recording device 250 may be part of client device 210 or server device 220. Alternatively, or additionally, recording device 250 may be a separate device from client device 210 and server device 220.
Although
Processing unit 320 may include one or more processors, microprocessors, or other types of processing units that may interpret and execute instructions. Main memory 330 may include one or more random access memories (RAMs) or other types of dynamic storage devices that may store information and/or instructions for execution by processing unit 320. ROM 340 may include one or more ROM devices or other types of static storage devices that may store static information and/or instructions for use by processing unit 320. Storage device 350 may include a magnetic and/or optical recording medium and its corresponding drive.
Input device 360 may include a mechanism that permits a user to input information to device 300, such as a keyboard, a camera, an accelerometer, a gyroscope, a mouse, a pen, a microphone, voice recognition and/or biometric mechanisms, a remote control, a touch screen, a neural interface, etc. Output device 370 may include a mechanism that outputs information to the user, including a display, a printer, a speaker, etc. Communication interface 380 may include any transceiver-like mechanism that enables device 300 to communicate with other devices, networks, and/or systems. For example, communication interface 380 may include mechanisms for communicating with another device or system via a network.
As described herein, device 300 may perform certain operations in response to processing unit 320 executing software instructions contained in a computer-readable medium, such as main memory 330. A computer-readable medium may be defined as a non-transitory memory device. A memory device may include space within a single physical memory device or spread across multiple physical memory devices. The software instructions may be read into main memory 330 from another computer-readable medium, such as storage device 350, or from another device via communication interface 380. The software instructions contained in main memory 330 may cause processing unit 320 to perform processes described herein.
Alternatively, hardwired circuitry may be used in place of or in combination with software instructions to implement processes described herein. Thus, implementations described herein are not limited to any specific combination of hardware circuitry and software.
Although
As shown in
A user associated with client/server device 210/220 may utilize a keyboard, a mouse, etc. to input information (e.g., user inputs 530) to client/server device 210/220 and/or TCE 240. For example, the user may input TCE code that appears in the command window of TCE 240. In one implementation, client/server device 210/220 may record user inputs 530 from an API associated with an operating system executed by client/server device 210/220. Alternatively, or additionally, TCE 240 may record user inputs 530 (e.g., TCE code entered by the user, results of execution of the TCE code, model images, etc.) in a file, such as, for example, a script file, a hypertext markup language (HTML) file, etc. In some implementations, TCE 240 may obtain the file and may execute commands in the file. This may enable the user to precisely control (e.g., via a video publishing script) how a video should be created, how the video should be enriched, and how the video should be published.
As further shown in
Returning to
As further shown in
Enriched video file 540 may include the video file embedded with user inputs 530 provided in the script file. For example, as shown in
In one example implementation, the user may programmatically add and/or delete user inputs 530 to/from enriched video file 540. For example, as shown in
Alternatively, or additionally, the user may interact with a video file created by recording device 250. For example, as shown in
The user may associate further properties with object 730, such as information that may be retrieved by interacting with object 730 during playback of video file 710. Object tracking may enable the associated information (e.g., identifier 750 and the further properties) to be retrieved through interaction with object 730 during playback of video file 710. The user may include this information by adding code 760 to video file 710. In one example, code 760 may include the following syntax:
Alternatively, or additionally, as shown in
Although
As shown in
As further shown in
Returning to
In one example implementation, the user associated client/server device 210/220 may instruct client/server device 210/220 to play enriched video file 540, and client/server device 210/220 may play enriched video file 540, as indicated by reference number 1110 in
Alternatively, or additionally, the user associated client/server device 210/220 may instruct client/server device 210/220 to play enriched video file 540, and client/server device 210/220 may play enriched video file 540, as indicated by reference number 1210 in
In one example, the information associated with the particular block 1230 may include a set of parameters associated with the particular block 1230, such as parameters associated with an appearance of the particular block 1230 (e.g., a foreground color, a background color, presence of a dropdown shadow, a block image, etc.). Alternatively, or additionally, the information associated with the particular block 1230 may include information about an execution behavior of the particular block 1230, such as a sample time, port data types, fixed point scaling, etc. Alternatively, or additionally, the information associated with the particular block 1230 may include information about a code generation configuration of the particular block 1230, such as whether to create a function-call, how to name a function-call, etc. In some implementations, the information that is retrieved may be stored in corresponding parameters of a block identified in a model that is open in the local TCE (e.g., by selecting the block before selecting the retrieval affordance). For example, a sample time of a block shown in the enriched video may be retrieved by selecting a corresponding graphical affordance and stored in a sample time parameter of a block in a graphical model that is open in a local TCE.
Alternatively, or additionally, the user associated client/server device 210/220 may instruct client/server device 210/220 to play enriched video file 540, and client/server device 210/220 may play enriched video file 540, as indicated by reference number 1310 in
Playing video file 1310 may include a retrieval mechanism 1340 (e.g., an icon, an image, a button, a menu, etc.) that may enable the user to store configuration information 1330 for direct access by the local TCE 240 provided by client/server device 210/220. For example, if the user selects retrieval mechanism 1340, client/server device 210/220 may store configuration information 1330 in a memory used by the local TCE 240 and/or in a repository associated with persistent memory. The local TCE 240 may retrieve configuration information 1330 and may display TCE model 1320 and/or configuration information 1330 to the user. In one example, retrieving configuration information 1330 may cause client/server device 210/220 to create an object in the local TCE 240. For example, a configuration set object may be created in the local TCE 240, and values corresponding to configuration information 1330 may be set accordingly in the created object. In some implementations, the configuration information may be assigned to a model in the local TCE 240 (e.g., a model loaded in memory, a model displayed on the screen, a model selected in a file system content display, etc.).
Alternatively, or additionally, the user associated client/server device 210/220 may instruct client/server device 210/220 to play enriched video file 540, and client/server device 210/220 may play enriched video file 540, as indicated by reference number 1410 in
In one example, retrieval mechanism 1440 may include a subset of parameters 1450 associated with block 1430. As shown in
In one example implementation, client/server device 210/220 may receive a video file that displays use of a TCE 240 but does not include TCE information embedded in the video file. In such a situation, client/server device 210/220 may extract the TCE information from the video file with image processing techniques. With reference to
Client/server device 210/220 may perform imaging processing techniques (e.g., optical character recognition (OCR), image recognition, etc.) on playing video file 1510 in order to extract TCE information from playing video file 1510. As shown in
Alternatively, or additionally, client/server device 210/220 may recognize a version of and/or license information associated with TCE 240 provided in a playing video file, and may utilize a correct version and/or license information at the local TCE 240. In one example, as long as the video file is playing, client/server device 210/220 may permit the local TCE 240 to utilize the TCE information in the playing video file even if client/server device 210/220 is not licensed for the TCE information.
In some implementations, a user may utilize an authoring tool for generating enriched videos either directly or by means of creating a publishing script to generate the enriched videos. The authoring tool may include an interactive tool with video editing capabilities and editing capabilities for TCE interaction commands.
In some implementations, the enriched video may be rewound to a previous point in time and/or forwarded to a future point in time and the state of the TCE (including which graphical models are open, which figures are showing, which variables are in a workspace, values of the variables in the workspace, which files are open, which connections to other open applications are established, etc.) may be set accordingly. For example, an authoring environment may permit identification of a specific frame of a video stream and the user may associate a corresponding time stamp of that frame with an action that the TCE may take. The authoring tool may execute in synchrony with the TCE and store the state of the TCE at various times. The stored state of the TCE may be associated with the corresponding time stamps of the video. When moving back and forth through the video stream (e.g., fast forward, rewind, jump to a frame, etc.), the state of the TCE may be kept consistent with the location in the video.
In some implementations, the TCE may analyze a video stream and automatically identify TCE-related objects in the video (e.g., diagrams of graphical models, an image of a modeled system, scripts of TCE code, etc.). The corresponding objects may have their location on the local file system listed, have their URL on a remote repository listed, and/or be opened for access by the user. For example, if the analysis determines that a particular model of a power window is shown in a video, the location of this (or a related) power window model may be presented to the user.
Although
Systems and/or methods described herein may provide TCE information associated with a computing environment (e.g., a TCE) into a video recording of the TCE. The TCE information may include inputs to the TCE and images displayed by the TCE based on the inputs. The video with the TCE information may be played at a device with a local TCE associated with a user. The user may select all or a portion of the TCE information, of the video, to be provided in the local TCE. Upon user selection, the selected TCE information may be provided to the local TCE. Alternatively, or additionally, the TCE information may be automatically streamed to the local TCE as the video is playing on the device, without user interaction.
The foregoing description of implementations provides illustration and description, but is not intended to be exhaustive or to limit the implementations to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practice of the implementations.
For example, while series of blocks have been described with regard to
It will be apparent that example aspects, as described above, may be implemented in many different forms of software, firmware, and hardware in the implementations illustrated in the figures. The actual software code or specialized control hardware used to implement these aspects should not be construed as limiting. Thus, the operation and behavior of the aspects were described without reference to the specific software code—it being understood that software and control hardware could be designed to implement the aspects based on the description herein.
Further, certain portions of the implementations may be implemented as a “component” that performs one or more functions. This component may include hardware, such as a processor, an application-specific integrated circuit (ASIC), or a field-programmable gate array (FPGA), or a combination of hardware and software.
Even though particular combinations of features are recited in the claims and/or disclosed in the specification, these combinations are not intended to limit the disclosure of the specification. In fact, many of these features may be combined in ways not specifically recited in the claims and/or disclosed in the specification. Although each dependent claim listed below may directly depend on only one other claim, the disclosure of the specification includes each dependent claim in combination with every other claim in the claim set.
No element, act, or instruction used in the present application should be construed as critical or essential unless explicitly described as such. Also, as used herein, the article “a” is intended to include one or more items. Where only one item is intended, the term “one” or similar language is used. Further, the phrase “based on” is intended to mean “based, at least in part, on” unless explicitly stated otherwise.
This application is a continuation-in-part of U.S. patent application Ser. No. 13/185,318, filed on Jul. 18, 2011, which is a divisional of U.S. patent application Ser. No. 11/687,510, filed on Mar. 16, 2007 (now U.S. Pat. No. 8,005,812). The entire contents of U.S. patent application Ser. Nos. 11/687,510 and 13/185,318 are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
4862376 | Ferriter et al. | Aug 1989 | A |
5311438 | Sellers et al. | May 1994 | A |
5980096 | Thalhammer-Reyero | Nov 1999 | A |
6240411 | Thearling | May 2001 | B1 |
6279006 | Shigemi et al. | Aug 2001 | B1 |
6415320 | Hess et al. | Jul 2002 | B1 |
6445782 | Elfe et al. | Sep 2002 | B1 |
6658393 | Basch et al. | Dec 2003 | B1 |
6714201 | Grinstein et al. | Mar 2004 | B1 |
6961688 | Bankes | Nov 2005 | B2 |
6968538 | Rust et al. | Nov 2005 | B2 |
7272618 | Bisotti et al. | Sep 2007 | B1 |
7334216 | Molina-Moreno et al. | Feb 2008 | B2 |
7373317 | Kopelman et al. | May 2008 | B1 |
7451065 | Pednault et al. | Nov 2008 | B2 |
7512932 | Davidov et al. | Mar 2009 | B2 |
7542892 | Clark et al. | Jun 2009 | B1 |
7650432 | Bosworth et al. | Jan 2010 | B2 |
7747648 | Kraft et al. | Jun 2010 | B1 |
7788123 | Ekhaus et al. | Aug 2010 | B1 |
7809770 | Jain et al. | Oct 2010 | B2 |
7890378 | Clarke et al. | Feb 2011 | B2 |
7934194 | Kinnucan et al. | Apr 2011 | B2 |
8005812 | Mosterman et al. | Aug 2011 | B1 |
8181150 | Szpak et al. | May 2012 | B2 |
8359304 | Mosterman et al. | Jan 2013 | B1 |
20010013009 | Greening | Aug 2001 | A1 |
20010026272 | Feld et al. | Oct 2001 | A1 |
20020019971 | Zygmont et al. | Feb 2002 | A1 |
20020026390 | Ulenas et al. | Feb 2002 | A1 |
20020029136 | Hagiwara et al. | Mar 2002 | A1 |
20020042835 | Pepin et al. | Apr 2002 | A1 |
20020123874 | Rosener et al. | Sep 2002 | A1 |
20020129059 | Eck | Sep 2002 | A1 |
20020143800 | Lindberg et al. | Oct 2002 | A1 |
20020169789 | Kutay et al. | Nov 2002 | A1 |
20030018953 | Aberg | Jan 2003 | A1 |
20030036975 | Martin et al. | Feb 2003 | A1 |
20030065663 | Chu | Apr 2003 | A1 |
20030140126 | Budhiraja et al. | Jul 2003 | A1 |
20030176931 | Pednault et al. | Sep 2003 | A1 |
20030187534 | Suzuki et al. | Oct 2003 | A1 |
20030191618 | Gabele et al. | Oct 2003 | A1 |
20030195921 | Becker et al. | Oct 2003 | A1 |
20030220911 | Tompras et al. | Nov 2003 | A1 |
20040034652 | Hofmann et al. | Feb 2004 | A1 |
20040054690 | Hillerbrand et al. | Mar 2004 | A1 |
20040064349 | Humenansky et al. | Apr 2004 | A1 |
20040215599 | Apps et al. | Oct 2004 | A1 |
20040243483 | Baumann et al. | Dec 2004 | A1 |
20050004930 | Hatta | Jan 2005 | A1 |
20050015363 | Dessloch et al. | Jan 2005 | A1 |
20050021435 | Hakanoglu et al. | Jan 2005 | A1 |
20050076294 | DeHamer et al. | Apr 2005 | A1 |
20050114229 | Ackley et al. | May 2005 | A1 |
20050165822 | Yeung et al. | Jul 2005 | A1 |
20050187717 | Paxson et al. | Aug 2005 | A1 |
20050187745 | Lurie et al. | Aug 2005 | A1 |
20050187747 | Paxson et al. | Aug 2005 | A1 |
20050193269 | Haswell et al. | Sep 2005 | A1 |
20050198646 | Kortela | Sep 2005 | A1 |
20050251755 | Mullins et al. | Nov 2005 | A1 |
20050268171 | House et al. | Dec 2005 | A1 |
20050289123 | Dettinger et al. | Dec 2005 | A1 |
20060004852 | Abraham et al. | Jan 2006 | A1 |
20060026168 | Bosworth et al. | Feb 2006 | A1 |
20060053014 | Yoshizawa | Mar 2006 | A1 |
20060168577 | Melo et al. | Jul 2006 | A1 |
20060173663 | Langheier et al. | Aug 2006 | A1 |
20060200795 | MacLay | Sep 2006 | A1 |
20070037214 | Luo et al. | Feb 2007 | A1 |
20070050201 | Gardner et al. | Mar 2007 | A1 |
20070073837 | Johnson-McCormick et al. | Mar 2007 | A1 |
20070073894 | Erickson et al. | Mar 2007 | A1 |
20070078529 | Thiele et al. | Apr 2007 | A1 |
20070083421 | McNair et al. | Apr 2007 | A1 |
20070112714 | Fairweather | May 2007 | A1 |
20070143266 | Tang et al. | Jun 2007 | A1 |
20070174290 | Narang et al. | Jul 2007 | A1 |
20070229537 | Kohli et al. | Oct 2007 | A1 |
20070288885 | Brunel et al. | Dec 2007 | A1 |
20070300179 | Friedlander | Dec 2007 | A1 |
20080004993 | Horspool et al. | Jan 2008 | A1 |
20080005076 | Payne et al. | Jan 2008 | A1 |
20080010597 | Seemann et al. | Jan 2008 | A1 |
20080126022 | Hoguet | May 2008 | A1 |
20080126394 | Jain et al. | May 2008 | A1 |
20080215583 | Gunawardena et al. | Sep 2008 | A1 |
20090182450 | Goldschmidt | Jul 2009 | A1 |
20100020075 | Edecker et al. | Jan 2010 | A1 |
20100030734 | Chunilal | Feb 2010 | A1 |
20110099474 | Grossman et al. | Apr 2011 | A1 |
20110153524 | Schnackel | Jun 2011 | A1 |
20110161054 | Woolf et al. | Jun 2011 | A1 |
20110191676 | Guttman | Aug 2011 | A1 |
Entry |
---|
www.3dexport.com, “3dexport.com—online 3d models shop”, http://web.archive.org/web/20051210033500/http://www.3dexport.com, 2004-2005, 2 pages. |
Co-pending U.S. Appl. No. 13/185,342, filed Jul. 18, 2011 entitled “Collaborative Modeling Environment” by Pieter J. Mosterman et al., 83 pages. |
Co-pending U.S. Appl. No. 13/185,359, filed Jul. 18, 2011 entitled “Collaborative Modeling Environment” by Pieter J. Mosterman et al., 83 pages. |
Co-pending U.S. Appl. No. 13/185,318, filed Jul. 18, 2011 entitled “Collaborative Modeling Environment” by Pieter J. Mosterman et al., 83 pages. |
Co-pending U.S. Appl. No. 13/185,374, filed Jul. 18, 2011 entitled “Collaborative Modeling Environment” by Pieter J. Mosterman et al., 83 pages. |
Dynast Features, http://dynast.net/contents.html, Aug. 1, 2007 (print date) 1 page. |
eBay—New & Used electronics, cars, apparel, collectibles, sporting goods & more at low . . . , http://ebay.com, Aug. 1, 2007 (print date) 2 pages. |
eBay, Inc., http://web.archvie.org/web/20050424074640/http://www.ebay.com, Apr. 24, 2005, 1 page. |
Exporting a Model to the Web: Exporting Simulink Models to Web Viewers (Report Generator), http://www.mathworks.com/access/helpdesk/help/toolbox/rptgen/ug/bqmz372-1.html, Aug. 1, 2007 (print date) 3 pages. |
iGoogle, http://www.google.com/ig?hl=en, Aug. 1, 2007 (print date), 1 page. |
Microsoft Office Online: Help and How-to: About finding files, http://office.microsoft.com/assistance/htws.aspx?AssetID=HP850527431033&CTT=1&or . . . , Mar. 13, 2007 (print date) 2 pages. |
Microsoft Office Online: Help and How-to: Find a file, http://office.microsfot.com/assistance/hfws.aspx?AssetID=HP010182231033&CTT=1&Or . . . , Mar. 13, 2007 (print date) 2 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,243 dated, Sep. 23, 2011, 14 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,318 dated, Dec. 8, 2011, 58 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,318 dated, May 25, 2012, 48 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,342 dated, Mar. 15, 2012, 43 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,342 dated, Sep. 10, 2012, 36 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,359 dated, May 24, 2012, 57 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,359 dated, Sep. 27, 2011, 31 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,359 dated, Dec. 19, 2011, 46 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,374 dated, Jul. 2, 2012, 42 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,374 dated, Sep. 28, 2011, 35 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,374 dated, Dec. 16, 2011, 33 pages. |
Office Action from corresponding U.S. Appl. No. 13/185,243 dated, Apr. 24, 2012, 60 pages. |
The MathWorks—MATLAB Central—File Exchange, http://www.mathworks.com/matlabcentral/fileexchange/loadCategory.do, Aug. 1, 2007 (print date) 1 page. |
The MathWorks—Simulink®—Simulation and Model-Based Design, http://www.mathworks.com/products/simulink, Aug. 1, 2007 (print date) 2 pages. |
http://www.physiome.org, Mar. 20, 2008 (print date) 1 page. |
http://opencores.org/projects.cgi/web/opencores/missions, Mar. 20, 2008 (print date) 2 pages. |
Number | Date | Country | |
---|---|---|---|
Parent | 11687510 | Mar 2007 | US |
Child | 13185318 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13185318 | Jul 2011 | US |
Child | 13827887 | US |