The present disclosure relates generally to technologies for presenting virtual scenarios. More particularly, in certain embodiments, the present disclosure is related to a system for predictive virtual scenario presentation.
Electronic display technology is rapidly developing to support the presentation of virtual environments. For example, augmented reality and virtual reality technology can present virtual environments.
As described above, augmented reality and virtual reality technology is typically used to provide user amusement. The virtual environments presented by previous augmented reality and virtual reality have little or no additional usefulness to the user. This disclosure recognizes that the ability to efficiently visualize possible outcome scenarios of prospective activities would significantly improve the performance of systems used to provide virtual representations as well as the utility of information provided in these virtual representations. Previous technology fails to provide tools for such a purpose.
Certain embodiments of this disclosure may be integrated into the practical application of a virtual scenario presentation device that provides improvements to previous technology, including those identified above. The virtual scenario presentation device presents virtual representations of potential scenarios based on selectable activities. For example, the disclosed system and device provide several technical advantages over previous augmented reality and virtual reality technology, which include: (1) the generation and presentation of improved virtual presentations that include representations of possible outcomes that may result in response to certain activities; (2) the determination and presentation of resources consumed in response to certain activities, thereby facilitating more efficient resource consumption; and (3) the presentation of immersive visual representations showing how an activity may be perceived over time (e.g., as the environment changes, the item changes, etc.). Through these technical improvements, the disclosed system and device provide more accurate and more readily interpretable visual representations of possible outcomes of possible activities and thus provide an improvement to augmented reality and virtual reality technology. As an example, the virtual scenario presentation device may provide a more accurate and usable representation of resource consumption over time in response to a prospective activity, which may facilitate the more efficient use of resources. As such, this disclosure may improve the function of computer systems used for providing augmented reality and/or virtual reality presentations. In some embodiments, the virtual scenario presentation device transforms certain pieces of descriptive information (e.g., from repositories of activity data and/or user data) into immersive virtual presentations that accurately reflect the possible outcomes of certain activities.
In an embodiment, a system includes an activity data repository that stores activity data. The activity data includes one or more of item properties associated with items involved with possible activities, environment properties associated with environments in which the possible activities take place, and resource properties corresponding to resources consumed to engage in the possible activities. A scenario presentation device includes a network interface communicatively coupled to the activity data repository, an electronic display, and a processor operatively coupled to the network interface and the electronic display. The processor presents, in the electronic display, one or more selectable options corresponding to possible activities that can be performed and a representation of an environment associated with the possible activities. The processor receives a selection of a first scenario option. The selected first scenario option corresponds to a selected activity, which itself corresponds to an action involving an item to be performed over a period of time. The processor determines, using the selected activity and the environment properties of the activity data, anticipated changes to the environment over the period of time. The processor determines, using the anticipated changes to the environment over the period of time, a projected environment representation for the selected activity. The projected environment representation includes one or more images depicting changes to the environment over the period of time associated with the selected activity. The projected environment representation determined for the selected activity is presented in the electronic display.
Certain embodiments of this disclosure may include some, all, or none of these advantages. These advantages and other features will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings and claims.
For a more complete understanding of this disclosure, reference is now made to the following brief description, taken in connection with the accompanying drawings and detailed description, wherein like reference numerals represent like parts.
This disclosure provides solutions to the aforementioned and other problems of previous technology by providing the practical application of a virtual scenario presentation device that determines improved information about possible outcomes to prospective activities. The scenario presentation device may provide users information that was not previously available in a format that is both user-friendly and readily interpretable (e.g., as one or more images or as immersive visual presentation). An example of a system in which the scenario presentation device may operate is shown in
Example System for Query Processing of a Frequency of Utility Indicators
The scenario presentation device 102 is configured to determine a scenario representation for an activity selected by a user of the scenario presentation device 102. For instance, the scenario presentation device 102 may use a selected activity (see
In some embodiments, the scenario presentation device 102 is a wearable device, such as a wearable virtual reality device, such that the scenario representations 110 are an immersive experience for the user of the scenario presentation device 102. In some embodiments, the scenario presentation device 102 is an augmented reality device (e.g., a wearable device, a smart phone, or the like). In such embodiments, the scenario presentation device 102 may overlay images determined as a scenario representation 110 on images captured by a camera 112 (e.g., such that the scenario representation 110 appears to be presented in the local environment of the scenario presentation device 102. In some embodiments, the scenario presentation device 102 is another device that provides for the presentation of the scenario presentation(s) 110 (e.g., as visual or audiovisual presentation(s)).
The scenario presentation device 102 includes at least a processor 104, a memory 106, and a display 108. Further embodiments may include a camera 112, a wireless communication interface 114, a network interface 116, a microphone 118, a global position system (GPS) sensor 120, and/or one or more input devices 122. The scenario presentation device 102 may be configured as shown or in any other suitable configuration. For example, the scenario presentation device 102 may include one or more additional components, and/or one or more shown components may be omitted.
The processor 104 includes one or more processors operably coupled to and in signal communication with memory 106, display 108, camera 112, wireless communication interface 114, network interface 116, microphone 118, GPS sensor 120, and input devices 122. Processor 104 is configured to receive and transmit electrical signals among one or more of memory 106, display 108, camera 112, wireless communication interface 114, network interface 116, microphone 118, GPS sensor 120, and input devices 122. The electrical signals are used to send and receive data (e.g., images captured from camera 112, scenario representations 110 to display on display 108, etc.) and/or to control or communicate with other devices.
The processor 104 is any electronic circuitry including, but not limited to, state machines, one or more central processing unit (CPU) chips, logic units, cores (e.g. a multi-core processor), field-programmable gate array (FPGAs), application specific integrated circuits (ASICs), or digital signal processors (DSPs). The processor 104 may be a programmable logic device, a microcontroller, a microprocessor, or any suitable combination of the preceding. The one or more processors are configured to process data and may be implemented in hardware or software. For example, the processor 104 may be 8-bit, 16-bit, 32-bit, 64-bit or of any other suitable architecture. The processor 104 may include an arithmetic logic unit (ALU) for performing arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of ALU operations, and a control unit that fetches instructions from memory and executes them by directing the coordinated operations of the ALU, registers and other components.
The one or more processors are configured to implement various instructions, including the scenario generation instructions 124, described further below. For example, the one or more processors are configured to execute instructions to implement the function disclosed herein, such as some or all of those described with respect to
The memory 106 is operable to store any of the information described with respect to
Display 108 is configured to present visual information, such as scenario representations 110, images from camera 112, to a user (e.g., user 202 of
Examples of camera 112 include, but are not limited to, charge-coupled device (CCD) cameras and complementary metal-oxide semiconductor (CMOS) cameras. Camera 112 is configured to capture images of an environment local to the scenario presentation device 102. Camera 112 may be configured to capture images continuously, at predetermined intervals, or on-demand. For example, camera 112 may be configured to receive a command to capture an image. In another example, camera 112 is configured to continuously capture images to form a video stream. Camera 112 is communicably coupled to processor 104.
Examples of wireless communication interface 114 include, but are not limited to, a Bluetooth interface, an RFID interface, an NFC interface, a local area network (LAN) interface, a personal area network (PAN) interface, a wide area network (WAN) interface, a Wi-Fi interface, a ZigBee interface, or any other suitable wireless communication interface as would be appreciated by one of ordinary skill in the art upon viewing this disclosure. Wireless communication interface 114 is configured to facilitate processor 104 in communicating with other devices. For example, wireless communication interface 114 is configured to enable processor 104 to send and receive signals with other devices, such as a wireless input device 122. Wireless communication interface 114 is configured to employ any suitable communication protocol.
The network interface 116 is configured to enable wired and/or wireless communications. The network interface 116 is configured to communicate data between the scenario presentation device 102 and other network devices, systems, or domain(s), such as network 144. For example, the network interface 116 may comprise a WIFI interface, a local area network (LAN) interface, a wide area network (WAN) interface, a modem, a switch, or a router. The processor 104 is configured to send and receive data using the network interface 116. The network interface 116 may be configured to use any suitable type of communication protocol as would be appreciated by one of ordinary skill in the art.
Microphone 118 is configured to capture audio signals (e.g. voice signals or commands) from a user, such as first user 202 of
GPS sensor 120 is configured to capture and to provide geographical location information. For example, GPS sensor 120 is configured to provide a geographic location of a user, such as user 202 of
Input device(s) 122 are configured to capture a user input for use by the scenario presentation device 102. For instance, a user input may be provided to indicate a selection of an activity for which a scenario representation 110 should be generated and presented (see
Information about the user of the scenario presentation device 102 (e.g., user 202 of
The memory 130 of the user data repository 126 is operable to store any data, instructions, logic, rules, or code operable to execute the functions of the user data repository 126. The memory 130 may store the user data 134 as well as any other logic, code, rules, and the like to execute functions of the user data repository 126. As illustrated in
The network interface 132 of the user data repository 126 is configured to enable wired and/or wireless communications. The network interface 132 is configured to communicate data between the user data repository 126 and other network devices, systems, or domain(s), such as the network 144 and scenario presentation device 102. The network interface 132 is an electronic circuit that is configured to enable communications between devices. For example, the network interface 132 may include one or more serial ports (e.g., USB ports or the like) and/or parallel ports (e.g., any type of multi-pin port) for facilitating this communication. As a further example, the network interface 132 may include a WIFI interface, a local area network (LAN) interface, a wide area network (WAN) interface, a modem, a switch, or a router. The processor 128 is configured to send and receive data using the network interface 132. The network interface 132 may be configured to use any suitable type of communication protocol as would be appreciated by one of ordinary skill in the art. The network interface 132 communicates user data 134 to the scenario presentation device 102.
Information about possible activities for which scenario representations 110 may be determined may be stored as activity data 142 in the activity data repository 134 and accessed as necessary by the scenario presentation device 102. The example activity data repository 134 of
The memory 138 of the activity data repository 134 is operable to store any data, instructions, logic, rules, or code operable to execute the functions of the activity data repository 134. The memory 138 may store activity data 142 as well as any other logic, code, rules, and the like to execute functions of the activity data repository 134. As illustrated in
The network interface 140 of the activity data repository 134 is configured to enable wired and/or wireless communications. The network interface 140 is configured to communicate data between the activity data repository 134 and other network devices, systems, or domain(s), such as the network 144 and scenario presentation device 102. The network interface 140 is an electronic circuit that is configured to enable communications between devices. For example, the network interface 140 may include one or more serial ports (e.g., USB ports or the like) and/or parallel ports (e.g., any type of multi-pin port) for facilitating this communication. As a further example, the network interface 140 may include a WIFI interface, a local area network (LAN) interface, a wide area network (WAN) interface, a modem, a switch, or a router. The processor 136 is configured to send and receive data using the network interface 140. The network interface 140 may be configured to use any suitable type of communication protocol as would be appreciated by one of ordinary skill in the art. The network interface 140 communicates activity data 142 to the scenario presentation device 102.
The communication network 144 may facilitate communication within the system 100. This disclosure contemplates the communication network 144 being any suitable network operable to facilitate communication between the first wearable device 102 and the server 106. Communication network 144 may include any interconnecting system capable of transmitting audio, video, signals, data, messages, or any combination of the preceding. Communication network 144 may include all or a portion of a public switched telephone network (PSTN), a public or private data network, a local area network (LAN), a metropolitan area network (MAN), a wide area network (WAN), a local, regional, or global communication or computer network, such as the Internet, a wireline or wireless network, an enterprise intranet, or any other suitable communication link, including combinations thereof, operable to facilitate communication between the components. In other embodiments, system 100 may not have all of the components listed and/or may have other elements instead of, or in addition to, those listed above.
Example Operation of the Scenario Presentation Device
The selected activity type 204 is used at step 210 to present one or more selectable activity options 214 and/or an environment representation 212 to the user 202 on the display 108. The activity options 214 correspond to possible activities that the user 202 can perform. For example, activity options may correspond to activities the user 202 may consider performing within the class of activities identified by the selected activity type 204. For example, if an activity type 204 of acquiring an item is selected, then a number of possible items may be presented as the activity options 214. The environment representation 212 is a representation (e.g., as one or more images, a video, or the like) of an environment associated with the possible activities. For example, the environment representation 212 may include images captured by the camera 112 of the scenario presentation device 102. As another example, an environment representation may be previously stored or generated of an environment associated with the activity options 214 or corresponding to a user location (e.g., determined from GPS sensor 120). For instance, if activity options 214 correspond to actions 218 that can be performed in different locations, one or more of these locations may be presented as the environment representation 212. The action 218 may be associated with an item 216 at a time 220. For instance, the action 218 may acquire the item 216 at a liven time 220. The time 220 may be a time period over which the item 216 may be used.
At step 222, a selection 224 of one of the scenario options 214 is received. The selected scenario option 214 corresponds to a selected activity 226 for the user 202. The selected activity 226 is an action that might be performed by the user 202 at a future time and for which the user 202 wishes to view a representation of a possible outcome in the form of a scenario representation 110. As a few non-limiting examples, the selected activity 226 may be for the user 202 to acquire an item (e.g., an automobile as illustrated in the examples of
As illustrated in
The user data 134 may include user properties 244 and/or user resources 246. The user properties 244 may include properties of the user 202, such characteristics of their appearance, biometric features of the user 202, and the like. The user resources 246 may include information regarding resources available to the user to engage in various activities. For example, user resources 246 may indicate an amount of funds available to acquire item 228, projected user income, physical space available to the user 202 for storing item 228, energy available to the user 202 to engage in selected activity 226, and the like.
At step 248, the processor 104 uses the selected activity 226 and the activity data 142 and/or the user data 134 to determine activity properties 250. The activity properties 250 may include one or more of anticipated environment changes 252, anticipated item changes 254, anticipated resource consumption 256, and anticipated user changes 258. For example, the processor 104 may use the selected activity 226 and the environment properties 238 to determine anticipated environment changes 252 that correspond to expected changes to the environment representation 212 over time 232. In an example embodiment (see
Anticipated item changes 254 may be determined using the selected activity 226 and the item properties 236. The anticipated item changes 254 may correspond to changes to the appearance or other characteristics of item 228 over time 232. For instance, effects of wear on item 228 may be determined based on anticipated usage of item 228 by the user 202, the length of time 232 over which the selected activity 226 will proceed, the environment properties 238 of the location in which item is located for the selected activity 226, and the like. Anticipated resource consumption 256 may be determined using the selected activity 226 and the resource properties 240. The anticipated resource consumption 256 corresponds to the expected consumption of resources over time 232. An example presentation of anticipated resource consumption 256 is illustrated in
Anticipated user changes 258 may be determined using the selected activity 226 and the user properties 244. The anticipated user changes 258 generally correspond to expected changes to the user 202 over the time 232 of the selected activity 226. In an example embodiment (see
At step 260, the processor 104 uses the activity properties 250 to determine the scenario representation 110 for the selected activity 226. The scenario representation 110 may include representations (e.g., visual or audio-visual representations) of the projected environment 262 (e.g., the environment representation 212 as it may appear over time 232 if the selected activity 226 is performed), the projected item 264 (e.g., a representation of the item 228 as it may appear over time 232 if the selected activity 226 is performed), the projected user 266 (a representation of the user 202 as they may appear over time 232 if the selected activity 226 is performed), and/or the projected resources 268 (e.g., a projected change in, or consumption of, user resources 246 if the selected activity 226 is performed). For example, the representation of the projected environment 262 may be determined using the anticipated changes to the environment 252 over time 232. The representation of the projected environment 262 may include one or more images depicting changes to the environment (e.g., changes to environment representation 212) over time 232. In an example where the selected activity 226 corresponds to acquiring item 228, the representation of the projected environment 262 may further depict the acquired item 228 within the environment at different points in a time interval defined by time 232. As another example, the anticipated item changes 254 over time 232 may be used to determine the representation of the projected item 264 that includes images depicting changes to the item 228 over time 232. As another example, the anticipated user changes 258 over time 232 may be used to determine the representation of the projected user 266 that includes images depicting changes to the user 202 over time 232. As yet another example, the anticipated resource consumption 256 over time 232 may be used to determine the representation of the projected resources 268 that illustrates (e.g., as a value entry, chart, or the like) the anticipated resource consumption at a given time during time 232. Examples of representations of a projected environment 262, projected item 264, projected user 266, and projected resources 268 are shown in
At step 270, the processor 104 provides the scenario representation 110 for presentation in the display 108. The scenario representation 110 may be presented as an image or a video. In some cases, the scenario representation 110 may be overlaid over the environment representation 212 such that one or more of the projected environment 262, projected item 264, projected user 266, and projected resources 268 are shown overlaid or integrated within the local environment of the user 202. In some cases, the scenario representation 110 may be an immersive representation (e.g., a virtual reality representation). For instance, the scenario representation 110 may show, at each of a plurality of time points during time 232 of the selected activity 226, an image of the item 228 as projected to appear according to projected item 228, an image of the user 202 as projected to appear according to projected user 266, and/or an image of the environment as projected to appear according to the projected environment 262.
Example Scenario Representations
Visual Representations of Projected Outcomes of Activities Over Time
At the initial time 302, the projected environment 262a corresponds to an expected sunny location. For example, the scenario presentation device 102 may determine that at time 302, the user 202 is expected to live in a location with mild weather. For example, this location determination may be made using user properties 244 indicating current and future locations of the user 202, environment properties 238 indicating projected weather trends in various locations, and/or user location information (e.g., from GPS sensor 120). At subsequent time 312 (
At the initial time 302, the projected item 264a corresponds to an expected appearance of the item 228. For example, the scenario presentation device 102 may determine that at time 302, the item 228 is relatively new and unchanged from its initial appearance. For example, the projected item 264a at time 302 may correspond to an initial item appearance included in the item properties 236. At subsequent times 312 and 322 (
The representations of projected user 266a-c at times 302, 312, 322 provide a visual representation of how the family of the user 202 is projected to change over the time 232 associated with the selected activity 226. In this example, the user 202 is represented by a projected user 266a at time 302 that is a single person (e.g., user 202), a projected user 266b at time 312 that is a pair of people (e.g., user 202 and a partner), and projected user 266c that is a larger family at time 322. Information about the projected user 266a-c over time may provide previously unavailable insights to the user 202 for deciding whether to engage in the selected activity 226. For instance, the user 202 can determine whether the item 228 will be suitable for the user's needs at the different time points 302, 312, 322 before making engaging in the selected activity 226. For example, the automobile item 228 shown as representation of projected item 246c may be too small to hold the family shown as the representation of the projected user 266a-c.
The representations of projected resources 268a-c at times 302, 312, 322 provide a visual representation of resources available to the user 202 at each time 302, 312, 322. In the examples of
In some cases an alternative scenario representation 110 may be presented such as is illustrated in
Immersive Scenario Representations
As described above, in some embodiments, scenario representations 110 may provide an immersive experience to the user 202 with improved accuracy for presenting potential outcomes of performing a selected activity 226.
Images 600, 610, 620 of
The scenario presentation device 102 of this disclosure allows the user 202 to view an immersive scenario representation 110 that captures such changes in the user's abilities. This improved immersive experience allows the user 202 to more reliable allocation of resources to prospective activities. These immersive scenario representations 110 also provide readily interpretable and more accurate visual representations of possible outcomes of the selected activity 226 than is possible through the user's mind alone or using previous technology. Thus, at least certain embodiments of the scenario presentation device 102 provide the practical application of presenting improved and more realistic information to user 202.
While several embodiments have been provided in this disclosure, it should be understood that the disclosed system and method might be embodied in many other specific forms without departing from the spirit or scope of this disclosure. The present examples are to be considered as illustrative and not restrictive, and the intention is not to be limited to the details given herein. For example, the various elements or components may be combined or integrated in another system or certain features may be omitted, or not implemented.
In addition, techniques, systems, subsystems, and methods described and illustrated in the various embodiments as discrete or separate may be combined or integrated with other systems, modules, techniques, or methods without departing from the scope of this disclosure. Other items shown or discussed as coupled or directly coupled or communicating with each other may be indirectly coupled or communicating through some interface, device, or intermediate component whether electrically, mechanically, or otherwise. Other examples of changes, substitutions, and alterations are ascertainable by one skilled in the art and could be made without departing from the spirit and scope disclosed herein.
To aid the Patent Office, and any readers of any patent issued on this application in interpreting the claims appended hereto, applicants note that they do not intend any of the appended claims to invoke 35 U.S.C. § 112(f) as it exists on the date of filing hereof unless the words “means for” or “step for” are explicitly used in the particular claim.
Number | Name | Date | Kind |
---|---|---|---|
7099850 | Mann et al. | Aug 2006 | B1 |
8429103 | Aradhye | Apr 2013 | B1 |
9153074 | Zhou et al. | Oct 2015 | B2 |
9704154 | Xing et al. | Jul 2017 | B2 |
10134084 | Gabriele et al. | Nov 2018 | B1 |
10529028 | Davis | Jan 2020 | B1 |
10579749 | Davis | Mar 2020 | B1 |
10796290 | Jacobs et al. | Oct 2020 | B2 |
10885701 | Patel | Jan 2021 | B1 |
D916860 | Tedesco et al. | Apr 2021 | S |
10970546 | Yakishyn et al. | Apr 2021 | B2 |
10976982 | Rochford et al. | Apr 2021 | B2 |
10977871 | Delia et al. | Apr 2021 | B2 |
10978018 | Sharma | Apr 2021 | B2 |
10981060 | Muskin | Apr 2021 | B1 |
10984493 | Borchardt et al. | Apr 2021 | B1 |
10984602 | Miranda et al. | Apr 2021 | B1 |
10987573 | Nietfeld et al. | Apr 2021 | B2 |
10990186 | Wan et al. | Apr 2021 | B2 |
10990683 | Wang et al. | Apr 2021 | B2 |
10990755 | Shanmugam et al. | Apr 2021 | B2 |
10990756 | Shanmugam et al. | Apr 2021 | B2 |
11010015 | Mccormack et al. | May 2021 | B2 |
11017232 | Alexander | May 2021 | B2 |
11024088 | Heinen et al. | Jun 2021 | B2 |
11058945 | Humadi | Jul 2021 | B2 |
11068969 | Wilson et al. | Jul 2021 | B2 |
11069145 | Pearson et al. | Jul 2021 | B1 |
11071912 | Silverstein et al. | Jul 2021 | B2 |
11074432 | Doolani et al. | Jul 2021 | B2 |
11082462 | Miller | Aug 2021 | B2 |
11082535 | Fowe | Aug 2021 | B2 |
11083967 | Summit | Aug 2021 | B1 |
11087539 | Unnerstall et al. | Aug 2021 | B2 |
11087555 | Miller et al. | Aug 2021 | B2 |
11087559 | Kuhn et al. | Aug 2021 | B1 |
11087562 | Croxford et al. | Aug 2021 | B2 |
20060232605 | Imamura | Oct 2006 | A1 |
20110306387 | Moon | Dec 2011 | A1 |
20120113223 | Hilliges et al. | May 2012 | A1 |
20140257862 | Billman | Sep 2014 | A1 |
20160103433 | Sahni | Apr 2016 | A1 |
20160284125 | Bostick | Sep 2016 | A1 |
20170053297 | Malaviya | Feb 2017 | A1 |
20180101985 | Jones-McFadden | Apr 2018 | A1 |
20180107269 | Benzies | Apr 2018 | A1 |
20180165977 | Johansen | Jun 2018 | A1 |
20180350144 | Rathod | Dec 2018 | A1 |
20190011700 | Reiner | Jan 2019 | A1 |
20190316309 | Wani | Oct 2019 | A1 |
20190384865 | Jaiswal | Dec 2019 | A1 |
20190388787 | Padmanabhan | Dec 2019 | A1 |
20210195640 | Nagarajan et al. | Jun 2021 | A1 |
20210201585 | Park et al. | Jul 2021 | A1 |
20210208398 | Shao et al. | Jul 2021 | A1 |
20210209856 | Liukkonen et al. | Jul 2021 | A1 |
20210215933 | Wieczorek | Jul 2021 | A1 |
20210217201 | Hong et al. | Jul 2021 | A1 |
20210217386 | Lal et al. | Jul 2021 | A1 |
20210224910 | Romero | Jul 2021 | A1 |
20210225064 | Ding et al. | Jul 2021 | A1 |
20210225525 | Lund et al. | Jul 2021 | A1 |
20210227203 | Zhou | Jul 2021 | A1 |
20210240219 | Buckley | Aug 2021 | A1 |
20210248821 | Wolfensparger et al. | Aug 2021 | A1 |
20210249634 | Wang et al. | Aug 2021 | A1 |
20210279852 | Jakka | Sep 2021 | A1 |
20220188545 | Nagar | Jun 2022 | A1 |
Entry |
---|
Stock et al. Realistic Simulation of Progressive Vision Diseases in Virtual Reality. VRST '18, Nov. 28-Dec. 1, 2018, Tokyo, Japan. ACM ISBN 978-1-4503-6086-9/18/11. https://doi.org/10.1145/3281505.3283395. 2 pages (Year: 2018). |
Krosel et al. CatARact: Simulating Cataracts in Augmented Reality. 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) DOI 10.1109/ISMAR50242.2020.00098. pp. 682-693. (Year: 2020). |
Krosel et al. ICthroughVR: Illuminating Cataracts through Virtual Reality. 2019 IEEE Conference on Virtual Reality and 3D User Interfaces Mar. 23-27, Osaka, Japan. pp. 655-663 (Year: 2019). |
Yung et al. Around the world in less than a day: virtual reality, destination image and perceived destination choice risk in family tourism. Tourism Recreation Research, 46:1, 3-18, DOI: 10.1080/02508281.2020.1788351. Published online Jul. 14, 2020. 17 pages. (Year: 2020). |
Fan et al. Warrior Vehicle Fleet Sustainment using Intelligent Agent Simulation. 2nd International Through-life Engineering Services Conference. Procedia CIRP 11 ( 2013 ) 213-218. (Year: 2013). |
Hu et al. Outshine to Outbid: Weather-Induced Sentiments on Housing Market. Draft document dated Nov. 1, 2017. 41 pages. (Year: 2017). |
Number | Date | Country | |
---|---|---|---|
20230057371 A1 | Feb 2023 | US |