The present invention is directed to a system and method for facilitating adaptive recentering in virtual reality (“VR”) environments. The use of VR has recently expanded into several different fields including gaming, entertainment, business, scientific research, education, and more. VR offers immersive advantages including realistic simulations, risk-free practice, and experiential training. Overall, VR provides a unique opportunity to immerse a user in a virtual environment that could otherwise be challenging, unsafe, or impossible to achieve in the real world. However, the advantages of VR come with several challenges including limited accessibility for users with mental or physical disabilities or impairments. The present invention provides a system and method for facilitating adaptive recentering in virtual reality (“VR”) environments in order to increase the accessibility of VR and provide a more seamless experience overall for any user.
In immersive VR gaming experiences, the adaptive recentering system can enhance accessibility for players with limited mobility or physical disabilities. As players engage in virtual battles or explore vast virtual worlds, the system can automatically prompt recentering when their viewpoint deviates significantly, allowing them to seamlessly adjust their perspective without disrupting the gameplay.
For VR cinema or theater experiences, the recentering system can ensure that viewers with disabilities can comfortably enjoy the virtual performance by automatically adjusting their viewpoint to the optimal vantage point without requiring manual inputs or assistance. In VR-based educational simulations or virtual classrooms, the adaptative recentering feature can assist students with disabilities in maintaining a focused and optimal viewpoint during lectures, demonstrations, or interactive learning experiences, enhancing their engagement and comprehension.
Architects and designers can leverage the adaptive recentering system in VR walkthroughs of proposed building designs or virtual models. As they navigate the virtual spaces, the system can automatically recenter their viewpoint, enabling them to inspect intricate details or visualize the design from different perspectives without manually adjusting their position. In virtual reality tourism experiences, the recentering system can enhance the immersive experience for users with disabilities by automatically adjusting their viewpoint to capture the most captivating angles of famous landmarks or natural wonders without needing physical inputs that could be challenging or impossible for some users.
Furthermore, adaptive recentering technology can by integrated into VR-based therapy or rehabilitation programs for individuals with physical or cognitive impairments. As patients engage in virtual exercises or simulations, the system can dynamically adjust their viewpoint, ensuring they maintain the optimal perspective for effective therapy sessions without strain or discomfort.
The present invention pertains to a system and method for facilitating adaptive recentering in virtual reality (VR) environments. The system autonomously monitors the user's viewpoint orientation within a VR environment using integrate sensors, such as gyroscopes and accelerometers, to detect deviations exceeding a predetermined threshold angle from a central reference point. Utilizing an innovative approach, the system employs an algorithm to evaluate the duration and extent of the user's viewpoint deviation, enabling detection of potential recentering needs without requiring direct physical input. Upon detecting a significant deviation indicative of a recentering requirement, the system dynamically generates a recentering prompt within the VR environment, seamlessly integrating the prompt into the ongoing user interaction. This prompt facilitates an intuitive recentering process, allowing users to easily adjust their viewpoint with minimal effort and without the need for additional hardware inputs, such as adaptive input devices.
Moreover, the system can discern intentional from unintentional deviations, dismissing the recentering prompt when the user's viewpoint returns within the acceptable range, ensuring a continuous and uninterrupted VR experience. The disclosed methods and systems significantly enhance accessibility for users with disabilities, leveraging minimal input for maximum navigational control within VR environments. This adaptive recentering approach simplifies the user interface and promotes a more inclusive and accessible VR experience across various applications and user demographics.
The present invention's recentering system employs a multi-faceted approach, incorporating hardware and software components to deliver an innovative solution to user recentering VR environments. The core elements of the system include widespread sensor compatibility, user interaction detection, recentering prompt generation, and multi-channel user feedback.
The system leverages standard sensors in most VR headsets, such as gyroscopes, accelerometers, and magnetometers. This ensures that the adaptive recentering functionality can be implemented across various VR devices without the need for additional hardware modifications. Gyroscopes are used to measure the angular velocity of the user's head movements. They provide precise data on the rotational movement along the X, Y, and Z axes. This allows the system to detect the direction and speed of head turns, which is crucial for identifying deviations from the central reference point. Accelerometers measure the linear acceleration of the user's head. They help in determining the tilt and inclination of the head by detecting changes in velocity. This is essential for understanding the overall head movement and orientation within the VR environment. Magnetometers are used to measure the orientation of the user's head relative to the earth's magnetic field. They help provide an absolute reference point for orientation, which complements the data from gyroscopes and accelerometers. This ensures accurate tracking of the user's head position and movement.
A sophisticated adaptive algorithm processes and analyzes user data from integrated sensors in real-time to detect deviations from a central reference point. The algorithm evaluates the magnitude and duration of these deviations to determine when recentering is necessary. The adaptive recentering process involves several key steps, beginning with deviation detection. The algorithm continuously monitors the user's head movement using integrated hardware sensors, such as gyroscopes and accelerometers, to measure the current viewpoint orientation. When the deviation angle exceeds a predefined threshold, a timer is initiated. The next step is prompt generation. If a deviation persists beyond the specified waiting time, a recentering prompt is generated and displayed to the user within the VR environment. Following the prompt generation, the system tracks how the user responds to the recentering prompt. If the user presses a button to recenter, this indicates that the prompt was necessary, and the system records this as a “successful response.” If the user ignores the prompt and manually adjusts their viewpoint to return within the acceptable range, this is recorded as an “automatic recovery.” Following this, the algorithm dynamically adjusts the detection threshold based on user interaction data. Initially, the threshold is adjusted based on all available historical data. As more data is collected, the algorithm places greater emphasis on recent interactions. If the user frequently achieves automatic recovery without requiring a prompt, the threshold is increased to accommodate the user's ability to control the deviation. On the other hand, if the user consistently responds to prompts by interacting with the recenter prompt, indicating the prompt is needed, the threshold is decreased to prompt recentering sooner.
The system then continuously monitors the user's viewpoint and adjusts the threshold in real-time, ensuring that the adaptive recentering remains aligned with the user's specific needs and behaviors. The system also allows users to customize the sensitivity of the recentering prompts, providing flexibility to tailor the system's responsiveness according to their specific needs and preferences. Key customization options include angular threshold adjustment, prompt frequency, sensitivity of response, and response time. Specifically, users can modify the initial angular threshold, which determines how much deviation from the central reference point is allowed before a recentering prompt is triggered. This allows users to set a threshold that best suits their comfort level and movement patterns within the VR environment. Users can also adjust how often the system checks deviations and generates recentering prompts. By setting the prompt frequency, users can control how quickly the system responds to significant deviations, ensuring the prompts are timely without being overly intrusive. The system's responsiveness can be fine-tuned based on user interaction data. Users can customize how the system dynamically adjusts the threshold based on their behavior. For instance, users can set how aggressively the system should increase or decrease the threshold based on their manual adjustments or button presses. Users can also set the waiting time before a recentering prompt is displayed after detecting a deviation. This allows users to balance immediate feedback and giving themselves a chance to self-correct without interruption. By incorporating these adaptive mechanisms and customization options, the system remains responsive to the user's behavior, providing prompts only when necessary and allowing for a more natural and seamless VR experience. This flexibility enhances the overall user experience, making the VR environment more accessible and comfortable for a diverse range of users.
Upon identifying a significant deviation, the system generates a recentering prompt within the VR environment. This prompt seamlessly integrates into the user's current interaction, minimizing disruption and facilitating an intuitive recentering process. The detailed steps involved include state recording, wherein the system automatically records the player's current progress state before displaying the recenter prompt. This current progress state may, for example, include a highlighted menu option in the player's view. By recording this state, the system ensures that the user's context is preserved for a smooth transition back to their original task after recentering. The recentering prompt is then displayed within the VR environment. To minimize disruption, the prompt is designed to be easily accessible and intuitive. The system automatically sets the highlighted option to the recenter option, enabling users to quickly and effortlessly respond to the prompt. Whether the user responds to the recentering prompt by pressing a button or manually adjusts their view to return within the acceptable range, the system ensures a seamless transition by automatically restoring the previously recorded states after recentering the user's view. For example, this may include resetting a highlighted option to its original state, allowing users to continue their menu interaction without interruption. By implementing these steps, the system ensures that the recentering process is effective and helpful while remaining as unobtrusive as possible. The automatic state recording and restoration allow users to maintain flow and continuity within the VR experience enhances overall accessibility and usability.
The recentering prompts are designed to include various feedback mechanisms, ensuring that users are aware of the need to recenter and can respond promptly. Key elements include visual cues, auditory signals, haptic feedback, and continuous transitions. The system provides clear visual indicators within the VR environment to alert users when recentering is required. These cues are designed to be easily noticeable without disruptive the ongoing interaction. In addition to these visual cues, the system can emit auditory signals to notify users of the need to recenter. These sounds are designed to be distinct yet not startling, ensuring users can recognize them without looking directly at the visual cues. For users with compatible devices, haptic feedback can provide a tactile alert. This feedback can benefit users with visual or auditory impairments, ensuring they receive the recentering prompt through another sensory channel. To ensure user comfort, the system uses smooth fade-in and fade-out transitions to maintain user continuity when recentering the viewpoint. This approach avoids a sudden change in viewpoint, which could cause discomfort or disorientation. Instead, the viewpoint adjusts smoothly, allowing the user to maintain their sense of immersion and spatial orientation.
Other features and aspects of the invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the features in accordance with embodiments of the invention. The summary is not intended to limit the scope of the invention, which is defined solely by the claims attached hereto.
The various embodiments are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings. Having thus described the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
In most embodiments, the system may optionally include some type of network. The network can be any type of network familiar to those skilled in the art that can support data communications using any of a variety of commercially available protocols, including without limitation TCP/IP, SNA, IPX, AppleTalk, and the like. Merely by way of example, the network can be a local area network (“LAN”), such as an Ethernet network, a Token-Ring network and/or the like; a wide-area network; a virtual network, including without limitation a virtual private network (“VPN”); the Internet; an intranet; an extranet; a public switched telephone network (“PSTN”); an infra-red network; a wireless network (e.g., a network operating under any of the IEEE 802.11 suite of protocols, GRPS, GSM, UMTS, EDGE, 2G, 2.5G, 3G, 4G, WiMAX, WiFi, CDMA 2000, WCDMA, the Bluetooth protocol known in the art, and/or any other wireless protocol); and/or any combination of these and/or other networks.
The system may also include one or more server computers which can be general purpose computers, specialized server computers (including, merely by way of example, PC servers, UNIX servers, mid-range servers, mainframe computers rack-mounted servers, etc.), server farms, server clusters, or any other appropriate arrangement and/or combination. One or more of the servers may be dedicated to running applications, such as a business application, a Web server, application server, etc. Such servers may be used to process requests from user computers. The applications can also include any number of applications for controlling access to resources of the servers.
The web server can be running an operating system including any of those discussed above, as well as any commercially-available server operating systems. The Web server can also run any of a variety of server applications and/or mid-tier applications, including HTTP servers, FTP servers, CGI servers, database servers, Java servers, business applications, and the like. The server(s) also may be one or more computers which can be capable of executing programs or scripts in response to the user computers. As one example, a server may execute one or more Web applications. The Web application may be implemented as one or more scripts or programs written in any programming language, such as Java®, C, C#, or C++, and/or any scripting language, such as Perl, Python, or TCL, as well as combinations of any programming/scripting languages. The server(s) may also include database servers, including without limitation those commercially available from Oracle®, Microsoft®, Sybase®, IBM® and the like, which can process requests from database clients running on a user computer.
End users, or users that are viewing and using the network platform, all contribute data to the cloud. A web service platform helps secure that data and maintain the service's functionalities. Only authorized users and entities can authorize or unauthorize content and monitor data stored within the web service. The platform's web services help maintain the operations of elements managed by the storage system.
The system may also optionally include one or more databases. The database(s) may reside in a variety of locations. By way of example, a database 620 may reside on a storage medium local to (and/or resident in) one or more of the computers. Alternatively, it may be remote from any or all of the computers, and/or in communication (e.g., via the network) with one or more of these. In a particular set of embodiments, the database may reside in a storage-area network (“SAN”) familiar to those skilled in the art. Similarly, any necessary files for performing the functions attributed to the computers may be stored locally on the respective computer and/or remotely, as appropriate. In one set of embodiments, the database may be a relational database, such as Oracle 10g, that is adapted to store, update, and retrieve data in response to SQL-formatted commands.
The above may be referred to as the “recentering criterion” which, when exceeded, prompts the system to generate a recentering prompt. The system then continuously monitors the user's viewpoint in relation to the above parameters and adjusts the threshold in real-time, ensuring that the adaptive recentering remains aligned with the user's specific needs and behaviors. The parameters may be customized by the user, providing flexibility to tailor the system's responsiveness according to their specific needs and preferences. Customization options may include and are not limited to angular threshold adjustment, prompt frequency, sensitivity of response, and response time.
Users can modify the initial angular threshold, which determines how much deviation from the central reference point is allowed before a recentering prompt is triggered. This allows users to set a threshold that best suits their comfort level and movement patterns within the VR environment. Users can also adjust how often the system checks deviations and generates recentering prompts. By setting the prompt frequency, users can control how quickly the system responds to significant deviations, ensuring the prompts are timely without being overly intrusive.
The system's responsiveness can be fine-tuned based on user interaction data. Users can customize how the system dynamically adjusts the threshold based on their behavior. For instance, users can set how aggressively the system should increase or decrease the threshold based on their manual adjustments or button presses. Users can also set the waiting time before a recentering prompt is displayed after detecting a deviation. This allows users to balance immediate feedback and giving themselves a chance to self-correct without interruption. By incorporating these adaptive mechanisms and customization options, the system remains responsive to the user's behavior, providing prompts only when necessary and allowing for a more natural and seamless VR experience. This flexibility enhances the overall user experience, making the VR environment more accessible and comfortable for a diverse range of users. Furthermore, the VR device may contain eye-tracking capabilities. In such embodiments, a fixed heads-up display (“HUD”) may be implemented in the VR environment which may be configured to follow the user's head movements. Recentering prompts may be generated when the user looks at the HUD for a specified duration, allowing the user to actively recenter by changing their gaze.
To minimize disruption, the prompt is designed to be easily accessible and intuitive. The prompt may contain a menu comprising an option for the user to confirm recentering and an option to dismiss the prompt. The system may automatically highlight the recenter option, enabling users to quickly and effortlessly respond to the prompt. Whether the user responds to the recentering prompt by pressing a button or manually adjusts their view to return within the acceptable range, the system ensures a seamless transition by automatically restoring the previously recorded states after recentering the user's view. For example, this may include resetting a highlighted option to its original state, allowing users to continue their menu interaction without interruption. By implementing these steps, the system ensures that the recentering process is effective and helpful while remaining as unobtrusive as possible. The automatic state recording and restoration allow users to maintain flow and continuity within the VR experience enhances overall accessibility and usability.
The recentering prompts are designed to include various feedback mechanisms, ensuring that users are aware of the need to recenter and can respond promptly. Key elements include visual cues, auditory signals, haptic feedback, and continuous transitions. The system provides clear visual indicators within the VR environment to alert users when recentering is required. These cues are designed to be easily noticeable without disruptive the ongoing interaction. In addition to these visual cues, the system can emit auditory signals to notify users of the need to recenter. These sounds are designed to be distinct yet not startling, ensuring users can recognize them without looking directly at the visual cues. For users with compatible devices, haptic feedback can provide a tactile alert. This feedback can benefit users with visual or auditory impairments, ensuring they receive the recentering prompt through another sensory channel. To ensure user comfort, the system uses smooth fade-in and fade-out transitions to maintain user continuity when recentering the viewpoint. This approach avoids a sudden change in viewpoint, which could cause discomfort or disorientation. Instead, the viewpoint adjusts smoothly, allowing the user to maintain their sense of immersion and spatial orientation.
New θ0=θavg+Δθ
If the user consistently presses the recenter button, indicating the prompt is necessary, and θavg is less than θ0, the threshold is adjusted downwards:
New θ0=θavg−Δθ
If the user responses are mixed or if there are not enough data points to make a significant change, the threshold may remain the same until more data is collected.
The system continuously monitors the user's viewpoint and makes real-time adjustments to the threshold based on the latest data. This ensures that the adaptive recentering is effective and tailored to the user's specific needs and behaviors. Regardless of whether the user uses the recenter button or manually adjusts their view, the system ensures a smooth transition. The previously recorded state is automatically restored, allowing users to continue their interaction without interruption. The transition involves a smooth fade-in and fade-out to prevent discomfort from sudden angle changes.
If the user confirms the need for recentering by interacting with the prompt, the system executes recentering instructions to realign the user's viewpoint with the ventral reference point. If the user does not interact with the prompt and instead returns their viewpoint to within the threshold angle, the system automatically dismisses the recentering prompt, recognizing this as an indication of intentional deviation. If the user frequently returns their viewpoint to within the threshold angle without requiring a prompt, the threshold is increased to accommodate the user's ability to control the deviation. If the user consistently responds to prompts by interacting with the recenter prompt, indicating the prompt is needed, the threshold is decreased to prompt recentering sooner. Following either user recentering confirmation or automatic prompt dismissal due to viewpoint correction, the user continues their VR experience without interruption.
While various embodiments of the disclosed technology have been described above, it should be understood that they have been presented by way of example only, and not of limitation. Likewise, the various diagrams may depict an example architectural or other configuration for the disclosed technology, which is done to aid in understanding the features and functionality that may be included in the disclosed technology. The disclosed technology is not restricted to the illustrated example architectures or configurations, but the desired features may be implemented using a variety of alternative architectures and configurations. Indeed, it will be apparent to one of skill in the art how alternative functional, logical or physical partitioning and configurations may be implemented to implement the desired features of the technology disclosed herein. Also, a multitude of different constituent module names other than those depicted herein may be applied to the various partitions. Additionally, with regard to flow diagrams, operational descriptions and method claims, the order in which the steps are presented herein shall not mandate that various embodiments be implemented to perform the recited functionality in the same order unless the context dictates otherwise.
Although the disclosed technology is described above in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described, but instead may be applied, alone or in various combinations, to one or more of the other embodiments of the disclosed technology, whether or not such embodiments are described and whether or not such features are presented as being a part of a described embodiment. Thus, the breadth and scope of the technology disclosed herein should not be limited by any of the above-described exemplary embodiments.
Terms and phrases used in this document, and variations thereof, unless otherwise expressly stated, should be construed as open ended as opposed to limiting. As examples of the foregoing: the term “including” should be read as meaning “including, without limitation” or the like; the term “example” is used to provide exemplary instances of the item in discussion, not an exhaustive or limiting list thereof; the terms “a” or “an” should be read as meaning “at least one,” “one or more” or the like; and adjectives such as “conventional,” “traditional,” “normal,” “standard,” “known” and terms of similar meaning should not be construed as limiting the item described to a given time period or to an item available as of a given time, but instead should be read to encompass conventional, traditional, normal, or standard technologies that may be available or known now or at any time in the future. Likewise, where this document refers to technologies that would be apparent or known to one of ordinary skill in the art, such technologies encompass those apparent or known to the skilled artisan now or at any time in the future.
Number | Name | Date | Kind |
---|---|---|---|
7094153 | Kunieda et al. | Aug 2006 | B2 |
7883415 | Larsen et al. | Feb 2011 | B2 |
8408706 | Yahav | Apr 2013 | B2 |
8576276 | Bar-Zeev et al. | Nov 2013 | B2 |
8797321 | Bertolami et al. | Aug 2014 | B1 |
8810600 | Bohn et al. | Aug 2014 | B2 |
8878846 | Francis, Jr. et al. | Nov 2014 | B1 |
8928558 | Lewis et al. | Jan 2015 | B2 |
8933912 | Ambrus et al. | Jan 2015 | B2 |
8941559 | Bar-Zeev et al. | Jan 2015 | B2 |
8964298 | Haddick et al. | Feb 2015 | B2 |
9050538 | Sugiyama et al. | Jun 2015 | B2 |
9152402 | Scheidel et al. | Oct 2015 | B2 |
9159165 | Stafford et al. | Oct 2015 | B2 |
9207455 | Bickerstaff et al. | Dec 2015 | B2 |
9285872 | Raffle et al. | Mar 2016 | B1 |
9454849 | Mount et al. | Sep 2016 | B2 |
9524580 | Katz et al. | Dec 2016 | B2 |
9530227 | Nevin, III | Dec 2016 | B2 |
9595113 | Osa et al. | Mar 2017 | B2 |
9659403 | Horowitz | May 2017 | B1 |
9659413 | Grossinger et al. | May 2017 | B2 |
9667860 | Hakim et al. | May 2017 | B2 |
9681804 | Spitzer | Jun 2017 | B2 |
9690374 | Clement et al. | Jun 2017 | B2 |
9766712 | Schpok | Sep 2017 | B2 |
9767613 | Bedikian et al. | Sep 2017 | B1 |
9823744 | Publicover et al. | Nov 2017 | B2 |
9858703 | Kaminitz et al. | Jan 2018 | B2 |
9877016 | Esteban et al. | Jan 2018 | B2 |
9911032 | Shotton et al. | Mar 2018 | B2 |
9990774 | Mao | Jun 2018 | B2 |
9996797 | Holz et al. | Jun 2018 | B1 |
10025060 | Lanman et al. | Jul 2018 | B2 |
10038887 | Gallup et al. | Jul 2018 | B2 |
10043281 | Mallinson et al. | Aug 2018 | B2 |
10078218 | Katz et al. | Sep 2018 | B2 |
10078367 | Michail et al. | Sep 2018 | B2 |
10095036 | Carollo et al. | Oct 2018 | B2 |
10121299 | Kusens et al. | Nov 2018 | B2 |
10126822 | Cohen et al. | Nov 2018 | B2 |
10127632 | Burke et al. | Nov 2018 | B1 |
10180720 | Higgins et al. | Jan 2019 | B2 |
10198032 | Thomas | Feb 2019 | B2 |
10198874 | Dearman et al. | Feb 2019 | B2 |
10203762 | Bradski et al. | Feb 2019 | B2 |
10229312 | Barnett et al. | Mar 2019 | B2 |
10241569 | Lanman et al. | Mar 2019 | B2 |
10275025 | Black et al. | Apr 2019 | B2 |
10306202 | Raghoebardajal et al. | May 2019 | B2 |
10338687 | Glazier et al. | Jul 2019 | B2 |
10380228 | Lee | Aug 2019 | B2 |
10416837 | Reif | Sep 2019 | B2 |
10423226 | Chen et al. | Sep 2019 | B2 |
10429647 | Gollier et al. | Oct 2019 | B2 |
10437347 | Holz et al. | Oct 2019 | B2 |
10445925 | Tokubo | Oct 2019 | B2 |
10453175 | Mierle et al. | Oct 2019 | B2 |
10475249 | Holz et al. | Nov 2019 | B2 |
10496156 | Tilton et al. | Dec 2019 | B2 |
10657701 | Osman et al. | May 2020 | B2 |
10665205 | Weaver et al. | May 2020 | B2 |
10670928 | Shi et al. | Jun 2020 | B2 |
10733431 | Zhang et al. | Aug 2020 | B2 |
10733781 | Shenton et al. | Aug 2020 | B2 |
10796185 | Collet Romea et al. | Oct 2020 | B2 |
10878631 | Dange | Dec 2020 | B2 |
10890941 | Raffle et al. | Jan 2021 | B2 |
10908682 | Goossens | Feb 2021 | B2 |
10909747 | Yu et al. | Feb 2021 | B2 |
10916065 | Furtwangler et al. | Feb 2021 | B2 |
10921878 | Noris et al. | Feb 2021 | B2 |
11024086 | Tate-Gans et al. | Jun 2021 | B2 |
11058950 | Azmandian et al. | Jul 2021 | B2 |
11080937 | Holz | Aug 2021 | B2 |
11099652 | Osotio et al. | Aug 2021 | B2 |
11151699 | Rodriguez et al. | Oct 2021 | B2 |
11181990 | Marks et al. | Nov 2021 | B2 |
11237625 | Johnston et al. | Feb 2022 | B2 |
11269481 | Holz | Mar 2022 | B2 |
11536973 | Miller et al. | Dec 2022 | B2 |
11668930 | Pennell et al. | Jun 2023 | B1 |
11741649 | Pollard et al. | Aug 2023 | B2 |
11966059 | Welch et al. | Apr 2024 | B2 |
12007573 | Yeoh et al. | Jun 2024 | B2 |
20140361977 | Stafford et al. | Dec 2014 | A1 |
20160054837 | Stafford | Feb 2016 | A1 |
20170078447 | Hancock et al. | Mar 2017 | A1 |
20180024363 | Wong et al. | Jan 2018 | A1 |
20180095635 | Valdivia et al. | Apr 2018 | A1 |
20180095637 | Valdivia et al. | Apr 2018 | A1 |
20180113508 | Berkner-Cieslicki et al. | Apr 2018 | A1 |
20190011982 | Wheeler et al. | Jan 2019 | A1 |
20190033989 | Wang et al. | Jan 2019 | A1 |
20190043259 | Wang et al. | Feb 2019 | A1 |
20200097603 | Vangala et al. | Mar 2020 | A1 |
20200401422 | Liu et al. | Dec 2020 | A1 |
20210117214 | Presant et al. | Apr 2021 | A1 |
20230214005 | Ohashi | Jul 2023 | A1 |
20230394621 | Nourai et al. | Dec 2023 | A1 |
20230414899 | Mallinson | Dec 2023 | A1 |
20240031547 | Holz et al. | Jan 2024 | A1 |
20240061636 | Yu | Feb 2024 | A1 |
20240192772 | Lutter | Jun 2024 | A1 |
Number | Date | Country |
---|---|---|
2022013894 | Jan 2022 | WO |