Virtual reality (VR) is a technology for creating three-dimensional virtual environments with specific functionality. VR headsets may be used in combination with hand controllers to generate and render realistic images, sounds and sensations that simulate a user's physical presence in a virtual environment. In particular, the user is able to look around an artificial world, move in it, and interact with virtual features or items in a meaningful way. Each VR headset includes a display and lenses, a processor, memory both as RAM and storage, battery, head strap, integral speakers and microphones. The hand controllers include, batteries, buttons, triggers, and haptics.
The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.
A virtual reality system for retail store design includes a network, a virtual reality server coupled to the network including a voice streaming server configured to stream audio data and a plurality of computing devices coupled to the network and the virtual reality server. Each computing device having a display assembly configured to render virtual reality spaces and objects. At least one of the plurality computing devices is a virtual reality headset that includes one or more tracking cameras configured to sense positioning of a user's hands relative to the headset worn by the user, at least one microphone and at least one speaker. The voice streaming server is activated to stream microphone audio data from the microphone to speakers associated with other of the plurality of computing devices when one of the user's hands associated with the headset is located a threshold distance from the tracking camera of the headset.
A method of streaming microphone audio data in a virtual reality system for retail store design includes providing a plurality of computing devices coupled to a network and a virtual reality server. Each computing device has a display assembly configured to render virtual reality spaces and objects, at least microphone and at least one speaker. At least one of the computing devices includes a virtual reality headset configured to be worn by a user. With one or more tracking cameras associated with the headset, a position of the user's hands relative to the headset worn by the user is sensed. Microphone audio data from the microphone of the headset is streamed to speakers of the other of the plurality of computing devices when one of the user's hands associated with the headset is located a threshold distance from the tracking camera of the headset.
A virtual reality system for retail store design includes a network, a virtual reality server coupled to the network and a plurality of computing devices coupled to the network and the virtual reality server. Each computing device is configured to be operated by a user and includes a display assembly configured to render virtual reality spaces and objects. The virtual reality server is configured to render a virtual palette containing a plurality of available product assets in virtual form and to render a virtual retail display unit. The rendered virtual product assets are dimensionally accurate digital representations of real world products and the rendered virtual retail display unit is a dimensionally accurate digital representation of a real world retail display unit so that the user is allowed to fill the space on the rendered virtual retail display unit with a limited quantity of rendered virtual product assets as in the real world. The user operates the computing device to select one or more rendered virtual product assets to virtually merchandise the rendered retail display fixture. Data associated with the virtual product assets virtually merchandised on the virtual retail display unit are exported into a data file for use in reproducing real merchandising of a real retail display unit in the real world.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
A virtual reality (VR) system for retail store design uses VR to provide a stimulated experience for collaboratively and individually designing retail stores and store displays.
Under one embodiment, VR system 100 includes a plurality of headsets 102 each having a corresponding pair of handheld touch controllers 104 that communicate with their corresponding headset 102 and one or more tracking cameras 105 to access VR server 110. Together each headset 102, pair of touch controllers 104 and tracking cameras 105 are configured to generate and render realistic images, sounds and sensations that simulate a user's physical presence in a virtual reality environment. Tracking cameras 105 are operable to track both headset 102 and the corresponding pair of touch controllers 10. It should be realized that it is possible for tracking cameras 105 to perform hand tracking of a user's physical hands without the need for handheld touch controllers 104.
One exemplary headset 102 and corresponding touch controllers 104 are the Oculus Quest 2 VR headsets and touch controllers created by Oculus, a brand owned by Facebook Technologies, LLC. In addition, each headset 102 of VR system 100 may be capable of running as both a standalone headset with an internal operating system and/or connected with software running on a user computing system 106 over, for example, a USB connection. Headsets 102 and optional user computing system 106 are in communication with a network 108, for example, the Internet, in order to communicate with VR server 110. In addition, user computing system 106 may not be connected to a VR headset and corresponding touch controllers. In such an embodiment user computing system 106 includes controls (such as hand controls or a keyboard) for controlling movement in VR system 100, a display for displaying the virtual world, but without a headset, and a microphone and speaker for providing voice communication.
VR system 100 includes collaborative VR spaces with synchronized media control and user avatars. As illustrated in
One or more sensors 124 are input devices that sense properties including acceleration and include one or more tracking cameras 105. Regardless of whether sensors and cameras 124 are mounted on headset 102 (as illustrated) or are separate from headset 102, sensors and cameras 124 include one or more tracking cameras 105 associated with each headset 102 and are configured to sense positional tracking, touch controller tracking or hand tracking and may provide pass through viewing, which allows user 101 to temporarily view outside of the virtual reality provided by display assembly 112 and see a real-time view of the environment around user 101.
Under an embodiment where a user enters the VR system 100 by donning headset 102 and holding touch controllers 104a and 104b in each hand, a distance between the physical real-world floor and the touch controllers 104a and 104b are determined by VR system 100. This determination allows user 101 to enter into the virtual reality of collaborative VR spaces with their user avatar being at their dimensionally correct height with respect to the artificial world. However, it should be realized that the user may enter VR system 100 as their user avatar by way of a different user computing device 106 that includes a display and controls.
The VR spaces in VR system 100 have interchangeable user-defined virtual environments. For example, by selecting multiuser in the UI illustrated in screenshot 200 of
The VR theater allows users to view, as a user avatar, user-based presentations or slides on a virtual display screen 208. Controls may be selected or manipulated to operate features in the VR space, such as to play or stop the presentation being shown on display screen 208. Controls for the presentations on display screen 208 are accessible by the user avatar from a menu or directly accessible on virtual podium 206.
As previously discussed, VR server 110 includes voice streaming server 148. Voice streaming server 148 is configured to receive audio data from each microphone 122 in VR system 100 and transmit audio data to each speaker 120 in VR system 100 so as to facilitate voice communications between users in the collaborative VR system 100. In one embodiment, voice streaming server 148 is configured to transmit audio data to each speaker 120 so that each user 101 is capable of hearing all voice communications in the collaborative VR space. However, in this embodiment, not all audio data from microphone 122 is constantly transmitted in the collaborative VR space. To allow the transmission of audio data from microphone 122 to other users in the collaborative VR space, tracking camera 105 detects whether the physical left-hand touch controller 104a (or in the case of hand tracking, a user's left hand) is in a particular position or range of positions relative to tracking camera 105 or headset 102. Upon detection of the particular position or range of positions of the physical left-hand touch controller 104a (or the user's left hand), audio data from microphone 122 is transmitted to other users in the collaborative VR system 100. In one example, camera 105 may detect the physical left-hand touch controller 104a (or the user's left hand) as being within a threshold distance, which activates the transmission of audio data from microphone 122 to other speakers associated with other users or headsets in the collaborative VR system 100. In another example, camera 105 may detect the physical left-hand touch controller 104a (or the user's left hand) as being in a threshold positional orientation, which activates the transmission of audio data from microphone 122 to other speakers associated with other users or headsets in the collaborative VR system 100. Certain threshold positional orientations may include the physical touch controller being held upwards or the user's hand being squeezed into a fist and positioned upright as if pretending the hand is a microphone. It should be understood that activating the transmission of audio data based on other touch controllers, such as the physical right-hand touch controller 104b, or a right hand is possible.
As illustrated in
In the VR retail store design space, users as user avatars can work independently or collaboratively to design a space or spaces in a retail store. In this simulative environment, there may be open space and floor to work on new store designs. A user may access a menu of virtual retail display fixtures and features to add or insert into their open space. Once added, the user can select and manipulate the fixture by moving it around in any direction and turning it from 0 to 360 degrees. As soon as the user deselects, other users in the VR environment may select the object and manipulate it. Therefore, a feature created by one user may be manipulated by other users in the virtual environment.
In the VR retail store design space, it is possible that a user may select an item to be added to a store design that is unavailable. This can occur if the user violates a standard design rule, such as, for example, aisle spacing and physical boundary rules and etc. The user can add other fixtures or features from menu items as the design develops. After the collaborative users have finished the design or are taking a break from their still unfinished design, the design may be exported. The data may be exported into CAD drawings or 2D plan views as well as 3D models including exported into augmented reality files. While in the virtual design room, different design areas having design features can be simplified for purposes of image rendering. For example, a first area of the design room may have an area that is in the midst of being designed and a second area in the design room is in process of holding a design that is not fully rendered. The features built in the second area are simplified to basic boxes for purposes of image rendering because the first area is not the subject of the current designing. This helps keep the headset optimized for virtual reality of the subject design.
When moving or dragging a fixture in the VR retail store space, there is a fast move or drag where the fixture moves in one grid block increments or in what would be one-foot increments. In addition, there is a slow move or drag where the fixtures moves in smaller increments than one block increments or what would be one-inch increments. The speed of movement (fast or slow) is determined by tracking camera(s) 105 in headset 102 detecting the speed of movement of the touch controller being used to direct the movement of the object.
Besides retail display fixtures being deleted, duplicated, inserted, moved or swapped, other features and objects may be manipulated. For example, the ceiling types may be swapped out. This gives users a sense of a drop-down ceiling versus an open ceiling or other types. In-store marketing signs (ISM) may also be swapped out in real-time to give users a sense of which signage would be most effective. The users may view the virtual space from the floor level as illustrated in
In still another exemplary collaborative VR space,
After entering the VR planogram design space, the user as the user avatar teleports to a targeted position or location to begin merchandising display fixture 248. When in the targeted position, the user depresses grip button 140 on left-hand touch controller 104a to view a virtual palette or menu of product assets 250. Menu of product assets 250 may be product assets provided by a third party. With reference back to
While the merchandise being set on display fixture 248 in
In another embodiment, real-time, automatic, non-user manipulation of soft goods is provided during the setting of soft good merchandise. For example, a user avatar selects a VR folded configuration of a clothing item to set on a VR display fixture table. Upon duplication of the selected VR folded item or selection of other VR folded items and placement of the VR folded items on top of each other, the height dimensions of the VR selected item changes. In other words, the VR height of the stack of VR folded items may not equal a multiple of item heights in the folded configuration. The VR height of the stack of VR folded items is less than the multiple of item heights to account for the compression of soft goods when stacked on top of each other. Such logic may include a percentage change in height based on a number of VR folded items stacked together. In this way an accurate representation of how many folded soft goods may be stacked together on a real world display fixture is produced.
In another example, a user avatar selects a VR hanger configuration of a clothing item to set on a VR display fixture rack. Upon duplication of the selected VR hanger item or selection of other VR hanger items and placement of the VR hanger items on a rack, the width dimensions of the VR selected items changes. In other words, the VR width of the side-by-side hanged items may not equal a multiple of item widths in the hanger configuration. The VR width of the side-by-side hanging items may be less than the multiple of item heights to account for the compression of hanging soft goods when placed next to each other on a rack. Such logic may include a percentage change in width based on the number VR hanger items next to each other. In this way, an accurate representation of how many hanging soft goods may be hung side-by-side on a real world display fixture is produced.
Although elements have been shown or described as separate embodiments above, portions of each embodiment may be combined with all or part of other embodiments described above.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms for implementing the claims.
The present application is based on and claims the benefit of U.S. provisional patent application Ser. No. 63/234,437, filed Aug. 18, 2021, the content of which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
6002853 | de Hond | Dec 1999 | A |
6119147 | Toomey | Sep 2000 | A |
6219045 | Leahy | Apr 2001 | B1 |
6362817 | Powers | Mar 2002 | B1 |
6396522 | Vu | May 2002 | B1 |
6414679 | Miodonski | Jul 2002 | B1 |
6570563 | Honda | May 2003 | B1 |
6573903 | Gantt | Jun 2003 | B2 |
6784901 | Harvey | Aug 2004 | B1 |
6961055 | Doak | Nov 2005 | B2 |
7414629 | Santodomingo | Aug 2008 | B2 |
7663625 | Chartier | Feb 2010 | B2 |
7746343 | Charaniya | Jun 2010 | B1 |
7788323 | Greenstein | Aug 2010 | B2 |
7814429 | Buffet | Oct 2010 | B2 |
7817150 | Reichard | Oct 2010 | B2 |
7844724 | Van Wie | Nov 2010 | B2 |
8812273 | Trabona | Aug 2014 | B2 |
9367950 | Scranton et al. | Jun 2016 | B1 |
9396588 | Li | Jul 2016 | B1 |
9524482 | Yopp et al. | Dec 2016 | B2 |
10217286 | Angel | Feb 2019 | B1 |
10241569 | Lanman | Mar 2019 | B2 |
10429644 | Williamson | Oct 2019 | B2 |
10467814 | Loberg et al. | Nov 2019 | B2 |
10511833 | Valdivia | Dec 2019 | B2 |
10636208 | Eikhoff | Apr 2020 | B2 |
10699488 | Terrano | Jun 2020 | B1 |
10783284 | Loberg et al. | Sep 2020 | B2 |
10846937 | Roger et al. | Nov 2020 | B2 |
10916065 | Furtwangler | Feb 2021 | B2 |
20010018667 | Kim | Aug 2001 | A1 |
20020113820 | Robinson | Aug 2002 | A1 |
20040128350 | Topfl et al. | Jul 2004 | A1 |
20050093719 | Okamoto | May 2005 | A1 |
20050128212 | Edecker | Jun 2005 | A1 |
20110072367 | Bauer | Mar 2011 | A1 |
20110169826 | Elsberg et al. | Jul 2011 | A1 |
20110205242 | Friesen | Aug 2011 | A1 |
20130042296 | Hastings | Feb 2013 | A1 |
20130174213 | Liu | Jul 2013 | A1 |
20130317950 | Abraham et al. | Nov 2013 | A1 |
20170132567 | Glunz | May 2017 | A1 |
20180095635 | Valdivia | Apr 2018 | A1 |
20180129278 | Luchinskiy | May 2018 | A1 |
20180307303 | Powderly | Oct 2018 | A1 |
20190251622 | Wiedmeyer et al. | Aug 2019 | A1 |
20190368868 | Abovitz | Dec 2019 | A1 |
20200302681 | Totty et al. | Sep 2020 | A1 |
20200302693 | Singh et al. | Sep 2020 | A1 |
20210219039 | Robateau | Jul 2021 | A1 |
20220070232 | Young | Mar 2022 | A1 |
Number | Date | Country |
---|---|---|
2008125593 | Oct 2008 | WO |
2014014296 | Jan 2014 | WO |
Entry |
---|
Bigscreen, Inc., Bigscreen Beta on Steam, https://store.steampowered.com/app/457550/Bigscreen_Beta/, 3 pages, Apr. 28, 2016. |
One Digital Nation, CINEVR, https://cinevr.io/en, 3 pages, 2021. |
Karlsson et al., Virtual Reality Locomotion: Four Evaluated Locomotion Methods, https://www.diva-portal.org/smash/get/diva2:1144090/FULLTEXT02, 69 pages, 2017. |
The Great Pug, Making a Theater in VRChat or: How the Vantage Works, https://thegreatpug.com/vrchat-making-a-theater/, 4 pages, 2021. |
Plex, Virtual Reality Plex, https://www.plex.tv/your-media/virtual-reality/, 13 pages, 2021. |
VRChat, Inc., VRCHAT: Developer FAQ, https://hello.vrchat.com/developer-faq, 2 pages, 2020. |
Zaker et al., Virtual Reality-Integrated Workflow in BIM-Enabled Projects Collaboration and Design Review: A Case Study, Visualization in Engineering, 15 pages, 2018. |
Fuzor, https://www.kalloctech.com/design.jsp, 3 pages, at least as early as Jan. 12, 2021. |
Prospect, IrisVR, https://irisvr.com/prospect/, 3 pages, at least as early as Jan. 12, 2021. |
Symphony Retail AI, <http://www.symphonyretailai.com/knowledge-hub/evaluate-refine-store-layout-store-planning-retail-virtualization/>, 1 page, at least as early as Jan. 12, 2021. |
Sampaio, Enhancing BIM Methodology with VR Technology, IntechOpen, Chapter 5, pp. 60-79, 2018. |
Number | Date | Country | |
---|---|---|---|
20230055819 A1 | Feb 2023 | US |
Number | Date | Country | |
---|---|---|---|
63234437 | Aug 2021 | US |