The display screens with which many modern communication devices are equipped are typically designed to display a two-dimensional (2D) image from a single viewing perspective. As a result, and despite their ability to display sharp, richly featured, high definition images, interactive group communications such as video conferencing using those devices tend to be less engaging and immersive than if the participants could be provided with the illusion of being together in person.
One conceivable improvement to the conventional approach to providing 2D images is to render group communications using 3D imagery. However, several significant obstacles to wider use of 3D imagery exist. For example, in order to project a 3D image, multiple projectors, augmented reality (AR) headgear, and/or other complex display technology is typically required to create the illusion of a real-world 3D image. Additional complications can arise if the 3D image is to be viewed from multiple perspectives.
There are provided communication systems and methods for generating a floating image of a remote venue, substantially as shown in and/or described in connection with at least one of the figures, and as set forth more completely in the claims.
The following description contains specific information pertaining to implementations in the present disclosure. One skilled in the art will recognize that the present disclosure may be implemented in a manner different from that specifically discussed herein. The drawings in the present application and their accompanying detailed description are directed to merely exemplary implementations. Unless noted otherwise, like or corresponding elements among the figures may be indicated by like or corresponding reference numerals. Moreover, the drawings and illustrations in the present application are generally not to scale, and are not intended to correspond to actual relative dimensions.
As further shown in
According to the exemplary implementation shown in
It is further noted that, as used in the present application, the terms “central processing unit” or “CPU” and “graphics processing unit” or “GPU” have their customary meaning in the art. That is to say, a CPU includes an Arithmetic Logic Unit (ALU) for carrying out the arithmetic and logical operations of computing platform 102, as well as a Control Unit (CU) for retrieving programs, such as software code 108, from system memory 106. A GPU is configured to reduce the processing overhead of the CPU by performing computationally intensive graphics processing tasks.
In addition, for the purposes of the present application, the term “perspective” refers to the particular viewing angle from which an object, virtual object, or image is viewed by an observer. Referring to
Moreover, the terms “render” and “rendering” are defined to mean causing one or more images to appear on a display screen, such as display screen 160 for example. Thus, rendering an image may mean causing an entirely new image to appear on the display screen, or refreshing an image previously appearing on the display screen. With respect to the term “privacy filter,” as used in the present application, privacy filter refers to a film or a structure, such as a louvered structure, affixed to a display screen so as to prevent viewing of the display screen outside of a predetermined viewing angle.
It is also noted that although
Local users 168a and 168b may be positioned so as to view floating image 116 of remote venue 117 from a variety of perspectives. For example, in some implementations, users 168a and 168b may be situated so as to view floating image 116 of remote venue 117 from a number of discrete perspectives, such as three discrete perspectives located approximately 120° apart on an imaginary 360° circle surrounding floating image 116. However, in other implementations, users 168a and 168b may be able to view floating image 116 of remote venue 117 from the perspective of any position on such an imaginary circle surrounding floating image 116 of remote venue 117.
In some implementations, one or more of local users 168a and 168b may be interactively engaged with the remote venue depicted by floating image 116 via communication system 100 including computing platform 102, 360° camera 118, projection system 164, audio system 166, sensor network 120, and display screen 160. That is to say, in those implementations, CPU 112 of ASIC 110 may be configured to execute software code 108 to utilize transceiver 104, projection system 164, audio system 166, sensor network 120, GPU 114, and display screen 160 to generate and sustain floating image 116 of remote venue 117 during communications with remote venue 117.
Although
Projection system 164 may include image projection components that are wholly integrated with display 190, may include image projection components controlled by but remote from display 190, or may be partially integrated with display 190 while including remote image projection components. Projection system 164 may include multiple projection sources, and may be configured to provide projection lighting of varying intensity and varying colors, for example.
Analogously, audio system 166 may be wholly integrated with display 190, may include elements, such as audio speakers, controlled by but remote from display 190, or may be partially integrated with display 190 while including remote audio elements. In one implementation, audio system 166 may include a theater quality Dolby® high definition (HD) surround-sound system, for example.
According to the exemplary implementation shown in
As further shown in
Transceiver 104 may be implemented as a wireless communication unit controlled by CPU 112 and enabling communication system 100 to exchange data with remote venue 117. For example, transceiver 104 may be implemented to support communication via WiFi, may take the form of a 3G or 4G wireless transceiver, or may be a 5G wireless transceiver configured to satisfy the IMT-2020 requirements established by the International Telecommunication Union (ITU).
It is noted that sensor network 120 is described in greater detail below by reference to
It is noted that the specific sensors shown to be included among sensors 122 of sensor network 120 are merely exemplary, and in other implementations, sensors 122 of sensor network 120 may include more, or fewer, sensors than RFID sensor 122a, FR sensor 122b, ASR sensor 122c, OR sensor 122d, image sensor 122e, laser sensor 122f, and P/R sensor(s) 130. RFID sensor 122a, FR sensor 122b, ASR sensor 122c, OR sensor 122d, image sensor 122e, laser sensor 122f, and P/R sensor(s) 130 may be implemented using any suitable sensors for those respective functions, as known in the art. Microphone(s) 124 may include one or more stationary and/or moving microphone(s). For example, stationary microphone(s) of microphone(s) 124 may be distributed in a 360° array surrounding base 140 to enhance directional sensing of sound, such as speech, produced by one or more of local users 168a and 168b.
In some implementations, one or more moving microphone(s) of microphone(s) 124 may rotate in synchronization with rotor 144 for display 190. In those implementations, P/R sensor(s) 130 may be used in combination with microphone(s) 124 to identify the direction from which a sound sensed using microphone(s) 124 is received.
Image sensor 122e may correspond to one or more sensors for obtaining visual images of local users 168a and 168b, as well as the local venue in which communication system 100 and local users 168a and 168b are located. Image sensor 122e may implemented as one or more stationary and/or rotating video cameras, for example, or as a vertical array of image capture pixels controlled by a physical or global electronic shutter and configured to rotate with display 190.
As indicated in
According to the exemplary implementation shown in
It is noted that the distribution of features identified by reference numbers 132a, 134a, 136a, 138a, 132b, 134b, 136b, and 138b between base sensor(s) 130a and rotating sensor(s) 130b is merely exemplary. In another implementation, for example, the positions of features 132a, 134a, 136a, 138a, 132b, 134b, 136b, and 138b may be reversed. That is to say, one or more of IR LED 132a, magnet 134a, visible light LED 136a, and glyph or visible marker 138a may be included as rotating sensor(s) 130b, while one or more of IR receiver 132b, Hall effect sensor 134b, photo diode 136b, and camera(s) 138b may be included as base sensor(s) 130a. It is further noted that camera(s) 138b may include one or more still camera(s) and/or one or more video camera(s), for example.
As indicated in
Display 290A includes image capture device 258 mounted on display 290A and configured to rotate with display 290A, as well as display screen 260 having optional privacy filter 266 affixed to display surface 236 of display screen 260. In addition,
Communication system 200A corresponds in general to communication system 100, in
Moreover, display 290A including display screen 260 corresponds in general to display 190 including display screen 160, in
Referring to
It is noted that remote venue 217 corresponds to remote venue 117, in
In some implementations, display screen 160/260 may be a liquid-crystal display (LCD) screen or an organic light-emitting diode (OLED) display screen, for example. Moreover, in some implementations, computing platform 102 and display 190/290A may be integrated with a mobile communication device configured to spin with rotor 144/244. For example, computing platform 102 and display 190/290A may be provided by a smartphone or a tablet computer. It is noted that although display screen 160/260 is depicted as a substantially flat display screen in
In the implementations shown in
Although, in some implementations, optional privacy filter 266 may be an advantageous or desirable feature for reducing flicker and/or blur, in some other implementations it may be preferable to omit optional privacy filter 266. For example, in implementations in which true volumetric images are to be displayed as floating image 116, privacy filter 266 may be preferentially omitted.
It is noted that CPU 112 may execute software code 108 to control motor 142/242 in order to spin rotor 144/244 and display 190/290A about vertical axis 154/254 at a varying spin rate, or at a substantially constant predetermined spin rate. It is also noted that spin direction 256 may be in either a counter clockwise direction with respect to the plane of horizontal axis 152/252, as shown in
In some implementations, CPU 112 may execute software code 108 to use GPU 114 to modify 2D image 219 as rotor 144/244 and display 190/290A rotate, so as to generate multiple perspectives of floating image 116 that are appropriate respectively to the locations of each of local users 168a and 168b in
Image capture device 258 may include one or more image sensors 122e for obtaining visual images of local users 168a and 168b, as well as the local venue in which communication system 100/200A and local users 168a and 168b are situated. Image capture device 258 may implemented as one or more video cameras, for example, or as a vertical array of image capture pixels controlled by a physical or global electronic shutter and configured to rotate with display 190/290A.
Display 290B corresponds in general to display 190/290A, in
Display 290B differs from display 290A in that display 290B includes two display screens: first display screen 260a and second display screen 260b. As shown in
Although each of first and second display screens 260a and 260b is shown to have 2D image 219 of remote venue 117/217 rendered thereon, in some implementations, first and second display screens 260a and 260b may show different respective perspectives of remote venue 117/217. That is to say, 2D image 219 corresponding to a first perspective of remote venue 117/217 may be rendered on first display screen 260a while a second, different, perspective of remote venue 117/217 is rendered on second display screen 260b. For example, CPU 112 may execute software code 108 to use GPU 114 to render a particular perspective of remote venue 117/217 on first display screen 260a, while substantially concurrently rendering a 180° opposite perspective of remote venue 117/217 on second display screen 260b.
The exemplary back-to-back display screen implementation shown in
As shown in
Communication system 300a corresponds in general to communication system 100/200A/200B, in
Moreover, display 390a including display screen 360 corresponds in general to display 190/290A/290B including display screen(s) 160/260/260a/260b, in
It is noted that floating image 316 of remote venue 117/217/317 corresponds to floating image 116, in
Each of local venue 370 and remote venue 117/217/317 may correspond to a video conferencing venue in an office complex, hospital, university, or hotel business center, for example. In implementations in which local venue 370 and remote venue 117/217/317 are video conferencing venues, for example, local users 168a/368a, 168b/368b, 368c, and 368d may correspond to local participants in a video conference, while remote users 369a, 369b, 369c, and 369d may correspond to remote participants in the video conference.
According to the exemplary implementation shown in
Substantially concurrently with spinning of display 390a to generate floating image 116/316, local image data 357a of local venue 370 including local users 168a/368a, 168b/368b, 368c, and 368d may be obtained by communication system 300a using one or more of camera(s) 138b, 360° camera 118, laser 119 and laser sensor 122f, and image capture device 258. Local image data 357a, along with local audio data obtained using microphone(s) 124, for example, may be transmitted to remote communication system 300b to at remote video conferencing venue 117/217/317 as local audio-visual data 359a via wireless communication link 361.
By way of example, communication system 100/200A/200B/300a and remote communication system 300b can be used for video conferencing in a number of different exemplary implementations. For example, in one implementation, an image feed captured from remote communication system 300b can be translated into a 1:1 recreation or mirror image that is displayed in local venue 370 by communication system 100/200A/200B/300a. For instance, the position of a camera view on remote communication system 300b can be mapped to the opposite side on communication system 100/200A/200B/300a. In this way, display 190/290A/290B/390a of communication system 100/200A/200B/300a would act as a cylindrical window where local users 168a/368a, 368b/368b, 368c, and 368d can walk around display 190/290A/290B/390a to observe remote venue 117/217/317 from different angles.
Alternatively, in one implementation, each of local users 168a/368a, 368b/368b, 368c, and 368d could view remote venue 117/217/317 from a perspective substantially matching their individual locations within local venue 370, reduced in size to fit display screen 160/260/260a/260b/360.
In implementations in which remote display 390b includes a traditional flat screen as display screen 360, a distorted fish eye view of local image data 357 may be rendered on display 390b. In those implementations, remote users 369a, 369b, 369c, and 369d would see an expanded/warped view of local venue 370 generated from a set of images obtained and transmitted by communication system 100/200A/200B/300a. Other implementations can include additional functionality using different forms of tracking. For example, facial tracking and voice tracking can be used to direct attention to a specific person in the venue, whether local or remote, (e.g. the person presently speaking during the video conference).
Base 440 and rotor 444 correspond in general to base 140/240/340 and rotor 144/244/344 may share any of the characteristics attributed to those corresponding features above. In addition, local users 468a and 468b correspond respectively to local users 168a/368a and 168b/368b in
According to the exemplary implementation shown in
Communication system 500 corresponds in general to communication system 100/200A/200B/300a/400, in
Moreover, display 590 including display screen 560 corresponds in general to display 190/290A/290B/390a including display screen 160/260/260a/260b/360, in
Wearable floating image tracking sensor 578 may be implemented as an augmented reality (AR) or virtual reality (VR) viewing device, for example, worn by local user 568 as a head-mounted tracking sensor. Wearable floating image tracking sensor 578 is in communication with computing platform 102 of communication system 100/200A/200B/300a/400/500, through integrated sensor system 120 or transceiver 104, for example, and via wireless communication link 576. As local user 568 moves within local venue 370/470/570, for example from location 569a to location 569b, wearable floating image tracking sensor 578 enables the generation of perspectives of floating image 116/316 of remote venue 117/217/317 appropriate respectively to locations 569a and 569b in local venue 370/470/570 relative to floating image 116/316.
For example, wearable floating image tracking sensor 578 enables local user 568 to view floating image 116/316 of remote venue 117/217/317 from first floating image perspective 516a when local user 568 is at location 569a, and to advantageously view floating image 116/316 of remote venue 117/217/317 from location appropriate second floating image perspective 516b when local user is at location 569b. Moreover, in some implementations, local user 568 can utilize wearable floating image tracking sensor 578 to look around remote venue 117/217/317 as if they are standing where remote communication system 300b is located in remote venue 117/217/317.
The functionality of communication system 100/200A/200B/300a/400/500 will be further described by reference to
Referring to
Flowchart 680 continues with rendering image data 357b on display screen 160/260/260a/260b/360/560 of communication system 100/200A/200B/300a/400/500 while spinning display 190/290A/290B/390a/590 using motor 142/242 and rotor 144/244/344/444/544 to generate floating image 116/316 of remote venue 117/217/317 (action 684). Rendering of image data 357b on display screen 160/260/260a/260b/360/560 of communication system 100/200A/200B/300a/400/500 while spinning display 190/290A/290B/390a/590 to generate floating image 116/316 of remote venue 117/217/317 may be performed by software code 108, executed by CPU 112 of ASIC 110, and, according to some implementations, using GPU 114 of ASIC 110.
CPU 112 of ASIC 110 may be configured to execute software code 108 to control motor 142/242 to spin rotor 144/244/344/444/544 and display 190/290A/290B/390a/590 about vertical axis 154/254 at a variable spin rate, or at a predetermined substantially constant spin rate, which may be on the order of approximately one or more tens or hundreds of rotations per second, for example.
According to various implementations of the present inventive concepts, the spin rate of rotor 144/244/344/444/544 and display 190/290A/290B/390a/590 may depend in part on the frame rate of display 190/290A/290B/390a/590. As known in the art, the term “frame rate” refers to the rate or frequency with which a new frame can be rendered on a display, expressed in frames per second (fps). Thus, frame rate is to be distinguished from refresh rate, which is the rate or frequency with which the same frame can be redrawn on a display. In addition to the frame rate of display 190/290A/290B/390a/590, the spin rate with which rotor 144/244/344/444/544 and display 190/290A/290B/390a/590 spin or rotate may be based on the number of perspectives of floating image 116/316 of remote venue 117/217/317 being displayed by communication system 100/200A/200B/300a/400/500.
Flowchart 680 continues with, concurrently with spinning display 190/290A/290B/390a/590, using one or more of 360° camera 118, laser 119 and laser sensor 122f, camera(s) 138b, and image capture device 258 to obtain local image data 357a of local venue 370/470/570 (action 686). It is noted that, as used in the present application, the expression “image capture device” may refer to any or all of the features disclosed herein as 360° camera 118, laser 119 and laser sensor 122f, camera(s) 138b, and image capture device 258. Moreover, and as discussed above, image capture device 258 includes image sensor configured to rotate with display 190/290A/290B/390a/590. For example, image capture device 258 may include a vertical array of image sensors, such as a vertical array of approximately 1,024 or 2,048 sensors, for example, mounted on display 190/290A/290B/390a/590.
It is noted that in implementations in which local image data is obtained using 360° camera 118, local image data 357a includes 360° image data of local venue 370/470/570. Obtaining local image data 357a of local venue 370/470/570 concurrently with spinning display 190/290A/290B/390a/590 may be performed by software code 108, executed by CPU 112 of ASIC 110, and, according to some implementations, using GPU 114 of ASIC 110.
Flowchart 680 can conclude with transmitting, using transceiver 104 of communication system 100/200A/200B/300a/400/500, local audio-visual data 359a including local image data 357a to remote venue 117/217/317 (action 688). Local audio visual data 359a including local image data 357a may be transmitted to remote communication system 300b of remote venue 117/217/317 via wireless communication link 361, for example, by software code 108, executed on communication system 100/200A/200B/300a/400/500 by CPU 112 of ASIC 110, and using transceiver 104.
Thus, the present application discloses communication systems and methods for generating a floating image of a remote venue. By spinning a display upon which a 2D image of a remote venue is rendered, the present communication solution is capable of generating an apparently floating image of the remote venue that may appear to be realistically 3D. In addition, by using an image capture device to obtain local image data of a local venue concurrently with spinning of the display, the present communication solution generates data that can be transmitted to the remote venue in real-time. As a result, the present communication solution advantageously enables realistic, engaging, and immersive group interactions among group participants who are physically remote from one another.
From the above description it is manifest that various techniques can be used for implementing the concepts described in the present application without departing from the scope of those concepts. Moreover, while the concepts have been described with specific reference to certain implementations, a person of ordinary skill in the art would recognize that changes can be made in form and detail without departing from the scope of those concepts. As such, the described implementations are to be considered in all respects as illustrative and not restrictive. It should also be understood that the present application is not limited to the particular implementations described herein, but many rearrangements, modifications, and substitutions are possible without departing from the scope of the present disclosure.
Number | Name | Date | Kind |
---|---|---|---|
4687167 | Skalka et al. | Aug 1987 | A |
4943851 | Lang | Jul 1990 | A |
5057827 | Nobile | Oct 1991 | A |
5148310 | Batchko | Sep 1992 | A |
5239892 | Sakai | Aug 1993 | A |
5437235 | Randolph | Aug 1995 | A |
5714997 | Anderson | Feb 1998 | A |
5815314 | Sudo | Sep 1998 | A |
6115006 | Brotz | Sep 2000 | A |
6183088 | LoRe | Feb 2001 | B1 |
6208318 | Anderson | Mar 2001 | B1 |
6481851 | McNelley | Nov 2002 | B1 |
6801185 | Salley | Oct 2004 | B2 |
6886281 | Smith | May 2005 | B2 |
7002604 | Barrus | Feb 2006 | B1 |
7477252 | Chun | Jan 2009 | B2 |
7490941 | Mintz | Feb 2009 | B2 |
7587120 | Koo | Sep 2009 | B2 |
7708640 | Burak | May 2010 | B2 |
8233032 | Yukich | Jul 2012 | B2 |
8698966 | Liu | Apr 2014 | B2 |
9053660 | Liu | Jun 2015 | B2 |
9186595 | Cannon | Nov 2015 | B1 |
10310284 | Waldron | Jun 2019 | B1 |
20020148148 | Smith | Oct 2002 | A1 |
20030142067 | Kurtenbach | Jul 2003 | A1 |
20040082283 | Lindell et al. | Apr 2004 | A1 |
20040196362 | Hoshino | Oct 2004 | A1 |
20050035962 | Ishibashi et al. | Feb 2005 | A1 |
20050083570 | Ueda et al. | Apr 2005 | A1 |
20050284997 | Tisbo | Dec 2005 | A1 |
20060171008 | Mintz | Aug 2006 | A1 |
20070139769 | DeCusatis et al. | Jun 2007 | A1 |
20070293299 | Aida | Dec 2007 | A1 |
20080218854 | Hoshino | Sep 2008 | A1 |
20090312979 | Pan | Dec 2009 | A1 |
20100007582 | Zalewski | Jan 2010 | A1 |
20110199373 | Liu et al. | Aug 2011 | A1 |
20120146897 | Yoshida | Jun 2012 | A1 |
20120194419 | Osterhout | Aug 2012 | A1 |
20120293941 | Myerchin | Nov 2012 | A1 |
20130050198 | Song | Feb 2013 | A1 |
20130100126 | Kim et al. | Apr 2013 | A1 |
20130100358 | De Collibus | Apr 2013 | A1 |
20130092805 | Funk et al. | Aug 2013 | A1 |
20130343743 | Yen | Dec 2013 | A1 |
20140091942 | Matloff et al. | Apr 2014 | A1 |
20140118271 | Lee | May 2014 | A1 |
20140307068 | Song et al. | Oct 2014 | A1 |
20150193084 | Juni | Jul 2015 | A1 |
20150212718 | Kellhammer | Jul 2015 | A1 |
20150288857 | Fay et al. | Oct 2015 | A1 |
20170009935 | Theis et al. | Jan 2017 | A1 |
20170023911 | Russell et al. | Jan 2017 | A1 |
20170038829 | Lanier | Feb 2017 | A1 |
20170115488 | Ambrus | Apr 2017 | A1 |
20170140791 | Das | May 2017 | A1 |
20170343804 | Choi | Nov 2017 | A1 |
20180024373 | Joseph | Jan 2018 | A1 |
20180224678 | Jung | Aug 2018 | A1 |
20190156710 | Hanson | May 2019 | A1 |
Number | Date | Country |
---|---|---|
S58-154913 | Oct 1983 | JP |
H1-280992 | Nov 1989 | JP |
H9-238369 | Sep 1997 | JP |
2004-54150 | Feb 2004 | JP |
2005-221946 | Aug 2005 | JP |
2005-275398 | Sep 2005 | JP |
2010-273013 | Dec 2010 | JP |
Entry |
---|
Horimai, Hideyoshi, et al. “Full-Color 3D Display System with 360 Degree Horizontal Viewing Angle.” Proc. Int. Symposium of 3D and Contents, 2010. pp. 1-4. |
File History of Related U.S. Appl. No. 15/888,896, filed Feb. 5, 2018, and titled “Floating Image Display System”. |
File History of Related U.S. Appl. No. 15/985,477, filed May 21, 2018, and titled “Electrical Charger for a Spinning Device”. |
File History of Related U.S. Appl. No. 15/985,502, filed May 21, 2018, and titled “Display of a Floating Image With Depth Enhancement”. |
File History of Related U.S. Appl. No. 15/983,006, filed May 17, 2018, and titled “Multi-Perspective Display of an Image”. |
File History of Related U.S. Appl. No. 16/011,505, filed Jun. 18, 2018, and titled “Image Display System With Visual Filter”. |
File History of Related U.S. Appl. No. 16/002,947, filed Jun. 7, 2018, and titled “Image Generation System Including a Spinning Display”. |
Yasuhiro Suzuk, et al. “Research of Real World Life-Sized Video Avatar Presentation System,” Proceedings of the Virtual Reality Society of Japan Annual Conference 10, Sep. 29, 2005, pp. 111-114. |
Hikechi Maeda, et al. “Experimental Development and Evaluation of All-Around Display System for Video Avatar in the Real World,” Proceedings of the Virtual Reality Society of Japan Annual Conference 8, Sep. 3, 2003. |
Number | Date | Country | |
---|---|---|---|
20200159035 A1 | May 2020 | US |