The present invention is directed to configurable display adaptor devices having transparent or semi-transparent display surfaces.
There exits in the art teleprompters and heads-up displays. These types of devices project content generated by an electronic display onto a semi-transparent surface. An observer of this semi-transparent surface is able to view the content projected, as well as events transpiring behind the display. As a result, an observer is able to evaluate both an event and contextual information regarding that event simultaneously. However, such devices have drawbacks and limitations.
Teleprompters and the like are commonly used only for displaying text and not for video or mixed media content. Furthermore, teleprompters require custom monitor setups, black traps, and other specialized, expensive equipment. This equipment must be set up with some advanced notice, and is often cumbersome for a single individual to use.
What is needed in the technical field are easy to use, portable, cost effective devices that provide semi-transparent surfaces for use in connection with widely available display device hardware.
In one particular implementation of the display device adaptor described herein, a supporting body is provided that defines a first plane and is configured to support a light projection or electronic display device. The display device adaptor also includes a beam splitter mirror configured to transmit a portion of light incident upon a surface and reflect a portion of light incident upon a surface. The beam splitter mirror is oriented such that the light generated by the light projection or electronic display device is reflected by the beam splitter mirror. The beam splitter mirror is coupled to the supporting body and pivotable about an axis. Here, the beam splitter mirror is configured to move from a closed position oriented in parallel with the first plane defined by the supporting body and a viewing position that orients the beam splitter mirror at an angle relative to the first plane. For instance, the beam splitter is inclined at a 45-degree angle relative to the plane defined by the supporting body. The display device adaptor also includes a coupling device that attaches the display device adaptor to a surface.
In a particular further implementation of the display device adaptor, the display device supported by the supporting body is a portable computing device having at least one output display. In a further configuration, the supporting body also includes a securing device that secures the display device (such as a smartphone) to the supporting body to prevent slippage.
In yet a further implementation, the supporting body also includes a power supply and/or a wireless charging interface to charge or power a portable display device.
The invention is illustrated in the figures of the accompanying drawings which are meant to be exemplary and not limiting, in which like references are intended to refer to like or corresponding parts, and in which:
By way of broad overview, and with reference to
By way of non-limiting example, the display device adaptor 100 positions a beam splitter mirror relative to the visible content generated by a display device such that the light generated by the display device is reflected off the beam splitter mirror to an observer and the light originating from behind the beam splitter mirror is also transmitted to the observer. Thus, the display device adaptor 100 described enables an observer to view the content provided by the display device while also viewing events transpiring within the field of view behind the semi-transparent display or surface.
With particular reference to
In one or more configurations, the supporting body 102 is formed of molded or cast materials, such as thermoplastic, resins and the like. Alternatively, the supporting body 102 is formed of metal, synthetic, natural, composite, or other commonly used materials that are suitable for the forms and purposes described herein.
In a particular implementation, the supporting body 102 couples to a beam splitter mirror 104. Beam splitter mirrors, beam splitter glass, one way mirrors and other similar optical devices simultaneously partially reflect and transmit light. In one particular implementation, a sheet of glass or plastic is coated with, or is encased within, a thin, near-transparent layer of metal (for example aluminum). This optical thin-film coating allows for the reflection and transmission of light to be set at specific ratios. By way of non-limiting example, the beam splitter mirror can be set to have 70%/30% transmittance to reflective properties. Alternatively, the beam splitter mirror can be manufactured to have 50%/50% transmittance to reflective properties. Because of the reflective and transmittance properties of beam splitter optical devices, such as beam splitter mirror 104, a light source can be reflected off of the mirror to an observer, while the same observer is able to observe light passing though the beam splitter mirror 104.
By way of further detail, in one implementation the beam splitter mirrors 104 include a glass plate with a reflective dielectric coating on one side (the so called “reflective” side) that provides a phase shift of 0 or π for light incident upon the beam splitter depending on the side from which side the light is incident. Without being held to any particular theory of operation, light waves incident upon the reflective side of the beam splitter glass are phase-shifted by π, whereas light incident upon the non-reflective side has no phase shift. As shown in
As shown in
In one particular orientation, as shown in
As shown with respect to
As shown in
In the illustrated implementations of
Returning to
As additionally shown in
In a particular non-limiting implementation, the output of the power supply 112 is in communication with an inductive charging device. Here, inductive charging devices are devices configured to transmit power (electrical energy) wirelessly to a receiving device equipped with a suitable wireless power interface. Where the display device 501 includes such interfaces, the power supply 112 supplies power to the display device 501.
In yet a further implementation, the supporting body 102 includes or incorporates one or more near field communication tag(s) 808. For instance, the supporting body 102 includes Near Field Communication (NFC) tags of types 1 through 5 that contain data in the NFC Data Exchange Format (NDEF). Such NFC tags provide a suitably equipped computing device with data stored within the NFC tag.
With reference to
In one or more particular implementations as provided in
Processors, computing elements and microprocessors described herein are, in one or more implementations, connected, directly or indirectly, to one or more memory storage devices (memories). The memory is a persistent or non-persistent storage device that is operative to store an operating system for the processor in addition to one or more of software modules. In accordance with one or more embodiments, the memory comprises one or more volatile and non-volatile memories, such as Read Only Memory (“ROM”), Random Access Memory (“RAM”), Electrically Erasable Programmable Read-Only Memory (“EEPROM”), Phase Change Memory (“PCM”), Single In-line Memory (“SIMM”), Dual In-line Memory (“DIMM”) or other memory types. Such memories can be fixed or removable, as is known to those of ordinary skill in the art, such as through the use of removable media cards or modules. The computer memories may also comprise secondary computer memory, such as magnetic or optical disk drives or flash memory, that provide long term storage of data in a manner similar to the persistent memory device. In one or more embodiments, the memory of the processors provide for storage of application programs and data files when needed.
It will be further appreciated that computers, processors or computing devices described herein can communicate with the one or more remote networks using USB, digital input/output pins, eSATA, parallel ports, serial ports, FIREWIRE, Wi-Fi, Bluetooth, or other communication interfaces. In a particular configuration, Computing devices, processors or computers provided herein may be further configurable through hardware and software modules so as to connect to one or more remote servers, computers, peripherals or other hardware using standard or custom communication protocols and settings (e.g., TCP/IP, etc.) either through a local or remote network or through the Internet. Computing devices, processors or computers provided herein may utilizes wired or wireless communication means, such as, but not limited to CDMA, GSM, Ethernet, Wi-Fi, Bluetooth, USB, serial communication protocols and hardware to connect to one or more access points, exchanges, network nodes or network routers.
The processors or computers described are configured to execute code written in a standard, custom, proprietary or modified programming language such as a standard set, subset, superset or extended set of JavaScript, PHP, Ruby, Scala, Erlang, C, C++, Objective C, Swift, C#, Java, Assembly, Go, Python, Pearl, R, Visual Basic, Lisp, or Julia or any other object oriented, functional or other paradigm based programming language.
While this specification contains many specific embodiment details, these should not be construed as limitations on the scope of any embodiment or of what can be claimed, but rather as descriptions of features that can be specific to particular embodiments of particular embodiments. Certain features that are described in this specification in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable sub-combination. Moreover, although features can be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the combination, and the claimed combination can be directed to a sub-combination or variation of a sub-combination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing can be advantageous. Moreover, the separation of various system components in the embodiments described above should not be understood as requiring such separation in all embodiments, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products.
This application is now further described below, with reference to an alternative implementation.
By way of introduction and overview, in one or more implementations the present application provides systems and methods that include a display device that provides a three-dimensional, holographic appearance without requiring the use of laser light and associated technology to generate the appearance. In one or more implementations, the display device is configured with a crystalline portion, through which at least some visual content is displayed. By providing visual content through a transparent or translucent material, such as glass, plastic, crystal, or other suitable material, a futuristic holographic-looking display is provided, including in color.
In one or more implementations, three-dimensional color hologram looking images are projected as a function of hardware and software. Such images can be provided in a pre-recorded fashion, such as provide messages. Alternatively or in addition, the images can be provide in a dynamic environment, such as provided in TEAMTIME, and that supports video conferencing and configured to display individuals within the crystal or crystal-like material. As a person speaks, (s)he is featured individually within the crystal or crystal-like material, which gives the appearance of a personal physically in a room and present, even though that person may be thousands of miles away.
In one or more implementations, three-dimensionality is supported as a function of etching the crystalline structure, such as by a laser. Software can be executed by one or more processors to control an etching unit. For example, the head of a laser passes back and forth over the crystal or crystal-like material, which etches a shape into the crystal as a function of the focused beam of light and resulting in heat on the crystal or crystal-like material. In one or more implementation, subsurface laser engraving is employed that uses a physical mask, which ensures improved detail and can eliminate stray errors. This processes effectively cuts the crystal or crystal-like material and creates a design. The portion of the mask is removed, thereby exposing the crystal or crystal-like material that is etched. The three-dimensional map inside the crystal or crystal-like material features a dithering effect around a core generic head. A shape of a human head accommodates for different hairstyles, facial structure, glasses or other accessories, or the like, associated with respective persons being displayed.
In one or more implementations, the present application includes one or more modules that include code that, when executed by a processor, configures the processor to provide content within a predefined video frame shown on each of at least one display device. The module(s) configure the processor to provide the content such that the content does not appear to leave the frame, notwithstanding movement of the subject and/or capture device during capture.
For example, an area of a person's upper body (e.g., a person's head or face) is captured by a camera during a videoconference session. During movement (subject and/or camera movement), the body portion is tracked and the person's face (or upper body) is maintained within a centered portion of a display screen. In one or more implementations, a graphical user interface is provided on at least one computing device that includes one or more selectable controls and/or options, e.g., cursor controls, selection controls (e.g., round, rectangular, free-form selection tools), or other suitable control. During the point of capture (or at a point before or after), the user makes a selection, such as by a single-click action, a dragged selection of a region, or the like. The selection can be used by one or more processors to define a region within the frame where particular content (e.g., a person's head or face) is to be displayed, regardless of movement. Providing controls to enable users to make specific selections within a frame increases flexibility beyond simply confining content to the center of the frame. In addition to selecting the region where content is to be displayed, a user can use one or more controls to define a particular area of the subject (e.g., a person, an animal, vehicle or virtually any movable or non-movable subject) to be displayed in a defined region within a display. For example, by clicking on and/or selecting within an area associated with the subject, one or more processors can define a respective region to be tracked and maintained within the center or other previously defined region within the display, regardless of movement. Thus, as shown and described herein, the present application provides the benefit of keeping a face (head, or shoulders, etc.) positioned (e.g., centered) in a display screen during a videoconference or other video experience, and to offset distracting movements.
In one or more implementations, confining the display to a specific area of a subject within a specific region of a display can be accomplished in various ways. For example, a capture device, such as a camera, can be configured with one or more gyroscopes that detects particular camera movement. For example, as a user inadvertently moves the capture device, the area of the subject previously defined by the user moves in and out of the frame. One or more modules executing on one or more processors track the area substantially in real time, and the images are adjusted to offset the inadvertent movement in order to maintain the positioning of the area of the subject within the frame. In one or more recommendations, a calculation can be made in order to determine the percentage of the total subject to be included in the frame (e.g., X and Y coordinates of a face within the total frame). The calculation can be used to define vertical and/or horizontal percentage values for mapping the area of the subject to be maintained in a specific region within the frame. This can be accomplished by overscanning or other similar technique to preclude the entire captured frame from being displayed in the restricted field of view of the display device. As the capture device moves, the one or more gyroscopes detect the movements and information representing the motion is received and processed by one or more processors to adjust (e.g., reposition and/or resize) the overscanned area and maintain the area of the subject in the predefined region within the frame.
Accordingly, in one or more implementations head tracking and corresponding display is provided. The present application supports proper projection mapping thereof onto a specific portion of crystal or crystal-like structure, such as a 3-D surface comprising an etched masked surface 906. In this way, a person's face that is captured by an image receiver, such as a camera, is tracked, rescaled and then displayed in accurate position substantially in real-time.
In addition to registering display accurately, the present application supports forms of segmentation to separate a subject (e.g., a person) from the background. In one or more implementations, a virtual selection in an image of the subject is made substantially in real-time, and any non-selected portion of the image is replaced with a solid black color. In this way, the subject is segmented from the background, which substantially improves the appearance of the subject in the three-dimensional display device.
In one or more implementations, a gyroscope may not be accurate enough to provide information necessary to maintain the area of the subject precisely within the defined region. Moreover, one or more image capture devices may not be configured with a gyroscope. In such (or similar) circumstances, present application can include one or more modules to provide subject stabilization. Overscanning, as described above, is supported to preclude the entire captured image from being displayed in the entire frame of the display device. For example, the captured frame is cropped and the cropped portion is modified in response to detected movement (e.g., camera movement and/or subject movement). As the person or camera moves, the image(s) can be manipulated to maintain the subject (e.g., face) in the center of the display.
Referring to
Three-dimensional display device 902 can be configured with various circuitry and components, including some or all of which are displayed in
In addition, a visual capture device (e.g., a camera) 908 can be included as depicted in
Three-dimensional display device 902 and/or any user computing device 904 can also include one or more input or output (“I/O”) devices and interfaces 225 which are provided to allow a user to provide input to, receive output from, and otherwise transfer data to and from the system. These I/O devices may include a mouse, keypad or a keyboard, a touch panel or a multi-touch input panel, camera, network interface, modem, other known I/O devices or a combination of such I/O devices (not shown). The touch input panel may be a single touch input panel which is activated with a stylus or a finger or a multi-touch input panel which is activated by one finger or a stylus or multiple fingers, and the panel is capable of distinguishing between one or two or three or more touches and is capable of providing inputs derived from those touches to the three-dimensional display device 902 and/or user computing device 904. The I/O devices and interfaces may include a connector for a dock or a connector for a USB interface, FireWire, etc. to connect with another device, external component, or a network.
Moreover, the I/O devices and interfaces can include gyroscope and/or accelerometer (not shown), which can be configured to detect 3-axis angular acceleration around the X, Y and Z axes, enabling precise calculation, for example, of yaw, pitch, and roll. The gyroscope and/or accelerometer can be configured as a sensor that detects acceleration, shake, vibration shock, or fall of a device 902/904, for example, by detecting linear acceleration along one of three axes (X, Y and Z). The gyroscope can work in conjunction with the accelerometer, to provide detailed and precise information about the device's axial movement in space. More particularly, the 3 axes of the gyroscope combined with the 3 axes of the accelerometer enable the device to recognize approximately how far, fast, and in which direction it has moved to generate telemetry information associated therewith, and that is processed to generate coordinated presentations, such as shown and described herein.
User computing devices 202 preferably have the ability to send and receive data across communication network 204, and are equipped with web browsers, software applications, or other software and/or hardware tools, to provide received data on audio/visual devices incorporated therewith. By way of example, user computing device 202 may be personal computers such as Intel Pentium-class and Intel Core-class computers or Apple Macintosh computers, tablets, smartphones, but are not limited to such computers. Other computing devices which can communicate over a global computer network such as palmtop computers, personal digital assistants (PDAs) and mass-marketed Internet access devices such as, for example, WebTV can be used. In addition, the hardware arrangement of the present invention is not limited to devices that are physically wired to communication network 204, and that wireless communication can be provided between wireless devices. In one or more implementations, the present application provides improved processing techniques to prevent packet loss, to improve handling interruptions in communications, and other issues associated with wireless technology.
Thus, as shown and described herein, production/display module 912 is configured to project on to a generic, three-dimensional etched mask portion 906. The images that are projected are properly registered, including as a function of one or more instructions executed by processing module 909 to adjust (e.g., reposition and/or resize) an overscanned area and to maintain the area of the subject in the predefined region within the frame and projected on the respective etched mask portion 906.
Although implementations shown in the figures illustrate various components apart from one another, the present application is not so limited. In one or implementations, all of the modules and/or components are contained with a single structure or housing to be contained within three-dimensional display device 902. In this way, a single unit can be placed on a desk or other surface (or simply held by a user), and can operate as shown and described herein without a need for additional and separate hardware and/or software.
Thus, as shown and described herein, technology is provided for an improved three-dimensional holographic-like display. Extending screen display onto a surface provided within or on a crystal or crystal-like structure in accordance with the teachings herein provides a novel and intriguing 3D shape inside a crystal or crystal-like, including to project a face thereon. In one or more implementations, a face and bust etched mask area is provided in full white, and images projected thereon fade out at the edges to blur and accommodate ears and hair. Using tracking functionality, such as shown and described herein, a person's face fills the respective etched portion inside the crystal or crystal-like material.
Furthermore, and in accordance with the teachings herein, a mask of a generic face is etched inside the crystal or crystal-like material, and light is projected on the surface of the crystal or crystal-like material 904, i.e., on the mask 906, thereby providing a 3-D display, such as of a face. The 3D etched portion 906 can be configured as approximately half of a face/bust and at least semi-hollow, such as a half shell profile of a generic human form with no mouth section. In this way, as images of a person are projected thereon, and the person speaks, there are no lips represented other than the projected lips. Similarly, no eyelids are preferably etched on the crystal or crystal-like material to preclude the movement of eyes and mouth, which are accomplished by projections. A 3-D presentation is accomplished in part due to the surface of the etched mask 906 on the 3-D surface of the crystal or crystal-like component 904, and having relief. The mask, effectively serves as convex-shaped screen, which provides a geometric effect. The video projector 912 projects the light onto the surface.
Unlike merely projecting onto a convex-shaped screen on a wall or other location, the image is significantly enhanced by being displayed through the crystal or crystal-like component. On a wall or other location, reflection of the light occurs, which affects the appearance of the projected image. The light passes through the crystal or crystal-like component and does not reflect away, thereby providing a bright and sharp image having good contrast and color.
Moreover, the present application supports improved positioning and alignment/registration of the projected image(s) with the etched mask on the subsurface of the crystal or crystal-like component. By etching a mask of a generic human face on the subsurface of the crystal or crystal-like component and using that mask as a support for displaying projected images brings the images to life in a new way that was previously only imagined.
In one or more implementations, the projection is provided from the rear of the etched mask. By placing the projection from the rear, the line of view of the display portion of the crystal or crystal-like component is not blocked or otherwise impeded for the viewer. Alternatively, the projection is from a direction other than the rear and a mirror or other reflective surface (e.g., a prism) is usable to redirect the projected image(s) to the etched mask, which can similarly prevent or avoid interfering with the viewpoint of the user.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising”, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should be noted that use of ordinal terms such as “first,” “second,” “third,” etc., in the claims to modify a claim element does not by itself connote any priority, precedence, or order of one claim element over another or the temporal order in which acts of a method are performed, but are used merely as labels to distinguish one claim element having a certain name from another element having a same name (but for use of the ordinal term) to distinguish the claim elements. Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having,” “containing,” “involving,” and variations thereof herein, is meant to encompass the items listed thereafter and equivalents thereof as well as additional items.
Particular embodiments of the subject matter described in this specification have been described. Other embodiments are within the scope of the following claims. For example, the actions recited in the claims can be performed in a different order and still achieve desirable results. As one example, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In certain embodiments, multitasking and parallel processing can be advantageous.
Publications and references to known registered marks representing various systems are cited throughout this application, the disclosures of which are incorporated herein by reference. Citation of any above publications or documents is not intended as an admission that any of the foregoing is pertinent prior art, nor does it constitute any admission as to the contents or date of these publications or documents. All references cited herein are incorporated by reference to the same extent as if each individual publication and references were specifically and individually indicated to be incorporated by reference.
While the invention has been particularly shown and described with reference to a preferred embodiment thereof, it will be understood by those skilled in the art that various changes in form and details are supported. For example, the present application provides significant flexibility and creativity in connection with creating and viewing coordinated presentations. Although illustrated embodiments of the present invention have been shown and described, it should be understood that various substitutions, and alterations can be made by one of ordinary skill in the art without departing from the scope of the present invention.
This patent application is based on and claims priority to U.S. Provisional Patent Application 62/599,753, filed Dec. 17, 2017, and this patent application is based on and claims priority to U.S. Provisional Patent Application 62/713,816, filed Aug. 2, 2018, the entire contents of both of which are incorporated by reference as if expressly set forth in their respective entireties herein.
Number | Date | Country | |
---|---|---|---|
62599753 | Dec 2017 | US | |
62713816 | Aug 2018 | US |