Near eye display (NED) systems include personal imaging systems that create an image in the field of view of one or both a viewer's eyes. Unlike imaging systems that project an image onto a screen or surface for viewing, certain NED systems project the image from a viewing area on a lens onto the human retina, where the image is perceived to be in front of the viewer. The distance from a viewing pupil on the lens to the viewer's eye may be only a few millimeters. Many NED systems are provided in wearable portable devices, resembling eyeglasses or goggles.
Some NED systems are virtual reality (VR) systems, in which an immersive viewing experience enables the viewer to see only the image projected by the system, while the immersive viewing system blocks light from other sources. VR systems may be used, for example, in gaming, simulators, training systems, or virtual two-dimensional or three-dimensional viewing for movies, games, or video presentations. Certain alternative systems that use NED are transmissive systems, where lenses act as optical combiners. In such alternative systems, the viewer looks through the lens of the NED, and the lens optically combines the images provided by the system with the scene the viewer is observing. Examples are augmented reality (AR) systems. Some NED systems are mixed reality (XR) systems, in which an immersive viewing experience enables the viewer to see only the image projected by the system, while the immersive viewing system also uses cameras to also project virtual renderings of objects in the real world.
In one example, a method includes obtaining, by a control circuit, first and second image frames. The first image frame has image content to be displayed during a first image frame period. The second image frame has second image content to be displayed during a second image frame period occurring after the first image frame period. The method further includes determining, by the control circuit, a first position within the first image frame for a first object of a plurality of objects included within the first image content of the first image frame. The method further includes determining, by the control circuit, a motion vector for the first object. The method further includes determining, by the control circuit, a second position of the first object using the determined motion vector, the second position being different from the first position. The method further includes outputting, by the control circuit, output signals configured to control the display of the first object at the second position before displaying the second image content during the second image frame.
In another example, a system includes a video receiver, a control circuit coupled to the video receiver, and a spatial light modulator coupled to the control circuit. The video receiver is configured to receive video input. The control circuit is configured to obtain first and second image frames from the video input. The first image frame has first image content to be displayed during a first image frame period. The second image frame has second image content to be displayed during a second image frame period occurring after the first image frame period. The control circuit is further configured to determine a first position within the first image frame for a first object of a plurality of objects included within the first image content of the first image frame. The control circuit is further configured to determine a motion vector for the first object. The control circuit is further configured to determine a second position of the first object using the determined motion vector, the second position being different from the first position. The control circuit is further configured to output display signals configured to control the display of the first object at the second position before displaying the second image content during the second image frame. The spatial light modulator is configured to spatially modulate a light beam responsive to the display signals outputted by the control circuit.
In another example, a method includes obtaining, by a control circuit, first and second image frames. The first image frame has first image content to be displayed during a first image frame period. The second image frame has second image content to be displayed during a second image frame period occurring after the first image frame period. The method further includes determining, by the control circuit, a first position and a dimension of an object frame encompassing an object included within the first image content of the first image frame. The method further includes outputting, by the control circuit, a sequence of display signals configured to control the display of the object at the first position, to subsequently display the object at the second position, and to subsequently display at least a portion of the second image content.
The same reference numbers or other reference designators are used in the drawings to designate the same or similar (functionally and/or structurally) features. The figures are not necessarily drawn to scale.
Among other technical advantages described herein, projection subsystem 100 may be configured to project static virtual objects that are perceived by the viewer to remain at a fixed position within the field of view, independent of any movement of the viewer. For example, projection subsystem 100 may render a display of a battery icon that appears to remain at a fixed position within a viewer's field of view even, for example, while the viewer rotates its head when traveling within a vehicle. Projection subsystem 100 also may be configured to project moving virtual objects (referred to herein as “dynamic objects”) with enhanced realism, such that they are perceived by the viewer to realistically move along respective motion vectors, independent of any motion of the viewer.
To enhance the realism of the perceived independent motion of dynamic objects, projection subsystem 100 may be configured to derive shifted positions of dynamic objects, such that, when displayed in appropriate sequence with dynamic object data received from a source, the motion of dynamic objects may be perceived by the viewer to be more realistic and independent of any movement of the viewer. For example, a data source may provide projection subsystem 100 with a sequence of positional data for a dynamic object; and projection subsystem 100 may be configured to derive one or more intermediate positions for the dynamic object, each derived position being located between a respective two positions of the provided positional data for the dynamic object. The display of such derived positional data may be perceived by the viewer to smooth the motion of a dynamic object and hence cause its movement to appear more realistic to the viewer. Projection subsystem 100 may also be configured to adaptively derive certain dynamic object positions in real-time, responsive to movement(s) of a viewer, such as the movement of the viewer's pupil, the viewer's head position, or the body of the viewer in general (e.g., the viewer may be traveling in a vehicle).
Control circuit 110 refers to any suitable circuitry configured to receive video output 112 and output corresponding control outputs 114 and 116 formatted to be applied to inputs of SLM 120 and light source 130, respectively. Control circuit 110 may include various hardware and software subcomponents, as explained further with reference to
SLM 120 refers to any suitable optical device that imposes some form of spatially varying modulation on a beam of light. SLM 120 includes an array of individually addressable and controllable pixel elements that modulate light according to video input data streams. Various optical devices can implement spatial light modulation, such as one or more digital micromirror devices (DMD), liquid crystal displays (LCD), liquid crystal on silicon (LCoS), micro-light-emitting-diode (microLED) and so forth.
Projection subsystem 100 may be adapted for use with one or more SLM(s) 120. In some systems, projection subsystem 100 may further be adapted, or may alternatively be adapted, for use with a phase light modulator (PLM) that imposes some form of phase varying modulation of a beam of light. Various optical devices can implement phase light modulation, such those based on DMD, LCD, LCOS, microLED, or other technologies capable of phase light modulation and/or spatial light modulation.
Control circuit 110 may determine the relative positional changes or motion vectors of virtual objects projected by projection subsystem 100. Control circuit 110 may adaptively determine respective motion vectors for various virtual objects to be projected within a given image frame. The motion vectors may be based on, for example, respective movement vectors of the projected objects themselves, any directional change of the gaze of the viewer's pupil(s), or any movement vector of the viewer's head or entire body.
In some AR systems, the field of view at any point in time may be perceived as one or more image frames projected by projection subsystem 100 that are optically combined with the real-world scene also within the viewer's field of view. In some systems capable of rendering three-dimensional content, multiple image frames may be used simultaneously to render projections to both of a viewer's eyes in a manner that produces a three-dimensional effect. In some VR or XR systems, the field of view at any point in time may be one or more image frames projected by projection subsystem 100, in which at least part of the image frame(s) may contain virtual renderings of objects in the real world.
Certain motion vectors determined by control circuit 110 may be explained in the context of an example AR system. If a viewer is a passenger in a moving vehicle, for example, control circuit 110 may be configured to cause projection subsystem 100 to project a moving virtual object in the foreground, such as, for example, an image of a person or a drone rapidly moving from right to left across the viewer's field of view (e.g., object 404 of
Control circuit 110 may also be configured to cause projection subsystem 100 to project one or more virtual objects that are perceived by the viewer to be at a fixed position relative to certain marked positions in the real world. For example, control circuit 110 may cause a virtual advertisement object to be projected within the field of view, such that the object is perceived by the viewer to be fixed in its position or anchored relative to a non-moving feature in the field of view. As shown in
Object 406 in this example may also be classified as dynamic in that it may change position from one image frame to another (e.g., responsive to a determination that the real world seen through the field of view has changed). If any movement of the viewer (including pupil gaze) causes the anchor point to leave the field of view, control circuit 110 may also cause object 406 to be removed from the field of view or to newly display replacement, unanchored object.
Control circuit 110 may also be configured to cause projection subsystem 100 to project a virtual object at one static position relative to the field of view. For example, a virtual object indicating the charge level of a battery may be projected at a static position in the upper-right hand corner of the field of view, as shown by object 410 of
After control circuit 110 determines the appropriate position for all virtual objects to be projected, control circuit 110 may send control signals causing projection subsystem 100 to project those objects (e.g., both the advertising and battery-level objects), such that the viewer perceives them to all be simultaneously within the same field of view at the appropriate position. For example, control circuit 110 may cause the projection subsystem 100 to project a series of image frames appropriately positioning any suitable combination or number of dynamic objects or static objects. In the above example scenario involving two dynamic objects and one static object, for example, control circuit 110 may cause projection subsystem to 100 appropriate position those objects within a series of image frames, even while the viewer's gaze, head, or body is in motion, such that the first dynamic object appears to be anchored to an anchor point in the real world, the second dynamic object appears to be in motion relative to the real world, and the static object that appears to the viewer to remain at a fixed location within the field of view.
Optical elements 140A-140B refer to any suitable optical device(s) capable of receiving and transmitting incident light beams in a manner that concentrates, diverges, refracts, diffracts, redirects, reshapes, integrates, or reflects the incident light beams. In some systems, optical elements 140A-140B collectively optically couple light source 130 to SLM 120. For example, optical elements 140A-140B may be configured to concentrate light beams emitted by light source 130 and direct focused light beams toward SLM 120. For example, optical elements 140B may be configured to receive light beams spatially modulated by spatial light modulator and concentrate, diverge, refracts, diffracts, redirects, reshapes, integrate, or reflect the received spatially modulated light beams toward a waveguide 141. In some systems, the waveguide 141 may be configured to receive the spatially modulated light beams and to transmit the same to the retina of a viewer wearing of an NED system, such as the NED system 200 illustrated in
In this description, elements that are optically coupled have an optical connection between the elements, but various intervening optical components can exist between elements that are optically coupled. Similarly, in this description, when the term coupled describes relationships between elements, it is not limited to connected or directly connected, but may also include connections made with intervening elements, and additional elements and various connections may exist between any elements that are coupled.
In some systems, the light illuminating SLM 120 is tinged with a color, for example by using either a white light source and some type of color filter or by using one or more light sources that each provide a respective colored light beam. This enables some display systems using spatial light modulation to display colored images.
In this example, SLM 120 is illustrated as being optically coupled to light source 130 at an angle that facilitates selective reflection to spatially modulate light beams. A DMD is an example device capable of such reflective spatial modulation. A DMD is an optical micro-electrical-mechanical system (MEMS) that contains an array of highly reflective aluminum micromirrors, each corresponding to at least one display pixel. Each micromirror may be individually addressed in either an on or off state, where an on state of a given micromirror causes light beams spatially corresponding to that micromirror to be projected onto a pupil of a viewer wearing a NED system containing the projection subsystem 100. Gray scale may be created by causing the micromirrors to oscillate some preset number of times within a timeframe corresponding to the display of a single image. A full color gamut may be created by time multiplexing the individual display of three or more primary colors (e.g., red, green, and blue).
In other systems, SLM 120 may output modulated light beams using one or more devices different from the general DMD description above. For example, SLM 120 may output modulated light beams by selective redirection using reflective LCOS display technology. In an LCOS display, a complementary metal-oxide semiconductor (CMOS) chip may be used to control the voltage on square reflective aluminum electrodes below the chip surface, each controlling one pixel. For example, a chip with XGA resolution will have 1024×768 plates, each with an independently addressable voltage. Typical cells are about 1-3 centimeters square and about 2 mm thick, with pixel pitch as small as 2.79 μm. A common voltage for all the pixels is supplied by a transparent conductive layer.
In addition, SLM 120 may selectively transmit incident light beams using an LCD crystal panel or an interferometric modulator. Some LCD projectors use transmissive LCD, in which an LCD panel functions as a spatial light modulator by selectively allowing light beams to pass through the LCD panel depending on the orientation of liquid crystal molecules at each LCD pixel. Each pixel of an LCD consists of a layer of molecules aligned between two transparent electrodes and two polarizing filters. Without the liquid crystal between the polarizing filters, light passing through the first filter would be blocked by the second polarizer. The orientation of each LCD pixel can be “switched” on or off by selectively applying an electrical field.
The position of light source 130 or optical elements 140A may be altered, in some systems, to accommodate an alternative SLM 120 that modulates light beams differently from what is representatively shown in
A variety of visual information, cues, or aids may be displayed by NED system 200. For example, notifications can be displayed and viewed along with a scene being observed. Examples of such notifications include social media messages, text including navigation information, weather, traffic, historical or tourism information about an object or place, retail offers such as sales or advertising related to a store or place near to or being viewed by a viewer, stock quotes, sports scores or other context driven notifications or information. Some systems may enable interactive gaming, such as scavenger hunts or games involving finding virtual objects at a location, or games scoring the viewer's ability to find a target place or object. Some systems may enable battle simulations in either a gaming or military training context, in which a virtual object, such as a drone or a person, is perceived as entering a field of view at a certain vector and as continuing to realistically and independently travel through the field of view even while the viewer adjusts its gaze. Some AR systems provide a full field of view display that is always in the view of the viewer, while other AR systems may provide a small display provided at a portion of the view that the viewer must specifically look at to see, such as smart glasses.
NED system 200 can include network connections, such as cellular connections, Wi-Fi connections, Bluetooth connections. In addition, NED system 200 can be coupled to another device including such connections, such as a smartphone, tablet, portable web browser, video player, or laptop computer.
In some systems, a viewer wears a headset or eyeglass(es) in a manner similar to sunglasses, eyeglasses, or monocle, and NED system 200 displays information that augments the real visual environment observed by the viewer while wearing the device. In other systems, such as an automotive or aerospace heads-up displays (HUD), the viewer looks into the NED system 200, and the imaging system adds images to the scene in front of the viewer. In this way, the viewer can observe a scene while receiving additional information at the same time, such as vehicle speed, fuel gauges, system messages, and similar data.
Video receiver 310 refers to any suitable circuitry configured to receive video input. For example, the video input may be received wirelessly or over a wired connection.
Image processor 320 refers to any suitable circuitry configured to process video input to output corresponding image frames. Each image frame may correspond to a full-array image to be displayed during an image frame period. In some systems implementing a colored display, each image frame may be subdivided into multiple color-specific image subframes (e.g., red, green, and blue).
Sub-image splitter 330 refers to any suitable circuitry configured to split a given image frame or image subframe into one or more sub-images. Example operations of sub-image splitting are described further herein with reference to
Frame memory manager 340 refers to any suitable circuitry configured to control the reading and writing of image data to and from frame memory 350. Example corresponding operations are described further herein with reference to
Frame memory 350 refers to any suitable circuitry configured to store image data that is to be displayed using SLM 120. For example, the image data may be stored in memory cells of frame memory.
Display scheduler 360 refers to any suitable circuitry configured to control the scheduling of when certain image content is to be displayed. Example scheduling schemes that may be used by display scheduler are described further herein with reference to
Sub-image shifter 370 refers to any suitable circuitry configured to control the position shifting of sub-images (also referred to herein as objects or object frames) to be displayed. The processing of control circuit 110 to effect the relative image shifting may be explained in the context of video input divided into multiple image frames. Each image frame may correspond to a full-array image to be projected by projection subsystem 100 during an image frame period. In some systems implementing a colored display, each image frame may be subdivided into multiple color-specific image subframes (e.g., red, green, and blue).
In some examples, the motion of virtual objects within an image frame may be perceived by the viewer to be smoother and hence more realistic if an image frame period is temporally divided into image frame subperiods of equal duration, referred to herein as object frames. To achieve this smoothing effect, control circuit 110 may be configured to encode different objects with different respective vectors of movement. For the battery-level object described above, for example, there may be just one position in the movement vector, which causes the battery-level object to appear stationary in the field of view. For the advertisement object described above, there may be a number of positional shifts applied within the same image frame period or applied during temporally adjacent image frame periods. For projected objects that are perceived to be moving more quickly across a field of view, an increased number of positional shifts may be applied (e.g., seven, eight, nine, ten, etc.). Additional detail concerning operations performed by sub-image shifter 370 is explained further herein with reference to
Display formatter 380 refers to any suitable circuitry configured to format the output of sub-image shifter in order to produce output signals formatted to be applied to input interfaces of by SLM 120. For example, display formatter 380 may divide image frames or object frames into multiple bit planes. Each bit plane represents an image arrangement of one-bit extracted from the full array of pixels in the input image frame. In some systems implementing a colored display, a number of bit planes may be applied during the image frame period for each color, which may enable modulating color brightness or intensity for each pixel during the image frame period. The output signals provided by display formatter 380 may be further configured to control sequential display of a plurality of objects included within a given image frame, where each object is to be displayed in a respective one of several object frame periods, the object frame periods sequentially diving up between them the full duration of an image frame period.
In the context of an AR system, object 404 may be a real-world structure (e.g., a café) or natural feature (e.g., a mountain or other landmark) naturally visible to the viewer. In an alternative context of either a VR or XR system, object 404 may be a virtual image representing a structure or natural feature in the real world.
Objects 406, 408, and 410 may each be virtual objects projected by projection subsystem 100. In this example, object 406 is in the background, object 408 is in the foreground, and object 410 is in the upper-right corner. As explained above, object 406 may be a virtual advertisement image intended to be anchored in its position relative to a fixed position in the real world (e.g., object 404), as explained with reference to
Object 408 may be a virtual image having a motion vector in a direction from right to left relative to the field of view of block 402. For example, object 408 may be an image of a person running, or a drone flying, in the direction indicated, as explained further with reference to
Object 410 may be a virtual icon representing the charge level of a battery, for example.
The four sides of image frame 412 collectively represent the edges of an image frame corresponding to the field of view shown in block 402. In this example, object 404 of block 402 does not appear within the illustrated image frame 412. This is because, in this example, object 404 represents a real-world object in the background of the field of view perceived by the viewer, whereas image frame 412 contains only virtual objects of an image frame to be projected by projection subsystem 100.
In some systems, control circuit 110 may analyze virtual objects within an image frame and determine an object frame sufficient to encompass the largest of the virtual objects included within a given image frame. The size of the determined object frame may be used by the control circuit to determine an equal duration of time to be applied to each object frame period of a plurality of object frame periods collectively dividing up an image frame period. For example, control circuit 110 may determine the smallest rectangle (e.g., in terms of total pixel count) sufficient to enclose the largest virtual object within the image frame. The smallest rectangle sufficient to enclose the largest virtual object within a given image frame is referred to herein as the object frame for that image frame. In this example, object 408 is the largest of the three virtual images represented by objects 406, 408, and 410. Accordingly, the dimensions of rectangle 416 is deemed the object frame dimensions for the image frame—i.e., in the illustrated example, rectangle 416 is the smallest rectangle sufficient to enclose the largest of the objects 406, 408, and 410 within image frame 412.
In some systems, control circuit 110 may apply a pixel buffer that increases the dimensions of the object frame for a given image frame. For example, the determined object frame may be enlarged by a predetermined number of pixels in terms of height and width. The applied pixel buffer may enable positioning of the object frame relative to the largest of virtual objects such that at least a threshold number of buffer pixels (e.g., a number selected within the range of 1 to 32 pixels) separates the outermost edges of the largest virtual object from all sides of the buffered object frame.
Once the template dimensions for the object frame is determined, control circuit 110 may determine the location for similar sized object frames encompassing each one of the of virtual objects within the same image frame.
While rectangle 418 is shown to be at the outermost edge of image frame 412, in some systems, there may be a buffer zone of pixels (e.g., a buffer zone 32 pixels wide) extending around all sides of the entire image frame. Such a buffer zone may be used, for example, to allow for an object frame to be positioned relative to its subject in a manner that extends beyond the image frame corresponding to the field of view. Use of a buffer zone around the entire image frame may also facilitate control circuit 110 processing the shifting of an entire image frame responsive to a determined shift in the view's gaze, head position, or geolocation.
Once the appropriate encompassing rectangles have been determined for each virtual object within an image frame, control circuit 110 may determine whether to mark each object as either static or dynamic relative to its position within the display frame. As explained with reference to
In some instances, the marking determination may include control circuit 110 determining whether a given object is already marked with either a static or dynamic classification. Such prior marking may have occurred, for example, as a result of control circuit 110 previously analyzing one or more image frames derived from video input received by video receiver 310.
If a given object is not already marked as static or dynamic, the marking determination for a given virtual object of an image frame may include control circuit 110 determining whether the object has a motion vector greater than zero. The motion vector of a given object may be determined based on any of a variety of factors. For example, control circuit 110 may determine a given virtual object has a non-zero motion vector by comparing multiple image frames to each other. As explained further with reference to
In certain systems, a series of projectable image frames may be derived from video input data at a certain maximum rate. Accordingly, each image frame may be allotted a certain time window, referred to herein as an image frame period, for processing corresponding data (including any virtual objects within an image frame) to produce display output formatted to be applied to input interfaces of SLM 120. The maximum number of image frames that control circuit 110 may be capable of processing per second is referred to herein as the image frame rate. Because the object frames described herein each represent a relatively small fraction (in terms of total number of pixels) of the overall image frame in which they are contained, control circuit 110 may be capable of processing object frames of a given image frame at a maximum object frame rate that exceeds the maximum image frame rate.
The maximum object frame rate may itself vary from one image frame to another depending on the variable size of the determined object frame for a given image frame. Thus, control circuit 110 may be configured to determine the maximum object frame rate that may be applied to a given image frame based, at least in part, on the variable size of the object frame being used.
Once control circuit 110 has determined the maximum object frame rate that can be supported for the object frame size of a given image frame, control circuit 110 may be configured to provide control output 114 or 116 that causes projection subsystem 100 to project the image content of the determined object frames. In some systems, control circuit 110 may cause all the object frames determined for a given image frame to be projected one at a time, in a sequential manner (e.g., using pulse width modulation timing), during the corresponding image frame period. In some systems, control circuit 110 may cause multiple object frames determined for a given image frame to be projected at the same time, in a spatially-interleaved manner, during the corresponding image frame period. Example spatially-interleaved pixel patterns that may be applied are explained further with reference to
The image content of Image Frames 1, 2 and 3 and Image Intraframes 1.5 and 2.5 includes three objects described previously with reference to
In this example, object 406 is anchored to a real-world position and yet is illustrated (by a frame-to-frame comparison) as gradually moving across the field of view from right to left at a relatively constant speed. Such frame-to-frame movement of anchored object 406 may be the result of control circuit 110 adjusting the position of object 406 within the field of view based on detection of the viewer traveling at a rapid velocity (e.g., the viewer may be a passenger in a vehicle).
In this example, object 408 appears to the viewer to move more rapidly in the foreground from right to left, consistent with the relative movement of object 406 anchored in the background. Object 410 (e.g., a static battery icon) remains fixed in its position for each one of the five example frames shown.
Control circuit 110 may apply various intraframe control schemes to render multiple virtual objects in a manner that soothes any motion of objects marked as being dynamic, while maintaining the brightness of other objects marked as being static. For example, control circuit 110 may temporally divide image frames periods into multiple object frame periods and derive intraframe shift amounts of any dynamic objects from one image frame to another. Such time divisions may facilitate smoothing the motion of dynamic objects from the perspective of a viewer, thereby making the objects appear to be more realistic. Such time division may also facilitate adaptively modifying the position of dynamic objects within a field of view responsive to detected changes in the viewer's gaze, head position, or body geolocation. Example timing schemes that may be applied to divide an image frame period into multiple object frame periods for use in displaying virtual objects are described further with reference to
In some systems, an image frame may be spatially divided in an interleaved manner, such that certain pixels of an image frame are assigned to displaying one object and other pixels of the same image frame are assigned to displaying a different object. The video input may itself contain spatially interleaved patterns of objects. In some systems, control circuit 110 may apply spatially-interleaved patterns for use in displaying more than one object during the same object frame, while individually controlling the relative positional shifting of each object within a field of view. Examples of such interleaved spatial subdivision of an image frame is described further with reference to
In some systems, the motion of dynamic objects may be perceived by the viewer to be smoother and hence more realistic if image frame periods are temporally divided into multiple object frame periods and the shift amount from image frame to image frame is itself subdivided through the use of multiple object frames. Static objects may also be repeatedly displayed a commensurate amount of times during the same image frame period, so as to be perceived by the viewer as having at least the same relative brightness as dynamic objects. Timing scheme 900 illustrates an example of how such temporal subdivision may be accomplished. Specifically, in this example, timing scheme 900 subdivides each image frame period into at least four respective object frame periods.
The top row 910 of timing scheme 900 refers to a vertical sync clock or VSYNC signal, which in some systems is produced by control circuit 110. The VSYNC signal may be used, for example, to initiate (and thereby temporarily differentiate) each image frame in a sequence of multiple images frames. Row 910 further shows how two sequential image frame periods may each be temporally subdivided into respective sequential, nonoverlapping object frame periods.
Row 920 shows that image content corresponding to at least one static object S1 and one dynamic object D1 is obtained during respective object frame periods.
Row 940 shows that the image content for both static object S1 and dynamic object D1 may be written to frame buffers 1 and 2, respectively. Frame buffers 1 and 2 may each be a portion of random-access memory (RAM) containing image content (e.g., a bitmap) that drives a video display. In some systems, frame buffers 1 and 2 may be included within frame memory 350 of control circuit 110. Row 940 further illustrates an example timing scheme for the writing of image content to frame buffers 1 and 2. For example, the image content for static object S1 may be received and written to a frame buffer 1 during a first object frame period; and the image content for dynamic object D1 may be received and written to frame buffer 2 during a second object frame period.
Row 930 indicates that while image content is being received and written, a “video idle mode” may be Inactive. Once the receiving and writing of object frame content is complete for a given image frame, the video idle mode may be switched to Active to indicate that video idle mode has commenced. Because no new video content is obtained while the video idle mode is in an Active state, any power associated with obtaining video may be turned off. Use of the video idle mode may thus provide power savings. While the video idle mode is Inactive, information may be written to frame buffers 1 and 2, as described above with reference to row 940. In addition, certain image shift processing may occur while the video idle mode is inactive, as described further with reference to row 950.
Row 950 indicates that there is zero shift with respect to static object S1 and dynamic object D1 during the second, third, and fourth object frame periods of the first image frame period. During the first object frame period of the second image frame period, however, there will be a shift per requirement (req.) (in terms of a relative position within an image frame) of dynamic object D1. As described with reference to
This is further indicated in row 970 by the reference D1 (shifted) shown in the first object frame period of the second image frame period. This also further indicated in rows 950 and 970 by the last object frame shown, in which row 950 indicates Shift per req. and row 970 indicates D2 (shifted) for the first object frame period of the third image frame period.
Row 960 indicates the timing for two different settings as Setting 1 and Setting 2. In this example, Setting 1 is used for object frames marked as being static and Setting 2 is used for object frames marked as being dynamic.
Row 970 indicates example timing for image content being displayed during the object frame periods of a given image frame period. For example, row 970 shows static object S1 as being displayed twice during the first image frame period. Specifically, static object S1 is displayed during both the second and fourth object frame periods of the first illustrated image frame period. The repeated display of static object S1 during the same image frame period may enable increasing the brightness of static object S1 from a viewer's perspective.
Image content corresponding to the dynamic object is likewise displayed multiple times within a time interval equivalent to an image frame period. For example, the dynamic object is displayed at a shifted position D1 (shifted) relative to position D1 and new image content D2 for the dynamic object is likewise displayed during the third object frame period of the second image frame. In other words, row 970 refers to D1 (shifted) as an intermediate or soothed positional shift of a certain number of pixels between image content of the dynamic object corresponding to D1 and D2. The image content D2 (shifted) (which represents a shift of the D2 image content) will be displayed during the first object frame period of the next image frame shown. With reference to
Thus, timing scheme 900 provides an example of how to smooth motion of a dynamic object by temporally dividing image frames periods into multiple object frame periods and deriving intraframe shift amounts of the dynamic object from one image frame to another. Timing scheme 900 may be applied, for example, by display formatter 380 of
Row 970 also indicates that image content represented as static object S1 may be displayed while image content represented as dynamic object D1 is received and written. Similarly, row 970 indicates that image content represented as D1 (shifted) may be displayed while image content represented as static object S2 is received and written. In this example, no image content is received or written while the Video idle mode is in an active state.
In some systems, the motion of dynamic objects may be perceived by the viewer to be smoother and hence more realistic if image frame periods are temporally divided into multiple object frame periods and the shift amount from image frame to image frame is itself subdivided through the use of multiple object frames. Static objects may also be repeatedly displayed a commensurate amount of times during the same image frame period, so as to be perceived by the viewer as having at least the same relative brightness as dynamic objects. Timing scheme 1000 illustrates an example of how such temporal subdivision may be accomplished. Specifically, in this example, timing scheme 1000 subdivides each image frame period into at least eight respective object frame periods.
In this example, timing scheme 1000 subdivides each image frame period into at least eight respective object frame periods. Timing scheme 1000 includes rows 1010, 1020, 1030, 1040, 1050, 1060, 1070 that generally correspond in their respective descriptions to rows 910, 920, 930, 940, 950, 960, 970 of timing scheme 900.
The top row 1010 of timing scheme 1000 refers to a VSYNC signal, which in some systems is produced by control circuit 110. The VSYNC signal may be used, for example, to initiate (and thereby temporarily differentiate) each image frame in a sequence of multiple images frames. Row 1010 further shows that a single image frame period may be temporally subdivided into at least eight sequential, nonoverlapping object frame periods.
Row 1020 shows that image content corresponding to at least one static object S1 and two dynamic objects D1 and D2 is obtained during respective object frame periods.
Row 1040 illustrates an example timing scheme for the writing of image content to frame buffers 1 and 2. In this example, the image content for static object S1 is received and written to a Frame Buffer 1 during a first object frame period; the image content for dynamic object D1 is received and written to Frame Buffer 2 during a second object frame period; and the image content for dynamic object D2 is received and written to Frame Buffer 2 during a sixth object frame period. Frame buffers 1 and 2 may each be a portion of random-access memory (RAM) containing image content (e.g., a bitmap) that drives a video display. In some systems, frame buffers 1 and 2 may be included within frame memory 350 of control circuit 110. Row 1040 further illustrates an example timing scheme for the writing of image content to frame buffers 1 and 2.
Row 1030 indicates that while image content is being received and written, a “video idle mode” may be “Inactive.” Once the receiving and writing of object frame content is complete for a given image frame, the video idle mode may be switched to “Active” to indicate that video idle mode has commenced. Because no new video content is obtained while the video idle mode is in an Active state, any power associated with obtaining video may be turned off. Use of the video idle mode may thus provide power savings.
Row 1050 indicates that there is “Zero shift” with respect to the dynamic objects D1 and D2 during the second, third, fourth, sixth, seventh, and eighth object frame periods of the first image frame period. A shift per requirement (e.g., in terms of a relative position within an image frame) may be applied to first and second dynamic objects D1 and D2 during respective object frame periods. As described with reference to
As shown in
Row 1060 indicates the timing for three different settings as “Setting 1,” “Setting 2,” and “Setting 3.” In this example, Setting 1 is used for object frames marked corresponding to static object S1, Setting 2 is used for object frames corresponding to dynamic object D1, and Setting 3 is used for object frames corresponding to dynamic object D2.
Row 1070 indicates example timing for image content being displayed during the object frame periods of a given image frame period. For example, row 1070 shows static object S1 as being displayed four times during the first image frame period. Specifically, static object S1 is displayed during both the second, fourth, sixth and eighth object frame periods of the first illustrated image frame period. The repeated display of static object S1 during the same image frame period may enable increasing the brightness of static object S1 from a viewer's perspective.
Image content corresponding to the two dynamic objects D1 and D2 is likewise displayed multiple times within a time interval equivalent to an image frame period. During the first image frame period, for example, dynamic object D1 is displayed during the third object frame period, a shifted intraframe position of dynamic object D1 is displayed during a fifth object frame period. Dynamic object D2 is displayed during the seventh object frame period; and a shifted intraframe position of dynamic object D2 is displayed during the first object frame period of the next image frame. Thus, in this example, dynamic objects D1 and D2 are both displayed at least twice within a time interval equivalent to an image frame period (i.e., the time interval inclusively extending from the second object frame period of the first image frame period to the second object frame period of the second image frame period).
Timing scheme 1000 provides an example of how to smooth motion of multiple dynamic objects by temporally dividing image frames periods into multiple object frame periods and deriving object-level intraframe shift amounts of multiple dynamic objects from one image frame to another. Timing scheme 1000 may be applied, for example, by display formatter 380 of
In this example, row 1070 shows static object S1 as being displayed four times during the first image frame period, while dynamic objects D1 and D2 are each displayed twice during an equivalent timeframe. The result of applying such a timing scheme 100 may be that static object S1 appears to the viewer to be twice as bright as dynamic objects D1 and D2. The brightness of static object S1 can be reduced to better match the relative brightness of dynamic objects D1 and D2. For example, the amount of time that static object S1 is displayed may be reduced by half relative to what is shown in row 1070. Alternatively, the pixel intensity level may by reduce by half, such that static object S1 is displayed for the full objects frame periods shown in row 1070, but the pixel intensity level is digitally reduced by half.
Row 1070 also indicates that image content represented as static object S1 may be displayed while image content for dynamic object D1 is received and written. Similarly, row 1070 indicates that image content represented as S1 be displayed (a third time) while image content represented as dynamic D2 is received and written. In this example, no image content is received or written while the “Video idle mode” is in an active state.
In some systems, the motion of dynamic objects may be perceived by the viewer to be smoother and hence more realistic if image frame periods are temporally divided into multiple object frame periods and the shift amount from image frame to image frame is itself subdivided through the use of multiple object frames. Static objects may also be repeatedly displayed a commensurate amount of times during the same image frame period, so as to be perceived by the viewer as having at least the same relative brightness as dynamic objects. Timing scheme 1100 illustrates an example of how such temporal subdivision may be accomplished. Specifically, in this example, timing scheme 1100 subdivides each image frame period into at least four respective object frame periods.
The top row 1110 of timing scheme 1100 refers to a vertical sync clock or VSYNC signal, which in some systems is produced by control circuit 110. The VSYNC signal may be used, for example, to initiate (and thereby temporarily differentiate) each image frame in a sequence of multiple images frames. Row 1110 further indicates, relative to the remainder of
Row 1120 shows that image content corresponding to at least one static object (S1) and at least one dynamic object (D1) is obtained during the same first and second object frame periods.
Row 1140 shows that the image content for both a single static object (e.g., S1) and for multiple dynamic objects (e.g., D1 and D2) may be written to one or more frame buffers, such as, for example, one or more of frame buffers 1 or 2. Frame buffers 1 and 2 may each be a portion of random-access memory (RAM) containing image content (e.g., a bitmap) that drives a video display. In some systems, frame buffers 1 and 2 may be included within frame memory 350 of control circuit 110.
Row 1140 further illustrates an example timing scheme for the writing of image content to frame buffers 1 and 2. For example, the image content for static object S1 and dynamic object D1 may be received and written to a frame buffer 1 during first and second object frame periods of an image frame period; and the image content for static object S1 and dynamic object D1(shifted) may be received and written to frame buffer 2 during first and second object frame periods of a subsequent image frame period.
Row 1130 indicates that while object frame content is being received and written, a “video idle mode” may be “Inactive.” Once the receiving and writing of object frame content is complete for a given image frame, the video idle mode may be switched to “Active” to indicate that video idle mode has commenced. Because no new video content is obtained while the video idle mode is in an Active state, any power associated with obtaining video may be turned off. Use of the video idle mode may thus provide power savings. While the video idle mode is Inactive, information may be written to frame buffers 1 and 2, as described above with reference to row 1140. In addition, certain image shift processing may occur while the video idle mode is Inactive, as described further with reference to row 1150.
Row 1150 indicates that there is “Zero shift” with respect to static object S1 and dynamic object D1 during the third and fourth object frame periods of the first image frame period and during the first object frame period of the second image frame period. A “shift per requirement” (e.g., in terms of a relative position within an image frame) may be applied to first and second dynamic objects D1 and D2 during respective object frame periods. As described with reference to
As shown in row 1150, a “shift per requirement (in terms of relative position within an image frame) may be applied to dynamic object D1 during the second object frame period of the second image frame period. The positional shifting of dynamic object D1 is further indicated in row 1180 by the reference D1(shifted) shown in the second object frame period of the second image frame period. Similarly, a shift per requirement (in terms of relative position within an image frame) may be applied to dynamic object D2 during the second object frame period of the third image frame period shown. The positional shifting of dynamic object D2 is further indicated in rows 1150 and 1170 by the last object frame shown, in which row 1150 indicates “Shift per req.” and row 1170 indicates “D2(shifted)” for the second object frame period of the third image frame period.
Row 1160 indicates the timing for two different settings as setting 1 and setting 2. In this example, setting 1 is used for object frames marked as being static and Setting 2 is used for object frames marked as being dynamic.
Row 1170 indicates example timing for image content being displayed during the object frame periods of a given image frame period. For example, row 1170 shows static object S1 as being displayed once during the third object frame period of the first image frame period and once during the first object frame of the second image frame period. Row 1170 further shows dynamic object D1 as being displayed in the fourth object frame of the first image frame period. The dynamic object D1 is displayed at a shifted position D1(shifted) relative to position D1 in the second object frame period of the second image frame. With reference to
Thus, timing scheme 1100 provides an example of how to smooth motion of a dynamic object by temporally dividing image frames periods into multiple object frame periods and deriving intraframe shift amounts of the dynamic object from one image frame to another. Timing scheme 1100 may be applied, for example, by display formatter 380 of
In some systems, the respective display times of the static and dynamic objects may occur during respective, mutually exclusive object frames, as shown in row 1170 of
In some systems, the motion of dynamic objects (e.g., D1 and D2) may be perceived by the viewer to be smoother and hence more realistic if image frame periods are temporally divided into multiple object frame periods and the shift amount from image frame to image frame is itself subdivided through the use of multiple object frames. Static objects (e.g., S1) may also be repeatedly displayed a commensurate amount of times during the same image frame period, so as to be perceived by the viewer as having at least the same relative brightness as dynamic objects. Timing scheme 1200 illustrates an example of how such temporal subdivision may be accomplished. Specifically, in this example, timing scheme 1200 subdivides each image frame period into at least eight respective object frame periods.
The top row 1210 of timing scheme 1200 refers to a vertical sync clock or VSYNC signal, which in some systems is produced by control circuit 110. The VSYNC signal may be used, for example, to initiate (and thereby temporarily differentiate) each image frame in a sequence of multiple images frames. Row 1210 also provides an example of how two sequential image frame periods may each be temporally subdivided into respective sequential, nonoverlapping object frame periods. While this example uses a 8:1 ratio of object frame periods per image frame period, any suitable ratio may be used.
Row 1220 shows that virtual image content corresponding to static object S1 and dynamic object D1 may be obtained during a first image frame period and that virtual image content corresponding to static object S2 and dynamic object D2 may be obtained during a second image frame period.
Row 1240 shows that the image content for the respective image content for static object S1 and dynamic object D1 may be written to a first buffer 1 during a first image frame period; and respective image content for static object S2 and dynamic object D2 may be written to a second buffer 2 during the second image frame period. Frame buffers 1 and 2 may each be a portion of random-access memory (RAM) containing image content (e.g., a bitmap) that drives a video display. In some systems, frame buffers 1 and 2 may be included within frame memory 350 of control circuit 110.
Row 1230 indicates that while image content is being received and written, a video idle mode may be inactive. Once the receiving and writing of object frame content is complete for a given image frame, the video idle mode may be switched to active to indicate that video idle mode has commenced. Because no new video content is obtained while the video idle mode is in an Active state, any power associated with obtaining video may be turned off. Use of the video idle mode may thus provide power savings. While the video idle mode is Inactive, information may be written to frame buffers 1 and 2, as described above with reference to row 1240. In addition, certain image shift processing may occur while the video idle mode is Inactive, as described further with reference to row 1250.
Row 1250 indicates that there is “Zero shift” for first, second, third, fourth, fifth, and seventh object frame periods of the second image frame period. A “shift per requirement” (e.g., in terms of a relative position within an image frame) may be applied to first and second dynamic objects D1 and D2 during respective object frame periods. As described with reference to
Various processing may occur during the first image frame period (which may continue during the second image frame period) to determine whether and the extent to which any positional shifting of a dynamic object should be implemented. In this example, the first image frame period shown in
As shown in
Row 1260 indicates the timing for two different settings as “Setting 1” and “Setting 2.” In this example, Setting 1 is used for object frames marked as being static and Setting 2 is used for object frames marked as being dynamic. Row 1270 indicates example subframe numbering that may be used to reference specific object frames.
Row 1280 indicates example timing for image content being displayed during the object frame periods of a given image frame period. For example, row 1280 shows static object S1 as being displayed twice during the second image frame period: once during the first object frame period and once during fifth object frame period. Row 1280 further shows static object S2 as also being displayed twice during the second image frame period: once during the third object frame period and once during the seventh object frame period. Row 1280 further shows dynamic object D1 and D1 (shifted) being displayed during second and sixth object frame periods, respectively, of the second image frame period. Row 1280 also shows dynamic object D2 and D2(shifted) being displayed during fourth and eighth object frame periods of the second image frame period.
Thus, timing scheme 1200 provides an example of how to smooth motion of a dynamic object by temporally dividing image frames periods into multiple object frame periods and deriving intraframe shift amounts of the dynamic object from one image frame to another. Timing scheme 1200 may be applied, for example, by display formatter 380 of
In some systems, the respective display times of the two static and two dynamic objects may occur during respective, mutually exclusive object frames, as shown in row 1280 of
Step 1310 includes control circuit 110 obtaining image frames. In some systems, control circuit 110 may include an image processor 320 configured to analyze video input received from video receiver 310 to derive a plurality of corresponding image frames, as explained herein with reference to
Step 1320 includes control circuit 110 determining the position(s) and size(s) of object(s) or object frame(s) within each image frame. In some systems, control circuit 110 may include a sub-image splitter 330 configured to determine the dimensions and locations of one or more object frames within a given image frame, with each object frame encompassing a respective virtual object, as explained further herein with reference to
Step 1330 includes control circuit 110 determining the motion vector(s) of object(s) or object frame(s). As explained with reference to
Step 1340 includes control circuit 110 marking object(s) or object frame(s) as static or dynamic. The marking of object(s) or object frame(s) as being static or dynamic may be based on the motion vector(s) determined in step 1330. As explained with reference to
Step 1350 includes control circuit 110 determining any shifted position(s) of object(s). For example, control circuit 110 may include a sub-image shifter 370 configured to control the position shifting of sub-images to be displayed, as explained further herein with reference to
Step 1360 includes control circuit 110 dividing image frame periods into object frame periods. The dividing may include, for example, the control circuit dividing a first image frame period into a certain number of object frame periods and further dividing a second image frame period into a corresponding number of object frame periods.
In some systems, each object frame of an image frame is allotted the same amount of time for a given image frame period. Such a timing scheme may facilitate calculating a frame rate for displaying content of object frames, in which the frame rate is represented as the number of object frames per image frame divided by the amount of time allotted per image frame. A maximum frame rate that may be used may be determined, for example, based on the amount of image content included within an object frame. As explained herein with reference to
Step 1370 includes control circuit 110 scheduling object(s) or object frame(s) to be displayed during respective object frame period(s). Various timing schemes may be used in scheduling object(s) or object frame(s) to be displayed. Example timing schemes are described above with reference to
While any control circuit 110 may apply any suitable schedule, in some systems, the schedule may be configured for displaying a given object in the following order: (1) the object or object frame is displayed in accordance with a first position within a first image frame; (2) sometime later the object or object frame is displayed in accordance with a second position that is derived by control circuit 110, based on a motion vector of the object, as a shifted position relative to the first position; and (3) sometime later the object or object frame is displayed in accordance with a third position within a second image frame subsequent to the first image frame, wherein the shifted second position of the object is somewhere between the first position and the third position.
In certain instances, an object or object frame may be scheduled for display, at different positions, at least twice during respective object frame periods of the same image frame period. For example, a first display may be in accordance with a first position of the object or object frame within a first image frame; and a second display may be in accordance with a second position that is shifted relative to the first position and that is derived by control circuit 110. A specific example of such a timing scheme is described further herein with respect to
In certain instances, an object or object frame may be scheduled for display at a first position and at derived shifted position, with the first position scheduled to be displayed during a first image frame period and the derived shifted position scheduled to be displayed during a second image frame period subsequent to the first image frame period. An example of such a timing scheme is described further herein with respect to
Step 1380 includes control circuit 110 outputting display signals configured to control the display of object(s) or object frame(s) in accordance with the schedule determined in step 1370. In some systems, control circuit 110 may include a display formatter 380 configured to produce an output formatted to be applied to input interfaces of by SLM 120, as explained further with reference to
Step 1390 includes displaying the object(s) or object frame(s) in accordance with the output display signals outputted by the control circuit in step 1380. In some systems, the display signals outputted by control circuit 110 may be provided to the projection subsystem 100 of
Herein, “or” is inclusive and not exclusive, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A or B” means “A, B, or both,” unless expressly indicated otherwise or indicated otherwise by context. Moreover, “and” is both joint and several, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A and B” means “A and B, jointly or severally,” unless expressly indicated otherwise or indicated otherwise by context. To aid the Patent Office, and any readers of any patent issued on this application, in interpreting the claims appended hereto, applicant notes that there is no intention that any of the appended claims invoke paragraph 6 of 35 U.S.C. § 112 as it exists on the date of filing hereof unless the words “means for” or “step for” are explicitly used in the claim language.
In the foregoing descriptions, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of one or more examples. However, this disclosure may be practiced without some or all these specific details, as will be evident to one having ordinary skill in the art. In other instances, well-known process steps or structures have not been described in detail in order not to unnecessarily obscure this disclosure. In addition, while the disclosure is described in conjunction with example examples, this description is not intended to limit the disclosure to the described examples. To the contrary, the description is intended to cover alternatives, modifications, and equivalents as may be included within the spirit and scope of the disclosure as defined by the appended claims.
This application claims priority to U.S. Provisional Application Ser. No. 63/476,968, entitled “SYSTEM AND METHOD FOR CONTENT ADAPTIVE INTRA-FRAME SHIFT IN DISPLAY SYSTEMS,” filed Dec. 23, 2022, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63476968 | Dec 2022 | US |