DISPLAYING 3D CONTENT ON LOW FRAME-RATE DISPLAYS

Information

  • Patent Application
  • 20120140033
  • Publication Number
    20120140033
  • Date Filed
    March 10, 2011
    13 years ago
  • Date Published
    June 07, 2012
    12 years ago
Abstract
Displaying three-dimensional (3D) video content to low frame-rate display devices can involve sending 3D video content to a display device. One implementation includes sending a first image for viewing by a user's first eye to the display device via first video frame(s) and sending a second image for viewing by the user's second eye via second video frame(s). Additionally, the implementation includes transmitting an inter-frame blanking signal to a blanking device. The inter-frame blanking signal instructs the blanking device to blank concurrently both of the user's eyes during a transition period of the display device. During the transition period, the display device concurrently displays both a portion of the first video frame(s) and a portion of the second video frame(s).
Description
BACKGROUND

1. The Field of the Invention


This invention relates to systems, methods, and computer program products related to conversion and presentation of three-dimensional video content.


2. Background and Relevant Art


Three-dimensional (3D) display technology involves presenting two-dimensional images in such a manner that the images appear to the human brain to be 3D. The process typically involves presenting “left” image data to the left eye, and “right” image data to the right eye. When received, the brain perceives this data as a 3D image. 3D display technology generally incorporates the use of a filtering or blanking device, such as glasses, which filter displayed image data to the correct eye. Filtering devices can be passive, meaning that image data is filtered passively (e.g., by color code or by polarization), or active, meaning that the image data is filtered actively (e.g., by shuttering).


Traditional display devices, such as computer monitors, television sets, and portable display devices, have been either incapable of producing suitable image data for 3D viewing, or have produced an inferior 3D viewing experience using known devices and processes. For instance, viewing 3D content from traditional display devices generally results in blurry images and/or images that have “ghosting” effects, both of which may cause dizziness, headache, discomfort, and even nausea in the viewer. This is true even for display devices that incorporate more recent display technologies, such as Liquid Crystal Display (LCD), Plasma, Light Emitting Diode (LED), Organic Light Emitting Diode (OLED), etc.


Recently, 3D display devices designed specifically for displaying 3D content have become increasingly popular. These 3D display devices are generally used in connection with active filtering devices (e.g., shuttering glasses) to produce 3D image quality not previously available from traditional display devices. These 3D display devices, however, are relatively expensive when compared to traditional display devices.


As a result, consumers who desire to view 3D content face the purchase of expensive 3D display devices, even when they may already have traditional display devices available. Accordingly, there a number of considerations to be made regarding the display of 3D content.


BRIEF SUMMARY

Implementations of the present invention solve one or more problems in the art with systems, methods, and computer program products configured to send three-dimensional (3D) content to a broad range of display devices. When sending 3D content using one or more implementations of the present invention, the viewer at the display device can experience a level of quality that can match or even exceed the quality of specialized 3D display devices. Accordingly, implementations of the present invention can alleviate or eliminate the need to purchase a 3D-specific display device by enabling traditional display devices to display 3D content in a high quality manner.


For example, a method of sending 3D content to a display device can involve sending one or more first video frames that include a first image for viewing by a user's first eye to a display device. The method can also involve transmitting an inter-frame blanking signal to a blanking device. The inter-frame blanking signal instructs the blanking device to concurrently blank both the user's first eye and the user's second eye during a display of a transition between the one or more first video frames and the one or more second video frames. During the transition the display device concurrently displays at least a portion of the one or more first video frames and at least a portion of the one or more second video frames.


Another implementation can include a method of sending 3D content to a display device while synchronously sending an inter-frame blanking signal to a blanking device. The method involves receiving a 3D input signal including one or more input video frames. The input video frames include a first image for viewing by a user's first eye and a second image for viewing by the user's second eye. The method also includes determining frame-rate capabilities of a display device.


After determining frame-rate capabilities of the display device, the method includes generating a 3D output signal for the display device. The 3D output signal comprises first output video frame(s) which include the first image and second output video frame(s) which include the second image. Then, the method further includes transmitting the 3D output signal to the display device at a frame-rate based on the determined frame-rate capabilities. The method also includes transmitting a blanking instruction to a blanking device. The blanking instruction directs the blanking device to blank the user's view of the display device while the display device transitions between the first output video frame(s) and the second output video frame(s).


This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


Additional features and advantages of exemplary implementations of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of such exemplary implementations. The features and advantages of such implementations may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features will become more fully apparent from the following description and appended claims, or may be learned by the practice of such exemplary implementations as set forth hereinafter.





BRIEF DESCRIPTION OF THE DRAWINGS

In order to describe the manner in which the above-recited and other advantages and features of the invention can be obtained, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:



FIG. 1 illustrates a schematic diagram of a three-dimensional (3D) content conversion system for sending 3D content to a variety of display devices in accordance one or more implementations of the present invention;



FIG. 2 illustrates a plurality of flow diagrams which demonstrate output video frames customized to physical characteristics of a destination display device in accordance with one or more implementations of the present invention;



FIG. 3 illustrates a schematic diagram of the shuttering of the display of 3D video content in response to a blanking signal in accordance one or more implementations of the present invention;



FIG. 4 illustrates a timing diagram which demonstrates the relative timing of transmitted output 3D content, a corresponding blanking signal, and resulting display states in accordance with one or more implementations of the present invention;



FIG. 5 illustrates a schematic diagram of a system for sending 3D content to low frame-rate devices in accordance with one or more implementations of the present invention;



FIG. 6 illustrates a flowchart of a series of acts in a method in accordance with an implementation of the present invention of sending 3D content to a display device; and



FIG. 7 illustrates a flowchart of a series of acts in a method in accordance with an implementation of the present invention of sending 3D content to a display device while synchronously sending an inter-frame blanking signal to a blanking device.





DETAILED DESCRIPTION

Implementations of the present invention solve one or more problems in the art with systems, methods, and computer program products configured to send three-dimensional (3D) content to a broad range of display devices. When sending 3D content using one or more implementations of the present invention, the viewer at the display device can experience a level of quality that can match or even exceed the quality of specialized 3D display devices. Accordingly, implementations of the present invention can alleviate or eliminate the need to purchase a 3D-specific display device by enabling traditional display devices to display 3D content in a high quality manner.


Specialized 3D display devices attempt to provide an enhanced 3D viewing experience by modifying physical characteristics of the display device, such as by increasing the frame-rate and decreasing a frame overlap interval. The frame-rate refers to the number of unique video frames the display device can render in a given amount of time (e.g., one second). Frame overlap interval refers to the period of time that elapses when transitioning between two frames. During the frame overlap interval, the display device displays at least a portion of two or more video frames concurrently. Longer frame overlap intervals are perceptible to the human eye, and can lead to a degraded viewing experience. For example, longer frame overlap intervals can cause motion blurring or ghosting. These effects are a particular problem when viewing 3D video content.


One or more implementations of the present invention provide for sending 3D content to lower frame-rate display devices in a manner customized for the display device. This can include, for example, customizing the frame-rate or the frame size of the 3D content for the device, and compensating for the frame overlap interval. Compensating for the frame overlap interval can involve blanking of some or all of the frame overlap interval from the user's view. “Inter-frame” blanking can involve sending a blanking instruction to a blanking device which instructs the blanking device to block all or part of the frame overlap interval from the user's view. In one or more implementations, the system can send the inter-frame blanking instruction synchronously with sending the 3D content to the display device. Thus, one or more implementations allow for sending 3D content to a broad range of display devices, including devices that have lower frame-rates and longer frame overlap intervals, while overcoming problems such as motion blurring and ghosting.



FIG. 1, for example, illustrates a schematic diagram of a 3D content conversion system 100 for sending 3D content to a variety of display devices in accordance with one or more implementations of the invention. As illustrated, the 3D content conversion system 100 includes a video processing device 102. The video processing device 102 can receive input 3D content 104 via a video receiver 114, and can transmit output 3D content 108 via a video transmitter 130.


The video processing device 102 can optimize, tailor, or customize the output 3D content 108 for a particular destination display device (e.g., destination display device 310, FIG. 3) by considering physical characteristics of the destination display device. Customization or tailoring of the output 3D content 108 can include customizing the encoding format, the frame-rate, the frame size, etc. of the output 3D content 108 for the destination display device. To assist with 3D viewing, the video processing device 102 can also generate a blanking signal 136 and transmit the generated blanking signal 136 to one or more blanking devices (e.g., blanking device(s) 312, FIG. 3).


In one or more implementations, the video processing device 102 includes a processing component 116 which can include a plurality of sub-components or modules (which can be separate or combined). For example, the processing component 116 can include a decoder 118, frame buffers 122, 124, and an encoder 120. The decoder 118 can receive the input 3D content 104, which can include one or more input video frames 106 that comprise left eye image data and right eye image data. The decoder 118 can detect the 3D encoding format of the input 3D content 104. Then, the decoder 118 can decode left eye image data of the input video frame(s) 106 into one frame buffer (e.g., frame buffer 122) and decode right eye image data of the input video frame(s) into the other frame buffer (e.g., frame buffer 124).


In some circumstances, decoding may involve decoding image data from a plurality of input video frames 106 to construct complete image data for each frame buffer 122, 124. This may be the case, for example, if the input video frames 106 encode each image using a plurality of interlaced video frames. In other circumstances, decoding may involve decoding image data from a single input video frame 106 to construct complete image data for both frame buffers 122,124. This may be the case, for example, if the input video frames 106 encode both images on a single frame (e.g., using spatial compression, or interleaving). As discussed more fully herein after, decoding can include modifying the left eye image data and the right eye image data based on physical characteristics of the destination display device.


Regardless of the specific decoding process, the encoder 120 can encode image data previously decoded into the frame buffers 122, 124 to generate the output 3D content 108. In one or more implementations, the encoder 120 can encode the left and right eye image data from the frame buffers 122, 124 into alternating “left” video frames and “right” output video frames 110, 112. The left video frame(s) can encode only left image data from a corresponding frame buffer (e.g., frame buffer 122). On the other hand, the right video frame(s) can encode only right image data from a corresponding frame buffer (e.g., frame buffer 124). Thus, the video transmitter 130 can first send one or more output video frames 110 for one eye (e.g., one or more left video frames) to the destination display device, and then send one or more output video frames 112 for the other eye (e.g., one or more right video frames). The decoder 118 can then decode new image data into the frame buffers 122, 124, and the encoder 120 can then encode new output video frames 110, 112 from the frame buffers 122, 124 into the output 3D content 108.


The encoding and decoding process can include customizing the output 3D content 108 to physical characteristics of the destination display device. Both the decoder 118 and the encoder 120 can consider physical characteristics of the destination display device to generate customized output 3D content 108 that is appropriate for the particular destination display device. This can include any combination of generating customized types of output video frames (e.g., interlaced or progressive) and/or generating customized sizes of output video frames (e.g., 480, 720, or 1080 vertical lines). This can also include generating output video frames at a target frame-rate (e.g., 60 Hz, 120 Hz). When generating output video frames 110, 112 at a target frame-rate, the 3D content conversion system 100 can send the frames to the destination display device at a rate that would cause the display device to receive a target number of frames per second.


Turning briefly to FIG. 2, for example, illustrated are a plurality of flow diagrams 202, 204, 206, in accordance with one or more implementations, which demonstrate output video frames 110, 112 customized to physical characteristics of a destination display device. Flow diagrams 202 and 204 illustrate output video frames 110, 112 customized to destination display devices that alternately receive progressive or interlaced video frames. Taking the progressive case, flow diagram 202 illustrates that sending one or more “left” video frames can involve sending a single progressive video frame 110 that includes left image data 208 (e.g., left image data from frame buffer 122). Similarly, the progressive case can also involve sending a single progressive video frame 112 that includes right image data 210 (e.g., right image data from frame buffer 124).


In the interlaced case, on the other hand, flow diagram 204 illustrates that sending one or more “left” video frames can involve sending two (or more) “left” interlaced video frames (110a, 110b). These frames can include left image data 208 (e.g., left image data from frame buffer 122). Similarly, sending one or more “right” video frames can involve sending and two (or more) “right” interlaced video frames (112a, 112b). These frames can include right image data 210 (e.g., right image data from frame buffer 124). Of course, one of ordinary skill in the art would understand in view of this disclosure that each of the interlaced video frames encodes only partial image data (e.g., odd lines or even lines).


Flow diagram 206 illustrates output video frames 110, 112 that would upscale the frame-rate of the output 3D content 108 in accordance with one or more implementations. As illustrated, the output 3D content 108 includes a progressive “left” video frame 110 (corresponding to the left image data 208) and a subsequent progressive “right” video frame 112 (corresponding to right image data 210). Then, the “left” video frame 110 and the “right” video frame 112 repeat, using the same image data (208, 210). One will appreciate in light of the disclosure herein that repeating the same image data twice can double the frame-rate. For example, if the input 3D content 104 had a frame-rate of 60 Hz (i.e., the decoder 118 decoded sixty complete frames per second), then introducing the same image data twice can result in output 3D content 108 having an upscaled frame-rate of 120 Hz.


Downscaling, on the other hand, can involve reducing the number of video frames in the output 3D content 108. This may be useful when sending the output 3D content 108 to destination display devices that may not be optimal for, or capable of, displaying higher frame-rates. Downscaling can involve omitting some frames of left and right image data stored in the frame buffers 122, 124. Downscaling can also involve detecting differences between sequential video frames and generating new frames that capture these differences. Thus, the video processing device 102 can generate new frames a lower frame-rate than the original frames, thereby reducing the frame-rate in the output 3D content 108.


Any combinations of customization of the output 3D content 108 are possible. Illustratively, the output 3D content 108 generated for one destination display device may comprise progressive video frames having 1080 lines of vertical resolution sent to the destination display device at 120 Hz. On the other hand, for the same input 3D content 104, but for a different destination display device, the output 3D content 108 may alternatively comprise interlaced video frames having 480 lines of vertical resolution sent to the destination display device at 60 Hz. Of course, these examples are merely illustrative and are not limiting.


Returning to FIG. 1, in one or more implementations customizing the output 3D content 108 can involve the use of additional components or modules, such as a detection module 126. The detection module 126 can detect physical characteristics of the destination display device and provide this information to the other modules or components, such as the decoder 118 and/or the encoder 120. In one or more implementations, the detection module 126 can receive the physical characteristic information via an input receiver 132. The detection module 126 can receive physical characteristic information directly from the destination display device (e.g., via a High Definition Media Interface (HDMI) connection) or manually (e.g., via user input). The physical characteristic information can include frame size and frame-rate capabilities of the destination display device, an inter-frame overlap interval of the destination display device, etc.


In one or more implementations, receiving physical characteristic information via user input can involve receiving user feedback about output 3D content 108 displayed on the destination display device. For instance, in a configuration mode, the video processing device 102 can generate and transmit “configuration” output 3D content 108 and a corresponding “configuration” blanking signal 136 in a manner intended to elicit user feedback. The user can then provide any appropriate user feedback about his or her perception of the “configuration” output 3D content 108 and blanking signal 136. The video processing device 102 can then adjust the output 3D content 108 and/or the blanking signal 136 until optimized for the physical characteristics of the destination display device.


For example, the video processing device 102 can send the output 3D content 108 in various different formats to the display device. When the user is able to clearly view an image, the user can provide feedback to the video processing device 102. In one or more implementations, the user can press a button on either a shuttering device, the display device, or the video processing device 102 to signify to the input module 132 that the clear image. The input module 132 can forward this input to the detection module 126, which can then determine the physical characteristics of the destination display device. Alternatively, the user can use the input module 132 to enter or select a make and model of the destination display device. The detection module 126 can then determine the physical characteristics of the destination display device based on the user input.


The decoder 118 and the encoder 120 can each be capable of encoding and/or decoding both analog and digital content. Thus, the video processing device 102 can convert digital input 3D content 104 into analog output 3D content 108. Alternatively, the video processing device 102 can convert analog input 3D content 104 into digital output 3D content 108. Of course, the video processing device 102 can also receive digital content and output digital content.


The processing component 116 can also include a blanking signal generator 128, which can generate a blanking signal 136 comprising a plurality of blanking instructions. The blanking signal transmitter 134 can transmit the blanking signal 136 to one or more blanking devices (e.g., blanking device(s) 312, FIG. 3) prior to or concurrently with the transmission of the output 3D content 108 to the destination display device. The blanking instructions in the blanking signal 136 can instruct the blanking device(s) to shutter the display of the output 3D content 108. Thus, the blanking device(s) can respond to the blanking instructions synchronously with the display of the output video frames 110, 112 at the destination display device to shutter the displayed output video frames 110, 112 from the user's view.


One will appreciate in light of the disclosure herein that the video processing device 102 can include any number of additional components or modules, or can contain a fewer number of components or modules. Accordingly, the video processing device 102 can depart from the illustrated form without departing from the scope of this disclosure. Furthermore, the video processing device 102 can implement any combination of the components or modules in hardware, software, or a combination thereof. For example, the video processing device 102 can implement one or more components or modules using Field Programmable Gate Arrays (FPGAs).


Turning to FIG. 3, illustrated is a schematic diagram of the shuttering of the display of 3D video content in response to a blanking signal, according to one or more implementations. FIG. 3 illustrates a destination display device 310 and a one or more blanking devices 312 in each of three distinct states 302, 304, 306. In each state, the destination display device 310 displays either unique 3D image data 208, 210 or an inter-frame overlap 308, while the blanking device 312 responds to an appropriate blanking instruction 318, 320, 322. For example, the destination display device 310 may be displaying one or more of the output video frames 110, 112 of customized output 3D content 108, while the blanking device(s) 312 may be responding to the blanking signal 136.


Each blanking device 312 can be a “shuttering device” that can blank (or block) one or more portions of a viewer's view of the destination display device 310 to provide the illusion of 3D content display. In state 302, for example, the video processing device 102 can transmit one or more “left” output video frames (e.g., output video frames 110) to the destination display device 310 and can transmit a blanking instruction 318 (blank right) to the blanking device(s) 312. In one or more implementations, the blanking instruction can include a data packet. Thus, when the display device 310 displays “left eye content” 208 in state 302, the video processing device 102 can send a blanking signal to the blanking device 312 including one or more data packets 318. The data packet 318 can include instructions to use shuttering component 316 to blank the viewer's right eye view of the display device 310. Thus, upon receipt of data packet 318, the blanking device 312 can blank or occlude the viewer's right eye view of the display device 310 using shutting component 316. Thus, the destination display device 310 can uniquely display left image data 208 and each blanking device 312 can use a “right” blanking component 316 to blank the viewer's right eye view of the displayed left image data 208.


Similarly, in state 306, the video processing device 102 can transmit one or more “right” output video frames (e.g., output video frames 112) to the destination display device 310 and can transmit a blanking instruction 322 (blank left) to the blanking device(s) 312. The data packet or blanking instruction 322 can include instructions to use shuttering component 314 to blank the viewer's left eye view of the display device 310. Thus, upon receipt of data packet 322, the blanking device 312 can blank or occlude the viewer's left eye view of the display device 310 using shutting component 314. In other words, the destination display device 310 can uniquely display right image data 210 and each blanking device 312 can use a “left” blanking component 314 to blank the viewer's left eye view of the displayed right image data 210.


One will appreciate in view of the disclosure herein that the appropriate shuttering or blanking of a single eye, as in states 302 and 306, when combined with the synchronous display of right and left image data, can provide the illusion that the two-dimensional left and right images are 3D. Of course, states 302 and 306 are not limited to displaying “left” and “right” video frames in the manner illustrated. For instance, in state 302, the destination display device 310 can display right image data 210, and each blanking device 312 can use the “left” blanking component 314 to blank the viewer's left eye. In state 306, on the other hand, the destination display device 310 can display left image data 208, and each blanking device 312 can use the “right” blanking component 316 blank the viewer's right eye.


In addition to blanking left and right eyes individually, one or more implementations provide an enhanced 3D viewing experience by introducing a third state that blanks both the viewer's eyes during inter-frame overlap. State 304 illustrates an inter-frame overlap interval occurring after the video processing device 102 transmits the one or more “right” output video frames (e.g., output video frames 112) subsequent to transmitting the “left” frame(s). During this interval, inter-frame overlap 308 may occur, whereby the destination display device 310 concurrently displays portions of image data from two or more video frames (e.g., image data 208 from video frame 110 and image data 210 from video frame 112). During this inter-frame overlap interval, the video processing device 102 can transmit a blanking instruction 320 (blank both) to the blanking device(s) 312. The blanking instruction or data packet 320 can include instructions to use shuttering components 314, 316 to blank the viewer's entire view of the display device 310. Thus, the blanking device(s) 312 can concurrently use both blanking components 314, 316 blank both the viewer's left eye view and the viewer's right eye during the inter-frame overlap 308.


By blanking both eyes during state 304, the blanking device(s) 312 can prevent the viewer(s) from viewing at least a portion of the inter-frame overlap 308 during at least a portion of the inter-frame overlap interval. This “inter-frame blanking,” or the synchronous blanking of both eyes during inter-frame overlap intervals, can enhance the clarity of the perceived 3D image. Inter-frame blanking can reduce or eliminate the undesirable effects common to 3D content display, such as motion blurring and ghosting. Thus, the disclosed inter-frame blanking techniques, when synchronously combined with the customized output 3D content 108, can allow for viewing of 3D content on display devices that may have lower frame-rates and/or longer frame overlap intervals.



FIG. 4 illustrates a timing diagram which demonstrates the relative timing of transmitted output 3D content 108, a corresponding blanking signal 136, and resulting display states, consistent with one or more implementations. Illustrated is a snapshot 400 of time during the transmission of the output 3D content 108 to the destination display device 310, and the transmission of the blanking signal 136 to the blanking device(s) 312. The display states 402 indicate the states 302, 304, 306 discussed herein above in connection with FIG. 3. The horizontal ellipses to the left and right of the snapshot 400 indicate that the snapshot 400 may extend to any point in the past or in the future.


At a time 406, the video processing device 102 can transmit left output video frame(s) 110 to the destination display device 310. As illustrated, time 406 can correspond to the beginning of state 302, in which the destination display device 310 uniquely displays left image data (208, FIG. 3) from the left video frame(s) 110. The video processing device 102 may have started transmission of the left video frame(s) 110 prior to time 406, and a state 204 of inter-frame overlap may have occurred. The video processing device 102 may also have started transmission at the beginning of time 406. Regardless of when transmission began, FIG. 4 illustrates that during the time period between time 406 and a time 408, the output 3D content 108 includes the left output video frame(s), 110 and that the blanking signal 136 includes an appropriate blanking instruction 318 (blank right).


At time 408, the video processing device 102 can cease transmitting the left output video frame(s) 110 and begin transmitting right output video frame(s) 112. The video processing device 102 can base the timing of the transition between the left and right video frames on a target frame-rate of the output 3D content 108 tailored for the destination display device 310. For example, if the destination display device 310 would optimally receive sixty progressive frames per second, then the video processing device 102 can transmit a progressive left video frame for 1/60th of a second. Subsequently, the destination display device 310 can transmit a progressive right video frame for another 1/60th of a second. Of course, if the destination display device 310 receives interlaced frames, then the video processing device 102 can transmit a plurality of left video frames and then a plurality of right video frames, each for an appropriate period of time. The transition between transmitting two video frames can occur immediately after the video processing device 102 transmits the last line of a video frame (e.g., after transmitting the 720th line, in the case of “720p” video frames).


Based on the physical characteristic information of the destination display device 310, the video processing device 102 can determine a state 304 from time 408 to a time 410. During this time period, the destination display device 310 would display inter-frame overlap (308, FIG. 3) as the display device transitions between uniquely displaying the left output video frame(s) 110 and the right output video frames(s) 112. Thus, FIG. 4 illustrates that from time 408 to time 410 the blanking signal can include an inter-frame blanking instruction 320 (blank both). As discussed, the inter-frame blanking instruction 320 can blank the inter-frame overlap (308, FIG. 3) from the user's view.


Next, during state 306 the destination display device 310 will have transitioned past the inter-frame overlap and will uniquely display the right output video frame(s) 112. Thus, the video processing device 102 can send an appropriate blanking instruction 322 (blank left) to the blanking device(s) 312. Subsequently, the video processing device 102 can send another one or more left frames, another one or more right frames, etc. These frames can include new image data decoded into the frame buffers, or can include the same data sent previously (i.e., when increasing the frame-rate) in the output 3D content 108.


One will also appreciate that while FIG. 4 illustrates a series of alternating left and right video frames (in any order), one or more implementations extend to any sequence of video frames. In one implementation, for example, the output 3D content 108 can comprise differing sequences of left and right video frames (e.g., left, left, right, right). In another implementation, the output video content 108 can include only video frames intended for viewing with both eyes. In yet another implementation, the output 3D content 108 can comprise a combination of different video frame types. One combination, for instance, can include both video frames intended for viewing with both eyes, as well as video frames intended for viewing with a single eye.


Furthermore, in some instances, the blanking signal 136 can instruct the blanking device(s) 312 to blank an entire time period. In other instances, however, the blanking signal 136 can also instruct the blanking device(s) 312 to blank only a portion of a corresponding time period. Furthermore, the blanking signal 136 can instruct the blanking device(s) 312 to blank more than a corresponding time period. In addition, the blanking signal 136 can also include other blanking instructions, such as a blanking instruction that causes the blanking device to refrain from blanking.


One will appreciate in light of the disclosure herein that the blanking signal 136 can include any appropriate sequence of blanking instructions that correspond to the output 3D content 108. For instance, if the output 3D content 108 includes a different sequence of left and right video frames, the blanking signal 136 can include an appropriate different sequence of blanking instructions. Furthermore, the blanking signal 136 can depart from the illustrated implementations. For example, the blanking signal 136 can refrain from blanking during one or more time periods corresponding to a transition. Furthermore, blanking signal 136 can include any number of other blanking instructions, such as blanking instructions that does no blanking (e.g., when displaying a video frame intended for viewing with both eyes).



FIG. 5 illustrates a schematic diagram of a system 500 for sending 3D video content to lower frame-rate devices. FIG. 5 illustrates that the system 500 can include the video processing device 102, one or more blanking devices 312, and a destination display device 310. These devices can be separate or combined. For instance, in one or more implementations the video processing device 102 and the destination display device 310 are separate units, while in one or more other implementations these devices form a single unit.


In one or more implementations the video processing device 310 receives the input 3D content 104 from a media device. The media device can comprise any number of devices capable of transmitting 3D video content to the video processing device 102. For example, FIG. 5 illustrates that the media device can comprise a streaming source 502 (e.g., a satellite box, cable box, the Internet), a gaming device (e.g., XBOX 504, PLAYSTATION 506), a player device (e.g., Blu-Ray player 506, DVD player 508) capable of reading media 512, and the like. Of course, the video processing device 102 can, itself, comprise one or more media devices. In this instance, the video receiver 114 can comprise one or more media devices (e.g., media devices 502, 504, 506, 508, 510).


The video processing device 102 can communicate with the destination display device 310 and the blanking device(s) 312 in any appropriate manner. For instance, an appropriate wired mechanism, such as HDMI, component, composite, coaxial, network, optical, and the like can couple the video processing device 102 and the destination display device 310 together. Additionally, or alternatively, an appropriate wireless mechanism, such as BLUETOOTH, Wi-Fi, etc., can couple the video processing device 102 and the destination display device 310 together. Likewise, any appropriate wired or wireless mechanism (e.g., BLUETOOTH, infrared, etc.) can couple the video processing device 102 and the blanking device(s) 312 together.


One will appreciate that the video processing device 102 can generate any appropriate output signal comprising output 3D content 108. For example, when the video processing device 102 and the destination display device 310 are coupled via a digital mechanism (e.g., HDMI), the video processing device 102 can generate a digital signal that includes the output 3D content 108. On the other hand, when the video processing device 102 and the destination display device 310 are coupled via an analog mechanism (e.g., component, composite or coaxial), the video processing device 102 can generate an analog signal that includes the output 3D content 108.


One will appreciate in view of the disclosure herein that the video processing device 102 can take any of a variety of forms. For example, the video processing device 102 may be a set-top box or other customized computing system. The video processing device 102 may also be a general purpose computing system (e.g., a laptop computer, a desktop computer, a tablet computer, etc.). Alternatively, the video processing device 102 can be a special purpose computing system (e.g., a gaming console, a set-top box, etc.) that has been adapted to implement one or more disclosed features.


The destination display device 310 can be any one of a broad range of display devices that incorporate a variety of display technologies, both current and future (e.g., Cathode Ray, Plasma, LCD, LED, OLED). Furthermore, the destination display device 310 can take any of a number of forms, such as a television set, a computer display (e.g., desktop computer monitor, laptop computer display, tablet computer display), a handheld display (e.g., cellular telephone, PDA, handheld gaming device, handheld multimedia device), or any other appropriate form. While the destination display device 310 can be a display device designed specifically to display 3D content, the destination display device 310 can also be a more traditional display device, such as a lower frame-rate device. One will appreciate in light of the disclosure herein that the destination display device 310 can include both digital and analog display devices.


The blanking device(s) 312 can be any blanking device(s) configured to interoperate with video processing device 102 and to respond to one or more blanking instructions received via the blanking signal 136. In one or more implementations, the blanking device(s) 312 comprise shuttering components (314, 316) that include one or more liquid crystal layers. The liquid crystal layers can have the property of becoming opaque (or substantially opaque) when voltage is applied (or, alternatively, when voltage is removed). Otherwise, the liquid crystal layers can have the property being transparent (or substantially transparent) when voltage is removed (or, alternatively, when voltage is applied). Thus, the blanking device(s) 312 can apply or remove voltage from the shuttering components to block the user's view, as instructed by the blanking signal.


Accordingly, FIGS. 1-5 provide a number of components and mechanisms for sending 3D content to display devices synchronously with an inter-frame blanking signal. The 3D content is customized to particular destination display devices and the inter-frame blanking signal can block inter-frame overlap from a user's view. Thus, one or more disclosed implementations allow for viewing of 3D content on a broad range of display devices, even when that content in not encoded for viewing on those devices.


Additionally, implementations of the present invention can also be described in terms of flowcharts comprising one or more acts in a method for accomplishing a particular result. Along these lines, FIGS. 6-7 illustrate flowcharts of computerized methods of sending 3D content to a display device. For example, FIG. 6 illustrates a flowchart of a method of sending 3D content to a display device. Similarly, FIG. 7 illustrates a flowchart of a method of sending 3D content to a display device while synchronously sending an inter-frame blanking signal to a blanking device. The acts of FIGS. 6 and 7 are described herein below with respect to the schematics, diagrams, devices and components shown in FIGS. 1-5.


For example, FIG. 6 shows that a method of sending 3D content to a display device can comprise an act 602 of sending first video frame(s) to a display device. Act 602 can include sending one or more first video frames that include a first image for viewing by a user's first eye to a display device. For example, the act can include the video processing device 102 transmitting output video frames 110 of output 3D content 108 to the destination display device 310 via the video transmitter 130. Also, as illustrated in FIG. 2, sending one or more first video frames can include sending a plurality of interlaced first video frames (e.g., video frames 110a, 110b) or sending a single progressive first video frame (e.g., video frame 110). Furthermore, sending the one or more first video frames can include sending the one or more first video frames at a frame-rate customized to the display device.



FIG. 6 also shows that the method can comprise an act 604 of transmitting an inter-frame blanking signal to a blanking device. Act 604 can include transmitting an inter-frame blanking signal to a blanking device that instructs the blanking device to concurrently blank both of the user's first eye and the user's second eye during a display of a transition during which at least a portion of the one or more first video frames and at least a portion of the one or more second video frames are to be displayed concurrently at the display device. For example, the act can include the video processing device sending the blanking instruction 320 (blank both) to the blanking device(s) 312 via the blanking signal 136. Of course, the blanking signal 136 can include a blanking instruction 320 that instructs the blanking device to blank both of the user's first eye and the user's second eye during less than an entire display of the transition.


Other blanking instructions are possible. For instance, the inter-frame blanking signal can also instruct the blanking device to blank the user's first eye during individual display of the second image at the display device. The inter-frame blanking signal can also instruct the blanking device and to blank the user's second eye during individual display of the first image at the display device. These instructions may correspond with blanking instructions 318 or 322 (in any order), for example.


Additionally, FIG. 6 shows that the method can comprise an act 606 of sending second video frame(s) to the display device. Act 606 can include sending the one or more second video frames that include a second image for viewing by the user's second eye to the display device. For example, the act can include the video processing device 102 transmitting output video frames 112 of output 3D content 108 to the destination display device 310 via the video transmitter 130. Similar to act 602, sending one or more second video frames can include sending a plurality of interlaced second video frames (e.g., video frames 112a, 112b) or sending a single progressive first video frame (e.g., video frame 112). Furthermore, sending the one or more second video frames can include sending the one or more second video frames at a frame-rate customized to the display device.


Although not illustrated, the method can include any number of additional acts. For example, the method can include acts of generating the one or more first video frames and generating the one or more second video frames based on one or more physical characteristics of the display device, including a frame-rate and a frame size of the display device. Illustratively, the generating can include generating output video frames 110, 112 of the output 3D content 108 having a number of lines customized to the destination display device 310 (e.g., 480, 720, 1080). Generating video frames can include generating a number of video frames based on the target frame-rate for the destination display device 310. As well, the method can include an act of generating the inter-frame blanking signal based on one or more physical characteristics of the display device, including an inter-frame overlap interval of the display device, which can be a time period corresponding to the display of the transition.


In addition to the foregoing, FIG. 7 illustrates a method of sending three-dimensional (3D) content to a display device while synchronously sending an inter-frame blanking signal to a blanking device. The method can comprise an act 702 of receiving a 3D input signal. Act 702 can include receiving a 3D input signal including one or more input video frames that include a first image for viewing by a user's first eye and a second image for viewing by the user's second eye. For example, the act can include the video processing device 102 receiving, via the video receiver 114, the input 3D content 104, which includes one or more input video frame(s) 106. In some instances, the one or more input video frames 106 comprise a single video frame (e.g., when the video frame encodes left and right image data using spatial compression or interleaving). In other instances, the one or more input video frames 106 comprise a plurality of video frames (e.g., when separate progressive or interlaced frames encode the left and right image data).


Furthermore, FIG. 7 illustrates that the method can comprise an act 704 of determining capabilities of the display device. Act 704 can include determining frame-rate capabilities of a display device. For example the act can include the video processing device 102 receiving physical characteristic information of the destination display device 310 via the input receiver 132. The physical characteristic information can include, for instance, frame-rate capabilities, frame size capabilities, frame overlap interval(s), etc. Thus, the act can also include determining frame size capabilities of the display device, or determining a frame overlap interval for the display device. Furthermore, the act can comprise receiving physical characteristic information (e.g., frame-rate capabilities) directly from the display device or via manual user input.



FIG. 7 also illustrates that the method can comprise an act 706 of generating a 3D output signal. Act 706 can include generating a 3D output signal for the display device, comprising one or more first output video frames including the first image and one or more second output video frames including the second image. For example, the act can include the video processing device 102 using the encoder 120 to encode a plurality of output video frames 110, 112 from the frame buffers 112, 124. Of course, when generating output video frames 110, 112, the encoder can take physical capabilities of the display device into account. Thus, the act an also include generating the one or more first output video frames and the one or more second output video frames based on determined capabilities (e.g., frame size, frame-rate).


In addition, FIG. 7 illustrates that the method can comprise an act 708 of transmitting the 3D output signal to the display device. Act 708 can include transmitting the 3D output signal to the display device at a frame-rate based on the determined frame-rate capabilities. For example the act can include the video processing device 102 using the video transmitter 130 to send the output 3D content 108 to a destination display device 310. To transmit at a specific frame-rate, the act can include sending each video frame for a specific time period appropriate for the frame-rate. For example, if the frame-rate is 60 Hz, the act can include sending each frame for 1/60th of a second.



FIG. 7 also shows that the method can include an act 710 of transmitting a blanking instruction to a blanking device. Act 710 can include transmitting a blanking instruction to a blanking device which directs the blanking device to blank the user's view of the display device while the display device transitions between the one or more first output video frames and the one or more second output video frames. For example, the act can include the video processing device 102 transmitting the blanking signal 136 via the blanking signal transmitter 134. The blanking signal 136 can include a first blanking instruction (e.g., blanking instruction 320) which instructs the blanking device to blank both of a user's eyes.


Of course, the method can include transmitting any number of additional blanking instructions. For example, the method include transmitting a second blanking instruction to the blanking device which directs the blanking device to blank the user's first eye view of the display device while the display device uniquely displays the one or more second output video frames (e.g., blanking instruction 318). The method can also include transmitting a third blanking instruction to the blanking device which directs the blanking device to blank the user's second eye view of the display device while the display device uniquely displays the one or more first output video frames (e.g., blanking instruction 322). The method can also include transmitting other blanking instructions, such as a blanking instruction which directs the blanking device to refrain from blanking.


Accordingly, FIGS. 1-7 provide a number of components and mechanisms for sending 3D video content to a broad range of display devices. One or more disclosed implementations allow for viewing of 3D video content on a broad range of display devices, including devices that that may have lower frame-rates and longer frame overlap intervals, or that are not otherwise specifically designed for displaying 3D video content.


The implementations of the present invention can comprise a special purpose or general-purpose computing systems. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally considered a computing system, such as DVD players, Blu-Ray Players, gaming systems, and video converters. In this description and in the claims, the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor.


The memory may take any form and may depend on the nature and form of the computing system. A computing system may be distributed over a network environment and may include multiple constituent computing systems. In its most basic configuration, a computing system typically includes at least one processing unit and memory. The memory may be physical system memory, which may be volatile, non-volatile, or some combination of the two. The term “memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If the computing system is distributed, the processing, memory and/or storage capability may be distributed as well. As used herein, the term “module” or “component” can refer to software objects or routines that execute on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).


Implementations of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.


Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.


A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.


Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.


Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.


Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.


The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Claims
  • 1. At a computer system, the computer system including one or more processors and a memory, a method of sending three-dimensional content to a display device, the method comprising the acts of: sending one or more first video frames that include a first image for viewing by a user's first eye to a display device;transmitting an inter-frame blanking signal to a blanking device that instructs the blanking device to concurrently blank both of the user's first eye and the user's second eye during a display of a transition during which at least a portion of the one or more first video frames and at least a portion of the one or more second video frames are to be displayed concurrently at the display device; andsending the one or more second video frames that include a second image for viewing by the user's second eye to the display device.
  • 2. The method of claim 1, wherein the inter-frame blanking signal instructs the blanking device to blank both of the user's first eye and the user's second eye during less than an entire display of the transition.
  • 3. The method of claim 1, wherein: sending one or more first video frames comprises sending a plurality of interlaced first video frames, andsending the one or more second video frames comprises sending a plurality of interlaced second video frames.
  • 4. The method of claim 1, wherein: sending one or more first video frames comprises sending a single progressive first video frame; andsending the one or more second video frames comprises sending a single progressive second video frame.
  • 5. The method of claim 1, further comprising generating the inter-frame blanking signal based on one or more physical characteristics of the display device, including an inter-frame overlap interval of the display device.
  • 6. The method of claim 5, wherein the inter-frame overlap interval includes a time period corresponding to the display of the transition.
  • 7. The method of claim 1, wherein sending the one or more first video frames and sending the one or more second video frames comprises sending the one or more first video frames and the one or more second video frames at a frame-rate customized to the display device.
  • 8. The method of claim 1, wherein: the one or more first video frames include image data for only the first image; andthe one or more second video frames include image data for only the second image.
  • 9. The method of claim 1, wherein the inter-frame blanking signal also instructs the blanking device to blank the user's first eye during individual display of the second image at the display device and to blank the user's second eye during individual display of the first image at the display device.
  • 10. The method of claim 1, further comprising generating the one or more first video frames and generating the one or more second video frames based on one or more physical characteristics of the display device, including a frame-rate and a frame size of the display device.
  • 11. At a computer system, the computer system including one or more processors and a memory, a method of tailoring and sending three-dimensional (3D) content to a display device while synchronously sending an inter-frame blanking signal to a blanking device, the method comprising the acts of: receiving a 3D input signal including one or more input video frames that include a first image for viewing by a user's first eye and a second image for viewing by the user's second eye;determining frame-rate capabilities of a display device;generating a 3D output signal for the display device, comprising one or more first output video frames including the first image and one or more second output video frames including the second image;transmitting the 3D output signal to the display device at a frame-rate based on the determined frame-rate capabilities; andtransmitting a blanking instruction to a blanking device which directs the blanking device to blank the user's view of the display device while the display device transitions between the one or more first output video frames and the one or more second output video frames.
  • 12. The method of claim 11, wherein the blanking instruction is a first blanking instruction, the method further comprising: transmitting a second blanking instruction to the blanking device which directs the blanking device to blank the user's first eye view of the display device while the display device uniquely displays the one or more second output video frames; andtransmitting a third blanking instruction to the blanking device which directs the blanking device to blank the user's second eye view of the display device while the display device uniquely displays the one or more first output video frames.
  • 13. The method of claim 11, further comprising determining frame size capabilities of the display device.
  • 14. The method of claim 13, wherein generating the 3D output signal for the display device comprises generating the one or more first output video frames and the one or more second output video frames at a frame size based on the determined frame size capabilities.
  • 15. The method of claim 11, further comprising determining a frame overlap interval of the display device, during which the display device transitions between the display of two or more video frames.
  • 16. The method of claim 11, wherein the one or more input video frames comprise a single video frame.
  • 17. The method of claim 11, wherein the one or more input video frames comprise a plurality of video frames.
  • 18. The method of claim 11, wherein determining frame-rate capabilities of the display device comprises receiving frame-rate capabilities directly from the display device.
  • 19. The method of claim 11, wherein determining frame-rate capabilities of the display device comprises receiving manual user input.
  • 20. A computer program product for implementing a method for sending three-dimensional content to a display device, the computer program product for use at a computer system, the computer program product comprising one or more computer storage devices having stored thereon computer-executable instructions that, when executed by the computer system, cause one or more processors of the computer system to perform the method, comprising the acts of: sending one or more first video frames that include a first image for viewing by a user's first eye to a display device;transmitting an inter-frame blanking signal to a blanking device that instructs the blanking device to concurrently blank both of the user's first eye and the user's second eye during a display of a transition during which at least a portion of the one or more first video frames and at least a portion of the one or more second video frames are to be displayed concurrently at the display device; andsending the one or more second video frames that include a second image for viewing by the user's second eye to the display device.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a U.S. National Stage Application corresponding to PCT Patent Application No. PCT/US2011/027933, filed Mar. 10, 2011, which claims priority to U.S. Provisional Application No. 61/416,708, filed Nov. 23, 2010, entitled “3D VIDEO CONVERTER.” The present application is also a continuation-in-part of: PCT Patent Application No. PCT/US2011/025262, filed Feb. 17, 2011, entitled “BLANKING INTER-FRAME TRANSITIONS OF A 3D SIGNAL;” PCT Patent Application No. PCT/US2011/027175, filed Mar. 4, 2011, entitled “FORMATTING 3D CONTENT FOR LOW FRAME-RATE DISPLAYS;” PCT Patent Application No. PCT/US2011/027981, filed Mar. 10, 2011, entitled “SHUTTERING THE DISPLAY OF INTER-FRAME TRANSITIONS;” PCT Patent Application No. PCT/US2011/032549, filed Apr. 14, 2011, entitled “ADAPTIVE 3-D SHUTTERING DEVICES;” and PCT Patent Application No. PCT/US2011/031115, filed Apr. 4, 2011, entitled “DEVICE FOR DISPLAYING 3D CONTENT ON LOW FRAME-RATE DISPLAYS.” The entire content of each of the foregoing applications is incorporated by reference herein.

PCT Information
Filing Document Filing Date Country Kind 371c Date
PCT/US11/27933 3/10/2011 WO 00 12/16/2011
Provisional Applications (1)
Number Date Country
61416708 Nov 2010 US
Continuation in Parts (5)
Number Date Country
Parent PCT/US11/25262 Feb 2011 US
Child 13378981 US
Parent PCT/US11/27175 Mar 2011 US
Child PCT/US11/25262 US
Parent PCT/US11/27981 Mar 2011 US
Child PCT/US11/27175 US
Parent PCT/US11/32549 Apr 2011 US
Child PCT/US11/27981 US
Parent PCT/US11/31115 Apr 2011 US
Child PCT/US11/32549 US