Not applicable.
This invention is in the field of interactive display systems. Embodiments of this invention are more specifically directed to the positioning of the location at a display to which a control device is pointing during the interactive operation of a computer system.
The ability of a speaker to communicate a message to an audience is generally enhanced by the use of visual information, in combination with the spoken word. In the modern era, the use of computers and associated display systems to generate and display visual information to audiences has become commonplace, for example by way of applications such as the POWERPOINT presentation software program available from Microsoft Corporation. For large audiences, such as in an auditorium environment, the display system is generally a projection system (either front or rear projection). For smaller audiences such as in a conference room or classroom environment, flat-panel (e.g., liquid crystal) displays have become popular, especially as the cost of these displays has fallen over recent years. New display technologies, such as small projectors (“pico-projectors”), which do not require a special screen and thus are even more readily deployed, are now reaching the market. For presentations to very small audiences (e.g., one or two people), the graphics display of a laptop computer may suffice to present the visual information. In any case, the combination of increasing computer power and better and larger displays, all at less cost, has increased the use of computer-based presentation systems, in a wide array of contexts (e.g., business, educational, legal, entertainment).
A typical computer-based presentation involves the speaker standing remotely from the display system, so as not to block the audience's view of the visual information. Because the visual presentation is computer-generated and computer-controlled, the presentation is capable of being interactively controlled, to allow selection of visual content of particular importance to a specific audience, annotation or illustration of the visual information by the speaker during the presentation, and invocation of effects such as zooming, selecting links to information elsewhere in the presentation (or online), moving display elements from one display location to another, and the like. This interactivity greatly enhances the presentation, making it more interesting and engaging to the audience.
The ability of a speaker to interact, from a distance, with displayed visual content, is therefore desirable. More specifically, a hand-held device that a remotely-positioned operator could use to point to, and interact with, the displayed visual information is therefore desirable.
U.S. Pat. No. 8,217,997, issued Jul. 10, 2012, entitled “Interactive Display System”, commonly assigned herewith and incorporated herein by reference, describes an interactive display system including a wireless human interface device (“HID”) constructed as a handheld pointing device including a camera or other video capture system. The pointing device captures images displayed by the computer, including one or more human-imperceptible positioning targets inserted by the computer into the displayed image data. The location, size, and orientation of the recovered positioning target identify the aiming point of the remote pointing device relative to the display.
The positioning of the aiming point of the pointing device according to the approach described in the above-referenced U.S. Pat. No. 8,217,997 is performed at a rate corresponding to the frame rate of the display system. More specifically, a new position can be determined as each new frame of data is displayed, by the combination of the new frame (and its positioning target) and the immediately previous frame (and its complementary positioning target). This approach works quite well in many situations, particularly in the context of navigating and controlling a graphical user interface in a computer system, such as pointing to and “clicking” icons, click-and-drag operations involving displayed windows and frames, and the like. A particular benefit of this approach described in U.S. Pat. No. 8,217,997, is that the positioning is “absolute”, in the sense that the result of the determination is a specific position on the display (e.g., pixel coordinates). The accuracy of the positioning carried out according to this approach is quite accurate over a wide range of distances between the display and the handheld device, for example ranging from in physical contact with the display screen to tens of feet away.
U.S. Patent Application Publication No. US 2014/0062881, published Mar. 6, 2014 from copending and commonly assigned U.S. patent application Ser. No. 14/018,695, incorporated herein by this reference, describes an interactive display system including a wireless pointing device and positioning circuitry capable of determining both absolute and relative positions of the display at which the pointing device is aimed. A comparison between the absolute and relative positions at a given time is used to compensate the relative position determined by the motion sensors, enabling both rapid and frequent positioning provided by the motion sensors and also the excellent accuracy provided by absolute positioning.
U.S. Patent Application Publication No. US 2014/0111433, published Apr. 24, 2014 from copending and commonly assigned U.S. patent application Ser. No. 14/056,286, incorporated herein by this reference, describes an interactive display system including a wireless pointing device and positioning circuitry capable of detecting motion of the pointing device between the times at which two frames are captured in order to identify the aiming point of the remote pointing device relative to the display. The ability of the pointing device to detect the positioning target is improved, according to the system and method described in this publication, by aligning the two captured images with one another according to the extent and direction of the detected motion.
Conventional digital cameras typically use a “rolling shutter” mechanism to control the time that the camera sensor is exposed to incident light in obtaining the image (i.e., the “shutter speed”). As known in the art, the rolling shutter describes the technique by way of which the image frame is recorded by the sensor in a scanning manner, either vertically or horizontally, rather than by all sensor pixels capturing the image simultaneously. The rolling shutter technique improves the effective sensitivity of the sensor, because it allows the sensor to gather photons over the acquisition process. However, the rolling shutter can result in distortion in the captured image, particularly if the subject is moving during the exposure (and thus changes location from one portion of the image to another), or if a flash of light occurs during the exposure.
In the context of an interactive display system as described in the above-incorporated patents and publications, in which image capture of consecutive frames by the pointing device is used to determine the aimed at location of a display at which different and changing images or frames are displayed over time, the use of a rolling shutter can cause artifacts in the captured images. Because the rolling shutter of the pointing device is not synchronized with the display timing, part of the captured image may include pixels released to the display in one frame while another part of the captured image includes pixels released in the next frame. In this case, a visible line (i.e., a “scan line”) of low signal-to-noise ratio or a polarity reversal (i.e., part of the image having light features over dark background and another part of the same image having dark features over light background) will appear in the captured or processed image at the boundary between those frames. This rolling shutter effect can result in inaccurate or indeterminate positioning of the location of the display at which the pointing device is aimed.
Embodiments of this invention provide an interactive display system and method for rapidly and accurately determining an absolute position of the location at a display at which a handheld human interface device, such as a pointing device, using a rolling shutter is pointing.
Some embodiments of this invention provide such a system and method in which the pointing device can be used with a wide range of display types and technologies.
Some embodiments of this invention provide such a system and method in which such absolute positioning can be performed without requiring an external synchronization source.
Other objects and advantages of the various embodiments of this invention will be apparent to those of ordinary skill in the art having reference to the following specification together with its drawings.
Embodiments of this invention may be implemented into an interactive display system and method of operating the same in which a pointing device includes an image capture subsystem, using a rolling shutter, for identifying an absolute location at a displayed image. The pointing device operates by detecting a “scan line”, which is a boundary in the captured image that appears between pixels scanned in different frames; the scan line is present when the image capture by the pointing device is not synchronized with the timing at which frames are released to the display. Circuitry in the pointing device operates to determine the phase difference required to move the scan line to a point outside of the visible pixel data. Other circuitry operates to adjust the phase of one of the pointing device shutter and the display frame scan according to the determined phase difference.
a and 1b are schematic perspective views of a speaker presentation being carried out using an interactive display system according to embodiments of the invention.
a through 2c are electrical diagrams, in block form, each illustrating an interactive display system according to an embodiment of the invention.
a is a timing diagram illustrating examples of the synchronized and mis-synchronized image capture and frame release in the operation of the systems of
b through 4h are illustrations of captured images and subtracted images illustrating the effects of synchronized and mis-synchronized image capture and frame release in the operation of the systems of
a through 6c are flow diagrams illustrating the operation of scan line detection in the process of
a and 7b are flow diagrams illustrating the operation of phase adjustment in the process of
a through 8c are flow diagrams illustrating the operation of optional frequency synchronization as useful in the process of
This invention will be described in connection with one or more of its embodiments, namely as implemented into a computerized presentation system including a display visible by an audience, as it is contemplated that this invention will be particularly beneficial when applied to such a system. However, it is also contemplated that this invention can be useful in connection with other applications, such as gaming systems, general input by a user into a computer system, and the like. Accordingly, it is to be understood that the following description is provided by way of example only, and is not intended to limit the true scope of this invention as claimed.
a illustrates a simplified example of an environment in which embodiments of this invention are useful. As shown in
The types of display 20 used for presenting the visual aids to audience A can also vary, often depending on the size of the presentation environment. In rooms ranging from conference rooms to large-scale auditoriums, display 20 may be a projection display, including a projector disposed either in front of or behind a display screen. In that environment, computer 22 would generate the visual aid image data and forward it to the projector. In smaller environments, display 20 may be an external flat-panel display, such as of the plasma or liquid crystal display (LCD) type, directly driven by a graphics adapter in computer 22. For presentations to one or two audience members, computer 22 in the form of a laptop or desktop computer may simply use its own display 20 to present the visual information. Also for smaller audiences A, hand-held projectors (e.g., “pocket projectors” or “pico projectors”) are becoming more common, in which case the display screen may be a wall or white board.
The use of computer presentation software to generate and present graphics and text in the context of a presentation is now commonplace. A well-known example of such presentation software is the POWERPOINT software program available from Microsoft Corporation. In the environment of
In
b illustrates another use of the system and method of embodiments of this invention, in which speaker SPKR closely approaches display 20 to interact with the visual content. In this example, display 20 is operating as a “white board” on which speaker SPKR may “draw” or “write” using pointing device 10 to actively draw content as annotations to the displayed content, or even on a blank screen as suggested by
In either case, as described in the above-incorporated U.S. Pat. No. 8,217,997, in the above-incorporated U.S. Patent Application Publications No. US 2014/0062881 and No. US 2014/0111433, and in further detail below in connection with particular embodiments of the invention, speaker SPKR carries out this interaction by way of pointing device 10, which is capable of capturing all or part of the image at display 20 and of interacting with a pointed-to (or aimed-at) target location at that image. Pointing device 10 in the examples of
Referring to
In its payload image generation function, computer 22 will generate or have access to the visual information to be displayed (i.e., the visual “payload” images), for example in the form of a previously generated presentation file stored in memory, or in the form of active content such as computer 22 may retrieve over a network or the Internet; for a “white board” application, the payload images will include the inputs provided by the user via pointing device 10, typically displayed on a blank background. This human-visible payload image frame data from computer 22 will be combined with positioning target image content generated by target generator function 23 that, when displayed at graphics display 20, can be captured by pointing device 10 and used by positioning circuitry 25 to deduce the location pointed to by pointing device 10. Graphics adapter 27 includes the appropriate functionality suitable for presenting a sequence of frames of image data, including the combination of the payload image data and the positioning target image content, in the suitable display format, to projector 21. Projector 21 in turn projects the corresponding images I at display screen 20, in this projection example.
The particular construction of computer 22, positioning circuitry 25, target generator circuitry 23, and graphics adapter 27 can vary widely. For example, it is contemplated that a single personal computer or workstation (in desktop, laptop, or other suitable form), including the appropriate processing circuitry (CPU, or microprocessor) and memory, can be constructed and programmed to perform the functions of generating the payload images, generating the positioning target, combining the two prior to or by way of graphics adapter 27, as well as receiving and processing data from pointing device 10 to determine the pointed-to location at the displayed image. Alternatively, it is contemplated that separate functional systems external to computer 22 may carry out one or more of the functions of target generator 23, receiver 24, and positioning circuitry 25, such that computer 22 can be realized as a conventional computer operating without modification; in this event, graphics adapter 27 could itself constitute an external function (or be combined with one or more of the other functions of target generator 23, receiver 24, and positioning circuitry 25, external to computer 22), or alternatively be realized within computer 22, to which output from target generator 23 is presented. Other various alternative implementations of these functions are also contemplated. In any event, it is contemplated that computer 22, positioning circuitry 25, target generator 23, and other functions involved in the generation of the images and positioning targets displayed at graphics display 20, will include the appropriate program memory in the form of computer-readable media storing computer program instructions that, when executed by its processing circuitry, will carry out the various functions and operations of embodiments of the invention as described in this specification. It is contemplated that those skilled in the art having reference to this specification will be readily able to arrange the appropriate computer hardware and corresponding computer programs for implementation of these embodiments of the invention, without undue experimentation.
Pointing device 10 in this example includes a camera function consisting of optical system 12 and image sensor 14. In this example, shutter 13 of the conventional type (e.g., a mechanical shutter) is implemented as part of image sensor 14, and controls the exposure of sensor 14 to light when actuated. Alternatively, shutter 13 may be implemented at or within optical system 12. In the embodiments described in this specification, shutter 13 is of the “rolling shutter” type, in that its opening effectively scans across the pixel field of sensor 14, either horizontally or vertically, which improves the sensitivity of sensor 14 and thus the quality of the captured image, as known in the art. With pointing device 10 aimed at display 20, image sensor 14 is exposed with the captured image via shutter 13, that captured image corresponding to all or part of image I at display 20, depending on the distance between pointing device 10 and display 20, the focal length of lenses within optical system 12, and the like. Image capture subsystem 16 includes the appropriate circuitry known in the art for acquiring and storing a digital representation of the captured image at a particular point in time selected by the user, or as captured at each of a sequence of sample times, including the circuitry that controls the timing and duration of the opening of shutter 13. Pointing device 10 also includes actuator 15, which is a conventional push-button or other switch by way of which the user of pointing device 10 can provide user input in the nature of a mouse button, to actuate an image capture, or for other functions as will be described below and as will be apparent to those skilled in the art. In this example, one or more inertial sensors 17 are also included within pointing device 10, to assist or enhance user interaction with the displayed content; examples of such inertial sensors include accelerometers, magnetic sensors (i.e., for sensing orientation relative to the earth's magnetic field), gyroscopes, and other inertial sensors.
In this example of
It is contemplated that the particular location of positioning circuitry 25 in the interactive display system of embodiments of this invention may vary from system to system. It is not particularly important, in the general sense, which hardware subsystem (i.e., the computer driving the display, the pointing device, a separate subsystem in the video data path, or some combination thereof) performs the determination of the pointed-to location at display 20. In the example shown in
According to embodiments of this invention, the interactive display system includes scan line detection circuitry 30, phase detection circuitry 32, and phase adjustment circuitry 34. In the example of
In the example of
b illustrates an alternative generalized arrangement of an interactive display system according to embodiments of this invention. This system includes projector 21 and display 20 as in the example of
In the example of the interactive display system shown in
c illustrates an alternative architecture of the interactive display system, according to an embodiment of this invention. This architecture arranges pointing device 10 and positioning circuitry 25 in the manner described above relative to
In any of these cases, positioning circuitry 25, 25′ (hereinafter referred to generically as positioning circuitry 25) determines the location at display 20 at which pointing device 10, 10′ (hereinafter referred to generically as pointing device 10) is aimed, as will be described in detail below. As described in the above-incorporated U.S. Pat. No. 8,217,997 and in the above-incorporated U.S. Patent Application Publications No. US 2014/0062881 and No. US 2014/0111433, positioning circuitry 25 performs “absolute” positioning, in the sense that the pointed-to location at the display is determined with reference to a particular pixel position within the displayed image. As described in U.S. Pat. No. 8,217,997 and in U.S. Patent Application Publications No. US 2014/0062881 and No. US 2014/0111433, image capture subsystem 16 captures images from two or more frames, those images including one or more positioning targets that are presented as patterned modulation of the intensity (e.g., variation in pixel intensity) in one display frame of the visual payload, followed by the same pattern but with the opposite modulation in a later (e.g., the next successive) frame.
For purposes of this description, the intensity modification applied for the positioning target is described in a monochromatic sense, with the overall intensity of each pixel described as modulated either brighter or dimmer at the positioning target. Of course, modern displays are color displays, typically realized based on frame data with different intensities for each component color (e.g., red, green, blue). As such, it is contemplated for some embodiments of this invention that the intensity of each component color would be modulated by ±p at the positioning target locations; alternatively, the modulation may vary from color to color.
This process is then repeated for the next frames j+2, j+3, etc., resulting in a sequence of images displayed at display 20, with one or more positioning targets appearing in successive frames, but alternating between being brighter and being dimmer in those successive frames. Because the response of the human eye is generally too slow to perceive individual display frames at modern frame rates on the order of 60 Hz or higher, human viewers will tend to average the perceived displayed images. Referring to
As described in U.S. Pat. No. 8,217,997 and in U.S. Patent Application Publications No. US 2014/0062881 and No. US 2014/0111433, however, the interactive display system is capable of detecting and identifying the positioning target included within the displayed image I. In summary, image capture subsystem 16 captures images from each of frames j and j+1, each captured image including image data containing the payload image FD(j,j+1) and the complementary positioning target PT1. Positioning circuitry 25 (whether located at computer 22 as in
This positioning operation summarized above relative to
The top plot in
The middle plot of
However, if the image capture carried out by pointing device 10 is asynchronous relative to the release of frames to display 20 by graphics adaptor 27, this “in sync” condition is a matter of happenstance. The bottom plot of
The illustration of
The effects of mis-synchronization between image capture subsystem 16 and the release of frames to display 20 is especially disruptive to the positioning operation described above relative to
However, the effects of mis-synchronization between image capture and display frame release drastically affects the fidelity of the recovery of positioning targets according to this subtraction technique.
In addition, the idealized illustration of
The width of noise band SL will depend on the time required to release and display a new frame at display 20, relative to the rolling shutter interval. A longer frame release interval and/or a shorter rolling shutter exposure time will result in a wider scan line noise band SL in the subtracted images if the shutter is open during that frame release time, because the frame release interval will correspond to a larger portion of the captured images. As such, mis-synchronization of the image capture time with the frame release time can result in a subtracted image that is largely noise, and thus of little use for positioning purposes.
h illustrates another complication that can arise from mis-synchronized image capture. As evident from the above description of the interactive display system, pointing device 10 can be held by the user in various attitudes. The images captured and subtracted in the positioning process can themselves be rotated.
According to embodiments of the invention, the image capture process by pointing device 10 is synchronized with the release of frames to display 20, such that each image captured by pointing device 10 for positioning purposes corresponds to one and only one image frame displayed at display 20, and does not include scan line noise or image information from multiple frames. In a general sense according to these embodiments, one may consider the synchronization problem as involving the synchronization of both frequency (i.e., the image capture rate should match the frame release rate) and phase (i.e., the timing of image capture should occur at a desired time relative to the frame release cycle). Frequency synchronization could be accomplished in a master/slave fashion by having the display system (computer 22, graphics adaptor 27, or display 20) as the master and pointing device 10 as the slave, or having pointing device 10 be the master and the display system as the slave, or having both the display system and pointing device 10 slaved to an external master device. However, the interactive display system described above and in U.S. Pat. No. 8,217,997 and U.S. Patent Application Publications No. US 2014/0062881 and No. US 2014/0111433 desirably allows pointing device 10 to operate with multiple display systems, most if not all of which may be pre-installed without regard to a particular pointing device. As such, it may not be practical in many instances to provide such a master/slave arrangement to attain frequency synchronization. Embodiments of this invention therefore control the synchronization of image capture relative to frame release, in other words reducing the phase difference between those events so that the undesired artifacts described above do not appear in the subtracted image data used for positioning purposes.
Referring now to
Synchronization of image capture at pointing device 10 (or pointing device 10′, as the case may be; for purposes of clarity, the following description will refer to either of these pointing devices as pointing device 10) with the release of frames to display 20 begins with process 70, in which image frames are displayed at display 20 by the operation of computer 22, target generator 23, graphics adaptor 27, and projector 21 as described above in connection with
According to embodiments of this invention, phase synchronization process 75 is performed to synchronize the timing of image capture with the release of frames to display 20 to avoid the situations described above relative to
According to embodiments of the invention, phase synchronization process 75 begins with process 76, in which scan line detection circuitry 30 detects the position of a scan line in images captured by image capture subsystem 16. As indicated in
Once the scan line position has been detected in process 76, phase detection circuitry 32 executes process 78 to determine a phase difference between the timing of image capture and that of the release of a frame to display 20, based on the position of the scan line determined in process 76. It is contemplated that process 78 will typically be based on a transform from the spatial position of the scan line in the captured or subtracted image as determined in process 76 into a temporal relationship of that scan line position relative to the period of the frame rate. As such, it is contemplated that the specific approach involved in process 78 will be apparent to those skilled in the art having reference to this specification. If phase detection circuitry 32 is implemented in pointing device 10, as shown in the example of
Following the determination of the phase difference in process 78, according to embodiments of the invention, process 80 is then performed by phase adjustment circuitry 34 to adjust the relative phase of image capture and the release of frames to display 20. Specific implementations of process 80 will be described in detail below by way of example. In general, process 80 may be performed by adjusting the timing of image capture by image capture subsystem 16 in pointing device 10, in which case phase adjustment circuitry 34 will be realized in pointing device 10, or alternatively by adjusting the timing of the release of display image frames to display 20, in which case phase adjustment circuitry 34 will be realized in computer 22, graphics adaptor 27, or projector 21 in the implementations of
Particular embodiments of the manner in which scan line detection process 76 may be implemented will now be described in connection with
In the embodiment shown in
In some embodiments, as discussed above, pointing device 10 may include inertial sensors 17 that are capable of detecting the relative motion of pointing device 10, including the rotation of pointing device 10 by the user. As discussed above relative to
In process 86, scan line detection circuitry 30 processes the retrieved images according to conventional image processing algorithms to detect any linear region of high noise in the image or images. As discussed above, it is contemplated that the mis-synchronization of image capture relative to frame release will often present a region of high spatial noise (e.g., significant high frequency variations) at the locations of the image obtained during a transition from one displayed image frame to the next. Accordingly, process 86 analyzes the retrieved image, for example by applying a spatial frequency transform algorithm, to determine whether a linear region of high noise is present, and if so, the position of that region within the image.
The result of process 86, and thus of scan line detection process 76a, is then forwarded to phase detection circuitry 32 for determination of the phase difference in process 78. In the embodiment of
In an alternative implementation of scan line detection process 76a described above relative to
b illustrates another approach to scan line detection process 76 according to an embodiment of the invention. In this embodiment, scan line detection process 76b begins with the retrieval of one or more subtracted frames, in process 88, following subtraction process 74 as used in the positioning process. The retrieved subtracted frames are optionally rotated in process 84 based on information from inertial sensors 17 (if present), as described above.
In process 90, scan line detection circuitry 30 performs an image processing routine on the retrieved subtracted image or images to detect a boundary between image features of the opposite polarity. Referring back to
The result of process 90 is then forwarded to phase detection circuitry 32 for determination of the phase difference in process 78a, in the same manner as discussed above relative to
c illustrates another embodiment of scan line detection process 76 according to an embodiment of the invention. In this embodiment, scan line detection process 76c essentially operates by identifying the absence of a scan line in the analyzed images. As such, this embodiment of scan line detection process 76c is incorporated in combination with phase adjustment process 80′, such that both processes are iteratively performed together. In other words, upon completion of process 76c, the relative timing of image capture and frame release will have already been adjusted.
For purposes of this embodiment, either raw captured images from process 82 or subtracted images from process 74 may be used in the scan line detection. Scan line detection process 76c thus begins with either of retrieval processes 82 or 88, depending upon whether raw captured images or subtracted images are to be analyzed. In either case, rotation process 84 is then optionally performed to de-rotate the retrieved image or images according to information from inertial sensors (17), if present. The retrieved images are then processed, for example by either of image processing processes 86, 90 described above or by another similar approach, to detect whether a scan line is present in the images. For purposes of this embodiment, it is not essential that the position of the scan line within the image be identified in process 86, 90; rather, the images need only be processed to determine whether a scan line is present. In particular, processes 86, 90 may be performed simply to determine whether the images are sufficiently clear (i.e., noise-free) to identify positioning targets; if not, then the presence of a scan line can be assumed.
In decision 91, scan line detection circuitry 30 evaluates the results of process 86, 90. If a scan line is present (decision 91 is “yes”), phase adjustment process 80′ is performed to incrementally adjust the timing of image capture relative to the release of display image frames to display 20 by, by adjusting either or both of image capture subsystem 16 or the display system (computer 22, graphics adaptor 27, or projector 21). Retrieval process 82, 88, optional rotation process 84, and image processing process 86, 90 are then repeated, and decision 91 is again evaluated. As mentioned above, knowledge of the phase difference indicated by the scan line position is not essential, nor is the polarity of the phase adjustment applied in process 80′ critical; the iterative nature of this approach will eventually settle on proper synchronization. However, as will be described below, convergence to synchronized operation can occur more rapidly if the phase difference and preferred polarity of adjustment were taken into consideration. Upon decision 91 determining that no scan line is present or that the image quality is sufficient to accurately perform the positioning process (decision 91 is “no”), the result is forwarded to process 78b, which in this embodiment determines that the phase difference is zero (i.e., no scan line is present, and therefore image capture and frame release are synchronized).
It is contemplated that scan line detection process 76c in this embodiment will be particularly useful in those implementations in which the mis-synchronized state does not exhibit a visible scan line, but rather results in a raw or subtracted image that is essentially noise over most if not all of the image field. This situation may present itself if the duration of the rolling shutter exposure is relatively long, occupying much of the period of the display frame.
Particular embodiments of the manner in which phase adjustment process 80 may be implemented will now be described in connection with
Phase adjustment process 80a as shown in
In process 94, phase adjustment circuitry 34 applies the phase difference Δφ to either or both of image capture subsystem 16 in pointing device 10, or to the appropriate component of the display system if the timing of frame release to display 20 is to be adjusted. In the case of adjustment of the timing of image capture subsystem 16 in pointing device 10, it is contemplated that adjustment process 94 may be carried out in any one of a number of ways, depending on the particular implementation of image capture subsystem 10. For example, if the image capturing timing is a programmable parameter in image capture subsystem 16, timing adjustment process 94 may be performed by altering a timing parameter stored in a control register or other operative memory element of image capture subsystem 16, or by issuing a software or firmware command to logic circuitry in image capture subsystem 16. In other cases, adjustment of the timing of operation of image capture subsystem 16 may be performed by issuing a hardware synchronization signal (e.g., a “sync” pulse) to the appropriate circuitry. Conversely, phase adjustment process 94 may be similarly performed to adjust the timing of frame release, for example by similarly updating a software/firmware register within, or by issuing a hardware synchronization signal to, the appropriate component of the display system (computer 22, graphics adaptor 27, projector 21). It is contemplated that those skilled in the art having reference to this specification will be readily able to realize the adjustment of this relative timing in process 80a for particular implementations, without undue experimentation.
b illustrates phase adjustment process 80b according to an alternative implementation, in which the relative timing of image capture and frame release is incrementally adjusted. Process 92 is again performed by phase adjustment circuitry 34 to retrieve the phase difference Δφ determined in process 78. In this embodiment, an incremental adjustment is applied to image capture subsystem 16 or to the appropriate component of the display system (computer 22, graphics adaptor 27, projector 21), to advance or retard the timing of image capture or frame release by increment dφ. This increment dφ may vary, depending on the value of the retrieved phase difference Δφ, or instead may be a constant increment, for example at or near the smallest timing increment available. Similarly as described above relative to
As mentioned above, frequency synchronization of the frequency at which image capture subsystem 16 acquires images and the rate at which display frames are “released” to display 20 by the display system (computer 22, graphics adaptor 27 or projector 21, as the case may be) can assist in the operation of some embodiments of the invention. These embodiments will now be described in connection with
a illustrates a first embodiment of this optional frequency synchronization. In process 100, the rate at which frames are released to display 20 is measured or otherwise identified. It is contemplated that any one of a number of approaches may be used to carry out process 100, including interrogation of a control register or other setting of graphics adaptor 27 (e.g., by computer 22), use of a counter to actually measure the time elapsed between sync or other signals indicative of the frame rate, and the like. It is contemplated that this process 100 will be carried out at the display system. In process 102, the frame release rate measured in process 100 is communicated to pointing device 10, for example by way of signals communicated by transceiver 24′ to transceiver 18′ in the architecture of
An alternative frequency synchronization approach to that of
b illustrates a phase-locked loop approach to frequency synchronization, according to an alternative embodiment, which is carried out at pointing device 10. This approach begins with process 106, in which pointing device 10 identifies the release rate of frames to display 20. Process 106 may be performed in a number of ways, for example by receiving sync signals or other start-of-frame indications from the display system, or alternatively by detecting scan lines or other events in captured images. Circuitry in pointing device 10 then identifies a frequency error between the frame release rate obtained in process 106 and the current image capture rate, in process 108. This frequency error value is used to adjust the image capture rate at image capture subsystem 16, in process 110, in a direction and by a value that reduces the frequency error. Processes 106, 108, 110 are then repeated in “PLL” fashion to maintain the two frequencies in synchronization. Phase synchronization process 75 can then be carried out, for example according to one of the embodiments described above, preferably in parallel with the continued frequency synchronization processes of this embodiment.
c illustrates another approach to frequency synchronization, particularly in connection with the architecture of
Other alternative approaches to attaining frequency synchronization are also contemplated. For example, an internal clock in pointing device that controls the rate of image capture system 16 may be synchronized to an internal clock in the display system (i.e., in computer 22, graphics adaptor 27, or projector 21) that controls the release of frames to display 20, or vice versa. This frequency synchronization of the respective internal clocks may be accomplished by one of pointing device 10 or the display system communicating its internal clock rate (or “beat” signal) to the other, for example over the wireless communication link between transceivers 18′, 24′ in the arrangement of
According to this embodiments of this invention, therefore, improvement in the ability of a pointing device to operate and control an interactive display system is provided. Specifically, the positioning of the location of a display at which a remote pointing device is aimed, and thus the displayed graphics or text element that is to be controlled by the user by way of the pointing device, can be more accurately and reliably be carried out, by ensuring good fidelity in the images captured by the pointing device for use in the positioning process. Some of the embodiments described enable the benefits of image capture synchronization to be attained over a wide range of display types, without requiring reconfiguration of the display system. It is contemplated that these advantages as applied to the absolute positioning process will significantly improve the operation of the interactive display system, as well as the experience provided to the audience.
While this invention has been described according to its embodiments, it is of course contemplated that modifications of, and alternatives to, these embodiments, such modifications and alternatives obtaining the advantages and benefits of this invention, will be apparent to those of ordinary skill in the art having reference to this specification and its drawings. It is contemplated that such modifications and alternatives are within the scope of this invention as subsequently claimed herein.
This application claims priority, under 35 U.S.C. §119(e), of Provisional Application No. 61/871,377, filed Aug. 29, 2013, incorporated herein by this reference.
Number | Date | Country | |
---|---|---|---|
61871377 | Aug 2013 | US |