The subject disclosure relates to an interactive input system and method.
Interactive input systems that allow users to inject input (i.e. digital ink, mouse events etc.) into an application program using an active pointer (e.g. a pointer that emits light, sound or other signal), a passive pointer (e.g. a finger, cylinder or other suitable object) or other suitable input device such as for example, a mouse or trackball, are known. These interactive input systems include but are not limited to: touch systems comprising touch panels employing analog resistive or machine vision technology to register pointer input such as those disclosed in U.S. Pat. Nos. 5,448,263; 6,141,000; 6,337,681; 6,747,636; 6,803,906; 7,232,986; 7,236,162; and 7,274,356 assigned to SMART Technologies ULC of Calgary, Alberta, Canada, assignee of the subject application, the entire disclosures of which are incorporated herein by reference; touch systems comprising touch panels employing electromagnetic, capacitive, acoustic or other technologies to register pointer input; laptop and tablet personal computers (PCs); smartphones, personal digital assistants (PDAs) and other handheld devices; and other similar devices.
Above-incorporated U.S. Pat. No. 6,803,906 to Morrison et al. discloses a touch system that employs machine vision to detect pointer interaction with a touch surface on which a computer-generated image is presented. A rectangular bezel or frame surrounds the touch surface and supports imaging devices in the form of digital cameras at its corners. The digital cameras have overlapping fields of view that encompass and look generally across the touch surface. The digital cameras acquire images looking across the touch surface from different vantages and generate image data. Image data acquired by the digital cameras is processed by on-board digital signal processors to determine if a pointer exists in the captured image data. When it is determined that a pointer exists in the captured image data, the digital signal processors convey pointer characteristic data to a master controller, which in turn processes the pointer characteristic data to determine the location of the pointer in (x,y) coordinates relative to the touch surface using triangulation. The pointer coordinates are conveyed to a computer executing one or more application programs. The computer uses the pointer coordinates to update the computer-generated image that is presented on the touch surface. Pointer contacts on the touch surface can therefore be recorded as writing or drawing or used to control execution of application programs executed by the computer.
U.S. Pat. No. 6,281,878 to Montellese discloses an input device for detecting input with respect to a reference plane. The input device includes a light sensor positioned to sense light at an acute angle with respect to the reference plane and for generating a signal indicative of sensed light, and a circuit responsive to the light sensor for determining a position of an object with respect to the reference plane.
U.S. Patent Application Publication No. 2011/0242060 to McGibney et al., assigned to SMART Technologies ULC, discloses an interactive input system comprising at least one imaging assembly having a field of view looking into a region of interest and capturing image frames, and processing structure in communication with the at least one imaging assembly. When a pointer exists in captured image frames, the processing structure demodulates the captured image frames to determine frequency components thereof and examines the frequency components to determine at least one attribute of the pointer.
U.S. Pat. No. 6,219,011 to Aloni et al. discloses an electro-optical display apparatus that includes a plurality of modular units each having a projector for receiving electrical signals, converting them to optical images, and projecting the optical images via an optical projection system onto a screen. The modular units are arranged in a side-by-side array so as to produce a combined display on the screen. A calibration system detects distortions in the combined display caused by the projection system of each modular unit and modifies the electrical signals applied to the projector of each modular unit to correct the combined display with respect to the detected distortions.
One disadvantage of machine vision interactive input systems is that they are susceptible to ambient light, which can cause light artifacts to appear in captured image frames. Such artifacts can cause inaccuracies when processing the captured image frames in order to determine pointer locations. Several approaches have been considered to deal with ambient light, and include calculating difference image frames to cancel out ambient light, using modulated light sources and using light-emitting pen tools in conjunction with an optical filter overlaying the image sensor of the imaging devices, whereby light emitted by the pen tools is frequency matched to the optical filter so that it may pass through the optical filter to the image sensor. These approaches often improve the ability of the interactive input system to deal with ambient light, but can add to the cost of the interactive input system due to the requirement for additional bezels, filters, light sources and/or computer processing power.
As a result, improvements are desired. It is therefore an object to provide a novel interactive input system and method.
Accordingly, in one aspect there is provided a method of determining pointer position in an interactive input system, the method comprising: identifying pixels of at least one captured image frame as being associated with coherent light; generating a processed image frame from the identified pixels; and determining from the processed image frame a position of at least one pointer that emits coherent light.
In one embodiment, the identifying comprises determining an intensity variance for pixels of the at least one captured image frame and identifying pixels having an intensity variance above a threshold value as the identified pixels. The method may further comprise determining a mean intensity for the pixels of the at least one captured image frame. In one embodiment, the mean intensity is used as the threshold value while in another embodiment, the mean intensity plus one or more standard deviations of estimated noise is used as the threshold value.
In one embodiment, the at least one pointer emits coherent light and is in the form of a pen tool having a diffused tip section configured to emit the coherent light. The coherent light may be coherent infrared light.
According to another aspect, there is provided an interactive input system comprising: at least one imaging device configured to capture image frames of a region of interest; and one or more processors configured to process captured image frames to identify pixels associated with coherent light; generate processed image frames from the identified pixels and determine from the processed image frames a position of at least one pointer that emits coherent light.
According to yet another aspect, there is provided a method of processing image frames captured in an interactive system, the method comprising: determining an intensity variance for pixels of the captured image frame; identifying pixels having an intensity variance above a threshold value as being associated with coherent light; and generating a processed image frame from the identified pixels.
According to yet another aspect, there is provided an interactive input system comprising: at least one imaging device capturing image frames of a region of interest; and one or more processors configured to process captured image frames to: determine an intensity variance for pixels of captured image frames, identify pixels having an intensity variance above a threshold value as being associated with coherent light, and generate processed image frames from the identified pixels.
According to yet another aspect, there is provided a non-transitory computer readable medium embodying program code, which when executed by one or more processors, causes an apparatus at least to determine an intensity variance for pixels of captured image frames; identify pixels having an intensity variance above a threshold value as being associated with coherent light; and generate processed image frames from the identified pixels.
Embodiments will now be described more fully with reference to the accompanying drawings in which:
Turning now to
Assembly 22 comprises a frame assembly that is integral with or attached to the display unit and surrounds the display surface 24. Frame assembly comprises a bezel having three bezel segments 40 to 44, four corner pieces 46 and a tool tray segment 48. Bezel segments 40 and 42 extend along opposite side edges of the display surface 24 while bezel segment 44 extends along the top edge of the display surface 24. The tool tray segment 48 extends along the bottom edge of the display surface 24 and supports one or more pen tools P and an eraser tool (not shown). The corner pieces 46 adjacent the top left and top right corners of the display surface 24 couple the bezel segments 40 and 42 to the bezel segment 44. The corner pieces 46 adjacent the bottom left and bottom right corners of the display surface 24 couple the bezel segments 40 and 42 to the tool tray segment 48. In this embodiment, the corner pieces 46 adjacent the bottom left and bottom right corners of the display surface 24 accommodate imaging assemblies 60 that look generally across the entire display surface 24 from different vantages. The bezel segments 40 to 44 are oriented so that their inwardly facing surfaces are seen by the imaging assemblies 60. In this embodiment, the inwardly facing surfaces of each of the bezel segments 40 to 44 has a light absorbing material thereon.
Turning now to
The clock receiver 76 and serializer 78 employ low voltage, differential signaling (LVDS) to enable high speed communications with the DSP unit 26 over inexpensive cabling. The clock receiver 76 receives timing information from the DSP unit 26 and provides clock signals to the image sensor 70 that determine the rate at which the image sensor 70 captures and outputs image frames. Each image frame output by the image sensor 70 is serialized by the serializer 78 and output to the DSP unit 26 via the connector 72 and communication lines 28.
Turning now to
The computing device 30 in this embodiment is a personal computer or other suitable processing device comprising, for example, a processing unit comprising one or more processors, system memory (volatile and/or non-volatile memory), other non-removable or removable memory (eg. a hard disk drive, RAM, ROM, EEPROM, CD-ROM, DVD, flash memory, etc.) and a system bus coupling the various computer components to the processing unit. The computing device 30 may also comprise a network connection to access shared or remote drives, one or more networked computers, or other networked devices.
During operation, the controller 120 conditions the clocks 130 and 132 to output clock signals that are conveyed to the imaging assemblies 60 via the communication lines 28. The clock receiver 76 of each imaging assembly 60 uses the clock signals to set the frame rate of its associated image sensor 70. In this embodiment, the controller 120 generates clock signals such that the image frame capture rate is 480 frames per second, and which results in the frame rate of each image sensor 70 being four (4) times a desired image frame output rate.
Each imaging assembly 60 typically sees a generally dark region as a result of the light absorbing material on the inwardly facing surfaces of the bezel segments 40 to 44, as well as artifacts resulting from ambient light. If an active pen tool P is brought into contact with the display surface 24 with sufficient force to push the actuator 186 into the tip 184, each imaging assembly 60 will also see an illuminated region corresponding to the illuminated tip 184 of the pen tool P. For example,
As mentioned above, each imaging assembly 60 captures successive images frames and conveys the captured image frames to the DSP unit 26. As each image frame is received, the controller 120 stores the image frame in a buffer. For each imaging assembly 60, once four (4) successive image frames 302 are available, the DSP unit 26 subjects the set of four (4) successive image frames to an image frame processing method, which is shown in
Once the processed image frame has been generated, the controller 120 processes the processed image frame by generating a vertical intensity profile (VIP) for each pixel column, and identifies intensity values that exceed a threshold value and that represent the likelihood that a pointer exists in the processed image frame. If no pen tool P exists in the successive image frames, the resulting processed image frame will not comprise any bright regions. As a result, the intensity values of the generated VIPs will not exceed the threshold value signifying that no pen tool exists. If one or more pen tools P exist in the successive image frames, the resulting processed image frame will comprise a bright region for each pen tool P. As a result, the intensity values of one or more generated VIPs will exceed the threshold value signifying that one or more pointers exist. The controller 120 in turn determines the peak locations of VIPs having intensity values surpassing the threshold value. Using the VIP peak locations, the controller 120 calculates the position of each pen tool P in (x,y) coordinates relative to the display surface 24 using triangulation in the well known manner, such as described in above-incorporated U.S. Pat. No. 6,803,906 to Morrison et al. Approaches for generating VIPs are described in U.S. Patent Application Publication No. 2009/0277697 to Bolt et al. entitled “Interactive Input System and Pen Tool Therefor” and assigned to SMART Technologies ULC, the relevant portions of the disclosure of which are incorporated herein by reference.
Once the position of each pen tool P has been determined, it is conveyed by the controller 120 to the general purpose computing device 30 via the USB cable 32. The general purpose computing device 30 in turn processes the received pointer coordinates, and updates the image output provided to the display unit, if required, so that the image presented on the display surface 24 reflects the pointer activity.
Although an embodiment has been described above with reference to
Turning now to
The overhead unit 526 comprises a base assembly 540, a digital signal processor (DSP) unit 544, a projection unit 546, a light curtain module 548, an imaging assembly 550, and a curved mirror 552.
The base assembly 540 comprises mounting structure (not shown) allowing overhead unit 526 to be mounted on the wall or other surface.
The DSP unit 544 communicates with the general purpose computing device 530 via USB cable 532. Alternatively, the DSP unit 544 may communicate with the general purpose computing device 530 over another wired connection such as for example, a parallel bus, an RS-232 connection, an Ethernet connection etc. or may communicate with the general purpose computing device 530 over a wireless connection using a suitable wireless protocol such as for example Bluetooth, WiFi, ZigBee, ANT, IEEE 802.15.4, Z-Wave, etc.
The projection unit 546 projects images received from the general purpose computing device 530 via a USB cable or other suitable wired or wireless communication link (not shown) onto the display surface 524 via curved mirror 552, as indicated by dotted lines 574a, shown in
The light curtain module 548 comprises an infrared (IR) light source such as for example one or more IR laser diodes and optical components that receive the laser diode output and generate a coherent light plane 560, as shown in
The imaging assembly 550 has a field of view encompassing the display surface 524 via curved mirror 552, as indicated by dashed lines 570a in
General purpose computing device 530 receives captured image frames from the DSP unit 544 and processes the captured image frames to detect pointer activity. The general purpose computing device 530 adjusts image data that is output to the projection unit 546 allowing the image presented on the display surface 524 to reflect pointer activity. In this manner, pointer activity proximate to the display surface 524 is recorded as writing or drawing or used to control the execution of one or more application programs executed by the general purpose computing device 530.
In the example shown in
Since ideal environments rarely exist during real world operation, sources of unwanted light may appear in image frames captured by imaging assembly 550. In
To resolve pointer locations and remove sources of ambient light, the general purpose computing device 530 employs an image frame processing method, which is shown in
Once the processed image frame has been generated, the general purpose computing device 530 analyzes the intensity value of each pixel in the processed image frame, and maps coordinates of bright regions in the image plane to coordinates in the display plane for interpretation as ink or mouse events by one or more application programs. Approaches for detecting one or more bright regions in image frames, and mapping the coordinates thereof to pointer positions, are described in U.S. Patent Application Publication No. 2010/0079385 to Holmgren et al. filed on Sep. 29, 2008 and assigned to SMART Technologies ULC, and International PCT Application No. PCT/CA2013/000024 filed on Jan. 11, 2013, the relevant portions of the disclosures of which are incorporated herein by reference.
Although the light curtain module is described above as emitting light generally continuously, those skilled in the art will appreciate that the light curtain module may alternatively pulse the emitted light such that it is in sequence with image frame capture.
Although the overhead unit is described as comprising the imaging assembly and the projection unit, in other embodiments, the imaging assembly and the projection unit may alternatively be separate units. In one such embodiment, the projection unit may alternatively be positioned behind the display surface, similar to configurations used in conventional rear-projection devices. In a related embodiment, the imaging assembly may also be positioned behind the display surface, such that the imaging assembly views the back of the display surface.
Also, although the light curtain module is used to provide a coherent light plane spaced from and generally parallel to the display surface, in other embodiments, other modules may be used to provide coherent light adjacent the display surface. For example, in one embodiment, a planar body within which totally internally reflected (TIR) coherent light propagates may be overlaid on the display surface, such that when a pointer contacts the planar body, the totally internally coherent light is frustrated at the contact locations, escapes from the planar body and appears in image frames captured by the imaging assembly.
In alternative embodiments, one or more light curtain modules may be integrated into the interactive input system 20 described above and with reference to
Although in embodiments above, the sources of coherent light are infrared coherent light sources, in other embodiments, the interactive input systems may alternatively be configured to process coherent light generated by non-infrared coherent light sources, such as for example by visible light sources. In such embodiments, each imaging assembly may alternatively comprise a visible-pass/IR block filter, or may alternatively comprise no filter.
Although in embodiments described above, the tip of the active pen tool houses an illumination source configured to emit coherent light, and where the illumination source comprises one or more miniature infrared laser diodes, in other embodiments, illumination sources that emit other frequencies of light may alternatively be used.
Although in embodiments described above, the variance of intensity is compared to a threshold value to identify pixel locations corresponding to coherent light, where the threshold value is the mean intensity, in other embodiments, other threshold values may alternatively be used. In one such embodiment, the threshold value may alternatively be the mean intensity plus a number N of standard deviations of estimated noise. For example, estimated noise can be determined from one or more “background” image frames obtained during calibration under normal operating conditions, and when no pointers or pen tools are in proximity with the interactive surface/display surface. As an example, it has been found in laboratory testing that a threshold value of the mean intensity plus five (5) standard deviations of estimated noise yields a low number of false positives.
Although embodiments have been described above with reference to the accompanying drawings, those of skill in the art will appreciate that variations and modifications may be made without departing from the scope thereof as defined by the appended claims.
This application claims the benefit of U.S. Provisional Application No. 61/783,383 to Barton filed on Mar. 14, 2013, entitled “Interactive Input System and Method”, the entire content of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61783383 | Mar 2013 | US |