The present invention relates generally to interactive input systems, and particularly to an interactive input system and an arm assembly therefor.
Interactive input systems that allow users to inject input (e.g. digital ink, mouse events, etc.) into an application program using an active pointer (e.g. a pointer that emits light, sound or other signal), a passive pointer (e.g. a finger, cylinder or other object) or other suitable input device such as for example, a mouse or trackball, are well known. These interactive input systems include but are not limited to: touch systems comprising touch panels employing analog resistive or machine vision technology to register pointer input such as those disclosed in U.S. Pat. Nos. 5,448,263; 6,141,000; 6,337,681; 6,747,636; 6,803,906; 7,232,986; 7,236,162; and 7,274,356 and 7,532,206 assigned to SMART Technologies ULC of Calgary, Alberta, Canada, assignee of the subject application, the contents of which are incorporated herein by reference in their entirely; touch systems comprising touch panels employing electromagnetic, capacitive, acoustic or other technologies to register pointer input; tablet personal computers (PCs); laptop PCs; personal digital assistants (PDAs); and other similar devices.
Above-incorporated U.S. Pat. No. 6,803,906 to Morrison et al. discloses a touch system that employs machine vision to detect pointer interaction with a touch surface on which a computer-generated image is presented. A rectangular bezel or frame surrounds the touch surface and supports digital imaging devices at its corners. The digital imaging devices have overlapping fields of view that encompass and look generally across the touch surface. The digital imaging devices acquire images looking across the touch surface from different vantages and generate image data. Image data acquired by the digital imaging devices is processed by on-board digital signal processors to determine if a pointer exists in the captured image data. When it is determined that a pointer exists in the captured image data, the digital signal processors convey pointer characteristic data to a master controller, which in turn processes the pointer characteristic data to determine the location of the pointer in (x,y) coordinates relative to the touch surface using triangulation. The pointer coordinates are conveyed to a computer executing one or more application programs. The computer uses the pointer coordinates to update the computer-generated image that is presented on the touch surface. Pointer contacts on the touch surface can therefore be recorded as writing or drawing or used to control execution of application programs executed by the computer.
Above-incorporated U.S. Pat. No. 7,532,206 to Morrison et al. discloses a touch system and method that differentiates between passive pointers used to contact a touch surface so that pointer position data generated in response to a pointer contact with the touch surface can be processed in accordance with the type of pointer used to contact the touch surface. The touch system comprises a touch surface to be contacted by a passive pointer and at least one imaging device having a field of view looking generally along the touch surface. At least one processor communicates with the at least one imaging device and analyzes images acquired by the at least one imaging device to determine the type of pointer used to contact the touch surface and the location on the touch surface where pointer contact is made. The determined type of pointer and the location on the touch surface where the pointer contact is made are used by a computer to control execution of an application program executed by the computer.
In order to determine the type of pointer used to contact the touch surface, in one embodiment a curve of growth method is employed to differentiate between different pointers. During this method, a horizontal intensity profile (HIP) is formed by calculating a sum along each row of pixels in each acquired image thereby to produce a one-dimensional profile having a number of points equal to the row dimension of the acquired image. A curve of growth is then generated from the HIP by forming the cumulative sum from the HIP.
Although passive touch systems provide some advantages over active touch systems and work extremely well, using both active and passive pointers in conjunction with a touch system provides more intuitive input modalities with a reduced number of processors and/or processor load.
U.S. Pat. Nos. 6,335,724 and 6,828,959 to Takekawa et al. disclose a coordinate-position input device having a frame with a reflecting member for recursively reflecting light provided in an inner side from four edges of the frame forming a rectangular form. Two optical units irradiate light to the reflecting members and receive the reflected light. With the mounting member, the frame can be detachably attached to a white board. The two optical units are located at both ends of any one of the frame edges forming the frame, and at the same time the two optical units and the frame body are integrated to each other.
U.S. Pat. No. 6,828,959 to Takekawa also discloses a coordinate-position input device having a frame comprising a plurality of frame edges having a nested, telescoping arrangement. The frame edges together with retractable reflecting members are accommodated in frame-end sections. Since the frame edges are extendable, the size of the coordinate-position input device can be adjusted according to the size of a white board or a display unit used with the device. Mounting members are provided on each of the frame-end sections that are used to mount the device to the white board or the display unit. An optical unit can be removably attached to each frame-end section, and the irradiating direction of the optical unit is adjustable.
Although adjustable coordinate-position input devices are known, improvements are desired. It is an object of the present invention at least to provide a novel interactive input system and a novel arm assembly therefor.
Accordingly, in one aspect there is provided an interactive input system comprising a display unit having a display surface; a bezel disposed around at least a portion of the periphery of a region of interest proximate said display surface and having an inwardly facing surface; and an elongate arm assembly mounted to the display unit, said arm assembly supporting imaging devices thereon and being longitudinally extendable to position the imaging devices at spaced locations relative to the display surface such that the fields of view of the imaging devices encompass the region of interest.
In one embodiment, the arm assembly comprises a body configured to be mounted to the display unit and at least one moveable arm received by the body, the arm being longitudinally slideable relative to the body. In one form, the arm assembly comprises two moveable arms received by the body, the arms being longitudinally slidable relative to the body in opposite directions, each of the arms supporting a respective one of the imaging devices. In another form, the arm assembly comprises one moveable arm received by the body and one fixed arm extending from the body in a direction opposite to the direction of sliding movement of the moveable arm.
In one embodiment, each of the imaging devices is accommodated within a housing adjacent a distal end of the respective arm. Each housing comprises an aperture through which the imaging device looks.
In one embodiment, the interactive input system further comprises a controller unit mounted on the arm assembly. The controller unit is mounted either within the interior of the at least one moveable arm, within the body or on the body.
In another aspect, there is provided an arm assembly configured to be mounted to a display unit, said arm assembly supporting imaging devices thereon and being longitudinally extendable to position the imaging devices at spaced locations relative to a display surface of said display unit such that the fields of view of the imaging devices look generally across said display surface.
In still another aspect, there is provided a kit for an interactive input system comprising a plurality of bezel segments configurable to form a reflective bezel for surrounding at least a portion of the periphery of a region of interest adjacent a display surface of a display unit; and an elongate arm assembly configured to be mounted to the display unit, said arm assembly supporting imaging devices thereon and being longitudinally extendable to position the imaging devices at spaced locations relative to the display surface such that the fields of view of the imaging devices encompass the region of interest.
Embodiments will now be described more fully with reference to the accompanying drawings in which:
a and 2b are front and side elevational views, respectively, of the interactive input system of
a and 6b are perspective views of an arm assembly forming part of the interactive system of
a is a perspective view of an alignment pin forming part of the interactive input system of
b is a perspective view of an alignment jig for use with the alignment pin of
a and 13b are perspective and cross-sectional views, respectively, of another embodiment of a bezel forming part of the interactive input system of
a and 17b are perspective views of a bezel forming part of the interactive input system of
The following is directed to an interactive input system comprising an arm assembly having one or two moveable arms on which imaging devices are mounted. The arm assembly is generally lightweight, and is configured to be fastened to or otherwise secured to a display unit, such as for example a plasma display panel, a liquid crystal display (LCD) panel etc., that has a display surface above which generally defines a region of interest. The region of interest is surrounded by a reflective or retro-reflective bezel. The moveable arm or arms enable the imaging devices to be positioned relative to the edges of the display panel so that at least the entirety of the region of interest is within the fields of view of the imaging devices. This adjustability allows the arm assembly to be used with display panels of more than one size. The bezel may be segmented, and the segments may be cut to size so as to fit the periphery of the display panel. The subject interactive input system is a low cost, adjustable alternative to prior art interactive input systems.
Turning now to
An adjustable arm assembly 40 is mounted to the bottom of display unit 22. Arm assembly 40 comprises two longitudinally extendable arms 44a and 44b extending from opposite ends of a body 45. An imaging device 46a and 46b is mounted on each of arms 44a and 44b, respectively. Arm assembly 40 is adjustable so as to allow the imaging devices 46a and 46b to be positioned so that the field of view of each imaging device looks generally across the display surface 24 and views the inwardly facing surfaces of the bezel segments 26, 28 and 30. In this manner, pointers brought into a region of interest in proximity with the display surface 24 are seen by the imaging devices 46a and 46b as will be described.
Arm assembly 40 also comprises a master controller 48 accommodated by the body 45 that communicates with the imaging devices 46a and 46b and with a general purpose computing device 50 and a display controller 52. Display controller 52 is in communication with the display unit 22 and communicates display output thereto. The general purpose computing device 50 executes one or more application programs and uses pointer location information communicated from the master controller 48 to generate and update the display output that is provided to the display controller 52 for output to the display unit 22, so that the image presented on the display surface 24 reflects pointer activity proximate the display surface 24. In this manner, pointer activity proximate the display surface 24 can be recorded as writing or drawing or used to control execution of one or more application programs running on the general purpose computing device 50. The display controller 52 also modifies the display output provided to the display unit 22 when a pointer ambiguity condition is detected to allow the pointer ambiguity condition to be resolved thereby to improve pointer verification, localization and tracking.
Referring to
The general purpose computing device 50 in this embodiment is a personal computer or the like comprising, for example, a processing unit, system memory (volatile and/or non-volatile memory), other non-removable or removable memory (e.g. a hard disk drive, RAM, ROM, EEPROM, CD-ROM, DVD, flash memory, etc.) and a system bus coupling the various computing device components to the processing unit. The general purpose computing device 50 may also comprise a network connection to access shared or remote drives, one or more networked computers, or other networked devices. The processing unit runs a host software application/operating system which, during execution, provides a graphical user interface that is presented on the display surface 24 such that freeform or handwritten ink objects and other objects can be input and manipulated via pointer interaction with the region of interest in proximity with display surface 24.
Turning now to
The body 45 has strips of fastener material (not shown) disposed on its upper surface. The strips of fastener material cooperate with corresponding strips of fastener material (not shown) disposed on the underside of the display unit 22 thereby to secure the arm assembly 40 to the display unit 22. In this embodiment, the fastener material on the body 45 and the corresponding fastener material on the display unit 22 is of the 3M Dual Lock™ type. Each arm 44a, 44b also has one or more strips of fastener material thereon (not shown). The strips of fastener material on the arms 44a and 44b cooperate with strips of fastener material (not shown) on the underside of display unit 22 once the arms have been extended and properly positioned relative to display surface 24, as will be further described below. In this embodiment, the fastener material on the arms 44a and 44b and the corresponding fastener material on the underside of display unit 22 is also of the 3M Dual Lock™ type.
In this embodiment, image sensor 54 has a field of view that is slightly greater than 90 degrees, and is oriented such that the boundaries of its field of view in the vertical plane (e.g. the plane parallel to display surface 24) are generally aligned with the horizontal and vertical edges of display surface 24. Accordingly, to properly position the imaging devices 46a and 46b on the moveable arms 44a and 44b relative to display surface 24 so as to enable the entirety of the display surface 24 and the surrounding bezel to be within the field of view the imaging devices, the lens 56 of each image sensor 54 should be vertically aligned with the reflective surfaces on the bezel segments. The imaging devices 46a and 46b should also be aligned with respect to the normal direction of the display surface 24 such that both the bezel and the display surface 24 are within the field of view of the image sensors 54. This may be achieved, for example, by repositioning arm assembly 40 relative to the display unit 22, as necessary.
In operation, the DSP 62 of each imaging device 46a and 46b generates clock signals so that the image sensor 54 of each imaging device captures image frames at the desired frame rate. The clock signals provided to the image sensors 52 are synchronized such that the image sensors of the imaging devices 46a and 46b capture image frames substantially simultaneously. The DSP 62 of each imaging device also signals the current control module 67a. In response, each current control module 67a connects its associated IR light source 67b to the power supply 68 thereby illuminating the IR light source resulting in IR backlighting being provided over the display surface 24. When no pointer is in proximity with the display surface 24, image frames captured by the image sensors 52 comprise a substantially uninterrupted bright band as a result of the infrared backlighting reflected by the retro-reflective surfaces 34 of the bezel segments 26, 28 and 30. However, when one or more pointers are brought into proximity of display surface 24, each pointer occludes the IR backlighting reflected by the bezel segments and appears in captured image frames as a dark region interrupting the white bands.
Each image frame output by the image sensor 54 of each imaging device 46a and 46b is conveyed to its associated DSP 62. When a DSP 62 receives an image frame, the DSP 62 processes the image frame to detect the existence of one or more pointers. If one or more pointers exist in the image frame, the DSP 62 creates an observation for each pointer in the image frame. Each observation is defined by the area formed between two straight lines, one line of which extends from the focal point of the imaging device and crosses the right edge of the dark region representing the pointer and the other line of which extends from the focal point of the imaging device and crosses the left edge of the dark region representing the pointer. The DSP 62 then conveys the observation(s) to the master controller 48 via the serial line driver 76 and the communication lines 74a and 74b.
The master controller 48 in response to received observations from the imaging devices 46a and 46b, examines the observations to determine those observations from imaging devices 46a and 46b that overlap. When both imaging devices 46a and 46b see the same pointer resulting in observations that overlap, the center of the resultant bounding box, that is delineated by the intersecting lines of the overlapping observations, and hence the position of the pointer in (x,y) coordinates relative to the display surface 24 is calculated using well known triangulation, as described in above-incorporated U.S. Pat. No. 6,803,906 to Morrison et al.
The master controller 48 then examines the triangulation results to determine if one or more pointer ambiguity conditions exist. If no pointer ambiguity condition exists, the master controller 48 outputs each calculated pointer position to the general purpose computing device 50. The general purpose computing device 50 in turn processes each received pointer position and updates the display output provided to the display controller 52, if required. The display output generated by the general purpose computing device 50 in this case passes through the display controller 52 unmodified and is received by the display unit 22. The display unit 22 in turn presents an image reflecting pointer activity. In this manner, pointer interaction with display surface 24 can be recorded as writing or drawing or used to control execution of one or more application programs running on the general purpose computing device 50.
If one or more pointer ambiguity conditions exist, the master controller 48 conditions the display controller 52 to dynamically manipulate the display output of the general purpose computing device 50 in a manner to allow each pointer ambiguity condition to be resolved as described in International PCT Application No. PCT/CA2010/000190, assigned to SMART Technologies ULC of Calgary, Alberta, Canada, assignee of the subject application, the content of which is incorporated herein by reference in its entirety. Once resolved, the master controller 48 outputs each calculated pointer position to the general purpose computing device 50. The general purpose computing device 50 in turn processes each received pointer position and updates the display output provided to the display controller 52, if required. The display output generated by the general purpose computing device 50 again passes through the display controller 52 unmodified and is received by the display unit 22 and displayed on the display surface 24.
a shows an alternative embodiment of fasteners for mounting each of the arms 44a and 44b to the underside of display unit 22, and which is generally indicated by reference numeral 186. The fasteners 186 are positioned at longitudinally spaced locations and are secured to the underside of the display unit 22. Each fastener 186 comprises a strip of fastening material 187 and an alignment pin 188 protruding from the surface of fastening material 187. Each fastener 186 is configured to be affixed to the underside of the display unit 22. In this embodiment, the fastening material is of the 3M Dual Lock™ type. Pin 188 is sized to be received in a corresponding aperture (not shown) formed in the upper surface of its respective arm. Fastening material 187 engages a corresponding strip of fastening material (not shown) disposed on the upper surface of the respective arm surrounding the aperture. Fasteners 186 may be applied in the correct positions to the underside of display unit 22 using an alignment jig 189, as shown in
a and 13b shows an alternative embodiment of a bezel for use with the interactive input system 20. In this embodiment, the bezel comprises a plurality of nested bezel segments 292 to 294. The nested bezel segments are slideably moveable relative to each other to provide an adjustable bezel that has dimensions corresponding to the periphery of the display surface 24. In this embodiment, the plurality of nested bezel segments comprises corner segments 292, center segments 293, and end segments 294, which are nested within each other and are extendable and retractable relative to each other, as shown. As will be appreciated, the adjustability of the bezel formed from nested bezel segments 292 to 294 allows the bezel to be fitted to display units 22 of more than one size.
Although the embodiments described above are directed to an interactive input system comprising two imaging devices, the interactive input system may comprise additional imaging devices.
Although the bezels described above are formed of bezel segments that are generally linear, the bezel segments may alternatively be curved for improving the imaging of the retro-reflective surface of the bezel. An interactive input system comprising curved bezel segments is shown in
Although the arm assembly described above comprises two longitudinally extendable arms, the arm assembly may alternatively comprise only one arm that is moveable. For example,
In the embodiments described above the imaging devices are in communication with the master controller through cables. The cables may be embodied in a serial bus, a parallel bus, a universal serial bus (USB), an Ethernet connection or other suitable wired connection. Alternatively, the imaging devices may communicate with the master controller by means of a wireless connection using a suitable wireless protocol such as for example Bluetooth, WiFi, ZigBee, ANT, IEEE 802.15.4, Z-Wave etc. Similarly, the master controller may communicate with the display controller and/or the general purpose computing device over one of a variety of wired connections such as for example, a universal serial bus, a parallel bus, an RS-232 connection, an Ethernet connection etc., or over a wireless connection.
Although in embodiments described above the controller unit is positioned within the interior of one of the arms of the arm assembly, the controller unit is not limited to this position and in other embodiments may alternatively be positioned anywhere in the interactive input system, including being mounted on the outside of the body of the arm assembly, or mounted within the interior of the body of the arm assembly. In any of these arrangements, the controller unit is positioned so as not to impede the movement of the arms relative to the body.
Although embodiments described above comprise a display surface having a periphery on which a reflective or retro-reflective bezel is disposed, such a bezel need not be employed. Alternatively, a series of light emitting diodes (LEDs) or other light sources may be disposed along the periphery of the display surface and optionally positioned behind a diffuser to illuminate the region of interest over the display surface and provide JR lighting to the imaging devices. In this case, the imaging devices do not require the IR light sources. Alternatively, the LEDs could be configured to emit light that reflects off of a diffuse reflector, as disclosed in U.S. Pat. No. 7,538,759 to Newton and assigned to Next Holdings. Alternatively, the display surface could comprise a bezel that is illuminated using optical fibers or other forms of waveguide, as disclosed in U.S. Pat. No. 7,333,095 to Lieberman et al. assigned to Lumio. Such a powered bezel could be powered through a power connection to the arm assembly, a battery, a solar power source, or any other suitable power source.
Although embodiments described above comprise imaging devices that are fixedly mounted within the housings such that they have a fixed viewing angle relative to the arms, the imaging devices need not be fixedly mounted and alternatively may be pivotably mounted within the housings.
Although in the embodiments described above the fastener material is of the 3M™ Dual Lock™ type, those of skill in the art will appreciate that alternative fastener material known in the art, such as, but not limited to, Velcro™ may be used. Of course, rather than using fastener material, those of skill in the art will appreciate that other fasteners known in the art, such as, but not limited to, screws, straps, and the like may be used.
Although embodiments have been described, those of skill in the art will appreciate that variations and modifications may be made with departing from the spirit and scope thereof as defined by the appended claims.
This application claims the benefit of U.S. Provisional Application No. 61/218,028 to Wiebe et al., filed on Jun. 17, 2009, the content of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61218028 | Jun 2009 | US |