The subject application relates generally to a method, apparatus and interactive input system.
Interactive input systems that allow users to inject input such as for example digital ink, mouse events etc. into an application program using an active pointer (eg. a pointer that emits light, sound or other signal), a passive pointer (e.g., a finger, cylinder or other object) or other suitable input device such as for example, a mouse or trackball, are well known. These interactive input systems include but are not limited to: touch systems comprising touch panels employing analog resistive or machine vision technology to register pointer input such as those disclosed in U.S. Pat. Nos. 5,448,263; 6,141,000; 6,337,681; 6,747,636; 6,803,906; 7,232,986; 7,236,162; and 7,274,356 and in U.S. Patent Application Publication No. 2004/0179001, all assigned to SMART Technologies ULC of Calgary, Alberta, Canada, assignee of the subject application, the entire disclosures of which are incorporated herein by reference; touch systems comprising touch panels employing electromagnetic, capacitive, acoustic or other technologies to register pointer input; tablet and laptop personal computers (PCs); personal digital assistants (PDAs) and other handheld devices; and other similar devices.
In educational environments, lessons involving the creation of physical devices on paper such as laptop computers, calculators, televisions etc. have been held. During such lessons, students are asked to design on paper a physical device, giving the students the freedom to bend the rules of conventional design and focus the design on personal choice and creativity rather than on practical layout and placement in terms of function. While these lessons promote creative thinking, enhancing the impact of these lessons is desired.
It is therefore an object to provide a novel method, apparatus and interactive input system.
Accordingly, in one aspect there is provided a method comprising mapping elements of an image on a digitizer surface to functions of a widget executed on a computing device that communicates with said digitizer; and responsive to user interaction with the elements, executing the widget functions mapped to the elements.
In one embodiment, the method further comprises displaying the result of the executed widget functions on the digitizer surface. The mapping comprises associating with the elements to corresponding widget functions and tracing the elements on the digitizer surface. The associating comprises, for each element, selecting a graphical object displayed on the digitizer surface associated with the corresponding widget function prior to the tracing.
The method may further comprise, prior to the mapping, placing the image on the digitizer surface within a designated region. The designated region may be a specified area within a window displayed on the digitizer surface. The image may be one of a hand-drawn image on a substrate, a picture, photograph or other illustration on a substrate or a digital image.
According to another aspect there is provided an apparatus comprising memory; one or more processors communicating with said memory, said one or more processors executing program instructions stored in said memory to cause said apparatus at least to: map elements of an image on a digitizer surface to functions of a widget executed on a computing device that communicates with said digitizer; and responsive to user interaction with the elements, execute the widget functions mapped to the elements.
According to another aspect there is provided a non-transitory computer readable medium embodying executable program code, said program code when executed by one or more processors, causing an apparatus to carry out a method comprising mapping elements of an image on a digitizer surface to functions of a widget executed on a computing device that communicates with said digitizer; and responsive to user interaction with the elements, executing the widget functions mapped to the elements.
According to another aspect there is provided an interactive input system comprising a digitizer having an interactive surface on which a computer-generated image is presented; and processing structure communicating with said digitizer, said processing structure executing an application program that causes said processing structure to: map elements presented on said interactive surface to corresponding functions of said application program; execute, in response to user interaction with the elements, the corresponding application functions; and update the computer-generated image presented on the interactive surface in accordance with the executed application functions.
Embodiments will now be described more fully with reference to the accompanying drawings in which:
In the following, a method, apparatus, non-transitory computer-readable medium and interactive input system are described wherein the method comprises mapping elements in an image on a digitizer surface to functions of a widget executed on a computing device that communicates with said digitizer; and responsive to user interaction with the elements, executing the widget functions mapped to the elements.
Turning now to
The interactive board 22 employs machine vision to detect one or more pointers brought into a region of interest in proximity with the interactive surface 24. The interactive board 22 communicates with a general purpose computing device 28 executing one or more application programs via a universal serial bus (USB) cable 30 or other suitable wired or wireless communication link. General purpose computing device 28 processes the output of the interactive board 22 and adjusts image data that is output to the projector 34, if required, so that the image presented on the interactive surface 24 reflects pointer activity. In this manner, the interactive board 22, general purpose computing device 28 and projector 34 allow pointer activity proximate to the interactive surface 24 to be recorded as writing or drawing or used to control execution of one or more application programs executed by the general purpose computing device 28.
The bezel 26 is mechanically fastened to the interactive surface 24 and comprises four bezel segments that extend along the edges of the interactive surface 24. In this embodiment, the inwardly facing surface of each bezel segment comprises a single, longitudinally extending strip or band of retro-reflective material. To take best advantage of the properties of the retro-reflective material, the bezel segments are oriented so that their inwardly facing surfaces lie in a plane generally normal to the plane of the interactive surface 24.
A tool tray 36 is affixed to the interactive board 22 adjacent the bottom bezel segment using suitable fasteners such as for example, screws, clips, adhesive etc. As can be seen, the tool tray 36 comprises a housing having an upper surface configured to define a plurality of receptacles or slots. The receptacles are sized to receive one or more pen tools 38 as well as an eraser tool that can be used to interact with the interactive surface 24. Control buttons are also provided on the upper surface of the tool tray housing to enable a user to control operation of the interactive input system 20. Further specifies of the tool tray 36 are described in International PCT Application Publication No. WO 2011/085486 filed on Jan. 13, 2011, and entitled “INTERACTIVE INPUT SYSTEM AND TOOL TRAY THEREFOR”.
Imaging assemblies (not shown) are accommodated by the bezel 26, with each imaging assembly being positioned adjacent a different corner of the bezel. Each of the imaging assemblies comprises an image sensor and associated lens assembly that provides the image sensor with a field of view sufficiently large as to encompass the entire interactive surface 24. A digital signal processor (DSP) or other suitable processing device sends clock signals to the image sensor causing the image sensor to capture image frames at the desired frame rate. During image frame capture, the DSP also causes an infrared (IR) light source to illuminate and flood the region of interest over the interactive surface 24 with IR illumination. Thus, when no pointer exists within the field of view of the image sensor, the image sensor sees the illumination reflected by the retro-reflective bands on the bezel segments and captures image frames comprising a continuous bright band. When a pointer exists within the field of view of the image sensor, the pointer occludes IR illumination and appears as a dark region interrupting the bright band in captured image frames.
The imaging assemblies are oriented so that their fields of view overlap and look generally across the entire interactive surface 24. In this manner, any pointer such as for example a user's finger, a cylinder or other suitable object, a pen tool 38 or an eraser tool lifted from a receptacle of the tool tray 36, that is brought into proximity of the interactive surface 24 appears in the fields of view of the imaging assemblies and thus, is captured in image frames acquired by multiple imaging assemblies. When the imaging assemblies acquire image frames in which a pointer exists, the imaging assemblies convey pointer data to the general purpose computing device 28.
The general purpose computing device 28 in this embodiment is a personal computer or other suitable processing device comprising, for example, a processing unit comprising one or more processors, system memory (volatile and/or non-volatile memory), other non-removable or removable memory (e.g., a hard disk drive, RAM, ROM, EEPROM, CD-ROM, DVD, flash memory, etc.) and a system bus coupling the various computing device components to the processing unit. The general purpose computing device 28 may also comprise networking capabilities using Ethernet, WiFi, and/or other suitable network format, to enable connection to shared or remote drives, one or more networked computers, or other networked devices.
The general purpose computing device 28 processes pointer data received from the imaging assemblies to resolve pointer ambiguity by combining the pointer data generated by the imaging assemblies, and to compute the locations of pointers proximate the interactive surface 24 using well known triangulation. The computed pointer locations are then recorded as writing or drawing or used as input commands to control execution of an application program as described above.
In this embodiment, one of the application programs executed by the general purpose computing device 28 is a widget that allows an image comprising functional elements to be placed on the interactive surface 24 and then pre-canned functions of the widget to be assigned to the functional elements in the image so that user interactions with the functional elements in the image invoke the assigned pre-canned functions. An exemplary widget in the form of a “My Paper CALCULATOR” application will now be described with particular reference to
Initially, when the “My Paper CALCULATOR” application is executed by the general purpose computing device 28, a window comprising start screen 100 is presented on the interactive surface 24 as shown in
After the user has properly positioned the piece of paper 118 within the designated region 112 and has selected the “Next” button 116, a calculator configuration screen 130 is presented in the window as shown in
Once the calculator configuration procedure has been completed and the “Next” button 138 has been selected, a calculator initialization screen 140 is presented in the window as shown in
As will be appreciated, the “My Paper CALCULATOR” widget allows a hand-drawn calculator to be virtually augmented by placing the hand-drawn calculator 120 on the interactive surface 24. While a “My Paper CALCULATOR” widget has been described above, those of skill in the art will appreciate that other widgets may be employed, where it is desired to bring an image of a physical device to life. For example, the widget may alternatively comprise pre-canned functions associated with a television, radio, clock, telephone, game controller, boom box, thermostat etc. allowing images of these physical devices to be brought to life.
If desired, the selectable graphic objects 132 may include a body button corresponding to the body or perimeter of the hand-drawn calculator 120 or other widget. Also, if desired, the general purpose computing device 28 may store the relative positions of the selectable graphic objects 132 and may include a redraw button. This permits quick re-orientation or bringing of the hand-drawn calculator 120 “back to life” should the substrate separate from the interactive surface 24 and require reaffixing, or should the substrate be rotated or moved relative to the interactive surface 24. In this case, only a subset of the elements of the hand-drawn calculator 120 may need to be traced. For example, just tracing the screen 120a of the hand-drawn calculator 120 may be sufficient to re-orient the hand-drawn calculator.
In the embodiments described above, although a piece of paper on which a calculator is hand drawn is described as being placed on the interactive surface 24, those of skill in the art will appreciate that variations are available. For example, the image need not be hand drawn and the substrate need not be a piece of paper. The image may be in the form of a picture, photograph or other illustration that is printed, adhered, taped or otherwise applied or secured to the substrate and the substrate may be formed of any suitable material on which an image can be placed.
The application program may comprise program modules including routines, instruction sets, object components, data structures, and the like, and may be embodied as computer readable program code stored on a non-transitory computer readable medium. The non-transitory computer readable medium is any data storage device that can store data. Examples of non-transitory computer readable media include for example read-only memory, random-access memory, CD-ROMs, magnetic tape, USB keys, flash drives and optical data storage devices. The computer readable program code can also be distributed over a network including coupled computer systems so that the computer readable program code is stored and executed in a distributed fashion.
Although in embodiments described above, the digitizer is described as comprising machine vision to register pointer input, those skilled in the art will appreciate that digitizers employing other machine vision configurations, analog resistive, electromagnetic, capacitive, acoustic or other technologies to register input may be employed. The digitizer need not be mounted on a wall surface. The digitizer may be suspended or otherwise supported in an upright orientation or may be arranged to take on an angled or horizontal orientation.
In embodiments described above, a projector is employed to project the computer-generated image onto the interactive surface 24. Those of skill in the art will appreciate that alternatives are available. For example, the digitizer may comprise a display panel such as for example a liquid crystal display (LCD) panel, a plasma display panel etc. on which the computer-generated image is presented. In this case, using a transparent or translucent material for the substrate is preferred to ensure the image presented on the display panel is clearly visible through the substrate and not occluded thereby.
In other embodiments, no physical substrate is used. In these embodiments, rather than place a substrate having an image thereon on the interactive surface 24, a digital image is selected from a gallery of images and placed in the designated region 112. Once the digital image is placed in the designated region 112, the digital image can be “brought to life”, in the same manner as described above. An added advantage of using digital images is that once the elements of the digital image have been traced and assigned pre-canned widget functions, as the display of the digital image is under the control of general purpose computing device 28, rotation, scaling and movement of the digital image is possible without losing the functions mapped to the elements of the digital image.
Although embodiments have been described above with reference to the accompanying drawings, those of skill in the art will appreciate that variations and modifications may be made without departing from the scope thereof as defined by the appended claims.
This application claims the benefit of U.S. Provisional Application No. 61/929,971 to Popovich, filed on Jan. 21, 2014, entitled “Method, Apparatus and Interactive Input System”, the entire disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61929971 | Jan 2014 | US |