The present invention relates generally to participant response systems and in particular to a data presentation method and a participant response system employing the same.
Participant response systems for enabling participants of an event to enter responses to posed questions, motions or the like are well known in the art and have wide applicability. For example, during a conference, seminar or the like, participants can be provided with handsets that enable the participants to respond to questions, or to vote on motions raised during the conference or seminar. In the entertainment field, audience members can be provided with handsets that enable the audience members to vote for entertainment programmes or sports events. These participant response systems are also applicable in the field of education. Participants can be provided with handsets that enable the participants to respond to questions posed during lessons, tests or quizzes. Of significant advantage, these participant response systems provide immediate feedback to presenters, teachers, entertainment programme producers, or event organizers. With respect to the field of education, research shows that teachers teach and participants learn more effectively when there is rapid feedback concerning the state of participants' comprehension or understanding. It is therefore not surprising that such participant response systems are gaining wide acceptance in the field of education.
Participant response systems fall generally into two categories, namely wired and wireless participant response systems. In wired participant response systems, participants respond to posed questions or vote on motions using remote units that are physically connected to a local area network and communicate with a base or host general purpose computing device. In wireless participant response systems, the remote units communicate with the base or host general purpose computing device wirelessly.
A number of different wired and wireless participant response systems have been considered. For example, U.S. Pat. No. 4,247,908 to Lockhart, Jr. et al. discloses a two-way communication system for use with a host general purpose computing device that includes a control unit, a base station and multiple, hand-held, portable radio/data terminal units. The control unit interfaces directly with the host general purpose computing device but uses a radio link to interface with the portable radio/data terminal units. Each portable radio/data terminal unit includes a two-way radio and a data terminal. The data terminal includes a keyboard for data entry and an LED display for readout of either received data or locally generated data. The host general purpose computing device initiates communication through polling and/or selection of portable radio/data terminal units via the control unit. The control unit, in response to a “poll” from the host general purpose computing device, responds by sending either a previously received message from a portable radio/data terminal unit, or if no message has been received, a “no message” response. Polling by the control unit is an invitation to the portable radio/data terminal units to send data to the control unit to be stored, grouped if necessary and sent on to the host general purpose computing device. The control unit polls the portable radio/data terminal units by address in a particular sequence. The control unit transmits acknowledgements to the portable radio/data terminal units for received data on the next polling cycle.
U.S. Pat. No. 5,002,491 to Abrahamson et al. discloses an interactive electronic classroom system for enabling teachers to teach participants concepts and to receive immediate feedback regarding how well the participants have learned the taught concepts. Structure is provided for enabling participants to proceed in lockstep or at their own pace through exercises and quizzes, responding electronically to questions asked, the teacher being able to receive the responses, and to interpret a readout, in histogram or other graphic display form, of participant responses. The electronic classroom comprises a central computer and a plurality of participant computers, which range from simple devices to full fledged personal computers, connected to the central computer over a network. Optional peripheral hardware, such as video cassette recorders (VCRs) or other recording/reproducing devices, may be used to provide lessons to participants in association with the computer network.
U.S. Pat. No. 6,790,045 to Drimmer discloses a method and system for analyzing participant performance by classifying participant performance into discrete performance classifications associated with corresponding activities related to an electronic course. An observed participant performance level for at least one of the performance classifications is measured. A benchmark performance level or range is established for one or more of the performance classifications. It is then determined whether the observed participant performance level is compliant with the established benchmark performance level for the at least one performance classification. Instructive feedback is determined for the observed participant based upon any material deviation of the observed participant performance from at least one benchmark.
U.S. Patent Application Publication No. 2004/0072136 to Roschelle et al. discloses a method and system for assessing a participant's understanding of a process that may unfold over time and space. The system comprises thin client devices in the form of wireless, hand-held, palm-sized computers that communicate with a host workstation. The system provides a sophisticated approach of directing participants to perform self-explanation, and enables instructors to enhance the value of this pedagogical process by providing meaningful and rapid feedback in a classroom setting.
U.S. Patent Application Publication No. 2004/0072497 to Buehler et al. discloses a response system and method of retrieving user responses from a plurality of users. The response system comprises a plurality of base units and a plurality of response units. Each of the response units is adapted to receive a user input selection and to communicate that user's input selection with at least one base unit utilizing wireless communication. Personality data is provided for the response units to facilitate communication with a particular base unit. The personality data of a particular response unit is changed when it is desired to change the base unit to which that response unit communicates. This allows a response unit to become grouped with a particular base unit at a particular time and become grouped with another base unit at another particular time.
Although known participant response systems are capable of analyzing student responses, some known participant response systems may not be capable of visually representing analysis results according to spatial information of participants. For example, in the case of a school classroom, some known participant response systems may not be capable of visually representing analysis results as a function of student location in a classroom, in a school, or in other geographic areas such as a school district or a city. A participant response system capable of such visual representation may better enable the teacher to identify possible relationships between student academic performance and seat location, for example, or may better enable a school board administrator to identify possible relationships between student academic performance and school location or student demographics.
Additionally, conventional database systems typically require a defined data structure. If a user needs to expand a table in a database to include data of a type that has not been defined therein, the user needs to first modify the structure to include additional fields for accommodating the data, and then enter the data into the modified structure. As will be appreciated, a participant response system utilizing a more flexible database configuration is desirable.
It is therefore an object of the present invention to provide a novel data presentation method and a participant response system employing same.
Accordingly, in one aspect there is provided a method of presenting data gathered by a participant response system, comprising obtaining spatial information for participants; collecting response data from the participants and generating result data; overlaying the result data on a map image in accordance with the obtained spatial information to form a data map; and displaying the data map.
In one embodiment, the spatial information comprises at least one of spatial coordinates and geographic coordinates of the participants. The spatial information may be stored as at least one tag forming part of a tag string created for each participant. In this case, the method may further comprise analyzing the tag strings, and associating tags comprising the spatial information with the map image. The method may also further comprise parsing the tag string of each participant into at least one tag, partitioning each tag into a keyword and a tag value and/or creating the map image based on the tag values of tags comprising spatial information.
In another embodiment, the method further comprises selecting data other than the result data to be displayed, overlaying the selected data on the map image in accordance with the spatial information to form a second data map, and displaying the second data map.
The response data may comprise answer data to at least one question. In this case, the method may further comprise, prior to the collecting, administrating an assessment to the participants, the assessment comprising the at least one question, and analyzing the response data with regard to the answer to the at least one question to generate the result data.
According to another aspect, there is provided a participant response system comprising a display; a plurality of participant response devices, each participant response device configured to generate response data in response to user input; and processing structure communicating with the display and the participant response devices, said processing structure being configured to analyze response data received from participant response devices and generate result data, overlay the result data on a map image in accordance with spatial information to form a data map, and display the data map on the display.
According to yet another aspect, there is provided an apparatus comprising a display; and processing structure communicating with the display, the processing structure executing program code causing the apparatus to analyze response data received from participant response devices and generate result data; overlay the result data on a map image in accordance with spatial information to form a data map; and display the data map on the display.
According to still yet another aspect, there is provided a computer-readable medium having embodied thereon computer program code which, upon execution by processing structure, causes an apparatus to collect response data from participant response devices and generate result data; overlay said result data on a map image in accordance with spatial information to form a data map; and display said data map.
Embodiments will now be described more fully with reference to the accompanying drawings in which:
Turning now to
The participant response system firmware for communicating with remote units 26A in this embodiment is implemented on top of IEEE 802.15.4 media access control (MAC) protocol layer software provided by Texas Instruments (TI). The TI MAC protocol layer software comprises a small real-time kernel and so called Z-stack™ ZigBee compliant protocol stack to provide simple real-time operating system (OS) facilities such as for example, timer management, task management and interrupt management. Abstraction layers are used to separate the OS and the hardware drivers for ease of porting to a different OS and hardware platform.
As is best seen in
The IWB 18 employs machine vision to detect one or more pointers brought into a region of interest in proximity with the interactive surface 34. The IWB 18 communicates with the general purpose computing device 16, which executes one or more application programs, via the USB cable 20. General purpose computing device 16 processes the output of the IWB 18 and adjusts image data that is output to the projector 40, if required, so that the image presented on the interactive surface 34 reflects pointer activity. In this manner, the IWB 18, general purpose computing device 16 and projector 40 allow pointer activity proximate to the interactive surface 34 to be recorded as writing or drawing or used to control execution of one or more application programs executed by the general purpose computing device 16.
The bezel 36 in this embodiment is mechanically fastened to the interactive surface 34 and comprises four bezel segments that extend along the edges of the interactive surface 34. In this embodiment, the inwardly facing surface of each bezel segment comprises a single, longitudinally extending strip or band of retro-reflective material. To take best advantage of the properties of the retro-reflective material, the bezel segments are oriented so that their inwardly facing surfaces extend in a plane generally normal to the plane of the interactive surface 34.
A tool tray 42 is affixed to the IWB 18 adjacent the bottom bezel segment using suitable fasteners such as for example, screws, clips, adhesive etc. As can be seen, the tool tray 42 comprises a housing having an upper surface configured to define a plurality of receptacles or slots. The receptacles are sized to receive one or more pen tools 44 as well as an eraser tool (not shown) that can be used to interact with the interactive surface 34. Control buttons (not shown) are provided on the upper surface of the housing to enable a user to control operation of the IWB 18. Further details of the tool tray 42 are provided in U.S. patent application Ser. No. 12/709,424 to Bolt et al., filed on Feb. 19, 2010, and entitled “INTERACTIVE INPUT SYSTEM AND TOOL TRAY THEREFOR”, the content of which is herein incorporated by reference in its entirety.
Imaging assemblies (not shown) are accommodated by the bezel 36, with each imaging assembly being positioned adjacent a different corner of the bezel. Each of the imaging assemblies has an infrared light source and an imaging sensor having an associated field of view. The imaging assemblies are oriented so that their fields of view overlap and look generally across the entire interactive surface 34. In this manner, any pointer such as for example a user's finger, a cylinder or other suitable object, or a pen tool 44 or eraser tool lifted from a receptacle of the tool tray 42, that is brought into proximity of the interactive surface 34 appears in the fields of view of the imaging assemblies.
The general purpose computing device 16 in this embodiment is a personal computer or other suitable processing device or structure comprising, for example, a processing unit, system memory (volatile and/or non-volatile memory), other non-removable or removable memory (e.g., a hard disk drive, RAM, ROM, EEPROM, CD-ROM, DVD, flash memory, etc.) and a system bus coupling the various computer components to the processing unit. The general purpose computing device 16 may also comprise networking capability using Ethernet, WiFi, and/or other network format, for connection to access shared or remote drives, one or more networked computers, or other networked devices.
Turning now to
One of the remote units 26A is better illustrated in
The display 104 comprises an upper row of LCD icons 132 disposed above a character display area 134. The LCD icons 132 comprise a plurality of status indicators such as for example a question number icon 132A, a user status icon 132B, a network status icon 132C, a hands-up (?) icon 132D, a battery status icon 132E and a transmission status icon 132F. The character display area 134 comprises a 128×48 pixel array that is divided into three lines. Each line can display a total of sixteen (16) characters.
The casing 100 of the remote unit 26A defines an enclosure in which electronics (not shown) are housed. In this embodiment, the electronics housed within the casing 100 comprise a microprocessor, an LCD control module, an omni-directional antenna and memory. Power is provided to the remote unit 26A by non-rechargeable and/or rechargeable batteries (not shown) also housed within the casing 100. The remote unit 26A is also configured to be powered by a standard 110V/220V power source via a power cord (not shown). When the batteries used are rechargeable batteries and the remote unit 26A is connected to the standard 110/220V power source, the remote unit 26A charges the batteries while being able to simultaneously communicate with the general purpose computing device 16. When battery charging has been completed, an indication of such is provided to the user and the remote unit 26A is then able to be disconnected from the power cord and used in a wireless manner.
In the case of the remote units 26A, the client-side application 150 is implemented as firmware stored in the memory of each remote unit 26A, and is executed by the microprocessor when the remote unit 26A is booted up. The client-side application 150 receives via the omni-directional antenna 62 the questions sent by the host-side application 142, stores received questions in the memory, and displays them on the character display area 134 via the LCD control module. The client-side application 150 collects user input entered via the keypad 102, stores the user input in the memory, and when the Enter key 130 is pressed, transmits the user input to the host-side application 142. Further specifics of the remote unit 26A are disclosed in PCT Patent Application Publication No. WO/2008/083486 to Doerksen et al. entitled “Participant Response System Employing Battery Powered, Wireless Remote Units” filed on Jan. 10, 2008, and assigned to SMART Technologies ULC of Calgary, Alberta, Canada, assignee of the subject application, the content of which is incorporated herein by reference in its entirety.
In the case of the portable computing devices 26B, the client-side application 150 is implemented as a software application running on the portable computing devices 26B. In this embodiment, the software application running on the portable computing devices 26B is SMART Notebook™ Student Edition software, offered by SMART Technologies ULC. In this implementation, the client-side application 150 presents a graphical user interface (GUI) window shown in
Referring again to
The management module 146 also comprises a GUI in the form of a management module window that is presented on the display screen of the general purpose computing device 16 (and/or optionally the interactive surface) when the management module 146 is being employed. The management module 146 provides a variety of functions selectable by the facilitator for generally managing participants, groups, response devices, and assessments.
As described above, the host side application 142 runs on the general purpose computing device 16 which, in this embodiment, uses a Microsoft® Windows® XP operating system. As shown in
Participant response system 10 further comprises a database stored by the general purpose computing device 16, which is shown in
The data categories 262, 264 and 266 further comprise spatial information 270 including spatial coordinates such as, for example, seat coordinates in a classroom, or for example geographic coordinates of a school on a map.
Some data within each of the data categories 262, 264 and 266 is organized using tags data fields. Each tags data field comprises a tag string in which information about each of the organizations, groups and participants, is represented.
The host-side application 142 uses a data management process for generally managing data used by the participant response system 10. A flowchart showing the steps of the data management process is shown in
By selecting either an “import map” button (not shown) in the “Home” tab of the management module window 180, or an “import map” function (not shown) in a menu of the management module window 180, an “import map” dialogue box (not shown) is displayed for enabling a map image selected by the facilitator to be imported into the database 260 (step 326). The map image may be imported into the database 260 from any storage medium in communication with the database 260, such as for example a flash drive or a hard drive, or from a network page or a folder at a network location, or may be imported into the database 260 by copying the map image from an image viewing and/or image processing application and pasting the image into the database 260. Once the map image has been imported, the facilitator is then able to add one or more hot spots to the map image (step 328). Each of the hot spots is a point at which data is to be associated with the map image. For example, hot spots may be added to an image of a classroom to designate seat locations, and hot spots may for example be added to a city map image to designate postal code zones.
In this embodiment, a plurality of map images may be imported and then linked together to form a combined map image of greater size and having a map structure over a range of scales for enabling the facilitator to view data at different zoom levels. For example, a teacher may view data in a city, and then “zoom-in” to view data within the school, and then “zoom-in” still further to view data within a classroom. Similarly, the teacher for example may also “zoom-out” to view data on larger scales. In this embodiment, the plurality of map images is linked together automatically by the management module 146. Following step 328, the process returns to step 322 to await input of another command.
By selecting the icon 228 on pop-up menu 224, the management module 146 is launched, enabling the facilitator to “set up” a group (step 330). Here, the facilitator may create a new group or edit an existing group, and may input or modify group information through interaction with the management module window 180. The group information may comprise, for example, a name of a class, a class room number, names of students in the class, and a class schedule. Once a group has been set up, the facilitator may then add participants to the group (step 332). Here, the facilitator may also input or modify participant information, such as for example student ID, student name, and tag strings. Once all participant information has been entered, the management module 146 then analyzes the tag strings of participants of the group (step 334).
Tag values 290 of the keyword 288 are then associated with the relevant map image at corresponding hot spots (step 374). The management module 146 then checks to determine if all tag values 290 have been associated with the relevant map image (step 376). If yes, the process proceeds to step 382; otherwise, an orphan zone is created within the map image (step 378). Any tag value 290 that is not already associated with a corresponding hot spot of the map image is associated with the orphan zone (step 380). An empty legend zone is then created within the map image (step 382).
Turning again to
Turning again to
A “show data” command may be entered at step 322 by selecting a “show data” button (not shown) presented by the management module 146, or a “show data” button (not shown) presented by the assessment tool 144. In this embodiment, the selected data comprises the response data generated during step 344. However, as will be understood, the selected data may be any data stored in the database 260 and selected by the facilitator for display. If the data selected for display is a statistical result that has not yet been calculated, the management module 146 calculates the statistical result and saves it in the database 260. Following step 346, the process returns to step 322 to await input of another command.
If a “quit” command is received at step 322, the process 310 ends (step 348).
In this embodiment, map image 702 is a combined map image formed from a plurality of map images of smaller size that are linked and has a map structure over a range of scales for enabling the facilitator to view data 708 at different zoom levels. Here, the facilitator may “zoom-in” the map image 702 to generate a new data map showing data within a postal code zone, and may further “zoom-in” to generate a new data map showing data within a class room, as shown in
Although in embodiments described above, tags are set up for each participant, in other embodiments, tags may alternatively be set up for each response device 26. For example,
In the embodiment shown in
In other embodiments, the management module may alternatively be configured to accept entry of a tag value without an associated keyword. During subsequent analyzing of tags, tag values not having an associated keyword are compared to a set of feasible tag values and feasible tag value formats, and an associated keyword is then determined for the tag value according to predefined rules. For example, instead of inputting a seat arrangement tag as “seat location: (2, 3)”, the facilitator may simply input a tag value of “(2, 3)”, and the associated keyword of “seat location” will be determined based on the tag value.
Although in embodiments described above, the value of the “seat location” tag is a set of seat coordinates, in other embodiments, the value of the “seat location” tag may alternatively be a seat number.
Although in embodiments described above, the spatial information tags comprise any of seat location, classroom location, postal code, and mailing address, it will be understood by those of skilled in the art that other tags may be used. For example, a tag “location” comprising tag values of “home” and “school” (e.g., “location: home”) may be used. As another example, a tag “location” comprising tag values of classroom names (e.g., “location: classroom 27” or “location: Computer Lab 3”) may also be used. As other examples, a “participant group” tag comprising tag values describing one or more participant groups may be used, and an “organization” tag comprising tag values describing one or more organizations may be used. It will be understood that still other tags may be used.
Although in embodiments described above, the tag keywords are entered by the facilitator, in other embodiments, the tag keywords may alternatively be previously defined.
Although in embodiments described above, spatial information of the participants is obtained from tags, in other embodiments, spatial information may alternatively be collected automatically from other sources. For example, in one related embodiment, a facilitator may set up a group and allow participants to join the group from one or more remote geographic locations. When participants join the group, the IP addresses of the response devices used by the participants are collected, and are then analyzed to determine the cities, provinces/states and countries of the participants. In still other embodiments, spatial information may be collected from other sources, such as, for example, by querying LDAP servers in the network, from a cellular network, from WiFi network location services, such as for example the Skyhook WiFi location service provided by Skyhook Wireless, and/or from Global Positioning System (GPS) devices incorporated into the response devices. In related embodiments, tags may be automatically generated based on the collected spatial information. For example, a GPS location tag may be automatically generated as “geo: 51.062770, −114.082139” for a response device equipped with a GPS device. Here, the tag values of automatically generated GPS location tags may be automatically updated in real-time.
As will be understood, the configurations of the host-side and client-side applications are not limited to those described above and in other embodiments, other configurations of the host-side and client-side applications may be used. For example, the host-side application 142 may reside and run on one or more servers, and may communicate with each other through a network. As another example, any of the assessment tool and the management module may alternatively be web applications running on one or more servers, and may provide one or more GUIs to the facilitator via a web browser on a computing device used by the facilitator. Similarly, the client-side applications may alternatively be web applications that run on one or more servers, and may provide a GUI to each participant via a web browser on each response device. As a further example, both host-side and client-side applications may be web applications that run on one or more servers, and may provide one or more GUIs to the facilitator and participants via web browsers.
Although in embodiments described above, the management module is used by the facilitator to set up groups, in other embodiments, the management module may alternatively be used by a system administrator to set up groups.
Although in embodiments described above, the management module associates the tag values with a relevant map image, in other embodiments, the management module may alternatively associate the tag values with a plurality of relevant map images.
Although in embodiments described above, the shape applied to the room is a rectangle, in other embodiments, other shapes may alternatively be applied to the room. In other embodiments, the shape may be chosen by a facilitator upon being prompted to choose a shape similar to the shape of the room.
Although in embodiments described above, the response devices comprise remote units and portable laptop or tablet computing devices, in other embodiments, the response devices may alternatively comprise other computing devices, such as, for example, smartphones, personal digital assistants etc. (PDAs). Here, the smartphones and/or PDAs would communicate with the general purpose computing device 16 wirelessly via the transceiver 22 or via other, commercial wireless transceivers such as wireless routers, or via wired connections such as for example Ethernet or Internet.
In other embodiments, the participant response system may be connected to a network and participant information, including tags, may alternatively be stored in a central database such that, when setting up a group, the facilitator may simply retrieve participant information from the central database and associate it with the group. Those skilled in the art will appreciate that instead of using tags, the system may alternatively provide the facilitator with an administrating tool for setting up fields of various types of participant information, allowing the facilitator to input participant information to corresponding fields.
Although the general purpose computing device is described as being physically connected to the IWB and transceiver via cables, the general purpose computing device may alternatively communicate with the IWB and transceiver over a wireless communication link.
It will be understood that in other embodiments, the keypad of the remote units may alternatively comprise a set of keys that is different from that of the embodiment described above, such as for example a full QWERTY key set or a DVORAK key set, or a subset thereof. If desired, the entire physical keypad or a portion thereof may be replaced with a touch screen overlying the LCD display to allow a user to interact with virtual keys.
Although in embodiments described above, the remote units are powered by any of batteries and a 110/220V source via a power cord, in other embodiments, the remote unit may be powered by any of a photovoltaic source and a manually cranked generator.
Although in embodiments described above, the facilitator is teacher, the participants are students of a class, the group is a class, and the assessment is a test, it will be understood that in other embodiments, the facilitator and the participants may be other persons, the group may alternatively be another grouping, and the assessment may alternatively be another form of assessing.
Although embodiments have been described above with reference to the accompanying drawings, those of skill in the art will appreciate that variations and modifications may be made without departing from the spirit and scope thereof as defined by the appended claims.
This application is a continuation of U.S. patent application Ser. No. 13/070,005, filed Mar. 23, 2011, the contents of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 13070005 | Mar 2011 | US |
Child | 13335243 | US |