Networks of digital displays that render digital signage related to advertising, entertainment, educational, and/or other content items are installed across a wide range of physical spaces (e.g. a retailer network). Content items displayed on such digital displays can include images, videos, and interactive content that users can interact with in real time. The content may be associated with a plurality of different entities, such as advertisers, having different needs and/or goals for display of respective content items.
Embodiments of the present disclosure will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments. Furthermore, embodiments described herein may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the systems and methods herein described.
In one embodiment, an interactive display device management device (or simply “management device”) manages a plurality of digital display devices that are in communication with the display management system via one or more networks. In one embodiment, the display devices comprise various types and/or sizes of display devices. For example, a first display device may include a touch sensitive display screen while a second display device includes a projector that projects video images onto a surface and one or more cameras that detect movements of users with respect to the surface. Additionally, the display devices may be positioned in different geographical locations, such as at different retail establishments in different locals, cities, states, or countries.
The management device may receive content items from each of one or more content clients, such as advertising clients, advertising networks, or agents of an entity that provide digital content for rendering on one or more digital display devices, that the content client desires to have displayed on one or more display devices. Each content item includes video images, audio files, and/or software code configured for execution by a computing device, such as a processor of a display device, in order to generate video images, such as video images including virtual objects, with which a user may interact. Content items may include any number of video, audio, and/or image files that are selectively rendered by respective digital display devices according to software code that is also includes as part of the content item, such as in response to user interactions with virtual objects that are rendered by the digital display devices. The content items may be related to one or more of advertising, education, entertainment, and/or any other purpose for which a content client desires to display the content item.
In one embodiment, content items may include associated content display parameters that indicate parameters for display of the content item on each of one or more particular display devices, types of display devices, geographical locations, etc. Thus, the same content item may be rendered in different manners based on the size of the display device on which the content item is rendered, for example. The content item may also include interaction rules regarding how interactions between users near display devices and virtual objects displayed on the respective display devices affect operation of the display device. For example, interaction rules of a particular content item may indicate changes to the rendered content item in response to user interactions with respective portions of video images, such as virtual items, that are displayed on a display device as part of the content item.
In one embodiment, the interactive display management device manages content of digital display devices across multiple networks, including multiple heterogeneous display devices. In one embodiment, certain display devices are interactive, such as an interactive display device that includes one or more cameras and/or projectors. In one embodiment, the display management system may also selectively issue commands to respective display devices, such as in response to input received from one or more content clients associated with a particular display device and/or content item(s) rendered on the display device, and may distribute content items to the network of digital displays.
In one embodiment, the display management device receives interaction data from at least some of the display devices. Interaction data may include indications of interactions that users have had with virtual objects of the content item(s), possibly from multiple content providers. Interaction data from multiple display devices may be analyzed by the display management device and changes and/or updates to one or more display schedules (also referred to herein as “playlists”) for display devices may be based at least partly on the interaction data.
The interaction data from each of the interactive display devices 140 may be intermittently transmitted to the interactive display management device 120, which may analyze the interaction data and update the content items, playlists, and/or commands for one or more of the interactive display devices 140. For example, if the interaction data from display device 140A indicates that a particular content item was repeatedly interacted with in a certain manner, the interactive display management device 120 may be configured to adjust the playlist of interactive display device 140B and/or interactive display device 140C to cause those interactive display devices to display the particular content item less or more frequently. Depending on the embodiment, the network 160 may include one or more of any suitable communication network, such as one or more LAN, WAN, and/or the Internet, for example. The devices 120, 140, and other devices described herein, may communicate via the network 160 using one or more wired and/or wireless communication mediums.
In one embodiment, the one or more content clients also communicate with the interactive display management device 120 via the network 160. The content clients may transmit content items, scheduling rules, commands, and/or any other data that might be used in controlling or monitoring display of video data and/or interactions with displayed video data on one or more interactive display devices 140. For example, a content client may upload content items to the management device 120 (or otherwise make the content items available), such as a software application that presents interactive advertising content that is responsive to interactions with rendered virtual objects. In one embodiment, the content clients may access information regarding their respective content items, playlists including those content items (possibly including indications of other content items associated with other content clients), and/or interactions with content items, and/or provide further content items to the management device 120 via one or more Internet-accessible interfaces, such as interfaces that are renderable in one or more browser, for example. Depending on the embodiment, content clients may communicate with the management device 120 via one or more standalone software applications that are executed on computing devices of the content client, and/or by transmitting data to the management device 120 via any other electronic communication medium and protocol, such as file transfer protocol (FTP), Hypertext Transfer Protocol (HTTP), email, or short messaging service (SMS), for example.
In the embodiment of
In operation, the interactive display management device 120 transmits updated content items, playlists, and/or commands to each of the interactive display devices 140, as necessary. For example, in the embodiment of
In one embodiment, the content client device 170 provides scheduling rules to the management device 120 that are used by the management device 120 to determine where, when, and how content items are displayed. For example, the content client 120 may indicate scheduling rules for a particular content item, or multiple content items, that includes rules related to dayparting (e.g., the time of day when the content items should or must be displayed), location information that indicates general and/or specific geographic locations (e.g., one or more ZIP code areas), a particular type of location where the content item should or must be displayed (e.g., a mall, a department store, a university, etc.), limitations on adjacent content (e.g., content before and/or after a particular content item), such as an indication that content items within one or more classes or associated with one or more specific content clients may not be displayed within a specific time before or after the content item, a particular class and/or particular display device(s) on which the content may or may not be displayed, and/or rules that define weather, current events, and/or demographics of areas around display devices that must match corresponding predefined thresholds before the content item(s) are transmitted to the corresponding display device.
The data 145, 155 that is downloaded to and uploaded by display devices 140 may be transferred over the network 160 in an incremental manner, such that the amount of data that needs to be transferred is reduced and network bandwidth may be reduced. For example, the management device 120 and a particular display device 140 may initially compare directory trees, file timestamps, and/or file hashes, for example, associated with data (e.g., content items or playlists for the particular display device) that is to be common on both devices, in order to determine the specific differences between the data stored on the two device, and only transmit the files and/or portions of files, that have been updated or added since the last incremental update.
The interactive display devices 140A, 140B, which may comprise different types of display devices, are each configured to detect interactions with users. A user may include a visitor, customer, or student, for example, that interacts with a display device 140. Thus, if the display device 140A is positioned in a shopping center, users may include customers that are shopping in the shopping center. In one embodiment, the interactive display devices 140 transmits certain or all of the interaction data representing interactions between displayed content items, or more specifically between virtual objects displayed as part of the content items, and one or more users. The interaction data may include data indicating counts of each of certain interactions and/or types of interactions, how users interact with respective virtual object, how long users interact with various virtual objects, etc. In the embodiment of
The interactive management device 120 is advantageously configured to receive interaction data from each display device 140, to analyze the interaction data, such as with reference to scheduling rules from respective content client devices 170, and to determine adjustments to playlists, including the particular content items and/or content display parameters associated with the content items, for display devices 140. In one embodiment, the management device 120 receives and/or compiles interaction data associated with content items from a particular content client device 170 from multiple display devices 140. Thus, the management device 120 may make playlist adjustments to display devices not just based on the interaction data from a single display device, but based on interaction data from multiple display devices 140. As illustrated in
In one embodiment, the management device 120 generates and provides the content client device 170 with access to interaction reports that include data regarding interactions with the content items of the content client 170, and possibly other reports associated with the content items of the content client 170 (and possibly of other content clients). In one embodiment, an interaction report indicates the interactions with a particular content item and compares those interactions with interactions with other content items, such as other content items of the content client 170 and/or content items of other content clients 170. For example, an interaction report may indicate interactions with a particular content item of a content client 170, as well as a comparison of the interactions with other content items rendered by the particular display devices 140 and/or other display devices. Thus, an interaction report may include data that allows the content client 170 to determine how interactive a particular content item is in relation to other content items and, accordingly, may be useful in determining which content items should be displayed less, additionally, and/or that may benefit from updating.
In one embodiment, the interactive display management device 120 comprises, for example, one or more servers or personal computers that are IBM, Macintosh, or Linux/Unix compatible. In another embodiment, the interactive display management device 120 comprises a laptop computer, smart phone, personal digital assistant, or other computing device. In one embodiment, the interactive display management device 120 includes one or more central processing units (“CPU”) 205, which may include conventional or specific-purpose microprocessors. In one embodiment, the interactive display management device 120 includes computer readable storage medium including one or more memories 230, such as random access memory (“RAM”) for temporary storage of information, a read only memory (“ROM”) for permanent storage of information (not shown), and one or more mass storage devices 220 (including mass storage devices 220A and 220B of illustrative
The interactive display management device 120 is generally controlled and coordinated by operating system software, such as Windows 95, 98, NT, 2000, XP, Vista, 7; SunOS; Solaris; Blackberry OS, or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X. In other embodiments, the interactive display management device 120 may be controlled by any other suitable operating system, such as a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file systems, networking, and I/O services, and/or provide a user interface, such as a graphical user interface (“GUI”), among other functions.
The illustrative interactive display management device 120 may include one or more commonly available input/output (I/O) interfaces and devices 210, such as a keyboard, mouse, touchpad, and printer. In one embodiment, the I/O interfaces and devices 210 include interfaces to one or more display devices, such as a monitor, that allow the visual presentation of data to a user. More particularly, a display device provides for the presentation of content items, such as video images including, GUIs, menus, icons, animations, application software data, and multimedia presentations, for example. In one embodiment, the I/O interfaces and devices 210 comprise devices that are in communication with modules of the interactive display management device 120 via a network, such as the network 160 or any local area network, including secured local area networks, or any combination thereof. The interactive display management device 120 may also include one or more multimedia devices 240, such as speakers, video cards, graphics accelerators, and microphones, for example.
In the embodiment of
The application modules of the illustrative interactive display management device 120 include a client interface module 250 and a display device interface module 255. In general, the client interface module 250 is configured to interface with one or more content client devices 170 (
The management device 120 also includes a display device interface module 255 that is configured to interface with each of a plurality of interactive display devices 140 (
Additionally, the display device interface module 255 receives interaction data from the respective display devices 140. As discussed above, interaction data may include various types and formats of data regarding interactions with particular content items and/or groups of content items, and comparisons of interaction data associated with certain content items. In the embodiment of
The illustrative interactive display device 300 includes one or more central processing units 305 and one or more memories 330, such as random access memories and/or read only memories, one or more I/O interfaces and devices 310, and one or more mass storage devices, such as the device data store 320. The central processing unit 305, memory 330, and I/O interfaces and devices 310 may include any suitable devices, such as those described above with respect to
In general, the content player 350 is configured to generate a video output comprising video images for display by the display device 300. For example, in one embodiment the content player 350 generates video output at a predetermined or adjustable frame rate (e.g., 20, 30, 40, 50, or 60 frames per second) that is transmitted to a projection device, such as a digital movie projector, for projection onto a display surface, such as a display screen, floor, wall, or other surface. The video output may be customized depending on the specifications of the particular display device. For example, video output for a projector may be formatted differently (e.g., a different frame rate and/or resolution) then a video output of the same content item for a touchscreen display. Thus, in one embodiment the content player 350 of the display device 300 is configured to render video output for display on the particular display hardware of the display device 300. In one embodiment, the content player initiates execution of content items in the form of software code via the CPU 305 and generates corresponding video images for display by the display device 300. The content player 350 may also be configured to render video and/or audio files in various formats, such as AVI, MPEG, VCD, DVD, WMV, ASF, MP4, 3GP, DivX, XviD, DAT, RM, RMVB, MOV, QT, M4V, FLV, MKV.
The activity monitor 355 is generally configured to detect interactions of users with the interactive display device 300, such as based on one or more inputs from other multimedia devices and/or I/O interfaces and devices 310. For example, if the display device 300 comprises a touch sensitive display screen, the activity monitor 355 may detect interactions of the user with certain virtual objects then displayed on the touch sensitive display screen in response to information received from the touch sensitive display screen, such as screen location(s), duration, pressure, and/or repetition, etc. associated with the user's interaction. Similarly, if display device 300 comprises a camera that records images of at least a portion of a user interacting with a screen onto which video images are being projected, such as by a projector, the activity monitor 355 may be configured to analyze the recorded images from the camera and determine when an interaction with a virtual object of the video images has occurred. In one embodiment, the activity monitor 355 may determine gestures performed by the user based on a series of interactions with a display screen, for example, such as a user moving a finger in a predefined pattern across a touch sensitive display screen or moving a hand (or other body part) in a certain pattern in front of a camera that records images of the user.
In one embodiment, the activity monitor 355 records logs that include data indicating the starting and ending time and identification for content items that were rendered on the display device, frame rate, memory usage, and/or application errors and warnings, for example. The logs may include system metrics, such as available disk space in memory, frame rates, display status, display problems such as overheating or component failure, raw and analyzed images from webcams or vision systems, application status, and/or display errors. The activity monitor 355 may use such logs and interaction data to generate alerts and reports. The alerts and reports may be custom defined, for example the custom definition may be defined by code entered into a web interface using languages such as SQL, or by entering a specific item of monitoring data and an associated threshold for alerting.
The scheduler 360 is generally configured to receive a playlists and/or updates to playlists from the management device 120, maintains the appropriate content data on a device data store 320, and initiate rendering of respective display items at the appropriate times (e.g., the time indicated in the playlist) or in response to a particular condition (a certain interaction with a first content item may trigger rendering of a second content item by a display device). In one embodiment, the scheduler 360 may be configured to update the playlist based on one or more user interactions with the display device 300, commands from the management device 120, and/or based on scheduling rules from a content client 170. Thus, in one embodiment the display device 300 is configured to update a playlist for the display device 300 according to scheduling rules established by one or more content clients, for example, rules based on interaction data of the interactive display device 300. The scheduler 360 may also be configured to upload content items to one or more other devices, such as other display devices 140, and/or perform quality assurance testing on content items.
The device data store 320 is configured to store content items and/or location information of content items, content display parameters associated with respective content items, scheduling rules for particular content items, and/or interaction data associated with respective content items.
In one embodiment, the live images from the video camera 420 are processed in substantially real-time in order to separate mobile objects (e.g. people) from the static background, regardless of what the background is. The processing can be done as follows: first, input frames from the video camera may be converted to grayscale to reduce the amount of data and to simplify the detection process. Next, the converted input frames may be blurred slightly to reduce noise. In one embodiment, any object that does not move over a long period of time is presumed to be background; therefore, the system is able to eventually adapt to changing lighting or background conditions. In one embodiment, a model image of the background can be generated by numerous methods that examine the input frames over a range of time, for example. In one method, the last several input frames (or a subset thereof) are examined to generate a model of the background, either through averaging, generating the median, detecting periods of constant brightness, or other heuristics, for example. The length of time over which the input frames are examined determines the rate at which the model of the background adapts to changes in the input image.
An object of interest, e.g., a portion of a user's body, is presumed to differ in brightness from the background. In order to find objects, the current video input may be subtracted from the model image of the background. In this embodiment, if the absolute value of this difference at a particular location is larger than a particular threshold, then that location is classified as an object; otherwise, it is classified as background.
In one embodiment, a computing system associated with the camera accesses the object/background classification of an image (possibly in addition to other data) as input, and outputs a video image based on this input, possibly in real time. Software that performs this functionality can take on an infinite number of forms, and is thus as broadly defined as a computer application. For example, this component could be as simple as producing a spotlight in the shape of the detected objects, or as complicated as a paint program controlled through gestures made by people who are detected as objects. In addition, applications could use other forms of input, such as sound, temperature, keyboard input etc. as well as additional forms of output, such as audio, tactile, virtual reality, aromatic, etc.
In one embodiment, objects detected by the camera and/or associated computing device are able to interact with virtual objects of the video images displayed on the display screen 423. For example, an interactive content item showing a group of ducklings could be programmed to follow behind any real object (e.g. a person) that walks in front of the display. As another example, computer games that can be played by people moving in front of the camera form another class of interactive content items.
Beginning in block 510, the interactive display management device 120 intermittently transmits content items, playlists, and/or commands to each of the interactive display devices. As described above, the content items for display on display devices may be transmitted for storage at the interactive display device 140 or may be stored elsewhere that is accessible to the display devices 140, such as in response to a display device providing authentication information to the storage device. The playlist may include a list of content items to be rendered by a particular display device, as well as one or more display parameters for each of the content items and/or for multiple content items. The data transmitted to the display devices 140 may include scheduling rules, such as portions or all of the scheduling rules that are received from one or more content clients, that indicate criteria for adjusting a playlist of a respective display device 140, such as in response to certain interaction data at a particular display device.
Next, in block 520, interaction data is received from multiple interactive display devices 140. Interaction data generally indicates interactions between a user and one or more video images and/or specific virtual objects rendered in the video images. Interaction data may be in various formats, such as in a log format that indicates each interaction between a user and an interactive display device, along with a time of each interaction and possibly other information regarding each interaction, or in one or more summary formats that indicate different combinations of the interaction data, such as a total number of a particular interaction during a predetermined time period. In other embodiments, the interaction data may include any other information related to interactions with the content items rendered on a display device and/or content items displayed on other display devices.
Moving to block 530, based on the received interaction data, and possibly on other data such as the scheduling rules from one or more content clients 170, the interactive display management device 120 determines if updates to the content items, playlists, and/or commands should be made for any of the one or more display devices 140 in the interactive display network.
Beginning in block 610, the interactive display management device 120 receives interaction data from one or more interactive display devices 140, such as is described above with reference to block 520 of
Next, in block 630 scheduling rules of one or more content clients are applied to at least the interaction data. For example, substantially real-time interaction data may be received by the interactive display management device 120 and may be provided to one or more content client device 170 in a substantially real-time manner, such as via one or more Internet-accessible user interfaces, for example. In this embodiment, scheduling rules for a particular content client and/or particular content item may be applied to the real-time interaction data associated with the content item in order to determine if the scheduling rules are currently satisfied. For example, a particular content item may require a predetermined minimum quantity of interactions with the content item, and/or with particular virtual objects of the content item, in order to continue rendering of the content item (in some embodiments, the content client pays for rendering of the content item based on the amount of time the content item is displayed on one or more display devices and, thus, may not want to pay for continued rendering of the content item if the content item is not being sufficiently interacted with by users). Thus, if the particular content item has not met the predetermined minimum user interaction threshold, the application of the rules on the interaction data would indicate that the minimum threshold has not been met.
Similarly, scheduling rules of multiple content clients may be applied to the interaction data, such as scheduling rules of each content client for which content items are scheduled over a predetermined future time. As noted above, scheduling rules may include limitations on adjacent content items, such that the scheduling rules of content items that are scheduled for rendering in the future may be relevant to the current content item on the playlist. For example, if a current content item triggers a scheduling rule for a later scheduled content item, e.g., the current content item is being rendered too close in time to the later scheduled content item, the interactive display management device 120 detects triggering of the scheduling rule. Additionally, scheduling rules may consider interaction data from multiple display devices 140, such as interaction data associated with multiple content items associated with a particular content client that are rendered on multiple display devices 140. For example, a particular content item of a content client may be rendered on display devices 140 in multiple states, countries, types of locations, etc., and the interactive display management device 120 may compile the interaction data for all of the display devices 140 on which the particular content item has been rendered and/or is scheduled to be rendered, possibly within a predetermined time. Thus, the scheduling rules of the content client may also include minimum and/or maximum rendering time quantities for a particular content item or group of content items, across multiple display devices 140, locations, location types, and/or any other criteria.
Moving to block 640, the interactive display management device 120 determines if updates to playlists of one or more display devices 140 is indicated by the application of scheduling rules. If updates to playlists are indicated, the method continues to block 650 where the interactive management device 120 determines playlist adjustments and selectively transmits updated playlists, and/or portions of playlists, to respective display devices 140.
Beginning in block 710, the display device 140 intermittently receives content items, playlist updates, and/or command data from the interactive display management device 120. In one embodiment, content items are transmitted on a regular schedule, such as nightly at a particular time when bandwidth for transfer of large content items is more easily accessible. Similarly, playlists may be transmitted on a predetermined schedule, such as once every hour, six hours, or days, for example. In some embodiments, content items, playlists (or playlist updates), and/or command data may be transmitted on an as needed basis or upon request from the display device 140. For example, if the interactive display management device 120 determines that a playlist update is needed, such as in response to application of one or more scheduling rules on real-time interaction data from one or more display devices 140, updated playlist information may be received by the interactive display device 140 outside of a normal schedule (or there may not be a normal schedule such that playlist updates are only transmitted when updated).
In block 720, the display device executes commands indicated in the received command data, if any. Commands may be entered by a content client and/or administrator, in order to cause a particular display device and/or one or more display devices that execute a particular content item or content items, to perform an associated function. For example, a content client may issue a command indicating that all renderings of a particular content item are to be stopped as soon as possible and a replacement content item of the client is to be rendered in its stead. Similarly, an administrator may issue a command to slow the frame rate of a particular content item on a particular display device in response to receiving log data from the display device that indicates the display device is overheating. In some embodiments, commands are not transmitted to the display device, such that block 720 is not performed by the display device.
Next, in block 730, the display device initiates display of one or more content items according to the received playlist. If updated playlist information was received, such as in block 710, content items selected for rendering on the display device are selected based on the updated playlist. In one embodiment, the display device may adjust the playlist based on application of scheduling rules by the interactive display device 140, hardware considerations associated with the interactive display device 140, and/or any other relevant factors.
Continuing to block 740, the display device 140 detects interactions with video images and/or portions of video images, such as virtual objects that are displayed as part of video images. As noted above, detection of such interactions may be accomplished in various manners such as by analyzing images of a user near a display screen upon which the video images are projected.
Next, in block 750, the display device intermittently transmits incremental interaction data to the management device 750. Depending on the embodiment, the transmission of the interaction data may occur on a periodic schedule, may be transmitted in response to a request for interaction data from the interactive display management device 120 and/or the content client device 170, and/or may be transmitted in response to detection of certain types, quantities, and/or patterns of interactions in the interaction data. For example, a particular interactive display device 140 may be configured to transmit interaction data on a regular basis, unless a particular interaction is detected by the display device. Thus, real-time indications of the particular interaction may be transmitted to the interactive display management device 120, which may adjust not only a playlist for the particular display device 140 but possibly for other display devices 140 in the interactive display network, based at least partly on the detected interaction.
The foregoing description details certain embodiments of the present disclosure. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the present disclosure can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of systems or methods should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects with which that terminology is associated. The scope of this disclosure should therefore be construed in accordance with the appended claims and any equivalents thereof.
This application claims priority under 35 U.S.C. §119(e) to U.S. Provisional Application Ser. No. 61/061,105, filed on Jun. 12, 2008, which is hereby expressly incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2917980 | Grube et al. | Dec 1959 | A |
3068754 | Benjamin et al. | Dec 1962 | A |
3763468 | Ovshinsky et al. | Oct 1973 | A |
4053208 | Kato et al. | Oct 1977 | A |
4275395 | Dewey et al. | Jun 1981 | A |
4573191 | Kidode et al. | Feb 1986 | A |
4725863 | Dumbreck et al. | Feb 1988 | A |
4769697 | Gilley et al. | Sep 1988 | A |
4791572 | Green et al. | Dec 1988 | A |
4843568 | Krueger et al. | Jun 1989 | A |
4887898 | Halliburton et al. | Dec 1989 | A |
4948371 | Hall | Aug 1990 | A |
5001558 | Burley et al. | Mar 1991 | A |
5138304 | Bronson | Aug 1992 | A |
5151718 | Nelson | Sep 1992 | A |
5239373 | Tang et al. | Aug 1993 | A |
5276609 | Durlach | Jan 1994 | A |
5319496 | Jewell et al. | Jun 1994 | A |
5325472 | Horiuchi et al. | Jun 1994 | A |
5325473 | Monroe et al. | Jun 1994 | A |
5418583 | Masumoto | May 1995 | A |
5426474 | Rubstov et al. | Jun 1995 | A |
5436639 | Arai et al. | Jul 1995 | A |
5442252 | Golz | Aug 1995 | A |
5454043 | Freeman | Sep 1995 | A |
5473396 | Okajima et al. | Dec 1995 | A |
5497269 | Gal | Mar 1996 | A |
5510828 | Lutterbach et al. | Apr 1996 | A |
5526182 | Jewell et al. | Jun 1996 | A |
5528263 | Platzker et al. | Jun 1996 | A |
5528297 | Seegert et al. | Jun 1996 | A |
5534917 | MacDougall | Jul 1996 | A |
5548694 | Gibson | Aug 1996 | A |
5591972 | Noble et al. | Jan 1997 | A |
5594469 | Freeman et al. | Jan 1997 | A |
5633691 | Vogeley et al. | May 1997 | A |
5662401 | Shimizu et al. | Sep 1997 | A |
5703637 | Miyazaki et al. | Dec 1997 | A |
5771307 | Lu et al. | Jun 1998 | A |
5808784 | Ando et al. | Sep 1998 | A |
5846086 | Bizzi et al. | Dec 1998 | A |
5861881 | Freeman et al. | Jan 1999 | A |
5882204 | Iannazo et al. | Mar 1999 | A |
5900982 | Dolgoff et al. | May 1999 | A |
5923380 | Yang et al. | Jul 1999 | A |
5923475 | Kurtz et al. | Jul 1999 | A |
5953152 | Hewlett | Sep 1999 | A |
5966696 | Giraud | Oct 1999 | A |
5969754 | Zeman | Oct 1999 | A |
5978136 | Ogawa et al. | Nov 1999 | A |
5982352 | Pryor | Nov 1999 | A |
6008800 | Pryor | Dec 1999 | A |
6058397 | Barrus et al. | May 2000 | A |
6075895 | Qiao et al. | Jun 2000 | A |
6084979 | Kanada et al. | Jul 2000 | A |
6088612 | Blair | Jul 2000 | A |
6097369 | Wambach | Aug 2000 | A |
6106119 | Edwards | Aug 2000 | A |
6118888 | Chino et al. | Sep 2000 | A |
6125198 | Onda | Sep 2000 | A |
6166744 | Jaszlics et al. | Dec 2000 | A |
6176782 | Lyons et al. | Jan 2001 | B1 |
6191773 | Maruno et al. | Feb 2001 | B1 |
6198487 | Fortenbery et al. | Mar 2001 | B1 |
6198844 | Nomura | Mar 2001 | B1 |
6217449 | Kaku | Apr 2001 | B1 |
6254246 | Tiao et al. | Jul 2001 | B1 |
6263339 | Hirsh | Jul 2001 | B1 |
6270403 | Watanabe et al. | Aug 2001 | B1 |
6278418 | Doi | Aug 2001 | B1 |
6292171 | Fu et al. | Sep 2001 | B1 |
6304267 | Sata | Oct 2001 | B1 |
6308565 | French et al. | Oct 2001 | B1 |
6323895 | Sata | Nov 2001 | B1 |
6333735 | Anvekar | Dec 2001 | B1 |
6335977 | Kage | Jan 2002 | B1 |
6339748 | Hiramatsu | Jan 2002 | B1 |
6349301 | Mitchell et al. | Feb 2002 | B1 |
6351222 | Swan et al. | Feb 2002 | B1 |
6353428 | Maggioni et al. | Mar 2002 | B1 |
6359612 | Peter et al. | Mar 2002 | B1 |
6388657 | Natoli | May 2002 | B1 |
6394896 | Sugimoto | May 2002 | B2 |
6400374 | Lanier | Jun 2002 | B2 |
6407870 | Hurevich et al. | Jun 2002 | B1 |
6414672 | Rekimoto et al. | Jul 2002 | B2 |
6445815 | Sato | Sep 2002 | B1 |
6454419 | Kitazawa | Sep 2002 | B2 |
6464375 | Wada et al. | Oct 2002 | B1 |
6480267 | Yanagi et al. | Nov 2002 | B2 |
6491396 | Karasawa et al. | Dec 2002 | B2 |
6501515 | Iwamura | Dec 2002 | B1 |
6513953 | Itoh | Feb 2003 | B1 |
6522312 | Ohshima et al. | Feb 2003 | B2 |
6545706 | Edwards et al. | Apr 2003 | B1 |
6552760 | Gotoh et al. | Apr 2003 | B1 |
6598978 | Hasegawa | Jul 2003 | B2 |
6607275 | Cimini et al. | Aug 2003 | B1 |
6611241 | Firester et al. | Aug 2003 | B1 |
6654734 | Mani et al. | Nov 2003 | B1 |
6658150 | Tsuji et al. | Dec 2003 | B2 |
6661918 | Gordon et al. | Dec 2003 | B1 |
6677969 | Hongo | Jan 2004 | B1 |
6707054 | Ray | Mar 2004 | B2 |
6707444 | Hendriks et al. | Mar 2004 | B1 |
6712476 | Ito et al. | Mar 2004 | B1 |
6720949 | Pryor et al. | Apr 2004 | B1 |
6732929 | Good et al. | May 2004 | B2 |
6747666 | Utterback | Jun 2004 | B2 |
6752720 | Clapper et al. | Jun 2004 | B1 |
6754370 | Hall-Holt et al. | Jun 2004 | B1 |
6791700 | Omura et al. | Sep 2004 | B2 |
6808293 | Watanabe et al. | Oct 2004 | B2 |
6826727 | Mohr et al. | Nov 2004 | B1 |
6831664 | Marmaropoulos et al. | Dec 2004 | B2 |
6871982 | Holman et al. | Mar 2005 | B2 |
6873710 | Cohen-Solal et al. | Mar 2005 | B1 |
6877882 | Haven et al. | Apr 2005 | B1 |
6882480 | Yanagisawa | Apr 2005 | B2 |
6902310 | Im | Jun 2005 | B2 |
6912313 | Li | Jun 2005 | B2 |
6965693 | Kondo et al. | Nov 2005 | B1 |
6975360 | Slatter | Dec 2005 | B2 |
6999600 | Venetianer | Feb 2006 | B2 |
7000200 | Martins | Feb 2006 | B1 |
7015894 | Morohoshi | Mar 2006 | B2 |
7042440 | Pryor et al. | May 2006 | B2 |
7054068 | Yoshida et al. | May 2006 | B2 |
7058204 | Hildreth et al. | Jun 2006 | B2 |
7068274 | Welch et al. | Jun 2006 | B2 |
7069516 | Rekimoto | Jun 2006 | B2 |
7084859 | Pryor et al. | Aug 2006 | B1 |
7088508 | Ebina et al. | Aug 2006 | B2 |
7129927 | Mattsson | Oct 2006 | B2 |
7149262 | Nayar et al. | Dec 2006 | B1 |
7158676 | Rainsford | Jan 2007 | B1 |
7170492 | Bell | Jan 2007 | B2 |
7190832 | Frost et al. | Mar 2007 | B2 |
7193608 | Stuerzlinger | Mar 2007 | B2 |
7227526 | Hildreth et al. | Jun 2007 | B2 |
7259747 | Bell | Aug 2007 | B2 |
7262874 | Suzuki | Aug 2007 | B2 |
7268950 | Poulsen | Sep 2007 | B2 |
7289130 | Satoh et al. | Oct 2007 | B1 |
7330584 | Weiguo et al. | Feb 2008 | B2 |
7331856 | Nakamura et al. | Feb 2008 | B1 |
7339521 | Scheidemann et al. | Mar 2008 | B2 |
7348963 | Bell | Mar 2008 | B2 |
7379563 | Shamaie | May 2008 | B2 |
7382897 | Brown et al. | Jun 2008 | B2 |
7394459 | Bathiche et al. | Jul 2008 | B2 |
7428542 | Fink et al. | Sep 2008 | B1 |
7431253 | Yeh | Oct 2008 | B2 |
7432917 | Wilson et al. | Oct 2008 | B2 |
7468742 | Ahn et al. | Dec 2008 | B2 |
7536032 | Bell | May 2009 | B2 |
7559841 | Hashimoto | Jul 2009 | B2 |
7576727 | Bell | Aug 2009 | B2 |
7598942 | Underkoffler et al. | Oct 2009 | B2 |
7619824 | Poulsen | Nov 2009 | B2 |
7665041 | Wilson et al. | Feb 2010 | B2 |
7671321 | Perlman et al. | Mar 2010 | B2 |
7710391 | Bell et al. | May 2010 | B2 |
7728280 | Feilkas et al. | Jun 2010 | B2 |
7737636 | Li et al. | Jun 2010 | B2 |
7738725 | Raskar et al. | Jun 2010 | B2 |
7745771 | Troxell et al. | Jun 2010 | B2 |
RE41685 | Feldman et al. | Sep 2010 | E |
7809167 | Bell | Oct 2010 | B2 |
7834846 | Bell | Nov 2010 | B1 |
7961906 | Ruedin | Jun 2011 | B2 |
7971156 | Albertson et al. | Jun 2011 | B2 |
8018579 | Krah | Sep 2011 | B1 |
8035612 | Bell et al. | Oct 2011 | B2 |
8035624 | Bell et al. | Oct 2011 | B2 |
8081822 | Bell | Dec 2011 | B1 |
8085293 | Brodsky et al. | Dec 2011 | B2 |
8085994 | Kim | Dec 2011 | B2 |
8098277 | Bell | Jan 2012 | B1 |
8159682 | Bell | Apr 2012 | B2 |
8199108 | Bell et al. | Jun 2012 | B2 |
8230367 | Bell et al. | Jul 2012 | B2 |
8259163 | Bell et al. | Sep 2012 | B2 |
20010012001 | Rekimoto et al. | Aug 2001 | A1 |
20010033675 | Maurer et al. | Oct 2001 | A1 |
20020006583 | Michiels et al. | Jan 2002 | A1 |
20020032697 | French et al. | Mar 2002 | A1 |
20020032906 | Grossman | Mar 2002 | A1 |
20020041327 | Hildreth et al. | Apr 2002 | A1 |
20020046100 | Kinjo | Apr 2002 | A1 |
20020064382 | Hildreth et al. | May 2002 | A1 |
20020073417 | Kondo et al. | Jun 2002 | A1 |
20020078441 | Drake et al. | Jun 2002 | A1 |
20020081032 | Chen et al. | Jun 2002 | A1 |
20020103617 | Uchiyama et al. | Aug 2002 | A1 |
20020105623 | Pinhanez | Aug 2002 | A1 |
20020130839 | Wallace et al. | Sep 2002 | A1 |
20020140633 | Rafii et al. | Oct 2002 | A1 |
20020140682 | Brown et al. | Oct 2002 | A1 |
20020158984 | Brodsky et al. | Oct 2002 | A1 |
20020178440 | Agnihorti et al. | Nov 2002 | A1 |
20020186221 | Bell | Dec 2002 | A1 |
20030032484 | Ohshima et al. | Feb 2003 | A1 |
20030065563 | Elliott et al. | Apr 2003 | A1 |
20030076293 | Mattsson | Apr 2003 | A1 |
20030078840 | Strunk et al. | Apr 2003 | A1 |
20030091724 | Mizoguchi | May 2003 | A1 |
20030093784 | Dimitrova et al. | May 2003 | A1 |
20030098819 | Sukthankar et al. | May 2003 | A1 |
20030103030 | Wu | Jun 2003 | A1 |
20030113018 | Nefian et al. | Jun 2003 | A1 |
20030122839 | Matraszek et al. | Jul 2003 | A1 |
20030126013 | Shand | Jul 2003 | A1 |
20030137494 | Tulbert | Jul 2003 | A1 |
20030161502 | Morihara et al. | Aug 2003 | A1 |
20030178549 | Ray | Sep 2003 | A1 |
20040005924 | Watabe et al. | Jan 2004 | A1 |
20040015783 | Lennon et al. | Jan 2004 | A1 |
20040046736 | Pryor et al. | Mar 2004 | A1 |
20040046744 | Rafii et al. | Mar 2004 | A1 |
20040073541 | Lindblad et al. | Apr 2004 | A1 |
20040091110 | Barkans | May 2004 | A1 |
20040095768 | Watanabe et al. | May 2004 | A1 |
20040165006 | Kirby et al. | Aug 2004 | A1 |
20040183775 | Bell | Sep 2004 | A1 |
20050028188 | Latona et al. | Feb 2005 | A1 |
20050039206 | Opdycke | Feb 2005 | A1 |
20050086695 | Keele et al. | Apr 2005 | A1 |
20050088407 | Bell | Apr 2005 | A1 |
20050089194 | Bell | Apr 2005 | A1 |
20050104506 | Youh et al. | May 2005 | A1 |
20050110964 | Bell | May 2005 | A1 |
20050122308 | Bell et al. | Jun 2005 | A1 |
20050132266 | Ambrosino et al. | Jun 2005 | A1 |
20050147135 | Kurtz et al. | Jul 2005 | A1 |
20050147282 | Fujii | Jul 2005 | A1 |
20050151850 | Ahn et al. | Jul 2005 | A1 |
20050162381 | Bell et al. | Jul 2005 | A1 |
20050185828 | Semba et al. | Aug 2005 | A1 |
20050195598 | Dancs et al. | Sep 2005 | A1 |
20050265587 | Schneider | Dec 2005 | A1 |
20060001760 | Matsumura et al. | Jan 2006 | A1 |
20060010400 | Dehlin et al. | Jan 2006 | A1 |
20060031786 | Hillis et al. | Feb 2006 | A1 |
20060078015 | Franck | Apr 2006 | A1 |
20060132432 | Bell | Jun 2006 | A1 |
20060132725 | Terada et al. | Jun 2006 | A1 |
20060139314 | Bell | Jun 2006 | A1 |
20060168515 | Dorsett, Jr. et al. | Jul 2006 | A1 |
20060184993 | Goldthwaite et al. | Aug 2006 | A1 |
20060187545 | Doi | Aug 2006 | A1 |
20060227099 | Han et al. | Oct 2006 | A1 |
20060242145 | Krishnamurthy et al. | Oct 2006 | A1 |
20060256382 | Matraszek et al. | Nov 2006 | A1 |
20060258397 | Kaplan et al. | Nov 2006 | A1 |
20060294247 | Hinckley et al. | Dec 2006 | A1 |
20060294258 | Powers-Boyle et al. | Dec 2006 | A1 |
20070002039 | Pendleton et al. | Jan 2007 | A1 |
20070019066 | Cutler | Jan 2007 | A1 |
20070199035 | Schwartz et al. | Aug 2007 | A1 |
20070285419 | Givon | Dec 2007 | A1 |
20080013826 | Hillis et al. | Jan 2008 | A1 |
20080018595 | Hildreth et al. | Jan 2008 | A1 |
20080030460 | Hildreth et al. | Feb 2008 | A1 |
20080036732 | Wilson et al. | Feb 2008 | A1 |
20080040692 | Sunday et al. | Feb 2008 | A1 |
20080062123 | Bell | Mar 2008 | A1 |
20080062257 | Corson | Mar 2008 | A1 |
20080090484 | Lee et al. | Apr 2008 | A1 |
20080135733 | Feilkas et al. | Jun 2008 | A1 |
20080150890 | Bell et al. | Jun 2008 | A1 |
20080150913 | Bell et al. | Jun 2008 | A1 |
20080159591 | Ruedin | Jul 2008 | A1 |
20080170776 | Albertson et al. | Jul 2008 | A1 |
20080245952 | Troxell et al. | Oct 2008 | A1 |
20080252596 | Bell et al. | Oct 2008 | A1 |
20080292144 | Kim | Nov 2008 | A1 |
20090027337 | Hildreth | Jan 2009 | A1 |
20090077504 | Bell et al. | Mar 2009 | A1 |
20090079813 | Hildreth | Mar 2009 | A1 |
20090102788 | Nishida et al. | Apr 2009 | A1 |
20090106785 | Pharn | Apr 2009 | A1 |
20090172606 | Dunn et al. | Jul 2009 | A1 |
20090179733 | Hattori et al. | Jul 2009 | A1 |
20090225196 | Bell et al. | Sep 2009 | A1 |
20090235295 | Bell et al. | Sep 2009 | A1 |
20090251685 | Bell et al. | Oct 2009 | A1 |
20100026624 | Bell et al. | Feb 2010 | A1 |
20100039500 | Bell et al. | Feb 2010 | A1 |
20100060722 | Bell et al. | Mar 2010 | A1 |
20110157316 | Okamoto et al. | Jun 2011 | A1 |
20120080411 | Mizuyama et al. | Apr 2012 | A1 |
20120200843 | Bell et al. | Aug 2012 | A1 |
20120287044 | Bell et al. | Nov 2012 | A1 |
20120293625 | Schneider et al. | Nov 2012 | A1 |
20120317511 | Bell et al. | Dec 2012 | A1 |
Number | Date | Country |
---|---|---|
0 055 366 | Jul 1982 | EP |
0 626 636 | Nov 1994 | EP |
0 913 790 | May 1999 | EP |
1 689 172 | Jun 2002 | EP |
57-094672 | Jun 1982 | JP |
10-207619 | Aug 1998 | JP |
11-057216 | Mar 1999 | JP |
2000-105583 | Apr 2000 | JP |
2002-014997 | Jan 2002 | JP |
2002-092023 | Mar 2002 | JP |
2002-171507 | Jun 2002 | JP |
2003-517642 | May 2003 | JP |
2003-271084 | Sep 2003 | JP |
2004-246578 | Sep 2004 | JP |
2007-514242 | May 2007 | JP |
2003-0058894 | Jul 2003 | KR |
WO 9838533 | Sep 1998 | WO |
WO 0016562 | Mar 2000 | WO |
WO 0163916 | Aug 2001 | WO |
WO 0201537 | Jan 2002 | WO |
WO 02100094 | Dec 2002 | WO |
WO 2004055776 | Jul 2004 | WO |
WO 2004097741 | Nov 2004 | WO |
WO 2005003948 | Jan 2005 | WO |
WO 2005041578 | May 2005 | WO |
WO 2005041579 | May 2005 | WO |
WO 2005057398 | Jun 2005 | WO |
WO 2005057399 | Jun 2005 | WO |
WO 2005057921 | Jun 2005 | WO |
WO 2005091651 | Sep 2005 | WO |
WO 2007019443 | Feb 2007 | WO |
WO 2008124820 | Oct 2008 | WO |
WO 2009035705 | Mar 2009 | WO |
Entry |
---|
Notice of Opposition in European Application No. 02739710.8 dated Aug. 23, 2010. |
Official Report in Australian Application No. 2008299883, dated Dec. 8, 2010. |
DePiero et al; “3-D Computer Vision Using Structured Light: Design, Calibrations and Implementation Issues”; Advances in Computers, vol. 43, pp. 243-278, 1996. |
Huang, Mark et al. “Shadow Vision,” Introduction to Computer Graphics, Fall 1999, Dec. 6, 1999; pp. 1-10, XP55013291 <http:groups.csail.mit.edu/graphics/classes/6.837/F99/projects/reports/team16.pdf>. |
Leibe, Bastian, et al., “The Perspective Workbench; Toward Spontaneous and Natural Interaction in Semi-Immersive Virtual Environments,” Mar. 18-22, 2000, IEEE Computer Society, Los Alamitos, CA; pp. 13-20. |
Paradiso, Joseph et al., “Optical Tracking for Music and Dance Performance,” Conference on Optical 3-D Measurement Techniques, XX, XX, No. 4th, Sep. 30, 1997, pp. 1-8, XP002548974. <http://www.media.mit.edu/resenv/pubs/papers/97—09—Zurich—3D4.pdf>. |
Quinz, Emanuele; “Conference Papers”, Apr. 10, 2001, XP55013293, Retrieved from the internet <http://www.isea2000.com/pop—actes.htm>. |
Quinz, Emanuele; “Digital Performance”, pp. 1-3, Retrieved from the internet on Nov. 28, 2011 <http://www.noemalab.org/sections/ideas/ideas—articles/pdf/. |
Sparacino, Flavia, et al., “Dance Space: An Interactive Video Interface”, Actes/Proceeding, ISEA2000—Oct. 12, 2000—Auditorium 1, Dec. 10, 2000. |
Maria Langer, “Mac OS X 10.2: Visual QuickStart Guide,” Sep. 17, 2002, Peachpit Press, p. 111. |
Rekimoto, Jun, “SmartSkin: Am Infrastructure for Freehand Manipulation on Interactive Surfaces.” Vol. No. 4, Issue No. 1, pp. 113-120, Apr. 2002. |
Xiao, Yang; “Throughput and Delay Limits of IEEE 802.11,” IEEE Communications Letters, vol. 6, No. 8, pp. 355-357, Aug. 2002. |
International Preliminary Report on Patentability for PCT/US2008/10750, filed Sep. 15, 2008. |
Letter of the opponent O2 dated May 28, 2010 in European Application No. 02739710.8, filed Jun. 4, 2002. |
EffecTV Version 0.2.0 released Mar. 27, 2001, available online at <http://web.archive.org/web/20010101-20010625re—http://effectv.sourceforge.net>. |
Index of EffecTV, as downloaded on Apr. 30, 2007 at <http://effectv.cvs.sourceforge.net/effectv/EffecTV/?pathrev=rel—0—2—0>. |
R111, The Transformation From Digital Information to Analog Matter, available online at <http://www.particles.de/paradocs/r111/10mkp2004/hmtl/r111—text111hock04.html>. |
2001 Symposium on Interactive 3D Graphics program description, ACM SIGGRAPH, held Mar. 19-21, 2001, Research Triangle Park, NC, downloaded from <http://www.allconferences.com/conferences/2000830092631/>; cited during opposition of European Application No. |
Affidavit of Daniel Barthels regarding EffecTV, dated May 15, 2007 (partial machine translation), cited during opposition of European Application No. 02739710.8, filed Jun. 4, 2002. |
Announcement: Workshop on Perceptual User Interfaces, The Banff Rocky Mountain Resort, Banff, Alberta, Canada, Oct. 20-21, 1997, can be found at <http://www.research.microsoft.com/PUIWorkshop/>, cited during opposition of European Application No. 02739. |
Bodymover Body Movement as a Means to Obtain an Audiovisual Spatial Experience, 2000 ART+COM AG Berlin; <http://www.artcom.de/index.php?option=com—acprojects&page=6&id=28&Itemid=144&details=0&lang=en>. |
ART+COM Bodymover 2000, as downloaded on Aug. 21, 2009 from <http://www.artcom.de/index.php?option=com—acprojects&page=6&id=28&Itemid=144&details=0&lang=en>, cited during opposition of European Application No. 02739710.8, filed Jun. 4, 2002. |
Article 96(2) Communication dated Feb. 25, 2005 in European Application No. 02739710.8. |
Article 96(2) Communication dated Mar. 31, 2004 in European Application No. 02739710.8. |
Brown, Matthew, et al. “Multi-Image Matching using Multi-Scale Oriented Patches,” Technical Report, Dec. 2004, pp. 1-48, available online at <ftp://ftp.research.microsoft.com/pub/tr/TR-2004-133.pdf>. |
Brown, Matthew, et al., “Multi-Image Matching using Multi-Scale Oriented Patches,” Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Conference Publication Date: Jun. 20-25, 2005, 8 pgs. |
Buxton, Bill, “Multi-Touch Systems That I Have Known and Loved,” accessed Mar. 21, 2007, http://billbuxton.com/multitouchOverview.html. |
Communication dated Dec. 10, 2008 from Patentanwalt attaching article by Katy Bachman, entitled “Reactrix Up for Sale,” cited during opposition of European Application No. 02739710.8, filed Jun. 4, 2002. |
Crouser, P.D., et al., “Unattenuated tracer particle extraction through time-averaged, background image subtraction with outlier rejection,” Experiments in Fluids, 22, 1997, 220-228, Springer-Verlag. |
Davis, J.W., et al., “SIDEshow: A Silhouette-based Interactive Dual-screen Environment,” Aug. 1998, MIT Media Lab Tech Report No. 457. |
Demarest, Ken, “Sand,” 2000, Mine Control, art installation, available online at <http://www.mine-control.com>. |
EffecTV Software Source: effect module, dated May 20, 2001 (German); cited during opposition of European Application No. 02739710.8, filed Jun. 4, 2002. |
Eigammai, Ahmed, et al., “Non-parametric Model for Background Subtraction,” Jun. 2000, European Conference on Computer Vision, Lecture Notes on Computer Science, vol. 1843, pp. 751-767. |
Extended Search Report for European Application No. 06010825.5, filed Jun. 4, 2002, dated Jul. 10, 2006. |
Dachselt, Raimund, et al., “CONTIGRA: An XML-Based Architecture for Component-Oriented 3D Applications, 3D Technologies for the World Wide Web, Proceedings of the Seventh International Conference on 3D Technology,” ACM, Feb. 24-28, 2002, pp. 155-163. |
Foerterer, Holger, “Fluidum,” 1999, art installation, description available online at <http://www.foerterer.com/fluidum>. |
Foerterer, Holger, “Helikopter,” 2001, art installation, description available online at <http://www.foerterer.com/helikopter>. |
Freeman, William, et al., “Computer vision for interactive computer graphics,” May-Jun. 1998, IEEE Computer Graphics and Applications, vol. 18, No. 3, pp. 42-53. |
Frisken, Sarah F. et al., “Adaptively Sampled Distance Fields: A General Representation of Shape for Computer Graphics,” Jul. 23-28, 2000, Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, pp. 249-254. |
Fujihata, Masaki, “Beyond Pages,” 1995, art installation, description available online at <http://on1.zkm.de/zkm/werke/BeyondPages>. |
Goetz, Frank, et al., “An XML-based Visual Shading Language for Vertex and Fragment Shaders,” 3D Technologies for the World Wide Web, Proceedings of Ninth International Conference on 3D Technology; ACM, Apr. 5-8, 2004; pp. 87-97. |
GroundFX Document, GestureTek (Very Vivid, Inc.), description available online at <http://www.gesturetek.com/groundfx>, downloaded on Aug. 11, 2006. |
Haller, Michael et al., “Coeno-Storyboard: An Augmented Surface for Storyboard Presentations,” Mensch & Computer 2005, Sep. 4-7, 2005, Linz, Austria. |
Han, Jefferson Y., “Low-Cost Multi-Touch Sensing Through Frustrated Total Internal Reflection,” Oct. 23-26, 2005, ACM Symposium on User Interface Software and Technology (UIST). |
Harville, Michael et al., “Foreground Segmentation Using Adaptive Mixture Models in Color and Depth,” Jul. 8, 2001, Proceedings of IEEE Workshop on Detection and Recognition of Events in Video, pp. 3-11. |
Hemmer, Raphael Lozano, “Body Movies,” 2002, art project/installation, description available online at <http://www.lozano-hemmer.com/eproyecto.html>. |
Hoff, Kenneth E. III et al, “Fast and Simple 2D Geometric Proximity Queries Using Graphics Hardware,” Mar. 19-21, 2001, Proc. Of the 2001 Symposium on Interactive 3D Graphics, pp. 145-148. |
International Preliminary Examination Report for PCT/US2002/017843, filed Jun. 4, 2002. |
International Preliminary Report on Patentability for PCT/US2004/035477, filed Oct. 25, 2004. |
International Preliminary Report on Patentability for PCT/US2004/035478, filed Oct. 25, 2004. |
International Preliminary Report on Patentability for PCT/US2004/041318, filed Dec. 9, 2004. |
International Preliminary Report on Patentability for PCT/US2004/041319, filed Dec. 9, 2004. |
International Preliminary Report on Patentability for PCT/US2004/041320, filed Dec. 9, 2004. |
International Preliminary Report on Patentability for PCT/US2005/008984, filed Mar. 18, 2005. |
International Preliminary Report on Patentability for PCT/US2006/030720, filed on Aug. 4, 2006. |
International Preliminary Report on Patentability for PCT/US2008/059900, filed on Apr. 10, 2008. |
International Search Report for PCT/US03/40321, filed Dec. 15, 2003. |
International Search Report for PCT/US2002/017843, filed Jun. 4, 2002, dated Feb. 5, 2003. |
International Search Report for PCT/US2004/035477, filed Oct. 25, 2004. |
Invitation to Pay Additional Fees and Partial ternational Search Report on Patentability for PCT/US2004/035478, filed Oct. 25, 2004. |
International Search Report for PCT/US2004/035478, filed Oct. 25, 2004. |
International Search Report for PCT/US2004/041318, filed Dec. 9, 2004. |
International Search Report for PCT/US2004/041319, filed Dec. 9, 2004. |
International Search Report for PCT/US2004/041320, filed Dec. 9, 2004. |
International Search Report for PCT/US2005/008984, filed Mar. 18, 2005. |
International Search Report for PCT/US2006/030720, filed Aug. 4, 2006. |
International Search Report for PCT/US2008/059900, filed Apr. 10, 2008. |
International Search Report for PCT/US2008/10750, filed Sep. 15, 2008. |
Ivars Peterson, “Artificial reality; combining a person's live video image with computer graphics suggests novel ways of working and playing with computers” Science News, Jun. 22, 1985. |
Jabri, Sumer et al., “Detection and Location of People in Video Images Using Adaptive Fusion of Color and Edge Information;” presented at the Int. Conf. Pattern Recognition, Barcelona, Spain, 2000. |
Joyce, Arthur W. III, et al., “Implementation and capabilities of a virtual interaction system,” Sep. 10-11, 1998, Proceedings 2nd European Conference on Disability, Virtual Reality and Associated Technologies, Skovde, Sweden, pp. 237-245. |
Katz, Itai et al., “A Multi-Touch Surface Using Multiple Cameras,” Oct. 3, 2007, Advanced Concepts for Intelligent Vision Systems, vol. 4678/2007. |
Keays, Bill, “metaField Maze,” 1998, exhibited at Siggraph'99 Emerging Technologies and Ars Electronica Aug. 8-13, 1999, description available online at <http://www.billkeays.com/metaFieldInfosheet1A.pdf>. |
Keays, Bill, “Using High-Bandwidth Input/Output in Interactive Art,” Jun. 1999, Master's Thesis, Massachusetts Institute of Technology, School of Architecture and Planning. |
Khan, Jeff; “Intelligent Room with a View”; Apr.-May 2004, RealTime Arts Magazine, Issue 60, available online at www.realtimearts.net/article/60/7432. |
Kjeldesn, Rick et al., “Dynamically Reconfigurable Vision-Based User Interfaces,” Apr. 2003, 3rd International Conference on Vision Systems (ICVVS '03), Graz, Austria, pp. 6-12. |
Kjeldsen, R. et al., “Interacting with Steerable Projected Displays,” May 20-21, 2002, Proceedings of the 5th International Conference on Automatic Face and Gesture Recognition, Washington, D.C. |
Kreuger, Myron, “Videoplace—An Artificial Reality,” Apr. 1985, Conference on Human Factors in Computing Systems, San Francisco, California, pp. 35-40. |
Kreuger, Myron, “Videoplace,” 1969 and subsequent, summary available online at <http://www.jtnimoy.com/itp/newmediahistory/videoplace>. |
Kurapati, Kaushal, et al., “A Multi-Agent TV Recommender,” Jul. 13-14, 2001, Workshop on Personalization in Future TV, pp. 1-8, XP02228335. |
Lamarre, Mathieu, et al., “Background subtraction using competing models in the block-DCT domain,” Pattern Recognition, 2002 Proceedings, 16 International Conference in Quebec City, Que., Canada, Aug. 11-15, 2002, Los Alamitos, CA, USA, IEEE COMPUT SOC US. |
Lantagne, Michel, et al., “VIP: Vision tool for comparing Images of People,” Vision Interface, Jun. 11-13, 2003, pp. 1-8. |
Leibe, Bastian, et al., “Towards Spontaneous Interaction with the Perceptive Workbench, a Semi-Immersive Virtual Environment,” Nov./Dec. 2000, IEEE Computer Graphics and Applications, vol. 20, No. 6, pp. 54-65. |
Lengyel, Jed et al., “Real-Time Robot Motion Planning Using Rasterizing Computer Graphics Hardware,” Aug. 1990, ACM SIGGRAPH Computer Graphics, vol. 24, Issue 4, pp. 327-335. |
Levin, Golan “Computer Vision for Artists and Designers: Pedagogic Tools and Techniques for Novice Programmers,” Aug. 2006, AI & Society, vol. 20, Issue 4, pp. 462-482. |
Letter dated May 16, 2007 from Christian Zuckschwerdt regarding EffecTV, (partial machine translation), cited during opposition of European Application No. 02739710.8, filed Jun. 4, 2002. |
Lin, Mingxiu et al., “A New Approach for Vision-based Rear Vehicle Tracking,” Key Laboratory of Integrated Automation of Process Industry, Ministry of Education, Northeastern University, Shenyang, Liaoning Province, China, held May 23-25, 2007, pp. 107-111. |
Livingston, Mark Alan, “Vision-based Tracking with Dynamic Structured Light for Video See-through Augmented Reality,” 1998, Ph.D. Dissertation, University of North Carolina at Chapel Hill. |
Malik, Shahzad et al., “Visual Touchpad: A Two-Handed Gestural Input Device,” Oct. 13-15, 2004, International Conference on Multimodal Interfaces (ICMI '04). |
MacIver, Malcolm, et al., “Body Electric,” Apr. 2003, art installation, description available online at <http://www.neuromech.northwestern.edu/uropatagium/#ArtSci>. |
Mandala Systems, “Video Gesture Control System Concept,” 1986, description available online at <http://www.vividgroup.com/tech.html>. |
Microsoft Surface multi-touch interface table unveiled, May 30, 2007, downloaded from http://www.dancewithshadows.com/tech/microsoft-surface.asp. |
Microsoft Surface Web Page, downloaded from http://www.microsoft.com/surface/Pages/Product/WhatIs.aspx on Sep. 24, 2009. |
Experience Microsoft Surface, downloaded from http://www.microsoft.com/surface/Pages/Product/Specifications.aspx on Sep. 24, 2009. |
Microsoft Surface, downloaded from http://en.wikipedia.org/wiki/Microsoft—surface on Sep. 24, 2009. |
Mitsubishi DiamondTouch, http://www.merl.com/projects/DiamondTouch/, visited Mar. 21, 2007. |
Mo, Zhenyao “SmartCanvas: A Gesture-Driven Intelligent Drawing Desk System,” Jan. 9-12, 2005, Proceedings of Intelligent User Interfaces (IUI '05). |
Morano, Raymond A. et al., “Structured Light Using Pseudorandom Codes,” Mar. 1998, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, No. 3. |
Morris, T. et al., “Real-Time Fingertip Detection for Hand Gesture Recognition,” Sep. 9-11, 2002, Advanced Concepts for Intelligent Vision Systems (ACIVS'04), Ghent University, Belgium. |
Muench, Wolfgang, “Bubbles”, Prix Ars Electonica Catalog 1999, Springer-Verlag, Berlin, Germany; available online at <http://hosting.zkm.de/wmuench/bub/text>. |
Notice of Opposition in European Application No. 02739710.8 dated May 14, 2007. |
Provision of the minutes in European Application No. 02739710.8 dated Dec. 28, 2009. |
Decision revoking the European Patent in European Application No. 02739710.8 dated Dec. 28, 2009. |
Observation by third party Michael Saup dated Jan. 17, 2005, cited during opposition of European Application No. 02739710.8, filed Jun. 4, 2002. |
Observation by third party Petra Trefzger dated Jan. 17, 2005, cited during opposition of European Application No. 02739710.8, filed Jun. 4, 2002. |
Observation by third party Simon Penny dated Jan. 17, 2005, cited during opposition of European Application No. 02739710.8, filed Jun. 4, 2002. |
Paradiso, Joseph et al., “New Sensor and Music Systems for Large Interactive Surfaces,” Aug. 2000, Proceedings of the Interactive Computer Music Conference, Berlin, Germany, pp. 277-280. |
Penny, Simon, “Fugitive”; Oct. 1997; <http://www.ace.uci.edu/penny/works/fugitive/fugitive.html>. |
Penny, Simon, et al., “Fugitive II,” Jan. 8-Mar. 14, 2004, Australian Center for the Moving Image, art installation, description available online at <http://www.acmi.net.au/fugitive.jsp?>. |
Penny, Simon, et al.; Traces: Wireless full body tracking in the CAVE, Dec. 16-18, 1999; Japan; ICAT Virtual Reality Conference; <http://turing.ace.uci.edu/pennytexts/traces/>. |
Pinhanez, C. et al., “Ubiquitous Interactive Graphics,” Jul. 29-31, 2003, IBM Research Report RC22495, available at <http://www.research.ibm.com/ed/publications/rc22495.pdf>. |
Pinhanez, C., “The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces,” Sep. 29-Oct. 2, 2001, Proceedings of the UbiComp 2001 Conference, Ubiquitous Computig Lecture Notes in Computer Science, Springer-Verlag, Berl. |
Plasma; 3 pages; <http://www.particles.de/paradocs/plasma/index.html>, cited in U.S. Appl. No. 10/160,217, filed Aug. 8, 2005. |
Reactrix, Inc. website, Mar. 28, 2003, <http://web.archive.org/web/20030328234205/http://www.reactrix.com> and <http://web.archive.org/web/20030328234205/http://www.reactrix.com/webdemo.php>. |
Rekimoto, J., et al., “Perceptual Surfaces: Towards a Human and Object Sensitive Interactive Display,” Oct. 19-21, 1997, Proceedings of the Workshop on Perceptual User Interfaces, Banff, Canada, pp. 30-32. |
Ringel, M. et al., “Barehands: Implement-Free Interaction with a Wall-Mounted Display,” Mar. 31-Apr. 5, 2001, Proceedings of the 2001 ACM CHI Conference on Human Factors in Computing Systems (Extended Abstracts), p. 367-368. |
Rogala, Miroslaw, “Lovers Leap,” Nov. 21-26, 1995, art installation, Dutch Electronic Arts Festival, description available online at <http://wayback.v2.nl/DEAF/persona/rogala.html>. |
Rokeby, David, “Very Nervous System (VNS),” Mar. 1995, Wired Magazine, available online at <http://www.wired.com/wired/archive/3.03/rokeby.html>; sold as software at <http://homepage.mac.com/davidrokeby/softVNS.html>. |
Rokeby, David, “softVNS 2 real time video processing and tracking software for Max;” SoftVNS 2 downloads, as downloaded from <http://homepage.mac.com/davidrokeby/softVNS.html> on Mar. 16, 2007. |
Sato, Yoichi, et al., “Fast Tracking of Hands and Fingertips in Infrared Images for Augmented Desk Interface,” Mar. 2000, 4th International Conference on Automatic Face -and Gesture-Recognition, Grenoble, France. |
Schneider, John K., “Improved Fingerprint System Using Rolled and Multi-segmented Techniques,” U.S. Appl. No. 60/575,952, filed Jun. 1, 2004, pp. 1-6. |
Screenshots of Reactrix Product Demo Video, Mar. 28, 2003, <http://web.archive.org/web/20030407174258/http://www.reactrix.com/demo/reactrix—demo.wmv>. |
Sester, Marie, “Access,” Dec. 2001, Interaction 99 Biennial Catalog, Gifu, Japan, available online at <http://www.accessproject.net/concept.html>. |
Snibbe, Scott, “Boundary Functions,” Sep. 7-12, 1998, art installation, description available online at <http://snibbe.com/scott/bf/index.html>. |
Snibbe, Scott, “Screen Series,” 2002-2003 art installation, description available online at <http://snibbe.com/scott/screen/index.html>. |
Sonneck, Georg, et al., “Optimized One-to-One Personalization of Web Applications using a Graph Based Model,” IEEE-22, Apr. 26, 2003, 9 pgs. |
Sparacino, Flavia, et al., “Media in performance: interactive spaces for dance, theater, circus and museum exhibits,” Nov. 2000, IBM Systems Journal, vol. 39, No. 3-4, pp. 479-510. |
Sparacino, Flavia, “(Some) computer visions based interfaces for interactive art and entertainment installations,” 2001, INTER—FACE Body Boundaries, Anomalie digital—arts, No. 2, Paris, France. |
Stauffer, Chris, et al., “Learning Patterns of Activity Using Real-Time Tracking,” Aug. 2000, IEEE Transaction on Pattern Analysis and Machine Intelligence, vol. 22, No. 8, pp. 747-757. |
Summons to Attend Oral Proceedings in European Application No. 02739710.8, dated Aug. 12, 2005. |
Summons to Attend Oral Proceedings in European Application No. 02739710.8, dated Jun. 12, 2009. |
Supreme Particles, “PLASMA/Architexture,” 1994, available online at <http://www.particles.de/paradocs/plasma/index.html>. |
Supreme Particles; R111, 1999, available online at <http://www.r111.org>, XP-002989704. |
Tan, P, et al., “Highlight Removal by Illumination-Constrained Inpainting,” Ninth IEEE International Conference on Computer Vision, Oct. 13-16, 2003. |
The History of Microsoft Surface, downloaded from <http://www.microsoft.com/presspass/presskits/surfacecomputing/docs/SurfaceHistoryBG.doc> on Sep. 24, 2009. |
Torr, P.H.S. et al., “The Development and Comparison of Robust Methods for Estimating the Fundamental Matrix,” Sep./Oct. 1997, International Journal of Computer Vision, vol. 24, No. 3, pp. 271-300. |
Toth, Daniel et al., “Illumination-Invariant Change Detection,” Apr. 2-4, 2000, 4th IEEE Southwest Symposium on Image Analysis and Interpretation, p. 3. |
Trefzger, Petra, “Vorwerk,” 2000, art installation, description available online at <http://www.petracolor.de>. |
Utterback, Camille, et al., “Text Rain,” 1999, art installation, available online at <www.camilleutterback.com/textrain.html>. |
Vogt, Florian et al., “Highlight Substitution in Light Fields,” IEEE International Conference on Image Processing, Sep. 22-25, 2002. |
Wang, Junxian, et al., “Specular reflection removal for human detection under aquatic environment,” Jun. 27-Jul. 2, 2004 IEEE Conference on Computer and Pattern Recognition Workshop (CVPRW04) vol. 8, p. 130. |
Wellner, Pierre, “Interacting with paper on the DigitalDesk,” Jul. 1993, Communications of the ACM, Special issue on computer augmented environments: back to the real world, vol. 36, Issue 7, pp. 87-96. |
Wellner, Pierre, “Digital Desk Calculator:Tangible Manipulation on a Desktop Display” Proceedings of the Symposium on User Interface Software and Technol (UIST), Hilton Head, S. Carolina, Nov. 11-13, 1991. |
Wilson, Andrew, “PlayAnywhere: A Compact Interactive Tabletop Projection—Vision System,” ACM Symposium on User Interface Software and Technology (UIST), Oct. 23-27, 2005, Seattle, Washington, U.S.A. |
Written Opinion for PCT/US2002/017843, filed Jun. 4, 2002, dated Feb. 5, 2003. |
Written Opinion of the International Searching Authority for PCT/US2004/035477, filed Oct. 25, 2004. |
Written Opinion of the International Searching Authority for PCT/US2004/035478, filed Oct. 25, 2004. |
Written Opinion of the International Searching Authority for PCT/US2004/041318, filed Dec. 9, 2004. |
Written Opinion of the International Searching Authority for PCT/US2004/041319, filed Dec. 9, 2004. |
Written Opinion of the International Searching Authority for PCT/US2004/041320, filed Dec. 9, 2004. |
Written Opinion of the International Searching Authority for PCT/US2005/008984, filed Mar. 18, 2005. |
Written Opinion of the International Searching Authority for PCT/US2006/030720, filed Aug. 4, 2006. |
Written Opinion of the International Searching Authority for PCT/US2008/059900, filed Apr. 10, 2008. |
Number | Date | Country | |
---|---|---|---|
20100121866 A1 | May 2010 | US |
Number | Date | Country | |
---|---|---|---|
61061105 | Jun 2008 | US |