When accessing files on a computing machine, a user can utilize one or more input devices coupled to the computing machine. The user can configure or manipulate one or more of the input devices when selecting at least one file on the computing machine to access. Once at least one of the files has been selected by the user, the user can proceed to access and/or manipulate at least one the files utilizing the input device and the computing machine.
Various features and advantages of the disclosed embodiments will be apparent from the detailed description which follows, taken in conjunction with the accompanying drawings, which together illustrate, by way of example, features of the embodiments.
Further, as shown in
As noted above, the computing machine 100 includes a processor 120. The processor 120 sends data and/or instructions to one or more components of the computing machine 100, such as the sensor 130 and/or the file application 110. Additionally, the processor 120 receives data and/or instruction from one or more components of the computing machine 100, such as the sensor 130 and/or the file application 110.
The file application 110 is an application which can be utilized in conjunction with the processor 120 and at least one sensor 130 to detect an object and a user interacting with the object. When detecting the user interacting with the object, a sensor 130 can additionally be configured to capture the user interacting with the object. For the purposes of this application, an object can be any physical object, media, and/or document which a sensor 130 can detect and a user can physically interact with.
In one embodiment, the object includes a document, a folder of documents, a book, a notepad, and/or a newspaper. In other embodiments, the object can include additional objects or additional forms of physical media or documents in addition to and/or in lieu of those noted above.
A user can be any person which can physically interact with an object. If the object is detected by the sensor 130, the file application 110 can proceed to associate the object with at least one file 160 on the computing machine 100. A file 160 on the computing machine can be a digital file. The digital file can be a digital document, a system or application file, a digital media file and/or any additional digital file type. The file 160 can include numbers, characters, and/or a combination of the above. Additionally, the file 160 can include audio, video, images, and/or a combination of the above.
Once the file application 110 has associated the object with at least one file 160, the file application 110 can configure a display device 170 to render a user accessing an associated file 160. Further, the file application 110 can edit one or more associated files 160 on the computing machine 100 in response to the user interacting with the object.
The file application 110 can be firmware which is embedded onto the computing machine 100. In other embodiments, the file application 110 is a software application stored on the computing machine 100 within ROM or on the storage device 140 accessible by the computing machine 100 or the file application 110 is stored on a computer readable medium readable and accessible by the computing machine 100 from a different location.
Additionally, in one embodiment, the storage device 140 is included in the computing machine 100. In other embodiments, the storage device 140 is not included in the computing machine 100, but is accessible to the computing machine 100 utilizing a network interface of the computing machine 100. The network interface can be a wired or wireless network interface card.
In a further embodiment, the file application 110 is stored and/or accessed through a server coupled through a local area network or a wide area network. The file application 110 communicates with devices and/or components coupled to the computing machine 100 physically or wirelessly through a communication bus 150 included in or attached to the computing machine 100. In one embodiment the communication bus 150 is a memory bus. In other embodiments, the communication bus 150 is a data bus.
As noted above, the file application 110 can be utilized in conjunction with the processor 120 and at least one sensor 130 to detect an object and/or capture a user interacting with the object. When detecting the object and the user interacting with the object, the file application 110 and/or the processor 120 can configure the sensor 130 to scan an environment around the computing machine 100 for an object and/or the user. For the purposes of this application, the environment includes a space or volume around the computing machine 100 and/or around the sensor 130.
A sensor 130 is a detection device configured to scan for or to receive information from the environment around the sensor 130 or the computing machine 100. In one embodiment, a sensor 130 is a 3D depth image capturing device configured to scan a volume in front of the sensor 130. In another embodiment, the sensor 130 can include at least one from the group consisting of a motion sensor, a proximity sensor, an infrared sensor, a stereo device, and/or any additional image capturing device. In other embodiments, a sensor 130 can include additional devices and/or components configured to receive and/or to scan for information from an environment around the sensor 130 or the computing machine 100.
A sensor 130 can be configured by the processor 120 and/or the file application 110 to actively, periodically, or upon request scan the environment for the object and/or the user. When configuring the sensor 130, the processor 120 and/or the file application 110 can send one or more instructions for the sensor 130 to scan the environment. Further, at least one sensor 130 can be coupled to one or more locations on or around the computing machine 100. In another embodiment, at least one sensor 130 can be integrated as part of the computing machine 100. In other embodiments, at least one of the sensors 130 can be coupled to or integrated as part of one or more components of the computing machine 100, such as the display device 170.
In one embodiment, if an object and a user are found by the sensor 130 within the volume or viewing area of the sensor 130, the sensor 130 can proceed to capture and identify a distance of the object. The file application 110 will then compare the identified distance of the object to a predefined distance.
The predefined distance can be defined by the file application 110, the computing machine 100, and/or by a user.
In one embodiment, if the object is within a predefined distance of the computing machine 100 and/or the sensor 130, the file application 110 will configure the sensor 130 to detect and/or capture information of the object and proceed to identify the object. In another embodiment, if the identified distance is greater than the predefined distance, the file application 110 can configure the display device 160 to prompt the user to bring the object closer to the computing machine 100 and/or the sensor 130.
When detecting and/or capturing information of the object, the file application 110 configures the sensor 130 to scans the object for information which can identify the object. In one embodiment, the file application 110 configures the sensor 130 to capture and identify at least one dimension of the object. A dimension of the object can include a length of the object, a width of the object, a shape of the object, and/or a color of the object. In another embodiment, the object can include a bar code or a visual signature and the file application 110 can configure the sensor 130 to scan the bar code or the visual signature. In other embodiments, the sensor 130 can capture an image of the object.
Utilizing the detected and/or captured information of the object, the file application 110 can attempt to identify the object by comparing the information to predefined information corresponding to one or more recognized objects of the computing machine 100. A recognized object is an object which is recognized by the computing machine 100 and is associated with one or more files 160 on the computing machine 100.
The predefined information of the recognized object can list dimensions of the corresponding recognized object. In another embodiment, the predefined information lists a bar code or visual signature of an object which is recognized by the computing machine 100 and is associated with corresponding media 160. In other embodiments, the predefined information includes an image of a recognized object of the computing machine 100.
The predefined information can be included as part of one or more files 160 on the computing machine 100. In another embodiment, the predefined information can be stored in a list of objects. In other embodiments, the predefined information can be included in a database of objects. The list of object and/or the database of objects can be stored in the storage device 140 or on another device accessible to the file application 110,
Further, a recognized object can list one or more files 160 on the computing machine 100 associated with the corresponding recognized object. If a match is found between the captured information and the predefined information, the file application 110 will proceed to identify the object as the recognized object. Additionally, the file application 110 will associate the object with one or more of the files 160 listed to be associated with the recognized object.
As a result, the file application 110 associates at least one of the files 160 with the object. In another embodiment, if no match is found, the file application 110 can associate the object with one or more files 160 on the computing machine 100 in response to the user interacting with the object. When associating one or more of the files in response to the user interacting with the object, the file application can configure the display device 170 to render one or more of the files 170 for display.
A display device 170 is a device that can create and/or project one or more images and/or videos for display. In one embodiment, the display device 170 can be a monitor and/or a television. In another embodiment, the display device 170 is a projector that can project one or more images and/or videos. The display device 170 can include analog and/or digital technology. Additionally, the display device 170 can be coupled to the computing machine 100 or the display device 170 can be integrated as part of the computing machine 100.
When utilizing a user interaction to associate the object with one or more files 160 on the computing machine 100, the sensor 130 can detect the user interacting with the object through one or more gestures. A gesture can correspond to a command recognized by file application 110 and/or the computing machine 100. Further, the gesture can be made by the user to or from the object. Additionally, the gesture can be made by the user between the object and the computing machine 100.
In one embodiment, the gesture can be a visual motion made by the user to or from the object or the computing machine 100. The visual motion can include one or more hand motions or hand gestures. In another embodiment, the gesture can include audio from the user captured by the sensor 130 or touch motions made by the user and detected by the sensor 130.
In one embodiment, when associating the object with one or more files 160, the sensor 130 can detect the user making one or more hand gestures. The sensor can detect the user closing his hand over the object and moving his hand to one or more files 160 being displayed on the display device 170. The user can then release his hand to an open position over an area of the display device 170 where one or more of the files 160 are rendered. In response, the file application 110 can identify which of the files 160 are being rendered at the area of the display device 170 and proceed to associate the corresponding file or files 160 with the object,
In another embodiment, the file application 110 can additionally create a binary and/or pixel map to identify a position of the object, the user, the computing machine, and/or one or more file rendered for display. As the user makes one or more gestures, the file application 110 can track and mark the coordinate and/or positions of where the gesture is being made. As a result, the file application can accurately track a direction of the gesture and identify which object or files are included in the gesture.
Once the user has associated the object with one or more of the files, the file application 110 can proceed to store the information of the corresponding object and store the detected and/or captured information as predefined information of a recognized object associated with the one or more of the files 160.
Additionally, in response to the object being associated with one or more of the files 160 on the computing machine 100, the file application 110 can proceed to access the associated files 160. Additionally, the file application 110 can render the associated files 160 for a user to continue to access or edit in response to the user interacting with the object through at least one gesture.
In one embodiment, when accessing or editing an associated file 160, the file application 110 configures the sensor 130 to capture a gesture from the user and proceeds to identify a corresponding command associated with the gesture. The file application 110 can then proceed to execute the identified command on one or more of the associated files 160.
In one embodiment, one or more of the associated files 160 list commands which can be executed on the associated files 160. Additionally, the commands can list a corresponding gesture which can be entered by the user when entering the command. In another embodiment, one or more commands and the corresponding gestures are stored in a list of commands or in a database of commands.
When accessing or editing one or more of the associated files 160, the file application 110 can scroll through one or more of the associated files. Additionally, the file application 110 can edit the associated files by adding or removing content from one or more of the associated files 160. In another embodiment, the file application 110 can associate the associated files 160 with additional files 160 on the computing machine 100. In other embodiments, the file application 110 can perform additional commands and/or execute additional commands on one or more associated files 160 in addition to and/or in lieu of those noted above.
Additionally, as illustrated in
As noted above, the sensor 230 is configured by a processor of the computing machine 200 and/or a file application to detect and/or capture information of an object 280. Additionally, the sensor 230 is configured to detect and/or capture a user 210 interacting with the object 280. In response to detecting the object 280 and the user 210 interacting the object 280, the file application can associate the object 280 with one or more files on the computing machine 200. Additionally, the file application can access and/or edit one or more of the associated files in response to the user 210 interacting with the object 280.
As shown in the present embodiment, the sensor 230 can be configured to detect and/or capture a view or a volume of an environment around the computing machine 200 by scanning and/or detecting information around the computing machine 200. The sensor 230 captures a view of any objects 280 within the environment of the computing machine 200. As noted above, the sensor 230 can actively scan the environment for an object 280 or the sensor 230 can periodically or upon request scan the environment for an object 280.
As illustrated in
Once an object 280 has been found within a view of the sensor 230, the sensor 230 can proceed to determine a distance of the object 280. As noted above, the sensor 230 can be a 3D depth image capturing device. The sensor 230 can scan a volume or a viewing area in front of the sensor 230 for the object 280 and/or the user. The sensor 230 can then identify the distance of any object 280 and/or user detected within the volume or the viewing area.
The sensor 230 then passes the identified distance of the object 280 to a file application. The file application compares the captured distance to a predefined distance to determine whether the object 280 is within proximity of the computing machine 200 and/or the sensor 230. If the file application determines that the distance of the object is greater than the predefined distance, the file application can render a user interface 275 on the display device 270 to prompt the user 210 to bring the object 280 closer so the file application can identify the object 280.
Additionally, the user interface 275 can be configured to render the object 280 and/or the user 210. In another embodiment, the user interface 275 can additionally be configured by the file application to render one or more files of the computing machine and/or the user 210 accessing one or more of the files. In other embodiments, the user interface 275 can be configured to display additional information or details in addition to and/or in lieu of those noted above.
If the file application determines that the distance of the object 280 is less than the predefined distance, then the object 280 will be determined to be within proximity of the computing machine 200 and/or the sensor 230. The file application will then proceed to identify the object 280 and associate the object 280 with one or more files on the computing machine 200.
As noted above, when identifying the object 280, the sensor 230 can detect and/or capture information of the object 280. The information can include one or more dimensions of the object 280. The dimensions can include a length, a width, a height, a shape, and/or a color of the object 280. In another embodiment, the information can include a bar code or a visual signature on the object 280. In other embodiments, the information can include an image of the object 280.
The file application can then compare the information to predefined information corresponding to one or more recognized objects on the computing machine 200. As noted above, a recognized object can be associated with one or more files on the computing machine 200. If a match is found, then the file application will proceed to identify the object as the recognized object and associate the object 280 with one or more of the files associated with the corresponding recognized object.
In another embodiment, if no match is found, the user 210 can associate the object 280 with one or more of the files on the computing machine 200 using one or more gestures 290. The file application can additionally configure the digital display device 210 to render the user interface 380 to prompt the user 210 to associate the object 280 with one or more of the files on the computing machine 200.
As noted above and as illustrated in
As shown in
Utilizing one or more captured gesture 290, the user 210 can associate the object 280 with one or more files on the computing machine. Once the object 280 is associated with one or more of the files, the file application can configure the display device 270 to render one or more of the associated files. In one embodiment, the display device 270 further renders the user interface 275 to display one or more of the associated files being accessed and/or edited by the user 210,
As illustrated in
In one embodiment, as illustrated in
As shown in
The sensor 330 sends information of this hand gesture to the file application to identify. The file application identifies the hand gesture 390 between the object 380 and file 350. Additionally, the file application determines that a corresponding command for the hand gesture 390 is an association between the object 380 and file 360. As a result, the file application proceeds to associate file 360 with the object 380.
In one embodiment, the file application additionally stores captured information of the object 380 as predefined information of a recognized object. The predefined information of the recognized object can be used by the file application subsequently to access associated file 360 when object 380 is detected and identified by the sensor 330.
As noted above, once the object 380 is associated with one or more of the files 380, the file application can access and render the associated file 360 on the digital display device 370. Additionally, the file application can continue to configure the sensor 330 to detect one or more gestures 390 from the user when accessing and/or edit the associated files 360.
In one embodiment, the sensor 430 has detected object 480. As a result, the file application 410 configures the sensor 430 to capture information of the object 480. As shown in
The file application 410 then proceeds to attempt to identify object 480 and associate object 480 with one or more files on the computing machine. As illustrated in
As illustrated in
In one embodiment, the list of objects 420 is stored on a storage device accessible to the file application 410. In another embodiment, the predefined information recognized objects can be stored as part of a file. In other embodiments, the predefined information can be included in a database of objects accessible to the computing machine and/or the file application 410.
As shown in
As noted above, once the object 480 has been associated with one or more files of the computing machine, the associated files can be accessed and/or edited by the user in response to the sensor 430 detecting the user interacting with the object 480 through one or more gestures.
Additionally, as illustrated in
As shown in
As illustrated in
In response, the file application 510 attempts to identify the gesture and a command associated with gesture. As shown in
In other embodiments, the user can interact with the object to access or edit the associated media 560 using additional gestures in addition to and/or in lieu of those noted above and illustrated in
As noted above, the processor and/or the file application can configure the sensor to scan an environment of the computing machine for an object and detect a user interacting with the object 700. The object can include any physical object, media, and/or document which a user can interact with. In one embodiment, the object can include a document, a folder or collection of documents, a book, a newspaper, and/or a notepad.
Once an object is detected, the file application will then proceed to associate the object with one or more files on the computing machine 710. As noted above, a file on the computing machine can include a system file, an application file, a document file, and/or a media file.
In one embodiment, before associating the object with one or more files, the file application will configure the sensor to detect and/or capture information of the object and attempt to identifying the object. The information can include dimensions of the object. In another embodiment, the information can include a bar code or a visual signature included on the object. In other embodiments, the information can include a visual image of the object.
Utilizing the information detected and/or captured from the object, the file application proceeds to compare the information to predefined information of one or more recognized objects. The recognized objects are objects which are recognized by the computing machine and associated with one or more files on the computing machine. Further, the predefined information can include dimensions of the recognized object, a bar code or visual signature of the recognized object, and/or an image of the recognized object.
If the file application finds that the detected and/or captured information matches predefined information of one of the recognized objects, the file application will proceed to identify the object as the recognized object and associate the object with the files listed to be associated with the recognized object.
In another embodiment, if the file application does not find a match, the user can proceed to associate the object with one or more of the files on the computing machine. As noted above, one or more of the files can be rendered as part of a user interface on a display device. When associating the object with one or more of the files, the user can utilize one or more gestures. A gesture can include a visual gesture, an audio gesture, and/or a touch gesture. The gesture can be made to or from the object and/or the computing machine.
In one embodiment, the user makes a visual gesture by moving his hand from the object to a media displayed on the digital display device. The sensor can detect and capture this gesture for the file application. The file application then proceeds to determine that the user is associating the object with the corresponding file. As a result, the file application associates the corresponding file with the object. Additionally, the file application can create a new a recognized object with captured information of the object and list the recognized object to be associated with the corresponding files.
Once the object has been associated with one or more of the files on the computing machine, the file application can access and/or retrieve one or more of the associated files. Additionally, the file application can configure the display device to render one or more associated files being accessing in response to the user interacting with the object 720.
As noted above, when accessing and/or editing one or more of the associate files, the file application can configure the sensor to detect and/or capture one or more gestures from the user and attempt to identify a corresponding command associated with the gesture. As noted above, one or more recognized gestures and the corresponding commands can be listed as part of an associated file. In another embodiment, one or more gestures and their corresponding commands can be included in a list or a database.
If the file application is able to find a matching gesture and an associated command corresponding to the gesture, the file application will execute the corresponding command on when accessing or editing an associated file. The method is then complete or the file application can continue to access and/or edit the associated files in response to the user interacting with the object. In other embodiments, the method of
As noted above, the file application and/or the processor can initially configure the sensor to scan an environment around the computing machine for an object and identify a distance of the object. In one embodiment, the sensor is a 3D depth image capture device configured to scan a viewing area and/or a volume around the computing machine for the object. Additionally, the object can include any physical object, physical document, and/or physical media.
If an object is found, the file application will then proceed to determine whether the object is within a proximity of the computing machine 800. When determining whether the object is within a proximity of the computing machine, the file application will compare a detected and/or captured distance of the object to a predefine distance. If the distance is less than the predefined distance, the file application will determine that the object is within the predefined distance and proceed to configure the sensor to detect and/or capture information of the object and a user interacting with the object.
In another embodiment, if the object is not within proximity of the computing machine, the file application can configure the display device to render a message for the user to bring the object closer. Once the object is within proximity, the sensor can detect and/or capture information of the object and the user interacting with the object.
As noted above, the information detected and/or captured by the sensor can include one or more dimensions of the object, a bar code or visual signature of the object, and/or an image of the object. In other embodiments, the information can include additional information in addition to and/or in lieu of those noted above.
Utilizing the captured information, the file application will determine whether the captured information matches any predefined information from a recognized object 820. As noted above, a recognized object is an object which the computing machine and/or the file application recognizes. Additionally, the recognized object is associated with one or more files of the computing machine. Further the predefined information can include one or more dimensions of the recognized object, a bar code or a visual signature of the recognized object, and/or an image of the recognized object.
The file application will utilize the information detected and/or captured by the sensor and scan the predefined information of the recognized objects for a match. If a match is found, the file application will proceed to identify the object as the recognized object and proceed to associate one or more of the files associated with the recognized object with the object 830.
In one embodiment, if no match is found, the file application will determine that the object is not recognized and no files are immediately associated with the object. In another embodiment, the file application will proceed to associate at least one file with the object in response to at least one gesture made by the user 840. As noted above, a gesture can include a visual gesture consisting of motion made by the user, an audio gesture made when the user speaks, and/or a touch gesture when the user touches the object and/or one or more components of the computing machine.
In one embodiment, once the object is associated with one or more files of the computing machine the file application can configure the sensor to detect at least one gesture made by the user when the user is interacting with the object 850. Additionally, the file application can configure the display device to render a user interface of at least one of the associated files being accessed in response to the user interacting with the object 860. The user interface can display the associated files being accessed and/or edited. In another embodiment, the user interface can display the object and/or the user interacting with the object.
If the sensor detects any additional gestures from the user, the sensor will capture the gesture for the file application to attempt to identify. As noted above, one or more of the associated files can list or store recognized gestures and corresponding commands for the recognized gestures. The file application will determine whether the captured gesture matches one or more of the recognized gestures. If a match is found, the file application will proceed to edit and/or access at least one of the files in response to the gesture 870.
In one embodiment, the file application additionally configures the display device to render any associated files being edited and/or accessed. The method is then complete or the file application can continue to access and/or edit any associated files of the object in response to the user interacting with the object. In other embodiments, the method of
By configuring a sensor to detect an object, the object can accurately and securely be associated with one or more files of a computing machine. Additionally, by accessing and/or editing one or more of the associated files of in response to the user making one or more gestures from the user with the object, a natural and a user friendly experience can be created for the user when the user is interacting with the object.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US2010/028804 | 3/26/2010 | WO | 00 | 1/25/2012 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2011/119167 | 9/29/2011 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5732227 | Kuzunuki et al. | Mar 1998 | A |
5917490 | Kuzunuki et al. | Jun 1999 | A |
5947478 | Kwan | Sep 1999 | A |
5953735 | Forcier | Sep 1999 | A |
6115739 | Ogawa et al. | Sep 2000 | A |
6185683 | Ginter et al. | Feb 2001 | B1 |
6266057 | Kuzunuki et al. | Jul 2001 | B1 |
6330486 | Padula | Dec 2001 | B1 |
6330975 | Bunte | Dec 2001 | B1 |
6377296 | Zlatsin | Apr 2002 | B1 |
6430563 | Fritz et al. | Aug 2002 | B1 |
6499665 | Meunier et al. | Dec 2002 | B1 |
6898307 | Harrington | May 2005 | B1 |
7003731 | Rhoads et al. | Feb 2006 | B1 |
7042440 | Pryor et al. | May 2006 | B2 |
7164413 | Davis et al. | Jan 2007 | B2 |
7283983 | Dooley | Oct 2007 | B2 |
7358962 | Dehlin et al. | Apr 2008 | B2 |
7421155 | King et al. | Sep 2008 | B2 |
7469097 | Yost | Dec 2008 | B2 |
7479950 | Dehlin et al. | Jan 2009 | B2 |
7519223 | Dehlin et al. | Apr 2009 | B2 |
7578441 | Gower et al. | Aug 2009 | B2 |
7593605 | King et al. | Sep 2009 | B2 |
7593950 | Silverman et al. | Sep 2009 | B2 |
7596766 | Sharma | Sep 2009 | B1 |
7606741 | King et al. | Oct 2009 | B2 |
7773822 | Walker | Aug 2010 | B2 |
7900000 | Nakamura et al. | Mar 2011 | B2 |
7957018 | Rudolph et al. | Jun 2011 | B2 |
7986806 | Rhoads | Jul 2011 | B2 |
8001613 | Duncan | Aug 2011 | B2 |
8005720 | King et al. | Aug 2011 | B2 |
8156115 | Erol et al. | Apr 2012 | B1 |
8199117 | Izadi et al. | Jun 2012 | B2 |
8228542 | Coley et al. | Jul 2012 | B2 |
8230337 | Rhoads et al. | Jul 2012 | B2 |
8269175 | Alameh et al. | Sep 2012 | B2 |
8285047 | Nagarajan et al. | Oct 2012 | B2 |
8289288 | Whytock et al. | Oct 2012 | B2 |
8467991 | Khosravy et al. | Jun 2013 | B2 |
8487938 | Latta et al. | Jul 2013 | B2 |
8509475 | Denzler et al. | Aug 2013 | B2 |
8531396 | Underkoffler et al. | Sep 2013 | B2 |
8533192 | Moganti et al. | Sep 2013 | B2 |
8542252 | Perez et al. | Sep 2013 | B2 |
8745541 | Wilson et al. | Jun 2014 | B2 |
8902445 | Yoshida | Dec 2014 | B2 |
20020126161 | Kuzunuki | Sep 2002 | A1 |
20020131076 | Davis | Sep 2002 | A1 |
20020164054 | McCartney | Nov 2002 | A1 |
20030040957 | Rodriguez | Feb 2003 | A1 |
20040125414 | Ohishi et al. | Jul 2004 | A1 |
20040193413 | Wilson et al. | Sep 2004 | A1 |
20040215689 | Dooley | Oct 2004 | A1 |
20050013462 | Rhoads | Jan 2005 | A1 |
20050040224 | Brinton | Feb 2005 | A1 |
20050078088 | Davis et al. | Apr 2005 | A1 |
20050128196 | Popescu | Jun 2005 | A1 |
20050275635 | Dehlin et al. | Dec 2005 | A1 |
20050275636 | Dehlin et al. | Dec 2005 | A1 |
20060001645 | Drucker et al. | Jan 2006 | A1 |
20060001650 | Robbins et al. | Jan 2006 | A1 |
20060007123 | Wilson | Jan 2006 | A1 |
20060007124 | Dehlin | Jan 2006 | A1 |
20060092170 | Bathiche et al. | May 2006 | A1 |
20060230038 | Silverman et al. | Oct 2006 | A1 |
20060238347 | Parkinson | Oct 2006 | A1 |
20060279798 | Rudolph et al. | Dec 2006 | A1 |
20070011149 | Walker | Jan 2007 | A1 |
20070060336 | Marks et al. | Mar 2007 | A1 |
20070094296 | Peters, III | Apr 2007 | A1 |
20070138256 | Coventry | Jun 2007 | A1 |
20070140678 | Yost | Jun 2007 | A1 |
20070208805 | Rhoads et al. | Sep 2007 | A1 |
20070211022 | Boillot | Sep 2007 | A1 |
20080074707 | Cranitch et al. | Mar 2008 | A1 |
20080080789 | Marks et al. | Apr 2008 | A1 |
20080189081 | Chang et al. | Aug 2008 | A1 |
20080231609 | Dehlin et al. | Sep 2008 | A1 |
20080258863 | Vrielink | Oct 2008 | A1 |
20080281851 | Izadi et al. | Nov 2008 | A1 |
20090139778 | Butler et al. | Jun 2009 | A1 |
20090215471 | Sands | Aug 2009 | A1 |
20090237245 | Brinton | Sep 2009 | A1 |
20090251285 | Do et al. | Oct 2009 | A1 |
20090251748 | Luttmer | Oct 2009 | A1 |
20090282130 | Antoniou | Nov 2009 | A1 |
20090286572 | Rhoads et al. | Nov 2009 | A1 |
20090319181 | Khosravy et al. | Dec 2009 | A1 |
20090323128 | Asuri et al. | Dec 2009 | A1 |
20100008255 | Khosravy et al. | Jan 2010 | A1 |
20100045816 | Rhoads | Feb 2010 | A1 |
20100091338 | Ohishi et al. | Apr 2010 | A1 |
20100171993 | Longobardi et al. | Jul 2010 | A1 |
20100174618 | Driessen | Jul 2010 | A1 |
20100188713 | Ogura et al. | Jul 2010 | A1 |
20100299390 | Alameh et al. | Nov 2010 | A1 |
20100302247 | Perez et al. | Dec 2010 | A1 |
20100306283 | Johnson et al. | Dec 2010 | A1 |
20110025876 | Denzler et al. | Feb 2011 | A1 |
20110026068 | Yoshida | Feb 2011 | A1 |
20110101085 | Nakagawa | May 2011 | A1 |
20110181421 | Nabata | Jul 2011 | A1 |
20120061460 | Mackley et al. | Mar 2012 | A1 |
20120072420 | Moganti et al. | Mar 2012 | A1 |
20120132701 | Nakagawa et al. | May 2012 | A1 |
20130126596 | Fletcher et al. | May 2013 | A1 |
20140143725 | Lee | May 2014 | A1 |
20150032634 | D'Agostino | Jan 2015 | A1 |
20150049115 | Oikawa | Feb 2015 | A1 |
20150161558 | Gitchell | Jun 2015 | A1 |
Number | Date | Country |
---|---|---|
2451461 | Feb 2009 | GB |
Entry |
---|
WIPO, International Search Report dated Apr. 27, 2012, PCT/US2010/028804 filed Mar. 26, 2010. |
Extended European Search Report, EP Application No. 10848601.0, Date of Completion: Aug. 29, 2014, Date of Mailing: Sep. 8, 2014, pp. 1-7. |
Pierre Wellner, “The Digital Desk Calculator: Tangible Manipulation on a Desk Top Display,” UIST'91, Nov. 11-13, 1991, pp. 27-33. |
Number | Date | Country | |
---|---|---|---|
20120137259 A1 | May 2012 | US |