Monitoring large and complex environments is a challenging task for security operators because situations evolve quickly, information is distributed across multiple screens and systems, uncertainty is rampant, decisions can have high risk and far reaching consequences, and responses must be quick and coordinated when problems occur. The increased market present of single-touch and multi-touch interaction devices such as the iPhone, GPS navigators, HP TouchSmart laptop, Microsoft Surface and Blackberry mobile devices offer a significant opportunity to investigate new gesture-based interaction techniques that can improve operator performance during complex monitoring and response tasks.
However, the solutions that are typically incorporated to address the myriad of needs in complex security environments often consist of adding a multitude of features and functions. Adding such features requires operators to remember the features available, including when and how to access them. Therefore, it would be desirable if the added features were intuitive thereby making them easy to use.
In the following description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific embodiments which may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that structural, electrical, and optical changes may be made without departing from the scope of the present invention. The following description of example embodiments is, therefore, not to be taken in a limited sense, and the scope of the present invention is defined by the appended claims.
The functions or algorithms described herein may be implemented in software or a combination of software and human implemented procedures in one embodiment. The software may consist of computer executable instructions stored on computer readable media such as memory or other type of storage devices. Further, such functions correspond to modules, which are software, hardware, firmware or any combination thereof. Multiple functions may be performed in one or more modules as desired, and the embodiments described are merely examples. The software may be executed on a digital signal processor, ASIC, microprocessor, or other type of processor operating on a computer system, such as a personal computer, server or other computer system.
One example embodiment is illustrated in
In the example embodiment illustrated in
Another example embodiment relating to manipulating the zoom of a camera 12 is illustrated in
As shown in
In some embodiments, a level indicator 19 appears on the display 10. In the illustrated example embodiment, the level indicator 19 appears within the camera 12.
Although not shown in the FIGS., the zoom of the camera 12 could also be moved farther away from the camera 12 when fingers are moved away from the camera 12. The field of view 14 would then become narrower as the fingers move away from the camera 12.
As shown in
In some embodiments, a plurality of cameras 12 and a field of view 14 associated with each camera 12 may be shown on the touch-sensitive display 10 (see
A block diagram of a computer system that executes programming 1225 for performing the above method is shown in
Computer 1210 may include or have access to a computing environment that includes input 1216, output 1218, and a communication connection 1220. The input 1216 may be a keyboard and mouse/touchpad, or other type of data input device, and the output 1218 may be a display device or printer or other type of device to communicate information to a user. In one embodiment, a touch screen device may be used as both an input and an output device.
The computer may operate in a networked environment using a communication connection to connect to one or more remote computers. The remote computer may include a personal computer (PC), server, router, network PC, a peer device or other common network node, or the like. The communication connection may include a Local Area Network (LAN), a Wide Area Network (WAN) or other networks.
Computer-readable instructions stored on a computer-readable medium are executable by the processing unit 1202 of the computer 1210. A hard drive, CD-ROM, and RAM are some examples of articles including a computer-readable medium.
The methods described herein may help security personnel to effectively support security monitoring and response tasks. Users can interact with a touch-sensitive display by using intuitive gestures that support performing tasks and activities such as monitoring un-related assets and/or responding to an incident. The information provided on the display gives the context that is needed for effective interaction by users with assets (e.g., cameras) within a complex environment. Users can effectively interact (i.e., view and/or adjust) with assets using a variety of single-touch and multi-touch gestures on the touch-sensitive display.
The display may show 3-D or 2-D views of an environment depending on what is the most effective representation of a situation (environment and context). The environment (e.g., a building) or assets (e.g., equipment) can be shown on the touch-sensitive display such that a user can easily access and manipulate the assets using gestures on the touch-sensitive display.
The Abstract is provided to comply with 37 C.F.R. §1.72(b) to allow the reader to quickly ascertain the nature and gist of the technical disclosure. The Abstract is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims.
Number | Name | Date | Kind |
---|---|---|---|
4992866 | Morgan | Feb 1991 | A |
5483261 | Yasutake | Jan 1996 | A |
5872594 | Thompson | Feb 1999 | A |
6208329 | Ballare | Mar 2001 | B1 |
6542191 | Yonezawa | Apr 2003 | B1 |
6680746 | Kawai et al. | Jan 2004 | B2 |
6697105 | Kato et al. | Feb 2004 | B1 |
6888565 | Tanaka et al. | May 2005 | B1 |
6954224 | Okada et al. | Oct 2005 | B1 |
6965376 | Tani et al. | Nov 2005 | B2 |
6965394 | Gutta et al. | Nov 2005 | B2 |
6973200 | Tanaka et al. | Dec 2005 | B1 |
7030861 | Westerman et al. | Apr 2006 | B1 |
7061525 | Tanaka et al. | Jun 2006 | B1 |
7183944 | Gutta et al. | Feb 2007 | B2 |
7278115 | Conway et al. | Oct 2007 | B1 |
7362221 | Katz | Apr 2008 | B2 |
7394367 | Aupperle et al. | Jul 2008 | B1 |
7411575 | Hill et al. | Aug 2008 | B2 |
7479949 | Jobs et al. | Jan 2009 | B2 |
7519223 | Dehlin et al. | Apr 2009 | B2 |
7535463 | Wilson | May 2009 | B2 |
20010026263 | Kanamori et al. | Oct 2001 | A1 |
20020067412 | Kawai et al. | Jun 2002 | A1 |
20050036036 | Stevenson et al. | Feb 2005 | A1 |
20050079896 | Kokko et al. | Apr 2005 | A1 |
20050225634 | Brunetti et al. | Oct 2005 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060034043 | Hisano et al. | Feb 2006 | A1 |
20060036944 | Wilson | Feb 2006 | A1 |
20060187196 | Underkoffler et al. | Aug 2006 | A1 |
20070146337 | Ording et al. | Jun 2007 | A1 |
20070171273 | Saleh et al. | Jul 2007 | A1 |
20070229471 | Kim et al. | Oct 2007 | A1 |
20080013826 | Hillis et al. | Jan 2008 | A1 |
20080129686 | Han | Jun 2008 | A1 |
20080143559 | Dietz et al. | Jun 2008 | A1 |
20080168403 | Westerman et al. | Jul 2008 | A1 |
20080231610 | Hotelling et al. | Sep 2008 | A1 |
20090040188 | Shu | Feb 2009 | A1 |
20090084612 | Mattice et al. | Apr 2009 | A1 |
20090160785 | Chen et al. | Jun 2009 | A1 |
20090262091 | Ikeda et al. | Oct 2009 | A1 |
20100053219 | Kornmann | Mar 2010 | A1 |
20100138763 | Kim | Jun 2010 | A1 |
20100188423 | Ikeda et al. | Jul 2010 | A1 |
20100192109 | Westerman et al. | Jul 2010 | A1 |
20100211920 | Westerman et al. | Aug 2010 | A1 |
20100304731 | Bratton et al. | Dec 2010 | A1 |
20110093822 | Sherwani | Apr 2011 | A1 |
20110117526 | Wigdor et al. | May 2011 | A1 |
20110181526 | Shaffer et al. | Jul 2011 | A1 |
20110199314 | Laberge et al. | Aug 2011 | A1 |
20110199386 | Dharwada et al. | Aug 2011 | A1 |
20110199516 | Laberge et al. | Aug 2011 | A1 |
20110225553 | Abramson et al. | Sep 2011 | A1 |
20110239155 | Christie | Sep 2011 | A1 |
20120023509 | Blumenberg | Jan 2012 | A1 |
20120088526 | Lindner | Apr 2012 | A1 |
20120242850 | Laberge et al. | Sep 2012 | A1 |
Number | Date | Country |
---|---|---|
WO-9849663 | Nov 1998 | WO |
Entry |
---|
U.S. Appl. No. 12/704,950, Advisory Action mailed May 8, 2012, 5 pgs. |
U.S. Appl. No. 12/704,950, Final Office Action mailed Mar. 8, 2012, 8 pgs. |
U.S. Appl. No. 12/704,950, Non Final Office Action mailed Jun. 15, 2012, 9 pgs. |
U.S. Appl. No. 12/704,950, Non Final Office Action mailed Dec. 16, 2011, 6 pgs. |
U.S. Appl. No. 12/704,950, Response filed Jan. 5, 2012 to Non Final Office Action mailed Dec. 16, 2011, 8 pgs. |
U.S. Appl. No. 12/704,950, Response filed Apr. 25, 2012 to Final Office Action mailed Mar. 8, 2012, 8 pgs. |
U.S. Appl. No. 12/704,950, Response filed Jun. 7, 2012 to Final Office Action mailed Mar. 8, 2012, 10 pgs. |
“Atmel's New Family of Touch Screen Solutions Enable Two Touch Gestures for Intuitive User Interfaces”, http://news.thomasnet.com/companystory/821709, (Oct. 22, 2008). |
“Getac Announces Technology Breakthrough With Resistive-Type Multi-Touch Technology for “Hands-On” Applications With or Without Gloves”, http://www.getac.com/news/edm/multi-touch.html, Getac Press Release, (Oct. 6, 2009). |
“HTC TouchFLO review”, http://msmobiles.com/news.php/6616.html, (Aug. 16, 2007). |
“Touch Screen and User Interface”, http://www.sony.jp/products/overseas/contents/pickup/contents/touch—screen/index.html, Undated, (Downloaded Oct. 29, 2009). |
“TOUCH1600 Touch Screen DVR”, http://helpdesk.portasystems.com/download/security/dvr.pdf, Porta Systems Corp., (Sep. 2008). |
Davies, Chris, “Getac V100 Tablet PC gets glove-friendly multitouch display”, http://www.slashgear.com/getac-v100-tablet-pc-gets-glove-friendly-multitouch-display-0759517/, (Oct. 7, 2009). |
Niper, E. D, “INEL central alarm monitoring and assessment system”, Nuclear materials management, 12, (1983), 150-155. |
Posey, Brien, “Touch screen gestures”, http://itknowledgeexchange.techtarget.com/brien-posey/touch-screen-gestures/, Brien Posey's Windows Blog, (Mar. 31, 2009). |
U.S. Appl. No. 12/705,026, filed Feb. 10, 2010, Overlay Feature to Provide User Assistance in a Multi-Touch Interactive Display Environment. |
U.S. Appl. No. 12/704,950, filed Feb. 12, 2010, Method of Showing Video on a Touch-Sensitive Display. |
U.S. Appl. No. 12/704,886, filed Feb. 12, 2010, Gestures on a Touch-Sensitive Display. |
U.S. Appl. No. 13/052,879, filed Mar. 21, 2011, Method of Defining Camera Scan Movements Using Gestures. |
U.S. Appl. No. 12/704,886, Examiner Interview Summary mailed May 20, 2013, 3 pgs. |
U.S. Appl. No. 12/704,886, Non Final Office Action mailed Apr. 12, 2013, 4 pgs. |
U.S. Appl. No. 12/704,886, Notice of Allowance mailed May 23, 2013, 6 pgs. |
U.S. Appl. No. 12/704,886, Response filed Mar. 25, 2013 to Restrction Requirement mailed Mar. 18, 2013, 4 pgs. |
U.S. Appl. No. 12/704,886, Response filed May 14, 2013 to Non Final Office Action mailed Apr. 12, 2013, 6 pgs. |
U.S. Appl. No. 12/705,026, Response filed Mar. 25, 2013 to Final Office Action mailed Feb. 12, 2013, 5 pgs. |
U.S. Appl. No. 12/705,026, Examiner Interview Summary mailed Mar. 28, 2013, 3 pgs. |
U.S. Appl. No. 12/705,026, Examiner Interview Summary mailed May 22, 2013, 3 pgs. |
U.S. Appl. No. 12/705,026, Non Final Office Action mailed Apr. 11, 2013, 7 pgs. |
U.S. Appl. No. 12/704,886, Restriction Requirement mailed Mar. 18, 2013, 5 pgs. |
U.S. Appl. No. 12/705,026, Final Office Action mailed Feb. 13, 2013, 11 pgs. |
U.S. Appl. No. 12/705,026, Non Final Office Action mailed Nov. 23, 2012, 10 pgs. |
U.S. Appl. No. 12/705,026, Response filed Jan. 2, 2013 to Non Final Office Action mailed Nov. 23, 2012, 9 pgs. |
U.S. Appl. No. 12/704,950, Final Office Action mailed Oct. 4, 2012, 10 pgs. |
U.S. Appl. No. 12/704,950, Response filed Sep. 14, 2012 Non Final Office Action mailed Jun. 15, 2012, 10 pgs. |
Number | Date | Country | |
---|---|---|---|
20110199495 A1 | Aug 2011 | US |