System, method, and computer program product for preventing image-related data loss

Information

  • Patent Grant
  • 9215197
  • Patent Number
    9,215,197
  • Date Filed
    Saturday, March 24, 2012
    12 years ago
  • Date Issued
    Tuesday, December 15, 2015
    8 years ago
Abstract
A system, method, and computer program product are provided for preventing data loss associated with an image. In use, an image is identified, and it is determined whether the image includes predetermined data. In addition, an action is performed based on the determination, for preventing data loss.
Description
FIELD OF THE INVENTION

The present invention relates to data loss prevention, and more particularly to systems for preventing data loss.


BACKGROUND

Typically, data loss prevention systems are utilized for identifying and analyzing textual-based data in order to ensure that predetermined data is not leaked. However, such data loss prevention systems have traditionally been ineffective with respect to data loss stemming from images. For example, a screenshot which includes confidential data is generally not recognized by a data loss prevention system which only identifies and analyzes text-based data. Thus, conventional data loss prevention systems have typically been unable to prevent data loss caused by the transfer of images containing confidential data. Additionally, these images have not only included screenshots and/or screen captures, but have also included various other types of images.


In some cases, data loss prevention systems have attempted to prevent data loss from images, however, such systems have also exhibited various limitations. For example, such systems have merely attempted to prevent image creation (e.g. by disabling predefined user input commands which are used to create images, etc.). Nevertheless, numerous techniques exist to circumvent these precautionary measures. For example, browser and software resources may be utilized which allow images (e.g. screenshots, etc.) to be created without utilizing such predefined user input commands.


There is thus a need for addressing these and/or other issues associated with the prior art.


SUMMARY

A system, method, and computer program product are provided for preventing data loss associated with an image. In use, an image is identified, and it is determined whether the image includes predetermined data. In addition, an action is performed based on the determination, for preventing data loss.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a network architecture, in accordance with one embodiment.



FIG. 2 shows a representative hardware environment that may be associated with the servers and/or clients of FIG. 1, in accordance with one embodiment.



FIG. 3 shows a method for preventing data loss associated with an image, in accordance with one embodiment.



FIG. 4 shows a method for determining whether an image includes confidential data, in accordance with another embodiment.



FIG. 5 shows a method for identifying an image as including confidential data, in accordance with yet another embodiment.



FIG. 6 shows a system for preventing data loss associated with an image, in accordance with one embodiment.





DETAILED DESCRIPTION


FIG. 1 illustrates a network architecture 100, in accordance with one embodiment. As shown, a plurality of networks 102 is provided. In the context of the present network architecture 100, the networks 102 may each take any form including, but not limited to a local area network (LAN), a wireless network, a wide area network (WAN) such as the Internet, peer-to-peer network, etc.


Coupled to the networks 102 are servers 104 which are capable of communicating over the networks 102. Also coupled to the networks 102 and the servers 104 is a plurality of clients 106. Such servers 104 and/or clients 106 may each include a desktop computer, lap-top computer, hand-held computer, mobile phone, personal digital assistant (PDA), peripheral (e.g. printer, etc.), any component of a computer, and/or any other type of logic. In order to facilitate communication among the networks 102, at least one gateway 108 is optionally coupled therebetween.



FIG. 2 shows a representative hardware environment that may be associated with the servers 104 and/or clients 106 of FIG. 1, in accordance with one embodiment. Such figure illustrates a typical hardware configuration of a workstation in accordance with one embodiment having a central processing unit 210, such as a microprocessor, and a number of other units interconnected via a system bus 212.


The workstation shown in FIG. 2 includes a Random Access Memory (RAM) 214, Read Only Memory (ROM) 216, an I/O adapter 218 for connecting peripheral devices such as disk storage units 220 to the bus 212, a user interface adapter 222 for connecting a keyboard 224, a mouse 226, a speaker 228, a microphone 232, and/or other user interface devices such as a touch screen (not shown) to the bus 212, communication adapter 234 for connecting the workstation to a communication network 235 (e.g., a data processing network) and a display adapter 236 for connecting the bus 212 to a display device 238.


The workstation may have resident thereon any desired operating system. It will be appreciated that an embodiment may also be implemented on platforms and operating systems other than those mentioned. One embodiment may be written using JAVA, C, and/or C++ language, or other programming languages, along with an object oriented programming methodology. Object oriented programming (OOP) has become increasingly used to develop complex applications.


Of course, the various embodiments set forth herein may be implemented utilizing hardware, software, or any desired combination thereof. For that matter, any type of logic may be utilized which is capable of implementing the various functionality set forth herein.



FIG. 3 shows a method 300 for preventing data loss associated with an image, in accordance with one embodiment. As an option, the method 300 may be carried out in the context of the architecture and environment of FIGS. 1 and/or 2. Of course, however, the method 300 may be carried out in any desired environment.


As shown in operation 302, an image is identified. In the context of the present description, the image may include any graphic or picture representation of data. In various embodiments, the image may include a JPEG interchange format (JPEG) image, a graphics interchange format (GIF) image, a tagged image file format (TIFF) image, a screenshot, and/or any other type of image that meets the above definition. In one embodiment, the image may be stored as a raster data set of binary or integer values that represent various data.


Additionally, in one embodiment, the image may be identified in response to communication of the image (or a request for such communication) over a network. For example, the image may be sent as an attachment to an electronic mail message over a network from a sender to a recipient. Just by way of example, the network may include any of the networks described above with respect to FIG. 1. Further, the image may be identified by analyzing, monitoring, etc. data communications over the network. Optionally, the image may be identified by analyzing outgoing electronic mail messages.


In another embodiment, the image may be stored on a device (e.g. such as any of the devices described above with respect to FIGS. 1 and/or 2). For example, the image may be stored on a hard disk drive of a personal computer. To this end, the image may optionally be identified during a system scan. Just by way of example, memory may be scanned for identifying the image (e.g. by identifying predetermined file formats, etc.).


As also shown, it is determined whether the image includes predetermined data. Note operation 304. The predetermined data may include any information, content, etc. that has been predetermined. In one embodiment, the predetermined data may include confidential data. In another embodiment, the predetermined data may include textual data. In still another embodiment, the predetermined data may include fingerprinted (e.g. hashed, etc.) data.


Additionally, it may be determined whether the image includes the predetermined data in any manner. For example, in one embodiment, the image may be analyzed. In another embodiment, data (e.g. textual data, etc.) may be extracted from the image. In still another embodiment, the textual data extracted from the image may be fingerprinted.


In yet another embodiment, it may be determined whether the image includes the predetermined data by comparing the image, or any portion of the image, against the predetermined data. For example, data extracted from the image, fingerprinted data associated with the image, etc. may be compared against the predetermined data. In another embodiment, the image may be determined to include the predetermined data, if the image, portion of the image, data extracted from the image, etc. matches the predetermined data or a portion thereof.


Furthermore, as shown in operation 306, an action is performed based on the determination, for preventing data loss. In the context of the present description, the data loss may refer to any loss of data. For example, the data loss may include data leakage, data deletion, a compromise of secure data, etc. In one embodiment, the data loss may result from communication of the image (e.g. over the network, etc.), presentation of the image to a user unauthorized to view the image, etc. Just by way of example, the data loss may be external and/or internal with respect to a network, organization, etc.


Additionally, the action may include any action capable of, at least in part, preventing data loss. For example, in one embodiment, communication of the image may be prevented. In this way, communication of the image may be prevented if it is determined that the image includes the predetermined data, as an option. Similarly, communication of the image may be allowed, if it is determined that the image does not include the predetermined data, as another option.


In yet another embodiment, the image may be quarantined (e.g. until manually deleted, allowed, etc. by a user). In still another embodiment, one or more of an administrator(s), a management team (e.g. a plurality of users, etc.), a sender of the image, and a recipient of the image may be notified. Additionally, a determination that the image includes the predetermined data may be logged. In this way, data loss involving images which include predetermined data may be prevented. Moreover, such data loss may optionally be prevented independently of a manner (e.g. device such as a camera, keyboard, mouse, application, etc.) in which the image is created (e.g. a manner in which a screenshot is captured, etc.).


Furthermore, prevention of the data loss may be accomplished utilizing any desired device. In various embodiments, the identification of the image, the determination of whether the image includes predetermined data, and the performance of the action may be performed at a gateway, a mail server, a web server, a file transfer protocol (FTP) server, a client device (e.g. desktop, laptop, etc.), and/or any other location, for that matter. Further, the different operations may or may not be performed at the same location.


More illustrative information will now be set forth regarding various optional architectures and features with which the foregoing technique may or may not be implemented, per the desires of the user. It should be strongly noted that the following information is set forth for illustrative purposes and should not be construed as limiting in any manner. Any of the following features may be optionally incorporated with or without the exclusion of other features described.



FIG. 4 shows a method 400 for determining whether an image includes confidential data, in accordance with another embodiment. As an option, the method 400 may be carried out in the context of the architecture and environment of FIGS. 1-3. Of course, however, the method 400 may be carried out in any desired environment. It should also be noted that the aforementioned definitions may apply during the present description.


As shown in operation 402, communication of an image over a network is identified. In one embodiment, the communication may include sending an electronic mail message containing the image from a sender to a recipient. In another embodiment, the communication may include uploading the image to one computer from another computer. In yet another embodiment, the communication may include downloading the image from one computer by another computer. However, the communication may include any transmission of the image over a network.


Furthermore, the communication of the image over the network may be identified in any manner. For example, the image may be identified by monitoring, flagging, tracking, etc. communications over the network. In one embodiment, the communication of the image may include a request for communication of the image over the network, such that identifying the communication may include identifying the request for such communication. As another option, the communication of the image may be identified by intercepting communication of the image over the network.


In addition, the image is analyzed. See operation 404. In one embodiment, the analysis may include extracting data (e.g. text, etc.) from the image. In another embodiment, the data extracted from the image may be fingerprinted. In yet another embodiment, the image may be flagged and examined by an administrator. However, the image may be analyzed in any manner.


Furthermore, in decision 406, it is determined whether the image includes confidential data. In the context of the present embodiment, the confidential data may include any data which is known to be confidential. Optionally, such data may be determined to be confidential if any predetermined words, phrases, etc. (or a number thereof meeting a predefined threshold) indicating a confidential nature of the data are identified. For example, the confidential data may include company policy information, trade secret information, information associated with a nondisclosure agreement, etc.


Optionally, the determination may be made automatically (e.g. via computer code, etc.). Additionally, in one embodiment, data extracted from the image may be compared to known confidential data for determining whether the image includes the confidential data. For example, all data extracted from the image may be fingerprinted and compared against a database of fingerprinted known confidential data. In another example, keywords selected from the data extracted from the image may be fingerprinted and compared against a database of fingerprinted keywords known to be indicative of confidential data. In another embodiment, an administrator may manually examine the image to determine if it includes the confidential data. Of course, the determination whether an image includes the confidential data may be made in any manner.


If it is determined in decision 406 that the image does not include confidential data, communication of the image is allowed. See operation 408. Additionally, any technique for allowing communication of the image may be performed. For example, in one embodiment, an electronic mail message containing the image may be sent to its intended recipient(s). In another embodiment, the image may be uploaded to a server in response to a request. In yet another embodiment, controls capable of being utilized (e.g. by a user, etc.) to send the image may be enabled.


If it is determined in decision 406 that the image includes confidential data, in operation 410, an action is performed for preventing data loss from the communication of the image. In one embodiment, such data loss may be prevented by preventing communication of the image over the network (e.g. denying a request to communicate the data, cancelling a communication of the data, etc.). Just by way of example, an electronic mail message may be prevented from being sent to its intended recipients. In another embodiment, an administrator or senior manager may be alerted or notified. However, any action for preventing data loss from communication of the image may be performed.



FIG. 5 shows a method 500 for identifying an image as including confidential data, in accordance with yet another embodiment. As an option, the method 500 may be carried out in the context of the architecture and environment of FIGS. 1-4. For example, the method 500 may be used in the context of operation 404 of FIG. 4, for example. Of course, however, the method 500 may be carried out in any desired environment. Again, it should also be noted that the aforementioned definitions may apply during the present description.


As shown in operation 502, text in an image is identified. In the context of the present description, the text may be numeric, symbolic, alphabetic, and/or any other type of text capable of being identified in an image. In one embodiment, the text in the image may be identified by performing an optical character recognition (OCR) procedure on the image. In another embodiment, the text in the image may be identified by performing an intelligent character recognition (ICR) procedure on the image. In yet another embodiment, the text in the image may be identified utilizing one or more neural networks (e.g. mathematical models, computational models, etc.). Of course, however, the text in the image may be identified in any manner.


Additionally, a fingerprint of the text is determined. Note operation 504. In the context of the present description, the fingerprint may be any value, string, etc. that identifies the text. For example, the fingerprint may be with respect to all or part of the text. To this end, the fingerprint may include a hash of the text, a hash of a portion of the text, etc.


In this way, the fingerprint of the text may be determined in any manner. For example, in one embodiment, the entire text may be hashed. In another embodiment, a portion of the text may be hashed.


Further, as shown in operation 506, the fingerprint is compared to a database of known confidential data. The database may include a list, an index, a string, a table, a relational database management system (RDBMS), and/or any other data structure which is capable of storing the known confidential data. Additionally, it should be noted that the database of known confidential data may store textual data, hashed data, modified data, and/or any other form of the known confidential data.


Moreover, the fingerprint may be compared with the database of known confidential data in any desired manner. For example, in one embodiment, the fingerprint may be compared with fingerprints stored in the database of known confidential data. In another embodiment, the database may include a hash table, such that the fingerprint may be utilized as an index for identifying known confidential data (or a fingerprint thereof) in the database to which the fingerprint is compared.


Further still, as shown in decision 508, it is determined if a match is found. In one embodiment, the match may be found, if the fingerprint exactly matches an entry in the database of known confidential data. In another embodiment, the match may be found, if the fingerprint approximately matches an entry in the database of known confidential data. In yet another embodiment, the match may be found, if a portion of the fingerprint matches a portion of an entry in the database of known confidential data. However, the match may be found in any manner.


If it is determined in decision 508 that no match is found, the image is identified as not including confidential data in operation 510. In one embodiment, the image may be tagged with an indication that it does not contain confidential data. In another embodiment, a null set may be returned. In still another embodiment, a user and/or administrator may be notified that the image does not contain confidential data. Of course, however, the image may be identified as not including confidential data in any manner.


If, in decision 508, it is determined that a match is found, the image is identified as including confidential data. Note operation 512. In one embodiment, the image may be tagged with an indication that it contains confidential data. In another embodiment, a notification may be generated for notifying the user and/or administrator that the image contains confidential data. Of course, however, the image may be identified as including confidential data in any manner.



FIG. 6 shows a system 600 for preventing data loss associated with an image, in accordance with another embodiment. As an option, the system 600 may be implemented in the context of the architecture and environment of FIGS. 1-5. Of course, however, the system 600 may be implemented in any desired environment. Again, it should also be noted that the aforementioned definitions may apply during the present description.


As shown, a first device 602 is in communication with a second device 606 via an analyzer 612. While only two devices are shown, it should be noted that the first device 602 and/or second device 606 may also optionally be in communication with a plurality of other devices. In addition, the analyzer 612 may include a single analyzer, as shown, but, of course, may also include any combination of analyzers by which the first device 602 and the second device 606 communicate. In addition, the first device 602 is also in communication with the second device 606 via a first network 604.


As also shown, the analyzer 612 is in communication with a database 614. The database 614 may include any data structure capable of being utilized for storing data. Further, such data may include known confidential data, fingerprints of known confidential data, etc. Thus, the analyzer 612 may be capable of reading, retrieving, etc. data from the database 614. Optionally, the database 614 may receive data updates [e.g. from a server (not shown), etc.].


In one embodiment, the analyzer 612 may be utilized for identifying the communication of an image (e.g. over a network). For example, the analyzer 612 may be capable of identifying the communication of the image sent from the first device 602 to the second device 606. Such identification may be made by intercepting the communication, for example. While not shown, it should be noted that the analyzer 612 may be integrated with (e.g. installed on, etc.) the first device 602 and/or the second device 606.


In another embodiment, a gateway 608 may be utilized for identifying the communication of the image over the first network 604, as shown. Further, the gateway 608 may communicate the image (or a copy thereof) to the analyzer 612. It should be noted that while the analyzer 612 is shown as being separate from the gateway 608, the analyzer 612 may also be integrated with the gateway 608.


Additionally, in another embodiment, the analyzer 612 may analyze the image and determine whether the image includes confidential data. For example, the analyzer 612 may identify text in the image. Additionally, the analyzer 612 may fingerprint the text and compare the fingerprint of the text to fingerprints of known confidential data found in the database 614.


Further, in still another embodiment, the analyzer 612 may allow (e.g. forward, etc.) the communication of the image if the analyzer 612 determines that the image does not include confidential data. For example, the analyzer 612 may allow the first device 602 to communicate the image to the second device 606 if the analyzer 612 determines that the image does not contain any confidential data.


Furthermore, in still another embodiment, the analyzer 612 may perform an action for preventing data loss from the communication of the image if the analyzer 612 determines that the image includes confidential data. For example, if the analyzer 612 determines that the image sent from the first device 602 to the second device 606 includes confidential data, the analyzer 612 may block transmission of the image and send a message regarding the confidential nature of the image to the first device 602, the second device 606, an administrator, etc.


In yet another embodiment, the analyzer 612 may perform an on-demand or automated scan of the first device 602 and/or the second device 606 in order to determine if such devices 602, 606 store images which include confidential data. For example, the analyzer 612 may scan disk storage devices of the first device 602 and/or the second device 606 on a monthly basis in order to determine if the devices store images which include confidential data. Further, the analyzer 612 may perform one or more actions to prevent data loss if it is determined that any of such devices 602, 606 store images which include confidential data. Of course, such scan may also be carried out on external media such as a compact disc (CD), a digital versatile disc (DVD), a memory stick, a universal serial bus (USB) drive, etc.


As also shown, the first network 604 is in communication with a second network 610 via the gateway 608. While only two networks are shown, it should be noted that the first network 604 may also optionally be in communication with a plurality of other networks, and that the second network 610 may also optionally be in communication with a plurality of other networks.


In one embodiment, the gateway 608 may monitor network traffic between the first network 604 and the second network 610. To this end, the gateway 608 may be capable of identifying the communication of an image between such networks 604, 610. For example, the gateway 608 may identify the communication of an image from the first network 604 to the second network 610 or from the second network 610 to the first network 604.


In one embodiment, the gateway 608 may send the identified image to the analyzer 612 to determine whether the image contains confidential data. The analyzer 612 may also alert the gateway 608 of whether the image contains confidential data. Furthermore, the gateway 608 may allow communication of the image based on the alert received from the analyzer 612. For example, if the alert indicates that the image does not include confidential data, communication of the image may be allowed. However, if the alert indicates that the image includes confidential data, the gateway 608 may perform an action for preventing data loss from the communication of the image. For example, if the gateway 608 determines that an image sent from the first network 604 to the second network 610 includes confidential data, the gateway 608 may inform an administrator of one or both networks 604, 610 that communication of an image including confidential information was attempted, may block communication of the image, and/or may perform any other action to prevent data loss from the communication of the image.


While various embodiments have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of a preferred embodiment should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims
  • 1. At least one non-transitory computer readable medium having instructions stored thereon, the instructions when executed on a machine cause the machine to: identify an image included within a communication;extract textual data from the image;determine whether the image includes predetermined data by comparing a representation of a portion of the textual data with the predetermined data, wherein the predetermined data includes data determined to be associated with confidential information, and wherein the image is determined to include the predetermined data if a number of predetermined keywords within the portion of the textual data matches a predefined threshold of predetermined keywords within the predetermined data known to be indicative of confidential data; andperform an action based on the determination, wherein the action includes preventing a subsequent communication of the image based on determining that the image includes the predetermined data; wherein the predefined threshold of predetermined keywords is equal to or greater than two.
  • 2. The non-transitory computer readable medium of claim 1, wherein the action further includes quarantining the image based on determining that the image includes the predetermined data.
  • 3. The non-transitory computer readable medium of claim 1, wherein extracting textual data from the image comprises performing an optical character recognition (OCR) procedure on the image.
  • 4. The non-transitory computer readable medium of claim 1, wherein extracting textual data from the image comprises performing an intelligent character recognition (ICR) procedure on the image.
  • 5. The non-transitory computer readable medium of claim 1, wherein the extracting of the textual data from the image is performed utilizing at least one neural network.
  • 6. The non-transitory computer readable medium of claim 1, wherein preventing subsequent communication of the image includes preventing a subsequent electronic mail message, which attaches the image, from propagating in a network.
  • 7. The non-transitory computer readable medium of claim 1, wherein the image is a screenshot.
  • 8. The non-transitory computer readable medium of claim 1, wherein the image is identified in response to communication of the image over a network.
  • 9. The non-transitory computer readable medium of claim 1, wherein the image is stored on a device.
  • 10. The non-transitory computer readable medium of claim 1, wherein the image is identified during a system scan.
  • 11. The non-transitory computer readable medium of claim 1, wherein the image is included as an attachment in an electronic mail message.
  • 12. The non-transitory computer readable medium of claim 1, wherein the action further includes at least one of: notifying an administrator;notifying a management team;notifying a sender of the image; andnotifying a recipient of the image.
  • 13. The non-transitory computer readable medium of claim 1, wherein the representation of the portion of the textual data is generated by fingerprinting at least a portion of the textual data extracted from the image.
  • 14. The non-transitory computer readable medium of claim 13, wherein fingerprinting the portion of the textual data includes hashing of the at least a portion of the textual data.
  • 15. The non-transitory computer readable medium of claim 13, wherein the image is determined to include the predetermined data, if the fingerprinted textual data matches the predetermined data.
  • 16. The non-transitory computer readable medium of claim 1, wherein the identifying, determining, and performing are performed at one or more of a gateway, a mail server, a web server, and a file transfer protocol server.
  • 17. The non-transitory computer readable medium of claim 1, further comprising allowing communication of the image, if it is determined that the image does not include the predetermined data.
  • 18. A computer-implemented method, comprising: identifying an image included within a communication;extracting textual data from the image;determining whether the image includes predetermined data by comparing a representation of a portion of the textual data with the predetermined data, wherein the predetermined data includes data determined to be associated with confidential information, and wherein the image is determined to include the predetermined data if a number of predetermined keywords within the portion of the textual data matches a predefined threshold of predetermined keywords within the predetermined data known to be indicative of confidential data; andperforming an action based on the determination, wherein the action includes preventing a subsequent communication of the image based on determining that the image includes the predetermined data; wherein the predefined threshold of predetermined keywords is equal to or greater than two.
  • 19. A system, comprising: a processor, wherein the system is configured for: identifying an image included within a communication;extracting textual data from the image;determining whether the image includes predetermined data by comparing a representation of a portion of the textual data with the predetermined data, wherein the predetermined data includes data determined to be associated with confidential information, and wherein the image is determined to include the predetermined data if a number of predetermined keywords within the portion of the textual data matches a predefined threshold of predetermined keywords within the predetermined data known to be indicative of confidential data; andperforming an action based on the determination, wherein the action includes preventing a subsequent communication of the image based on determining that the image includes the predetermined data; wherein the predefined threshold of predetermined keywords is equal to or greater than two.
  • 20. The system of claim 16, further comprising memory coupled to the processor via a bus.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation (and claims the benefit of priority under 35 U.S.C. §120) of U.S. application Ser. No. 11/840,831, filed Aug. 17, 2007, and entitled “SYSTEM, METHOD, AND COMPUTER PROGRAM PRODUCT FOR PREVENTING IMAGE-RELATED DATA LOSS”. The disclosure of the prior application is considered part of and is hereby incorporated by reference in its entirety in the disclosure of this application.

US Referenced Citations (203)
Number Name Date Kind
4797447 Gergen et al. Jan 1989 A
5195086 Baumgartner et al. Mar 1993 A
5280527 Gullman et al. Jan 1994 A
5485068 Vaught Jan 1996 A
5572694 Uchino Nov 1996 A
5796948 Cohen Aug 1998 A
5845068 Winiger Dec 1998 A
5941915 Federle et al. Aug 1999 A
5987610 Franczek et al. Nov 1999 A
6073142 Geiger et al. Jun 2000 A
6081265 Nakayama et al. Jun 2000 A
6177932 Galdes et al. Jan 2001 B1
6240417 Eastwick et al. May 2001 B1
6367019 Ansell et al. Apr 2002 B1
6460050 Pace et al. Oct 2002 B1
6658566 Hazard Dec 2003 B1
6718367 Ayyadurai Apr 2004 B1
6741851 Lee et al. May 2004 B1
6820204 Desai et al. Nov 2004 B1
6934857 Bartleson et al. Aug 2005 B1
6957330 Hughes Oct 2005 B1
6961765 Terry Nov 2005 B2
7023816 Couillard Apr 2006 B2
7100123 Todd et al. Aug 2006 B1
7124197 Ocepek et al. Oct 2006 B2
7149778 Patel et al. Dec 2006 B1
7194623 Proudler et al. Mar 2007 B1
7194728 Sirota et al. Mar 2007 B1
7222305 Teplov et al. May 2007 B2
7257707 England et al. Aug 2007 B2
7278016 Detrick et al. Oct 2007 B1
7313615 Fitzpatrick et al. Dec 2007 B2
7346778 Guiter et al. Mar 2008 B1
7350074 Gupta et al. Mar 2008 B2
7350084 Abiko et al. Mar 2008 B2
7383433 Yeager et al. Jun 2008 B2
7424543 Rice et al. Sep 2008 B2
7434513 Dannoux Oct 2008 B2
7437752 Heard et al. Oct 2008 B2
7441000 Boehringer et al. Oct 2008 B2
7461249 Pearson et al. Dec 2008 B1
7475420 Hernacki Jan 2009 B1
7484247 Rozman et al. Jan 2009 B2
7490355 Wong Feb 2009 B2
7497447 Musselman Mar 2009 B2
7506155 Stewart et al. Mar 2009 B1
7519984 Bhogal et al. Apr 2009 B2
7523484 Lum et al. Apr 2009 B2
7526654 Charbonneau Apr 2009 B2
7539857 Bartlett et al. May 2009 B2
7559080 Bhargavan et al. Jul 2009 B2
7581004 Jakobson Aug 2009 B2
7630986 Herz et al. Dec 2009 B1
7653811 Yagiura Jan 2010 B2
7660845 Fusari Feb 2010 B2
7661124 Ramanathan et al. Feb 2010 B2
7689563 Jacobson Mar 2010 B1
7725934 Kumar et al. May 2010 B2
7730040 Reasor et al. Jun 2010 B2
7742406 Muppala Jun 2010 B1
7847694 Lee et al. Dec 2010 B2
7877616 Abiko et al. Jan 2011 B2
7890587 Chebiyyam Feb 2011 B1
7940756 Duffy et al. May 2011 B1
8103727 Lin Jan 2012 B2
8111413 Nuggehalli et al. Feb 2012 B2
8151363 Smithson Apr 2012 B2
8181036 Nachenberg May 2012 B1
8199965 Basavapatna et al. Jun 2012 B1
8272058 Brennan Sep 2012 B2
8353053 Chebiyyam Jan 2013 B1
8446607 Zucker et al. May 2013 B2
8590002 Chebiyyam Nov 2013 B1
8621008 Chebiyyam Dec 2013 B2
8713468 Chebiyyam Apr 2014 B2
8893285 Zucker et al. Nov 2014 B2
20010046069 Jones Nov 2001 A1
20020046275 Crosbie et al. Apr 2002 A1
20020046575 Hayes et al. Apr 2002 A1
20020083003 Halliday et al. Jun 2002 A1
20020099944 Bowlin Jul 2002 A1
20020157089 Patel et al. Oct 2002 A1
20030043036 Merrem et al. Mar 2003 A1
20030043039 Salemi et al. Mar 2003 A1
20030046679 Singleton Mar 2003 A1
20030065937 Watanabe et al. Apr 2003 A1
20030097583 Lacan et al. May 2003 A1
20030105979 Itoh et al. Jun 2003 A1
20030133443 Klinker et al. Jul 2003 A1
20030135744 Almeida Jul 2003 A1
20030177394 Dozortsev Sep 2003 A1
20030182435 Redlich et al. Sep 2003 A1
20030192033 Gartside et al. Oct 2003 A1
20030233421 Shibata et al. Dec 2003 A1
20040003255 Apvrille et al. Jan 2004 A1
20040006715 Skrepetos Jan 2004 A1
20040010686 Goh et al. Jan 2004 A1
20040027601 Ito et al. Feb 2004 A1
20040034794 Mayer et al. Feb 2004 A1
20040054928 Hall Mar 2004 A1
20040064732 Hall Apr 2004 A1
20040088433 Kaler et al. May 2004 A1
20040111482 Bourges-Waldegg et al. Jun 2004 A1
20040117802 Green Jun 2004 A1
20040146006 Jackson Jul 2004 A1
20040172557 Nakae et al. Sep 2004 A1
20040199555 Krachman Oct 2004 A1
20040199566 Carlson et al. Oct 2004 A1
20040199596 Nutkis Oct 2004 A1
20040230572 Omigui Nov 2004 A1
20040255138 Nakae Dec 2004 A1
20050004359 Rai et al. Jan 2005 A1
20050033810 Malcolm Feb 2005 A1
20050038853 Blanc et al. Feb 2005 A1
20050044359 Eriksson et al. Feb 2005 A1
20050058285 Stein et al. Mar 2005 A1
20050060643 Glass et al. Mar 2005 A1
20050116749 Pentakota et al. Jun 2005 A1
20050131990 Jewell Jun 2005 A1
20050132184 Palliyil et al. Jun 2005 A1
20050154885 Viscomi et al. Jul 2005 A1
20050166066 Ahuja et al. Jul 2005 A1
20050172140 Ide Aug 2005 A1
20050198285 Petit Sep 2005 A1
20050204009 Hazarika et al. Sep 2005 A1
20050216749 Brent Sep 2005 A1
20050262208 Haviv et al. Nov 2005 A1
20050272861 Qiao et al. Dec 2005 A1
20050275861 Ferlitsch Dec 2005 A1
20050289181 Deninger et al. Dec 2005 A1
20060005244 Garbow et al. Jan 2006 A1
20060010150 Shaath et al. Jan 2006 A1
20060010209 Hodgson Jan 2006 A1
20060010217 Sood Jan 2006 A1
20060021043 Kaneko et al. Jan 2006 A1
20060026593 Canning et al. Feb 2006 A1
20060031359 Clegg et al. Feb 2006 A1
20060039554 Fry Feb 2006 A1
20060041930 Hafeman et al. Feb 2006 A1
20060050879 Iizuka Mar 2006 A1
20060059548 Hildre et al. Mar 2006 A1
20060070089 Shoaib et al. Mar 2006 A1
20060075040 Chmaytelli Apr 2006 A1
20060075502 Edwards Apr 2006 A1
20060112166 Pettigrew et al. May 2006 A1
20060120526 Boucher et al. Jun 2006 A1
20060123413 Collet et al. Jun 2006 A1
20060123479 Kumar et al. Jun 2006 A1
20060132824 Aritomi Jun 2006 A1
20060168026 Keohane et al. Jul 2006 A1
20060190986 Mont Aug 2006 A1
20060224589 Rowney et al. Oct 2006 A1
20060248252 Kharwa Nov 2006 A1
20070022285 Groth et al. Jan 2007 A1
20070028112 Mackelden et al. Feb 2007 A1
20070029744 Musselman Feb 2007 A1
20070033283 Brown Feb 2007 A1
20070064883 Rosenthal et al. Mar 2007 A1
20070074292 Mimatsu Mar 2007 A1
20070094394 Singh et al. Apr 2007 A1
20070101419 Dawson May 2007 A1
20070110089 Essafi et al. May 2007 A1
20070118904 Goodman et al. May 2007 A1
20070136593 Plavcan et al. Jun 2007 A1
20070143472 Clark et al. Jun 2007 A1
20070143851 Nicodemus et al. Jun 2007 A1
20070174909 Burchett et al. Jul 2007 A1
20070198656 Mazzaferri et al. Aug 2007 A1
20070214220 Alsop et al. Sep 2007 A1
20070220319 Desai et al. Sep 2007 A1
20070245148 Buer Oct 2007 A1
20070256142 Hartung et al. Nov 2007 A1
20070279668 Czyszczewski et al. Dec 2007 A1
20070280112 Zheng et al. Dec 2007 A1
20080034224 Ferren et al. Feb 2008 A1
20080040358 Deng Feb 2008 A1
20080065882 Goodman et al. Mar 2008 A1
20080065903 Goodman et al. Mar 2008 A1
20080079730 Zhang et al. Apr 2008 A1
20080083037 Kruse et al. Apr 2008 A1
20080120689 Morris et al. May 2008 A1
20080170785 Simmons et al. Jul 2008 A1
20080208988 Khouri et al. Aug 2008 A1
20080229428 Camiel Sep 2008 A1
20080262991 Kapoor et al. Oct 2008 A1
20080279381 Narendra et al. Nov 2008 A1
20080309967 Ferlitsch et al. Dec 2008 A1
20090055536 Jo Feb 2009 A1
20090086252 Zucker et al. Apr 2009 A1
20090172786 Backa Jul 2009 A1
20090182931 Gill et al. Jul 2009 A1
20090232300 Zucker et al. Sep 2009 A1
20090327743 Finlayson et al. Dec 2009 A1
20100174784 Levey et al. Jul 2010 A1
20100250547 Grefenstette et al. Sep 2010 A1
20110167265 Ahuja et al. Jul 2011 A1
20110273584 Ogawa Nov 2011 A1
20120011189 Werner et al. Jan 2012 A1
20120191792 Chebiyyam Jul 2012 A1
20130246534 Chebiyyam Sep 2013 A1
20130276061 Chebiyyam et al. Oct 2013 A1
20140115086 Chebiyyam Apr 2014 A1
20140283145 Chebiyyam et al. Sep 2014 A1
Foreign Referenced Citations (3)
Number Date Country
2411330 Aug 2005 GB
WO 02093410 Nov 2002 WO
WO 2006076536 Jul 2006 WO
Non-Patent Literature Citations (99)
Entry
U.S. Appl. No. 11/349,479, filed on Feb. 6, 2006.
Non-Final Rejection in U.S. Appl. No. 11/349,479 mailed on Dec. 8, 2008.
Final Rejection in U.S. Appl. No. 11/349,479 mailed on Jun. 10, 2009.
Notice of Appeal in U.S. Appl. No. 11/349,479, filed on Dec. 12, 2009.
Examiner Interview Summary in U.S. Appl. No. 11/349,479 mailed on Feb. 5, 2010.
Non-Final Rejection in U.S. Appl. No. 11/349,479 mailed on Mar. 22, 2010.
Notice of Allowance in U.S. Appl. No. 11/349,479 mailed on Nov. 8, 2010.
U.S. Appl. No. 11/473,930, filed on Jun. 23, 2006.
Non-Final Office Action in U.S. Appl. No. 11/473,930 mailed on Aug. 17, 2009.
Non-Final Office Action in U.S. Appl. No. 11/473,930 mailed on Jan. 26, 2010.
Non-Final Office Action in U.S. Appl. No. 11/473,930 mailed on Jul. 16, 2010.
Non-Final Office Action in U.S. Appl. No. 11/473,930 mailed on Mar. 10, 2011.
Final Office Action in U.S. Appl. No. 11/473,930 mailed on Sep. 14, 2011.
Non-Final Office Action in U.S. Appl. No. 11/473,930 mailed on Mar. 1, 2012.
Final Office Action in U.S. Appl. No. 11/473,930 mailed on Aug. 8, 2012.
Non-Final Office Action in U.S. Appl. No. 11/473,930 mailed on Feb. 4, 2013.
Final Office Action in U.S. Appl. No. 11/473,930 mailed on Jul. 16, 2013.
U.S. Appl. No. 11/546,745, filed on Nov. 29, 2006.
Non-Final Office Action in U.S. Appl. No. 11/546,745 mailed on Nov. 2, 2009.
Non-Final Office Action in U.S. Appl. No. 11/546,745 mailed on Apr. 21, 2010.
Final Office Action in U.S. Appl. No. 11/546,745 mailed on Oct. 21, 2010.
Non-Final Office Action in U.S. Appl. No. 11/564,745 mailed on Jan. 19, 2012.
Final Office Action in U.S. Appl. No. 11/564,745 mailed Jun. 4, 2012.
Non-Final Office Action in U.S. Appl. No. 11/564,745 mailed on Apr. 5, 2013.
Notice of Allowance in U.S. Appl. No. 11/564,745 mailed on Jul. 29, 2013.
U.S. Appl. No. 11/740,884, filed on Apr. 26, 2007.
Non-Final Office Action in U.S. Appl. No. 11/740,884 mailed on May 14, 2009.
Final Office Action in U.S. Appl. No. 11/740,844 mailed on Jan. 11, 2010.
Advisory Action in U.S. Appl. No. 11/740,844 mailed on Mar. 25, 2010.
Non-Final Office Action in U.S. Appl. No. 11/740,844 mailed on Jun. 24, 2010.
Final Office Action in U.S. Appl. No. 11/740,844 mailed on Feb. 18, 2011.
Advisory Action in U.S. Appl. No. 11/740,844 mailed on Apr. 27, 2011.
Non-Final Office Action in U.S. Appl. No. 11/740,844 mailed on Jul. 20, 2011.
Final Office Action in U.S. Appl. No. 11/740,844 mailed on Feb. 16, 2012.
Non-Final Office Action in U.S. Appl. No. 11/740,844 mailed on May 10, 2012.
Final Office Action in U.S. Appl. No. 11/740,844 mailed on Aug. 15, 2012.
Non-Final Office Action in U.S. Appl. No. 11/740,844 mailed on May 3, 2013.
Notice of Allowance in U.S. Appl. No. 11/740,844 mailed on Sep. 5, 2013.
U.S. Appl. No. 11/840,831, filed on Aug. 17, 2007.
Non-Final Office Action in U.S. Appl. No. 11/840,831 mailed on Oct. 12, 2010.
Final Office Action in U.S. Appl. No. 11/840,831 mailed on May 5, 2011.
Non-Final Office Action in U.S. Appl. No. 11/840,831 mailed on Jul. 21, 2011.
Final Office Action in U.S. Appl. No. 11/840,831 mailed on Dec. 21, 2011.
Notice of Allowance in U.S. Appl. No. 11/840,831 mailed on Mar. 16, 2012.
Notice of Allowance in U.S. Appl. No. 11/840,831 mailed on Apr. 3, 2012.
Notice of Allowance in U.S. Appl. No. 11/840,831 mailed on May 9, 2012.
Non-Final Office Action in U.S. Appl. No. 11/905,420 mailed on May 23, 2011.
Final Office Action in U.S. Appl. No. 11/905,420 mailed on Nov. 2, 2011.
Non-Final Office Action in U.S. Appl. No. 11/905,420 mailed on Jul. 23, 2011.
Notice of Allowance in U.S. Appl. No. 11/905,420 mailed on Dec. 6, 2012.
Non-Final Office Action in U.S. Appl. No. 12/076,163 mailed on Apr. 28, 2011.
Final Office Action in U.S. Appl. No. 12/076,163 mailed on Oct. 19, 2011.
Non-Final Office Action in U.S. Appl. No. 12/076,163 mailed on Sep. 4, 2012.
Final Office Action in U.S. Appl. No. 12/076,163 mailed on Mar. 25, 2013.
Non-Final Office Action in U.S. Appl. No. 12/076,163 mailed on Sep. 10, 2013.
Non-Final Office Action in U.S. Appl. No. 12/187,207 mailed on Mar. 25, 2011.
Notice of Allowance in U.S. Appl. No. 12/187,207 mailed on Aug. 24, 2011.
Notice of Allowance in U.S. Appl. No. 12/187,207 mailed on Sep. 11, 2012.
U.S. Appl. No. 13/434,777, filed on Mar. 29, 2012, entitled “System, Method, and Computer Program Product for Determing Whether an Electronic Mail Messgae is Compliant with Etiquette Policy”, Inventor Gopi Krishna Chebiyyam.
Non-Final Office Action in U.S. Appl. No. 13/434,777 mailed on Aug. 20, 2012.
Final Office Action in U.S. Appl. No. 13/434,777 mailed on Feb. 12, 2013.
Non-Final Office Action in U.S. Appl. No. 13/343,777 mailed on May 23, 2013.
Notice of Allowance mailed in U.S. Appl. No. 13/434,777 mailed Dec. 17, 2013.
U.S. Appl. No. 11/850,432, filed on Sep. 5, 2007.
U.S. Appl. No. 12/123,370, filed on May 19, 2008.
U.S. Appl. No. 12/102,526, filed Apr. 14, 2008, entitled Computer Program Product and Method for Permanently Storing Data Based on Whether a Device is Protected with an Encryption Mechanism and Whether Data is a Data Structer Requires Encryption, inventor Gopi Krishna Chebiyyam.
Non-Final Office Action in U.S. Appl. No. 12/102,526 mailed on Nov. 24, 2010.
Final Office Action in U.S. Appl. No. 12/102,526 mailed on May 25, 2011.
Advisory Action in U.S. Appl. No. 12/102,526 mailed on Aug. 1, 2011.
Notice of Allowance in U.S. Appl. No. 12/102,526 mailed on Sep. 21, 2013.
U.S Appl. No. 11/210,321, filed on Aug. 23, 2005.
Fumera, G. et al., “Spam Filtering Based on the Analysis of Text Information Embedded into Images,” Journal of Machine Learning Research, Dec. 2006, 22 pages.
Layland, Robin, “Data Leak Prevention: Coming Soon to a Business Near You,” Business Communications Review, May 2007 (pp. 44-49).
Heikkila, Faith M., “Encryption: Security Considerations for Portable Media Devices,” IEEE Computer Society, IEEE Security & Provacy, Jul./Aug. 2007 (pp. 22-27).
ClearContext, www.clearcontext.com/user—guide/; [avaliable online at URL <http://web.archive.org/20061107135010/http://www.clearcontext.com/user—guide/>], Nov. 7, 2006 (pp. 1-24).
Dabbish, et al., “Understanding Email Use: Predicting Action on a Message,” CHI 2005—Papers: Email and Security, Portland Oregon; available online at URL: <http://www.cs.cmu.edu/˜kraut/Rkraut.site.files/articles/dabbish05-UnderstandingEmailUse.pdf>] Apr. 2-7, 2005 (pp. 691-700).
U.S. Appl. No. 14/144,136, filed on Dec. 30, 2013, 27 pages.
Non-Final Office Action in U.S. Appl. No. 14/144,136 mailed Jun. 2, 2014, 39 pages.
Non-Final Office Action for U.S. Appl. No. 13/429,363 mailed on Sep. 23, 2013, 22 pages.
Final Office Action for U.S. Appl. No. 13/429,363 mailed Mar. 21, 2014, 27 pages.
Final Office Action in U.S. Appl. No. 12/076,163 mailed Mar. 18, 2014, 76 pages.
Notice of Allowance in U.S. Appl. No. 12/076,163 mailed Jul. 18, 2014, 27 pages.
Masaru Takesue, “A Scheme for Protecting the Information Leakage via Portable Devices”, International Confererence on Emerging Security Information, System and Technologies, 0-7695-2989-5/07 © 2007 IEEE, pp. 54-59.
Hangbae Chang et al., “Design of Inside Information Leakage Prevention System in Ubiquitous Computing Environment”, O. Gervasi et al. (Eds.): ICCSA 2005, LNCS 3483, pp, 128-137, 2005 © Springer-Verlag Berlin Heidelberg 2005.
Faith M. Heikkila, “Encryption: Security Considerations for Portable Media Devices”, Cryptography, IEEE Computer Society, 1540-7993/07 © 2007 IEEE, pp. 22-27.
Robin Layland, “Data Leak Prevention: Coming Soon to a Business Near You”, Business Communications Review, May 2007, pp. 44-49.
Mingdi Xu et al., “A New Data Protecting Scheme Based on TPM”, 8th ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing, 0-7695-2909-7/07 © 2007 IEEE, pp. 943-947.
Peter Hannay et al., “Pocket SDV with SD Guardian: A Secure & Forensically Safe Portable Execution Environment”, Edith Cowan University, Australian Digital Forensics Conference, Security Research Institute Conferences, Dec. 3, 2007, http://ro.edu.edu.au/adf/17, 11 pages.
Morejon, Mario, “Review: Remote Desktop Support Out of the Box”, CRN, May 21, 2007, 2 pages.
Non-Final Office Action received for U.S. Appl. No. 11/850,432 mailed on Oct. 7, 2010. 13 pages.
Final Office Action received in U.S. Appl. No. 11/850,432 mailed May 10, 2011, 14 pages.
Non-Final Office Action received fur U.S. Appl. No. 11/850,432 mailed on Jul. 16, 2013. 17 pages.
Final Office Action received for U.S. Appl. No. 11/850,432 mailed on Jan. 31, 2014. 19 pages.
Non-Final Office Action received for U.S. Appl. No. 12/102,625 mailed on Feb. 6, 2012, 16 pages.
Final Office Action received for U.S. Appl. No. 11/850,432 mailed on May 10, 2011, 14 pages.
Notice of Allowance in U.S. Appl. No. 14/144,136 mailed Sep. 17, 2014, 12 pages.
Non-Final Office Action in U.S. Appl. No. 12/187,207 mailed on Sep. 17, 2014, 16 pages.
Notice of Allowance in U.S. Appl. No. 12/102,526 mailed on Sep. 12, 2012, 13 pages.
U.S. Appl. No. 14/543,869, filed Nov. 17, 2014, 37 pages.
Related Publications (1)
Number Date Country
20120183174 A1 Jul 2012 US
Continuations (1)
Number Date Country
Parent 11840831 Aug 2007 US
Child 13429363 US