The present disclosure relates to mobile device security.
Mobile devices, such as laptops, mobile phones, wearable computing devices, implantable devices, portable computing devices, etc., are susceptible to unauthorized users accessing data stored on the mobile devices. To illustrate, if an unauthorized user obtains (e.g., steals) a laptop that has information associated with an authorized user of the laptop, the unauthorized user may access the information and use the information for improper purposes. As a non-limiting example, the unauthorized user can steal identity information of the authorized user (e.g., identity theft).
According to one implementation, a computer-implemented method includes generating, at a processor of a mobile device, a user profile for an authorized user of the mobile device based on behavior patterns associated with the authorized user. The behavior patterns are determined based on historical data indicating past user behavior of the authorized user. The computer-implemented method also includes detecting subsequent user behavior of a particular user during an attempt by the particular user to access the mobile device. For example, the subsequent user behavior may include a heart rate of the particular user, a time of day that the particular user attempts to access the mobile device, etc. The computer-implemented method also includes comparing the subsequent user behavior to the behavior patterns of the user profile to determine whether the particular user is authorized or unauthorized. In response to determining that the particular user is an unauthorized user, the computer-implemented method includes detecting activity by the unauthorized user and performing at least one countermeasure of a plurality of countermeasures in response to detecting the activity. Each countermeasure of the plurality of countermeasures has a security level and corresponds to a degree of the activity. Different countermeasures may be performed for similar scenarios. For example, for a particular degree of activity, a first particular countermeasure or a second particular countermeasure may be performed. According to one implementation, a first countermeasure having a first security level protects data stored at the mobile device from the unauthorized user. To illustrate, the first countermeasure may delete data stored at the mobile device. A second countermeasure having a second security level determines information about the unauthorized user. To illustrate, the second countermeasure may activate a camera of the mobile device to capture the surroundings of the mobile device, scan a fingerprint of the unauthorized user and compare the fingerprint to one or more fingerprint databases, etc. A third countermeasure having a third security level performs additional security measures or countermeasures. Each countermeasure is triggered by the degree of activity (e.g., improper activity) of the unauthorized user.
According to another implementation, an apparatus includes a memory that stores a user profile for an authorized user of a mobile device. The user profile is generated based on behavior patterns associated with the authorized user. The behavior patterns are determined based on historical data indicating past user behavior of the authorized user. The apparatus also includes a processor coupled to the memory. The processor is configured to detect subsequent user behavior of a particular user during an attempt by the particular user to access the mobile device. The processor is further configured to compare the subsequent user behavior to the behavior patterns of the user profile to determine whether the particular user is authorized or unauthorized. In response to a determination that the particular user is an unauthorized user, the processor is configured to detect activity by the unauthorized user and perform at least one countermeasure of a plurality of countermeasures in response to detecting the activity. Each countermeasure of the plurality of countermeasures has a security level and corresponds to a degree of the activity. Different countermeasures may be performed for similar scenarios. For example, for a particular degree of activity, a first particular countermeasure or a second particular countermeasure may be performed. According to one implementation, a first countermeasure having a first security level protects data stored at the mobile device from the unauthorized user, a second countermeasure having a second security level determines information about the unauthorized user, and a third countermeasure having a third security level performs additional security measures or countermeasures. Each countermeasure is triggered by the degree of activity (e.g., improper activity) of the unauthorized user.
According to another implementation, a computer-readable storage device includes instructions that, when executed by a processor, cause the processor to perform operations including generating a user profile for an authorized user of a mobile device based on user behavior patterns associated with the authorized user. The behavior patterns are determined based on historical data indicating past user behavior of the authorized user. The operations also include detecting subsequent user behavior of a particular user during an attempt by the particular user to access the mobile device. The operations also include comparing the subsequent user behavior to the user behavior patterns of the user profile to determine whether the particular user is authorized or unauthorized. In response to determining that the particular user is an unauthorized user, the operations include detecting activity by the unauthorized user and performing at least one countermeasure of a plurality of countermeasures in response to detecting the activity. Each countermeasure of the plurality of countermeasures has a security level and corresponds to a degree of the activity. Different countermeasures may be performed for similar scenarios. For example, for a particular degree of activity, a first particular countermeasure or a second particular countermeasure may be performed. According to one implementation, a first countermeasure having a first security level protects data stored at the mobile device from the unauthorized user, a second countermeasure having second a security level determines information about the unauthorized user, and a third countermeasure having a third security level performs additional security measures or countermeasures. Each countermeasure is triggered by the degree of activity (e.g., improper activity) of the unauthorized user.
One advantage of the above-described implementations is that a mobile device can detect that an unauthorized user has accessed the mobile device. Different countermeasures can be triggered at the mobile device based on levels of improper activity by the unauthorized user. For example, if the improper activity corresponds to a low-level security breach, the mobile device can delete data to prevent the unauthorized user from accessing the data. If the improper activity corresponds to a higher-level security breach, the mobile device can actively collect data about the unauthorized user while the mobile device is in the unauthorized user's possession. The collected data may be used to identify the location of the mobile device (and thus the location of the unauthorized user) and the collected data may be used to determine an identity of the unauthorized user. As a non-limiting example, to determine the location of the mobile device, a camera of the mobile device may be activated to capture the surroundings of the mobile device. To determine the identity of the unauthorized user, a fingerprint scanner on the mobile device may scan a fingerprint of the unauthorized user and compare the fingerprint to one or more fingerprint databases. Thus, the mobile device may perform active measures to identify and locate the unauthorized user. Additionally, the features, functions, and advantages that have been described can be achieved independently in various implementations or may be combined in yet other implementations, further details of which are disclosed with reference to the following description and drawings.
Particular embodiments of the present disclosure are described below with reference to the drawings. In the description, common features are designated by common reference numbers throughout the drawings.
The figures and the following description illustrate specific exemplary embodiments. It will be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles described herein and are included within the scope of the claims that follow this description. Furthermore, any examples described herein are intended to aid in understanding the principles of the disclosure and are to be construed as being without limitation. As a result, this disclosure is not limited to the specific embodiments or examples described below, but by the claims and their equivalents.
The present disclosure describes a mobile computing security system that is implemented on a mobile device. The mobile computing security system detects unauthorized use of the mobile device by an unauthorized user (e.g., a nefarious user or “bad actor”). For example, over time, the mobile device logs behavioral information and environmental cues about an authorized user (or authorized users) of the mobile device. The mobile computing security system uses machine learning logic and algorithms to determine behavior patterns of the authorized user and develops a user profile based on the behavior patterns. In response to detecting behavior that varies from the behavior patterns of the user profile, the mobile computing security system determines that an unauthorized user is using the mobile device. For example, the mobile computing security system may determine that an unauthorized user is using the mobile device in response to detecting a variation in location of the mobile device from locations associated with the behavior patterns, a variation in detected biometric measurements (e.g., retinal scan, facial scan, galvanic skin response, heart rate etc.) of a user from biometric measurements associated with the behavior patterns, a variation in voice signature of a user from voice signatures associated with the behavior patterns, etc.
The mobile computing security system monitors activities and patterns (e.g., nefarious patterns) along with environmental and situational cues to determine a severity of a security breach by the unauthorized user. As a non-limiting example, the mobile computing security system may monitor the activities to determine whether the unauthorized user is attempting to steal identity information (e.g., a social security number) of the authorized user. Based on the severity of the security breach, the mobile computing security system performs countermeasures to protect the authorized user (e.g., protect the identity information). For example, the mobile computing security system can delete the identity information, encrypt the identity information, etc. Additionally, the mobile computing security system can generate false identity information (e.g., a fake social security number) and provide the false identity information to the unauthorized user.
According to some implementations, the mobile computing security system may gather information about the unauthorized user. As non-limiting examples, the mobile computing security system may activate a camera of the mobile device to capture a surrounding environment of the unauthorized user, record audio to receive a voice signature of the unauthorized user and transmit the voice signature over available networks (e.g., cellular networks), scan a fingerprint of the unauthorized user if the unauthorized user touches a particular portion of the mobile device, measure biometric data of the unauthorized user, spoof a wireless network using a hotspot that mines data from other mobile devices in a network range, etc.
Thus, based on the severity of the security breach, the mobile computing security system may perform active countermeasures to protect data stored at the mobile device and to obtain information about the unauthorized user. As a result, in scenarios where immediate action needs to be taken against the unauthorized user, the mobile computing security system may assist to gather information (e.g., the whereabouts and identity) associated with the unauthorized user.
The mobile device 102 is associated with (e.g., belongs to) an authorized user 190. To access data stored at the mobile device 102 and to access applications installed on the mobile device 102, the authorized user 190 provides access data 126 to the user login unit 114. According to one implementation, the access data 126 includes a personal identification number (PIN) or a user password. For example, the user login unit 114 presents a login interface to the authorized user 190, and the authorized user 190 enters the access data 126 at the login interface. The login interface may be presented via a display screen, such as a display 928 in
The activity detector 122 detects activity of the authorized user 190 (e.g., authorized activity 128) while the authorized user 190 is providing the access data 126 and while the authorized user 190 is using the mobile device 102. For example, the activity detector 122 may detect a geographical location of the mobile device 102 while the authorized user 190 is using the mobile device 102, a time when the authorized user 190 is using the mobile device 102, physical behavior patterns (e.g., heart rate, stress level, etc.) of the authorized user 190 while the authorized user 190 is using the mobile device 102, etc. According to some implementations, the authorized activity 128 also indicates data that the authorized user 190 accesses while using the mobile device 102, applications that the authorized user 190 uses, etc. For example, the authorized activity 128 may indicate websites that the authorized user 190 frequently visits, applications that the authorized user 190 frequently uses, etc.
The data logging unit 116 logs the authorized activity 128 of the authorized user 190 over time as past user behavior 130. The past user behavior 130 may be stored at the mobile device 102 or stored at a remote device (e.g., a remote device or “cloud”). The past user behavior 130 is historical data indicating past activities of the authorized user 190. The past user behavior 130 is collected over time (e.g., days, months, or years). The processor 106 uses the past user behavior 130 logged by the data logging unit 116 to determine authorized user behavior patterns 132 of the authorized user 190. For example, the processor 106 retrieves the machine learning algorithm 110 from the memory 104 and provides the past user behavior 130 as an input to the machine learning algorithm 110. The processor 106 executes the machine learning algorithm 110 to determine the authorized user behavior patterns 132. The authorized user behavior patterns 132 include geographical location patterns of the authorized user 190, mobile device usage time patterns of the authorized user 190, typing speed patterns of the authorized user 190, data-type patterns of the authorized user 190, biometric data patterns (e.g., heart rate patterns, stress level patterns, sleep patterns, etc.), and other user behavior patterns.
The user profile generator 118 generates a user profile 135 for the authorized user 190 based on the authorized user behavior patterns 132. The user profile 135 includes information that is unique to the authorized user 190 (based on past behavior and activity) and that may be used to determine whether an unauthorized user 192 has accessed (e.g., stolen) the mobile device 102, as described below. For example, the user profile 135 indicates geographical locations where the authorized user 190 has accessed the mobile device 102, geographical locations where the authorized user 190 is likely to access the mobile device 102, time periods when the authorized user 190 has accessed the mobile device 102, time periods when the authorized user 190 is likely to access the mobile device 102, physical attributes (e.g., heart-rate, stress level, etc.) of the authorized user 190, etc.
If the mobile device 102 determines that a potential unknown user is attempting to access the mobile device 102, the mobile device 102 detects subsequent user behavior 134 of the potential unknown user and compares the subsequent user behavior 134 to the authorized user behavior patterns 132 of the user profile 135. To illustrate, if the user login unit 114 determines that the access data 126 has been manipulated or falsified, the user login unit 114 may determine that a potential unknown user is attempting to access the mobile device 102. As a non-limiting example, if the access data 126 is a PIN and it takes more than ten seconds for the PIN to be entered, the user login unit 114 may determine that the access data 126 has been manipulated or falsified. Alternatively, if the typing speed at which the PIN is entered significantly differs from the typing speed of the authorized user 190 (indicated in the user profile 135), the user login unit 114 may determine that the access data 126 has been manipulated or falsified. As another non-limiting example, if the user login unit 114 detects cyber activity or cyber threats while the access data 126 is provided, the user login unit 114 may determine that the access data 126 has been manipulated or falsified. To illustrate, if the user login unit 114 determines that the access data 126 is computer-generated data, the user login unit 114 may determine that the access data 126 has been manipulated of falsified.
According to one implementation, if a particular user enters the PIN into the login unit 114 and the processor 106 determines that the particular user is not following historical norms (e.g., a typing speed historically used to enter the PIN, a login time when PIN is historically entered, a login location where the PIN is historically entered, etc.), the processor 106 may activate a camera to capture an image of the particular user. Alternatively or in addition, the processor 106 may activate a microphone to record a voice sample of the particular user. The processor 106 may compare the image, the voice sample, or both, to information associated with the authorized user 190. In response to a determination that the camera is blocked (e.g., unable to capture an image of the particular user), the processor 106 may increase a security level. For example, the processor 106 may indicate that the particular user is an unauthorized user 192 that the unauthorized user 192 is involved in improper activity.
In response to a determination that a potential unknown user has accessed the mobile device 102, the activity detector 122 may detect activity of the potential unknown user and the data logging unit 116 may log the activity as the subsequent user behavior 134. The comparison unit 120 may compare the subsequent user behavior 134 to the authorized user behavior patterns 132 of the user profile 135. If the deviation between the subsequent user behavior 134 and the authorized user behavior patterns 132 satisfies a threshold, the comparison unit 120 determines that an unauthorized user 192 has accessed the mobile device 102. As a non-limiting example, if the subsequent user behavior 134 indicates the mobile device 102 is accessed in a location that is not recognized by the authorized user behavior patterns 132 and at a time that is not consistent with access times in the authorized user behavior patterns 132, the comparison unit 120 initiates a process to determine whether an unauthorized user 192 has accessed the mobile device 102. After the process is initiated, the comparison unit 120 uses other metrics (e.g., biometric measurements) of the subsequent user behavior 134 to determine whether an unauthorized user 192 has accessed the mobile device 102. In an illustrative example, the comparison unit 120 compares the heart-rate indicated by the subsequent user behavior 134, the stress level indicated by the subsequent user behavior 134, and other biometric measurements indicated by the subsequent user behavior 134 to corresponding metrics of the authorized user behavior patterns 132 to determine whether an unauthorized user 192 has accessed the mobile device 102. According to some implementations, the activity detector 122 and the comparison unit 120 may also determine, after the login process, that an unauthorized user 192 has accessed the mobile device 102 if activity associated with the mobile device 102 significantly differs from the authorized user behavior patterns 132 of the user profile 135. The activity “significantly differs” from the authorized user behavior patterns 132 if a particular number of weighted user behavior metrics differ from corresponding metrics of the past user behavior 130 by a threshold.
In response to a determination that the unauthorized user 192 has accessed the mobile device 102, the activity detector 122 detects activity by the unauthorized user 192 (e.g., unauthorized activity 136). For example, the activity detector 122 detects whether the unauthorized user 192 is accessing data stored at the mobile device 102, detects whether the mobile device 102 is near unknown networks, detects whether the unauthorized user 192 is attempting to access identity information of the authorized user 190, etc.
The processor 106 determines a security level that the unauthorized activity 136 satisfies (e.g., breaches) and performs an appropriate countermeasure. For example, the security level data 112 includes data indicating activities associated with a breach of a first security level 138, data indicating activities associated with a breach of a second security level 140, and data indicating activities associated with a breach of a third security level 142. Although three security levels 138-142 are illustrated in
The processor 106 determines the security levels 138-142 that are breached by the unauthorized activity 136 to determine the appropriate countermeasure to perform. For example, the countermeasure unit 124 is configured to perform a first countermeasure 144, a second countermeasure 146, and a third countermeasure 148. Although three countermeasures 144-148 are illustrated in
The countermeasures may be performed sequentially based on the unauthorized activity 136. For example, if the countermeasure unit 124 determines that the unauthorized activity 136 satisfies the first security level 138, the countermeasure unit 124 performs the first countermeasure 144. As described with respect to
It should be understood that the activity detector 122 may continuously monitor the unauthorized activity 136. For example, the unauthorized activity 136 may increase (or decrease) in severity and the activity detector 122 may monitor the change in severity. Thus, the second countermeasure 146, the third countermeasure 148, or both, may be triggered in response to a change in the unauthorized activity 136. Additionally, it should be understood that the countermeasures described with respect to
Referring to
The first countermeasure 144a protects information stored at the mobile device 102 from the unauthorized user 192. For example, authorized user data 202 is stored at the memory 104. In response to determining that unauthorized activity 136 satisfies the first security level 138, the countermeasure unit 124 deletes the authorized user data 202 from the memory 104, as illustrated in
Referring to
The first countermeasure 144b protects information stored at the mobile device 102 from the unauthorized user 192. For example, in the example provided in
To perform the first countermeasure 144b, the processor 106 detects a request 306 for the authorized user data 202 from the unauthorized user 192. For example, if the unauthorized user 192 attempts to access the authorized user data 202 from the memory 104, the processor 106 may receive the request 306. In response to receiving the request, the processor 106 may generate the false data 304 and provide the false data 304 to the unauthorized user 192. Thus, the first countermeasure 144b prevents the unauthorized user 192 from receiving the authorized user data 202 (e.g., protects the authorized user data 202 from the unauthorized user 192). Additionally, upon use by the unauthorized user 192, the false data 304 may notify other parties of activities by the unauthorized user 192. For example, a bank or other financial entity may be notified if the unauthorized user 192 uses the fake credit card number.
Referring to
The second countermeasure 146a enables the mobile device 102 to determine information associated with the unauthorized user 192. For example, performing the second countermeasure 146a includes activating a camera 404 of the mobile device 102 to determine a surrounding environment of the unauthorized user 192. To illustrate, in the example provided in
Additionally, the camera activation unit 402 may activate the camera 404 in such a manner that the camera 404 captures an image of the unauthorized user's 192 face. The transmitter 410 may transmit the image to the above-mentioned authorities, and the above-mentioned authorities may perform facial recognition to determine the identity of the unauthorized user 192.
Referring to
The second countermeasure 146b enables the mobile device 102 to determine information (e.g., identify information) about the unauthorized user 192. For example, the processor 106 may prompt the unauthorized user 192 to touch a particular portion of the mobile device 102 that includes a scanner 502. The scanner may be integrated into an input device or keypad, such as the input device 978 of
After the fingerprint 504 is scanned, an identification of the unauthorized user 192 is determined based on the fingerprint 504. According to one implementation, the transmitter 410 sends the fingerprint 504 to the above-mentioned authorities to assist the authorities in identifying the unauthorized user 192. According to another implementation, the processor 106 accesses one or more fingerprint databases (stored at the mobile device 102 or stored at a remote server) and compares the fingerprint 504 to fingerprints in the databases. If a match is determined, the processor 106 determines the identity of the unauthorized user 192, and the transmitter 410 sends information associated with the identity to the appropriate authorities. Thus, the second countermeasure 146b enables the mobile device to determine the identity of the unauthorized user 192 by prompting the unauthorized user 192 to touch the scanner 502 without the unauthorized user 192 knowing and “secretly” scanning the fingerprint 504 of the unauthorized user 192.
Referring to
The second countermeasure 146c enables the mobile device 102 to determine information (e.g., identify information) about the unauthorized user 192. For example, the processor 106 may prompt the unauthorized user 192 to speak into a microphone of the mobile device 102. The processor 106 may generate voice characteristics 604 (e.g., a voice signature) of the unauthorized user 192 while the unauthorized user 192 speaks into the microphone.
After the voice characteristics 604 are generated, an identification of the unauthorized user 192 is determined based on the voice characteristics 604. According to one implementation, the transmitter 410 sends the voice characteristics 604 to the above-mentioned authorities to assist the authorities in identifying the unauthorized user 192. According to another implementation, the processor 106 accesses one or more voice databases and compares the voice characteristics 604 to voices in the databases. If a match is determined, the processor 106 determines the identity of the unauthorized user 192, and the transmitter 410 sends information associated with the identity to the appropriate authorities. Thus, the second countermeasure 146c enables the mobile device to determine the identity of the unauthorized user 192 by prompting the unauthorized user 192 to speak and “secretly” generating the voice signature 604 of the unauthorized user 192.
The second countermeasure 146a-146c described with respect to
The techniques described with respect to
Referring to
The method 700 includes generating, at a processor of a mobile device, a user profile for an authorized user of the mobile device based on a plurality of authorized user behavior patterns associated with the authorized user, at 702. The plurality of authorized user behavior patterns is determined based on historical data indicating past user behavior of the authorized user. The historical data includes geographical locations where the authorized user accesses the mobile device, time periods, when the authorized user accesses the mobile device, physical behavior patterns of the authorized user, or a combination thereof. For example, referring to
The method 700 also includes detecting subsequent user behavior of a particular user during an attempt by the particular user to access the mobile device, at 704. For example, referring to
The method 700 also includes comparing the subsequent user behavior to the plurality of authorized user behavior patterns of the user profile to determine whether the particular user is the authorized user or an unauthorized user, at 706. For example, referring to
The method 700 also includes determining that the particular user is the unauthorized user based on the comparison, at 708. The method 700 also includes detecting activity by the unauthorized user, at 710. For example, referring to
The method 700 also includes performing at least one countermeasure of a plurality of countermeasures in response to detecting the activity, at 712. Each countermeasure of the plurality of countermeasures has a different security level and corresponds to a degree of the activity. For example, referring to
According to the method 700, performing the at least one countermeasure also includes determining that the detected activity 136 satisfies the second security level 140 and performing the second countermeasure 146 of the plurality of countermeasures in response to determining the detected activity 136 satisfies the second security level 140. The second countermeasure 146 enables the mobile device 102 to determine information associated with the unauthorized user 192, as described with respect to
The method 700 of
Referring to
At 802, the countermeasure unit 124 determines whether the detected activity 136 satisfies the first security level 138. If the detected activity 136 does not satisfy the first security level 138, the method 800 ends, at 804. It should be understood however that the detected activity 136 is continuously monitored if the method 800 ends and a further determination at 802 may be performed if the security level associated with the detected activity 136 increases. If the detected activity 136 satisfies the first security level 138, the countermeasure unit 124 performs the first countermeasure 144, at 806.
After the first countermeasure 144 is performed, the countermeasure unit 124 determines whether the detected activity 136 satisfies the second security level 140, at 808. If the detected activity 136 does not satisfy the second security level 140, the method 800 ends, at 810. It should be understood however that the detected activity 136 is continuously monitored if the method 800 ends and a further determination at 802 may be performed if the security level associated with the detected activity 136 increases. If the detected activity 136 satisfies the second security level 140, the countermeasure unit 124 performs the second countermeasure 146, at 812.
After the second countermeasure 146 is performed, the countermeasure unit 124 determines whether the detected activity 136 satisfies the third security level 142, at 814. If the detected activity 136 does not satisfy the third security level 142, the method 800 ends, at 816. It should be understood however that the detected activity 136 is continuously monitored if the method 800 ends and a further determination at 802 or 808 may be performed if the security level associated with the detected activity 136 increases. If the detected activity 136 satisfies the third security level 142, the countermeasure unit 124 performs the third countermeasure 148, at 818. The third countermeasure 148 may include more serious countermeasures and usually notification of involve law enforcement.
Referring to
In a particular implementation, the mobile device 102 includes the processor 106, such as a central processing unit (CPU), coupled to the memory 104. The memory 104 includes instructions 960 (e.g., executable instructions) such as computer-readable instructions or processor-readable instructions. The instructions 960 may include one or more instructions that are executable by a computer, such as the processor 106.
The mobile device 102 may include a display controller 926 that is coupled to the processor 106 and to a display 928. A coder/decoder (CODEC) 934 may also be coupled to the processor 106. An output device 930 (e.g., one or more loudspeakers) and a microphone 948 may be coupled to the CODEC 934. The CODEC 934 may include a digital-to-analog converter (DAC) 902 and an analog-to-digital converter (ADC) 904.
In some implementations, the processor 106, the display controller 926, the memory 104, the CODEC 934, the network interface 932, and the transmitter 410 are included in a system-in-package or system-on-chip device 922. In some implementations, an input device 978 and a power supply 944 are coupled to the system-on-chip device 922. The input device 978 includes the scanner 502. Moreover, in a particular implementation, as illustrated in
In an illustrative implementation, the memory 104 includes or stores the instructions 960 (e.g., executable instructions), such as computer-readable instructions or processor-readable instructions. For example, the memory 104 may include or correspond to a non-transitory computer readable medium storing the instructions 960. The instructions 960 may include one or more instructions that are executable by a computer, such as the processor 106. The instructions 960 may cause the processor 106 to perform the method 700 of
Referring to
For example, in response to a determination that the unauthorized activity 136 satisfies the third security level, the dedicated countermeasure hardware 1000 may generate one or more jamming signals, generate an electromagnetic pulse (EMP), may generate an electrical shocking signal, destroy read-only memories (e.g., flash drives) at the mobile device 102a, arm the mobile device 102a, etc. According to one implementation, the dedicated countermeasure hardware 1000 may include a self-destruct processor (not shown) that destroys a non-volatile memory. The dedicated countermeasure hardware 1000 may be integrated into dedicated defensive devices.
The illustrations of the examples described herein are intended to provide a general understanding of the structure of the various implementations. The illustrations are not intended to serve as a complete description of all of the elements and features of apparatus and systems that utilize the structures or methods described herein. Many other implementations may be apparent to those of skill in the art upon reviewing the disclosure. Other implementations may be utilized and derived from the disclosure, such that structural and logical substitutions and changes may be made without departing from the scope of the disclosure. For example, method operations may be performed in a different order than shown in the figures or one or more method operations may be omitted. Accordingly, the disclosure and the figures are to be regarded as illustrative rather than restrictive.
The steps of a method or algorithm described in connection with the implementations disclosed herein may be included directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in random access memory (RAM), flash memory, read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), registers, hard disk, a removable disk, a compact disc read-only memory (CD-ROM), or any other form of non-transient storage medium known in the art. An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an application-specific integrated circuit (ASIC). The ASIC may reside in a computing device or a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a computing device or user terminal. A storage device is not a signal.
Moreover, although specific examples have been illustrated and described herein, it should be appreciated that any subsequent arrangement designed to achieve the same or similar results may be substituted for the specific implementations shown. This disclosure is intended to cover any and all subsequent adaptations or variations of various implementations. Combinations of the above implementations, and other implementations not specifically described herein, will be apparent to those of skill in the art upon reviewing the description.
The Abstract of the Disclosure is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, various features may be grouped together or described in a single implementation for the purpose of streamlining the disclosure. Examples described above illustrate but do not limit the disclosure. It should also be understood that numerous modifications and variations are possible in accordance with the principles of the present disclosure. As the following claims reflect, the claimed subject matter may be directed to less than all of the features of any of the disclosed examples. Accordingly, the scope of the disclosure is defined by the following claims and their equivalents.
Number | Name | Date | Kind |
---|---|---|---|
6981155 | Lyle | Dec 2005 | B1 |
7042852 | Hrastar | May 2006 | B2 |
7096499 | Munson | Aug 2006 | B2 |
7647645 | Edeki | Jan 2010 | B2 |
7665134 | Hernacki | Feb 2010 | B1 |
8244532 | Begeja | Aug 2012 | B1 |
8528091 | Bowen | Sep 2013 | B2 |
8549640 | Lyle | Oct 2013 | B2 |
8549643 | Shou | Oct 2013 | B1 |
8595834 | Xie | Nov 2013 | B2 |
8732829 | Johnson | May 2014 | B2 |
8843767 | Hars | Sep 2014 | B2 |
8875255 | Dotan | Oct 2014 | B1 |
9009829 | Stolfo | Apr 2015 | B2 |
9015075 | Hughes | Apr 2015 | B2 |
9088903 | Kim | Jul 2015 | B2 |
9154515 | Russo | Oct 2015 | B1 |
9185083 | Glatfelter et al. | Nov 2015 | B1 |
9185095 | Moritz | Nov 2015 | B1 |
9342674 | Abdallah | May 2016 | B2 |
9356942 | Joffe | May 2016 | B1 |
9426139 | McClintock | Aug 2016 | B1 |
9529987 | Deutschmann | Dec 2016 | B2 |
9531710 | Deutschmann | Dec 2016 | B2 |
9838405 | Guo | Dec 2017 | B1 |
10142794 | Diamanti | Nov 2018 | B1 |
10331937 | Goldberg | Jun 2019 | B2 |
20020157021 | Sorkin | Oct 2002 | A1 |
20030051026 | Carter | Mar 2003 | A1 |
20040111636 | Baffes | Jun 2004 | A1 |
20040193892 | Tamura | Sep 2004 | A1 |
20060026682 | Zakas | Feb 2006 | A1 |
20070180516 | Aoki | Aug 2007 | A1 |
20070226795 | Conti | Sep 2007 | A1 |
20070276521 | Harris | Nov 2007 | A1 |
20080168135 | Redlich | Jul 2008 | A1 |
20090150631 | Wilsey | Jun 2009 | A1 |
20100050268 | Sheymov | Feb 2010 | A1 |
20100207721 | Nakajima et al. | Aug 2010 | A1 |
20110154438 | Price | Jun 2011 | A1 |
20110276597 | Little | Nov 2011 | A1 |
20120151121 | Braga | Jun 2012 | A1 |
20130091539 | Khurana | Apr 2013 | A1 |
20130097701 | Moyle | Apr 2013 | A1 |
20130104236 | Ray | Apr 2013 | A1 |
20130111600 | Guenther | May 2013 | A1 |
20140075570 | Hsu et al. | Mar 2014 | A1 |
20140096229 | Burns | Apr 2014 | A1 |
20140143404 | Kennedy | May 2014 | A1 |
20140143873 | Stirtzinger | May 2014 | A1 |
20140201526 | Burgess | Jul 2014 | A1 |
20140215550 | Adams | Jul 2014 | A1 |
20150033306 | Dickenson | Jan 2015 | A1 |
20150040226 | Barau | Feb 2015 | A1 |
20150264069 | Beauchesne | Sep 2015 | A1 |
20150288687 | Heshmati | Oct 2015 | A1 |
20150317475 | Aguayo Gonzalez | Nov 2015 | A1 |
20150350914 | Baxley | Dec 2015 | A1 |
20160014146 | Nakata | Jan 2016 | A1 |
20160021081 | Caceres | Jan 2016 | A1 |
20160065539 | Mermelstein | Mar 2016 | A1 |
20160080403 | Cunningham | Mar 2016 | A1 |
20160080415 | Wu | Mar 2016 | A1 |
20160120075 | Logan | Apr 2016 | A1 |
20160127931 | Baxley | May 2016 | A1 |
20160182543 | Aabye | Jun 2016 | A1 |
20160224777 | Rebelo | Aug 2016 | A1 |
20160275277 | Huang | Sep 2016 | A1 |
20160283715 | Duke | Sep 2016 | A1 |
20160294837 | Turgeman | Oct 2016 | A1 |
20160366161 | Mehta | Dec 2016 | A1 |
20170070521 | Bailey | Mar 2017 | A1 |
20170161746 | Cook | Jun 2017 | A1 |
20170310702 | Chantz | Oct 2017 | A1 |
20170359370 | Humphries | Dec 2017 | A1 |
20180285562 | Radhakrishnan | Oct 2018 | A1 |
20180308100 | Haukioja | Oct 2018 | A1 |
20190020669 | Glatfelter et al. | Jan 2019 | A1 |
20190052675 | Krebs | Feb 2019 | A1 |
Entry |
---|
Arslan, et al., “A Review on Mobile Threats and Machine Learning Based Detection Approaches”, IEEE Xplore, May 19, 2016, 7 pages. |
Communication pursuant to Article 94(3) EPC for Application No. 18182208.1 dated Apr. 28, 2021, 6 pgs. |
Singapore Written Opinion for Application No. 10201805558U dated Jun. 10, 2021, pp. 1-10. |
Number | Date | Country | |
---|---|---|---|
20190020676 A1 | Jan 2019 | US |