This invention relates generally to door lock systems, and more particularly to an intelligent door lock system coupled to or including a camera, and more particularly to an intelligent door lock system that is coupled to or includes a camera is coupled to or part of the intelligent door lock system, where the camera is configured to decide whether or not to record in response to door lock status.
Existing security systems for homes and commercial properties feature multiple video camera connected to a security box. The security box contains electronics to convert analog video and optional audio inputs to digital and performs audio and video compression by a System-On-Chip (SoC) processor, which then stores the results on a hard disk. The system could be programmed for continuous recording in a loop, recording upon a trigger caused by external alarm and scene change threshold, or timed scheduled recording. The cameras are connected by cabling and video is transmitted as analog to the main system. Such cabling makes it difficult to install the multiple cameras inside and outside a residence or commercial because of routing of such long cabling between a dwelling user, resource owner, or end-user, resource owner, or end-user accessible box and cameras. Such a system provides 240 frames-per-second capture, which is divided by multiple cameras. For an 8-camera system, each camera video is captured at 240/8, or 30 fps, but capture resolution is usually low at CIF resolution (350×240). Such a security box can display captured video live from cameras or from hard disk on a monitor or TV, and dwelling user, resource owner, or end-user, resource owner, or end-user functions are controlled by front-panel buttons or an infrared remote-control unit (RCU). This means such a security box must be located near a TV and be visible for RCU operation. Such a system also provides means for remote viewing over internet, and can also send email messages with some snap shots of video when an alarm trigger occurs. However, there is much vulnerability in such a system. If internet is not working at the time of intrusion because phone or internet cables are externally cut, then no such email could be send. Thief can easily remove or damage the whole security box which removes all security data.
Another existing video security systems use networked security based where multiple camera units are connected to a PC or laptop computer over local area network or wide-area network. For example, 9 wireless camera units can connect to a PC computer using Ethernet wires or 802.11 wireless communications. Each camera unit contains video camera, video compression, and network interface in this case. Existing systems use JPEG or MPEG-2 or MPEG-4 systems, but in the future this will probably extend to advanced H.264 video compression standard as well in new designs. If there is no local computer, it is also possible to connect the cameras to a router connected to a WAN gateway, so that multiple security video channels could be streamed to a remote PC or laptop. The remote PC or laptop could perform remote viewing or recording of one or multiple channels on its hard disk storage. One of the disadvantages of such a security system is that if internet access deliberately interrupted at the time of a security event, then it is not possible to stream the data for the event to the remote PC for recording.
If the PC is located locally, then it could easily be removed by the perpetrators. Furthermore, such a system requires continuous stream of multiple video streams over local and wide area networks, which places a considerably load on such networks, thus causing unreliable operations and slowing other network activity. Cabled systems using Ethernet cabling also require difficult cabling of multiple camera units. Units configured to use 802.11 g systems contend bandwidth collisions with other systems, cordless phone, wireless microwaves, and other wireless communication systems on a limited number of channels. Thus, it becomes difficult and unreliable to transfer plurality of live compressed video stream in real-time without interruptions. However, such systems consume energy.
There is a need for an intelligent door lock system. There is a further need for an improved intelligent door lock system with a camera that begins recording in response to door lock status.
An object of the present invention is to provide an intelligent door lock system coupled to or including a camera configured to decide whether or not to record.
Another object of the present invention is to provide an intelligent door lock system coupled to or including a camera configured to begin recording in response to door lock status.
A further object of the present invention is to provide an intelligent door lock system that includes a camera that begins recording as soon as a door is opened and continues until the door is closed and locked.
Yet another object of the present invention is to provide to provide an intelligent door lock system with a lock that communicates directly to a camera.
Another object of the present invention is to provide an intelligent door lock system with a lock that communicates with a camera via Bluetooth.
Yet another object of the present invention is to provide an intelligent door lock system with a lock that communicates with a camera, where the camera begins recording as soon as the door is open and continues recording until the door is closed.
These and other objects of the present invention are achieved in an intelligent door lock system coupled to a door at a dwelling. A door status device is at the dwelling. The door status device is coupled to a drive shaft of a lock device to assist in locking and unlocking a lock of a lock device at the door. The lock device is coupled to the door status device and includes a bolt. An engine, an energy source and a memory are coupled together. A camera is coupled to or part of the intelligent door lock system. The camera is configured to record in response to door lock status.
In another embodiment of the present invention an intelligent door lock system is coupled to a door at a dwelling. An existing door lock system at the dwelling includes: an existing door lock system with a lock device with a lock, and a drive shaft. An intelligent door lock system is retrofitted to the existing door lock system to create a retrofitted intelligent door lock system to lock and unlock the door. The intelligent door lock system includes: a door status device configured to be coupled to the drive shaft of a lock device. The door status device determines lock or unlock status of a lock device. An engine provides power to lock and unlock the lock and coupled to the door status device. A circuit is coupled to the engine, an energy source is coupled to the circuit and provides power to lock or unlock the door, the energy source coupled to the circuit. A camera is coupled to part of the intelligent door lock system and configured to record in response to door lock status.
As used herein, the term engine refers to software, firmware, hardware, other component that can be used to effectuate a purpose, serving computing and the like. The engine will typically include software instructions that are stored in non-volatile memory (also referred to as secondary memory). When the software instructions are executed, at least a subset of the software instructions can be loaded into memory (also referred to as primary memory) by a processor. The processor then executes the software instructions in memory. The processor may be a shared processor, a dedicated processor, or a combination of shared or dedicated processors. A typical program will include calls to hardware components (such as 1/0 devices), which typically requires the execution of drivers. The drivers may or may not be considered part of the engine, but the distinction is not critical.
As used herein, the term database is used broadly to include any known or convenient means for storing data, whether centralized or distributed, relational or otherwise.
As used herein a mobile device includes, but is not limited to, a cell phone, such as Apple's iPhone®, other portable electronic devices, such as Apple's iPod Touches®, Apple's iPads®, and mobile devices based on Google's Android® operating system, and any other portable electronic device that includes software, firmware, hardware, or a combination thereof that is capable of at least receiving the signal, decoding if needed, exchanging information with a server to verify information. Typical components of mobile device may include but are not limited to persistent memories like flash ROM, random access memory like SRAM, a camera, a battery, LCD driver, a display, a cellular antenna, a speaker, a Bluetooth® circuit, and WIFI circuitry, where the persistent memory may contain programs, applications, and/or an operating system for the mobile device. A mobile device can be a key fob A key fob which can be a type of security token which is a small hardware device with built in authentication mechanisms. It is used to manage and secure access to network services, data, provides access, communicates with door systems to open and close doors and the like.
As used herein, the term “computer” or “mobile device or computing device” is a general purpose device that can be programmed to carry out a finite set of arithmetic or logical operations. Since a sequence of operations can be readily changed, the computer can solve more than one kind of problem. A computer can include of at least one processing element, typically a central processing unit (CPU) and some form of memory. The processing element carries out arithmetic and logic operations, and a sequencing and control unit that can change the order of operations based on stored information. Peripheral devices allow information to be retrieved from an external source, and the result of operations saved and retrieved.
As used herein, the term “Internet” is a global system of interconnected computer networks that use the standard Internet protocol suite (TCP/IP) to serve billions of users worldwide. It is a network of networks that consists of millions of private, public, academic, business, and government networks, of local to global scope, that are linked by a broad array of electronic, wireless and optical networking technologies. The Internet carries an extensive range of information resources and services, such as the inter-linked hypertext documents of the World Wide Web (WWW) and the infrastructure to support email. The communications infrastructure of the Internet consists of its hardware components and a system of software layers that control various aspects of the architecture, and can also include a mobile device network, e.g., a cellular network.
As used herein, the term “extranet” is a computer network that allows controlled access from the outside. An extranet can be an extension of an organization's intranet that is extended to users outside the organization that can be partners, vendors, and suppliers, in isolation from all other Internet users. An extranet can be an intranet mapped onto the public Internet or some other transmission system not accessible to the general public, but managed by more than one company's administrator(s). Examples of extranet-style networks include but are not limited to:
LANs or WANs belonging to multiple organizations and interconnected and accessed using remote dial-up
LANs or WANs belonging to multiple organizations and interconnected and accessed using dedicated lines
Virtual private network (VPN) that is comprised of LANs or WANs belonging to multiple organizations, and that extends usage to remote users using special “tunneling” software that creates a secure, usually encrypted network connection over public lines, sometimes via an ISP
As used herein, the term “Intranet” is a network that is owned by a single organization that controls its security policies and network management. Examples of intranets include but are not limited to:
ALAN
A Wide-area network (WAN) that is comprised of a LAN that extends usage to remote employees with dial-up access
A WAN that is comprised of interconnected LANs using dedicated communication lines
A Virtual private network (VPN) that is comprised of a LAN or WAN that extends usage to remote employees or networks using special “tunneling” software that creates a secure, usually encrypted connection over public lines, sometimes via an Internet Service Provider (ISP)
For purposes of the present invention, the Internet, extranets and intranets collectively are referred to as (“Network Systems”).
For purposes of the present invention, Bluetooth LE devices and peripheral devices are Bluetooth low energy devices, marketed as Bluetooth Smart.
For purposes of the present invention, “third party access to a dwelling user, resource owner, or end-user, which can be programmatic” is authorized access to the dwelling user, resource owner, or end-user, and can be secured access, granted by an occupant or owner or end user of the dwelling user, resource owner, or end-user. In one embodiment the access is access via an intelligent door lock system as described herein. In one embodiment the third party secured access to the dwelling user, resource owner, or end-user, which can by programmatic, is granted by the occupant or owner, or end-dwelling user, resource owner, or end-user of a dwelling user, resource owner, or end-user to a service provider, that can be multi-tiered, and used for only one time, multiple times, recurring times, set times, changeable times, and can be revocable, and the like. In one embodiment the access is a secured access, and in one embodiment it is authenticated with authorization provided to access the dwelling user, resource owner, or end-user via a lock of an intelligent door lock system, and it can include authorized resetting of the lock.
For purposed of the present invention, the term “service provider” means organizations and individuals that provide services for a dwelling user, resource owner, or end-user or occupant at a dwelling user, resource owner, or end-user. The services provided can include, any maintenance of the dwelling user, resource owner, or end-user, delivery and the pick-up of items to and from a dwelling user, resource owner, or end-user, services related to dwelling user, resource owner, or end-users and dwelling user, resource owner, or end-user occupants, including but not limited to craftspeople, housekeeping services, laundry and dry-cleaning, skilled laborers, unskilled laborers delivery people, childcare, housekeeping, hairstyling & barbering, makeup and beauty, laundry and dry-cleaning, pet sitting, pet training, funeral services, pet grooming, tailoring, delivery of packages and other items from delivery companies, the U.S. Post Office, the delivery of household items including groceries and the like. A service provider can be an individual, an organization, including but not limited to one with more than a single person such as a corporation, a DBA, partnership, and the like with multiple layers of management and multiple layers of providers from a CEO down to a an individual that performs an actual activity at the dwelling user, resource owner, or end-user. An occupant or owner or end-user of a dwelling user, resource owner, or end-user can grant the service provider access to a corporation or organization, which can grant access to its employees, contractors, consultants, and the like, all of which can be revoked by the corporation or organization relative to the a person given dwelling user, resource owner, or end-user access, maintain records in a database regarding dwelling user, resource owner, or end-user access dates, times, and the like, all of which can be audited, videoed, monitored and maintained by the service provider and/or the occupant or owner or end-user of the dwelling user, resource owner, or end-user, which can revoke at any times access to the dwelling user, resource owner, or end-user.
In one embodiment of the present invention a dwelling user, resource owner, or end-user security system 11(a) is provided with a camera coupled to a WiFi/BTLE a cellular/BTLE bridge 11 or more generally a long range networking/low power short range networking bridge 11.
In one embodiment the present invention provides an improved dwelling user, resource owner, or end-user security system.
In one embodiment the present invention provides a dwelling user, resource owner, or end-user security system 11(a) that includes a WiFi bridge 11 and wireless camera.
In one embodiment the present invention provides a dwelling user, resource owner, or end-user security system 11(a) that includes a camera system which is fully wireless, powered by batteries, and has the performance and endurance necessary to ensure a dwelling user, resource owner, or end-user's entry is properly secured.
In one embodiment the present invention provides a dwelling user, resource owner, or end-user security system 11(a) that includes a WiFi bridge 11 and wireless camera (10c), where the camera can be activate via any internet connected device.
In one embodiment the present invention provides a dwelling user, resource owner, or end-user security system 11(a) that includes a WiFi bridge 11, wireless camera and a sensor.
In one embodiment the present invention provides a dwelling user, resource owner, or end-user security system 11(a) that can include a WiFi bridge 11, a wireless camera 10(c), and a sensor selected from at least one of a doorbell, occupancy sensor, entry keypad, touch sensor, pressure sensor, mobile device phone, Keyfob/card and sensor. In one embodiment wireless camera 10(c) and a motion detection device 10(g) are integrated as one unit, or are at least in communication with each other.
In one embodiment the present invention provides a dwelling user, resource owner, or end-user security system 11(a) that includes a WiFi bridge 11 and a wireless camera 10(c) that does not need a communication cable or external power.
In one embodiment the present invention provides a dwelling user, resource owner, or end-user security system 11(a) that includes a WiFi bridge 11 and a battery powered wireless camera.
In one embodiment the present invention provides a dwelling user, resource owner, or end-user security system 11(a) that includes a WiFi bridge 11, a wireless camera 10(c) and an intelligent door lock system 10.
In one embodiment the present invention provides a dwelling user, resource owner, or end-user security system 11(a) that includes a WiFi bridge 11, a wireless camera 10(c) and an intelligent door lock system 10 that is configured to confirm delivery of items to the dwelling user, resource owner, or end-user.
In one embodiment the present invention provides a dwelling user, resource owner, or end-user security system 11(a) that includes a WiFi bridge 11, a wireless camera 10(c) and an intelligent door lock system 10 that is configured to allow entrance into the dwelling user, resource owner, or end-user of a person delivering item to the dwelling user, resource owner, or end-user.
The specific embodiments of the dwelling user, resource owner, or end-user security system 11(a) of the present invention are discussed hereafter.
Referring to
In one embodiment the vibration/tapping sensing device 11 senses knocking on the door and locks or unlocks the door. In one embodiment the vibration/tapping sensing device 11 is not included as part of the actual intelligent door lock system. In one embodiment the vibration/tapping sensing device 11 is coupled to the drive shaft 14. It will be appreciated that the vibration/tapping sensing device 11 can be coupled to other elements of the intelligent door lock system 10. The vibration/tapping sensing device detects vibration or knocking applied to a door that is used to unlock or lock the intelligent door lock system 10. This occurs following programming the intelligent door lock system 10. The programming includes a user's vibration code/pattern, and the like. Additionally, a dwelling user, resource owner, or end-user, resource owner, or end-user can give a third person a knock code/pattern to unlock the intelligent door lock system of the door. The knocking is one that is recognized as having been defined by a user of the door lock system as a means to unlock the door. The knocking can have a variety of different patterns, tempos, duration, intensity and the like.
The vibration/tapping sensing device 11 detects oscillatory motion resulting from the application of oscillatory or varying forces to a structure. Oscillatory motion reverses direction. The oscillation may be continuous during some time period of interest or it may be intermittent. It may be periodic or nonperiodic, i.e., it may or may not exhibit a regular period of repetition. The nature of the oscillation depends on the nature of the force driving it and on the structure being driven.
Motion is a vector quantity, exhibiting a direction as well as a magnitude. The direction of vibration is usually described in terms of some arbitrary coordinate system (typically Cartesian or orthogonal) whose directions are called axes. The origin for the orthogonal coordinate system of axes is arbitrarily defined at some convenient location.
In one embodiment, the vibratory responses of structures can be modeled as single-degree-of-freedom spring mass systems, and many vibration sensors use a spring mass system as the mechanical part of their transduction mechanism.
In one embodiment the vibration/tapping sensing device 11 can measure displacement, velocity, acceleration, and the like.
A variety of different vibration/tapping sensing devices 11 can be utilized, including but not limited to accelerometers, optical devices, electromagnetic and capacitive sensors, contact devices, transducers, displacement transducers, piezoelectric sensors, piezoresistive devices, variable capacitance, servo devices, audio devices where transfer of the vibration can be gas, liquid or solid, including but not limited to microphones, geo-phones, and the like.
Suitable accelerometers include but are not limited to: Piezoelectric (PE); high-impedance output; Integral electronics piezoelectric (IEPE); low-impedance output Piezoresistive (PR); silicon strain gauge sensor Variable capacitance (VC); low-level, low-frequency Servo force balance; and the like.
The vibration/tapping sensing device 11 can be in communication with an intelligent door lock system back-end 68, via Network Systems, as more fully described hereafter.
In one embodiment, the intelligent door lock system 10 is configured to be coupled to a structure door 12, including but not limited to a house, building and the like, window, locked cabinet, storage box, bike, automobile door or window, computer locks, vehicle doors or windows, vehicle storage compartments, and the like. In one embodiment, the intelligent door lock system 10 is coupled to an existing drive shaft 14 of a lock device 22 already installed and is retrofitted to all or a portion of the lock device 22, which includes a bolt/lock 24. In another embodiment, the intelligent door lock system 10 is attached to a door 12, and the like, that does not have a pre-existing lock device.
In one embodiment, the intelligent door lock system 10 includes a positioning sensing device 16, a motor 38, an engine/processor 36 with a memory and one or more wireless communication devices 40 coupled to a circuit 18. The motor 38 converts any form of energy into mechanical energy. As a non-limiting example, three more four wireless communications devices 40 are in communication with circuit 18. In one embodiment the vibration sensing device can be included with the positioning sensing device.
In one embodiment, the intelligent door lock system 10 is provided with the position sensing device 16 configured to be coupled to the drive shaft 14 of the lock device 22. The position sensing device 16 senses position of the drive shaft 14 and assists in locking and unlocking the bolt/lock 24 of the lock device 22. The engine 36 is provided with a memory. The engine 36 is coupled to the positioning sensing device 16. A circuit 18 is coupled to the engine 36 and an energy source 50 is coupled to the circuit. A device 38 converts energy into mechanical energy and is coupled to the circuit 18, positioning sensing device 16 and the drive shaft 14. Device 38 is coupled to the energy source 50 to receive energy from the energy source 50, which can be via the circuit 18.
In one embodiment, the intelligent door lock system 10 includes any or all of the following, a face plate 20, ring 32, latches such as wing latches 37, adapters 28 coupled to a drive shaft 14, one or more mounting plates 26, a back plate 30, a power sensing device 46, energy sources, including but not limited to batteries 50, and the like.
In one embodiment (see
In one embodiment, the memory of engine/processor 36 includes states of the door 12. The states are whether the door 12 is a left handed mounted door, or a right handed mounted door, e.g., opens from a left side or a right side relative to a door frame. The states are used with the position sensing device 16 to determine via the engine/processor 36 if the lock device 22 is locked or unlocked.
In one embodiment, the engine/processor 36 with the circuit 18 regulates the amount of energy that is provided from energy source 50 to the motor 38. This thermally protects the motor 38 from receiving too much energy and ensures that the motor 38 does not overheat or become taxed.
A variety of position sensing devices 16 can be used, including but not limited to, accelerometers, optical encoders, magnetic encoders, mechanical encoders, Hall Effect sensors, potentiometers, contacts with ticks, optical camera encoders, and the like.
As a non-limiting example, an accelerometer 16, well known to those skilled in the art, detects acceleration. The accelerometer 16 provides a voltage output that is proportional to a detected acceleration. Suitable accelerometers 16 are disclosed in, U.S. Pat. Nos. 8,347,720, 8,544,326, 8,542,189, 8,522,596. EP0486657B1, EP 2428774 A1, incorporated herein by reference.
In one embodiment, the position sensing device 16 is an accelerometer 16. Accelerometer 16 includes a flex circuit coupled to the accelerometer 16. The accelerometer reports X, Y, and X axis information to the engine/processor 36 of the drive shaft 14. The engine/processor 36 determines the orientation of the drive shaft 14, as well as door knocking, bolt/lock 24 position, door 12 close/open (action) sensing, manual key sensing, and the like, as more fully explained hereafter.
Suitable optical encoders are disclosed in U.S. Pat. Nos. 8,525,102, 8,351,789, and 8,476,577, incorporated herein by reference.
Suitable magnetic encoders are disclosed in U.S. Publication 20130063138, U.S. Pat. No. 8,405,387, EP2579002A1, EP2642252 A1, incorporated herein by reference.
Suitable mechanical encoders are disclosed in, U.S. Pat. No. 5,695,048, and EP2564165A2, incorporated herein by reference.
Suitable Hall Effect sensors are disclosed in, EP2454558B1 and EP0907068A1, incorporated herein by reference.
Suitable potentiometers are disclosed in, U.S. Pat. No. 2,680,177, EP1404021A3, CA2676196A1, incorporated herein by reference.
In various embodiments, the positioning sensing device 16 is coupled to the drive shaft 14 by a variety of means, including but not limited to the adapters 28. In one embodiment, the position sensing device 16 uses a single measurement, as defined herein, of drive shaft 14 position sensing which is used to determine movement in order the determine the location of the drive shaft 14 and the positioning sensing device 16. The exact position of the drive shaft 14 can be measured with another measurement without knowledge of any previous state. Single movement, which is one determination of position sensing, is the knowledge of whether the door 12 is locked, unlocked or in between. One advantage of the accelerator is that one can determine position, leave if off, come back at a later time, and the accelerometer 16 will know its current position even if it has been moved since it has been turned off. It will always know its current position.
In one embodiment, the positioning sensing device 16 is directly coupled to the drive shaft 14, as illustrated in
In one embodiment, the same positioning sensing device 16 is able to detect knocks by detecting motion of the door 12 in the Z axis. As a non-limiting example, position sensing is in the range of counter and clock wise rotation of up to 180 degrees for readings. The maximum rotation limit is limited by the position sensing device 16, and more particularly to the accelerometer cable. In one embodiment, the result is sub 1° resolution in position sensing. This provides a higher lifetime because sampling can be done at a slower rate, due to knowing the position after the position sensing device 16 has been turned off for a time period of no great 100 milliseconds. With the present invention, accuracy can be enhanced taking repeated measurements. With the present invention, the positioning sensing device 16, such as the accelerometer, does not need to consume additional power beyond what the knock sensing application already uses.
In one embodiment, the position sensing device 16 is positioned on the drive shaft 14, or on an element coupled to the drive shaft 14. In one embodiment, a position of the drive shaft 14 and power sensing device and/or a torque limited link 38 are known. When the position of the drive shaft 14 is known, it is used to detect if the bolt/lock 24 of a door lock device 22 is in a locked or unlocked position, as well as a depth of bolt/lock 24 travel of lock device 22, and the like. This includes but is not limited to if someone, who turned the bolt/lock 24 of lock device 22 from the inside using the ring 32, used the key to open the door 12, if the door 12 has been kicked down, attempts to pick the bolt/lock 24, bangs on the door 12, knocks on the door 12, opening and closing motions of the door 12 and the like. In various embodiments, the intelligent door lock system 10 can be interrogated via hardware, including but not limited to a key, a mobile device 201, a computer, key fob, key cards, personal fitness devices, such as fitbit®, nike fuel, jawbone up, pedometers, smart watches, smart jewelry, car keys, smart glasses, including but not limited to Google Glass, and the like.
During a power up mode, the current position of the drive shaft 14 is known.
Real time position information of the drive shaft 14 is determined and the bolt/lock 24 of lock device 22 travels can be inferred from the position information of the drive shaft 14. The X axis is a direction along a width of the door 12, the Y axis is in a direction along a length of a door 12, and the Z axis is in a direction extending from a surface of the door 12.
In one embodiment, the accelerometer 16 is the knock sensor. Knocking can be sensed, as well as the number of times a door 12 is closed or opened, the physical swing of the door 12, and the motion the door 12 opening and closing. With the present invention, a determination is made as to whether or not someone successfully swung the door 12, if the door 12 was slammed, and the like. Additionally, by coupling the position sensing device 16 on the movable drive shaft 14, or coupled to it, a variety of information is provided, including but not limited to, if the bolt/lock 24 is stored in the correct orientation, is the door 12 properly mounted and the like.
In one embodiment, a calibration step is performed to determine the amount of drive shaft 14 rotations to fully lock and unlock the bolt/lock 24 of lock device 22. The drive shaft 14 is rotated in a counter-counter direction until it can no longer rotate, and the same is then done in the clock-wise direction. These positions are then stored in the engine memory. Optionally, the force is also stored. A command is then received to rotate the drive shaft 14 to record the amount of rotation. This determines the correct amount of drive shaft 14 rotations to properly lock and unlock the lock device 22.
In another embodiment, the drive shaft 14 is rotated until it does not move anymore. This amount of rotation is then stored in the memory and used for locking and unlocking the lock device 22.
In another embodiment, the drive shaft 14 is rotated until it does not move anymore. However, this may not provide the answer as to full lock and unlock. It can provide information as to partial lock and unlock. Records from the memory are then consulted to see how the drive shaft 14 behaved in the past. At different intervals, the drive shaft 14 is rotated until it does not move anymore. This is then statistically analyzed to determine the amount of drive shaft 14 rotation for full locking and unlocking. This is then stored in the memory.
In one embodiment, the engine/processor 36 is coupled to at least one wireless communication device 40 that utilizes audio and RF communication to communicate with a wireless device, including but not limited to a mobile device/key fob 210, with the audio used to communicate a security key to the intelligent door lock system 10 from the wireless device 210 and the RF increases a wireless communication range to and from the at least one wireless communication device 40. In one embodiment, only one wireless communication device 40 is used for both audio and RF. In another embodiment, one wireless communication device 40 is used for audio, and a second wireless communication device 40 is used for RF. In one embodiment, the bolt/lock 22 is included in the intelligent door lock system 10. In one embodiment, the audio communications initial set up information is from a mobile device/key fob 210 to the intelligent door lock system 10, and includes at least one of, SSID WiFi, password WiFi, a Bluetooth key, a security key and door configurations.
In one embodiment, an audio signal processor unit includes an audio receiver, a primary amplifier circuit, a secondary amplifier circuit, a current amplifier circuit, a wave detection circuit, a switch circuit and a regulator circuit. In one embodiment, the audio receiver of each said audio signal processor unit is a capacitive microphone. In one embodiment, the switch circuit of each audio signal processor unit is selected from one of a transistor and a diode. In one embodiment, the regulator circuit of each audio signal processor unit is a variable resistor. In one embodiment, the audio mixer unit includes a left channel mixer and a right channel mixer. In one embodiment, the amplifier unit includes a left audio amplifier and a right audio amplifier. In one embodiment, the Bluetooth device includes a sound volume control circuit with an antenna, a Bluetooth microphone and a variable resistor, and is electrically coupled with the left channel mixer and right channel mixer of said audio mixer unit. Additional details are in U.S. Publication US20130064378 A1, incorporated fully herein by reference.
In one embodiment, the faceplate 20 and/or ring 32 is electrically isolated from the circuit 18 and does not become part of circuit 18. This allows transmission of RF energy through the faceplate 20. In various embodiments, the faceplate and/or ring are made of materials that provide for electrical isolation. In various embodiments, the faceplate 20, and/or the ring 32 are at ground. As non-limiting examples, (i) the faceplate 20 can be grounded and in non-contact with the ring 32, (ii) the faceplate 20 and the ring 32 are in non-contact with the ring 32 grounded, (iii) the faceplate 20 and the ring can be coupled, and the ring 32 and the faceplate 20 are all electrically isolated from the circuit 18. In one embodiment, the ring 32 is the outer enclosure to the faceplate 20, and the bolt/lock 24 and lock device 22 is at least partially positioned in an interior defined by the ring 32 and the faceplate 20.
In one embodiment, the lock device 22 has an off center drive mechanism relative to the outer periphery that allows up to R displacements from a center of rotation of the bolt/lock 24 of lock device 22, where R is a radius of the bolt/lock 24, 0.75 R displacements, 0.5 R displacements, and the like, as illustrated in
As illustrated in
In one embodiment of the present invention, a haptic device 49 is included to provide the user with haptic feedback for the intelligent door lock system 10, see
In one embodiment, the wing latches 37 are used to secure the intelligent door lock system 10 to a mounting plate 26 coupled to the door 12. In one embodiment, the wing latches 37 secure the intelligent door lock system 10 to a mounting plate 26 coupled to a door 12 without additional tools other than the wing latches 37.
In one embodiment, the main circuit 18 is coupled to, the engine 36 with a processor and memory, the motor 38, wireless communication device 40 such as a WiFi device including but not limited to a Bluetooth device with an antenna, position sensing device 16, \delete speaker (microphone) 17, temperature sensor 42, battery voltage sensor 44, current sensor or power sensor 46 that determines how hard the motor 38 is working, a protection circuit to protect the motor from overheating, an LED array 48 that reports status and one or more batteries 50 that power circuit 18, see
The current sensor 46 monitors the amount of current that goes to the motor 38 and this information is received and processed by the engine/processor 36 with memory and is coupled to the circuit 18. The amount of current going to the motor 38 is used to determine the amount of friction experienced by door 12 and/or lock device 22 with lock/bolt 24 in opening and/or closing, as applied by the intelligent door lock system 10 and the positioning sensing device 16 to the drive shaft 14. The circuit 18 and engine/processor 36 can provide for an adjustment of current. The engine/processor 36 can provide information regarding the door and friction to the dwelling user, resource owner, or end-user, resource owner, or end-user of the door 12.
In one embodiment the housing 58 has an interior volume of at least 200,000 cubic mm.
In one embodiment the amount of torque applied to the mechanical components of intelligent door lock system 10 is less than 8 in-lbs. In one embodiment the amount of torque applied to the draft shaft 22, bolt 24 and lock device 22 is less than 8 in-lbs. As non-limiting examples, the amount of torque applied is less than: 7 in-lbs; 6 in-lbs; 5 in-lbs; 4 in-lbs and the like.
In one embodiment, a lead in ramp,
The intelligent door lock system 10 can be in communication with an intelligent door lock system back-end 68, via Network Systems, as more fully described hereafter.
In one embodiment, the flex circuit 18, which has an out-of plane deflection of at least 1 degree, includes a position detector connector 46, Bluetooth circuit, and associated power points, as well as other elements.
In one embodiment, the intelligent door lock system 10 can use incremental data transfer via Network Systems, including but not limited to BLUETOOTH® and the like. The intelligent door lock system 10 can transmit data through the inductive coupling for wireless charging. The dwelling user, resource owner, or end-user, resource owner, or end-user is also able to change the frequency of data transmission.
In one embodiment, the intelligent door lock system 10 can engage in intelligent switching between incremental and full syncing of data based on available communication routes. As a non-limiting example, this can be via cellular networks, WiFi, BLUETOOTH® and the like.
In one embodiment, the intelligent door lock system 10 can receive firmware and software updates from the intelligent lock system back-end 68.
In one embodiment, the intelligent door lock system 10 produces an output that can be received by an amplifier, and decoded by an I/O decoder to determine I/O logic levels, as well as, both clock and data information. Many such methods are available including ratio encoding, Manchester encoding, Non-Return to Zero (NRZ) encoding, or the like; alternatively, a UART type approach can be used. Once so converted, clock and data signals containing the information bits are passed to a memory at the intelligent door lock system 10 or intelligent door lock system back-end 68.
In one embodiment, the intelligent door lock system 10, or associated back-end 68, can includes a repeatable pseudo randomization algorithm in ROM or in ASIC logic.
As illustrated in
The back-end 68 knows that an intelligent door lock system 10 is with a dwelling user, resource owner, or end-user, resource owner, or end-user, and includes a database with the dwelling user, resource owner, or end-user, resource owner, or end-user's account information. The back-end 68 knows if the dwelling user, resource owner, or end-user, resource owner, or end-user is registered or not. When the intelligent door lock system 10 is powered up, the back-end 68 associated that intelligent door lock system 10 with the dwelling user, resource owner, or end-user, resource owner, or end-user.
The conditioning electronics 80 can provide signal conditioning, including but not limited to amplification, filtering, converting, range matching, isolation and any other processes required to make sensor output suitable for processing after conditioning. The conditioning electronics can provide for, DC voltage and current, AC voltage and current, frequency and electric charge. Signal inputs accepted by signal conditioners include DC voltage and current, AC voltage and current, frequency and electric charge. Outputs for signal conditioning electronics can be voltage, current, frequency, timer or counter, relay, resistance or potentiometer, and other specialized output.
In one embodiment, the one or more processors 78, can include a memory, such as a read only memory, used to store instructions that the processor may fetch in executing its program, a random access memory (RAM) used by the processor 78 to store information and a master dock. The one or more processors 78 can be controlled by a master clock that provides a master timing signal used to sequence the one or more processors 78 through internal states in their execution of each processed instruction. In one embodiment, the one or more processors 78 can be low power devices, such as CMOS, as is the necessary logic used to implement the processor design. Information received from the signals can be stored in memory.
In one embodiment, electronics 92 are provided for use in intelligent door system 10 analysis of data transmitted via System Networks. The electronics 92 can include an evaluation device 94 that provides for comparisons with previously stored intelligent door system 10 information.
Signal filtering is used when the entire signal frequency spectrum contains valid data. Filtering is the most common signal conditioning function, as usually not all the signal frequency spectrum contains valid data.
Signal amplification performs two important functions: increases the resolution of the inputted signal, and increases its signal-to-noise ratio.
Suitable amplifiers 86 include but are not limited to sample and hold amplifiers, peak detectors, log amplifiers, antilog amplifiers, instrumentation amplifiers, programmable gain amplifiers and the like.
Signal isolation can be used in order to pass the signal from to a measurement device without a physical connection. It can be used to isolate possible sources of signal perturbations.
In one embodiment, the intelligent door lock system back-end 68 can provide magnetic or optic isolation. Magnetic isolation transforms the signal from voltage to a magnetic field, allowing the signal to be transmitted without a physical connection (for example, using a transformer). Optic isolation takes an electronic signal and modulates it to a signal coded by light transmission (optical encoding), which is then used for input for the next stage of processing.
In one embodiment, the intelligent door lock system 10 and/or the intelligent door lock system back-end 68 can include Artificial Intelligence (AI) or Machine Learning-grade algorithms for analysis. Examples of AI algorithms include Classifiers, Expert systems, case based reasoning, Bayesian networks, and Behavior based AI, Neural networks, Fuzzy systems, Evolutionary computation, and hybrid intelligent systems.
Information received or transmitted from the back-end 68 to the intelligent door system 10 and mobile device 210 can use logic resources, such as AI and machine learning grade algorithms to provide reasoning, knowledge, planning, learning communication, and create actions.
In one embodiment, AI is used to process information from the intelligent door lock system 10, from mobile device 210, and the like. The back-end 68 can compute scores associated with various risk variables involving the intelligent door lock system 10. These score can be compared to a minimum threshold from a database and an output created. Alerts can be provided to the intelligent door lock system 10, mobile device 210 and the like. The alert can provide a variety of options for the intelligent door lock system 10 to take, categorizations of the received data from the mobile device 210, the intelligent door lock system 10, and the like, can be created. A primary option can be created as well as secondary options.
In one embodiment, data associated with the intelligent door lock system 10 is received. The data can then be pre-processed and an array of action options can be identified. Scores can be computed for the options. The scores can then be compared to a minimum threshold and to each other. A sorted list of the action options based on the comparison can be outputted to the intelligent door lock system 10, the mobile device 210 and the like. Selections can then be received indicating which options to pursue. Action can then be taken. If an update to the initial data is received, the back-end 68 can then return to the step of receiving data.
Urgent indicators can be determined and directed to the intelligent door lock system 10, including unlocking, locking and the like.
Data received by the intelligent door lock system 10 and mobile device 210 can also is compared to third party secured access to a dwelling user, resource owner, or end-user, which can by programmatic data sources.
In data evaluation and decision making, algorithm files from a memory can be accessed specific to data and parameters received from the intelligent door lock system 10 and mobile device 210.
Scoring algorithms, protocols and routines can be run for the various received data and options. Resultant scores can then be normalized and weights assigned with likely outcomes.
The intelligent door lock system 10 can be a new lock system mounted to a door 12, with all or most of the elements listed above, or it can be retrofitted over an existing lock device 22.
To retrofit the intelligent door lock system 10 with an existing lock system, the dwelling user, resource owner, or end-user, resource owner, or end-user makes sure that the existing lock device 22 and bolt/lock 24 is installed right-side up. The existing thumb-turn is then removed. With some lock devices 22, additional mounting plates 26 need to be removed and the intelligent door lock system 10 can include replacement screws 56 that are used. The correct mounting plate 26 is then selected. With the existing screws 56 in the thumb-turn, the dwelling user, resource owner, or end-user, resource owner, or end-user sequentially aligns with 1 of 4 mounting plates 26 that are supplied or exist. This assists in determining the correct diameter and replace of the screws 56 required by the bolt/lock 24. The mounting plate 26 is then positioned. The correct adapter 28 is positioned in a center of the mounting plate 26 to assist in proper positioning. Caution is made to ensure that the adapter 28 does not rub the sides of the mounting plate 26 and the screws 56 are then tightened on the mounting plate 26. The intelligent door lock system bolt/lock 24 of lock device 22 is then attached. In one embodiment, this is achieved by pulling out side wing latches 36, sliding the lock device 22 and/or bolt/lock 24 over the adapter 28 and pin and then clamping down the wings 36 to the mounting plate 26. The faceplate is rotated to open the battery compartment and the battery tabs are then removed to allow use of the battery contacts 64. An outer metal ring 32 to lock and unlock the door 12 is then rotated. An app from mobile device 210 and/or key then brings the dwelling user, resource owner, or end-user, resource owner, or end-user through a pairing process.
A door 12 can be deformed, warped, and the like. It is desirable to provide a customer or dwelling user, resource owner, or end-user, resource owner, or end-user, information about the door, e.g., if it is deformed, out of alignment, if too much friction is applied when opening and closing, and the like.
As recited above, the current sensor 46 monitors the amount of current that goes to the motor 38 and this information is received and processed by the engine/processor 36 with memory and is coupled to the circuit 18. The amount of current going to the motor 38 is used to determine the amount of friction experienced by door 12 and/or lock device 22 in opening and/or closing, as applied by the intelligent door lock system 10 and the positioning sensing device 16 to the drive shaft 14. The circuit 18 and engine/processor 36 can provide for an adjustment of current. The engine/processor 36 can provide information regarding the door and friction to the dwelling user, resource owner, or end-user, resource owner, or end-user of the door 12.
In one embodiment of the present invention, the intelligent door lock system 10 provides an ability to sense friction on the lock device 22 and/or door 12 by measuring the torque required to move the bolt/lock 24. The intelligent door lock system 10 increases the applied torque gradually until the bolt/lock 24 moves into its desired position, and the applied torque is the minimum amount of torque required to move the bolt/lock 24, which is directly related to how deformed the door is.
In one embodiment, when a bad door is detected, a customer can be notified that their door may require some servicing. In one embodiment, door deformation can be detected with a torque device is used to determine if the torque applied when the door is rotated is too high. As a non-limiting example, this can be 2-15 in lbs of torque The intelligent door lock system back end 68 can then perform a comparison between the measured torque with a standard, or a norm that is included in the one or more databases 88.
In one embodiment of the present invention, before the door is serviced, the intelligent door lock system 10 allows operation by offering a high-friction mode. As a non-limiting example, the high friction mode is when, as non-limiting examples, 2 inch lbs, 3 inch lbs., 3.5 inch pounds, and the like are required to open the door. In the high friction mode, the bolt/lock 24 is driven while the dwelling user, resource owner, or end-user, resource owner, or end-user is pushing, lifting, torquing the door, pulling, performing visual inspections of rust, blockage, other conditions that can compromise a door and the like, that is applied to the doorknob. The position sensing device 16 is used to determine if the bolt/lock 24 was moved to a final position. In the high friction mode, motion of the door closing is confirmed. Upon detecting the closing of the door, the bolt/lock 24 is then driven. When the dwelling user, resource owner, or end-user, resource owner, or end-user receives an auditory, visual, or any other type of perceptible confirmation, the dwelling user, resource owner, or end-user, resource owner, or end-user then knows that the door has been locked. In one embodiment, the firmware elements, of the intelligent door lock system 10, as well as other door lock device 22 elements, can also attempt to drive the bolt/lock 24 for a second time when the first time fails. However, this can result in more power consumption, reducing lifetime of the power source, particularly when it is battery 50 based.
In one embodiment of the present invention, the intelligent door lock system 10 seeks to have the motor 38 operate with reduced energy consumption for energy source lifetime purposes, as well as eliminate or reduce undesirable noises, operations, and dwelling user, resource owner, or end-user, resource owner, or end-user experiences that occur when this is a failure in door locking and unlocking, particularly due to door deformation, door non-alignment, as well as other problems with the door that can be irritating to the person locking or unlocking the door.
In one embodiment of the present invention, the intelligent door lock system back-end 68 can track performance of doors and friction levels across time and build a service to encourage dwelling user, resource owner, or end-user, resource owner, or end-users to better maintain their doors. Such service can be a comparison of a door's friction level to other users that have similar geographic locations, at similar weather pattern, such that the dwelling user, resource owner, or end-user, resource owner, or end-user is encouraged to maintain their doors at a competent level. There can be a comparison to standards that at a certain level the door becomes unsafe. Guidelines are provided as to how to maintain their doors. This can be achieved by asking a door dwelling user, resource owner, or end-user, resource owner, or end-user what improves their door, including but not limited to, pushing, lifting, torquing the door, pulling, visual inspections of rust, blockage, other conditions that can compromise a door, and the like. The analysis and comparison can be conducted at the back-end 68 and the results computed to door lock operator as well as others.
In one embodiment of the present invention, the intelligent door lock system 10 has a deformed operation mode that can be activated after a selected amount of time. As a non-limiting example, this can immediately after the dwelling user, resource owner, or end-user, resource owner, or end-user has been notified, more than 1 pico second, 1 second, 5 seconds, and greater periods of time. The deformed operation mode can be activated by the intelligent door lock system 10 itself, or by the intelligent door lock system back-end 68. It can be activated on the door operator's request. In one embodiment, the back-end 68 can anticipate these problems. As non-limiting examples, these can include but are not limited to, due to analysis of doors 12 in similar geographic areas, doors under similar conditions, doors with similar histories, similar environmental conditions, as well as the history of a particular door, and the like.
The deformed mode provides cooperation with the door dwelling user, resource owner, or end-user, resource owner, or end-user to more readily open the door. In one embodiment, this is a mechanism for the door to communicate back to the door lock operator. As a non-limiting example, feedback can be provided to the door operator. Such feedback can include, but is not limited to, communication via, tactile, audio, visual, temperature, electronic, wirelessly, through a computer, mobile device 201 and the like. In another embodiment, the operator can signify to the door the operator's desire to leave by unlocking and opening the door 12. This is a door operator and lock communication. The door operator can close the door, which is sensed by the intelligent door lock system 10, a timer can then be initiated to provide with door operator with a selected time period in which the door operator can manually alleviate the friction problem. When the time has expired, the intelligent door system 10 can then lock the door 12. Upon detecting a successful door locking event, the intelligent door lock system 10 can advise the door operator that there is a successful door locking. If the door locking is not successful, the intelligent door lock system 10 can provide a message to the door operator via a variety of means, including but not limited to a message or alert to the door lock operator's mobile device 201. Such a mobile device message provides the door operator with notification that door locking was not successful or achieved, and the door lock operator can then take action to lock the door 12 either in person, wirelessly, and the like.
For entry, communication with the lock device 22 may be different. In one embodiment, it can be locking coupled with close proximity to a mobile device 201 that is exterior to the door.
In another embodiment of the present invention, the intelligent door lock system back-end 68 can track performance of doors and friction levels across time and build a simple service to encourage dwelling user, resource owner, or end-user, resource owner, or end-users to maintain their doors better, as discussed above.
This information can be stored in the one or more databases 64.
In one embodiment of the present invention, the intelligent door lock system 10 unlocks when a selected temperature is reached, when smoke is detected, when a fire is detected by processor 38 and the like. As non-limiting examples, the intelligent door lock system 10 unlocks the bolt/lock 24 when a temperature is sensed by the temperature sensor 46 that, as non-limiting examples, is greater than 40 degrees C., any temperature over 45 degrees C. and the like. The temperature sensor 46212 sends a signal to the processor 36 which communicates with the motor 38 that will then cause the drive shaft 14 to rotate sufficiently and unlock the bolt/lock 24. An arm can also be activated. It will be appreciated that the processor 36 can be anywhere as long as it is in communication with the temperature sensor 46, and the motor 38, which can be at the intelligent door lock system 10, at the back-end 68, anywhere in the building, and at any remote location. The processor 36 determines if there is an unsafe condition, e.g., based on a rise in temperature and this then results in an unlocking of the bolt/lock 24.
In one embodiment, the intelligent door lock system back-end 68 can track performance of doors and friction levels across time and build a service to encourage dwelling user, resource owner, or end-user, resource owner, or end-users to better maintain their doors, as discussed above.
In another embodiment, the mobile device 201 or computing device 210 may execute a browser stored in the memory of the mobile or computing device 210 using a processor from the mobile device 201 or computing device 210 to interact with the intelligent door lock system back-end component 114. Each of the elements shown in
As shown in
The intelligent door look assembly 100 may have an extension gear which extends through the baseplate of the smart door lock. The baseplate may have one or more oval mounting holes to accommodate various rose screw distances from 18 mm to 32 mm to accommodate various different doors. In one implementation, the intelligent door lock system 100 may have a circular shape and also a rotating bezel. The rotating bezel allows a dwelling user, resource owner, or end-user, resource owner, or end-user to rotate the smart door lock and thus manually lock or unlock the bolt as before. The extension gear extends through the baseplate and then interacts with the existing bolt elements and allows the smart door lock to lock/unlocks the bolt. The extension gear may have a modular adapter slot at its end which interfaces with an extension rod of the bolt assembly 124. These modular adapters, as shown in
The intelligent door lock system 100 may be used for various functions. As a non-limiting example, the intelligent door lock system 100 may enable a method to exchange a security token between mobile or computing device 210 and the intelligent door look system 100. All or all of the intelligent door look systems 100 may be registered with the intelligent doorlock back-end 68 with a unique registration ID. The unique ID of the an intelligent door look system 100 may be associated with a unique security token that can only be used to command a specific intelligent door look system 100 to lock or unlock. Through a virtual key provisioning interface of the intelligent door lock system back-end 68, a master user, who may be an administrator, can issue a new security token to a particular mobile or computing device 210. The intelligent door look system 100 can periodically broadcast an advertisement of its available services over System Networks. When the mobile or computing device 210 is within a predetermined proximity of the intelligent door look system 100, which varies depending on the protocol being used, the mobile or computing device 210 can detect the advertisement from the intelligent door lock assembly 100.
The application on the mobile or computing device 210 detects the intelligent door look system 100 and a communications session can be initiated. The token, illustrated as a key 118 in
The intelligent door lock system 100 may also allow for the triggering of multiple events upon connection to an intelligent door look system 100 by a mobile or computing device 210. As a non-limiting example, the intelligent door look system 100 can detect and authenticate the mobile or computing device 210, as described herein, and initiate a series of actions, including but not limiting to, unlocking doors 100, turning on lights, adjusting temperature, turning on stereo etc.
As non-limiting examples, suitable devices that can be controlled by a mobile device 201 include but are not limited to: doors and windows, burglar alarms, generators, thermostats. lighting, smoke/co detector, refrigerator, ranges, electronic devices, door locks, water alarm or shutoff, washer and dryer, music systems, heating and air conditioning systems, water systems, sprinklers systems and the like. With the present invention analogies of any of the preceding can be detected. When an anomaly is detected the owner can be detected via its mobile device 201, via the cloud or through a system backend, and an action be taken. In certain embodiments authorities can be immediately contacted and investigate the situation/dwelling user, resource owner, or end-user.
In one embodiment the motivation for anomaly detection is to discover events that are outside of threshold settings. Events could be malicious ones such as a hacker attempting to remotely operate a lock, a burglar breaking a window or pushing in a door, or non-malicious events such as a door that has been left open for longer than normal.
As a non-limiting example normal events would be normal smart lock operation and door operations that fall within threshold values, such as a commuter who operates the door lock once in the morning, and then once in the evening when they return home.
In one embodiment the calculation of an anomaly could be done by collecting signal values from devices and sensors such as a smart door lock, camera, microphone, etc. that capture video, audio, motion, seismic, or other event data. Data from each device could have a weighting factor attached to it, and a total event value could be calculated by multiplying each signal value by its weighting factor, and then summing all signal values. If the total event value is greater than the anomaly threshold, then additional alerts or actions could be triggered, such as automatically locking the door or sending a notification to the dwelling user, resource owner, or end-user, resource owner, or end-user.
Total Event Value=w1*k(door)+w2*k(window)+ . . . +wn*kn
where w1, w2, wn are weighting factors
where k(door), k(window), kn are signal values from devices or sensors
The commands for these actions may be carried out by the mobile or computing device 210 or the intelligent door lock system back-end 68. In addition, through a web interface of the intelligent door lock system back-end 68, the dwelling user, resource owner, or end-user, resource owner, or end-user may define one or more events to be triggered upon proximity detection and authentication of the dwelling user, resource owner, or end-user, resource owner, or end-user's mobile or computing device 210 to the intelligent door look system 100.
The intelligent door lock system 100 may also allow for the intelligent triggering of events associated with an individual. In particular, environmental settings may be defined per individual in the intelligent door lock system back-end 68 and then applied intelligently by successive ingress by that person into a building that has an intelligent door look system 100. For example: person A arrives home and its mobile or computing device 210 is authenticated by the intelligent door look system 100. His identity is shared with the intelligent door lock system back-end 68. The intelligent door lock system back-end 68 may send environmental changes to other home controllers, such as “adjust heat to 68 degrees”. Person B arrives at the same building an hour later and her mobile or computing device 210 is also authenticated and shared with the intelligent door lock system back-end 68. The intelligent door lock system back-end 68 access her preferred environmental variables such as “adjust heat to 71 degrees”. The intelligent door lock system back-end understands that person B has asked for a temperature increase and issues the respective command to the dwelling user, resource owner, or end-user thermostat. In one example, the intelligent door lock back-end system 68 has logic that defers to the higher temperature request or can deny it. Therefore if person A entered the home after person B, the temperature would not be decreased.
Referring to
The mobile or computing device can include a display that can be a touch sensitive display. The touch-sensitive display is sometimes called a “touch screen” for convenience, and may also be known as or called a touch-sensitive display system. The mobile or computing device may include a memory (which may include one or more computer readable storage mediums), a memory controller, one or more processing units (CPU's), a peripherals interface, Network Systems circuitry, including but not limited to RF circuitry, audio circuitry, a speaker, a microphone, an input/output (I/O) subsystem, other input or control devices, and an external port. The mobile or computing device may include one or more optical sensors. These components may communicate over one or more communication buses or signal lines.
It should be appreciated that the mobile or computing device is only one example of a portable multifunction mobile or computing device, and that the mobile or computing device may have more or fewer components than shown, may combine two or more components, or a may have a different configuration or arrangement of the components. The various components may be implemented in hardware, software or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
Memory may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory by other components of the mobile or computing device, such as the CPU and the peripherals interface, may be controlled by the memory controller.
The peripherals interface couples the input and output peripherals of the device to the CPU and memory. The one or more processors run or execute various software programs and/or sets of instructions stored in memory to perform various functions for the mobile or computing device and to process data.
In some embodiments, the peripherals interface, the CPU, and the memory controller may be implemented on a single chip, such as a chip. In some other embodiments, they may be implemented on separate chips.
The Network System circuitry receives and sends signals, including but not limited to RF, also called electromagnetic signals. The Network System circuitry converts electrical signals to/from electromagnetic signals and communicates with communications with communications devices via the electromagnetic signals. The Network Systems circuitry may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. The Network Systems circuitry may communicate with Network Systems 110 and other devices by wireless communication.
The wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), BLUETOOTH®, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voiceover Internet Protocol (VoIP), Wi-MAX, a protocol for email (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), and/or Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
The audio circuitry, the speaker, and the microphone provide an audio interface between a user and the mobile or computing device. The audio circuitry receives audio data from the peripherals interface, converts the audio data to an electrical signal, and transmits the electrical signal to the speaker. The speaker converts the electrical signal to human-audible sound waves. The audio circuitry also receives electrical signals converted by the microphone from sound waves. The audio circuitry converts the electrical signal to audio data and transmits the audio data to the peripherals interface for processing. Audio data may be retrieved from and/or transmitted to memory and/or the Network Systems circuitry by the peripherals interface. In some embodiments, the audio circuitry also includes a headset jack. The headset jack provides an interface between the audio circuitry and removable audio input/output peripherals, such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone).
The I/O subsystem couples input/output peripherals on the mobile or computing device, such as the touch screen and other input/control devices, to the peripherals interface. The I/O subsystem may include a display controller and one or more input controllers for other input or control devices. The one or more input controllers receive/send electrical signals from/to other input or control devices. The other input/control devices may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, and joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) may be coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse. The one or more buttons may include an up/down button for volume control of the speaker and/or the microphone. The one or more buttons may include a push button. A quick press of the push button may disengage a lock of the touch screen or begin a process that uses gestures on the touch screen to unlock the device, as described in U.S. patent application Ser. No. 11/322,549, “Unlocking a Device by Performing Gestures on an Unlock Image,” filed Dec. 23, 2005, which is hereby incorporated by reference in its entirety. A longer press of the push button may turn power to the mobile or computing device on or off. The user may be able to customize a functionality of one or more of the buttons. The touch screen is used to implement virtual or soft buttons and one or more soft keyboards.
The touch-sensitive touch screen provides an input interface and an output interface between the device and a user. The display controller receives and/or sends electrical signals from/to the touch screen. The touch screen displays visual output to the user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output may correspond to user-interface objects, further details of which are described below.
A touch screen has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. The touch screen and the display controller (along with any associated modules and/or sets of instructions in memory) detect contact (and any movement or breaking of the contact) on the touch screen and converts the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on the touch screen. In an exemplary embodiment, a point of contact between a touch screen and the user corresponds to a finger of the user.
The touch screen may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. The touch screen and the display controller may detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with a touch screen.
A touch-sensitive display in some embodiments of the touch screen may be analogous to the multi-touch sensitive tablets described in the following U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference in their entirety. However, a touch screen displays visual output from the portable mobile or computing device, whereas touch sensitive tablets do not provide visual output.
A touch-sensitive display in some embodiments of the touch screen may be as described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, “Multipoint Touch Surface Controller,” filed May 12, 2006; (2) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed May 6, 2004; (3) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed Jan. 31, 2005; (5) U.S. patent application Ser. No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices,” filed Jan. 18, 2005; (6) U.S. patent application Ser. No. 11/228,758, “Virtual Input Device Placement On A Touch Screen User Interface,” filed Sep. 16, 2005; (7) U.S. patent application Ser. No. 11/228,700, “Operation Of A Computer With A Touch Screen Interface,” filed Sep. 16, 2005; (8) U.S. patent application Ser. No. 11/228,737, “Activating Virtual Keys Of A Touch-Screen Virtual Keyboard,” filed Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, “Multi-Functional Hand-Held Device,” filed Mar. 3, 2006. All of these applications are incorporated by reference herein in their entirety.
The touch screen may have a resolution in excess of 1000 dpi. In an exemplary embodiment, the touch screen has a resolution of approximately 1060 dpi. The user may make contact with the touch screen using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and mood intensity, which are much less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
In some embodiments, in addition to the touch screen, the mobile or computing device may include a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad may be a touch-sensitive surface that is separate from the touch screen or an extension of the touch-sensitive surface formed by the touch screen.
In some embodiments, the mobile or computing device may include a physical or virtual click wheel as an input control device. A user may navigate among and interact with one or more graphical objects (henceforth referred to as icons) displayed in the touch screen by rotating the click wheel or by moving a point of contact with the click wheel (e.g., where the amount of movement of the point of contact is measured by its angular displacement with respect to a center point of the click wheel). The click wheel may also be used to select one or more of the displayed icons. For example, the user may press down on at least a portion of the click wheel or an associated button. User commands and navigation commands provided by the user via the click wheel may be processed by an input controller as well as one or more of the modules and/or sets of instructions in memory. For a virtual click wheel, the click wheel and click wheel controller may be part of the touch screen and the display controller, respectively. For a virtual click wheel, the click wheel may be either an opaque or semitransparent object that appears and disappears on the touch screen display in response to user interaction with the device. In some embodiments, a virtual click wheel is displayed on the touch screen of a portable multifunction device and operated by user contact with the touch screen.
The mobile or computing device also includes a power system for powering the various components. The power system may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
The mobile or computing device may also include one or more sensors, including not limited to optical sensors. In one embodiment an optical sensor is coupled to an optical sensor controller in I/O subsystem. The optical sensor may include charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors. The optical sensor receives light from the environment, projected through one or more lens, and converts the light to data representing an image. In conjunction with an imaging module (also called a camera module); the optical sensor may capture still images or video. In some embodiments, an optical sensor is located on the back of the mobile or computing device, opposite the touch screen display on the front of the device, so that the touch screen display may be used as a viewfinder for either still and/or video image acquisition. In some embodiments, an optical sensor is located on the front of the device so that the user's image may be obtained for videoconferencing while the user views the other video conference participants on the touch screen display. In some embodiments, the position of the optical sensor can be changed by the user (e.g., by rotating the lens and the sensor in the device housing) so that a single optical sensor may be used along with the touch screen display for both video conferencing and still and/or video image acquisition.
The mobile or computing device may also include one or more proximity sensors. In one embodiment, the proximity sensor is coupled to the peripherals interface. Alternately, the proximity sensor may be coupled to an input controller in the I/O subsystem. The proximity sensor may perform as described in U.S. patent application Ser. No. 11/241,839, “Proximity Detector In Handheld Device,” filed Sep. 30, 2005; Ser. No. 11/240,788, “Proximity Detector In Handheld Device,” filed Sep. 30, 2005; Ser. No. 13/096,386, “Using Ambient Light Sensor To Augment Proximity Sensor Output”; Ser. No. 13/096,386, “Automated Response To And Sensing Of User Activity In Portable Devices,” filed Oct. 24, 2006; and Ser. No. 11/638,251, “Methods And Systems For Automatic Configuration Of Peripherals,” which are hereby incorporated by reference in their entirety. In some embodiments, the proximity sensor turns off and disables the touch screen when the multifunction device is placed near the user's ear (e.g., when the user is making a phone call). In some embodiments, the proximity sensor keeps the screen off when the device is in the user's pocket, purse, or other dark area to prevent unnecessary battery drainage when the device is a locked state.
In some embodiments, the software components stored in memory may include an operating system, a communication module (or set of instructions), a contact/motion module (or set of instructions), a graphics module (or set of instructions), a text input module (or set of instructions), a Global Positioning System (GPS) module (or set of instructions), and applications (or set of instructions).
The operating system (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
The communication module facilitates communication with other devices over one or more external ports and also includes various software components for handling data received by the Network Systems circuitry and/or the external port. The external port (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over Network Systems 110. In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with the 30-pin connector used on iPod (trademark of Apple Computer, Inc.) devices.
The contact/motion module may detect contact with the touch screen (in conjunction with the display controller) and other touch sensitive devices (e.g., a touchpad or physical click wheel). The contact/motion module includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred, determining if there is movement of the contact and tracking the movement across the touch screen, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations may be applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, the contact/motion module and the display controller also detect contact on a touchpad. In some embodiments, the contact/motion module and the controller detects contact on a click wheel.
Examples of other applications that may be stored in memory include other word processing applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch screen, display controller, contact module, graphics module, and text input module, a contacts module may be used to manage an address book or contact list, including: adding name(s) to the address book; deleting name(s) from the address book; associating mobile device number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing mobile device numbers or e-mail addresses to initiate and/or facilitate communications by mobile device 201, video conference, e-mail, or IM; and so for
The foregoing description of various embodiments of the claimed subject matter has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the claimed subject matter to the precise forms disclosed. Many modifications and variations will be apparent to the practitioner skilled in the art. Particularly, while the concept “component” is used in the embodiments of the systems and methods described above, it will be evident that such concept can be interchangeably used with equivalent concepts such as, class, method, type, interface, module, object model, and other suitable concepts. Embodiments were chosen and described in order to best describe the principles of the invention and its practical application, thereby enabling others skilled in the relevant art to understand the claimed subject matter, the various embodiments and with various modifications that are suited to the particular use contemplated.
Referring now to
The mobile or computing device 210 can include a display 1214 that can be a touch sensitive display. The touch-sensitive display 1214 is sometimes called a “touch screen” for convenience, and may also be known as or called a touch-sensitive display system. The mobile or computing device 210 may include a memory 1216 (which may include one or more computer readable storage mediums), a memory controller 1218, one or more processing units (CPU's) 1220, a peripherals interface 1222, Network Systems circuitry 1224, including but not limited to RF circuitry, audio circuitry 1226, a speaker 1228, a microphone 1230, an input/output (I/O) subsystem 1232, other input or control devices 1234, and an external port 1236. The mobile or computing device 210 may include one or more optical sensors 1238. These components may communicate over one or more communication buses or signal lines 1240.
It should be appreciated that the mobile or computing device 210 is only one example of a portable multifunction mobile or computing device 210, and that the mobile or computing device 210 may have more or fewer components than shown, may combine two or more components, or a may have a different configuration or arrangement of the components. The various components shown in
Memory 1216 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 1216 by other components of the mobile or computing device 210, such as the CPU 1220 and the peripherals interface 1222, may be controlled by the memory controller 1218.
The peripherals interface 1222 couples the input and output peripherals of the device to the CPU 1220 and memory 1216. The one or more processors 1220 run or execute various software programs and/or sets of instructions stored in memory 1216 to perform various functions for the mobile or computing device 210 and to process data.
In some embodiments, the peripherals interface 1222, the CPU 1220, and the memory controller 1218 may be implemented on a single chip, such as a chip 1242. In some other embodiments, they may be implemented on separate chips.
The Network System circuitry 1244 receives and sends signals, including but not limited to RF, also called electromagnetic signals. The Network System circuitry 1244 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. The Network Systems circuitry 1244 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. The Network Systems circuitry 1244 may communicate with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication.
The wireless communication may use any of a plurality of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), BLUETOOTH®, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for email (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), and/or Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS)), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
The audio circuitry 1226, the speaker 1228, and the microphone 1230 provide an audio interface between a dwelling user, resource owner, or end-user, resource owner, or end-user and the mobile or computing device 210. The audio circuitry 1226 receives audio data from the peripherals interface 1222, converts the audio data to an electrical signal, and transmits the electrical signal to the speaker 1228. The speaker 1228 converts the electrical signal to human-audible sound waves. The audio circuitry 1226 also receives electrical signals converted by the microphone 1230 from sound waves. The audio circuitry 1226 converts the electrical signal to audio data and transmits the audio data to the peripherals interface 1222 for processing. Audio data may be retrieved from and/or transmitted to memory 1216 and/or the Network Systems circuitry 1244 by the peripherals interface 1222. In some embodiments, the audio circuitry 1226 also includes a headset jack. The headset jack provides an interface between the audio circuitry 1226 and removable audio input/output peripherals, such as output-only headphones or a headset with both output (e.g., a headphone for one or both ears) and input (e.g., a microphone).
The I/O subsystem 1232 couples input/output peripherals on the mobile or computing device 210, such as the touch screen 1214 and other input/control devices 1234, to the peripherals interface 1222. The I/O subsystem 1232 may include a display controller 1246 and one or more input controllers 210 for other input or control devices. The one or more input controllers 1 receive/send electrical signals from/to other input or control devices 1234. The other input/control devices 1234 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, and joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 1252 may be coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse. The one or more buttons may include an up/down button for volume control of the speaker 1228 and/or the microphone 1230. The one or more buttons may include a push button. A quick press of the push button may disengage a lock of the touch screen 1214 or begin a process that uses gestures on the touch screen to unlock the device, as described in U.S. patent application Ser. No. 11/322,549, “Unlocking a Device by Performing Gestures on an Unlock Image,” filed Dec. 23, 2005, which is hereby incorporated by reference in its entirety. A longer press of the push button may turn power to the mobile or computing device 210 on or off. The dwelling user, resource owner, or end-user, resource owner, or end-user may be able to customize a functionality of one or more of the buttons. The touch screen 1214 is used to implement virtual or soft buttons and one or more soft keyboards.
The touch-sensitive touch screen 1214 provides an input interface and an output interface between the device and a dwelling user, resource owner, or end-user, resource owner, or end-user. The display controller 1246 receives and/or sends electrical signals from/to the touch screen 1214. The touch screen 1214 displays visual output to the dwelling user, resource owner, or end-user, resource owner, or end-user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output may correspond to dwelling user, resource owner, or end-user, resource owner, or end-user-interface objects, further details of which are described below.
A touch screen 1214 has a touch-sensitive surface, sensor or set of sensors that accepts input from the dwelling user, resource owner, or end-user, resource owner, or end-user based on haptic and/or tactile contact. The touch screen 1214 and the display controller 1246 (along with any associated modules and/or sets of instructions in memory 1216) detect contact (and any movement or breaking of the contact) on the touch screen 1214 and converts the detected contact into interaction with dwelling user, resource owner, or end-user, resource owner, or end-user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on the touch screen. In an exemplary embodiment, a point of contact between a touch screen 1214 and the dwelling user, resource owner, or end-user, resource owner, or end-user corresponds to a finger of the dwelling user, resource owner, or end-user, resource owner, or end-user.
The touch screen 1214 may use LCD (liquid crystal display) technology, or LPD (light emitting polymer display) technology, although other display technologies may be used in other embodiments. The touch screen 1214 and the display controller 1246 may detect contact and any movement or breaking thereof using any of a plurality of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with a touch screen 1214.
A touch-sensitive display in some embodiments of the touch screen 1214 may be analogous to the multi-touch sensitive tablets described in the following U.S. Pat. No. 6,323,846 (Westerman et al.), U.S. Pat. No. 6,570,557 (Westerman et al.), and/or U.S. Pat. No. 6,677,932 (Westerman), and/or U.S. Patent Publication 20021005024A1, each of which is hereby incorporated by reference in their entirety. However, a touch screen 1214 displays visual output from the portable mobile or computing device 210, whereas touch sensitive tablets do not provide visual output.
A touch-sensitive display in some embodiments of the touch screen 1214 may be as described in the following applications: (1) U.S. patent application Ser. No. 11/381,313, “Multipoint Touch Surface Controller,” filed May 12, 2006; (2) U.S. patent application Ser. No. 10/840,862, “Multipoint Touchscreen,” filed May 6, 2004; (3) U.S. patent application Ser. No. 10/903,964, “Gestures For Touch Sensitive Input Devices,” filed Jul. 30, 2004; (4) U.S. patent application Ser. No. 11/048,264, “Gestures For Touch Sensitive Input Devices,” filed Jan. 31, 2005; (5) U.S. patent application Ser. No. 11/038,590, “Mode-Based Graphical User Interfaces For Touch Sensitive Input Devices,” filed Jan. 18, 2005; (6) U.S. patent application Ser. No. 11/228,758, “Virtual Input Device Placement On A Touch Screen User Interface,” filed Sep. 16, 2005; (7) U.S. patent application Ser. No. 11/228,700, “Operation Of A Computer With A Touch Screen Interface,” filed Sep. 16, 2005; (8) U.S. patent application Ser. No. 11/228,737, “Activating Virtual Keys Of A Touch-Screen Virtual Keyboard,” filed Sep. 16, 2005; and (9) U.S. patent application Ser. No. 11/367,749, “Multi-Functional Hand-Held Device,” filed Mar. 3, 2006. All of these applications are incorporated by reference herein in their entirety.
The touch screen 1214 may have a resolution in excess of 1000 dpi. In an exemplary embodiment, the touch screen has a resolution of approximately 1060 dpi. The dwelling user, resource owner, or end-user, resource owner, or end-user may make contact with the touch screen 1214 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the dwelling user, resource owner, or end-user, resource owner, or end-user interface is designed to work primarily with finger-based contacts and gestures, which are much less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the dwelling user, resource owner, or end-user, resource owner, or end-user.
In some embodiments, in addition to the touch screen, the mobile or computing device 210 may include a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad may be a touch-sensitive surface that is separate from the touch screen 1214 or an extension of the touch-sensitive surface formed by the touch screen.
In some embodiments, the mobile or computing device 210 may include a physical or virtual click wheel as an input control device 1234. A dwelling user, resource owner, or end-user, resource owner, or end-user may navigate among and interact with one or more graphical objects (henceforth referred to as icons) displayed in the touch screen 1214 by rotating the click wheel or by moving a point of contact with the click wheel (e.g., where the amount of movement of the point of contact is measured by its angular displacement with respect to a center point of the click wheel). The click wheel may also be used to select one or more of the displayed icons. For example, the dwelling user, resource owner, or end-user, resource owner, or end-user may press down on at least a portion of the click wheel or an associated button. Dwelling user, resource owner, or end-user, resource owner, or end-user commands and navigation commands provided by the dwelling user, resource owner, or end-user, resource owner, or end-user via the click wheel may be processed by an input controller 1252 as well as one or more of the modules and/or sets of instructions in memory 1216. For a virtual click wheel, the click wheel and click wheel controller may be part of the touch screen 1214 and the display controller 1246, respectively. For a virtual click wheel, the click wheel may be either an opaque or semitransparent object that appears and disappears on the touch screen display in response to dwelling user, resource owner, or end-user, resource owner, or end-user interaction with the device. In some embodiments, a virtual click wheel is displayed on the touch screen of a portable multifunction device and operated by dwelling user, resource owner, or end-user, resource owner, or end-user contact with the touch screen.
The mobile or computing device 210 also includes a power system 1214 for powering the various components. The power system 1214 may include a power management system, one or more power sources (e.g., battery 1254, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
The mobile or computing device 210 may also include one or more sensors 1238, including not limited to optical sensors 1238. An optical sensor can be coupled to an optical sensor controller 1248 in I/O subsystem 1232. The optical sensor 1238 may include charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) phototransistors. The optical sensor 1238 receives light from the environment, projected through one or more lens, and converts the light to data representing an image. In conjunction with an imaging module 1258 (also called a camera module); the optical sensor 1238 may capture still images or video. In some embodiments, an optical sensor is located on the back of the mobile or computing device 210, opposite the touch screen display 1214 on the front of the device, so that the touch screen display may be used as a viewfinder for either still and/or video image acquisition. In some embodiments, an optical sensor is located on the front of the device so that the dwelling user, resource owner, or end-user, resource owner, or end-user's image may be obtained for videoconferencing while the dwelling user, resource owner, or end-user, resource owner, or end-user views the other video conference participants on the touch screen display. In some embodiments, the position of the optical sensor 1238 can be changed by the dwelling user, resource owner, or end-user, resource owner, or end-user (e.g., by rotating the lens and the sensor in the device housing) so that a single optical sensor 1238 may be used along with the touch screen display for both video conferencing and still and/or video image acquisition.
The mobile or computing device 210 may also include one or more proximity sensors 1250. In one embodiment, the proximity sensor 1250 is coupled to the peripherals interface 1222. Alternately, the proximity sensor 1250 may be coupled to an input controller in the I/O subsystem 1232. The proximity sensor 1250 may perform as described in U.S. patent application Ser. No. 11/241,839, “Proximity Detector In Handheld Device,” filed Sep. 30, 2005; Ser. No. 11/240,788, “Proximity Detector In Handheld Device,” filed Sep. 30, 2005; Ser. No. 13/096,386, “Using Ambient Light Sensor To Augment Proximity Sensor Output”; Ser. No. 11/586,862, “Automated Response To And Sensing Of User Activity In Portable Devices,” filed Oct. 24, 2006; and Ser. No. 11/638,251, “Methods And Systems For Automatic Configuration Of Peripherals,” which are hereby incorporated by reference in their entirety. In some embodiments, the proximity sensor turns off and disables the touch screen 1214 when the multifunction device is placed near the dwelling user, resource owner, or end-user, resource owner, or end-user's ear (e.g., when the dwelling user, resource owner, or end-user, resource owner, or end-user is making a phone call). In some embodiments, the proximity sensor keeps the screen off when the device is in the dwelling user, resource owner, or end-user, resource owner, or end-user's pocket, purse, or other dark area to prevent unnecessary battery drainage when the device is a locked state.
In some embodiments, the software components stored in memory 1216 may include an operating system 1260, a communication module (or set of instructions) 1262, a contact/motion module (or set of instructions) 1264, a graphics module (or set of instructions) 1268, a text input module (or set of instructions) 1270, a Global Positioning System (GPS) module (or set of instructions) 1272, and applications (or set of instructions) 1272.
The operating system 1260 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
The communication module 1262 facilitates communication with other devices over one or more external ports 1274 and also includes various software components for handling data received by the Network Systems circuitry 1244 and/or the external port 1274. The external port 1274 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector that is the same as, or similar to and/or compatible with the 30-pin connector used on iPod (trademark of Apple Computer, Inc.) devices.
The contact/motion module 106 may detect contact with the touch screen 1214 (in conjunction with the display controller 1246) and other touch sensitive devices (e.g., a touchpad or physical click wheel). The contact/motion module 106 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred, determining if there is movement of the contact and tracking the movement across the touch screen 1214, and determining if the contact has been broken (i.e., if the contact has ceased). Determining movement of the point of contact may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations may be applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, the contact/motion module 106 and the display controller 1246 also detects contact on a touchpad. In some embodiments, the contact/motion module 1284 and the controller 1286 detects contact on a click wheel.
Examples of other applications that may be stored in memory 1216 include other word processing applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
In conjunction with touch screen 1214, display controller 1246, contact module 1276, graphics module 1278, and text input module 1280, a contacts module 1282 may be used to manage an address book or contact list, including: adding name(s) to the address book; deleting name(s) from the address book; associating telephone number(s), e-mail address(es), physical address(es) or other information with a name; associating an image with a name; categorizing and sorting names; providing telephone numbers or e-mail addresses to initiate and/or facilitate communications by telephone, video conference, e-mail, or IM; and so forth.
The cloud lock can provide dwelling user, resource owner, or end-user access services with the utilization and allocation of hardware and software resource(s) to remote clients. The system can concurrently service requests from several clients without participant perception of degraded computing performance as compared to conventional techniques where computational tasks can be performed upon a client or a server within a proprietary intranet. The cloud services provider (e.g., “which can be for secured dwelling user, resource owner, or end-user access with or without an intelligent door lock system 10”) supports a collection of hardware and/or software resources. The hardware and/or software resources can be maintained by an off-premises party, and the resources can be accessed and utilized by identified participants over Network System. Resources provided by the cloud services provider can be centrally located and/or distributed at various geographic locations. For example, the cloud services provider can include any number of data center machines that provide resources. The data center machines can be utilized for storing/retrieving data, effectuating computational tasks, rendering graphical outputs, routing data, and so forth.
In one embodiment the cloud is used for the remote door 12 status operation, remote door operation for locking, unlocking and the like.
According to an illustration, the cloud services provider can provide any number of resources such as data storage services, computational services, word processing services, electronic mail services, presentation services, spreadsheet services, gaming services, web syndication services (e.g., subscribing to a RSS feed), and any other services or applications that are conventionally associated with personal computers and/or local servers. Further, utilization of any number of the cloud service providers similar to the cloud services provider is contemplated. According to an illustration, disparate cloud services providers can be maintained by differing off-premise parties and a participant can employ, concurrently, at different times, and the like, all or a subset of the cloud services providers.
By leveraging resources supported by the cloud services provider, limitations commonly encountered with respect to hardware associated with clients and servers within proprietary intranets can be mitigated. Off-premises parties or Network System administrators of servers within proprietary intranets, can maintain, troubleshoot, replace and update the hardware resources. Further, for example, lengthy downtimes can be mitigated by the cloud services provider utilizing redundant resources; thus, if a subset of the resources are being updated or replaced, the remainder of the resources can be utilized to service requests from participants. According to this example, the resources can be modular in nature, and thus, resources can be added, removed, tested, modified, etc. while the remainder of the resources can support servicing participant requests. Moreover, hardware resources supported by the cloud services provider can encounter fewer constraints with respect to storage, processing power, security, bandwidth, redundancy, graphical display rendering capabilities, etc. as compared to conventional hardware associated with clients and servers within proprietary intranets.
The system can include a client device, which can be the wearable device and/or mobile device 201 that employs resources of the cloud services provider. Although one client device is depicted, it is to be appreciated that the system can include any number of client devices similar to the client device, and the plurality of client devices can concurrently utilize supported resources. By way of illustration, the client device can be a desktop device (e.g., personal computer), mobile device 201, and the like. Further, the client device can be an embedded system that can be physically limited, and hence, it can be beneficial to leverage resources of the cloud services provider.
Resources can be shared amongst a plurality of client devices subscribing to the cloud services provider. According to an illustration, one of the resources can be at least one central processing unit (CPU), where CPU cycles can be employed to effectuate computational tasks requested by the client device. Pursuant to this illustration, the client device can be allocated a subset of an overall total number of CPU cycles, while the remainder of the CPU cycles can be allocated to disparate client device(s). Additionally or alternatively, the subset of the overall total number of CPU cycles allocated to the client device can vary over time. Further, a number of CPU cycles can be purchased by the participant of the client device. In accordance with another example, the resources can include data store(s) that can be employed by the client device to retain data. The participant employing the client device can have access to a portion of the data store(s) supported by the cloud services provider, while access can be denied to remaining portions of the data store(s) (e.g., the data store(s) can selectively mask memory based upon participant/device identity, permissions, and the like). It is contemplated that any additional types of resources can likewise be shared.
The cloud services provider can further include an interface component that can receive input(s) from the client device and/or enable transferring a response to such input(s) to the client device (as well as perform similar communications with any disparate client devices). According to an example, the input(s) can be request(s), data, executable program(s), etc. For instance, request(s) from the client device can relate to effectuating a computational task, storing/retrieving data, rendering a participant interface, and the like via employing one or more resources. Further, the interface component can obtain and/or transmit data over a Network System connection. According to an illustration, executable code can be received and/or sent by the interface component over the Network System connection. Pursuant to another example, a participant (e.g. employing the client device) can issue commands via the interface component.
In one embodiment, the cloud services provider includes a dynamic allocation component that apportions resources, which as a non-limiting example can be hardware resources supported by the cloud services provider to process and respond to the input(s) (e.g., request(s), data, executable program(s), and the like, obtained from the client device.
Although the interface component is depicted as being separate from the dynamic allocation component, it is contemplated that the dynamic allocation component can include the interface component or a portion thereof. The interface component can provide various adaptors, connectors, channels, communication paths, etc. to enable interaction with the dynamic allocation component.
In one embodiment a system includes the cloud services provider that supports any number of resources (e.g., hardware, software, and firmware) that can be employed by the client device and/or disparate client device(s) not shown. The cloud services provider further comprises the interface component that receives resource utilization requests, including but not limited to requests to effectuate operations utilizing resources supported by the cloud services provider from the client device and the dynamic allocation component that partitions resources, including but not limited to, between participants, devices, computational tasks, and the like. Moreover, the dynamic allocation component can further include a participant state evaluator, an enhancement component and an auction component.
The dwelling user, resource owner, or end-user, resource owner, or end-user state evaluator can determine a state associated with a dwelling user, resource owner, or end-user, resource owner, or end-user and/or the client device employed by the dwelling user, resource owner, or end-user, resource owner, or end-user, where the state can relate to a set of properties. For instance, the dwelling user, resource owner, or end-user, resource owner, or end-user state evaluator can analyze explicit and/or implicit information obtained from the client device (e.g., via the interface component) and/or retrieved from memory associated with the cloud services provider (e.g., preferences indicated in subscription data). State related data yielded by the dwelling user, resource owner, or end-user, resource owner, or end-user state evaluator can be utilized by the dynamic allocation component to tailor the apportionment of resources.
In one embodiment, the dwelling user, resource owner, or end-user, resource owner, or end-user state evaluator can consider characteristics of the client device, which can be used to apportion resources by the dynamic allocation component. For instance, the dwelling user, resource owner, or end-user, resource owner, or end-user state evaluator can identify that the client device is a mobile device 201 with limited display area. Thus, the dynamic allocation component can employ this information to reduce resources utilized to render an image upon the client device since the cellular telephone may be unable to display a rich graphical dwelling user, resource owner, or end-user, resource owner, or end-user interface.
Moreover, the enhancement component can facilitate increasing an allocation of resources for a particular participant and/or client device.
In one embodiment a system employs load balancing to optimize utilization of resources. The system includes the cloud services provider that communicates with the client device (and/or any disparate client device(s) and/or disparate cloud services provider(s)). The cloud services provider can include the interface component that transmits and/or receives data from the client device and the dynamic allocation component that allots resources. The dynamic allocation component can further comprise a load balancing component that optimizes utilization of resources.
In one embodiment, the load balancing component can monitor resources of the cloud services provider to detect failures. If a subset of the resources fails, the load balancing component can continue to optimize the remaining resources. Thus, if a portion of the total number of processors fails, the load balancing component can enable redistributing cycles associated with the non-failing processors.
In one embodiment a system archives and/or analyzes data utilizing the cloud services provider. The cloud services provider can include the interface component that enables communicating with the client device. Further, the cloud services provider comprises the dynamic allocation component that can apportion data retention resources, for example. Moreover, the cloud services provider can include an archive component and any number of data store(s). Access to and/or utilization of the archive component and/or the data store(s) by the client device (and/or any disparate client device(s)) can be controlled by the dynamic allocation component. The data store(s) can be centrally located and/or positioned at differing geographic locations. Further, the archive component can include a management component, a versioning component, a security component, a permission component, an aggregation component, and/or a restoration component.
The data store(s) can be, for example, either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. By way of illustration, and not limitation, nonvolatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory. Volatile memory can include random access memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in many forms such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), Rambus direct RAM (RDRAM), direct Rambus dynamic RAM (DRDRAM), and Rambus dynamic RAM (RDRAM). The data store(s) of the subject systems and methods is intended to comprise, without being limited to, these and any other suitable types of memory. In addition, it is to be appreciated that the data store(s) can be a server, a database, a hard drive, and the like.
The management component facilitates administering data retained in the data store(s). The management component can enable providing multi-tiered storage within the data store(s), for example. According to this example, unused data can be aged-out to slower disks and important data used more frequently can be moved to faster disks; however, the claimed subject matter is not so limited. Further, the management component can be utilized (e.g. by the client device) to organize, annotate, and otherwise reference content without making it local to the client device. Pursuant to an illustration, enormous video files can be tagged via utilizing a cell phone. Moreover, the management component enables the client device to bind metadata, which can be local to the client device, to file streams (e.g., retained in the data store(s)); the management component can enforce and maintain these bindings.
Additionally or alternatively, the management component can allow for sharing data retained in the data store(s) with disparate participants and/or client devices. For example, fine-grained sharing can be supported by the management component.
The versioning component can enable retaining and/or tracking versions of data. For instance, the versioning component can identify a latest version of a document (regardless of a saved location within data store(s)).
The security component limits availability of resources based on participant identity and/or authorization level. For instance, the security component can encrypt data transferred to the client device and/or decrypt data obtained from the client device. Moreover, the security component can certify and/or authenticate data retained by the archive component.
The permission component can enable a participant to assign arbitrary access permissions to various participants, groups of participants and/or all participants.
Further, the aggregation component assembles and/or analyzes collections of data. The aggregation component can seamlessly incorporate third party data into a particular participant's data.
The restoration component rolls back data retained by the archive component. For example, the restoration component can continuously record an environment associated with the cloud services provider. Further, the restoration component can playback the recording.
An algorithm described hereafter computes proximity of a Bluetooth device 21 from the intelligent door lock system 10 of a dwelling user, resource owner, or end-user and from the one or more bridges 11 in the dwelling user, resource owner, or end-user. The relative signal strength of connections to these two devices during lock operations is recorded as a threshold value. When the proximity to the bridge 11, placed inside the home is closer than before the lock operation, we will compute algorithmically that the device is inside the home.
In one embodiment the time spent with a relatively consistent signal strength value is a strong indicator a person being in the dwelling user, resource owner, or end-user. A rapid change of proximity following a lock operation will be an indicator of coming.
In one embodiment a lock device 22 operation of the intelligent door lock system 10 followed by a rapid change of proximity is an indicator of going from the dwelling user, resource owner, or end-user.
The process of
At step 320, parameter data of the non-interconnect device is determined. The parameter data may include location, time, and/or velocity coordinates associated with the non-interconnect device at the time of the signal strength measurement. Step 320 may be performed in any variety of ways, including but not limited to the use of GPS information. Further, step 320 may be initiated by a predefined schedule or a predefined event, as discussed above.
At step 330, the signal strength and parameter data are transmitted to the cloud lock access services. Step 330 may be performed in any of the ways discussed above, including using upstream control, communication, or out-of-band channels of Network System. The signal strength and parameter data, and optionally additional data, may be combined to form network status data, which is transmitted to the cloud lock access services at step 330.
At step 340, the signal strength and parameter data are used to analyze the signal strength between the bridge 11 and a Bluetooth LE device 21. The network operations center 150 is able to process the data in any of the ways discussed above, including mapping the signal strength to geographic representations of the bridge 11 and a Bluetooth LE device 21, based on the parameter data. A graphical representation of at least a section of the strength of the signal between the bridge 11 and a Bluetooth LE device 12 may be generated to illustrate instances of measured signal strength plotted based on corresponding parameter data. Network operators may use the output of the cloud lock access services to analyze, configure, reconfigure, overhaul, and/or optimize the wireless network, as discussed above.
The process of
At step 420, it is determined whether the measured signal strength is lower than a predetermined threshold. The predetermined threshold may be defined by network operators and may be based on a desired level of signal power that provides effective signal strength. If it is determined at step 420 that the measured signal strength is not lower than the predetermined threshold, the process returns to step 410, at which step another measurement of signal strength is obtained either immediately, according to an established schedule, or in response to a predetermined trigger event.
On the other hand, if it is determined at step 420 that the measured signal strength is lower than the predetermined threshold, the process continues at step 430. In one embodiment, at step 430, parameter data of the Bluetooth LE device 21 is determined. As non-limiting examples, the parameter data may include location, time, and/or velocity coordinates associated with the Bluetooth LE device 21 relative to the bridge 11. Step 430 may be performed in any of the ways discussed above; including using GPS signals to determine GPS coordinate data.
At step 440, it is determined whether the measured signal strength is adequate for transmission of data upstream to the cloud lock access services from the Bluetooth LE device 21. Step 440 may be performed by comparing the measured signal strength to a predetermined transmission threshold, which may be defined by network operators based on a level of signal power that supports reliable upstream data transmissions from the wireless device.
If it is determined at step 440 that the measured signal strength is inadequate for transmission of data, the process continues at step 445. At step 445, the signal strength and parameter data are buffered for subsequent transmission. Step 445 may be performed by storing the data to memory to maintain the data until it can be transmitted. In one embodiment, from step 445, the process returns to step 410 to obtain another measurement of signal strength. Multiple instances of data may be buffered until signal strength becomes strong enough for the stored data to be transmitted from the Bluetooth LE device 21. In other words, steps 410-440 may be repeated with different measurements being gathered and stored for later transmission when the signal strength becomes strong enough to support upstream transmissions.
If it is determined at step 440 that the measured signal strength is adequate for data transmission, the process continues to step 450. At step 450, the signal strength and parameter data are transmitted to the cloud lock access services. Step 450 may be performed in any of the ways discussed above, including using upstream control, communication, or out-of-band channels of the wireless network 144. The signal strength and parameter data, and optionally additional data, may be combined to form network status data, which is transmitted to the cloud lock access services at step 450.
At step 460, the signal strength and parameter data are used to analyze any number of parameters relative to Bluetooth LE device 21, particularly its location. The cloud is able to process the data in any of the ways discussed above, including mapping the signal strength to geographic representations of the wireless network 144, based on the parameter data. A graphical representation may be generated to illustrate instances of measured signal strength plotted based on corresponding parameter data.
Triangulation forms sets of circles. Each of the reference points, access points 1, 2 or 3, becomes the center of a circle, and the distances between the object and the center, d1, d2 or d3, becomes the radius of that circle.
Triangulation estimates locations based on various intersection areas formed by these circles. If three formed circles meet at a single spot, that spot becomes the location estimate as a result of the triangulation. However, as a practical matter, the three circles rarely will meet at a single spot. More often, if the circles intersect, they will intersect in multiple spots. In
rms=√{square root over((a a i)2+(b b i)2+(c c i)2)} {square root over ((a a i)2+(b b i)2+(c c i)2)}{square root over((a a i)2+(b b i)2+(c c i)2)} (1)
wherein (Sa, Sb, Sc) represents off-line signal strength measurements at the location of interest.
In particular, these five triplets are: signal strength triplet (a1, b1, c1) at position L1 (x1 , y1) from a, b and c; signal strength triplet (a2, b2, c2) at position L2 (x2, y2) from a, b and c; and signal strength triplet (a5, b5, c5) at position L5 (x5, y5) from a, b and c. L1, . . . , L5 are determined by using the location information database. The location information database for RF-based static scene analysis typically contains entries used to map RF signal metrics to positions (i.e., transfer from signal domain to space domain). The positions of these five locations are averaged to yield the location estimate of the object as follows:
L=(L1+L2+L3+L4+L5)/5 (2)
x=(A1+B2+C2)/3 (3)
In one embodiment the conventional static scene analysis maps from the radio signal domain to the space domain. The final estimate is typically within a coordinate system. A main drawback of the static scene analysis is that it cannot effectively cope with the impact of errors in the radio signal domain. Due to interference and noise, objects at different locations might be represented similarly in the radio signal domain, a phenomenon called aliasing. The conventional methods cannot detect aliasing, and may provide these different locations with similar location estimates.
In one embodiment selective fusion location estimation (SELFLOC) algorithm selectively combines or fuses multiple location information sources to yield a combined estimate in a theoretically optimal manner. The SELFLOC algorithm is disclosed in U.S. patent application Ser. No. 10/330,523, filed Dec. 27, 2002, which is incorporated herein by reference.
The branch weights 1, 2 and 3 are calibrated during the off-line stage using error feedback. A minimum mean square error (MMSE) algorithm can be used for SELFLOC weight training and calibration. As shown in
X=w I−X I+w 2·X 2+w 3·X 3 (4)
In one embodiment, a dwelling user, resource owner, or end-user security system 10(a) is provided as illustrated in
The camera 10(c) consumes less power in a standby mode because the first radio 10(d) consumes less power when configured to receive triggers and the third radio 10(d) is very efficient at transmitting video over Network Systems.
In one embodiment a generic input device, (hereafter “keypad”) is provided. The key pad can be part of the intelligent door lock system or be an accessory to the intelligent door lock system. In one embodiment the key pad is retrofitted to an existing intelligent door lock system after the intelligent door lock system has been installed. It is retrofitted to the existing intelligent door lock system. In one embodiment the keypad is installed when the intelligent door lock system, and can be sold with the intelligent door lock system. In one embodiment the keypad is an exterior of the dwelling user, resource owner, or end-user and in another embodiment it is in the interior of the dwelling user, resource owner, or end-user.
In one embodiment the keypad includes: a battery, keypad, a Bluetooth chip and board. Optionally included are LED lighting and a proximity sensor. Suitable examples of proximity sensors are disclosed herein.
The keypad provides for entering a communication that is encrypted, in order to gain access to the intelligent door lock system to lock and unlock. In one embodiment the communication is via BLE low energy.
In one embodiment the keypad has a BLE range of range of 20-30 feet. In one embodiment the keypad is within 3-5 feet of the door. As a non-limiting example the keypad can have a communication distance of at least thirty feet.
In one embodiment, the dwelling user, resource owner, or end-user, resource owner, or end-user, on initial setup programs the keypad via its mobile device 201, or other web-enabled device. The initial setup program is encrypted and can be achieved with symmetric key encryption, public key encryption and the like.
The dwelling user, resource owner, or end-user, resource owner, or end-user can communicate with the keypad by a variety of different mechanisms, including but not limited to entering digitals, letters, codes, tapping, a code with a pattern and the like.
In one embodiment the proximity sensor is integral with the proximity sensor. In one embodiment the keypad lights up as the dwelling user, resource owner, or end-user, resource owner, or end-user walks towards the keypad via the LED's.
As non-limiting examples the keypad can be configured to have time codes for expiration, may only be available for a certain of time, codes can be on a recurring identified time basis, the dwelling user, resource owner, or end-user, resource owner, or end-user can set the availability of time for access via the key paid for who can use, and how often it can be used
In one embodiment the keypad can be programmed via a bridge 11. This can be achieved remotely.
As non-limiting examples the keypad can be utilized using a mobile device 201, a computing device, via an API and the like. As a non-limiting example, a delivery company can issue a pass to a delivery person for access to the dwelling user, resource owner, or end-user. This can be done at any time, or at a last minute via an APL
In one embodiment, illustrated in
In one embodiment the intelligent security system is an infrared detector mounted on circuit board, along with photoresistive detector for visible light. As non-limiting examples the following technologies can be used for the motion detection device 10(g): passive infrared (PIR), micro wave which detects motion through the principle of Doppler radar, and the like, ultrasonic and the like, tomographic motion detector, video camera 10(c) software, and the like.
As non-limiting examples suitable motion detection devices 10(g) include but are not limited to Infrared (passive and active sensors); optics (video and camera systems); radio frequency energy (radar, microwave and tomographic motion detection); sound (microphones and acoustic sensors); vibration (triboelectric, seismic, and inertia-switch sensors); magnetism (magnetic sensors and magnetometers); and the like.
In one embodiment in a first step, motion detection device 10(g) is used to detect motion of an individual approaching the dwelling user, resource owner, or end-user. In a second step, if the motion detection device 10(g) detects the approach of the individual then the camera 10(c) is turned on in sufficient time to take a face or body picture of the individual. In one embodiment, motion detection of the individual and turning on of camera 10(c) is processed in the cloud via its server/engine and the like, and in another embodiment in an intelligent door lock system back-end.
As a non-limiting example the first distance for the motion detection device 10(g) to detect approach of an individual is 5 meters, 10 meters and the like and the first trigger is at 10 meters, 5 meters and the like. As a non-limiting example the second distance to wake up camera 10(c) can be 2 mm, and any suitable distance suffice for a camera 10(c) to identify that there is a person. In one embodiment the second distance can be 5 meters for body detection.
As the person approaching hits, as a non-limiting example, 5 meters, the motion detection device 10(g) says that something has happened and wakes up camera 10(c), and at 2 meters determines if it is a person, the camera 10(c) is awakened in sufficient time to take a picture, and send a notice to the owner, to any device capable of receiving messages and notifications, it can be sent also to the cloud, to the authorities such as law enforcement who can then be dispatched to the dwelling user, resource owner, or end-user.
In one embodiment the authorization sensing device 10(g) is a person sensing device, including but not limited to a button. As non-limiting examples, the button can be a doorbell that can be integrated with a doorbell, a body or person sensing device, a hepatic device and the like. One embodiment of a suitable doorbell is disclosed in US 2004008205, incorporated herein by reference.
In other embodiments the camera 10(c) can be activated by an access authorization event. Suitable access authorization events include but are not limited to, use of an authorized mobile device 201 to unlock a door of the dwelling user, resource owner, or end-user; detection of an approaching face by another camera 10(c) that is powered, someone pressing the doorbell via a mechanical switch, capacitive sensor that senses touch, and the like. In other embodiment's access to a dwelling user, resource owner, or end-user is given to a person with one of the authorized devices recited above. In one embodiment instead of a doorbell a device is provided that translates mechanical movement or contact into an electrical signal. These devices include but are not limited to a rocker switch, body-heat sensitive switches, capacitive switch, pressure sensitive switches and the like.
In one embodiment the camera 10(c) is activated when a person is detected in proximity to an entrance to the dwelling user, resource owner, or end-user. As a non-limiting example this can be achieved using a proximity sensor situated inside the doorbell; by pressure sensors on a dwelling user, resource owner, or end-user floor; with the use of other proximity sensors with coverage in front of the a dwelling user, resource owner, or end-user access such as a door; and the like.
In one embodiment the camera 10(c) is in an interior of the dwelling user, resource owner, or end-user and the camera 10(c) is activated when a person entering the dwelling user, resource owner, or end-user is detected.
In one embodiment a power supply powers the intelligent doorbell by extracting power from the 2 leads from the dwelling user, resource owner, or end-user without ringing the doorbell, and without affecting the doorbells ability to ring. In one embodiment the intelligent doorbell is a bridge 11 configured to communicate with another bridge 11.
In one embodiment the camera 10(c) is positioned at the doorbell and is activated by a sensor or when the doorbell is depressed. In one embodiment a doorbell module is integrated with the camera 10(c). In one embodiment the doorbell module of a dwelling user, resource owner, or end-user is coupled to a wireless camera 10(c) that provides for wireless transmission of an image, and the like.
In one embodiment the camera is a micro-camera 10(c) mounted to a circuit board and is positioned in alignment with a hole defined in a case for photographing the visitor.
In one embodiment of the present invention, illustrated in
In one embodiment the computing device 13 is configured to connect Bluetooth LE devices 21 to the Network Systems.
In one embodiment the bridge 11 is coupled to the intelligent door lock system 10 via secure digital keys distributed by Cloud lock access services Lock Access Services.
In one embodiment the bridge 11 allows BLE devices in the dwelling user, resource owner, or end-user to interact with the cloud lock access services and with other Internet-connected devices via the intermediary that is the cloud lock access services. It will be appreciated that the dwelling user, resource owner, or end-user includes all structures besides homes.
In one embodiment the bridge 11 determines signal strength between the bridge 11, and the Bluetooth LE device 21. In another embodiment the bridge 11 determines signal strength of between the bridge 11, the Bluetooth LE device 21 and the intelligent door lock system 10(a).
The retrieved signal strength processed . . . . It one embodiment, as described below, a triangulation algorithm is applied between the bridge 11, the Bluetooth LE device 21 and the intelligent door lock system.
In one embodiment the bridge 11 uses detection of known Bluetooth devices and peripheral devices, hereafter collectively Bluetooth devices 21, tied to specific individual people in the interior or at an exterior of the dwelling user, resource owner, or end-user. The bridge 11 tracks signal strength over time to: (i) determine if known or unknown people are inside or outside the dwelling user, resource owner, or end-user, (ii) if people are approaching the dwelling user, resource owner, or end-user, entering the dwelling user, resource owner, or end-user, exiting the dwelling user, resource owner, or end-user, moving away from the building and the like. In one embodiment the bridge 11 with the detection of the presence of a Bluetooth device 21 relays lock operations of the intelligent door lock system (manual or via a mobile application), door 12 movements, door 12 knocks to allow making these determinations of presence and movement with an algorithm as set forth below.
In one embodiment the bridge 11 interacts with the cloud lock access services to gather and relay data. This data can be gathered and stored locally, at the back-end 68, and in a cloud lock access services based data layer. This is then used to determine the location and movement of people in and out the dwelling user, resource owner, or end-user.
In one embodiment the bridge 11 discovers the intelligent door lock system 10 over a Bluetooth device 21 networking. In one embodiment this is achieved by the bridge discovering lock devices 22 and their available services by scanning the Bluetooth LE 21 network for connected devices, advertising their presence and their services for obtaining lock device 22 status (secured or unsecured), communicates lock device 22 activity, communicates door 12 activity (door 12 opening and closing, door 12 knocks, and the like) and operates the lock to lock and unlock the bolt 24 to secure or unsecure the lock device 22.
In one embodiment the bridge 11 provides communication to other Bluetooth devices 21 without the use of a mobile device 201. As non-limiting examples, the bridge 11 allows: WiFi-enabled devices in a dwelling user, resource owner, or end-user to interact with Bluetooth devices 21 in the dwelling user, resource owner, or end-user; WiFi-enabled devices in a dwelling user, resource owner, or end-user to interact with the intelligent door lock system 10 over Bluetooth; allows a Bluetooth device 21 in a dwelling user, resource owner, or end-user to interact with Internet-based services and API's using a dwelling user, resource owner, or end-user's home WiFi network and Network System connection; allows people to operate an intelligent door lock system and other Bluetooth devices over a Network System from anywhere outside a dwelling user, resource owner, or end-user; extend network coverage of Bluetooth devices in a dwelling user, resource owner, or end-user in order to understand who is in the dwelling user, resource owner, or end-user, who is away, who is coming and who is going when doors 12 and lock devices 22 are operated and the like.
In one embodiment the bridge 11 extends Network System coverage of Bluetooth devices 21 other than lock devices 22 to perform device-specific operations, including but not limited to: gathering information about the presence of the Bluetooth device 21, the operational status of the Bluetooth device 21, the operational history of the Bluetooth device 21 and performing Bluetooth device 21 specific operations including but not limited to: turning the Bluetooth device 21 off and on, changing the mode of operations of the Bluetooth device 21, changing the operational settings of the Bluetooth device 21 and scheduling these device operations based on ad hoc, daily, weekly, monthly or other schedules.
In one embodiment the intelligent door lock system 10 trusts the bridge 11 for commands (remote status) after an intelligent door lock system owner or designee is registered at the back-end of the intelligent door lock system using a cloud lock access services-based access system that grants the bridge 11 access to the intelligent door lock system 10.
In one embodiment the intelligent door lock system 10 owners or designee rants the bridge 11 access to the lock device 22 by using their digital credentials, which can be stored at the cloud lock access services or at the back-end 68, to pair a specific bridge 11 with a specific intelligent door lock system 10 grant specific rights. As non-limiting example, the specific rights include but are not limited to, gathering of status and operational history of the system 10, triggering lock device 22 operations in real-time, as well as applications for interfacing with the bridge 11 and a Bluetooth device 21.
In one embodiment the bridge 11 is used to determine if an intelligent door lock system 10 owners or designee with a non-internet connect device is at an interior or an exterior of a dwelling user, resource owner, or end-user.
In one embodiment the bridge 11 is used to determine if the person is approaching or moving away from the dwelling user, resource owner, or end-user. In one embodiment the bridge 11 measures the signal strength of the Bluetooth LE devices 21.
In one embodiment as a Bluetooth LE device 21, coupled to a person moves away from the bridge 11 the signal strength decreases, as more fully discuss hereafter. Similarly, as the signal strength increases this indicates that a person with the Bluetooth LE device is approaching the dwelling user, resource owner, or end-user.
In one embodiment, each room of a dwelling user, resource owner, or end-user with the intelligent door lock system has a bridge 11. In another embodiment, the major rooms of the dwelling user, resource owner, or end-user each have a bridge 11.
In one embodiment the bridge 11 learns habits, movements, and the like of the intelligent door lock system 10 owners or designee.
In one embodiment a triangulation is provided between the bridge 11, the intelligent door lock system 10 and a Bluetooth LE device 21, as more fully explained hereafter.
In one embodiment the computing device 13 provides for coordination of information flow between the two radios 15 and 17. The computing device 13 is configured to enable the two radios, 15 and 17 to communicate and take incoming and outgoing information from one radio into a format that the other radio can transmit and receive. The internet facing radio 15 is configured to communicate through a router 25 to the Network Systems and the BLE LE devices 21 connect to Network Systems via one of the radios 15, 17 through the computing device 13 through the internet facing radio 15 through the router 25 to Network Systems, with the bridge 11 communicating with a data center 27. In one embodiment a router is not required when an alternative bridge 11 is constructed to bridge 11 between cellular and BTLE.
In one embodiment the internet facing radio is configured to communicate through the router 25 to Network Systems. The Bluetooth LE devices 21 connect to Network Systems, via the computing device 13, with the bridge 11 communicating with a data center 27.
The computing device 13 provides for coordination of information flow between the two radios 15 and 17. Because most radios speak in different frequencies or protocols, packet sizes, and the like, the computing device 13 enables the two radios 15 and 17 to communicate, takes incoming and outgoing information from one radio into the proper format that the other radio can transmit and receive. In one embodiment the computing device makes the first and second radios 16 and 18 the same thing.
In one embodiment a wall wart in the dwelling user, resource owner, or end-user is configured to communicate with other Bluetooth devices, including but not limited to redundant or backup power supplies, redundant data communications connections, environmental controls (e.g., air conditioning, fire suppression) and various security devices, thermostats, audio systems, appliances, gates, outdoor electrical equipment and the like.
In one embodiment the internet facing radio 15 is configured to communicate through the router 25 to Network Systems and Bluetooth LE devices 21 connected to Network Systems via the computing device 13. The bridge 11 communicates with the data center 27.
In one embodiment the computing device 13 is a wall wart, and equivalent element, which is a power adapter that contains the plug for a wall outlet.
In one embodiment the radios 15 and 17 transmit radio waves for communication purposes.
In one embodiment the bridge 11 provides at least a partial probability analysis of where a person with a Bluetooth LE device 21 is located, as well as to the existence of an adverse condition including but not limited to entrance via a window or door to the dwelling user, resource owner, or end-user.
Referring to
Third party secured access to a dwelling user, resource owner, or end-user, which can by programmatic is protected access in that it is an access control code that is running on the server. The third secured access to the dwelling user, resource owner, or end-user can be automatically revoked, along with automatic revocation of access credentials.
In one embodiment credentials are granted for third party secured access to a dwelling user, resource owner, or end-user, which can by programmatic, in a secure manner via the server. In one embodiment the server communicates with a server of the third party secured access to a dwelling user, resource owner, or end-user, which can by programmatic. As a non-limiting example, the third party secured access to a dwelling user, resource owner, or end-user, which can by programmatic can be a service provider, including but not limited to grocery delivery, housing cleaning company/person, package delivery organizations, including but not limited to FedEx, UPS, grocery delivery, house cleaning, and the like, as defined above.
In this embodiment the dwelling user, resource owner, or end-user, resource owner, or end-user, grants to a third party secured access to a dwelling user, resource owner, or end-user, which can by programmatic, which can be via the intelligent door lock system 10. The access can be at a certain time of day/night, and for a certain length of time. In one embodiment a mobile device 201 is utilized or a keypad can also be used. In one embodiment the third party secured access to a dwelling user, resource owner, or end-user, which can by programmatic asks for a customer account of an organization that has been granted previous access to the dwelling user, resource owner, or end-user. The company is able to give its employees, consultants, associates and the like, access to the dwelling user, resource owner, or end-user via the intelligent door lock system. In one embodiment the access is granted at a certain date and time. In one embodiment one or more cameras 10(c) are utilized to video the activities of the person granted access to the dwelling user, resource owner, or end-user. In one embodiment a first camera 10(c) is at the interior and a second one is at the exterior to video the actions of the third party secured access to a dwelling user, resource owner, or end-user, which can by programmatic. In one embodiment the videos can be uplifted and sent to the third party employer, and the like for monitoring every activity and movement. An accessible database can be provided and used by the third party service provider.
As a non-limiting example, third party secured access to the dwelling user, resource owner, or end-user, which can be programmatic, is authenticated by a back end of intelligent lock system 10 or via the Cloud, and authorized with a resetting of the lock.
Notifications are provided to a third party system of the third party granted secured access to the dwelling user, resource owner, or end-user, along with an audit trail that can be stored for a defined time period, as well as perpetually.
In one embodiment there can be a transfer of access rights to a new resident of the dwelling user, resource owner, or end-user for secure, authorized access to the dwelling user, resource owner, or end-user. As a non-limiting example this is a secure transfer of rights, and the original occupants or owners or end-users of the dwelling user, resource owner, or end-user is then dissociated with access rights without further rights, and can include resetting, and a change of credentials.
In one embodiment this is achieved using server which maintains access right privileges. The server is an intermediary. The person with the dwelling user, resource owner, or end-user and the intelligent door lock system 10 grants permission via server to give third party secured access to the dwelling user, resource owner, or end-user, which can be programmatic. Because that third party secured access to the dwelling user, resource owner, or end-user, which can by programmatic, including but not limited to a service provider is authorized, they can give temporary rights to an individual or dwelling user, resource owner, or end-user service provider, and then the occupant or owner or end-user of the dwelling user, resource owner, or end-user, and lock system 10 can revoke those rights at any time, for one or all. As non-limiting examples cameras 10(c) are utilized to see the person entering or exiting the dwelling user, resource owner, or end-user. It will be appreciated that cameras 10(c) as optional. Without the use of cameras 10(c) the date and time of a third party secured access to a dwelling user, resource owner, or end-user, which can by programmatic unlocking or locking the intelligent door lock system 10 is provided to the third party secured access to a dwelling user, resource owner, or end-user, which can by programmatic, such as a service provider, as well as the person with the dwelling user, resource owner, or end-user and intelligent door lock system.
In one embodiment a lock system is coupled to a lock at a dwelling of a dwelling user, resource owner, or end-user, collectively the user. An intelligent door lock system is provided with a remotely operable lock at the dwelling accessible by the user, as illustrated in
In one embodiment when the dwelling user, resource owner, or end-user, resource owner, or end-user enables an automatic unlock of system 10 it can view its current location on a map as a means of visual verification that it is being set up properly. In one embodiment a dwelling user, resource owner, or end-user threshold region can be viewed and/or adjusted by the dwelling user, resource owner, or end-user, resource owner, or end-user. In this embodiment the dwelling user, resource owner, or end-user, resource owner, or end-user implicitly indicates that when it has left the area then on returning to the area the dwelling user, resource owner, or end-user, resource owner, or end-user would like its door unlocked as the dwelling user, resource owner, or end-user, resource owner, or end-user approaches it. A dwelling user, resource owner, or end-user threshold region can be adjusted larger to compensate for poor GPS positioning available in a remote dwelling user, resource owner, or end-user region.
As a non-limiting example the following steps can be followed. The dwelling user, resource owner, or end-user, resource owner, or end-user exits the dwelling user, resource owner, or end-user. One or both inner and outer geo-fences 512, 514 can be exited. In the event that both geo-fences 512, 514 are exited after a certain period of time door 12 is then locked. When the inner fence 512 is only exited system 10 does not lock door 12. The server 510 can enable detailed GPS detection to verify that the dwelling user, resource owner, or end-user, resource owner, or end-user has left the area defined by geo-fences 512 and 514. The dwelling user, resource owner, or end-user, resource owner, or end-user can enter outer geo-fence 514 and mobile device 210 App wakes up. Dwelling user, resource owner, or end-user, resource owner, or end-user mobile device 210 examines a sufficient to determine whether to activate Bluetooth and search for door 12 in order to unlock door 12. The data can include, time spent outside geo-fence 512 or geo-fence 514, time of day, past dwelling user, resource owner, or end-user, resource owner, or end-user activity patterns, dwelling user, resource owner, or end-user, resource owner, or end-user habits, motion data recorded (including but not limited to driving activity, walking or running activity, and the like), dwelling user, resource owner, or end-user, resource owner, or end-user activity data, WiFi access point information and the like. The server 510 can enable detailed GPS detection to verify that the dwelling user, resource owner, or end-user, resource owner, or end-user has returned to the dwelling user, resource owner, or end-user. Using this information the server 510 decides if the dwelling user, resource owner, or end-user, resource owner, or end-user is returning to the dwelling user, resource owner, or end-user.
The server 510 synthesizes the configuration data (which can include thresholds and probabilities) with the activity data on the service and determines if the dwelling user, resource owner, or end-user, resource owner, or end-user is returning to the dwelling user, resource owner, or end-user. In the case where there are multiple geo-fences, e.g., more than just inner geo-fence 512, the server 510 can decide multiple times if this is true in order to determine if the dwelling user, resource owner, or end-user, resource owner or end-user is returning to the dwelling user, resource owner, or end-user. Bluetooth is activated on the dwelling user, resource owner, or end-user, resource owner, or end-user's mobile phone to search for the dwelling user, resource owner, or end-user lock. When the lock is detected the server 510 connects to it, establishes a secure connection, unlocks door 12, and can notify the dwelling user, resource owner, or end-user, resource owner, or end-user via mobile device 210 that door 12 has been unlocked.
System 10 receives notifications when the dwelling user, resource owner, or end-user, resource owner, or end-user has left and entered the area as defined by the one or more geo-fences 512, 514. At least a portion of the Bayesian algorithm, or other filtering algorithm, filters spurious location events that can be erroneous due to environmental or technical glitches, which as a non-limiting example can be a power outage. If system 10 sees that the dwelling user, resource owner, or end-user, resource owner, or end-user has entered the dwelling user, resource owner, or end-user area less than 90 seconds after entering, then system 10 can discard it.
In one embodiment of the present invention first and second geo-fences 512 and 514 are provided, also known as inner and outer geo-fences 512 and 514. When first and second geo-fences 512 and 514 are exited, after a certain time period door 12 is then locked or a notification is sent warning the user door 12 at the user's dwelling has been unlocked. If the first geo-fence 512 is only exited system 10 may not lock door 12. User's mobile device 210 can also enable detailed GPS detection to verify that the user has left the area.
In one embodiment of the present invention an automatic-unlock uses a mobile device's 210 location features to tell when the mobile device 210 dwelling user, resource owner, or end-user, resource owner, or end-user has left a geo-fence 512 and/or 514 in an area outside of a dwelling user, resource owner, or end-user. As a non-limiting example, this can be considered the first geo-fence 512, second geo-fence 514, both or just one. At a later time when the mobile device 210 dwelling user, resource owner, or end-user, resource owner, or end-user returns to its neighborhood an app of the mobile device 210 prepares to connect to the intelligent door lock system 10 via the system server 510 and/or cloud based server 510. As the mobile device 210 owner approaches an entrance to the dwelling user, resource owner, or end-user the intelligent lock system 10 can automatically unlocks a door 12.
In one embodiment the automatic unlock feature is on the dwelling user, resource owner, or end-user, resource owner, or end-user's mobile device 210 app that uses the dwelling user, resource owner, or end-user, resource owner, or end-user's mobile device 210 location features to tell when dwelling user, resource owner, or end-user, resource owner, or end user leaves. At this point the first geo-fence 512 is triggered. As a result of dwelling user, resource owner, or end-user, resource owner, or end user leaving the geo-fence 512/514 is triggered, and when dwelling user, resource owner, or end-user, resource owner, or end user is at the dwelling user, resource owner, or end-user, the geo-fence 512/514 is on and the Bluetooth is also on. At entrance to the first geo-fence 512, the Bluetooth is turned on and the Bluetooth creates a signal to unlock system 10. There is no need for dwelling user, resource owner, or end-user, resource owner, or end user to utilize its mobile device 210.
When dwelling user, resource owner, or end-user, resource owner, or end user enables an automatic unlock of system 10 it may view its current location on a map as a means of visual verification that it is being set up properly. A home threshold region can be viewed, adjusted and the like by the dwelling user, resource owner, or end-user, resource owner, or end-user. In one embodiment dwelling user, resource owner, or end-user, resource owner, or end user implicitly indicates that when it has left the area then upon returning to the area dwelling user, resource owner, or end-user, resource owner, or end user would like their door 12 unlocked as dwelling user, resource owner, or end-user, resource owner, or end user approaches the door 12. A home threshold region is provided that can be adjusted larger to compensate for poor GPS positioning. As a non-limiting example this can be caused when the dwelling user, resource owner, or end-user is in a remote region and the like.
As dwelling user, resource owner, or end-user, resource owner, or end user exits the dwelling user, resource owner, or end-user the first and second geo-fences 512 and 514 are exited. When this occurs, after a certain time period the door 12 is then locked. If the first geo-fence 512 is only exited system 10 may not lock the door 12. The system 10 server 510 and/or cloud based server 510 can enable detailed GPS detection to verify that dwelling user, resource owner, or end-user, resource owner, or end user has left the area.
In one embodiment as dwelling user, resource owner, or end-user, resource owner, or end use renters the second geo-fence 514 an app on the dwelling user, resource owner, or end-user, resource owner, or end-user's mobile device 210 wakes up and examines a host of data to determine whether to activate Bluetooth of the mobile device 210 and search for the door 12 to unlock. As non-limiting examples the data can include: the amount of time spent outside the first or second geo-fence 512, 514 respectfully, time of day, past dwelling user, resource owner, or end-user, resource owner, or end-user activity patterns, motion data recorded device (including driving activity, walking or running activity), WiFi access point information. The system 10 server 510 and/or cloud based server 510 can enable detailed GPS detection to verify that dwelling user, resource owner, or end-user, resource owner, or end user has returned home.
Using this information the system 10 server 510 and/or cloud based server 510 determines if dwelling user, resource owner, or end-user, resource owner, or end user is returning to the dwelling user, resource owner, or end-user. The dwelling user, resource owner, or end-user, resource owner, or end-user's mobile device 210 synthesizes configuration data (which can include thresholds and probabilities) with activity data on the system 10 server 510 and/or cloud based server 510 and determines if dwelling user, resource owner, or end-user, resource owner, or end user is returning to the dwelling user, resource owner, or end-user. Mobile device 210 can employ this logic multiple times over a period of time as input data changes to decide whether mobile device 210 believes that the user is returning home.
The Bluetooth is activated on the dwelling user, resource owner, or end-user, resource owner, or end-user's mobile phone to search for the dwelling user, resource owner, or end-user, resource owner, or end-user's dwelling user, resource owner, or end-user lock. When the lock is detected the system 10 server 510 and/or cloud based server 510 connects to it, establishes a secure connection, unlocks door 12, and notifies dwelling user, resource owner, or end-user, resource owner, or end user via notification that door 12 has been unlocked.
System 10 receives notifications when dwelling user, resource owner, or end-user, resource owner, or end user has left and entered the area. Spurious location events are filtered out that could be erroneous due to environmental or technical glitches. As a non-limiting example this can be a power outage. If the system sees that dwelling user, resource owner, or end-user, resource owner, or end user has entered the home area less than a selected period of time a debounce signal, from the system 10 server 510 and/or cloud based server 510, 90 seconds after entering, then the system may discard it.
In one embodiment a debounce is provided by the system and/or cloud based server 510 such that dwelling user, resource owner, or end-user, resource owner, or end user leaves and is outside a geo-fence 512, 514 (a second geo-fence as described hereafter), and the like for a selected period of time in order for an entrance to trigger a Bluetooth connection to the lock to unlock of door 12. With the debounce the dwelling user, resource owner, or end-user, resource owner or end user can leave, come back quickly, and then is not considered to have left the dwelling user, resource owner, or end-user. The use of the debounce eliminates false unlocking of system 10 when dwelling user, resource owner, or end-user, resource owner, or end user is at the dwelling user, resource owner, or end-user home and artifacts from cell tower, WiFi and other systems that provide assistance providing location data are faulty and incorrectly assert that the user has left the dwelling user, resource owner, or end-user.
In one embodiment intelligent door lock system 10 is configured to have a remotely operable lock 12 at a dwelling accessible by the user. The intelligent door lock system 10 configured to be in communication with a server 510. The user communicates with the server 510 using the user's mobile device 210 and the server 510 is configured to transmit a crossing notification message in response to tracking the user's mobile device 201 and enable an automatic unlock feature of the lock using the server 510 and a mobile device 201 App.
In one embodiment a second geo-fence 514 is provided, or the first geo-fence 512 is extended, for early entrance detection. The second geo-fence 514 is larger than the first geo-fence 512 and can be of different geometric configurations. The second geo-fence 514 increases/decreases and a combination of both, the first geo-fence 512. In one embodiment the second geo-fence is adjustable 514. The second geo-fence 514 increases sensitivity
In another embodiment a map view position and radius adjustment is made. In this embodiment system 10 does not unlock until after the mobile device 210 owner has manually performed an unlock.
In one embodiment data is collected and false motion data is filtered out.
In one embodiment mobile device 210 stores data over time as events occur. As non-limiting examples this can include, motion data, geo-fence regions exited or entered and the like. Any configuration of parameters from the server 510 can be downloaded at any time as well. Each geo-fence 510/512 is associated with a lock or door 12. When a geo-fence area has been entered, the mobile device 201 collects the history of crossing geo-fences for that lock and creates a data set along with other live data, including but not limited to WiFi access point data, accumulated motion data during the time the user was outside a geo-fence fence, time of day, and other information. That information is processed based on the configuration data, producing a limited set of parameters types and associated values. These are fed to a decision making engine of mobile device 210, which may be a Bayesian filter, and the like, that associates different probability value on each of these parameters, and combines it to form a single yes no decision on whether the user desires its door 12 is locked or unlocked. In the unlocked case, the device enables Bluetooth and begins trying to connect to the dwelling user, resource owner, or end-user lock. When the dwelling user, resource owner, or end-user lock is connected the device unlocks door 12. In the case of the lock being bridged to the internet via Wireless access point Bluetooth is not used, the mobile device 210 connects to the bridge 11 via Network system. The lock is then locked or unlocked remotely by mobile device 210 using the Network System to a cloud service and/or system 10 backend that establishes a secure connection to the lock. In this embodiment there is a wall wart, doorbell, or another device that includes Bluetooth and is within an appropriate distance of door 12.
In one embodiment security system 10(a) is an intelligent security system that produces fewer false alarms or alerts. In one embodiment security system 10(a) uses motion detection device 10(g) to look for an actual person, which can be, an outline of a person at or near the door 12, or other entrance to the dwelling. If motion detection device 10(g) with camera 10(c) sees a person, an outline of a person and the like, an alert is sent to the dwelling user, resource owner, or end-user mobile device 210. The dwelling user, resource owner, or end-user views the video, picture and the like taken by camera 10(c) and has various options including but not limited to: determine who is at the dwelling; communicate with the person who is at the dwelling; notify authorities of an unwanted person at the dwelling; unlock or lock the dwelling including but not limited to door 12, a window and the like in order to allow or deny access to the person.
In one embodiment intelligent security system 10(a) is split into a plurality of parts. As a non-limiting example intelligent security system 10(a) can be split into: (i) an intelligent security system 10(a) used to wake up the motion detection device 10(g) and prepare for a video call; (ii) detect motion, with or without person detection; (iii) detect people at the dwelling without linger detection; and (iv) detect people at the dwelling and linger.
As a non-limiting example intelligent security system 10(a) triggers motion detection device 10(g) to wake up and turn on camera 10(c) when motion is triggered. As a non-limiting example one use of this capability is to prepare for a video call in response to a doorbell press. When motion is detected, camera 10(c) takes a still image to be used in a notification and make other preparations for a video call to the dwelling occupant in the event that the doorbell is pressed, or the dwelling occupant initiates a video call in response to a motion/person notification. As a non-limiting example motion detection device 10(g) does whatever is possible ahead of time to speed up the perceived connection time of notice to the dwelling occupant which can be a video call, and the like
When the short range motion detection device 10(g) is triggered, the doorbell wakes up and camera 10(c) takes a still image. A countdown to a timeout begins.
If the doorbell button is pressed before the timeout, a doorbell press notification is sent to dwelling user, resource owner, or end-user and a motion notification is not sent.
If the doorbell button is not pressed before the timeout, a motion notification can be sent.
As a non-limiting example mobile device OS motion notification can include the following text or equivalent: “Motion detected at <door name> at <house name>”. The in-app accept I reject UI can be the same as for a button press except with the following differences: (i) it can have the following text: “Motion detected at <door name> at <house name>”; (ii) it will display the still image taken by camera (10c) when it was awakened by the motion it detected; (iii) red X and green check icons can remain the same but the labels below them can be “Dismiss” and “View”, note: the timeout can be a variety of different times. As a non-limiting example it can be 5-10 second range, although other ranges can be utilized; (iv) a motion event can be put into an activity feed with the image taken in response to the motion, and this can happen regardless of whether the dwelling user, resource owner, or end-user responds to the notification.
In one embodiment if the dwelling user, resource owner, or end-user taps “View” to initiate a video call, an on-demand video session can be put into the activity feed. In this embodiment, there can be two events in the activity feed, one for the motion and one for the video call.
In one embodiment toggle settings can be provided in settings to enable/disable motion alerts. In one embodiment person notifications are provided. As a non-limiting example the person notifications are sent to the dwelling user, resource owner, or end-user.
As a non-limiting example person detection can be based on whether there is a person in the frames captured by the camera 10(c) which can be included with the doorbell as described above. As a non-limiting example there is detection of via camera 10(c)/doorbell and the like. In one embodiment there is not checking that the same person is being detected in every frame captured by camera 10(c).
When either the long range or short range motion sensor 10(g) is triggered, camera 10(c)/doorbell wakes up, takes a still image and initiates intelli-vision analytics. Camera 10(c)/doorbell can continue taking still images, save them, and feed them to the server, see
A countdown to a timeout can begin.
If the doorbell button is pressed before the timeout, a doorbell press notification can be sent, and no motion or person notification is sent.
If the doorbell button is not pressed before the timeout, either a motion notification or a person notification is sent.
If the analytics engine does not detect a person at any time during the timeout period, a motion notification is sent as described above.
If the analytics engine does detect a person, at any point during the timeout, a person notification is sent.
In one embodiment the mobile device OS notification can have the following text: “Person detected at <door name> at <house name>”. In one embodiment the in-app accept I reject UI is the same as for door bell press, but may not have the following text: “Person detected at <door name> at <house name>”. In one embodiment it can display the first image and the server/analytics engine tags as containing a person. The red X and green check icons can remain the same, but the labels below them can be “Dismiss” and “View”.
The timeout period is then selected. As a non-limiting example this can be 5-10 seconds. It will be appreciated that other periods of time can be used for the timeout period.
In one embodiment Person events are placed in the Activity Feed the same way described above for motion events. In one embodiment the motion and person events are mutually exclusive. For a given activation of the intelligent security system 10(a), a person event or motion event can be placed into the Activity Feed, not both.
In one embodiment on demand video session events in the Activity Feed work the same as described above. As a non-limiting example these events can include addition to motion/person events.
In one embodiment there are two toggles in settings, one to enable/disable motion notifications, and the other to enable/disable person notifications. The defaults for these settings need to be determined. In one embodiment the person notifications can be on by default and motion notifications can be off by default.
Lingering notifications can be selected.
In one embodiment “Person Notifications” only detects whether a frame contains a person. It does nothing to track a person from frame to frame, or to determine if it's the same person in each frame. The result is that on a busy street where people frequently pass by the doorbell and or camera 10(c), the person notification feature is triggered, even though no one person is actually lingering in front of door 12. This issue can be addressed using the server/analytics engine's ability to detect lingering. From a user perspective, the feature can operate the same way and can surface the same settings. In one embodiment system 10 only sends person notifications when we detect that the same person is lingering in front of the door 12, window and the like. This provides better accuracy and fewer false person notifications.
In one embodiment camera 10(c) is coupled to an analytics engine. The analytics engine includes a person detection module which wakes camera 10(c) from the sleep state earlier.
In one embodiment intelligent security system 10(a) has reduced latency. In one embodiment the reduced latency is achieved by the analytics engine/person detection module. As a person approaches and comes to door 12 or any dwelling available entrance, including but not limited to a window, and the like, camera 10(c) is activated with or with pressing the doorbell, wakes up from a sleep mode and the intelligent security system 10(a) is ready to go. Motion detection device 10(g), with or without analytics engine, wakes camera 10(c) up from the sleep mode and when the person is at the entrance of the dwelling the camera is set to take a picture, video with one or more frames, and the like. This reduces latency and improves speed. As a non-limiting example activation of camera 10(c) integrated or not integrated with the doorbell, from the pressing of the doorbell, when the doorbell and camera 10(c) are coupled and/or integrated to the sending of a picture or video to a mobile device 210 can be on the order of 30 seconds or less. This results from motion detection device 10(g) starting recognition of a person or an object at an earlier time. When the person is at the dwelling entrance camera 10(c) is then is set to start recording.
In one embodiment security system 10(a) has reduced power consumption. In this embodiment the security system 10(a) camera 10(c) is only started up from its sleep state to a wake state when something passes by and/or is in a selected distance relative to camera 10(c) which is detected by. More particularly the person detection module is used along or without motion detection device 10(g) to determine if something or someone passes by. In response to that determination, if a person or something similar to a person passes by, camera 10(c) is awakened and put in an active state. This results in a savings of power because camera 10(c) is in active step less often.
In one embodiment a server 510 is in communication with the intelligent door lock system 10. In another embodiment instead of the intelligent door lock system 10, a low power device that communicates with small payloads is used instead of the intelligent door lock system 10. In one embodiment low power means running off of consumer batteries. In one embodiment a small payload is 12 bytes or less. The server 510 has a handshake key (Kh) with a key exchange that provides for a communication session between a mobile device 210 and the intelligent door lock system 10. The mobile device 210 does not have the (Kh). A user mobile device 210 is in communication with the server 510 and uses a cipher to provide a secured communication between the mobile device 210 and the intelligent door lock system 10.
In one embodiment cipher block chaining is utilized to provide the secured communication.
In one embodiment an initialization is done when the intelligent door lock system 10 and/or the lock 22 is created. In one embodiment communication is initialized by generating a session nonce (Ns) on the mobile device 210. In one embodiment the session nonce is encrypted using a handshake key (Kh) that is at the server 510. In one embodiment the (Kh) is a shared secret known to intelligent door lock system 10 and to an entity encrypting the nonce that includes the server 510. In one embodiment the intelligent door lock system 10 modifies the nonce and returns a modified value to the mobile device 210. In one embodiment
An entity with the server 510 has the (Kh) decrypts a session key and the mobile device 210 uses that to encrypt subsequent communication, wherein the intelligent door lock system 10 generates the session key and provides for an unlocking of a lock 22 at the intelligent door lock system 10.
In one embodiment when the intelligent door lock system 10 is assembled a random number is provisioned on the mobile device 210.
In one embodiment the (Kh) can be common to all locks 22 of a given lock model.
In one embodiment the (Kh) is a shared secret between the server 510 and all locks 22 of a model class.
As a non-limiting example the (Kh) is exchanged for a unit-specific unit key (Kh). As a non-limiting example the (Kh) is a random number.
In one embodiment the server 510 associates the (Kh) with a unique identification of a lock 22 of the intelligent door lock system 10. In one embodiment the cloud service responds to the mobile device 210 with a unit key encrypted using an open block cipher. As a non-limiting example the server 510 sends the (Kh) to the lock 22 and the lock 22 stores it in a non-volatile memory, with the (Ku) used as a handshake key. In one embodiment the mobile device 210 generates two random numbers that are communicated with the server 510 to create one/half of the (Kh). In response the server 510 generates one or more random numbers to create a second half of the (Kh). For each of session the server 510 creates a different set of one or more numbers for the second half of the (Kh).
In one embodiment in response to the server 510 creating the second half of the (Kh) the mobile device 210 functions without having the (Kh)
In one embodiment the mobile device 210 generates a checksum. In one embodiment the mobile device 210 creates a request for a private communication and in response the server 510 creates a second half of the (Kh) without the mobile device 210 having the (Kh). In response to transmission of the (Kh) a session is initiated that provides for unlocking of the door 12. A production of half of the (Kh) by the mobile device 210 is an initiation of a request for packet for a private communication and thereafter a validation of a source is conducted.
As a non-limiting example a third party mobile device 210 never has on its mobile device 210 an encryption for the intelligent door lock system 10.
Terms:
The following terms are used: encryption function: e (key, block)
decryption function: d(key, block)
random number function r(number)
The handshake key will be called Kh
The generated session key, Ks
A 32-bit constant used during Initialization, IC
A 32-bit constant used during Initialization, RC
mFooBar means foobar for the mobile device 210
lFooBar means foobar for the lock (it's an el not an eye).
Communication is initialized by generating a session nonce (Ns) on the mobile device 210. This session nonce is encrypted using a handshake key (Kh). The handshake key is a shared secret known to the lock and to the entity encrypting the nonce. The lock then modifies the nonce and returns the modified value to the mobile device 210. The entity that has the handshake key then decrypts the session key and the mobile device 210 uses that to encrypt subsequent communication.
Factory Initialization
When the lock is assembled, a 16-byte key (random number) can be provisioned on the device. This key can be common to all locks of a given model and referred to as model key (Km). The model key is a shared secret between the August API server 510 and all locks of that model class. It can only be used at setup or factory reset time to exchange a new, unit-specific unit key (Ku).
Owner Initialization
When the owner of the lock sets it up for the first time, or after a factory reset of the lock, the mobile application can generate a 16-byte random number and send it to the server 510. This number can be referred to as the unit key (Ku). The API server 510 associates the unit key with the unique ID of the lock. This is a shared secret between the lock and the service.
After associating the unit key with the lock, the service responds to the mobile device 210 with the unit key encrypted using AES CBC. The mobile device 210 then sends this key to the lock and the lock stores it in non-volatile memory. This unit key can be used as the handshake key (Kh) in future communications.
The mobile device 210 generates a key, sends it to the service server 510 for encryption.
When a mobile device 210 has authenticated against the August REST API and has been authorized to communicate with a particular lock, it generates two 32-bit random values: mRand1 and mRand2, which combined constitute the session nonce (Ns).
It also generates a 32-bit checksum that is the 2 s-complement sum for the two random numbers.
The service server 510 encrypts the key, sends encrypted data back to mobile device 210.
This set of numbers is transmitted to the server 510, where it is encrypted using the handshake key:
Kh=Ku
mKeyExchange=e(Kh,{mRand1,mRand2,mKeyChecksum})
mKeyExchange is then returned to the mobile device 210.
The mobile device 210 transmits the key to the low power device where data is validated. When the mobile device 210 receives the encrypted key from the server 510, it sends it to the lock. The lock then decrypts the message:
Kh=Ku
{mRand1′,mRand2′,mKeyChecksum′}=d(Kh,mKeyExchange)
The lock confirms the validity of the message by summing the four decrypted 32-bit values:
mKeyCheck=mRand1′+mRand2′+mKeyChecksum′
If mKeyCheck is non-zero, the lock disconnects from the mobile devices 210.
Lock Generates Key, Encrypts, and Sends Data
The lock now generates two 32-bit random values: Rand1 & 1Rand2, and a 32-bit 1keychecksum that is the 2 s-complement sum of 1Rand1, 1Rand2, and the 32-bit constant LX.
1keychecksum=232−(LX+1Rand1+1Rand2)
The values are combined into a message and encrypted using the unit key, Kh.
1KeyExchange=e(Kh,{LX,1Rand1,1Rand2,1keychecksum})
The value 1KeyExchange is sent to the mobile device 210.
Mobile device 210 transmits the encrypted key to service server 510 which validates it. The mobile device 210 sends 1KeyExchange to the server 510, which uses the unit ey to decrypt it:
Kh=Ku
{LX′,1Rand′,1Rand2′,1keychecksum′}=d(Kh,1KeyExchange)
The server 510 confirms the validity of the message by summing the four decrypted 32-bit values:
1keycheck=LX′+1Rand1′+1Rand2′+1keychecksum′
If 1keycheck is not zero, the server 510 returns an error to the mobile device and the mobile device 210 disconnects from the lock. If 1keycheck is zero, the server 510 returns 1Rand1! and 1Rand2′ to the mobile device 210.
The mobile device 21021-generates the candidate session key, mSK:
mSK={mRand1,mRand2,1Rand1′,1Rand2′}
The mobile device 210 initiates a test communication.
The mobile device 210 then generates two 32-bit random values: iRand2 and iRand2, as well as another 32-bit value, iInitChecksum, that is the twos-complement of the two random values, and a 32-bit constant, IC:
iInitChecksum=232−(IC+iRand1+iRand2)
The values are combined into a message and encrypted using the candidate session key, mSK.
iInitComm=e(mSK,{IC,iRand1,iRand2,iInitChecksum})
The mobile device 210 then transmits iInitComm to the lock.
The low power device validates an initialization command. When the lock receives the candidate session key from mobile, device, it generates its candidate session key (ISK) from the generated and shared random values and decrypts the message:
1SK={mRand1′,mRand2′,1Rand1,1Rand2}
{IC′,iRand1′,iRand2′,iInitChecksum′}=d(ISK,iInitComm)
The low power device confirms the validity of the message by summing the four decrypted 32-bit values:
iInitCheck=iC′+iRand1′+=iRand2′+iInitChecksum′
If iInitCheck is non-zero or if IC′ does not match the expected value for IC, the lock disconnects from the mobile device 210. If the message is valid, communication is now considered secure buy the lock and future messages can be decrypted using Ks=1SK.
The intelligent door lock 10 system sends an initialization response.
The intelligent door lock 10 now generates a 32-bit value, rInitChecksum, that is the twos-compliment sum of the random values iRand1′, iRand2′, and a 32-bit constant, RC.
rInitChecksum=232−(RC+iRand1′+iRand1′)
The values are combined into a message and encrypted using the session key Ks:
rInitResponse=e(Ks,{RC,iRand1′,iRand2′,rInitChecksum})
The value rInitResponse is sent to the mobile device 210.
The mobile device 210 validates initialization response.
When the mobile device 210 receives rInitResponse, it decrypts it using mSK:
{RC′,iRand1″,iRand2″,rInitChecksum′}=d(mSK,rInitResponse)
The mobile device 210 confirms the validity of the message by summing the four decrypted 32-bit values:
rInitCheck=RC′+iRand1″+iRand2″+rInitChecksum′
If all of the conditions below are met, communication is considered secure and future messages can be encrypted or decrypted with Ks=mSK:
rInitCheck==0
RC′==RC
iRand1″==iRand1
iRand2″==iRand2
If any of the above conditions are not met, the mobile device 210 disconnects from the low power device.
The low power device can have three states that define its security level:
Not Connected
Connected Not Secure
Connected Secure
The security level defines the behavior of the service characteristics. When advertising, the device is in the Not Connected state. After a connection is established, the peripheral is in the Connected Not Secure state. After the handshake process is complete, the device enters the Connected Secure state.
In one embodiment firmware updates are provide to intelligent door lock system from server. In one embodiment a dwelling Bluetooth device 21 sends a packet with or without acknowledgment. As a non-limiting example server 510 sends updates to mobile device 210 or Bluetooth to WiFi bridge 11. Mobile device 210 or WiFi bridge 11 then sends the updates to a Bluetooth device 21 at the dwelling, e.g., to lock system 10.
In one embodiment the update is send as a payload. As a non-limiting example a payload of N bits is sent. As a non-limiting example the payload can be 96 bits. Each packet includes n number of bits. Each bit has a traceable marker. As a non-limiting example the payload is sent without a response. This increases the speed of sending the payload. As a non-limiting example the speed can be a bit packet three times a second.
Even though the payload is sent without a response, in response to receipt of the payload a notification is required from lock system 10. In one embodiment the notification is a bit mask of how many bits of the payload were received.
Typically not all of the bits are received by lock system 10. Notification is then provided to server of the non-received bits. When there is 0 in the bit mask that portion not received is then resent. A payload is resent until all of the bits have been received.
In this manner a backfill of non-received bits is process. This is repeated until all of the original bits in the update have been received.
With subsequent updates a checksum is made on the payload and a checksum is sent back.
In this embodiment intelligent door lock system 10 receives fast updates of firmware where there have been slow connections.
In one embodiment intelligent door lock system with server 510 notifies the dwelling user, resource owner, or end-user, hereafter (dwelling user) the status of the intelligent door lock system batteries 21. As a non-limiting example intelligent door lock system 10 reports its battery voltages to server 510, and then the server 510 determines battery life, more particularly when the batteries will die and need to be changed. Notification is sent to the dwelling user.
In one embodiment intelligent door lock system reports its voltage to server 510. Server 510 then determines when the one or more batteries 21 are going to die. The user of is then notified of the battery 21 status and can be prompted to change the batteries 21.
In one embodiment periodically mobile device 210 or Bluetooth to WiFi bridge 11 records the voltage every time intelligent door lock system 10 locks or unlocks. The recorded voltage is sent to server 510. This is done on a period basis. In one embodiment a band pass filter is used in processing the recorded voltage.
In one embodiment server 510 initially runs a linear regression on the data for a period of a number of different months which can be from 1-6 months. Thereafter server 510 runs a cubic regression on the data set of the recorded voltages.
The server 510 pulls from memory the average lifetime of a battery 21. As a non-limiting example in the beginning if the recorded voltage is above 5 volts, and a certain threshold is reached, a cubic regression predicts immediate death of batteries 21. A comparison is run of battery 21 status now and when the battery 21 was charged. When the cubic ticks back up then use linear regression is applied. Linear regression works up to a certain range of battery life. As a non-limiting example linear regression is applied up to 75% of battery life.
In one embodiment server 510 initially runs linear regression on the data set of recorded voltages. Cubic regression is applied. When the cubic regression diverges server 510 switches back to linear regression and this is used as the death date.
The data check of battery life voltages can be for any number of times, including periodically. As a non-limiting example the data check of battery voltages is run everyday.
In one embodiment when Bluetooth-WiFi bridge 11 is used the battery check voltage is done automatically.
In one embodiment the dwelling user is notified when a door 12, window, opening and the like of a dwelling is manually opened. Manually opened includes operation of the keypad, of a key, by any mechanism which can not notify server 510 directly. This in effect provides a notification of “I need attention”.
In one embodiment every time the lock 24 is operated a log of that event is put in the lock's memory. One or more bits of this bit information is received by the wireless bridge 11, which processes all of the logs. This provides a notification of “I need attention”. A push notification is sent to the dwelling user.
In another embodiment server 210 can sent a similar push notification also be sent to a third party that provides a service to the dwelling user, including but not limited to an alarm company, which then can cause the alarm to not be activated, and an a notification is also provided to the notification to the dwelling user, which as a non-limiting example can be via mobile device 210.
The wireless bridge 11 processes all of the logs.
In one embodiment of the, illustrated in
In one embodiment the buffer size is selected based on analytics and a determination can be made as to when to start the video. As a non-limiting example motion can be triggered by a PIR sensor, but once motion is triggered, the security system 11(a11(a) analyses the x-second video buffer to pixel changes This provides that the beginning of the video is aligned with the first pixel change. In another embodiment, motion speed is evaluated so that the slower the event, the longer the buffer time x is.
Referring to
As set forth above sensor 16, which can be an accelerometer, can be used to detect rotation of the lock device 22 and the magnetometer 712 can detect the movement of the door 12. The value of knowing the magnetic sensor provides door ajar status, sensor 16 is used to determine if bolt/lock 24 is extended or retracted.
As non-limiting examples, the magnetometer 712 can be: (i) a vector magnetometer 712 that measures vector components of a magnetic field; (ii) a total field magnetometer 712 or scalar magnetometer 712 to measure a magnitude of the vector magnetic field; and the like.
In one embodiment the magnetometer 712 is an absolute magnetometer 712 that measures an absolute magnitude or vector magnetic field, using an internal calibration or known physical constants of the magnetic sensor. In one embodiment magnetometer 712 is a relative magnetometer 712 that measures magnitude or vector magnetic field relative to a fixed but un-calibrated baseline, and can be used to measure variations in magnetic field.
In one embodiment, illustrated in
In one embodiment the magnetometer 712 is used to determine if the door 12 is actually closed or ajar. This information, as well as information provided by sensor 16 is used to determine whether or not door 12 is secured. As illustrated in
As a non-liming example the magnetometer 712 measures magnetism, including but not limited to magnetization of a magnetic material, strength of a magnetic field, a direction of the magnetic field at a point in space. A magnetic material 716 can be existing in the door 12; be at the door frame; can be added which as a non-limiting example can be a discreet magnet 716 coupled to the door frame; be a magnetic strike plate; and the like, as illustrated in
In one embodiment the magnetometer 712 communicates wirelessly, which as a non-limiting example can be RF and the like. In one embodiment magnetometer 712 receives power from power source, battery 50.
The performance and capabilities of the magnetometer 712 can perform and/or include all or a portion of the following:
Noise—the random fluctuations generated by the magnetometer 712 sensor or electronics, which as a non-limiting example can be given in units of {\displaystyle {\rm {{nT}/{\sqrt {\rm Hz}}}}}} where frequency component refers to the bandwidth.
Sensitivity—the larger of the noise or the resolution.
Heading error—the change in the measurement due to a change in orientation of the instrument in a constant magnetic field.
A dead zone—the angular region of magnetometer 712 orientation in which the instrument produces poor or no measurements.
Gradient tolerance—the ability of a magnetometer 712 to obtain a reliable measurement in the presence of a magnetic field gradient.
In one embodiment, illustrates in
Additionally the wide field camera 810 can be a doorbell, included as a doorbell and also allow for a resident to see and speak with a third party viewed by the wide field camera 810. In one embodiment the wide field camera 810 is included with a keypad, can be a keypad and the like, allowing a person to enter a code. In one embodiment the wide field camera 810 also provides for detecting if the door has been closed and can serve as a door sensor, e.g, to determine open or closed door status. As a non-limiting example, the open close status of the door can be used to operate the lock, or trigger alarms.
In one embodiment the wide field camera 810 captures a follow-through video of a third party, who may or may not have been given access to the dwelling by the dwelling occupant, by using the wide field camera 810, that can be coupled to door 12, and moves with door 12 as door 12 moves. Because the camera moves with the door, this allows wide angle camera 810 to capture view and capture activities and third party movement inside and outside of the dwelling, all with a single camera system.
Referring to
In one embodiment the safe zone is delimited by one or more virtual fences 814, 816 and the like, which the occupant of the dwelling can define, and in the event the one or more virtual fences 814 and 816 are broken, different alerts can be activated, including but not limited a local alarm, notification to the occupants mobile device, vocal warning, a communication with authorities, and the like.
In one embodiment the one or more virtual fences are multi-layer fences that can define multiple concentric zones. This allows for tracking of the occupant, a non-occupant that is granted access to the dwelling by the occupant, including but not limited to a delivery person, housekeeper and the like. In one embodiment, the order in which concentric fences are broken allows tracking of a movement as either towards door 12, or away from door 12. As a non-limiting example breaking the first fence 814 can result in a vocal warning, while breaking second fence 816 can result in an alarm being activated, and breaking a third fence can result in a call to authorities.
More particularly as illustrated in
As illustrated in
Referring to
As a non-limiting example information is used from the open or closes status of the door 12. In one embodiment camera 10(c) begins recording as soon as the door 12 is open and can continue recording until the door 12 is closed. In one embodiment the camera 10(c) continues recording until the door is closed and locked. In one embodiment camera (10c) is a doorbell camera.
In one embodiment the lock communicates information to the camera (10c) that includes multiple status data. As a non-limiting example the multiple status data can include at least one of, if the door 12 is open or closed and if the lock device 22 is locked or unlocked, and the like.
In one embodiment the camera 10(c) uses input from any of the sensors at the dwelling, or included in the intelligent door lock system 10, to decide to begin recording or not.
In one embodiment the intelligent door lock system 10 is retrofitted to an existing lock system. In one embodiment the intelligent door lock system 10 is retrofitted to an existing lock system.
In one embodiment a plurality of cameras, which can be camera 10(c), camera 812, a doorbell camera and the like, along with a dwelling occupants mobile device 210, communicate with each other, as well as communicate with lock device 22, which as a non-limiting example can be via Bluetooth, as well as via WiFi to the server in cloud lock access services infrastructure,
In one embodiment multiple cameras 10(c) can be set along an expected path of the dwelling. As a non-limiting example a third party delivery person can recorded, or other third party, is observed and watched along a desired path as the third party approaches the dwelling, is in the interior the dwelling, and exits the dwelling.
The foregoing description of various embodiments of the claimed subject matter has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the claimed subject matter to the precise forms disclosed. Many modifications and variations will be apparent to the practitioner skilled in the art. Particularly, while the concept “component” is used in the embodiments of the systems and methods described above, it will be evident that such concept can be interchangeably used with equivalent concepts such as, class, method, type, interface, module, object model, and other suitable concepts. Embodiments were chosen and described in order to best describe the principles of the invention and its practical application, thereby enabling others skilled in the relevant art to understand the claimed subject matter, the various embodiments and with various modifications that are suited to the particular use contemplated.
This application is a continuation of U.S. patent application Ser. No. 15/798,425, filed Oct. 31, 2017, which is a continuation-in-part of Ser. No. 15/497,383, filed Apr. 26, 2017, which (1) claims the benefit of and priority to U.S. Provisional Patent Application No. 62/481,797, filed Apr. 5, 2017, and (2) is a continuation-in-part of U.S. patent application Ser. No. 15/497,327, filed Apr. 26, 2017, which is a continuation-in-part of U.S. patent application Ser. No. 15/463,022, filed Mar. 20, 2017, which (1) is a continuation-in-part of U.S. patent application Ser. No. 15/410,845, filed Jan. 20, 2017, which is a continuation of U.S. patent application Ser. No. 15/066,210, filed Mar. 10, 2016, which is (a) a continuation of U.S. patent application Ser. No. 14/205,608, filed Mar. 12, 2014, which claims the benefit of and priority to (i) U.S. Provisional Patent Application No. 61/800,937, filed Mar. 15, 2013, (ii) U.S. Provisional Patent Application No. 61/801,335, filed Mar. 15, 2013, and (iii) U.S. Provisional Patent Application No. 61/801,294, filed Mar. 15, 2013, (b) a continuation of U.S. patent application Ser. No. 14/205,973, filed Mar. 12, 2014, (c) a continuation of U.S. patent application Ser. No. 14/206,536, filed Mar. 12, 2014, (d) a continuation of U.S. patent application Ser. No. 14/206,619, filed Mar. 12, 2014, (e) a continuation of U.S. patent application Ser. No. 14/207,833, filed Mar. 13, 2014, (f) a continuation of U.S. patent application Ser. No. 14/207,882, filed March 13, 2014, (g) a continuation of U.S. patent application Ser. No. 14/208,947, filed Mar. 13, 2014, (h) a continuation of U.S. patent application Ser. No. 14/208,182, filed Mar. 13, 2014, (i) a continuation of U.S. patent application Ser. No. 14/212,569, filed Mar. 14, 2014, (j) a continuation-in-part of U.S. patent application Ser. No. 14/321,260, filed Jul. 1, 2014, (k) a continuation-in-part of U.S. patent application Ser. No. 14/321,000, filed Jul. 1, 2014, (l) a continuation-in-part of U.S. patent application Ser. No. 14/465,513, filed Aug. 21, 2014, (m) a continuation-in-part of U.S. patent application Ser. No. 14/465,527, filed Aug. 21, 2014, (n) a continuation-in-part of U.S. patent application Ser. No. 14/469,186, filed Aug. 26, 2014, (o) a continuation-in-part of U.S. patent application Ser. No. 14/471,470, filed Aug. 28, 2014, (p) a continuation-in-part of U.S. patent application Ser. No. 14/622,054, filed Feb. 13, 2015, (q) a continuation-in-part of U.S. patent application Ser. No. 14/622,192, filed Feb. 13, 2015, (r) a continuation-in-part of U.S. patent application Ser. No. 14/622,578, filed Feb. 13, 2015, (s) a continuation-in-part of U.S. patent application Ser. No. 14/622,654, filed Feb. 13, 2015, (t) a continuation of U.S. patent application Ser. No. 14/730,848, filed Jun. 4, 2015, (u) a continuation-in-part of U.S. patent application Ser. No. 14/732,290, filed Jun. 5, 2015, (v) a continuation-in-part of U.S. patent application Ser. No. 14/796,994, filed Jul. 10, 2015, (w) a continuation-in-part of U.S. patent application Ser. No. 15/066,091, filed Mar. 10, 2016, (x) a continuation-in-part of U.S. patent application Ser. No. 14/459,054, filed Aug. 13, 2014, (y) a continuation-in-part of U.S. patent application Ser. No. 14/461,177, filed Aug. 15, 2014, and (z) a continuation of U.S. patent application Ser. No. 15/065,657, filed Mar. 9, 2016, which is (i) a continuation of U.S. patent application Ser. No. 14/205,608, filed Mar. 12, 2014, (ii) a continuation of U.S. patent application Ser. No. 14/205,783, filed Mar. 12, 2014, (iii) a continuation of U.S. patent application Ser. No. 14/205,973, filed Mar. 12, 2014, (iv) a continuation of U.S. patent application Ser. No. 14/206,536, filed Mar. 12, 2014, (v) a continuation of U.S. patent application Ser. No. 14/206,619, filed Mar. 12, 2014, (vi) a continuation of U.S. patent application Ser. No. 14/207,833, filed Mar. 13, 2014, (vii) a continuation of U.S. patent application Ser. No. 14/207,882, filed Mar. 13, 2014, (viii) a continuation of U.S. patent application Ser. No. 14/208,947, filed Mar. 13, 2014, (ix) a continuation of U.S. patent application Ser. No. 14/208,182, filed Mar. 13, 2014, (x) a continuation of U.S. patent application Ser. No. 14/212,569, filed Mar. 14, 2014, (xi) a continuation-in-part of U.S. patent application Ser. No. 14/321,260, filed Jul. 1, 2014, (xii) a continuation-in-part of U.S. patent application Ser. No. 14/321,000, filed Jul. 1, 2014, (xiii) a continuation-in-part of U.S. patent application Ser. No. 14/465,513, filed Aug. 21, 2014, (xiv) a continuation-in-part of U.S. patent application Ser. No. 14/465,527, filed Aug. 21, 2014, (xv) a continuation-in-part of U.S. patent application Ser. No. 14/469,127, filed Aug. 26, 2014, (xvi) a continuation-in-part of U.S. patent application Ser. No. 14/469,186, filed Aug. 26, 2014, (xvii) a continuation-in-part of U.S. patent application Ser. No. 14/471,414, filed Aug. 28, 2014, (xviii) a continuation-in-part of U.S. patent application Ser. No. 14/471,470, filed Aug. 28, 2014, (xix) a continuation-in-part of U.S. patent application Ser. No. 14/622,054, filed Feb. 13, 2015, (xx) a continuation-in-part of U.S. patent application Ser. No. 14/622,192, filed Feb. 13, 2015, (xxi) a continuation-in-part of U.S. patent application Ser. No. 14/622,578, filed Feb. 13, 2015, (xxii) a continuation-in-part of U.S. patent application Ser. No. 14/622,396, filed Feb. 13, 2015, (xxiii) a continuation-in-part of U.S. patent application Ser. No. 14/622,654, filed Feb. 13, 2015, (xxiv) a continuation-in-part of U.S. patent application Ser. No. 14/730,848, filed Jun. 4, 2015, (xxv) a continuation-in-part of U.S. patent application Ser. No. 14/731,092, filed Jun. 4, 2015, (xxvi) a continuation-in-part of U.S. patent application Ser. No. 14/732,290, filed Jun. 5, 2015, and (xxvii) a continuation-in-part of U.S. patent application Ser. No. 14/796,994, filed Jul. 10, 2015, and (2) claims the benefit and priority to (a) U.S. Provisional Patent Application No. 62/036,979, filed Aug. 13, 2014, (b) U.S. Provisional Patent Application No. 62/036,989, filed Aug. 13, 2014, (c) U.S. Provisional Patent Application No. 62/036,991, filed Aug. 13, 2014, and (d) U.S. Provisional Patent Application No. 62/036,993, filed Aug. 13, 2014.
Number | Name | Date | Kind |
---|---|---|---|
2680177 | Rosenthal | Jun 1954 | A |
3898976 | Coffman, Jr. | Aug 1975 | A |
4849749 | Fukamachi et al. | Jul 1989 | A |
5245329 | Gokcebay | Sep 1993 | A |
5306407 | Hauzer et al. | Apr 1994 | A |
5407035 | Cole et al. | Apr 1995 | A |
5594430 | Cutter et al. | Jan 1997 | A |
5695048 | Tseng | Dec 1997 | A |
5712626 | Andreou et al. | Jan 1998 | A |
5717379 | Peters | Feb 1998 | A |
5774058 | Henry et al. | Jun 1998 | A |
5903225 | Schmitt et al. | May 1999 | A |
5933086 | Tischendorf et al. | Aug 1999 | A |
5979199 | Elpern et al. | Nov 1999 | A |
6032500 | Collard, Jr. et al. | Mar 2000 | A |
6089058 | Elpern et al. | Jul 2000 | A |
6196936 | Meckel | Mar 2001 | B1 |
6215781 | Kato et al. | Apr 2001 | B1 |
6282931 | Padiak et al. | Sep 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6334636 | Huang et al. | Jan 2002 | B1 |
6360573 | Ming-Chih | Mar 2002 | B1 |
6407520 | Kleefeldt et al. | Jun 2002 | B1 |
6418764 | Lerchner | Jul 2002 | B1 |
6422457 | Frich et al. | Jul 2002 | B1 |
6429893 | Xin | Aug 2002 | B1 |
6568726 | Caspi et al. | May 2003 | B1 |
6570557 | Westerman et al. | May 2003 | B1 |
6580871 | Proidl | Jun 2003 | B1 |
6612415 | Yamane | Sep 2003 | B2 |
6624739 | Stobbe | Sep 2003 | B1 |
6677932 | Westerman | Jan 2004 | B1 |
6729169 | Moore | May 2004 | B2 |
6891479 | Eccleston | May 2005 | B1 |
6910301 | Kalempa et al. | Jun 2005 | B2 |
6961763 | Wang et al. | Nov 2005 | B1 |
6967562 | Menard et al. | Nov 2005 | B2 |
6972660 | Montgomery, Jr. et al. | Dec 2005 | B1 |
7127083 | Han et al. | Oct 2006 | B2 |
7248836 | Taylor | Jul 2007 | B2 |
7252311 | Pratt et al. | Aug 2007 | B2 |
7351910 | Magisano et al. | Apr 2008 | B1 |
7373795 | Kilbourne | May 2008 | B2 |
7420456 | Fisher | Sep 2008 | B2 |
7439850 | Boulard et al. | Oct 2008 | B2 |
7520152 | Sabo et al. | Apr 2009 | B2 |
7614008 | Ording | Nov 2009 | B2 |
7619529 | Goff et al. | Nov 2009 | B2 |
7633076 | Huppi et al. | Dec 2009 | B2 |
7643056 | Silsby | Jan 2010 | B2 |
7653883 | Hotelling et al. | Jan 2010 | B2 |
7657849 | Chaudhri et al. | Feb 2010 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7734249 | Mitchell, Jr. et al. | Jun 2010 | B1 |
7810852 | Alacqua et al. | Oct 2010 | B2 |
7844914 | Andre et al. | Nov 2010 | B2 |
7891222 | Ratkus et al. | Feb 2011 | B2 |
8006002 | Kalayjian et al. | Aug 2011 | B2 |
8019353 | Smithey et al. | Sep 2011 | B1 |
8024186 | De Bonet | Sep 2011 | B1 |
8035478 | Lee | Oct 2011 | B2 |
8122645 | Theile et al. | Feb 2012 | B2 |
8239784 | Hotelling et al. | Aug 2012 | B2 |
8269627 | Gore et al. | Sep 2012 | B2 |
8279180 | Hotelling et al. | Oct 2012 | B2 |
8310365 | Siegler, II et al. | Nov 2012 | B2 |
8325039 | Picard et al. | Dec 2012 | B2 |
8347720 | De Los Santos et al. | Jan 2013 | B2 |
8351789 | Wagener et al. | Jan 2013 | B2 |
8405387 | Novak et al. | Mar 2013 | B2 |
8445779 | Gretz | May 2013 | B1 |
8476577 | Nagahama et al. | Jul 2013 | B2 |
8479122 | Hotelling et al. | Jul 2013 | B2 |
8498572 | Schooley et al. | Jul 2013 | B1 |
8522596 | Avery | Sep 2013 | B2 |
8525102 | Augustyniak et al. | Sep 2013 | B2 |
8533144 | Reeser et al. | Sep 2013 | B1 |
8542189 | Milne et al. | Sep 2013 | B2 |
8544326 | Je | Oct 2013 | B2 |
8586902 | Verfuerth | Nov 2013 | B2 |
8600430 | Herz et al. | Dec 2013 | B2 |
8653982 | Yulkowski et al. | Feb 2014 | B2 |
8671723 | Dayanikli et al. | Mar 2014 | B2 |
8826708 | Lopes | Sep 2014 | B2 |
8864049 | Nolte et al. | Oct 2014 | B2 |
8872915 | Scalisi et al. | Oct 2014 | B1 |
8896416 | Lundy et al. | Nov 2014 | B1 |
8918208 | Hickman et al. | Dec 2014 | B1 |
9000916 | Meeker et al. | Apr 2015 | B2 |
9024759 | Uyeda et al. | May 2015 | B2 |
9049352 | Scalisi et al. | Jun 2015 | B2 |
9057210 | Dumas et al. | Jun 2015 | B2 |
9113051 | Scalisi | Aug 2015 | B1 |
9187929 | Webb et al. | Nov 2015 | B2 |
9222282 | Russo et al. | Dec 2015 | B2 |
9251679 | Wandel et al. | Feb 2016 | B2 |
9322194 | Cheng et al. | Apr 2016 | B2 |
9322201 | Cheng et al. | Apr 2016 | B1 |
9326094 | Johnson et al. | Apr 2016 | B2 |
9359794 | Cheng | Jun 2016 | B2 |
9382739 | Johnson et al. | Jul 2016 | B1 |
9396598 | Daniel-Wayman et al. | Jul 2016 | B2 |
9447609 | Johnson et al. | Sep 2016 | B2 |
9454893 | Warren et al. | Sep 2016 | B1 |
9470017 | Cheng et al. | Oct 2016 | B1 |
9470018 | Cheng et al. | Oct 2016 | B1 |
9514585 | Ahearn et al. | Dec 2016 | B2 |
9528294 | Johnson et al. | Dec 2016 | B2 |
9528296 | Cheng et al. | Dec 2016 | B1 |
9530262 | Johnson | Dec 2016 | B2 |
9530295 | Johnson | Dec 2016 | B2 |
9534420 | Cheng et al. | Jan 2017 | B1 |
9574372 | Johnson et al. | Feb 2017 | B2 |
9613476 | Johnson | Apr 2017 | B2 |
9624695 | Cheng et al. | Apr 2017 | B1 |
9640053 | Siann et al. | May 2017 | B2 |
9644398 | Cheng et al. | May 2017 | B1 |
9644399 | Johnson et al. | May 2017 | B2 |
9644400 | Cheng et al. | May 2017 | B1 |
9647996 | Johnson et al. | May 2017 | B2 |
9652917 | Johnson et al. | May 2017 | B2 |
9683391 | Johnson et al. | Jun 2017 | B2 |
9683392 | Cheng et al. | Jun 2017 | B1 |
9685015 | Johnson | Jun 2017 | B2 |
9685017 | Johnson | Jun 2017 | B2 |
9685018 | Johnson | Jun 2017 | B2 |
9691198 | Cheng et al. | Jun 2017 | B2 |
9695616 | Johnson et al. | Jul 2017 | B2 |
9704314 | Johnson et al. | Jul 2017 | B2 |
9704320 | Johnson et al. | Jul 2017 | B2 |
9706365 | Johnson et al. | Jul 2017 | B2 |
9725927 | Cheng | Aug 2017 | B1 |
9727328 | Johnson | Aug 2017 | B2 |
9728023 | Johnson | Aug 2017 | B2 |
9761073 | Cheng et al. | Sep 2017 | B2 |
9761074 | Cheng et al. | Sep 2017 | B2 |
9767632 | Johnson | Sep 2017 | B2 |
9769435 | Scalisi et al. | Sep 2017 | B2 |
9818247 | Johnson | Nov 2017 | B2 |
9916746 | Johnson et al. | Mar 2018 | B2 |
9922481 | Johnson et al. | Mar 2018 | B2 |
10017963 | Johnson et al. | Jul 2018 | B2 |
10140828 | Johnson et al. | Nov 2018 | B2 |
10181232 | Cheng et al. | Jan 2019 | B2 |
10198884 | Johnson | Feb 2019 | B2 |
11417182 | Shamai | Aug 2022 | B2 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020099945 | McLintock et al. | Jul 2002 | A1 |
20020117868 | Bates et al. | Aug 2002 | A1 |
20020138767 | Hamid et al. | Sep 2002 | A1 |
20020196771 | Vij et al. | Dec 2002 | A1 |
20030160681 | Menard et al. | Aug 2003 | A1 |
20030167693 | Mainini | Sep 2003 | A1 |
20040003257 | Mitchell | Jan 2004 | A1 |
20040012352 | Kachouh et al. | Jan 2004 | A1 |
20040075532 | Ueda et al. | Apr 2004 | A1 |
20040212678 | Cooper et al. | Oct 2004 | A1 |
20040215910 | Okaue et al. | Oct 2004 | A1 |
20040236918 | Okaue et al. | Nov 2004 | A1 |
20040237609 | Hosselet | Dec 2004 | A1 |
20040243779 | Okaue et al. | Dec 2004 | A1 |
20050007451 | Chiang | Jan 2005 | A1 |
20050029345 | Waterhouse et al. | Feb 2005 | A1 |
20050088145 | Loch | Apr 2005 | A1 |
20050179517 | Harms et al. | Aug 2005 | A1 |
20050212750 | Marvit et al. | Sep 2005 | A1 |
20050212752 | Marvit et al. | Sep 2005 | A1 |
20050248444 | Joao | Nov 2005 | A1 |
20050252739 | Callahan et al. | Nov 2005 | A1 |
20050286466 | Tagg et al. | Dec 2005 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20060033724 | Chaudhri et al. | Feb 2006 | A1 |
20060158144 | Theile et al. | Jul 2006 | A1 |
20060164208 | Schaffzin et al. | Jul 2006 | A1 |
20060193262 | McSheffrey et al. | Aug 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060267409 | Mullet et al. | Nov 2006 | A1 |
20060283219 | Bendz et al. | Dec 2006 | A1 |
20070056338 | Sabo et al. | Mar 2007 | A1 |
20070090843 | De Doncker et al. | Apr 2007 | A1 |
20070132849 | Hill et al. | Jun 2007 | A1 |
20070150842 | Chaudhri et al. | Jun 2007 | A1 |
20070188307 | Lai et al. | Aug 2007 | A1 |
20070229350 | Scalisi et al. | Oct 2007 | A1 |
20070246396 | Brollier | Oct 2007 | A1 |
20080011032 | Groff | Jan 2008 | A1 |
20080047200 | Krause et al. | Feb 2008 | A1 |
20080055241 | Goldenberg et al. | Mar 2008 | A1 |
20080101654 | Sano | May 2008 | A1 |
20080125965 | Carani et al. | May 2008 | A1 |
20080129498 | Howarter et al. | Jun 2008 | A1 |
20080211775 | Hotelling et al. | Sep 2008 | A1 |
20080223093 | Amir | Sep 2008 | A1 |
20080236214 | Han | Oct 2008 | A1 |
20080238669 | Linford | Oct 2008 | A1 |
20080297602 | Chang | Dec 2008 | A1 |
20080309624 | Hotelling | Dec 2008 | A1 |
20090006736 | Pantos et al. | Jan 2009 | A1 |
20090029672 | Manz | Jan 2009 | A1 |
20090066320 | Posey | Mar 2009 | A1 |
20090085878 | Heubel et al. | Apr 2009 | A1 |
20090128329 | Sato et al. | May 2009 | A1 |
20090180933 | Kauling et al. | Jul 2009 | A1 |
20090217596 | Neundorf et al. | Sep 2009 | A1 |
20090250552 | Kearns et al. | Oct 2009 | A1 |
20090256676 | Piccirillo et al. | Oct 2009 | A1 |
20090267732 | Chauvin et al. | Oct 2009 | A1 |
20090273438 | Sultan et al. | Nov 2009 | A1 |
20100000750 | Andel | Jan 2010 | A1 |
20100070281 | Conkie et al. | Mar 2010 | A1 |
20100089109 | Bliding et al. | Apr 2010 | A1 |
20100127517 | Bliding et al. | May 2010 | A1 |
20100141381 | Bliding et al. | Jun 2010 | A1 |
20100141762 | Siann et al. | Jun 2010 | A1 |
20100145164 | Howell | Jun 2010 | A1 |
20100156809 | Nutaro et al. | Jun 2010 | A1 |
20100283579 | Kraus et al. | Nov 2010 | A1 |
20100306549 | Ullmann | Dec 2010 | A1 |
20100325194 | Williamson et al. | Dec 2010 | A1 |
20110148631 | Lanham et al. | Jan 2011 | A1 |
20110056253 | Greiner et al. | Mar 2011 | A1 |
20110082634 | Povirk et al. | Apr 2011 | A1 |
20110100076 | Weinstein | May 2011 | A1 |
20110109678 | Schwartz et al. | May 2011 | A1 |
20110148575 | Sobecki et al. | Jun 2011 | A1 |
20110154740 | Matsumoto et al. | Jun 2011 | A1 |
20110185554 | Huang et al. | Aug 2011 | A1 |
20110215597 | Weum | Sep 2011 | A1 |
20110265528 | Saari | Nov 2011 | A1 |
20110276207 | Falkenstein | Nov 2011 | A1 |
20110277520 | Nunuparov | Nov 2011 | A1 |
20110285501 | Chen | Nov 2011 | A1 |
20120008526 | Borghei | Jan 2012 | A1 |
20120011905 | Gui | Jan 2012 | A1 |
20120068817 | Fisher | Mar 2012 | A1 |
20120073482 | Meeker et al. | Mar 2012 | A1 |
20120092502 | Knasel et al. | Apr 2012 | A1 |
20120199374 | Herth | Aug 2012 | A1 |
20120257615 | Eskildsen et al. | Oct 2012 | A1 |
20120280783 | Gerhardt et al. | Nov 2012 | A1 |
20120280789 | Gerhardt et al. | Nov 2012 | A1 |
20120280790 | Gerhardt | Nov 2012 | A1 |
20120306655 | Tan et al. | Dec 2012 | A1 |
20120319827 | Pance et al. | Dec 2012 | A1 |
20130010120 | Nnoruka | Jan 2013 | A1 |
20130023278 | Chin | Jan 2013 | A1 |
20130038550 | Chien et al. | Feb 2013 | A1 |
20130050106 | Chung et al. | Feb 2013 | A1 |
20130062892 | Chow et al. | Mar 2013 | A1 |
20130063138 | Takahashi et al. | Mar 2013 | A1 |
20130064378 | Chuang | Mar 2013 | A1 |
20130067969 | Webb et al. | Mar 2013 | A1 |
20130076048 | Aerts et al. | Mar 2013 | A1 |
20130126666 | Brown | May 2013 | A1 |
20130138826 | Ling et al. | May 2013 | A1 |
20130154823 | Ostrer et al. | Jun 2013 | A1 |
20130166202 | Bandyopadhyay et al. | Jun 2013 | A1 |
20130176107 | Dumas et al. | Jul 2013 | A1 |
20130178233 | McCoy et al. | Jul 2013 | A1 |
20130192318 | Yanar et al. | Aug 2013 | A1 |
20130207773 | Hathaway et al. | Aug 2013 | A1 |
20130229274 | Kumar et al. | Sep 2013 | A1 |
20130237193 | Dumas et al. | Sep 2013 | A1 |
20130271261 | Ribas et al. | Oct 2013 | A1 |
20130307670 | Ramaci | Nov 2013 | A1 |
20140020295 | Bonahoom et al. | Jan 2014 | A1 |
20140021725 | Baty et al. | Jan 2014 | A1 |
20140028443 | Ebner | Jan 2014 | A1 |
20140033773 | Myers et al. | Feb 2014 | A1 |
20140039366 | Joseph | Feb 2014 | A1 |
20140047878 | Zheng et al. | Feb 2014 | A1 |
20140049366 | Vasquez | Feb 2014 | A1 |
20140049369 | Ahearn et al. | Feb 2014 | A1 |
20140051355 | Ahearn et al. | Feb 2014 | A1 |
20140051425 | Ahearn et al. | Feb 2014 | A1 |
20140052783 | Swatsky et al. | Feb 2014 | A1 |
20140062466 | Thibault et al. | Mar 2014 | A1 |
20140067452 | Anderson et al. | Mar 2014 | A1 |
20140125599 | Seeley | May 2014 | A1 |
20140145666 | Swanson | May 2014 | A1 |
20140159865 | Eto et al. | Jun 2014 | A1 |
20140165673 | Tyner et al. | Jun 2014 | A1 |
20140189758 | Kozlowski | Jul 2014 | A1 |
20140202220 | Denison et al. | Jul 2014 | A1 |
20140218173 | Long et al. | Aug 2014 | A1 |
20140239647 | Jadallah et al. | Aug 2014 | A1 |
20140265359 | Cheng et al. | Sep 2014 | A1 |
20140267736 | DeLean | Sep 2014 | A1 |
20140267740 | Almomani et al. | Sep 2014 | A1 |
20140292481 | Dumas et al. | Oct 2014 | A1 |
20140324590 | Kong et al. | Oct 2014 | A1 |
20140340196 | Myers et al. | Nov 2014 | A1 |
20140354820 | Danialian et al. | Dec 2014 | A1 |
20140365773 | Gerhardt et al. | Dec 2014 | A1 |
20140375422 | Huber et al. | Dec 2014 | A1 |
20150008685 | Beck | Jan 2015 | A1 |
20150015513 | Kwak et al. | Jan 2015 | A1 |
20150022466 | Levesque | Jan 2015 | A1 |
20150027178 | Scalisi | Jan 2015 | A1 |
20150049189 | Yau et al. | Feb 2015 | A1 |
20150049191 | Scalisi et al. | Feb 2015 | A1 |
20150065167 | Scalisi | Mar 2015 | A1 |
20150096341 | Oovergaard | Apr 2015 | A1 |
20150102609 | Johnson et al. | Apr 2015 | A1 |
20150102610 | Johnson et al. | Apr 2015 | A1 |
20150102927 | Johnson et al. | Apr 2015 | A1 |
20150109104 | Fadell et al. | Apr 2015 | A1 |
20150116075 | Cregg et al. | Apr 2015 | A1 |
20150116080 | Cregg et al. | Apr 2015 | A1 |
20150116082 | Cregg et al. | Apr 2015 | A1 |
20150116490 | Scalisi | Apr 2015 | A1 |
20150128667 | Yoon et al. | May 2015 | A1 |
20150130587 | Lydecker | May 2015 | A1 |
20150145796 | Lee | May 2015 | A1 |
20150156031 | Fadell | Jun 2015 | A1 |
20150160770 | Stewart et al. | Jun 2015 | A1 |
20150170448 | Robfogel et al. | Jun 2015 | A1 |
20150185311 | Lohier | Jul 2015 | A1 |
20150194000 | Schoenfelder et al. | Jul 2015 | A1 |
20150199860 | Hong et al. | Jul 2015 | A1 |
20150211259 | Dumas et al. | Jul 2015 | A1 |
20150213658 | Dumas et al. | Jul 2015 | A1 |
20150213663 | Dumas et al. | Jul 2015 | A1 |
20150216326 | Artwohl et al. | Aug 2015 | A1 |
20150218850 | Uyeda et al. | Aug 2015 | A1 |
20150218857 | Hamada | Aug 2015 | A1 |
20150222517 | McLaughlin et al. | Aug 2015 | A1 |
20150227201 | Nakao | Aug 2015 | A1 |
20150227227 | Myers et al. | Aug 2015 | A1 |
20150228167 | Scalisi et al. | Aug 2015 | A1 |
20150233153 | Smart et al. | Aug 2015 | A1 |
20150233154 | Smart et al. | Aug 2015 | A1 |
20150240521 | Vaknin et al. | Aug 2015 | A1 |
20150240531 | Blust et al. | Aug 2015 | A1 |
20150241974 | Takeda | Aug 2015 | A1 |
20150242007 | Iwaizumi et al. | Aug 2015 | A1 |
20150242036 | Heidari | Aug 2015 | A1 |
20150242038 | Steiner et al. | Aug 2015 | A1 |
20150242045 | Choi et al. | Aug 2015 | A1 |
20150242047 | Zafiris | Aug 2015 | A1 |
20150242074 | Iwamoto | Aug 2015 | A1 |
20150242113 | Nguyen Thien et al. | Aug 2015 | A1 |
20150242115 | Gao et al. | Aug 2015 | A1 |
20150242696 | Kim et al. | Aug 2015 | A1 |
20150259949 | Cheng | Sep 2015 | A1 |
20150287254 | Ribas et al. | Oct 2015 | A1 |
20150300048 | Yen et al. | Oct 2015 | A1 |
20150302738 | Geerlings et al. | Oct 2015 | A1 |
20150308157 | Lin et al. | Oct 2015 | A1 |
20150310712 | Lyman | Oct 2015 | A1 |
20150348399 | Cree et al. | Dec 2015 | A1 |
20150356345 | Velozo et al. | Dec 2015 | A1 |
20150363989 | Scalisi | Dec 2015 | A1 |
20160024831 | Houser et al. | Jan 2016 | A1 |
20160032621 | Johnson et al. | Feb 2016 | A1 |
20160036594 | Conrad et al. | Feb 2016 | A1 |
20160037306 | Johnson et al. | Feb 2016 | A1 |
20160042581 | Kumar et al. | Feb 2016 | A1 |
20160044287 | Scalisi | Feb 2016 | A1 |
20160047145 | Johnson et al. | Feb 2016 | A1 |
20160048370 | Zenoff | Feb 2016 | A1 |
20160049024 | Johnson et al. | Feb 2016 | A1 |
20160049025 | Johnson | Feb 2016 | A1 |
20160049026 | Johnson | Feb 2016 | A1 |
20160050515 | Johnson | Feb 2016 | A1 |
20160055694 | Saeedi et al. | Feb 2016 | A1 |
20160055695 | Saeedi et al. | Feb 2016 | A1 |
20160092954 | Bassett et al. | Mar 2016 | A1 |
20160116510 | Kalous et al. | Apr 2016 | A1 |
20160127874 | Kingsmill et al. | May 2016 | A1 |
20160133071 | Henderson | May 2016 | A1 |
20160134932 | Karp | May 2016 | A1 |
20160180618 | Ho et al. | Jun 2016 | A1 |
20160180620 | Eyring et al. | Jun 2016 | A1 |
20160180621 | Desinor, Jr. | Jun 2016 | A1 |
20160189453 | Johnson et al. | Jun 2016 | A1 |
20160189454 | Johnson et al. | Jun 2016 | A1 |
20160189459 | Johnson et al. | Jun 2016 | A1 |
20160189502 | Johnson et al. | Jun 2016 | A1 |
20160189503 | Johnson et al. | Jun 2016 | A1 |
20160208541 | Goto | Jul 2016 | A1 |
20160225240 | Voddhi et al. | Aug 2016 | A1 |
20160275762 | Kuenzi | Sep 2016 | A1 |
20160284170 | Kasmir et al. | Sep 2016 | A1 |
20160284181 | Johnson | Sep 2016 | A1 |
20160290009 | Feirer et al. | Oct 2016 | A1 |
20160291966 | Johnson | Oct 2016 | A1 |
20160300476 | Kasmir et al. | Oct 2016 | A1 |
20160319569 | Johnson et al. | Nov 2016 | A1 |
20160319571 | Johnson | Nov 2016 | A1 |
20160326775 | Johnson | Nov 2016 | A1 |
20160328901 | Johnson | Nov 2016 | A1 |
20160330413 | Scalisi et al. | Nov 2016 | A1 |
20160343181 | Cheng et al. | Nov 2016 | A1 |
20160343188 | Johnson | Nov 2016 | A1 |
20160358433 | Johnson | Dec 2016 | A1 |
20160358437 | Johnson et al. | Dec 2016 | A1 |
20170011570 | Johnson et al. | Jan 2017 | A1 |
20170016249 | Johnson et al. | Jan 2017 | A1 |
20170019378 | Johnson et al. | Jan 2017 | A1 |
20170032597 | Johnson | Feb 2017 | A1 |
20170032602 | Cheng et al. | Feb 2017 | A1 |
20170053468 | Johnson | Feb 2017 | A1 |
20170053469 | Cheng et al. | Feb 2017 | A1 |
20170109952 | Johnson | Apr 2017 | A1 |
20170132864 | Adam | May 2017 | A1 |
20170169679 | Johnson et al. | Jun 2017 | A1 |
20170193724 | Johnson et al. | Jul 2017 | A1 |
20170228603 | Johnson | Aug 2017 | A1 |
20170243420 | Lien | Aug 2017 | A1 |
20170243455 | Johnson et al. | Aug 2017 | A1 |
20170263065 | Johnson | Sep 2017 | A1 |
20180040183 | Cheng et al. | Feb 2018 | A1 |
20180073274 | Johnson et al. | Mar 2018 | A1 |
20180135336 | Johnson et al. | May 2018 | A1 |
20180135337 | Johnson et al. | May 2018 | A1 |
20180179786 | Johnson | Jun 2018 | A1 |
20180253951 | Johnson et al. | Sep 2018 | A1 |
20180261029 | Johnson et al. | Sep 2018 | A1 |
20180268675 | Johnson et al. | Sep 2018 | A1 |
20180340350 | Johnson et al. | Nov 2018 | A1 |
20190019364 | Cheng et al. | Jan 2019 | A9 |
20190040653 | Baumgarte et al. | Feb 2019 | A1 |
20200011088 | Harris | Jan 2020 | A1 |
20200279446 | Hage et al. | Sep 2020 | A1 |
20210383663 | Shamai | Dec 2021 | A1 |
20220058904 | Ross et al. | Feb 2022 | A1 |
20220101672 | Abueshsheikh et al. | Mar 2022 | A1 |
20220101706 | Abueshsheikh et al. | Mar 2022 | A1 |
Number | Date | Country |
---|---|---|
2014236999 | Oct 2015 | AU |
2676196 | Jul 2008 | CA |
2834964 | Nov 2012 | CA |
2905009 | Sep 2014 | CA |
110276876 | Sep 2019 | CN |
110706382 | Jan 2020 | CN |
0 244 750 | Nov 1987 | EP |
0 486 657 | May 1992 | EP |
0 907 068 | Apr 1999 | EP |
1 404 021 | Mar 2004 | EP |
1 529 904 | May 2005 | EP |
2 428 774 | Mar 2012 | EP |
2 447 450 | May 2012 | EP |
2 454 558 | May 2012 | EP |
2 564 165 | Mar 2013 | EP |
2 579 002 | Apr 2013 | EP |
2 631 400 | Aug 2013 | EP |
2 642 252 | Sep 2013 | EP |
2 259 737 | Mar 1993 | GB |
WO 9119986 | Dec 1991 | WO |
WO 2006085852 | Aug 2006 | WO |
WO 2009142596 | Nov 2009 | WO |
WO 2011006515 | Jan 2011 | WO |
WO 2011139682 | Nov 2011 | WO |
WO 2012151290 | Nov 2012 | WO |
WO 2014062321 | Apr 2014 | WO |
WO 2014107196 | Jul 2014 | WO |
WO 2014151692 | Sep 2014 | WO |
WO 2014151692 | Sep 2014 | WO |
WO 2015023737 | Feb 2015 | WO |
WO 2015138726 | Sep 2015 | WO |
WO 2015138740 | Sep 2015 | WO |
WO 2015138747 | Sep 2015 | WO |
WO 2015138755 | Sep 2015 | WO |
WO 2016130777 | Aug 2016 | WO |
WO 2016196025 | Dec 2016 | WO |
Entry |
---|
International Search Report and Written Opinion for International Application No. PCT/US2014/026254, dated Nov. 18, 2014. |
International Preliminary Report on Patentability for International Application No. PCT/US2014/026254, dated Sep. 24, 2015. |
International Search Report and Written Opinion for International Application No. PCT/US2015/020180, dated Jun. 16, 2015. |
International Preliminary Report on Patentability for International Application No. PCT/US2015/020180, dated Sep. 22, 2016. |
International Search Report and Written Opinion for International Application No. PCT/US2015/020206, dated Jun. 29, 2015. |
International Preliminary Report on Patentability for International Application No. PCT/US2015/020206, dated Sep. 22, 2016. |
International Search Report and Written Opinion for International Application No. PCT/US2015/020216, dated Jun. 17, 2015. |
International Preliminary Report on Patentability for International Application No. PCT/US2015/020216, dated Sep. 22, 2016. |
International Search Report and Written Opinion for International Application No. PCT/US2015/020226, dated Jun. 25, 2015. |
International Preliminary Report on Patentability for International Application No. PCT/US2015/020226, dated Sep. 22, 2016. |
International Search Report and Written Opinion for International Application No. PCT/US2016/017508, dated Jun. 14, 2016. |
International Preliminary Report on Patentability for International Application No. PCT/US2016/017508, dated Aug. 24, 2017. |
International Search Report and Written Opinion for International Application No. PCT/US2016/033257, dated Aug. 22, 2016. |
International Preliminary Report on Patentability for International Application No. PCT/US2016/033257, dated Dec. 14, 2017. |
Number | Date | Country | |
---|---|---|---|
20210396046 A1 | Dec 2021 | US |
Number | Date | Country | |
---|---|---|---|
62481797 | Apr 2017 | US | |
61800937 | Mar 2013 | US | |
61801335 | Mar 2013 | US | |
61801294 | Mar 2013 | US | |
62036979 | Aug 2014 | US | |
62036989 | Aug 2014 | US | |
62036991 | Aug 2014 | US | |
62036993 | Aug 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15798425 | Oct 2017 | US |
Child | 17360705 | US | |
Parent | 15066210 | Mar 2016 | US |
Child | 15410845 | US | |
Parent | 14205608 | Mar 2014 | US |
Child | 15066210 | US | |
Parent | 14205973 | Mar 2014 | US |
Child | 15066210 | Mar 2016 | US |
Parent | 14206536 | Mar 2014 | US |
Child | 14205973 | US | |
Parent | 14206619 | Mar 2014 | US |
Child | 14206536 | US | |
Parent | 14207833 | Mar 2014 | US |
Child | 14206619 | US | |
Parent | 14207882 | Mar 2014 | US |
Child | 14207833 | US | |
Parent | 14208947 | Mar 2014 | US |
Child | 14207882 | US | |
Parent | 14208182 | Mar 2014 | US |
Child | 14208947 | US | |
Parent | 14212569 | Mar 2014 | US |
Child | 14208182 | US | |
Parent | 14205608 | Mar 2014 | US |
Child | 15065657 | US | |
Parent | 14205783 | Mar 2014 | US |
Child | 14205608 | US | |
Parent | 14205973 | Mar 2014 | US |
Child | 14205783 | US | |
Parent | 14206536 | Mar 2014 | US |
Child | 14205973 | US | |
Parent | 14206619 | Mar 2014 | US |
Child | 14206536 | US | |
Parent | 14207833 | Mar 2014 | US |
Child | 14206619 | US | |
Parent | 14207882 | Mar 2014 | US |
Child | 14207833 | US | |
Parent | 14208947 | Mar 2014 | US |
Child | 14207882 | US | |
Parent | 14208182 | Mar 2014 | US |
Child | 14208947 | US | |
Parent | 14212569 | Mar 2014 | US |
Child | 14208182 | US | |
Parent | 14730848 | Jun 2015 | US |
Child | 14622654 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15497383 | Apr 2017 | US |
Child | 15798425 | US | |
Parent | 15497327 | Apr 2017 | US |
Child | 15497383 | US | |
Parent | 15463022 | Mar 2017 | US |
Child | 15497327 | US | |
Parent | 15410845 | Jan 2017 | US |
Child | 15463022 | US | |
Parent | 14321260 | Jul 2014 | US |
Child | 14212569 | US | |
Parent | 14321000 | Jul 2014 | US |
Child | 14321260 | US | |
Parent | 14459054 | Aug 2014 | US |
Child | 15463022 | Mar 2017 | US |
Parent | 14461177 | Aug 2014 | US |
Child | 14459054 | US | |
Parent | 14465513 | Aug 2014 | US |
Child | 15066210 | Mar 2016 | US |
Parent | 14469186 | Aug 2014 | US |
Child | 15066210 | Mar 2016 | US |
Parent | 14471470 | Aug 2014 | US |
Child | 14469186 | US | |
Parent | 14622054 | Feb 2015 | US |
Child | 14471470 | US | |
Parent | 14622192 | Feb 2015 | US |
Child | 14622054 | US | |
Parent | 14622578 | Feb 2015 | US |
Child | 14622192 | US | |
Parent | 14622654 | Feb 2015 | US |
Child | 14622578 | US | |
Parent | 14730848 | Jun 2015 | US |
Child | 14622654 | US | |
Parent | 14732290 | Jun 2015 | US |
Child | 14730848 | US | |
Parent | 14796994 | Jul 2015 | US |
Child | 14732290 | US | |
Parent | 15065657 | Mar 2016 | US |
Child | 14796994 | US | |
Parent | 14321260 | Jul 2014 | US |
Child | 14212569 | US | |
Parent | 14321000 | Jul 2014 | US |
Child | 14321260 | US | |
Parent | 14465513 | Aug 2014 | US |
Child | 14321000 | US | |
Parent | 14465527 | Aug 2014 | US |
Child | 14465513 | US | |
Parent | 14469127 | Aug 2014 | US |
Child | 14465527 | US | |
Parent | 14469186 | Aug 2014 | US |
Child | 14469127 | US | |
Parent | 14471414 | Aug 2014 | US |
Child | 14469186 | US | |
Parent | 14471470 | Aug 2014 | US |
Child | 14471414 | US | |
Parent | 14622054 | Feb 2015 | US |
Child | 14471470 | US | |
Parent | 14622192 | Feb 2015 | US |
Child | 14622054 | US | |
Parent | 14622578 | Feb 2015 | US |
Child | 14622192 | US | |
Parent | 14622396 | Feb 2015 | US |
Child | 14622578 | US | |
Parent | 14622654 | Feb 2015 | US |
Child | 14622396 | US | |
Parent | 14731092 | Jun 2015 | US |
Child | 14730848 | US | |
Parent | 14732290 | Jun 2015 | US |
Child | 14731092 | US | |
Parent | 14796994 | Jul 2015 | US |
Child | 14732290 | US | |
Parent | 15066091 | Mar 2016 | US |
Child | 15066210 | Mar 2016 | US |