The present invention relates generally to tracking animals, such as pets, using a smart collar system, and to prevent animals from attacking other animals.
Pet collars have long been used to provide identification information regarding the home or owner location of pets, particularly when lost. Collars are also often used as an attachment means for leashes or other retention mechanisms for walking etc. Recently smart collars have been used to detect location of animals using GPS location. Some smart collars include stimulus mechanisms to help train the animal or warn of danger. Training collars have been developed in attempt to train animals such as to train a dog when and when not to bark. Often these collars emit a shock or sound in response to an audible bark. However, these current systems are not as reliable or accurate, and certain situations compound the matter when there is a plurality of noises, some of which may be similar to a dog bark, or from another dog barking or in the environment. Improvements in these areas is desired and the present invention seeks to improve upon these in a novel manner.
One problem with pets, such as dogs or cats, is that the pet may attack and harm and kill wildlife. For example, stray cats and owned cats kill millions of birds a year in the United States and Canada alone. Cats are thought to have contributed to the extinction of 63 species of wild animals, mostly birds. Dogs are thought to have contributed to the extinction of nearly one dozen species of wild animals. Both cats and dogs are threats to hundreds of species worldwide including species of birds, mammals, reptiles, and amphibians.
Accordingly, it is desirable to have a system and method for identifying and deterring an animal from successfully attacking and injuring or killing another animal. The system includes an animal collar with built-in electronics that are utilized to detect animal motion, which motion is determinative of actions and of which actions can be used to determine and predict behavior and in particular preparing for or attacking behavior. Accordingly, the present application relates to systems and methods for detecting that an animal is about to attack another animal and to prevent the animal from attacking another animal.
In one such embodiment a method of deterring an animal from attacking, the method comprises: receiving signals from at least one sensor of a plurality of sensors disposed on a smart collar worn by the animal; analyzing the signals from the plurality of sensors to determine a behavior of the animal; comparing the determined behavior to behavior known to indicate that the animal is about attack; and providing a deterring stimulus configured to prevent the animal from attacking when the determined behavior is indicative of an animal that is about to attack or providing a warning stimulus configured to warn nearby animals or people.
The deterring stimulus can include at least one of: an audio stimulus, a vibrating stimulus, an electro-shock stimulus, and a visual stimulus.
The warning stimulus can include at least one of: an audio stimulus, a vibrating stimulus, a visual stimulus and a notification sent to a mobile device.
For both the deterring and warning stimulus features those can be generated by the smart collar, although in some variations third-party devices in the vicinity can also be triggered to generate the listed deterring and warning stimuli. For example, a remote Bluetooth or Wifi enable speaker.
The audio stimulus can generally be configured to startle an animal nearby thereby causing them to be alert or to flee.
The above embodiment can include recording a location of the animal when it is determined that the animal is about to attack. This can be used to update the animal behavior detection and deterrent algorithm used. It can also determine the mode the smart collar is operating in. Similarly, the smart collar or other device can be used to record sound signals when it is determined that the animal is about to attack.
The received signals can be processed and used to indicate movement of the animal, which can also determine one or more actions.
The determined behavior can include one or more sequential actions taken by the animal.
In another embodiment a method of training an animal attack detection and deterrent system comprises: setting an attack detection system, which is comprised of a smart collar, a remote server and database into a training mode; receiving a plurality of input signals from one or more sensors associated with the smart collar, wherein at least one sensor is a motion sensor; observing or inducing the animal to engage in a plurality of actions, including one of which is a simulated attack or part of a simulated attack; corresponding the received plurality of input signals to each of the observed actions; and updating an attack detection algorithm based on the observed actions and plurality of input signal data.
This embodiment can further include the steps of: changing the attack detection system training mode to deterrent mode; and recording signal data indicative of actions leading up to and a part of an attack. It can also include the steps of: recording location data associated with the attack; and recording ambient noise data associated with the attack.
In the above method the induced actions could include any of: walking, running, jumping, laying down, eating, pouncing, growling, digging, cuddling and sleeping or others known that the animal can replicate.
In yet another embodiment a smart collar for deterring an animal from attacking, the smart collar comprises: a power supply; at least one sensor configured to receives signals indicative of actions taken by the animal; a memory configured to store behavior patterns based on one or more actions that are indicative that the animal is likely to attack; a processor configured to determine if the received signals indicate a likely to attack behavior of the animal by identifying one or more actions based on the received signals and determining if those one or more actions are indicative of behavior similar to the stored behavior patterns; and a deterrent device configured to deter the animal from attacking, wherein deterrent device deters the animal from attacking or continuing to attack when the processor determines that the animal behavior is one that is likely to attack or already attacking.
The at least one sensor of the smart collar can include an accelerometer, gyroscope or altimeter that determines movement of the animal.
The signals detected by the one or more sensors can include a magnitude indicating the type of movement and an angular component indicating direction of movement.
The deterrent device of the smart collar can include a haptic device or an audio emitting device that emits an audio signal. Although as noted in another embodiment, these could also be external devices to the smart collar, but in communication with the smart collar.
Again, the audio signal(s) generated can be configured to warn nearby animals or persons.
Additionally, the smart collar can include a location determination device. This location determination device can be configured to temporarily disable the deterrent device when it is determined that the animal is located inside of its dwelling or other safety zone. The location determination device could be a WIFI transceiver configured to connect the smart collar to WIFI networks, Bluetooth transmitter, or even GPS device. In the variation where the smart collar is connected to a specified WIFI network the processor could be configured to not analyze the animal's actions or behavior. In such a manner the Wifi network or range is used to identify a particular zone.
The smart collar can also include a training mode in which the one or more sensor receives signals, which the user indicates which set of received signals are to be associated with certain actions, and of which of the one or more actions indicate behavior consistent with the animal about to attack or attacking.
Other variations and configurations will become evident in the detailed description below.
The foregoing and other objects, features, and advantages of the invention will be apparent from the following description of particular embodiments of the invention, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
This application describes a behavior detection system and implementation method. Part of the behavior detection system is training the system to discern what is and is not an attack on another animal and what is normal movement.
Additionally, the collar system can include various sensors and interfaces including but not limited to a tangible display 132 providing pet identification information optical sensors, i.e. a cameras 130, a Global Positioning System (GPS) 110, RFID, infrared communication mechanisms, accelerometers, wi-fi adapters, Bluetooth adapters, SIM or GSM communication modules, temperature sensors, microphones, light sensors, ultrasonic, radio or virtually any other contemplated sensor which would be recognized by those having skill in the art as useful so as to provide a desired feedback regarding a pet activity, it will be appreciated that these various sensors are indicated by the single reference number, but can be provided as virtually any sensor which would provide information regarding a particular activity, parameter, or characteristic.
Additionally, display 132 of the collar can be an interactive display which can be used to access information or settings of the collar by a user. In some embodiments, the collar can also include lights as noted above or other indicia which can aid in pet location when lost, particularly at night or other dark conditions. Such a light can also be illuminated automatically in low-light conditions so as to aid in visibility, such as on walks for passing motorists, etc.
The accelerometers are used to measure motion of the smart collar, which is indicative of the motion of the pet that this is wearing the smart collar. The measured motion can be used to determine an action or behavior of the pet. For example, if a pet is walking the accelerometer signal will indicate a steady pace of movement at a relatively low velocity. When the pet is running, the accelerometer signal will indicate that the animal is moving at a higher velocity. If the pet is inactive, the accelerometer signal will indicate no movement.
When a pet, such as a dog or cat is hunting, the pet will typically move slowly, come to a stop, crouch or otherwise get low to the ground, and then burst with speed at the prey. This series of behaviors and movements results in a series of accelerometer signals that will have a detectable waveform that indicates an imminent attack.
In order to differentiate between acceptable behavior and an attack on another animal, an attack detection algorithm has been developed to measure signal from the accelerometer. The signals from the accelerometer are analyzed to determine pet behavior or movement. When the signal form the accelerometer indicates that the animal is about to attack, then the smart collar enters into attack deterrent mode. Various flowcharts of how this accomplished are illustrated in
In one embodiment, the attack detection algorithm also includes using sounds sensed by the audio sensor 230. The audio sensor 230 senses sound around the collar 100, including sound made by the pet. Many pets make a sound before attacking, such as a growl or a hiss. The attack detection algorithm includes using sound signals that indicate the pet is about to attack.
In one embodiment, the attack deterrent mode includes a negative reinforcement stimulus that is applied to the pet in order to deter the pet from attacking another animal. The negative stimulus or action can include one or more of shocking the animal, using the haptic feedback to buzz the animal, playing an audio signal, connecting to the user's cell phone to allow the user to reprimand the animal, or playing an audio recording of the user reprimanding the animal.
In one embodiment, the attack deterrent mode includes playing an audio signal that is likely to warn or scare the animal that is about to be attacked causing the animal to flee making it more likely to avoid the attack.
In one embodiment, both a negative stimulus or action and playing an audio signal are provided. The audio signal may also be an audio signal that both deters the pet from attacking and startles the attacked animal into fleeing.
In one embodiment, the attack deterrent mode includes recording an audio signal received from the audio sensor 230. The recorded audio signal can be transmitted to the user to allow the user to listen to the attack. Also, the recorded audio signal can be used to determine sounds the pet makes during an attack and can be used to update the attack detection algorithm.
As described above and in more detail below, the collar 100 includes multiple sensors that can be used to determine the location of the collar 100. The location of the collar 100 can be used to turn on the attack detection algorithm and to turn off the attack detection algorithm. When a pet is located inside, such as in its owner's home, the attack detection algorithm can be turned off so actions such as the pet playing are not detected as potential attacks. When the pet location indicates that the pet is outside, the attack detection algorithm is turned on.
In one embodiment, the attack detection algorithm may be manually turned on or off by the user by using a mobile device in communication with the collar, or with a button on the collar 100 itself. This enables the user to turn the attack detection algorithm on while the pet is indoors if there are other animals around. Also, it enables the user to turn off the attack detection algorithm when the pet is outside and the user is not concerned about the pet attacking other animals.
The attack deterrent system also includes an algorithm training mode. In this mode, the collar 100 and the various sensors detect input while the pet is playing in a way that simulates attacking another animal. For example, a young cat may “hunt” a toy mouse attached to a string that is pulled by the user. The user enters the system into a training mode, and as the cat “attacks” the toy mouse, the sensor data is collected and analyzed. This collected and analyzed data is used to update and fine tune the attack detection algorithm.
A historical database, of positively detected attacks or attempted attacks, can be updated periodically as attacks are appropriately detected. This information can be further used to fine tune the algorithm to decipher between a pet playing with a pet about to attack another animal.
Also contemplated herein is a pet training and location system which can be utilized by a pet owner to train a pet to behave in a certain manner depending on the pet's particular location as determined by a collar being worn by the pet. The attack detection algorithm and system can be incorporated onto a collar such as shown in
In accordance with these concepts, the system as contemplated can include a collar 100, as shown in
It will be appreciated that providing negative stimulus, such as through an electrode, and associated electric shock has been utilized in many previously known systems and is known as a relatively effective training method. However, one aspect of the present invention involves providing not only a negative stimulus for a negative behavior, but also allows for providing a positive stimulus for corrective behavior or desired behavior. In particular, one aspect of the present invention involves providing a positive stimulus when a pet stops barking after barking is detected. For example, when it is detected that a pet is barking and the pet is told, either by a person or person's voice recorded and stored in the collar, the pet receives positive stimulus. In order to provide positive stimulus, the system as contemplated can also include an audio transmitter, such as a speaker, which can be configured to provide an audio stimulus in the audible range or at ultrasonic frequencies which can be heard by the pet, but not the owner/user. In such cases, the audio transmitter can be configured to provide an audio signal which can be either pleasing or unpleasant to the pet in response to determined behaviors. The audio transmitter can provide positively trained sounds or recordings when positive activities are determined.
Further, it will also be understood that the audio transmitter can also be used for negative reinforcement, rather than using electric shock. In such cases the user can record a verbal reprimand, or some other negative reinforcement noise so as to provide a more humane negative reinforcement over the electric shock and electrode methodology.
In some embodiments, the user platform can include a dedicated screen in the application being devoted to hands-on training. As desired, the user can press a positive button to cause the collar to issue the positive reinforcement signal to help with real-time association of a particular sound with positive reinforcement. As such a separate button on the same screen can then cause the collar to issue the negative reinforcement signal(s) for real-time association of a particular sound with negative reinforcement.
With this interface, a trainer can perform general training with the collar's reinforcement signals instead of, or in addition to, traditional reinforcement signals.
It will also be understood that a power source, such as a battery, can be provided within collar which is configured to provide power to each of the aforementioned accessories, sensors, etc. The power source can be configured to be rechargeable either through a power port, or can incorporate wireless charging technology.
As discussed in some detail above, the system will include a user platform, such as an application, which can be configured to receive input from a user. It will be understood that the application/user platform can be accessed through mobile devices, web portals, or any number of suitable means. It will be understood that the platform is operable to define at least one permitted zone where the pet is permitted to reside and at least one restricted zone where the pet is restricted from entering. This can be achieved by defining or drawing boundaries, for example on a map.
Additionally, it will be understood that the collar can be provided with a local processing unit and non-transitory computer-readable media for tracking location or activities and saving data with regard to those activities locally. Such a local processor and non-transitory computer-readable media can store computer instructions wherein sounds, warnings, positive reinforcement, or negative reinforcement steps and when applied can each be determined locally and performed locally after transfer of such instructions from the user platform. Accordingly, the user platform can be connected to a remote server having a remote processor and non-transitory computer-readable media can be utilized remotely, and instructions can then be transmitted to the collar to perform any such step or action using a mobile or home network.
The system can also include processing capabilities and data storage capabilities which allow for activities to be determined, stored, and enter a desired mode based on a predetermined set of instructions in response to input or commands from the various sensor or commands provided through the communication systems. In some instances, pet data can be transmitted and stored over an external network or service for data tracking of various pet activities, parameters, etc.
In various aspects of the present invention the various sensors can be divided into various primary groups and subset groups. In response to various sensor inputs the collar can be prompted to enter into various modes wherein various primary and subset groups. For example, the audio sensor or microphone in combination with an accelerometer can detect for example when a dog is barking, in response to a detected barking over a predetermined timeframe, in response the collar can then activate the optical sensor or camera so as to detect or otherwise capture an image or video of what the dog is barking at. It is thus contemplated herein that various sensors can be primary sensors and can cause the collar to enter various modes wherein various sensor subsets are activated or deactivated in response to sensor input. It will be appreciated that the collar system and application can have a predetermined mode set, and in some instances custom modes can be created or certain sensors can be manually controlled using the mobile application.
The mobile application, which can be specifically designed to connect to the collar using Bluetooth technology on a smart device, can allow for control of the collar itself in real-time. For example, an active mode or a user connected mode can allow the pet to leave a predetermined area without signaling alerts to a previously defined area perimeter.
In yet additional embodiments the collar system can be connected to a control system or program during a charging process or other hard connection means when not being worn by the pet. Such connection and charging means can be provided using USB or other serial connections and charging methods.
It will be appreciated that a speaker can be provided wherein the user/owner can give verbal commands, positive reinforcement or otherwise. In other embodiments, negative reinforcement mechanisms such as vibrators or shock electrodes can be provided so as to provide negative reinforcement for certain behaviors. Each of these functions can be automatically activated or alternatively manually activated via the mobile smart device of the owner/user using the control application.
In certain embodiments, the control application can be utilized to customize a collar response by registering each of the individual independent proximity sensors and saving a profile therefore which prompts specific stimuli based on the proximity thereto. For example, a sensor on a cat and a sensor on a dog can prompt a negative response to discourage the dog from chasing the cat.
In yet additional embodiments certain collar responses can be customized by a user using the control application, such as a custom recording including particular and customized praise(s)/reprimand(s) to be played back by the speaker based on a sensed proximity to a particular item or boundary.
In alternative to using location information, audible noise can also be used. For example, the ambient noise changes from being inside to being outside. Thus, if the ambient noise is indicative of being outside that can then trigger the attack detection prevention mode to be activated.
In yet another alternative, the microphone can be used to further analyze the ambient noise to determine if other animals or persons are in the vicinity.
As noted above, there are various sensors that can be used along with historical and other animal behavior datasets in remote databases that utilized by and used to improve the attack detection and animal behavior algorithms.
It should be understood if not already, that these algorithm's can be learning algorithms, which include supervised learning, which includes the user input data, unsupervised learning, which includes gather data from multiple smart collars and animals to generate a database of signal information that can be used to refine determined actions and ultimately predicted behavior, as well as reinforcement learning which is known in the art. As noted, the type of sensor data, location, audio and historical data are used to ultimately enable the methods and systems above to deter and warn, as well as improve with usage.
While the principles of the invention have been described herein, it is to be understood by those skilled in the art that this description is made only by way of example and not as a limitation as to the scope of the invention. Other embodiments are contemplated within the scope of the present invention in addition to the exemplary embodiments shown and described herein. Modifications and substitutions by one of ordinary skill in the art are considered to be within the scope of the present invention.
This application claims the benefit of U.S. provisional patent application No. 63/082,739, being filed on Sep. 24, 2020, which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5067441 | Weinstein | Nov 1991 | A |
5576994 | Kato et al. | Nov 1996 | A |
5749324 | Moore | May 1998 | A |
5857433 | Files | Jan 1999 | A |
5868100 | Marsh | Feb 1999 | A |
6043748 | Touchton et al. | Mar 2000 | A |
6166643 | Janning et al. | Dec 2000 | A |
6232880 | Anderson et al. | May 2001 | B1 |
6232916 | Grillo et al. | May 2001 | B1 |
6271757 | Touchton et al. | Aug 2001 | B1 |
6487992 | Hollis | Dec 2002 | B1 |
6581546 | Dalland et al. | Jun 2003 | B1 |
6683564 | McBurney | Jan 2004 | B1 |
6700492 | Touchton et al. | Mar 2004 | B2 |
6903682 | Maddox | Jun 2005 | B1 |
6928958 | Crist et al. | Aug 2005 | B2 |
7000570 | Napolez et al. | Feb 2006 | B2 |
7017524 | Gillis et al. | Mar 2006 | B2 |
7046152 | Peinetti et al. | May 2006 | B1 |
7068174 | Peinetti et al. | Jun 2006 | B1 |
7110777 | Duncan | Sep 2006 | B2 |
7111586 | Lee et al. | Sep 2006 | B2 |
7117822 | Peinetti et al. | Oct 2006 | B1 |
7174855 | Gerig et al. | Feb 2007 | B2 |
7198009 | Crist et al. | Apr 2007 | B2 |
7204204 | Peinetti et al. | Apr 2007 | B1 |
7222589 | Lee, IV et al. | May 2007 | B2 |
7252051 | Napolez et al. | Aug 2007 | B2 |
7259718 | Patterson et al. | Aug 2007 | B2 |
7278376 | Peinetti et al. | Oct 2007 | B1 |
7343879 | Gerig et al. | Mar 2008 | B2 |
7345588 | Gerig | Mar 2008 | B2 |
7360505 | Gerig et al. | Apr 2008 | B2 |
7394390 | Gerig | Jul 2008 | B2 |
7409924 | Kates | Aug 2008 | B2 |
7495570 | Peinetti et al. | Feb 2009 | B1 |
7552699 | Moore | Jun 2009 | B2 |
7565885 | Moore | Jul 2009 | B2 |
7602302 | Hokuf et al. | Oct 2009 | B2 |
7667642 | Frericks et al. | Feb 2010 | B1 |
7710263 | Boyd | May 2010 | B2 |
7779788 | Moore | Aug 2010 | B2 |
7861676 | Kates | Jan 2011 | B2 |
7946252 | Lee, IV et al. | May 2011 | B2 |
8011327 | Mainini et al. | Sep 2011 | B2 |
8018334 | DiMartino et al. | Sep 2011 | B1 |
8152745 | Smith et al. | Apr 2012 | B2 |
8239133 | Wang et al. | Aug 2012 | B2 |
8342135 | Peinetti et al. | Jan 2013 | B2 |
8430064 | Groh et al. | Apr 2013 | B2 |
8436735 | Mainini | May 2013 | B2 |
8448607 | Giunta | May 2013 | B2 |
8704728 | Mujahed et al. | Apr 2014 | B2 |
8736499 | Goetzl et al. | May 2014 | B2 |
8779925 | Rich et al. | Jul 2014 | B2 |
8803692 | Goetzl et al. | Aug 2014 | B2 |
8823513 | Jameson et al. | Sep 2014 | B2 |
8839744 | Bianchi et al. | Sep 2014 | B1 |
8934923 | Golden | Jan 2015 | B1 |
8935093 | Chansarkar | Jan 2015 | B2 |
8939111 | Berntsen | Jan 2015 | B2 |
8947240 | Mainini | Feb 2015 | B2 |
8947241 | Trenkle et al. | Feb 2015 | B2 |
8955462 | Golden | Feb 2015 | B1 |
8972180 | Zhao et al. | Mar 2015 | B1 |
8978592 | Duncan et al. | Mar 2015 | B2 |
9146113 | Funk et al. | Sep 2015 | B1 |
9173380 | Trenkle et al. | Nov 2015 | B2 |
9258982 | Golder | Feb 2016 | B1 |
9538329 | Vivathana | Jan 2017 | B1 |
9648849 | Vivathana | May 2017 | B1 |
9654925 | Solinsky et al. | May 2017 | B1 |
9848295 | Mason et al. | Dec 2017 | B1 |
9861076 | Rochelle et al. | Jan 2018 | B2 |
9922522 | Solinsky et al. | Mar 2018 | B2 |
9924314 | Solinsky et al. | Mar 2018 | B2 |
9980463 | Milner et al. | May 2018 | B2 |
10045512 | Mainini et al. | Aug 2018 | B2 |
10151843 | McFarland et al. | Dec 2018 | B2 |
10154651 | Goetzl et al. | Dec 2018 | B2 |
10228447 | Rich et al. | Mar 2019 | B2 |
10231440 | Seltzer et al. | Mar 2019 | B2 |
D851339 | Solinsky et al. | Jun 2019 | S |
10356585 | Ling et al. | Jul 2019 | B2 |
10444374 | Park et al. | Oct 2019 | B2 |
10674709 | Goetzl et al. | Jun 2020 | B2 |
10806126 | Loewke et al. | Oct 2020 | B1 |
10842129 | Anderton et al. | Nov 2020 | B1 |
10863718 | Lazarevic | Dec 2020 | B1 |
10955521 | Seltzer | Mar 2021 | B2 |
10986813 | Seltzer et al. | Apr 2021 | B2 |
11013214 | Anderton et al. | May 2021 | B2 |
11166435 | Anderton et al. | Nov 2021 | B2 |
11470814 | Goetzl et al. | Oct 2022 | B2 |
11553692 | Goetzl et al. | Jan 2023 | B2 |
20050009376 | Gotz et al. | Jan 2005 | A1 |
20050066912 | Korbitz et al. | Mar 2005 | A1 |
20060197672 | Talamas, Jr. et al. | Sep 2006 | A1 |
20060247847 | Carter et al. | Nov 2006 | A1 |
20070204804 | Swanson et al. | Sep 2007 | A1 |
20080035072 | Lee | Feb 2008 | A1 |
20080036610 | Hokuf et al. | Feb 2008 | A1 |
20080272920 | Brown | Nov 2008 | A1 |
20090309789 | Verechtchiagine | Dec 2009 | A1 |
20100097208 | Rosing et al. | Apr 2010 | A1 |
20100111359 | Bai | May 2010 | A1 |
20100139576 | Kim et al. | Jun 2010 | A1 |
20100161271 | Shah et al. | Jun 2010 | A1 |
20110140884 | Santiago et al. | Jun 2011 | A1 |
20110298615 | Rich et al. | Dec 2011 | A1 |
20120206454 | Alasaarela | Aug 2012 | A1 |
20120209730 | Garrett | Aug 2012 | A1 |
20130044025 | Chiu | Feb 2013 | A1 |
20130127658 | McFarland et al. | May 2013 | A1 |
20130157628 | Kim et al. | Jun 2013 | A1 |
20130179204 | Sabarez, II | Jul 2013 | A1 |
20130225282 | Williams et al. | Aug 2013 | A1 |
20130271281 | Jessop | Oct 2013 | A1 |
20130332064 | Funk et al. | Dec 2013 | A1 |
20140002239 | Rayner | Jan 2014 | A1 |
20140002307 | Mole et al. | Jan 2014 | A1 |
20140012094 | Das et al. | Jan 2014 | A1 |
20140048019 | So | Feb 2014 | A1 |
20140230755 | Trenkle | Aug 2014 | A1 |
20140261235 | Rich et al. | Sep 2014 | A1 |
20140320347 | Rochelle et al. | Oct 2014 | A1 |
20140335887 | Liu et al. | Nov 2014 | A1 |
20140352632 | McLaughlin | Dec 2014 | A1 |
20150065167 | Scalisi | Mar 2015 | A1 |
20150107531 | Golden | Apr 2015 | A1 |
20150181840 | Tupin, Jr. et al. | Jul 2015 | A1 |
20150219767 | Humphreys et al. | Aug 2015 | A1 |
20150269624 | Cheng et al. | Sep 2015 | A1 |
20150373951 | Kelly | Dec 2015 | A1 |
20160007888 | Nieminen et al. | Jan 2016 | A1 |
20160097861 | Li et al. | Apr 2016 | A1 |
20160150362 | Shaprio et al. | May 2016 | A1 |
20160178392 | Goldfain | Jun 2016 | A1 |
20160259061 | Carter | Sep 2016 | A1 |
20160278346 | Golden et al. | Sep 2016 | A1 |
20170066464 | Carter et al. | Mar 2017 | A1 |
20170265432 | Anderton et al. | Sep 2017 | A1 |
20170372580 | Vivathana | Dec 2017 | A1 |
20180125038 | Hord | May 2018 | A1 |
20190029221 | Anderton et al. | Jan 2019 | A1 |
20200267941 | Seltzer et al. | Aug 2020 | A1 |
20210045353 | Ehrman | Feb 2021 | A1 |
20210274754 | Talley | Sep 2021 | A1 |
20220068142 | Anderton | Mar 2022 | A1 |
20220236367 | Seltzer et al. | Jul 2022 | A1 |
20220256812 | Huber et al. | Aug 2022 | A1 |
20220257132 | Huber et al. | Aug 2022 | A1 |
20220287577 | Huber et al. | Sep 2022 | A1 |
20230039951 | Seltzer et al. | Feb 2023 | A1 |
20230240269 | Mainini et al. | Aug 2023 | A1 |
20230301532 | Huber et al. | Sep 2023 | A1 |
Number | Date | Country |
---|---|---|
2865966 | Jul 2021 | CA |
101713822 | Aug 2013 | CN |
105393139 | Jul 2019 | CN |
2869691 | Nov 2005 | FR |
10260055 | Sep 1998 | JP |
10295212 | Nov 1998 | JP |
4787762 | Oct 2011 | JP |
2015139667 | Aug 2015 | JP |
102265654 | Jun 2021 | KR |
1997024577 | Oct 1997 | WO |
2000057692 | Oct 2000 | WO |
WO-2007015186 | Feb 2007 | WO |
2014151064 | Sep 2014 | WO |
2014182420 | Nov 2014 | WO |
2015173712 | Nov 2015 | WO |
WO-2016010906 | Jan 2016 | WO |
2016067116 | May 2016 | WO |
Entry |
---|
Alvaro Llaria; Geolocation and Monitoring Platform for Extensive Farming in Mountain Pastures (“Llaria”); IEEE International Conference on Industrial Technology (ICIT), pp. 2420-2425; Mar. 19, 2015. |
D. M. Anderson; Virtual Fencing—Past, Present and Future (“Anderson II”); The Rangeland Journal vol. 26, pp. 65-78. |
Johnathan Chang, et al.; Wireless Pet Containment (“Chang”); Rutgers University, Electrical and Computer Engineering Department, Capstone Design Projects, Team Project No. SP16-002; Feb. 22, 2016. |
W. Randolph Franklin, PNPOLY—Point Inclusion in Polygon Test W. Randolph Franklin (WRF), May 18, 2005, https://web.archive.org/web/20050518083531/http://www.ecse.rpi.edu/Homepages/wrf/Research/Short_Notes/pnpoly.html (“Franklin”). |
Kai Hormann, et al.; The Point in Polygon Problem for Arbitrary Polygons (“Hormann”); Computational Geometry, vol. 20 Issue 3; Nov. 2001. |
Alejandro Weinstein; Distance from a Point to a Polygon (“Weinstein”); Matlab Central File Exchange; Apr. 1, 2008. |
Donald H. House, Chapter 9—Raycasting Polygonal Models, Dec. 28, 2013, https://web.archive.org/web/20131228085233/http://people.cs.clemson.edu/˜dhouse/courses/405/notes/raypolygon.pdf (“House”). |
Andrea Antonia Serra, et al.; A Low-Profile Linearly Polarized 3D PIFA for Handheld GPS Terminals (“Serra”); IEEE Transactions on Antennas and Propagation, vol. 58, No. 4, pp. 1060-1066; Jan. 26, 2010. |
Paul D Groves, et al.; Context Detection, Categorization and Connectivity for Advanced Adaptive Integrated Navigation (“Groves”); Institute of Navigation GNSS+ 2013, Sep. 16-20, 2013, Nashville, TN, USA; Sep. 20, 2013. |
Jeffrey David Miller; A Maximum Effort Control System for the Tracking and Control of a Guided Canine; A dissertation submitted to the Graduate Faculty of Auburn University in partial fulfillment of the requirements for the Degree of Doctor of Philosophy; Dec. 13, 2010; 216 pages. |
Arun Vydhyanathan, et. al.; The Next Generation Xsens Motion Trackers for Industrial Applications (“Xsens”); 2015 Whitepaper published by Xsens regarding Motion Trackers (Version 2.0.1); 2015. |
SiRF Technology Inc.; SiRF Demo UserGuide (“SiRF”); Mar. 2016. |
Stefan Schirra; How Reliable are Practical Point-in-Polygon Strategies? (“Schirra”); 16th European Symposium on Elgorithms (ESA); 2008. |
Collision Course II: Ray-Polygon-Intersection (“Claus”); Phys.ik.cx; Apr. 28, 2016. |
LMU User's Guide (“LMU”); CalAmp DataCom Inc.; Dec. 2009. |
Salvatore John Giunta; Garmin DC50 Dog Collar (“Garmin DC50”); 2013. |
The Whistle GPS Pet Tracker & Activity Monitor (“Whistle”); Whistle Labs Inc.; 2015. |
Tractive GPS Pet Tracker (“Tractive”); Tractive GmbH; 2013. |
Tagg—The Pet Tracker (“Tagg”); Snaptracs Inc./Qualcomm Incorporated; 2011. |
NoFence; Nofence AS; 2016. |
Directional Virtual Fencing (DVF) Devices (“DVF”); United States Department of Agriculture Research / Massachusetts Institute of Technology; 2007. |
Invisible Fence Brand GPS 2.0 (“Invisible Fence”); Invisible Fence, Inc.; 2015. |
PeTrak Electric Fence (“PeTrak”); PeTrak, LLC; 2012. |
Wolf-Tek Pet Collar (“Wof-Tek”); Wolf-Tek, LLC; 2015. |
Mastrack Tracking System (“MasTrack”); MasTrack, LLC; 2015. |
GPS Trackit (“GPS Trackit”); GPS Trackit, LLC; 2013. |
Life 360 Family Safe Assist and Driver Protect (“Life360”); Life360 Inc.; May 25, 2016. |
Geozilla Family Locator (“GeoZilla”); GeoZilla, Inc.; Apr. 2, 2016. |
Fleetsat GPS Tracking Solutions (“Fleetsat”); Fleetsat Inc.; Jan. 3, 2015. |
Trimble Aardvark DR + GPS (“Aardvark”); Trimble Navigation Limited; Jun. 29, 2012. |
Round Solutions Nano Tracker Tracking Device (“NanoTracker”); Round Solutions GMBH & Co Kg; Apr. 18, 2016. |
Zack Butler, Peter Corke, Ron Peterson, Daniela Rus; Virtual Fences for Controlling Cows; Dartmouth College Department of Computer Science; CSIRO Manufacturing & Infrastructure Technology Brisbane; MIT Computer Science and Artificial Intelligence Laboratory; New Orleans, LA; Apr. 2004; 9 pages. |
Yilmaz Kemal Yüce, et al.; An alternative approach to overcome ethical issues of geotracking patients with Alzheimer's disease; 7th International Symposium on Health Informatics and Bioinformatics; 2012. |
Bogdan Târnaucǎ, et al.; Using Complex Event Processing for implementing a geofencing service; IEEE 11th International Symposium on Intelligent Systems and Informatics (SISY); Nov. 14, 2013. |
Arun Vydhyanathan et al.; The Next Generation Xsens Motion Trackers for Industrial Applications; Xsens; 2015; 9 pages. |
Berbakov et al.; Smart-Phone Application for Autonomous Indoor Positioning; Proceedings of the IEEE International Instrumentation and Measurement Technology Conference; May 14, 2015; pp. 670-374. |
How to check if a given point lies inside or outside a polygon?—Geeks for Geeks, Jul. 11, 2013, https://web.archive.org/web/20130715200034/http://www.geeksforgeeks.org/how-to-check-if-a-given-point-lies-inside-a- polygon (“Geeks”). |
Vivek Shah, When is a Point Inside a Polygon?, May 17, 2013, https://web.archive.org/web/20130517010213/http://cgatglance.blogspot.com/ (“Shah”). |
Collision Course II: Ray-Polygon-Intersection, Apr. 28, 2016, https://web.archive.org/web/20161021075420/http://phys.ik.cx/ (“Claus”). |
A. Bahga and V. Madisetti, “Cloud-Based Information Technology Framework for Data Driven Intelligent Transportation Systems,” Journal of Transportation Technologies, vol. 3 No. 2, 2013, pp. 131-141. doi: 10.4236/jtts.2013.32013. |
Prasad, S., Weeks, M., Zhang, Y., Zelikovsky, A., Belkasim, S., Sunderraman, R., & Madisetti, V. (2002). Mobile Fleet Application using Soap and System on Devices (SYD) Middleware Technologies. Communications, Internet, and Information Technology. |
Madisetti, Vijay, et al. (2004). SyD: A Middleware Testbed for Collaborative Applications over Small Heterogeneous Devices and Data Stores. 3231. 352-371. 10.1007/978-3-540-30229-2_19. |
Number | Date | Country | |
---|---|---|---|
63082739 | Sep 2020 | US |