The present disclosure relates generally to automate access points such as security gates, and, more particularly, to an in-vehicle intelligent access control system and method configured to automate access of a vehicle to an access point.
Access to facilities can be controlled using manual intervention. For example, a security gate controlling access can be opened or closed to a vehicle based on manual inspection of an identification card, such as a driver's license of the driver of the vehicle. Such manual inspection can be performed by a security guard at the gate who views the identification card and views the driver for subjective matching of the identification card and the driver. Further identification of the vehicle and the contents of the vehicle is often performed by subjective visual matching by a security guard at the gate.
With newly introduced autonomous driving systems, access of a vehicle to an access point can be automated. For example, a driver of a vehicle can insert or swipe an identification card in or near a card reader, such as a card having a magnetic strip or a radio frequency identification (RFID) chip or device. In response to the card reader positively determining the identification card corresponds to a grant of access, a controller of the security gate can respond to the positive determination to automatically grant access to the vehicle. However, such insertion or swiping of the identification card still requires manual activity of the driver. Accordingly, such gate control is only semi-automated.
According to an embodiment consistent with the present disclosure, an in-vehicle intelligent access control system and method are configured to automatically control access of a vehicle to an access point. Prior to approach of the vehicle to the access point, an enrollment system stores, in a database, first data identifying a non-human object in a vehicle, second data identifying the vehicle, third data identifying a human occupant of the vehicle, and fourth data identifying an itinerary of the vehicle. An access control management system receives at least the first data from the database. When the vehicle approaches the access point, the system detects the vehicle within a predetermined range of an access point, receives fifth data transmitted by the vehicle, determines a match of the fifth data with at least the first data within a prescribed tolerance, and controls the access point to allow access of the vehicle to a facility if the fifth data matches at least the first data.
In an embodiment, a system comprises an enrollment system and an access control management system. The enrollment system has a first processor configured by code executing therein to store, in a database, first data identifying a non-human object in a vehicle, second data identifying the vehicle, third data identifying a human occupant of the vehicle, and fourth data identifying an itinerary of the vehicle. The access control management system has a second processor configured by code executing therein to receive at least the first data from the database, to detect the vehicle entering within a first predetermined range of an access point, to receive fifth data transmitted by the vehicle, to determine a match of the fifth data with at least the first data within a prescribed tolerance, and to automate the access point to allow access of the vehicle to a facility if the fifth data matches at least the first data.
The access control management system determines the match when the vehicle is within a second predetermined distance of the access point, with the second predetermined distance being less than the first predetermined distance. The access control management system is operatively connected to the database by a network, and the access control management system receives at least the first data from the database through the network. The vehicle includes a sensor configured to detect an identification corresponding to at least one of the first data, second data, third data, and fourth data. The sensor detects the identification selected from the group consisting of: identification of the non-human object, identification of the vehicle, identification of the human occupant, and identification of a current time and a current location of the vehicle. The sensor is selected from the group consisting of: a bar code reader, a thermal sensor, an ultrasonic sensor, an optical sensor, a card reader, a biometric scanner, an electronic scanner configured to communicate with a smartphone, a clock, a global positioning device, an accelerometer, a gyroscope, and a magnetometer. The sensor is disposed on the vehicle. Alternatively, the sensor is disposed within the interior of the vehicle.
In another embodiment, a system is configured to automate access of a vehicle to a facility. The system includes a network, a database, an enrollment system, and an access control management system. The enrollment system is operatively connected to the database through the network, wherein the enrollment system has a first processor configured by code executing therein to store, in the database, first data identifying a non-human object in the vehicle, second data identifying the vehicle, third data identifying a human occupant of the vehicle, and fourth data identifying an itinerary of the vehicle. The access control management system operatively connected to the database through the network, wherein the access control management system has a second processor configured by code executing therein to receive at least the first data from the database, to detect the vehicle entering within a first predetermined range of an access point, to receive fifth data transmitted by the vehicle, to determine a match of the fifth data with at least the first data within a prescribed tolerance, and to automate the access point to allow access of the vehicle to the facility if the fifth data matches at least the first data.
An output device is configured to output an error message if the fifth data does not match at least the first data. The access control management system receives at least the first data from the database through the network when the vehicle is within a first predetermined distance from the access point. The vehicle includes a sensor configured to detect an identification corresponding to at least one of the first data, second data, third data, and fourth data. The sensor detects the identification selected from the group consisting of: identification of the non-human object, identification of the vehicle, identification of the human occupant, and identification of a current time and a current location of the vehicle. The sensor is selected from the group consisting of: a bar code reader, a thermal sensor, an ultrasonic sensor, an optical sensor, a card reader, a biometric scanner, an electronic scanner configured to communicate with a smartphone, a clock, a global positioning device, an accelerometer, a gyroscope, and a magnetometer. The sensor is disposed on the vehicle. Alternatively, the sensor is disposed within the interior of the vehicle. The access control management system determines the match when the vehicle is within a second predetermined distance of the access point, with the second predetermined distance being less than the first predetermined distance.
In a further embodiment, a method comprises storing, in a database, first data identifying a non-human object in a vehicle, second data identifying the vehicle, third data identifying a human occupant of the vehicle, and fourth data identifying an itinerary of the vehicle; detecting the vehicle entering within a first predetermined range of an access point; receiving fifth data transmitted by the vehicle; determining whether a match of the fifth data with at least the first data within a prescribed tolerance exists; and controlling the access point to allow access of the vehicle to a facility if the fifth data matches at least the first data.
The method further includes outputting an error message from an output device if the fifth data does not match at least the first data. The vehicle includes a sensor configured to detect an identification corresponding to at least one of the first data, second data, third data, and fourth data, wherein the identification is selected from the group consisting of: identification of the non-human object, identification of the vehicle, identification of the human occupant, and identification of a current time and a current location of the vehicle.
Any combinations of the various embodiments and implementations disclosed herein can be used in a further embodiment, consistent with the disclosure. These and other aspects and features can be appreciated from the following description of certain embodiments presented herein in accordance with the disclosure and the accompanying drawings and claims.
It is noted that the drawings are illustrative and are not necessarily to scale.
Example embodiments consistent with the teachings included in the present disclosure are directed to an in-vehicle intelligent access control system and method are configured to automate access of a vehicle to an access point.
As shown in
As described in greater detail below, an in-vehicle intelligent access control device 26 is disposed within the vehicle 12. The in-vehicle intelligent access control device 26 has a processor 46 configured by code provided from a memory 48 which is executing in the processor to receive data from at least one sensor in real-time. The received data includes first data identifying a non-human object in the vehicle 12, second data identifying the vehicle 12, third data identifying a human occupant of the vehicle 12, and fourth data identifying an itinerary of the vehicle 12. The non-human objects identified by the first data can be items contained in the vehicle 12. The non-human objects can also be items loaded upon the vehicle 12. The non-human objects can also be goods. In addition, the non-human objects can be living animals. Alternatively, the non-human objects can be living plants.
The identification of the vehicle 12 by the second data can be a vehicle identification number (VIN). The identification of the vehicle 12 can also include a vehicle plate number. The identification of the vehicle 12 can also include a city, state, province, or country identifier, such as “SA” for Saudi Arabia. The identification of the vehicle 12 can comply with the ISO 3166-1 standard for countries and their subdivisions. The identification of the vehicle 12 can also include the make of the vehicle 12. The identification of the vehicle 12 can also include the model of the vehicle 12. The identification of the vehicle 12 can also include the year of manufacture associated with the vehicle 12. The identification of the vehicle 12 can also include a bar code placed on the vehicle 12. The bar code can be uniquely associated with the vehicle 12. Each data comprising the identification of the vehicle is storable in the database 22 or another storage device which can communicate with the processor of the in-vehicle intelligent access control device 26.
The human occupant of the vehicle 12 identified by the third data can be a driver of the vehicle 12. The human occupant of the vehicle 12 can also be a passenger of the vehicle 12. The human occupant can be associated with a name of the occupant. The name of the occupant can be based on an identification document of the occupant, such as a driver's license. The identification document can also be a passport of the occupant. The third data can also be a value as the count of the number of occupants of the vehicle 12. The human occupant can also be associated with biometric data of the occupant. The biometric data can be a fingerprint of the occupant. The biometric data can also be the face of the occupant. The biometric data can also be any identifier uniquely associated with the occupant, such as a retinal scan. The biometric data can be verified using fingerprint analysis, facial recognition, and other biometric verification methods as described below using conventional devices for obtaining such data from the occupant which devices are available to provide inputs to the in-vehicle intelligent access control device 26.
The itinerary of the vehicle 12 identified by the fourth data can include a location of the vehicle 12. The location of the vehicle 12 can include Global Positioning System (GPS) coordinates of the destination of the vehicle 12, such as the geographic location of the access point 14. The itinerary of the vehicle 12 can also include a time of the vehicle 12, such as a projected time of the vehicle 12 in a specified time zone. The itinerary of the vehicle 12 can also include a date of the vehicle 12, such as a projected date of the vehicle 12 in a specified time zone. The itinerary of the vehicle 12 can also include combinations of the location, time, and date associated with geographic travel of the vehicle 12. The fourth data is storable in the database 22 or another storage device which can communicate with the processor of the in-vehicle intelligent access control device 26.
Referring again to
In an alternative embodiment, the enrollment system 24 receives as inputs any one of the first, second, third, and fourth data from a third-party, such as a vehicle loading service. The vehicle loading service can load the vehicle with objects and occupants. The vehicle loading service can also associate a value as the count of the number of objects. The vehicle loading service can also associate a value as the count of the number of occupants. The vehicle loading service can also associate the vehicle and its identification information with the objects and occupants. The vehicle loading service can also associate the vehicle, objects, and occupants with an itinerary of the vehicle. Each of the associations can be stored in the database 22 or in another data storage device.
The third-party can also be a vehicle dispatching service. The vehicle dispatching service can check the vehicle and its objects and occupants at any point on the itinerary of the vehicle, using signals provided by the sensors and readers included in the device 26 as described further below. The vehicle dispatching service can also associate the vehicle and its identification information with the objects and occupants. The vehicle dispatching service can also associate a value as the count of the number of objects. The vehicle dispatching service can also associate a value as the count of the number of occupants. The vehicle dispatching service can also associate the vehicle, objects, and occupants with the itinerary of the vehicle. Each of the associations can be stored in the database 22 or in another data storage device.
The enrollment system 24 has a processor configured by code executing therein to store, in the database 22, first data identifying a non-human object in the vehicle 12, second data identifying the vehicle 12, third data identifying a human occupant of the vehicle 12, and fourth data identifying an itinerary of the vehicle 12. The database 22 can be an SQL database configured to store data in the SQL format. Alternatively, the database 22 can store data in any known format. For example, the first data identifying a non-human object in the vehicle 12 can list the non-human objects as text entries. In another example, the second data identifying the vehicle 12 can list the vehicle identifiers as a record or a collection of text entries, such as the make, model, and year of manufacture associated with the vehicle 12. In a further example, the third data identifying a human occupant of the vehicle 12 can include a set of data points associated with fingerprint analysis, facial recognition, retinal analysis, or other biometric data formats. In another example, the fourth data identifying an itinerary of the vehicle 12 can list the itinerary as a record or a collection of text entries, such as the time, data, and location of the vehicle 12 as the vehicle travels geographically.
The access control management system 18 has a processor configured by code executing therein to receive at least the first data from the database 22, to detect the vehicle 12 entering within a first predetermined range of the access point 14, to receive fifth data transmitted by the vehicle 12, to determine a match of the fifth data with at least the first data within a prescribed tolerance, and to automate the access point 14 to allow access of the vehicle 12 to a facility if the fifth data matches at least the first data. The prescribed tolerance for determining a match can include a criterion depending upon the type of data being compared. For example, the first data identifying a non-human object in the vehicle 12 can include a number of containers storing goods, a number of living animals, or a number of living plants without necessarily specifying the contents of the containers, or the types of living animals or plants. If the container, animal, or plant numbers in the first data are, for example and not by way of limitation of an acceptable tolerance, within plus one or minus one of the numbers in the fifth data, then a match can be established.
In another example, the second data identifying the vehicle 12 can include the year of manufacture of the vehicle. If the year numbers in the second data are, for example, match exactly—that is, with no tolerance permitted for that data point as compared to the fifth data, then a match can be established. In a further example, the third data identifying a human occupant of the vehicle 12 can include twenty points of fingerprint identification points, such as ridges and whorls. If the number of fingerprint identification points in the third data are, for example, matching at least sixteen points in the fifth data, then a match can be established. In another example, the third data identifying a human occupant of the vehicle 12 can include a thermal scan from which a count of human shapes is automatically performed by thermal image recognition techniques. If the count of human shapes is within zero or minus one of the numbers in the fifth data, then a match can be established. In still further an example, the fourth data identifying an itinerary of the vehicle 12 can include a time, date, and location of the vehicle 12. If the numbers in the fourth data are, for example, within plus one hour or minus one hour of the time in the fifth data, or within plus one day or minus one day of the date in the fifth data, or within ten miles from the location in the fifth data, then a match can be established.
The access control management system 18 determines the match, within the prescribed tolerance, when the vehicle 12 is within a second predetermined distance of the access point 14, with the second predetermined distance being less than the first predetermined distance. The access control management system 18 receives at least the first data from the database 22 through the network 20. As shown in
The vehicle 12 includes at least one sensor configured to detect an identification corresponding to at least one of the first data, second data, third data, and fourth data. As shown in
Referring to
As described above, the at least one sensor includes a card reader 28 having a slot configured to receive an identification card associated with the vehicle 12. For example, the identification card can be a vehicle registration card. Alternatively, the identification card can be associated with an occupant of the vehicle 12, such as a driver's license or a passport. The at least one sensor can also include an antenna 30 operatively connected to at least one of the sensors 30, 40, 42. The sensor 38 can be an ultrasonic sensor configured to emit ultrasonic waves in the interior of the vehicle 12. Reflections of the ultrasonic waves can be detected by the sensor 38 and used by the in-vehicle intelligent access control device 26 to count the number of non-human objects in the vehicle 12. Alternatively, reflections of the ultrasonic waves can be detected by the sensor 38 and used by the in-vehicle intelligent access control device 26 to count the number of occupants in the vehicle 12. The sensor 40 can be a thermal sensor configured to detect heat in the interior of the vehicle 12. The heat detected by the sensor 40 can be used by the in-vehicle intelligent access control device 26 to count the number of occupants in the vehicle 12. The sensor 42 can be any other type of sensor configured to detect the occupants of the vehicle 12. For example, the sensor 42 can be an optical sensor such as a camera configured to generate an image of the occupants. Based on the image, the in-vehicle intelligent access control device 26 can recognize the shapes of people to count the number of occupants in the vehicle 12. Alternatively, the in-vehicle intelligent access control device 26 can perform facial recognition of the occupants based on the image. Using such facial recognition, the in-vehicle intelligent access control device 26 can perform biometric verification of the occupants, including the driver of the vehicle 12.
The in-vehicle intelligent access control device 26 can also include an antenna 32 connected to a geographic locating system configured to perform geographic location of the vehicle 12 using the Global Positioning System (GPS) or any other known geographic locating system, such as GLONASS, BEIDOU, and GALILEO. The in-vehicle intelligent access control device 26 can also include a biometric sensor 34, such as a fingerprint scanner, configured to detect a unique physical property of an occupant of the vehicle 12. For example, each occupant of the vehicle 12 can insert a finger into the biometric sensor 34 to be scanned and processed by the in-vehicle intelligent access control device 26 to identify each respective occupant.
As shown in
Other sensors can be included in the in-vehicle intelligent access control device 26. Alternatively, the other sensors can be in communication with the in-vehicle intelligent access control device 26, such as by a wired or wireless connection. The other sensors can include a bar code reader configured to read a bar code. The bar code can be on a label placed on an object in or on the vehicle 12, with the bar code identifying the object, such as its nature, contents, point of origin, etc. Alternatively, the bar code can be on a label placed on or in the vehicle 12, with the bar code identifying the vehicle 12, such as the make, model, license plate, year of manufacture, etc. of the vehicle 12. In addition, the bar code can be on a card associated with an occupant of the vehicle 12, such as a card in the form of a driver's license, a passport, an identification (ID) card, etc.
Another sensor can be an electronic scanner configured to communicate with a smartphone using a known communication protocol, such as BLUETOOTH. The smartphone can be associated with an occupant of the vehicle 12. The smartphone can include a display for displaying a bar code such as a quick response (QR) code, with the bar code identifying the occupant of the vehicle 12. Alternatively, the smartphone can emit an identification signal to identify the occupant associated with the smartphone. Accordingly, the electronic scanner can acquire identification information from the smartphone to identify an occupant of the vehicle 12.
In addition, another sensor can include a clock configured to track the time, date, and time zone associated with the vehicle 12, in order to identify the temporal location of the vehicle 12 in relation to a previously stored itinerary. Another set of sensors can include one or more of an accelerometer, a gyroscope, and a magnetometer configured to track the speed, direction, and orientation of the vehicle 12, in order to identify the spatial or geographic location of the vehicle 12 in relation to the previously stored itinerary.
Referring to
Referring to
The access control management system 18 can determine if the vehicle 12 is at a first predetermined distance from the access point 14, such as the security gate 16, as shown in
In step 114, if the actual count of the occupants does not match the count of the number of occupants listed by the enrollment system 24 in the database 22, the method 100 can try a predetermined number of times in step 116 to activate the thermal sensor 38 to count the occupants of the vehicle 12. The predetermined number of times in step 116 can be three times. The predetermined number of times can be default to three times. The predetermined number of times can be set by a system administrator of the access control management system 18. In step 114, if the actual count of the occupants does not match the count of the number of occupants listed by the enrollment system 24 in the database 22 after the predetermined number of times, the method 100 proceeds to step 118 to output an error message. The outputted error message can be displayed on the display 52 to the system operator of the access control management system 18. Alternatively, the outputted error message can be sent to the vehicle 12 using the WIFI signals 60. The outputted error message can then be conveyed to at least one occupant of the vehicle 12. The at least one occupant can be the driver of the vehicle 12. The outputted error message can be an audio message generated through a sound speaker of the vehicle 12 to be heard by the at least one occupant. The audio message can indicate that the count of occupants does not match the count in the database 22 previously stored by the enrollment system 24. The audio message can also indicate that access of the vehicle 12 is denied due to the mismatch of the number of occupants in the vehicle 12 compared to the enrolled or pre-registered number previously stored in the database 22. The outputted error message can be a visual message on a display in the vehicle 12.
However, in step 114, if there is a match of the count of the number of occupants in the vehicle 12, within the prescribed tolerance described above, then the method 100 proceeds to step 120. In step 120, an in-vehicle object inspection is performed. For example, an occupant of the vehicle 12 can use a bar code scanner to scan bar codes on non-human objects in the vehicle 12. The bar codes can be on a container of goods or living plants. The bar codes can also be on a container such as a cage for living animals.
In step 122, if the actual count of the non-human objects does not match the count of the number of non-human objects listed by the enrollment system 24 in the database 22, the method 100 proceeds to step 118 to output an error message. The outputted error message can be displayed on the display 52 to the system operator of the access control management system 18. Alternatively, the outputted error message can be sent to the vehicle 12 using the WIFI signals 60. The outputted error message can then be conveyed to at least one occupant of the vehicle 12. The at least one occupant can be the driver of the vehicle 12. The outputted error message can be an audio message generated through a sound speaker of the vehicle 12 to be heard by the at least one occupant. The audio message can indicate that the count of non-human objects does not match the count in the database 22 previously stored by the enrollment system 24. The audio message can also indicate that access of the vehicle 12 is denied due to the mismatch of the number of non-human objects in the vehicle 12 compared to the enrolled or pre-registered number previously stored in the database 22. The outputted error message can be a visual message on a display in the vehicle 12.
However, in step 122, if there is a match of the count of the number of non-human objects in the vehicle 12, within the prescribed tolerance described above, then the method 100 proceeds to step 124. In step 124, the method 100 checks whether the distance D of the vehicle 12 from the access point 14 is at a second predetermined distance less than the first predetermined distance. For example, the second predetermined distance can be 20 m. If the distance D is not the second predetermined distance, then the method 100 has the access control management system 18, in step 126, send a hold instruction to the controller 50 to continue to deny access to the access point 14 of the vehicle 12. Otherwise, once the vehicle 12 is at the second predetermined distance in step 124, the access control management system 18 instructs the structure 58 to activate and transmit the WIFI signals 60 in step 128. The WIFI signals can comply with the WIFI Direct, IEEE 802.11p standard.
The access control management system 18 starts a connection with the vehicle 12 in step 130 using the transmitted signals 60, 62. In particular, a connection for data transfer is established between the in-vehicle intelligent access control device 26 and the access control management system 18. After the connection is established in step 130, the access control management system 18 reads the fifth data gathered and transmitted from the in-vehicle intelligent access control device 26 in step 132. The fifth data can be gathered in response to the establishment of the connection in step 130. Alternatively, the fifth data can be gathered by the in-vehicle intelligent access control device 26 and stored in a memory of the in-vehicle intelligent access control device 26 for later retrieval and transmission from the in-vehicle intelligent access control device 26 to the access control management system 18. The access control management system 18 then determines if the fifth data matches at least the first data within a prescribed tolerance in step 134, as described above. Alternatively or in addition, the management system 18 determines if the fifth data matches the first data, the second data, the third data, and the fourth data, each within a prescribed tolerance, as described above.
In step 134, if any of the fifth data does not match, within a prescribed tolerance described above, the first data, second data, third data, and fourth data listed by the enrollment system 24 in the database 22, the method 100 proceeds to step 118 to output an error message. The outputted error message can be displayed on the display 52 to the system operator of the access control management system 18. Alternatively, the outputted error message can be sent to the vehicle 12 using the WIFI signals 60. The outputted error message can then be conveyed to at least one occupant of the vehicle 12. The at least one occupant can be the driver of the vehicle 12. The outputted error message can be an audio message generated through a sound speaker of the vehicle 12 to be heard by the at least one occupant. The audio message can indicate that the situation in the vehicle 12, corresponding to non-human objects in a vehicle 12, the identification of the vehicle 12, the human occupants of the vehicle 12, and the itinerary of the vehicle 12 does not match the corresponding data, respectively, in the database 22 previously stored by the enrollment system 24. The audio message can also indicate that access of the vehicle 12 is denied due to the mismatch of the situation in the vehicle 12 compared to the enrolled or pre-registered number previously stored in the database 22. The outputted error message can be a visual message on a display in the vehicle 12.
However, in step 134, if there is a match of the fifth data with all of the first data, second data, third data, and fourth data, within the respective prescribed tolerances of each type of data, as described above, then access is granted automatically in real-time to the vehicle 12 in step 136. For example, access is granted if the actual number of non-human objects in the vehicle 12 matches the number of non-human objects previously stored in the database 22, if the actual identity of the vehicle 12 matches the identification of the vehicle 12 previously stored in the database 22, if the actual number of occupants in the vehicle 12 matches the number of occupants previously stored in the database 22, and if the actual time and date of arrival of the vehicle 12 at the access point 14 matches the time and date of arrival according to an itinerary previously stored in the database 22.
Such granted access can include opening or raising the security gate 16 at the access point 14. The granted access can also include marking the vehicle 12 to travel into the facility beyond the access point 14. The marking of the vehicle 12 can be electronic such that the vehicle 12 is recognized as having access throughout the facility. Alternatively, the marking of the vehicle 12 can include affixing a bar code to the vehicle 12 indicating that the vehicle 12 is recognized as having access throughout the facility.
Portions of the methods described herein can be performed by software or firmware in machine readable form on a tangible (e.g., non-transitory) storage medium. For example, the software or firmware can be in the form of a computer program including computer program code adapted to cause the in-vehicle intelligent access control system and method to perform various actions described herein when the program is run on a computer or suitable hardware device, and where the computer program can be embodied on a computer readable medium. Examples of tangible storage media include computer storage devices having computer-readable media such as disks, thumb drives, flash memory, and the like, and do not include propagated signals. Propagated signals can be present in a tangible storage media. The software can be suitable for execution on a parallel processor or a serial processor such that various actions described herein can be carried out in any suitable order, or simultaneously.
It is to be further understood that like or similar numerals in the drawings represent like or similar elements through the several figures, and that not all components or steps described and illustrated with reference to the figures are required for all embodiments or arrangements.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “contains”, “containing”, “includes”, “including,” “comprises”, and/or “comprising,” and variations thereof, when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Terms of orientation are used herein merely for purposes of convention and referencing and are not to be construed as limiting. However, it is recognized these terms could be used with reference to an operator or user. Accordingly, no limitations are implied or to be inferred. In addition, the use of ordinal numbers (e.g., first, second, third) is for distinction and not counting. For example, the use of “third” does not imply there is a corresponding “first” or “second.” Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” “having,” “containing,” “involving,” and variations thereof herein, is meant to encompass the items listed thereafter and equivalents thereof as well as additional items.
While the disclosure has described several exemplary embodiments, it will be understood by those skilled in the art that various changes can be made, and equivalents can be substituted for elements thereof, without departing from the spirit and scope of the invention. In addition, many modifications will be appreciated by those skilled in the art to adapt a particular instrument, situation, or material to embodiments of the disclosure without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiments disclosed, or to the best mode contemplated for carrying out this invention, but that the invention will include all embodiments falling within the scope of the appended claims.
The subject matter described above is provided by way of illustration only and should not be construed as limiting. Various modifications and changes can be made to the subject matter described herein without following the example embodiments and applications illustrated and described, and without departing from the true spirit and scope of the invention encompassed by the present disclosure, which is defined by the set of recitations in the following claims and by structures and functions or steps which are equivalent to these recitations.
Number | Name | Date | Kind |
---|---|---|---|
6026340 | Corrado et al. | Feb 2000 | A |
6958676 | Morgan et al. | Oct 2005 | B1 |
8811664 | Dalal et al. | Aug 2014 | B2 |
9202118 | Wang et al. | Dec 2015 | B2 |
9963106 | Ricci | May 2018 | B1 |
10535208 | Zavesky et al. | Jan 2020 | B2 |
10810816 | Kocher et al. | Oct 2020 | B1 |
10846809 | Channah et al. | Nov 2020 | B2 |
20050110610 | Bazakos | May 2005 | A1 |
20100156630 | Ainsbury | Jun 2010 | A1 |
20160117866 | Stancato et al. | Apr 2016 | A1 |
20160196706 | Tehranchi | Jul 2016 | A1 |
20170052538 | Li et al. | Feb 2017 | A1 |
20200238952 | Lindsay et al. | Jul 2020 | A1 |
Number | Date | Country | |
---|---|---|---|
20220237971 A1 | Jul 2022 | US |