The present disclosure relates to systems and methods for monitoring body temperature and more particularly to managing one or more thermal scanners including a mobile temperature device, and systems and methods for monitoring body temperature using one or more thermal scanners.
Under a virus pandemic situation, like COVID-19, where there are no vaccines nor specific antiviral treatments for a virus, preventive measures such as wearing a face mask in public settings and monitoring and self-isolation for people who suspect they are infected are strongly recommended. Ongoing monitoring in a workplace can be performed by developing and implementing procedures to check for signs and symptoms of employees daily upon arrival and to encourage anyone who is sick to stay home, and to monitor employee absences. Such ongoing monitoring also can be performed at home. A virus pandemic situation has increased demand for thermal scanners or temperature scanners which can screen the skin to infer body temperature in a contactless manner. Under this situation, improvements in a system for monitoring body temperature using thermal scanners remain desired.
Implementations of the present disclosure relate to a system and a method for monitoring body temperature and more particularly to one or more thermal scanners including a mobile temperature device, and a system and a method for monitoring body temperature using one or more thermal scanners.
In some implementations of the present disclosure, a method may include monitoring, by at least one processor, a body temperature of a first person using a first device. The method may include determining, by the at least one processor responsive to the monitoring, as a first determination result, whether the monitored body temperature exceeds a predetermined threshold. The method may include performing, by the at least one processor, image processing on an image of the first person. The method may include determining, by the at least one processor based on a result of the image processing, as a second determination result, whether the first person wears a face mask. The method may include controlling, by the at least one processor, a second device based on at least one of the first determination result or the second determination result.
In some implementations of the present disclosure, a system may include at least one processor. The at least one processor may be configured to cause a first device to monitor a body temperature of a first person, determine, responsive to the monitoring, as a first determination result, whether the monitored body temperature exceeds a predetermined threshold, perform image processing on an image of the first person, determine, based on a result of the image processing, as a second determination result, whether the first person wears a face mask, and control a second device based on at least one of the first determination result or the second determination result.
These and other aspects and features of the present implementations will become apparent to those ordinarily skilled in the art upon review of the following description of specific implementations in conjunction with the accompanying figures, wherein:
According to certain aspects, implementations in the present disclosure relate to a system and a method for monitoring body temperature and more particularly to one or more thermal scanners including a mobile temperature device, and a system and a method for monitoring body temperature using one or more thermal scanners.
Under a virus pandemic situation, like COVID-19, where there are no vaccines nor specific antiviral treatments for a virus, workplace screening may be required to restrict individual infected with or at higher risk for serious illness from the virus from accessing workplace facilities. Workplace screenings can be implemented by asking a set of questions upon entry, performing temperature checks or visual inspection, checking whether a person wears a face covering or a face mask, or other. Such screenings may need to be performed in an efficient and contactless manner. Also, such screenings may need to be easily and efficiently integrated into workplace entry management systems (hardware or software, for example) such as systems for personnel management, pass management, or attendance management.
Face recognition techniques can be utilized to identify or classify a person for workplace screenings. However, wearing face masks may make face recognition difficult through conventional facial detection programs.
In using a plurality of thermal scanners for workplace screenings, different thermal scanners may produce different data outputs, and their different applications and hardware may store and transmit data in proprietary databases and different formats. Without an efficient and flexible scheme to access such data, it would be difficult to use a plurality of thermal scanners and manage them using an integrated management application.
To solve the above-noted problems, according to certain aspects, a middleware or monitoring system (software, hardware, etc.) is provided to interface with thermal scanners for workplace temperature monitoring of employees, visitors, or strangers. In some implementations, a temperature monitoring system may provide a dashboard or console interface to display real-time data obtained from one or more thermal scanners, providing information showing statistical data and trend data related with body temperature of people in an organization.
In some implementations, thermal scanners may include infrared thermometer, laser thermometers, non-contact thermometers or temperature guns, infrared pyrometers, thermographic cameras, infrared cameras, thermal imaging cameras, ambient temperature sensors, thermal imagers, a combination thereof or like. In some implementations, a mobile/portable device may be coupled with a thermal scanner. In some implementations, a thermal scanner may be wirelessly paired or coupled with a mobile device (e.g., smartphone) or a fixed mount on a door (e.g., a front door of a house), providing information showing statistical data and trend data related with body temperature of a person or family members at home. In some implementations, the thermal scanner may communicate with a mobile device (e.g., smartphone) using Bluetooth or Wi-Fi so that a portable temperature monitoring system can provide a result of temperature monitoring fast and accurately.
In some implementations, a mobile temperature device or a mobile device (e.g., a smartphone) coupled with the mobile temperature device may be configured to apply artificial intelligence (e.g., machine learning using neural networks) calibrated for a specific emergency situation (e.g., COVID-19) to enforce preventive measure for the specific emergency situation. In some implementations, the mobile temperature device or the mobile device coupled with the mobile temperature device may be configured to perform a combination of at least one of face recognition, temperature sensors, geospatial positioning data, proximity sensors, environmental sensors, biometric sensors, motion detection, and/or a short distance communication (e.g., RFID, near field communication (NFC) or Bluetooth, among others). Performing such combination can have advantages of mutually reinforcing the effectiveness of other key components by applying artificial intelligence calibrated for a specific emergency.
In some implementations, a workplace entry management system may identify and/or classify a person upon entry using face recognition. In some implementations, the system may perform image processing on a face image using a plurality of landmarks each of which is a map of points that surround a feature of the face, e.g., eyes, mouth, nose, etc. Masks worn by people may obscure more than half of these landmarks, making face recognition difficult through conventional facial detection programs (e.g., DLib tools). In some implementations, the system may adjust or modify the landmarks used to primarily use landmarks around the eyes and brows to detect and recognize faces. In some implementations, the system may use such adjusted or modified landmarks to determine whether a person wears a mask. In some implementations, the system may use a machine learning algorithm to detect such features of the face from the face image. In another implementation, an image processor and thermal scanner are combined in a single device.
In some implementations, in order to access different data outputs produced by different thermal scanners and stored in different databases and different formats, a workplace entry management system may connect to a server via a network. In some implementations, an application programming interface (API) may be provided in the server so that the server can connect to different databases to access different data outputs produced by different thermal scanners. In some implementations, the server may include a local database so that data stored at the local database can be accessed through the API. In some implementations, the server may include an API that can access a plurality of remote databases. In some implementations, the server may include a local database and an API that can access both the local database and a plurality of remote databases.
According to certain aspects, implementations in the present disclosure relate to a method including monitoring, by at least one processor, a body temperature of a first person using a first device. The method may include determining, by the at least one processor responsive to the monitoring, as a first determination result, whether the monitored body temperature exceeds a predetermined threshold. The method may include performing, by the at least one processor, image processing on an image of the first person. The method may include determining, by the at least one processor based on a result of the image processing, as a second determination result, whether the first person wears a face mask. The method may include controlling, by the at least one processor, a second device based on at least one of the first determination result or the second determination result.
According to certain aspects, implementations in the present disclosure relate to a system may include at least one processor. The at least one processor may be configured to cause a first device to monitor a body temperature of a first person, determine, responsive to the monitoring, as a first determination result, whether the monitored body temperature exceeds a predetermined threshold, perform image processing on an image of the first person, determine, based on a result of the image processing, as a second determination result, whether the first person wears a face mask, and control a second device based on at least one of the first determination result or the second determination result.
Various implementations in the present disclosure have one or more of the following advantages and benefits.
First, implementations in the present disclosure can perform a combination of at least one of face recognition, temperature sensors, motion detection, and/or a short distance communication (e.g., near field communication (NFC) or Bluetooth, among others). Performing such combination can have advantages of mutually reinforcing the effectiveness of other key components by applying artificial intelligence calibrated for a specific emergency.
Second, implementations in the present disclosure can display real-time data obtained from a thermal scanner, providing real-time related with body temperature of people in an organization, and control a physical device (e.g., gate) using the real-time data. With this configuration, the organization can efficiently and effectively restrict individual infected with or at higher risk for serious illness from the virus from accessing workplace facilities in an automated and contactless manner.
Third, implementations in the present disclosure can perform image processing to determine whether a person wears a face mask or to recognize the face of a person by adjusting or modifying landmarks of the face. This configuration can provide a more stable way of face recognition even when a person wears a face mask or a face covering.
Fourth, implementations in the present disclosure can provide an application programming interface (API) in a server so that workplace entry management system coupled with a plurality of different thermal scanners can connect to different databases to access different data outputs produced by different thermal scanners. This configuration can provide an efficient and flexible scheme to access different data outputs that are produced by different thermal scanners and stored in different databases and different formats.
Referring to
In some implementations, the system 100 may be coupled or paired with one or more gates 130-1 to 130-N. In some implementations, the system 100 may be connected to one or more gates 130-1 to 130-N via a network. In some implementations, the system 100 may be coupled or paired with one or more cameras 140-1 to 140-N. In some implementations, the system 100 may be connected to one or more cameras 140-1 to 140-N via a network. In some implementations, the system 100 may be connected to a server 500 via a network. Here, the network may be a Local Area Network (“LAN”), a wide area network (“WAN”), a wireless network, and/or the Internet, among others. The wireless network may be the IEEE 802.11 protocols, near field communication (NFC), Bluetooth, ANT, or any other wireless protocol, among others.
In some implementations, the system 100 may include one or more of a console manager 106, a device manager 104, a personnel manager 108, a pass manager 110, a system manager 112, an attendance manager 114, or an application manager 116, which perform console management, device management, personnel management, pass management, system management, attendance management, or application management, respectively, which will be described below with reference to
Referring to
In more detail, the processor(s) 210 may be any logic circuitry that processes instructions, e.g., instructions fetched from the memory 260 or cache 220. In some implementations, the processor(s) 210 are microprocessor units or special purpose processors. The computing device 200 may be based on any processor, or set of processors, capable of operating as described herein. The processor(s) 210 may be single core or multi-core processor(s). The processor(s) 210 may be multiple distinct processors.
The memory 260 may be any device suitable for storing computer readable data. The memory 260 may be a device with fixed storage or a device for reading removable storage media. Examples include all forms of non-volatile memory, media and memory devices, semiconductor memory devices (e.g., EPROM, EEPROM, SDRAM, and flash memory devices), magnetic disks, magneto optical disks, and optical discs (e.g., CD ROM, DVD-ROM, or Blu-Ray® discs). A computing system 172 may have any number of memory devices as the memory 260.
The cache memory 220 is generally a form of computer memory placed in close proximity to the processor(s) 210 for fast read times. In some implementations, the cache memory 220 is part of, or on the same chip as, the processor(s) 210. In some implementations, there are multiple levels of cache 220, e.g., L2 and L3 cache layers.
The network interface controller 230 manages data exchanges via the network interface (sometimes referred to as network interface ports). The network interface controller 230 handles the physical and data link layers of the OSI model for network communication. In some implementations, some of the network interface controller's tasks are handled by one or more of the processor(s) 210. In some implementations, the network interface controller 230 is part of a processor 210. In some implementations, a computing system 172 has multiple network interfaces controlled by a single controller 230. In some implementations, a computing system 172 has multiple network interface controllers 230. In some implementations, each network interface is a connection point for a physical network link (e.g., a cat-5 Ethernet link). In some implementations, the network interface controller 230 supports wireless network connections and an interface port is a wireless (e.g., radio) receiver/transmitter (e.g., for any of the IEEE 802.11 protocols, near field communication “NFC”, Bluetooth, ANT, or any other wireless protocol). In some implementations, the network interface controller 230 implements one or more network protocols such as Ethernet. Generally, a computing device 172 exchanges data with other computing devices via physical or wireless links through a network interface. The network interface may link directly to another device or to another device via an intermediary device, e.g., a network device such as a hub, a bridge, a switch, or a router, connecting the computing device 172 to a data network such as the Internet.
The computing system 172 may include, or provide interfaces for, one or more input or output (“I/O”) devices 250. Input devices include, without limitation, keyboards, microphones, touch screens, foot pedals, sensors, MIDI devices, and pointing devices such as a mouse or trackball. Output devices include, without limitation, video displays, speakers, refreshable Braille terminal, lights, MIDI devices, and 2-D or 3-D printers.
Other components may include an I/O interface, external serial device ports, and any additional co-processors. For example, a computing system 172 may include an interface (e.g., a universal serial bus (USB) interface) for connecting input devices, output devices, or additional memory devices (e.g., portable flash drive or external media drive). In some implementations, a computing device 172 includes an additional device such as a co-processor, e.g., a math co-processor can assist the processor 210 with high precision or complex calculations.
A specification of the mobile temperature device 310 is shown in Table 1 as an example. That is, the present disclosure is not limited to the specification shown in Table 1.
In some implementations, the computing system 423 may have configurations similar to those of the computing system 200 (see
In some implementations, a camera 402 may be configured to capture video and static images with Super Video Graphics Array (SVGA) 1280/720 HD resolution. The camera 402 or the camera assembly may be configured to perform network communication (e.g., Wi-Fi connectivity), and perform a facial recognition (see
In some implementations, a distance sensor 404 may be a distance sensor or a time-of-flight sensor for providing accurate distance measurements to the subject. The distance sensor can improve the accuracy of temperature measurement because more accurate temperature readings are obtained as the subject approaches the scanner. For example, STMicroelectronics VL53L3CX time-of-flight proximity sensor may be used as the distance sensor 404.
In some implementations, a temperature sensor 406 may have at least 35° of field of view (FOV) 423 (see
In some implementations, the temperature sensor 406 can support (1) an outdoor temperature range of 60-80 degree (° F.); (2) splash proof or submersible for liquid ingress protection; (3) a sustain drop height of 5 ft; (4) system weight less than or equal to 3 Oz; and/or (5) dimension (L×W×H) of less than or equal to 1.67″×2.8″×0.75″.
In some implementations, a presence sensor 426 may be a presence sensor or a proximity sensor or an occupancy sensor. In some implementations, the presence sensor may have at least 100° FOV (FOV 427 in
In some implementations, a battery 428 may be a 650 mAh battery or a 1200 mAh battery for an extended battery life. For example, a 650 mAh battery with dimension (L×W×H) of 0.9″×1.9″×0.24″ or a 1200 mAh battery with dimension (L×W×H) of 1.3″×2.4″×0.2″ or 1.1″×2.4″×0.24″ may be used as the battery 428.
In some configurations in which the scanner 400 is mounted on the door 444 (see
In some configurations, as shown in
In some configurations in which the user holds the scanner 400 (see
In some implementations, one or more thermal scanners or a temperature monitoring system may be configured to apply artificial intelligence (e.g., machine learning using neural networks) calibrated for a specific emergency situation (e.g., COVID-19) to enforce preventive measure for the specific emergency situation. For example, a thermal scanner may be a thermal scanner 120-1, . . . , or 120-N (see
In some implementations, one or more thermal scanners or a temperature monitoring system may estimate the age of a person based on normal body temperature (see Table 2 below) since normal body temperature range is different for various age groups. In some implementations, one or more thermal scanners or a temperature monitoring system may perform face recognition combined with age estimation so as to provide fever thresholds more personalized. For example, when the scanner or system detects a person's age as falling in 3-10 years based on his or her normal temperature, the scanner or system may determine a fever threshold of 37.8° C. (according to Table 1, for example) to be higher than fever thresholds for older people.
In some implementations, one or more thermal scanners or a temperature monitoring system can improve accuracy and detection of abnormal temperatures while minimizing false positives, which feature is critical in settings with young children or seniors. In some implementations, the scanner or system can use a result of age estimation to improve face recognition even for mask-wearers using a generative adversarial network (GAN) model, for example. Thus, accurate detection of personalized abnormal temperatures is significant for other purposes, e.g., accurate face recognition.
In some implementations, one or more thermal scanners or a temperature monitoring system can utilize features of ears for face recognition because ears are an effective biometric trait. Ear images have been utilized in many different works for the purpose of person identification, age estimation, and gender classification. In some implementations, one or more thermal scanners or a temperature monitoring system may use multiple visible light cameras (e.g., using multiple visible light lenses 312) for depth and facial detection. Without this configuration, the scanner or the system can detect an object of interest more accurately in the view, particularly if a field of view is crowded (for instance, in urban environments, in a line, a crowded lobby). In some implementations, the scanner or the system can perform multi-object recognition and track multiple objects even through low-frame rates recordings.
In some implementations, one or more thermal scanners or a temperature monitoring system can produce high-resolution images which enable high accuracy. For example, the scanner or the system may include a visible light camera that has configuration of at least one or more of (1) a resolution of 3840×2160 pixels, (2) a focal length of 40 mm, (3) field of view is 100°, or (4) wide and normal visible light cameras have f/1.2 and f/2.2 aperture. In some implementations, the visible light lens 312 may have 5-6 elements and may be coated to be scratch-resistant.
In some implementations, one or more thermal scanners may use one or more edge AI chips to perform data collection and analysis within the scanner itself. In some implementations, only categorically defined data points and metadata are transferred via Wi-Fi and Bluetooth, for example, thereby reducing latency and improving battery life. The scanner's use of edge AI chips, combined with a plurality of sensors (e.g., temperature sensors and motion sensors) built in the scanner, can provide intelligent monitoring, privacy and peace of mind to the users.
In some implementations, one or more thermal scanners may automatically detect a face of a person, find the most reliable spot to measure, and send temperature readings to a computer or mobile device (e.g., mobile device 350 in
In some implementations, one or more thermal scanners may include one or more temperature sensors (e.g., one or more temperature sensors 317 in
In some implementations, one or more thermal scanners may calibrate hardware of a temperature sensor (e.g., temperature sensor 317 in
In some implementations, one or more thermal scanners can detect motion with a wide field of view (120°) and a distance of around five feet, for example, using a motion sensor (e.g., motion sensor 319 in
In some implementations, one or more thermal scanners can communicate at ranges of up to 10 meters using a short distance communication (e.g., Bluetooth or NFC) or Wi-Fi. Bluetooth devices do not need to be in direct sight of each other, making Bluetooth communication much more flexible and robust. Since a thermal scanner is Bluetooth-enabled, it can excel at low-bandwidth data transfer, while it is not intended as a replacement for high-bandwidth cabled peripherals. The thermal scanner may use Bluetooth 5.0 and have BLE (Bluetooth Low Energy) technology. For high-bandwidth information transfer, such as that to and from external hard drives or video cameras, the thermal scanner may enable Wi-Fi, for example 802.11ax or Wi-Fi 6 with 2×2 multiple-input and multiple-output (MIMO). This means most users do not need to touch the thermal scanner often. In some implementations, users can just access the dashboards and data (see
In some implementations, in order to access different data outputs produced by different thermal scanners (e.g., thermal scanners 120-1 to 120-N in
Referring to
In some implementations, the APIs shown in
In some implementations, a workplace entry management system (e.g., the system 100 in
When a person wears a face mask or a face covering, this may obscure more than half of these landmarks, making face recognition difficult through conventional facial detection programs (e.g., DLib tools). For example, referring to
In some implementations, the system may use a machine learning algorithm to detect such features of the face from the face image. Such machine learning models or techniques may include, but not limited to, supervised learning, unsupervised learning, semi-supervised learning, regression algorithms, instance-based algorithms, regularization algorithms, decision tree algorithms, Bayesian algorithms, clustering algorithms, artificial neural networks, deep learning algorithms. dimension reduction algorithms (e.g., PCA), ensemble algorithms, support vector machines (SVM), and so on.
In some implementations, the system may use deep learning algorithms to predict and recognize individuals based on images of their ears to supplement landmark-based detections (particularly when a clear face picture as the second picture is uploaded so that a machine learning model can sufficiently learn from the clear face picture). In some implementations, such ear-based face recognition can be implemented using a large image database and a generative adversarial network (GAN)-based model which is constructed based on accepted biometric practices. Ear images can be effectively utilized for person identification, age estimation, and gender classification, etc. because use of ear images for such purposes are accepted biometric practices.
Referring to
The user interface 800 may display device statistics 821 which indicates the number of devices online and the number of devices offline using a pie chart, for example. The user interface 800 may display attendance statistics 822 which indicates the number of (on-time) attendance, the number of late attendances, the number of people leaving early, the number of over-timers, and the time off duty, using a pie chart, for example. The user interface 800 may display (temperature) pass statistics 823 which indicates the number of people whose body temperature exceeds a threshold temperature and the number of people having a normal body temperature, using a pie chart, for example. The user interface 800 may display a real time monitoring status 831 of a plurality of persons scanned, which includes information of each person, such as (1) time of being scanned, (2) whether each person is an employee, visitor, blacklisted, or stranger using corresponding color codes 832, (3) body temperature of each person, (4) whether each person wears a face mask/covering, and/or (5) quick access to view details.
In some implementations, the system may perform body temperature settings including settings for a body temperature detection switch (e.g., temperature detection switch 901), a compensation temperature (e.g., compensation temperature 904), an alarm threshold (e.g., alarm threshold 902), and/or a body temperature alarm (e.g., alarm switch 903). The body temperature detection switch setting may enable a user to control a body temperature detection function by turning the function on or off with the default on. Setting the body temperature detection switch to a value of “on” may indicate that during identification or recognition of personnel traffic, the system displays (in a user interface) and/or broadcasts a body temperature value of a person after the face of the person is recognized. Setting the body temperature detection switch to a value of “off” may indicate that during identification or recognition of personnel traffic, the system automatically hide the outline of the face of a person in a user interface, and the interface does not detect a body temperature of the person after the face is recognized.
The compensation temperature setting may enable a user to set a compensation temperature such that when the ambient temperature may affect a detected body temperature, the detected body temperature can be automatically adjusted using the compensation temperature. In some implementations, the compensation temperature setting may enable a user to select an addition or a subtraction. For example, if a default compensation temperature value is 0.3° C., and a default selection is an addition, during identification or recognition of personnel traffic, a detected body temperature (e.g., 36.1° C.) can be automatically adjusted by adding 0.3° C. and the adjusted body temperature (e.g., 36.4° C.) can be displayed. In some implementations, the compensation temperature setting may enable a user to set a range of compensation temperature. For example, if the range is set to a range from 0° C. to 1° C., a maximum of one decimal can be reserved.
The alarm threshold setting may enable a user to set an alarm threshold to control body temperature detection. In some implementations, the alarm threshold setting may enable a user to set a range of alarm threshold. For example, if a default alarm threshold is 37.3° C. and a range between 30.0° C. and 45.0° C., only numbers between 30.0° C. and 45.0° C. can be entered, and up to one decimal can be reserved. The body temperature alarm setting may enable a user to select on or off to turn on or off a body temperature alarm so that when the body temperature alarm is turned on, if an identified body temperature exceeds a threshold, an alarm may be issued. In some implementations, the body temperature alarm setting may enable a user to control a body temperature alarm function to turn on or off the body temperature alarm function with the default being on. For example, when the body temperature alarm is on, if a detected body temperature is higher than the threshold, the system may display on a user interface the body temperature and emit or issue an alarm sound, for example. On the other hand, when the body temperature alarm is on, if the body temperature is lower than the threshold, the system may not emit or issue an alarm. When the body temperature alarm is off, no matter the body temperature is higher or lower than the threshold, the system may not emit or issue an alarm.
In some implementations, body temperature settings may include settings for mask detection (e.g., mask settings 907 in
In some implementations, a personnel manager (e.g., the personnel manager 108 in
Using (1) the user interface to perform the employee creation operation, the system may enable a user to add employee information individually. For example, the system may enable a user to add a single employee to an employee list (not shown) by entering the personnel ID, name, gender, belonging group, phone number, ID card number, IC card number, place of birth, date of birth, contact address, notes, etc. The system may enable a user to add a face recognition photo for face recognition (e.g., the photos shown in
Using (2) the user interface to perform the visitor creation operation, the system may enable a user to add visitor information individually. For example, the system may enable a user to add a single visitor by entering a visitor ID, name, gender, affiliation group, mobile phone number, ID card number, IC card number, ethnicity, nationality, date of birth, contact address and remarks, etc. The system may enable a user to add a face recognition photo for face recognition (e.g., the photos shown in
In some implementations, using a user interface to perform the blacklist creation operation, the system may enable a user to add individual information to a blacklist. For example, the system may enable a user to add a single person to the blacklist by entering a blacklist ID, name, gender, affiliation group, mobile phone number, ID card number, IC card number, ethnicity, nationality, date of birth, contact address and remarks, etc. The system may enable a user to add a face recognition photo for face recognition (e.g., the photos shown in
Referring to
In some implementations, the system may enable a user to query data of the pass records and/or export the data by day. For example, the data of the current day is displayed by default (as shown in
In some implementations, the system may display pass records (per device, for example) which are identification records on the device. For example, as shown in
In some implementations, the system (e.g., the system 100 in
Referring to
In some implementations, the user interface 1100 may include a refresh button 1104 to display pass permission records as updated at the current time. In some implementations, the system may enable a user to query data of the pass permission records. For example, the system may enable a user to query data of the pass permission records with information 1107 such as a personal ID, name or phone number of a particular person. In response to a user submitting the information 1107, the system may display pass permission records satisfying or matching with the submitted information.
In some implementations, as shown in
In some implementations, the user interface 1100 may include an employee permission button 1105 to display a user interface (not shown) to give pass permission to, or revoke pass permission pass permission from, a particular employee, and to update the pass permission records accordingly. Similarly, the user interface 1100 may include a visitor permission button 1106 to display a user interface (not shown) to give pass permission to, or revoke pass permission pass permission from, a particular visitor, and to update the pass permission records accordingly.
In some implementations, a system manager (e.g., the system manager 112 in
Using (1) the user interface to manage the group structure, the system may enable a user to manage a group structure and organization user information (of an organization). For example, the user may be able to manage a group structure of an enterprise (e.g., hierarchical relationship) and manage enterprise user information in the enterprise.
Using (2) the user interface to manage the role management, the system may enable a user to control various business function operations of users in the system (e.g., system administrators or other users that can login to the system). For example, using the user interface to manage the role management, the user can set access rights to system resources or data (e.g., device list, pass records, pass permission records, etc.) for a particular user in the system.
Using (3) the user interface to perform business management, the system may enable a user (e.g., a system administrator or a super administrator) to create and manage corporate accounts in the system. In some implementations, the business management can be only operated by a system administrator (or super administrator). For example, each corporate account may have corporate administrator rights to log in to the system. After logging in to the system, the account can manage the organizational structure, users, and roles within the enterprise, and can view and manage all business data created by the enterprise users.
Using (4) the user interface to display system logs, the system may enable a user to display a system log list that contains the user's operation date, function modules, log details, operation results, operator and other information records during the use of the system.
Using (5) the user interface to perform system settings, the system may enable a user to set system parameters such as background server port, message service port, and/or database service port configuration.
In some implementations, an attendance manager (e.g., the attendance manager 114 in
Using (1) the user interface to manage attendance rules, the system may enable a user to add, modify and/or delete related rules including shifts, holidays, public holidays, and device groups, and the like.
Using (2) the user interface to display attendance records, the system may enable a user to query the attendance records of all employees by time period and group, track employees by attendance status, query the daily attendance within a custom time period by employee name and ID, and/or query the attendance of employees by date record and export the query result list file to download locally. In some implementations, each attendance record may include name of a person, date, employee group, employee ID, body temperature, face mask (e.g., whether the person wears a mask on that day), and/or attendance status (e.g., attendance or absence), etc.
Using (3) the user interface to display attendance statistics, the system may enable a user to query or export the data of normal and abnormal attendance of employees at all times or within a specified range of time, working days, public holidays and overtime data on holidays. In some implementations, the user can query the data of normal and abnormal attendance of employees with pass status (e.g., whether body temperature is normal or abnormal, and/or whether the person wears a mask or not). In some implementation, the system may enable a user to select all groups of employees or a particular group of employees to display attendance statistics of all employees or attendance statistics of employees of that particular group. In some implementations, attendance statistics of an employee may include at least one or more of name of the employee, employee ID, number of days with normal attendance, number of days with late attendance, number of days with absence, number of days with leaving early, number of days with overtime on working days, number of days of overtime on holidays, number of days with abnormal temperatures, number of without face mask, etc. In some implementations, the system may provide a search bar including a search box and an enter button (not shown) so that a user can enter an employee name or employee ID in the search box and click the enter button to query the employee's attendance data. In some implementations, the system may provide a search bar (not shown) for performing a range search the employee's attendance data.
In some implementations, an application manager (e.g., the application manager 114 in
Using (1) the user interface to perform screen saver settings, the system may enable a user to set whether a face recognition is required, and/or set brightness of screen saver, for example. In some implementations, if a face recognition is not required, the system may require a screen saver to be displayed. If a face recognition is required, the system may cause the screen to jump to a face recognition page of the person (see
In some implementations, a system (e.g., temperature monitoring system 100 in
In some implementations, the first device may be a thermal scanner (e.g., one or more thermal scanners 120-1 to 120-N). In some implementations, the second device may be a gate configured to open or close (e.g., one or more gates 130-1 to 130-N in
In some implementations, the second device may be a speaker (e.g., I/O components 250 in
In some implementations, the second device may be a display (e.g., I/O components 250 in
In some implementations, the at least one processor may be further configured to set a compensation temperature (e.g., compensation temperature 0.3 in
In some implementations, the first device may be one thermal scanner of a plurality of different kinds of thermal scanners (e.g., one or more thermal scanners 120-1 to 120-N). In monitoring the body temperature of the first person, the at least one processor may be configured to connect one database of a plurality of databases (e.g., databases 510B, 520B, 530B in
In some implementations, an application programming interface (API) that integrates the plurality of databases (e.g., APIs 501A, API in the server 500B, API 501C in
In some implementations, in performing the image processing on the image of the first person, the at least one processor may be configured to select, based on a shape of a face mask, a set of landmarks (e.g., landmarks 702, 703, 705 in
In some implementations, the selected set of landmarks may include at least one of eyes, eyebrows, or ears (e.g., landmarks 703, 702, 706 in
In some implementations, the first device may be one thermal scanner of a plurality of different kinds of thermal scanners (e.g., one or more thermal scanners 120-1 to 120-N). In monitoring the body temperature of the first person, the at least one processor may connect one database of a plurality of databases (e.g., databases 510B, 520B, 530B in
In some implementations, an application programming interface (API) that integrates the plurality of databases (e.g., APIs 501A, API in the server 500B, API 501C in
At step 1604, in some implementations, the at least one processor may determine, responsive to the monitoring, as a first determination result, whether the monitored body temperature exceeds a predetermined threshold (e.g., alarm threshold 902 in
At step 1606, the at least one processor may perform image processing (see
At step 1608, the at least one processor may determine, based on a result of the image processing, as a second determination result, whether the first person wears a face mask. In determining whether the first person wears a face mask, the at least one processor may determine, based on the identified features of the face of the first person, whether the first person wears a face mask. In some implementations, the selected set of landmarks may include at least one of eyes, eyebrows, or ears (e.g., landmarks 703, 702, 706 in
At step 1610, the at least one processor may control a second device (e.g., one or more gates 130-1 to 130-N in
In some implementations, the second device may be a gate configured to open or close (e.g., one or more gates 130-1 to 130-N in
In some implementations, the second device may be a speaker (e.g., I/O components 250 in
In some implementations, the second device may be a display e.g., I/O components 250 in
In some implementations, the method may include setting, by the at least one processor, a compensation temperature (e.g., compensation temperature 0.3 in
The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but is to be accorded the full scope consistent with the language claims, wherein reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.” Unless specifically stated otherwise, the term “some” refers to one or more. All structural and functional equivalents to the elements of the various aspects described throughout the previous description that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the claims. No claim element is to be construed as a means plus function unless the element is expressly recited using the phrase “means for.”
It is understood that the specific order or hierarchy of blocks in the processes disclosed is an example of illustrative approaches. Based upon design preferences, it is understood that the specific order or hierarchy of blocks in the processes may be rearranged while remaining within the scope of the previous description. The accompanying method claims present elements of the various blocks in a sample order, and are not meant to be limited to the specific order or hierarchy presented.
The previous description of the disclosed implementations is provided to enable any person skilled in the art to make or use the disclosed subject matter. Various modifications to these implementations will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of the previous description. Thus, the previous description is not intended to be limited to the implementations shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The various examples illustrated and described are provided merely as examples to illustrate various features of the claims. However, features shown and described with respect to any given example are not necessarily limited to the associated example and may be used or combined with other examples that are shown and described. Further, the claims are not intended to be limited by any one example.
The foregoing method descriptions and the process flow diagrams are provided merely as illustrative examples and are not intended to require or imply that the blocks of various examples must be performed in the order presented. As will be appreciated by one of skill in the art the order of blocks in the foregoing examples may be performed in any order. Words such as “thereafter,” “then,” “next,” etc. are not intended to limit the order of the blocks; these words are simply used to guide the reader through the description of the methods. Further, any reference to claim elements in the singular, for example, using the articles “a,” “an” or “the” is not to be construed as limiting the element to the singular.
The various illustrative logical blocks, modules, circuits, and algorithm blocks described in connection with the examples disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and blocks have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The hardware used to implement the various illustrative logics, logical blocks, modules, and circuits described in connection with the examples disclosed herein may be implemented or performed with a general purpose processor, a DSP, an ASIC, an FPGA or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some blocks or methods may be performed by circuitry that is specific to a given function.
In some exemplary examples, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a non-transitory computer-readable storage medium or non-transitory processor-readable storage medium. The blocks of a method or algorithm disclosed herein may be embodied in a processor-executable software module which may reside on a non-transitory computer-readable or processor-readable storage medium. Non-transitory computer-readable or processor-readable storage media may be any storage media that may be accessed by a computer or a processor. By way of example but not limitation, such non-transitory computer-readable or processor-readable storage media may include RAM, ROM, EEPROM, FLASH memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of non-transitory computer-readable and processor-readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a non-transitory processor-readable storage medium and/or computer-readable storage medium, which may be incorporated into a computer program product.
The preceding description of the disclosed examples is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these examples will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to some examples without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the examples shown herein but is to be accorded the widest scope consistent with the following claims and the principles and novel features disclosed herein.
This application claims priority to U.S. Provisional Patent Application No. 63/053,136, filed Jul. 17, 2020, which is incorporated by reference in its entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
63053136 | Jul 2020 | US |