Embodiments of the subject matter disclosed herein generally relate to a system and method for controlling a mechanical valve based on computer vision and machine learning, and more particularly, to a mechanical valve that is controlled by a human operator without touching the mechanical valve.
Control valves (e.g., mechanical valves) form an important building block of industrial control systems. These valves are used to control the flow of liquids and gases between different parts of an industrial plant. An example of such plant might be a petrochemical plant that purifies crude oil. In such a plant, the crude oil flows into different stages while being purified and the flow must be controlled for a productive and safe operation of the whole plant. Historically, the flow of liquids and gases used to be controlled by manually controlled mechanical valves. As the valves are often installed at hard-to-reach locations, and they are exposed to harsh environments of the plants (e.g., high temperature locations), controlling them is a tedious task.
As the digital era started, digitally controllable valves were designed that were not only convenient to use, but also helped revolutionize the design of industrial plants. However, to test and calibrate some of these valves, it is often desirable that the valves are still manually controlled. In this case, the valves are actuated by a control wheel until desired flow rates are achieved. As highlighted above, these valves must be physically reached in order for being controlled, which requires proper planning taking into consideration the safety risks involved. Therefore, it is desired that these valves can still be controlled manually without reaching them physically.
A different problem exists for the water taps that people are using several times a day, but this problem may be solved with a novel system that also solves the valve problems in the industrial environment discussed above. The water taps are also controlled by mechanical control valves. In a particular household, the control of the mechanical taps is linked with efficient usage of water as well as the overall user experience. As water is the most valuable resource on earth, redefining how the taps are used could have a significant impact on preserving this precious resource. In fact, water consumption is an increasing worldwide concern, especially in countries with limited water resources. The Middle East and North Africa regions have 6% of the world's population and less than 2% of the world's renewable water resources. It is the driest region in the world, with the 12 most water-scarce countries in the world: Algeria, Bahrain, Kuwait, Jordan, Libya, Oman, the Palestinian Territories, Qatar, Saudi Arabia, Tunisia, the UAE and Yemen. For example, the Saudi Ministry of Water and Electricity indicated that individual water consumption in Saudi Arabia is 246 liters per day, which is three times the recommended individual rate that was defined by the World Health Organization as being 83 liters per individual per day. This rate has made Saudi Arabia one of the highest water consumers in the world. Several studies have shown that more than 40% of the water is being wasted during human-tap interaction for daily activities such as washing hands, face, brushing teeth, etc.
As the humans interact with these taps regularly, there is an imperative need to make them more efficient, to reduce the water waste. Thus, there is a need of a novel intelligent mechanical valve that addresses these problems.
According to an embodiment, there is a smart-valve system for controlling a fluid. The smart-valve system includes a hybrid valve configured to control a flow of the fluid, wherein the hybrid valve includes electronics for controlling the flow of the fluid, and the hybrid valve also includes a manual handle for controlling the flow of the fluid, a controller connected to the electronics and configured to control the electronics to close or open the hybrid valve, a camera oriented to capture visual data about a user, and an artificial intelligence, AI, algorithm configured to receive the visual data from the camera, extract a user action or gesture from the visual data, generate a command associated with the user action or gesture, and send the command to the hybrid valve to control the flow of the fluid.
According to another embodiment, there is a method for controlling a hybrid valve, and the method includes collecting visual data associated with a user and a hybrid valve, transmitting the visual data from a camera to a controller, that is hosting an artificial intelligence, AI, algorithm, processing the visual data with the AI algorithm to extract a user action or gesture, generating a command with the AI algorithm based on the extracted user action or gesture, and sending the command to the hybrid valve to control a parameter of a fluid flowing through the hybrid valve.
For a more complete understanding of the present invention, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
The following description of the embodiments refers to the accompanying drawings. The same reference numbers in different drawings identify the same or similar elements. The following detailed description does not limit the invention. Instead, the scope of the invention is defined by the appended claims. The following embodiments are discussed, for simplicity, with regard to a mechanical valve that controls the flow of a fluid in an industrial or household environment. However, the embodiments to be discussed next are not limited to a mechanical valve, or an industrial or household environment, but may be applied to other valves or in other environments.
Reference throughout the specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with an embodiment is included in at least one embodiment of the subject matter disclosed. Thus, the appearance of the phrases “in one embodiment” or “in an embodiment” in various places throughout the specification is not necessarily referring to the same embodiment. Further, the particular features, structures or characteristics may be combined in any suitable manner in one or more embodiments.
According to an embodiment, there is a novel system that is configured to control at least one of the flow rate, pressure, flux, temperature, flow duration and direction of the fluid flowing through a mechanical valve, by analyzing human actions and/or gestures associated with the user of the mechanical valve. In one application, it is possible to use a machine learning/deep learning based computer model to identify in real-time the human actions and/or gestures for the purpose of controlling the mechanical valve. In this or another application, it is also possible to use a machine learning/deep learning-based computer model that is capable of predicting future actions/gestures of humans interacting with the mechanical valve to increase the overall system efficiency. In yet another application, it is possible to have a hygiene compliance system for healthcare industry that also uses a machine learning/deep learning-based computer model to predict a reaction time of the human using the hygiene system and thus, to turn on and off the mechanical valve based on the predicted reaction time. These various applications are now discussed in more detail with regard to the figures.
According to an embodiment, as illustrated in
The controller 120 may be configured as a neural network that runs an artificial intelligence (AI) algorithm 122 for processing data. Alternatively, the AI algorithm 122 may be stored at a remote, central, server 124, that is linked by a wired or wireless communication link 126 to the controller 120. The AI algorithm 122 is configured and trained to receive visual data 132, from a camera 130 or equivalent sensor. The camera 130 may operate in the visible, infrared, ultraviolet, or any other spectrum as long as the camera is able to detect a movement of the user 114. The camera 130 is located in such a way that a gesture or equivalent indicia made by the user 114, with regard to the hybrid valve 110, can be recorded and transmitted to the AI algorithm 122. In one application, the camera 130 is oriented to capture both the user and the hybrid valve.
The AI algorithm 122 processes visual data 134 recorded by the camera 130 and determines an action to be taken by the hybrid valve 110, for example, to increase or decrease a rate or flux of the fluid through the pipe 112, to control a flow duration of the fluid through the pipe, to regulate a flow direction of the fluid, and/or to adjust a temperature of the fluid. For one or more of these actions, it is possible that plural pipes 112A to 112C and corresponding plural hybrid valves 110A to 110C are used to combine various fluid streams 113A to 113C into a single fluid stream 113, as illustrated in
To be able to adjust one or more of these parameters of the final stream 113, the AI algorithm 122 needs to receive visual data 134 that includes at least one of grayscale or RGB or video images of the user 114, or depth information regarding the user, or human gestures, or non-human visual signals. In one application, there are additional sensors 140 located around the user 114, as shown in
The user profile 115 may include, but is not limited to, a photo of the user, a photo of the face of the user so that the AI algorithm can automatically determine which user is using the system, the age, height, weight, sex, and/or any preference of the user (e.g., the user prefers water at 25 C for washing his/her hands, water at 31 C for washing his/her face, etc.). In one embodiment, if the system 100 is used in an industrial environment, a user may be associated with oil analysis, another user may be associated with gas testing, etc. In other words, any user may be associated with any desired characteristic or feature related to the system to be used. The user profile may be modified by the user or another user through a computing device that is connected to the controller 120, or by using an input/output interface 121 associated with the controller 120, as shown in
The visual data 134 may include, as discussed above, human gestures, human actions, visual clues, and any other visual indication that might not be related to humans. A human gesture may include different pre-defined human gestures, for example, raising a hand, stretching an arm, bending the arm, bending the body at a certain angle, smiling, covering his or her face, etc. Different mechanical valve operations can be conducted by performing actions such as raising the right hand, stretching both arms, etc. In other words, each possible action of the hybrid valve can be associated with a unique gesture of the user, and all this information may be stored in the user's profile 115. A human action may include, but is not limited to, hand washing, face washing, moving hands in a circular fashion, bowing, etc. A visual clue may include an image that records at least one parameter, i.e., the amount of light in a given place, the presence or absence of one or more persons in a given place, a particular behavior of one or more persons, etc.
In this way, the AI algorithm receives an input from the camera 130 and/or sensors 140, an input from the storage device 123, which stores the profiles 115 of the users 114, and also may receive an input from a remote server 124. Based on this information, the AI algorithm 122 is trained to provide a certain instruction to the controller 120, to control the hybrid valve 110.
While
While the embodiments discussed above have been discussed mainly with regard to an industrial environment, the next embodiment is adapted for use in a public or household washing area, e.g., the rest room in an office or the bathroom in a personal home, when the privacy of the user is not of concern or it is kept confidential. The system 100 may be used in this embodiment, with some additional features as now discussed with regard to
The user 114 interacts with the hybrid water tap 110 to perform daily operations, like hand washing, face washing, brushing teeth, shaving, etc. The controller 120 is configured to switch on and off the hybrid water tap 110 by interacting with its electronics 118, based on live commands generated by the AI algorithm 122. The AI algorithm 122 is configured and trained to analyze the hand movements (or head movements or body movements) of the user, based on the visual data 134, and to turn the water tap on or off. The AI algorithm 122 may also be configured to recognize the action undertaken by the person using the hybrid water tap and adjust the pressure of the water stream according to the requirements of the action. For example, collecting water to wash the face needs more water as compared to soaking a teeth brush, and thus, the water pressure should be different for these two scenarios. The AI algorithm is trained to recognize and distinguish these two actions and provide the appropriate water pressure associated with each action. For this embodiment, a list 420 may be stored by the controller 120 and the list maps each action recognized by the AI algorithm to a corresponding water pressure or temperature or flow, etc.
In another embodiment, which may be combined with the above embodiment, the AI algorithm is configured and trained to recognize the action of the person using the tap and adjust the amount of water dispensed according to the requirements of the action. For example, if a person wants to wash his/her face, the amount of water dispensed should not exceed that requirement, which is previously calculated and stored in the list 420. Similarly, if a person intends to soak his/her teeth brush, only a little amount of water should be dispensed and so on. All these amounts are previously researched and calculated and stored in the list 420. In one application, depending on the age or height or mass of the user, the AI algorithm may further adjust up or down these amounts according to one or more of these characteristics.
Further, the AI algorithm 122 may be trained to recognize the action undertaken by the person using the hybrid water tap and adjust the shape of the water flow accordingly. For example, collecting water in the hands can be done conveniently if the water flow is solid, as shown in
The AI algorithm 120, which is schematically illustrated in
A specific implementation of the AI algorithm 122 is now discussed with regard to
The AI algorithm may also be configured and trained to dispense water at a temperature that is comfortable for the person. In order to achieve this functionality, the system 100 in
In another embodiment, which may be combined with any of the above discussed embodiments, the spout of the hybrid water tap 110 may be controlled by the controller 120 to change its orientation, and thus, to direct the water stream in a different direction. For this embodiment, which is shown in
While all the embodiments discussed above use the camera 130 to either identify the user and use its profile 115 and/or to identify the user's gesture or action to control the hybrid valve 110, in one embodiment, as illustrated in
The profiles 115 of the users may be especially applicable for the taps in a household. The hybrid water taps can learn the personal preferences and styles of washing for each person in a household and save it to the personal profile. These profiles can be continuously updated to continue to adapt to the changing styles of the users. These profiles can be stored in the local database 123 or on the cloud, in the server 124.
In yet another embodiment, which may be combined with any of the above discussed embodiments, the hybrid water tap may be configured to switch off when the hands of the user move away from the water tap. This action is implemented by the AI algorithm 122 based on an estimated distance between the hands and the hybrid water tap, as recorded by the camera 130. However, as the hybrid water tap includes a mechanical moving part (solenoid valve), which needs some time to be actuated/stopped, there is a delay between the moment when the hybrid water tap's stopping signal is generated by the controller 120 and the moment when the water actually stops flowing from the hybrid water tap 110. Although at each instance of stopping the hybrid water tap the amount of water wasted is small, the total amount of water wasted when combined for a large number of users in a city could be very large, when considering the daily water usage of the city. To address this issue, in this embodiment, the smart-tap system 100 is provided with a prediction mechanism (e.g., the configuration shown in
While the embodiments discussed above show the controller 120 being located behind a wall, opposite to the hybrid water tap 110, in one embodiment, as illustrated in
Another aspect of the prediction characteristic of the AI algorithm 122 is related to the desired to offer a smooth user experience. Consider the example of a person washing his/her face as discussed above. When the person, after washing his/her face, moves his/her hands again towards the hybrid water tap, the hybrid water tap may predict the exact moment when the user will expect the hybrid water tap to dispense the water. This prediction (i.e., time at which the water should start flowing) may then be used to switch on the hybrid water tap at the right moment, i.e., when the hands of the user just reach under the spout, thus enhancing the overall user experience. The AI algorithm 122 would be trained to learn, recognize and predict the future movements of the person using the hybrid water tap, as discussed above with regard to
In yet another embodiment, which is illustrated in
In this way, the modified system 100 is capable to determine in real time if a certain germ is still present on the user's hands, even after the user has finished using the hybrid water tap. For this situation, the AI algorithm 122 is configured to generate an alarm and sound a sound at a speaker 1020 or generate a warning light with a light source 1020 so that the user is made aware that he or she needs to continue to wash the hands. This modified smart-tap-hygiene system would be very useful in the healthcare or food handling industries as the workers here need to have their hands clean before touching the patients or the food. These are also environments where the privacy of the users has a lower threshold, i.e., it is expected that their hands are monitored by such systems for ensuring clean hands.
While some of the above embodiments were discussed with regard to a hybrid water tap provided next to a sink, it is possible to use the novel features disclosed herein with a hybrid water tap with no sink, for example, when used with a public shower or an industrial shower or any other situation where the privacy of the user is not of concern.
According to the embodiment illustrated in
The method may further include a step of storing in the controller profiles associated with various users of the hybrid valve, and a step of generating the command based on a given profile of the profiles, which is associated with a given user that is currently using the hybrid valve. The method may also include a s step of receiving germ information at the AI algorithm, and a step of generating a warning for the user when germs are detected on the user's hands. The method may also include a step of delaying a turning on state of the hybrid valve based on a profile of the user stored in the controller. In one application, the AI algorithm is configured to learn from each user to predict a next movement or a time until the movement takes place.
The disclosed embodiments provide a smart-valve system that is capable of being operated by visual data associated with a user without the user practically touching the valve. It should be understood that this description is not intended to limit the invention. On the contrary, the embodiments are intended to cover alternatives, modifications and equivalents, which are included in the spirit and scope of the invention as defined by the appended claims. Further, in the detailed description of the embodiments, numerous specific details are set forth in order to provide a comprehensive understanding of the claimed invention. However, one skilled in the art would understand that various embodiments may be practiced without such specific details.
Although the features and elements of the present embodiments are described in the embodiments in particular combinations, each feature or element can be used alone without the other features and elements of the embodiments or in various combinations with or without other features and elements disclosed herein.
This written description uses examples of the subject matter disclosed to enable any person skilled in the art to practice the same, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the subject matter is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims.
This application claims priority to U.S. Provisional Patent Application No. 63/067,625, filed on Aug. 19, 2020, entitled “REAL-TIME CONTROL OF MECHANICAL VALVES USING COMPUTER VISION AND MACHINE LEARNING,” the disclosure of which is incorporated herein by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/IB2021/057589 | 8/18/2021 | WO |
Number | Date | Country | |
---|---|---|---|
63067625 | Aug 2020 | US |