The present disclosure relates generally to faucets. More specifically, the present disclosure relates to faucets including object sensing and automatic dispensing features.
Generally, faucets provide or dispense fluids (e.g., water) to a sink or basin. Many faucets are manually controlled, for example, by rotating one or more handle(s) proximate to the faucet. Faucets are versatile fixtures used to perform many distinct functions, for example, faucets are used for dishwashing, food preparation, hand washing, and the like. Each of these applications may require different water temperatures, volumes of water, and the like. Manual control of these characteristics is burdensome, inconveniencing users.
Objects, features, and advantages of the present disclosure should become more apparent upon reading the following detailed description in conjunction with the drawing figures, in which:
The figures illustrate certain examples of the present disclosure in detail. It should be understood that the present disclosure is not limited to the details and methodology set forth in the detailed description or illustrated in the figures. It should be understood that the terminology used herein is for the purposes of description only and should not be regarded as limiting.
Described herein are devices, systems, and methods for controlling one or more characteristics of a flow of water dispensed from a faucet based on an object or a type of object disposed below the faucet. The faucet may control a type of water dispensed, a flow type of the water dispensed, a duration for which the water is dispensed, a temperature of water dispensed, a flow rate of the water dispensed, and the like. Specifically, the devices, systems, and methods described herein may control a flow attribute of water directed into a sink (e.g., dispensed), the flow attribute comprising a type of fluid dispensed and/or a flow type of fluid dispensed.
Generally, it may be beneficial to use different water types and/or dispense water having different flow types based on the object disposed below the faucet and/or the application for which the water is being used. For example, when a drinking glass is disposed below the faucet, the faucet may dispense filtered water having a laminar flow type, filling the glass with drinking water. In another example, when a dish is disposed below the faucet, the faucet may dispense tap water having a sweeping flow, for cleaning the dish. In some examples, a flow rate of the water dispensed when a dish is disposed below the faucet may vary based on a level of cleanliness or contamination of the dish. In yet another example, when a fruit and/or a vegetable is placed below the faucet, the faucet may dispense ozone water, electro chlorinated water, or electrolyzed water having a soft spray flow type for cleaning the fruits and vegetables.
In some examples, in addition to varying a type of water dispensed and/or a flow type of the water dispensed, the faucet may control a flow rate, a duration during which water is dispensed, and/or a temperature of water dispensed. For example, when a drinking glass is disposed below the faucet, the faucet may dispense cold water having a controlled flow rate for a controlled duration required to fill the glass. In another example, when a dish is disposed below the faucet, the faucet may dispense hot water for cleaning the dish. Similarly, when a clean dish or glass is disposed below the faucet, the faucet may dispense cold water having a flow rate and/or duration required to fill the glass, and when a dirty dish or glass is disposed below the faucet, the faucet may dispense hot water for cleaning the dish.
Referring generally to the various devices, systems, and methods described herein, a soft spray may be dispensed for cleaning fruits and/or vegetables, a sweeping flow may be used for cleaning dishes, and a laminar flow may be used for filling objects with water. Additionally, in some examples, the faucet may be configured to sense or detect and classify a cleanliness of any of the above identified objects (e.g., fruits, vegetables, dishes) and vary a flow rate of water dispensed based on the cleanliness (e.g., cleanliness classification or ranking) of the object. As used herein, the term dish includes any object or utensil used for food preparation and/or consumption, for example, pots, pans, plates, bowls, silverware, cooking spoons, spatulas, and the like.
Referring generally to the figures, the devices and systems described herein may include a range sensor (e.g., an ultrasonic sensor, a radar sensor) and/or an image sensor or camera configured to sense an object disposed below the faucet. For example, a field of view of the range sensor and image sensor may be directed toward a sink or basin disposed below the faucet. Specifically, the ranging sensor may be configured to detect the presence of an object below the faucet and/or a distance between the faucet and the object disposed below the faucet. The imaging sensor may be configured to capture images of an object below the faucet, for example, as a user places or holds the object below the faucet. In some examples, the image sensor may capture an image in response to the range sensor detecting an object below the faucet. Objects placed below the faucet may include, for example, a dish, a glass, a bottle, fruits, hands, a toothbrush, and the like.
A faucet control system may receive the image or image data from the camera. Specifically, in some examples, a processor may receive the image from the camera. The faucet control system or processor may process the image to detect and classify the object included in the image. The faucet control system may cause the faucet (e.g., a spout of the faucet) to dispense a flow of fluid with characteristics (e.g., a water type, flow type, temperature, flow rate, etc.) which correspond to the classification of the object in the image. Accordingly, in the devices, systems, and methods described herein, fluid flow characteristics including a type of water dispensed and a flow type (e.g., soft spray, sweeping flow, laminar flow) may be dynamically changed and controlled according to the object or objects disposed below the faucet. Specifically, a flow attribute comprising one of a fluid type and a fluid flow type may be dynamically changed (e.g., selected) based on an object or objects disposed below the faucet. Accordingly, the faucet may be automatically controlled to be adapted to different uses based on the object or type of object positioned below the faucet.
Referring generally to
In operation, fluid (e.g., water) flows into the faucet 100, 200, 300, 400 from one or more water sources, up the extension portion 111 of the spout 110, through the arched portion 112, and out of the opening 114 in the end portion 113 into the sink 120. In some examples, a user may manually turn the faucet 100, 200, 300, 400 on or off via rotation or manipulation of one or more handles 116 coupled to or located proximate to the faucet 100, 200, 300, 400. Additionally, a temperature and/or flow rate of fluid dispensed from the faucet may be changed by manipulating the one or more handles 116. In some touchless applications, a user turns on the faucet 100, 200, 300, 400 by positioning their hands or an object beneath the spout 110. A sensor detects the object 170 (e.g., a user's hands) below the spout 110 and activates a valve (e.g., solenoid valve) to induce flow into the sink 120. In such applications, fluid flow is binary (e.g., “on” and “off” based on the presence or absence of an object 170 below the spout 110). According to the examples of the present disclosure described herein, a fluid type and/or a fluid flow type (i.e., a flow attribute) are adapted (e.g., selected, determined) based on a type of object 170 and or characteristics (e.g., cleanliness) of the object placed beneath the spout 110.
Each of the faucets 100, 200, 300, 400 described herein may include one or more sensors. Specifically, each of the faucets 100, 200, 300, 400 may include an image sensor 130. In some examples, each of the faucets 100, 200, 300, 400 may further include a range sensor 140. The image sensor 130 and/or the range sensor 140 are mounted to the faucet 100, 200, 300, 400 so as to have a field of view that includes the sink 120. In some examples, the image sensor 130 and the range sensor 140 may be mounted so as to have a field of view that includes a portion of the surface 121 (e.g., countertop) adjacent to the sink 120. Specifically, in some examples, the image sensor 130 may have a field of view 131. Generally, the image sensor 130 and the range sensor 140 may face outward from the faucet 100, 200, 300, 400 toward the sink 120.
As illustrated in
Referring to
Returning to
The mixing valve 510 may be configured to control or regulate flow from a hot water source and a cold water source. Referring to
In some examples, as illustrated in
The water selection valve 520 may be configured to control or regulate a flow of water provided to a faucet 100, 200, 300, 400, 700 from two or more water sources. For example, referring to
In some examples, the water selection valve 520 may be configured to control or regulate a flow of water from both an alkaline water tank or reservoir and an acidic water tank storing alkaline and acidic water, respectively. In some examples, the water selection valve 520 may be configured to control or regulate a flow of filtered water from a filtered water tank or reservoir to the faucet 100. In some examples, the water selection valve 520 may comprise one or more valves, each of which is configured to control or regulate a flow of a respective water type or from a respective water source. For example, the water selection valve 520 may control or regulate a flow of water or fluid from one or more of the ozone generator 530, water electrolyzer 590, copper ionization system 591, beverage printer 592, and electro chlorine generator 593. In some examples, the water selection valve 520 may include one or more valve drivers configured to control the water selection valve 520.
The water selection valve 520 may be configured to control or regulate a type of fluid (e.g., tap water, pH controller water, filtered water) dispensed from a faucet 100, 200, 300, 400, 700. Specifically, in some examples, the water selection valve 520 may control or regulate a type of fluid provided to the faucet 100, 200, 300, 400, 700. Specifically, the faucet control system 500 may control the water selection valve 520 at least in part based on image data received from the image sensor 130 and/or the range data received from the range sensor 140. In some examples, the water selection valve 520 may include one or more valve drivers configured to control the water selection valve 520.
The flow controller 580 may be configured to control or regulate a flow type dispensed or conveyed from a faucet 100, 200, 300, 400, 700. Specifically, the flow controller 580 may control which outlet or outlets fluid flows through as it is dispensed from a faucet 100, 200, 300, 400, 700. In some examples, the location of an outlet or an arrangement (e.g., pattern) of outlets through which fluid is dispensed may dictate or control a flow type of fluid dispensed from the faucet 100, 200, 300, 400, 700. In some examples, one or more outlets may include a fluidic device, such as a nozzle or fluidic oscillator, configured to condition a flow of fluid dispensed from the faucet 100, 200, 300, 400, 700 to have a specific flow type.
In some examples, as illustrated in
Specifically, in one example, as illustrated in
Further, in one example, as illustrated in
In other examples, the first and/or second set of one or more outlets 730, 750 may be arranged (e.g., in a pattern) so as to create a spray (e.g., conical) flow type, a soft (e.g., relatively low velocity) spray flow type, and/or an oscillating flow type. In some examples, the first and/or second set of one or more outlets 730, 750 may include one or more fluidic devices, such as a fluidic oscillator or a nozzle configured to condition a flow of fluid dispensed from the faucet 700. For example, one or more fluidic oscillators may be used to create an oscillating (e.g., back and forth) flow type. Further, one or more nozzles may be used to vary a velocity and/or flow rate of fluid dispensed from the faucet 700, for example, creating spray and/or soft spray patterns.
In other examples, a faucet 100, 200, 300, 400 may include a single channel extending therethrough, e.g., from an inlet 710 to one or more outlets (e.g., sets of outlets 730, 750). In these examples, the flow controller 580 may include a plurality of nozzles and/or fluidic devices, for example, arranged radially around a disc. The flow controller 580 may be configured to change a nozzle and/or fluidic device in line with the single channel of the faucet 100, 200, 300, 400, thus, changing a nozzle and/or fluidic device through which fluid is dispensed into the sink (e.g., 120) and a flow type of the fluid dispensed by the faucet 100, 200, 300, 400.
Returning to
The ozone generator 530 may be configured to selectively generate ozone (O3) and/or introduce or entrain (e.g., generated) ozone into water. Specifically, as illustrated in
The ozone generator 530 may be configured to control or regulate a quantity (e.g., volume, concentration) of ozone in a flow of water supplied to a faucet 100, 200, 300, 400, 700. Specifically, the ozone generator 530 may selectively generate ozone and/or selectively introduce or entrain ozone in a flow of water supplied to a faucet 100, 200, 300, 400, 700. The faucet control system 500 may control the ozone generator 530 at least in part based on image data received from the image sensor 130 and/or the range data received from the range sensor 140. According to some examples, as described hereinafter in greater detail, the faucet control system 500 may be configured to identify one or more gestures performed by a user. According to some examples, the faucet control system 500 may be configured to identify one or more gestures performed by a user for adjusting a concentration of ozone in a flow of water dispensed by the faucet 100, 200, 300, 400, 700. Accordingly, the faucet control system 500 may control the ozone generator 530 to increase or decrease a quantity or concentration of ozone entrained in a flow of water dispensed from the faucet 100, 200, 300, 400, 700.
As illustrated in
The water electrolyzer may include two plates (e.g., a cathode plate and an anode plate). Each plate may be formed of a metal or conductive material. Each plate may be electrically connected to a power source directly or through processor 560 and/or valve control module 550. The anode may be disposable. For example, the user may remove and reinstall the anode at a time interval or as needed. Vinegar or peracedic acid may be added to the water as a reactant.
The processor 560 and/or valve control module 550 may provide one or more control signals and/or power (e.g., electric current) to the water electrolyzer 590. The water electrolyzer 590 may be configured to control or regulate a flow of electrolyzed water to a faucet 100, 200, 300, 400, 700. Specifically, the water electrolyzer 590 may selectively electrolyze water and/or supply electrolyzed water to the faucet 100, 200, 300, 400, 700 at least in part based on the image data received from the image sensor 130 and/or range data received from the range sensor 140. In some examples, the water selection valve 520 may be disposed between the water electrolyzer 590 and the faucet 100, 200, 300, 400, 700 such that the water selection valve 520 controls or regulates a flow of electrolyzed water to the faucet 100, 200, 300, 400, 700. For example, the water selection valve 520 may control or regulate a flow of water from a tank storing electrolyzed water to the faucet 100, 200, 300, 400, 700.
In some examples, as illustrated in
The copper ionization system 591 may include two plates (e.g., a cathode plate and an anode plate). Each plate may be formed of a copper material (e.g., solid copper, copper alloy). Each plate may be electrically connected to a power source directly or through processor 560 and/or valve control module 550. The processor 560 and/or valve control module 550 may provide one or more control signals and/or power to the copper ionization. The copper ionization system 591 may be configured to control or regulate a flow of water including copper ions (e.g., copper ionized water). Specifically, the copper ionization system 591 may selectively generate copper ions and/or supply copper ionized water to the faucet 100, 200, 300, 400, 700 at least in part based on the image or image data received from the image sensor 130 and/or the range data received from the range sensor 140. In some examples, the water selection valve 520 may be disposed between the copper ionization system 591 and the faucet 100, 200, 300, 400, 700 such that water selection valve 520 controls or regulates a flow of copper ionized water to the faucet 100, 200, 300, 400, 700.
In some examples, the faucet control system 500 may further include a beverage printer 592. The beverage printer 592 may be configured to print or make one or beverages in-situ at the faucet 100, 200, 300, 400, 700 or faucet control system 500. For example, the beverage may be configured to print or make flavored water, carbonated flavored water, soda, and the like in-situ. In some examples, the beverage printer may receive tap water from a first water source. In other examples, the beverage printer may receive filtered water. In some examples, the beverage printer 592 may be configured to receive one or more containers or cartridges including a substance (e.g., liquid, powder, syrup etc.) configured to make flavored water, soda, or another beverage in-situ. Specifically, the beverage printer 592 may mix the substance for flavoring the beverage with water supplied to the beverage printer 592. Additionally or alternatively, in some examples, the beverage printer 592 may include a carbonation system including a carbon dioxide (CO2) tank and a regulator configured to make carbonated beverages such as soda, carbonated water, and the like.
The processor 560 and/or valve control module 550 may provide one or more control signals to the beverage printer 592 and/or power to the beverage printer 592. The beverage printer 592 may be configured to make a plurality of different beverages based on a control signal or control signals received from the processor 560 and/or valve control module 550. The beverage printer 592 may selectively print or make beverages based at least in part on the image or image data received from the image sensor 130 and/or the range data received from the range sensor 140. For example, the beverage printer 592 may be configured to dispense a specific beverage based on a gesture performed by a user when a glass is disposed below the faucet 100, 200, 300, 400, 700.
In some examples, as illustrated in
The electro chlorine generator 593 may include two plates (e.g., a cathode plate and an anode plate). Each plate may be formed of a metal or conductive material. Each plate may be electrically connected to a power source directly or through 560 and/or valve control module 550. The processor 560 and/or valve control module 550 may provide one or more control signals and/or power (e.g., electric current) to the electro chlorine generator 593. The electro chlorine generator 593 may be configured to control or regulate a flow of electro chlorinated water to the faucet 100, 200, 300, 400, 700. Specifically, the electro chlorine generator 593 may selectively generate electro chlorinated water and/or supply the electro chlorinated water to the faucet 100, 200, 300, 400, 700 at least in part based on the image data received from image sensor 130 and/or range data received from the range sensor 140. In some examples, the water selection valve 520 may be disposed between the electro chlorine generator 593 and the faucet 100, 200, 300, 400, 700, such that the water selection valve 520 controls or regulates a flow of electro chlorinated water to the faucet 100, 200, 300, 400, 700.
The data processing module 540 may include an image processing module 541 and an object classifier 542. The image processing module 541 may be configured to process images or other data received from the image sensor 130. The image processing module 541 may be configured to identify objects included in images. The object classifier 542 may be configured to classify the objects in the images. The faucet control system 500 is configured to generate control signals for one or more of the mixing valve 510, water selection valve 520, ozone generator 530, flow controller 580, water electrolyzer 590, copper ionization system 591, beverage printer 592, and electro chlorine generator 593 to control fluid flow characteristics based on the classification of the objects in the images. Specifically, the faucet control system may generate control signals for one or more of the water selection valve 520, the ozone generator 530, the flow controller 580, the water electrolyzer 590, the copper ionization system 591, the beverage printer 592, and the electro chlorine generator 593 to control a flow attribute comprising a fluid type and a flow type based on the classification of an object or objects in the images.
In some examples, the image sensor 130 may be configured to collect images (e.g., image data) and/or other data in response to the range sensor 140 detecting the presence of an object below the faucet 100, 200, 300, 400, 700. The data processing module 540 may be configured to process data (e.g., ranging data) received from the range sensor 140. Specifically, the data processing module 540 may be configured to identify the presence and/or absence of an object below the faucet 100, 200, 300, 400, 700 based on the data received from the range sensor 140. The data processing module 540 may be configured to generate one or more control signals, for example, turning on or causing the image sensor to collect images or image data in response to a determination that an object is disposed below the faucet 100, 200, 300, 400, 700.
The processor 560 and memory 570 may form a processing circuit. The processor 560 may be configured to execute instructions stored in the memory 570 or may execute instructions otherwise accessible to the processor 560. In some examples, the one or more processors 560 may be embodied in various ways. The processor 560 may be constructed in a manner sufficient to perform at least the operations described herein. In some examples, the processor 560 may be shared by multiple circuits (e.g., circuit A and circuit B may comprise or share the same processor which, in some examples, may execute instructions stored, or otherwise accessed, via different areas of memory 570). Alternatively, or additionally, the processor 560 may be structure to perform or otherwise execute certain operations independent of one or more co-processors. In other examples, two or more processors 560 may be coupled via a bus to enable independent, parallel, pipelined, or multi-threaded instruction execution. Each processor 560 may be implemented as one or more general-purpose processors, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), digital signal processors (DSPs), or other suitable electronic data processing components structure to execute instructions provided by memory. The processor 560 may take the form of a single core processor, multi-core processor (e.g., a dual core processor, triple core processor, quad core processor, etc.), microprocessor, etc.
The memory 570 may include non-transient volatile storage media, non-volatile storage media, non-transitory storage media (e.g., one or more volatile and/or non-volatile memories), etc. In some examples, the non-volatile media may take the form of ROM, flash memory (e.g., flash memory such as NAND, 3D NAND, NOR, 3D NOR, etc.), EEPROM, MRAM, magnetic storage, hard discs, optical discs, etc. In other examples, the volatile storage media may take the form of RAM, TRAM, ZRAM, etc. Combinations of the above are also included within the scope of machine-readable media. The memory 570 may store machine-readable executable instructions which are executed by the processor 560. In this regard, machine-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions. The memory 570 may be operable to maintain or otherwise store information relating to the operations performed by one or more associated circuits, including processor instructions and related data (e.g., database components, object code components, script components, etc.), in accordance with the examples described herein.
In some examples, at least some of the components of the of faucet control system 500 may execute locally. For instance, the processor 560 and memory 570 may be implemented, incorporated, or otherwise execute on a local computing device which is located at or near the faucet 100, 200, 300, 400, 700. For example, the local computing device may be mounted at or near a sink (e.g., 120) and/or a faucet 100, 200, 300, 400, 700. The faucet control system 500 is communicably coupled to the image sensor 130 and/or range sensor 140. The faucet control system 500 may be configured to execute locally to process data received from the image sensor 130 and/or range sensor 140 and control the one or more of the mixing valve 510, water selection valve 520, flow controller 580, the ozone generator 530, water electrolyzer 590, copper ionization system 591, beverage printer 592, and electro chlorine generator 593 based on, at least, the data from the image sensor 130 and/or range sensor 140.
The image sensor 130 may include a camera. The camera may include a lens and an image capture element. The image capture element can be any suitable type of imaging capture device or system, including, for example, an area array sensor, a Charge Coupled Device (CCD) sensor, a Complementary Metal Oxide Semiconductor (CMOS) sensor, or a linear array sensor, and the like. The image capture element may capture images in any suitable wavelength on the electromagnetic spectrum. The image capture element may capture color images and/or greyscale images. In other examples, the image sensor 130 may include other types of sensors configured to generate image data. For instance, the image sensor 130 may include an infrared sensor, a plurality of range sensors (e.g., 140), and the like.
The range sensor 140 may include a signal transmitter, a signal detector, and a timer. The range sensor 140 may be configured to transmit a signal into or toward a medium, for example, via the signal transmitter. The range sensor 140 may be configured to detect the signal reflected off of the medium, for example, using the signal detector. The range sensor 140 may be configured to calculate a distance or range between the signal transmitter and the medium based on a duration between the time in which the signal is transmitted and the reflected signal is detected. For example, the range sensor 140 may be an ultrasonic sensor, a radar sensor, a light detection and ranging (LIDAR) sensor, a sonar sensor, and the like. In some examples, the range sensor 140 is configured to detect or otherwise generate data corresponding to relative distances or depths. The range sensor 140 may be configured to transmit multiple signals toward a medium simultaneously or near simultaneously. The range sensor 140 may be configured to detect reflected signals. The range sensor 140 may be configured to detect or otherwise generate data corresponding to relative distances or depths based on the time in which the multiple signals are detected (e.g., following reflection off of the medium). The range sensor 140 may be configured to transmit multiple signals with a resolution sufficient to detect, for instance, a depth differential of various surfaces of an object. As one example, where a cup is positioned beneath the faucets 100, 200, 300, 400, 700, the range sensor 140 may be configured to detect a depth differential between a rim of the cup and a bottom of the cup. As the cup is filled with water, the depth differential may change.
According to some examples, the range sensor 140 may be a microwave or millimeter radar wave sensor. For example, the microwave radar sensor may emit electromagnetic wave signals and receive electromagnetic wave echo signals reflected by targets. Millimeter wave radar sensor with FMCW (Frequency Modulate Continuous Wave) technology is a high-precision radar ranging technology that generates an intermediate frequency signal with target distance and signal strength after mixing the microwave transmitted wave with the reflected wave of the target through a radio frequency (RF) circuit. The intermediate frequency signal is processed to obtain the distance, intensity, and speed of the targets. Based on these behavioral characteristics the sensor may identify the presence of an object, the location of an object, and/or the size and shape of an object.
According to some examples, the millimeter wave sensor may generate a plurality of measurements indicating a location at which the object is disposed and the millimeter wave sensor of the data processing module 540 may generate a point cloud for determining the size and shape of the object. According to some examples, the object may be classified based on a point cloud generated using a millimeter wave sensor.
In one example, the microwave or millimeter wave sensor is configured to detect the presence of one or more objects or motion of one or more objects. According to some examples, the microwave or millimeter wave sensor may be provided with a millimeter wave sensor module and a microcontroller unit (MCU). According to other examples, the processor 560 and/or the data processing module 540 may perform the functionality of the millimeter wave sensor. In one example, millimeter wave control unit, the microwave operating frequency can be selected as either 24 GHz or 60/77 GHz, with no frequency restrictions. According to some examples, the microwave operating frequency may be in the range of 20 GHz to 80 GHz.
The basic process of microwave ranging, and speed measurement is summarized according to the following. The transmission antenna (Tx chirp) of the millimeter wave sensor module transmits millimeter wave signal. The receiving antenna of the millimeter wave sensor module receives reflected waves (Rx chirp) when there is an object or user in the range (e.g., disposed below faucet 100, 200, 300, 400, 700). The emitted wave and reflected wave are mixed in the mixer to generate an intermediate frequency signal in the millimeter wave sensor. The MCU of the millimeter wave sensor performs fast Fourier transform (FFT) operation on the intermediate frequency signal to obtain the distance, intensity, and velocity information of the targets (e.g., objects disposed below the faucet 100, 200, 300, 400, 700). Based on the characteristics of radar signals, the presence of an object, a location of the object, and/or a size and shape of an object below the faucet 100, 200, 300, 400, 700 may be identified.
The faucet control system 500 includes a data processing module 540. The data processing module 540 is configured to process, interpret, or otherwise analyze data received from the image sensor 130 and/or range sensor 140. The data processing module 540 may include the image processing module 541 and the object classifier 542.
The image processing module 541 and the object classifier 542 may be or include any software, instructions, or other digital commands which are configured to process images received from the image sensor 130. In some examples, the image processing module 541 and/or object classifier 542 may be or include a neural network. The neural network may be a series of input layers, hidden layers, and output layer(s) which are configured to receive an input (e.g., an image), process the image to detect various characteristics within the image (e.g., at the hidden layer), and provide an output. The neural network may be trained prior to deployment. Hence, the neural network may be static at deployment (e.g., when processing images from the imaging sensor 130 in real-time).
The image processing module 541 may be or include software and/or hardware generally configured to identify objects within an image received from the image sensor 130. Since the image sensor 130 is mounted such that the image sensor's 130 field of view 131 includes a sink (e.g., 120), each of the images or the image date received from the image sensor 130 typically have fixed portions corresponding to the sink (e.g., 120). Hence, the images typically have a fixed (or set) background. The image processing module 541 may be configured to identify objects within images based on the difference between the background (e.g., the sink 120) and the images or image data received from the image sensor 130. The image processing module 541 may be configured to compare the images or image data received from the image sensor 130 to a static image of the sink (e.g., 120), for example, when no objected are present in the foreground of the image or the image data of the sink. The static image of the sink (e.g., 120) may be stored locally (e.g., within memory 570). The image processing module 541 may be configured to identify objects within the images based on the comparison (e.g., when there is a difference between the static image of the sink 120 and the images received from the image sensor 130). The image processing module 541 may be capable of distinguishing an object or objects placed in (e.g., on a bottom surface of) a sink (e.g., 120) from those held below the faucet 100, 200, 300, 400, 700 by a user.
The object classifier 542 may be or include software and/or hardware configured to assign a classification to objects in the images or the image data. The object classifier 542 may be configured to assign the classification by, for instance, identifying various features within the portion of the image corresponding to the object, based on object matching, object edge detection and matching, model matching, interpretation trees, and the like. The object classifier 542 may include, incorporate, or otherwise use algorithms corresponding to the above-mentioned methods for classifying objects. In some examples, the object classifier 542 may be configured to use data (e.g., range data) from the range sensor 140 in conjunction with the image or image data to assign a classification to the object. For instance, the object classifier 542 may be configured to use the range sensor 140 in conjunction with the image data to identify scale or size of the object, which may provide a further input to classify the object. According to some examples, as noted above, the object classifier 542 may be configured to classify an object based on a point cloud generated using sensor data collected by a microwave or millimeter wave sensor.
Additionally, in some examples, the images or image data generated by the image sensor 130 and/or the range date generated by the range sensor 140 may be indicative of one or more gestures performed by a user for controlling the faucet 100, 200, 300, 400, 700. The image sensor 130 and/or range sensor may detect one or more gestures performed below or in front of the faucet 100, 200, 300, 400, 700. Specifically, in some examples, the data processing module 540 may identify (e.g., classify) a user's hands in the image, image data, or range data. Further, the data processing module 540 may monitor or track a position of the user's hands over time to determine if one or more gestures for controlling the faucet 100, 200, 300, 400, 700 are performed. The gesture may be a wave or another specific gesture action (e.g., physical movement). In some examples, the data processing module 540 determine a change in distance between the image sensor 130 and/or the range sensor 140 and a user's hands when determined whether or not a user has performed one or more gestures for controlling the faucet 100, 200, 300, 400, 700. The faucet control system may be configured to direct a flow of fluid into the sink having a specific flow attribute based on one or more gestures detected by the faucet control system 500.
According to some examples, the image processing module 541 and/or the object classifier 542 may be configured to identify both an object disposed below the faucet 100, 200, 300, 400, 700 and a gesture performed by a user, and the faucet control system 500 may control a flow attribute of water dispensed by the faucet 100, 200, 300, 400, 700 based on both the identified object and the gesture.
In one example, the data processing module 540 may be configured to identify (e.g., classify) a user's hand and/or a user's finger and a drinking glass disposed below the faucet 100, 200, 300, 400, 700. Specifically, the data processing module 540 may identify a user's hand and/or finger based on images or image data received from the image sensor 130 and/or range data received from the range sensor 140. Further, in some examples, the data processing module 540 may identify a position of the user's hand or finger relative to the drinking glass and fill the drinking glass to a height (e.g., volume) corresponding to the position of user's hand or finger.
The valve control module 550 may be configured to generate control signals for one or more of the mixing valve 510, the water selection valve 520, the flow controller 580, the ozone generator 530, the water electrolyzer 590, the copper ionization system 591, the beverage printer 592, and the electro chlorine generator 593. In some examples, the valve control module 550 may be configured to generate control signals for a valve driver or valve drivers associated with one or more of the mixing valve 510, water selection valve 520, the flow controller 580, the ozone generator 530, the water electrolyzer 590, the copper ionization system 591, the beverage printer 592, and the electro chlorine generator 593.
The valve control module 550 and/or memory 570 may be configured to store fluid settings (e.g., temperature, volume, flow rate, fluid type, flow types) corresponding to various object classifications. Specifically, the valve control module 550 and/or the memory 570 may be configured to store a flow attribute comprising one of a fluid type or a flow type corresponding to various object classifications. In some examples, the valve control module 550 and/or memory 570 may store fluid settings and/or flow attributes as a look-up table. Each entry may correspond to an object classification. Each entry may include, for example, valve positions, durations for which valves are opened, a position of a flow controller 580, an on/off state of an ozone generator 530, and the like. In some examples, the fluid flow settings may be prestored settings (e.g., factory default settings). In some examples, a user may modify the fluid settings by updating an entry in the look-up table. The user may update various entries in the look-up table using, for example, an application executing on the user's mobile device. The user may update an entry in the look-up table via the mobile application, and the updated entry may be communicated from the mobile device to, for example, the valve control module 550 or memory 570 to update the look-up table.
The valve control module 550 may be configured to receive or add new entries to the look-up table. For instance, the user may select a “new entry” option on their mobile device. The user may capture and upload (e.g., via their mobile device) one or more photographs of the object, for instance, while the object is positioned in the sink 120 or otherwise beneath the faucet 100, 200, 300, 400, 700. The valve control module 550 may be configured to receive the photographs from the user's mobile device. The user may select fluid settings on the mobile device to provide when the object in the photographs is detected. The valve control module 550 may be configured to receive the fluid settings. The valve control module 550 may be configured to update the look-up table to the new entry from the user. Hence, the look-up table may be dynamic, in that a user may modify existing entries, add new entries, and so forth (e.g., including flow attributes).
The valve control module 550 may be configured to cross-reference a classification assigned to an object in an image or image data from the image sensor 130 with the look-up table to identify fluid settings, for example, a flow attribute, for the object below the faucet 100, 200, 300, 400, 700. The fluid settings may include, temperature, volume, flow rate, fluid type, flow types, and the like. The fluid settings may correspond to fluid characteristics of fluid flowing from the faucet 100, 200, 300, 400, 700. The valve control module 550 may control the mixing valve 510, water selection valve 520, flow controller 580, ozone generator 530, water electrolyzer 590, copper ionization system 591, beverage printer 592, and/or electro chlorine generator 593 in accordance with the fluid settings to achieve the one or more fluid characteristics from the mixing valve 510, water selection valve 520, flow controller 580, ozone generator 530, water electrolyzer 590, copper ionization system 591, beverage printer 592, and/or electro chlorine generator 593. Various examples of objects and corresponding fluid settings are provided below; however, the present disclosure is not limited thereto.
According to one example, a user may position their hands beneath a faucet 100, 200, 300, 400, 700. In some examples, the range sensor 140 and/or data processing module 540 may determine the presence of an object below the faucet and the image sensor may begin capturing images (e.g., image data). The image sensor 130 may then capture an image of the user's hands below the faucet. In other examples, the image sensor may continually (e.g., periodically) capture images of the contents of the sink 120. The image sensor 130 may transmit the image to the faucet control system 500. The image processing module 541 may identify the user's hands in the image based on a difference (or differences) between the background, for example, the sink 120 and the foreground, for example, the user's hands in the image. The object classifier 542 may assign a classification of “hands” to the object. The object classifier 542 may identify features of the object within the image. The valve control module 550 may identify the classification of the object assigned by the object classifier 542. The valve control module 550 may identify fluid settings or characteristics, specifically, a flow attribute, based on the object classification (e.g., by cross-referencing the assigned classification of the object with data in the look-up table). The valve control module 550 may generate control signals to one or more of the mixing valve 510, water selection valve 520, flow controller 580, the ozone generator 530, the water electrolyzer 590, the copper ionization system 591, the beverage printer 592, and electro chlorine generator 593 to adjust fluid settings based on the classification of the object positioned beneath the faucet.
In this example, the flow attributes may provide for tap water to be dispensed from the faucet with a laminar flow type. Further, the fluid settings may provide for warm water to de dispensed from the faucet for a predetermined period of time (e.g., 2 minutes). The valve control module 550 may control the water selection valve 520 to provide tap water (e.g., from a potable water plumbing network) to the faucet and may control the flow controller 580 such that water is dispensed through an outlet or outlets configured to provide a laminar flow.
In another example, a user may position a drinking glass or a water bottle beneath a faucet (e.g., 100, 200, 300, 400, 700). The image sensor 130 may capture an image of the drinking glass or the water bottle (e.g., in response to the range sensor 140 and/or data processing module 540 determining the presence of an object below the faucet). The image sensor 130 may transmit the image to the faucet control system 500. The image processing module 541 may identify drinking glass or water bottle in the image (e.g., based on a difference between the background and the drinking glass or water bottle). The object classifier 542 may assign a classification of “drinking glass” or “water bottle,” respectively, to the object. The valve control module 550 may identify the classification of the object assigned by the object classifier 542. The valve control module 550 may identify fluid settings or characteristics, specifically, the flow attributes, based on the object classification and generate control signals to one or more of the mixing valve 510, water selection valve 520, flow controller 580, ozone generator 530, water electrolyzer 590, copper ionization system 591, beverage printer 592, and electro chlorine generator 593 to adjust fluid settings or characteristics based on the classification of the object placed below the faucet.
In this example, the flow attributes may provide for filtered water or pH controlled (e.g., alkaline) water to be dispensed from the faucet with a laminar flow type. Further, in some examples, the fluid settings may provide for cold water to be dispensed from the faucet. The valve control module 550 may control the water selection valve 520 to provide filtered water or pH controlled (e.g., alkaline) water to the faucet and control the flow controller 580 such that water is dispensed through an outlet or outlets configured to provide a laminar flow. In some examples, the range data generated by the range sensor 140 may be used by the data processing module 540 to determine a depth of the drinking glass or the water bottle and fill the drinking glass to a predetermined volume (e.g., 70% full, 85% full, etc.)
In another example, a user may position a dish (e.g., a dirty dish) beneath the faucet (e.g., 100, 200, 300, 400, 700) for cleaning. The image sensor 130 may capture an image of the dish (e.g., in response to the range sensor 140 and/or data processing module 540 determining the presence of an object below the faucet). The image sensor 130 may transmit the image to the faucet control system 500. The image processing module 541 may identify the dish in the image (e.g., based on a difference between the background and the dish). The object classifier 542 may assign a classification of “dish” to the object. The valve control module 550 may identify the classification of the object assigned by the object classifier 542. The valve control module 550 may identify fluid settings or characteristics, specifically, flow attributes, based on the object classification and generate control signals to one or more of the mixing valve 510, water selection valve 520, flow controller 580, ozone generator 530, water electrolyzer 590, copper ionization system 591, beverage printer 592, and electro chlorine generator 593 to adjust fluid settings or characteristics based on the classification of the object placed below the faucet.
In this example, the flow attributes may provide for tap water to be dispensed from the faucet with sweeping or blade (e.g., narrow and wide) flow type. Further, in some examples, the fluid settings may provide for hot water to be dispensed from the faucet. The valve control module 550 may control the water selection valve 520 to provide tap water to the faucet and control the flow controller 580 such that water is dispensed through an outlet or outlets configured to provide a sweeping or blade flow type.
In some examples, the data processing module 540 may determine a cleanliness classification or a level of contamination of the dish or another object beneath the faucet. For example, the image processing module may identify and determine a quantity (e.g., area, volume) of contaminant (e.g., food, etc.) on the dish. Further, in some examples, the object classifier 542 may assign a cleanliness classification to the dish (or another object) beneath the faucet. In some examples, the valve control module may identify flow settings, specifically, flow attributes, corresponding to the cleanliness classification and control one or more of the mixing valve 510, water selection valve 520, flow controller 580, ozone generator, water electrolyzer 590, copper ionization system 591, beverage printer 592, and electro chlorine generator 593 at least in part based on the cleanliness classification level of the object beneath the faucet. For example, a flow rate of fluid dispensed from the faucet may increase as the cleanliness classification decreases (as the object is more contaminated) and/or a flow type (e.g., flow attribute) may be selected in part based on the contamination level of the object.
In another example, a user may position a fruit or vegetable beneath a faucet (e.g., 100, 200, 300, 400, 700). The image sensor 130 may capture an image of the fruit or vegetable (e.g., in response to the range sensor 140 and/or data processing module 540 determining the presence of an object below the faucet). The image sensor 130 may transmit the image to the faucet control system 500. The image processing module 541 may identify the fruit or vegetable in the image (e.g., based on a difference between the background and the fruit or vegetable). The object classifier 542 may assign a classification of “fruit” or “vegetable”, respectively, to the object. The valve control module 550 may identify the classification of the object assigned by the object classifier 542. The valve control module 550 may identify fluid settings or characteristics, specifically, a flow attribute, based on the object classification and generate control signals to one or more of the mixing valve 510, water selection valve 520, flow controller 580, ozone generator 530, water electrolyzer 590, copper ionization system 591, beverage printer 592, and electro chlorine generator 593 to adjust fluid settings or characteristics based on the classification of the object placed below the faucet.
In this example, the flow attributes may provide for ozone water or pH controlled water to be dispensed from the faucet with a soft (e.g., relatively low velocity) spray type for cleaning the fruit or vegetable. The valve control module 550 may control the ozone generator to produce and/or entrain ozone in a flow of water provided (e.g., from a potable water plumbing network) to the faucet. Further, the valve control module 550 may control the flow controller 580 such that water is dispensed through an outlet or outlets configured to provide a soft spray flow type.
In some examples, a field of view 131 of the image sensor 130 may include a portion or portions of a surface 121 (e.g., countertop) adjacent to the sink 120 and faucet 100, 200, 300, 400, 700. In these examples, the data processing module 540 (e.g., image processing module 541, object classifier 542) may be configured to identify or classify a surface adjacent the faucet 100, 200, 300, 400, 700. Additionally, the data processing module 540 may be configured to identify a volume of fluid (e.g., water) on the surface 121 proximate to the faucet. In these examples, when the data processing module 540, specifically, the image processing module 541 and/or object classifier 542 identifies a volume of water on the surface 121 adjacent to the sink, the valve control module may reduce a flow rate and/or change a flow type of water dispensed from the faucet to prevent additional water from traveling (e.g., splashing) outside of the sink 120. In some examples, when the data processing module 540 identifies a volume of water on the surface 121 adjacent to the sink 120, the valve control module 540 will send one or more control signals to drain valve 125 actuating the drain valve 125 to an open state.
According to some examples, a field of view 131 of the image sensor 130 and/or the range sensor 140 may include an area in which a user would stand at or in front of the faucet 100, 200, 300, 400, 700 when using the faucet 100, 200, 300, 400, 700. In these examples, the data processing module 540 may determine when a user is standing at or in front of the faucet 100, 200, 300, 400, 700. According to some examples, the faucet control system 500 may control the faucet 100, 200, 300, 400, 700 to automatically dispense a flow of water when a user is standing at the faucet 100, 200, 300, 400, 700. According to some examples, the image sensor 130 and/or the range sense 140, for example, a millimeter wave radar sensor may be configured to identify when a user steps or moves away from the faucet 100, 200, 300, 400, 700 and the faucet control system 500 may control the faucet 100, 200, 300, 400, 700 to stop dispensing a flow of water and/or reduce a flow rate of water dispensed from the faucet 100, 200, 300, 400, 700 when a user steps or moves away from the faucet 100, 200, 300, 400, 700.
In some examples, the faucet control system 500 may be configured to indicate when an object is placed below a faucet 100, 200, 300, 400, 700 and the processor 560 and/or image processing module 541 is not able to identify an object placed below the faucet. For example, the faucet control system 500 may be configured to cause an indicator light included in the faucet (e.g., 100, 200, 300, 400, 700) to illuminate or cause a speaker included in the faucet to produce a noise.
In some examples, the faucet control system 500 may be configured to indicate when no object is identified below the faucet 100, 200, 300, 400, 700. In one example, the faucet 100, 200, 300, 400, 700 may include an indicator light and the indicator light may illuminate when no object is disposed below the faucet 100, 200, 300, 400, 700. In another example, the indicator light may illuminate when an object is detected below the faucet 100, 200, 300, 400, 700, and remain off when no object is detected below the faucet 100, 200, 300, 400, 700. In some examples, the faucet 100, 200, 300, 400, 700 may include an indicator light configured to illuminate and/or a speaker configured to generate a noise when a faucet 100, 200, 300, 400, 700 is dispensing fluid and no object is detected or disposed below the faucet 100, 200, 300, 400, 700.
According to some examples, the faucet control system 500 may include a display or one or more indicator lights configured to display a type of water and/or a flow type of water dispensed by the faucet 100, 200, 300, 400, 700.
In some examples, the faucet control system 500 may receive an ambient temperature from a temperature sensor (e.g., thermometer) included in the faucet (e.g., 100, 200, 300, 400, 700) and adjust a temperature of fluid (e.g., using the mixing valve 510) based on an ambient temperature proximate. For example, when high temperature fluid is required, the faucet control system 500 may cause the mixing valve 510 to provide a fluid that is thirty degrees (e.g., Fahrenheit, Celsius) warmer than the ambient air proximate to the faucet.
In some examples, the faucet control system 500 (e.g., processor 560, image processing module 541) may be configured to determine a quantity of fluid in a sink 120 disposed below the faucet (e.g., 100, 200, 300, 400, 700) and may cause the faucet to stop dispensing fluid into the sink 120 when the sink is nearly (e.g., 85%, 90%, 95%) full to prevent the sink 120 from overflowing. In some examples, the faucet control system may further illuminate and indicator light or cause a speaker to generate a sound when the sink is nearly full. In some examples, the faucet control system 500 may cause a drain valve disposed in the sink or a drain pipe proximate to the sink to open when the sink 120 is nearly full.
In some examples, the processor 560 and/or the valve control module 550 may control the faucet 100, 200, 300, 400, 700 according to one or more learned models or neural networks. For example, one or more learned models or neural networks may be used to analyze (e.g., classify) an object below the faucet. Additionally, one or more learned models or neural networks may be used to select or identify a fluid setting based on the classified object. In some examples, a single learned model or neural network may classify an object disposed below the faucet 100, 200, 300, 400, 700 and select or identify a fluid setting based the identified object.
In some examples, a learned model for classifying an object and/or a learned model for selecting a fluid setting based on the classified object may be a multi-layered learned model or neural network wherein each layer comprises a different learned model or neural network for classifying an object and/or selecting a fluid setting. For example, a different learned model or neural network may be used to identify an object below the faucet, analyze raw image data including other objects besides the object of interest (e.g., objects disposed in the sink below the object of interest), and/or analyze a cropped image that is tight around the object of interest. In some examples, another learned model or neural network may receive an output from two or more constituent layers for classifying an object and/or selecting a fluid setting. In some examples, the image date may be provided to multiple neural networks at once (e.g., in parallel) and/or in series.
An image or image data (e.g., collected, generated by the image sensor 130) may be input into a learned model for identifying or classifying an object and the learned model may output a classification of an object included in the image or image data. The learned model for identifying or classifying objects may be trained with objects having known classifications. In some examples, a ground truth set of images including objects having known classifications may be input into the learned model. In some examples, manual (e.g., human) verification of object classifications may be used to train the learned model.
In some examples, different learned models may be developed (e.g., trained) for different applications. For example, a different learned model may be trained for each of faucets disposed in a kitchen and faucets disposed in a bathroom. Example, classifications, as noted above, include, hands, dishes, a drinking glass, a water bottle, a fruit, a vegetable, a tooth brush, and the like.
In some examples, a learned model or learned models may be used to identify or classify a cleanliness of an object disposed below the faucet 100, 200, 300, 400, 700. The learned model for classifying a cleanliness of an object may be trained with objects having known cleanliness classifications. In some examples, a fluid setting (e.g., fluid type, fluid flow type) may be selected at least in part, based on a cleanliness classification of the object.
In some examples, a learned model or neural network may be configured to adjust a fluid setting corresponding to a classified object. Specifically, a learned model or neural network may be configured to adjust a (e.g., stored) fluid setting corresponding to a classified object when a user manually adjusts a fluid setting when the classified object is disposed below the faucet 100, 200, 300, 400, 700. For example, if when it is determined that a drinking glass is disposed below the faucet, the faucet dispenses fluid having a temperature of 50° Fahrenheit and a user manually adjust the faucet to dispense a fluid having a temperature of 40° Fahrenheit, the learned model may decrease a (e.g., stored) fluid temperature associated with the drinking glass. In one example, the learned model may decrease a fluid temperature associated with drinking glass by a difference between the stored temperature and the manually set temperature time a multiplier (e.g.,.3). One or more equations may be used to adjust a fluid setting, for example, Stored Temperature±Multiplier×|Stored Temperature−Manually Set Temperature|. In other examples, the learned model may be configured to change another fluid setting, such as, f type and/or flow type based on one or more manual adjustments by a user. In some examples, the learned model may continuously adjust a fluid setting associated with the classified object. Similarly, if a user manually changes a water type and/or a flow type of water dispensed by the faucet 100, 200, 300, 400, 700 (e.g., a threshold number of times) when a classified object is disposed below the faucet 100, 200, 300, 400, 700, the learned model may change a water type and/or flow type associated with the classified object.
Referring to
In a first act S101, a faucet control system 500 receives an image or image data from an image sensor 130. Specifically, a processor 560 and/or image processing module 541 may receive the image or image data from the image sensor 130.
In a second act, S103 a faucet control system 500 may analyze the image or image data received from the image sensor to assign a classification to an object included in the image or the image data. Specifically, the processor 560 and/or the image processing module 541 and the object classifier may analyze the image or the image data to assign a classification. For example, the processor 560 or image processing module 541 may identify the object in the image based on a difference or differences between the background (e.g., sink 120) and the foreground or object below the faucet. Further, the processor 560 or object classifier 542 may assign a classification to the object. For example, the processor 560 or object classifier 542 may identify features of the object and classify the object based on the identified features.
In a third act S105, at least one of a fluid type and a fluid flow type (e.g., flow type) may be selected based on the assigned classification. Specifically, the valve control module 550 may identify the classification of the object assigned by the object classifier 542 and may identify fluid settings or characteristics, for example, fluid type and flow type based on the object classification. For example, the by cross-referencing the assigned classification of the object with data in a look-up table.
In a fourth act S107, a flow of fluid including the selected fluid type or having the selected flow type is dispensed from the faucet. Specifically, the valve control module 550 may generate one or more control signals for one or more of the mixing valve 510, water selection valve 520, flow controller 580, ozone generator 530, water electrolyzer 590, copper ionization system 591, beverage printer 592, and electro chlorine generator 593 to adjust fluid settings (e.g., fluid type, fluid flow type) based on classification of object positioned beneath the faucet. Further, the valve control module 550 may control or cause the faucet to dispense a flow of fluid having the selected or assigned fluid settings (e.g., fluid type, fluid flow type) into the sink 120.
In some examples, the flowchart 800 may further include an act of receiving, by the faucet control system 500 (e.g., processor 560, data processing module 540) range data from the range sensor 140, the range data indicating a presence of an object below the faucet.
Referring to
The contents of the database 903 may include, for example, a look-up table including object classifications and one or more flow settings associated with or corresponding to each object classification. The memory 904 may be a volatile memory or a non-volatile memory. The memory 904 may include one or more read only memory (ROM), random access memory (RAM), a flash memory, an electronic erasable program read only memory (EEPROM), or other type of memory. The memory 904 may be removable from the apparatus 900, such as a secure digital (SD) memory card.
The memory 904 and/or the computer readable medium 905 may include a set of instructions that can be executed to cause the controller to perform any one or more of the methods or computer-based functions disclosed herein. For example, the controller 950 may send one or more controller signals and/or electric current to one or more of the mixing valve 510, water selection valve 520, flow controller 580, and ozone generator 530, for example, performing various acts of the flowchart 800.
A user may enter a new object classification and/or update a look-up table including object classifications and associated fluid settings using the display 912 and/or user input device 913. The display 912 may comprise a screen and the user input device 913 may comprise one or more buttons on the apparatus 900. In some examples, the display 912 and user input device 913 may comprise a touch sensitive surface (i.e., a touch screen). In some examples, the user input device 913 may include a microphone configured to receive one or more verbal or voice activation commands for controlling the faucet.
The communication interface 914 may be connected to the network 920, which may be the internet. In some examples, the network 920 may be connected to one or more mobile devices 922. The one or more mobile devices may be configured to send a signal to the communication interface 914 via the network 920. For example, one or more mobile devices may send a signal to the communication interface to enter a new object classification and/or update a look-up table including object classifications and associated fluid settings.
The communication interface 914 may include any operable connection. An operable connection may be one in which signals, physical connections and/or logical communications may be sent and/or received. An operable connection may include a physical interface, an electrical interface, and/or a data interface. The communication interface 914 provides for wireless and/or wired communications in any known or later developed format.
According to some examples, the apparatus 900 for controlling a faucet 100, 200, 300, 400, 700 may be implemented on a stand alone battery operated device (e.g., a wireless puck). For example, a wireless puck may be disposed proximate to the faucet 100, 200, 300, 400, 700, for example, in a sink 120, or on a surface 121 (e.g., countertop) proximate to the faucet 100, 200, 300, 400, 700.
When a component, device, element, or the like of the present disclosure is described as having a purpose or performing an operation, function, or the like, the component, device, or element should be considered herein as being “configured to” meet that purpose or to perform that operation or function.
As utilized herein, the terms “approximately,” “about,” “substantially”, and similar terms are intended to have a broad meaning in harmony with the common and accepted usage by those of ordinary skill in the art to which the subject matter of this disclosure pertains. It should be understood by those of skill in the art who review this disclosure that these terms are intended to allow a description of certain features described and claimed without restricting the scope of these features to the precise numerical ranges provided. Accordingly, these terms should be interpreted as indicating that insubstantial or inconsequential modifications or alterations of the subject matter described and claimed are considered to be within the scope of the disclosure as recited in the appended claims.
It should be noted that the term “exemplary” and variations thereof, as used herein to describe various embodiments, are intended to indicate that such embodiments are possible examples, representations, or illustrations of possible embodiments (and such terms are not intended to connote that such embodiments are necessarily extraordinary or superlative examples).
The term “coupled” and variations thereof, as used herein, means the joining of two members directly or indirectly to one another. Such joining may be stationary (e.g., permanent or fixed) or moveable (e.g., removable or releasable). Such joining may be achieved with the two members coupled directly to each other, with the two members coupled to each other using a separate intervening member and any additional intermediate members coupled with one another, or with the two members coupled to each other using an intervening member that is integrally formed as a single unitary body with one of the two members. If “coupled” or variations thereof are modified by an additional term (e.g., directly coupled), the generic definition of “coupled” provided above is modified by the plain language meaning of the additional term (e.g., “directly coupled” means the joining of two members without any separate intervening member), resulting in a narrower definition than the generic definition of “coupled” provided above. Such coupling may be mechanical, electrical, or fluidic.
The term “or,” as used herein, is used in its inclusive sense (and not in its exclusive sense) so that when used to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Conjunctive language such as the phrase “at least one of X, Y, and Z,” unless specifically stated otherwise, is understood to convey that an element may be either X, Y, Z; X and Y; X and Z; Y and Z; or X, Y, and Z (i.e., any combination of X, Y, and Z). Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z to each be present, unless otherwise indicated.
References herein to the positions of elements (e.g., “top,” “bottom,” “above,” “below”) are merely used to describe the orientation of various elements in the FIGURES. It should be noted that the orientation of various elements may differ according to other exemplary embodiments, and that such variations are intended to be encompassed by the present disclosure.
Although the figures and description may illustrate a specific order of method steps, the order of such steps may differ from what is depicted and described, unless specified differently above. Also, two or more steps may be performed concurrently or with partial concurrence, unless specified differently above. Such variation may depend, for example, on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations of the described methods could be accomplished with standard programming techniques with rule-based logic and other logic to accomplish the various connection steps, processing steps, comparison steps, and decision steps.
It is important to note that the construction and arrangement of the system as shown in the various exemplary embodiments is illustrative only. Additionally, any element disclosed in one embodiment may be incorporated or utilized with any other embodiment disclosed herein. Although only one example of an element from one embodiment that can be incorporated or utilized in another embodiment has been described above, it should be appreciated that other elements of the various embodiments may be incorporated or utilized with any of the other embodiments disclosed herein.
This application claims priority benefit of Provisional Application No. 63/595,106 (Docket No. 10222-23009A) filed on Nov. 1, 2023, which is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63595106 | Nov 2023 | US |