The present disclosure relates to quantitative systems. More particularly and specifically, the present disclosure discloses a method and a system for real time determination of quantity of food consumed by users.
Consider a scenario where a group of people having lunch at a restaurant, would like to share bill for food ordered and food consumed. Currently, there are numerous applications where a user inputs types of food ordered, number of people sharing the bill, amount of food consumed etc. The application then generates a bill for each user sharing the bill. Here, users have to manually key in the inputs to the application. Also, the inputs may not be accurate and hence the bill generated may not he according to actual amount of food consumed by each user. Hence, the users will not contribute accurately towards the bill generated by the conventional devices and systems. Thus, existing systems do not provide itemized bill according to food consumed by users.
In an embodiment, the present disclosure discloses a method for determining quantity of food consumed by users. The method comprises receiving, by a determination unit, one or more inputs from a first set of sensors and a second set of sensors associated with the determination unit, where the first set of sensors and the second set of sensors monitor one or more users and food served to the one or more users respectively, identifying each type of food from the food served, identifying each of the one or more users and actions performed by each of the one or more users to consume the food and determining quantity of each type of food consumed by each of the one or more users.
In an embodiment of the present disclosure, a determination unit for determining quantity of food consumed by users is disclosed. The determination unit comprises a processor and a memory communicatively coupled to the processor, storing processor executable instructions. The processor is configured to receive one or more inputs from first set of sensors and second set of sensors, associated with the determination unit, where the first set of sensors and the second set of sensors monitor one or more users and food served to the one or more users respectively, identify each type of food from the food served, identify each of the one or more users and actions performed by each of the one or more users to consume the food and determine quantity of food consumed by each of the one or more users,
In an embodiment, the present disclosure provides a system for determining quantity of food consumed by users. The system comprises a first set of sensors to monitor one or more users, a second set of sensors to monitor food served to the one or more users and a determination unit to receive one or more inputs from the first set of sensors and the second set of sensors, associated with the determination unit, identify each type of food from the food served, identify each of the one or more users and actions performed by each of the one or more users to consume the food and determine quantity of food consumed by each of the one or more users.
In another embodiment, a non-transitory computer-readable storage medium for determining quantity of food consumed by users is disclosed, which when executed by a computing device, cause the computing device to perform operations comprising receiving one or more inputs from a first set of sensors and a second set of sensors, where the first set of sensors and the second set of sensors monitor one or more users and food served to the one or more users respectively, identifying each type of food from the food served, identifying each of the one or more users and actions performed by each of the one or more users to consume the food and determining quantity of each type of food consumed by each of the one or more users.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
The novel features and characteristic of the disclosure are set forth in the appended claims. The disclosure itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying figures. One or more embodiments are now described, by way of example only, with reference to the accompanying figures wherein like reference numerals represent like elements and in which:
It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
In the present document, the word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment or implementation of the present subject matter described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments,
While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
The terms “comprises”, “comprising”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises . . . a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or apparatus.
Embodiments of the present disclosure relate to a method and a system for determining quantity of food consumed by users. The system comprises one or more set of sensors to monitor users, food served, actions of the users, etc. The system then determines quantity of each type of food consumed by each user by monitoring one or more actions of the user. Thereby, the system determines an itemized bill according to quantity of food consumed by each user.
The first set of sensors 103 may monitor users 107. In an embodiment, the users 107 may be referred to as one or more users 107 hereinafter in the present disclosure. In an embodiment, the one or more users 107 may refer to persons consuming food 108. In an embodiment, the first set of sensors may include but are not limited to one or more of at least one Red, Green, Blue (RGB) camera, at least one RGB-D (Red, Green, Blue-Depth) camera, at least one spectral camera, at least one Infra-Red (IR) camera or at least one hyperspectral camera.
The second set of sensors monitor the food 10$. Particularly, the second set of sensors 104 monitor the food 108 served to the one or more users 107. In an embodiment, the second set of sensors may include but are not limited to one or more of at least one biosensor, at least one image sensor, at least one thermal sensor or at least one laser sensor.
The determination unit 101 receives one or more inputs from the first set of sensors 103 and the second set of sensors 104 respectively. Further, the determination unit 101 determines each type of food 108 served based on the one or more inputs received from the second set of sensors 104. Likewise, the determination unit 101 identifies each of the one or more users 107 and actions performed by each of the one or more users 107 to consume the food 108 based on the one or more inputs received from the first set of sensors 103. The determination unit 101 determines quantity of each type of food 108 consumed by each of the one or more users 107 based on identified actions performed by each of the one or more users 107. In an embodiment, the determination unit 101 retrieves data from the database 106, regarding amount of food 108 ordered by the one or more users 107. Then, the determination unit 101 calculates a bill for each of the one or more users based on the based on the identified type of food 108, the quantity of food consumed by respective one or more users and the amount of food 108 ordered by the one or more users.
In an embodiment, the database 106 may be associated with a server (not shown in figure) of a service provider providing food service to the one or more users 107. The database 106 is connected to the determination unit by at least one of a wired interface and a wireless interface.
In an embodiment, the display unit 105 displays results generated by the determination unit 101. The display unit 105 can be connected to the determination unit through wired interface or wireless interface.
In an embodiment, one or more data 204 may be stored within the memory 202. The one or more data 204 may include, for example, first set of sensors data 205, second set of sensors data 206, food ordered data 207 and other data 208. The first set of sensors data 205 includes parameters related to the one or more users 107. The parameters may comprise user Identity (ID), facial recognition data, action recognition data, etc. The second set of sensors data 206 includes parameters related to food 108 served to the one or more users 107. The parameters may comprise type of food 108 served, amount of food 108 served, etc.
In an embodiment, the food ordered data 207 includes amount of food 108 ordered, type of food 108 ordered, etc.
The other data 208 may be used to store data, including temporary data and temporary files, generated by modules 208 for performing various functions of the determination unit 101.
In an embodiment, the one or more data 204 in the memory 202 is processed by modules 209 of the determination unit 101. As used herein, the term module refers to an algorithm running on application specific integrated circuit (ASIC), an electronic circuit, a field-programmable gate arrays (FPGA), Programmable System-on-Chip (PSoC), a combinational logic circuit, and/or other suitable components that provide the described functionality. The said modules 209 when configured with the functionality defined in the present disclosure will result in a novel hardware.
In one implementation, the modules 209 may include, for example, food identification module 210, user identification module 211, quantity determination module 212, bill generation module 213 and other modules 214. It will be appreciated that such aforementioned modules 2.08 may be represented as a single module or a combination of different modules.
In an embodiment, the food identification module 210 identifies the type of food 108 served to the one or more users 107. The food identification module 210 receives the one or more inputs from the second set of sensors 104. The food identification module 210 may receive the one or more inputs from the second set of sensors at predefined intervals of time. The food identification module 210 uses image processing techniques to identify the type of food 108. For example, the food identification module may receive images of food 108 served as inputs. The images can be compared with reference images stored in the database 106 to identify the type of food 108.
in an embodiment, the user identification module 211 identifies each of the one or more users 107 based on the one or more inputs received from the first set of sensors 103. Here, the user identification module 211 may use image processing techniques to identify each of the one or more users 107. Also, actions performed by each of the one or more users to consume the food 108 are identified by the user identification module 211. The actions identified by the user identification module 211 are mapped with the respective one or more users 107.
in an embodiment, the quantity determination module 212 determines amount of food 108 consumed by each of the one or more users 107. The quantity determination module 212 receives inputs from the food identification module 211 and the user identification module 212. Then, the quantity determination module 212 maps the actions performed by each of the one or more users to consume the food 108 with each type of food 108 identified. Further, the quantity determination module 212 determines the quantity of each type of food 108 consumed by each of the one or more users 107 based on the actions performed by each of the one or more users 107 to consume the food 108.
In an embodiment, the bill generation module 213 generates a bill for each of the one or more user based on identified type of food 108, the quantity of food 108 consumed by respective one or more users and amount of food 108 ordered by the one or more users. The bill generation module 213 considers the amount of food 108 ordered by the one or more users 107, type of food 108 ordered by the one or more users 107, amount of food 108 consumed by each of the one or more users 107 to generate an itemized bill for each of the one or more users 107.
In an embodiment, the other modules 214 may include a notification module to notify a staff when the one or more users 107 require attention, communication module to communicate with similar systems for determining quantity of each type of food 108 consumed by the one or more users 107, etc.
As illustrated in
The order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the spirit and scope of the subject matter described herein. Furthermore, the method can he implemented in any suitable hardware, software, firmware, or combination thereof.
At step 401, the user identification module 211 and the food identification module 210 receives the one or more inputs from the first set of sensors 103 and the second set of sensors 104 respectively. The first set of sensors 103 monitor the one or more users 107 and actions performed by the one or more users 107 to consume the food 108. The second set of sensors 104 monitor the food 108 served to the one or more users 107.
At step 402, the food identification module 210 identifies each type of food 108 served to the one or more users 107 based on the one or more inputs received from the second set of sensors 104. The food identification module 210 uses image processing techniques to identify each type of food 108. The food identification module 210 may compare the one or more inputs received from the second set of sensors 104 with reference data to identify the type of food 108. The reference data may be stored in the database 106.
At step 403, the user identification module 211 identifies each of the one or more users 107 and actions performed by each of the one or more users 107 to consume the food 108 based on the one or more inputs received from the first set of sensors 103. Here, the user identification module 211 uses method pertaining to user recognition to identify each of the one or more users 107. Further, the user identification module 211 tracks motion of each of the one or more users 107 to identify when position of respective one or more users are changed. Further, the user identification module 211 identifies actions performed by each of the one or more users 107 to consume the food 108. Here, the user identification module 211 identifies only certain actions performed by each of the one or more users 107 to consume the food 108. Such actions trigger a signal to indicate that respective one or more user 107 has consumed the food 108.
At step 404, the quantity determination module 212 determines quantity of each type of food 108 consumed by each of the one or more users 107 based on the actions performed by each of the one or more users 107. The quantity determination unit 2.12 receives inputs from the food identification unit 210 and the user identification unit 211. Further, the quantity determination unit 212 determines quantity of each type of food 108 consumed by each of the one or more users 107 based on the actions performed by each of the one or more users 107 to consume each type of food 108. The actions performed by the one or more users 107 may include hand movements to pick food 108 from a container to a plate, hand movements to place the food 108 into mouth of a user 107, etc.
In an embodiment, the bill generation module 213 generates a bill for each of the one or more user 107 based on the quantity of food 108 consumed by each of the one or more users 107. The bill generation module receives inputs from the quantity determination module 212 indicating quantity of each type of food 108 consumed by each of the one or more users 107. The bill generation module 213 calculates price for the food 108 consumed by each of the one or more users based on the quantity of food 108 consumed by each of the one or more users 107 and amount of food 108 ordered by the one or more users 107. In an embodiment, the bill generation module 213 may retrieve the food ordered data 207 from the database 106. Also, the bill generation module 213 calculates the price based on predefined price for a particular type of food for predefined quantity.
In an embodiment, the database 106 may be associated with a server (not shown in figure) of a service provider providing food service to the one or more users 107.
In an embodiment, the determination unit 101 can be placed in a restaurant server. Further, the determination unit 101 can receive the first set of sensor data 205 and second set of sensor data 206 from the first set of sensors 103 and the second set of sensors 104 respectively, embedded in the napkin holder 500. The determination unit 101 then performs the method steps as described in method steps 401 to 404 to determine the quantity of each type of food 108 consumed by the one or more users 107.
In an embodiment, the one or more napkin holders 500 are placed on a table of a restaurant in such a way that each of the one or more users 107 and the food 108 ordered are within Field of View (FOV) of the first set of sensors 103 and the second set of sensors 104.
In an embodiment, the first set of sensors 103 and the second set of sensors 104 can be used interchangeably. Consider six users 107 to be seated at the table. Let users 107 order three different types of food 108 from a menu displayed to the six users 107. Here, the food 108 ordered is retrieved from the database 106 associated with the restaurant server. Also, price associated with predefined amount of the food ordered is also retrieved from the database 106. In an embodiment, the food ordered can be manually updated into the napkin holder 500 by a concerned personnel. Here, the determination unit 101implemented by the napkin holder 500 is initiated by the concerned personnel. Once the determination unit 101 is initiated, the first set of sensors 103 and the second set of sensors 104 begin to monitor the one or more users 107 and the food 108 served. Let a first user consume two spoons of first type of food, one spoon of second type of food and three pieces of third type of food. Likewise, let each user among the six user consume a portion of each type of food. Here, the actions performed by each user to consume each type of food are monitored by the first set of sensors 103. The action performed by each of the six users is mapped to respective user ID. Also, the type of food served to the six users is monitored by the second set of sensors 104. The determination unit 101 receives the one or more inputs from the first set of sensors 103 and the second set of sensors 104. Further, the determination unit 101 determines quantity of each type of food consumed by each of the six users by mapping the actions performed by each of the six users to consume each type of food. For example, the action performed by the first user to consume first type of food is determined by the determination unit 101. Likewise, actions performed by the first user to consume the second type of food and third type of food is determined by the determination unit 101. The action performed to consume the first type of food may be picking an egg from a container. The action performed to consume the second type of food may be picking up two spoons of noodles. The action performed to consume the third type of food may be picking up chicken pieces from a container. Here, the determination unit 101 determines quantity of first type of food consumed based on number of egg pieces picked up by the first user. The determination unit 101 also maps action of the first user eating the egg pieces. The action of picking up the egg piece and the action of eating the egg piece is considered as an action performed by the first user to consume the first type of food. Similarly, the determination unit identifies action of the first user to consume the second type and third type of food respectively. Based on the actions performed by the first user, the total quantity of food consumed by the first user is determined by the determination unit 101. Similarly, the determination unit 101 determines quantity of each type of food consumed by each of the six users.
The determination unit 101 further generates an itemized bill for each of the six users based on the amount of food consumed by each of the six users and amount of food ordered by the six users. The generated itemized bill is then displayed to the six users by a display unit 105 associated with the napkin holder 500.
In an embodiment, the determination unit 101 identifies a person serving the food to the user and differentiates the person from the user consuming the food. For example, in a restaurant, a server may serve food to users. Here, the action of the server is identified as serving. Since the server has not consumed the food, the action of the server is not considered for determining quantity of each type of food consumed by the users.
In an embodiment, the determination unit 101 can be integrated with mobile applications. Further, the itemized bill can be displayed to users on one or more user devices associated with the determination unit 101.
The napkin holder 500 described above can be considered as an example for implementing the determination unit 101. In an embodiment, the determination unit 101 can be integrated into any system capable of monitoring the users and the food served to the users.
In an embodiment, the bill generated for each of the user can be printed using a printer connected to the determination unit 101. The printer can be connected by at least one of wired interface and wireless interface. Likewise, the database 106 is connected to the determination unit by at least one of a wired interface and a wireless interface.
The processor 702 may be disposed in communication with one or more input/output (I/O) devices (not shown) via I/O interface 701. The I/O interface 701 may employ communication protocols/methods such as, without limitation, audio, analog, digital, monoaural, RCA, stereo, IEEE-1394, serial bus, universal serial bus (USB), infrared, PS/2, BNC, coaxial, component, composite, digital visual interface (DVI), high-definition multimedia interface (HDMI), RF antennas, S-Video, VGA, IEEE 802.n/b/g/n/x, Bluetooth, cellular (e.g., code-division multiple access (CDMA), high-speed packet access (HSPA+), global system for mobile communications (GSM), long-term evolution (LTE), WiMax, or the like), etc.
Using the I/O interface 701, the computer system 700 may communicate with one or more I/O devices. For example, the input device 710 may be an antenna, keyboard, mouse, joystick, (infrared) remote control, camera, card reader, fax machine, dongle, biometric reader, microphone, touch screen, touchpad, trackball, stylus, scanner, storage device, transceiver, video device/source, etc. The output device 711 may be a printer, fax machine, video display (e.g., cathode ray tube (CRT), liquid crystal display (LCD), light-emitting diode (LED), plasma. Plasma display panel (PDP), Organic light-emitting diode display (OLED) or the like), audio speaker, etc.
In some embodiments, the computer system 700 is connected to the service operator through a communication network 709. The processor 702 may be disposed in communication with the communication network 709 via a network interface 703. The network interface 703 may communicate with the communication network 709. The network interface 703 may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), transmission control protocol/Internet protocol (TCP/IP), token ring, IEEE 802.11a/b/g/n/x, etc. The communication network 709 may include, without limitation, a direct interconnection, e-commerce network, a peer to peer (P2P) network, local area network (LAN), wide area network (WAN), wireless network (e.g., using Wireless Application Protocol), the Internet, Wi-Fi, etc. Using the network interface 703 and the communication network 709, the computer system 700 may communicate with the one or more service operators.
In some embodiments, the processor 702 may be disposed in communication with a memory 705 (e.g., RAM, ROM, etc. not shown in
The memory 705 may store a collection of program or database components, including, without limitation, user interface 706, an operating system 707, web server 708 etc. In some embodiments, computer system 700 may store user/application data locally in user interface 706, such as the data, variables, records, etc. as described in this disclosure. Such databases may be implemented as fault-tolerant, relational, scalable, secure databases such as Oracle or Sybase.
The operating system 707 may facilitate resource management and operation of the computer system 700. Examples of operating systems include, without limitation, Apple Macintosh OS X, Unix, Unix-like system distributions (e.g., Berkeley Software Distribution (BSD), FreeBSD, NetBSD, OpenBSD, etc.), Linux distributions (e.g., Red Hat, Ubuntu, Kubuntu, etc.), IBM OS/2, Microsoft Windows (XP, Vista/7/8, 10 etc.), Apple iOS, Google Android, Blackberry OS, or the like.
In some embodiments, the computer system 700 may implement a web browser 707 stored program component. The web browser 708 may be a hypertext viewing application, such as Microsoft Internet Explorer, Google Chrome, Mozilla Firefox, Apple Safari, etc. Secure web browsing may he provided using Secure Hypertext Transport Protocol (HTTPS), Secure Sockets Layer (SSL), Transport Layer Security (TLS), etc. Web browsers 708 may utilize facilities such as AJAX, DHTML, Adobe Flash, JavaScript, Java, Application Programming Interfaces (APIs), etc. In some embodiments, the computer system 700 may implement a mail server stored program component. The mail server may be an Internet mail server such as Microsoft Exchange, or the like. The mail server may utilize facilities such as ASP, ActiveX, ANSI C++/C∩, Microsoft.NET, CCI scripts, Java, JavaScript, PERL, PHP, Python, WebObjects, etc. The mail server may utilize communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), Microsoft Exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like. In some embodiments, the computer system 700 may implement a mail client stored program component. The mail client may be a mail viewing application, such as Apple Mail, Microsoft Entourage, Microsoft Outlook, Mozilla Thunderbird, etc.
In an embodiment, the determination unit 101 may receive order details through the user devices. The user devices may can be indicated by input devices 710. In an embodiment, the determination unit 101 may be associated with a restaurant server 712. The restaurant server 712 may provide the determination unit 101 food ordered data 207.
The terms “an embodiment”, “embodiment”, “embodiments”, “the embodiment”, “the embodiments”, “one or more embodiments”, “some embodiments”, and “one embodiment” mean “one or more (but not all) embodiments of the invention(s)” unless expressly specified otherwise.
The terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless expressly specified otherwise.
The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise. The terms “a”, “an” and “the” mean “one or more”, unless expressly specified otherwise,
A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device/article may he used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
The illustrated operations of
In an embodiment, the present disclosure discloses a method to provide users a itemised and quantified bill based on the amount of food consumed.
In an embodiment, the present disclosure discloses a method to improve accuracy of bill distribution between users.
In an embodiment, the present disclosure discloses a method and system for accurately determining amount of food consumed by the users.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter, It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
201641028071 | Aug 2016 | IN | national |