The present subject matter relates to electronic devices, including smart devices and communications between the smart devices.
Olfactory families allow individual perfumes to be classified according to their key olfactory characteristics. The olfactory families are created either by grouping together raw materials (like flowers, woods, aromatics or citrus fruits) or by taking inspiration from traditional accords (oriental or chypre, for example).
The drawing figures depict one or more implementations, by way of example only, not by way of limitations. In the figures, like reference numerals refer to the same or similar elements.
Features of the various implementations disclosed will be readily understood from the following detailed description, in which reference is made to the appended drawing figures. A reference numeral is used with each element in the description and throughout the several views of the drawing. When a plurality of similar elements is present, a single reference numeral may be assigned to like elements, with an added letter referring to a specific element.
The various elements shown in the figures are not drawn to scale unless otherwise indicated. The dimensions of the various elements may be enlarged or reduced in the interest of clarity. The several figures depict one or more implementations and are presented by way of example only and should not be construed as limiting. Included in the drawing are the following figures:
An electronic device having an olfactory detector including an array of olfactory sensors for determining scents proximate the electronic device, such as a smartphone, smart watch, or smart eyewear. Each sensor in the array of olfactory sensors is tuned to detect and measure the concentration of one or more specific chemical compounds or molecules. Specific combinations and concentrations of chemical compounds or molecules as measured by the olfactory detector represent distinct scents as perceived by the human olfactory system. A fan creates airflow of ambient air across the olfactory sensors. An analog to digital (A/D) converter receives and processes the sensor outputs of the olfactory sensors and provides the processed sensor output to a processor for further processing. The processor may display information of the determined scents on a display of the smart device, and the processor can also send information indicative of the determined scents to another device.
The term “connect,” “connected,” “couple,” and “coupled” as used herein refers to any logical, optical, physical, or electrical connection, including a link or the like by which the electrical or magnetic signals produced or supplied by one system element are imparted to another coupled or connected system element. Unless described otherwise, coupled, or connected elements or devices are not necessarily directly connected to one another and may be separated by intermediate components, elements, or communication media, one or more of which may modify, manipulate, or carry the electrical signals. The term “on” means directly supported by an element or indirectly supported by the element through another element integrated into or supported by the element.
Additional objects, advantages and novel features of the examples will be set forth in part in the following description, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The objects and advantages of the present subject matter may be realized and attained by means of the methodologies, instrumentalities and combinations particularly pointed out herein.
Reference now is made in detail to the examples illustrated in the accompanying drawings and discussed below.
As shown in
To generate location coordinates for positioning of the smartphone 100, smartphone 100 also may include a global positioning system (GPS) receiver. Alternatively, or additionally, the smartphone 100 can utilize either or both the short range XCVRs 170 and WWAN XCVRs 165 for generating location coordinates for positioning. For example, cellular network, WI-FI®, or BLUETOOTH® based positioning systems can generate very accurate location coordinates, particularly when used in combination.
The transceivers 165, 170 (i.e., the network communication interface) may conform to one or more of the various digital wireless communication standards utilized by modern mobile networks. Examples of WWAN transceivers 165 include (but are not limited to) transceivers configured to operate in accordance with Code Division Multiple Access (CDMA) and 3rd Generation Partnership Project (3GPP) network technologies including, for example and without limitation, 3GPP type 2 (or 3GPP2) and LTE, at times referred to as “4G,” or 5G New Radio, referred to as “5G.” For example, the transceivers 165, 170 provide two-way wireless communication of information including digitized audio signals, still image and video signals, web page information for display as well as web-related inputs, and various types of mobile message communications to/from the smartphone 100.
The smartphone 100 further includes a microprocessor that functions as a central processing unit (CPU) shown as CPU 120 in
The CPU 120 serves as a programmable host controller for the smartphone 100 by configuring the smartphone 100 to perform various operations in accordance with instructions or programming executable by CPU 120. For example, such operations may include various general operations of the smartphone 100, as well as operations related to the programming for applications on the smartphone 100. Although a microprocessor may be configured by use of hardwired logic, typical microprocessors in mobile devices are general processing circuits configured by execution of programming.
The smartphone 100 further includes a memory or storage system, for storing programming and data. In the example illustrated in
Hence, in the example of smartphone 100, the flash memory 110 is used to store programming or instructions for execution by the CPU 120. Depending on the type of device, the smartphone 100 stores and runs a mobile operating system through which specific applications are executed. Examples of mobile operating systems include Google Android, Apple iOS (for iPhone or iPad devices), Windows Mobile, Amazon Fire OS, RIM BlackBerry OS, or the like.
In sample configurations, the CPU 120 may construct a map of the environment surrounding the smartphone 100, determine a location of the smartphone 100 within the mapped environment, and determine a relative position of the smartphone 100 to one or more objects in the mapped environment. The CPU 120 may construct the map and determine location and position information using a simultaneous localization and mapping (SLAM) algorithm applied to data received from one or more sensors.
Sensor data may include images received from cameras 155 and 160, distance(s) received from a laser range finder, position information received from a GPS unit, motion and acceleration data received from an inertial measurement unit (IMU) 190, or a combination of data from such sensors, or from other sensors that provide data useful in determining positional information. In the context of augmented reality, a SLAM algorithm is used to construct and update a map of an environment, while simultaneously tracking and updating the location of a device (or a user) within the mapped environment. The mathematical solution can be approximated using various statistical methods, such as particle filters, Kalman filters, extended Kalman filters, and covariance intersection. In a system that includes a high-definition (HD) video camera that captures video at a high frame rate (e.g., thirty frames per second), the SLAM algorithm updates the map and the location of objects at least as frequently as the frame rate, in other words, calculating and updating the mapping and localization thirty times per second.
The smartphone 100 is configured to send messages to, and receive messages from, a remote smartphone device via the short range XCVRs 170 and the WWAN XCVRs 165. The messaging supports traditional chat-based messages, augmented reality (AR) messages, and hybrid chat-AR based messages. The smartphone 100 further includes an olfactory transducer 200 configured to store and disperse a small amount of one or more stored scents to be perceived by a user in proximity to the olfactory transducer 200. The CPU 120 is configured to send control signals to the olfactory transducer 200 to control the dispersion of the scents. In one example, the olfactory transducer 200 is built into the smartphone 100. In other examples, a separate device containing the olfactory transducer 200 is coupled to the smartphone 100, such as a cartridge coupled to the smartphone 100.
The smartphone 100 also includes an olfactory detector 220 that is configured to both sense and determine a scent proximate the smartphone 100. The olfactory detector 220 includes a plurality of olfactory sensors 224 for sensing and determining a plurality of different scents and their intensity as will be described further with respect to
The olfactory sticker 204 may also include text instructions 208 indicating how to engage and activate the olfactory sticker 204 to release the scent via the olfactory transducer 200. In the example shown in
The olfactory sticker 204 instructs the olfactory transducer 200 to emit the corresponding scent when the olfactory sticker 204 is engaged by a user of the receiver smartphone 100B, such as by tapping or rubbing the displayed olfactory sticker 204 as previously described in reference to
In one example, the user of the sender smartphone 100A may select which olfactory sticker 204 to send from a displayed set of predefined olfactory stickers 204 that correlate to predefined scents that are stored in the memory 110 of the sender smartphone 100A. The receiver smartphone 100B has access to an identical, or a different, set of predefined olfactory stickers 204 to easily preview what particular scent has been received without activating the olfactory sticker 204. For example, an image of a pine tree corresponds to a fresh pine scent for both the user and the sender of the smartphones. The sender smartphone 100A sends an olfactory sticker 204 representing a pine tree to the receiver smartphone 100B and the receiver smartphone 100B knows from the predefined olfactory stickers 204 that a fresh pine scent olfactory sticker 204 was sent without activating the olfactory sticker 204. In another example, the user of the sender smartphone 100A may select a predetermined scent to accompany a custom olfactory sticker. For example, the user selects an image of a candle to be a visual preview 206 of the olfactory sticker 204 and selects the scent of the olfactory sticker 204 to be a citrus scent. The receiver smartphone 100B receives this custom olfactory sticker 204 and does not know the scent of the olfactory sticker 204 until the olfactory sticker 204 is activated and released by the olfactory transducer 200.
At block 402, the user of the sender smartphone 100A creates and send a chat message with an olfactory sticker 204 to the receiver smartphone 100B. The processor 120 of the sender smartphone 100A sends the message via the short range XCVRs 170 or WWAN XCVRs 165. The message sent may be a traditional chat-based message or an AR message. In one example, the sender smartphone 100A sends an olfactory sticker 204 selected from a predetermined set of olfactory stickers 204 that have a predetermined scent. In another example, the sender selects a scent to accompany the olfactory sticker 204. In another example, the sender sends an identified scent determined by the olfactory detector 220 proximate the smartphone 100A as detailed with respect to
At block 404, the receiver smartphone 100B receives the message forwarded from the sender smartphone 100A via respective short range XCVRs 170 or WWAN XCVRs 165. The message contains the olfactory sticker 204 with the encoded olfactory information. The receiver smartphone 100B displays the olfactory sticker 204 on the display 145B via the receiver smartphone 100B user interface 202B.
At block 406, the user of the receiver smartphone 100B activates the olfactory sticker 204 by interacting with the displayed olfactory sticker 204. For example, the user taps or rubs the displayed olfactory sticker 204. The user can determine the intensity of the released scent by controlling the intensity of the interaction with the displayed olfactory sticker 204.
At block 408, the processor 120 of the receiver smartphone 100B sends a signal to the olfactory transducer 200 to release a scent corresponding to the encoded olfactory information in the olfactory sticker 204 and the intensity of the user interaction with the olfactory sticker 204. The user of the receiver smartphone 100B is then able to perceive the released scent.
The human olfactory system uses about 300 olfactory receptors to sample scents inhaled through the nose. Individual human olfactory receptors are sensitive to multiple chemical compounds present in the inhaled scent. These receptors' responses are transmitted to the brain, which combines and processes the “signals” of the olfactory receptors to determine the scent.
In the example shown, the olfactory detector 220 includes a manifold 222 having an airflow input 220, an airflow output 226, and a fan 228 that draws ambient air proximate the manifold 222 across an array of olfactory sensors 230. Each olfactory sensor 230 generates an analog signal indicative of a chemical compound or molecule and its concentration in the ambient air, where each olfactory sensor 230 has a different response curve as illustrated in
In an example, the olfactory sensors 230 are metal oxide semiconductor (MOS) sensors, as they are well-characterized, inexpensive, widely available and, in contrast to quartz crystal microbalance (QCM) sensors, are less sensitive to interference from environmental conditions such as airflow effects and electromagnetic effects from other QCM sensors operating in the vicinity. Table 1 shows an example list of sensors and the specific compounds each is configured to respond to. Example of the MOS sensors are MQ Series sensors available from Figaro USA Inc. and Hanwei Electronics CO LTD. An example A/D converter 232 is an Arduino Mega 25601 that converts the analog outputs of the sensor olfactory sensors 230 to digital signals. The digital signals are transmitted via a serial connection to the CPU 120 for further processing.
In an example of machine learning model, as shown in
The olfactory detector 200 can be used as a scent-sensing IOT appliance that is always on and classifies the current scent in a space. Using the scent classification, a server can provide a web endpoint that displays the currently detected and history of recently detected scents. Furthermore, a simple Slack Bot (Sniff Bot) can be implemented that posts the currently detected scent to a Slack channel every hour and on every change of detected scent. Such updates can augment the awareness of what is happening in the environment, such as of a remote co-worker, without being as intrusive as, e.g., a camera feed.
At block 702, the CPU 120 of the smartphone 100 enables the olfactory sensors 230 to each detect a scent that the respective olfactory sensor 230 is tuned to sense. This may be done by the CPU 120 powering the olfactory sensors 230.
At block 704, the CPU 120 enables the fan 228 to draw ambient air into the airflow input 220, across the olfactory sensors 230, and exit through the airflow output 226. The speed of the fan, and thus the airflow, is controlled by the CPU 120.
At block 706, the olfactory sensors 230 each detect the scent in the airflow that the respective olfactory sensor 230 is tuned to sense. Each olfactory sensor 230 generates an electrical analog signal that is indicative of the detected scent, and the intensity of the detected scent. Each olfactory sensor 230 can detect a single scent, or a compound of scents depending on the selected olfactory sensor 230.
At block 708, the A/D converter 232 processes the received analog signals from each of the olfactory sensors 230. The A/D converter 232 generates a digital signal(s) indicative of the detected scents. The A/D converter 232 communicates the digital signal to the CPU 120 for further processing.
At step 710, the CPU 120 processes the digital signals from the A/D converter 232, including identifying the detected scents and intensity. The CPU 120 displays the identified detected scents on the display 145 of the smartphone 100 as shown in
It will be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein. Relational terms such as first and second and the like may be used solely to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” “includes,” “including,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises or includes a list of elements or steps does not include only those elements or steps but may include other elements or steps not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “a” or “an” does not, without further constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
Unless otherwise stated, all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification, including in the claims that follow, are approximate, not exact. Such amounts are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain. For example, unless expressly stated otherwise, a parameter value or the like may vary by as much as ±10% from the stated amount.
In addition, in the foregoing Detailed Description, various features are grouped together in various examples for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed examples require more features than are expressly recited in each claim. Rather, as the following claims reflect, the subject matter to be protected lies in less than all features of any single disclosed example. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
While the foregoing has described what are considered to be the best mode and other examples, it is understood that various modifications may be made therein and that the subject matter disclosed herein may be implemented in various forms and examples, and that they may be applied in numerous applications, only some of which have been described herein. It is intended by the following claims to claim any and all modifications and variations that fall within the true scope of the present concepts.