This invention relates generally to virtual reality systems that recreate an object or experience, or an aspect thereof, for a user.
In the retail setting, one area that is becoming of increasing significance is the virtual shopping environment. Some retailers have established an online shopping experience for customers in which the customer may participate in a simulated shopping experience without actually going to a store. Instead, the customer may access the retailer's website and navigate through a realistic appearing store that may have various departments and various types of products for sale. Some shopping experiences also incorporate augmented reality technology, wherein a live direct or indirect view of a real-world environment is augmented or modified by computer-generated perceptual information. In most cases, augmented reality technology adds visual elements to a real-world environment or masks visual elements in order to alter the user's visual perception of a three-dimensional space.
In some virtual reality or augmented reality shopping experiences, head-mounted displays are provided in order to permit the user to change the field of view within a surrounding three-dimensional environment. For instance, new objects may be added to the real-world environment, or the appearance of specific objects within the environment may be altered by overlaying of new or altered images. Additional sensory inputs may be provided to provide user with information about the real-world, augmented, or virtual environment. Haptic gloves are often provided to assist the user in manipulating or interacting with the environment.
However, the immersion of the customer is generally limited in such systems, and does not convey information regarding numerous aspects of products that would be apparent to a user in real life.
Disclosed herein are embodiments of systems, apparatuses and methods pertaining to simulating haptic characteristics of objects. This description includes drawings, wherein:
Elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions and/or relative positioning of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present invention. Also, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present invention. Certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. The terms and expressions used herein have the ordinary technical meaning as is accorded to such terms and expressions by persons skilled in the technical field as set forth above except where different specific meanings have otherwise been set forth herein.
Various embodiments, systems, apparatuses and methods are provided herein useful to replicate or represent the sensation of touching an object by providing the user with a physical structure that can act as a virtual object and can be interacted with or manipulated to give a user the impression that a real-world object is being touched or manipulated. In some forms, a haptic device is provided that replicates firmness, texture, and/or resilience of an object across its surface to provide a user with a physical virtual object that replicates a remote object or a portion thereof. In some embodiments, the haptic device has a generally planar surface that adjusts haptic characteristics during use to replicate different areas of a three-dimensional surface. The portion of the object that is replicated may change as the user focuses on or views different portions of the three-dimensional surface. In other embodiments, the haptic device is shaped similarly to the detected object, or a portion thereof, or may be reconfigured to replicate at least a portion of the shape of the detected object. For instance, a spherical or partially spherical haptic device may be provided to mimic detected objects that are generally spherical or rounded, a pyramid-shaped haptic device may be provided to mimic detected objects that are generally pyramidal or tapered, and a rectangular or cube-shaped haptic device may be provided to mimic objects that are generally rectangular or angled. In some embodiments, a system for presenting a user with haptic sensations representative of an object may be provided, the system comprising a sensor device for detecting physical characteristics of an object at multiple areas across at least one surface of the object, a computer memory for receiving and storing the information detected by the sensor device, a haptic feedback device comprising a plurality of movable members that may be adjusted to replicate the physical characteristics of the object as detected by the sensor device, and a control circuit configured to receive information from the sensor device and transmit signals to the haptic feedback device for causing biasing of the movable members in response to the information from the sensor device so that a force exerted by the movable members correlates with the information received by the sensor device in order to replicate the firmness of the object or other detected tactile properties.
In some forms, the system further includes a stereoscopic camera for obtaining images comprising the at least one surface of the object and mapping the images to a three-dimensional coordinate system, and in some embodiments further includes a head mounted display for displaying images from the stereoscopic camera in virtual reality or augmented reality. In some aspects, head-mounted displays may present to the user an artificial environment or an augmented real-world environment (e.g. recorded images of an environment, real-time images of an environment, or a direct view of the environment through transparent portions of the display) with the appearance of specific objects within the environment altered by overlaying of new or altered images. For instance, the head-mounted display may show images of a real-world object superimposed over a haptic feedback device to provide the impression that the user is touching the real-world object when interacting with the haptic feedback device. In some embodiments, the control circuit is configured to cause the head mounted display to present the images from the stereoscopic camera over the haptic feedback device, the images oriented so that portions of the images corresponding to portions of the surface detected by specific portions of the sensor device are aligned with corresponding moveable members from the haptic feedback device. One or more symbols may be placed on the haptic feedback device and an optical sensor associated with the head mounted display may be provided for determining the orientation of the haptic feedback device based on positions of the one or more symbols in order to accurately map the image from the camera to the haptic feedback device within the view provided by the head mounted display. The display may further play recorded sounds (such as audio files recorded during interaction with the object) to enhance simulation of interacting with the virtual object.
In some embodiments, the sensor device is configured for detecting the firmness of an object at multiple areas across at least one surface of an object. In some embodiments the sensor device comprises a support member; a plurality of probes extending from the support member, the probes forming an array of probes and each probe having at least a distal end thereof moveable along a linear path between a proximal position toward the support member and a distal position away from the support member; a drive mechanism for linearly shifting the distal ends of the plurality of probes toward their respective distal positions; at least one regulator for applying a set amount of force from the pneumatic drive mechanism to the plurality of probes; one or more sensors to receive information relating to the amount of movement of the distal ends of the plurality of probes; and a computer memory for receiving and storing the information relating to the amount of movement of the distal ends of the plurality of probes. In some embodiments, the sensor device includes a box structure that may be positioned around an object to be scanned. In some embodiments, a conveyor is provided to deliver a series of objects to the sensor device. In some forms, parts of the sensor device that interact with the object may be driven by pneumatic or hydraulic power. In some forms, the set amount of force applied by the pneumatic drive mechanism that shifts the probes may be adjusted by a user, such as by manually inputting one or more values or information regarding an object to be sensed, or is automatically adjusted based on characteristics of an object to be sensed. For instance, in some forms a user may provide to a control circuit via an input device (such as a keyboard or numerical pad) information regarding characteristics of an object to be sensed, and the control circuit references a database to determine preset force values associated with the information provided by the user and correspondingly adjusts the force applied to the probes.
The sensor device may also include in some embodiments optical sensors, such as laser scanners, color sensors, infrared sensors, and/or ultraviolet sensors, for measuring visual characteristics such as shape, surface contours, color, or other responses to irradiation with specific wavelengths of light. The data from such optical sensors can be compared to and correlated with physical measurements from probes that contact the object, and in cases where a strong correlation between visual characteristics and haptic characteristics exists, the optical sensors may be used to predict texture, firmness, resilience, and other characteristics based on visual cues. For instance, where the sensor device is used to analyze fruit and vegetables, soft areas such as bruises may be recognized based on color. Other sensors may also be employed to assist in predicting haptic characteristics of the object. For instance, ethylene oxide detectors or other chemical sensors may be included in the sensor device to assist in determining how ripe fruit and vegetables are, and the amount of ethylene oxide detected may be used to predict hardness or softness. Machine learning may be employed in some embodiments in order to predict haptic characteristics based on optical scanning or other gathered data by referencing previously-measured correlations between such data and haptic characteristics. In some forms, data from previous operations may be stored in a database for reference during later operations. In some embodiments, machine learning may also be employed to filter data presented to a user or prioritize objects scanned with the sensor device in order to first present users with objects they are most likely to purchase, as determined by past purchases or selections. Machine learning may be of various types, including supervised or unsupervised learning, reinforcement learning, rule-based learning, and clustering.
In some forms, the haptic feedback device may comprise a base member; a flexible membrane coupled to the base member; a plurality of buttress structures arranged in an array of buttress structures corresponding to the array of probes from the sensor device, the array of buttress structures disposed between the base member and flexible membrane, at least a portion of each buttress structure biased toward the membrane, each buttress structure further comprising a solenoid coupled to a power source for modifying a force exerted upon the flexible membrane by the buttress structure.
Generally, the control circuit 110 can include fixed-purpose hard-wired platforms or can comprise a partially or wholly programmable platform. These architectural options are well known and understood in the art and require no further description here. The control circuit can be configured (for example, by using corresponding programming as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein, and can store instructions, code, and the like that is implemented by the control circuit and/or processors to implement intended functionality. In some applications, the control circuit and/or memory may be distributed over a communications network (e.g. LAN, WAN, Internet) providing distributed and/or redundant processing and functionality. In some implementations, the control circuit can comprise a processor 112 and a memory module 114, which may be integrated together, such as in a microcontroller, application specification integrated circuit, field programmable gate array or other such device, or may be separate devices coupled together.
In some embodiments, the control circuit may be locally connected to, or incorporated into, the sensor device 130 and/or haptic feedback device 140, or may be remotely located from both. The control circuit 110 may include one or more cloud-based software components in some embodiments. The power source 120 in communication with the control circuit may be of any conventional type, such as a direct current electrical circuit, an alternating current electrical circuit, a battery, or a solar (photovoltaic) cell. In some embodiments, the control circuit may communicate with auxiliary devices only intermittently and/or as necessary in order to conserve power. The database 170 shown may be contained in the memory module 114 of the control circuit, may exist in a separate computer memory coupled to the control circuit, or may be located on a remote server accessed by the control circuit.
The sensor device 130 is provided for detecting and/or quantifying one or more haptic characteristics of an object, such as shape, firmness, resilience, and texture. In some embodiments, the sensor device 130 includes an array of sensors or structures for analyzing haptic characteristics at multiple areas across at least one surface of the object. The sensor device 130 may comprise, in some embodiments, at least one probe that extend from the support member to be placed in contact with a three dimensional object in order to detect and/or quantify one or more haptic characteristics of the object. In some embodiments, a plurality of probes form an array, each probe having at least a distal end thereof moveable along a linear path between a proximal position toward the support member and a distal position away from the support member. The array of probes may be physically connected to a support member that holds the plurality of probes in a predetermined array.
In some forms, the sensor device contains or is coupled to a drive mechanism 131 controlled by the control circuit 110. The drive mechanism 131 may be of any type capable of physically moving the probes or portions thereof, and in some forms may be pneumatic, hydraulic, electric, or a combination thereof. In some embodiments, the drive mechanism may apply a predetermined amount of force uniformly across an array of probes for linearly shifting the distal ends of the probes toward their respective distal positions. The drive mechanism 131 and/or sensor device 130 may further include at least one regulator for maintaining the force applied from the pneumatic drive mechanism to the plurality of probes at a set amount. The probes may be associated with one or more sensors to receive information relating to the amount of movement of the probes or portions thereof. For instance, optical or magnetic sensors may measure the movement of the probes.
In some embodiments, the probes may be initially gently positioned against an object at a contact position and then driven with a greater amount of force distally to a second pressing position, with the amount of movement between the initial contact position and the pressing position used to determine firmness and/or resilience of the object's surface. In some forms the sensor device may rely on gravity to position the probes into the initial contact position rather than actively moving the probes into initial contact with the object. In some forms, the initial contact position of a plurality of probes is also used to determine the object's shape and/or size. Movement of the probe or probe portions may be determined in any manner, for instance by generation of an electrical current or by movement of physical, optical, magnetic, or other markers relative to sensors. In some embodiments, movement into the initial contact position and subsequent pressing position may be accomplished using the same drive system, such as where two pulses of differing magnitude are provided from a single hydraulic, pneumatic, or electric drive system, or may involve separate drive systems. Placing probes in the initial position may be alternatively accomplished manually. The amount of force exerted upon the probes will vary depending on the information desired by the user and the nature of the object being detected. In some embodiments, for example, the drive system may apply, without limitation, an amount of force in the range of 0.01 psi to 50 psi, or 1 psi to 20 psi, or 2 psi to 15 psi, or 5 to 10 psi.
A stereoscopic camera 150, laser scanner, and/or other optical scanning device may be provided or coupled to the system for obtaining images comprising at least one surface of the object scanned using the sensor device 130. The images may be captured before or after interaction of the sensor device 130 with the object, and in some embodiments the images are captured while the object is in the same orientation in which the object interacts with the sensor device. Optical data may be used to construct a point cloud or other polygon mesh for visualizing the object remotely. In some embodiments, images of the object or portions thereof may be mapped to a three dimensional coordinate system, for instance a point cloud, either by the camera device 150, the control circuit 110, or another device. In some embodiments, optical or laser scanning may be used to determine the overall three-dimensional shape of the object scanned in order to later replicate or reconstruct the same.
The haptic feedback device 140 of the system 100 may comprise a plurality of movable structures controllable by the control circuit 110 in order to represent or imitate haptic characteristics of the object detected by the sensor device 130. In some embodiments, the haptic feedback device 140 comprises a flexible membrane. In some embodiments, the device includes one or more moveable elements, which may be buttress structures that support a flexible membrane. In some embodiments, the haptic feedback device 140 includes a plurality of buttress structures arranged in an array of buttress structures corresponding to the array of probes, and the array of buttress structures may be disposed between a base member to which they are mounted and a flexible membrane that the buttress structures support. In some such embodiments, at least a portion of each buttress structure is biased toward the membrane. The buttress structure may further comprise, for example, a solenoid coupled to a power source for modifying a force exerted upon the flexible membrane by the buttress structure. In some embodiments the haptic feedback device 140 may be shaped generally similarly to the objects detected by the sensor device, for instance generally spherical where the detected objects are apples, oranges, or other round produce. In other embodiments, the haptic device may be relatively flat. If the haptic feedback device includes a flexible membrane or outer covering, the membrane or covering may in some embodiments have a texture representative of the objects detected by the sensor device 130.
The head mounted display 160 may be, for instance, a virtual reality or augmented reality headset or goggle device that displays the images of the object captured by the stereoscopic camera. Alternatively in some embodiments the head mounted display may show the user stock images generally representative of an object instead of captured images of a specific object. In some embodiments, the images are mapped over the haptic feedback device 140 so that it appears to the user that they are manipulating the object. The images may be oriented relative to the haptic feedback device 140 in the head mounted display by reference to visual indicators on the haptic device's surface, such as symbols, letters, numbers, QR codes, shapes, or other visual elements, or by other methods (such as radiofrequency markers embedded in the haptic device. In some forms, the control circuit 110 is configured to transmit images (for instance images from the stereoscopic camera 150) to the head mounted display 160 and cause the head mounted display to present the images overlaying or superimposed over the haptic device and oriented so that portions of the images corresponding to portions of the surface detected by specific probes of an array are aligned with corresponding buttress structures of the haptic feedback device 140.
A first amount of force may be applied to the movable portions 222 of the probes 220, preferably simultaneously, so that the distal ends 222b contact the object 205. At this point, the positions of the various movable portions 222 relative to the casings 221 of the probe array may be recorded to determine the general shape of at least a portion of the object 205. Alternatively, the probes 220 may slide into position and be brought into contact with the object 205 solely by gravity in some embodiments. A second amount of force, greater than the first amount of force, may then be applied to the movable portions 222, preferably simultaneously, and the amount of movement of each individual movable portion 222 in the distal direction is recorded to determine the firmness of various regions of the object 205. In the instance depicted, a bruised or soft area 206 leads to greater movement of the adjacent probe 220a than in other probes of the array. Movement of the movable portions 222 in the proximal direction when the pressure applied to the probes by the drive mechanism ceases may be recorded and used to represent resilience of the object 205. In many cases movement of probes will be relatively small, and therefore sensors detecting movement of the probes in many embodiments should be capable of measuring linear movement of 1 mm or less.
The sound made by impacting the object 205, such as the “thunk” sound generated by tapping or flicking produce with one's finger, may be imitated by applying a third amount of force to one or more of the probes or, alternatively, a separate impact member 240. The impact member 240 may be configured to linearly shift to drive a blunt end 241 into contact with the object 205, or alternatively may be configured to be pivotable and swing into contact with the object 205 in order to better imitate the flicking motion of a human finger. The impact member 240 may be driven by a spring, a pneumatic, hydraulic, or electric drive system, or other means. An audio recording device may be provided for recording sounds generated by contact between the impact member and the object, and for storing the recorded sounds as an audio file. In some embodiments the recording device is located on the impact member or an interior surface of the sensor device for capturing sounds near the point of impact. In alternative embodiments, the recording device may be located at a distance from the impact member to replicate the distance of a user's ear from the object as the object is tapped or flicked with a finger. The audio file recorded during impact of the object may be linked to the identity of the object or other information relating to the object that is stored in a database. A speaker associated with a head mounted display may be provided in some embodiments in order to play the audio file when the user requests sounds relating to the object, and in some embodiments the user requests sounds by making gestures normally associated with generating those sounds, such as tapping or flicking motions.
A cross-sectional view of one example of a haptic feedback device 400 is illustrated in
The buttress members 420 of
The membrane 430 of the haptic feedback device may be textured in order to more accurately reflect the surface of the types of objects scanned. For instance, if produce such as apples, bananas, or watermelon are being scanned by the system, the membrane 430 may be provided with a smooth waxy coating. A slightly bumpy or roughened surface may be provided to replicate oranges, grapefruit, and the like. A cloth covering may be provided if furniture, pillows, or similar items are simulated. In some embodiments, a plurality of interchangeable membranes or a membrane with differing texture regions may be provided, and in some embodiments may be relocated or interchanged automatically by the haptic feedback device according to incoming signals relating to haptic characteristics of a detected object. For instance, several membranes may be stored in a sleeve connected to the feedback device, so that a membrane best approximating the object surface may be slid from the sleeve over the surface of the feedback device. The membrane also may include heating and/or cooling elements in some embodiments to replicate the warmth or coolness of objects, with the temperature of the membrane adjusted by a control circuit based on values from a database associated with the type of item simulated or based on actual temperature measurements taken from the sensed object.
The membrane may also incorporate vibratable components to alter the haptic sensations provided, or incorporate electrodes to provide electromagnetic pulses or electrostatic fields capable of replicating a variety of surface textures. For instance, electrostatic haptic feedback touchscreen technology from Senseg OY (Espoo, Finland) or electromagnetic touchscreen technology from Tanvas Corp. (Chicago, Ill.) to create variable haptic surface effects at the surface of the membrane 430. These touchscreens provide impulses to a user's fingertips to provide sensations that simulate subtle surface textures. In particular, surfaces as described in US Published Patent Application Numbers 2014/0375580, 2016/0124548, 2016/0349880, 2016/0357342, and 2017/0168572, all of which are hereby incorporated by reference in their entirety, discloses electrostatic haptics in which frictional force along the surface is modulated via an electrical field at the point of contact between the fingertip and the touch surface by placing one or more electrodes on the touch surface of the substrate and insulating those electrodes from the user's fingertip with a dielectric layer. A circuit through the user's finger creates an electrical field, and the circuit may be closed through a second contact at some other part of the body (e.g. a second fingertip) or by two separate electrodes placed under a single contact location to close the circuit through only a single fingertip. In some embodiments, the dielectric layer is about 0.1-50 microns thick, allowing a relatively large electric field to be produced without extremely high voltages. Arrays of electrode pairs on the touch screen surface permit the electric field to be varied from electrode pair to electrode pair, providing a sensation of varying friction across the screen surface that can be patterned to simulate roughened or smooth surfaces, or a combination thereof. For instance, in some embodiments pairs of electrodes may be positioned in a lattice. By appropriately patterning and stimulating the electrodes, the screen may be configured to mimic cloth, wood, stone, plastic, or various features of organic matter (such as bumps, microabrasions, or surface roughness of fruits and vegetables). In some embodiments, the sensations transmitted to the user may change as the user examines another portion of the object or zooms in on the object. In some embodiments, the screen may be used to examine the texture of only a portion of the object at any given time.
In some embodiments, the haptic feedback device may be configured to vary both the shape and the firmness/resilience of a membrane in order to simulate a wide variety of different objects.
As shown in
A control circuit receives information regarding haptic characteristics of the object based on signals from the sensor device relating to the amount of movement of the probes, and then the control circuit transmits one or more control signals to a haptic feedback device sufficient to cause a plurality of buttress structures to exert an amount of pressure on a flexible membrane (Step 705). The buttress structures may be arranged in an array with each buttress structure corresponding to a different one of the probes of the sensor device so that the pressure exerted on the flexible membrane in a given region corresponds to the firmness of a corresponding region of the object. Optionally, if images of the object have been captured, the control circuit may also transmit the three-dimensional image from the stereoscopic camera to a head mounted display so that the three-dimensional image is mapped to the haptic device. The regions of the three-dimensional image corresponding to portions of the object sensed by specific probes are overlayed on the haptic device at buttress members simulating a firmness level of the corresponding probe.
In some embodiments, an apparatus and a corresponding method performed by the apparatus, comprises a virtual reality system for presenting a user with haptic sensations representative of the firmness of an object for wholesale or retail purchase, the system having (i) a sensor device for detecting the firmness of an object at multiple areas across at least one surface of the object, the sensor device comprising a support member, a plurality of probes extending from the support member, the probes forming an array of probes and each probe having at least a distal end thereof moveable along a linear path between a proximal position toward the support member and a distal position away from the support member, a pneumatic or hydraulic drive mechanism for linearly shifting the distal ends of the plurality of probes toward their respective distal positions, at least one regulator for applying a set amount of force from the drive mechanism to the plurality of probes, one or more sensors to receive information relating to the amount of movement of the distal ends of the plurality of probes, and a computer memory for receiving and storing the information relating to the amount of movement of the distal ends of the plurality of probes; (ii) a stereoscopic camera for obtaining images comprising the at least one surface of the object and mapping the images to a three dimensional coordinate system; (iii) a haptic feedback device comprising a base member, a flexible membrane coupled to the base member, a plurality of buttress structures arranged in an array of buttress structures corresponding to the array of probes, the array of buttress structures disposed between the base member and flexible membrane, at least a portion of each buttress structure biased toward the membrane, each buttress structure further comprising a solenoid coupled to a power source for modifying a force exerted upon the flexible membrane by the buttress structure; (iv) a head mounted display for displaying the images from the stereoscopic camera mapped to the haptic feedback device; and (v) a control circuit configured to receive from the sensor device the information relating to the amount of movement of the distal ends of the plurality of probes, transmit signals to the haptic feedback device for causing the solenoids of the plurality of buttress structures to shift at least a portion of the buttress structures in response to the information relating to the amount of movement of distal ends of corresponding probes so that the force exerted upon the membrane by each buttress member is correlated with the information relating to the amount of movement of the distal end of a corresponding probe of the sensing device so that the firmness of the flexible membrane as supported by the buttress structures is representative of the firmness of the object, and cause the head mounted display to present the images from the stereoscopic camera over the haptic device and oriented so that portions of the images corresponding to portions of the surface detected by specific probes of the array of probes are aligned with corresponding buttress structures from the array of buttress structures.
In some forms, the user of a haptic feedback device may purchase the object being simulated, or another representative object, for instance my manipulating the haptic feedback device in a defined manner, by providing input via a separate device (e.g. mouse or keyboard), or by issuing voice commands that are detected by a microphone and transmitted to the control circuit. In one example, the user may touch the haptic feedback device to determine its firmness, and then shake or move the haptic feedback device in a predetermined manner in order to initiate a purchase. In some forms, the system may track the user's purchases and determine correlations between information and characteristics regarding purchased items.
In some embodiments, a plurality of objects are measured as in steps 701 through 704 and information regarding each object is maintained in a database for later use. A user, who may or may not have been involved in the measuring process, may then be presented with the option of selecting one of the previously measured objects, whereupon the information from the database will cause application of forces to buttress structures of a haptic feedback device as in step 705 in order to simulate the particular selected object. In some forms, the apparatus and/or method may employ machine learning in order to narrow the options presented to the user, prioritize the options, and/or make suggestions to the user based on previous selections by the user. For instance, if the system is used to simulate fruits or vegetables, in some embodiments the system will allow the user to rate the fruits and vegetables or provide other input regarding the desirability of certain characteristics. The system may then use this input to select the fruits and vegetables most likely to be purchased by that particular user during subsequent operations. In some embodiments, unsupervised learning may be used to select one or more items for presentation to the user based on past input or correlations between characteristics of purchased items. Information regarding preferences or past purchases may also be used to offer the user incentives.
The system may in some embodiments also allow the user to import preferences of other users. For instance, family members may share preferences in various items, especially food items, due to shared experiences and culture. In some embodiments individual users may have individual user accounts that track individual preferences, purchases, and/or other personal information. Users may link their account to that of another user in order to receive information about the other user's preferences, and may import the other user's preferences and/or purchase history information in whole or in part. In this way, families or other groups may link to one another's preferences to allow machine learning to benefit from a larger data set and make suggestions, prioritize items, and offer incentives based on family member input and/or purchases.
In some embodiments, machine learning may also be used to establish correlations between visual characteristics of items and texture, firmness, resilience, and/or other haptic characteristics, in order to reduce the need to apply a physical sensor device to the objects or to reduce the time needed to use the sensor device. For instance, supervised learning may be employed wherein when the sensor device is used on a piece of fruit in steps 701-703, optical sensors also measure visual characteristics of the fruit and pair them with haptic measurements in a database. During future operations on the same type of fruit, the system uses data from prior operations to predict haptic characteristics based on detected visual characteristics, and compares the prediction to actual measurements of the haptic device.
Those skilled in the art will recognize that a wide variety of other modifications, alterations, and combinations can also be made with respect to the above described embodiments without departing from the scope of the invention, and that such modifications, alterations, and combinations are to be viewed as being within the ambit of the inventive concept.
This application claims the benefit of U.S. Provisional Application Number 62/640,650, filed Mar. 9, 2018, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62640650 | Mar 2018 | US |