Interactive 3D image projection systems and methods

Information

  • Patent Grant
  • 11288733
  • Patent Number
    11,288,733
  • Date Filed
    Wednesday, November 14, 2018
    6 years ago
  • Date Issued
    Tuesday, March 29, 2022
    2 years ago
Abstract
A 3D interface generator computing device (“3D device”) is provided. The 3D device includes at least one sensor, a projector, and a memory that stores (i) a 3D image including 3D elements, (ii) identification for individuals, each associated with an element, and (iii) list data associated with an individual and purchase options. The 3D device includes at least one processor coupled to the memory, the projector, and the sensor. The processor is configured to project the 3D image into a real-world space. The 3D device also receives a first interaction with a 3D element, retrieves the individual associated with the 3D element and a set of list data associated with the individual. The 3D device overlays purchase option images onto the 3D image, each representing an option in the list data, receives a second interaction with a purchase option image, and generates a purchase transaction request for the purchase option.
Description
BACKGROUND

The field of the disclosure relates generally to three-dimensional (3D) images and, more specifically, to network-based systems and methods for providing user interactions with 3D images.


Holidays can be a stressful time for many people. Winter holidays, for example, often involve activities such as arranging decorations with ornaments, purchasing gifts, hosting parties with friends and family, and preparing elaborate meals. Other events, such as weddings and birthday parties, may require extensive planning by both the hosts and the guests. A host of such an event may wish to acquire decorations that are difficult and expensive to obtain. Removal of the decorations afterwards may also be time-consuming and costly. For example, in many cultures, decorating a Christmas tree during the winter holidays may include purchasing a large tree, transporting the tree to the display site, decorating the tree with ornaments, maintaining the tree over the course of the holiday season, and finally removing and packing the ornaments and transporting the tree for trash collection afterwards.


Furthermore, guests attending holiday events or parties to celebrate special occasions may wish to bring a gift to present to the host or other attendees of the event. However, shopping for gifts can be a time consuming activity. Predicting an appropriate gift for a recipient may be a particularly difficult task, especially if the recipient's interests are not well known. In some cases, a registry of gift ideas may be made available by the recipient. However, many people do not maintain a list of desirable items for all of the holiday events that occur throughout a calendar year.


BRIEF DESCRIPTION

In one aspect, a 3D interface generator computing device is provided. The 3D interface generator includes at least one sensor, a projector, and a memory device configured to store (i) 3D image data corresponding to a 3D image, wherein the 3D image includes a plurality of 3D elements, (ii) identification information for a plurality of individuals, wherein each of the plurality of individuals is associated with a respective one of the 3D elements, and (iii) sets of list data, each set of list data associated with a respective one of a plurality of individuals and identifying a plurality of purchase options. The 3D interface generator also includes at least one processor communicatively coupled to the memory device, the projector, and the at least one sensor. The at least one processor is configured to command the projector to project the 3D image into a real-world space. The at least one processor is also configured to receive, from the at least one sensor, an indication of a first physical interaction by a user with a first of the 3D elements in the real-world space. The at least one processor is further configured to retrieve, from the memory device in response to the first physical interaction, a first of the individuals associated with the first 3D element, and a first of the sets of list data associated with the first individual. The at least one processor is also configured to command the projector to overlay a plurality of purchase option images onto the 3D image in the real-world space, each of the purchase option images representing a respective one of the purchase options in the first set of list data. The at least one processor is further configured to receive, from the at least one sensor, an indication of a second physical interaction by the user with one of the purchase option images in the real-world space. The at least one processor is also configured to generate, in response to the second physical interaction, a purchase transaction request by the user of the one of the purchase options.


In another aspect, a computer-implemented method, the method executed by a 3D interface generator computing device that includes at least one sensor, a projector, and a memory device configured to store (i) 3D image data corresponding to a 3D image, wherein the 3D image includes a plurality of 3D elements, (ii) identification information for a plurality of individuals, wherein each of the plurality of individuals is associated with a respective one of the 3D elements, and (iii) sets of list data, each set of list data associated with a respective one of a plurality of individuals and identifying a plurality of purchase options. The 3D interface generator computing device also includes at least one processor communicatively coupled to the memory device, the projector, and the at least one sensor. The method includes commanding the projector to project the 3D image into a real-world space. The method also includes receiving, from the at least one sensor, an indication of a first physical interaction with a first of the 3D elements in the real-world space. The method further includes retrieving, from the memory device in response to the first physical interaction by a user, a first of the individuals associated with the first 3D element, and a first of the sets of list data associated with the first individual. The method also includes commanding the projector to overlay a plurality of purchase option images onto the 3D image in the real-world space, each of the purchase option images representing a respective one of the purchase options in the first set of list data. The method further includes receiving, from the at least one sensor, an indication of a second physical interaction by the user with one of the purchase option images in the real-world space. The method also includes generating, in response to the second physical interaction, a purchase transaction request by the user of the one of the purchase options.


In yet another aspect, at least one non-transitory computer-readable storage medium having computer-executable instructions is provided. When executed by a 3D interface generator computing device, including at least one sensor, a projector, a memory device configured to store (i) 3D image data corresponding to a 3D image, wherein the 3D image includes a plurality of 3D elements, (ii) identification information for a plurality of individuals, wherein each of the plurality of individuals is associated with a respective one of the 3D elements, and (iii) sets of list data, each set of list data associated with a respective one of a plurality of individuals and identifying a plurality of purchase options, and at least one processor communicatively coupled to the memory device, the projector, and the at least one sensor, the instructions cause the at least one processor to command the projector to project the 3D image into a real-world space. The computer-executable instructions further cause the at least one processor to receive, from the at least one sensor, an indication of a first physical interaction by a user with a first of the 3D elements in the real-world space. The computer-executable instructions further cause the at least one processor to retrieve, from the memory device in response to the first physical interaction, a first of the individuals associated with the first 3D element, and a first of the sets of list data associated with the first individual. The computer-executable instructions further cause the at least one processor to command the projector to overlay a plurality of purchase option images onto the 3D image in the real-world space, each of the purchase option images representing a respective one of the purchase options in the first set of list data. The computer-executable instructions further cause the at least one processor to receive, from the at least one sensor, an indication of a second physical interaction by the user with one of the purchase option images in the real-world space. The computer-executable instructions further cause the at least 3D interface generator computing device to generate, in response to the second physical interaction, a purchase transaction request by the user of the one of the purchase options.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic diagram of an example 3D image projection system in accordance with one embodiment of the present disclosure.



FIG. 2 is a simplified block diagram of a 3D interface generator device shown in FIG. 1.



FIG. 3 is a schematic diagram of an example 3D user interface for the 3D image projection system shown in FIG. 1.



FIG. 4 is another schematic diagram of the example 3D user interface for the 3D image projection system shown in FIG. 3.



FIG. 5 is a simplified block diagram of an example server configuration of the 3D image projection system shown in FIG. 1.



FIG. 6 is a simplified block diagram of an example user computing device that may be used with the system shown in FIG. 1.



FIG. 7 is a simplified block diagram of an example server system that may be used with the system shown in FIG. 5.



FIG. 8 illustrates a flow chart of an example process for user interaction with the 3D interface generator device shown in FIG. 1.



FIG. 9 is a diagram of components of one or more example computer devices that are used in the computing system shown in FIG. 1.





DETAILED DESCRIPTION

The systems and methods described herein are related to 3D image projection systems, and more specifically, to interactive user interface systems for selecting and executing transactions for desired items via interaction with a projected three-dimensional (3D) display.


An interactive 3D image projection system and associated devices and methods are described herein. The interactive 3D image projection system includes a 3D interface generator device that is configured to project a 3D image in real-world space. For example, during the winter holidays, the 3D image may be of a 3D Christmas tree or other 3D holiday decoration. As such, the 3D Christmas tree is projected into a real-world space (e.g., as a volumetric image near the projection device) and is visible to people located near the device.


The 3D image projection system may be reprogrammed to display other 3D images for other occasions or for general decorative purposes. For example, the 3D image projection system may be programmed to display a birthday cake during a child's birthday party, a ghost or ghoul during Halloween, or a sports team's or symbolic emblem during major sporting events or national holidays. In some embodiments, the 3D image projection system may be complemented by other devices to enhance the experience, such as a scent-diffusing system for dispersing a particular holiday fragrance near the 3D image or an audio output system for playing holiday music.


The 3D image projection system also includes sensors to detect and identify nearby users and to facilitate user interactions with the 3D image. For example, a motion sensing input device may be configured to detect hand gestures or other physical interaction performed by a user near the 3D image, thereby allowing the user to interact with the 3D image in various ways (e.g., via the user pointing at, or “pressing,” virtual objects within the 3D image). The user may be allowed to customize aspects of the 3D image, such as selecting from various types of trees, changing the size of the tree to fit the real-world space, displaying virtual ornaments or other virtual decorations on the tree, or customizing certain virtual ornaments (e.g., assigning a particular virtual ornament to a friend or relative and uploading an image of that friend or relative to be displayed on the projected 3D ornament).


In one example embodiment, the interactive 3D image projection system assists the user with gift purchasing for a holiday or other event associated with the projected 3D display. For example, the interactive 3D image projection system may allow the user to interact with a virtual ornament associated with a particular friend on the 3D Christmas tree. Interacting with the virtual ornament enables the user to see gift ideas for that friend. When the user presses the virtual ornament, the interactive 3D image projection system displays a list of potential gifts for that friend (e.g., from a gift registry of the friend, based on a profile of the friend). Upon selection of a particular gift, the interactive 3D image projection system executes a purchase transaction of the identified gift on behalf of the user. As such, the interactive 3D image projection system both provides a virtual replacement for the underlying physical object (e.g., in the form of the displayed 3D Christmas tree) as well as provides interactive capabilities that allow the user to perform gift purchases through a natural user interface presented in conjunction with the 3D image.


In some embodiments, the registration of users includes opt-in informed consent of users to data usage by the interactive 3D image projection system consistent with consumer protection laws and privacy regulations. In some embodiments, the enrollment data and/or other collected data may be anonymized and/or aggregated prior to receipt such that no personally identifiable information (PII) is received. In other embodiments, the system may be configured to receive enrollment data and/or other collected data that is not yet anonymized and/or aggregated, and thus may be configured to anonymize and aggregate the data. In such embodiments, any PII received by the system is received and processed in an encrypted format, or is received with the consent of the individual with which the PII is associated. In situations in which the systems discussed herein collect personal information about individuals, or may make use of such personal information, the individuals may be provided with an opportunity to control whether such information is collected or to control whether and/or how such information is used. In addition, certain data may be processed in one or more ways before it is stored or used, so that personally identifiable information is removed.


The technical problems addressed by the systems and methods described includes at least one of the following: (i) storing calendar event data and user information; (ii) displaying a 3D image in a real-world space relevant to the calendar event data; (iii) detecting and identifying nearby users; (iv) adapting the 3D image to user interaction via a natural user interface; (v) displaying identifiers associated with gift recipients; (vi) identifying potential purchases associated with an identified gift recipient; and (vii) executing a transaction based on user interaction via the natural user interface.


The resulting technical effect achieved by the systems and methods described herein is at least one of: (i) replacing a real-world object with a 3D image representing the replaced object; (ii) allowing users to configure presentational aspects of the 3D image; (iii) allowing users to interact with the 3D image via a natural user interface; (iv) automatically presenting gift purchase options for recipients; and (v) allowing users to view gift lists and execute purchase transactions via the natural user interface.


In one embodiment, a computer program is provided, and the program is embodied on a computer-readable medium. In an example embodiment, the system is executed on a single computer system, without requiring a connection to a server computer. In a further example embodiment, the system is being run in a Windows® environment (Windows is a registered trademark of Microsoft Corporation, Redmond, Wash.). In yet another embodiment, the system is run on a mainframe environment and a UNIX® server environment (UNIX is a registered trademark of X/Open Company Limited located in Reading, Berkshire, United Kingdom). In a further embodiment, the system is run on an iOS® environment (iOS is a registered trademark of Cisco Systems, Inc. located in San Jose, Calif.). In yet a further embodiment, the system is run on a Mac OS® environment (Mac OS is a registered trademark of Apple Inc. located in Cupertino, Calif.). In still yet a further embodiment, the system is run on Android® OS (Android is a registered trademark of Google, Inc. of Mountain View, Calif.). In another embodiment, the system is run on Linux® OS (Linux is a registered trademark of Linus Torvalds of Boston, Mass.). The application is flexible and designed to run in various different environments without compromising any major functionality. The following detailed description illustrates embodiments of the disclosure by way of example and not by way of limitation. It is contemplated that the disclosure has general application to providing an on-demand ecosystem in industrial, commercial, and residential applications.


As used herein, an element or step recited in the singular and preceded with the word “a” or “an” should be understood as not excluding plural elements or steps, unless such exclusion is explicitly recited. Furthermore, references to “example embodiment” or “one embodiment” of the present disclosure are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features.



FIG. 1 is a schematic diagram of an example interactive 3D image projection system (or just “projection system”) 100. In the example embodiment, projection system 100 includes a 3D interface generator device 120 (or just “3D device”) and a server system 170. 3D device 120 is configured to project a 3D image 160 into a real-world space 122 near to (e.g., above) 3D device 120. In the example embodiment, 3D image 160 is a 3D representation of a Christmas tree 130 (e.g., a virtual Christmas tree). Further, a user 110 may interact (e.g., wirelessly) with 3D device 120 using a user computing device 150 (e.g., for configuring 3D image 160, sharing profile information, detecting proximity of user 110 with 3D device 120, and so forth).


In the example embodiment, 3D image 160 is a three-dimensional image (e.g., a volumetric image). In some embodiments, 3D image 160 is primarily a two-dimensional “planar” image. For example, a “flat” image may be projected vertically, perpendicular to a horizontal plane of 3D device 120, such that a projected virtual screen is displayed to user 110. Additionally or alternatively, a 3D image may be projected on a horizontal plane from 3D device 120 such that user 110 may view downwards from a standing position and interact with the 3D image similar to items on a table.


In the example embodiment, 3D image 160 includes 3D elements 140. In the example embodiment, 3D elements 140 are ornaments associated with a holiday theme and are used to decorate 3D Christmas tree 130. In some embodiments, 3D elements 140 are user configurable. For example, user 110 may record and upload video or images to be displayed as 3D elements 140 or embedded on or integrated with 3D elements 140. In one embodiment, user 110 may purchase additional decorations 140 to customize the “decorations” of 3D Christmas tree 130. In alternative embodiments, 3D elements 140 are any suitable, separately recognizable visual elements overlaid on 3D image 160.


In some embodiments, 3D image 160 may be animated (e.g., include aspects of motion over time). For example, 3D image 160 may rotate about a vertical axis of 3D image 160 (not shown, e.g., the vertical center line of 3D Christmas tree 130), where all aspects about a circumference of 3D image 160 pass a stationary viewing position. Further, 3D elements 140 may also rotate along the same vertical axis with 3D Christmas tree 130 at the same rate of rotation as the 3D Christmas tree 130 such that 3D elements 140 remain relatively stationary with respect to 3D Christmas tree 130 during the rotation. In other words, as user 110 views 3D Christmas tree 130 from a stationary perspective, user 110 is able to view different 3D elements 140 as 3D image 160 rotates. 3D image 160 may be configured to display other animations, videos, or images. Additionally or alternatively, 3D device 120 is configured to modify 3D image 160 in other aspects such as changes in color, lighting, depth, size, shape, intensity, etc. In some embodiments, 3D device 120 is configured to allow user 110 to add or remove additional 3D elements 140 to 3D Christmas tree 130. In some embodiments, 3D elements 140 are animated to move relative to 3D Christmas tree 130. For example, 3D elements 140 may be snowflakes descending around 3D Christmas tree 130.


In the example embodiment, 3D device 120 determines which 3D image 160 to display based on calendar events stored in a memory. More specifically, 3D image 160 may be an image associated with holidays, birthdays, or other events associated with specific dates in a calendar. For example, 3D image 160 may be a birthday cake if the current date is near the stored date of a person's birthday.


In the example embodiment, 3D image 160 is a 3D natural user interface. For example, user 110 interacts with 3D elements 140 to configure the 3D image 160 displayed. Interaction with the 3D user interface is described further below. User 110 may also use user computing device 150 to communicate with 3D device 120. In one embodiment, user computing device 150 communicates wirelessly with 3D device 120 using an app installed on user computing device 150. The app is configured to perform at least similar control functions as may be performed by interacting with 3D image 130 as described below.



FIG. 2 is a simplified block diagram illustrating an example embodiment of 3D device 120. Components of 3D device 120 may include, for example, modules such as a 3D image projector 210, a processor 220, a memory 230, an audio component 240, at least one sensor 250, a scent diffuser 260, a database 270, a communication module 280, and a user interface module 290. Additional or fewer components may be used in projection system 100.


In the example embodiment, audio component 240 includes at least one speaker to play pre-recorded audio files such as holiday music. Audio component 240 may be configured to play other sounds such as ambient sounds, alerts, vocalized speech, sounds associated with interaction with the 3D natural user interface, and sounds configured for additional accessibility. Additionally or alternatively, audio component 240 may be configured to stream live music, sounds, or speech. Multiple speaker configurations, including speakers of varying type and size, may be used. In one embodiment, additional speakers in a wireless configuration may be used where 3D device 120 transmits audio data to the additional speakers.


In the example embodiment, audio component 240 includes a microphone or other sound detecting and/or recording equipment. The microphone may be used for detection and identification purposes. In a multi-microphone system, triangulation of sounds may be used to determine the location of individuals nearby. User 110 speaking near 3D device 120 may be identified through vocal speech patterns. In some embodiments, user 110 (shown in FIG. 1) may issue oral commands to 3D device 120 via audio input systems such as a microphone. In other embodiments, user 110 may speak or sing into the microphone for amplification through audio component 240.


The at least one sensor 250 may include any type of sensor including, for example, one or more of a camera, an infrared sensor, a proximity sensor, a motion sensor, a magnetic sensor, an optical sensor, a fingerprint and/or thumbprint scanner, a GPS sensor, or an electromagnetic detection sensor. 3D device 120 may include any number and/or type of sensors 250 to further enhance the functionality of projection system 100.


In one embodiment, the at least one sensor 250 is configured to detect motions near (e.g., above) 3D device 120. For example, the at least one sensor 250 detects motion within a virtual boundary of 3D image 160 and determines that user 110 is interacting with 3D image 160. 3D device 120 determines, based on data received from the at least one sensor 250, a pattern of motion (e.g., a gesture). 3D device 120 compares the gesture to a library of gestures stored in database 270 to determine a corresponding command to execute. For example, if the command is associated with a purchase transaction, 3D device 120 determines that an item represented in the currently displayed 3D image is the desired gift item that user 110 wishes to purchase. 3D device 120 executes a purchase transaction for that gift item.


In one embodiment, a GPS sensor may be used to determine the presence of nearby people. For example, if user computing device 150 is equipped with a GPS module, projection system 100 may determine that, based on the location of 3D device 120 and the location of user computer device 150, user 110 is nearby. Additionally or alternatively, if known users 110 are not present, the 3D image may be adapted to display information relevant to users 110 that are present. Additionally or alternatively, the at least one sensor 250 may also include a proximity sensor acting in conjunction with the GPS module.


User 110 may engage in a payment transaction using the at least one sensor 250. In another embodiment, the at least one sensor 250 includes an infrared sensor configured to detect the presence of people in a low light setting. Additionally or alternatively, the infrared sensor may be used for receiving data transmissions from user computing device 150. In the example embodiment, the at least one sensor 250 includes at least one camera to detect the presence of user 110 and identify user 110 using technologies such as facial recognition. Additionally or alternatively, user 110 may identify themselves via other sensors such as a fingerprint scanner, a keyboard, a touchscreen, an NFC identification card, a payment card reader or using user computing device 150.


In the example embodiment, scent diffuser 260 is configured to enhance the experience of user 110 by releasing into the surrounding air an aroma that is associated with, for example, 3D Christmas tree 130. For example, if 3D Christmas tree 130 is actively displayed, scent diffuser 260 may emit a pine tree aroma. In one embodiment, user 110 may configure 3D device 120 to emit additional or alternative aromas such as cinnamon, pumpkin, spices, or baked goods such as apple pie. In some embodiments, scent diffuser 260 may also include functionality to humidify the surrounding air.


3D image projector 210 may include various technologies configured to generate a 3D holographic image, volumetric image, stereoscopic image, or other visualization designed to simulate or display object depth. In one embodiment, 3D image projector 210 includes at least one mirror and at least one light source such as a laser. In some embodiments, a beam splitter may be used to generate a scattered beam and combined with a reference beam to generate the appearance of a 3D image. In some embodiments, the at least one sensor 250 may be used to receive emitted light and to identify or determine emitted light patterns.


In the example embodiment, 3D image projector 210 receives, from processor 220, commands or instructions regarding patterns or images to generate. Images and/or patterns may be stored in memory 230. Processor 220 retrieves image pattern data from memory 230 for generation using 3D image projector 210. In some embodiments, the particular 3D image selected for generation may be stored in database 270. Database 270 also may store user profiles, and certain images may be associated with specific user profiles stored in database 270. For example, if user 110 has a birthday within a number of days, processor 220 retrieves the date information and determines that the 3D image to be generated is a birthday cake. Processor 220 retrieves 3D birthday cake image data from memory 230 and determines instructions to cause 3D image projector 210 to project the 3D cake. In the example embodiment, processor 220 determines that the current date is around the winter holidays and retrieves from memory 230 3D Christmas tree data. Processor 220 causes 3D image projector 210 to generate the 3D Christmas tree.


Processor 220 may also query database 270 to determine a scent associated with 3D image 160. If a scent is determined, processor 220 causes scent diffuser 260 to emit the associated scent. Multiple scents may be associated with 3D image 160. In some embodiments, multiple scents may be emitted by scent diffuser 260 simultaneously. In other embodiments, scents may be emitted by scent diffuser 260 in a predetermined order. Additionally or alternatively, processor 220 may randomize the order or combination of scents to emit.


Processor 220 may also query database 270 to determine if any sounds are associated with 3D image 160. Processor 220 causes audio component 240 to play the determined sounds. In some embodiments, multiple sounds are associated with 3D image 160. Database 270 may include associated rank or identifiers for the sounds that are associated with 3D image 160. In one embodiment, processor 220 randomizes the order of sounds to be played by audio component 240. In another embodiment, processor 220 synchronizes the audio output with changes in the visual elements of the generated by 3D image projector 210. For example, songs with a specific tempo may accompany corresponding alterations in colors, intensity, or size of 3D image 160 at the rate of the tempo of the song.


In one embodiment, communication module 280 is a NFC device. In other embodiments, communication module 280 may be another device capable of transmission across the electromagnetic frequency spectrum such as a Wi-Fi device capable of connecting to the Internet. In the example embodiment, communication module 280 may communicatively couple 3D device 120 to the Internet through many interfaces including, but not limited to, at least one of a network, such as a local area network (LAN), a wide area network (WAN), or an integrated services digital network (ISDN), a dial-up-connection, a digital subscriber line (DSL), a cellular phone connection, and a cable modem.


User interface module 290 implements a 3D natural user interface and will be described herein.



FIG. 3 illustrates an example 3D natural user interface that enables user 110 to engage projection system 100. In the example embodiment, 3D device 120 generates 3D image 160. The at least one sensor 250 detects movements performed by user 110. For example, sensor 250 may be configured to detect physical motion performed by user 110. Movements by user 110 may include waving hands and/or fingers, etc. In some embodiments, the at least one sensor 250 may be configured to detect facial movements. In the example embodiment, user 110 interacts with the 3D natural user interface by physical interaction via a gesture or a combination of gestures and/or vocalizations that are matched to stored patterns in database 270. For example, user 110 may use a pointing gesture or a swiping gesture within a proximate distance of a particular 3D element 140. The at least one sensor 250 may determine the location of the gesture and determine that the gesture is near the particular 3D element 140. The at least one sensor 250 may include a motion detection device, such as a plurality of cameras or lasers to identify in real-world space 122 the location of the gesture in a three-coordinate 3D space. The location information is compared to the projection information stored in memory 230 for the 3D image 160. Each 3D element 140 may be associated with a particular person.


In some embodiments, interactions with the 3D natural user interface may be supplemented by data received by user computing device 150. In some embodiments, user computing device 150 is communicatively coupled to 3D device 120. For example, user 110 may activate a 3D decoration 140 by using a gesture. User 110 is then prompted for additional information such as a name, password, code, or confirmation. User 110 may then use user computing device 150 to input the additional information and transmit the data to 3D device 120 by for example, using a keyboard, touchscreen, or other input device associated with user computing device 150. Additionally or alternatively, user computing device 150 may store in memory information about persons associated with user 110 which may be transmitted to 3D device 120. In other embodiments, an app installed on user computing device 150 may be used to select a person associated with user 110, and information about the selected person may be transmitted to 3D device 120.


In the example embodiment, the at least one sensor 250 is configured to identify user 110. For example, user 110 may be identified through image recognition. 3D device 120 retrieves from a database, such as database 270 (shown in FIG. 2), information regarding people associated with user 110. 3D device 120 displays on 3D image 160 identification information 310 for the associated people. In the example embodiment, identification information 310 for each associated person is displayed on respective 3D elements 140. In the example embodiment, a photographic image is overlaid on the 3D elements 140. In some embodiments, identification information 310 may be a 3D image. For example, the 3D image may be a 3D image displayed as enclosed in a 3D space inside decorations 140. In other embodiments, identification information 310 may be text or symbolic identifiers. In some embodiments, identification information 310 may be animated. In some embodiments, identification information 310 is a continuously looping video previously uploaded to 3D device 120, the video identifying a particular person. In some embodiments, audio aspects may also be synchronized to the display of identification information 310.



FIG. 4 illustrates an example activation of one of 3D elements 140 on 3D image 160 generated by 3D device 120. In the example embodiment, user 110 activates element 140 through a gesture and/or via user computing device 150. User 110 visually determines the identity of the person associated with identification information 310 by examining the overlaid identification information 310. In the example embodiment, user 110 selects a person as a designated recipient of a gift. Upon selection of a decoration 140, 3D device 120 generates a 3D gift list 410 which appears as a pop-up and/or dropdown menu list. In the example embodiment, gift list 410 is populated by icons of preferred gift items.


In the example embodiment, other users 110 register with system 100 to define the list of preferred gift items. At least one sensor on 3D device 120 may capture identification image. The at least one sensor 250 may include, for example, a camera. A photographic image and/or a video recording of each user 110, captured by a corresponding 3D device 120, may be stored in database 270 (shown in FIG. 2) and associated with the respective user 110. Each user 110 may also record audio greetings that may be played when the user is selected. Registration may include providing personal information such as name, address, and payment account information. Each user 110 may upload personal identifying information from a corresponding user computing device 150. In some embodiments, the camera may be used to scan text-based information presented by user 110 and stored in database 270.


The list of preferred gift items may be defined by each registered user 110. For example, upon registration, 3D device 120 displays a 3D list of possible items. User 110 selects from the predetermined list of possible items to determine a list of preferred items. User 110 may use various gestures to navigate the 3D user interface. For example, a swipe down may scroll a list down. Multi-point gestures involving multiple hands or fingers may also be used. For example, initiating a gesture with two points with close proximity and ending the gesture with the two points further apart in a widening gesture may invoke a specific response from 3D device 120 such as displaying additional options. In some embodiments, the 3D user interface may display the potential items in other formats. For example, potential items may also be represented by a category and selection of the category may expand the list into a tree. Additionally or alternatively, user 110 may have an account with a merchant. Projection system 100 may be configured to communicate with third-party merchants to retrieve a list preferred items stored with the merchant. Additionally or alternatively, user 110 may use user computing device 150 to upload a predetermined list of preferred gift items to 3D device 120.


In the example embodiment, user 110 selects a gift for another person by interacting with an icon 420 in gift list 410. Icon 420 represents a preferred gift for a designated recipient represented by identification information 310 on element 140. In some embodiments, icon 420 may be an animated 3D image, a 3D image, or a text or symbolic identifier. In the example embodiment, activation of one icon 420 automatically transmits a request for a purchase transaction for the associated gift item. The request may include gift information, account information, and gift recipient information. In some embodiments, the request may be transmitted to a merchant. In other embodiments, the request may be transmitted to a payment processor via a payment processing network. In alternative embodiments, activation of an icon 420 causes 3D device 120 to transmit a purchase request visually such as, for example, by displaying a code such as a QR code. User 110 may use user computing device 150 to scan the code. User computing device 150 may then automatically execute a purchase transaction based on the scanned code. In yet other embodiments, the request may be transmitted to user computing device 150 as a push notification for execution of a purchase transaction via user computing device 150. In some embodiments the request may include account information associated with user 110. In other embodiments, account information may be retained by a third party such as, for example, merchant 520, or payment processor 510.



FIG. 5 is a simplified block diagram of an example server configuration of server system 170 that may be used with projection system 100 shown in FIG. 1 in an example environment 500. In the example embodiment, 3D device 120 is communicatively coupled to a server computer device 530 of server system 170. Server computer device 530 may include a database server 540 in remote communication with a database 550. In some embodiments, server computer device 530 locally includes database 550. In the example embodiment, server computer device 530 is in communication with a payment processor 510. Server computer device 530 receives purchased gift selection from 3D device 120, receives user 110 information from 3D device 120, and/or retrieves user 110 account information from database 550, and transmits to payment processor 510 transaction data for purchase transactions engaged in by user 110. Alternatively, server computer device 530 receives user 110 account information from user computing device 150. Payment processor 510 then communicates with a merchant 520 to complete the gift purchase transaction.


Server computer device 530 may also be in direct communication with merchant 520. Merchant 520 may, for example, provide account information for registered users. In some embodiments, merchant 520 may upload and store gift data in database 550. In some embodiments, merchant 520 may access database 550 to retrieve transaction data.


In the example embodiment, server computer device 530 is also in direct communication with user computing device 150. User 110 may use user computing device 150 to provide registration information. Additionally or alternatively, user 110 interacts with 3D device 120 to access database 550. For example, user 110 may wish to add additional preferred gift items associated with the user's account. Using the 3D natural user interface generated by 3D device 120, user 110 may select preferred gift items from a list of gifts presented as described above.



FIG. 6 illustrates an example configuration of a client system 602 that may be used to implement user computing device 150 (shown in FIG. 1) in accordance with one embodiment of the present disclosure. Client system 602 may also be used to implement a merchant computing device, a POS system, or another user computing device in communication with 3D device 120 (shown in FIG. 1). Client system 602 is operated by a user 604 that may be, for example, user 110 or merchant 520 (shown in FIG. 5). Client system 602 includes a processor 605 for executing instructions. In some embodiments, executable instructions are stored in a memory area 610. Processor 605 includes one or more processing units (e.g., in a multi-core configuration). Memory area 610 is any device allowing information such as executable instructions and/or transaction data to be stored and retrieved. Memory area 610 may include one or more computer-readable media.


Client system 602 also includes at least one media output component 615 for presenting information to user 604. Media output component 615 is any component capable of conveying information to user 604. In some embodiments, media output component 615 includes an output adapter (not shown) such as a video adapter and/or an audio adapter. An output adapter is operatively coupled to processor 605 and operatively coupleable to an output device such as a display device (e.g., a cathode ray tube (CRT), liquid crystal display (LCD), light emitting diode (LED) display, or “electronic ink” display) or an audio output device (e.g., a speaker or headphones). In some embodiments, media output component 615 is configured to present a graphical user interface (e.g., a web browser and/or a client application) to user 604. A graphical user interface includes, for example, an online store interface for viewing and/or purchasing items, and/or a wallet application for managing payment information. In some embodiments, client system 602 includes an input device 620 for receiving input from user 604. User 604 may use input device 620 to, without limitation, select and/or enter one or more items to purchase and/or a purchase request, or to access credential information, and/or payment information. Input device 620 includes, for example, a keyboard, a pointing device, a mouse, a stylus, a touch sensitive panel (e.g., a touch pad or a touch screen), a gyroscope, an accelerometer, a position detector, a biometric input device, and/or an audio input device. A single component such as a touch screen functions as both an output device of media output component 615 and input device 620.


Client system 602 also includes a communication interface 625, communicatively coupled to a remote device such as 3D device 120. Communication interface 625 includes, for example, a wired or wireless network adapter and/or a wireless data transceiver for use with a mobile telecommunications network.


Stored in memory area 610 are, for example, computer-readable instructions for providing a user interface to user 604 via media output component 615 and, optionally, receiving and processing input from input device 620. The user interface includes, among other possibilities, a web browser and/or a client application capable of generating a user interface transmitted by, for example, 3D device 120. A client application allows user 604 to interact with, for example, 3D device 120. For example, instructions may be stored by a cloud service and the output of the execution of the instructions sent to the media output component 615.



FIG. 7 illustrates an example configuration of a server computing device 530 (shown in FIG. 5), in accordance with one embodiment of the present disclosure. Server computer device 530 includes a processor 705 for executing instructions. Instructions may be stored in a memory area 710. Processor 705 may include one or more processing units (e.g., in a multi-core configuration).


Processor 705 is operatively coupled to a communication interface 715 such that server computer device 530 is capable of communicating with a remote device such as another server computer device 530, user computing device 150, merchant 520, or 3D device 120. For example, communication interface 715 receives requests from user computing device 150 via the Internet.


Processor 705 may also be operatively coupled to a storage device 734. Storage device 734 is any computer-operated hardware suitable for storing and/or retrieving data, such as, but not limited to, data associated with database 550. In some embodiments, storage device 734 is integrated in server computer device 530. For example, server computer device 530 may include one or more hard disk drives as storage device 734. In other embodiments, storage device 734 is external to server computer device 530 and may be accessed by a plurality of server computer devices 530. For example, storage device 734 may include a storage area network (SAN), a network attached storage (NAS) system, and/or multiple storage units such as hard disks and/or solid state disks in a redundant array of inexpensive disks (RAID) configuration.


In some embodiments, processor 705 is operatively coupled to storage device 734 via storage interface 740. For example, storage interface 740 is used to implement database server 540 (shown in FIG. 5). Storage interface 740 is any component capable of providing processor 705 with access to storage device 734. Storage interface 740 may include, for example, an Advanced Technology Attachment (ATA) adapter, a Serial ATA (SATA) adapter, a Small Computer System Interface (SCSI) adapter, a RAID controller, a SAN adapter, a network adapter, and/or any component providing processor 705 with access to storage device 734.


Processor 705 executes computer-executable instructions for implementing aspects of the disclosure. In some embodiments, processor 705 is transformed into a special purpose microprocessor by executing computer-executable instructions or by otherwise being programmed. For example, processor 705 is programmed with the instructions such as those illustrated in FIG. 8.



FIG. 8 illustrates a flow chart of an example method 800 of generating a 3D user interface and generating a request for a purchase transaction based on user input. In the example embodiment, method 800 is implemented by a 3D interface generator device 120 including at least one sensor 250, 3D image projector 210, memory device 230, and the at least one processor 220 coupled to memory device 230, projector 210, and the at least one sensor 250 (see FIG. 2). The memory device is configured to store (i) 3D image data corresponding to a 3D image where the 3D image includes a plurality of 3D elements, (ii) identification information for a plurality of individuals, where each of the plurality of individuals is associated with a respective one of the 3D elements, and (iii) sets of list data, each set of list data associated with a respective one of the plurality of individuals and identifying a plurality of purchase options.


In the example embodiment, method 800 includes commanding 802 the projector to project the 3D image into a real-world space. Method 800 also includes receiving 804 from the at least one sensor, an indication of a first physical interaction by a user with a first of the 3D elements in the real-world space. Method 800 also includes retrieving 806, from the memory device in response to the first physical interaction, a first of the individuals associated with the first 3D element, and a first of the sets of list data associated with the first individual. Method 800 also includes commanding 808 the projector to overlay a plurality of purchase option images onto the 3D image in the real-world space, each of the purchase option images representing a respective one of the purchase options in the first set of list data. Method 800 also includes receiving 810, from the at least one sensor, an indication of a second physical interaction by the user with one of the purchase option images in the real-world space. Method 800 further includes generating 820, in response to the second physical interaction, a purchase transaction request by the user of the one of the purchase options. Method 800 may also include additional or alternative steps based on the functionality of system 100 as described above.



FIG. 9 depicts a diagram 900 of components of one or more example computing devices 910 that may be used to implement a 3D image projection system such as projection system 100 (shown in FIG. 1). In some embodiments, computing device 910 may be used to implement 3D device 120 (shown in FIG. 1). A memory component 920 is coupled with several separate components within computing device 910 which perform specific tasks. In the example embodiment, memory 920 includes 3D image data 924 for displaying a 3D image 160. 3D image data 924 may also include calendar data for associating certain images with a particular calendar date. Memory 920 may also include 3D element data 922 associated with images of additional 3D elements (e.g., ornaments). Memory 920 also may include identification information 925 for a plurality of individuals, including associating each individual with one of the 3D elements. For example, 3D element data 922 may include images, videos, and audio associated with gift recipients. Memory 920 may also include a user profile 926 associated with users that may use projection system 100 to select and initiate purchase transactions. Memory 920 may also include sets of list data 928 such as lists of purchase options for the individuals and/or data related to the purchase options such as pricing and the like.


Computing device 910 also includes communicating component 930 for transmitting purchase requests to, for example, merchant 520, and/or receiving data from, for example, a user computing device such as user computing device 150. Computing device 910 may also include a sensor data processing component 940 for processing data received from the at least one sensor 250 (shown in FIG. 2). For example, sensor data processing component 940 may analyze gesture data received from the at least one sensor 250 and compare the gesture data with stored gestures that correspond to function calls and/or commands for interacting with projection system 100 as part of user interface module 290 (shown in FIG. 2). Computing device 910 may also include a projector command component 950 for transmitting commands to 3D image projector 210 based on, for example, user gestures. Computing device 910 may also include a purchase request generating component 960 for generating purchase requests. Computing device 910 may also include a retrieving component 970 for retrieving, for example, identification information 925, user profile data 926, and/or list data 928 from memory 920.


Although various elements of the computer system are described herein as including general processing and memory devices, it should be understood that the computer system is a specialized computer configured to perform the steps described herein for facilitating purchase transactions for items via interaction with a projected three-dimensional display.


As will be appreciated based on the foregoing specification, the above-discussed embodiments of the disclosure may be implemented using computer programming or engineering techniques including computer software, firmware, hardware or any combination or subset thereof. Any such resulting computer program, having computer-readable and/or computer-executable instructions, may be embodied or provided within one or more computer-readable media, thereby making a computer program product, i.e., an article of manufacture, according to the discussed embodiments of the disclosure. These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms “machine-readable medium,” “computer-readable medium,” and “computer-readable media” refer to any computer program product, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The “machine-readable medium,” “computer-readable medium,” and “computer-readable media,” however, do not include transitory signals (i.e., they are “non-transitory”). The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.


This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal language of the claims.

Claims
  • 1. A 3D interface generator computing device comprising: at least one sensor;a projector;a memory device configured to store (i) 3D image data corresponding to a 3D image, wherein the 3D image includes a plurality of 3D elements, (ii) a plurality of user profiles each including identification information for a respective one of a plurality of individuals, wherein each of the plurality of user profiles is associated with a respective one of the 3D elements, (iii) sets of list data, each set of list data associated with a respective one of the plurality of individuals and identifying a plurality of purchase options; and (iv) a library of gestures indicating a pattern of physical interactions performed by a user; andat least one processor communicatively coupled to the memory device, the projector, and the at least one sensor, the at least one processor configured to: collect registration data from the plurality of individuals, wherein the registration data for each of the individuals defines gift items that the respective individual wishes to receive;generate the sets of list data for the plurality of user profiles based on the gift items in the registration data collected from the plurality of individuals;store the generated sets of list data in the memory device in association with the user profiles for the plurality of individuals;command the projector to project the 3D image into a real-world space;detect, using the at least one sensor, a physical interaction by the user in the real-world space;compare, in response to the detection, a location of the physical interaction with respect to the 3D image data to determine that the location is within a threshold distance of a first of the 3D elements within the 3D image, the first of the 3D elements associated with a first user profile of the plurality of user profiles;perform a lookup in the library of gestures stored in the memory device in response to the detection to determine, based on the determination that the location of the physical interaction is within the threshold distance of the first of the 3D elements, that the physical interaction corresponds to a selection of the first of the 3D elements;retrieve, from the memory device in response to the selection of the first of the 3D elements, a first of the sets of list data associated with the first user profile corresponding to the first of the 3D elements;command, in response to the selection of the first of the 3D elements, the projector to overlay a plurality of purchase option images onto the 3D image in the real-world space, each of the purchase option images representing a respective one of the purchase options in the first set of list data;receive, from the at least one sensor, an indication of a second physical interaction by the user with one of the purchase option images in the real-world space; andgenerate, in response to the second physical interaction, a purchase transaction request by the user for a gift item defined in the registration data of the first individual, the gift item associated with the one of the purchase options.
  • 2. The computing device of claim 1, wherein the at least one processor is configured to generate the purchase transaction request by commanding the projector to project a QR code associated with the one of the purchase options.
  • 3. The computing device of claim 1, wherein the at least one processor is configured to generate the purchase transaction request by transmitting a push notification for the one of the purchase options to a user computing device and causing the user computing device to execute a payment transaction for the one of the purchase options.
  • 4. The computing device of claim 1, wherein the at least one processor is configured to generate the purchase transaction request includes transmitting account information and the one of the purchase options to a merchant.
  • 5. The computing device of claim 1, wherein the at least one processor is further configured to identify the user based on input from the at least one sensor.
  • 6. The computing device of claim 1, wherein the at least one processor is further configured to retrieve, from a merchant, account information associated with the user.
  • 7. The computing device of claim 1, wherein the 3D image data further includes a reference date associated with a calendar event, and the at least one processor is further configured to select the 3D image data by comparing the reference date to a current date.
  • 8. A computer-implemented method, the method executed by a 3D interface generator computing device that includes: at least one sensor;a projector;a memory device configured to store (i) 3D image data corresponding to a 3D image, wherein the 3D image includes a plurality of 3D elements, (ii) a plurality of user profiles each including identification information for a respective one of a plurality of individuals, wherein each of the plurality of user profiles is associated with a respective one of the 3D elements, and (iii) sets of list data, each set of list data associated with a respective one of the plurality of individuals and a respective one of the plurality of user profiles, each set of list data identifying a plurality of purchase options; andat least one processor communicatively coupled to the memory device, the projector, and the at least one sensor, the method comprising:collecting registration data from the plurality of individuals, wherein the registration data for each of the individuals defines gift items that the respective individual wishes to receive;generating the sets of list data for the plurality of user profiles based on the gift items in the registration data collected from the plurality of individuals;storing the generated sets of list data in the memory device in association with the user profiles for the plurality of individuals;commanding the projector to project the 3D image into a real-world space;detecting, using the at least one sensor, a physical interaction by a user in the real-world space;comparing, in response to the detection, a location of the physical interaction with respect to the 3D image data to determine that the location is within a threshold distance of a first of the 3D elements within the 3D image, the first of the 3D elements associated with a first user profile of the plurality of user profiles;performing a lookup in the library of gestures stored in the memory device in response to the detection to determine, based on the determination that the location of the physical interaction is within the threshold distance of the first of the 3D elements, that the physical interaction corresponds to a selection of the first of the 3D elements;retrieving, from the memory device in response to the selection of the first of the 3D elements, a first of the sets of list data associated with the first user profile corresponding to the first of the 3D elements;commanding, in response to the selection of the first of the 3D elements, the projector to overlay a plurality of purchase option images onto the 3D image in the real-world space, each of the purchase option images representing a respective one of the purchase options in the first set of list data;receiving, from the at least one sensor, an indication of a second physical interaction by the user with one of the purchase option images in the real-world space; andgenerating, in response to the second physical interaction, a purchase transaction request by the user for a gift item defined in the registration data of the first individual, the gift item associated with the one of the purchase options.
  • 9. The computer-implemented method of claim 8, wherein generating the purchase transaction request comprises commanding the projector to project a QR code associated with the one of the purchase options.
  • 10. The computer-implemented method of claim 8, wherein generating the purchase transaction request comprises transmitting a push notification for the one of the purchase options to a user computing device and causing the user computing device to execute a payment transaction for the one of the purchase options.
  • 11. The computer-implemented method of claim 8, wherein generating the purchase transaction request comprises transmitting account information and the one of the purchase options to a merchant.
  • 12. The computer-implemented method of claim 8, further comprising identifying the user based on input from the at least one sensor.
  • 13. The computer-implemented method of claim 8, further comprising retrieving, from a merchant, account information associated with the user.
  • 14. The computer-implemented method of claim 8, wherein the 3D image data further includes a reference date associated with a calendar event, said method further comprising selecting the 3D image data by comparing the reference date to a current date.
  • 15. At least one non-transitory computer-readable storage medium having computer-executable instructions, wherein when executed by a 3D interface generator computing device including: at least one sensor;a projector;a memory device configured to store (i) 3D image data corresponding to a 3D image, wherein the 3D image includes a plurality of 3D elements, (ii) a plurality of user profiles each including identification information for a respective one of a plurality of individuals, wherein each of the plurality of user profiles is associated with a respective one of the 3D elements, and (iii) sets of list data, each set of list data associated with a respective one of the plurality of individuals and identifying a plurality of purchase options; andat least one processor communicatively coupled to the memory device, the projector, and the at least one sensor, the computer-executable instructions cause the at least one processor to:collect registration data from the plurality of individuals, wherein the registration data for each of the individuals defines gift items that the respective individual wishes to receive;generate the sets of list data for the plurality of user profiles based on the gift items in the registration data collected from the plurality of individuals;store the generated sets of list data in the memory device in association with the user profiles for the plurality of individuals;command the projector to project the 3D image into a real-world space;detect, using the at least one sensor, a physical interaction by a user in the real-world space;compare, in response to the detection, a location of the physical interaction with respect to the 3D image data to determine that the location is within a threshold distance of a first of the 3D elements within the 3D image, the first of the 3D elements associated with a first user profile of the plurality of user profiles;perform a lookup in the library of gestures stored in the memory device in response to the detection to determine, based on the determination that the location of the physical interaction is within the threshold distance of the first of the 3D elements, that the physical interaction corresponds to a selection of the first of the 3D elements;retrieve, from the memory device in response to the selection of the first of the 3D elements, a first of the sets of list data associated with the first user profile corresponding to the first of the 3D elements;command, in response to the selection of the first of the 3D elements, the projector to overlay a plurality of purchase option images onto the 3D image in the real-world space, each of the purchase option images representing a respective one of the purchase options in the first set of list data;receive, from the at least one sensor, an indication of a second physical interaction by the user with one of the purchase option images in the real-world space; andgenerate, in response to the second physical interaction, a purchase transaction request by the user for a gift item defined in the registration data of the first individual, the gift item associated with the one of the purchase options.
  • 16. The at least one non-transitory computer-readable storage medium of claim 15, wherein the instructions further cause the at least one processor to generate the purchase transaction request by commanding the projector to project a QR code associated with the one of the purchase options.
  • 17. The at least one non-transitory computer-readable storage medium of claim 15, wherein the instructions further cause the at least one processor to generate the purchase transaction request by transmitting a push notification for the one of the purchase options to a user computing device and causing the user computing device to execute a payment transaction for the one of the purchase options.
  • 18. The at least one non-transitory computer-readable storage medium of claim 15, wherein the instructions further cause the at least one processor to generate the purchase transaction request by transmitting account information and the one of the purchase options to a merchant.
  • 19. The at least one non-transitory computer-readable storage medium of claim 15, wherein the instructions further cause the at least one processor to identify the user based on input from the at least one sensor.
  • 20. The at least one non-transitory computer-readable storage medium of claim 15, wherein the instructions further cause the at least one processor to retrieve, from a merchant, account information associated with the user.
US Referenced Citations (386)
Number Name Date Kind
6915279 Hogan Jul 2005 B2
7643025 Lange Jan 2010 B2
7792801 Hamilton, II Sep 2010 B2
8229800 Trotman Jul 2012 B2
8549001 Minevski Oct 2013 B1
8564595 Ho Oct 2013 B1
8606645 Applefeld Dec 2013 B1
8704879 Cheng Apr 2014 B1
8847919 Krah Sep 2014 B2
8896685 Ihara Nov 2014 B2
8923686 Krishnaswamy Dec 2014 B2
9098873 Geisner Aug 2015 B2
9129433 Korobkin Sep 2015 B1
9298266 Blackstone et al. Mar 2016 B2
9520072 Sun Dec 2016 B2
9547938 Ramkumar Jan 2017 B2
9565400 Curlander Feb 2017 B1
9595115 Cederlof Mar 2017 B1
9606584 Fram Mar 2017 B1
9641823 Said et al. May 2017 B2
9721383 Horowitz Aug 2017 B1
9728007 Hakkarainen Aug 2017 B2
9734634 Mott Aug 2017 B1
9762757 Kim Sep 2017 B2
9767613 Bedikian Sep 2017 B1
9785247 Horowitz Oct 2017 B1
9785640 Bailey Oct 2017 B2
9892564 Cvetko Feb 2018 B1
10007948 Nickerson Jun 2018 B1
10009640 Lodato Jun 2018 B1
10052026 Tran Aug 2018 B1
10096168 Da Veiga Oct 2018 B2
10122969 Lim Nov 2018 B1
10134082 Benkar Nov 2018 B2
10257490 Khalid Apr 2019 B2
10264215 Sadanand Apr 2019 B1
10268277 Kang Apr 2019 B2
10332176 Gadre Jun 2019 B2
10339548 Kumar Jul 2019 B1
10339721 Dascola Jul 2019 B1
10373383 Werner Aug 2019 B1
10403050 Beall Sep 2019 B1
10445716 Riechers Oct 2019 B1
10467811 Cederlof Nov 2019 B1
10482664 Schlosser Nov 2019 B1
10504297 Cuthbertson Dec 2019 B2
10515397 Serfass Dec 2019 B2
10559019 Beauvais Feb 2020 B1
10614616 Tedesco Apr 2020 B1
10635895 Andersen Apr 2020 B2
10692299 Bhushan Jun 2020 B2
10699488 Terrano Jun 2020 B1
10706584 Ye Jul 2020 B1
10712901 Hwang Jul 2020 B2
10755487 Snibbe Aug 2020 B1
10783554 Hylton Sep 2020 B1
10783712 Hwang Sep 2020 B2
10796489 Cordes Oct 2020 B1
10798292 Lei Oct 2020 B1
10853869 DeStefano Dec 2020 B2
10970707 Techel Apr 2021 B1
11087543 Cowburn Aug 2021 B1
20030018579 Litster Jan 2003 A1
20030071810 Shoov Apr 2003 A1
20030187747 Fukasawa Oct 2003 A1
20040103038 Power et al. May 2004 A1
20050008256 Uchiyama Jan 2005 A1
20050081161 MacInnes Apr 2005 A1
20050195157 Kramer Sep 2005 A1
20050198571 Kramer Sep 2005 A1
20060038814 Rivera Feb 2006 A1
20060192925 Chang Aug 2006 A1
20070130020 Paolini Jun 2007 A1
20070179867 Glazer Aug 2007 A1
20070192203 Di Stefano Aug 2007 A1
20070282695 Toper Dec 2007 A1
20070294622 Sterner Dec 2007 A1
20080238916 Ghosh Oct 2008 A1
20080249897 Oh Oct 2008 A1
20080255957 Erdem Oct 2008 A1
20090077504 Bell Mar 2009 A1
20090113349 Zohar Apr 2009 A1
20090288012 Hertel Nov 2009 A1
20100005424 Sundaresan Jan 2010 A1
20100010902 Casey Jan 2010 A1
20100026723 Nishihara Feb 2010 A1
20100070378 Trotman Mar 2010 A1
20100091015 Heidel Apr 2010 A1
20100115425 Bokor May 2010 A1
20100149311 Kroll et al. Jun 2010 A1
20100169837 Hyndman Jul 2010 A1
20100177164 Zalevsky Jul 2010 A1
20100281432 Geisner Nov 2010 A1
20100295847 Titus Nov 2010 A1
20100306712 Snook Dec 2010 A1
20110078055 Faribault Mar 2011 A1
20110119640 Berkes May 2011 A1
20110128555 Rotschild Jun 2011 A1
20110178924 Briscoe Jul 2011 A1
20110187706 Vesely Aug 2011 A1
20110246329 Geisner Oct 2011 A1
20110251928 Van Buskirk Oct 2011 A1
20120005624 Vesely Jan 2012 A1
20120017147 Mark Jan 2012 A1
20120022924 Runnels Jan 2012 A1
20120056989 Izumi Mar 2012 A1
20120162214 Chavez Jun 2012 A1
20120182403 Lange Jul 2012 A1
20120212509 Benko Aug 2012 A1
20120239513 Oliver Sep 2012 A1
20120293411 Leithinger Nov 2012 A1
20120293632 Yukich Nov 2012 A1
20120330781 Borrero Dec 2012 A1
20130024371 Hariramani Jan 2013 A1
20130033484 Liao Feb 2013 A1
20130038528 Fein Feb 2013 A1
20130042296 Hastings Feb 2013 A1
20130063559 Moshe Mar 2013 A1
20130073388 Heath Mar 2013 A1
20130091445 Treadway Apr 2013 A1
20130110666 Aubrey May 2013 A1
20130159081 Shastry Jun 2013 A1
20130166332 Hammad Jun 2013 A1
20130182077 Holz Jul 2013 A1
20130215116 Siddique Aug 2013 A1
20130215148 Antonyuk Aug 2013 A1
20130215393 Kim Aug 2013 A1
20130229396 Huebner Sep 2013 A1
20130234934 Champion Sep 2013 A1
20130265306 Landweber Oct 2013 A1
20130268357 Heath Oct 2013 A1
20130321346 Tyler Dec 2013 A1
20130339906 Barthelt Dec 2013 A1
20140006129 Heath Jan 2014 A1
20140032362 Frayman Jan 2014 A1
20140080109 Haseltine Mar 2014 A1
20140100994 Tatzel Apr 2014 A1
20140100995 Koshy Apr 2014 A1
20140100996 Klein Apr 2014 A1
20140100997 Mayerle Apr 2014 A1
20140129354 Soon-Shiong May 2014 A1
20140129990 Xin May 2014 A1
20140143136 Dhar May 2014 A1
20140143683 Underwood, IV May 2014 A1
20140215365 Hiraga Jul 2014 A1
20140253692 Wilson Sep 2014 A1
20140279242 Staicut Sep 2014 A1
20140281869 Yob Sep 2014 A1
20140282220 Wantland Sep 2014 A1
20140316920 Wolfe Oct 2014 A1
20140317575 Ullmann Oct 2014 A1
20140327747 Kong Nov 2014 A1
20140333899 Smithwick Nov 2014 A1
20140376773 Holz Dec 2014 A1
20150006332 Kinarti Jan 2015 A1
20150012426 Purves Jan 2015 A1
20150022444 Ooi Jan 2015 A1
20150033191 Mankowski Jan 2015 A1
20150049080 Purayil Feb 2015 A1
20150094142 Stafford Apr 2015 A1
20150097768 Holz Apr 2015 A1
20150130701 Kimenkowski May 2015 A1
20150134547 Oikonomidis May 2015 A1
20150154588 Purves Jun 2015 A1
20150169076 Cohen Jun 2015 A1
20150170256 Pettyjohn Jun 2015 A1
20150186984 Loganathan Jul 2015 A1
20150220244 Vats Aug 2015 A1
20150268817 Phan Sep 2015 A1
20150304568 Ikeda Oct 2015 A1
20150309705 Keeler Oct 2015 A1
20150310539 McCoy Oct 2015 A1
20150316985 Levesque Nov 2015 A1
20150324940 Samson Nov 2015 A1
20150350628 Sanders Dec 2015 A1
20150363001 Malzbender Dec 2015 A1
20160004335 Hosenpud Jan 2016 A1
20160012491 Shah Jan 2016 A1
20160026249 Glass Jan 2016 A1
20160027042 Heeter Jan 2016 A1
20160042554 Ogan Feb 2016 A1
20160057511 Mullins Feb 2016 A1
20160063588 Gadre Mar 2016 A1
20160104235 Benkar Apr 2016 A1
20160109954 Harris Apr 2016 A1
20160132962 Xia May 2016 A1
20160147233 Whinnery May 2016 A1
20160170603 Bastien Jun 2016 A1
20160171597 Todeschini Jun 2016 A1
20160180590 Kamhi Jun 2016 A1
20160217623 Singh Jul 2016 A1
20160239080 Margolina Aug 2016 A1
20160239092 Junuzovic Aug 2016 A1
20160247219 Sorensen Aug 2016 A1
20160253844 Petrovskaya Sep 2016 A1
20160275723 Singh Sep 2016 A1
20160284121 Azuma Sep 2016 A1
20160292966 Denham Oct 2016 A1
20160300294 Roche Oct 2016 A1
20160306431 Stafford Oct 2016 A1
20160313816 Krishnakumar Oct 2016 A1
20160313866 Pacheco Oct 2016 A1
20160335712 Tapley Nov 2016 A1
20160349851 Eskolin et al. Dec 2016 A1
20160350842 Glass Dec 2016 A1
20160378887 Maldonado Dec 2016 A1
20170001111 Willette Jan 2017 A1
20170003784 Garg Jan 2017 A1
20170006074 Oates, III Jan 2017 A1
20170052632 Kamamori Feb 2017 A1
20170054569 Harms Feb 2017 A1
20170061700 Urbach Mar 2017 A1
20170132841 Morrison May 2017 A1
20170132842 Morrison May 2017 A1
20170139375 Chung May 2017 A1
20170178266 Schmidt Jun 2017 A1
20170186166 Grunnet-Jepsen Jun 2017 A1
20170192493 Ofek Jul 2017 A1
20170193087 Savliwala Jul 2017 A1
20170223321 Kang Aug 2017 A1
20170243022 Corazza Aug 2017 A1
20170249693 Greenwood Aug 2017 A1
20170262154 Black Sep 2017 A1
20170270715 Lindsay Sep 2017 A1
20170282062 Black Oct 2017 A1
20170315721 Merel Nov 2017 A1
20170358096 Boss Dec 2017 A1
20170358138 Dack Dec 2017 A1
20170359690 Crutchfield Dec 2017 A1
20170371432 Gavriliuc Dec 2017 A1
20180004481 Fallon Jan 2018 A1
20180005437 Anderson Jan 2018 A1
20180052919 Feldman Feb 2018 A1
20180075294 Shahar Mar 2018 A1
20180088741 Matsumura Mar 2018 A1
20180101239 Yin Apr 2018 A1
20180101986 Burns Apr 2018 A1
20180114264 Rafii Apr 2018 A1
20180114353 Champion Apr 2018 A1
20180130255 Hazeghi May 2018 A1
20180136465 Chi May 2018 A1
20180139203 Dolan May 2018 A1
20180150810 Lee May 2018 A1
20180150844 Dolan May 2018 A1
20180158060 Adams Jun 2018 A1
20180165675 Isaacson Jun 2018 A1
20180165977 Johansen Jun 2018 A1
20180165984 Waldron Jun 2018 A1
20180181803 Zhang Jun 2018 A1
20180190003 Upadhyay Jul 2018 A1
20180197340 Loberg Jul 2018 A1
20180205876 Paulus Jul 2018 A1
20180211183 Innes Jul 2018 A1
20180218538 Short Aug 2018 A1
20180232662 Solomon Aug 2018 A1
20180246983 Rathod Aug 2018 A1
20180247024 Divine Aug 2018 A1
20180247359 Kressler Aug 2018 A1
20180247370 Nickerson Aug 2018 A1
20180260089 Hudson Sep 2018 A1
20180260680 Finkelstein Sep 2018 A1
20180260874 Balan Sep 2018 A1
20180284885 Kim Oct 2018 A1
20180307303 Powderly Oct 2018 A1
20180307397 Hastings Oct 2018 A1
20180315040 Isaacson Nov 2018 A1
20180329209 Nattukallingal Nov 2018 A1
20180342103 Schwarz Nov 2018 A1
20180350133 Gervasio Dec 2018 A1
20180350144 Rathod Dec 2018 A1
20180350145 Byl Dec 2018 A1
20180350149 Dandekar Dec 2018 A1
20180357710 Nickerson Dec 2018 A1
20180365893 Mullins Dec 2018 A1
20180374066 Shrivastava Dec 2018 A1
20190004688 Bowen Jan 2019 A1
20190005724 Pahud Jan 2019 A1
20190018498 West Jan 2019 A1
20190018567 Murphy Jan 2019 A1
20190034056 Eisenmann Jan 2019 A1
20190035000 Soltanipour Jan 2019 A1
20190050427 Wiesel Feb 2019 A1
20190050547 Welsh Feb 2019 A1
20190052870 Lutter Feb 2019 A1
20190065028 Chashchin-Semenov Feb 2019 A1
20190068765 Wilens Feb 2019 A1
20190087842 Koenig Mar 2019 A1
20190090025 Chesson Mar 2019 A1
20190095905 Shrivastava Mar 2019 A1
20190095906 Shrivastava Mar 2019 A1
20190107935 Spivack Apr 2019 A1
20190108578 Spivack Apr 2019 A1
20190108672 Ghelberg Apr 2019 A1
20190129607 Saurabh May 2019 A1
20190130462 Tietzen May 2019 A1
20190130655 Gupta May 2019 A1
20190132569 Karpenko May 2019 A1
20190138266 Takechi May 2019 A1
20190146219 Rodriguez, II May 2019 A1
20190156715 James May 2019 A1
20190163985 Wang May 2019 A1
20190172262 McHugh Jun 2019 A1
20190179405 Sun Jun 2019 A1
20190197542 Kirby Jun 2019 A1
20190197553 Conway Jun 2019 A1
20190205851 Sinha Jul 2019 A1
20190212827 Kin Jul 2019 A1
20190220850 Crowe Jul 2019 A1
20190220851 Barnes Jul 2019 A1
20190240508 Friman Aug 2019 A1
20190243138 Peltola Aug 2019 A1
20190244426 Knoppert Aug 2019 A1
20190258318 Qin Aug 2019 A1
20190272079 Woodward Sep 2019 A1
20190279276 Leano Sep 2019 A1
20190279283 Benkar Sep 2019 A1
20190279373 Boss Sep 2019 A1
20190279417 Castaneda Sep 2019 A1
20190287125 Kumar Sep 2019 A1
20190287311 Bhatnagar Sep 2019 A1
20190304166 Yu Oct 2019 A1
20190304201 Cuthbertson Oct 2019 A1
20190311341 Rice Oct 2019 A1
20190311512 VanBlon Oct 2019 A1
20190311550 Cuthbertson Oct 2019 A1
20190313059 Agarawala Oct 2019 A1
20190324558 Kimenkowski Oct 2019 A1
20190325498 Clark Oct 2019 A1
20190332758 Yin Oct 2019 A1
20190339837 Furtwangler Nov 2019 A1
20190340816 Rogers Nov 2019 A1
20190340818 Furtwangler Nov 2019 A1
20190340833 Furtwangler Nov 2019 A1
20190355050 Geisler Nov 2019 A1
20190362562 Benson Nov 2019 A1
20190385367 Labron Dec 2019 A1
20200005539 Hwang Jan 2020 A1
20200034011 Shen Jan 2020 A1
20200035025 Crocker Jan 2020 A1
20200035026 Demirchian Jan 2020 A1
20200043240 Abe Feb 2020 A1
20200050259 Lam Feb 2020 A1
20200064911 Mine Feb 2020 A1
20200064996 Giusti Feb 2020 A1
20200066049 Sun Feb 2020 A1
20200066236 Giusti Feb 2020 A1
20200082629 Jones Mar 2020 A1
20200104025 Iglesias Apr 2020 A1
20200105057 Horowitz Apr 2020 A1
20200111267 Stauber Apr 2020 A1
20200117336 Mani Apr 2020 A1
20200118343 Koblin Apr 2020 A1
20200122405 Bigos Apr 2020 A1
20200130089 Ivkovich Apr 2020 A1
20200132474 Comer Apr 2020 A1
20200139471 Pliska May 2020 A1
20200143591 Watkins May 2020 A1
20200147712 Pliska May 2020 A1
20200165859 Hong May 2020 A1
20200193206 Turkelson Jun 2020 A1
20200202390 Gregori Jun 2020 A1
20200219319 Lashmar Jul 2020 A1
20200225758 Tang Jul 2020 A1
20200225830 Tang Jul 2020 A1
20200226599 Adams Jul 2020 A1
20200276503 Marchiorello Sep 2020 A1
20200302693 Singh Sep 2020 A1
20200320794 Huang Oct 2020 A1
20200360816 Cahill Nov 2020 A1
20200379625 Wang Dec 2020 A1
20200391442 Boss Dec 2020 A1
20200393952 Hsiao Dec 2020 A1
20200401576 Yerli Dec 2020 A1
20200410766 Swaminathan Dec 2020 A1
20210019948 Short Jan 2021 A1
20210029339 Liu Jan 2021 A1
20210042019 Densham Feb 2021 A1
20210082044 Sliwka Mar 2021 A1
20210118232 Scott, II Apr 2021 A1
20210200776 Pounds Jul 2021 A1
20210201336 Mallett Jul 2021 A1
20210201587 Mehr Jul 2021 A1
20210203727 Pounds Jul 2021 A1
20210225087 Sudol Jul 2021 A1
20210233062 Clark Jul 2021 A1
20210241362 Nguyen Aug 2021 A1
Related Publications (1)
Number Date Country
20200151805 A1 May 2020 US