The present invention relates generally to the field of virtual reality interactive advertisement, specifically, real time personalization for interactive advertisements for a user within a virtual environment where advertisement is being served.
In the field of virtual reality, a set of users may be targeted by advertisers to deliver a personalized experience. This personalized experience or advertising may be a powerful tool that improves advertising relevance for users and increases the return on investment (ROI) for advertisers in a virtual reality environment. In such virtual reality environments, the users may engage with content without any external disruptions or interruptions but also by providing interactive ads, advertising may blend the real world with simulated elements. Therefore, the opportunity of personalization is enhanced further in a virtual environment because the system may place users in a virtual and controlled advertising environment.
A method embodiment may include: generating, by a computing device having a processor and addressable memory, a virtual environment where advertisement may be provided to a user of a set of one or more users; determining personalized advertisement for the user of the set of one or more users within the generated virtual environment based on whether a set of publisher preferences may be available from a publisher of an advertisement within the generated virtual environment, where the set of publisher preferences comprise: a set of group unique advertisements, a group singular advertisement, and a non-player character (NPC) advertisement; determining a 2D/3D interactive advertisement as an experience based on the generated virtual environment and availability of the set of publisher preferences, where determining the 2D/3D interactive advertisement as an experience may be based on receiving data from a set of components and executing at least one of: a custom 2D/3D advertisement in group setting component, where the custom 2D/3D advertisement in group setting component may be executed based on receiving data that the publisher has provided group unique advertisements; a custom group 2D/3D advertisement component, where the custom group 2D/3D advertisement component may be executed based on receiving data that the publisher has provided group singular advertisements; and a NPC as an advertisement component, where the NPC as an advertisement component may be executed based on receiving data that the publisher has provided advertisements as NPC; determining a real time personalization for interactive advertisements based on the determined 2D/3D interactive advertisement as an experience, where determining the real time personalization for interactive advertisements may be via executing a real time personalization for interactive advertisements component based on whether a user interacts with an advertisement, where the real time personalization for interactive advertisements component may be configured to continuously check for user interaction data; and determining an interactive advertisement within the virtual environment for each user of the set of one or more users based on the received data from at least one of the components from the set of components.
In additional method embodiments, the method determines the personalized advertisement for the user of a set of one or more users within the generated virtual environment as an initial advertisement offering. In additional method embodiments, the method determines the real time personalization for interactive advertisement after the determination of the personalized advertisement for the user to evolve the interactive advertisement to be more personalized for the user as the user makes interactive choices with the advertisement.
In additional method embodiments, the custom 2D/3D advertisement in group setting component may be configured to select advertisement for different users to show within the virtual environment. In additional method embodiments, the provided group unique advertisements in the custom 2D/3D advertisement in group setting component may be a set of unique advertisements for each user of the set of one more users.
In additional method embodiments, the custom group 2D/3D advertisement component may be configured to select the same advertisement for different users within the virtual environment, based on the aggregation of all user preferences within the virtual environment. In additional method embodiments, the provided group singular advertisements in the custom group 2D/3D advertisement component may be the same advertisements for each user.
In additional method embodiments, the NPC as an advertisement component may be configured to provide advertisements that may be interactive. In additional method embodiments, the user may be able to interact with the advertisements within the virtual environment giving input to the advertisement as it evolves.
In additional method embodiments, real time personalization for interactive advertisements component may be further configured to: determine an initial advertisement state where the user may be shown an interactive advertisement; update the determined initial advertisement state to a next state based on receiving input that the user has taken an advertisement action; and update the advertisement with the received user actions based on whether the updated advertisement state allows new actions.
A computing device embodiment may include a processor and addressable memory, the processor configured to execute a set of components comprising: a custom 2D/3D advertisement in group setting component, where the custom 2D/3D advertisement in group setting component may be executed based on receiving data that a publisher has provided group unique advertisements; a custom group 2D/3D advertisement component, where the custom group 2D/3D advertisement component may be executed based on receiving data that the publisher has provided group singular advertisements; a non-player character (NPC) as an advertisement component, where the NPC as an advertisement component may be executed based on receiving data that the publisher has provided advertisements as NPC; and a real time personalization for interactive advertisements component, where the real time personalization for interactive advertisements component may be executed based on whether a user interacts with an advertisement; where the computing device may be further configured to: generate a virtual environment where advertisement may be provided to a user of a set of one or more users; determine personalized advertisement for the user of the set of one or more users within the generated virtual environment based on whether a set of publisher preferences may be available from the publisher of an advertisement within the generated virtual environment, where the set of publisher preferences comprise: a set of group unique advertisements, a group singular advertisement, and a NPC advertisement; determine a 2D/3D interactive advertisement as an experience based on the generated virtual environment and availability of the set of publisher preferences; determine a real time personalization for interactive advertisements based on the determined 2D/3D interactive advertisement as an experience; and determine an interactive advertisement within the virtual environment for each user of the set of one or more users based on the received data from at least one of the components from the set of components, and where the set of components may be being continuously executed in real-time thereby checking for user interaction data.
The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principals of the invention. Like reference numerals designate corresponding parts throughout the different views. Embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which:
The following detailed description describes the present embodiments with reference to the drawings. In the drawings, reference numbers label elements of the present embodiments. These reference numbers are reproduced below in connection with the discussion of the corresponding drawing features. The described technology concerns one or more methods, systems, apparatuses, and mediums storing processor-executable process steps to execute a virtual reality interactive advertisement environment that provides experiences between a virtual world and a physical world.
Virtual reality (VR) describes a computer-generated three-dimensional environment where users interact with objects or other users. In some VR related scenarios, users are placed inside an experience, where during the experience the system stimulates multiple senses, such as vision, hearing, and touch. Virtual reality may be experienced using headsets, which take over the user's vision to simulate the computer-generated three-dimensional environment, replacing the real-world with a virtual one. VR headsets may communicate with the system via a cable or wirelessly and include motion tracking sensors to track user movement, thus enabling a 360-degree world. VR headsets may also connect to smartphones which now provide an even more real-world experience using the smartphone's motion sensors and other built in sensors in conjunction with the VR headset.
Additionally, augmented reality is a subset of virtual reality that simulates artificial objects within the real-world, meaning the virtual objects interact with real-world objects. Using a smartphone camera the system may superimpose additional information on top of the user's real-world environment. This process may also be experienced on a computer screen having the ability to display 3D objects or other such similar devices. Augmented reality may be as immersive as a virtual reality experience given that augmented reality builds an experience based on live surroundings. Augmented reality provides an enhanced version of the real physical world that is achieved through the use of digital visual elements, sound, or other sensory stimuli and often times uses mobile computing power for executing the code. Augmented reality and virtual reality systems execute applications in which users immerse themselves into an alternate reality environment when wearing, for example, a head-mounted display that displays virtual and/or augmented reality user experiences. Accordingly, a computerized method for viewing an augmented reality environment comprises generating a unique environment corresponding to a user and rendering the unique environment on the user device for them to interact with. Such systems and methods utilize broadcasting of information formatted for reception by a user device. The broadcast information is based at least in part on the unique experiences for that user and certain preferences.
Embodiments of the present application disclose a set of components to execute a series of steps to provide direct advertisement (“ad”) or advertisements (“ads”) in the virtual world where the ad is in a virtual space, for example, virtual space for a given point of interest (POI). Accordingly, such interactive advertising experiences communicate with consumers to promote products, brands, services, etc., where in the disclosed embodiments, such experiences are based on physical triggers to initiate them. The interaction is not limited to, for example, a banner ad or a billboard in a virtual space, but also to an object in the virtual space where the object provides an interactive experience by way of using the 3D space to experience the product being advertised. In one embodiment, the physical trigger may be the user's avatar moving up to the object and touching it.
The embodiments surrounding the interactive advertising may be paid and/or unpaid presentation and promotion of products and services by an identified sponsor involving interactions between consumers and products. This interaction may be performed in the 3D virtual space environment through the use of the disclosed system embodiments, where the system is configured to deliver a variety of interactive advertising units as the user interacts with objects displayed within the virtual space. Such advertising may be uniquely placed in the 3D virtual space environment where different users may see different ads on the same billboard due to their user preferences. Additionally, the introduction of objects that provide a transformation of the user from one virtual space to another virtual space related to that object may be implemented by the disclosed systems and processes.
That is, in such embodiments, the Real Time Personalization for Interactive Ads component 900 may be in communication with the Custom 2D/3D Ad in group setting component 1000 (unique ads per user), the Custom Group 2D/3D Ad component 1100 (same ad), and/or Non-player character (NPC) as an Advertisement component 1200 where the Real Time Personalization for Interactive Ads component 900 may be configured to execute and make real time updates to the ad based on current actions taken by the user, in addition to or instead of the personal history. Accordingly, the personalization may be determined for the initial ad offering (for example, in components 800, 1000, 1100, and 1200), while the Real Time Personalization for Interactive Ad component 900 determines advertisement that itself may evolve to be more personalized for the user as they make interactive choices with the ad.
In some embodiments, the custom 2D/3D Ad in group setting component (unique ads per user) 212, the Custom Group 2D/3D Ad component (same ad) 222, and the NPC as an Advertisement component 232, may each be in communication with a 2D/3D Interactive Ad As Experience Component 240 (described in further detail below and in
In some examples, the computing apparatus 302 detects voice input, user gestures or other user actions and provides a natural user interface (NUI). This user input may be used to author electronic ink, view content, select ink controls, play videos with electronic ink overlays, and for other purposes. The input/output controller 318 outputs data 322 to devices other than a display device in some examples, e.g. a locally connected printing device. NUI technology enables a user to interact with the computing apparatus 302 in a natural manner, free from artificial constraints imposed by input devices 320 such as mice, keyboards, remote controls and the like. Examples of NUI technology that are provided in some examples include but are not limited to those relying on voice and/or speech recognition, touch and/or stylus recognition (touch sensitive displays), gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, and machine intelligence. Other examples of NUI technology that are used in some examples include intention and goal understanding systems, motion gesture detection systems using depth cameras (such as stereoscopic camera systems, infrared camera systems, red green blue (rgb) camera systems and combinations of these), motion gesture detection using accelerometers/gyroscopes, facial recognition, three dimensional (3D) displays, head, eye and gaze tracking, immersive augmented reality and virtual reality systems and technologies for sensing brain activity using electric field sensing electrodes (electro encephalogram (EEG) and related methods).
The techniques introduced below may be implemented by programmable circuitry programmed or configured by software and/or firmware, or entirely by special-purpose circuitry, or in a combination of such forms. Such special-purpose circuitry (if any) can be in the form of, for example, one or more application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), etc.
The described technology may also be practiced in distributed computing environments where tasks or modules are performed by remote processing devices, which are linked through a communications network, such as a Local Area Network (“LAN”), Wide Area Network (“WAN”), or the Internet. In a distributed computing environment, program modules or subroutines may be located in both local and remote memory storage devices. Those skilled in the relevant art will recognize that portions of the described technology may reside on a server computer, while corresponding portions may reside on a client computer (e.g., PC, mobile computer, tablet, or smart phone). Data structures and transmission of data particular to aspects of the technology are also encompassed within the scope of the described technology.
Information transferred via communications interface 512 may be in the form of signals such as electronic, electromagnetic, optical, or other signals capable of being received by a communications interface 512, via a communication link 516 that carries signals and may be implemented using wire or cable, fiber optics, a phone line, a cellular/mobile phone link, an radio frequency (RF) link, and/or other communication channels. Computer program instructions representing the block diagram and/or flowcharts herein may be loaded onto a computer, programmable data processing apparatus, or processing devices to cause a series of operations performed thereon to produce a computer implemented process.
Embodiments have been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments. Each block of such illustrations/diagrams, or combinations thereof, can be implemented by computer program instructions. The computer program instructions when provided to a processor produce a machine, such that the instructions, which execute via the processor, create means for implementing the functions/operations specified in the flowchart and/or block diagram. Each block in the flowchart/block diagrams may represent a hardware and/or software module or logic, implementing embodiments. In alternative implementations, the functions noted in the blocks may occur out of the order noted in the figures, concurrently, etc.
Computer programs (i.e., computer control logic) are stored in main memory and/or secondary memory. Computer programs may also be received via a communications interface 512. Such computer programs, when executed, enable the computer system to perform the features of the embodiments as discussed herein. In particular, the computer programs, when executed, enable the processor and/or multi-core processor to perform the features of the computer system. Such computer programs represent controllers of the computer system.
The server 630 may be coupled via the bus 602 to a display 612 for displaying information to a computer user. An input device 614, including alphanumeric and other keys, is coupled to the bus 602 for communicating information and command selections to the processor 604. Another type or user input device comprises cursor control 616, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to the processor 604 and for controlling cursor movement on the display 612.
According to one embodiment, the functions are performed by the processor 604 executing one or more sequences of one or more instructions contained in the main memory 606. Such instructions may be read into the main memory 606 from another computer-readable medium, such as the storage device 610. Execution of the sequences of instructions contained in the main memory 606 causes the processor 604 to perform the process steps described herein. One or more processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in the main memory 606. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the embodiments. Thus, embodiments are not limited to any specific combination of hardware circuitry and software.
The terms “computer program medium,” “computer usable medium,” “computer readable medium”, and “computer program product,” are used to generally refer to media such as main memory, secondary memory, removable storage drive, a hard disk installed in hard disk drive, and signals. These computer program products are means for providing software to the computer system. The computer readable medium allows the computer system to read data, instructions, messages or message packets, and other computer readable information from the computer readable medium. The computer readable medium, for example, may include non-volatile memory, such as a floppy disk, ROM, flash memory, disk drive memory, a CD-ROM, and other permanent storage. It is useful, for example, for transporting information, such as data and computer instructions, between computer systems. Furthermore, the computer readable medium may comprise computer readable information in a transitory state medium such as a network link and/or a network interface, including a wired network or a wireless network that allow a computer to read such computer readable information. Computer programs (also called computer control logic) are stored in main memory and/or secondary memory. Computer programs may also be received via a communications interface. Such computer programs, when executed, enable the computer system to perform the features of the embodiments as discussed herein. In particular, the computer programs, when executed, enable the processor or multi-core processor to perform the features of the computer system. Accordingly, such computer programs represent controllers of the computer system.
Generally, the term “computer-readable medium” as used herein refers to any medium that participated in providing instructions to the processor 604 for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, optical or magnetic disks, such as the storage device 610. Volatile media includes dynamic memory, such as the main memory 606. Transmission media includes coaxial cables, copper wire and fiber optics, including the wires that comprise the bus 602. Transmission media can also take the form of acoustic or light waves, such as those generated during radio wave and infrared data communications.
Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read.
Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to the processor 604 for execution. For example, the instructions may initially be carried on a magnetic disk of a remote computer. The remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to the server 630 can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal. An infrared detector coupled to the bus 602 can receive the data carried in the infrared signal and place the data on the bus 602. The bus 602 carries the data to the main memory 606, from which the processor 604 retrieves and executes the instructions. The instructions received from the main memory 606 may optionally be stored on the storage device 610 either before or after execution by the processor 604.
The server 630 also includes a communication interface 618 coupled to the bus 602. The communication interface 618 provides a two-way data communication coupling to a network link 620 that is connected to the world wide packet data communication network now commonly referred to as the Internet 628. The Internet 628 uses electrical, electromagnetic, or optical signals that carry digital data streams. The signals through the various networks and the signals on the network link 620 and through the communication interface 618, which carry the digital data to and from the server 630, are exemplary forms of carrier waves transporting the information.
In another embodiment of the server 630, the communication interface 618 is connected to a network 622 via a communication link 620. For example, the communication interface 618 may be an integrated services digital network (ISDN) card or a modem to provide a data communication connection to a corresponding type of telephone line, which can comprise part of the network link 620. As another example, the communication interface 618 may be a local area network (LAN) card to provide a data communication connection to a compatible LAN. Wireless links may also be implemented. In any such implementation, the communication interface 618 sends and receives electrical electromagnetic or optical signals that carry digital data streams representing various types of information.
The network link 620 typically provides data communication through one or more networks to other data devices. For example, the network link 620 may provide a connection through the local network 622 to a host computer 624 or to data equipment operated by an Internet Service Provider (ISP). The ISP in turn provides data communication services through the Internet 628. The local network 622 and the Internet 628 both use electrical, electromagnetic or optical signals that carry digital data streams. The signals through the various networks and the signals on the network link 620 and through the communication interface 618, which carry the digital data to and from the server 630, are exemplary forms of carrier waves transporting the information.
The server 630 can send/receive messages and data, including e-mail, and/or program code, through the network, the network link 620 and the communication interface 618. Further, the communication interface 618 can comprise a USB/Tuner and the network link 620 may be an antenna or cable for connecting the server 630 to a cable provider, satellite provider, or other terrestrial transmission system for receiving messages, data, and program code from another source.
The example versions of the embodiments described herein may be implemented as logical operations in a distributed processing system such as the system 600 including the servers 630. The logical operations of the embodiments may be implemented as a sequence of steps executing in the server 630, and as interconnected machine modules within the system 600. The implementation is a matter of choice and can depend on performance of the system 600 implementing the embodiments. As such, the logical operations constituting said example versions of the embodiments are referred to for example, as operations, steps or modules.
Similar to a server 630 described above, a client device 601 can include a processor, memory, storage device, display, input device and communication interface (e.g., e-mail interface) for connecting the client device to the Internet 628, the ISP, or LAN 622, for communication with the servers 630. The system 600 can further include computers (e.g., personal computers, computing nodes) 605 operating in the same manner as client devices 601, where a user can utilize one or more computers 605 to manage data in the server 630.
Referring now to
In one embodiment, 2D/3D interactive advertisements may be presented as experiences to the user via the system being configured to implement a User Personalization Fuser component 820, where the User Personalization Fuser component 820 may be configured to take user personalization data from preferences and third party data along with past interactions, to aggregate meaningful data about that user, for example, what type of topics a user likes. Accordingly, the User Personalization Fuser component 820 may receive as input a set of User Preferences 822, Third Party Data 824, and data related to User Past Ad Interactions 826, where the User Preferences 822 may include preferences a user has stored as a mapping of defined fields, for example, a user might prefer educational content over gaming content. Additionally, the Third-Party Data 824 may be data from outside of the system, collected from third parties, that may be used to give meaningful personalization information such as likes and dislikes. The embodiments may also include a User Ads Database 827, where the User Past Ad Interactions 826 may be in communication with the User Ads Database 827 and the User Ads Database 827 may be configured to store all user ad interactions history for all users and all available ads. User Past Ad Interactions 826 may be information of previous ads, for example, a user had selected “Red” when prompted for the color of a specific model car interactive advertisement. With all the above info, the disclosed embodiments provide a method for a new advertisement to assume the user likes sport cars and prefers the color red. The representation of this data may be vector representations from a deep learning model or a knowledge graph formed about the user. In one embodiment, the data may be stored within a database or some other representation as known in the technical field.
The disclosed embodiment may further include a 2D/3D Ad Matcher component 830 configured to match users with relevant ads based on at least one of: the 2D/3D virtual information, constraints for a virtual publisher space, and the user profile information (received from the User Personalization Fuser component 820). The 2D/3D Ad Matcher component 830 may be in communication with the User Personalization Fuser component 820 to receive the user profile information, and in addition, with a Virtual Publisher Environment component 840 configured to provide a virtual environment that the ads will be shown in. For example, a virtual room where the ad will be displayed. This is the equivalent to today's ad publishers where a website is the publisher's website and will be served ads to show to users. The Virtual Publisher Environment component 840 may receive input from an Ad Virtual Volume Allocator 842 and a Virtual Space Content Database 844, where the Virtual Publisher Environment component 840 may be configured to collect the volume constraints of an ad that the ad may take up, for example, a 2D/3D interactive ad can take up 5×5×10 volume in the virtual space to display an object. The Virtual Space Content Database 844 may be configured to store all the advertisements with their corresponding content descriptions; for example, a 2D/3D model of a car may have content tags of “luxury car”, “car”, “vehicle” and other features as well to search.
The 2D/3D Ad Matcher component 830 may also be in communication with a 2D/3D Ad Virtual Constraints Fuser component 850 configured to use publisher constraints and suggested content to adjust what ads are being matched. The 2D/3D Ad Virtual Constraints Fuser component 850 may receive as input Publisher Ad Content Restrictions 852 that may include the constraints a publisher may set on a space, for example, the publisher may not want any rated R ads and instead ads that are PG-13. The 2D/3D Ad Virtual Constraints Fuser component 850 may also receive as input Publisher Suggested Content 854 that provides content that the ad publisher may think is relevant and suggest recommending to the users that see the ad.
The 2D/3D Ad Matcher component 830 may further be in communication with an Available Ads component 860 that may be configured to retrieve information about the availability of the ads from an Ads Database 862. Accordingly, the 2D/3D Ad Matcher component 830 may determine and output a selected ad 870 based on the input data received from the User Personalization Fuser component 820, the Virtual Publisher Environment component 840, the 2D/3D Ad Virtual Constraints Fuser component 850, and/or the Available Ads component 860 to create a 2D/3D interactive ad as an experience to the user.
The Online Feedback System component 910 may be in communication with an Ad Actions Predictor component 920, where the Ad Actions Predictor component 920 may be configured to predict the next action to present to the user, where the next action may be based on: user current interactions, past user interactions, and available actions to choose from. The Ad Actions Predictor component 920 may receive input from an Ad Actions Database 922, the input being available actions that an ad can take to give a user more choices. One example may be the use case where the ability to add a wrist jewelry item is considered an ad action, and that can be taken along with adding a watch or not adding anything at all. The Ad Actions Predictor component 920 may be configured to communicate the predictions to a User Interaction Predictor 940, where the User Interaction Predictor 940 may be configured to take the user's interaction history with other ads and the demographic the user belongs to in order to service the actions predictor. The User Interaction Predictor 940 may communicate with a User actions database 942 that may be configured to store all the user actions with interactive ads, for example, if the user in the past interacted with a clothing advertisement. In addition to the User actions database 942, the User Interaction Predictor 940 may receive input from a User Demographic Matcher 944 that matches a user with a demographic they belong to. This matching may be accomplished by communicating with a User Database 946 that stores the user information along with any groups that have common interaction patterns. One example is a clothing shopper demographic or more specifically, jewelry shopper. Accordingly, the Real Time Personalization for Interactive Ads component 900, using the Online Feedback System component 910 may provide a set of interactive Ad Choices 916 which may be personalized for the user as they make interactive choices with the ad.
user: no actions taken;
ad state: mannequin with shoes, shirt, pants, and necklace;
possible actions: interact with shoe, interact with shirt, interact with pants, interact with necklace.
user: interacts with necklace;
ad state: mannequin with shoes, shirt, pants, necklace, and bracelet.
possible actions: interact with shoe, interact with shirt, interact with pants, interact with bracelet
That is, the system may be configured to execute the following method steps: show a user interactive advertisement (step 952); determine if a user takes Ad action by interacting with the shown interactive advertisement (step 954); if no action taken, then the system may continue to check for user action (step 956); and if the user interacts with the advertisement then update the ad state (step 958). This step provides the addition or in some embodiments, subtraction, of possible action an advertisement offers to the user. Therefore, after the ad state has been updated (in step 958), the system may be configured to determine whether a new Ad state allows or leads to new user actions (step 960). In the disclosed embodiments, this determination may be based on the types of actions possible and previous user interaction with the ad amounting to a change in possible actions. If for example, the ad state does not allow for a new user action, then the system may continue checking for user action (step 956 described above). If the system determines that the ad state does allow new user actions, then execute an update to the ad with new user actions (step 962). As to the above example, in this state, the bracelet was added showing the ad has been personalized to someone who likes jewelry more. Additionally, interacting with the bracelet is now a new action the user may take further, adding more personalization options.
In one embodiment, once the Virtual Publisher Environment 1010 is determined, such information may be communicated to a Group Splitter component 1020. The Group Splitter component 1020 may be configured to take the user pool to show the advertisements and split them up into their own users to show ads to. The Group Splitter component 1020 may then output a set of Users 1, 2, . . . N, that provide the relevant user information for the selected user. The component may then transmit the Users 1, 2, . . . N identification and execute the 2D/3D Interactive Ad component 1030 (also see
In one embodiment, once the Virtual Publisher Environment 1110 is determined, such information may be communicated with a Group Splitter component 1120 where the Group Splitter component 1120 may be configured to take the user pool to show the advertisements and split them up into their own users to show ads to. The Group Splitter component 1120 may then output a set of Users 1, 2, . . . N, that provide the relevant user information for the selected user. The system may then transmit the Users 1, 2, . . . N identification and execute the 2D/3D Interactive Ad component 1130 (also see
Embodiments also include a Persona Generator 1220 configured to generate a Persona by taking into account the best matched person from the User Persona Matcher 1210. The Persona Generator 1220 may then provide input to an NPC Creator 1230, where the NPC Creator 1230 may be configured to take all aspects of the NPC communication, personality, and virtual embodiment to create the full NPC. Referring now to an Avatar Generator 1240, the system may use the Avatar Generator 1240 to generate the 2D/3D Model for the avatar of the persona. For example, if the person generated is a pushy salesperson, this avatar can have a suit and tie on. The NPC Creator 1230 may also receive input from a Language Module 1250 that provides a language model that will allow the person to communicate via text or speech audio.
In one embodiment, a User ID 1232 which is a unique identification number of the user may be provided to a 2D/3D Interactive Ad 1230 (also see
In the above disclosed embodiments, the user device may be any type of display system providing a view through optics so that the generated image that is being displayed to the user is overlaid onto a real-world view. Thus, as a wearable display system, an augmented reality device can incorporate components, such as processing unit(s), computer interface(s) that provide network connectivity, and camera(s) etc. These components can be housed in the headset or in a separate housing connected to the headset by wireless or wired means. The user device may also include an imaging application implemented to generate holograms for display. The imaging application may be implemented as a software application or components, such as computer-executable software instructions that are executable with the processing system. The imaging application may be stored on computer-readable storage memory (e.g., the memory), such as any suitable memory device or electronic data storage implemented in the alternate reality device.
It is contemplated that various combinations and/or sub-combinations of the specific features and aspects of the above embodiments may be made and still fall within the scope of the invention. Accordingly, it should be understood that various features and aspects of the disclosed embodiments may be combined with or substituted for one another in order to form varying modes of the disclosed invention. Further, it is intended that the scope of the present invention is herein disclosed by way of examples and should not be limited by the particular disclosed embodiments described above. The present embodiments are, therefore, susceptible to modifications and alternate constructions from those discussed above that are fully equivalent. Consequently, the present invention is not limited to the particular embodiments disclosed. On the contrary, the present invention covers all modifications and alternate constructions coming within the spirit and scope of the present disclosure. For example, the steps in the processes described herein need not be performed in the same order as they have been presented, and may be performed in any order(s). Further, steps that have been presented as being performed separately may in alternative embodiments be performed concurrently. Likewise, steps that have been presented as being performed concurrently may in alternative embodiments be performed separately.
This application claims priority to and the benefit of U.S. Provisional Patent Application No. 63/348,159, filed Jun. 2, 2022, the contents of which are hereby incorporated by reference herein for all purposes.
| Number | Date | Country | |
|---|---|---|---|
| 63348159 | Jun 2022 | US |