The present disclosure is directed to systems and methods for dynamic chat translation and interactive entertainment control, including machine model language translation, electronic game control, game content rendering, gaming device operations, and gaming device processes.
Computer and console games titles have been developed in many styles for different gaming systems and platforms. As device processing increases and game play environments become more immersive, there is a desire for enhancement of content and customization of content for a user. Conventional games allow for preprogrammed settings, however, these settings may not be suitable for all users. In addition, preprogrammed settings may be based on a fixed set of parameters and may have a limited ability to personalize content. There is a need and a desire to provide customization for a user that is based on user input. There also exists a desire to accommodate users in different locations and with different communication styles.
Disclosed and described herein are systems, methods and device configurations for nonplayer character (NPC) personalization and electronic game control. In one embodiment, a method includes receiving, by a device, a user input to personalize at least one nonplayer character (NPC) of an electronic game, and generating, by the device, at least one NPC seed element for the at least one NPC character. The method also includes controlling, by the device, output of the electronic game, wherein the at least one NPC character is presented including the at least one NPC seed element.
In one embodiment, receiving the user input includes receiving an identification of a NPC seed element source.
In one embodiment, generating the at least one NPC seed includes detecting at least one of audio, text, and image data on a user device.
In one embodiment, generating the at least one NPC seed includes controlling a machine learning model to identify at least one of voice communication content and message communication content of an NPC seed source.
In one embodiment, generating the at least one NPC seed includes filtering personal data of an NPC seed source.
In one embodiment, controlling output of the electronic game includes controlling at least one of an NPC action and an NPC appearance in the electronic game.
In one embodiment, controlling output of the electronic game includes controlling an NPC output communication with at least one detected communication phrase of an NPC seed source.
In one embodiment, controlling output of the electronic game includes controlling the NPC character to output a humorous communication.
In one embodiment, controlling output of the electronic game includes modifying a communication style of the NPC character and including an NPC game instruction message.
In one embodiment, the method includes receiving game data identifying NPC communications for the electronic game, and wherein generating the at least one NPC seed element for the at least one NPC character includes modifying at least one received NPC communication for the electronic game.
Another embodiment is directed to a device configured for nonplayer character (NPC) personalization and electronic game control. The device includes an interface configured to output gaming content, a memory storing executable instructions and a controller, coupled to the interface and memory. The controller is configured to receive a user input to personalize at least one nonplayer character (NPC) of an electronic game, and generate at least one NPC seed element for the at least one NPC character. The controller is also configured to control output of the electronic game, wherein the at least one NPC character is presented including the at least one NPC seed element.
Other aspects, features, and techniques will be apparent to one skilled in the relevant art in view of the following detailed description of the embodiments.
The features, objects, and advantages of the present disclosure will become more apparent from the detailed description set forth below when taken in conjunction with the drawings in which like reference characters identify correspondingly throughout and wherein:
One aspect of the disclosure is directed to nonplayer character (NPC) personalization for electronic devices and electronic games, including gaming consoles, network games and network based applications for interactive and entertainment devices. Nonplayer character (NPC) personalization, which also may be referred to as non-player character personalization, may include modification of at least one of text, audio, appearance and characteristics in general of one or more NPCs in an electronic game. NPCs, and described herein also as NPC characters, provide useful information to players in games. NPCs can provide instructions, such as directions, to advance within a game. In other games, NPCs may interact with a player, or player controlled character or game element, in one or more stages of the game. According to embodiments, operations are provided to identify features for personalization of NPCs. According to embodiments, one or more seed elements may be determined to personalize an NPC of one or more games. According to another embodiments, seed elements may be based on a real person and/or user input, to create an NPC with at least one of mannerisms and personalities of a friend, show/movie/video character. Personalization may be provided based on people of interest, such as a favorite professor, comedian, celebrity, etc. By personalizing an NPC based on user input, the user may customize one or more aspects of a NPC to improve a gaming experience. In addition, customization based on user input does not require preprogramming of all individuals. Rather, personalization may be used to modify one or more existing NPCs and/or create an NPC for an electronic game.
Embodiments are directed to systems, processes, and device configurations for nonplayer character (NPC) personalization and electronic game control. Methods can include receiving user input to personalize at least one nonplayer character (NPC) of an electronic game, generating at least one NPC seed element for the at least one NPC character, and controlling output of the electronic game with at least one NPC character presented including an NPC seed element. NPC seed elements may be determined from one or more of communication streams including messaging data, in-game chat and voice communications. In addition, systems and methods include training a machine learning model for generatively producing NPC communications, NPC interactions, and NPC appearance. Operations also allow for detecting electronic game data to identify NPC output (e.g., messaging) for a game including modification of a communication style of NPC characters while maintaining NPC game messaging.
Embodiments are directed to gaming systems which may include consoles, processors or servers that generate game media and interactive entertainment devices configured to provide output and receive user input. Personalization of NPCs can be applied to applications associated with gaming systems, including game engine processes for output of gaming content and game communication functions. Embodiments may also be applied to one or more game functions, such as character dialogue and NPC control within a game engine. A user, as used herein, may relate to a user of a gaming system, such as a console configured to control or operate one or more of a game element, game character and game world. The user may interact with one or more game NPCs while playing the game.
Embodiments are also directed to generating an NPC seed bank. Personalization of NPC elements in an electronic game may include identifying user data and detecting seed elements. According to embodiments, users may acknowledge and agree to use of user communications, such as messaging, voice communication messages and use of user data to identify characteristics of one or more seed users. Using detected data, personal information of seed users may be removed and one or more NPC seeds may be generated. According to embodiments, communications may be received for a first user of an electronic game from a seed user, including one or more of voice, text and audio data. The communications may be received on a user device, which may be a personal communication device (e.g., phone, tablet, etc.). Using a machine learning model for language processing, one or more seed elements may be detected for the seed user. Generating NPC seeds may include generating one or more of scripts, responses, and/or humor. NPC seeds may be based on a seed user or for generating NPC content in general. In addition, NPC seeds may be generated consistent with one or more game functions and game scripts. Embodiments include one or more operations and device configurations for using and updating machine learning models. Processes may use machine learning models including communication databases for audio, graphical, text and language models to detect and convert communications.
As used herein, the terms “a” or “an” shall mean one or more than one. The term “plurality” shall mean two or more than two. The term “another” is defined as a second or more. The terms “including” and/or “having” are open ended (e.g., comprising). The term “or” as used herein is to be interpreted as inclusive or meaning any one or any combination. Therefore, “A, B or C” means “any of the following: A; B; C; A and B; A and C; B and C; A, B and C”. An exception to this definition will occur only when a combination of elements, functions, steps or acts are in some way inherently mutually exclusive.
Reference throughout this document to “one embodiment,” “certain embodiments,” “an embodiment,” or similar term means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of such phrases in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner on one or more embodiments without limitation.
According to embodiments, control device 105 may be configured to control presentation of electronic game content on display 110. When gaming content is a network game, server 115 may be configured to control gaming content output for a user. According to embodiments, operations for presentation of game content and personalization performed by control device 105 may similarly be performed by server 115. Personalization may be performed by a device, control device 105 and/or server 115, controlling presentation of game content. According to other embodiments, control device 105 and server 115 may interoperate with each other to perform operations described herein.
According to embodiments, a user of a game, such as a game player may want to personalize one or more NPCs of the game. Control device 105 may be configured to receive a user input to personalize at least one nonplayer character (NPC) of an electronic game. Personalization may be based on the user input provided, which may include one or more instructions or inputs for personalization. According to embodiments, personalization may be based on communications of a game player with one or more other individuals or contacts, such as a seed user. According to other embodiments, personalization may be based on voice or image data. Personalization may be based on graphics or memes (e.g., graphical elements, text elements, video elements, etc.). Embodiments can detect one or more inputs as seeds for personalization and/or detect seeds for personalization of an NPC from elements of a user input. Seed elements as used herein may allow for personalizing one or more NPCs based on a real person, such as a contact, friend, favorite professor, favorite comedian, or celebrity. Similarly, seed elements may allow for personalizing one or more NPCs based on characters, such as movie or show character. As an example, when communications with a contact or seed user include a recurring joke, phrase and/or one or more memes, seed elements may be based on the content of the communications and this content may be used to personalize features of an NPC. By detecting seed elements, one or more mannerisms and personality parameters may be detected and used for personalization of an NPC.
Control device 105 may be configured to receive a user input to personalize one or more NPC characters of an electronic game and present game content to include a personalized NPC character. A personalized NPC character may include elements of a game NPC with one or more features modified. According to embodiments, a user input to personalize an NPC may include at least one of identification of an NPC to by modified, identification of a seed element, identification of a seed user and an instruction in general. User input may be provide by way of a game controller for control device 105, by using a user device 125 and by way of commands to control device such as menu selections, voice commands and game commands in general.
According to embodiments, personalization of an NPC may be based on one or more communications of a user with another user. By way of example, a game player of control device 105 may have one or more communications with another individual, the communications may include one or more of text, messaging, graphical elements and voice communications. According to embodiments, one or more conversations may be detected form a device associated with the game player, such as user device 125. User device 125 may be a personal device, such as a mobile phone, personal communication device and tablet.
According to embodiments, personalization of an NPC may be based on one or more mannerisms, personality parameters and appearance of graphical elements and/or identified persons. A user input identifying a movie character, for example, may result in at least one seed element for each of an appearance, hair style, body style, face type, and appearance parameter in general. It should be appreciated that seed elements may be determined from a variety of different sources. When a user input indicates using a personal contact, such as a favorite professor, seed elements for personalization of the NPC may be based on one or more of communications, appearance and common phrases uttered by the personal contact.
Based on a user input identifying at least one of a person, source of seed elements, control device 105 may be configured to generate at least one NPC seed element for the at least one NPC character. Using the NPC seed element, control device 105 may control output of an electronic game, wherein the at least one NPC character is presented including the at least one NPC seed element.
According to embodiments, a user may link or provide data from a user device 125 to provide data for one or more seed elements. It should be appreciated that control device 105 and/or server 115 may provide a user interface for receiving user input including one or more of an application interface, online browsing and network interface in general. According to embodiments, functions of user device 125 described herein may be performed by control device 105.
Display 110 in FIG. illustrates an example of game content 135. According to embodiments, game engine operations may be performed by control device 105 to allow players to engage with an NPC. A player controlled element of a game, such as a character, may engage with an NPC when in close proximity. The game may use NPCs to provide information, such as instructions. Game content 135 includes a user character 140. According to embodiments, control device 105 may control presentation of NPC dialogue 145 personalized for game NPC 150. According to embodiments, control device 105 may control presentation appearance of game NPC 150. Appearance may be personalized to include one or more of a mannerism, personality trait and appearance trait. Seed elements may be detected for one or more of an appearance item, hair color, hair style, and eye color. Mannerisms may include body movement, physical gestures, facial expressions, and body movements in general.
According to embodiments personalization of an NPC character in an electronic game may be based on an archetype or character trait (e.g., spicy bartender, professor, etc.) received in a user input. According to embodiments, control device 105 may generate NPC dialogue based on the character and/or character trait to include at least one of a script and communication style based on the character and/or character trait. Control device 105 may use a machine learning model to generate a script for a character and/or character trait. Generation of character scripts may be based on predetermined reactions of an NPC character based on game data, included NPC comments and sample conversations.
One or more devices of system 100 may perform operations of process 200 described with reference to
According to embodiments, generating at least one seed element and controlling output of an electronic game may be performed using a machine learning language model. Processes for conversion are discussed with reference to
According to embodiments, personalization of an NPC may include modifying at least one of a communication format and appearance of an NPC. In-game NPCs may provide instructions that are necessary for gameplay. Controlling output of the electronic game may include modifying the style of an NPC message while maintaining the information to be provided to a user. According to embodiments, personalization of an NPC by system 100 may be performed using a machine learning model. One or more devices of system 100, such as control device 105, server 115 and display 110 may perform operations of a machine learning model.
System 100 may provide features to improve user experience, wherein functions and operations described herein are performed following user consent, with express notice to a user, and/or in alignment with one or more user settings for user privacy. It should be appreciated that embodiments may be applied to interactive entertainment with one or more users. Processes described herein are not limited to gaming content.
Process 200 may also optionally include controlling presentation of an electronic game at optional block 206. Process 200 may be performed on electronic games (e.g., video games, interactive games, etc.). Electronic games may include one or more player controlled elements, a visual display, or one or more levels or areas, and the ability of the user to control a game element or character relative to one or more levels and areas. Electronic games may also include non-player elements such as non-player characters and various game elements and characters. Games may include a storyline including character information, such as a character backstory and one or more narrative components describing and/or associated with gameplay. Controlling presentation of an electronic game at block 206 may include controlling presentation and interaction with one or more NPC characters.
Process 200 may also optionally include receiving game data at optional block 207. The game data may include an identification of NPC characters in the electronic game and one or more parameters related to presentation of the characters. In some instances, a user selects a NPC character to be modified as part of presentation of electronic game data and using received game data at block 207. Receiving game data at block 207 may be used for identifying NPC communications for the electronic game. Generating at least one NPC seed element for the at least one NPC character may include modifying at least one received NPC communication for the electronic game.
At block 210, process 200 includes generating at least one NPC seed element for at least one NPC character. According to embodiments, user input received at block 205 including an identification of a person or source of data may be used to generate one or more NPC seeds. NPC seeds directed to appearance may be used to modify the appearance of an NPC character. NPC seeds directed to communication style may be used to modify the communication style of an NPC. According to embodiments, generating the at least one NPC seed includes detecting at least one of audio, text, and image data on a user device. Text conversations between a user (e.g., player) and the identified seed user or seed source may include one or more of text messages, interviews, emails, etc. Text conversations can capture non-text data including memes sent on one or more network services as an approximation of humor. Audio data may be sourced from one or more of phone calls, voice notes, podcasts, etc. Video data may be sourced from one or more of recorded video calls, interviews, video clips, and downloaded video data. NPC seeds may be generated consistent with one or more game functions and game scripts. For example, a game limitation on NPC ability may limit the implementation of generation of content based on an NPC seed.
Generating the at least one NPC seed at block 210 can include controlling a machine learning model to identify at least one of voice communication content and message communication content of an NPC seed source. By way of example, a machine learning model, such as a standard large language model, may feed an NPC for generatively produced interactions. The machine learning model may additionally be trained on data to customize at least one of personality, speech patterns, and tendencies of a real person. According to embodiments, machine learning models may use one or more conversation models based on conversation, salutations, greetings and personal conversations. Humor models may include parameters to impart humor to one or more communications.
Generating the at least one NPC seed at block 210 can include filtering personal data of an NPC seed source. According to embodiments, input data may be scrubbed or cleaned of any personally identifying information. Scrubbing may be performed on communications, for example, to remove any contact information. According to embodiments, user input including NPC seed material may remove personal information prior to using seed data in training and NPC generation.
According to embodiments, generating nonplayer character (NPC) seed elements at block 210 includes generating example communication exchanges based on a seed source. For example, if the seed source uses phraseology as a greeting, a seed element may be generated for a greeting based on or similar to communications used by the seed source. Additional communication styles and phrases may be detected an encoded as seed elements to be used for personalizing an NPC character.
At block 215, process 200 includes controlling output of the electronic game at block 215 to present at least one NPC character including at least one NPC seed element. Output of the NPC character may be in the form of at least one of voice, audio and text of the communication. According to embodiments, output of the NPC may include outputting control information to an electronic game to control game elements, such as output of an NPC. Output of the NPC may be an update compared to standard output of the NPC.
According to embodiments, controlling output of the electronic game at block 215 includes controlling at least one of an NPC action and an NPC appearance in the electronic game. At least one of a communication style, appearance and mannerism may be imparted to the NPC character. By way of example, if a user input requests for communication style to be associated with a particular person or character, the NPC may be personally qualified on a number of characteristics of reactivity. NPC actions may include abilities or restrictions within the game and movements or the in-game NPC character (e.g., facial expression, arm/leg movements, dancing, body position, etc.).
Controlling output of the electronic game at block 215 may include modifying a communication style of the NPC character and including an NPC game instruction message. According to embodiments, controlling output of the electronic game at block 215 includes controlling an NPC output communication with at least one detected communication phrase of an NPC seed source. By way of example, if a user input requests for communication style to be associated with one or more communication messages and/or a seed user, the NPC character may include one or more phrases, terms and speech patterns based on at least one generated NPC seed element. Output of the NPC may be controlled to include at least one of scripts, responses, and humor consistent with the text conversations used as input data. Control of the NPC character may be control of speech synthesis based on audio clips available from video, calls, and/or voice notes. Control of the NPC character may include physical mannerisms based on video and/or graphic data. According to embodiments, controlling output of the electronic game at block 215 includes controlling the NPC character to output a humorous communication.
Process 200 may optionally include detecting seed data at optional block 208. According to embodiments, process 200 may detect seed data from one or more seed users and/or data sources at block 208. Seed data may be detected to determine one or more modifications to an existing NPC character. According to embodiments, seed data detected at block 208 may be based on game data received at block 207. In a game with particular characteristics for NPC characters, seed data may need to be detected at block 208 to fulfill a user input at block 205.
Seed data may provide one or more of a text, audio, communications and voicings that may be used to modify existing output of an NPC and/or to generate new output. Seed data may include portions or elements of NPC output or appearance. For example, seed data my describe a hair style and/or hair color. Seed data may include responses to questions, conversation scripts and/or selected phrases or terms that may be used to personalize a NPC character.
At block 260, process 250 includes detecting seed elements based on user data identified at block 255. Using data identified at block 255, process 250 may detect one or more of text, audio, video and graphical content. By way of example, seed elements may be detected for a communication style, including word selection, salutations and greetings, and frequently used terms or phrases. Seed elements may be detected for elements a user finds humorous. By way of example, a running joke or repeated text content may be identified as including humorous content. Similarly, content may be responded to and labeled as being funny or humorous. Seed elements may be detected for an appearance of an individual. For example, detection of hair color or body movement styles may be detected. For a seed element detected identifying a particular body movement, such as a head lean for example, an NPC can be configured to include body mechanics with a similar head tilt. Seed elements detected may be examples of content or characteristics of a seed user that may be used to generate one or more NPC seeds.
At block 265, process 250 includes generating at least one NPC seed. The NPC seed may be a characteristic or element that may be used for personalizing an NPC of electronic game content. To provide a robust experience, personalization of an NPC may be performed using a plurality of NPC seeds. An NPC seed may be a data unit that may be read and used by a processing device for electronic game data. According to embodiments, one NPC seeds may be generated to personalize at least one nonplayer character (NPC) of an electronic game at block 205.
Controller 310 may relate to a processor or control device configured to execute one or more operations (e.g., executable instructions) stored in memory 315, such as processes for NPC personalization. Memory 315 may be non-transitory memory configured to provide data storage and working memory operations for device 300. Memory 315 may be configured to store computer readable instructions for execution by controller 310 for one or more processes described herein. Interface 320 may be a communications module configured to receive and transmit network communication data.
Device 300 may be configured to receive gaming media (e.g., card, cartridge, disk, etc.) and output visual and audio content of the gaming media to a display. For network games, device 300 may receive game data from a network source. Device 300 may be configured to receive input from one or more peripheral devices, such as user controller 305.
Controller 310 may be configured to control presentation of electronic gaming content, detect NPC seed elements and present gaming content. Controller 310 may also be configured to convert communications and control one or more machine learning model operations for language processing. Controller 310 may also be configured to output an updated gaming content including at least one personalized NPC.
According to embodiments, users acknowledge and agree to provide access to personal devices to provide data that may be used for training and/or to provide seed elements. By way of example, phone communications can be mined, with identified data scrubbed of personal or sensitive information. One or more users, or seed users, may be linked to mined data. Similarly, in-game communications may be mined to detect seed elements. Using mined data, such as phone images for appearance or user provided images, one or more seed elements may be determined by a control device for the appearance of an NPC. Communication characteristics may also be used to determine seed elements.
According to embodiments, training process 400 and controller 410 may be configured to use one or more machine learning models (e.g., artificial intelligence, iterative models, etc.) to identify communications and communication style. Training process 400 and controller 410 may use one or more libraries of common user responses. According to embodiments, output 415 may include output of communications with at least one modified segment.
While this disclosure has been particularly shown and described with references to exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the claimed embodiments.