Content, orchestration, management and programming system

Information

  • Patent Grant
  • 11717756
  • Patent Number
    11,717,756
  • Date Filed
    Wednesday, September 8, 2021
    3 years ago
  • Date Issued
    Tuesday, August 8, 2023
    a year ago
Abstract
Procedurally generating live experiences for a virtualized music-themed world, including: providing a headless content management system supporting management of back end; coupling a virtual world client to the back end of the content management system using an interface that provides a content packaging framework to enable customized event scheduling and destination management; and procedurally generating a plurality of content elements in a real-time game engine for live experiences in the virtualized music-themed world.
Description
BACKGROUND
Field

The present disclosure relates to content management, and more specifically, to back-end content management for a virtual music-themed world.


Background

Computer-implemented games such as PC, gaming console, table, mobile phone, or any other analogous device-based games are increasingly popular, and new uses for the technology are constantly being found. The software may also be tied to certain real or virtual-world systems. However, since the content of the games are in the executable, when something in the game needs to be changed, a large amount of content including binaries needs to be downloaded again.


SUMMARY

The present disclosure provides for leveraging the content management system in a social metaverse where content is procedurally generating live experiences in a real-time 3-D game engine for a virtualized music-themed world.


In one implementation, a method for procedurally generating live experiences for a virtualized music-themed world is disclosed. The method includes: providing a headless content management system supporting management of back end; coupling a virtual world client to the back end of the content management system using an interface that provides a content packaging framework to enable customized event scheduling and destination management; and procedurally generating a plurality of content elements in a real-time game engine for live experiences in the virtualized music-themed world.


In one implementation, procedurally generating includes providing a server-side control over how the plurality of content elements is rendered. In one implementation, the server-side control comprises control over what content element is rendered. In one implementation, the server-side control comprises control over where the content element is rendered. In one implementation, the server-side control comprises control over when the content element is rendered and for how long. In one implementation, the server-side control comprises control over what the content element costs. In one implementation, the method further includes rendering the plurality of content elements from at least one of text, audio, music, and video source files at specific choreographed times within a virtual setting.


In another implementation, a system for procedurally generating live experiences for a 3-D virtualized music world is disclosed. The system includes: a headless content management system to provide management of a back end; and a virtual world client to connect to the back end using an interface that provides a content packaging framework to enable customized event scheduling and destination management, wherein the virtual-world client procedurally generates a plurality of content elements in a real-time game engine for live experiences in a virtualized music-themed world.


In one implementation, the virtual-world client includes a server-side controller which provides server-side control over how the plurality of content elements is rendered. In one implementation, the server-side control comprises control over which content element of the plurality of content elements is rendered. In one implementation, the server-side control comprises control over where the content element is rendered. In one implementation, the server-side control comprises control over when the content element is rendered and for how long. In one implementation, the server-side control comprises control over what the content element costs. In one implementation, the virtual-world client renders the plurality of content elements from at least one of text, audio, music, and video source files at specific choreographed times within a virtual setting.


In another implementation, a non-transitory computer-readable storage medium storing a computer program to procedurally generate live experiences for a virtualized music-themed world is disclosed. The computer program includes executable instructions that cause a computer to: provide a headless content management system supporting management of back end; couple a virtual world client to the back end of the content management system using an interface that provides a content packaging framework to enable customized event scheduling and destination management; and procedurally generate a plurality of content elements in a real-time game engine for live experiences in the virtualized music-themed world.


In one implementation, the executable instructions that cause the computer to procedurally generate includes executable instructions that cause the computer to provide a server-side control over how the plurality of content elements is rendered. In one implementation, the server-side control comprises control over what content element is rendered. In one implementation, the non-transitory computer-readable storage medium further includes executable instructions that cause the computer to render the plurality of content elements from at least one of text, audio, music, and video source files at specific choreographed times within a virtual setting.


Other features and advantages should be apparent from the present description which illustrates, by way of example, aspects of the disclosure.





BRIEF DESCRIPTION OF THE DRAWINGS

The details of the present disclosure, both as to its structure and operation, may be gleaned in part by study of the appended drawings, in which like reference numerals refer to like parts, and in which:



FIG. 1 is a diagram of a scheduling layer operating within a content operating, management and programming (COMP) system in accordance with one implementation of the present disclosure;



FIG. 2 is an example of metadata that is stored and managed under the content models and media/metadata management;



FIG. 3 is a block diagram of the architecture and design of the COMP system in accordance with one implementation of the present disclosure;



FIG. 4 is a system data flow diagram including a list of systems, subsystems and logical components and how the COMP system will interact with them;



FIG. 5 is a data entity relationship diagram (ERD) in accordance with one implementation of the present disclosure;



FIG. 6 is a UE4 client interface application in accordance with one implementation of the present disclosure;



FIG. 7 shows a login page that will need to integrate with Sony Music's Azure Active Directory for identity management of users;



FIG. 8 shows a schedule view;



FIG. 9 shows All Users page;



FIG. 10 shows Edit Users page;



FIG. 11 shows All Schedules page;



FIG. 12 shows Add Schedules page;



FIG. 13 shows Edit Schedule page;



FIG. 14 shows All Locations page;



FIG. 15 shows Edit Location page;



FIG. 16 shows All Content Types page;



FIG. 17 shows Edit Content Type page;



FIG. 18 shows All Attribute Types page;



FIG. 19 shows Edit Attribute Type page;



FIG. 20 shows All Venues page;



FIG. 21 shows Add Content Programming page;



FIG. 22 shows Add Content Attributes page;



FIG. 23 is a flow diagram of a method for procedurally generating live experiences for a 3-D virtualized music world in accordance with one implementation of the present disclosure;



FIG. 24 is a block diagram of a system for procedurally generating live experiences for a 3-D virtualized music world in accordance with one implementation of the present disclosure;



FIG. 25A is a representation of a computer system and a user in accordance with an implementation of the present disclosure; and



FIG. 25B is a functional block diagram illustrating the computer system hosting the procedural generation application in accordance with an implementation of the present disclosure.





DETAILED DESCRIPTION

As described above, computer-implemented games residing in relevant gaming device systems, such as those referred to above, are increasingly popular, and new uses for the technology are constantly being found. The software may also be tied to certain virtualized music world systems. However, since the content of the games are in the executable, when something in the game needs to be changed, a large amount of content including binaries needs to be downloaded again. Accordingly, a need exists for more efficient real-time rendering of the virtual music-themed world.


Certain implementations of the present disclosure provide for leveraging the content management system in a social metaverse where content is procedurally generating live experiences in a real-time 3-D game engine for a virtualized music-themed world. That is, the procedural generation of the content (i.e., orchestration of multiple content elements) in the content management system of the present disclosure enables the content to be updated and rendered dynamically by leveraging the content management system. After reading below descriptions, it will become apparent how to implement the disclosure in various implementations and applications. Although various implementations of the present disclosure will be described herein, it is understood that these implementations are presented by way of example only, and not limitation. As such, the detailed description of various implementations should not be construed to limit the scope or breadth of the present disclosure.


In one implementation, a computer system provides a headless content management system, supporting back-end management without providing a front end for data presentation. The computer system provides designing and implementing content orchestration, management, and programming in a virtual music-themed world by providing a mechanism for real-time procedural virtual world generation. In one example, a virtual world client connects to the content orchestration, management, and programming back end through a GraphQL interface that provides a content packaging framework to enable customized and efficient event scheduling and destination management. The GraphQL interface is neither the front end nor the back end, but rather it is a language spoken between the back end and the front end to exchange information.


Features provided in implementations can include, but are not limited to, one or more of the following items: Server-side control over one or more of: (a) what content appears; (b) where content appears; (c) when content appears (and for how long); and (d) what content costs. Using this information, the client can render out various content from text, audio, music, and video source files at specific choreographed times within a virtual setting (e.g., a virtual concert venue). Content producers may define content packs and program event schedules through a web interface where data is stored in both a relational database and in a cache with programming in Java that uses reflection to define content models.



FIG. 1 is a diagram of a scheduling layer operating within a content operating, management and programming (COMP) system 100 in accordance with one implementation of the present disclosure. In one example, when shows 110 are scheduled, there are metadata under shows 110 that describes what makes a show possible, such as which venue 112, what performance 114 (e.g., Singer1), what kind of video 116 and music 118. Once a show is scheduled, the COMP system procedurally generates, renders, and performs the show.



FIG. 2 is an example of metadata that is stored and managed under the content models and media/metadata management.



FIG. 3 is a block diagram of the architecture and design 300 of the COMP system in accordance with one implementation of the present disclosure. In one implementation, the architecture and design 300 of the COMP system shows the details of the overall architecture, design, dependencies, and requirements to fulfill the product requirements.



FIG. 4 is a system data flow diagram 400 including a list of systems, subsystems and logical components and how the COMP system will interact with them. The data flow diagram 400 also includes a presenter system 410 which is a client-side/server-side system built in C++ on the Unreal Engine. The presenter system 410 is responsible for requesting programmed content data from the backend COMP system. Upon receiving the serialized data, the presenter system 410 deserializes and maps data to appropriate elements within the application. In one example, an item of audio is mapped to an Unreal Engine Audio Element. As the event's schedule begins, the event scheduler of the presenter system 410 notifies all clients to begin playing. That is, the presenter system 410 procedurally generates and activates the music artist's performance with audio, lights, effects and imagery.



FIG. 5 is a data entity relationship diagram (ERD) 500 in accordance with one implementation of the present disclosure. In the illustrated implementation of FIG. 5, the data ERD describes the database tables and entity relationships between tables. In one example, a show that is scheduled is referred to as an event. The ERD 500 describes the event as stored in the database. The below sections describe the database schema.


User (admin) Data Model: This table defines the list of privileged users who have access to view or administer content within the COMPS System.

















Data





Attribute
Type
Required
Description
Potential Values







user_id
Integer
Y
Primary key



name
String
Y
Used to display
First Name, Last Name





user name



profile_image
String
N
Image of user
URL string to image file





which can be






uploaded or pulled






down from Azure






AD



email
Boolean
Y
Email to be used
This email will be used to send an





to authenticate
invite to user to validate their






email


access_type
Integer
Y
Entitlement level
0 = View-only or 1 = Admin


auth_type
Integer
Y
How the user will
Default: 0 = SSO Azure AD





be authenticated






into COMPS's web






portal



deactivated
Boolean
Y
Is the object
Default: false





deactivated?



last_login_date
datetime
N
Last time user
the date-time notation as defined





logged in
by RFC 3339, section 5.6, for






example, 2017-07-21T17:32:28Z


created_at
datetime
Y
When the record
the date-time notation as defined





was created
by RFC 3339, section 5.6, for






example, 2017-07-21T17:32:28Z


updated_at
datetime
Y
Last time the
the date-time notation as defined





record was
by RFC 3339, section 5.6, for





updated
example, 2017-07-21T17:32:28Z









User Session Data Model: This table represents the active session for any user authenticated through Azure Active Directory (AD). The session will remain active unless expired whereby the user will need to re-authenticate using Single Sign-On (SSO).

















Data





Attribute
Type
Required
Description
Potential Values







session_id
Integer
Y
Primary key



user_id
Integer
Y
User ID Foreign key



token
String
Y
Azure AD Session






token



created_at
datetime
Y
When the record
the date-time notation as defined





was created
by RFC 3339, section 5.6, for






example, 2017-07-21T17:32:28Z


updated_at
datetime
Y
Last time the
the date-time notation as defined





record was
by RFC 3339, section 5.6, for





updated
example, 2017-07-21T17:32:28Z









User Logs Data Model


This table represents all the logged user events such as but not limited to the following: Logging in; Creating content; and Changing a schedule.


















Required




Attribute
Data
Type
Description
Potential Values







log_id
Integer
Y
Primary key



user_id
Integer
Y
User ID Foreign






key



changes
String
Y
A textual
Example: User A changed the





description of
schedule for Content A





what data the user






changed in the






COMPS system



created_at
datetime
Y
When the record
the date-time notation as defined





was created
by RFC 3339, section 5.6, for






example, 2017-07-21T17:32:28Z


updated_at
datetime
Y
Last time the
the date-time notation as defined





record was
by RFC 3339, section 5.6, for





updated
example, 2017-07-21T17:32:28Z









Content Location Data Model


This table represents where in the Music World content can be located.

















Data





Attribute
Type
Required
Description
Potential Values







location_id
Integer
Y
Primary key



name
String
Y
Display name for
Examples: “Hub”, “Hip-Hop Alley”





the location of the






content in the






Music World



deactivated
Boolean
Y
Is the object
Default: false





deactivated?



created_by
Integer
Y
Foreign key of user






who created this






content



created_at
datetime
Y
When the record
the date-time notation as defined





was created
by RFC 3339, section 5.6, for






example, 2017-07-21T17:32:28Z


updated_at
datetime
Y
Last time the
the date-time notation as defined





record was
by RFC 3339, section 5.6, for





updated
example, 2017-07-21T17:32:28Z









Content Type Data Model


This table represents the various types of content that COMPS and the UE4 client are designed for.


















Required




Attribute
Data
Type
Description
Potential Values







content_type_id
Integer
Y
Primary key



Mainname
String
Y
Category of
Ad





content types
NPC





that can be
Venue





managed by
Item





COMPS. Each
Playlist





content type will






enforce UI for






related data






attributes



deactivated
Boolean
Y
Is the object
Default: false





deactivated?



created_by
Integer
Y
Foreign key of






user who created






this content



created_at
datetime
Y
When the record
the date-time notation as





was created
defined by RFC 3339, section






5.6, for example, 2017-07-






21T17:32:28Z


updated_at
datetime
Y
Last time the
the date-time notation as





record was
defined by RFC 3339, section





updated
5.6, for example, 2017-07-






21T17:32:28Z









Content Data Model


This table represents the dynamic content in the Music World mapped to the parent UE4 object that contains attributes.


Design Consideration


In some cases, a piece of Music World content may contain a single attribute, however designing the data hierarchy where a single UE4 Object (content) can have many attributes provides flexibility as the Music World grows. For example, early on, the Music World may contain a theater showing a Singer1's video, however, in the future this same theater may contain exterior posters, a marquee and multiple theaters inside the building.

















Data





Attribute
Type
Required
Description
Potential Values







content_id
Integer
Y
Primary key



location_id
Integer
Y
Foreign key of






location in Music






world content will






live



name
String
Y
Display name for
Examples: “Madison Beer





the content
Concert”, “ACDC T-Shirt”


ue4_class_name
String
Y
The actual UE4
Example: “theater01”,





class name
“joe_npc”





representing this






UObject in Unreal



created_by
Integer
Y
Foreign key of






user who created






this content



deactivated
Boolean
Y
Is the object
Default: false





deactivated?



created_at
datetime
Y
When the record
the date-time notation as





was created
defined by RFC 3339, section






5.6, for example, 2017-07-






21T17:32:28Z


updated_at
datetime
Y
Last time the
the date-time notation as





record was
defined by RFC 3339, section





updated
5.6, for example, 2017-07-






21T17:32:28Z









Attribute Type Data Model


The table represents the various types of data (audio, video, etc) of an attribute.

















Data





Attribute
Type
Required
Description
Potential Values







attr_type_id
Integer
Y
Primary key



name
String
Y
Category of
Following but not limited to:





attribute types
audio/url





that can be
image/url





managed
video/url





COMPS. Each
json/url





type will provide
text/text





details on where
text/json





the resources can
playfab/url





be located.
playfab/quest_name






playfab/item_id






napster/artist_id






napster/playlist_id


deactivated
Boolean
Y
Is the object
Default: false





deactivated?



created_by
Integer
Y
Foreign key of






user who created






this content



created_at
datetime
Y
When the record
the date-time notation as defined





was created
by RFC 3339, section 5.6, for






example, 2017-07-21T17:32:28Z


updated_at
datetime
Y
Last time the
the date-time notation as defined





record was
by RFC 3339, section 5.6, for





updated
example, 2017-07-21T17:32:28Z









Content Attribute Data Model


This table references all the attribute and data resources that a single UE4 Object can contain.

















Data





Attribute
Type
Required
Description
Potential Values







attribute_id
Integer
Y
Primary Key



attribute_type_id
Integer
Y
Foreign Key



content_id
Integer
Y
Foreign Key



schedule_id
Integer
Y
Foreign Key



name
String
Y
Display
Examples:





Name of the
“Exterior Poster”





attribute
“Main Theater Video Player”






UE4 class attribute name, i.e.,


ue4_attr_name
String
Y
The actual
video1





UE4
poster2





attribute
artist3456





name
songTrack2234





representing
playlist987





this UObject
billboard1





in Unreal
poster2


resource
String
Y
UE4 class
For most attribute types, the value will be a





attribute
URL that points to a CDN or from Music





data that is
Metadata API (MMA), i.e.,





scheduled
http://cdn.monterey.com/Singer1_2d_video.mov






http://cdn.monterey.com/travis_scott_poster.jpg






http://mma.monterey.com/artist/3456






http://mma.monterey.com/artist/3456/tracks/2234






http://api.napster.com/v2/playlist/987






http://cdn.monterey.com/hub_billboard_verizon.jpg






http://cdn.monterey.com/blimp texture.file






This attribute can also contain text, i.e.,






“Singer1, The Movie brought to you by






SK Telecom”


deactivated
Boolean
Y
Is the object
Default: false





deactivated?



created_by
Integer
Y
Foreign key






of user who






created this






content



created_at
datetime
Y
When the
the date-time notation as defined by RFC





record was
3339, section 5.6, for example, 2017-





created
07-21T17:32:28Z


updated_at
datetime
Y
Last time
the date-time notation as defined by RFC





the record
3339, section 5.6, for example, 2017-





was updated
07-21T17:32:28Z









Content Schedule Data Model


This table represents all the available schedules that can be associated to content and their attributes. Each schedule will be used to determine if given a timeframe what content to display inside the Music World.

















Data





Attribute
Type
Required
Description
Potential Values







schedule_id
Integer
Y
Primary Key



name
String
N
Display name for the
Example: “Taco Tuesdays”, “Daily





schedule
News”


frequency
String
Y
Interval at which the
Once-Beginning on the





content should be
start_date, once at the start_time





activated
for the duration






Daily-Beginning on the






start_date, every day at the






start_time for the duration,






ending on the end_date






Weekly-Beginning on the






start_date, every week on the






same weekday as the start_date






at the start_time for the duration,






ending on the end_date.






Monthly-Beginning on the






start_date, every month on the






same numerical day of the month






as the start_date at the start_time






for the duration, ending on the






end_date






Yearly-Beginning on the






start_date, every year on the






same numerical day of the month






as the start_date at the start_time






for the duration, ending on the






end_date






On-Demand-Every day, all the






time






Custom Weekly-Beginning on






the start_date, every week on the






same weekdays selected at the






start_time for the duration,






ending on the end_date


start_datetime
datetime
Y
Content activation start
the date-time notation as defined





date
by RFC 3339, section 5.6, for






example, 2017-07-21T17:32:28Z


end_datetime
datetime
N
Not included = indefinitely
the date-time notation as defined






by RFC 3339, section 5.6, for






example, 2017-07-21T17:32:28Z


weekdays
String
N
If Custom Weekly is
Examples:





selected, lists the
mo, tu, we, th, fr, sa, su





weekdays
mo, we, fr


duration
int
N
Optional, end time of the
minutes





event. For content, i.e.,






videos, end time should






be disregarded as a video






can extend the end time.



deactivated
Boolean
Y
Is the object deactivated?
Default: false


created_by
Integer
Y
Foreign key of user who






created this content



created_at
datetime
Y
When the record was
the date-time notation as defined





created
by RFC 3339, section 5.6, for






example, 2017-07-21T17:32:28Z


updated_at
datetime
Y
Last time the record was
the date-time notation as defined





updated
by RFC 3339, section 5.6, for






example, 2017-07-21T17:32:28Z









Example of Event Content


A theater in the Music World playing Singer1's video weekly at 7 PM UTC starting on 05/09/2020.














{


“contentId”: 122,








“name” : “Main Street Theater”,



“UE4ClassName” : “theater01”,



“locationId” : 12“
# R&B Alley


“contentTypeId” : 2,
# venue


“createdBy” : 1,
# created by Creator


“deactivated”: false,



“createdAt” : “2020-04-09T19:00:00Z”



“updatedAt” : “2020-04-09T19:00:00Z”



}



{



“attrId” : 11223,



“attributeTypeId” : 3,
# video/url


“contentId” : 122,
# Main Street Theater


“scheduleId” : 3,
# Main Street Theater Show Times


“name” : “Main Screen”,



“UE4AttributeName” : “video1”,



“resource” : “http://cdn.monterey.com/Singer1_2d_video.mov”,



“deactivated”: false,



“createdBy” : 1,
# created by Creator


“createdAt” : “2020-04-09T19:00:00Z”,



“updatedAt” : “2020-04-09T19:00:00Z”



}



{



“scheduleId” : 3,



“name” : “Main Street Theater Show Times”,



“frequency” : “weekly”



“start_datetime” : “2020-05-09T19:00:00Z”



“duration” : 125,



“deactivated”: false,



“createdBy” : 1,
# created by Creator


“createdAt” : “2020-04-09T19:00:00Z”,



“updatedAt” : “2020-04-09T19:00:00Z”



}









Example of Ad Content


A Verizon billboard in the Music World Hub appears all the time starting on 11/01/2020.














{


“contentId” : 04,


“name” : “Billboard on top of Main Theater”,


“UE4ClassName” : “hub_billboard02”,








“locationId” : 1,”
# Hub


“contentTypeId” : 1,
# Ad


“createdBy” : 1,
# created by Creator


“deactivated”: false,



“createdAt” : “2020-04-09T19:00:00Z”,



“updatedAt” : “2020-04-09T19:00:00Z”



}



{



“attrId” : 478,



“attributeTypeId” : 2,
# i mage/url


“contentId” : 04,
# Billboard on top of Main Theater


“scheduleId” : 5,
# feature billboard schedules


“name” : “Feature billboard”,



“UE4AttributeName” : “image01”,



“resource” : “http://cdn.monterey.com/verizon_featuring_madison_beer_billboard.jpg”,



“deactivated”: false,



“createdBy” : 1,
# created by Creator


“createdAt” : “2020-04-09T19:00:00Z”,



“updatedAt” : “2020-04-09T19:00:00Z”



}



{



“scheduleId” : 5,



“name” : “feature billboard schedule”,



“frequency” : “on-demand”,



“start_datetime” : “2020-11-01T00:00:00Z”,



“deactivated”: false,



“createdBy” : 1,
# created by Creator


“createdAt” : “2020-04-09T19:00:00Z”,



“updatedAt” : “2020-04-09T19:00:00Z”



}










FIG. 6 is a UE4 client interface application 600 in accordance with one implementation of the present disclosure. In FIG. 6, the UE4 client interface application 600 loads serialized JSON data received from the API endpoints. Upon receiving the requested data, the client code is designed such that the Unreal Engine base UObject and AActor objects either dynamically encapsulate the JSON data (extensible design) or map directly to object attributes (low LOE, preferred). In the example, there is a Theater01 object that inherits from a ABaseVenue class. The Theater01 object upon calling LoadFromCOMPS( ) will now contain data retrieved from the COMP system.



FIGS. 7 through 22 show user interface/UX design and functionality requirements. The COMP system users uses a web portal to program content for the Music World.



FIG. 7 shows a login page that will need to integrate with Sony Music's Azure Active Directory for identity management of users.


Main Dashboard (Landing Page)


Once logged in, the user will be directed to the main dashboard that contains Today's and This Week's content programming schedule adjustable by time ranges. All time-based content will be displayed in user's local time. The main dashboard needs calendar viewing options. Further, through the dashboard, the users: (1) Search for any content; (2) Logout taking them back to the login page; (3) Breadcrumb style links to individual types of content; (4) Each tab will take the user to the content type specific page, based on permissions, they can view, edit or create a new content program schedule; (5) Tabs are procedural generated based on content types in the database; (6) System settings to allow them to define new content types, attribute types, locations, and calendars.



FIG. 8 shows a schedule view.


System Settings Page


Based on permissions, a user will be able to perform the following functionality: Invite/Deactivate COMPS users


Review user logs; Add/edit/delete locations; Add/edit/delete content types; Add/edit/delete attribute types.



FIG. 9 shows All Users page. Admin users can deactivate users. Sending an invite will send a verification email so the account is ‘Pending’. If an Admin resends an invite to a ‘Pending’ user, a new verification code will be sent. The verification code can be stored in the Sessions table or adding a new attribute to the Users table for the verification code.



FIG. 10 shows Edit Users page. Clicking logs will show all information for the particular user from the logs table.



FIG. 11 shows All Schedules page. This page Depicts a screenshot where a user viewing the admin web portal can view all scheduled content that would be procedurally generated in the Music World.



FIG. 12 shows Add Schedules page. This page depicts a screenshot where a user can create a new schedule for content.



FIG. 13 shows Edit Schedule page. This page depicts a screenshot where a user can edit a schedule for content.



FIG. 14 shows All Locations page. This page depicts a screenshot where a user can view all available locations within the Music World where content can be scheduled to be viewed by users.



FIG. 15 shows Edit Location page. This page depicts a screenshot where a user can edit a location within the Music World client.



FIG. 16 shows All Content Types page. This page depicts a screenshot where a user can view all available content models within the Music World.



FIG. 17 shows Edit Content Type page. This page depicts a screenshot where a user can edit a specific content model.



FIG. 18 shows All Attribute Types page. This page depicts a screenshot where a user can view all available attribute types of data that can exist in a content model/type.



FIG. 19 shows Edit Attribute Type page. This page depicts a screenshot where a user can edit a specific data attribute type.



FIG. 20 shows All Venues page. This page depicts a screenshot where a user can view all available venues within the Music World where content can be scheduled to be viewed by users (e.g., a concert or musical performance).



FIG. 21 shows Add Content Programming page. This page depicts a screenshot where a user can create a specific schedule content program within the Music World.



FIG. 22 shows Add Content Attributes page. This page depicts a screenshot where a user add a new attribute made up of a an attribute type and actual data.


Attribute Type Resource Mapping













Attribute Type
Data Description







audio/unl



image/unl
Allow user to upload media asset to COMPS CDN and output resulting CDN URL


video/url



json/url
Input URL.


text/text
Input Text


text/json



playfab/url



playfab/quest_name
Depending attribute type, will use PlayFab Admin API to populate pull down



with associated type values, storing resulting attribute IDs


playfab/item_id



Napster/artist_id
Depending attribute type, will use Napster API to populate pull down with


Napster/playlist
associated type values, storing resulting attribute IDs










FIG. 23 is a flow diagram of a method 2300 for procedurally generating live experiences for a 3-D virtualized music world in accordance with one implementation of the present disclosure. In the illustrated implementation of FIG. 23, a headless content management system (CMS) supporting management of back end is provided, at step 2310. A virtual world client is coupled to the back end of the CMS, at step 2320, using an interface that provides a content packaging framework to enable customized event scheduling and destination management. In one implementation, the interface is the GraphQL interface which is a language spoken between the back end and the front end to exchange information. A plurality of content elements is then procedurally generated, at step 2330, by the virtual world client in a real-time game engine for live experiences in a virtualized music-themed world. In one implementation, the procedural generation of the content elements includes server-side control over one or more of: what content is rendered; where the content is rendered; when the content is rendered (and for how long); and what the content costs.


In one implementation, the client then renders out the content elements, at step 2340, from text, audio, music, and video source files at specific choreographed times within a virtual setting (e.g., a virtual concert venue). Content producers may define content packs and program event schedules through a web interface where data is stored in both a relational database and in a cache with programming in Java that uses reflection to define content models.



FIG. 24 is a block diagram of a system 2400 for procedurally generating live experiences for a 3-D virtualized music world in accordance with one implementation of the present disclosure. In the illustrated implementation of FIG. 24, the system 2400 includes a headless content management system (CMS) 2410 and a virtual world client 2420. In one implementation, the headless content management system 2410 supports management of back end 2412 of the CMS. In one implementation, the virtual world client 2420 couples to the back end using an interface that provides a content packaging framework to enable customized event scheduling and destination management. In one implementation, the virtual-world client 2420 procedurally generates a plurality of content elements in a real-time game engine for live experiences in a virtualized music-themed world.


In one implementation, the virtual-world client 2420 includes server-side controller 2422 which provides control over one or more of: what content is rendered; where the content is rendered; when the content is rendered (and for how long); and what the content costs. In one implementation, the virtual-world client 2420 also renders out the content elements from text, audio, music, and video source files at specific choreographed times within a virtual setting (e.g., a virtual concert venue).



FIG. 25A is a representation of a computer system 2500 and a user 2502 in accordance with an implementation of the present disclosure. The user 2502 uses the computer system 2500 to implement an application 2590 for procedurally generating live experiences for a 3-D virtualized music world as illustrated and described with respect to the method 2300 illustrated in FIG. 23 and the system 2400 illustrated in FIG. 24.


The computer system 2500 stores and executes the procedural generation application 2590 of FIG. 25B. In addition, the computer system 2500 may be in communication with a software program 2504. Software program 2504 may include the software code for the procedural generation application 2590. Software program 2504 may be loaded on an external medium such as a CD, DVD, or a storage drive, as will be explained further below.


Furthermore, computer system 2500 may be connected to a network 2580. The network 2580 can be connected in various different architectures, for example, client-server architecture, a Peer-to-Peer network architecture, or other type of architectures. For example, network 2580 can be in communication with a server 2585 that coordinates engines and data used within the procedural generation application 2590. Also, the network can be different types of networks. For example, the network 2580 can be the Internet, a Local Area Network or any variations of Local Area Network, a Wide Area Network, a Metropolitan Area Network, an Intranet or Extranet, or a wireless network.



FIG. 25B is a functional block diagram illustrating the computer system 2500 hosting the procedural generation application 2590 in accordance with an implementation of the present disclosure. A controller 2510 is a programmable processor and controls the operation of the computer system 2500 and its components. The controller 2510 loads instructions (e.g., in the form of a computer program) from the memory 2520 or an embedded controller memory (not shown) and executes these instructions to control the system. In its execution, the controller 2510 provides the procedural generation application 2590 with a software system, such as to enable the creation and configuration of engines and data generators within the procedural generation application 2590. Alternatively, this service can be implemented as separate hardware components in the controller 2510 or the computer system 2500.


Memory 2520 stores data temporarily for use by the other components of the computer system 2500. In one implementation, memory 2520 is implemented as RAM. In one implementation, memory 2520 also includes long-term or permanent memory, such as flash memory and/or ROM.


Storage 2530 stores data either temporarily or for long periods of time for use by the other components of the computer system 2500. For example, storage 2530 stores data used by the procedural generation application 2590. In one implementation, storage 2530 is a hard disk drive.


The media device 2540 receives removable media and reads and/or writes data to the inserted media. In one implementation, for example, the media device 2540 is an optical disc drive.


The user interface 2550 includes components for accepting user input from the user of the computer system 2500 and presenting information to the user 2502. In one implementation, the user interface 2550 includes a keyboard, a mouse, audio speakers, and a display. The controller 2510 uses input from the user 2502 to adjust the operation of the computer system 2500.


The I/O interface 2560 includes one or more I/O ports to connect to corresponding I/O devices, such as external storage or supplemental devices (e.g., a printer or a PDA). In one implementation, the ports of the I/O interface 2560 include ports such as: USB ports, PCMCIA ports, serial ports, and/or parallel ports. In another implementation, the I/O interface 2560 includes a wireless interface for communication with external devices wirelessly.


The network interface 2570 includes a wired and/or wireless network connection, such as an RJ-45 or “Wi-Fi” interface (including, but not limited to 802.11) supporting an Ethernet connection.


The computer system 2500 includes additional hardware and software typical of computer systems (e.g., power, cooling, operating system), though these components are not specifically shown in FIG. 25B for simplicity. In other implementations, different configurations of the computer system can be used (e.g., different bus or storage configurations or a multi-processor configuration).


In one implementation, the system 2400 is a system configured entirely with hardware including one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable gate/logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. In another implementation, the system 2400 is configured with a combination of hardware and software.


The description herein of the disclosed implementations is provided to enable any person skilled in the art to make or use the present disclosure. Numerous modifications to these implementations would be readily apparent to those skilled in the art, and the principals defined herein can be applied to other implementations without departing from the spirit or scope of the present disclosure. Thus, the present disclosure is not intended to be limited to the implementations shown herein but is to be accorded the widest scope consistent with the principal and novel features disclosed herein.


Various implementations of the present disclosure are realized in electronic hardware, computer software, or combinations of these technologies. Some implementations include one or more computer programs executed by one or more computing devices. In general, the computing device includes one or more processors, one or more data-storage components (e.g., volatile or non-volatile memory modules and persistent optical and magnetic storage devices, such as hard and floppy disk drives, CD-ROM drives, and magnetic tape drives), one or more input devices (e.g., game controllers, mice and keyboards), and one or more output devices (e.g., display devices).


The computer programs include executable code that is usually stored in a persistent storage medium and then copied into memory at run-time. At least one processor executes the code by retrieving program instructions from memory in a prescribed order. When executing the program code, the computer receives data from the input and/or storage devices, performs operations on the data, and then delivers the resulting data to the output and/or storage devices.


Those of skill in the art will appreciate that the various illustrative modules and method steps described herein can be implemented as electronic hardware, software, firmware or combinations of the foregoing. To clearly illustrate this interchangeability of hardware and software, various illustrative modules and method steps have been described herein generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled persons can implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. In addition, the grouping of functions within a module or step is for ease of description. Specific functions can be moved from one module or step to another without departing from the present disclosure.


All features of each above-discussed example are not necessarily required in a particular implementation of the present disclosure. Further, it is to be understood that the description and drawings presented herein are representative of the subject matter that is broadly contemplated by the present disclosure. It is further understood that the scope of the present disclosure fully encompasses other implementations that may become obvious to those skilled in the art and that the scope of the present disclosure is accordingly limited by nothing other than the appended claims.

Claims
  • 1. A method for procedurally generating live experiences for a virtualized music-themed world, the method comprising: providing a headless content management system supporting management of back end;coupling a virtual world client to the back end of the content management system using an interface that provides a content packaging framework to enable customized event scheduling and destination management; andprocedurally generating a plurality of content elements in a real-time game engine for live experiences in the virtualized music-themed world.
  • 2. The method of claim 1, wherein procedurally generating comprises providing a server-side control over how the plurality of content elements is rendered.
  • 3. The method of claim 2, wherein the server-side control comprises control over what content element is rendered.
  • 4. The method of claim 2, wherein the server-side control comprises control over where the content element is rendered.
  • 5. The method of claim 2, wherein the server-side control comprises control over when the content element is rendered and for how long.
  • 6. The method of claim 2, wherein the server-side control comprises control over what the content element costs.
  • 7. The method of claim 1, further comprising rendering the plurality of content elements from at least one of text, audio, music, and video source files at specific choreographed times within a virtual setting.
  • 8. A system for procedurally generating live experiences for a 3-D virtualized music world, the system comprising: a headless content management system to provide management of a back end;a virtual world client to connect to the back end using an interface that provides a content packaging framework to enable customized event scheduling and destination management,wherein the virtual-world client procedurally generates a plurality of content elements in a real-time game engine for live experiences in a virtualized music-themed world.
  • 9. The system of claim 8, wherein the virtual-world client comprises a server-side controller which provides server-side control over how the plurality of content elements is rendered.
  • 10. The system of claim 9, wherein the server-side control comprises control over which content element of the plurality of content elements is rendered.
  • 11. The system of claim 9, wherein the server-side control comprises control over where the content element is rendered.
  • 12. The system of claim 9, wherein the server-side control comprises control over when the content element is rendered and for how long.
  • 13. The system of claim 9, wherein the server-side control comprises control over what the content element costs.
  • 14. The system of claim 8, wherein the virtual-world client renders the plurality of content elements from at least one of text, audio, music, and video source files at specific choreographed times within a virtual setting.
  • 15. A non-transitory computer-readable storage medium storing a computer program to procedurally generate live experiences for a virtualized music-themed world, the computer program comprising executable instructions that cause a computer to: provide a headless content management system supporting management of back end;couple a virtual world client to the back end of the content management system using an interface that provides a content packaging framework to enable customized event scheduling and destination management; andprocedurally generate a plurality of content elements in a real-time game engine for live experiences in the virtualized music-themed world.
  • 16. The computer-readable storage medium of claim 15, wherein the executable instructions that cause the computer to procedurally generate comprises executable instructions that cause the computer to provide a server-side control over how the plurality of content elements is rendered.
  • 17. The computer-readable storage medium of claim 16, wherein the server-side control comprises control over what content element is rendered.
  • 18. The computer-readable storage medium of claim 1, further comprising executable instructions that cause the computer to render the plurality of content elements from at least one of text, audio, music, and video source files at specific choreographed times within a virtual setting.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application claims the benefit of priority under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application No. 63/077,414, filed Sep. 11, 2020, entitled “Content Orchestration, Management and Programming System”. The disclosure of the above-referenced application is incorporated herein by reference.

US Referenced Citations (24)
Number Name Date Kind
6961055 Doak Nov 2005 B2
8990140 Michelstein Mar 2015 B2
9009092 Michelstein Apr 2015 B2
9208216 Michelstein Dec 2015 B2
9503770 Elm et al. Nov 2016 B2
10666997 Roberts et al. May 2020 B2
11452940 Baughman Sep 2022 B2
20030058238 Doak Mar 2003 A1
20060111188 Winkler May 2006 A1
20090253517 Bererton Oct 2009 A1
20100304869 Lee et al. Dec 2010 A1
20120150577 Berg Jun 2012 A1
20130332475 Michelstein Dec 2013 A1
20140025619 Michelstein Jan 2014 A1
20140025650 Lee Jan 2014 A1
20140067958 Bradley et al. Mar 2014 A1
20150178376 Michelstein Jun 2015 A1
20150199605 Michelstein Jul 2015 A1
20160085786 Michelstein Halberstam Mar 2016 A1
20210379492 Baughman Dec 2021 A1
20220075591 Cardenas Gasca Mar 2022 A1
20220080319 Lee Mar 2022 A1
20220080320 Sachson Mar 2022 A1
20230071892 Liang Mar 2023 A1
Non-Patent Literature Citations (3)
Entry
Waihi, Joshua. “What Are the Benefits of a Headless CMS?” Published Aug. 22, 2022. Acquia. Accessed Mar. 10, 2023. 7 pages. <https://www.acquia.com/blog/benefits-of-headless-cms#:˜:text=A%20headless%20CMS%20makes%20it,republishing%20content%20in%20multiple%20places.> (Year: 2022).
International Search Report in PCT/US2021/049711, dated Nov. 11, 2021, p. 2, ISA/RU.
R. Ranjan, et al., “MediaWiseCloud Content Orchestrator”, Journal of Internet Services and Applications, Jan. 2013, 14 pages.
Related Publications (1)
Number Date Country
20220080319 A1 Mar 2022 US
Provisional Applications (1)
Number Date Country
63077414 Sep 2020 US