The present invention relates generally to electrical and electronic hardware, computer software, wired and wireless network communications, and computing devices. More specifically, techniques related to a sensor-enabled media device are described.
Conventional devices and techniques for providing media content are limited in a number of ways. Conventional media devices (i.e., media players, such as speakers, televisions, computers, e-readers, smartphones) typically are not well-suited for selecting targeted media content for a particular user. While some conventional media devices are capable of operating applications or websites that provide targeted media content services, such services typically provide media content only on a device capable of downloading or running that media service application or website. Such applications or websites typically are unable to select or control other media devices in a user's ecosystem of media devices for providing media content.
Conventional media services and devices also typically do not automatically select media content in view of environmental or physiological factors associated with a user. Conventional media devices also typically are not well-suited for determining environmental states, and controlling media and output devices in response to environmental factors. Nor are they typically configured to identify and cross-reference local data with remote data, either for targeting media content or for providing notifications and feedback. Conventional media devices also typically are not configured to target media content for a user based on media preferences specified by a user across multiple media services.
Thus, what is needed is a solution for a sensor-enabled media device without the limitations of conventional techniques.
Various embodiments or examples (“examples”) are disclosed in the following detailed description and the accompanying drawings:
Although the above-described drawings depict various examples of the invention, the invention is not limited by the depicted examples. It is to be understood that, in the drawings, like reference numerals designate like structural elements. Also, it is understood that the drawings are not necessarily to scale.
Various embodiments or examples may be implemented in numerous ways, including as a system, a process, an apparatus, a user interface, or a series of program instructions on a computer readable medium such as a computer readable storage medium or a computer network where the program instructions are sent over optical, electronic, or wireless communication links. In general, operations of disclosed processes may be performed in an arbitrary order, unless otherwise provided in the claims.
A detailed description of one or more examples is provided below along with accompanying figures. The detailed description is provided in connection with such examples, but is not limited to any particular example. The scope is limited only by the claims and numerous alternatives, modifications, and equivalents are encompassed. Numerous specific details are set forth in the following description in order to provide a thorough understanding. These details are provided for the purpose of example and the described techniques may be practiced according to the claims without some or all of these specific details. For clarity, technical material that is known in the technical fields related to the examples has not been described in detail to avoid unnecessarily obscuring the description.
In some examples, the described techniques may be implemented as a computer program or application (“application”) or as a plug-in, module, or sub-component of another application. The described techniques may be implemented as software, hardware, firmware, circuitry, or a combination thereof. If implemented as software, then the described techniques may be implemented using various types of programming, development, scripting, or formatting languages, frameworks, syntax, applications, protocols, objects, or techniques, including ASP, ASP.net, .Net framework, Ruby, Ruby on Rails, C, Objective C, C++, C#, Adobe® Integrated Runtime™ (Adobe® AIR™), ActionScript™, FLex™, Lingo™, Java™, Javascript™, Ajax, Perl, COBOL, Fortran, ADA, XML, MXML, HTML, DHTML, XHTML, HTTP, XMPP, PHP, and others. Software and/or firmware implementations may be embodied in a non-transitory computer readable medium configured for execution by a general purpose computing system or the like. The described techniques may be varied and are not limited to the examples or descriptions provided.
In some examples, smart media device 102 may be configured to generate and store user-specific media preferences, for example, in an account profile, which may be associated with a user or group of users (i.e., “user group”). In some examples, a user group may include a family, a household, an office, a team, a group of specified individuals, or the like. In some examples, said media preferences may encompass local data associated with, for example, a user's or user group's environment, locally stored media content, direct media preference inputs, media preferences provided by other local sources, and the like. In other examples, said media preferences also may encompass remote data associated with a user's or user group's media service accounts and social network accounts, including previously selected media content, genres, types, and other preferences.
In some examples, wearable device 104 may be configured to be worn or carried. In some examples, wearable device 104 may be implemented as a data-capable strapband, as described in co-pending U.S. patent application Ser. No. 13/158,372, co-pending U.S. patent application Ser. No. 13/180,320, co-pending U.S. patent application Ser. No. 13/492,857, and co-pending U.S. patent application Ser. No. 13/181,495, all of which are herein incorporated by reference in their entirety for all purposes. In some examples, wearable device 104 may include one or more sensors (i.e., a sensor array) configured to collect local sensor data. Said sensor array may include, without limitation, an accelerometer, an altimeter/barometer, a light/infrared (“IR”) sensor, a pulse/heart rate (“HR”) monitor, an audio sensor (e.g., microphone, transducer, or others), a pedometer, a velocimeter, a global positioning system (GPS) receiver, a location-based service sensor (e.g., sensor for determining location within a cellular or micro-cellular network, which may or may not use GPS or other satellite constellations for fixing a position), a motion detection sensor, an environmental sensor, a chemical sensor, an electrical sensor, or mechanical sensor, and the like, installed, integrated, or otherwise implemented on wearable device 104. In other examples, wearable device 104 also may capture data from distributed sources (e.g., by communicating with mobile computing devices, mobile communications devices, computers, laptops, distributed sensors, GPS satellites, or the like) for processing with sensor data. In still other examples, the quantity, type, function, structure, and configuration of the elements shown may be varied and are not limited to the examples provided.
In some examples, mobile device 106 may be implemented as a smartphone, a tablet, laptop, or other mobile communication or mobile computing device. In some examples, mobile device 106 may include, without limitation, a touchscreen, a display, one or more buttons, or other user interface capabilities. In some examples, mobile device 106 also may be implemented with various audio and visual/video output capabilities (e.g., speakers, video display, graphic display, and the like). In some examples, mobile device 106 may be configured to operate various types of applications associated with media, social networking, phone calls, video conferencing, calendars, games, data communications, and the like. For example, mobile device 106 may be implemented as a media device configured to store, access and play media content.
In some examples, wearable device 104 and/or mobile device 106 may be configured to provide sensor data, including environmental and physiological data, to smart media device 102. In some examples, wearable device 104 and/or mobile device 106 also may be configured to provide derived data generated by processing the sensor data using one or more algorithms to determine, for example, advanced environmental data (e.g., whether a location is favored or frequented, whether a location is indoor or outdoor, home or office, public or private, whether other people are present, whether other compatible devices are present, weather, location-related services (e.g., stores, landmarks, restaurants, and the like), air quality, news, and the like) from said environmental data, and activity, mood, behavior, medical condition and the like from physiological data. In some examples, smart media device 102 may be configured to cross-correlate said sensor data and said derived data with other local data, as well as remote data (e.g., social, demographic, or other third-party proprietary or public media data from remote sources) to select media content for smart media device 102, or other media player, to play or provide. In some examples, smart media device 102 may select media content from a local source, a remote source, or both. In other examples, the quantity, type, function, structure, and configuration of the elements shown may be varied and are not limited to the examples provided.
In some examples, smart media device 202 also may include storage 206, which may be configured to store various types of data, including profile data 220 and content data 222. In some examples, profile data 220 may include data associated with a user's or user group's stored account information, media preferences, historical data (i.e., prior user activity, account or media-related), and the like. In some examples, historical data may include local sensor data previously collected (e.g., by sensor array 208, wearable device 214, mobile device 212, or the like) and associated with a user account (i.e., stored in an account profile). For example, historical data may include environmental data previously captured using sensor array 208 and associated with a media preference and a user account. In another example, historical data may include activity, physiological, behavioral, environmental and other information determined using local sensor data previously collected by wearable device 214 being worn by a user identified with an account by smart media device 202. In some examples, historical data may include metrics correlating various types of pre-calculated sensor data. Such metrics may provide insights into a user's media preferences in relation to certain environments (e.g., location, time, setting, weather, and the like), and such insights may be used by smart media modules 204 to automatically select media content for a present user in a present environment. In some examples, content data 222 may include data associated with stored media content previously downloaded (e.g., from local sources such as mobile device 212, display 216 or speaker 218, or from remote sources, such as remote databases (e.g., databases 112a-114a in
In some examples, smart media device 202 also may include sensor array 208 configured to provide sensor data, including data associated with an environment in which smart media device 202 is located. In some examples, smart media modules 204 may be configured to use such sensor data to customize a selection of media content for said environment. For example, sensor data provided by sensor array 208 may indicate noise levels, heat levels, light levels, and a number of compatible devices congruent with a lively, public atmosphere, and thus may select automatically an up tempo playlist associated with a present user or user group, or other media content matching such an environment. In some examples, smart media modules 204 may be configured to process said sensor data to derive more advanced environmental data (e.g., public or private/alone setting, home or office setting, indoor or outdoor setting, and the like) or behavioral data (i.e., through a user's interactions with smart media device 202). In some examples, smart media device 202 may be configured to use sensor array 208 or a separate communications facility (e.g., including an antenna, short range communications controller, or the like) to detect a presence, proximity, and/or location of compatible devices (i.e., devices with communication and operational capabilities in common with smart media device 202) (e.g., mobile device 212, wearable device 214, display 216, speaker 218, or the like).
In some examples, smart media device 202 also may include logic (not shown) implemented as firmware or application software that is installed in a memory (e.g., memory 302 in
In some examples, learning algorithm 304 may be configured to learn media tastes and preferences of a user or user group (i.e., associated with an account created and maintained by account profile generator 306). In some examples, learning algorithm 304 may use environmental and behavioral data from sensor array 318, remote data (e.g., social, demographic, or other third-party proprietary or public media data from remote sources) obtained using communication facility 314, stored data (e.g., historical and other profile data from storage 316, and the like), and other local data (e.g., from other media devices associated with a user's or user group's account profile) to generate data pertaining to a user's or user group's media tastes and preferences, both general (e.g., genres, types, styles, media services, social networks, and the like) and specific (e.g., identified playlists, songs, movies, videos, articles, books, advertisements and other media content, as well as environments associated highly, positively, or otherwise, with said identified media content).
In some examples, account profile generator 306 may be configured to create accounts and account profiles to identify individual users or user groups and to associate the users and user groups with media preference data (e.g., learned tastes and preferences, favored or frequented environments, correlations between media content consumption and an environment, or the like). In some examples, an account may be associated with an individual user. In other examples, an account may be associated with a user group, including, without limitation, a family, a household, a household member's social network, or other social graphs. In some examples, account data (e.g., user identification data, device identification data, metadata, and the like) and media preference data may be stored in one or more profiles associated with an account (e.g., using storage 316 or the like).
In some examples, rules engine 308 may be configured to prioritize media preference data (i.e., indicating media tastes and preferences of a user) associated with an account profile, as well as to mix or combine media preference data associated with multiple users or user groups, in order to provide media content module 310 with data with which to select media content. In some examples, rules engine 308 may comprise a set of rules configured to prioritize both general and specific media preference data according to various conditions, including environment (e.g., time, location, and the like), available devices (i.e., for playing media content), presence of a user, and the like. In some examples, rules engine 308 also may be configured to prioritize among different available media devices, for providing media content to a user, considering type of media content, a user's preferences, available devices, and the like. In some examples, rules engine 308 also may be configured to prioritize accounts and account profiles according to whether an associated user or user group is a primary or frequent user (e.g., registered owner of a smart media device, is a sole member of a household, is a member of a family of registered owners and frequent user, or the like), or lesser priority (e.g., friend of an owner, unknown user, or the like).
In some examples, data interface 312 may be configured to receive and send data associated with functions provided by smart media modules 301, sensor array 318, storage 316, communication facility 314. For example, data interface 312 may be configured to receive remote data from communication facility 314 for use by account profile generator 306 to create or update a profile stored in storage 316, or for use by media content module 310 to select or customize media content to be played using media player 320. In another example, data interface 312 may be configured to receive sensor data from sensor array 318 for use by learning algorithm 304 to inform media tastes and preferences with environmental data, or for use by media content module 310 to select or customize media content. In other examples, the quantity, type, function, structure, and configuration of the elements shown may be varied and are not limited to the examples provided.
In some examples, a profile may be associated with more than one account. For example, profile 410 may be associated with multiple accounts identifying users 422, 430 and 436, and their respective associated devices. In this example, profile 410 may be associated with an account identifying user 422, as well as user 422's associated devices, including wearable device 424, mobile device 426 and headset 428. Profile 410 also may include data identifying user 430 and associated devices, including mobile device 432. Profile 410 also may include data from media service 434, to which user 430 may have an account. In some examples, remote data from media service 434 may be accessed using mobile device 432. In other examples, mobile device 432 may be configured to operate an application associated with media service 434, and may locally store data associated with user 430's account with media service 434. Profile 410 also may include data identifying user 436 and associated devices, including wearable device 438 and mobile device 440. Profile 410 may be created and updated with data from one or more of said devices identified in accounts for users 422, 430 and 436. In other examples, profile 410 may be associated with a single account generated for a user group including users 422, 430 and 436, for example, if user 422, 430 and 436 were members of a household, a family, a work group, an office, or other group or social graph.
In some examples, a profile may be associated with a user's social network. For example, profile 412 may be associated with an account identifying user 442, as well as with social network 446 associated with user 442. In some examples, media preference data associated with social network 446, as may be indicated using a social networking service (e.g., Facebook®, Twitter®, LinkedIn®, Yelp®, Google+®, Instagram®, and the like), may be stored in profile 412 in association with user 442. In some examples, data associated with media preferences of social network 446 (e.g., media content is being consumed by members of social network 446, genres and types of media being consumed by members of social network 446, associated trends, media services being used by members of social network 446, and the like) may be obtained using mobile device 444 (e.g., implementing an application, accessing remote data using a network and long range communication protocol, as described herein, and the like). In other examples, the quantity, type, function, structure, and configuration of the elements shown may be varied and are not limited to the examples provided.
According to some examples, computing platform 600 performs specific operations by processor 604 executing one or more sequences of one or more instructions stored in system memory 606, and computing platform 600 can be implemented in a client-server arrangement, peer-to-peer arrangement, or as any mobile computing device, including smart phones and the like. Such instructions or data may be read into system memory 606 from another computer readable medium, such as storage device 608. In some examples, hard-wired circuitry may be used in place of or in combination with software instructions for implementation. Instructions may be embedded in software or firmware. The term “computer readable medium” refers to any non-transitory medium that participates in providing instructions to processor 604 for execution. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Non-volatile media includes, for example, optical or magnetic disks and the like. Volatile media includes dynamic memory, such as system memory 606.
Common forms of computer readable media includes, for example, floppy disk, flexible disk, hard disk, magnetic tape, any other magnetic medium, CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, RAM, PROM, EPROM, FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read. Instructions may further be transmitted or received using a transmission medium. The term “transmission medium” may include any tangible or intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine, and includes digital or analog communications signals or other intangible medium to facilitate communication of such instructions. Transmission media includes coaxial cables, copper wire, and fiber optics, including wires that comprise bus 602 for transmitting a computer data signal.
In some examples, execution of the sequences of instructions may be performed by computing platform 600. According to some examples, computing platform 600 can be coupled by communication link 621 (e.g., a wired network, such as LAN, PSTN, or any wireless network) to any other processor to perform the sequence of instructions in coordination with (or asynchronous to) one another. Computing platform 600 may transmit and receive messages, data, and instructions, including program code (e.g., application code) through communication link 621 and communication interface 613. Received program code may be executed by processor 604 as it is received, and/or stored in memory 606 or other non-volatile storage for later execution.
In the example shown, system memory 606 can include various modules that include executable instructions to implement functionalities described herein. In the example shown, system memory 606 includes account profiles module 610 configured to create and modify profiles, as described herein. System memory 606 also may include learning module 612, which may be configured to learn media tastes and preferences of one or more users, as described herein. System memory 606 also may include rules module 614, which may be configured to operate a rules engine, as described herein.
In some embodiments, various devices described herein may communicate (e.g., wired or wirelessly) with each other, or with other compatible devices, using computing platform 600. As depicted in
As hardware and/or firmware, the above-described structures and techniques can be implemented using various types of programming or integrated circuit design languages, including hardware description languages, such as any register transfer language (“RTL”) configured to design field-programmable gate arrays (“FPGAs”), application-specific integrated circuits (“ASICs”), multi-chip modules, or any other type of integrated circuit. For example, smart media devices 102, 202 and 402, including one or more components, can be implemented in one or more computing devices that include one or more circuits. Thus, at least one of the elements in
According to some embodiments, the term “circuit” can refer, for example, to any system including a number of components through which current flows to perform one or more functions, the components including discrete and complex components. Examples of discrete components include transistors, resistors, capacitors, inductors, diodes, and the like, and examples of complex components include memory, processors, analog circuits, digital circuits, and the like, including field-programmable gate arrays (“FPGAs”), application-specific integrated circuits (“ASICs”). Therefore, a circuit can include a system of electronic components and logic components (e.g., logic configured to execute instructions, such that a group of executable instructions of an algorithm, for example, and, thus, is a component of a circuit). According to some embodiments, the term “module” can refer, for example, to an algorithm or a portion thereof, and/or logic implemented in either hardware circuitry or software, or a combination thereof (i.e., a module can be implemented as a circuit). In some embodiments, algorithms and/or the memory in which the algorithms are stored are “components” of a circuit. Thus, the term “circuit” can also refer, for example, to a system of components, including algorithms. These can be varied and are not limited to the examples or descriptions provided.
In some examples, display 708 may be implemented as a light panel using a variety of available display technologies, including lights, light-emitting diodes (LEDs), interferometric modulator display (IMOD), electrophoretic ink (E Ink), organic light-emitting diode (OLED), or the like, without limitation. In other examples, display 708 may be implemented as a touchscreen, another type of interactive screen, a video display, or the like. In some examples, smart media devices 702 may include software, hardware, firmware, or other circuitry (not shown), configured to implement a program (i.e., application) configured to cause control signals to be sent to display 708, for example, to cause display 708 to present a light pattern, a graphic or symbol, a message or other text (e.g., a notification, information regarding audio being played, information regarding characteristics of smart media device 104 and 124, or the like), a video, or the like.
In some examples, controller 720 may be configured to generate a plurality of control signals to cause a device (e.g., smart media device 702, audio/video output device 722, mobile device 724, light 726, wearable device 728, or the like) to provide an output, for example, a notification (e.g., using light, acoustic output, visual/video output, vibrational output, or the like), in response to environmental state data 736 provided by environmental state determinator 718. For example, controller 720 may send a control signal to smart media device 702 to cause display 708 to provide or modify a visual output (e.g., light up, lower a light, display a light pattern, display a graphic or video, or the like), or the cause speaker 702-706 to provide or modify an audio output (e.g., output a sound, increase/decrease volume, output an audible alarm, or the like). In another example, controller 720 may send a control signal to audio/video output device 722 to provide or modify an audio/visual output (e.g., display a message with an audible alarm, play a video, increase/decrease volume or brightness associated with media content being played, or the like). In still other examples, controller 720 may send one or more control signals to mobile device 724, light 726 and wearable device 728 to provide or modify an audio, visual, or vibrational notification based on environmental state information generated by environmental state determinator 718. In yet other examples, the quantity, type, function, structure, and configuration of the elements shown may be varied and are not limited to the examples provided.
The foregoing description, for purposes of explanation, uses specific nomenclature to provide a thorough understanding of the invention. However, it will be apparent to one skilled in the art that specific details are not required in order to practice the invention. In fact, this description should not be read to limit any feature or aspect of the present invention to any embodiment; rather features and aspects of one embodiment can readily be interchanged with other embodiments. Notably, not every benefit described herein need be realized by each embodiment of the present invention; rather any specific embodiment can provide one or more of the advantages discussed above. In the claims, elements and/or operations do not imply any particular order of operation, unless explicitly stated in the claims. It is intended that the following claims and their equivalents define the scope of the invention. Although the foregoing examples have been described in some detail for purposes of clarity of understanding, the above-described inventive techniques are not limited to the details provided. There are many alternative ways of implementing the above-described invention techniques. The disclosed examples are illustrative and not restrictive.
Although the foregoing examples have been described in some detail for purposes of clarity of understanding, the above-described inventive techniques are not limited to the details provided. There are many alternative ways of implementing the above-described invention techniques. The disclosed examples are illustrative and not restrictive.