Messaging applications have become a popular means of communication that can provide real-time or near real-time exchange of messages between participants of a messaging session. Messaging applications may be standalone applications, such as a text messaging application for a mobile device that is configured to exchange Short Message Service (“SMS”) or Multimedia Message Service (“MMS”) from a mobile phone or other mobile device. Messaging applications may also be in-application services as one component of an application or suite of applications. In-application messages services may be used in social media platforms, collaborative work environment software, and numerous other types of application to facilitate communications between groups of users. Text alone has a limited ability to convey or express nuanced feelings or moods. Thus, there are significant areas for new and approved mechanisms for providing more engaging and interactive user experiences in messaging applications.
A computing device according to one aspect of this disclosure includes a processor and a computer-readable medium storing executable instructions for causing the processor to perform operations that include: receiving a signal reflecting a user input to invoke an animation associated with a message displayed on a user interface on a display of a computing device, determining an animation point at a preset location specified relative to dimensions of the user interface or the display and not based on a location of the message, establishing an animation path that includes the animation point and a target display location relative to a location of the message, such that at least a portion of the animation is rendered from the animation point to the target display location relative to the message, and rendering the animation on the user interface according to the established animation path.
A method, executed by a data processing system for rendering animations on a display, according to a second aspect of this disclosure includes receiving a signal reflecting a user input to invoke an animation associated with a message displayed on a user interface on a display of a computing device; determining via a processor an animation point at a preset location specified relative to dimensions of the user interface or the display and not based on a location of the message; establishing via the processor an animation path that includes the animation point and a target display location relative to the location of the message, such that at least a portion of the animation is rendered from the animation point to the target display location relative to the message; and rendering the animation on the user interface according to the animation path.
A memory device according to a third aspect of this disclosure stores instructions that, when executed on a processor of a computing device, cause the computing device to render an animation on a user interface, by: receiving a signal reflecting a user input to invoke an animation associated with a message displayed on a user interface on a display of the computing device; determining an animation point at a preset location specified relative to dimensions of the user interface or the display and not based on a location of the message; establishing an animation path that includes the animation point and a target display location relative to the location of the message, such that at least a portion of the animation is rendered from the animation point to the target display location relative to the message; and rendering the animation on the user interface according to the animation path.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
The drawing figures depict one or more implementations in accord with the present teachings, by way of example only, not by way of limitation. In the figures, like reference numerals refer to the same or similar elements. Furthermore, it should be understood that the drawings are not necessarily to scale.
In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant teachings. However, it should be apparent that the present teachings may be practiced without such details. In other instances, well known methods, procedures, components, and/or circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present teachings. In the following material, indications of direction, such as “top” or “left,” are merely to provide a frame of reference during the following discussion, and are not intended to indicate a required, desired, or intended orientation of the described articles.
The user devices 105a-105c each comprise a computing device that includes at least one display that can display a messaging user interface, a wired and/or wireless interface for exchanging messaging data with other user devices, and user interface means for controlling the messaging interface. The user interface means can include a keyboard, touch screen, a microphone, a camera, and/or other user interface components for receiving message content input from a user and for controlling the sending of message content to other users. The messaging interface may be implemented as a messaging application executable by a processor of the user device. The messaging interface may be a component of another application, such as a social media application, a collaborative software platform, or other application where users may exchange messages. Messages may typically include text content, but may also include non-textual content, such as multimedia content, files, data, and/or other information that may be shared by users of such software.
The wired and/or wireless interface for exchanging messaging data may comprise a wired or wireless network interface to facilitate establishing a network connection to the network 110 to communicate with other network-enabled devices. The network 110 may comprise one or more public or private networks and may be the collection of networks referred to as the Internet. The user devices 105a-105c may be configured to communicate using various communication protocols over wired and/or wireless network connections, such as but not limited to Wireless Local Access Network (WLAN), Wireless Wide Area Network (WWAN), or other types of wireless communications protocols. The user devices 105a-105c may be configured to communicate directly with other user devices, such as the user devices 105a-105c, or other user devices suitably equipped for wireless communications without any intervening network by using BLUETOOTH and/or other wireless communication protocols.
Some implementations of the messaging application may be implemented for the Android operating system, and the messaging user interface 205 may be implemented using ListView. ListView displays a vertically scrollable collection of views. A view is a basic building block for the creation of user interface components in the Android operating system. Each view occupies a rectangular area of the display and is responsible for drawing and event handling. The ListView can be used to implement the messaging user interface 205 by rendering a view of a list of message objects. Each message object can provide a view associated with that message. For example, a message object can be configured to provide a view that renders a “message bubble” around the text of the message as illustrated in
The messaging user interface 205 may include a message entry field 220, in which a user may enter a new message to add to the messaging session. For example, in Android implementations, the message entry field 220 may be implemented as an EditText user interface element, and a new message object is added to the ListView in response to the message being entered by the user. The ListView can be refreshed and the newly added message can provide a view that includes the message entered by the user rendered in a chat bubble, such as those illustrated in
The messaging user interface may implement one or more event listeners configured to listen for events associated with a message and to perform one or more actions in response to detecting such an event. The types of listeners utilized may depend upon the particular implementation used. The message objects may include a listener that responds to touch event when the display of the user device 105 is a touchscreen. The message objects may include a listener that responds to “on-click” events where the user device 105 includes a pointer device, such as a mouse or trackpad, that the user may use to interact with the messaging user interface 205. The messaging user interface 205 and/or the message objects may also include a listener that responds to voice-command events where the messaging user interface 205 is configured to provide a voice-based command interface. Other types of listeners may be associated with the message objects for processing other types of events.
In the example illustrated in
When a user selects an emoticon from the reactions toolbar 250, an animation associated with the selected emoticon may be rendered. Each emoticon included in the reactions toolbar 250 is associated with an emoticon identifier value. The emoticon identifier value may be an integer value associated with each emoticon and may be used by the messaging user interface 205 to identify an animation associated with the selected emoticon. The emoticon identifier may be an index value representing the position of the respective emoticon in the list of emoticons displayed by the reactions toolbar 250 or may be another value associated with the emoticon that can be used to identify which emoticon the user selected. Each message included in the list of messages is also associated with a message identifier. The message identifier may be an index value representing the position of that message in the list of messages associated with the messaging session being displayed by the messaging user interface 205 or may be another value assigned to each message such that each message of the messaging session can be uniquely identified.
The emoticon identifier and the message identifier can be passed to a render animation object that is configured to render a reaction animation over the contents of the messaging user interface 205. The render animation object can use the emoticon identifier value to determine which reaction animation(s) should be rendered in response to the user selecting the indication that a user has selected a particular reaction emoticon. The messaging application may maintain a mapping which animation(s) are associated with each reaction emoticon. More than one animation may be associated with a particular reaction emoticon, and timing information can also be associated with the reaction emoticon that indicates the order in which the animations should be rendered. The animations may be rendered sequentially, or one or more of the animations may be rendered concurrently. Rendering parameters may also be associated with each animation. The rendering parameters may include transparency or opacity control information, color information, size information for the animation. The rendering parameters may be expressed as a function of time or be associated with a frame identifier of frames of the animation and can be changed to alter the color(s), size, opacity, and/or other parameters as the animation is rendered.
An animation path for the reaction animation may then be established. The animation path is a path on the messaging interface 205 along which the reaction animation is to be rendered. An initial display location, an animation point (also referred to herein as an intermediate point), and a target display location may be determined. The initial display location represents a starting location on the messaging user interface 205 at which the animation is initially rendered. The animation point represents an intermediate point along the animation path that is a preset location specified relative to dimensions of the messaging user interface 205 or the display of the user device 105 on which the messaging user interface 205 is displayed. The target display location represents an end point for the animation path. The target display location may be defined relative to a location of the message on the message user interface 205 or a representation of such a message, such as a chat bubble or other user interface component that may be rendered for the message. The animation path is established such that at least a portion of the animation is rendered from the animation point to the target display location.
The animation path may be rendered from the initial display location to the target display location as a straight line, curve, or set of linear or curvilinear segments in which the animation point is an intermediate point on the animation path that falls between the initial display location and the target display location. In some implementations, the initial display location and the animation point are the same point, and the initial display location is a preset location specified relative to dimensions of the messaging user interface 205 or the display of the user device 105 on which the messaging user interface 205 is displayed. Furthermore, it is possible that a first animation or set of animations may be rendered up until the animation path reaches the animation point, and a second animations or set of animations may be rendered along the animation path between the animation point and the target display location.
The animation point 260 may be determined based on the dimensions of the messaging user interface 260. The height (HUI) along the y-axis and width (WUI) along the x-axis of a viewable area of the messaging user interface 205 may be determined. This viewable area may match the dimensions of the display upon which the messaging user interface 205 is rendered or may be smaller than the dimensions of the display if the messaging user interface 205 does not occupy the entire viewable area of the display. The height and width may be determined in pixels, millimeters, inches, or another unit of measurement. In some implementations, the animation point 260 is determined as a center point of the messaging user interface 260 in which the x and y coordinates of the animation point 260 are ½H and ½W, respectively. In other implementations, the animation point 260 may be offset from the center point of the messaging user interface 260. The animation point 260 may be located at the top left of the messaging user interface 260 (x=0, and y=HUI), the top center of the messaging user interface 260 (x=WUI/2, y=HUI), the top right of the messaging user interface 205 (x=WUI, Y=HUI), the bottom left of the messaging user interface 205 (x=0, y=0), the bottom center (x=WUI/2, y=0), or the bottom right of the messaging user interface 205 (x=WUI, y=0). These examples are meant to illustrate that the animation point 260 may be determined based on the dimensions of the messaging user interface 260 or the display of the user device 105.
The animation point examples discussed above may be offset by dimensions of the animation so that the animation does exceed the boundaries of the viewable area of the messaging user interface 205. The width of the animation may be expressed as WA and the height of the animation may be expressed as HA. The animation point 260 may be located at the top left of the messaging user interface 260 (x=0+WA, and y=HUI−HA), the top center of the messaging user interface 260 (x=WUI/2, y=HUI−HA), the top right of the messaging user interface 205 (x=WUI−WA, Y=HUI−HA), the bottom left of the messaging user interface 205 (x=0+WA, y=0+HA), the bottom center (x=WUI/2, y=0+HA), or the bottom right of the messaging user interface 205 (x=WUI, y=0+HA). These examples are meant to illustrate that the animation point 260 may be determined based on the dimensions of the messaging user interface 260 or the display of the user device 105 and the dimensions of the animation to be rendered and do not limit the scope of the invention or the claims to these specific examples.
The target display location 265 in this example falls within the message bubble associated with message 210d. The coordinates of the message bubble associated with the message 210d can be determined. The coordinates of the message bubble can be determined based on the location of the message in the list of messages. The index of the message can be used to determine where the message bubble associated with message 210d appears relative to the visible portion of the messaging user interface 205. The index of the message can be used to determine the coordinates of the messages bubble associated with the message. The coordinates may be associated with a predetermined location on the message bubble (e.g. the bottom left corner, upper right corner, center point) of the message bubble associated with the message. In Android based implementations, the dimensions of the message bubble rendered for a message and the coordinates of one or more reference points of the message bubble (e.g. top left corner, bottom right corner) may be obtained from the view associated with that message (which may be in turn be contained by a ListView as discussed above).
The target display location 265 can be determined based on the location of a reaction icon placement area 230 denoted by the dashed line in the message bubble associated with the message 210d. The reaction icon placement area 230 is rendered at a predetermined location within the message bubble, such as but not limited to below the message text as illustrated in the example of
In
The messaging application may obtain the coordinates of the message from the message object associated with that message. The message obtain may provide coordinates of the message within the messaging user interface 205. These coordinates can be used by the messaging application to determine whether the message falls within a visible portion of the contents of the messaging user interface 205. As discussed in the preceding examples, the messaging user interface 205 may be implemented as a ListView and each message object may provide its respective coordinates to the message user interface 205.
In
The messaging application can scroll the contents of the messaging user interface 205 such that the selected message becomes visible. The messaging application may scroll the contents of the user interface so that the selected message aligns with a predetermined location within the visible portion of the messaging user interface 205. The messaging user interface may determine a top visible message location at which a message at that location would be fully visible at a location proximate to the top of the messaging user interface 205. The messaging user interface may determine a center visible message location at which the message would be approximately centered within the messaging user interface 205. The messaging user interface may also determine a bottom visible message location at which the message would be located at a location proximate to the bottom of the messaging user interface 205. The messaging user interface 205 may select from one of these locations that is closest to a current location of the message that would minimize the movement of the contents of the messaging user interface 205.
As discussed in the preceding examples, the messages objects may each provide a view that is included in a ListView that provides a scrollable user interface that contains the messages, and the ListView object can provide methods for controlling the scrolling of the contents of the user interface to a particular message object included in the list.
The animation point 260 of the animation path is established based on the dimensions of the messaging user interface 205 using the process discussed above with respect to
The location of the animation point 260 and the target display location 265 may be determined as discussed in the preceding examples, such as
The process 300 may include a first operation 310 in which a signal reflecting a user input to invoke an animation associated with a message displayed on a user interface on a display of a computing device. The user may touch the message in implementations where the display of the data processing system is a touchscreen, click on the message using a pointer controlled by a touchpad or mouse, user voice commands to select the message, or any of the other various means discussed in the preceding examples.
The process 300 may also include a second operation 320 in which an animation point at a preset location specified relative to dimensions of the user interface or the display is determined and not based on a location of the message. The animation point represents an intermediate point along an animation path that is a preset location specified relative to dimensions of the user interface on which the animation is to be rendered, such as the messaging user interface 205 or the display of the device, such as the user device 105, on which the user interface is displayed. The location of the animation point may be predetermined for all animations, or a different animation point may be specified for specific animations. The location of the animation point may also be determined at least in part, based on an application in which the user interface is displayed. For example, a first messaging application may use a first animation point that is determined based on a center point of the user interface or display on which the user interface is rendered, while a second messaging application may use a second animation point that is determined from a different point relative to the user interface or display on which the user interface is rendered (e.g. top left, top center, top right, bottom left, bottom center, bottom right). Both the first and second animation points are determined relative to the dimensions of the user interface or the display but are at different locations within the user interface.
The location of the animation point may also be based at least in part on a size, shape, or both the size and shape of the animation. The animation may be centered over the animation point 300 along at least a portion of the animation path of the animation. A determination can be made as to the size and shape of the animation at the animation point. The dimensions of the frame or frames of animation to be rendered at the animation point may be determined. If these dimensions indicate that the animation would exceed the bounds of the visible area of the user interface at the animation point, the location of the animation point may be shifted horizontally or vertically on the user interface such that the animation would no longer exceed the bounds of the visible area of the user interface. Alternatively, the horizontal and vertical dimensions of the animation may be reduced to ensure that the animation remains fully visible within the user interface.
The process 300 may also include a third operation 330 in which an animation path that includes the animation point and a target display location relative to the location of the message is established. At least a portion of the animation path is rendered from the animation point to the target display location relative to the message. The target display location may be a point relative to the selected message or may be determined relative to the dimensions of the user interface or display, as discussed in the preceding examples. The animation path may render an animation that traverses the user interface in a linear path, a curvilinear path, or in a path comprising more than one linear or curvilinear segments.
The process 300 may include a fourth operation 340 in which the animation is rendered on the user interface according to the established animation path. The messaging application may access the animation indicated in the signal reflecting the user input in a memory of the user device 105. The animation may comprise a series of images or frames of animation that are to be rendered sequentially over a predetermined period. The animation may also comprise information identifying attributes of how the animation is to be rendered, such as opacity of the rendered animation, size of the rendered animation, color(s) of the rendered animation, and/or other attributes. The messaging application may render the animation along the animation path according to these attributes. The messaging application may have access to various functions of the operating system (OS), libraries, frameworks, and a presentation layer of software operating of the data processing system in order to render the user interface of the messaging application.
The process 400 may include a first operation 410 in which a signal reflecting a user input to invoke an animation associated with a message displayed on a user interface on a display of a computing device. The user may touch the message in implementations where the display of the data processing system is a touchscreen, click on the message using a pointer controlled by a touchpad or mouse, user voice commands to select the message, or any of the other various means discussed in the preceding examples.
The process 400 may include a second operation 420 in which a first location of the message is determined responsive to receiving the signal. The messaging application may determine the location of the selected message using the techniques discussed in the preceding examples. The location of the message can be determined based on the message's position in the list of messages included in the messaging session being displayed by the user interface. The location of the message may also be determined based on the location of a user interface component, such as a message bubble, in which the message is rendered on the user interface of the messaging application.
The process 400 may include a third operation 430 in which a determination that the message is not fully visible on the user interface based on the location of the message on the user interface. The message associated with the user input may not be fully visible. The dimensions of the contents of the user interface may exceed the dimensions of the user interface. As messages are added by participants of the messaging session, previously entered messages may begin to scroll offscreen as the contents of the user interface exceed the dimensions of the user interface. The messaging application can determine whether the contents of the user interface have exceeded the display dimensions and can determine whether the location of the message indicates that the message falls either partly or completely outside of the visible region of the contents. The messaging application can make this determination using the techniques discussed in the preceding examples.
The process 400 may include a fourth operation 440 in which a second location is determined at which the message would be fully visible on the user interface. The messaging application can select a predetermined location on the user interface based on dimensions of the user interface. The messaging application may alternatively select from more than one predetermined location for the message in which the message would be fully visible on the user interface. The messaging application may select a predetermined location from the plurality of predetermined locations that is a shortest distance from the current position of the message to minimize the movement of the contents of the user interface when the contents are scrolled to bring the message fully into view.
The process 400 may include a fifth operation 450 in which content of the user interface is scrolled to move the message to the second location. The messaging application may have access to various functions of the operating system (OS), libraries, frameworks, and a presentation layer of software operating of the data processing system in order to render the user interface of the messaging application and may use one or more of these elements to control which portion of the contents of the messaging session are visible in the user interface. The messaging application can determine how far the contents of the need to be scrolled based on the first and second locations determined in the preceding operations.
The process 400 may include a sixth operation 460 in which an animation path is established relative to the second position of the message. The animation path may include an initial display location, an animation point, and a target display location as discussed in the preceding examples. The target display location may be determined based on the second location of the message. An emoticon or other graphical representation of the reaction animation may be added to the message. The location where the emoticon or other graphical representation is rendered depends on the configuration of the message bubble or other graphical representation which is used to represent the message in the user interface. The messaging application may determine the location of a reaction icon placement area 230, and an offset within the reaction icon placement area 230 based on how many reactions have already been added to the message.
The process 400 may include a seventh operation 470 in which the animation is rendered on the user interface according to the established animation path. The messaging application may render the animation on the user interface according to the established animation path as discussed in the preceding examples.
The process 500 may include a first operation 510 in which a signal reflecting a user input to invoke an animation associated with a message displayed on a user interface on a display of a computing device. The user may touch the message in implementations where the display of the data processing system is a touchscreen, click on the message using a pointer controlled by a touchpad or mouse, user voice commands to select the message, or any of the other various means discussed in the preceding examples.
The process 500 may include a second operation 520 in which a first location of the message on the user interface is determined responsive to receiving the signal. The messaging application may determine the location of the selected message using the techniques discussed in the preceding examples. The location of the message can be determined based on the message's position in the list of messages included in the messaging session being displayed by the user interface. The location of the message may also be determined based on the location of a user interface component, such as a message bubble, in which the message is rendered on the user interface of the messaging application.
The process 500 may include a third operation 530 in which a determination can be made that the message is not fully visible on the user interface 530. The messaging application can determine whether the contents of the user interface have exceeded the display dimensions and make a determination whether the location of the message indicates that the message falls either partly or completely outside of the visible region of the contents. The messaging application can make this determination using the techniques discussed in the preceding examples.
The process 500 may include a fourth operation 540 in which the contents of the user interface can be scrolled to move the message to a second location in which the message is fully visible on the user interface responsive to determining that the message is not fully visible. The contents of the user interface may be scrolled such that the message is moved to a predetermined location on the screen. The predetermined location may coincide with initial display location, an animation point, or a target display location associated with the animation path along which the animation will be rendered. The messaging application can cause the contents of the messaging user interface 205 to scroll as discussed in the preceding examples to cause the portion of the content visible in the messaging user interface 205 to change.
The process 500 may include a fifth operation 550 in which the animation is rendered on the user interface such that the animation interacts with the message responsive to scrolling the contents of the user interface. For example, at least a portion of the animation may be rendered on or proximate to a representation of the message in the user interface, and an emoticon or other graphical representation of the reaction animation may be added to the message as discussed in the preceding examples.
The process 600 may include a first operation 610 in which a signal reflecting a user input to invoke an animation associated with a message displayed on a user interface on a display of a computing device. The user may touch the message in implementations where the display of the data processing system is a touchscreen, click on the message using a pointer controlled by a touchpad or mouse, user voice commands to select the message, or any of the other various means discussed in the preceding examples.
The process 600 may include a second operation 620 in which a determination is made that a location of the message on the user interface is different from a preset location specified relative to dimensions of the user interface or the display responsive to receiving the signal. The messaging application can determine the current location of the message within the user interface using the techniques discussed in the preceding examples. The messaging application can compare the current location of the message with the predetermined location to determine if the two locations are different. This approach may be used by the messaging application where the animations to be rendered have an animation path that may be at least partly preset. For example, one or more of the initial display location, the animation point, or the target display location may be preset points, and the message may be positioned at the preset location so that the message aligns with the preset animation path.
The process 600 may third a third operation 630 in which the contents of the user interface are scrolled to move the message to the predetermined location in which the message is fully visible on the user interface responsive to determining that the message is not fully visible. The messaging application can scroll the contents of the user interface of the messaging application as discussed in the preceding examples.
The process 600 may include a fourth operation 640 in which the animation is rendered on the user interface such that the animation interacts with the message. For example, at least a portion of the animation may be rendered on or proximate to a representation of the message in the user interface, and an emoticon or other graphical representation of the reaction animation may be added to the message as discussed in the preceding examples.
Examples of the operations illustrated in the flow charts shown in
The detailed examples of systems, devices, and techniques described in connection with
In some examples, a hardware module may be implemented mechanically, electronically, or with any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is configured to perform certain operations. For example, a hardware module may include a special-purpose processor, such as a field-programmable gate array (FPGA) or an Application Specific Integrated Circuit (ASIC). A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations and may include a portion of machine-readable medium data and/or instructions for such configuration. For example, a hardware module may include software encompassed within a programmable processor configured to execute a set of software instructions. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (for example, configured by software) may be driven by cost, time, support, and engineering considerations.
Accordingly, the phrase “hardware module” should be understood to encompass a tangible entity capable of performing certain operations and may be configured or arranged in a certain physical manner, be that an entity that is physically constructed, permanently configured (for example, hardwired), and/or temporarily configured (for example, programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering examples in which hardware modules are temporarily configured (for example, programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module includes a programmable processor configured by software to become a special-purpose processor, the programmable processor may be configured as respectively different special-purpose processors (for example, including different hardware modules) at different times. Software may accordingly configure a processor or processors, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time. A hardware module implemented using one or more processors may be referred to as being “processor implemented” or “computer implemented.”
Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (for example, over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory devices to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output in a memory device, and another hardware module may then access the memory device to retrieve and process the stored output.
In some examples, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by, and/or among, multiple computers (as examples of machines including processors), with these operations being accessible via a network (for example, the Internet) and/or via one or more software interfaces (for example, an application program interface (API)). The performance of certain of the operations may be distributed among the processors, not only residing within a single machine, but deployed across several machines. Processors or processor-implemented modules may be in a single geographic location (for example, within a home or office environment, or a server farm), or may be distributed across multiple geographic locations.
The example software architecture 702 may be conceptualized as layers, each providing various functionality. For example, the software architecture 702 may include layers and components such as an operating system (OS) 714, libraries 716, frameworks 718, applications 720, and a presentation layer 744. Operationally, the applications 720 and/or other components within the layers may invoke API calls 724 to other layers and receive corresponding results 726. The layers illustrated are representative in nature and other software architectures may include additional or different layers. For example, some mobile or special purpose operating systems may not provide the frameworks/middleware 718.
The OS 714 may manage hardware resources and provide common services. The OS 714 may include, for example, a kernel 728, services 730, and drivers 732. The kernel 728 may act as an abstraction layer between the hardware layer 704 and other software layers. For example, the kernel 728 may be responsible for memory management, processor management (for example, scheduling), component management, networking, security settings, and so on. The services 730 may provide other common services for the other software layers. The drivers 732 may be responsible for controlling or interfacing with the underlying hardware layer 704. For instance, the drivers 732 may include display drivers, camera drivers, memory/storage drivers, peripheral device drivers (for example, via Universal Serial Bus (USB)), network and/or wireless communication drivers, audio drivers, and so forth depending on the hardware and/or software configuration.
The libraries 716 may provide a common infrastructure that may be used by the applications 720 and/or other components and/or layers. The libraries 716 typically provide functionality for use by other software modules to perform tasks, rather than rather than interacting directly with the OS 714. The libraries 716 may include system libraries 734 (for example, C standard library) that may provide functions such as memory allocation, string manipulation, file operations. In addition, the libraries 716 may include API libraries 736 such as media libraries (for example, supporting presentation and manipulation of image, sound, and/or video data formats), graphics libraries (for example, an OpenGL library for rendering 2D and 3D graphics on a display), database libraries (for example, SQLite or other relational database functions), and web libraries (for example, WebKit that may provide web browsing functionality). The libraries 716 may also include a wide variety of other libraries 738 to provide many functions for applications 720 and other software modules.
The frameworks 718 (also sometimes referred to as middleware) provide a higher-level common infrastructure that may be used by the applications 720 and/or other software modules. For example, the frameworks 718 may provide various graphic user interface (GUI) functions, high-level resource management, or high-level location services. The frameworks 718 may provide a broad spectrum of other APIs for applications 720 and/or other software modules.
The applications 720 include built-in applications 740 and/or third-party applications 742. Examples of built-in applications 740 may include, but are not limited to, a contacts application, a browser application, a location application, a media application, a messaging application, and/or a game application. Third-party applications 742 may include any applications developed by an entity other than the vendor of the particular platform. The applications 720 may use functions available via OS 714, libraries 716, frameworks 718, and presentation layer 744 to create user interfaces to interact with users.
Some software architectures use virtual machines, as illustrated by a virtual machine 748. The virtual machine 748 provides an execution environment where applications/modules can execute as if they were executing on a hardware machine (such as the machine 800 of
The machine 800 may include processors 810, memory 830, and I/O components 850, which may be communicatively coupled via, for example, a bus 802. The bus 802 may include multiple buses coupling various elements of machine 800 via various bus technologies and protocols. In an example, the processors 810 (including, for example, a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an ASIC, or a suitable combination thereof) may include one or more processors 812a to 812n that may execute the instructions 816 and process data. In some examples, one or more processors 810 may execute instructions provided or identified by one or more other processors 810. The term “processor” includes a multi-core processor including cores that may execute instructions contemporaneously. Although
The memory/storage 830 may include a main memory 832, a static memory 834, or other memory, and a storage unit 836, both accessible to the processors 810 such as via the bus 802. The storage unit 836 and memory 832, 834 store instructions 816 embodying any one or more of the functions described herein. The memory/storage 830 may also store temporary, intermediate, and/or long-term data for processors 810. The instructions 816 may also reside, completely or partially, within the memory 832, 834, within the storage unit 836, within at least one of the processors 810 (for example, within a command buffer or cache memory), within memory at least one of I/O components 850, or any suitable combination thereof, during execution thereof. Accordingly, the memory 832, 834, the storage unit 836, memory in processors 810, and memory in I/O components 850 are examples of machine-readable media.
As used herein, “machine-readable medium” refers to a device able to temporarily or permanently store instructions and data that cause machine 800 to operate in a specific fashion, and may include, but is not limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, optical storage media, magnetic storage media and devices, cache memory, network-accessible or cloud storage, other types of storage and/or any suitable combination thereof. The term “machine-readable medium” applies to a single medium, or combination of multiple media, used to store instructions (for example, instructions 816) for execution by a machine 800 such that the instructions, when executed by one or more processors 810 of the machine 800, cause the machine 800 to perform and one or more of the features described herein. Accordingly, a “machine-readable medium” may refer to a single storage device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices. The term “machine-readable medium” excludes signals per se.
The I/O components 850 may include a wide variety of hardware components adapted to receive input, provide output, produce output, transmit information, exchange information, capture measurements, and so on. The specific I/O components 850 included in a particular machine will depend on the type and/or function of the machine. For example, mobile devices such as mobile phones may include a touch input device, whereas a headless server or IoT device may not include such a touch input device. The particular examples of I/O components illustrated in
In some examples, the I/O components 850 may include biometric components 856, motion components 858, environmental components 860, and/or position components 862, among a wide array of other physical sensor components. The biometric components 856 may include, for example, components to detect body expressions (for example, facial expressions, vocal expressions, hand or body gestures, or eye tracking), measure biosignals (for example, heart rate or brain waves), and identify a person (for example, via voice-, retina-, fingerprint-, and/or facial-based identification). The motion components 858 may include, for example, acceleration sensors (for example, an accelerometer) and rotation sensors (for example, a gyroscope). The environmental components 860 may include, for example, illumination sensors, temperature sensors, humidity sensors, pressure sensors (for example, a barometer), acoustic sensors (for example, a microphone used to detect ambient noise), proximity sensors (for example, infrared sensing of nearby objects), and/or other components that may provide indications, measurements, or signals corresponding to a surrounding physical environment. The position components 862 may include, for example, location sensors (for example, a Global Position System (GPS) receiver), altitude sensors (for example, an air pressure sensor from which altitude may be derived), and/or orientation sensors (for example, magnetometers).
The I/O components 850 may include communication components 864, implementing a wide variety of technologies operable to couple the machine 800 to network(s) 870 and/or device(s) 880 via respective communicative couplings 872 and 882. The communication components 864 may include one or more network interface components or other suitable devices to interface with the network(s) 870. The communication components 864 may include, for example, components adapted to provide wired communication, wireless communication, cellular communication, Near Field Communication (NFC), Bluetooth communication, Wi-Fi, and/or communication via other modalities. The device(s) 880 may include other machines or various peripheral devices (for example, coupled via USB).
In some examples, the communication components 864 may detect identifiers or include components adapted to detect identifiers. For example, the communication components 864 may include Radio Frequency Identification (RFID) tag readers, NFC detectors, optical sensors (for example, one- or multi-dimensional bar codes, or other optical codes), and/or acoustic detectors (for example, microphones to identify tagged audio signals). In some examples, location information may be determined based on information from the communication components 862, such as, but not limited to, geo-location via Internet Protocol (IP) address, location via Wi-Fi, cellular, NFC, Bluetooth, or other wireless station identification and/or signal triangulation.
While various embodiments have been described, the description is intended to be exemplary, rather than limiting, and it is understood that many more embodiments and implementations are possible that are within the scope of the embodiments. Although many possible combinations of features are shown in the accompanying figures and discussed in this detailed description, many other combinations of the disclosed features are possible. Any feature of any embodiment may be used in combination with or substituted for any other feature or element in any other embodiment unless specifically restricted. Therefore, it will be understood that any of the features shown and/or discussed in the present disclosure may be implemented together in any suitable combination. Accordingly, the embodiments are not to be restricted except in light of the attached claims and their equivalents. Also, various modifications and changes may be made within the scope of the attached claims.
While the foregoing has described what are considered to be the best mode and/or other examples, it is understood that various modifications may be made therein and that the subject matter disclosed herein may be implemented in various forms and examples, and that the teachings may be applied in numerous applications, only some of which have been described herein. It is intended by the following claims to claim any and all applications, modifications and variations that fall within the true scope of the present teachings.
Unless otherwise stated, all measurements, values, ratings, positions, magnitudes, sizes, and other specifications that are set forth in this specification, including in the claims that follow, are approximate, not exact. They are intended to have a reasonable range that is consistent with the functions to which they relate and with what is customary in the art to which they pertain.
The scope of protection is limited solely by the claims that now follow. That scope is intended and should be interpreted to be as broad as is consistent with the ordinary meaning of the language that is used in the claims when interpreted in light of this specification and the prosecution history that follows and to encompass all structural and functional equivalents. Notwithstanding, none of the claims are intended to embrace subject matter that fails to satisfy the requirement of Sections 101, 102, or 103 of the Patent Act, nor should they be interpreted in such a way. Any unintended embracement of such subject matter is hereby disclaimed.
Except as stated immediately above, nothing that has been stated or illustrated is intended or should be interpreted to cause a dedication of any component, step, feature, object, benefit, advantage, or equivalent to the public, regardless of whether it is or is not recited in the claims.
It will be understood that the terms and expressions used herein have the ordinary meaning as is accorded to such terms and expressions with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein. Relational terms such as first and second and the like may be used solely to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element proceeded by “a” or “an” does not, without further constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element.
The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various examples for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed example. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.