Artists have conventionally utilized brushes and paints to create a work of art on a canvas. An artist has the freedom to choose a type and size of canvas, a type and size of brush, and types and colors of paint to create a work of art. Different canvases, brushes, and paints can give the artist freedom in generating the work of art.
As computers have become more popular and readily accessible, paint simulation programs have been created that are configured to simulate artistic painting on a computer. These paint simulation programs have traditionally not been particularly robust or realistic. For example, many paint simulation programs utilize two-dimensional stamps of a fixed size and shape, such as a circle or square. A user can select the stamp, select the color, and then utilize an input device (e.g., a mouse or touchpad) to stamp the shape repeatedly on a computer screen in accordance with user input. It can be readily ascertained, however, that real-world paintbrushes have several degrees of freedom, such that the size and shape of a footprint of the paintbrush changes as the user handles the paintbrush.
More recently, traditional paint simulation programs have attempted to mimic use of physical media through various brushes and paint effects. For instance, in some paint simulation programs, the brushes can be styled to represent depositing various types of paint such as oils, acrylics, pastels, charcoals, pen or the like. Moreover, the paint simulation programs can provide various effects for each type of paint, which attempt to portray realistic effects for the differing types of paint. However, many of these conventional paint simulation programs commonly lack realistic modeling of paint interaction that resembles characteristics of physical painting on a physical canvas.
Described herein are various technologies that pertain to simulating oil painting. Data can be received from a sensor that indicates a desired orientation of an image editing tool with respect to a computer-implemented canvas. The computer-implemented canvas can include a paint map, which includes color values and height values of pixels representative of oil paint deposited on the computer-implemented canvas. Moreover, a footprint of the image editing tool upon the computer-implemented canvas can be computed based upon the data from the sensor. Further, an oil paint ridge model can be generated by modulating height values of a subset of the pixels from the paint map that are outside the footprint and less than or equal to a predetermined distance from an edge of the footprint. A display screen of a computing device can be caused to update an image rendered thereupon based upon the oil paint ridge model.
In accordance with various embodiments, oil paint deposited on the computer-implemented canvas can be dried. For instance, the paint map of the computer-implemented canvas can include a wet layer and a dry layer. The oil paint deposited on the computer-implemented canvas can be dried by merging the wet layer into the dry layer to create an updated dry layer. Merging the wet layer into the dry layer can include combining height values from the wet layer and the dry layer per pixel in the updated dry layer. Further, merging the wet layer into the dry layer can include replacing color values from the dry layer with color values from the wet layer in the updated dry layer for pixels having color values in the wet layer. Moreover, merging the wet layer into the dry layer can include maintaining color values from the dry layer in the updated dry layer for pixels lacking color values in the wet layer.
According to various embodiments set forth herein, the footprint of the image editing tool upon the computer-implemented canvas can be computed based at least in part upon geometry of an image editing tool specified by a model of the image editing tool, the data received from the sensor that indicates a desired orientation of the image editing tool with respect to the computer-implemented canvas, and the height values of the pixels from the paint map. The footprint can further be computed based upon a texture of the computer-implemented canvas. Moreover, the footprint can be computed based upon data received from the sensor that indicates a pressure applied to the image editing tool. The footprint can further be computed based upon data received from the sensor that indicates movement of the image editing tool.
The above summary presents a simplified summary in order to provide a basic understanding of some aspects of the systems and/or methods discussed herein. This summary is not an extensive overview of the systems and/or methods discussed herein. It is not intended to identify key/critical elements or to delineate the scope of such systems and/or methods. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
Various technologies pertaining to simulation of painting with oil paint on a computer-implemented canvas are now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of one or more aspects. It may be evident, however, that such aspect(s) may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing one or more aspects. Further, it is to be understood that functionality that is described as being carried out by certain system components may be performed by multiple components. Similarly, for instance, a component may be configured to perform functionality that is described as being carried out by multiple components.
Moreover, the term “or” is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from the context, the phrase “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, the phrase “X employs A or B” is satisfied by any of the following instances: X employs A; X employs B; or X employs both A and B. In addition, the articles “a” and “an” as used in this application and the appended claims should generally be construed to mean “one or more” unless specified otherwise or clear from the context to be directed to a singular form.
As set forth herein, painting with oil paint on a computer-implemented canvas is simulated. Fluid flow and interaction of oil paint can be modeled to provide dynamism to the oil paint. Accordingly, oil paint previously deposited on the computer-implemented canvas can be manipulated to enhance dynamism of the oil paint. For instance, an oil paint ridge model can be generated by modulating height values of a subset of pixels from a paint map that are outside a footprint of an image editing tool. Moreover, the oil paint deposited on the computer-implemented canvas can have various states of dryness. Further, the oil paint deposited on the computer-implemented canvas can have a three-dimensional height (e.g., height value for the pixels in the paint map), which can effect subsequent painting operations.
Referring now to the drawings,
The sensor 106 can output data that is indicative of the movement and/or position of the input mechanism, and such data can be received by the computing device 102. A display screen 110 can be in communication with the computing device 102, such that the computing device 102 can cause the display screen 110 to display graphical data. More particularly, the oil paint simulation system 104 can be configured to cause the display screen 110 to display an image editing user interface (UI) 112. The image editing UI 112 can include a graphical depiction of a canvas, paint selections such that the user 108 can select a type and/or color of paint to employ, image editing tool (e.g., brush) selections such that the user 108 can select a type and/or size of image editing tool to employ, amongst other graphical data. Thus, the oil paint simulation system 104 is configured to provide a computer-implemented simulation environment, wherein images caused to be displayed in the image editing UI 112 are based at least in part upon input data detected by the sensor 106.
It is contemplated that the sensor 106 can be substantially any type of sensor that can accept input from the user 108. For example, the sensor 106 can be a gesture enabled trackpad, touch sensitive display screen (e.g., the sensor 106 can be integrated into the display screen 110), mouse, camera, microphone, remote control, keyboard, combination thereof, or the like. According to various examples, it is contemplated that the sensor 106 can rely on speech recognition, touch and stylus recognition, gesture recognition both on the display screen 110 and adjacent to the display screen 110, air gestures, head and eye tracking, voice and speech, vision, touch, gestures and the like. Moreover, the sensor 106 can be configured to output data that indicates position and/or movement. The output data can be utilized by the oil paint simulation system 104, which can cause images to be displayed in the image editing UI 112 that are reflective of an artistic intent of the user 108.
In accordance with an example, the sensor 106 can be a gesture enabled trackpad. Following this example, the gesture enabled trackpad can output data that indicates position and/or movement of a finger (or fingers) of the user 108, including height field(s) corresponding to the finger (or fingers) of the user 108. Such data can be indicative of movement of a paintbrush or other suitable image editing tool over a computer-implemented canvas. Additionally or alternatively, the gesture enabled trackpad can output data that specifies pressure applied by the finger (or fingers) of the user 108. The oil paint simulation system 104 can obtain such data and cause updated images to be displayed in the image editing UI 112 on the display screen 110.
By way of another example, the sensor 106 can be a mouse. Following this example, through movement of the mouse and selection of one or more buttons on the mouse, the user 108 can cause the mouse to output data that is indicative of movement of a paintbrush or other suitable image editing tool over the computer-implemented canvas. The oil paint simulation system 104 can receive such data and can cause updated images to be displayed in the image editing UI 112 on the display screen 110.
According to a further example, the sensor 106 can be integrated into the display screen 110, such that the display screen 110 is a touch sensitive display screen. The user 108 can move his finger (or fingers) or stylus over the display screen 110 as if the user 108 were painting on a canvas. The oil paint simulation system 104 can cause the image editing UI 112 to update images on the computer display screen 110 based at least in part upon sensed movement of the finger (or fingers) of the user 108 or stylus utilized by the user 108 over the display screen 110.
While several examples have been set forth above, it is to be understood that the oil paint simulation system 104 can be configured to simulate real-world artistic painting/image editing based at least in part upon any suitable input device that is representative of position/movement of a three-dimensional image editing tool (e.g., paintbrush, etc.) over a canvas. For instance, the sensor 106 can be configured to detect orientation of an input mechanism (e.g., in three-dimensions), pressure applied by the user 108 with respect thereto, amongst other data that is indicative of intent of the user 108 with respect to painting on a canvas. Operation of the oil paint simulation system 104 will now be described in greater detail below.
Now turning to
The computer-implemented canvas 202 can have various properties. Examples of the properties include texture (e.g., surface ridges), absorption, color, and the like. According to an illustration, a type of medium (e.g., type of paper, etc.) for the computer-implemented canvas 202 can be selected by the user, and the selected medium type can have a set of properties (e.g., texture, absorption, color, etc.). By way of another illustration, it is contemplated that the computer-implemented canvas 202 can be a still image (e.g., selected by a user, captured by a camera, etc.). The computer-implemented canvas 202 further includes a paint map 204. The paint map 204 includes color values and height values of pixels representative of oil paint deposited on the computer-implemented canvas 202. The height values of the pixels in the paint map 204 can enhance realism by providing a thickness to the oil paint deposited on the computer-implemented canvas 202, which can be visually observable when rendered on the display screen 110 and can effect subsequent painting operations.
Moreover, the oil paint simulation system 104 includes a tool component 206 that generates a model of the image editing tool. The model of the image editing tool can specify geometry of the image editing tool. For instance, the image editing tool can be a paintbrush; however, the claimed subject matter is not so limited. The model of the image editing tool generated by the tool component 206 can be based at least in part upon the input data from the user (e.g., the input data from the user can indicate a selection of the type and/or size of the image editing tool).
The tool component 206 can provide rendering algorithms for various image editing tools that are selectable responsive to the input data received from the user. For instance, image editing tools can be provided for oil, pastels, custom paint tools, and the like. Further, the tool component 206 can initiate simulation and/or rendering of paint on the computer-implemented canvas 202. The tool component 206 can also initiate rendering of the image editing tool, including deformations of the brush bristles, etc. For instance, each image editing tool provided by the tool component 206 can have properties that define rendering of such image editing tool on the display screen 110. The tool component 206 can cause the image editing tool to be rendered in real-time as such image editing tool is being used. Thus, the tool component 206 can cause the image editing tool to be rendered per properties of such image editing tool and the received input data. Moreover, the tool component 206 can update the image editing tool over time.
The oil paint simulation system 104 further includes an effector component 208 that computes a footprint of the image editing tool upon the computer-implemented canvas 202 based at least in part upon the geometry of the image editing tool and the input data received from the user. The footprint of the image editing tool can further be computed by the effector component 208 based upon texture of the computer-implemented canvas 202, height values of the pixels specified in the paint map 204, a combination thereof, and so forth. Moreover, the tool component 206 can instantiate the effector component 208 (or a plurality of effector components similar to the effector component 208).
The oil paint simulation system 104 can further include a paint component 210 that can update the pixels from the paint map 204 within the footprint based at least in part upon the color values and the height values of the pixels within the footprint as specified in the paint map 204 as well as a pickup map. The pickup map can include color values and height values representative of a disparate oil paint on the image editing tool. The pixels from the paint map 204 within the footprint can be updated by the paint component 210 to deposit the disparate oil paint from the image editing tool onto the computer-implemented canvas 202. Such deposition of the disparate oil paint can be effectuated by mixing (e.g., through linear interpolation, etc.) the disparate oil paint from the image editing tool (e.g., represented by the color values and height values from the pickup map) with the oil paint previously deposited on the computer-implemented canvas 202 (e.g., represented by the color values and height values from the paint map 204).
The paint component 210 can include logic to simulate and render oil paint on the computer-implemented canvas 202. For instance, the paint component 210 can provide paint simulation algorithms. Moreover, the paint component 210 can include shaders. Further, it is contemplated that the paint component 210 can continuously update the image rendered on the display screen 110 (e.g., to animate painting, etc.).
The oil paint simulation system 104 can also include a render component 212 that is configured to render an artistic work on the computer-implemented canvas 202 displayed on the display screen 110. The render component 212 can utilize any suitable rendering technique to render the artistic work on the display screen 110. Moreover, the render component 212 can be configured to render paint such that it appears as a thin layer on the display screen 110, control glossiness of paint as it appears on the display screen 110, control lighting and/or height of paint as it appears on the display screen 110, etc.
As noted above, it is contemplated that the tool component 206 can instantiate multiple effector components similar to the effector component 208. For instance, multiple effector components can be instantiated by the tool component 206 when using the image editing tool to paint by sprinkling the oil paint on the computer-implemented canvas 202 or airbrushing the oil paint on the computer-implemented canvas 202. Thus, each paint drop can be considered as a source with a respective footprint, where the respective footprints can be computed by differing effector components. According to an example, the tool component 206 can instantiate two effector components. Following this example, the first effector component can compute a first footprint of the image editing tool upon the computer-implemented canvas 202, and the second effector component can compute a second footprint of the image editing tool upon the computer-implemented canvas 202. Such effector components can compute the footprints based at least in part upon the data received from the sensor. Accordingly, the paint component 210 can deposit disparate oil paint from the image editing tool onto the computer-implemented canvas 202 by updating pixels from the paint map 204 within the first footprint and the second footprint based at least in part upon the color values and the height values of the pixels within the first footprint and the second footprint and the pickup map. Again, the pickup map can include color values and height values representative of the disparate oil paint on the image editing tool. It is to be appreciated, however, that the claimed subject matter is not limited to the foregoing example as it is contemplated that substantially any number of effector components can be instantiated by the tool component 206.
Turning now to
Moreover, the effector component 208 can include a physics simulation component 312 that simulates physics of the interaction between the image editing tool and the computer-implemented canvas when computing the footprint 302. For instance, if the image editing tool is a brush tuft, then the physics simulation component 312 can provide for deformation of such brush. According to another example, if the image editing tool is a paint drop, then the physics simulation component 312 can simulate the physics of such paint drop as though it is thrown through the air onto the computer-implemented canvas, slides on a paint surface, etc. Yet, it is to be appreciated that the claimed subject matter is not limited to the foregoing examples.
With reference to
Moreover, the paint component 210 includes a ridge creation component 404 that generates an oil paint rigid model. The ridge creation component 404 can model fluid flow and interaction to allow the user to create rich artwork that resembles physical painting. When physically pushing oil paint, a ridge is often observed. For example, when physically painting on wet oil paint, the oil paint underneath a brush can be pushed towards the sides of the brush during a brush stroke, and the oil paint pushed away can pile up around a footprint, resulting in a ridge that can remain after the brush stroke. Such ridge can be modeled by the ridge creation component 404 to simulate dynamism of depositing oil paint and interacting with previously deposited oil paint.
Turning briefly to
Again reference is made to
According to an example, the predefined distance from the edge of the footprint 302 can be three pixels. Many of the examples set forth herein describe the predefined distance being three pixels; however, it is to be appreciated that other predefined distances from the edge of the footprint 302 are intended to fall within the scope of the hereto appended claims. For instance, predefined distances of one pixel, two pixels, four pixels, five pixels, or more than five pixels are intended to fall within the scope of the hereto appended claims.
The edge detection component 406 can determine respective distances of the subset of the pixels from the edge of the footprint 302. Moreover, the modulation component 408 can modulate the height values of the subset of the pixels as a function of the respective distances from the edge of the footprint 302. According to an example, the modulation component 408 can increase a first height value of a first pixel in the subset by a first increment, where the first pixel is at a distance of one pixel from the edge of the footprint 302 as determined by the edge detection component 406. Following this example, the modulation component 408 can increase a second height value of a second pixel in the subset by a second increment, where the second pixel is at a distance of two pixels from the edge of the footprint 302 as determined by the edge detection component 406, and where the first increment is greater than the second increment. Further following this example, the modulation component 408 can increase a third height value of a third pixel in the subset by a third increment, where the third pixel is at a distance of three pixels from the edge of the footprint 302 as determined by the edge detection component 406, and where the second increment is greater than the third increment.
With reference to
Turning to
Reference is again made to
Moreover, the ridge creation component 404 can include a movement track component 412 that computes a movement direction of the footprint 302. The subset of the pixels from the paint map 204 identified by the edge detection component 406 and modulated by the modulation component 408 can further be a function of the movement direction of the footprint 302 computed by the movement track component 412. Thus, a ridge can be formed ahead of a leading edge of the footprint 302 as the footprint 302 moves across the computer-implemented canvas 202. By way of example, it is contemplated that a ridge can be formed around the footprint 302 if the footprint 302 is formed based upon a drop of paint being splattered onto the computer-implemented canvas 202 (e.g., the movement direction of the footprint 302 can be considered to be in a direction into the computer-implemented canvas 202 by the movement track component 412); yet, the claimed subject matter is not so limited.
The following example illustrates operation of the ridge creation component 404. The ridge creation component 404 can obtain the footprint 302. Around the edge of the footprint 302, the ridge creation component 404 can modify color values and height values of a subset of pixels in the paint map 204 to simulate the ridge effect. More particularly, a height value and/or a color value of a current pixel (e.g., a pixel from the paint map 204) can be modified at least as a function of a distance from the edge of the footprint 302 determined by the edge detection component 406 as set forth below.
Pursuant to this example, if the edge detection component 406 determines that the current pixel is a level one edge (e.g., the current pixel is at a distance of one pixel from the edge of the footprint 302), then the ridge creation component 404 can evaluate whether the current pixel has paint in the paint map 204. If the current pixel is determined to lack paint in the paint map 204, then the blend component 410 can set a color value of the current pixel to a color value from the footprint and the modulation component 408 can linearly interpolate a height value for the current pixel. Alternatively, if the ridge creation component 404 determines that the current pixel includes paint in the paint map 204, then the modulation component 408 can increase the height value for the current pixel specified in the paint map 204 by a first increment. For instance, the first increment can be 0.06, and a maximum height value can be a footprint height plus 0.4; yet, the claimed subject matter is not so limited. Moreover, the blend component 410 can mix the footprint color with the color value of the current pixel specified in the paint map 204 when the current pixel is determined to include paint in the paint map 204.
Further following the aforementioned example, if the edge detection component 406 determines that the current pixel is a level two edge (e.g., the current pixel is at a distance of two pixels from the edge of the footprint 302), then the ridge creation component 404 can evaluate whether the current pixel has paint in the paint map 204. If the current pixel is determined to lack paint in the paint map 204, then the blend component 410 can set a color value of the current pixel to a color value from the footprint 302 and the modulation component 408 can linearly interpolate a height value for the current pixel. Alternatively, if the ridge creation component 404 determines that the current pixel includes paint in the paint map 204, then the modulation component 408 can increase the height value of the current pixel specified in the paint map 204 by a second increment. For instance, the second increment can be 0.04, and a maximum height value can be a footprint height plus 0.2; however, it is to be appreciated that the claimed subject matter is not so limited. Further, the blend component 410 need not change the color for the current pixel as specified in the paint map 204 when the current pixel is determined to include paint in the paint map 204.
Moreover, in accordance with the continuing example, if the edge detection component 406 determines that the current pixel is a level three edge (e.g., the current pixel is at a distance of three pixels from the edge of the footprint 302), then the ridge creation component 404 can evaluate whether the current pixel has paint in the paint map 204. If the current pixel is determined to lack paint in the paint map 204, then the blend component 410 can set a color value of the current pixel to a color value from the footprint 302 and the modulation component 408 can linearly interpolate a height value for the current pixel. Alternatively, if the ridge creation component 404 determines that the current pixel includes paint in the paint map 204, then the modulation component 408 can increase the height value of the current pixel as specified in the paint map 204 by a third increment. The third increment can be 0.02, and a maximum height value can be a footprint height plus 0.1, for instance; yet, the claimed subject matter is not so limited. Further, the blend component 410 can be inhibited from changing the color value of the current pixel in the paint map 204 when the current pixel is determined to include paint in the paint map 204. It is to be appreciated, however, that the claimed subject matter is not limited to the foregoing example.
As noted above, the movement track component 412 can compute a movement direction of the footprint 302. The movement direction of the footprint 302 can be employed by the edge detection component 406 to mitigate an impact on performance associated with detecting an edge of the footprint 302. For example, the edge detection component 406 can evaluate three directions from a pixel when determining whether the pixel is less than or equal to the predefined distance from the edge of the footprint 302 as opposed to checking a full circular range around the pixel. By way of example, the modulation component 408 can modulate a height value of a pixel that is outside the footprint 302 if and only if the pixel is less than or equal to the predefined distance from the edge of the footprint 302 as determined by the edge detection component 406 in at least one of a first direction opposite the movement direction of the footprint 302 as determined by the movement track component 412, a second direction that is rotated clockwise by a predefined angle from the first direction, or a third direction that is rotated counterclockwise by the predefined angle from the first direction.
Reference is again made to
By way of another example, the ridge creation component 404 can inhibit processing the ridge effect when the movement track component 412 detects that the footprint 302 is moving at a speed above a threshold. For instance, when the speed is above the threshold, the oil paint ridge model can be inhibited from being generated; alternatively, when the speed is below the threshold, the oil paint ridge model can be generated by the ridge creation component 404. Yet, it is to be appreciated that the claimed subject matter is not limited to the foregoing example.
In accordance with yet a further example, the movement track component 412 can filter redundant move input data. By way of illustration, in a brush stroke, oftentimes move input data can indicate that a move distance of zero occurred. The movement track component 412 can filter the input data with zero move distance, such that the ridge creation component 404 can be inhibited from generating the oil paint ridge model in response to such input data.
The paint component 210 further includes a drying component 414. The drying component 414 dries the oil paint deposited on the computer-implemented canvas 202. Thus, interaction between the oil paint deposited on the computer- implemented canvas 202 and subsequent oil paint deposits can be dependent upon a state of dryness of the oil paint deposited on the computer-implemented canvas 202 as controlled by the drying component 414. For instance, the ridge creation component 404 can form a ridge in wet oil paint deposited on the computer-implemented canvas 202 (e.g., manipulating such wet oil paint), while being inhibited from forming a ridge in dry oil paint. Moreover, wet oil paint deposited on the computer-implemented canvas 202 can be mixed with oil paint subsequently deposited with the image editing tool, while the subsequently deposited oil paint can be inhibited from being mixed with dry oil paint deposited on the computer-implemented canvas 202. Further, wet oil paint on the computer-implemented canvas 202 can be picked up onto the image editing tool (e.g., causing the pickup map to be altered), while dry oil paint on the computer-implemented canvas 202 can be inhibited from being picked up. While the foregoing examples describe two states of dryness (e.g., wet oil paint, dry oil paint), it is to be appreciated that more than two states of dryness can be employed in accordance with various embodiments.
The drying component 414 can dry the oil paint deposited on the computer-implemented canvas 202 in response to user input (e.g., user selection to dry the oil paint, etc.), for example. Additionally or alternatively, the drying component 414 can dry the oil paint deposited on the computer-implemented canvas 202 after a threshold length of time (e.g., subsequent to depositing the oil paint on the computer-implemented canvas 202). It is to be appreciated, however, that the claimed subject matter is not limited to the foregoing examples.
Turning to
With reference again to
It is further contemplated that disparate oil paint from the image editing tool can be deposited onto the computer-implemented canvas 202 subsequent to the drying component 414 drying the oil paint deposited on the computer-implemented canvas 202. The disparate oil paint can be deposited into the wet layer of the paint map 204 above the updated dry layer based at least in part upon the height values from the updated dry layer. Moreover, the display screen of the computing device can be caused to update an image rendered thereupon with the wet layer (e.g., including the disparate oil paint deposited subsequent to drying) covering the updated dry layer.
With reference to
As shown in
Moreover, as depicted in
Further, the color values from the dry layer of
Moreover, the color values from the dry layer of
Moreover, the acts described herein may be computer-executable instructions that can be implemented by one or more processors and/or stored on a computer-readable medium or media. The computer-executable instructions can include a routine, a sub-routine, programs, a thread of execution, and/or the like. Still further, results of acts of the methodologies can be stored in a computer-readable medium, displayed on a display device, and/or the like.
Turning to
At 1402, height values from the wet layer and the dry layer can be combined per pixel in the updated dry layer. At 1404, color values form the dry layer can be replaced with color values from the wet layer in the updated dry layer for pixels having color values in the wet layer. At 1406, color values from the dry layer can be maintained in the updated dry layer for pixels lacking color values in the wet layer.
Referring now to
The computing device 1500 additionally includes a data store 1508 that is accessible by the processor 1502 by way of the system bus 1506. The data store 1508 may include executable instructions, a computer-implemented canvas, a paint map, a pickup map, a footprint, etc. The computing device 1500 also includes an input interface 1510 that allows external devices to communicate with the computing device 1500. For instance, the input interface 1510 may be used to receive instructions from an external computer device, from a user, etc. The computing device 1500 also includes an output interface 1512 that interfaces the computing device 1500 with one or more external devices. For example, the computing device 1500 may display text, images, etc. by way of the output interface 1512.
It is contemplated that the external devices that communicate with the computing device 1500 via the input interface 1510 and the output interface 1512 can be included in an environment that provides substantially any type of user interface with which a user can interact. Examples of user interface types include graphical user interfaces, natural user interfaces, and so forth. For instance, a graphical user interface may accept input from a user employing input device(s) such as a keyboard, mouse, remote control, or the like and provide output on an output device such as a display. Further, a natural user interface may enable a user to interact with the computing device 1500 in a manner free from constraints imposed by input device such as keyboards, mice, remote controls, and the like. Rather, a natural user interface can rely on speech recognition, touch and stylus recognition, gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, machine intelligence, and so forth.
Additionally, while illustrated as a single system, it is to be understood that the computing device 1500 may be a distributed system. Thus, for instance, several devices may be in communication by way of a network connection and may collectively perform tasks described as being performed by the computing device 1500.
As used herein, the terms “component” and “system” are intended to encompass computer-readable data storage that is configured with computer-executable instructions that cause certain functionality to be performed when executed by a processor. The computer-executable instructions may include a routine, a function, or the like. It is also to be understood that a component or system may be localized on a single device or distributed across several devices.
Further, as used herein, the term “exemplary” is intended to mean “serving as an illustration or example of something.”
Various functions described herein can be implemented in hardware, software, or any combination thereof. If implemented in software, the functions can be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes computer-readable storage media. A computer-readable storage media can be any available storage media that can be accessed by a computer. By way of example, and not limitation, such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk and disc, as used herein, include compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and blu-ray disc (BD), where disks usually reproduce data magnetically and discs usually reproduce data optically with lasers. Further, a propagated signal is not included within the scope of computer-readable storage media. Computer-readable media also includes communication media including any medium that facilitates transfer of a computer program from one place to another. A connection, for instance, can be a communication medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio and microwave are included in the definition of communication medium. Combinations of the above should also be included within the scope of computer-readable media.
Alternatively, or in addition, the functionally described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.
What has been described above includes examples of one or more embodiments. It is, of course, not possible to describe every conceivable modification and alteration of the above devices or methodologies for purposes of describing the aforementioned aspects, but one of ordinary skill in the art can recognize that many further modifications and permutations of various aspects are possible. Accordingly, the described aspects are intended to embrace all such alterations, modifications, and variations that fall within the spirit and scope of the appended claims. Furthermore, to the extent that the term “includes” is used in either the details description or the claims, such term is intended to be inclusive in a manner similar to the term “comprising” as “comprising” is interpreted when employed as a transitional word in a claim.