Electronic ink is typically drawn or written on a touch screen computing device such as an electronic white board, a tablet computer, a smart phone or other touch screen device. Facilitating input of electronic ink to such touch screen computing devices is an ongoing issue.
The embodiments described below are not limited to implementations which solve any or all of the disadvantages of known electronic ink systems.
The following presents a simplified summary of the disclosure in order to provide a basic understanding to the reader. This summary is not intended to identify key features or essential features of the claimed subject matter nor is it intended to be used to limit the scope of the claimed subject matter. Its sole purpose is to present a selection of concepts disclosed herein in a simplified form as a prelude to the more detailed description that is presented later.
A computing device is described which has a media content viewer configured to render content comprising an image or a video at a display associated with the computing device. The computing device has a processor configured to operate in at least an ink control mode and an ink authoring mode. In the ink control mode at least one ink control element is rendered over a display of the content, the ink control element being selectable by a user to change settings for electronic inking over the content. In the ink authoring mode user input generates electronic ink over the content. An ink mode controller is configured to monitor for and detect specified conditions, and when the specified conditions are detected, to trigger transitioning of the processor between the ink control mode and the ink authoring mode.
Many of the attendant features will be more readily appreciated as the same becomes better understood by reference to the following detailed description considered in connection with the accompanying drawings.
The present description will be better understood from the following detailed description read in light of the accompanying drawings, wherein:
Like reference numerals are used to designate like parts in the accompanying drawings.
The detailed description provided below in connection with the appended drawings is intended as a description of the present examples and is not intended to represent the only forms in which the present example are constructed or utilized. The description sets forth the functions of the example and the sequence of operations for constructing and operating the example. However, the same or equivalent functions and sequences may be accomplished by different examples.
Electronic ink is a form of digital writing or drawing, which is akin to conventional ink writing or drawing, and which is made by manual user input at a user interface of a computing device. The term “ink stroke” is used herein to refer to an electronic ink stroke. A non-exhaustive list of examples of types of manual user input for electronic inking is: mouse input, touch screen input, pointing user input, stylus input. In the case of mouse input a user operates a relative position indicator such as a conventional computer mouse to control a position indicator such as a cursor on a graphical user interface of a computer. As the user clicks and drags the mouse an electronic ink stroke is input to the computer. The electronic ink stroke ends when the user stops the click action. As the user repeats the clicking and dragging operation, further electronic ink strokes are created.
In the case of touch screen input, a user touches a touch screen with his or her finger and drags his or her finger along the touch screen to create an electronic ink stroke. When the finger loses contact with the touch screen the electronic ink stroke ends. Where the touch screen is hover sensitive, the finger creates electronic ink when it is within a hover sensitive range of the touch screen. The term “touch screen” is used herein to include hover sensitive screens.
In the case of pointing user input, cameras capture images of a user and a computing system computes from the image data either an absolute pointing direction of the user's finger or a relative pointing position. Using the captured images, tracked pointing direction and/or relative pointing position are computed and used to input electronic ink strokes. A stroke ends when pointing is not detected. Pointing user input for electronic ink is described in more detail below with reference to
In the case of stylus input a user holds a conventional pencil or pen, or a passive stylus (which is powered parasitically by a touch sensor panel), or an active stylus which has its own power supply. The stylus touches or hovers over a display and is used to input electronic ink strokes to an associated computing device. A touch sensor panel detects the conventional pencil and computes positions on the touch sensor panel where the sensor is activated by the pencil. The touch sensor panel is a hover sensitive panel in some cases. The positions are stored as an electronic ink stroke where there are a plurality of positions in a substantially continuous sequence and the stroke ends when input at the sensor panel is no longer detected. In the case of a stylus, the positions are the position of a tip of the stylus computed by the stylus itself or by a combination of the stylus and the touch sensor panel. The stylus is either passive or active.
In various examples described herein, input of electronic ink to a computing device in order to annotate videos or images is facilitated by using an ink mode controller. The ink mode controller controls automatic transitions between at least two modes of the computing device. A mode of a computing device is a state of the computing device in which the computing device is configured to operate in a specified manner, such as to react to user input or other events according to specified rules or operations. In various examples described herein a computing device has at least two modes which are an ink authoring mode and an ink control mode. In some examples, there are three modes which are an ink authoring mode, an ink control mode and a viewing mode. More than three modes are used in some cases.
By automatically transitioning between the modes an end user does not need to manually switch the modes and this reduces burden on the user. The user is able to maintain focus and attention on the drawing of the electronic ink rather than having to manually switch modes. The risk of error in inputting electronic ink to the computing device is also reduced because there is less risk of user input being wrongly detected as electronic ink, or as being detected as electronic ink when it is in fact intended as other user input. In some examples electronic ink is detectable over a greater proportion of the area of a display associated with the computing device in at least one of the modes as compared with at least one other of the modes.
The computing device 100 comprises an ink mode controller 102, media content viewer 104, one or more ink applications 106, at least one processor 110, a memory 112 and a store 108 holding one or more overlays for videos or images. The computing device 100 is in communication with at least one user input device 114 such as stylus 118, computer mouse, camera for detecting pointing user input, touch panel sensor or other user input device 114.
The media content viewer 104 is functionality to view media content such as videos or images. In examples using video, the media content viewer 104 comprises a video player which is computer-implemented functionality to receive a video and render the video at a display associated with the computing device 100. The video player may comprise a video decoder which decodes compressed, encoded video prior to rendering of the video at the display. The video player has an associated user interface to enable an end user to control playing of videos. The media content viewer 104 comprises an image viewer in examples where images are annotated with electronic ink. An image viewer is computer implemented functionality to render an image on a display.
The one or more ink applications 106 comprise software to enable authoring of electronic ink.
The ink mode controller is 102 is functionality to control which of a plurality of modes the computing device is in at any one time. A method of operation at the ink mode controller 102 is described below with reference to
In the example of
In the example of
In some examples, some or all of the functionality of the ink mode controller 102, video/image overlay store 108, media content viewer 104 and ink applications 106 is located at a remote computing entity in communication with the computing device 100 via a communications network.
Alternatively, or in addition, the functionality of the ink mode controller 102 and media content viewer 104 described herein is performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that are optionally used include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (ASICs), Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), Graphics Processing Units (GPUs).
Operation of the computing device 100 is now described with respect to particular examples depicted in
In the example of
When the media content viewer 104 is started up the electronic device 100 enters a viewing mode 900. The double lines around the viewing mode state in
A processor of the computing device 100 is configured to operate in a viewing mode 900 in which the content is played by the media content viewer 104 according to user input received at viewing controls rendered over the content by the processor. The ink mode controller 102 is configured to monitor for and detect specified conditions, and when the specified conditions are detected, to trigger transitioning of the processor between the viewing mode 900 and the ink control mode 902.
The ink mode controller 102 is configured to trigger transitioning of the processor from the viewing mode 900 to the ink control mode 902 if the specified conditions comprise detection of an event comprising at least one of: a stylus touch on the display, a finger touch on a touch screen associated with the display, a mouse click, a pointing user input computed from images of a user operating the computing device, a voice command, selection of an ink control mode icon.
The detected event has an associated location on the display which is anywhere on the display outside the viewing controls.
The ink mode controller 102 is configured to trigger transitioning of the processor from the ink control mode 902 to the viewing mode 900 if the specified conditions comprise detection of an event comprising at least one of: a voice command, selection of a viewing mode icon.
The ink mode controller 102 is configured to trigger transitioning of a process of the computing device 100 between ink control mode 902 and ink authoring mode 904 when it detects conditions which indicate commencement of electronic inking or cessation of electronic inking.
The ink mode controller 102 is configured to trigger transitioning of the processor from the ink control mode 902 to the ink authoring mode 904 on detection of an event comprising at least one of: a stylus touch on the display, a finger touch on a touch screen associated with the display, a mouse click, a pointing user input computed from images of a user operating the computing device, a voice command, selection of an ink authoring mode icon.
The ink mode controller 102 is configured to retain the ink control mode 902 in the case that the detected event occurs in the ink control element, such as where a user touches an ink control element.
In some cases the detected event comprises a location on the display which is outside the ink control element and so the detected event is able to trigger transitioning from the ink control mode 902 to the ink authoring mode 904.
In some cases the ink mode controller 102 is configured to trigger transitioning of the processor from the ink authoring mode 904 to the ink control mode 902 on detection of a time interval during which there is an absence of electronic ink input.
In some cases, in the ink control mode, a processor of the computing device 100 is configured to render at least one ink control element over the content such that part of the content is obscured.
In some cases, the processor is configured, in the ink control mode 902, to render a video timeline control over the content, where the content comprises a video.
In some cases the processor is configured, in the ink authoring mode 904, to render electronic ink over the locations on the display which are occupied by the ink control element(s) during the ink control mode 902.
The ink mode controller 102 monitors 1012 conditions such as times, user input events, processor events, operating system events or other events and checks 1014 whether to enter ink authoring mode or not. If the ink mode controller 102 decides not to enter ink authoring mode it checks 1024 whether to return to viewing mode. If so, the process returns to operation 1000. If not, the process moves to operation 1012.
If the ink mode controller decides at check 1014 to move to ink authoring mode, the processor is configured to stop 1016 rendering the ink controls, and where applicable, the video timeline. In ink authoring mode the processor detects user input and applies 1018 electronic ink as a result. The electronic ink is stored in an overlay to be superimposed or rendered over the content. The ink mode controller monitors conditions at operation 1020 and according to checks 1022 on those conditions it decides whether to return to ink control mode or to remain in ink authoring mode. If there is a return to ink control mode the process moves to operation 1008. If there is a decision to remain in ink authoring mode the process moves to operation 1016.
Computing-based device 1100 comprises one or more processors 1102 which are microprocessors, controllers or any other suitable type of processors for processing computer executable instructions to control the operation of the device in order to control which mode the computing-based device is in and to control transitions between modes, in order to facilitate input of electronic ink to the computing-based device 1100. In some examples, for example where a system on a chip architecture is used, the processors 1102 include one or more fixed function blocks (also referred to as accelerators) which implement a part of the method of
The computer executable instructions are provided using any computer-readable media that is accessible by computing based device 1100. Computer-readable media includes, for example, computer storage media such as memory 1112 and communications media. Computer storage media, such as memory 1112, includes volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or the like. Computer storage media includes, but is not limited to, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM), electronic erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that is used to store information for access by a computing device. In contrast, communication media embody computer readable instructions, data structures, program modules, or the like in a modulated data signal, such as a carrier wave, or other transport mechanism. As defined herein, computer storage media does not include communication media. Therefore, a computer storage medium should not be interpreted to be a propagating signal per se. Although the computer storage media (memory 1112) is shown within the computing-based device 1100 it will be appreciated that the storage is, in some examples, distributed or located remotely and accessed via a network or other communication link (e.g. using communication interface 1104). Memory 1112 includes an overlay store 1124 for video/image overlays where the overlays comprise electronic ink to be superimposed on a video or image.
The computing-based device 1100 also comprises an input/output controller 1106 arranged to output display information to a display device 1108 which may be separate from or integral to the computing-based device 1100. The display information may provide a graphical user interface. The input/output controller 1106 is also arranged to receive and process input from one or more devices, such as a user input device 1110 (e.g. a touch panel sensor, stylus, mouse, keyboard, camera, microphone or other sensor). In some examples the user input device 1110 detects voice input, user gestures or other user actions and provides a natural user interface (NUT). This user input may be used to author electronic ink, view content, select ink controls, play videos with electronic ink overlays and for other purposes. In an embodiment the display device 1108 also acts as the user input device 1110 if it is a touch sensitive display device. The input/output controller 1106 outputs data to devices other than the display device in some examples, e.g. a locally connected printing device.
Any of the input/output controller 1106, display device 1108 and the user input device 1110 may comprise natural user interface (NUI) technology which enables a user to interact with the computing-based device in a natural manner, free from artificial constraints imposed by input devices such as mice, keyboards, remote controls and the like. Examples of NUI technology that are provided in some examples include but are not limited to those relying on voice and/or speech recognition, touch and/or stylus recognition (touch sensitive displays), gesture recognition both on screen and adjacent to the screen, air gestures, head and eye tracking, voice and speech, vision, touch, gestures, and machine intelligence. Other examples of NUI technology that are used in some examples include intention and goal understanding systems, motion gesture detection systems using depth cameras (such as stereoscopic camera systems, infrared camera systems, red green blue (rgb) camera systems and combinations of these), motion gesture detection using accelerometers/gyroscopes, facial recognition, three dimensional (3D) displays, head, eye and gaze tracking, immersive augmented reality and virtual reality systems and technologies for sensing brain activity using electric field sensing electrodes (electro encephalogram (EEG) and related methods).
Alternatively or in addition to the other examples described herein, examples include any combination of the following:
A computing device comprising:
The computing device described above wherein the specified conditions comprise conditions which indicate commencement of electronic inking or cessation of electronic inking.
The computing device described above wherein the ink mode controller is configured to trigger transitioning of the processor from the ink control mode to the ink authoring mode on detection of an event comprising at least one of: a stylus touch on the display, a finger touch on a touch screen associated with the display, a mouse click, a pointing user input computed from images of a user operating the computing device, a voice command.
The computing device described above wherein the ink mode controller is configured to retain the ink control mode in the case that the detected event occurs in the ink control element.
The computing device described above wherein the detected event comprises a location on the display which is outside the ink control element.
The computing device described above wherein the ink mode controller is configured to trigger transitioning of the processor from the ink authoring mode to the ink control mode on detection of a time interval during which there is an absence of electronic ink input.
The computing device described above wherein the processor is configured, in the ink control mode, to render the at least one ink control element over the content such that part of the content is obscured.
The computing device described above wherein the processor is configured, in the ink control mode, to render a video timeline control over the content, where the content comprises a video.
The computing device described above wherein the processor is configured, in the ink authoring mode, to render electronic ink over the locations on the display which are occupied by the ink control element(s) during the ink control mode.
The computing device described above wherein the processor is configured to operate in a viewing mode in which the content is played by the media content viewer according to user input received at viewing controls rendered over the content by the processor.
The computing device described above wherein the ink mode controller is configured to monitor for and detect specified conditions, and when the specified conditions are detected, to trigger transitioning of the processor between the viewing mode and the ink control mode.
The computing device described above wherein the ink mode controller is configured to trigger transitioning of the processor from the viewing mode to the ink control mode if the specified conditions comprise detection of an event comprising at least one of: a stylus touch on the display, a finger touch on a touch screen associated with the display, a mouse click, a pointing user input computed from images of a user operating the computing device, a voice command.
The computing device described above wherein the detected event has an associated location on the display which is anywhere on the display outside the viewing controls.
The computing device described above wherein the ink mode controller is configured to trigger transitioning of the processor from the ink control mode to the viewing mode if the specified conditions comprise detection of an event comprising at least one of: a voice command, a user input event associated with a specified location on the display.
A computer-implemented method comprising:
The method described above comprising triggering transitioning of the processor from the ink control mode to the ink authoring mode on detection of an event comprising at least one of: a stylus touch on the display, a finger touch on a touch screen associated with the display, a mouse click, a pointing user input computed from images of a user operating the computing device, a voice command.
The method described above comprising retaining the ink control mode in the case that the detected event occurs in the ink control element.
The method described above comprising, in the ink control mode, rendering the at least one ink control element over the content such that part of the content is obscured.
The method described above comprising, in the ink authoring mode, rendering electronic ink over the locations on the display which are occupied by the ink control element(s) during the ink control mode.
A computing device comprising:
A computing device comprising:
For example, the means for rendering content is the media content viewer 104, the means for operating a processor is the ink mode controller 102 and the means for monitoring and detecting specified conditions is the ink mode controller.
The term ‘computer’ or ‘computing-based device’ is used herein to refer to any device with processing capability such that it executes instructions. Those skilled in the art will realize that such processing capabilities are incorporated into many different devices and therefore the terms ‘computer’ and ‘computing-based device’ each include personal computers (PCs), servers, mobile telephones (including smart phones), tablet computers, set-top boxes, media players, games consoles, personal digital assistants, wearable computers, and many other devices.
The methods described herein are performed, in some examples, by software in machine readable form on a tangible storage medium e.g. in the form of a computer program comprising computer program code means adapted to perform all the operations of one or more of the methods described herein when the program is run on a computer and where the computer program may be embodied on a computer readable medium. The software is suitable for execution on a parallel processor or a serial processor such that the method operations may be carried out in any suitable order, or simultaneously.
This acknowledges that software is a valuable, separately tradable commodity. It is intended to encompass software, which runs on or controls “dumb” or standard hardware, to carry out the desired functions. It is also intended to encompass software which “describes” or defines the configuration of hardware, such as HDL (hardware description language) software, as is used for designing silicon chips, or for configuring universal programmable chips, to carry out desired functions.
Those skilled in the art will realize that storage devices utilized to store program instructions are optionally distributed across a network. For example, a remote computer is able to store an example of the process described as software. A local or terminal computer is able to access the remote computer and download a part or all of the software to run the program. Alternatively, the local computer may download pieces of the software as needed, or execute some software instructions at the local terminal and some at the remote computer (or computer network). Those skilled in the art will also realize that by utilizing conventional techniques known to those skilled in the art that all, or a portion of the software instructions may be carried out by a dedicated circuit, such as a digital signal processor (DSP), programmable logic array, or the like.
Any range or device value given herein may be extended or altered without losing the effect sought, as will be apparent to the skilled person.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
It will be understood that the benefits and advantages described above may relate to one embodiment or may relate to several embodiments. The embodiments are not limited to those that solve any or all of the stated problems or those that have any or all of the stated benefits and advantages. It will further be understood that reference to ‘an’ item refers to one or more of those items.
The operations of the methods described herein may be carried out in any suitable order, or simultaneously where appropriate. Additionally, individual blocks may be deleted from any of the methods without departing from the scope of the subject matter described herein. Aspects of any of the examples described above may be combined with aspects of any of the other examples described to form further examples without losing the effect sought.
The term ‘comprising’ is used herein to mean including the method blocks or elements identified, but that such blocks or elements do not comprise an exclusive list and a method or apparatus may contain additional blocks or elements.
The term ‘subset’ is used herein to refer to a proper subset such that a subset of a set does not comprise all the elements of the set (i.e. at least one of the elements of the set is missing from the subset).
It will be understood that the above description is given by way of example only and that various modifications may be made by those skilled in the art. The above specification, examples and data provide a complete description of the structure and use of exemplary embodiments. Although various embodiments have been described above with a certain degree of particularity, or with reference to one or more individual embodiments, those skilled in the art could make numerous alterations to the disclosed embodiments without departing from the scope of this specification.