This disclosure relates generally to video post-processing and, more particularly, to a method, a device and/or a system of enhancement of a portion of video data rendered on a display unit associated with a data processing device based on tracking movement of an eye of a user thereof.
A data processing device (e.g., a desktop computer, a laptop computer, a notebook computer, a smart television, a smart display, a netbook, a mobile device such as a mobile phone) may render video data on a display unit (e.g., Liquid Crystal Display (LCD), Light Emitting Diode (LED) display) associated therewith. A user of the data processing device may wish to modify a video parameter (e.g., a resolution) associated with the video data in order to enhance a viewing experience thereof. For the aforementioned purpose, the user may have to manually modify the video parameter associated with the video data through a physical intervention on the data processing device. Repeated manual modifications may frustrate the user. Further, as all onscreen portions of the display unit may have data associated therewith enhanced, the user may suffer eye strain during prolonged onscreen viewing.
Disclosed are a method, a device and/or a system of enhancement of a portion of video data rendered on a display unit associated with a data processing device based on tracking movement of an eye of a user thereof.
In one aspect, a method includes tracking, through a processor of a data processing device in conjunction with a number of sensors, a movement of an eye of a user of the data processing device onscreen on a display unit associated therewith. The processor is communicatively coupled to a memory. The method also includes determining, through the processor, a portion of a video data being rendered onscreen on the display unit on which the eye of the user is focused based on the sensed movement of the eye. Further, the method includes rendering, through the processor, the portion of the video data on the display unit at an enhanced level compared to other portions thereof following the determination of the portion of the video data.
In another aspect, a non-transitory medium, readable through a data processing device and including instructions embodied therein that are executable through the data processing device, is disclosed. The non-transitory medium includes instructions to track, through a processor of the data processing device in conjunction with a number of sensors, a movement of an eye of a user of the data processing device onscreen on a display unit associated therewith. The processor is communicatively coupled to a memory. The non-transitory medium also includes instructions to determine, through the processor, a portion of a video data being rendered onscreen on the display unit on which the eye of the user is focused based on the sensed movement of the eye. Further, the non-transitory medium includes instructions to render, through the processor, the portion of the video data on the display unit at an enhanced level compared to other portions thereof following the determination of the portion of the video data.
In yet another aspect, a data processing system is disclosed. The data processing system includes a data processing device. The data processing device includes a memory, a processor communicatively coupled to the memory, and a number of sensors. The processor is configured to execute instructions to track a movement of an eye of a user of the data processing device onscreen on a display unit associated therewith in conjunction with the number of sensors. The processor is further being configured to execute instructions to: determine a portion of a video data being rendered onscreen on the display unit on which the eye of the user is focused based on the sensed movement of the eye, and render the portion of the video data on the display unit at an enhanced level compared to other portions thereof following the determination of the portion of the video data.
The methods and systems disclosed herein may be implemented in any means for achieving various aspects, and may be executed in a form of a non-transitory machine-readable medium embodying a set of instructions that, when executed by a machine, cause the machine to perform any of the operations disclosed herein.
Other features will be apparent from the accompanying drawings and from the detailed description that follows.
The embodiments of this invention are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
Other features of the present embodiments will be apparent from the accompanying drawings and from the detailed description that follows.
Example embodiments, as described below, may be used to provide a method, a device and/or a system of enhancement of a portion of video data rendered on a display unit associated with a data processing device based on tracking movement of an eye of a user thereof. Although the present embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the various embodiments.
In one or more embodiments, memory 104 of data processing device 100 may include video data 116 (e.g., video data 116 may be downloaded and locally stored in memory 104; video data 116 (e.g., a video stream, a file including video, audio and/or text content therein) may be transmitted from a data source) therein. In one or more embodiments, processor 102 may perform appropriate processing (e.g., data conversion) on video data 116 to enable rendering thereof on a display unit 112 associated with data processing device 100;
In one or more alternate embodiments, post-processing engine 130 may be part of decoder engine 120;
In one or more embodiments, as mentioned above, data processing device 100 may include a number of sensors 1241-9 associated therewith to track an eye movement of a user 150 thereof.
In one or more embodiments, sensors 1241-9 may be configured to track the eye movement of user 150 as discussed above.
Also,
It should be noted that other forms of sensors 1241-5 and/or sensors 1246-9 (e.g., sensors based on heat mapping, other forms of distance sensing/eye movement tracking) are within the scope of the exemplary embodiments discussed herein.
In one or more embodiments, once portion 252 is determined, processor 102 may be configured to adjust/enhance one or more video parameter(s) 140 (e.g., a resolution, color/contrast adjustment) associated with portion 252. In one or more embodiments, processor 102 may then be configured to enable rendering video data 116 on display unit 112 with adjusted/enhanced portion 252 thereon. It should be noted that enhancing/adjusting video parameter(s) 140 as discussed above alone may not determine the scope of the exemplary embodiments discussed herein. In an example embodiment, portion 252 may be rendered in a normal mode of operation and other portions of video data 116 may be rendered at a reduced level. Such variations are within the scope of the exemplary embodiments discussed herein. Further, rendering portion 252 at an enhanced level includes processing associated with increasing intensity level of a backlight 164 of display unit 112 on the corresponding area/portion on the “screen” thereof.
In one or more embodiments, the eye movement tracking, the determination of portion 252 and/or the rendering of portion 252 at an adjusted/enhanced level may be initiated through a driver component (e.g., a set of instructions) associated with processor 102, display unit 112 and/or sensors 1241-9.
Also, user 150 may initiate the abovementioned processes through a physical button provided on data processing device 100 and/or a user interface of an application (e.g., multimedia application 196 shown as being part of memory 104) executing on data processing device 100. In one or more embodiments, driver component 302 may be packaged with operating system 188 (e.g., again, shown as being part of memory 104) executing on data processing device 100 and/or multimedia application 196. Further, instructions associated with driver component 302 and/or the processes discussed above may be tangibly embodied on a non-transitory medium (e.g., a Compact Disc (CD), a Digital Video Disc (DVD), a Blu-ray Disc®, a hard drive; appropriate instructions may be downloaded to the hard drive) readable through data processing device 100.
An example scenario in which concepts discussed herein may be applicable includes utilizing sensors 1241-9 for eye movement tracking and dynamically increasing pixel brightness solely for the pixels corresponding to the portion onscreen on which eye 202 of user 150 is focused (the brightness of other portions may be maintained/reduced). It should be noted that rendering portion 252 on display unit 112 at an enhanced level discussed above may also include automatically providing user 150 a capability to perform operations on onscreen portion 274 (e.g., selecting a text corresponding to onscreen portion 274, cut/copy/paste actions associated therewith, modifying font size) based on tracking eye movement thereof, without a requirement on part of user 150 to physically intervene on data processing device 100. For example, user 150 may stare at the screen of display unit 112 for a time exceeding a threshold (e.g., 5 seconds); the aforementioned action may be predefined (e.g., through processor 102) as corresponding to an operation on onscreen portion 274 (e.g., a selecting a text corresponding to onscreen portion 274); once the abovementioned eye movement is tracked, the corresponding onscreen portion 274 may be determined and the operation performed thereon automatically without the requirement of physical intervention on part of user 150.
Although the present embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the various embodiments. For example, the various devices and modules described herein may be enabled and operated using hardware circuitry (e.g., CMOS based logic circuitry), firmware, software or any combination of hardware, firmware, and software (e.g., embodied in a non-transitory machine-readable medium). For example, the various electrical structure and methods may be embodied using transistors, logic gates, and electrical circuits (e.g., application specific integrated (ASIC) circuitry and/or Digital Signal Processor (DSP) circuitry).
In addition, it will be appreciated that the various operations, processes and methods disclosed herein may be embodied in a non-transitory machine-readable medium and/or a machine-accessible medium compatible with a data processing system (e.g., data processing device 100). Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.