Embodiments described herein relate generally to an image processing apparatus.
Digital televisions (DTVs) with curved displays have recently been developed. The DTVs with curved displays can provide more realistic images than DTVs with flat displays. However, the DTVs with curved display's require a larger installation area than the DTVs with flat displays, and involve such an installation problem that they are hard to hang on a wall. Further, the DTVs with curved displays are fixed in shape, and are therefore fixed in position to realize optimal viewing.
A general architecture that implements the various features of the embodiments will now be described with reference to the drawings. The drawings and the associated descriptions are provided to illustrate the embodiments and not to limit the scope of the invention.
In general, according to one embodiment, an image processing apparatus comprising: a display formed of a flat panel and configured to display a video image; and a display controller configured to generate a curved image and output, to the display, a display signal for displaying the curved image, the curved image being obtained by reducing and deforming an input image included in an input video signal in accordance with a horizontal position of the input image, to curve the input image perpendicularly, wherein the display controller reduces the input image by a maximum reduction ratio in a predetermined horizontal position, and reduces the input image by a smaller reduction ratio in a horizontal position remoter from the predetermined horizontal position.
Embodiments will be described with reference to the accompanying drawings.
The image processing apparatus 10 receives TV programs of ground-based, cable, BS and CS broadcasts, etc. The image processing apparatus 10 is connected to an external device by radio or fixed line to thereby transmit and receive information. For instance, the image processing apparatus 10 may be connected to, for example, the Internet.
The image processing apparatus 10 of the first embodiment comprises an input unit 111, a signal processor 112, a system controller (controller) 113, a video processor 114, a display 115, a voice processor 116, a voice output unit 117, an operation unit 119, a receiver 120, a communication interface 121, a network controller 122, a USB interface 123, an HDMI interface 124, a storage unit 125 and a detector 300. The image processing apparatus 10 is communicable with a remote controller 302 and connected to a communication unit 304.
The input unit 111 comprises an antenna for receiving broadcasts, tuners for selecting received signals, a descrambler for pre-processing programs, etc. The input unit 111 is connected to the antenna to receive programs from broadcast enterprises via space waves. Further, the input unit 111 receives programs from delivery enterprises via a network. The input unit 111 receives a broadcast stream (broadcast signal) to select one or more broadcast programs and then convert it into a broadcast stream usable in the signal processor 112. The input unit 111 sends all received programs of predetermined channels to the signal processor 112.
The signal processor 112 separates program attendant information multiplexed in the received broadcast signal, and outputs the separated program attendant information to the video processor 114 (video decoder 241). The signal processor 112 also outputs a recording stream to the controller 113 described later. The recording stream is information obtained by separating, in the signal processor 112, the program attendant information from the broadcast stream received by the input unit 111. The signal processor 112 separates the broadcast signal sent from the input unit 111 into a video signal, an audio signal and control information. The video processor 112 outputs the video signal to the video processor 114, and outputs the audio signal to the voice processor 116. At this time, if externally input signals received from the receiver 120 are, for example, video and audio signals output from a video camera, separation by the signal processor 112 may not be needed.
The system controller (controller) 113 controls each element of the image processing apparatus 10. Namely, the controller 113 controls the input unit 111, the signal processor 112, the video processor 114, the display 115, the voice processor 116, the voice output unit 117, the operation unit 119, the receiver 120, the communication interface (I/F) 121, the network controller 122, the USB interface (I/F) 123, the HDMI interface 124, the storage unit 125 and the detector 300. The controller 113 outputs various control commands corresponding to input signals (operation instruction signals) received by the receiver 120, described later, from the remote controller 302 or a mobile terminal, such as a smartphone, a mobile phone, a tablet or a note PC. The control commands are those for instructing, recording of a TV broadcast (program), replay of recorded content (program), etc.
The controller 113 comprises a position detector 131, an observation distance measuring unit 132, a ROM 134, a RAM 135 and an NVM 136.
The position detector 131 detects the position of an object existing in a predetermined area, identifies the type and/or state of the detected object, and outputs, as position information, information indicative of the detected and identified object.
The position detector 131 automatically detects the position of an object based on a signal from the controller 113. Further, the position detector 131 can detect the position of an object at arbitrary timing. The predetermined area means a preset area or an automatically set area. The preset area is, for example, an area that can be detected by the detector 300 described later. Based on detection data obtained from the detector 300, a person existing in the predetermined area is detected, and it is determined whether this person is observing (viewing) the display 115.
The position detector 131 arbitrarily sets an observer based on a signal from the controller 113. For instance, the position detector 131 causes the display 115 to display the detected person, and a user sets an observer using, for example, the remote controller 302. Further, the position detector 131 sets the detected person as an observer (viewer). In this case, the position detector 131 may set, as the observer, a person who is observing the display 115 for a predetermined period of time in the area detected by the detector 300.
The position detector 131 may detect the position of an object based on data differing from the data provided by the detector 300, and identifies the type or state of the object. For instance, the position detector 131 may detect an object based on data acquired by a sensor.
The observation (viewing) distance measuring unit 132 calculates the distance between a preset position and the position of the display 115. The preset position may be an arbitrarily set position or a regularly automatically detected position. The preset position is, for example, the position (observation position) of an observer who observes the display 115, and is, for example, the position of the head or eyes of the observer. The preset position may be obtained from detection data obtained by the position detector 131, or be set by the user. Further, the position of the display 115 is preset as, for example, the position of the screen surface of the display 115, the center of the thickness of the display 115, or the surface of the display 115 to which the visual axis of the observer is directed.
The observation distance measuring unit 132 calculates an observation distance, and stores the calculated observation distance as distance information in the storage unit 125.
The ROM (read only memory) 137 holds a control program executed by the controller 13. The RAM (access random memory (work memory)) 138 provides a work area for the system controller 113.
The NVM (nonvolatile memory) 139 holds various types of setting information and control information in the image processing apparatus 10. The NVM 139 can also hold the structure content information of a program table.
The video processor 114 comprises a video decoder 241, a video converter 242, a frame memory 243 and a display controller 244.
The video decoder 241 decodes a video signal separated by the signal processor 112, and outputs the decoded signal as a digital video signal (video output) to the video converter 242.
The video converter 242 converts the video signal decoded by the video decoder 241 into a signal of a predetermined resolution and an output scheme that can be displayed on the display 115. The video converter 242 stores the converted video signal in the frame memory 243 in accordance with a signal from the controller 113, and outputs it to the display controller 244.
The frame memory 243 stores, for processing, video data (signal) received from the input unit 111, and video data received from the video converter 242.
The display controller 244 converts an input video signal, i.e., images included in a single video image stream, into a display signal that can be appropriately displayed (video reproduction) by the display 115. The display controller 244 arbitrarily or automatically converts, into a display signal, a video signal output from the video converter 242 in accordance with a signal supplied from the controller 113, and outputs the resultant signal to the display 115.
Further, the display controller 244 extracts an arbitrary image from the images included in the video signal sent from the frame memory 243, and defines the extracted image as (divides into) images (primary images) belonging to a plurality of zones. The display controller 244 executes image processing on each of the primary images defined in the respective zones, and appropriately rearranges secondary images resulting from image processing on the primary images. The display controller 244 defines a set of rearranged secondary images as a curved image. When outputting a set of thus-defined curved images as a video image, the display controller 244 temporarily sequentially arranges a plurality of images generated by image processing, and converts them into a display signal indicative of a video image. The image processing includes, for example, expansion, reduction and deformation of images, and changes in the brightness, hue, contrast, quality, resolution, etc., of images. The zones defined for each image by the display controller 244 may have the same size or different sizes. In the description below, parts of the image, for which the display controller 244 defines zones, are referred to as the primary images, and parts of the image, for which zones are defined after processing the primary images, are referred to as the secondary images.
When rearranging the secondary images, the display controller 244 performs image processing so that the boundaries of the zone of each secondary image will contact opposing boundaries of the zones of corresponding adjacent secondary images. At this time, adjustment is performed to suppress image distortion at the boundaries of the respective pairs of adjacent secondary images.
The display controller 244 can arbitrarily set, for example, a to-be-processed image in the images indicated by a video signal, defined (divided) zones of the image, the type of image processing to execute, adjustment of images, and an image to output, in accordance with an instruction signal, or can automatically set them in accordance with an instruction signal from the controller 113. Further, the display controller 244 can control the content of image processing for each image whose zones are defined.
In the embodiment, each image is an image displayed to be fit in the rectangular display 115, and has a major axis and a minor axis. Each image may be a square one. In this case, in the display area of the display 115 having the major and minor axes, an arbitrary background image, such as a black band, is displayed in the area in which no image is displayed.
In image processing A301, the display controller 244 acquires image G11 from the frame memory 243.
In subsequent image processing A302, the display controller 244 defines image G11 as a plurality of zones. For instance, the display controller 244 defines image zones using triangular polygons as indicated by the broken lines in
In image processing A303, the display controller 244 executes predetermined processing on each of the primary images defined in image G11 by image processing A302. Namely, the display controller 244 provides a curved image of a predetermined curvature by processing each primary image partitioned by triangular polygons and rearranging the resultant secondary images.
At the time of rearrangement, the display controller 244 rearranges the secondary images so as to obtain a curved image that is greatly curved like hyperbolic curves along a longitudinal axis with reference to a secondary image located at a point at which the substantial hyperbolic curves most approach each other. The secondary image as the reference of the curves is more reduced and deformed than a maximum reduction ratio among the primary images defined in one image. Namely, the secondary images have deformation ratios that relatively increase as the secondary images are away along the longitudinal axis (horizontal axis) from the secondary image located at the above-mentioned point (reference point). The display controller 244 rearranges the secondary images to form a curved image that has an upwardly projecting lower end at a position along the longitudinal axis (horizontal axis) of the display area, and has a downwardly projecting upper end at the same position as the above. The upper and lower curves have the same curvature variation. For instance, the display controller 244 generates a curved image, at a center portion of which the curves project to approach each other. In the image, an axis perpendicular to the longitudinal axis is set as a transverse (perpendicular) axis. Further, the ratio of reduction is a deformation ratio indicative of a ratio with which each secondary image is reduced with respect to a corresponding primary image.
When forming a curved image curved along a certain axis, the display controller 244 can output a display signal indicative of a smoothly curved video image by setting (defining) a larger number of polygons for secondary images that are remoter, along the longitudinal axis, from the secondary image serving as a reference for curving.
Further, in image processing A303, the display controller 244 rearranges secondary images so that they are curved to the opposite ends of image G21 furthest from the center portion thereof located at the above-mentioned point, thereby setting the set of rearranged secondary images as a curved image. For instance, in image G21, the secondary image located at the center of the curved image is most reduced and deformed, while the two secondary images located at the opposite ends along the longitudinal axis of the display area are maintained at the original size assumed when they are input. In image G21, a black band BB is provided in, for example, an extra portion where no image is provided. After finishing image processing, the display controller 244 converts processed image G21 into a display signal, and outputs the resultant signal to the display 115.
The display 115 displays, as a video image, the display signal received from the display controller 244. When the display signal output from the display 115 constructs a curved video image (curved image), the display 115 displays the curved video image (curved image). For instance, if a display signal corresponding to a video image formed of a curved image like image G21 of
The voice processor 116 decodes an audio signal in a program received by the input unit 111, and outputs the resultant signal to the voice output unit 117.
The voice output unit 117 outputs an audio signal decoded by the voice processor 116. The voice output unit 117 is, for example, a loud speaker.
The operation unit 119 inputs, to the controller 113, a control command corresponding to a direct operation by a user.
The receiver 120 inputs, to the controller 113, a control command corresponding to a signal received from an external device, such as the remote controller 302 and a mobile terminal. The receiver 120 inputs an operation instruction, received from the remote controller 302, to the controller 113.
The communication interface 121 realizes wireless communication with a short-range wireless communication device based on, for example, WiFi (Wireless Fidelity). As the short-range wireless communication standard, Bluetooth (trademark) or Near Field Communication (NFC) is usable. The communication interface 121 may be of either wired scheme or wireless scheme, and is connected to, for example, a communication unit capable of transceiving signals from and to, for example, a wireless keyboard or a mouse. Further, the communication interface 121 communicates with a short-range wireless communication device via, for example, the communication unit 304. The communication unit 304 is a terminal for performing wireless communication based on, for example, WiFi. Specifically, the communication unit 304 is, for example, a card reader capable of communicating with a noncontact card medium.
The network controller 122 controls access to an external network, such as the Internet. The network controller 122 transmits and receives information through the Internet.
The USB interface 123 is connected to an external device, such as a keyboard 306, compliant with the USB standards.
The HDMI interface 124 enables wired communication between a plurality of devices based on the HDMI or MHL standards.
The storage device 125 stores information associated with various types of setting and data, various set values, information indicative of curved video images, and data associated with, for example, setting for curved video images corresponding to various types of content. The storage device 125 is, for example, a hard disk drive (HDD).
The detector 300 includes, for example, various types of sensors. For instance, the detector 300 is a camera with an image sensor. When the detector 300 is, for example, a small camera, it detects an object(s) around the image processing apparatus 10. The detector 300 is installed such that it can detect a predetermined area (detection area). The detector 300 detects, for example, an area in which an image on the display 115 is observed. A plurality of detectors 300 may be installed. The detector 300 detects detection data based on a signal from the controller 113. The detector 300 stores the detected data in the storage unit 125. The detection data includes data indicative of the position of a target (e.g., an observer), data indicative of the state of the target, data indicative of a person identification result, data associated with ambient video images and/or ambient images, etc.
The remote controller 302 sends an operation instruction from the user, to the controller 113 via the receiver 120. The remote controller 302 accepts an operation instruction via, for example, a button. The remote controller 302 outputs various operation instructions input by the user, such as an instruction to set the curvature of a video image (the curvature of a curved image), an instruction to set the position of the observer or the display 115, an instruction to adjust a video image, and an instruction to start detection.
The curvature of a curved video image detected in an observation position will be described. In general, in order to obtain a wide view angle and enhance presence, it is preferable to set the curvature of a curved video image small when the observer exists away from the display 115, and to set it large when the observer exists near the display 115. It should be noted here that although in the figures, some curved video images (curved images) are indicated by curved lines in order to show their curvatures, they are actually displayed on the display 115 that has a flat surface. Further, broken lines, which are displayed on each displayed image, merely indicate a process that the image was partitioned and then subjected to processing. Actually, however, these broken lines are not displayed on the image.
In
In
In
In
As shown in
In
Further, since in this case, the longitudinal position of the observer is changed, the display controller 244 resets the position of the vertically smallest portion of a curved image observed by the observer, in order to provide a realistic curved image of a wide view angle suitable for the observation position. Namely, the display controller 244 constructs curved image G24 so that the vertically smallest portion of the image is positioned at a longitudinal end close to the observer.
In
Referring then to
In B801, the position detector 131 of the controller 113 detects an object in a detection area, using the detector 300. The position detector 131 detects an object in, for example, the detection area that is detected in a real-time by the detector 300.
If in B802, the position detector 131 detects a person in the detected object (Yes in B802), subsequent processing (B803) is performed. In contrast, if no person is detected (No in B802), the position detector 131 again attempts to detect a person in the object detected in the detection area.
In B803, the position detector 131 detects whether the detected person is observing the display 115, and sets, as an observer, a person who is observing the display 115.
Subsequently, in B804, the observation distance measuring unit 132 of the controller 113 sets an observation position from information acquired by the position detector 131, and calculates an observation distance from information acquired by the detector 300.
In B805, the display controller 244 defines, as primary images, images acquired from the frame memory 243 in respective predetermined zones under the control of the controller 113, and performs predetermined processing in each of the primary images. The display controller 244 appropriately rearranges secondary images obtained by processing the primary images of the respective zones, and sets a set of rearranged secondary image as a curved image.
In B806, the display controller 244 outputs a display signal indicative of the processed image to the display 115.
In the first embodiment, the image processing apparatus 10 displays a curved video image on the flat display 115. Further, the image processing apparatus 10 performs predetermined processing, under the control of the controller 113, on each of the primary images obtained by dividing processing using polygons, appropriately rearranges the processed images, and provides the rearranged images as a smooth curved image. The image processing apparatus 10 displays the thus-formed curved image on the display 115. As a result, the image processing apparatus 10 can provide realistic images of a wide view angle.
Moreover, the image processing apparatus 10 can arbitrarily set the position of the vertically smallest portion of the curved image and the curvature of the curved image in accordance with an instruction from, for example, the remote controller 302. As a result, the curvature can be set in accordance with the taste of the observer, regardless of the observation distance.
The image processing apparatus 10 (display controller 244) can arbitrarily change the curvature of a video image displayed on the display 115 in accordance with a signal from the remote controller 302. Namely, a user including the observer can manually change the curvature of a curved image displayed on the display 115, using the remote controller 302.
Furthermore, the image processing apparatus 10 can automatically set the position of the vertically smallest portion of the curved image and the curvature of the curved image, using the controller 113. As a result, when the observer is observing the display 115, the image processing apparatus 10 automatically detects the observation position and executes appropriate image processing in the observation position. This also enables the image processing apparatus 10 to provide a realistic curved video image and/or image of a wide view angle even when the user including the observer does not set it using, for example, the remote controller 302.
Yet further, although in the first embodiment, the image processing apparatus 10 incorporates the display 115, they may be separate units. In the latter case, the image processing apparatus 10 is connected to the display 115, and outputs a resultant curved image to the display 115.
Although in the first embodiment, the detector 300 is incorporated in the image processing apparatus 10, it may be separate from the image processing apparatus 10 as shown in
Also, the display controller 244 may further incorporate image processing A304 for performing processing on each secondary image in image processing A303 so as not to display the blank spaces of a rearranged image as shown in
An image processing apparatus according to a second embodiment will be described. In the second embodiment, elements similar to those of the first embodiment are denoted by corresponding reference numbers, and no detailed description will be given thereof.
The second embodiment will be described with reference to the accompanying drawings.
As shown in
The observation angle measuring unit 133 detects the orientation of an arbitrarily or automatically set object. The observation angle measuring unit 133 sets an observed portion (observation point) based on the detected orientation of the object. Further, the observation angle measuring unit 133 can determine whether the observation point is on the display 115. If the observation point is on the display 115 for a predetermined period or more, the observation angle measuring unit 133 determines that the display is being observed. The predetermined period may be predetermined, or be arbitrarily set by the user. The controller 113 outputs, to the display controller 244, a signal indicative of the position of a curved image including the observation point.
For instance, the observation angle measuring unit 133 detects the orientation of the face of the observer and the eyes (line of sight) of the observer to thereby estimate the position of the line of sight in the display area of the display 115. The observation angle measuring unit 133 sets, as the observation point, a point in the display area of the display 115, which the observer is supposed to be observing. The observation angle measuring unit 133 outputs the area of a curved image located at the set observation point. The controller 113 outputs, to the display controller 244, a signal indicative of the position of the curved image including the set observation point.
In
As shown in
Referring then to
In B1401, the position detector 131 of the controller 113 detects an object in a detection area using the detector 300. The position detector 131 detects an object in, for example, a detector area detected in a real-time manner by the detector 300.
In B1402, the position detector 131 identifies (detects) a person in the detected object (Yes in B1402), it proceeds to subsequent processing (B1403). If no person is detected (No in B1402), the position detector 131 re-attempts to identify a person in the detection area from the detected object.
In B1403, the position detector 131 detects whether the detected person is observing the display 115. If the person is observing the display 115, the position detector 131 sets this person as an observer.
Subsequently, in B1404, the observation distance measuring unit 132 of the controller 113 sets, as an observation distance, the distance between the observation position and the display 115 based on the information acquired by the position detector 131, and calculates the observation distance from the detection data acquired by the detector 300.
In B1405, the observation angle measuring unit 133 of the controller 113 detects the orientation of the face of the observer and the line of sight of the observer from the detection data of the detector 300, thereby setting an observation point.
In B1406, the observation angle measuring unit 133 determines whether the observation point is kept on the display 115 for a predetermined period of time, thereby determining whether the observer is observing the display 115. If it is determined that the observer is observing the display 115 (Yes in B1406), the observation angle measuring unit 133 proceeds to subsequent processing (B1407). In contrast, if it is not determined that the observer is observing the display 115 (No in B1406), the observation angle measuring unit 133 returns to processing of B1401.
In B1407, the display controller 244 defines, as primary images, images acquired from the frame memory 243 in respective zones under the control of the controller 113, and executes predetermined processing on each primary image. Further, the display controller 244 appropriately rearranges secondary images resulting from predetermined processing, thereby providing a set of rearranged secondary images as a curved image. At this time, the display controller 244 provides a secondary image including the set observation point as the vertically smallest portion of the curved image.
In B1408, the display controller 244 outputs a display signal indicative of the processed image to the display 115.
In the second embodiment, the image processing apparatus 10 can set the curvature of a curved image and the vertically smallest portion of the curved image in accordance with the orientation of the face of the observer and the line of sight of the observer. When the observer is observing the display 115, the image processing apparatus 10 can detect an observation point on the displayed image, and reset the position of the vertically smallest portion of the curved image, which enables an appropriate curved video image to be output. As a result, the image processing apparatus 10 can provide a realistic curved video image or curved image of a wide view angle, regardless of the orientation of the face of the observer or the line of sight of the observer.
An image processing apparatus according to a third embodiment will be described. In the third embodiment, elements similar to those of the second embodiment are denoted by corresponding reference numbers, and no detailed description will be given thereof.
A description will be given of the third embodiment with reference to the accompanying drawings.
In an image processing apparatus 10 according to the third embodiment, the controller 113 further comprises a content identification unit 134 as shown in
The content identification unit 134 identifies the type of arbitrarily or predetermined set scene/content. The content identification unit 134 outputs, to the display controller 244, curvature information indicative of the curvature of a curved image suitable for viewing a scene and content. The curvature information is preset and stored in the storage unit 125. Alternatively, the curvature information can be added arbitrarily. If it is added, it is stored in the storage unit 125.
The content identification unit 134 identifies the type of content (variety shows, sports, movies, news, etc.) based on, for example, program collateral information acquired from, for example, an electronic program table, and outputs curvature information corresponding to the identified type. At this time, the controller 113 arbitrarily or automatically changes the ON/OFF state of a curved video image or the curvature of the image in accordance with the content/scene.
In the case of an externally input video image, the content identification unit 134 identifies an input system and outputs curvature information corresponding to the identified input system. In this case, the controller 113 can set the ON/OFF state of a curved video image or the curvature of the image in accordance with the identified input system, for example, the input system of a player. The curvature information corresponding to the input system is preset and stored in the storage unit 125.
In
For instance, when the display scene is changed from image G23 to image G26 as shown in
In
Referring then to
In B1701, the position detector 131 of the controller 113 detects an object in a detection area detected by the detector 300. The position detector 131 detects an object in, for example, a detector area or an image detected in a real-time manner by the detector 300.
In B1702, the position detector 131 identifies (detects) a person in the detected object (Yes in B1702), it proceeds to subsequent processing (B1703). If no person is detected (No in B1702), the position detector 131 re-attempts to identify a person in the detection area from the detected object.
In B1703, the position detector 131 detects whether the detected person is observing the display 115. If the person is observing the display 115, the position detector 131 sets this person as an observer.
Subsequently, in B1704, the observation distance measuring unit 132 of the controller 113 sets, as an observation distance, the distance between the observation position and the display 115 based on the information acquired by the position detector 131, and calculates the observation distance from the detection data acquired by the detector 300.
In B1705, the observation angle measuring unit 133 of the controller 113 detects the orientation of the face of the observer and the line of sight of the observer from the detection data of the detector 300, thereby setting an observation point.
In B1706, the observation angle measuring unit 133 determines whether the observation point is on the display 115, thereby determining whether the observer is observing the display 115. If it is determined that the observer is observing the display 115 (Yes in B1706), the observation angle measuring unit 133 proceeds to subsequent processing (B1707). In contrast, if it is not determined that the observer is observing the display 115 (No in B1706), the observation angle measuring unit 133 returns to processing of B1701.
In B1707, the display controller 244 defines, as primary images, images acquired from the frame memory 243 in respective zones under the control of the controller 113, and executes predetermined processing on each primary image. Further, the display controller 244 appropriately rearranges secondary images resulting from predetermined processing, thereby providing a set of rearranged secondary images as a curved image.
In B1708, the display controller 244 outputs a display signal indicative of the processed image to the display 115.
In B1709, the controller 113 determines whether the display scene displayed on the display 115 has been changed. If it is determined that the display scene has been changed (Yes in B1709), the controller 113 returns to B1707, thereby changing, for example, the curvature of the image displayed by the display controller 244. If it is determined that the display scene is unchanged (No in B1709), the controller 113 finishes the processing.
In the third embodiment, the image processing apparatus 10 can set the curvature of the curved image in accordance with a display scene or display content. As a result, the image processing apparatus 10 can provide a realistic curved video image or curved image of a wide view angle suitable for the display scene or content.
In the third embodiment, the image processing apparatus 10 displays a curved video image on the flat display 115. Further, the image processing apparatus 10 executes predetermined processing on each of partial images obtained by dividing processing using polygons, in accordance with an instruction from the controller 113, and appropriately rearranges the processed images to form a smooth curved image. The image processing apparatus 10 displays the curved image as a curved video image on the display 115. As a result, the image processing apparatus 10 can provide a realistic video image of a wide view angle.
Further, the image processing apparatus 10 can arbitrarily set the curvature of a curved image and the vertically smallest portion of the image, which are referred to for curving. This enables an observer to set a curvature in accordance with their taste, regardless of the observation distance.
The image processing apparatus 10 (display controller 244) can arbitrarily change the curvature of a video image displayed on the display 115 in accordance with a signal from the remote controller 302. Namely, a user including the observer can manually change the curvature of a video image displayed on the display 115, using the remote controller 302.
Yet further, the image processing apparatus 10 can appropriately set the curvature of a curved image and the vertically smallest portion of the image, which are referred to for curving. This enables the image processing apparatus 10 to automatically detect an observer who is observing the display 115, and to automatically execute appropriate processing on a video image in accordance with the detected observer. As a result, the image processing apparatus 10 can also provide a realistic curved image and curved image of a wide view angle, even if a user including an observer does not perform setting using, for example, the remote controller 302.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
This application claims the benefit of U.S. Provisional Application No. 62/072,248, filed Oct. 29, 2014, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62072248 | Oct 2014 | US |