The present description generally relates to video standard determination. More particularly, an embodiment relates to determining a correct video standard for a display device (e.g., a television monitor).
Different video standards are used around the world for broadcasting television signals. For example, television broadcasts in the United States of America are in NTSC (National Television System Committee) format. In some of Europe, television programming is broadcast in PAL (phase alternation line) format. Other parts of the world (such as France and portions of the Middle East) may use SECAM (sequential color and memory) format to broadcast television programs.
Some digital cameras are capable of displaying captured images on a regular television set. For example, a digital camera may be connected to a television set via a cable. However, to correctly display an image on a television set, a user needs to determine the video standard of the television set and then configure the camera appropriately. These steps may require the user to search through the manual and/or camera menus to try to find the correct setting.
One approach attempts to determine the video standard of the television receiver based on region and/or language settings provided by a user. However, such an approach may result in an incorrect video standard setting. For example, a user may pick Spanish as the language for the camera menus and South America as the region where the user is located. Since Spanish speaking countries in South America use various video standards, these inputs may result in an incorrect video standard setting. Accordingly, the user may need to search through the manual and/or camera menus to try and find the correct setting.
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items.
Various embodiments for determining a correct video standard for a display device (e.g., a television monitor) are described. In one embodiment, a test signal is transmitted to the display device in accordance with a video standard (such as NTSC, PAL, SECAM, or their varieties). The display device generates one or more displays in response to the test signal. A generated display may be thought of as a signal transmitted by the display device. Thus, it may be said that the display device transmits one or more signals in response to receiving a test signal. The signal(s) transmitted by the display device are analyzed to determine whether the video standard corresponding to the test signal is a correct video standard for the display device.
Moreover, the device that is transmitting the signals to the display device may be any suitable device capable of accessing data such as image, audio, or video data. For example, the device may be a digital camera (e.g., capable of capturing still images or video streams in digital format), personal digital assistant (PDA), cellular phone, MPEG (Moving Picture Experts Group) player (such as MPEG4 for compressed video streams), MP3 (MPEG layer-3 audio) player, or combinations thereof.
As will be further discussed herein, for example with reference to
The sensor 204 may be any suitable image capture sensor such as a complimentary metal-oxide semiconductor (CMOS) or a charge-coupled device (CCD). In an embodiment, the sensor 204 may be selectively activated or exposed to light rays without utilizing a physical barrier (such as the shutter 203). Moreover, a more simplified mechanism (such as a sensor cover) may be utilized to protect the lens 202 and/or the sensor 204 from environmental elements (e.g., sun rays, dust, water, humidity, or the like).
The digital camera 102 further includes a digital camera processing unit 206 that is coupled to the sensor 204. The processing unit 206 may process various data and/or signals within the camera 102. The processing unit 206 may include one or more processors (208) coupled to a volatile memory 210. The volatile memory 210 may be accessed by the processors 208 to fetch or store data such as configuration data utilized during the operation of the digital camera 102 (such as camera settings) or the like. The volatile memory 210 may further be utilized to temporarily store and/or process data such as images captured by the sensor 204. The volatile memory 210 may include any suitable types of memory such as random access memory (RAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), static RAM (SRAM), combinations thereof, or the like.
As shown in
The processing unit 206 may also include an external display signal generator 215 that is coupled to the processor(s) 208. For example, the processor(s) 208 may retrieve data from the nonvolatile memory 212 and/or volatile memory 210 regarding test signals (e.g., test data 214) and forward the test data to the external display signal generator 215 for conversion into a signal format that is appropriate for the display device 106. In an embodiment, the signal generator 215 may include a digital-to-analog (D/A) signal converter. Also, the signal generated by the signal generator 215 may be in accordance with various standards such as S-Video (separated video), composite video, component video (e.g., utilized with high definition television (HDTV)), VGA (video graphics array, or its varieties such as SVGA (super VGA), etc.), or the like.
The camera 102 may also include a display (such as a liquid crystal display (LCD)) or viewfinder 216 (or both), e.g., to display various information for a user such as camera menus, camera settings, instructions, captured images, vision field, or the like. The camera 102 may further include various input devices (not shown) such as buttons, dials, touch pads, keypad, touch screens, or the like to enable a user to provide input data, for example, in response to the displayed information on the display/viewfinder 216.
Furthermore, the digital camera 102 may include other removable/non-removable, volatile/nonvolatile computer storage media (not shown). By way of example, the nonvolatile memory 212 may include one or more of the following: a floppy disk, an optical disk drive (such as a compact disc ROM (CD-ROM) and/or digital versatile disk (DVD)), a tape (e.g., in case of digital video cameras), or the like.
In an embodiment, the digital camera 102 may utilize one or more external facilities (such as the computing device discussed with reference to
Referring to
At an operation 305, the camera user may be instructed to point the camera 102 towards the display device 106, e.g., by displaying a message on the display/viewfinder 216. In an embodiment, the user may be asked to confirm, e.g., via pressing a button, that the camera 102 is in fact pointed towards the display device 106. Also, the camera 102 may fire a strobe to get the user's attention. In one embodiment, the camera 102 may transmit a strobe signal and capture one or more images to confirm whether the camera is pointed towards the display device 106, e.g., by considering the number of saturated (e.g., white) pixels coming back to the camera 102 to determine how much of the display device fills the sensor 204.
At an operation 306, the test signal 104 is transmitted to the display device 106. In response to the transmitted test signal 104, the display device 106 may transmit one or more signals (e.g., one or more video frames or images) in accordance with the test signal 104 received by the display device 106. At an operation 308, the signals transmitted by the display device 106 are analyzed. For example, images (108) transmitted by the display device 106 may be captured by the sensor 204 and converted into electrical signals for processing by the processing unit 206. Various techniques may be utilized to perform the operation 308, such as analyzing one or more of a normalized histogram, a frequency response, and/or a contrast of the one or more signals transmitted by the display device, which will be further discussed below.
At an operation 310, it is determined whether the video standard corresponding to the transmitted test signal of the operation 306 is correct, e.g., based on the analysis of the operation 308 and comparison of the test signal 104 and the signals transmitted by the display device 106. If the video standard setting is correct, the method 300 terminates. Optionally, the user may be asked to confirm that the video standard setting is correct. Also, the video standard setting 213 may be stored in nonvolatile memory 212 (which is optional in one embodiment, as the operation 302 may have performed this task before). Otherwise, if the operation 310 determines that the video standard setting is incorrect, the video standard setting is modified (312) and the method 300 resumes at the operation 306. In one embodiment, the video standard setting may be modified to the remaining varieties of the current video standard setting prior to switching to a totally different video standard. For example, the processor(s) may modify the video standard setting 213 at the operation 312 from PAL to PAL-M and/or PAL-N, prior to switching to NTSC or SECAM.
As discussed above, various techniques may be utilized to perform the operation 308, such as analyzing one or more of a normalized histogram, a frequency response, and/or a contrast of the one or more signals transmitted by the display device and captured by the camera 102. Each of these techniques will now be discussed in further detail.
The camera 102 may capture data in a prescribed form. Typically, the captured data may encompass some known filter pattern (e.g., a Bayer pattern). This pattern will allow red, green, and blue triplets to be developed for the imaging array (or some alternated form such as YCbCr). In an embodiment, the Bayer pattern may be transformed into an array of x mega pixels based on the image captured by the camera 102. Within the captured image(s) there may be a test pattern or signal (or not). The test signal may then be recognizable. If the captured images of the display device 106 are not recognizable or do not include sufficient color information, then an alternate signal may be generated by the processing unit 206.
For example, the pixels may be read off of the sensor 204 into memory (210). The test pattern (214) which is being generated from memory may be compared to this captured image. In an embodiment, the color space may be irrelevant (YCbCr or RGB or other). The captured image(s) (108) and the test data (214) may be made similar enough for more efficient comparison. Moreover, in order to determine statistically that a pattern has been matched, the color for the image or a subset of the image may be developed and statistically analyzed. In one embodiment, a normalized histogram may be used to determine whether the image has the prescribed color. Further the entire resolution of the sensor 204 need not be used. In fact, just like preview in the camera, a down-sampled version may be used that is smaller than the maximum pixel count of the sensor (VGA resolution instead of 5 mega pixels).
In another embodiment, a relatively saturated blue background may be placed behind a saturated red “+” sign or other shape (geometric or otherwise). Statistically, a certain ratio of blue pixels to red pixels may be detected. This data may be normalized. Generally, normalizing may mean that based on the number of blue pixels detected, a certain number of red pixels should also be detected. Both the lack of color and the “scrolling” because of the differing frequency of the video standards may cause this measurement to be in error.
Furthermore, the test signals (104) instead of being a single pattern may be full screens of red, green, and/or blue. Each screen may be displayed in succession on the display device 106. The captured pixels may be analyzed at operation 308 to determine if the colors have high counts of red, green, and/or blue, respectively. In an embodiment, the Bayer pattern may not need to be demosaiced for the techniques discussed with reference to the operation 308 since the pattern on a sensor (204) typically resembles the following (where R=red, G=green, B=blue):
The pixel count is generated by interpolating the “other” colors for the pixel. For example, the upper left R pixel will have captured the R information. In order to generate the G and the B information for the final image, the surrounding pixels may be analyzed and interpolated. This process is generally referred to as “demosaicing.” This may speed up the analysis at the operation 308.
In one embodiment, the camera 102 may be used as an oscilloscope to analyze the frequency response of the signals transmitted by the display device 106. For example, the camera 102 may be set to PAL and coupled to an NTSC television (106) with the test signal generating a white screen on the display device 106. Because the PAL pattern has a frequency of 50 Hz, if the camera 102 is used to sample the displayed image(s) 108 at 50 Hz, the same normalized histogram may be perceived by the analysis over time. If the normalized histogram varies (e.g., where a high white count/low white count/high white count sequence occurs), the sampling frequency may be switched to 60 Hz. This may provide a captured sequence of n white count/n white count/n white count etc. which indicates that NTSC will provide a correct video standard for this example. There may be some predefined variation based on monitor/hand shake and other factors that may be factored out based on building in threshold factors. Because the signals are quite a bit different in frequency, the threshold factor may be an estimate.
In a further embodiment, text recognition (or pattern recognition) may be utilized, e.g., by displaying a test image that includes certain text (or pattern). The captured images may be analyzed at the operation 308 to determine the presence of the text in the displayed images 108.
In yet another embodiment, the operation 308 may be performed by contrast analysis. For example, a test signal (104) that generates a grid of two colors may be transmitted at the operation 306, e.g., with saturated blue lines with a red background. The contrast analysis may then be performed by a summation of the transitions between reds and blues. If the signal transmitted by the display device (108) is incorrect, the counts and values will get degraded, e.g., fewer transitions would be present and the sum would be much less than the sum corresponding to the test signal 104. Several images may be captured and averaged to help with deviations.
As shown in
The input/output interfaces 404 may include serial, parallel, and/or network interfaces. A network interface allows devices coupled to a common data communication network to communicate information with the computing device 400. Similarly, a communication interface, such as a serial and/or parallel interface, a universal serial bus (USB) interface, an Ethernet interface, an Institute of Electrical & Electronics Engineers (IEEE) 802.11 interface, and/or any combination of wireless or wired communication interfaces provides a data communication path directly (or through intermediate computing device(s) or network component(s)) between the computing device 400 and another electronic or computing device.
The computing device 400 may also include a volatile memory 408 and/or nonvolatile memory 410 (such as discussed with reference to
The computing device 400 may also include one or more application program(s) 414 and an operating system 416 which may be stored in volatile/nonvolatile memory (e.g., the memory 408 or 410) and executed on the processor(s) 402 to provide a runtime environment in which the application program(s) 414 may run or execute. The computing device 400 may also include an integrated display device 418, e.g., in embodiments where the computing device 400 is included in a suitable device, such as a PDA, a cellular phone, a digital camera, or other portable/mobile computing devices.
Some embodiments discussed herein (such as those discussed with reference to
Moreover, some embodiments may be provided as computer program products, which may include a machine-readable or computer-readable medium having stored thereon instructions used to program a computer (or other electronic devices) to perform a process discussed herein. The machine-readable medium may include, but is not limited to, those discussed with reference to memories 210, 212, 408, and/or 410, such as floppy diskettes, hard disk, optical disks, CD-ROMs, and magneto-optical disks, ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, flash memory, or other suitable types of media or machine-readable media that is capable of storing electronic instructions and/or data.
Additionally, some embodiments discussed herein may be downloaded as a computer program product, wherein the program may be transferred from a remote computer (e.g., a server) to a requesting computer (e.g., a client) by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., a modem or network connection). Accordingly, herein, a carrier wave shall be regarded as comprising a machine-readable medium.
Reference in the specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least an implementation. The appearances of the phrase “in one embodiment” in various places in the specification may or may not be all referring to the same embodiment.
Also, in the description and claims, the terms “coupled” and “connected,” along with their derivatives, may be used. In some embodiments of the invention, “connected” may be used to indicate that two or more elements are in direct physical or electrical contact with each other. “Coupled” may mean that two or more elements are in direct physical or electrical contact. However, “coupled” may also mean that two or more elements may not be in direct contact with each other, but may still cooperate or interact with each other.
Thus, although embodiments have been described in language specific to structural features and/or methodological acts, it is to be understood that claimed subject matter may not be limited to the specific features or acts described. Rather, the specific features and acts are disclosed as sample forms of implementing the claimed subject matter.