Many consumer devices are now constructed to generate and/or use digital data in increasingly large quantities. Portable digital cameras for still and/or moving pictures, for example, generate large amounts of digital data representing still images, video clips, and for some devices audio tracks. To provide for this type of data storage application, the storage memory should be relatively low in cost for sufficient capacities of around 10 MB to 1 gigabyte (GB). The storage memory should also be low in power consumption (e.g., <<1 Watt) and have relatively rugged physical characteristics to cope with the portable battery powered operating environment. Preferably the memory should have a short access time (preferably less than one millisecond) and moderate transfer rate (e.g., 20 Mb/s).
One form of storage currently used for application in portable devices such as digital cameras is flash memory. Flash memory meets the desired mechanical robustness, power consumption, transfer, and access rate characteristics mentioned above. However, the read/write speeds of flash memory cards varies greatly from card to card, vendor to vendor, and for individual cards read/write speeds can degrade with age and/or use of the card. The variance in data transfer rates to/from a particular external memory medium can make certain features unavailable in such applications, for example streaming video at the highest resolution, frame rate, and image quality acquirable on a digital camera.
One embodiment of a digital camera comprises an image acquisition system configured to generate a data stream, a data processing system configured to receive and transform the data stream to generate a compressed data stream, a control configured to generate a variable user input, and a memory interface coupled to the data processing system. The memory interface is configured to feedback a sustainable data transfer rate to configuration logic. The configuration logic selects a value associated with at least one operational parameter of the digital camera in response to the variable user input and the sustainable data transfer rate.
Another embodiment is a method for dynamically processing data. The method comprises the following: determining a sustainable data transfer rate between a data appliance and an auxiliary memory medium, determining a user preference for a smooth representation versus a sharp representation, selecting a value for at least one operational parameter within the data appliance in response to the sustainable data transfer rate and the user preference, and processing data in accordance with the at least one operational parameter.
The present systems and methods for responding to a data transfer, as defined in the claims, can be better understood with reference to the following drawings. The components within the drawings are not necessarily to scale relative to each other, emphasis instead is placed upon clearly illustrating the principles of the systems and methods.
Present systems and methods for responding to a data transfer measure or otherwise determine a sustainable data transfer rate between a data appliance such as a digital camera and an external memory medium. The sustainable data transfer rate is optimized for data transfers to/from a particular auxiliary/external memory medium when data is transferred to/from the data appliance at the sustainable data transfer rate. By measuring and responding to actual data transfer rates, improved data characteristics can be met while still streaming data to/from the memory medium. Data transfer calibration can be implemented at system start up and/or at other unobtrusive times during system operation as may be desired.
Sustained data transfer rates can be determined by forwarding a test file via a memory interface to an auxiliary/external memory medium. The test file contains a digital representation of video data. Any of a number of methods may be used to determine a sustainable data transfer rate for data write or data read operations. Described methods read or write the test file at an initial bit rate that matches the maximum rate supportable by the data appliance. If a data transfer error is detected, an interim bit rate less than the initial bit rate by a predetermined amount is used for the remainder of the data transfer and/or subsequent data transfers. After the bit rate has been decreased, the data transfer resumes until another data transfer error condition occurs or the data transfer is completed. Data transfers and bit rate adjustments repeat until no data error is detected during a transfer of the test file.
An alternative method starts with an initial bit rate that is slower than that required to support the transfer of the desired data stream directly to/from the external memory medium for a set of desired operational parameters. The test file is written to or read from the auxiliary/external memory medium at the initial bit rate. If a data transfer error is detected, a suitable error message indicating that the memory medium cannot support the desired data quality is communicated to an operator of the data appliance via a user interface. If no error condition is detected, the initial bit rate is increased by a predetermined amount and the test file is transferred again. The test file transfer, error condition monitoring, and bit rate adjustment steps are repeated until a data transfer error is detected or the data appliance reaches its maximum data transfer rate. When a data transfer error is encountered, the data appliance may use the last bit rate associated with a successful file transfer or may reduce the last bit rate by some other predetermined amount or by a predetermined percentage of the bit rate that produced the data transfer error. When the alternative bit rate adjustment is contemplated, the data appliance will be configured to confirm that the test file can be successfully transferred to/from the external memory medium at the final bit rate.
Additional methods for determining a sustainable data transfer rate between a data appliance and an auxiliary/external memory medium may be implemented within contemplated systems for responding to a data transfer. Select additional and previously described methods for determining a sustainable data transfer rate may be implemented for monitoring data write operations (i.e., data transfers to an external memory medium) with the same or different methods used for monitoring data read operations (i.e., data transfers from an auxiliary/external memory medium) as may be desired.
In addition to responding to a sustainable data transfer rate, present systems and methods retrieve and respond to an operator preference. In the example embodiment, the data appliance is a digital camera and the operator preference concerns whether an operator of the camera desires to stream video data that is smoother (i.e., at a higher frame rate so that objects in motion in the video representation appear to move in a continuous manner) rather than sharper (i.e., at a higher spatial resolution) as these video quality characteristics may be limited by the sustainable data transfer rate associated with the present memory medium coupled to the digital camera.
After a sustainable data transfer rate and the operator preference are determined, the systems and methods select a value for an operational parameter within the data appliance to maximize the quality of data that can be streamed to the memory medium. Operational parameters include data acquisition parameters and data processing parameters. Data acquisition parameters are those variables that determine the nature of the acquired data stream. Data acquisition parameters include spatial resolution and/or frame rate. Data processing parameters are those variables that determine the nature of data compression performed on the acquired data stream. Data processing parameters include bit rate, frame type, and search area for motion vectors. After determining the sustainable read/write speed of the current external memory card or other auxiliary memory medium in use and the operator preference, one or more values associated with operational parameters can be selected to dynamically match the data rate generated on the data appliance to the sustainable data transfer speed. Selection of values associated with operational parameters includes the selection of a predetermined set of operational parameters for a range of sustainable data transfer rates.
For example, if an operator of a digital camera selects a preference of “smoothest,” a high frame rate will be applied over high frame resolution and low quantization. If an operator selects a preference of “sharpest,” higher frame resolution and low quantization will be applied over a high frame rate. Thus, for a given sustained data transfer rate and user preference, the digital camera will apply a set of values associated with operational parameters (e.g., data acquisition and data processing settings) to controllably maintain the digital camera's ability to stream video data to a select memory medium.
Turning to the drawings that illustrate various embodiments of systems and methods for responding to a data transfer,
Operator interface 102 is coupled to control 105 via connection 103 and coupled to ASIC 110 via connection 112. Operator interface 102 coordinates the application of control inputs applied via control 105 at appropriate times under the direction of executable instructions stored in internal memory 120 and processed by ASIC 110. Control 105 includes a set of positional pushbuttons. The positional pushbuttons include right pushbutton 107 and left pushbutton 109.
ASIC 110 coordinates and controls the functions of the remaining functional items via various connections with each of the internal memory 120, display 130, data acquisition system 140, data processing system 150, and memory interface 160. As illustrated in
ASIC 110 processes or otherwise executes executable instructions provided in firmware (not shown) within ASIC 110 or within software provided in internal memory 120. ASIC 110 further coordinates the transfer of display data via connection 119 to display 130.
Display 130 can be, for example, a liquid crystal display (LCD) or other display. Display data can include frames of image data. In alternative modes, display 130 presents camera configuration information, configuration menus, and other information. Display data can be formatted in ASIC 110 or in a display controller (not shown). The display data can be formatted using any of a number of standards including VGA, SVGA, among others.
Data acquisition system 140 is coupled to ASIC 110 via connection 115. Data acquisition system 140 is configured to obtain and forward data either via connection 115 to ASIC 110 or in alternative modes of operation to data processing system 150 via connection 145. Data acquisition system 140 captures or otherwise obtains data and forwards the acquired data in accordance with one or more data acquisition parameters. Data acquisition parameters may be stored internally within data acquisition system 140 or communicated to data acquisition system 140 at select times from ASIC 110 via connection 115.
For example, in one embodiment data acquisition system 140 is configured to capture image information. In this embodiment data acquisition system 140 includes an image sensor. The image sensor may comprise a charge coupled device (CCD) array or an array of complementary metal-oxide semiconductor (CMOS) sensors. Regardless of whether the image sensor comprises an array of individual CCD elements or CMOS sensors, each of the elements in the array comprises a picture element or pixel of the image sensor. The individual pixels of the image sensor are typically arranged in a two-dimensional array. For example, an array may comprise 2272 pixels in length and 1712 pixels in height.
The image sensor captures an image of a subject-of-interest by converting light incident upon the individual elements of the array into electrical signals. The electrical signals are forwarded to an analog-to-digital converter for converting the analog signal received from the image sensor into a digital signal. When data acquisition system 140 is configured to acquire image information over time, the data is acquired in accordance with a controllable spatial resolution and frame rate. Spatial resolution determines the number of pixels that will be used when forming a representation (e.g., a frame) of the captured image. A desired spatial resolution may or may not match the two-dimensional array of sensing elements in the image sensor. When the spatial resolution defines an array size that is lower than that provided by the image sensor, the data acquisition system 140 or ASIC 110 will drop some of the information provided by the image sensor. When the desired spatial resolution defines an array size that is higher than that provided by the image sensor, the data acquisition system 140 or ASIC 110 will insert data interpolated from closely located pixels to expand the size of the array. Frame rate determines the number of two-dimensional images provided over a fixed period of time (e.g., 30 frames/second).
A movie is usually filmed at a rate of 24 frames per second. This means that every second, there are 24 complete images displayed on the movie screen. American and Japanese television use the national television standards committee (NTSC) format, which displays a total of 30 frames per second in a sequence of 60 fields, each of which contains alternating lines of the picture. Other countries use the phase alternate line (PAL) format, which displays at 50 fields per second, but at a higher resolution. Because of the differences in frame rate and resolution, video data needs to be formatted for either the NTSC or the PAL system.
Data processing system 150 is coupled to ASIC 110 via connection 117. Data processing system 150 is configured to receive, format, or otherwise compress data from data acquisition system 140 via connection 145 or ASIC 110 via connection 117. Data processing system 150 formats and/or compresses data in accordance with one or more data processing parameters. Data processing parameters may be stored internally within data processing system 150 or communicated to data processing system 150 at select times from ASIC 110 via connection 117.
The compression and transmission of digital video is associated with a series of different disciplines of digital signal processing, each of which can be applied independently. Video data compression systems typically employ a variety of mechanisms to efficiently encode video frames. Some well-known compression standards (e.g., MPEG) utilize transform coding (e.g., the discrete-cosine transform), quantization, entropy coding, predictive coding, and control theory. Furthermore, these video compression standards contain a variety of different coding parameters and/or algorithms which may result in different performance depending on their values and/or implementation, respectively. When digital camera 100 is configured to acquire and process image information over time, the data is processed by data processing system 150 in accordance with a controllable bit rate, frame type, and search area for motion vectors. The bit rate reflects the amount of data transferred over a specific time period (e.g., 20 MB/second). The frame type defines how the image data for a specific frame is encoded. The search area defines the maximum displacement of matching blocks of information from one frame to the next, i.e., how objects can move between frames if they are to be coded effectively.
While most video compression techniques use some of the techniques used in compressing still image representations to eliminate redundant data, they also use information from other frames to reduce the overall size of a file or video clip. Each frame can be encoded in one of three ways: as an intraframe, a predictive frame, and a bidirectional frame. An intraframe contains the complete image data for that frame. This method of encoding provides the least compression. A predicted frame contains just enough information to display the frame based on the most recently displayed intraframe or predicted frame. This means that the frame contains only the data that relates to how the picture has changed from the previous frame. A bidirectional frame must have the information from the surrounding intraframe or predicted frames. Using data from the closest surrounding frames, it interpolates the position and color of each pixel.
Data processed in accordance with processing parameters is forwarded via connection 155 to memory interface 160. Memory interface 160 can store and retrieve data from an auxiliary memory medium 170 via connection 165. As illustrated in
Memory interface 160 is further coupled to internal memory 120 via connection 125 and ASIC 110 via connection 167. During a data transfer calibration operation, memory interface 160 retrieves test file 122 from internal memory 120 via connection 125. When a sustainable data write speed is desired, memory interface 160 forwards test file 122 at a predetermined bit rate. An internal system clock and monitoring logic (both not shown) associated with digital camera 100 are used to confirm bit rates associated with transfers of the test file 122. If a data transfer error occurs, memory interface 160 adjusts the bit rate until a sustainable data transfer rate is confirmed. Once confirmed, the sustainable data transfer rate for data write operations to the presently coupled auxiliary memory medium 170 is forwarded via connection 167 to ASIC 110.
When a sustainable data read speed is desired, memory interface 160 retrieves and forwards test file 122 to the auxiliary memory medium 170. Once the data transfer is complete, memory interface 160 begins to retrieve the test file 122 from auxiliary memory medium 170 at a predetermined bit rate. If a data transfer error occurs, memory interface 160 adjusts the bit rate until a sustainable data transfer rate is confirmed. The sustainable data transfer rate for data read operations from the presently coupled auxiliary memory medium 170 is forwarded via connection 167 to ASIC 110.
The systems and methods for responding to a data transfer can be implemented using combinations of hardware, software, or firmware. In the illustrated embodiment(s), the systems and methods are implemented using a combination of hardware and software that is stored in an internal memory and that is executed by a suitable instruction execution system provided within an ASIC.
Hardware components of the systems for responding to a data transfer can be implemented with any or a combination of the following alternative technologies, which are all well known in the art: discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates (as described in the illustrated embodiment), a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.
Software or firmware components of the systems for responding to a data transfer can be stored in one or more memory elements and executed by a suitable general purpose or application specific processor. Software or firmware for determining a sustainable data transfer rate and or for selecting a value for at least one operational parameter associated with a digital appliance and/or a digital camera, which comprises an ordered listing of executable instructions and data for implementing logical functions, can be embodied in any computer-readable medium for use by, or in connection with, an instruction execution system, apparatus, or device, such as an appropriately configured processor-containing camera or other system that can fetch the instructions from the instruction execution system and execute the instructions. While illustrated embodiments of the present systems and methods do not include operation with a computer, those of ordinary skill will understand that software or firmware components of the systems for responding to a data transfer can be stored on and later read from a computer-readable medium. In the context of this document, a “computer-readable medium” can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system.
Reference is directed to
Similarly, rightward movement, i.e., movement towards right-side limit 214 of sliding bar 210 is responsive to an operator depressing right pushbutton 107 associated with control 105. Relative rightward movement of sliding bar 210 results in an operator preference for smoother video data to be streamed to the present external memory medium coupled to the digital camera 100.
As further illustrated in
Once a preference has been entered and recorded or a default preference is retrieved from internal memory 120 and a sustainable data transfer rate is established and forwarded to ASIC 110, the configuration logic 114 within ASIC 110 determines a suitable set of values to apply to the operational parameters of digital camera 100. Configuration logic 114 compares the sustainable data transfer rate by type (i.e., data write, data read) and by speed against sets of predetermined operational parameter values suitable for operating the digital camera 100 across a range of operator preference levels for sharp/smooth representations and sustainable data rates. Once configuration logic 114 identifies the appropriate set of operational parameter values for the presently selected operator preference and sustainable data transfer rate, ASIC 110 forwards the values to the appropriate system. Thereafter, the digital camera 100 is configured to acquire, process, and stream video data to external memory medium 170. Note that one or more values associated with respective operational parameters may not change as a result of the application of a specific operator preference level for smooth/sharp video representations and a data transfer calibration.
Any process descriptions or blocks in the flow diagrams illustrated in
After a sustainable data transfer rate is determined for the contemplated data transfer operation (e.g., a data write operation when transferring data from the respective device to the auxiliary memory medium) and a user preference is determined, a value for at least one operational parameter within the data appliance is selected in response to the sustainable data transfer rate and the user preference as indicated in block 306. Thereafter, as illustrated in block 308 video data is processed in accordance with the at least one operational parameter. Note that video data may or may not be processed in accordance with an operational parameter that was adjusted as a result of the application of the function associated with block 306. That is, video data may be acquired using one or more predetermined or default values for data acquisition parameters that match a select value(s). In addition, video data may be compressed using one or more predetermined or default values for data compression parameters that match a select value(s).
After a sustainable data transfer rate is determined for the contemplated data transfer operation (e.g., a data write operation when transferring data from the respective device to the auxiliary memory medium) and the user preference is recorded, a value for at least one operational parameter is selected in response to the sustainable data transfer rate and the user preference as indicated in block 406. Thereafter, as illustrated in block 408 data is streamed in accordance with the at least one operational parameter and perhaps other operational parameters associated with digital camera 100. Note that data may or may not be streamed in accordance with an operational parameter that was adjusted as a result of the application of the function associated with block 406. That is, data may be acquired using one or more predetermined or default values for data acquisition parameters that match a select value(s). In addition, data may be compressed using one or more predetermined or default values for data compression parameters that match a select value(s).
After a sustainable data transfer rate is determined for the contemplated data transfer operation (e.g., a data write operation when transferring data from the respective device to the external memory medium) and the operator preference is retrieved, a set of operational parameters responsive to the operator preference and the sustainable data transfer rate is selected as indicated in block 506. Thereafter, the set of operational parameters is applied to generate a video data stream as illustrated in block 508. Note that the video data stream may or may not be generated in accordance with an operational parameter that was adjusted as a result of the application of the function associated with block 506. That is, data may be acquired using one or more predetermined or default values for data acquisition parameters that match a select value(s). In addition, data may be compressed using one or more predetermined or default values for data compression parameters that match a select value(s).
Once the preference and the sustainable data transfer rate have been established, a set of operational parameters responsive to the preference and the sustainable data transfer rate are identified as indicated in block 608. Thereafter, as illustrated in block 610 the data appliance is configured in response to an identified set of values associated with the operational parameters. Note that the data appliance configuration may or may not be adjusted as a result of the application of the function associated with block 608. That is, the set of operational parameters selected may match predetermined or default values.
It should be emphasized that the above-described embodiments are merely examples of implementations of the systems and methods for responding to a data transfer. Many variations and modifications may be made to the above-described embodiments. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.