Exemplary embodiments of the present invention relate, generally, to motion and image quality monitoring and, in particular, to a technique for improving image matching and/or providing power savings through motion and image quality monitoring.
With the wide use of mobile phones having cameras, camera applications are becoming more and more popular for mobile phone users. As a result, mobile applications based on image matching or recognition including, for example, what is referred to as a mobile visual search, are emerging. One such application is the visual search system described in U.S. application Ser. No. 11/592,460, entitled “Scalable Visual Search System Simplifying Access to Network and Device Functionality,” the contents of which are hereby incorporated herein by reference in their entirety.
Unlike keyword searches, visual search systems are typically based on analyzing the perceptual content of a media object or media content, such as images or video data (e.g. video clips), using an input sample image as the query. The visual search system is different from the so-called image search commonly employed by the Internet, where keywords entered by users are matched to relevant image files on the Internet. Visual search systems are typically based on sophisticated algorithms that are used to analyze a media object, such as an input image (e.g., an image captured by a user using a camera operating on his or her mobile phone) against a variety of image features or properties of the image such as color, texture, shape, complexity, objects and regions within an image. To facilitate efficient visual searches, the images along with their properties, and other metadata associated with the images, are usually indexed and stored in a visual database, such as a centralized database that stores predefined point-of-interest (“POI”) images, along with their corresponding features and related metadata (i.e., textual tags). In mobile visual search, the mobile device takes advantage of the large visual database to match against input images. After matching an input image with an image stored in the visual database, the mobile visual search can transmit the context information tagged to the stored image to the user. Based on the foregoing, it is clear that the robustness of an image matching engine used to match the input image to an image of the visual database plays a critical role in a mobile visual search system.
There are special problems, however, with using the camera on a mobile phone for applications based on image matching, such as mobile visual searching. For example, one of the major problems is the quality of the input images. Due to the form factor and spontaneous nature of imaging applications on a mobile device, motion is a problem and can substantially reduce the input image quality (referred to as “motion blur”). This, in turn, will affect the performance of image matching applications. Experimental results show that motion blurring is one of the major factors that limit the image matching performance on a mobile device.
Another problem comes from user experience. Due to hand motion and other image noises, image matching results can “flip over,” or change, repeatedly, thus providing a poor user experience. In particular, when moving a phone, a recognition engine may return wrong results due to motion blur and other artifacts.
A need, therefore, exists, for a way to ensure that poor image quality caused, for example, by movement or changes in environmental conditions, does not detrimentally affect the application of various image matching applications, such as mobile visual search applications.
In general, exemplary embodiments of the present invention provide an improvement over the known prior art by, among other things, providing a way to monitor the motion and/or image quality associated with a captured image being used, for example, in conjunction with various image matching or recognition applications, such as a mobile visual search application. According to exemplary embodiments of the present invention, a monitor can detect changes in image quality and, for example, only allow the captured image to be used in conjunction with an image matching application (e.g., a visual search application) when the image features have stabilized. One result of requiring that only stabilized images be used in a visual search application is that the user's experience is greatly improved by reducing the number of times the application “flips over” or provides a different result. According to other exemplary embodiments, detected changes in motion and/or image quality may be used for energy saving purposes, for example, by switching on and off various applications and/or components operating on the mobile device depending upon the amount of motion detected and/or the quality of the image captured.
According to one aspect, a method is provided of monitoring motion and image quality of a captured image. In one embodiment, the method may include: (1) detecting motion in a captured image; and (2) taking an action in response to the motion detected, wherein the action includes either stabilizing the captured image prior to using the captured image in an image matching application or conserving power in response to the motion detected exceeding a predetermined threshold.
In one exemplary embodiment, detecting motion in a captured image involves comparing one or more features of two or more consecutive frames of the captured image. Comparing the features may, in turn, involve: (1) sampling two or more frames of the captured image; (2) filtering the two or more sampled frames to remove noise; (3) extracting the one or more features from the sampled frames; and (4) computing a difference between the extracted features of the sampled frames. In one exemplary embodiment, comparing the one or more features of the two or more consecutive frames of the capture image may further involve dividing respective sampled frames into two or more sub-regions, wherein filtering the two or more sampled frames comprises filtering respective sub-regions of the samples frames, extracting one or more features from the sample frames comprises extracting one or more features from respective sub-regions of the sampled frames, and computing a difference between the extracted features comprises computing the difference between extracted features for respective sub-regions of the sampled frames. The method of this exemplary embodiment may further include accumulating the computed difference between extracted features for respective sub-regions and integrating the accumulated differences of the two or more sub-regions.
According to another aspect, an apparatus is provided for monitoring motion and image quality of a captured image. In one exemplary embodiment, the apparatus includes a processor and a memory in communication with the processor and storing an application executable by the processor. The application may, in one exemplary embodiment, be configured, upon execution, to detect motion in a captured image and to cause an action to be taken in response to the motion detected, wherein the action includes either stabilizing the captured image prior to using the captured image in an image matching application or conserving power in response to the motion detected exceeding a predetermined threshold.
According to yet another aspect, a computer program product is provided for monitoring motion and image quality of a captured image. The computer program product may include at least one computer-readable storage medium having computer-readable program code portions stored therein. In one exemplary embodiment, the computer-readable program code portions include: (1) a first executable portion for detecting motion in a captured image; and (2) a second executable portion for causing an action to be taken in response to the motion detected, wherein the action includes either stabilizing the captured image prior to using the captured image in an image matching application or conserving power in response to the motion detected exceeding a predetermined threshold.
In accordance with another aspect, an apparatus is provided for monitoring motion and image quality of a captured image. In one exemplary embodiment, the apparatus includes: (1) means for detecting motion in a captured image; and (2) means for taking an action in response to the motion detected, wherein the action includes either stabilizing the captured image prior to using the captured image in an image matching application or conserving power in response to the motion detected exceeding a predetermined threshold.
Having thus described exemplary embodiments of the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Exemplary embodiments of the present invention now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the inventions are shown. Indeed, exemplary embodiments of the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like numbers refer to like elements throughout.
In general, exemplary embodiments of the present invention provide a technique for monitoring the motion and image quality of a captured image. Where poor image quality and/or a high degree of motion is detected, various steps or actions can be taken by the mobile device in response. For example, in one exemplary embodiment, where a substantial amount of change is detected between frames of a captured image, indicating, for example, that the quality of the captured image is low, a visual search system of the kind discussed above may be instructed not to update a search query based on the new image frame. In other words, the motion and image quality monitor of exemplary embodiments may be used to ensure that the image used by the visual search, or similar image matching, application is stabilized prior to use. Exemplary embodiments, therefore, reduce the number of times such an application “flips over” or provides new results; thus improving a user's overall experience. In another exemplary embodiment, the motion and image quality monitor may be used for power savings purposes by, for example, causing one or more components of the mobile device, or the device itself, to be turned off in response to motion detected.
The change detected by the motion and image quality monitor may be a result of motion, for example caused by user hand movements, and/or an environmental change, such as lighting. In particular, the motion and image quality monitor may use the same image features as used in image matching to compare sampled frames. As a result, the motion and image quality monitor of exemplary embodiments may not only be used to monitor motions, but also as a general input image quality monitor.
In addition, as discussed in more detail below, the motion and image quality monitor of one exemplary embodiment may be designed to work together with an image matching system in order to minimize the additional computations, and corresponding overhead, needed to perform the motion and image quality monitoring.
The motion and image quality monitor of exemplary embodiments may be implemented on a one-camera, or multiple-camera mobile device, as well as on any other mobile device with any kind of sensor including, but not limited to motion sensors.
In addition, while several embodiments of the method of the present invention are performed or used by a mobile terminal 10, the method may be employed by other than a mobile terminal. Moreover, the system and method of exemplary embodiments of the present invention will be primarily described in conjunction with mobile communications applications. It should be understood, however, that the system and method of exemplary embodiments of the present invention can be utilized in conjunction with a variety of other applications, both in the mobile communications industries and outside of the mobile communications industries.
As shown in
It is understood that the controller 20 includes circuitry required for implementing audio and logic functions of the mobile terminal 10. For example, the controller 20 may be comprised of a digital signal processor device, a microprocessor device, and various analog to digital converters, digital to analog converters, and other support circuits. Control and signal processing functions of the mobile terminal 10 are allocated between these devices according to their respective capabilities. The controller 20 thus may also include the functionality to convolutionally encode and interleave message and data prior to modulation and transmission. The controller 20 can additionally include an internal voice coder, and may include an internal data modem. Further, the controller 20 may include functionality to operate one or more software programs, which may be stored in memory. For example, the controller 20 may be capable of operating a connectivity program, such as a conventional Web browser. The connectivity program may then allow the mobile terminal 10 to transmit and receive Web content, such as location-based content, according to a Wireless Application Protocol (WAP), for example.
The mobile terminal 10 may also comprise a user interface including an output device such as a conventional earphone or speaker 24, a ringer 22, a microphone 26, a display 28, and a user input interface, all of which are coupled to the controller 20. The user input interface, which allows the mobile terminal 10 to receive data, may include any of a number of devices allowing the mobile terminal 10 to receive data, such as a keypad 30, a touch display (not shown) or other input device. In embodiments including the keypad 30, the keypad 30 may include the conventional numeric (0-9) and related keys (#, *), and other keys used for operating the mobile terminal 10. Alternatively, the keypad 30 may include a conventional QWERTY keypad. The mobile terminal 10 further includes a battery 34, such as a vibrating battery pack, for powering various circuits that are required to operate the mobile terminal 10, as well as optionally providing mechanical vibration as a detectable output.
In an exemplary embodiment, the mobile terminal 10 may include a camera module 36 in communication with the controller 20. The camera module 36 may be any means for capturing an image or a video clip or video stream for storage, display or transmission. For example, the camera module 36 may include a digital camera capable of forming a digital image file from an object in view, a captured image or a video stream from recorded video data. As such, the camera module 36 may include all hardware, such as a lens or other optical device, and software necessary for creating a digital image file from a captured image or a video stream from recorded video data. Alternatively, the camera module 36 may include only the hardware needed to view an image, or video stream while a memory device of the mobile terminal 10 stores instructions for execution by the controller 20 in the form of software necessary to create a digital image file from a captured image or a video stream from recorded video data. In an exemplary embodiment, the camera module 36 may further include a processing element such as a co-processor which assists the controller 20 in processing image data or a video stream and an encoder and/or decoder for compressing and/or decompressing image data or a video stream. The encoder and/or decoder may encode and/or decode according to a JPEG standard format, and the like.
The mobile terminal 10 may further include a location module 70, such as a GPS module, in communication with the controller 20. The location module 70 may be any means for locating the position of the mobile terminal 10. Additionally, the location module 70 may be any means for locating the position of points-of-interest (POIs), in images captured by the camera module 36, such as for example, shops, bookstores, restaurants, coffee shops, department stores and other businesses and the like, as described more fully in U.S. Provisional Application No. 60/913,733 entitled Method, Device, Mobile Terminal and Computer Program Product for a Point of Interest-Based Scheme for Improving Mobile Visual Searching Functionalities” (“the '733 application”), the contents of which are hereby incorporated herein by reference. As such, points-of-interest as used herein may include any entity of interest to a user, such as products and other objects and the like. The location module 70 may include all hardware for locating the position of a mobile terminal or a POI in an image. Alternatively or additionally, the location module 70 may utilize a memory device of the mobile terminal 10 to store instructions for execution by the controller 20 in the form of software necessary to determine the position of the mobile terminal or an image of a POI. Additionally, the location module 70 may be capable of utilizing the controller 20 to transmit/receive, via the transmitter 14/receiver 16, locational information such as the position of the mobile terminal 10 and a position of one or more POIs to a server, such as the visual map server 54 (also referred to herein as a visual search server) and the point-of-interest shop server 51 (also referred to herein as a visual search database), described more fully below.
The mobile terminal of one exemplary embodiment, may also include a unified mobile visual search/mapping client 68 (also referred to herein as visual search client) for the purpose of implementing a mobile visual search, for example, of the kind discussed above. The unified visual search client 68 may include a mapping module 99 and a mobile visual search engine 97 (also referred to herein as mobile visual search module). The unified mobile visual search/mapping client 68 may include any means of hardware and/or software, being executed by controller 20, capable of recognizing points-of-interest when the mobile terminal 10 is pointed at POIs, when the POIs are in the line of sight of the camera module 36, or when the POIs are captured in an image by the camera module, as described more fully in the '733 application. The mobile visual search engine 97 may also be capable of receiving location and position information of the mobile terminal 10 as well as the position of POIs. The mobile visual search engine 97 may further be capable of recognizing or identifying POIs and enabling a user of the mobile terminal 10 to select from a list of several actions that are relevant to a respective POI. For example, one of the actions may include but is not limited to searching for other similar POIs (i.e., candidates) within a geographic area. These similar POIs may be stored in a user profile in the mapping module 99. Additionally, in one exemplary embodiment, the mapping module 99 may launch a third person map view and a first person camera view of the camera module 36. The camera view when executed shows the surrounding area of the mobile terminal 10 and superimposes a set of visual tags that correspond to a set of POIs.
According to one exemplary embodiment, the visual search client 68, may further include a motion and/or image quality monitor 92 for monitoring the quality of an image captured by the camera module 36 as determined, for example, by the relative change in image features resulting from motion and/or other environmental changes. Where, for example, a substantial amount of change (e.g., motion) is detected, causing the image quality to be poor, the captured image may not be used by the visual search engine 97 to locate POIs and provide the user with feedback associated with those POIs. Alternatively, or in addition, as discussed in more detail below, a determination that a significant amount (or some predetermined amount) of motion or change has occurred, may result in some other action being taken with respect to the mobile terminal 10 and/or the camera module 36 (e.g., turn off the camera module 36, turn off a backlight, switch the input method for the visual search client, etc.). The motion and/or image quality monitor 92 of exemplary embodiments may, therefore, include any means of hardware and/or software, being executed by controller 20, capable of determining the relative motion and/or image quality of a captured image and responding accordingly.
The mobile terminal 10 may further include a user identity module (UIM) 38. The UIM 38 is typically a memory device having a processor built in. The UIM 38 may include, for example, a subscriber identity module (SIM), a universal integrated circuit card (UICC), a universal subscriber identity module (USIM), a removable user identity module (R-UIM), etc. The UIM 38 typically stores information elements related to a mobile subscriber. In addition to the UIM 38, the mobile terminal 10 may be equipped with memory. For example, the mobile terminal 10 may include volatile memory 40, such as volatile Random Access Memory (RAM) including a cache area for the temporary storage of data. The mobile terminal 10 may also include other non-volatile memory 42, which can be embedded and/or may be removable. The non-volatile memory 42 can additionally or alternatively comprise an EEPROM, flash memory or the like, such as that available from the SanDisk Corporation of Sunnyvale, Calif., or Lexar Media Inc. of Fremont, Calif. The memories can store any of a number of pieces of information, and data, used by the mobile terminal 10 to implement the functions of the mobile terminal 10. For example, the memories can include an identifier, such as an international mobile equipment identification (IMEI) code, capable of uniquely identifying the mobile terminal 10.
Referring now to
The MSC 46 can be coupled to a data network, such as a local area network (LAN), a metropolitan area network (MAN), and/or a wide area network (WAN). The MSC 46 can be directly coupled to the data network. In one typical embodiment, however, the MSC 46 is coupled to a GTW 48, and the GTW 48 is coupled to a WAN, such as the Internet 50. In turn, devices such as processing elements (e.g., personal computers, server computers or the like) can be coupled to the mobile terminal 10 via the Internet 50. For example, as explained below, the processing elements can include one or more processing elements associated with a computing system 52, visual map server 54, point-of-interest shop server 51, or the like, as described below.
The BS 44 can also be coupled to a signaling GPRS (General Packet Radio Service) support node (SGSN) 56. As known to those skilled in the art, the SGSN 56 is typically capable of performing functions similar to the MSC 46 for packet switched services. The SGSN 56, like the MSC 46, can be coupled to a data network, such as the Internet 50. The SGSN 56 can be directly coupled to the data network. In a more typical embodiment, however, the SGSN 56 is coupled to a packet-switched core network, such as a GPRS core network 58. The packet-switched core network is then coupled to another GTW 48, such as a GTW GPRS support node (GGSN) 60, and the GGSN 60 is coupled to the Internet 50. In addition to the GGSN 60, the packet-switched core network can also be coupled to a GTW 48. Also, the GGSN 60 can be coupled to a messaging center. In this regard, the GGSN 60 and the SGSN 56, like the MSC 46, may be capable of controlling the forwarding of messages, such as MMS messages. The GGSN 60 and SGSN 56 may also be capable of controlling the forwarding of messages for the mobile terminal 10 to and from the messaging center.
In addition, by coupling the SGSN 56 to the GPRS core network 58 and the GGSN 60, devices such as a computing system 52 and/or visual map server 54 may be coupled to the mobile terminal 10 via the Internet 50, SGSN 56 and GGSN 60. In this regard, devices such as the computing system 52 and/or visual map server 54 may communicate with the mobile terminal 10 across the SGSN 56, GPRS core network 58 and the GGSN 60. By directly or indirectly connecting mobile terminals 10 and the other devices (e.g., computing system 52, visual map server 54, etc.) to the Internet 50, the mobile terminals 10 may communicate with the other devices and with one another, such as according to the Hypertext Transfer Protocol (HTTP), to thereby carry out various functions of the mobile terminals 10.
Although not every element of every possible mobile network is shown and described herein, it should be appreciated that the mobile terminal 10 may be coupled to one or more of any of a number of different networks through the BS 44. In this regard, the network(s) can be capable of supporting communication in accordance with any one or more of a number of first-generation (1G), second-generation (2G), 2.5G, third-generation (3G) and/or future mobile communication protocols or the like. For example, one or more of the network(s) can be capable of supporting communication in accordance with 2G wireless communication protocols IS-136 (TDMA), GSM, and IS-95 (CDMA). Also, for example, one or more of the network(s) can be capable of supporting communication in accordance with 2.5G wireless communication protocols GPRS, Enhanced Data GSM Environment (EDGE), or the like. Further, for example, one or more of the network(s) can be capable of supporting communication in accordance with 3G wireless communication protocols such as Universal Mobile Telephone System (UMTS) network employing Wideband Code Division Multiple Access (WCDMA) radio access technology. Some narrow-band AMPS (NAMPS), as well as TACS, network(s) may also benefit from embodiments of the present invention, as should dual or higher mode mobile stations (e.g., digital/analog or TDMA/CDMA/analog phones).
The mobile terminal 10 can further be coupled to one or more wireless access points (APs) 62. The APs 62 may comprise access points configured to communicate with the mobile terminal 10 in accordance with techniques such as, for example, radio frequency (RF), Bluetooth (BT), Wibree, infrared (IrDA) or any of a number of different wireless networking techniques, including wireless LAN (WLAN) techniques such as IEEE 802.11 (e.g., 802.11a, 802.11b, 802.11g, 802.11n, etc.), WiMAX techniques such as IEEE 802.16, and/or ultra wideband (UWB) techniques such as IEEE 802.15 or the like. The APs 62 may be coupled to the Internet 50. Like with the MSC 46, the APs 62 can be directly coupled to the Internet 50. In one embodiment, however, the APs 62 are indirectly coupled to the Internet 50 via a GTW 48. Furthermore, in one embodiment, the BS 44 may be considered as another AP 62. As will be appreciated, by directly or indirectly connecting the mobile terminals 10, the computing system 52, the visual map server 54, and/or any of a number of other devices to the Internet 50, the mobile terminals 10 can communicate with one another, the computing system 52, the visual map server 54, the POI shop server 51, or other devices, to thereby carry out various functions of the mobile terminals 10, such as to transmit data, content or the like to, and/or receive content, data or the like from, the computing system 52, visual map server 54 and/or POI shop server 51. For example, the visual map server 54, may provide map data, by way of a map server 96 (shown in
The information relating to one or more POIs may be linked to one or more visual tags which may be transmitted to a mobile terminal 10 for display. Moreover, the point-of-interest shop server 51 may store data regarding the geographic location of one or more POI shops and may store data pertaining to various points-of-interest including but not limited to location of a POI, category of a POI, (e.g., coffee shops or restaurants, sporting venue, concerts, etc.) product information relative to a POI, and the like. The visual map server 54 may transmit and receive information from the point-of interest shop server 51 and communicate with a mobile terminal 10 via the Internet 50. Likewise, the point-of-interest shop server 51 may communicate with the visual map server 54 and alternatively, or additionally, may communicate with the mobile terminal 10 directly via a WLAN, Bluetooth, Wibree or the like transmission or via the Internet 50. As used herein, the terms “images,” “video clips,” “data,” “content,” “information” and similar terms may be used interchangeably to refer to data capable of being transmitted, received and/or stored in accordance with embodiments of the present invention. Thus, use of any such terms should not be taken to limit the spirit and scope of the present invention.
Although not shown in
An exemplary mobile visual search application implemented by a visual search system will now be described with reference to
Referring now to
Similarly, the point-of-interest shop server 51 may be any device or means such as hardware or software capable of storing information pertaining to points-of-interest. The point-of-interest shop server 51 may include a processor for carrying out or executing functions or software instructions. (See e.g.
In the exemplary embodiment of the visual search system of
Referring to
As discussed above, exemplary embodiments of the present invention provide a motion and image quality monitor for monitoring the quality of images captured by the camera module 36 and used, for example, in the mobile visual search, or similar image matching or recognition, application discussed above. In one exemplary embodiment as shown in
The following describes one exemplary method for performing the above-described monitoring in connection with
Each input image or video frame sampled may then be divided into a grid including a plurality of sub-regions, as shown in
Turning now to
As noted above, the foregoing is just one method that may be used to detect motion and ascertain image quality and other, similar, methods may likely be used without departing from the spirit and scope of exemplary embodiments of the present invention. For example, according to one exemplary embodiment, the mobile device may include an acceleration sensor capable of detecting acceleration along a certain axis (e.g., x, y or z axis). Motion may be detected based on a threshold of acceptable versus unacceptable acceleration, as detected by the acceleration sensor. In this exemplary embodiment, consecutive frames need not be analyzed and instead, a threshold of maximum allowed motion may be set.
As discussed above, the mobile device of exemplary embodiments may take one or more of several actions in response to the detection of poor image quality or significant amounts of change or motion between image frames.
Where, for example, the image matching application is a mobile visual search application of the kind discussed above, in one exemplary embodiment (shown in
As shown in
Actions that may be taken, in accordance with
As will be understood by one of ordinary skill in the art, where a decision is made to turn off a particular device, or switch to a particular application or to a particular input method, or the like, when the motion detected is high and/or the image quality determined is low (e.g., as compared to some predefined threshold value), an opposite decision may likewise be made when the outcome of the motion and image quality monitor is that the motion detected is low and/or the image quality determined is high.
According to another exemplary embodiment, the mobile device may further be capable of detecting when the mobile device has been put away, for example, in a pocket or a handbag. In particular, according to this exemplary embodiment, the mobile device may be configured to analyze the level of ambient light that the camera module is receiving. Where, for example, there is an insufficient amount of light to recognize objects in the line of sight of the cameral module, the mobile device may assume that the device is in a pocket or handbag and go to sleep. The mobile device may, thereafter, wake up in intervals to try to figure out whether the camera can see something meaningful. The foregoing is beneficial since placing a mobile device in one's pocket and forgetting to turn it off can drain the battery of the mobile device sooner than expected.
The system, method, electronic device and computer program product of exemplary embodiments of the present invention are primarily described in conjunction with mobile communications applications. It should be understood, however, that the system, method, electronic device and computer program product of embodiments of the present invention can be utilized in conjunction with a variety of other applications, both in the mobile communications industries and outside of the mobile communications industries. For example, the system, method, electronic device and computer program product of exemplary embodiments of the present invention can be utilized in conjunction with wireline and/or wireless network (e.g., Internet) applications.
As described above and as will be appreciated by one skilled in the art, embodiments of the present invention may be configured as a system, method, or electronic device. Accordingly, embodiments of the present invention may be comprised of various means including entirely of hardware, entirely of software, or any combination of software and hardware. Furthermore, embodiments of the present invention may take the form of a computer program product on a computer-readable storage medium having computer-readable program instructions (e.g., computer software) embodied in the storage medium. Any suitable computer-readable storage medium may be utilized including hard disks, CD-ROMs, optical storage devices, or magnetic storage devices.
Exemplary embodiments of the present invention have been described above with reference to block diagrams and flowchart illustrations of methods, apparatuses (i.e., systems) and computer program products. It will be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by various means including computer program instructions. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create a means for implementing the functions specified in the flowchart block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including computer-readable instructions for implementing the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
Accordingly, blocks of the block diagrams and flowchart illustrations support combinations of means for performing the specified functions, combinations of steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, can be implemented by special purpose hardware-based computer systems that perform the specified functions or steps, or combinations of special purpose hardware and computer instructions.
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these exemplary embodiments of the invention pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the embodiments of the invention are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
This application claims priority to U.S. Provisional Patent Application Ser. No. 60/913,761 filed Apr. 24, 2007, which is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
60913761 | Apr 2007 | US |