This disclosure relates to the medical industry, more particularly to devices and methods for uniformly acquiring, replicating, sharing, and comparing still and motion images.
In the medical field, diagnostic imaging techniques have become a critical technology relied upon by doctors around the world. Many imaging technologies like Video EEGs, X-rays, CT scanning, mammography and magnetic resonance imaging have greatly increased the standard of care and capability of doctors by enabling them to view structures within the human body that were not previously viewable except through invasive means.
These previously developed non-invasive imaging techniques have been instrumental in improving clinical safety and patient outcomes. High cost, however, is a major limitation in the application of the previously developed non-invasive imaging technologies, restricting their use by doctors and facilities as well as the patients who can benefit from them. These previously developed non-invasive imaging techniques have contributed to the rising costs of healthcare and many have long term detrimental effects on patients due to the high levels of radiation used in connection with such techniques.
Further, these previously developed non-invasive imaging techniques are not useful in medical arts such as dermatology where external visual imaging and analysis are the main diagnostic techniques. Skin ailments such as melanoma, for example, are often characterized through a doctor's visual analysis of the patient's skin. A doctor treating melanoma is primarily concerned with the size, color, and shape of the melanoma at a given time as well as how the size and shape of the melanoma are changing over time.
The previously developed non-invasive imaging techniques further fail to provide useful information in the cosmetics industry where research scientists must visually study how make-up, creams such as wrinkle and cellulite treatments, and other products affect the appearance of subjects over a period of time or course of treatment using such cosmetic products.
Additionally these previously developed non-invasive imaging techniques fail to provide useful information for researchers involved in clinical trials who must visually study certain experimental topical therapeutics to determine the efficacy of such therapeutics on patients suffering from various skin aliments. The results of such visual studies are then used to support regulatory filings with the goal of having such therapeutics approved for sale to consumers.
Since external visual imaging in the medical arts is primarily concerned with how certain structures on the human body are changing over time, both still and motion photography have become vital tools for image acquisition and storage. Such still and motion photography allows doctors and clinical researchers to study images taken at one time with images taken at a later time to assess how a patient's condition is changing as a function of time. However, the use of still and motion photography in the medical arts presents a unique set of challenges.
A primary challenge inherent in the use of still and motion photography is a potential lack of consistency during the acquisition and analysis of images. For example, non-uniform lighting conditions may make image comparison between two different photographs or video difficult. Another challenge arises during studies when pre-defined image acquisition protocols depend on correct and consistent patient position or posture. Image analysis and comparison is made more difficult when even slight position changes of the camera with respect to the subject occur between two different images.
Another challenge involves the photographic equipment itself. Bulky cameras, video cameras and lighting setups are expensive and difficult for medical practitioners (who in most cases are not trained photographers) to use in doctor's offices and other healthcare settings. Such equipment setups are also difficult to deploy and use consistently at multiple investigator sites when clinical trials are being performed. Still other challenges involve the lack of efficient systems and methods to store, retrieve and analyze images for the purposes of patient care.
These problems are compounded when untrained patients are tasked with taking subsequent pictures of the relevant limb, wound, rash, or any other physical presentation on their person in a uniform manner that allows effective analysis and diagnosis. Patients also have been found to encounter extreme difficulty when attempting to capture, on video, an accurately reproduced movement of timing and position that allows effective analysis and diagnosis.
Solutions have been long sought but prior developments have not taught or suggested any complete solutions, and solutions to these problems have long eluded those skilled in the art. Thus there remains a considerable need for devices and methods enabling users to accurately acquire uniform images in accordance with a protocol allowing effective analysis and diagnosis.
Contemplated embodiments of the imaging uniformity system can include methods and devices for acquiring an initial image based on a protocol; generating a protocol guide based on the initial image; displaying the protocol guide overlaying an actual image; and acquiring a subsequent image of the actual image, and the subsequent image being in alignment with the protocol guide.
The present disclosure includes contemplated steps of providing to the user, via a user interface, instructions in the form of a protocol guide for posing a subject according to a protocol, receiving, via an imaging apparatus, a first set of images and a second set of images taken in accordance with the protocol; storing, on a computer readable memory, the first set of images and the second set of images; providing for display, via the user interface, the first set of images and the second set of images of comparison purposes.
The present disclosure further includes the steps of providing to the user a translucent image and contour tracing of an initial image placed on the user interface of the imaging apparatus to enable the user to achieve similar positioning, orientation and placement of the relevant subject matter, in subsequent images of the patient. Additionally, a timeline comparative feature is disclosed enabling images within a protocol to be compared to one another, using a slider feature.
Accordingly it has been discovered that one or more embodiments described herein can provide research organizations, hospitals, universities, pharmaceutical companies, and medical device manufacturers a system to accurately acquire uniform images in accordance with a protocol allowing effective analysis and diagnosis.
Other contemplated embodiments can include objects, features, aspects, and advantages in addition to or in place of those mentioned above. These objects, features, aspects, and advantages of the embodiments will become more apparent from the following detailed description, along with the accompanying drawings.
The imaging uniformity system is illustrated in the figures of the accompanying drawings which are meant to be exemplary and not limiting, in which like reference numerals are intended to refer to like components, and in which:
In the following description, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration, embodiments in which the imaging uniformity system may be practiced. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the imaging uniformity system.
The imaging uniformity system is described in sufficient detail to enable those skilled in the art to make and use the imaging uniformity system and numerous specific details are provided to give a thorough understanding of the imaging uniformity system; however, it will be apparent that the imaging uniformity system may be practiced without these specific details.
In order to avoid obscuring the imaging uniformity system, some well-known system configurations are not disclosed in detail. Likewise, the drawings showing embodiments of the system are semi-diagrammatic and not to scale and, particularly, some of the dimensions are for the clarity of presentation and are shown greatly exaggerated in the drawing FIGS. Generally, the imaging uniformity system can be operated in any orientation.
As used herein, the term system is defined as a device or method depending on the context in which it is used. The term image is generally used herein to describe a still image for descriptive clarity; however, the term image is intended to encompass a series of images as is found in a video or an image and changes thereto as is found in a compressed or encoded video.
The term protocol refers to a requirement for image acquisition. Example protocol can include: position of a patients body or body part with reference to an image capturing device such as posture, position, pose, angle, view; acquisition of an image according to a schedule such as day or time of day; lighting; subject; movement for video acquisition; acquisition of a secondary object like a pantone screen or a white piece of paper; the surface condition of a patient; or image-capturing device specifications like focal distance or shutter speed. These example protocols are not intended to be an exhaustive list. It is contemplated that these protocols can be used individually or in combination. It is further contemplated that other protocols not mentioned here can be implemented individually or in combination with the above listed example protocol.
The term parameter includes data about the image or the acquisition of the image. Example parameters can include time and date of acquisition, corrections for color balance of the image, aberrations between the image captured and the protocol, information about an image-capturing device, subject identification, user identification, provider, condition, or number of attempts by the user to capture the image within the required protocol. The term series as used herein refers to a group of images taken using a specific protocol.
Referring now to
The distributed computing system 102 can include the Internet, a wide area network, (WAN), a metropolitan area network (MAN), a local area network (LAN), a telephone network, cellular data network (e.g., 3G, 4G) and/or a combination of these and other networks (wired, wireless, public, private or otherwise).
The servers 104 can function both to process and store data for use on user devices 108 including laptops 110, cellular phones 112, and tablet computers 114, and cameras 116. It is contemplated that the servers 104 and the user devices 108 can individually comprise a central processing unit, memory, storage and input/output units and other constituent components configured to execute applications including software suitable for displaying user interfaces, the interfaces optionally being generated by a remote server, interfacing with the cloud network, and managing or performing capture, transmission, storage, analysis, display, or other processing of data and or images.
The servers 104 and the user devices 108 of the imaging uniformity system 100 can further include a web browser operative for, by way of example, retrieving web pages or other markup language streams, presenting those pages or streams, executing scripts, controls and other code on those pages or streams, accepting user input with respect to those pages or streams, and issuing HTTP requests with respect to those pages or streams. The web pages or other markup language can be in HAML, CSS, HTML, Ruby on Rails or other conventional forms, including embedded XML, scripts, controls, and so forth as adapted in accord with the teachings hereof. The user devices 108 and the servers 104 can be used individually or in combination to store and process information from the imaging uniformity system 100 in the form of protocol, parameters, images, protocol instructions and protocol guides.
The user devices 108 can also be image-capturing devices 118, such as the cellular phone 112, the camera 116, the laptop 110, or the tablet computer 114. It is contemplated that the image-capturing device 118 can be any device suitable for acquiring images and communicating the images to the distributed computing system 102.
The image-capturing devices 118 can be used to capture and display images 120 of a subject 122. It is contemplated that the subject 122 can be a patient 124, a user (not shown), an object 126, pictorial representations such as photographs or drawings, images including DICOM images or X-ray images, and models. The object 126 is depicted as a pantone screen which can include many sample colors including black and white which can be used for color balance and lighting correction.
Referring now to
The initial image 202 is depicted here, and in the FIGS. that follow as a hand of the patient 124 of
It is further contemplated the protocol 204 can be defined in part by the initial image 202, that is the alignment, position, movement, angle, view, lighting, and size of the patient 124 in relation to the image-capturing device 118, when the initial image 202 was captured, can be incorporated in the protocol 204.
The protocol 204 can also be a standard set of requirements that will be implemented for each single patient 124 within a broader study or practice. The initial image 202 can be taken in compliance with the protocol 204 determined before the initial image 202 is captured for portions of the protocol 204 the initial image 202 is not used to generate.
As an illustrative example, the protocol 204 can be provided by a physician after examining a patient's 124 condition and determining the proper protocol 204 in terms of position with reference to the image-capturing device 118, and period between images for an individual patient 124. Other contemplated methods of determining the protocol 204 can include determining a standard position with reference to the image-capturing device 118, and period between images for multiple patients 124, which can be helpful when comparing many series of images for multiple patients 124 in situations like clinical trials.
Other elements of the protocol 204 can include a schedule for taking subsequent images, taking of control images, and movement. The image-capturing device 118 can further depict parameters 206. The parameters 206 can include metrics under which the initial image 202 was taken such as focal metrics, shutter metrics, lighting metrics, contrast metrics, and color metrics.
Referring now to
The control image 302 will have been taken in the same environment and with the same image-capturing device 118 used to capture the patient 124 in
The control image 302 can be taken of the object 126 and provide information about the environment and image-capturing device 118 used to capture the initial image 202 for analysis and color balance. The image 120 of
Referring now to
The protocol guide 402 can include a translucent image 406, a contour 408, or a combination thereof. The contour 408 and the translucent image 406 are contemplated to act as a guide to a user for conforming to the protocol 204 of
The protocol guide 402 can further include instructions 410. The instructions 410 are contemplated to be communications to the user or the patient 124 of
The instructions 410 are depicted as displayed on the user interface 404 along with the translucent image 406 and the contour 408 but without an actual image 502 of
The contour 408 is depicted as an outline of the hand from the initial image 202 of
In other contemplated embodiments the contour 408 can be generated by subsequent images rather than the initial image 202 to maintain an up-to-date protocol guide 402 for the subject 122 even if the subject's outline changes over time. As an illustrative example, a user or healthcare professional might be presented with the option to select any of the images 120 of
In the present depiction of the contour 408 in
The contour 408 can be emphasized, highlighted, magnified, or accentuated differently from the rest of the initial image 202 or any subsequent image that it is superimposed on. The translucent image 406 can be created from the initial image 202 as a translucent reproduction or a semitransparent reproduction.
In the present illustrative example, the translucent image 406 is the hand of the subject 122, while the contour 408 provides an indication of the outer edge of the hand of the subject 122. The contour 408 can be displayed lighter or darker than the translucent image 406 of the initial image 202. It is contemplated that the contour 452 or the translucent image 406 can be used to signal a match or alignment between the subsequent image and the protocol 204 by flashing, changing color, or other suitable means.
Referring now to
The cellular phone 112 is shown having the protocol guide 402 displayed on the user interface 404. The protocol guide 402 is depicted having the contour 408 as a darker bolded outline of the initial image 202 of
The contour 408 and the translucent image 406 overlay an actual image 502 of the subject 122. The actual image 502 will be used to describe the image of the subject 122 displayed on the user interface 404 before a subsequent image is saved. The subsequent image will be used to describe an image that is saved in the series of a protocol after the initial image 202.
It is contemplated that the translucent image 406 and the contour 408 of the protocol guide 402 can be dynamically adjusted based on the lighting, color, background, or other parameters 206 of the actual image 502. The dynamic adjustment of the translucent image 406 or the contour 408 can include making the translucent image 406 or contour 408 lighter or darker, more or less transparent, colored or highlighted a contrasting color, or even pulsating.
It is contemplated that the translucent image 406 and the contour 408 can be dynamically adjusted independent of each other. It has been discovered that dynamically adjusting the translucent image 406 or the contour 408 ensures that the protocol guide 402 will act as an overlay always allowing the actual image 502 of the subject 122 to appear on the user interface 404 without being obstructed by the protocol guide 402.
The actual image 502 is depicted as misaligned and too far away from the image-capturing device 118 represented by the actual image 502 being misaligned with the contour 408 of the protocol guide 402 representing the positional protocol 204 of
The misaligned actual image 502 could indicate that the image-capturing device 118 or the subject should be moved horizontally, vertically, rotated, angled, or posed in a different way so as to conform to the protocol 204. The actual image 502 is further depicted as too small relative to the contour 408 of the protocol guide 402.
When the actual image 502 is smaller than the contour 408 the image-capturing device 118 should be repositioned closer to the subject 122. It is contemplated in some embodiments that the contour 408 would be slightly larger than the subject 122 to avoid obscuring the outline of the actual image 502 or that the contour 408 would be slightly transparent so that the outline of the actual image 502 can be seen through the contour 408.
It has been discovered that projecting or overlaying the protocol guide 402 including the contour 408 or the translucent image 406 on the user interface 404 of the image-capturing device 118 enables the users to intuitively and accurately take an image of their relevant body part, symptom, or presentation with a high degree of compliance with the protocol 204 and providing a similar positioning, alignment, orientation and presentation to the initial image 202. It has further been discovered that implementing the protocol guide 402 on the user interface 404 of a device allows a user or the subject 122 to progressively align their body part in compliance with the protocol 204.
Referring now to
Within the contour 408, the translucent image 406 is shown allowing the actual image 502 to show through the protocol guide 402 for ease of alignment. Further, the actual image 502 is shown as misaligned with the contour 408 of the protocol guide 402 indicating a vertical or horizontal change between the image-capturing device 118 and the subject 122 is required to more closely conform with the protocol 204 of
Referring now to
Specifically, the actual image 502 that depicts the hand of the subject 122 is shown within the contour 408 of the protocol guide 402. In the present exemplary embodiment and for ease of description, the actual image 502 is shown slightly within the contour 408 signaling compliance with the protocol 204 of
The actual image 502 is shown appearing through the translucent image 406 of the protocol guide 402 for ease of alignment with the protocol guide 402. It is contemplated that the contour 408 or the translucent image 406 can be used to signal a match or alignment between the actual image 502 and the protocol guide 402 by flashing, changing color, or other suitable means.
Upon compliance between the actual image 502 and the protocol guide 402 is obtained by the user, the user can capture the actual image 502 as a subsequent image 702. The subsequent image 702 will be compliant with the protocol 204 dictating position, posture, and pose, so long as the actual image 502 is in alignment with the protocol guide 402. It is contemplated that the subsequent image 702 can be captured by the patient 124, a user, or a healthcare professional and then acquired by the imaging uniformity system 100 of
It has been discovered that aligning the actual image 502 with the protocol guide 402 results in the ability to easily capture the subsequent image 702 that is compliant with the protocol 204 and when the subsequent image 702 is compliant with the protocol 204, the subsequent image 702 and the initial image 202 of
It is contemplated the subsequent image 702 can be used to further refine the protocol guide 402 if for example the subject 122 is changing size during the period of time the protocol 204 requires the subsequent images 702 to be taken such as during weight loss, and swelling or the reduction thereof. It is also contemplated that the protocol guide 402 based on the initial image 202 can continue to be used.
It is contemplated that before the subsequent image 702, or after the subsequent image 702 is taken, the control image 302 of
Capturing the control image 302 enables the imaging uniformity system 100 to adjust the subsequent image 702 for color, lighting, contrast, and other image characteristics providing a high degree of similarity between the subsequent image 702 and the initial image 202. It has been discovered that adjusting the subsequent image 702 using the control image 302 provides a fast intuitive and accurate method of analysis and diagnosis.
Referring now to
In the present depiction, however, the protocol guide 402 includes a further attribute of movement 802. The protocol guide 402 can provide a guide for the movement 802 of the subject 122 that is captured as the subsequent image 702 in the form of a video.
It is contemplated that the initial image 202 could also be in the form of a video and the contour 408 could be created from the initial image 202 similar to that of
In the same way the subsequent image 702 of
Referring now to
The comparison panes 904 can include the initial image 202 or any of the subsequent images 702 from a series taken within the protocol 204 of
The comparison panes 904 are shown without the protocol guide 402 of
It has been further discovered that implementing the comparison panes 904 providing either the initial image 202 or the subsequent images 702 side-by-side enables physicians to quickly identify differences or changes in the condition of the patient 124 over time. Enabling a physician to identify changes over time greatly increases the ability of physicians to identify patterns or trends and take meaningful corrective action or make meaningful predictions.
It is contemplated that the sliders 908 could interact intuitively enabling a user to quickly display before and after images 120 of the subject 122 of
It is contemplated that the sliders 908 could be locked in a plus or minus one configuration or could move independently of each other. It is further contemplated that the first time 912 and the second time 916 could always be different meaning the sliders 908 would jump over each other to the next image 120.
Further it is contemplated that the comparison panes 904 could operate as before or after panes. In this contemplated function one of the panes, such as the left pane 910 could always display either the initial image 202 or the subsequent images 702 of the first time 912 that is before any of the subsequent images 702 of the second time 916 displayed on the right pane 914.
It is contemplated that when the comparison panes 904 operate as before or after panes, when the slider associated with the left pane 910 hits the slider associated with the right pane 914 the slider associated with the right pane 914 could be locked in place or could move ahead to the next subsequent image 702 in the series of the protocol 204. Below the comparison panes 904, information 918 about the images 120 displayed on the comparison panes 904 can be viewed.
The information 918 can include the date and time the image 120 was taken in accordance with the protocol 204. Further the information 918 can include the position, pose, subject's 122 identity, previous diagnosis information, treatments or corrective actions taken after the image 120 was taken, along with any other information 918 that would be important to understanding the series of images 120 taken of the patient 124 and providing analyses and diagnoses.
It has been discovered that implementing the comparison panes 904 with the ability to easily display the information 918 and the images 120 from the first time 912 and the second time 916 allow a user to make diagnoses, assess healing or growth, assess beauty characteristics, or perform one or more analyses on the captured image data. This can also allow the user to view any combination of before and after images, provided that the images being compared were taken at different times.
The timeline 906 can further include time marks 920 or other indicators that one of the images 120 was taken at a date or time along the timeline 906. As noted above, the images 120 displayed within the comparison panes 904 could include still images or video.
When the comparison panes 904 are used to display video, it is contemplated that the video on the left pane 910 will play at the same time as the video on the right pane 914. It is contemplated that when the videos are played simultaneously on the comparison panes 904, the movement captured in accordance with the protocol 204 will be synchronized.
That is, when the video taken at the first time 912 displayed in the left pane 910 depicts the subject 122 moving to the left then to the right, the video taken at the second time 916 displayed in the right pane 914 will depict the subject 122 moving to the left then to the right at the same time. It is contemplated that the user will be able to pause, slow the video down, fast forward, rewind, zoom, or preform other video control functions on the video displayed on both of the comparison panes 904 identically.
Referring now to
The comparison panes 1004 can include the initial image 202 or any of the subsequent images 702 from a series taken within the protocol 204 of
The image 120 shown in the comparison panes 1004 is shown much larger for analysis and diagnosis purposes than the images 120 depicted above and below the comparison panes 1004. When a user wishes to advance the image 120 shown in the comparison panes 1004, the user can scroll through the timeline 1006 and display other images 120 of
It is contemplated that the timeline 1006 could be reduced or expanded providing more or fewer images 120 above and below the comparison panes 1004. That is, the timeline 1006 could be eliminated completely leaving only the comparison panes 1004 left. A user would then simply swipe up, down, left, or right on the comparison panes 1004 to advance the image 120 displayed in the comparison panes 1004.
The comparison panes 1004 are shown without the protocol guide 402 of
It has been further discovered that implementing the comparison panes 1004 providing either the initial image 202 or the subsequent images 702 side-by-side enables physicians to quickly identify differences or changes in the condition of the patient 124 over time. Enabling a physician to identify changes over time greatly increases the ability of physicians to identify patterns or trends and take meaningful corrective action or make meaningful predictions.
It is contemplated that the changing the images 120 displayed in the comparison panes 1004 could intuitively enable a user to quickly display before and after images 120 of the subject 122 of
It is contemplated that the comparison panes 1004 could be locked in a plus or minus one configuration or could be changed independently of each other. It is further contemplated that the first time 1012 and the second time 1016 could always be different meaning the right pane 1014 and the left pane 1010 would not display the images 120 having the same first time 1012 or second time 1016 but instead would advance past to the next image 120 in the timeline 1006.
Further it is contemplated that the comparison panes 1004 could operate as before or after panes. In this contemplated function one of the panes, such as the left pane 1010 could always display either the initial image 202 or the subsequent images 702 of the first time 1012 that is before any of the subsequent images 702 of the second time 1016 displayed on the right pane 1014.
It is contemplated that when the comparison panes 1004 operate as before or after panes, the first time 1012 of the image 120 on the left pane 1010 will not be allowed to advance beyond the second time 1016 of the image 120 associated with the right pane 1014. Instead, the image 120 on the right pane 1014 would need to be advanced first before the image 120 on the left pane 1010 would be allowed to advance or move ahead to the next subsequent image 702 in the series of the protocol 204.
It is contemplated that either the left pane 1010 or the right pane 1014 might be operated as the before plane and the other pane operate as the after pane depending on the needs of the reviewer. Below the comparison panes 1004, information 1018 about the images 120 displayed on the comparison panes 1004 can be viewed.
The information 1018 can include the date and time the image 120 was taken in accordance with the protocol 204. Further the information 1018 can include the position, pose, subject's 122 identity, previous diagnosis information, treatments or corrective actions taken after the image 120 was taken, along with any other information 1018 that would be important to understanding the series of images 120 taken of the patient 124 and providing analyses and diagnoses.
It has been discovered that implementing the comparison panes 1004 with the ability to easily display the information 1018 and the images 120 from the first time 1012 and the second time 1016 allow a user to make diagnoses, assess healing or growth, assess beauty characteristics, or perform one or more analyses on the captured image data. This can also allow the user to view any combination of before and after images, provided that the images being compared were taken at different times.
As noted above, the images 120 displayed within the comparison panes 1004 could include still images or video. When the comparison panes 1004 are used to display video, it is contemplated that the video on the left pane 1010 will play at the same time as the video on the right pane 1014. It is contemplated that when the videos are played simultaneously on the comparison panes 1004, the movement captured in accordance with the protocol 204 will be synchronized.
That is, when the video taken at the first time 1012 displayed in the left pane 1010 depicts the subject 122 moving to the left then to the right, the video taken at the second time 1016 displayed in the right pane 1014 will depict the subject 122 moving to the left then to the right at the same time. It is contemplated that the user will be able to pause, slow the video down, fast forward, rewind, zoom, or perform other video control functions on the video displayed on both of the comparison panes 1004 identically.
Referring now to
The computer program of the imaging uniformity system 100 typically is comprised of a multitude of instructions that will be translated by the native computer into a machine-readable format and hence executable instructions. Also, programs are comprised of variables and data structures that either reside locally to the program or are found in memory or on storage devices.
In addition, various programs described hereinafter may be identified based upon the application for which they are implemented in a specific embodiment of the invention; however, it should be appreciated that any particular program nomenclature that follows is used merely for convenience, and thus the imaging uniformity system 100 should not be limited to use solely in any specific application identified or implied by such nomenclature.
Embodiments of the imaging uniformity system 100 may also be practiced in distributed computing environments in which tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices and data processing can be accomplished with both local and remote devices.
The following description includes the term module which is intended to include, but is not limited to, one or more computers configured to execute one or more software programs configured to perform one or more functions, operations or actions. It is contemplated that modules of the control flow 1100 could be deleted, combined, or rearranged without departing from the imaging uniformity system 100.
The control flow 1100 is depicted having a protocol definition module 1102. The protocol definition module 1102 includes the steps of defining the protocol 204 of
It is contemplated that the protocol definition module 1102 and the initial image capture module 1104 could overlap in some cases where the protocol 204 is defined in part by the initial image 202. Once the protocol 204 is defined in the protocol definition module 1102, the protocol 204 is stored within the distributed computing system 102 either on the user device 108 or the servers 104.
The initial image 202 can be stored on the distributed computing system 102 either in the servers 104 or on the user device 108. Coupled to the initial image capture module 1104 is an initial control image module 1106. The initial control image module 1106 includes acquiring the control image 302 of
It is contemplated that the initial control image module 1106 can be placed before the initial image capture module 1104 or after. The initial control image module 1106 can capture the control image 302 for color balancing the initial image 202. The corrections made to the initial image 202 with the control image 302 can be stored on the distributed computing system 102 either in the servers 104 or on the user device 108 for later analysis and manual color balancing.
Coupled to the initial control image module 1106 is a contour creation module 1108. In the contour creation module 1108 the contour 408 of
It is contemplated that the contour creation module 1108 can be implemented before the initial control image module 1106 or after the initial control image module 1106. Coupled to the contour creation module 1108 is a translucent image creation module 1110. During the translucent image creation module 1110 the translucent image 406 of
The translucent image 406 can be stored on the distributed computing system 102 either in the servers 104 or on the user device 108. It is contemplated that the translucent image creation module 1110 can be implemented before, during, or after the contour creation module 1108. It is further contemplated that the contour creation module 1108 and the translucent image creation module 1110 can be accomplished using processors on either the servers 104 or the user devices 108.
The protocol guide 402 of
Coupled to the translucent image creation module 1110 is a display protocol guide module 1112. The display protocol guide module 1112 can be activated once the user is required to take one of the subsequent images 702 of
When a user is prompted to take one of the subsequent images 702 in accordance with the protocol 204, the display protocol guide module 1112 will retrieve the contour 408 and the translucent image 406 stored on the distributed computing system 102 either in the servers 104 or on the user device 108 and display the contour 408 and the translucent image 406 on the user interface 404 of
It is contemplated that the protocol definition module 1102 can also display the actual image 502. The acquire subsequent image module 1114 can be triggered by the user to save the subsequent image 702 when the user believes the actual image 502 is in accordance with the protocol 204 displayed by the protocol guide 402.
The protocol guide 402 can be used to signal compliance with the protocol 204 by flashing the contour 408, changing the contour's 408 color, or other suitable means. Alternatively, the subsequent image 702 could be acquired and stored automatically once the user aligns the actual image 502 with the protocol guide 402. The subsequent image 702 is first acquired by the imaging uniformity system 100 according to the protocol 204 defined in the protocol definition module 1102 and stored in a memory storage on the distributed computing system 102 either in the servers 104 or on the user device 108.
Coupled to the acquire subsequent image module 1114 is a subsequent control image module 1116. During the subsequent control image module 1116 the user will be instructed to acquire the control image 302 for the subsequent image 702.
The control image 302 for the subsequent image 702 will enable the subsequent image 702 to be color balanced. The subsequent image 702 can be color balanced using a processor in the distributed computing system 102. The protocol guide 402 can be generated by processors either in the servers 104 or on the user device 108. It is contemplated that the contour creation module 1108 and the translucent image creation module 1110 could be invoked again after the acquire subsequent image module 1114 module to prepare the contour 408 and the translucent image 406 based on the subsequent image 702.
Once the subsequent image 702 is acquired the subsequent image 702 can be stored on the distributed computing system 102 either in the servers 104 or on the user device 108. The acquire subsequent image module 1114 can be invoked along with the display protocol guide module 1112 and the subsequent control image module 1116 as required to form the series dictated by the protocol and described with respect to
Referring now to
During the protocol selection module 1202, a user may select one of the protocols 204 for a specific subject 122 corresponding to a predetermined set of poses or movements. It is contemplated that the protocol selection module 1202 can include various security features, such as requiring a username and password, to preserve the subject's 122 of
For instance, doctors that have not been given permissions to view a subject's data, cannot access subject or image data associated with that subject 122. Permissions can be based on roles, which can include doctor, health professionals, care team members, or subject.
It is contemplated that the imaging uniformity system 100 of
Once the protocol 204 is selected a series retrieval module 1204 will gather all the images 120 of
The comparison module 1206 can display the images 120 as still images or as video as described with regard to
Thus, it has been discovered that the imaging uniformity system furnishes important and heretofore unknown and unavailable solutions, capabilities, and functional aspects.
The resulting configurations are straightforward, cost-effective, uncomplicated, highly versatile, accurate, sensitive, and effective, and can be implemented by adapting known components for ready, efficient, and economical manufacturing, application, and utilization.
While the imaging uniformity system has been described in conjunction with a specific best mode, it is to be understood that many alternatives, modifications, and variations will be apparent to those skilled in the art in light of the preceding description.
Accordingly, it is intended to embrace all such alternatives, modifications, and variations, which fall within the scope of the included claims. All matters set forth herein or shown in the accompanying drawings are to be interpreted in an illustrative and non-limiting sense.
This application claims the priority benefit, with regard to all common subject matter, of U.S. Provisional Patent Application Ser. No. 61/877,288, titled Image Acquisition, Replication and Comparison Methods using the Marking of Contour Tracing and Translucent Image Placement on Image Capturing Screen, of Previously Captured Images and Video for Mobile Applications and Phones and a Timeline Comparative Feature System and Method for Image Acquisition, Replication and Comparison, and filed on Sep. 13, 2013; this application is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
61877288 | Sep 2013 | US |