Example embodiments generally relate to heat exchangers and, in particular, relate to heat exchanger inspections.
Heat exchangers, such as steam generators, may be periodically inspected to identify degradation between a heat source side and a heat sink side of the heat exchanger. A heated fluid may flow through a tube sheet and a plurality of tubes that maximize a heat transfer area to a fluid on the heat sink side. These tubes and the tube sheet may be susceptible to corrosion and chemical build up due to their geometry. Heat exchangers may be inspected to detect and address corrosion and chemical build up, thereby extending the heat exchanger's lifetime and preventing leaks from the heat source side to the heat sink side.
Accordingly, some example embodiments may enable heat exchanger inspection, as described below. In one example embodiment, a robot for heat exchanger inspection is provided including a mobility system configured to move the robot in reference to the heat exchanger, a camera configured to capture image data including at least a portion of the heat exchanger, and processing circuitry. The processing circuitry is configured to receive the image data from the camera, determine a plurality of heat exchanger characteristics in the image data, compare the plurality of heat exchanger characteristics to heat exchanger data, determine a current location and an orientation angle of the robot based on the comparison of the plurality of heat exchanger characteristics to the heat exchanger data, identify the plurality of heat exchanger characteristic based on the current location, adjust the orientation angle based on a calculation of a plurality of angles between the plurality of heat exchanger characteristics, and determined an end effector position based on the current location and the orientation angle.
In another example embodiment, an apparatus for heat exchanger inspections is provided including processing circuitry. The processing circuitry is configured to receive the image data from a camera associated with a robot, determine a plurality of heat exchanger characteristics in the image data, compare the plurality of heat exchanger characteristics to heat exchanger data, determine a current location and an orientation angle of the robot based on the comparison of the plurality of heat exchanger characteristics to the heat exchanger data, identify the plurality of heat exchanger characteristic based on the current location, adjust the orientation angle based on a calculation of a plurality of angles between the plurality of heat exchanger characteristics, and determine an end effector position based on the current location and the orientation angle.
Having thus described the heat exchanger inspection in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Each of
Each of
Some example embodiments now will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all, example embodiments are shown. Indeed, the examples described and pictured herein should not be construed as being limiting as to the scope, applicability or configuration of the present disclosure. It will be apparent to those skilled in the art that modifications and variations can be made in such example embodiments without departing from the scope or spirit thereof. For instance, features illustrated or described in one embodiment may be used on another embodiment to yield a still further embodiment. Thus, it is intended that the present invention covers such modifications and variations as come within the scope of the appended claims and their equivalents. Like reference numerals refer to like elements throughout.
As used herein, terms referring to a direction or a position relative to the orientation of a robot, such as but not limited to “vertical,” “horizontal,” “above,” or “below,” refer to directions and relative positions with respect to the robot's orientation in its normal intended operation on a tube sheet, as indicated in
Further, the term “or” as used in this application and the appended claims is intended to mean an inclusive “or” rather than an exclusive “or.” That is, unless specified otherwise, or clear from the context, the phrase “X employs A or B” is intended to mean any of the natural inclusive permutations. That is, the phrase “X employs A or B” is satisfied by any of the following instances: X employs A; X employs B; or X employs both A and B. In addition, the articles “a” and “an” as used in this application and the appended claims should generally be understood to mean “one or more” unless specified otherwise or clear from the context to be directed to a singular form. Throughout the specification and claims, the following terms take at least the meanings explicitly associated therein, unless the context dictates otherwise. The meanings identified below do not necessarily limit the terms, but merely provide illustrative examples for the terms. The meaning of “a,” “an,” and “the” may include plural references, and the meaning of “in” may include “in” and “on.” The phrase “in one embodiment” or other similar phrase, as used herein, does not necessarily refer to the same embodiment, although it may. The phrase “one of A and B” means A or B, not “one of A and one of B.”
In the past, a robotic crawler has been utilized to inspect and/or repair the tube sheet of a heat exchanger. The robotic crawler, e.g. a robot, may include one or more tools, such as an eddy current probe, to test the physical integrity of the tubes. The tools may be coupled to the robot via an arm and end effector. As should be understood, a robotic arm is a projection from the robot, e.g. a metal or rigid plastic bar that the robot may drive over a range of positions, or that may be held rigidly with respect to the remainder of the robot so that the end effector's position changes with the robot's position, to locate an end effector at the arm's end. As should also be understood, an end effector is a device at the end of the arm that is movable under the control of the robot to interact with the robot's environment, e.g. a rotatable gripper (which may grip a tool) or a directly-connected tool. The end effector may be, e.g., a gripper that grips the tool and aligns the tool with the tube sheet or may have an insertion tube that is inserted into the tube sheet tube and through which the tool is inserted into the tube sheet tool. A computer system in communication with the end effector tracked the physical position of the end effector, and correspondingly the tool coupled thereto, with respect to the robot and the tube sheet during the inspection to ensure that the data from the tool or repair work performed was associated with the correct tube.
Generally, robots in the past, e.g. the robot manufactured under the name ZR-100 by Zetec, Inc., of Snoqualmie, Wash., have used a hardware-based (in that computer programming was provided as firmware operable only on dedicated circuitry) position solution to determine the position of the robot and/or end effector/tool by identifying incremental movements of the robot from a known starting location, e.g. a predetermined or manually entered start position and orientation of the robot on the tube sheet, and from prior incremental positions, through analysis of sequential images acquired from a camera located on the robot. The robot may also have included one or more motor encoders that provided data to the computer system so that the computer system, in conjunction with direction data based on a known directional orientation of the robot's mobility system, tracked the robot's change in position based upon the motor encoder's movement, but the camera-based determination of position was independent of the encoder-based determination of position. The robot was initially placed at a predetermined position and orientation in the heat exchanger (e.g. at a predetermined position on the tube sheet), such that the tool on the end effector is aligned with one or more predetermined tubes. For camera-based location, the robot acquired a sequential series of images from a robot-mounted camera as the robot moved across a tube sheet surface. At each image, the robot processor identified circles present in the image and the centers of such circles. Having identified the positions of each identified circle of the immediately previous image in tube sheet space, the processor compared the image-space positions of the circle centers of the present image to the image-space positions of the previous image and, for those present-image centers falling within a predetermined threshold distance of the previous-image centers, identified such present-image centers with the known tube sheet positions of their respective corresponding previous-image centers. By locating multiple centers of the present image in tube sheet space, the processor was then able to locate the positions (in tube sheet space) of any present-image centers that were not paired with a previous-image center through triangulation, assuming each such remaining present-image sensor was within a predetermined threshold of a tube sheet feature center in the tube sheet map.
Having identified the position of all present-image feature centers in tube sheet space (excepting any thereof that failed the threshold test), the processor determined the position of the tool at the end of the end effector. As the end effector was at a fixed position with respect to the robot and, therefore, the camera image center, the processor, having located the present image in tube sheet space, also located the tool position in tube sheet space through triangulation. When the robot thereafter moved and acquired subsequent images, the process repeated, thereby maintaining knowledge of the tool's position as the robot moved over the tube sheet.
As noted, encoder accumulation systems were also known and were used to track the position of the robot and, thereby, the end effector tool. As the robot moved across the tube sheet in response to operator instructions, the computer received data from the encoder(s) that were driven by the motors that drove the robot's movement across the tube sheet and from one or more sensors, e.g. encoders, that outputted data corresponding to the mobility system's direction and updated the robot's/end effector's position and orientation in a tube sheet tracker. Such methods were used independently of the image-based method and could be used, e.g., as a backup confirmation of the result produced by the image-based method.
In another prior system, a robot includes a proximity sensor disposed on the robot so that the proximity sensor is always carried at a predetermined position above the tube sheet surface, such that the sensor switches between two operative states depending whether the sensor is above a solid section of the tube sheet surface or over a tube opening. The processor uses the alternating states to track the robot's position as it moves over the tube sheet.
The prior hardware-based position solutions resulted in only semi-reliable results in the tube sheet tracker. For example, if the end effector was placed in an incorrect initial position, or if the robot moved to an unintended position, the tube sheet tracker correlations could be incorrect, and the inspection results may be correspondingly incorrect from that point on. With regard to image-based tracking, the limited number of tube sheet features, and corresponding centers thereof, gave rise to error in locating the end effector tool position. Further, it could be difficult or impossible to directly verify the robot's/end effector's position outside of the installation area due to dimensional constraints of the heat exchanger.
In some embodiments of systems and methods as described herein, a robot uses a camera and machine vision to capture images of the tube sheet, which may then be presented by a computer system display to an operator for visual analysis and for automatic analysis by the computer system to track and update the robot's position and orientation on the tube sheet. In one or more embodiments, motor encoders are omitted, but in others, motor encoders are used in parallel with machine vision methods. The tube sheet may include hundreds of tube penetrations, which a computer system correlates with tube sheet penetrations in images captured by the robot. The computer system captures and saves these correlations in tracking the robot's movement on the tube sheet, e.g. based on a truth table that maps the tube sheet surface. The computer system uses the correlation of the tube penetrations/openings in the image to the tube penetrations in the tube sheet to determine a position and/or orientation of the robot and associated end effector and/or tool. Particularly, the end effector may be positioned a known distance from the camera in a known direction, such that application of the known distance and direction to the determined position and orientation of the image identifies the end effector's position relative to the tube sheet.
For example, the computer system may initially align the image of the tube sheet surface with a tube sheet map based on two or more tube penetrations or other characteristics that are capable of unique identification in the image and that are also specifically and distinctly identifiable in tube sheet space. As discussed herein, the surface of tube sheet 118 may define a plurality of tube sheet characteristics, such as tube penetration locations 202 (
As discussed above, the robot's initial placement may encompass an area such that the initial captured image includes one or more predetermined tube penetrations or other tube sheet characteristics having known positions on the tube sheet surface. The operator, when placing the robot in an initial position on the tube sheet, may do so based upon observation of one or more markers made or placed on the tube sheet in proximity to the predetermined tube sheet characteristics for this purpose. Relying on identification of the predetermined tube sheet characteristics, which the operator identifies in the image through the user interface, the computing system correlates the image-space tube sheet characteristics to tube sheet space, initializing a procedure that is repeatable in each subsequent image frame.
The camera-based position and orientation determination may be limited due to the environmental conditions, such as poor lighting, tight camera clearances, or the like. The limiting conditions may cause the number of tube penetrations which are identifiable in each image to be relatively small, such as three tubes, two tubes, one tube, or, in some instances, zero tubes. The determination of the position and/or orientation on the tube sheet may be limited due to the small number of identifiable tube penetrations. As noted above, in one or more embodiments discussed herein, the processing circuitry may correlate an image into tube sheet space if the processing circuitry identifies at least two tube sheet characteristics in the image that have known positions in tube sheet space. If, in a given image, the processing circuitry is unable to locate at least two known tube sheet characteristics (e.g. because poor lighting prevents identification of tube sheet characteristics or their centers), the processing circuitry discards the image and repeats the process for the next subsequent image, as if the discarded image had not occurred. If the processor is still unable to identify at least two known tube sheet characteristics in the next image, this process repeats and will so repeat until either successfully identifying two known tube sheet characteristics in a subsequent image or assessing a predetermined number of images without identifying two such tube sheet characteristics. The predetermined number is selected by the operator, based on the robot's known top rate of travel on the tube sheet surface and the camera's known rate of image acquisition, to correspond to a distance traveled by the robot at its top speed that would preclude the system from correlating tube sheet characteristics in a new image with tube sheet characteristics in the most recent readable image. At this point, if the robot does not concurrently accumulate its position through the use of motor encoders, the processing circuitry determines that the tracking process cannot continue and provides such notice to the operator at the display of user interface 60 (
The processes for determining the robot's position and orientation in tube sheet space identify those robot characteristics based on the robot camera's center optical position. The end effector, and the tool it secures, are offset from that robot center position by the robot arm, such that error that might occur in the determination of the robot's orientation, for example due to distortion in the image, increase in magnitude when translated out to the tool. Particularly where the tube openings are relatively closely spaced and relatively few tube openings are used to determine the robot's orientation (which can occur, e.g., when using a non-wide angle lens in the camera) so that the tube openings used are relatively close to the camera center, such error can result in a misidentification of the tube to which the tool is applied.
To counter potential effects of such errors, one or more embodiments of apparatus and methods as described herein adjust the determination of the robot's orientation (and, thus, the position of the end effector and its tool) based upon a quantification of distortion present in the image. In particular, the system compares the alignment of certain heat exchanger characteristics (e.g. tube opening centers) with respect to each other in the image with the known alignment of the same heat exchanger characteristics in tube sheet space and, to the extent the comparison indicates that such alignment is distorted in the image, adjusts the determination of the robot's orientation to counteract or accommodate the measured distortion. In certain embodiments, the system bases the distortion measurement upon a first plurality of heat exchanger characteristics visible in the image and a second plurality of heat exchanger characteristics detectable in the image disposed with respect to each other at an expected orientation based on tube sheet space, where confidence in the distortion adjustment increases directly with the number of heat exchanger characteristics in each plurality. The robot and associated processing circuitry may therefore include features to increase the number of detectable heat exchanger characteristics in the image, such as a wide angle lens and image processing techniques that may provide a clearer or more detailed image for determining the heat exchanger characteristics. As should be understood, a non-wide angle, or normal, lens is one that produces a field of view that appears natural to a human observer, i.e. with a focal length approximately equal to or greater than the image frame diagonal. A wide angle lens, by contrast, has a focal length smaller than that of a normal lens for a given film plane, for example less than the approximate image plane diameter or less than half the approximate image plane diameter. The processing circuitry may, for example, apply an undistort filter to the image data to compensate for lens curvature of the wide angle lens. In some embodiments, the processing circuitry may apply light compensation, such as a high and/or low gamma compensation, which may maximize distinguishable details of the image data.
The heat exchanger characteristics may include tube locations, or identification of plugged tubes, stay tubes, or the like, e.g. as identified by the centers thereof. The processing circuitry compares the heat exchanger characteristics from the image to predetermined data locating the characteristic on the tube sheet, to thereby determine the image's current location with respect to the tube sheet map and to identify other tube sheet characteristics in the image with respect to the tube sheet map. The processing circuitry may compare an unknown heat exchanger characteristic, such as a tube location, in a given image to a known heat exchanger characteristic in a prior image to determine or confirm the identity of the heat exchanger characteristic in the present image. In some example embodiments, the processing circuitry may confirm the identity of the heat exchanger characteristic based on two or more image frames at two or more locations.
Turning to the robot's orientation with respect to the tube sheet, the processing circuitry determines a rotation angle of the image, with respect to a given orientation in tube sheet space, based on alignment of the heat exchanger characteristics to the tube sheet map. The processing circuitry then calculates one or more angles between respective pluralities of aligned heat exchanger characteristics, such as tube locations, in image space to determine an offset of heat exchanger characteristics from an expected orientation based on the actual positions of those heat exchanger characteristics in tube sheet space. Relying on this offset, the system adjusts the orientation of the robot and/or camera within a display presented to the operator that identifies the image's location in tube sheet space.
In operation, heated fluid, such as water, flows into heat exchanger 100 through inlet piping 108 to hot leg 102. The fluid enters tubes 110 through tube sheet 118, transfers heat to fluid flowing through heat sink side 106, discharges into cold leg 104, and exits heat exchanger 100 through outlet piping 112. On heat sink side 106, cooler fluid (relative to the hot water passing through tubes 110), such as water, enters heat exchanger 100 through a feed ring 114 and passes downward over a thermal shroud 115 to tube sheet 118. Thermal shroud 115 separates the feed water from direct contact with the tubes as the water flows downward from feed ring 114, thereby allowing the feed water to be first warmed by heat from the fluid within thermal shroud 115 as the fluid on the outside of thermal shroud 115 passes to tube sheet 118, thereby reducing or preventing thermal shock to tubes 110. The fluid then passes under thermal shroud 115 into a volume defined by the shroud and containing tubes 110 and flows upward, receiving heat energy from tubes 110, thereby generating steam. The steam exits the heat exchanger 100 through a steam pipe 116 to be utilized by steam systems, such as turbine generators.
Heat exchanger 100 may be inspected periodically to monitor for corrosion and/or chemical build up that may degrade the normal operation of heat exchanger 100 and/or result in a leak from the heat source side, e.g. hot leg 102 and cold leg 104, to heat sink side 106. Due to the geometry of tubes 110 and their proximity to each other, tubes 110 and, correspondingly, tube sheet 118 can be susceptible to corrosion and chemical build up. Additionally, due to space constraints and/or other hazards, such as radiation and contamination in nuclear applications, the inspections are typically performed by a robot 120 inserted into heat exchanger 100 through a manway 121. The depicted heat exchanger 100 is a vertical steam generator, which is described merely for illustrative purposes. One of ordinary skill in the art would immediately appreciate from the present disclosure that the systems and methods described herein may be employed on various types of heat exchangers and in various heat exchanger orientations.
As should be understood, a motor encoder is an electro-mechanical device driven by the output (e.g. a shaft) of a motor to which it is attached or otherwise is a part that outputs a signal that corresponds to the shaft's angular position and/or continuing rotation. Each of the robot's electric motor(s) that drives the relative position between the two housing parts or rotation of the rotatable section has an encoder that outputs a signal to the processor, which in turn receives and collects the signals. The processor is calibrated to translate the signal from the encoder for the motor that drives relative movement between the housing parts and, therefore, linear movement of the robot housing into a distance traveled by the robot from a known initial position, into distance data from that initial position. The processor is also calibrated to translate the signal from the encoder for the motor that drives the rotatable segment into angular rotation of the robot from an initial orientation. By accumulating such distances and angle changes in sequence, the processing circuitry thereby tracks the robot's movement and positions over the tube sheet surface from a known initial position.
For instance, and continuing with the discussion of the robot's operation based on encoder data, processing circuitry 50 includes a memory storage 54 (
End effector 122 may secure one or more tools 123, such as an eddy current probe, for inspection and/or repair of the tubes 110. Robots for traversing and imaging tube sheets as described herein, having such effectors and cameras, are known, for example manufactured under the model name ZR-100 by Zetec, Inc. of Snoqualmie, Wash.
The construction and operation of such robots with respect to engaging and traversing the tube sheet, being understood in the art, are not discussed in detail herein. Generally, however, and for example in an embodiment in which processing circuitry 50 (
The operator operates the robot via a user interface 60 (
Camera 124 may be a digital camera having a processor and executable code stored in memory at the camera that is executable by the processor so that the camera is configured to capture image data, including fixed images or moving images. In some example embodiments, camera 124 captures images at a frame rate of 30 Hz (images per second), 60 Hz, or the like. In some example embodiments, camera 124 includes a wide angle lens, such as a fish eye lens, to broaden the camera's field of view and thereby maximize the viewable area of the tube sheet within the image data of a given acquired image, which may increase the number of tube sheet characteristics in each frame of the image data. Camera 124 is mounted on the robot so that the camera's field of view is directed downward, relative to robot 120, to capture image data that encompasses a portion of tube sheet 118. In some example embodiments, robot 120 may include or be associated with one or more tools, which may be disposed at and gripped by a distal end of the end effector or elsewhere on robot 120. Each tool may be disposed a predetermined and known distance from camera 124.
In certain embodiments, the end effector includes a rotatable unit at the end of the end effector's boom, with the tool being disposed on the rotatable unit. A motor is disposed at the boom end, under the control of the robot processor, to rotationally drive the rotatable unit in response to control signals issued by the robot's mobility control processor. The motor may include an encoder disposed on the motor so that the encoder outputs a signal to the system processing circuitry that corresponds to the rotatable unit's, and therefore the tool's, rotational position (with respect to a predetermined rotational position) about a vertical axis passing through the rotational unit's rotatable attachment to the boom end. Since the rotational unit's length from the boom end, and therefore the tool's distance from the boom end, is known and stored in memory accessible by the processor, and the encoder's data indicates the rotational unit's angular position at the boom end with respect to a predetermined orientation, the system processing circuitry knows (a) the horizontal distance from the camera's vertical field of view axis to the vertical axis of rotation between the rotatable unit and the boom end (stored in system memory), (b) the horizontal distance between the vertical axis of rotation between the rotatable unit and the boom end and a vertical axis passing through the tool, and (c) the angle (in the horizontal plane) between those two distance vectors. Thus, for any given angular position of the rotatable unit with respect to the boom end, this data defines two sides of a triangle (the two distances) and the angle therebetween. Accordingly, for each such angular position, the system processing circuitry determines the third side to the triangle through side-angle-side triangulation, thus identifying the horizontal distance between the camera's vertical field of view axis and the vertical axis passing through the tool and the angular offset (in the horizontal plane) between the distance vector from the camera's vertical field of view axis and the vertical rotational axis between the boom end and the rotatable unit and the distance vector from the camera's vertical field of view axis and the vertical axis passing through the tool. It will be understood, in view of the present disclosure, that the latter distance vector is the relevant vector for use in the heat exchanger inspection method described below. For ease of explanation, this description assumes that the system has rotationally positioned the rotatable unit so that the horizontal distance vector between the vertical axis of rotation between the rotatable unit and the boom end and the vertical axis through the tool is aligned with the distance vector between the camera's vertical field of view axis and the vertical axis of rotation between the rotatable unit and the boom end, such that the horizontal distance from the camera's vertical field of view axis and the vertical axis through the tool is the sum of these two distances and that the angle between a vector from the camera's vertical field of view axis to the vertical axis through the tool and a vector from the camera's vertical field of view axis to the vertical axis of rotation between the rotatable unit and the boom end is zero. It should be understood in view of the present disclosure, however, that the system may control the rotatable unit to be positioned at various angular positions and, in such event, the system processing circuitry will determine the distance vector from the camera's vertical field of view axis to the vertical axis through the tool based on encoder data as described above and adjust the vector's angular orientation accordingly.
Robot 120 may be utilized with its associated camera 124 and end effector 122/tool 123 to perform an inspection of heat exchanger 100 (
The present discussion refers to a tube sheet space. Tube sheet space is a two-dimensional coordinate system that can be considered to overlay a tube sheet surface, such as depicted at
Processing circuitry 50 (
The discussion below, with reference to the steps provided at
In one or more embodiments, the tube sheet position at which the operator initially places the robot is not necessarily the initial position at which the tracking operation begins. For example, as described above, the tube sheet may be marked so that the operator, positioned at the manway, may locate the robot in the operational starting position such that two or more of the four pins of housing part 117 (
Once the robot reaches the predetermined initial tracking position on the tube sheet so that the main robot camera's field of view faces downward toward the tube sheet surface and encompasses the at least two predetermined tube sheet features, for example tube penetrations, plug tubes, stay tubes, or the like, the robot camera acquires an initial image. The camera outputs the image data to the system processor, which receives the image data at 704. As discussed in more detail below, where the camera includes a wide angle lens, the processing circuitry may apply an undistort filter at 706 and may apply light compensation to the acquired image data at 708. At 710, the processor assesses the image data to identify any circular feature that meets certain predetermined criteria for defining a normal tube opening (see
To detect the heat exchanger characteristics at 710, processing circuitry 50 (
Thus, the processor, at this point, knows the pixel position in the initial image of each of a predetermined type of tube sheet characteristic. At 712, the processor compares this information to known data that describes the heat exchanger surface to thereby, at 714, locate the acquired image, and therefore the robot's position and orientation, on the heat exchanger (in this instance, the tube sheet) surface.
The comparison of the image data with the tube sheet data for the initial tracking image is based on the operator's identification of at least two predetermined tube sheet characteristics in the image. The processor drives user interface 60 (
When the processor displays the image at user interface 60 (
At this point, the processor knows the locations of the two characteristics within the image space, as defined by their pixel locations. The processor also knows the pixel locations of all other tube characteristics identified in the image, as described above. The processor then identifies the relative positions of the two identified predetermined tube sheet characteristics with respect to each other and the other tube centers identified in the image, e.g. whether the selected and identified predetermined tube sheet characteristics are adjacent each other in the image, with respect to the other tube sheet characteristics identified in the image, or if there are other tube sheet characteristics in the image disposed between the two identified predetermined tube sheet characteristics and, if so, how many.
As discussed above, the system program may be calibrated so that the processor can translate distances in image space into distances in tube sheet space. For example, prior to locating the robot onto the tube sheet, the operator may place the robot onto a surface upon which are marked at least two surface characteristics, the size of, or distance between which, is known. The surface is at the same position with respect to the camera as will be the tube sheet surface when the robot is placed on the tube sheet. The robot camera acquires an image of the calibration surface and outputs the image to a calibration system that displays the image on an operator screen. An operator locates the two characteristics on the screen using an input device such as a mouse or a keyboard, in a manner similar to that discussed above with regard to location of the predetermined tube sheet characteristics, and the calibration program determines the pixel location of the two characteristics in the image. The operator enters the actual surface distance between the two image characteristics or a dimension of the characteristic (e.g. the diameter of a tube opening). Since the calibration system knows the distance between the two characteristics, or the size/diameter of the characteristic, in terms of image pixels, this establishes a correlation between distances in image space and distances in tube sheet space. That is, for example, since the system can determine the diameter of the circular tube openings, and therefore the pixel distance across the opening, this establishes a correlation between the tube opening diameter in image space and distances in tube sheet space. The operator interacts with the program at the system processor to enter this correlation, which the system processor stores in system memory.
Returning to the first tube sheet image, the system processor, executing the program, determines the distance, in pixels, between the center of the first predetermined tube sheet characteristic and the center of the second predetermined tube sheet characteristic. Because the system processor knows the correlation between image pixel distance and tube sheet space distance, the processor applies the ratio of tube sheet space distance/image pixel distance to the determined pixel distance between the first and second predetermined tube sheet characteristics, thereby determining the tube sheet distance that corresponds to the image distance between the first and second predetermined tube sheet characteristics.
Knowing the relative disposition of the two predetermined tube sheet characteristics in image space, and the tube sheet space distance between those characteristics' image positions, the program queries the truth table for all tube numbers (e.g. row/column indicator) corresponding to this tube sheet. The program selects the two tube numbers corresponding to the two identified predetermined tube sheet characteristics and, thereby, the two tube sheet space locations (in this example, in terms of the two-dimensional distance coordinates) for those centers of those tube sheet characteristics. The program determines the tube sheet space distance between those two characteristics, compares that distance with the tube sheet space distance between those two characteristics' image positions, and determines whether the two distances are within a predetermined error threshold. If so, the program determines whether the truth table data reflects the same relative tube sheet orientation between the two predetermined tube sheet characteristics in tube sheet space as the image indicates in image space. For example, if the truth table indicates that the two tube sheet characteristics are adjacent to each other, without any intervening tube sheet characteristics, is that also true of the two identified tube sheet characteristics in image space? If the truth table indicates that the two tube sheet characteristics are separated in tube sheet space by a third tube sheet characteristic whose center is linearly aligned with the two tube sheet characteristic centers, is that also true of the two identified tube sheet characteristics in image space?
If the distance check and the orientation check return positively, the processor has determined that the operator-selected tube sheet characteristics in image space can correspond to the two selected tube sheet characteristics in tube sheet space. If either confirmation check returns negatively, the program determines that the likelihood that the operator-selected tube sheet characteristics in image space can correspond to the two selected tube characteristics in tube sheet space is low. If the check is negative, the processor provides the operator with an instruction at the user interface display that the identification of the two predetermined characteristics has failed and to re-enter the data and then ceases progress of the data analysis until receiving data that matches the criteria. If, however, the check is positive, the processor provides a success notification to the operator at the user interface and moves to the next step.
Because the processor now knows the image space positions of at least two tube sheet characteristics, for which the processor also knows (from the truth table) the tube sheet space locations, the processor can find the tube sheet positions of the other tube sheet characteristics present in the image. To do this, the processor locates each tube opening identified in the image with respect to the two predetermined tube sheet tube openings and then identifies the closest tube opening having the same relationship to those two tube openings in tube sheet space. Based on the image space information, the processor determines each of a plurality of triangles in image space, where each triangle's corners are the pixel locations of the two identified predetermined tube sheet characteristic centers and the pixel position of a respective one of the remaining tube opening centers. Based on the pixel position for each of the three triangle corners in the image (assuming a two dimensional coordinate system in image space in which the position of each pixel is defined), the processor determines the distance in image space between each pair of corners in the triangle. To the resulting known side-side-side triangle, the processor applies the Law of Sines and/or the Law of Cosines at each corner of the triangle defined by the two identified predetermined tube sheet tube characteristic centers to thereby solve for the triangle's angles at those two corners. Of course, these angles should remain the same for the corresponding triangle in tube sheet space. Thus, the processor determines a line in tube sheet space connecting the two predetermined tube sheet characteristic centers and defines a respective line extending from each of the two tube sheet characteristic centers as defined by its corner angle in the corresponding image space triangle. Projection of these two lines in tube sheet space from the two predetermined tube sheet characteristic centers defines, at the lines' intersection, where the center of the tube opening corresponding to third corner in the image space triangle should be. The processor then finds, in tube sheet space, the tube opening center closest to this expected point. If the so-identified tube sheet space tube opening center is within a predetermined threshold distance (defined in tube sheet space) from the expected point, and if there is only one tube sheet space tube opening center within that threshold, the processor considers the so-identified tube opening center in tube sheet space as corresponding to the tube opening center from image space that comprised the third point in the triangle. The processor then acquires the row/column number of the corresponding tube from the truth table, based upon the tube's center location in tube sheet space. If more than one tube opening center in tube sheet space, or if none, is found to fall within the predetermined threshold, the processor does not associate any of the tube sheet-space tube centers with the tube opening center from image space that corresponds to the third point in the triangle. The processor repeats this analysis for every other tube center in the image until all image tube centers are correlated with a tube sheet space tube center or there is a failure to do so. The processor creates a table entry in memory 54 (
The analysis above correlates the tube features in the image with the tube features in the tube sheet map and truth table. The processor also knows the robot's location in the image and can, therefore, identify the robot's location on the tube sheet. Referring to
Thus, through the comparison of image data to the tube sheet data at 712, the processor thereby identifies at 714 the image's, and therefore the robot's, location in tube sheet space. To determine the robot's orientation about axis 401, also at 714, the processor relies on information relating that orientation to the image. Robot orientation may be important, for example in some embodiments, in order to provide the operator an indication of the robot's heading, so that the operator may more accurately control (remotely, through the user interface and the processor) the robot's movements, and/or to identify the location of the end effector and the tool it carries so that the operator may deploy the tool into a tube in the tube sheet with confidence in the tube's identity. The operator, or the robot's manufacturer, may determine the end effector's position with respect to the camera's optical axis 401 in image space prior to the robot's deployment on the tube sheet and store this information at memory 54 (
This general knowledge of the robot's orientation, however, can include error, e.g. due to distortion in the image caused by a variety of sources, including environmental effects and use of a wide angle lens. Thus, also at 716, the processing circuitry corrects for this error based on a determination of such distortion. The processing circuitry identifies in the image a first line defined by two or more tube sheet characteristics, e.g. tube opening centers, an intersecting second line defined by two or more tube sheet characteristics (i.e. at least three total tube sheet characteristics), and the angle in image space between those two lines. The processing circuitry identifies the same corresponding tube sheet characteristics in tube sheet space based on the truth table and determines the corresponding tube sheet space angle between the intersecting lines they define in tube sheet space. In the absence of image distortion, the two angles should be equal. Thus, the processing circuitry compares the angles, determines any difference between them, and adjusts the robot's previously-determined tube sheet space orientation based on the determined angle error. As described below, each of the two lines defining the angle can be determined based on more than two tube sheet characteristics. The more tube sheet characteristics upon which the error correction algorithm relies, the more precise the alignment of the image to the tube sheet space, resulting in a more accurate orientation angle of the robot. Thus, the use of a wide angle lens at the camera for acquiring the images, as discussed herein, may enable the processing circuitry to include more heat exchanger characteristics in each frame of image data than previous image based location processes, and thereby increase the number of points for reducing error in the orientation angle.
Having correlated the robot's position and orientation in the image to the robot's position and orientation on the tube sheet surface, and determined error arising from image distortion, the processor adjusts the location of the end effector and, thereby, the tool it carries, in the tube sheet representation at
Once the tool operation is complete, or if the tool is not disposed over a tube opening for which use of the tool is desired, then at 718 the operator issues an instruction to the processor, via the user interface, to move the robot on the tube sheet surface in a direction desired by the operator's review of the tube sheet image at the user interface display, as discussed above, or the processor continues a previously-entered instruction that has not yet been completed. The processor receives (or continues) the instruction and responsively sends control signals (e.g. through appropriate relays) to mobility system 127 (
Upon receiving the subsequent image's data from the camera, the processor, at 722, repeats steps 702-720 for the new image. For the subsequent image received at 704, the existing heat exchanger data at 702 is provided via the prior image data. The processor again locates circular and linear tube sheet characteristics in the new image, in the same manner as it had for the initial image. Upon locating each such identifiable tube sheet characteristic in the subsequent image (in terms of pixel position), the processing circuitry compares the characteristics' pixel positions in the subsequent image with the tube sheet characteristic pixel positions in the immediately preceding acquired image. The robot's actual, average, expected, or maximum speed being known, and the camera's frame rate being known, dividing the latter into the former provides the expected distance the robot can be expected to travel from image to image. The addition of a tolerance, e.g. 5%, 10%, 15% or the like, to the expected distance range produces a threshold distance by which tube sheet characteristics in the subsequent image are correlated to tube sheet characteristics in the prior image. This predetermined threshold is programmed into the program executed by the processor. The processor thus compares the pixel location of each identified tube sheet characteristic in the subsequent image to the pixel positions of each tube sheet characteristic in the initial image. Where a tube sheet characteristic is at a pixel position in the subsequent image that is within the predetermined threshold of the pixel position of the same type of tube sheet characteristic (e.g. open tube or plug tube, as the case may be) in the prior image, but is not within the predetermined threshold with respect to any other tube sheet characteristic of the same type in the prior image, the processor determines that the tube sheet characteristic in the subsequent image is the tube sheet characteristic from the prior image that is within the threshold distance. If multiple tube sheet characteristics from the prior image are within the threshold distance of the characteristic in the subsequent image, the characteristic in the subsequent frame is recorded but is not used to locate the subsequent image. Returning to a characteristic in the subsequent image for which there is only one characteristic within the threshold from the prior image, since the processor has identified the tube sheet characteristic from the prior image in tube sheet space, the processor assigns the tube sheet characteristic in the subsequent image the same identity and stores that tube sheet space identity in association with the image pixel location of the subsequent image in the data stored for this image, as discussed herein. The processor repeats this process for each tube sheet characteristic identified in the subsequent image. Where the processor is able to so identify the tube sheet space identity of at least two tube sheet characteristics of the subsequent image, this locates the subsequent image in tube sheet space, as discussed herein, where these two tube sheet characteristics are the predetermined tube sheet characteristics. If the subsequent image contains any tube sheet characteristics that were not present in or successfully identified within the prior image, the processor attempts to identify those characteristics based on at least two of the identified characteristics, as discussed herein.
Having located the subsequent image, and therefore the robot, in tube sheet space, the processor identifies the image's, and therefore the robot's, general orientation in tube sheet space and corrects that orientation for image distortion, as discussed herein. The processor determines the tube sheet space position of the end effector and its tool, as described above and further below, and updates the representation of the image and the end effector/tool in the tube sheet representation presented at the user interface 60 (
The operator repeats this process until the operator has deployed the tool in all tube openings of interest, with the result that the processing circuitry has stored at 54 (
As described above, camera 124 may send image data 200 to processing circuitry 50 (
In some example embodiments, camera 124 may include a wide angle or ultra-wide angle lens 125 (
In some example embodiments, the image quality within the interior of heat exchanger 100 may be poor or unreliable due to the harsh environment in which the tube sheet is disposed. Thus, for example, lighting in image data 200 (
As discussed above, image space distortions can create error in the processor's location of the robot's orientation in tube sheet space. As described above, the processor determines the locations of the tube centers in the image through triangulation based on the positions of two known tube characteristics. The translation of each triangle into tube sheet space assumes that the representation of the tube sheet surface in the image is undistorted, so that the relationships among the features in the image are the same as the relationships among those same features on the tube sheet surface. Distortion in the image, however, can impart differences in those relationships, as between the image and the tube sheet, with the result that the correlation between one or more tube sheet characteristics in the image to tube sheet characteristics in tube sheet space may be incorrect, and there may be error in the identification of line 502 in tube sheet space. This, in turn, can translate into error in identifying the location of the end effector and tool at 503. Since the distance between robot center axis 401 and the end effector/tool 503 can be large relative to the dimensions of image 400, this error can propagate to such a degree that the end effector/tool position illustrated in
To resolve such error, the processor adjusts the position of reference line 502, and therefore of ray 504, in the display 500 of
In
The processor then selects one of the two tube opening centers adjacent the selected center 404 in the selected center's same row (i.e. among those tube characteristic centers having the same row number). The choice of which adjacent center is immaterial, but in this example the direction chosen results in the selection of the tube opening center immediately to the right and below the selected center 404. The processor defines a line 405 in image space extending through the two column centers and a line 407 in image space through the two row centers. The processor then measures the angle Δ between these two lines. Angle Δ could be measured directly between lines 405 and 407, or, e.g., by measuring the angle Θcol between lines 406 and 405, and the angle Θrow between lines 406 and 407, and determining the difference between Θcol and Θrow. Since the tube opening row lines and column lines in tube sheet space are always offset by 90° (or 270°, depending on the measurement direction, but in either event the “expected angle”), deviation from a 90° offset between lines 405 and 407 in image 400 is due to distortion in the image. In one or more embodiments, the processor directly measures the angle between lines 405 and 407 in the same direction as the expected angle is measured, compares the measured angle to the expected angle, and defines an offset adjustment angle, as discussed below, to be equal to one-half the difference between the expected angle and the measured angle. The 0.5 weighting factor was determined by trial and error to provide a desired distortion resolution, but it should be understood that this factor may be adjusted if desired. If the measured angle is less than the expected angle, the offset adjustment angle is negative, indicating a clockwise shift in lines 502 and 504 in
In other embodiments, the distortion measurement, and its compensatory offset adjustment angle, are determined based on an approximation of a column line 405 that incorporates additional tube centers for the selected column whose centers are visible in image 400 and additional tube centers for the selected row whose centers are visible in image 400. In one or more such embodiments, the processor defines line 405 by applying a best fit algorithm to all such visible column tube centers in the selected column (in image space) and defines line 407 by applying a best fit algorithm to all such visible row tube centers in the selected row (in image space). The processor then directly measures the angle between lines 405 and 407 in the same direction as the expected angle is measured (or by determining Θcol and Θrow and the difference between those angles, as discussed above), compares the measured difference angle to the expected angle, and defines an offset adjustment angle, similarly as discussed above and below, to be equal to one-half the difference between the expected angle and the measured angle, weighted by a factor that depends on a ratio of the number of tube sheet characteristic points that contributed to the definition of row line 407 to the number of points that contributed to column line 405. Again, the default factor of 0.5 was determined upon trial and error to provide a desirable resolution of distortion when the column and row points contributed evenly. The sign of the offset adjustment angle, again, determines the direction by which lines 502 and 504 are rotated in
In other embodiments, the angle between lines 405 and 407 is not measured directly between the two lines but is, instead, measured as the difference between angles Θcol and Θrow measured between line 405 and a line 406 parallel to line 502 that passes through selected tube opening center point 404 and between line 407 and line 406, respectively, where the angle Θcol between line 405 and line 406 is the result of a replication and accumulation of such angles for multiple tube centers in the selected column, and the angle Θrow between line 407 and line 406 is the result of a replication and accumulation of such angles for multiple tube centers in the selected row. An accumulation of offset errors among the tube sheet centers in the selected column and in the selected row increases the confidence in the error determination. For each tube opening in the selected column in image 400 for which a center is within image 400 or is projectable to a determinable position with respect to image 400 (e.g. if the Hough circle transformation determines a circle center outside the image for a circle only partially visible in the image) for the tube openings above and below the selected tube opening in the selected opening's column (for purposes of this discussion, there are four such tube opening centers in image 400: the selected tube opening center 404 and the respective tube opening centers above and below the selected center 404 as indicated by column line 405), the processor defines a line 406 parallel to line 502 and extending through that tube opening center, a line 405 as a best fit line defined by the selected tube center point 404 and all other (in this instance, three) tube center points in the column in or projected from the image (as discussed above), and an angle (ΘCol) extending from that line 406 in the clockwise direction to that line 405. For each tube opening in the selected row in image 400 for which a center is within or projectable from the image for the tube openings to the right and left of that tube opening (there are five such tube opening centers in or projectable from image 400 along row 407: the selected tube opening center 404 and the two tube opening centers in the row both to the left and the right of the selected tube opening center 404), the processor defines a line 406 parallel to line 502 and extending through the selected tube opening center, a line 407 as a best fit line defined by the selected tube center point 404 and all other (in this instance, four) tube center points in the row in or projected from the image (as discussed above), and an angle (ΘRow) extending from that line 406 in the clockwise direction to that line 407. For each of the three other column tube centers within the same column as the originally selected column tube center, the processor determines an angle ΘCol specific to that tube opening as the now-selected opening, in the manner as described above. For each of the two other row tube centers within the same row as the originally selected row tube center, the processor determines an angle ΘRow specific to that tube opening as the now-selected opening, in the manner as described above. The processor averages the four values of ΘCol and averages the two values of ΘRow, where the average function is represented at Equation 1.
Processing circuitry 50 removes outliers in Θcol and Θrow, such as by applying Chauvenet's criterion. In some examples, Θcol and Θrow angles greater than a predetermined threshold, such as two standard deviations, may be removed from processing, as an outlier not indicative of a true heat exchanger characteristic location.
The processor executes Equation 2 to thereby determine the absolute value of the difference, Δ, between the average column angle,
∥
Since Δ can range between 0° and 360°, if Δ is greater than 180°, the processor converts Δ to its corresponding angle below 180° according to Equations 3a and 3b, resulting in the angle σ. As will be apparent below, these steps are not needed for Equation 4, but for Equations 5a and 5b, the results of which indicate whether the angle adjustment should be additive or subtractive, the angle should be converted to a value less than 180°.
Δ>180⇒σ=360−Δ EQN. 3a
Δ<180⇒σ=Δ EQN. 3b
The processor executes Equation 4 to determine the remainder, λ, from the numerical division of σ by the expected angle.
λ=σ mod ≮_tubesheet. EQN. 4
The modulo operation result (λ) describes the angle by which the angular offset between the column and row in the image differs from the angular offset between the same column and row in image space. In this example, then, λ describes the angle by which the angular offset between the column and row in the image differs from 90°. It does not, however, indicate whether that angular difference from 90° is positive or negative. The processor, executing Equations 5a/5b, introduces the proper sign (i.e. indicating the direction of offset from the expected angle) and halves the result. This is, then, the offset adjustment angle.
As in the first two embodiments, the default weighting factor is 0.5. As in the second embodiment, the weighting factor can be modified based on the ratio of the number of tube characteristic column center points used to determine line 405 in the best fit analysis to the number of tube characteristic row center points used to determine line 407. The processor determines an error factor equal to one-half the ratio of the number of ΘCol angles utilized in the above analysis to the number of ΘRow angles utilized in the above analysis. If the error factor is less than 1, the processor keeps the offset adjustment angle unchanged. If the error factor is greater than 1, the processor multiplies the offset adjustment angle by the error factor.
Returning to
As discussed above, the adjustment to the rotation angle and/or orientation angle may cause the determination of the end effector position in
As discussed above, and referring to
Processing circuitry 50 identifies one or more heat exchanger characteristics by comparing the unknown heat exchanger characteristics and the known heat exchanger characteristics in image 400 of
As discussed above, processing circuitry 50 tracks, by storing to memory, heat exchanger characteristics from a previous frame and uses the location of the previously identified heat exchanger characteristics to determine unknown or unidentified heat exchanger characteristics. In an example embodiment, processing circuitry 50 compares the current image frame to one or more previous frames. The known heat exchanger characteristics in a current image frame may be determined by being within a predetermined threshold, such as 1 radii, 2 radii, or the like, for circle detection of a tube location 402 or a width of a tube 110 in line detection of tubes 118. The threshold may be selected based on the frame rate of the image data and/or the speed at which robot 120 (
In an example embodiment in which few tubes, e.g. three tubes, four tubes, or the like, are found in the current frame, the theta angle analysis from a predetermined number of previous frames, such as 1 frame, 3 frames, or the like, may be averaged and used for adjustment of the rotation angle of image 400. In some example embodiments, processing circuitry 50 verifies the identified heat exchanger characteristics throughout the inspection. Processing circuitry 50 is configured to relabel any heat exchanger characteristic that is determined to be mis-identified. For example, if the heat exchanger characteristic is identified a predetermined number of times, such as three times, five times, a majority or time, or the like, differently than the current identification, the heat exchanger characteristic is identified with the new identification.
Although the above process is discussed primarily in the context of inspecting tube sheet 118, other aspects of heat exchanger 100 may also be inspected using this process, such as the interface of tubes 110 and tube sheet 118 on the heat sink side 106 of the heat exchanger, such as depicted in
As depicted In
An example embodiment of the invention will now be described with reference to
An apparatus configured for heat exchanger inspection is provided. The apparatus may be an embodiment of inspection module 44 or a device hosting inspection module 44. In an example embodiment, the apparatus may include or otherwise be in communication with processing circuitry 50 that is configured to perform data processing, application execution and other processing and management services. In one embodiment, processing circuitry 50 may include a storage device 54 and a processor 52 that are in communication with or otherwise control a user interface 60 and a device interface 62. As such, processing circuitry 50 is embodied as a circuit chip (e.g. an integrated circuit chip) configured (e.g. with hardware, software or a combination of hardware and software) to perform operations described herein. However, in some embodiments, processing circuitry 50 may be embodied as a portion of a server, computer, laptop, workstation or even one of various mobile computing devices. In situations where processing circuitry 50 is embodied as a server or at a remotely located computing device, user interface 60 may be disposed at another device (e.g. at a computer terminal or client device) in communication with processing circuitry 50 via device interface 62 and/or a network (e.g. network 30).
User interface 60 is in communication with processing circuitry 50 to receive an indication of a user input at user interface 60 and/or to provide an audible, visual, mechanical or other output to the user. As such, user interface 60 may include, for example, a keyboard, a mouse, a joystick, a display, a touch screen, a microphone, a speaker, mobile device, or other input/output mechanisms. In embodiments where the apparatus is embodied at a server or other network entity, user interface 60 may be limited or even eliminated in some cases. Alternatively, as indicated above, user interface 60 may be remotely located.
Device interface 62 may include one or more interface mechanisms for enabling communication with other devices and/or networks. In some cases, device interface 62 may be any means such as a device or circuitry embodied in hardware, software, or a combination of hardware and software that is configured to receive and/or transmit data from/to a network and/or any other device or module in communication with processing circuitry 50. In this regard, device interface 62 may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network and/or a communication modem or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB), Ethernet or other methods. In situations where device interface 62 communicates with a network, the network may be any of various examples of wireless or wired communication networks such as, for example, data networks like a Local Area Network (LAN), a Metropolitan Area Network (MAN), and/or a Wide Area Network (WAN), such as the Internet.
In an example embodiment, storage device 54 may include one or more non-transitory storage or memory devices such as, for example, volatile and/or non-volatile memory that may be either fixed or removable. Storage device 54 may be configured to store information, data, applications, instructions or the like for enabling the apparatus to carry out various functions in accordance with example embodiments of the present invention. For example, storage device 54 could be configured to buffer input data for processing by processor 52. Additionally or alternatively, storage device 54 could be configured to store instructions for execution by processor 52. As yet another alternative, storage device 54 may include one of a plurality of databases (e.g. database server 42) that may store a variety of files, contents or data sets. Among contents of the storage device 54, applications (e.g. client application 22 or server application 44) may be stored for execution by processor 52 in order to carry out the functionality associated with each respective application.
Processor 52 may be embodied in a number of different ways. For example, processor 52 may be embodied as various processing means such as a microprocessor or other processing element, a coprocessor, a controller or various other computing or processing devices including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a hardware accelerator, or the like. In an example embodiment, processor 52 may be configured to execute instructions stored in storage device 54 or otherwise accessible to processor 52. As such, whether configured by hardware or software methods, or by a combination thereof, processor 52 may represent an entity (e.g. physically embodied in circuitry) capable of performing operations according to embodiments of the present invention while configured accordingly. Thus, for example, when processor 52 is embodied as an ASIC, FPGA or the like, processor 52 may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when processor 52 is embodied as an executor of software instructions, the instructions may specifically configure processor 52 to perform the operations described herein.
In an example embodiment, processor 52 (or processing circuitry 50) may be embodied as, include or otherwise control the inspection module 44, which may be any means, such as, a device or circuitry operating in accordance with software or otherwise embodied in hardware or a combination of hardware and software (e.g. processor 52 operating under software control, processor 52 embodied as an ASIC or FPGA specifically configured to perform the operations described herein, or a combination thereof) thereby configuring the device or circuitry to perform the corresponding functions of inspection module 44 as described below.
In an example embodiment, processing circuitry 50 may include or otherwise be in communication with camera 124. The camera 124 may be a digital camera configured to capture image data associated with the surrounding environment. The image data may be one or more fixed images or a moving image.
Inspection module 44 manager may include tools to facilitate distributed heat exchanger inspections via network 30. In an example embodiment inspection module 44 is configured to receive the image data from the camera, determine one or more heat exchanger characteristics in the image data, compare the one or more heat exchanger characteristics to heat exchanger data, determine a current location of the robot based on the comparison of the one or more heat exchanger characteristics to the heat exchanger data, and identify the heat exchanger characteristic based on the current location.
From a technical perspective, inspection module 44 described above may be used to support some or all of the operations described above. As such, the platform described in
Accordingly, blocks of the flowchart support combinations of means for performing the specified functions and combinations of operations for performing the specified functions. It will also be understood that one or more blocks of the flowchart, and combinations of blocks in the flowchart, can be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe exemplary embodiments in the context of certain exemplary combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. In cases where advantages, benefits or solutions to problems are described herein, it should be appreciated that such advantages, benefits and/or solutions may be applicable to some example embodiments, but not necessarily all example embodiments. Thus, any advantages, benefits or solutions described herein should not be thought of as being critical, required or essential to all embodiments or to that which is claimed herein. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.