The claimed subject matter relates generally to industrial control systems and more particularly to systems and methods that utilize time of flight sensing to control industrial equipment in the performance of work in industrial environments.
To date, human-machine collaboration has been based on a master-slave relationship where the human user operates industrial machinery or programs industrial machinery while it is off-line, allowing only static tasks to be performed. Moreover, to ensure safety, the workspaces of humans and industrial equipment are typically separated in time or in space. As will be appreciated, the foregoing approach fails to take advantage of potential human and industrial equipment/machinery collaboration where each member, human and industrial equipment/machinery, can actively assume control and contribute to the solution of tasks based on their respective capabilities.
Sign language has been utilized extensively in society as well as amongst the hearing impaired for the purposes of communication. Moreover, sign language and/or body gestures/language have been employed in noisy environments and/or environments where distance is a factor to convey commands and/or directions. For example, at industrial worksites, such as an aircraft manufacturer, it is not atypical to see personnel using hand and/or arm signals to direct crane operators in the maneuvering of heavy components, such as wings for attachment to the body of an aircraft under manufacture. Further, certain sign language and/or body gestures/expressions, regardless of region of the world and/or culture, can have universality and can convey substantially similar connotations.
As will be appreciated industrial environments, or work areas within these industrial environments, can pose significant dangers and hazards to personnel who unwittingly enter them. In industrial environments there can be numerous machines that can spin and/or move at considerable speed and/or with tremendous force such that should a human come in the way of these machines serious injury or even death could result.
Touch screen monitors employed in industrial applications as human machine interfaces (HMIs), despite constant cleaning (e.g., with wet wipes) can over time become encrusted with grime and/or detritus (e.g., dust, oils from contact with fingers, oils from industrial processes, particulate from latex or rubber gloves, etc.) even under the most sterile and/or sanitary conditions. Build up of such grime and/or detritus layers can cause the sensitivity of touch screen monitors to deteriorate over time. Moreover, some touch screen monitors require that actual physical contact be made between a body part (e.g., finger) and the screen. For instance, there are touch screen monitors that do not function when one is wearing gloves. As can be imagined this can be a problem where the touch screen monitor is situated in a chemically corrosive industrial environment where exposure of skin in order to manipulate objects displayed on the screen can have hazardous consequences.
Further, touch screens manipulable using a stylus or other scribing means can also be subject to drawbacks since scribing or drawing the stylus over the surface of the touch screen can ultimately indelibly scratch or etch the surface making subsequent viewing of the screen difficult or problematic. Additionally, many working areas in an industrial plant can be situated within environments where the atmosphere is saturated with airborne abrasive particulate matter and/or oils that can settle on touch screens. This abrasive particulate matter, alone and/or in conjunction with any settled oils acting as a lubricant, can ineffaceably incise the touch screen were a stylus or other scribing means to be drawn over the touch screen. Moreover, use of light pens, light wands, or light guns, rather than a stylus, are typically not compatible with current industry trends away from cathode ray tube (CRT) monitors to utilization of flat screen technologies for reasons of space savings, and further use of light pens, light wands, or light guns requires the user be relatively proximate to the CRT monitor.
In order to demarcate or circumscribe and/or monitor hazardous regions in an industrial automation environment that can include various machines moving and/or rotating with great rapidity and/or force, it has been common practice to employ fences, light curtains, and the like, to immediately halt the machines in the controlled or bounded area should persons unwittingly stumble into and/or limbs accidentally enter such dangerous areas during operation of these machines. A further widespread practice that has also be employed to prevent inadvertent entry into restricted and/or supervised zones posing risk to life and/or limb in industrial automation environments has been through the use of position marking points wherein cameras detect and ascertain the position of the position marking points and generate the boundaries of the protected areas which can thereafter be monitored for intentional and/or inadvertent/accidental entry.
The following summary presents a simplified overview to provide a basic understanding of certain aspects described herein. This summary is not an extensive overview nor is it intended to identify critical elements or delineate the scope of the aspects described herein. The sole purpose of this summary is to present some features in a simplified form as a prelude to a more detailed description presented later.
In accordance with various aspects and/or embodiments of the subject disclosure, a method for utilizing a user's body movement in an industrial automation environment is provided. The method includes employing a time-of-flight sensor to detect movement of a body part of the user, ascertaining whether or not the movement of the body part conforms to a recognized movement of the body part, interpreting the recognized movement of the body part as a performable action, and actuating industrial machinery to perform the performable action based on the recognized movement of the body part.
In accordance with further aspects or embodiments, a system that employs body movement to control industrial machinery in an industrial automation environment is disclosed. The system can include a time-of-flight sensor that detects movement of a body part of a user positioned proximate to the time-of-flight sensor, an industrial controller that establishes whether or not the movement of the body part conforms with a recognized movement of the body part, and an industrial machine that performs an action based at least in part on instructions received from the industrial controller.
In accordance with yet further aspects or embodiments, a system that utilizes movement performed by a user to actuate actions on industrial equipment is described. The system can include means for constantly monitoring the movement performed by the user, means for detecting an appropriate movement performed by the user, means for demarcating, on a generated or persisted map of an industrial factory environment, a safety zone around the industrial equipment described by the appropriate movement performed by the user, and means for actuating the industrial equipment to monitor the safety zone for inadvertent intrusion.
To the accomplishment of the foregoing and related ends, the following description and annexed drawings set forth in detail certain illustrative aspects. These aspects are indicative of but a few of the various ways in which the principles described herein may be employed. Other advantages and novel features may become apparent from the following detailed description when considered in conjunction with the drawings.
A system that employs a user's body movements, gestures, or gesticulations to control industrial equipment in industrial automation environments. In one embodiment, a method is provided that employs a time-of-flight sensor to detect movement of a body part of the user, ascertains whether or not the movement of the body part conforms to a recognized movement of the body part, interprets the recognized movement of the body part as a performable action, and thereafter actuates industrial machinery to perform the performable action. In a further embodiment, a system is provided that utilizes body movement to control industrial machinery in an industrial automation environment, wherein a time-of-flight sensor can be employed to detect movement of a body part of a user positioned proximate to the time-of-flight sensor, an industrial controller can be used to establish whether or not the movement of the body part conforms with a recognized movement (or pattern of movements) of the body part, and an industrial machine can perform actions in response to instructions received from the industrial controller.
Referring initially to
In the same manner that a human observer can understand consistently repeatable body motion or movement to convey secondary meaning, system 100 can also utilize human body movement, body gestures, and/or finger gesticulations to have conveyed meaningful information in the form of commands, and can therefore perform subsequent actions based at least in part on the interpreted body movement and the underlying command. Thus, as stated earlier, time-of-flight sensor 102 can monitor body motion of a user positioned within its line of sight. Time-of-flight sensor 102 can monitor or detect any motion associated with the human body. In accordance with one embodiment, time-of-flight sensor 102 can monitor or detect motion associated with the torso of the user located proximate the time-of-flight sensor 102. In accordance with another embodiment, time-of-flight sensor 102 can detect or monitor motion associated with the hands and/or arms of the user situated within the time-of-flight sensor's 102 line of sight. In accordance with yet a further embodiment, time-of-flight sensor 102 can detect or monitor eye movements associated with the user situated within the working ambit of time-of-flight sensor 102. In accordance with another embodiment, time-of-flight sensor 102 can detect or monitor movement associated with the hand and/or digits (e.g., fingers) of the user positioned proximate to the optimal operating zone of time-of-flight sensor 102.
At this juncture, it should be noted, without limitation or loss of generality, that time-of-flight sensor 102, in conjunction or cooperation with other components (e.g., controller 104 and logic component 106) can perceive motion in at least three-dimensions. In accordance with an embodiment therefore, time-of-flight sensor 102 can perceive, not only, lateral body movement (e.g., movement in the x-y plane) taking place within its line of sight, but can also discern body movement in the z-axis as well.
Additionally, in cooperation with further components, such as controller 104 and/or associated logic component 106, time-of-flight sensor 102 can gauge the velocity with which a body movement, gesticulation, or gesture is performed. For example, where the user positioned proximate to the time-of-flight sensor 102 is moving their hands with great vigor or velocity, time-of-flight sensor 102, in conjunction with controller 104 and/or logic component 106, can comprehend the velocity and/or vigor with which the user is moving their hands to connote urgency or aggressiveness. Accordingly, in one embodiment, time-of-flight sensor 102 (in concert with other components) can perceive the vigor and/or velocity of the body movement providing a modifier to a previously perceived body motion. For instance, in an industrial automated environment, where a fork lift operator is receiving directions from a colleague, the colleague can have initially commenced his/her directions by gently waving his/her arm back and forth (indicating to the operator of the forklift that he/she is clear to move the forklift in reverse). The colleague on perceiving that the forklift operator is reversing too rapidly and/or that there is a possibility of a collision with on-coming traffic can either start waving his/her arm back and forth with great velocity (e.g., informing the forklift operator to hurry up) or hold up their arm with great emphasis (e.g., informing the forklift operator to come to an abrupt halt) in order to avoid the impending collision.
Conversely, in a further embodiment, time-of-flight sensor 102, in conjunction with controller 104 and/or logic component 106, can detect the sluggishness or cautiousness with which the user, situated proximate to the time-of-flight sensor 102, is moving their hands. Such sluggishness, cautiousness, or lack or emphasis can convey uncertainty, warning, or caution, and once again can act as a modifier to previously perceived body movements or future body movements. Thus, to continue the foregoing forklift operator example, the colleague can, after having waved his/her arm back and forth with great velocity, vigor, and/or emphasis can now commence moving his/her arm in a much more languid or tentative manner, indicating to the forklift operator that caution should be used to reverse the forklift.
On perceiving (e.g., detecting or monitoring) motion or movement associated with a user positioned within its line of sight, time-of-flight sensor 102 can communicate with controller 104. It should be appreciated without limitation or loss of generality that time-of-flight sensor 102, controller 104 (and associated logic component 106), and industrial machinery 108 can be located in disparate ends of an automated industrial environment. For instance, in accordance with an embodiment, time-of-flight sensor 102 and industrial machinery 108 can be situated in close proximity to one another, while controller 104 and associated logic component 106 can be located in an environmentally controlled (e.g., air-conditioned, dust free, etc.) environment. In accordance with a further embodiment, time-of-flight sensor 102, controller 104 and logic component 106 can be located in an environmentally controlled safe environment (e.g., a safety control room) while industrial machinery 108 can be positioned in a environmentally hazardous or inhospitable environment (e.g., industrial environments where airborne caustic or corrosive reagents are utilized). In still yet a further embodiment, time-of-flight sensor 102, controller 102, logic component 106, and industrial equipment or industrial machinery 108 can each be situated at geographically disparate ends of the industrial automation environment (e.g., for multinational corporations, disparate ends of the industrial automation environment can imply components of manufacture located in different cities and/or countries). Needless to say, in order to facilitate communication between the various and disparately located component parts of system 100, a network topology or network infrastructure will usually be utilized. Typically the network topology and/or network infrastructure can include any viable communication and/or broadcast technology, for example, wired and/or wireless modalities and/or technologies can be utilized to effectuate the subject application. Moreover, the network topology and/or network infrastructure can include utilization of Personal Area Networks (PANs), Local Area Networks (LANs), Campus Area Networks (CANs), Metropolitan Area Networks (MANs), extranets, intranets, the Internet, Wide Area Networks (WANs)—both centralized and/or distributed—and/or any combination, permutation, and/or aggregation thereof.
Time-of-flight sensor 102 can communicate to controller 104 a detected movement or motion or a perceived pattern of movements or motions that are being performed by the user located in proximity to time-of-flight sensor 102. In accordance with one embodiment, an individual movement, single motion, signal, or gesture (e.g., holding the palm of the hand up in a static manner) performed by the user can be detected by time-of-flight sensor 102 and conveyed to controller 104 for analysis. In accordance with a further embodiment, a single repetitive motion, signal, movement or gesture (e.g., moving the arm in a side to side motion) can be detected by time-of-flight sensor 102 and thereafter communicated to controller 104. In accordance with yet a further embodiment, a series or sequence of body motions/movements, signals, gestures, or gesticulations comprising a complex command structure, sequence, or set of commands (e.g., initially moving the arm in a side-to-side manner, subsequently utilizing an extended thumb providing indication to move up, and finally using the palm of the hand facing toward the time-of-flight sensor 102 providing indication to halt), for example, can be identified by time-of-flight sensor 102 and passed on to controller 104 for contemporaneous and/or subsequent interpretation, analysis and/or conversion into commands (or sequences or sets of commands) to be actuated or effectuated by industrial machinery 108.
As might have been observed and/or will be appreciated from the foregoing, the sequences and/or series of body/movements, signals, gestures, or gesticulations utilized by the subject application can be limitless, and as such a complex command structure or set of commands can be developed for use with industrial machinery 108. Moreover, one need only contemplate established human sign language (e.g. American Sign Language) to realize that a great deal of complex information can be conveyed merely through use of sign language. Accordingly, as will have been observed in connection with the foregoing, in particular contexts, certain gestures, movements, motions, etc. in a sequence or set of commands can act as modifiers to previous or prospective gestures, movements, motions, gesticulations, etc.
Thus, in order to distinguish valid body movement (or patterns of body movement) intended to convey meaning from invalid body movement (or patterns of body movement) not intended to communicate information, parse and/or interpret recognized and/or valid body movement (or patterns of body movement), and translate recognized and/or valid body movement (or patterns of body movement) into a command or sequence of commands or instructions necessary to actuate or effectuate industrial machinery to perform tasks, time-of-flight sensor can be coupled to controller 104 that, in concert with an associated logic component 106, can differentiate valid body movements (or patterns of body movement) from invalid body movements (or patterns of body movement), and can thereafter translate recognized body movement (or patterns of body movement) into a command or sequence or set of commands to activate industrial machinery 108 to perform the actions indicated by the recognized and valid body movements (or patterns of body movement).
To aid controller 104 and/or associated logic component 106 in differentiating valid body movement from invalid or unrecognized body movement, controller 104 and/or logic component 106 can consult a persisted library or dictionary of pre-established or recognized body movements (e.g., individual hand gestures, finger movement sequences, etc.) in order to ascertain or correlate the body movement supplied by, and received from, time-of-flight sensor 102 with recognized body movement, and thereafter to utilize the recognized body movement to interpret whether or not the recognized body movement is capable of one or more performable action on industrial machinery 108. Controller 104 and/or associated logic component 106 can thereafter supply a command or sequence of commands that can actuate performance of the action on industrial machinery 108.
It should be noted without limitation or loss of generality that the library or dictionary of pre-established or recognized body movements as well as translations or correlations of recognized body movement to commands or sequences of command can be persisted to memory or storage media. Thus, while the persistence devices (e.g., memory, storage media, and the like) are not depicted, typical examples of these devices include computer readable media including, but not limited to, an ASIC (application specific integrated circuit), CD (compact disc), DVD (digital video disk), read only memory (ROM), random access memory (RAM), programmable ROM (PROM), floppy disk, hard disk, EEPROM (electrically erasable programmable read only memory), memory stick, and the like.
Additionally, as will also be appreciated by those conversant in this field of endeavor, while body movements can be repeatable they nevertheless can be subject to slight variation over time and between different users. Thus, for instance a user might one day use his/her whole forearm and hand to indicate an instruction or command (e.g. reverse the forklift) but on the next the same user might use only his/her hand flexing at the wrist to indicate the same instruction or command. Accordingly, controller 104 and/or logic component 104 can also utilize fuzzy logic (or other artificial intelligence mechanisms) to discern slight variations or modifications in patterns of body movement between the same or different users of system 100, and/or to identify homologous body movements performed by the same or different users of system 100.
In connection with the aforementioned library or dictionary of established or recognized body movements, it should be appreciated that the established or recognized body movements are generally correlative to sets of industrial automation commands universally comprehended or understood by diverse and/or disparate industrial automation equipment in the industrial automation environment. The sets of commands therefore are typically unique to industrial automation environments and generally can include body movement to command correlations for commands to stop, start, slow down, speed up, etc. Additionally, the correlation of body movements to industrial automation commands can include utilization of established sign language (e.g., American Sign Language) wherein sign language gestures or finger movements can be employed to input alphanumeric symbols. Thus, in accordance with an aspect, letters (or characters) and/or numerals can be input by way of time of flight sensor 102 to correlate to applicable industrial automation commands.
The sets of commands and correlative body gestures and/or movements can be pre-established or installed during manufacture of time of flight sensor 102, and/or can be taught to time of flight sensor 102 during installation, configuration, and/or set up of time of flight sensor 102 in an industrial automation environment. In the case of teaching time of flight sensor 102 correlations or correspondences between gestures or signs and commands operable to cause industrial automation machinery to perform actions, this can be accomplished through use of a video input facility associated with time of flight sensor 102. In accordance with this aspect, time of flight sensor 102 can be placed in a learning mode wherein a user can perform gestures or finger movements which can be correlated with commands that cause industrial automation machinery or equipment to perform actions, and these correlations can subsequently be persisted to memory. As will be appreciated by those of moderate comprehension in this field of endeavor, selected body gestures and command correlations can be specific to particular types of industrial automation equipment, while other body gestures and command correlations can have wider or universal application to all industrial automation equipment. Thus, body gesture/command correlations specific or particular to certain types of industrial automation equipment or machinery can form a sub-set of the body gesture/command correlations pertinent and universal to all industrial automation equipment or machinery. Once time of flight sensor 102 has been configured (either through installation and persistence of pre-established sets of commands and body gesture correspondences or through the aforementioned learning mode) with sets of command and body gesture correlations, time of flight sensor 102 can be switched to a run time mode wherein the sets of body gesture/command correlations can be utilized to actuate industrial equipment or machinery.
In an additional embodiment, time of flight sensor 102, whilst in a run time mode or in a user training mode can be utilized to provide dynamic training wherein time of flight sensor 102 through an associated video output facility can demonstrate to a user the various body gesture/command correspondences persisted and utilizable on specific industrial machinery or equipment or universally applicable to industrial machinery situated in industrial automation environments in general. Further, time of flight sensor 102, where a user or operator of industrial machinery is unable to recall a body gesture/command correspondence or sequence of body gesture/command correspondences, once again through an associated video output functionality, can provide tutorial to refresh the operator or user's memory regarding the body gesture/command correspondence(s). Additionally during run time mode, time of flight sensor 102 can further provide a predictive feature wherein plausible or possible body gesture/command correspondences can be displayed through the video output feature associated with time of flight sensor 102. Thus, for instance where the user or operator has commenced, through body gestures, inputting commands to operate industrial automation equipment, time of flight sensor 102 can predictively display on a video screen possible alternative body gestures that can be undertaken by the user to further the task being performed by the industrial machinery.
With reference to
Human machine interface component 202, in concert with time-of-flight sensor 102 (or a plurality of time-of-flight sensors disposed in various locations), can be utilized to provide a touchless touch screen interface wherein motions of the fingers and/or hands can be utilized to interact with industrial machinery 108. Such a touchless touch screen interface can be especially applicable in environments (e.g., food processing) where a user or operator of a touch screen interface comes in contact with oily contaminants (e.g., cooking oils/fats/greases) and yet needs to access the touch screen. As will be comprehended by those cognizant in this field of endeavor, touching touch sensitive devices with hands contaminated with oils and/or greases can diminish the visibility of displayed content associated with the screen and significantly attenuate the sensitivity of the touch sensitive device.
Further, a touchless touch screen interface can be utilized from a distance by an operator or user. For instance, the operator or user can be performing tasks at a distance (e.g., beyond reach) from the touch screen and through the facilities provided by human machine interface component 202 and time-of-flight sensor 102 the operator or user can interact with the touchless touch screen and thereby actuate work to be performed by industrial machinery 108. Such a facility can be especially useful where industrial machinery 108 is located in environmentally hazardous areas while the user can be controlling the industrial machinery 108, via the touchless touch screen provided by human machine interface component 202, from an environmentally controlled safe zone, for example.
As has been discussed above, time-of-flight sensor 102 can detect body movement, and in particular, can detect hand and/or finger movement to a resolution such that motion can be translated by controller 104 and associated logic component 106 into actions performed by industrial machinery 108. In one embodiment, human machine interface 202 can be utilized to present a touchless touch screen interface that can interpret physical input (e.g., hand and/or finger movement perceived by time-of-flight sensor 102) performed in multiple dimensions by a user or operator and translate these movements into instructions or commands that can be acted upon by industrial machinery 108.
In accordance with at least one embodiment, typical physical input that can be performed by the user can include utilization of pre-defined sets of hand signals that can be translated into instructions or commands (or sequences of instructions or commands) that can be employed to effectuate or actuate tasks on industrial machinery 108. Further, in accordance with further embodiments, physical input performed by the user or operator can include finger and/or hand movements in a single plane (e.g., in the x-y plane) such that horizontal, vertical, or diagonal movement can be detected and translated. For instance, keeping in mind that the operator or user is interacting touchlessly with a touchless touch display generated by human machine interface component 202, the operator or user can, without touching the display, in three-dimensional space, simulate a flicking motion in order to actuate a moving slide bar projected onto the touchless touch display by human machine interface component 202.
Further, still bearing in mind that the user or operator in interacting touchlessly with a touchless touch display projected by human machine interface component 202, can simulate touching a button generated by human machine interface component 202 and projected onto the touchless touch display. In accordance with this aspect, the user can simulate movement of a cursor/pointer onto a pre-defined location of the projected touchless touch screen (e.g., the user can cause movement of the cursor/pointer to the pre-defined location by moving his/her hand or finger in a first plane) and thereafter simulate pressing the button (e.g., the user can activate/deactivate the button by moving his/her hand or finger in a second plane). Further, the user or operator can simulate releasing and/or depressing the button multiple times (e.g., by repeatedly moving his/her hand/finger in the second plane) thereby simulating the effect of jogging. It should be noted without limitation or loss of generality, that while the foregoing illustration describes employment of a first and second plane, human machine interface component 202, in concert with time-of-flight component 102, controller 104 and associated logic component 106, can monitor and track movement by the user or operator in multiple planes or dimensions.
Additionally, in accordance with a further embodiment, human machine interface component 202 can recognize and translate movement (or the lack thereof) as corresponding to pressure (and degrees of pressure) exerted. Thus in continuation of the foregoing example, the user or operator may wish to continually to press the button. Accordingly, human machine interface component 202 can recognize that the user or operator has not only positioned his/her hand or finger over the button to simulate pressing the button, but has also continued to have left his/her hand or finger in the same position to signify that he/she wishes to continue pressing the button. Further, human machine interface component 202 can also detect degrees of pressure intended by the user or operator to be exerted on the button. For instance, the user or operator having continued to have left his/her hand in the same relative position over the button signifying application of constant pressure on the button, can move his/her hand or finger into or out of the second plane to indicate either an increase or diminution of pressure to be applied to the button. The amount of relative movement of the hand or finger into or out of the second plane can also be utilized to assess the magnitude with which the button is to be released or depressed thereby providing indication as to an increase or decrease in the degree of pressure intended to be applied by the user or operator. For example, where the user or operator moves, from a previously established static position, his/her hand or finger substantially into the second plane a greater amount of pressure on the button can be intended. Similarly, where the user or operator moves his/her hand or finger out of the second plane a lesser amount of pressure on the button can intended. Based at least in part on these hand or finger movements human machine interface component 202 can commensurately adjust the pressure on the button.
In accordance with yet a further embodiment, human machine interface component 202 can recognize and translate velocity of movement (or the lack thereof) as corresponding to an urgency or lack of urgency with which the button is pressed or released. Thus, for example, where the user or operator moves his/her hand or finger with great rapidity (or velocity) into or out of the second plane, this motion can signify pressing or releasing the button abruptly. For instance, where time-of-flight sensor 102, in conjunction with controller 104, logic component 106, and human machine interface component 202, ascertains that the user or operator moves his/her hand/finger with great velocity (e.g., measured as a rate of change of distance (d) over time (t) (Δd/Δt)), this can be translated as pressing the button with great pressure or releasing the button abruptly. It should be noted, that rapid movements of hands or fingers can also be translated to infer that the operator or user wishes that the actions be performed with greater speed or more swiftly, and conversely, slower movements can be interpreted as inferring the operator wants the actions to be performed at a slower or more measured pace.
In accordance with still a further embodiment, and has been discussed supra, time-of-flight sensor 102 can also interpret, comprehend, and translate common sign language or acceptable hand signals that can describe a pattern of instructions and/or commands that can be utilized by industrial machinery to perform tasks. For example, if human machine interface component 202 projects a simulacrum of a wheel associated with a piece of industrial equipment (e.g., industrial machinery 108) onto a touchless touch screen display, time-of-flight sensor 102 can detect or ascertain whether the operator or user's movements can be interpreted as motions indicative of rotating the wheel as well as the velocity at which the wheel should be spun. In yet a further example, human machine interface component 202 can project a simulacrum of a lever associated with a piece of industrial equipment onto the touchless touch screen display, in this instance, time-of-flight sensor 102 can ascertain whether or not the user's movements simulate moving the lever and/or the amount of force that should be utilized to manipulate the lever.
It should be noted, without limitation or loss of generality, in connection with the foregoing, that while human machine interface component 202 projects simulacrums of levers, buttons, wheels, console displays, etc. associated with various industrial machinery 108, and that users or operators of these various machines touchlessly interact (through the displays projected by human machine interface component 202 onto a projection surface) with the various simulacrums, the user's movements actuate work to be performed by physical industrial machinery 108 located in the industrial environment.
Turning now to
Dynamic learning component 302 can be utilized to learn individual movements, sequences of movement, and variations of movements (since typically no two individuals can perform the same actions in precisely the same manner) performed by users and translate these user actions into commands or instructions performable by industrial machinery 108. Additionally, since the manner and movement of a user in performing the various actions or movements can vary over time, dynamic learning component 302 can dynamically and continually modify previously learned movement to reflect these changes without deleteriously changing the underlying significance, meaning, or translation of the movements into performable commands or instructions, or requiring the system to be re-trained or re-programmed every time that a slight variation in user movement is discovered or detected.
Dynamic learning component 302 can also be employed to perform tasks that for safety reasons do not lend themselves to dynamic adjustment. For example, dynamic learning component 302 can be utilized to demarcate safety boundaries around industrial equipment (e.g., industrial machinery 108). This can be accomplished by using time-of-flight sensor 102 and dynamic learning component 302 to monitor or track the movement of a user (e.g., a safety supervisor) as he/she walks around an area intended to mark the safety boundary around industrial machinery 108. In accordance with one embodiment, time-of-flight sensor 102 and dynamic learning component 302 can focus on the user and track and monitor the user as he/she perambulates around the industrial equipment thereby creating a safety boundary that can be vigilantly monitored by dynamic learning component 302 in concert with time-of-flight sensor 102, controller 104, and logic component 106. Thus, where there is accidental, perceived, or impending ingress by persons into the demarcated safety zone, a panoply of safety counter measures can be effectuated (e.g., the industrial machinery can be powered down, sirens can be actuated, gates around the industrial equipment can be closed or lowered, etc.), and in so doing injury or death can be averted.
In accordance with a further embodiment, time-of-flight sensor 102 and dynamic learning component 302, rather than particularly focusing on the user, can focus on a item or sensor (e.g., light emitter, ultra-sonic beacon, piece of clothing, etc.) being carried or worn by the user while the user circumscribes the periphery of the intended safety zone to provide a demarcation boundary. As in the foregoing example, time-of-flight sensor 102 and dynamic learning component 302 can track the item or sensor as the user moves around the intended boundary, generating and/or updating a persisted boundary map identifying safety zones around various industrial machinery 108 within the industrial facility, thereby creating safety boundaries that can be continually monitored by dynamic learning component 302 in concert with time-of-flight sensor 102, controller 104, and logic component 106, to prevent unintended entrance of persons within the circumscribed safety areas.
It should be noted that time-of-flight sensor 102 and dynamic learning component 302 can also effectuate demarking and/or monitoring of safety or warning zones that might have been previously, but temporarily, demarcated using tape integrated with light emitting diodes (LEDs), luminescent or fluorescing tape, triangulating beacons, and the like. Similarly, where safety or warning zones are of significant scale, time-of-flight sensor 102 and dynamic learning component 302 can also effectuate demarcation of warning or safety area where a moving machine (e.g., a robot) circumnavigates a path(s) to indicate danger zones.
It is noted that components associated with the industrial control systems 100, 200, and 300 can include various computer or network components such as servers, clients, controllers, industrial controllers, programmable logic controllers (PLCs), energy monitors, batch controllers or servers, distributed control systems (DCS), communications modules, mobile computers, wireless components, control components and so forth that are capable of interacting across a network. Similarly, the term controller or PLC as used herein can include functionality that can be shared across multiple components, systems, or networks. For example, one or more controllers can communicate and cooperate with various network devices across the network. This can include substantially any type of control, communications module, computer, I/O device, sensors, Human Machine Interface (HMI) that communicate via the network that includes control, automation, or public networks. The controller can also communicate to and control various other devices such as Input/Output modules including Analog, Digital, Programmed/Intelligent I/O modules, other programmable controllers, communications modules, sensors, output devices, and the like.
The network can include public networks such as the Internet, Intranets, and automation networks such as Control and Information Protocol (CIP) networks including DeviceNet and ControlNet. Other networks include Ethernet, DH/DH+, Remote I/O, Fieldbus, Modbus, Profibus, wireless networks, serial protocols, and so forth. In addition, the network devices can include various possibilities (hardware or software components). These include components such as switches with virtual local area network (VLAN) capability, LANs, WANs, proxies, gateways, routers, firewalls, virtual private network (VPN) devices, servers, clients, computers, configuration tools, monitoring tools, or other devices.
The techniques and processes described herein may be implemented by various means. For example, these techniques may be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a combination thereof. With software, implementation can be through modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in memory unit and executed by the processors.
Turning to
As will be appreciated by those of moderate comprehension in this field of endeavor, the logical grouping 602 of electrical components can in accordance with an embodiment be a means for performing various actions. Accordingly, logical grouping 602 of electrical components can comprise means for constantly monitoring a user's movement 604. Additionally, logical grouping 602 can further comprise means for detecting an appropriate movement performed by the user 606. Moreover, logical grouping 602 can also include means for interpreting the movement as a performable action 608. Furthermore, logical grouping 602 can additionally include means for actuating industrial machinery to perform the action 610.
Turning to
Once again as will be comprehended by those of reasonable skill, logical grouping 702 of electrical components that can, in accordance with various embodiments, act as a means for accomplishing various actions or tasks. Thus, logical grouping 702 can include means for constantly monitoring a user's movements 704. Further, logical grouping 702 can include means for detecting an appropriate movement performed by the user 706. Moreover, logical grouping 702 can include means for demarcating on a persisted map a boundary described by the user's movements 708. Furthermore, logical grouping 702 can include means for actuating or causing industrial machinery to monitor the demarcated boundary for accidental or inadvertent intrusion into the demarcated area 710.
The techniques and processes described herein may be implemented by various means. For example, these techniques may be implemented in hardware, software, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions described herein, or a combination thereof. With software, implementation can be through modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in memory unit and executed by the processors.
Proceeding to
Proceeding to
It is noted that as used herein, that various forms of Time of Flight (TOF) sensors can be employed to control industrial equipment in the performance of various industrial activities based on the detected body movement as described herein. These include a variety of methods that measure the time that it takes for an object, particle or acoustic, electromagnetic or other wave to travel a distance through a medium. This measurement can be used for a time standard (such as an atomic fountain), as a way to measure velocity or path length through a given medium, or as a manner in which to learn about the particle or medium (such as composition or flow rate). The traveling object may be detected directly (e.g., ion detector in mass spectrometry) or indirectly (e.g., light scattered from an object in laser Doppler velocimetry).
In time-of-flight mass spectrometry, ions are accelerated by an electrical field to the same kinetic energy with the velocity of the ion depending on the mass-to-charge ratio. Thus the time-of-flight is used to measure velocity, from which the mass-to-charge ratio can be determined. The time-of-flight of electrons is used to measure their kinetic energy. In near infrared spectroscopy, the TOF method is used to measure the media-dependent optical path length over a range of optical wavelengths, from which composition and properties of the media can be analyzed. In ultrasonic flow meter measurement, TOF is used to measure speed of signal propagation upstream and downstream of flow of a media, in order to estimate total flow velocity. This measurement is made in a collinear direction with the flow.
In planar Doppler velocimetry (optical flow meter measurement), TOF measurements are made perpendicular to the flow by timing when individual particles cross two or more locations along the flow (collinear measurements would require generally high flow velocities and extremely narrow-band optical filters). In optical interferometry, the path length difference between sample and reference arms can be measured by TOF methods, such as frequency modulation followed by phase shift measurement or cross correlation of signals. Such methods are used in laser radar and laser tracker systems for medium-long range distance measurement. In kinematics, TOF is the duration in which a projectile is traveling through the air. Given the initial velocity u of a particle launched from the ground, the downward (i.e., gravitational) acceleration and the projectile's angle of projection.
An ultrasonic flow meter measures the velocity of a liquid or gas through a pipe using acoustic sensors. This has some advantages over other measurement techniques. The results are slightly affected by temperature, density or conductivity. Maintenance is inexpensive because there are no moving parts. Ultrasonic flow meters come in three different types: transmission (contrapropagating transit time) flow meters, reflection (Doppler) flowmeters, and open-channel flow meters. Transit time flow meters work by measuring the time difference between an ultrasonic pulse sent in the flow direction and an ultrasound pulse sent opposite the flow direction. Doppler flow meters measure the Doppler shift resulting in reflecting an ultrasonic beam off either small particles in the fluid, air bubbles in the fluid, or the flowing fluid's turbulence. Open channel flow meters measure upstream levels in front of flumes or weirs.
Optical time-of-flight sensors consist of two light beams projected into the medium (e.g., fluid or air) whose detection is either interrupted or instigated by the passage of small particles (which are assumed to be following the flow). This is not dissimilar from the optical beams used as safety devices in motorized garage doors or as triggers in alarm systems. The speed of the particles is calculated by knowing the spacing between the two beams. If there is only one detector, then the time difference can be measured via autocorrelation. If there are two detectors, one for each beam, then direction can also be known. Since the location of the beams is relatively easy to determine, the precision of the measurement depends primarily on how small the setup can be made. If the beams are too far apart, the flow could change substantially between them, thus the measurement becomes an average over that space. Moreover, multiple particles could reside between them at any given time, and this would corrupt the signal since the particles are indistinguishable. For such a sensor to provide valid data, it must be small relative to the scale of the flow and the seeding density.
Referring now to
Generally, program modules include routines, programs, components, data structures, etc., that perform particular tasks or implement particular abstract data types. Moreover, those skilled in the art will appreciate that the inventive methods can be practiced with other computer system configurations, including single-processor or multiprocessor computer systems, minicomputers, mainframe computers, as well as personal computers, hand-held computing devices, microprocessor-based or programmable consumer electronics, and the like, each of which can be operatively coupled to one or more associated devices.
The illustrated aspects of the claimed subject matter may also be practiced in distributed computing environments where certain tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules can be located in both local and remote memory storage devices.
A computer typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by the computer and includes both volatile and non-volatile media, removable and non-removable media. By way of example, and not limitation, computer-readable media can comprise computer storage media and communication media. Computer storage media includes both volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital video disk (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the computer.
With reference again to
The system bus 1108 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures. The system memory 1106 includes read-only memory (ROM) 1110 and random access memory (RAM) 1112. A basic input/output system (BIOS) is stored in a non-volatile memory 1110 such as ROM, EPROM, EEPROM, which BIOS contains the basic routines that help to transfer information between elements within the computer 1102, such as during start-up. The RAM 1112 can also include a high-speed RAM such as static RAM for caching data.
The computer 1102 further includes an internal hard disk drive (HDD) 1114 (e.g., EIDE, SATA), which internal hard disk drive 1114 may also be configured for external use in a suitable chassis (not shown), a magnetic floppy disk drive (FDD) 1116, (e.g., to read from or write to a removable diskette 1118) and an optical disk drive 1120, (e.g., reading a CD-ROM disk 1122 or, to read from or write to other high capacity optical media such as the DVD). The hard disk drive 1114, magnetic disk drive 1116 and optical disk drive 1120 can be connected to the system bus 1108 by a hard disk drive interface 1124, a magnetic disk drive interface 1126 and an optical drive interface 1128, respectively. The interface 1124 for external drive implementations includes at least one or both of Universal Serial Bus (USB) and IEEE 1094 interface technologies. Other external drive connection technologies are within contemplation of the claimed subject matter.
The drives and their associated computer-readable media provide nonvolatile storage of data, data structures, computer-executable instructions, and so forth. For the computer 1102, the drives and media accommodate the storage of any data in a suitable digital format. Although the description of computer-readable media above refers to a HDD, a removable magnetic diskette, and a removable optical media such as a CD or DVD, it should be appreciated by those skilled in the art that other types of media which are readable by a computer, such as zip drives, magnetic cassettes, flash memory cards, cartridges, and the like, may also be used in the illustrative operating environment, and further, that any such media may contain computer-executable instructions for performing the methods of the disclosed and claimed subject matter.
A number of program modules can be stored in the drives and RAM 1112, including an operating system 1130, one or more application programs 1132, other program modules 1134 and program data 1136. All or portions of the operating system, applications, modules, and/or data can also be cached in the RAM 1112. It is to be appreciated that the claimed subject matter can be implemented with various commercially available operating systems or combinations of operating systems.
A user can enter commands and information into the computer 1102 through one or more wired/wireless input devices, e.g., a keyboard 1138 and a pointing device, such as a mouse 1140. Other input devices (not shown) may include a microphone, an IR remote control, a joystick, a game pad, a stylus pen, touch screen, or the like. These and other input devices are often connected to the processing unit 1104 through an input device interface 1142 that is coupled to the system bus 1108, but can be connected by other interfaces, such as a parallel port, an IEEE 1094 serial port, a game port, a USB port, an IR interface, etc.
A monitor 1144 or other type of display device is also connected to the system bus 1108 via an interface, such as a video adapter 1146. In addition to the monitor 1144, a computer typically includes other peripheral output devices (not shown), such as speakers, printers, etc.
The computer 1102 may operate in a networked environment using logical connections via wired and/or wireless communications to one or more remote computers, such as a remote computer(s) 1148. The remote computer(s) 1148 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 1102, although, for purposes of brevity, only a memory/storage device 1150 is illustrated. The logical connections depicted include wired/wireless connectivity to a local area network (LAN) 1152 and/or larger networks, e.g., a wide area network (WAN) 1154. Such LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, e.g., the Internet.
When used in a LAN networking environment, the computer 1102 is connected to the local network 1152 through a wired and/or wireless communication network interface or adapter 1156. The adaptor 1156 may facilitate wired or wireless communication to the LAN 1152, which may also include a wireless access point disposed thereon for communicating with the wireless adaptor 1156.
When used in a WAN networking environment, the computer 1102 can include a modem 1158, or is connected to a communications server on the WAN 1154, or has other means for establishing communications over the WAN 1154, such as by way of the Internet. The modem 1158, which can be internal or external and a wired or wireless device, is connected to the system bus 1108 via the serial port interface 1142. In a networked environment, program modules depicted relative to the computer 1102, or portions thereof, can be stored in the remote memory/storage device 1150. It will be appreciated that the network connections shown are illustrative and other means of establishing a communications link between the computers can be used.
The computer 1102 is operable to communicate with any wireless devices or entities operatively disposed in wireless communication, e.g., a printer, scanner, desktop and/or portable computer, portable data assistant, communications satellite, any piece of equipment or location associated with a wirelessly detectable tag (e.g., a kiosk, news stand, restroom), and telephone. This includes at least Wi-Fi and Bluetooth™ wireless technologies. Thus, the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices.
Wi-Fi, or Wireless Fidelity, allows connection to the Internet from a couch at home, a bed in a hotel room, or a conference room at work, without wires. Wi-Fi is a wireless technology similar to that used in a cell phone that enables such devices, e.g., computers, to send and receive data indoors and out; anywhere within the range of a base station. Wi-Fi networks use radio technologies called IEEE 802.11x (a, b, g, etc.) to provide secure, reliable, fast wireless connectivity. A Wi-Fi network can be used to connect computers to each other, to the Internet, and to wired networks (which use IEEE 802.3 or Ethernet).
Wi-Fi networks can operate in the unlicensed 2.4 and 5 GHz radio bands. IEEE 802.11 applies to generally to wireless LANs and provides 1 or 2 Mbps transmission in the 2.4 GHz band using either frequency hopping spread spectrum (FHSS) or direct sequence spread spectrum (DSSS). IEEE 802.11a is an extension to IEEE 802.11 that applies to wireless LANs and provides up to 54 Mbps in the 5 GHz band. IEEE 802.11a uses an orthogonal frequency division multiplexing (OFDM) encoding scheme rather than FHSS or DSSS. IEEE 802.1 lb (also referred to as 802.11 High Rate DSSS or Wi-Fi) is an extension to 802.11 that applies to wireless LANs and provides 11 Mbps transmission (with a fallback to 5.5, 2 and 1 Mbps) in the 2.4 GHz band. IEEE 802.11g applies to wireless LANs and provides 20+Mbps in the 2.4 GHz band. Products can contain more than one band (e.g., dual band), so the networks can provide real-world performance similar to the basic 10BaseT wired Ethernet networks used in many offices.
Referring now to
The system 1200 also includes one or more server(s) 1204. The server(s) 1204 can also be hardware and/or software (e.g., threads, processes, computing devices). The servers 1204 can house threads to perform transformations by employing the claimed subject matter, for example. One possible communication between a client 1202 and a server 1204 can be in the form of a data packet adapted to be transmitted between two or more computer processes. The data packet may include a cookie and/or associated contextual information, for example. The system 1200 includes a communication framework 1206 (e.g., a global communication network such as the Internet) that can be employed to facilitate communications between the client(s) 1202 and the server(s) 1204.
Communications can be facilitated via a wired (including optical fiber) and/or wireless technology. The client(s) 1202 are operatively connected to one or more client data store(s) 1208 that can be employed to store information local to the client(s) 1202 (e.g., cookie(s) and/or associated contextual information). Similarly, the server(s) 1204 are operatively connected to one or more server data store(s) 1210 that can be employed to store information local to the servers 1204.
It is noted that as used in this application, terms such as “component,” “module,” “system,” and the like are intended to refer to a computer-related, electro-mechanical entity or both, either hardware, a combination of hardware and software, software, or software in execution as applied to an automation system for industrial control. For example, a component may be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program and a computer. By way of illustration, both an application running on a server and the server can be components. One or more components may reside within a process or thread of execution and a component may be localized on one computer or distributed between two or more computers, industrial controllers, or modules communicating therewith.
The subject matter as described above includes various exemplary aspects. However, it should be appreciated that it is not possible to describe every conceivable component or methodology for purposes of describing these aspects. One of ordinary skill in the art may recognize that further combinations or permutations may be possible. Various methodologies or architectures may be employed to implement the subject invention, modifications, variations, or equivalents thereof. Accordingly, all such implementations of the aspects described herein are intended to embrace the scope and spirit of subject claims. Furthermore, to the extent that the term “includes” is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term “comprising” as “comprising” is interpreted when employed as a transitional word in a claim.