1. Technical Field
This invention relates to an apparatus for remote communication. More specifically, the apparatus is adapted to convey information pertaining to the operator with respect to the locale of the apparatus and/or a remote device in communication with the apparatus.
2. Description of the Prior Art
Portable computing apparatus, such as laptop computers and personal digital apparatus are commonly used for remote computing needs and communication with computer systems and networks. A person utilizing such apparatus can enter data into the apparatus as long as the apparatus has an input device and source of power.
Many known portable computing apparatus contain communication electronics, such as a modem, to enable the operator to send and receive data to and from the apparatus and other computer systems or networks. Most modems require the operator to physically connect their apparatus to a telecommunication link. However, recently developments for communication apparatus capable of transmitting and receiving data from a remote device through a wireless connection include radio frequency transceivers. Accordingly, portable computing apparatus, which enable operators to remotely communicate with other devices and transmit data to and receive data from other devices, is common in the art.
There are several apparatus that enable remote communication. For example, laptop computers enable people to do computing from a relatively compact personal computer and transmit data through a connection to a network or other computer system. Similarly, personal digital apparatus with communications hardware enable users to do remote computing on a more limited basis and to transmit files to remote device through a communications connection to a computer network. However, neither the laptop nor the personal digital apparatus is designed to account for the physical environment of the unit in which the embedded processor is housed, and to communication the physical environment to the operator. In addition, laptops, personal digital apparatus, and similar computing apparatus are not generally designed to enable wireless communication with another remote device other than computer apparatus or enable bidirectional communication with such apparatus. Accordingly, what is desired is an embedded processor, which can be worn on a body part of the user, that enables remote wireless communication with a remote device while accounting for the physical environment and positioning of the processor.
This invention comprises a control unit for remote communication.
In one aspect of the invention, an operator control apparatus is provided with digital camera optics in communication with a visual display, with the optics configured to provide a digital video signal. An embedded processor is provided in communication with the optics. The processor tracks change to orientation and position of the apparatus and recalculates data to be displayed based on the change. A remote device is provided in communication with and separate from the apparatus. The remote device has an actuator that is configured to be controlled by an input device of the apparatus. The remote device communicates global position data of an object of interest to the processor, the data is then refined by one or more machine vision algorithms. The processor is configured to re-calculate location of the object of interest relative to the apparatus. In addition, the visual display employs an overlay to show a combination of data received from the optics local to the visual display and the remote device. This overlay provides position and orientation in three dimensional spaces to a location of the object relative to location and orientation of the apparatus.
In another aspect of the invention, a method is provided for remote communication. A digital video feed is provided to a visual display through optics. Orientation and position change of an apparatus in communication with said visual display are tracked. The orientation of a portion of a device remote from said apparatus is controlled through orientation of the apparatus. The remote device includes a global positioning sensor. Both global position and orientation data of an object of interest are communicated to the visual display. This communication includes refining the global position data with one or more machine vision algorithms. The visual display shows a combination of data received from the optics and the remote device, with the combination providing location data of an object relative to location of the apparatus.
In yet another aspect of the invention, an article is provided with optics in communication with a visual display. The optics are configured to provide a digital video signal. The article includes a computer-readable medium encoded with instructions to be executed by a computer. Instructions are provided to provide a digital video feed to a visual display through optics local to the visual display and remote from the visual display, and to track orientation and position of an apparatus in communication with the visual display. In addition, instructions are provided to control orientation of a portion of a device remote from the apparatus through orientation of the apparatus. Instructions are also provided to communicate global position of an object of interest as detected from the remote device to the visual display, including instructions to refine the global position and orientation data by employment of a machine vision algorithm. Finally, instructions are provided to present a combination of data collected by the local optics and the remote device, this combination providing location of an object relative to location of the apparatus.
Other features and advantages of this invention will become apparent from the following detailed description of the presently preferred embodiment of the invention, taken-in conjunction with the accompanying drawings.
The drawings referenced herein form a part of the specification. Features shown in the drawings are meant as illustrative of only some embodiments of the invention, and not of all embodiments of the invention unless otherwise explicitly indicated.
It will be readily understood that the components of the present invention, as generally described and illustrated in the Figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the apparatus, system, and method of the present invention, as presented in the Figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention.
Reference throughout this specification to “a select embodiment,” “one embodiment,” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “a select embodiment,” “in one embodiment,” or “in an embodiment” in various places throughout this specification are not necessarily referring to the same embodiment.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided, such as examples of vision and vision techniques, to provide a thorough understanding of embodiments of the invention. One skilled in the relevant art will recognize, however, that the invention can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring aspects of the invention.
The illustrated embodiments of the invention will be best understood by reference to the drawings, wherein like parts are designated by like numerals throughout. The following description is intended only by way of example, and simply illustrates certain selected embodiments of devices, systems, and processes that are consistent with the invention as claimed herein.
An apparatus for conveying local and/or remote information to an operator is provided. The positioning of the apparatus may control the information conveyed to the operator. An embedded processor of a control unit computes the position and orientation of the apparatus and gathers data associated therewith. One or more machine vision algorithms are employed to accurately refine the position and orientation calculations of the apparatus. Machine vision algorithms are used to recognize objects in three dimensional spaces. In addition, the apparatus may communicate with a remote device. The orientation of the apparatus may be used to control the orientation of the remote device, and associated data gathered from the remote device and transmitted to the apparatus. Accordingly, the position and orientation of the apparatus control the data gathered and conveyed to the operator.
As shown in
Preferably, the embedded processor includes a wireless communication apparatus to enable communication between the embedded processor and a remote device. The case 12 has a proximal end 14 and a distal end 16. A set of ear pieces 32, 36 are mounted adjacent to the proximal end 14 for receipt of auditory data. External sound sources are damped by pliable material 34, 38 on the earpieces 32, 36, respectively, resulting in enhanced clarity of presentation of the auditory data to the operator. The control unit 10 has a directional microphone 40 to detect auditory data conveyed to the earpiece. Similarly, a set of eyepieces 42, 46 are mounted adjacent to the proximal end 14 for receipt and presentation of visual data to the operator. External light sources are shielded from the display using pliable material 44, 48 that conforms to the operator's face. Within the pliable material 44, 48 of eyepieces 42, 46 are pressure sensors (not shown) indicating proximity of the operators face with respect to the control unit. Both the ear and eye pieces are adapted to receive data in stereo format. In addition, the control unit 10 includes a light sensor 50, a light amplification sensor array 52, digital video camera optics (not shown), an infra-red amplification sensor array 54 to convey visual data to the operator through the eyepieces 42, 46, and lens optics 82 and 84 to provide a magnified analog display to the operator. Accordingly, the control unit 10 includes apparatus for conveying auditory and visual information to an operator of the unit.
In addition to conveying information to the operator of the unit, input apparatus are provided to collect data as well as enable communication between the operator and the unit, and/or between the operator and a remote device. A set of input devices 60 and 70 are provided on each lateral side of the control unit 10. The input devices preferably include additional input devices 62, 64, and 66, and 72, 74, and 76, shown in the form of tactile pushbuttons. Each of the input devices is mapped to a set of corresponding logical states in the control unit and/or a remote device. A logical state may correspond to activation of one or more actuators on the remote device. One or more of the input devices may be in the form of a proportional input device, such as a proportional input grip, as shown in
Each proportional input grip 60, 70 have a proximal end 61, 71 and a distal end 69, 79, respectively. The distal ends of the proportional input grips extend from a surface of the case 12 and may be actuated by the operator. Similarly, the proximal ends 61, 71 of the proportional input grips 60, 70 are connected to electronic circuits that reside within an interior section of the case 12. As the proportional input grip is revolved around its center axis, a signal is produced that corresponds to the degree of actuation. The signal is preferably in the form of a voltage output that preferably ranges from 0 to 5 volts, but may be calibrated for a lesser or greater output. As the proportional input grip 60, 70 is rotated about its axis, a proportional voltage is output to the associated electronic circuit. Alternatively, the proportional input grip may use optical motion detection, wherein an optical signal would be digitized at an analog to digital converter bypassing any electronic circuits. Actuation of the proportional input grip 60, 70 may be communicated to a respective logical state or motor of the remote device controlling direction, velocity and/or illumination for any apparatus adapted to receive the variable input. The signal from the circuit board associated with the proportional input device 60, 70 is processed by an analog to digital converter to digitize the data into a computer readable format. Following the digitizing process, the processed data is streamed to a communication port of the embedded processor. The radial proportional input grip motion described for the proportional input devices 60, 70 may be replaced by any other proportional movement that would be necessary to control the remote device. However, actuation of the proportional input grip is not limited to communication with a remote device. The proportional input grip may also be used to communicate with the visual display. Accordingly, the proportional input device functions as an input device in communication with the control unit 10 to provide a proportional signal to the embedded processor of the control unit and/or a remote device.
As with the proportional input devices 60, 70, the tactile buttons 62, 64, 66, 72, 74, 76 convey information from the operator to a circuit board associated therewith, which transmits the data to an analog-digital converter. Wired communication electronics are integrated into the analog-digital converter to digitize the data into a computer readable format and to communicate data received from the input device to the embedded processor or streamed to a communication port of the embedded processor. The tactile buttons may be used to communicate with either the visual display or the remote device, or both. Functionality associated with the tactile pushbuttons may include, switching modes of operation, switching proximity sensors, and navigation within a graphical user interface. Pressure sensors in the proportional input device, known in the art as “dead man” switches, control communication signals between the control unit 10 and the remote device. For example, a release of one of the pressure sensors sends a communication signal to the remote device to enter a safe state. Whereas, when the pressure sensor is engaged, communication between the control unit 10 and the remote device can be achieved. In a preferred embodiment, the tactile pushbuttons are separated by a silicone rubber membrane to prevent moisture and dust from entering the case 12. However, the membrane may be comprised of an alternative material that provides protection of the interior section of the case and associated circuit board(s) from damage due to dust, moisture, and environmental weather conditions. Accordingly, actuation of the tactile pushbuttons enables an operator of the unit to communicate a variety of signals to the embedded processor for local or remote communication.
The hardware components of the control unit 10 may be used to visually convey data from a remote device to an operator of the unit 10. Visual data are displayed to the operator on the visual display as seen through the eyepieces 42 and 46. There are four modes of operation for visual display, including a local situational awareness (LSAM), remote situational awareness (RSAM), first person map (FPMM), and bird's eye map (BEMM). The control unit 10 includes several apparatus to operate in each of these modes. For example, a global positioning system (GPS) sensor (not shown) is provided to convey the location of the control unit 10 to the embedded processor of the control unit. An electronic compass (not shown) and an electronic accelerometer (not shown) are provided to convey direction with respect to North and angle with respect to the Horizon, respectively, to the embedded processor of the control unit 10. Similarly, all position and orientation information gathered by the remote device are conveyed to the embedded processor of the control unit. In addition, a rangefinder 56 is provided both on the control unit 10 and the remote device. The rangefinder conveys distance to a specific object or location by calculating a range to objects of interest. In one embodiment, the rangefinder may be in the form of an electromagnetic signal. Accordingly, the apparatus of the control unit includes tools to collect appropriate data to enable the four modes of operation.
Following step 114, object data update is received 116. The location of the object(s) is re-calculated relative to the control unit 118. Thereafter, infra-red sensor array data is collected and received 120, and the location of the infra-red sources are calculated relative to the location of the control unit 122. Information gathered by the remote device or any other source(s) relative to the object(s) is displayed in a transparent overlay form relative to the actual position of the object(s) with respect to the position and orientation of the control unit 124. Such information may include infra-red source data. Accordingly, the local situational awareness mode (LSAM), together with the machine vision algorithm, calculates position and orientation in three dimensional spaces to accurately represent graphical overlays of environmental information to an operator of the control unit.
The overlay information gathered can indicate the location of objects which are not directly visible to the operator. In addition, the overlay information provides information about objects which are visible to the operator. The execution of machine vision algorithms, as discussed above, provides accurate refinement of the absolute position of objects in three dimensions as well as accurate representation of graphical overlays of environmental information to the operator.
Embodiments within the scope of the present invention also include articles of manufacture comprising program storage means having encoded therein program code to communicate data between the input device and data presented on the visual display. Such program storage means can be any available media which can be accessed by a general purpose or special purpose computer. By way of example, and not limitation, such program storage means can include RAM, ROM, EPROM, CD-ROM, or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired program code means and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included in the scope of the program storage means.
In one embodiment, the invention is implemented in software, which includes but is not limited to firmware, resident software, microcode, etc. The invention can take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. For the purposes of this description, a computer-usable or computer readable medium can be any apparatus that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
The medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or apparatus or device) or propagation medium. Examples of a computer-readable medium include but are not limited to a semiconductor or solid state memory, magnetic tape, a removable computer diskette, random access memory (RAM), read-only memory (ROM), a rigid magnetic disk, and an optical disk. Current examples of optical disks include compact disk B read only (CD-ROM), compact disk B read/write (CD-R/W) and DVD.
A data processing system suitable for storing and/or executing program code includes at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories which provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution.
Input/output or I/O devices (including but not limited to keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening I/O controllers. Network adapters may also be coupled to the system to enable the data processing system to become coupled to other data processing systems or remote printers or storage devices through intervening private or public networks. The software implementation can take the form of a computer program product accessible from a computer-useable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system.
It will be appreciated that, although specific embodiments of the invention have been described herein for purposes of illustration, various modifications may be made without departing from the spirit and scope of the invention. In particular, the input devices may come in different forms, including a proportional input device, such as a joystick, a rocker pad, a touch pad, a track balls, and alternative input devices. Additionally, the invention should not be limited to the mappings of the input devices to the described movement and communication with the image on the visual display. In one embodiment, there may be different mappings of the input devices to the image, or even additional mappings for different image movements. Furthermore, the invention should not be limited to a fixed set of mappings. In one embodiment, an interface may be provided to modify the mappings of the input devices. Accordingly, the scope of protection of this invention is limited only by the following claims and their equivalents.
The embedded processor of the control unit tracks orientation and position of the control unit 10. Positioning of the control apparatus is conveyed to digital camera optics in communication with the embedded processor. Since the control unit 10 is adapted to be placed against the eyes and/or ears of the operator during use, the position and orientation of the control unit 10 is directly related to the orientation and position of the head of the operator of the control unit 10. The orientation and position information of the control unit may be projected onto the visual display of the control unit. In addition, the orientation and position of the control unit 10 may be conveyed to the remote device and the associated digital camera optics to position the camera associated with the remote device in accordance with the orientation and position of the control unit 10. Communication of orientation and position data enhances interactivity between the control unit and the remote device, aside from the environment of the remote device. In addition, the embedded processor may create a wireframe to give shape to the terrain and synthetic graphics to represent physical items in the noted relative locations, thus producing synthetic vision. The use of a wireframe and/or synthetic graphics timely conveys map, terrain, and shape data to the visual display.
It will be appreciated that, although specific embodiments of the invention have been described herein for purposes of illustration, various modifications may be made without departing from the spirit and scope of the invention. In particular, the control unit may be designed to communicate with a variety of remote device. For example, the remote device may be in an electronic or mechanical form with logical states mapped to corresponding input devices and motors of the control unit. The remote device may include a camera that captures live video to provide live video feedback to the control unit. In addition, the control unit may be used to download topographical and/or geographical data independent of or in conjunction with the various modes of operation. The visual display may be in the form of a liquid crystal display, or an alternative medium that enables viewing by the operator while maintaining the integrity of the control unit. Similarly, the wireless communication electronics may be in the form of wireless communication electronics in communication with the embedded processor of the control unit, or an alternative communication electronics that enables wireless communication of data between the embedded processor and a corresponding wireless communication apparatus remote from the control unit. In addition, the scope of the invention should not be limited to the input devices described together with the control unit. Alternative input devices that enable communication of data between the control unit and the remote device may be employed. Accordingly, the scope of protection of this invention is limited only by the following claims and their equivalents.
The present application is a continuation-in-part utility application claiming the benefit of U.S. patent application Ser. No. 10/739,603, filed on Dec. 18, 2003, and titled “Operator Control Unit with Tracking,” now pending, which is hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 10739603 | Dec 2003 | US |
Child | 12510835 | US |