The present invention relates to the fields of Electro-optics and computerized vision. More specifically the invention relates to the fields of observation and imaging systems for mobile platforms.
Moving platforms usually include windows through which the operator observes the surroundings and operates the platform in relation to the events taking place around him. The simplest example is that of the windows found in the everyday vehicles that we drive. Most vehicles have windows at the front, on the sides and at the back, through which we observe our surroundings whilst driving in order to keep ourselves and others safe.
Despite the presence of windows surrounding the driver, there frequently occur situations, e.g. darkness, inadequate lighting, fog or stormy weather when it is difficult to understand everything that is taking place around the vehicle only by observation out of a window. For example, looking out of a side window of a vehicle at surroundings that are not properly lit, it will be hard to see everything that is taking place alongside of the vehicle.
As opposed to the types of mobile platforms described above, there are many types of mobile platform that are designed with few, very small, or no windows at all. These platforms are usually used for security and defense applications such as; armored personnel carriers, tanks, vehicles for transporting valuable commodities such as money, prisoner transport vehicles etc. The restricted field of view makes it hard for the operator of these platforms to become familiar with his surroundings and safely operate his vehicle. In order to provide the ability for the operator to become familiar with the surroundings these platforms can be equipped with different means e.g. mirrors, direction/range sensors, and cameras.
A variety of vision and imaging systems for manned mobile platforms exist today. A sample of typical prior art solutions follow:
It is an object of the present invention to supply a vision and imaging system for manned mobile platforms, for intuitive surrounding orientation via a screen based interactive virtual window, on which the operator can see the occurrences outside the platform, as if looking through a transparent glass window.
It is an object of the present invention to supply a vision and imaging system for remote control over unmanned/robotic mobile platforms, which simulates intuitively the surroundings of the platform as if the remote operator was looking through a window from inside the platform.
It is another object of the present invention, to provide a system which is simple to integrate into mobile platforms in such a manner which saves space and which is easily accessible and stowed away when required.
It is another object of the present invention to provide a system which enables intuitive Omni Directional observation of the surroundings (360°) and orientation of the observed features relative to the mobile platform even under unfavorable environmental conditions such as insufficient lighting, darkness, stormy weather, or lack of windows in which human orientation abilities are severely limited.
It is another object of the present invention to supply a system that allows integration and synchronization with other systems located on the platform, thus making it easier for the operator of the system to control the platform and its devices, as well as improving the capabilities of the system itself by combining/fusing information with other systems.
It is yet another object of the current invention to provide a system with capacity to process an image which can support the operator's decision making process by using algorithms for understanding the image, which allow designation of relevant information from the information obtained by the system sensors as a whole based on predetermined and predefined parameters known to and/or defined by the operator.
Further objects and abilities of the system will become apparent as the description proceeds.
In a first aspect the invention is a vision and image capture system for manned mobile platforms. The system comprises:
The system of the invention is characterized in that the at least one imaging sub-sensor is mounted approximately at the height of the eyes of the operator of the system and at a predefined angle respective to the mobile platform matching the preferred viewing angle of the operator from his seat within the platform. This location and orientation of the imaging sub-sensor allows a life like simulation and presentation of the images on the screen to the operator as if he were looking at the scene through a transparent window.
In embodiments of the system of the invention that comprise more than one imaging sub-sensor integrated in the outside walls of the platform, the system may comprise more than one display screen. In one embodiment a display screen is located at the position corresponding to the location of each of the imaging sub-sensors. In another embodiment only one display screen is provided and it is physically or virtually moved alternately between positions corresponding to the positions of the imaging sub-sensors. The display screen can be moved along a curved track and, as the display screen moves along the track, the processing unit updates the images on the screen to display the view that would be seen through a window in the wall of the mobile platform corresponding to the direction the operator is looking at every point along the track.
In embodiments of the system of the invention the imaging sensor comprises a plurality of stationary imaging sub-sensors that are integrated in the outside walls of the platform. The imaging sub-sensors are positioned such that the fields of view of adjacent sub-sensors overlap and that together they capture images of all objects and events surrounding the mobile platform. In these embodiments the processing unit comprises hardware and software components that are configured to seamlessly stitch the images from the stationary imaging sub-sensors into a panoramic 360 degree view of the area surrounding the mobile platform.
Any of the embodiments of the system of the invention can comprise at least one pan-tilt-zoom (PTZ) imaging sub-sensor installed on the roof of the mobile platform. The PTZ sensor can be moved laterally, raised or zoomed in according to instructions given by the operator of the system in order to capture enlarged images of a selected region of interest (ROI).
The system may comprise one or more additional sensors, e.g. distance measuring sensors, microphones, space detectors, temperature detectors, and ABC (atomic, biological and chemical) sensors. The information gathered from the additional sensors can be integrated by the processing unit into the images displayed to the operator of the system.
The imaging sub-sensors can gather images in one or more of the ultra-violet, visible, near infra-red, or infra-red spectral regions. In embodiments of the system each sub-sensor is equipped with illumination means that enhance the ambient light in the appropriate spectral range. The illumination means can be based on Light Emitting Diodes (LEDs). One or more pairs of imaging sub-sensors can be attached to the mobile platform side by side in such a way that allows for the images obtained from both sub-sensors to be processed to create a three-dimensional image.
In embodiments of the system of the invention the display screen is a graphical user interface (GUI) on which is displayed the images gathered by the imaging sub-sensors and other information selected to improve the spatial orientation of the operator of the system, to familiarized him with his surroundings, and to assist him in intuitively analyzing the events which are taking place around him and in decision making.
In embodiments of the system the display screen is a touch screen. In embodiments of the system the default screen display may comprise a panoramic 360 degree view of the area surrounding the mobile platform and an enlarged image of a selected region of interest (ROI). The display screen can be curved and relatively wide in order to provide the user with a life-like panoramic view of the matching view outside of the mobile platform.
In embodiments of the system the processing unit comprises a movement detection algorithm to perform Video Motion Detection (VMD) for tracking objects in motion within the obtained images. The objects determined to be moving can be indicated on the display screen using tactical markings that can be intuitively understood by the operator of the system.
In embodiments of the system the processing unit and display screen are adapted to allow interfacing with existing systems on the mobile platform.
The system of the invention can comprise an additional means of imaging located inside the mobile platform and directed at the platform operator. The processing unit of the system is configured to process the information obtained from this interior imaging means to allow determination of the operator's direction of observation by measuring the orientation of the pupils in his eyes relative to some fixed reference frame and/or by measuring the angle of the tilt of his head. The processing unit uses these measurements to synchronize the information displayed on the display screen to always provide the view that would be seen through a window in the wall of the mobile platform corresponding to the direction the operator is looking.
Sensors that indicate the operator's viewing direction can be attached to his head and information from the sensors utilized by the processing unit of the system to continually update the view on the display screen to show the information that the operator would see if there were a window located at his present viewing direction.
In embodiments of the system of the invention the operator of the system sits on a seat that can rotate. The display screen is mechanically coupled to the seat such that the screen moves along a track together with the seat as a single unit. The rotation mechanism for the seat has a sensor attached to it that measures the angle of rotation and transmits this information to the processing unit of the system. The processing unit displays images on the display screen that simulate the scene that would be seen through an actual window at the position of the display screen at any given time.
In embodiments of the system of the invention the display screen is replaced with a miniature screen attached to the helmet of the operator of the system.
Embodiments of the system comprise communication means that enable the transmitting of information displayed on the display screen to additional locations in the mobile platform and/or communication means based on a wireless transmitter to transmit the obtained information to entities outside of the platform.
Embodiments of the system of the invention comprise a remote control station. In these embodiments the mobile platform comprises a transceiver for transmitting information from the mobile platform to the remote control station. The mobile platform also comprises an electro-mechanical mechanism executing control signals that are wirelessly transmitted from the control station to the mobile platform by a remote operator. Additional information from sensors installed on the mobile platform may also be wirelessly transmitted to the remote control station to assist the operator to control the mobile platform.
In a second aspect the invention is a manned mobile platform that comprises one or more vision and image capture systems according to the first aspect of the invention.
All the above and other characteristics and advantages of the invention will be further understood through the following illustrative and non-limitative description of preferred embodiments thereof, with reference to the appended drawings.
The present invention describes a vision and image capture system for mobile platforms, whether manned, unmanned, or robotic, including any type of motorized land vehicle, aircraft, and ship. The system of the invention is configured to assist the individual or the team using the platform or controlling it to become familiarized with their surroundings and to assist in analyzing the events which are taking place around them intuitively. This ability is provided to the operator of the platform by means of a virtual window through which the operator can observe the events outside as if he were looking through a transparent window inside the mobile platform. The system of the invention can be usefully employed with mobile platforms having very few or no windows as is the case with mobile platforms for security and defense applications or with mobile platforms having sufficient windows but visibility through them is impaired as a result of unfavorable external environmental conditions such as bad lighting, darkness, stormy weather, dirt, or dust. Especially when these and similar environmental conditions are present the use of imaging sensors with spectral sensitivity that is outside of the visible range together with matching illumination means will allow displaying of all information on the screen, which is needed to operate the mobile platform.
Imaging sensor 12 is any type of electronic camera known in the art that is installed on the outside of the mobile platform. In the simplest case imaging sensor 12 is rigidly attached to the side of the platform pointing in a fixed direction that allows imaging of objects and events taking place within a fixed field of view 6. In order to allow as much of a life-like simulation and present the images to the operator as if he were looking at the scene through a window in the platform it is an essential feature of the invention that imaging device 12 be mounted at the exact height of the eyes 4 of the operator of the system and that the screen 20 be positioned such that, as shown in
In order to meet the conditions of the invention, the cameras must be integrated into the outside of the tank at the height above ground at which an “average sized” crew member, e.g. driver, weapons system operator, or commander who will be operating the system normally sits or stands and at a position on the side of the mobile platform matching the average viewing angle of the driver from his seat within the platform. Matching a specific angle for a specific operator, which may be different from the average setting, can be done by synchronizing the components of the system manually, i.e. by moving the touch screen or moving the imaging sensor laterally, or electronically by adjusting the interface obtained information on the touch screen to that specific operator. In addition “fine tuning” must be provided to adjust the height of the operator's eyes to the height of the cameras, for example by raising or lowering the seat on which the operator sits.
The synchronization methods described above can be applied mutatis mutandis to the case of a remote operator in order to enable intuitive remote control over an unmanned/robotic platform. In such embodiments the images captured by the sub sensors are wirelessly transmitted to a remote control station where they are displayed on a screen to the remote operator. The remote operator can synchronize the system using methods described above in order to enable a life-like simulation of the surroundings. In other words, the system provides a remote life-like simulator to enhance the operator's ability to understand the surroundings of the platform and to control it from a distance.
At least one additional imaging sub-sensor 16 of the Pan, Tilt, Zoom (PTZ) type is installed on the roof of the platform 10. This sensor can be moved laterally, raised or zoomed in according to instructions given by the platforms' operator in order to capture enlarged images of a selected region of interest (ROI) 18 in the panoramic scene 14. The images obtained from ROI 18 can also be displayed on the screen in the platform to resemble looking through a window (hereinafter “Natural Presentation”).
The imaging sensor can comprise sub-sensors that gather information in different spectral regions, e.g. ultraviolet, visible, near infra-red, and infra-red, to allow information gathering both during the day and at night. Additionally each sub-sensor can be equipped with illumination means that enhance the ambient light in the appropriate spectral range. Preferably the illumination means are based on Light Emitting Diodes (LEDs).
In an embodiment of the invention one or more pairs of imaging sub-sensors are attached to the mobile platform side by side in such a way that allows for the images obtained from both sub-sensors to be processed to create a three-dimensional image.
The system of the invention also preferably includes additional sensors, e.g. distance measuring sensors, microphones, space detectors, that are attached to the exterior of the mobile platform. The information gathered from these additional sensors can be integrated into the images displayed to the operator as will be described hereinbelow.
On top of the Natural Presentation images of the surroundings of the mobile platform that are obtained by the imaging sensors, image processing and other techniques can be used to add additional information in order to assist the operator in decision making and/or to provide advanced warnings of obstacles etc. All of this information and more is made available to the operator by means of a processing unit comprising various hardware and software components, memory components, communication components, and a physical device on which the images and information are displayed to the operator.
In an embodiment of the invention the display screen is curved and relatively wide in order to provide the user with a life-like panoramic view of the matching view outside of the mobile platform.
In embodiments of the invention, the system includes a movement detection algorithm to perform Video Motion Detection (VMD) for tracking objects in motion within the obtained images. The objects determined to be moving are then indicated on interface 20 using tactical markings that can be intuitively understood by the operator. For example, arrow 42 indicates an object located outside of the presently selected ROI that is moving away from the mobile platform. The operator can choose to shift the ROI to determine the nature of the moving object or to ignore it.
Embodiments of the system are configured to allow interfacing with existing systems on the platform, e.g. weapons systems or GPS navigational systems and integrating the information between them. This processed information can be displayed on the Natural Presentation for the convenience of the operator to enable him to control all the different systems that exist within the platform via the interface on the screen.
As an example of the type of information that can be determined from the images supplied by the imaging sensor, tactical markings 11, 12, and 13 on interface 20 respectively show the operator the direction in which he is looking in relation to the platform as well as the location of a moving object (the man in the doorway) in relation to his direction of observation. In the embodiment shown a range measurement detector is incorporated. The operator simply points at (touches) the object on the screen to which he wishes to measure the range and the range is displayed beside that object 40. The interface then allows directing weapons systems towards the object designated by the operator.
The operator can also request (button 26) that the system display indications that are obtained from complimentary external sensors, e.g. microphones, temperature detectors, space detectors, or ABC (atomic, biological and chemical) sensors.
The system also includes a memory component that enables saving information obtained from the imaging and other sensors fully or selectively. The information can be retrieved and used by the processing unit in applications such as those that analyze images for navigation purposes or issue warnings about threats or suspicious objects.
In order to accomplish the main goal of the invention, i.e. to supply a vision and imaging system for mobile platforms, for intuitive surrounding orientation via a screen based interactive virtual window, on which the operator can see the occurrences outside the platform, as if looking through a transparent glass window, it is necessary to synchronize the Natural Presentation with the direction in which the operator is looking or would like to look. This can be accomplished in a number of ways, a few of which are now described:
The synchronization methods described above can be applied mutatis mutandis to the case of a remote operator in order enable intuitive remote control over an unmanned/robotic platform. In such embodiments other information gathered by sensors installed in the unmanned/robotic platform such as azimuth and acceleration sensors can be wirelessly transmitted to the remote control station in order to enhance the life-like simulation to the remote operator during maneuvers, e.g. curves, turns, and changes of speed, which are performed by the mobile platform. The control signals transmitted by the remote operator are received and processed at the mobile platform using an electro-mechanical mechanism.
The system of the invention comprises a connection to a source of electrical power to enable operation of its various components. The power source can an independent power pack but is preferably the electrical circuit of the mobile platform.
Embodiments of the system comprise communication means that enables the transmitting of information displayed on the display screen to additional locations in the platform and/or communication means based on a wireless transmitter to transmit the obtained information to entities outside of the platform.
Thee system of the invention can obviously be built into new mobile platforms, but also virtually no structural changes are required to retrofit existing platforms with the added capabilities provided by the invention. A given manned mobile platform can be equipped with two or more touch screen graphical user interfaces, all of which display images gathered by the same imaging sensor, and optionally sharing a common processing unit.
Each of the interfaces can be adapted to provide only the information relative to the specific task of the crew member that is using it.
Although embodiments of the invention have been described in relation to a specific type of mobile platform by way of illustration, it will be understood that the invention may be carried out with many variations, modifications, and adaptations, without exceeding the scope of the claims.
Number | Date | Country | Kind |
---|---|---|---|
189251 | Feb 2008 | IL | national |