This application claims priority to French Patent Application No. 11 60884 filed Nov. 29, 2011, the entire disclosure of which is herein incorporated by reference.
The present invention relates to a dialog device for an aircraft, notably a transport airplane, enabling a dialog between an operator of the aircraft, in particular a pilot, and a guidance system of said aircraft.
The airplanes that are provided with a guidance system, namely either a flight director which computes piloting setpoints on the basis of guidance setpoints or an automatic piloting system which makes it possible to follow guidance setpoints automatically, are provided with an item of equipment, called FCU (Flight Control Unit) on airplanes of AIRBUS type and MCP (Mode Control Panel) on airplanes of BOEING type, which enables a pilot of the airplane to enter guidance setpoints into the guidance system.
Generally, the pilot chooses a guidance setpoint, then he or she controls the engagement (activation) of the associated guidance mode, so that it takes into account either the value entered (in a so-called “selected” mode), or a value computed by the system according to various criteria (in a so-called “managed” mode).
More particularly, the pilot can notably:
on the speed axis:
on the lateral axis:
on the vertical axis:
These setpoints are taken into account by the guidance system, either directly as soon as their value is modified if the associated mode is active, or after validation (engagement of the associated mode) in the case where another guidance mode is initially engaged. In the latter case, the setpoint is said to be preset before its validation.
For each selection of a setpoint to be reached or to be maintained there is a corresponding guidance mode of the airplane. There is one mode engaged for each axis (speed, lateral, vertical) exclusively. As an illustration, the following modes can be cited:
on the lateral axis:
on the vertical axis:
A synthetic summary of the behavior of the guidance system (flight director or automatic piloting system, associated or not with an automatic thrust control) is produced, generally, on the screens displaying the primary flight parameters, of PFD (Primary Flight Display) type, on a panel of FMA (Flight Mode Annunciator) type. This synthetic summary reviews, generally, the guidance modes that are engaged (active) on each axis (speed, lateral, vertical), as well as the guidance modes that are armed, that is to say those which have been requested by the pilot and which will be engaged automatically when conditions for engaging the mode are satisfied. As an example, outside the trajectory of the flight plan, in maintain heading mode converging toward the trajectory of the flight plan with the join or maintain the trajectory of the flight plan mode armed, the latter mode is engaged automatically on approaching the flight plan.
In most airplanes with two pilots, the control unit of the guidance system is situated in the center of the cockpit (above the screens showing the flight parameters) so that both pilots can access it.
This control unit, for example of FCU type, makes it possible:
to select guidance setpoints;
to engage the modes associated with a guidance setpoint (render the mode active), or to request the arming of the mode; and
to change reference (for example heading rather than route) for a guidance setpoint.
The task of the pilot responsible for the guidance of the airplane is to select the guidance setpoints and modes. Currently, he or she performs this task through the dedicated control unit (FCU or MCP) which is located between the two pilots, then he or she has to check the selection of his or her setpoints (values) on the primary flight screen which is located facing him or her (PFD, standing for Primary Flight Display) and/or on the navigation screens (ND, standing for Navigation Display in the lateral plane; VD, standing for Vertical Display in the vertical plane). Then, the guidance is monitored on these screens which indicate the behavior of the guidance:
summary of the behavior via the synthesis of the modes that are armed and engaged: FMA panel;
guidance setpoints (speed CAS, heading/route, altitude, vertical speed/gradient) and deviations in relation to the current parameters of the airplane: PFD screen;
margins in relation to the limits (for example, margin in relation to the minimum operational speed and stall speed): PFD screen.
This standard solution presents drawbacks, and in particular:
the pilot has to select the guidance setpoints and modes in one place (control unit FCU), then check and monitor the behavior of the airplane in another place (on the playback screens). This involves visual toing and froing and a dispersion of the guidance elements between the control and the playback of the behavior of the system;
the control unit is a physical item of equipment that is costly and difficult to modify (because it is of hardware type); and
this control unit is bulky in the cockpit.
The present invention relates to a dialog device between an operator, notably a pilot, of an aircraft and a guidance system of said aircraft, which makes it possible to remedy the abovementioned drawbacks.
To this end, according to the invention, said dialog device which is installed on the aircraft and which comprises at least one screen capable of restoring guidance information, is noteworthy in that said screen comprises at least one graphic object which is produced in the form of an interaction means which is associated with at least one guidance setpoint of said guidance system and which represents:
on the one hand, a playback element which indicates the value of the associated guidance setpoint of said guidance system; and
on the other hand, a control element which can be grasped and moved along a curve by an operator so as to modify the value of said guidance setpoint.
Thus, by virtue of the invention, there is on the screen, for example of PFD, ND or VD type, at least one interaction means which is associated with a guidance setpoint of said guidance system and which not only makes it possible to restore the value of this guidance setpoint with which it is associated, but also enables an operator to modify this value on the screen. The control and the monitoring are combined, colocated.
The present invention can be applied to any guidance setpoint used by a guidance system and in particular to the following guidance setpoints: speed/mach, heading/route, altitude, vertical speed/gradient.
An interaction function (direct) is thus obtained on a screen (which was hitherto dedicated only to the playback of the flight parameters and guidance), through an interaction means (namely a graphic object allowing an interaction) associated with a guidance setpoint.
This interaction means can be grasped and moved by an operator along a curve (on a scale for example, which can appear dynamically and contextually when modifying a setpoint) so as to modify the associated guidance setpoint. By way of example, the invention can make it possible to grasp an interaction means indicating a heading setpoint, move it along a heading scale (a heading rose for example) to modify the heading setpoint so that the new heading setpoint is taken into account by the guidance system of the aircraft.
Said curve which is predefined can be a scale of values displayed by default or an independent curve on which a scale of values can appear dynamically and contextually.
The dialog device according to the invention, of interactive type, thus makes it possible:
for the pilot to select guidance setpoints (as well as guidance modes, as specified below) in the same place (screen) where he or she can check and monitor the behavior of the aircraft. This avoids the visual toing and froing and a dispersion of the guidance elements, which exist on the standard dialog devices; and
in circumstances specified below, to do away with a control unit, for example of FCU type, which is an item of equipment that is costly, difficult to modify and bulky.
In a preferred embodiment, said interaction means comprises a plurality of states which allow different actions to be implemented. In this case, advantageously, said interaction means comprise states which allow at least some of the following different actions to be implemented:
modifying a guidance setpoint, called selected, which is directly applied by the guidance system;
modifying a preset guidance setpoint, which will be applied by the guidance system after validation;
engaging a capture or maintain mode for a selected guidance setpoint; and
engaging a capture or maintain mode for a computed guidance setpoint (called “managed”).
Furthermore, advantageously, the transition from one state to another of the interaction means is generated by a corresponding movement thereof.
Moreover, in a preferred embodiment, said dialog device comprises a plurality of interaction means, each of which is intended for a given guidance setpoint (speed/mach, heading/route, altitude, vertical speed/gradient) of said guidance system. The use of a plurality of interaction means, namely an interaction means for each guidance setpoint, on the screens dedicated to the playback of the flight parameters and of the guidance (PFD, ND, VD), makes it possible to directly implement on these screens all the functions of a standard physical control unit, for example of FCU type, and therefore to do away with such a control unit, which represents a significant saving in particular in terms of cost, weight and bulk.
In a particular embodiment, said dialog device comprises at least one interaction means, which is capable of controlling at least two different references (speed/mach, heading/route, vertical speed/gradient) of a guidance setpoint of said guidance system. This interaction means is capable of controlling only one reference at a time, and the selection of one of said references to be controlled depends on the movement of said interaction means (or on the action carried out to make it appear).
Moreover, advantageously, said interaction means is not displayed continuously on the screen, and it appears by placing a pointer (finger or cursor in particular) on the corresponding graphic object.
In the context of the present invention, said interaction means can be moved by a direct action. It is however also possible to envisage moving said interaction means by a so-called “lever arm” effect specified below.
In a preferred embodiment, said screen generates a dynamic visual feedback on a predicted trajectory associated with the guidance setpoint, which makes it possible to have directly on the same screen both a means for selecting the guidance setpoint, for restoring its value, and an indication of the effect generated on the trajectory of the aircraft. This embodiment is particularly advantageous operationally, since the pilot can immediately interpret the impact of his or her guidance setpoint modifications on the trajectory, and do so without the need for any visual toing and froing between a control panel and a playback screen.
Furthermore, in this case, advantageously:
said screen can automatically display at least one characteristic point of said predicted trajectory; and
said interaction means is capable of acting on the characteristic point(s), thus displayed, of said predicted trajectory to modify them.
The present invention is applied to one or more screens, and preferably, to the abovementioned PFD, ND and VD screens.
In a first embodiment of the dialog device, said screen is a touch screen, and a graphic object is controlled by a direct contact, preferably finger contact, on the part of the operator on this touch screen.
Furthermore, in a second embodiment, the dialog device comprises, in addition to the screen, control means, such as a trackball or a touchpad in particular (of the multi-touch type or not), which are linked to the screen and which enable an operator to control the movement of a cursor on the screen, intended to act on the interaction means provided.
The present invention also relates to a guidance system of an aircraft, namely a flight director or an automatic piloting system, which comprises a dialog device such as that mentioned above, to enable a dialog between said guidance system and an operator, notably a pilot, of said aircraft.
The present invention also relates to an aircraft, in particular a transport airplane, which is equipped:
with such a dialog device; and/or
with such a guidance system.
The figures of the appended drawing will give a good understanding of how the invention can be produced. In these figures, identical references designate similar elements.
The dialog device 1 according to the invention and schematically represented in
For this, said dialog device 1 which is installed on the aircraft comprises a display system 2 which comprises at least one screen 3 capable of restoring guidance information of said guidance system 4.
Said dialog device 1 may comprise one or more screens 3 and, preferably, at least one of the following screens:
a piloting screen of PFD (Primary Flight Display) type;
a navigation screen of ND (Navigation Display) type in relation to the lateral plane;
a navigation screen of VD (Vertical Display) type in relation to the vertical plane.
According to the invention, the screen 3 comprises at least one graphic object which is produced in the form of an interaction means 8. This interaction means 8 is associated with at least one guidance setpoint of the guidance system 4 and represents:
on the one hand, a playback element which indicates the value of this guidance setpoint of said guidance system 4, in conjunction with a scale of values; and
on the other hand, a control element which can be grasped and moved along a curve by an operator, in particular the pilot of the aircraft, so as to modify the value of said guidance setpoint (of said guidance system 4).
To do this, the display system 2 comprising the screen 3 is linked via a link 5 to guidance means 4A and 4B of said guidance system 4, so as to be able to provide a communication of information between the two assemblies. Said guidance system 4 may comprise, as guidance means:
a standard flight director 4A, which computes piloting setpoints on the basis of guidance setpoints; and/or
a standard automatic piloting system 4B, which makes it possible to follow guidance setpoints automatically.
Thus, by virtue of the dialog device 1 according to the invention, the operator has on the screen 3 at least one interaction means 8 which is associated with a guidance setpoint of said guidance system 4 and which not only makes it possible to restore the value of this guidance setpoint with which it is associated, but also enables this value to be modified on the screen 3.
The dialog device 1 according to the invention therefore allows a direct interaction on a screen 3 (which was hitherto dedicated solely to the playback of the flight parameters and guidance), through an interaction means 8 (namely a graphic object allowing an interaction) associated with a guidance setpoint.
In a first embodiment of the dialog device, said screen 3 is a touch screen, as represented in
Furthermore, in a second embodiment, the dialog device 1 also comprises control means 6, represented by broken lines in
These control means 6 may notably comprise:
a trackball;
a computer mouse; and/or
a touchpad (of multi-touch type or not).
The interaction means 8 can therefore be grasped and moved by an operator along a predefined curve (on a scale for example, which may appear dynamically and contextually when modifying a setpoint) so as to modify the associated guidance setpoint. Said curve may be a scale of values which is displayed by default, as represented in
As an illustration, in
a symbol AC1 representing the current position of the aircraft equipped with the device 1;
symbols A1, A2, A3 representing the current positions of surrounding aircraft;
a distance scale 11 (in relation to the current position AC1 of the aircraft);
a heading scale 12 (a heading rose) with a symbol 13 indicating on the scale 12 the value of the current heading; and
a continuous line plot 10 which illustrates the lateral trajectory followed by the aircraft.
in
the operator then moves the interaction means 8 with his or her finger 9, as illustrated by an arrow 16 in
in
the aircraft will then progressively modify its heading (as illustrated in
The dialog device 1 according to the invention thus enables the pilot to select guidance setpoints (as well as guidance modes, as specified below) in the same place (screen 3) where he or she can check and monitor the behavior of the aircraft. This avoids the visual toing and froing and a dispersion of the guidance elements, which exist on the standard dialog devices. These comments also apply to the second embodiment using control means 6 since, in this case, the pilot visually follows, on the screen 3, the commands produced using these control means 6 (which are likely to be located separately from the screen 3).
The present invention also relates to a guidance system 4 of an aircraft, namely a flight director 4A or an automatic piloting system 4B, which comprises a dialog device 1 such as that mentioned above, to enable a dialog between said guidance system 4 and a pilot of said aircraft.
Moreover, in a preferred embodiment, said dialog device 1 comprises a plurality of interaction means 8, each of which is intended for a given guidance setpoint (speed/mach, heading/route, altitude, vertical speed/gradient) of said guidance system 4. The use of a plurality of interaction means 8, namely one interaction means for each guidance setpoint, on the screens 3 dedicated to the playback of the flight parameters and guidance (PFD, ND, VD), makes it possible to implement, directly on these screens 3, all the functions of a standard physical control unit, for example of FCU type, and therefore to dispense with such a control unit, which represents a significant saving, notably in terms of cost, weight and bulk.
In a preferred embodiment, said interaction means 8 comprises a plurality of states which allow different actions to be implemented. The transition from one state to another of the interaction means 8 is generated by a corresponding movement thereof. In this case, said interaction means 8 comprises states which allow at least some of the following different actions to be implemented:
modifying a guidance setpoint, called selected, which is applied by the guidance system 4;
modifying a preset guidance setpoint, which will be applied directly by the guidance system 4 after validation;
arming or engaging a capture or maintain mode for a selected guidance setpoint (selected mode); and
engaging a capture or maintain mode for a guidance setpoint computed automatically in the usual manner (managed mode).
In a preferred embodiment, the interaction means 8 thus makes it possible to control the engagement (activation) of the associated guidance mode on the defined value (so-called selected mode) or on a value computed by the system according to certain criteria (so-called managed mode), and also the arming of a guidance mode.
In a particular embodiment, said interaction means 8 is not displayed continuously on the screen 3, and it appears on request by placing a pointing element on the corresponding graphic object (by a direct contact or by the positioning of a cursor), as illustrated in
Furthermore, preferably, each interaction means 8 has the abovementioned states (not visible, modification directly taken into account for guidance, preset, request to arm or engage the managed mode) which can be accessed by a cursor movement (or by contact in touch mode). Preferably, the management of the interaction means 8 exhibits the following characteristics:
by default, the state of the interaction means 8 is invisible (only the playback of the setpoint value is displayed in the case where a setpoint exists);
the interaction means 8 appears on request, by placing the cursor (or a finger 9) on the graphic object representing the value of the guidance setpoint or the current value of the parameter;
consequently, the modification of the setpoint is possible by moving the interaction means 8 along a predefined curve. The guidance setpoint is then taken into account immediately;
if the pilot wants to preset the guidance setpoint (namely choose a value without activating it, and activate it only later, for example after validation of his or her request by air traffic control), he or she can access the presetting state by locating on the interaction means 8, by grasping it and by moving it appropriately, preferably backward (away from the scale or the curve of movement for the modification) so as to cause a different graphic state associated with the presetting to appear (which is highlighted by an appropriate color, for example yellow). Then, he or she can modify the presetting value by moving the interaction means 8 along the predefined curve (as for the guidance setpoint);
to actually activate a presetting, an appropriate movement of the interaction means 8, preferably toward the interior this time (toward the scale), causes the overlapping of the graphic object associated with the presetting, thus validating the value for the actual guidance of the aircraft; and
to engage or arm the managed mode of the axis concerned (mode for which the guidance setpoint is computed automatically by the system according to predefined criteria), the interaction means 8 is pushed more toward the interior of the interface giving control to the system and causing a graphic object to be covered to validate the command to appear temporarily. In a particular embodiment, the releasing of the interaction means 8 should take effect at the end of travel of the movement required to validate the action. In this case, a releasing of the interaction means 8 before the end of the required movement has no effect.
In the context of the present invention, said interaction means 8 is preferably moved by a direct action. It is, however, also possible to envisage moving said interaction means by a so-called “lever arm” effect. In the latter case, an operator interacts with the graphic object representing the guidance setpoint (for example heading/route), not by a direct interaction on this object, but with a lever arm located diametrically opposite this setpoint representation, along the scale, notably in heading rose form, as illustrated by a dashed line 17 in
Moreover, in a particular embodiment, said dialog device 1 comprises at least one interaction means, which is capable of controlling at least two different references (speed/mach, heading/route, vertical speed/gradient) of a guidance setpoint of said guidance system 4. In this case, it is capable of controlling only one reference at a time, and the selection of one of said references to be controlled depends on the way in which the interaction means 8 is made to appear.
In the latter embodiment, the manner in which the interaction means 8 is made to appear therefore makes it possible to select the setpoint reference. For example, by bringing the interaction means over the heading scale 12 (
Moreover, by way of illustration, in
a symbol AC2 representing the current position of the aircraft equipped with the device 1; and
an altitude scale 22.
in
in
this modification is made in a presetting mode so that the flight level to be set (which is represented by a broken line plot 24 in
the new altitude setpoint (to reach a flight level FL2 according to a trajectory 27) is taken into account by the guidance system 4 after the engagement of a climb mode (maintain speed CAS without altitude constraint), which is controlled by an appropriate movement (illustrated by an arrow 26) of the interaction means 8, as shown in
in
in
It is also possible to implement a climb mode to a setpoint altitude by observing a particular constraint, for example an altitude or geometrical profile constraint. As an illustration, in the example of
under the altitude highlighted by the symbol P1;
through the point highlighted by the symbol P2; and
over the altitude highlighted by the symbol P3.
Moreover, preferably, said screen 3 generates a dynamic visual feedback on a predicted trajectory associated with the guidance setpoint, which makes it possible to have directly on the same screen 3 both a means for modifying the guidance setpoint, for restoring the current value of the guidance setpoint, and an indication of the effect generated on the trajectory of the aircraft by a modification of the guidance setpoint. This is particularly advantageous operationally, since the pilot can immediately interpret the impact of his or her guidance setpoint modifications on the trajectory, and do so without requiring any visual toing and froing between a control panel and a playback screen.
Furthermore, in the latter embodiment, said screen 3 may also display, automatically, at least one characteristic point 31 of said predicted trajectory 30 (
the point of intersection of its predicted heading/route trajectory with the flight plan;
the point of intersection of its predicted heading/route trajectory with the axis of the runway used for a landing;
the horizontal distance (in Nm) relative to the aircraft, of the point of capture of the setpoint altitude.
In a particular embodiment, the interactions are extended to the characteristic points of the display of the predicted trajectory of the preceding preferred embodiment. Thus, said interaction means is capable of acting on the displayed characteristic point or points of said predicted trajectory to modify them.
As an illustration, it is thus notably possible to carry out the following operations:
on the heading presetting, it is possible to delay the start of turn by pushing back, along the predicted trajectory for example, the representation (on the ND screen) of the point at which the taking into account of the heading presetting setpoint begins;
similarly, on the gradient/speed presetting, it is possible to delay the descent/climb start point by an interaction on the graphic representation of this point (on the VD screen);
it is possible to modify the vertical speed/gradient setpoint by an interaction on the end-of-climb/descent graphic representation.
As an illustration, in
Number | Date | Country | Kind |
---|---|---|---|
11 60884 | Nov 2011 | FR | national |
Number | Name | Date | Kind |
---|---|---|---|
812174 | Grundal | Feb 1906 | A |
4538229 | Baltzer et al. | Aug 1985 | A |
4807158 | Blanton et al. | Feb 1989 | A |
5842142 | Murray et al. | Nov 1998 | A |
5936552 | Wichgers et al. | Aug 1999 | A |
5978715 | Briffe et al. | Nov 1999 | A |
6072473 | Muller et al. | Jun 2000 | A |
6085145 | Taka et al. | Jul 2000 | A |
6181987 | Deker et al. | Jan 2001 | B1 |
6353734 | Wright et al. | Mar 2002 | B1 |
6573841 | Price | Jun 2003 | B2 |
6832138 | Straub et al. | Dec 2004 | B1 |
6910657 | Schneider | Jun 2005 | B2 |
D517435 | Yamada | Mar 2006 | S |
7307549 | Firra | Dec 2007 | B2 |
D559260 | Noviello | Jan 2008 | S |
D563977 | Carl et al. | Mar 2008 | S |
7343229 | Wilson | Mar 2008 | B1 |
D579458 | Nash et al. | Oct 2008 | S |
7577501 | Tafs et al. | Aug 2009 | B2 |
D608793 | Canu-Chiesa | Jan 2010 | S |
7702427 | Sridhar et al. | Apr 2010 | B1 |
D615100 | Canu-Chiesa | May 2010 | S |
7751948 | Boorman et al. | Jul 2010 | B2 |
7765061 | Barber et al. | Jul 2010 | B1 |
7830275 | Hiraoka | Nov 2010 | B2 |
D634332 | Spek | Mar 2011 | S |
7996121 | Ferro et al. | Aug 2011 | B2 |
D644651 | Spek | Sep 2011 | S |
D644652 | Spek | Sep 2011 | S |
D644653 | Spek | Sep 2011 | S |
D646689 | Ulliot | Oct 2011 | S |
8078343 | Ferreira et al. | Dec 2011 | B2 |
8108087 | Stone et al. | Jan 2012 | B2 |
8234068 | Young et al. | Jul 2012 | B1 |
8290642 | Hanson | Oct 2012 | B2 |
8311686 | Herkes et al. | Nov 2012 | B2 |
8380366 | Schulte et al. | Feb 2013 | B1 |
8829401 | Lutke et al. | Sep 2014 | B1 |
20030025719 | Palmer et al. | Feb 2003 | A1 |
20030060940 | Humbard et al. | Mar 2003 | A1 |
20030112503 | Lantin | Jun 2003 | A1 |
20040006412 | Doose et al. | Jan 2004 | A1 |
20040056895 | Hedrick | Mar 2004 | A1 |
20040260458 | Park et al. | Dec 2004 | A1 |
20050065671 | Horvath et al. | Mar 2005 | A1 |
20050156777 | King et al. | Jul 2005 | A1 |
20050203675 | Griffin et al. | Sep 2005 | A1 |
20050222766 | Burch et al. | Oct 2005 | A1 |
20050261808 | Artini et al. | Nov 2005 | A1 |
20050273220 | Humbard et al. | Dec 2005 | A1 |
20060132460 | Kolmykov-Zotov et al. | Jun 2006 | A1 |
20060164261 | Stiffler | Jul 2006 | A1 |
20070129855 | Colmeau | Jun 2007 | A1 |
20070182590 | Younkin | Aug 2007 | A1 |
20070288129 | Komer et al. | Dec 2007 | A1 |
20080046134 | Bruce et al. | Feb 2008 | A1 |
20080243318 | Ferro et al. | Oct 2008 | A1 |
20080249675 | Goodman et al. | Oct 2008 | A1 |
20090070123 | Wise et al. | Mar 2009 | A1 |
20090105890 | Jones et al. | Apr 2009 | A1 |
20090118997 | Truitt | May 2009 | A1 |
20090281684 | Spek | Nov 2009 | A1 |
20100010958 | Perrow et al. | Jan 2010 | A1 |
20100156674 | Dwyer et al. | Jun 2010 | A1 |
20100161157 | Guilley et al. | Jun 2010 | A1 |
20100194601 | Servantie et al. | Aug 2010 | A1 |
20100305786 | Boorman | Dec 2010 | A1 |
20100324807 | Doose et al. | Dec 2010 | A1 |
20110001636 | Hedrick | Jan 2011 | A1 |
20110029919 | Woltkamp | Feb 2011 | A1 |
20110184595 | Albert | Jul 2011 | A1 |
20110196599 | Feyereisen et al. | Aug 2011 | A1 |
20110199239 | Lutz | Aug 2011 | A1 |
20110202272 | Feyereisen et al. | Aug 2011 | A1 |
20110208374 | Jayathirtha et al. | Aug 2011 | A1 |
20110213514 | Baxter | Sep 2011 | A1 |
20110246015 | Cummings et al. | Oct 2011 | A1 |
20110264312 | Spinelli et al. | Oct 2011 | A1 |
20110313645 | Shukla | Dec 2011 | A1 |
20120010765 | Wilson et al. | Jan 2012 | A1 |
20120105318 | Nutaro et al. | May 2012 | A1 |
20120116614 | Torres et al. | May 2012 | A1 |
20120215433 | Subbu et al. | Aug 2012 | A1 |
20120253564 | Noll et al. | Oct 2012 | A1 |
20130046462 | Feyereisen et al. | Feb 2013 | A1 |
20130090841 | Barraci et al. | Apr 2013 | A1 |
20130100042 | Kincaid | Apr 2013 | A1 |
20130179011 | Colby et al. | Jul 2013 | A1 |
20130204524 | Fryer et al. | Aug 2013 | A1 |
20130211635 | Bourret | Aug 2013 | A1 |
20130215023 | Bourret | Aug 2013 | A1 |
20130278444 | Venkataswamy et al. | Oct 2013 | A1 |
20140081569 | Agrawal et al. | Mar 2014 | A1 |
20140200748 | Porez | Jul 2014 | A1 |
20140277857 | Bourret et al. | Sep 2014 | A1 |
20140309821 | Poux | Oct 2014 | A1 |
Number | Date | Country |
---|---|---|
0562929 | Sep 1993 | EP |
2063227 | May 2009 | EP |
2 694 104 | Jan 1994 | FR |
Entry |
---|
French Search Report for FR 1160884 dated Jul. 5, 2012. |
French Search Report for FR 1350247 dated Nov. 22, 2013. |
Non-Final Office Action for U.S. Appl. No. 13/861,052 dated Feb. 14, 2014. |
Final Office Action for U.S. Appl. No. 13/861,052 dated Jun. 16, 2014. |
Restriction Requirement for U.S. Appl. No. 13/835,506 dated Mar. 27, 2014. |
Restriction Requirement for U.S. Appl. No. 29/449/551 dated Apr. 9, 2014. |
Notice of Allowance for U.S. Appl. No. 29/449,551 dated Jul. 18, 2014. |
Non-Final Office Action for U.S. Appl. No. 13/835,506 dated Jun. 26, 2014. |
Non-Final Office Action for U.S. Appl. No. 13/835,201 dated Jun. 27, 2014. |
Interview Summary for U.S. Appl. No. 13/861,052 dated Aug. 5, 2014. |
Non-Final Office Action for U.S. Appl. No. 13/834,401 dated Sep. 2, 2014. |
Non-Final Office Action for U.S. Appl. No. 13/861,052 dated Sep. 26, 2014. |
Notice of Allowance for U.S. Appl. No. 29/449,551dated Oct. 22, 2014. |
Final Office Action for U.S. Appl. No. 13/835,201 dated Nov. 19, 2014. |
Number | Date | Country | |
---|---|---|---|
20130135202 A1 | May 2013 | US |