Inflight entertainment (IFE) systems have evolved significantly over the last 25 years. Prior to 1978, IFE systems consisted of audio-only systems. In 1978, Bell and Howell (Avicom Division) introduced a group viewing video system based on VHS tapes. In 1988, Airvision introduced the first in-seat video system allowing passengers to choose between several channels of broadcast video. In 1997, Swissair installed the first interactive video on demand (VOD) system. Currently, several IFE systems provide VOD with full digital video disc (DVD)-like passenger controls.
Sometimes, WE passenger controls are provided on a passenger control unit (PCU) mounted in a seat armrest. Passenger controls on a legacy PCU consist of mechanically actuated buttons having predetermined functions. The PCU communicates with a seatback video display unit (VDU) over a cable.
Other times, WE passenger controls are provided on a touch screen of a seatback VDU.
Moreover, legacy PCU and touch screen seatback VDU IFE passenger controls are not well suited to allowing selections to be made remotely from the passenger's seat, such as by a flight attendant or parent, and reflected on the passenger's seatback VDU; nor are such conventional passenger controls amenable to advanced IFE applications, such as interactive gaming.
The present invention, in a basic feature, provides an improved IFE system with remote control capability and video display synchronization. In one aspect, the present invention uses remote control capability and video display synchronization to extend the advantages of touch screen IFE passenger controls to airline passengers who cannot easily reach their seatback VDU due to, for example, cabin seating arrangements, age or disabilities. These advantages are realized in this aspect through the expedient of a PCU having a touch screen video display that is synchronized with a VDU video display. For example, when the passenger makes a selection by touching the PCU touch screen video display, the VDU video display reflects the selection at a latency level below human perception. In another aspect, the present invention's remote control capability and video display synchronization are used to allow selections to be made remotely from the passenger's seat, such as by a flight attendant, parent, or interactive game competitor, and reflected on both a remote video display and the passenger's seatback VDU and/or PCU video display at latency levels below human perception.
In one aspect of the invention, therefore, an IFE system comprises a VDU having a first video display; and a PCU communicatively coupled with the VDU and having a second video display, wherein user screens rendered on the first and second video displays are synchronized using output selection information communicated between the PCU and the VDU.
In some embodiments, the VDU is mounted to a seatback in front of a seat used by a passenger and wherein the PCU is detachably mounted in an armrest on the seat.
In some embodiments, the PCU and the VDU are communicatively coupled via an IFE distribution network.
In some embodiments, the second video display comprises a touch screen.
In some embodiments, the first and second video displays comprise touch screens.
In some embodiments, the user screens are GUI pages.
In some embodiments, the user screens are full motion video frames.
In some embodiments, the user screens rendered on the first and second video displays are differently formatted to account for size differences between the first and second video displays.
In some embodiments, the orientation of a user screen rendered on the second video display changes upon detachment of the PCU from the armrest.
In some embodiments, the change in orientation is from portrait to landscape.
In some embodiments, the orientation of a user screen rendered on the second video display changes upon attachment of the PCU to the armrest.
In some embodiments, the change in orientation is from landscape to portrait.
In some embodiments, the PCU transmits the output selection information to the VDU, and the VDU selects output for the first video display using the output selection information.
In some embodiments, the output selection information is logged by the PCU on a network storage device, and the VDU polls the network storage device for the output selection information and selects output for the first video display using the output selection information.
In some embodiments, the VDU polls the PCU for the output selection information and selects output for the first video display using the output selection information.
In some embodiments, the VDU transmits the output selection information to the PCU and the PCU selects output for the second video display using the output selection information.
In some embodiments, the output selection information comprises a GUI page identifier and the VDU selects output for the second video display using the GUI page identifier.
In some embodiments, the output selection information comprises an event identifier associated with an event received on the PCU from a peripheral controller device, the PCU transmits the event identifier to the VDU, and the VDU selects output for the second video display using the event identifier.
In some embodiments, the output selection information comprises VDU video data packets reformatted from VDU video data packets received on the PCU from the IFE distribution network, and the PCU transmits the reformatted video data packets to the VDU.
In some embodiments, the video data packets are User Datagram Protocol (UDP) packets.
In some embodiments, the first and second video displays are synchronized at a latency level below human perception.
In another aspect of the invention, an IFE system comprises a first unit having a first video display; and a second unit communicatively coupled with the first unit over an IFE distribution network and having a second video display, wherein the first unit is assigned to a first passenger and the second unit is not assigned to the first passenger, and wherein user screens rendered on the first and second video displays are synchronized at least in part by transmitting from the second unit to the first unit output selection information and having the first unit select output for the first video display using the output selection information.
In some embodiments, the output selection information comprises a GUI page identifier.
In some embodiments, the second unit is assigned to a second passenger.
In some embodiments, the IFE system further comprises a third unit communicatively coupled with the second unit over an IFE distribution network and having a third video display, wherein user screens rendered on the second and third video displays are synchronized at least in part by transmitting from the second unit to the third unit the output selection information and having the third unit select output for the third video display using the output selection information.
In some embodiments, the third unit is assigned to the first passenger.
In some embodiments, the first unit is a VDU and the third unit is a PCU.
In some embodiments, the third unit is assigned to a second passenger and the second unit is not assigned to the second passenger.
In some embodiments, the first and second video displays are synchronized at a latency level below human perception.
These and other aspects will be better understood by reference to the following detailed description taken in conjunction with the drawings that are briefly described below. Of course, the invention is defined by the appended claims.
User screens rendered on video displays 301, 311 may be identically formatted, or may be differently formatted to account for size differences between VDU video display 301 and PCU video display 311. VDU video display 301 may be a touch screen video display or a standard video display. Even if unreachable by the passenger, implementing VDU video display 301 as a touch screen video display has advantages in, for example, allowing a flight attendant to assist the passenger in using the IFE system by making touch-based selections on VDU video display 301. User screens are synchronized using synchronization software executed on VDU 300 and PCU 310 under processor control.
After the synchronization master launches (700), the master loads the root GUI page for the master video display based on configuration information for the master video display (701). The master saves to a slave page ID log in local storage or remote storage (e.g., on a IFE network storage device) an identifier of a slave GUI page corresponding to the root GUI page (702). The master then waits for a new page event (703), such as a selection made by touching a button on the root GUI page on the master video display. After receiving a new page event, the master opens the new GUI page (704) on the master video display. The master then saves in local or remote storage an identifier of a slave GUI page corresponding to the new GUI page (705) and transmits to the synchronization slave a new page event message having the identifier of the slave GUI page (706). The flow then returns to Step 703.
After the synchronization slave launches (710), the slave loads the root GUI page for the slave video display based on configuration information for the slave video display (711). The slave then retrieves from a slave page ID log on local or remote storage an identifier of a slave GUI page (712). The slave then compares the identifier of the slave GUI page retrieved from the slave page ID log with the identifier of the currently loaded GUI page for the slave video display (713). If the identifiers do not match, the slave loads the slave GUI page associated with the identifier retrieved from the slave page ID log (715) and the flow returns to Step 712. If the identifiers match, the slave waits T seconds (where T is a predetermined number) for a new page event message from the synchronization master (714). If T seconds elapse without receiving a new page event message from the master, the flow returns to Step 712 (i.e., the slave periodically polls the slave page ID log). If a new page event message is received before T seconds elapse, the slave loads the new page associated with the identifier in the new page event message (716) and returns to Step 714.
Working together, the synchronization master and synchronization slave provide a robust mechanism for synchronizing GUI user screens on a VDU/PCU pair at a level below human perception. Moreover the above method may be extended to one-to-many scenarios (i.e., one synchronization master and multiple synchronization slaves) to synchronize user screens presented on three or more associated video displays.
Moreover, it may be desirable to allow selections to be made remotely from the passenger's seat, such as by a flight attendant or parent, and reflected on both a remote video display and the passenger's seatback VDU and/or PCU video display at latency levels below human perception. For example, it may be desirable to have a flight attendant or a parent make IFE selections on a VDU and/or PCU touch screen associated with a first seat and have the IFE selections reflected on the VDU and/or PCU associated with a second seat at a latency level below human perception. Moreover, it may be desirable to have IFE selections from one VDU and/or PCU touch screen reflected on a large number of other VDUs and/or PCUs for purposes of IFE system testing. To achieve these and other ends, the method of
Turning now to
After the device redirector launches (800), the redirector obtains redirection configuration information for a peripheral controller device (801). This information indicates whether activity events received from the peripheral controller device are to be routed to a local application (i.e., an application running on the same VDU or PCU on which the redirector is running), redirected to a remote device (i.e., a VDU or PCU other than the one on which the redirector is running), or both. This information may be retrieved from local or remote storage (e.g., IFE network storage device). The redirector then waits for an event and branches according to the event type. If the redirector receives a configuration change event, the redirector updates the redirection configuration information for the peripheral controller device (810) and returns to Step 802. If the redirector receives an activity event message from a remote device in relation to an activity event originating on a peripheral controller device, the redirector extracts the activity event and routes it to a local application which reflects the event on the local user screen (820). If the redirector receives an activity event from a local peripheral controller device, the redirector consults redirection configuration information stored in local or remote storage (803). If the redirection configuration information indicates to only route activity events from the local peripheral controller device to a local application, the redirector passes the activity event to the local application which reflects the event on the local user screen (811). If the redirection configuration information indicates to only redirect activity events from the local peripheral controller device to a remote device, the redirector packages the activity event in an activity event message and routes the event message to the remote device, whereupon an application running on the remote device reflects the event on a remote user screen (821). If the redirection configuration information indicates to both route activity events from the local peripheral controller device to the local application and redirect such activity events to the remote device, redirector does both, whereupon the event is reflected on both the local user screen and a remote user screen (804). After handling the activity event, the flow returns to Step 802 where the redirector awaits the next event.
Finally,
After the network manager launches (900), the manager obtains UDP video packet processing information (901). This information indicates whether full motion video in UDP video packets should be redirected to a remote device (e.g., a PCU paired with the VDU on which the manager is running) and, if so, how the redirected full motion video should be formatted for compatibility with the remote device, as well as how the full motion video should be formatted for compatibility when routed to a local application (e.g., an application running on the VDU on which the manager is running) (901). This information may be retrieved from local or remote storage (e.g., IFE network storage device). The manager then receives a UDP video packet destined for the local application and consults the UDP video packet processing information (902). The manager packages the full motion video in a format compatible with the local application and routes it to the local application, which locally renders user screens depicting full motion video frames (903). The manager then determines whether a remote device and format are defined in the UDP video packet processing information (904). If not, the flow returns to Step 902. If so, the manager also packages the full motion video in a format compatible with the remote device and redirects it to the remote device, whereupon user screens depicting full motion video frames are rendered remotely (905), whereafter the flow returns to Step 902.
In some embodiments, the received UDP video packets are controlled using Real-Time Streaming Protocol (RTSP). The manager generates Internet Group Management Protocol (IGMP) packets from the received UDP video packets and sends them to the remote device for rendering on a selected IGMP channel. The remote device video display replicates the video being shown on the local device with no human perceptible latency.
It will be appreciated by those of ordinary skill in the art that the invention can be embodied in other specific forms without departing from the spirit or essential character hereof. The present description is therefore considered in all respects to be illustrative and not restrictive. The scope of the invention is indicated by the appended claims, and all changes that come with in the meaning and range of equivalents thereof are intended to be embraced therein.
This application claims the benefit of U.S. provisional application No. 61/335,084 entitled “INFLIGHT ENTERTAINMENT SYSTEM WITH SYNCHRONIZED SEATBACK AND PASSENGER CONTROL UNIT VIDEO DISPLAYS,” filed on Dec. 31, 2009, the contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61335084 | Dec 2009 | US |