1. Field
This document relates to a display device and a method of controlling the same, and more particularly, to a display device and a method of controlling the same, capable of controlling the presentation (i.e., display) of an image in response to a distance and an approach direction with respect to a stereoscopic image.
2. Related Art
The functional diversification of terminals, such as personal computers, laptop computers, cellular phones or the like, has led to the implementation thereof as multimedia player type terminals equipped with complex functions of, for example, capturing pictures or videos, reproducing music or video files, providing game services, receiving broadcasting signals or the like.
Terminals, as multimedia devices, may also be called display devices as they are generally configured to display a variety of image information.
Such display devices may be classified into portable and stationary type according to the mobility thereof. Examples of portable display devices may include laptop computers, cellular phones and the like, while examples of stationary display devices may include televisions, monitors for desktop computers and the like.
It is, therefore, an object of the present invention is to efficiently provide a display device and a method of controlling the same, capable of controlling the presentation of an image in response to a distance and an approach direction with respect to a stereoscopic image.
According to an aspect of the present invention, there is provided a display device including: a camera capturing a gesture of a user; a display displaying a stereoscopic image; and a controller controlling presentation of the stereoscopic image in response to a distance between the gesture and the stereoscopic image in a virtual space and an approach direction of the gesture with respect to the stereoscopic image.
According to another aspect of the present invention, there is provided a display device including: a camera capturing a gesture of a user; a display displaying a stereoscopic image having a plurality of sides; and a controller executing a function assigned to at least one of the plurality of sides in response to an approach direction of the gesture with respect to the at least one of the plurality of sides in a virtual space.
According to another exemplary embodiment of the present invention, there is provided a method of controlling the display device, including: displaying a stereoscopic image; acquiring a gesture with respect to the displayed stereoscopic image; and controlling presentation of the stereoscopic image in response to a distance between the gesture and the stereoscopic image in a virtual space, and an approach direction of the gesture with respect to the stereoscopic image.
According to another exemplary embodiment of the present invention, there is provided a method of controlling a display device, including: displaying a stereoscopic image having a plurality of sides; acquiring a gesture with respect to the displayed stereoscopic image; and executing a function assigned to at least one of the plurality of sides in response to an approach direction of the gesture with respect to the at least one of the plurality of sides in a virtual space.
The above and other objects and features of the present invention will become apparent from the following description of preferred embodiments given in conjunction with the accompanying drawings, in which:
This document will now be described more fully with reference to the accompanying drawings, in which exemplary embodiments of this document are shown. This document may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein; rather, there embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of this document to those skilled in the art.
Hereinafter, a mobile terminal relating to this document will be described below in more detail with reference to the accompanying drawings. In the following description, suffixes “module” and “unit” are given to components of the mobile terminal in consideration of only facilitation of description and do not have meanings or functions discriminated from each other.
The mobile terminal described in the specification can include a cellular phone, a smart phone, a laptop computer, a digital broadcasting terminal, personal digital assistants (PDA), a portable multimedia player (PMP), a navigation system and so on.
As shown, the display device 100 may include a communication unit 110, a user input unit 120, an output unit 150, a memory 160, an interface 170, a controller 180, and a power supply 190. Not all of the components shown in
The communication unit 110 may include at least one module that enables communication between the display device 100 and a communication system or between the display device 100 and another device. For example, the communication unit 110 may include a broadcasting receiving module 111, an Internet module 113, and a near field communication module 114.
The broadcasting receiving module 111 may receive broadcasting signals and/or broadcasting related information from an external broadcasting management server through a broadcasting channel.
The broadcasting channel may include a satellite channel and a terrestrial channel, and the broadcasting management server may be a server that generates and transmits broadcasting signals and/or broadcasting related information or a server that receives previously created broadcasting signals and/or broadcasting related information and transmits the broadcasting signals and/or broadcasting related information to a terminal. The broadcasting signals may include not only TV broadcasting signals, radio broadcasting signals, and data broadcasting signals but also signals in the form of a combination of a TV broadcasting signal and a radio broadcasting signal of a data broadcasting signal.
The broadcasting related information may be information on a broadcasting channel, a broadcasting program or a broadcasting service provider, and may be provided even through a communication network.
The broadcasting related information may exist in various forms. For example, the broadcasting related information may exist in the form of an electronic program guide (EPG) of a digital multimedia broadcasting (DMB) system or in the form of an electronic service guide (ESG) of a digital video broadcast-handheld (DVB-H) system.
The broadcasting receiving module 111 may receive broadcasting signals using various broadcasting systems. The broadcasting signals and/or broadcasting related information received through the broadcasting receiving module 111 may be stored in the memory 160.
The Internet module 113 may correspond to a module for Internet access and may be included in the display device 100 or may be externally attached to the display device 100.
The near field communication module 114 may correspond to a module for near field communication. Further, Bluetooth®, radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB) and/or ZigBee® may be used as a near field communication technique.
The user input 120 is used to input an audio signal or a video signal and may include a camera 121 and a microphone 122.
The camera 121 may process image frames of still images or moving images obtained by an image sensor in a video telephony mode or a photographing mode. The processed image frames may be displayed on a display 151. The camera 121 may be a 2D or 3D camera. In addition, the camera 121 may be configured in the form of a single 2D or 3D camera or in the form of a combination of the 2D and 3D cameras.
The image frames processed by the camera 121 may be stored in the memory 160 or may be transmitted to an external device through the communication unit 110. The display device 100 may include at least two cameras 121.
The microphone 122 may receive an external audio signal in a call mode, a recording mode or a speech recognition mode and process the received audio signal into electric audio data. The microphone 122 may employ various noise removal algorithms for removing or reducing noise generated when the external audio signal is received.
The output unit 150 may include the display 151 and an audio output module 152.
The display 151 may display information processed by the display device 100. The display 151 may display a user interface (UI) or a graphic user interface (GUI) relating to the display device 100. In addition, the display 151 may include at least one of a liquid crystal display, a thin film transistor liquid crystal display, an organic light-emitting diode display, a flexible display and a three-dimensional display. Some of these displays may be of a transparent type or a light transmissive type. That is, the display 151 may include a transparent display. The transparent display may include a transparent liquid crystal display. The rear structure of the display 151 may also be of a light transmissive type. Accordingly, a user may see an object located behind the body of terminal through the transparent area of the terminal body, occupied by the display 151.
The display device 100 may include at least two displays 151. For example, the display device 100 may include a plurality of displays 151 that are seated on a single plane at a predetermined distance or integrated displays. The plurality of displays 151 may also be seated on different planes.
Further, when the display 151 and a sensor sensing touch (hereafter referred to as a touch sensor) form a layered structure that is referred to as a touch screen, the display 151 may be used as an input device in addition to an output device. The touch sensor may be in the form of a touch film, a touch sheet, and a touch pad, for example.
The touch sensor may convert a variation in pressure applied to a specific portion of the display 151 or a variation in capacitance generated at a specific portion of the display 151 into an electric input signal. The touch sensor may sense pressure of touch as well as position and area of the touch.
When the user applies a touch input to the touch sensor, a signal corresponding to the touch input may be transmitted to a touch controller. The touch controller may then process the signal and transmit data corresponding to the processed signal to the controller 180. Accordingly, the controller 180 can detect a touched portion of the display 151.
The audio output module 152 may output audio data received from the radio communication unit 110 or stored in the memory 160. The audio output module 152 may output audio signals related to functions, such as a call signal incoming tone and a message incoming tone, performed in the display device 100.
The memory 160 may store a program for operation of the controller 180 and temporarily store input/output data such as a phone book, messages, still images, and/or moving images. The memory 160 may also store data about vibrations and sounds in various patterns that are output from when a touch input is applied to the touch screen.
The memory 160 may include at least a flash memory, a hard disk type memory, a multimedia card micro type memory, a card type memory, such as SD or XD memory, a random access memory (RAM), a static RAM (SRAM), a read-only memory (ROM), an electrically erasable programmable ROM (EEPROM), a programmable ROM (PROM) magnetic memory, a magnetic disk or an optical disk. The display device 100 may also operate in relation to a web storage performing the storing function of the memory 160 on the Internet.
The interface 170 may serve as a path to all external devices connected to the mobile terminal 100. The interface 170 may receive data from the external devices or power and transmit the data or power to internal components of the display device terminal 100 or transmit data of the mobile terminal 100 to the external devices. For example, the interface 170 may include a wired/wireless headset port, an external charger port, a wired/wireless data port, a memory card port, a port for connecting a device having a user identification module, an audio I/O port, a video I/O port, and/or an earphone port.
The controller 180 may control overall operations of the mobile terminal 100. For example, the controller 180 may perform control and processing for voice communication. The controller 180 may also include an image processor 182 for pressing image, which will be explained later.
The power supply 190 receives external power and internal power and provides power required for each of the components of the display device 100 to operate under the control of the controller 180.
Various embodiments described in this document can be implemented in software, hardware or a computer readable recording medium. According to hardware implementation, embodiments of this document may be implemented using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, and/or electrical units for executing functions. The embodiments may be implemented by the controller 180 in some cases.
According to software implementation, embodiments such as procedures or functions may be implemented with a separate software module executing at least one function or operation. Software codes may be implemented according to a software application written in an appropriate software language. The software codes may be stored in the memory 160 and executed by the controller 180.
As shown in
The proximity sensor can be constructed such that it outputs a proximity signal according to the distance between the pointer approaching the touch screen and the touch screen (referred to as “proximity depth”).
The distance in which the proximity signal is output when the pointer approaches the touch screen is referred to as a detection distance. The proximity depth can be known by using a plurality of proximity sensors having different detection distances and comparing proximity signals respectively output from the proximity sensors.
Specifically, when the pointer completely comes into contact with the touch screen (D0), it is recognized as contact touch. When the pointer is located within a distance D1 from the touch screen, it is recognized as proximity touch of a first proximity depth. When the pointer is located in a range between the distance D1 and a distance D2 from the touch screen, it is recognized as proximity touch of a second proximity depth. When the pointer is located in a range between the distance D2 and a distance D3 from the touch screen, it is recognized as proximity touch of a third proximity depth. When the pointer is located at longer than the distance D3 from the touch screen, it is recognized as cancellation of proximity touch.
Accordingly, the controller 180 can recognize the proximity touch as various input signals according to the proximity distance and proximity position of the pointer with respect to the touch screen and perform various operation controls according to the input signals.
Binocular parallax (or stereo disparity) refers to the difference in vision of viewing an object between a human being's (user's or observer's) left and right eyes. When the user's brain combines an image viewed by the left eye and that viewed by the right eye, the combined image makes the user feel stereoscopic. Hereinafter, the phenomenon in which the user feels stereoscopic according to binocular parallax will be referred to as a ‘stereoscopic vision’, and an image causing a stereoscopic vision will be referred to as a ‘stereoscopic image’. Also, when a particular object included in an image causes the stereoscopic vision, the corresponding object will be referred to as a ‘stereoscopic object’.
A method for displaying a stereoscopic image according to binocular parallax is classified into a glass type method and a glassless type method. The glass type method may include a scheme using tinted glasses having a wavelength selectivity, a polarization glass scheme using a light blocking effect according to a deviation difference, and a time-division glass scheme alternately providing left and right images within a residual image time of eyes. Besides, the glass type method may further include a scheme in which filters each having a different transmittance are mounted on left and right eyes and a cubic effect with respect to a horizontal movement is obtained according to a time difference of a visual system made from the difference in transmittance.
The glassless type method, in which a cubic effect is generated from an image display surface, rather than from an observer, includes a parallax barrier scheme, a lenticular lens scheme, a microlens array scheme, and the like.
With reference to
With reference to
Meanwhile, the foregoing methods for displaying a stereoscopic image are merely for explaining exemplary embodiments of the present invention, and the present invention is not meant to be limited thereto. Beside the foregoing methods, a stereoscopic image using binocular parallax may be displayed by using various other methods.
Hereinafter, concrete embodiments of the present invention will be described.
As shown therein, the controller 180 of the display device 100, according to an exemplary embodiment of the present invention, may display a stereoscopic image in operation S10.
As described above, the stereoscopic image may be an image displayed by using a binocular disparity, that is, a stereo disparity. By presenting an image using the stereo disparity, a stereoscopic image with depth or perspective may be displayed. For example, in such a manner, an image may look as if protruding or receding from a display surface of the display 151. The stereoscopic image using the stereo disparity is different from a related-art two-dimensional (2D) display that gives just a 3D-like impression. The method of displaying a stereoscopic image by using the stereo disparity will be described later in more detail.
When the stereoscopic image is displayed, a user's gesture may be acquired in operation S30.
The user's gesture may be captured by the camera 121 provided in the display device 100. For example, assuming that the display device 100 is a fixed TV, the camera 121 may capture a motion made by a user in front of the TV. Also, assuming that the display device 100 is a mobile terminal, the camera 121 may capture a hand motion of the user in front or at the back of the mobile terminal.
When the user's gesture is acquired, the presentation of the stereoscopic image may be controlled according to a distance and a location relationship between the stereoscopic image and the gesture in operation S50.
The controller 180 may learn (i.e., determine) the location of the gesture made by the user. That is, an image captured by the camera 121 may be analyzed to thereby provide an analysis of the location of the gesture in the virtual space. The location of the gesture may be a relative distance with respect to the body of a user or the display surface of the display 151. In this case, the distance may refer to a location within a 3D space. For example, the distance may indicate a specific spot having x-y-z components from an origin such as a specific point on the body of the user.
The controller 180 may determine the location of the displayed stereoscopic image in the virtual space. That is, the controller 180 may determine the location of the stereoscopic image in the virtual space giving the user an impression that an image is displayed therein due to the effect of the stereo disparity. For example, this means that in the case where an image has positive (+) depth to look as if protruding toward the user from the display surface of the display 151, the controller 180 may determine the extent to which the image protrudes, and the location thereof.
The controller 180 may determine a direction in which the gesture approaches the stereoscopic image, that is, an approach direction of the gesture with respect to the stereoscopic image. That is, since the controller 180 learns the location of the gesture and the location of the stereoscopic image in the virtual space, it can be determined which side (or face) of the stereoscopic image the gesture is made for. For example, in the case in which the stereoscopic image in the form of a polyhedron is displayed in the virtual space, the controller 180 may determine whether the user's gesture is directed toward the front side of the stereoscopic image or the lateral or rear side of the stereoscopic image.
Since the controller 180 may determine the approach direction of the gesture with respect to the stereoscopic image, a function corresponding to the approach direction may be executed. For example, in the case in which the stereoscopic image is approached from its front side thereof and touched, a function of activating the stereoscopic image may be executed. Also, in the case in which the stereoscopic image is approached from the rear side thereof and touched, a specific function corresponding to the stereoscopic image may be executed.
For example, the stereoscopic image depicted in
The controller 180 may display an image acquired in real time by the camera 121 on the display 151 in the form of a preview.
The controller 180 may acquire one or more stereo disparities respectively corresponding to one or more of the image objects in operation S110.
In the case where the camera 121 is a 3D camera capable of acquiring an image for the left eye (hereinafter, referred to as “a left-eye image”) and an image for the right eye (hereinafter, referred to as “a right-eye image”), the controller 180 may use the acquired left-eye and right-eye images to acquire the stereo disparity of each of the first image object 10 and the second image 11.
For example, referring to
The controller 180 may acquire a stereo disparity d1 corresponding to the first image object 10 on the basis of the left-eye image 10a and the right-eye image 10b.
In the case where the camera 121 is a 2D camera, the controller 180 may convert a 2D image, acquired by the camera 121, into a stereoscopic image by using a predetermined algorithm for converting a 2D image into a 3D image, and display the converted image on the display 151.
Furthermore, by using left-eye and right-eye images created by the above image conversion algorithm, the controller 180 may acquire the respective stereo disparities of the first image object 10 and the second image object 11.
Referring to
The controller 180 may acquire one or more graphic objects respectively corresponding to one or more of the image objects in operation. The controller 180 may display the acquired one or more graphic objects on the display 151 so as to have a stereo disparity.
The controller 180 may give the user the perception of various types of depth by displaying a stereoscopic image having positive (+) or negative depth (−) according to needs.
As shown in those drawings, the acquiring of the user's gesture by the controller 180 of the display device in operation S30 of
The controller 180 may activate the camera 121. When the camera 121 is activated, the controller 180 may capture an image of the surroundings of the display device 100.
It is determined whether a user having control is found in operation S32, and the user having control may be tracked in operation S33.
The controller 180 may control the display device 100 on the basis of a gesture made by a specific user having control. For example, this means that in the case where a plurality of people are located in front of the display device 100, the controller 180 may allow a specific function of the display device 100 to be performed on the basis of only a gesture made by a specific person having acquired control among those in front of the display device 100.
As shown in
When a user having control is found, the user with control may be tracked. The granting and tracking of the control may be performed on the basis of an image captured by the camera 121 provided in the display device 100. That is, this means that the controller 180 analyzes the captured image to thereby continuously determine whether or not a specific user U exists, the specific user U performs a gesture required for control acquisition, the specific user U is moving, and the like.
While the user having control is being tracked, it may be determined whether or not a specific gesture of the user is captured in operation 34.
The specific gesture of the user may be a gesture for executing a specific function of the display device 100 and terminating the specific function being performed. For example, the specific gesture may be a gesture to select various menus displayed as stereoscopic images by the display device 100. Hereinafter, the operation in which the presentation of the stereoscopic image is controlled according to the user's gesture (S50 of
As shown in those drawings, the display device 100 according to an exemplary embodiment of the present invention may appropriately control the presentation of the stereoscopic image in response to the specific gesture made by the user U with respect to the stereoscopic image.
The controller 180 may acquire the location of the stereoscopic image in the virtual space VS in operation S51.
As shown in
When each of the objects O1 to O3 is displayed in the virtual space VS, the user U may have an impression that he can take hold (grip) of the display objects O1 to O3 with his hand. This effect is more clearly demonstrated in an object looking as if being located near the user U. For example, as shown in
The controller 180 may learn the location of the stereoscopic image displayed in the virtual space VS. That is, based on the locations of the left-eye and right-eye images 10a and 10b of
The location of the gesture in the virtual space VS may be acquired in operation S52.
The location of the gesture in the virtual space VS may be acquired by using the camera 121 provided in the display device 100. That is, the controller 180 may analyze an image acquired as the camera 121 continuously tracks an image of the user U.
As shown in
It may be determined whether the gesture is made within a predetermined distance to the stereoscopic image in operation S53.
For the execution of a specific function on the basis of the user U's gesture, the user U may make a specific gesture within the predetermined distance. For example, the user U may put out his hand H toward the first object O1 to make a motion associated with the first object O1.
As shown in
The controller 180 may determine a direction V in which the user U's hand H approaches, through an image analysis. That is, the controller 180 may determine whether the hand H approaches the first object O1 or another object adjacent to the first object O1, by using the trace of the gesture made by the hand H.
When it is determined that the gesture is made within the predetermined distance with respect to the stereoscopic image, an approach direction of the gesture with respect to the stereoscopic image may be acquired in operation S54.
When the gesture is made by the hand H of the user U, the controller 180 may determine which direction (i.e., side) of the hand H faces the first object O1. For example, the controller 180 may determine whether a palm side P or a back side B of the palm faces the first object O1.
It may be determined which one of the palm side P and the back side B approaches the first object O1, by analyzing the image acquired by the camera 121 and/or tracking the trace of the hand H.
As shown in
When the palm side P moves in a first direction A1 as in the case of the first hand H1, the controller 180 may determine that the user U moves to take a grip on (i.e., hold) the first object O1. When the back side B moves in a second direction A2 as in the case of the second hand H2, the controller 180 may determine that the user U is not moving to take a grip on the first object O1. That is, this means that it may be determined which motion the user is to make with respect to a specific stereoscopic image, on the basis of the gesture of the user U, in particular, a hand motion. Accordingly, the controller 180 may enable the execution of a specific function on the basis of a specific hand motion. That is, the case of the first hand H1 may be linked to a motion of grabbing the first object O1, and the case of the second hand H2 may be linked to a motion of pushing the first object O1.
It may be determined whether the approach is appropriate for the external shape and properties of the stereoscopic image in operation S55, and the presentation of the stereoscopic image may be controlled in operation S56.
The controller 180 may allow stereoscopic images to have different characteristics according to shapes of the stereoscopic images and/or properties of entities respectively corresponding to the stereoscopic images. That is, a stereoscopic image representing a rigid object such as an iron bar, and a stereoscopic image representing an elastic object such as a rubber bar may have different responses to a user's gesture. In the case in which a stereoscopic image represents an entity such as an iron bar, the shape of the stereoscopic image may be maintained as it is even when a user makes a motion of holding the corresponding image. In contrast, in the case in which a stereoscopic image represents an entity such as a rubber bar, the shape thereof may be changed when a user makes a motion of holding the same.
If the first object O1 is set to have rigidity as shown in
When the first hand H1 of the user U makes a holding motion after approaching the first object O1 within a predetermined distance, the controller 180 may cause the stereoscopic image to look as if the first object O1 is held by the hand. Accordingly, the user can perform a function of moving the first object O1 in the third direction A3.
As shown in
As shown in
As shown in
As shown in
As shown in
As shown in those drawings, the controller 180 of the display device 100 according to an exemplary embodiment of the present invention may perform a specific function in response to a user's gesture with respect to a specific side of a stereoscopic image in the form of a polyhedron with a plurality of sides (i.e., faces).
As shown in
As shown in
As shown in
As shown in
The user may make a gesture of touching the rear side of the object O, the third side S3 thereof, in an eighth direction A8 as shown in
As shown in
As shown in
As shown in
As shown in
As shown in
When the user makes a gesture of holding the object O, a different function from that in the case of a gesture of separately touching each side may be executed. For example, assuming that a first function is executed by a gesture with respect to the first side S1 and a second function is executed by a gesture with respect to the third side S3, a gesture of simultaneously touching the first and third sides S1 and S3 may execute a third function. In this case, the third function may be totally different from the first and second functions or may be the simultaneous execution of the first and second functions.
As shown in
As shown in those drawings, the display device according to an exemplary embodiment of the present invention may display a pointer P corresponding to a gesture of a user U. In this case, the pointer P is displayed so as to give the user an impression of 3D distance.
As shown in
As shown in
As shown in
As shown in
The pointer P having the size of the first point P1 at the reference location may become bigger to have a size of a second pointer P2 as the hand H moves in a twelfth direction A12. That is, the pointer P increases in size as it approaches the user. Furthermore, the pointer P having the size of the first pointer P1 at the reference location may become smaller to have a size of a third pointer P3 as the hand H moves in a thirteenth direction A13. That is, the pointer P may decrease in size as it moves farther away from the user.
Since the pointer changes in size according to the distance from the user, the perspective caused by a stereo disparity may be more clearly presented. Also, this may provide a guide to the depth of an object selectable by the user's gesture.
As shown in
As shown in
As shown in
As shown in those drawings, when a gesture for selecting a specific image from among the plurality of stereoscopic images is input, the controller 180 of the display device 100 according to an exemplary embodiment of the present invention changes the presentation of another stereoscopic image. Accordingly, the selection with respect to the specific stereoscopic image can be facilitated.
As shown in
As shown in
As shown in
As shown in those drawings, the display device 100 according to an exemplary embodiment of the present invention may give the user U feedback on a gesture.
The feedback may be recognized through an auditory sense and/or a tactile sense. For example, when the user makes a gesture of touching or holding a stereoscopic image, the controller 180 may provide the user with corresponding sounds or vibrations. The feedback for the user may be performed by using a directional speaker SP or an ultrasonic generator US, a feedback unit provided in the display device 100.
As shown in
As shown in
As shown in those drawings, the display device 100 according to another exemplary embodiment of the present invention may be a mobile terminal which can be carried by a user. In the case in which the display device 100 is a portable device, a user's gesture with respect to not only the front side of the display device 100 but also the rear side thereof may be acquired and corresponding functions may be performed accordingly.
As shown in
The body of the display device 100 may be provided with a first camera 121a facing the front side and a second camera 121b facing the back side.
As shown in
As shown in
Since the display device 100 can be controlled upon acquiring not only a gesture made in front of the display device 100 but also a gesture made at the rear of the display device 100, various operations can be made according to the depth of a stereoscopic image.
As set forth herein, in the display device and the method of controlling the same, according to exemplary embodiments of the present invention, the presentation of an image can be controlled in response to a distance and an approach direction with respect to a stereoscopic image.
While the present invention has been shown and described in connection with the exemplary embodiments, it will be apparent to those skilled in the art that modifications and variations can be made without departing from the spirit and scope of the invention as defined by the appended claims.