This application claims the priority benefit of Taiwan application serial no. 99108929, filed Mar. 25, 2010. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
1. Field of the Invention
The invention generally relates to a sensing method and a system using the same, and more particularly, to a touch sensing method and a system using the same.
2. Description of Related Art
In this information era, reliance on electronic products is increasing day by day. The electronic products including notebook computers, mobile phones, personal digital assistants (PDAs), digital walkmans, and so on are indispensable in our daily lives. Each of the aforesaid electronic products has an input interface for a user to input his or her command, such that an internal system of each of the electronic product spontaneously runs the command.
In order to provide a more intuitional operation mode, electronic product manufactures start to dispose touch input interfaces, such as touch pads and touch panels, on electronic products such that users can input instructions through these touch pads and touch panels. An existing touch input interface usually works by detecting the touching or sensing action between a finger (or stylus) and the touch input interface, so that the electronic apparatus can define a touch gesture of the user according to the change of the coordinates of the touch point or the change of the number of touch points and perform a corresponding operation according to the touch gesture.
Accordingly, the invention is directed to a touch sensing method in which a touch gesture is defined according to an edge change of a region on a touch interface corresponding to an object, and a corresponding touch operation is performed according to the touch gesture.
The invention is directed to a touch sensing system in which a touch gesture is defined according to an edge change of a region on a touch interface corresponding to an object, and a corresponding touch operation is performed according to the touch gesture.
The invention provides a touch sensing method adapted to a touch sensing system, wherein the touch sensing system includes a touch interface. The touch sensing method includes following steps. At least one edge change of at least one region on the touch interface corresponding to at least one object is sensed within a timing tolerance. A touch gesture corresponding to the object is defined according to the edge change.
According to an embodiment of the invention, the step of sensing the edge change includes following steps. A first edge of the region in a direction is sensed at a first time within the timing tolerance. A second edge of the region in the direction is sensed at a second time within the timing tolerance. Whether a distance between the first edge and the second edge is greater than a change threshold is determined. The distance is defined as the edge change if the distance is greater than the change threshold, wherein the value of the edge change is the distance, and the direction of the edge change is aforementioned direction.
According to an embodiment of the invention, the touch sensing method further includes following steps. The touch gesture is defined as a moving gesture towards aforementioned direction according to the direction of the edge change.
According to an embodiment of the invention, the step of sensing the edge change includes following steps. A first edge change of the region corresponding to the object is sensed during a first period within the timing tolerance. A second edge change of the region corresponding to the object is sensed during a second period within the timing tolerance. The direction of the first edge change is a first direction, and the direction of the second edge change is a second direction.
According to an embodiment of the invention, the touch sensing method further includes following steps. A rotation judgment sequence is performed according to an area change of the region. Whether an angle between the first direction and the second direction is greater than a first angle threshold is determined. The touch gesture is defined as a rotation gesture if the angle is greater than the first angle threshold.
According to an embodiment of the invention, the touch sensing method further includes following steps. A flip judgment sequence is performed according to an area change of the region. Whether an angle between the first direction and the second direction is greater than a second angle threshold is determined. The touch gesture is defined as flip gesture if the angle is greater than the second angle threshold.
According to an embodiment of the invention, the step of sensing the edge change includes following steps. A third edge change of a first region corresponding to the object is sensed during a third period within the timing tolerance. A fourth edge change of a second region corresponding to the object is sensed during the third period within the timing tolerance.
According to an embodiment of the invention, the touch sensing method further includes following steps. A zoom-in judgment sequence is performed according to area changes of the first region and the second region. Whether a distance between the first region and the second region is smaller than a first distance threshold is determined. The touch gesture is defined as a zoom-in gesture if the distance is smaller than the first distance threshold.
According to an embodiment of the invention, the touch sensing method further includes following steps. A zoom-out judgment sequence is performed according to area changes of the first region and the second region. Whether a distance between the first region and the second region is greater than a second distance threshold is determined. The touch gesture is defined as a zoom-out gesture if the distance is greater than the second distance threshold.
According to an embodiment of the invention, the touch sensing method further includes performing a touch operation according to the touch gesture.
The invention provides a touch sensing system including a touch interface and a control unit. The touch interface senses at least one edge change of at least one region on the touch interface corresponding to at least one object within a timing tolerance. The control unit defines a touch gesture corresponding to the object according to the edge change.
According to an embodiment of the invention, when the touch interface senses the edge change, the touch interface senses a first edge of the region in a direction at a first time within the timing tolerance, and the touch interface senses a second edge of the region in the direction at a second time within the timing tolerance, and the control unit determines whether a distance between the first edge and the second edge is greater than a change threshold. The control unit defines the distance as the edge change if the distance is greater than the change threshold, wherein the value of the edge change is the distance, and the direction of the edge change is aforementioned direction.
According to an embodiment of the invention, the control unit defines the touch gesture as a moving gesture towards aforementioned direction according to the direction of the edge change.
According to an embodiment of the invention, when the touch interface senses the edge change, the touch interface senses a first edge change of the region corresponding to the object during a first period within the timing tolerance, and the touch interface senses a second edge change of the region corresponding to the object during a second period within the timing tolerance, wherein the direction of the first edge change is a first direction, and the direction of the second edge change is a second direction.
According to an embodiment of the invention, the control unit performs a rotation judgment sequence according to an area change of the region and determines whether an angle between the first direction and the second direction is greater than a first angle threshold. The control unit defines the touch gesture as a rotation gesture if the angle is greater than the first angle threshold.
According to an embodiment of the invention, the control unit performs a flip judgment sequence according to an area change of the region and determines whether an angle between the first direction and the second direction is greater than a second angle threshold. The control unit defines the touch gesture as a flip gesture if the angle is greater than the second angle threshold.
According to an embodiment of the invention, when the touch interface senses the edge change, the touch interface senses a third edge change of a first region corresponding to the object during a third period within the timing tolerance, and the touch interface senses a fourth edge change of a second region corresponding to the object during the third period within the timing tolerance.
According to an embodiment of the invention, the control unit performs a zoom-in judgment sequence according to area changes of the first region and the second region and determines whether a distance between the first region and the second region is smaller than a first distance threshold. The control unit defines the touch gesture as a zoom-in gesture if the distance is smaller than the first distance threshold.
According to an embodiment of the invention, the control unit performs a zoom-out judgment sequence according to area changes of the first region and the second region and determines whether a distance between the first region and the second region is greater than a second distance threshold. The control unit defines the touch gesture as a zoom-out gesture if the distance is greater than the second distance threshold.
According to an embodiment of the invention, the control unit performs a touch operation according to the touch gesture.
As described above, in a touch sensing system provided by an embodiment of the invention, a touch gesture is defined according to an edge change of a region on the touch interface corresponding to an object, and a corresponding operation is performed according to the touch gesture. Additionally, in a touch sensing method provided by an embodiment of the invention, whether a touch gesture is a moving gesture, a rotation gesture, a flip gesture, a zoom-in gesture, or a zoom-out gesture is determined according to any change of a touch region. Thereby, application of touch sensing technique is made more diversified.
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
Reference will now be made in detail to the present preferred embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
In the embodiments provided hereinafter, touch panels and users' fingers exemplarily act as touch interfaces and touching objects, while people having ordinary skill in the art are aware that the touch panels and the users' fingers do not pose a limitation on the touch interfaces and the touching objects of the invention, and any input interface capable of sensing touching objects does not depart from the protection scope of the invention.
Accordingly, at time t12, the distance between the region A1 and the region A0 in the +X-direction, for example, is ΔX1, and the distance between the region A1and the region A0 in the +Y-direction, for example, is ΔY1. Namely, during the period t11-t12, the touch region between the finger 110 and the touch interface 120 changes from the region A0 to the region A1, the edge change in the +X-direction is ΔX1, and the edge change in the +Y-direction is ΔY1. Herein the edge change ΔX1, for example, is defined as the shortest distance between the edge tangents of the regions A0 and A1 in the +X-direction, and the edge change ΔY1, for example, is defined as the shortest distance between the edge tangents of the regions A0 and A1 in the +Y-direction.
It should be noted that in
Similarly, in
It should be noted that in the present embodiment, even though the touch region between the finger 110 and the touch interface 120 changes during the period t13-t14, the finger 110 remains in contact with the touch interface 120 and does not leave the surface of the touch interface 120 during the course of the change. Thus, the region A0 is substantially the same as the region A2, and the two regions are regions on the touch interface 120 touched by the same object at different time.
Additionally, in
In following exemplary embodiments, when a round region or an elliptical region is exemplary for the touch region between the finger 110 and the touch interface 120 (for example, the one illustrated in
To be specific, referring to
In the present embodiment, when the touch interface 120 senses the edge change, the touch interface 120 senses a first edge of the touch region in a direction at a first time within the specific timing tolerance, and the touch interface 120 senses a second edge of the touch region in the direction at a second time within the specific timing tolerance.
Taking the +X-direction in
Next, when the touch region between the finger 110 and the touch interface 120 changes, the touch interface 120 senses an edge tangent LX1 of the region A1 on the touch interface 120 touched by the finger 110 in the +X-direction. Namely, if the finger 110 touches the region A1 on the touch interface 120 at time t12 (as shown in
After that, the control unit 130 determines whether the distance between the edge tangent LX0 and the edge tangent LX1 is greater than a change threshold. If the distance is greater than the change threshold, the control unit 130 defines the distance as the edge change ΔX1 in the +X-direction and continues to execute the touch sensing method in the present embodiment. In the present embodiment, the edge change ΔX1 is a vector, the value thereof is the distance between the edge tangent LX0 and the edge tangent LX1, and the direction thereof is the +X-direction.
Thus, when the control unit 130 determines the distance between the edge tangent LX0 and the edge tangent LX1 to be the edge change ΔX1 in the +X-direction , the control unit 130 defines a touch gesture corresponding to the finger 110 according to the edge change ΔX1. For example, if the touch region between the finger 110 and the touch interface 120 changes in the +X-direction as that illustrated in
Similarly, when the touch region between the finger 110 and the touch interface 120 changes in the +Y-direction as that illustrated in
It should be noted that in the embodiment illustrated in
Thus, in the present embodiment, the control unit 130 defines a touch gesture corresponding to the finger 110 according to the edge change of the region A0 and further defines the touch gesture as a moving gesture towards a specific direction, so as to perform a corresponding touch operation.
Similarly, in the present embodiment, if the edge change of the touch region between the finger 110 and the touch interface 120 is as illustrated in
Additionally, in the present embodiment, the first edge refers to the edge tangent of the region on the touch interface 120 corresponding to the finger 110 in a specific direction when the finger 110 touches the touch interface 120 for the first time, and the second edge refers to another edge tangent of the region on the touch interface 120 corresponding to the finger 110 in the specific direction when the touch region between the finger 110 and the touch interface 120 changes.
For example, in
Similarly, in
It should be noted that in the embodiment described above, even though the touch region between the user's finger and the touch interface changes within the specific timing tolerance, the user's finger remains in contact with the touch interface and does not leave the surface of the touch interface during the course of the change. Thus, in the embodiment described above, to depart from conventional techniques, the touch sensing system defines a moving gesture according to an edge change of the touch region so that application of the moving gesture is made more diversified.
Referring to
Besides, the touch sensing method described in this embodiment of the invention is sufficiently taught, suggested, and embodied in the embodiments illustrated in
To be specific, the edge change sensed by the touch interface 120 during the period t1-t2 may be in the +Y-direction. The edge change sensed by the touch interface 120 during the period t2-t3 may be in a specific direction between the directions of +Y and +X. The edge change sensed by the touch interface 120 during the period t3-t4 may be in the +X-direction. Herein the direction of the edge change sensed during the period t2-t3 is the direction of an edge change between the regions A (t3) and A (t1), and the direction of the edge change sensed during the period t3-t4 is the direction of an edge change between the regions B(t4) and B(t1).
Thus, the control unit 130 defines the touch gesture of the user's finger as a rotation gesture in the clockwise direction (as shown in
To be specific, in order to prevent the control unit 130 from defining the touch gesture as a moving gesture in the +Y-direction according to the direction of the edge change during the period t1-t2, the control unit 130 first determines whether an area change of the region A (t1) satisfies a condition of performing a rotation judgment sequence. The control unit 130 performs the rotation judgment sequence only if the area change of the region A (t1) satisfies the condition of performing the rotation judgment sequence. For example, if the area change of the region A (t1) tallies with the area changes of the regions A (t2) and A (t3) during the period t2-t3, the control unit 130 performs the rotation judgment sequence to define the touch gesture of the user's finger as a rotation gesture in the clockwise direction.
In addition, before defining the touch gesture of the user's finger as a rotation gesture in the clockwise direction, the control unit 130 first determines whether an angle between the directions of the edge changes at time t2 and time t4 is greater than a first angle threshold. If the angle is greater than the first angle threshold, the control unit 130 defines the touch gesture as a rotation gesture. For example, in the present embodiment, the angle between the directions of the edge changes at time t2 and time t3 is assumed to be the first angle threshold. If the angle between the directions of the edge changes at time t2 and time t4 is greater than the angle between the directions of the edge changes at time t2 and time t3, the control unit 130 defines the touch gesture as a rotation gesture. In the present embodiment, the directions of the edge changes at time t2 and time t4 are substantially perpendicular to each other. Thus, the control unit 130 defines the touch gesture as a rotation gesture.
In other words, the control unit 130 performs a rotation judgment sequence according to an area change of the touch region and determines whether the angle between the first direction (the direction of the edge change at time t2) and the second direction (the direction of edge change at time t4) is greater than a first angle threshold (the angle between the directions of the edge changes at time t2 and time t3). If the angle between the first direction and the second direction is greater than the first angle threshold, the control unit 130 defines the touch gesture as a rotation gesture. In the present embodiment, the first direction and the second direction are substantially perpendicular to each other.
It should be noted that in
To be specific, the edge change sensed by the touch interface 120 during the period t1-t2 is in the +Y-direction. However, unlike in the embodiment illustrated in
Thus, the control unit 130 defines the touch gesture of the user's finger as a rotation gesture in the anticlockwise direction (as shown in
It should be noted that the value of the first angle threshold (i.e., the angle between the directions of the edge changes at time t2 and time t3) in the present embodiment may be the same as or different from that in the embodiment illustrated in
Referring to
Besides, the touch sensing method described in this embodiment of the invention is sufficiently taught, suggested, and embodied in the embodiments illustrated in
To be specific, the edge change sensed by the touch interface 120 during the period t1-t2 may be in the −X-direction. The edge change sensed by the touch interface 120 during the period t2-t3 may be in the +Y-direction. Then, the edge change sensed by the touch interface 120 during the period t3-t4 may be in the +X-direction. Herein the direction of the edge change sensed during the period t2-t3 is the direction of an edge change between the regions B (t3) and B (t1), and the direction of the edge change sensed during the period t3-t4 is the direction of an edge change between the regions B (t4) and B (t1).
Thus, the control unit 130 defines the touch gesture of the user's finger as a flip gesture (as shown in
To be specific, in order to prevent the control unit 130 from defining the touch gesture as a moving gesture in the −X-direction according to the direction of the edge change during the period t1-t2, the control unit 130 first determines whether an area change of the region B (t1) satisfies a condition of performing the flip judgment sequence. The control unit 130 performs the flip judgment sequence only if the area change of the region B (t1) satisfies the condition of performing the flip judgment sequence. For example, if the area change of the region B (t1) tallies with the area changes of the regions B (t2) and B (t3) during the period t2-t3, the control unit 130 performs the flip judgment sequence to define the touch gesture of the user's finger as a flip gesture in the horizontal direction.
Additionally, before defining the touch gesture of the user's finger as a flip gesture in the horizontal direction, the control unit 130 first determines whether an angle between the directions of the edge changes sensed at time t2 and time t4 is greater than a second angle threshold. If the angle is greater than the second angle threshold, the control unit 130 defines the touch gesture as a flip gesture. For example, in the present embodiment, the angle between the directions of the edge changes sensed at time t2 and time t3 is assumed to be the second angle threshold. If the angle between the directions of the edge changes sensed at time t2 and time t4 is greater than the angle between the directions of the edge changes sensed at time t2 and time t3, the control unit 130 defines the touch gesture as a flip gesture. In the present embodiment, the directions of the edge changes sensed at time t2 and time t4 are substantially reverse to each other. Thus, the control unit 130 defines the touch gesture is a flip gesture.
In other words, the control unit 130 performs a flip judgment sequence according to the area change of the touch region and determines whether the angle between the first direction (the direction of the edge change sensed at time t2) and the second direction (the direction of the edge change sensed at time t4) is greater than the second angle threshold (the angle between the directions of the edge changes sensed at time t2 and time t3). If the angle between the first direction and the second direction is greater than the second angle threshold, the control unit 130 defines the touch gesture as a flip gesture. In the present embodiment, the first direction and the second direction are substantially reverse to each other.
It should be noted that in
To be specific, the edge change sensed by the touch interface 120 during the period t1-t2 may be in the +Y-direction. The edge change sensed by the touch interface 120 during the period t2-t3 may be in the −X-direction. The edge change sensed by the touch interface 120 during the period t3-t4 may be in the −Y-direction. Herein the direction of the edge change during the period t2-t3 is the direction of an edge change between the regions B′ (t3) and B′ (t1), and the direction of the edge change during the period t3-t4 is the direction of an edge change between the regions B′ (t4) and B′ (t1).
Thus, the control unit 130 defines the touch gesture of the user's finger as a flip gesture in the vertical direction (as shown in
It should be noted that the value of the second angle threshold (i.e., the angle between the directions of the edge changes sensed at time t2 and time t3) in the present embodiment may be the same as or different from that in the embodiment illustrated in
Referring to
Besides, the touch sensing method described in this embodiment of the invention is sufficiently taught, suggested, and embodied in the embodiments illustrated in
It should be noted that in the embodiments illustrated in
Thus, the touch interface 120 senses the edge changes of the regions AL (t1) and AR (t1) over time within the specific timing tolerance Δt. The control unit 130 defines the touch gesture of the user's finger according to the edge changes over time of the regions AL (t1) and AR (t1). Herein the timings have such a sequence as t1<t2.
To be specific, at time t1, the touch interface 120 senses the edge tangent of the region AL (t1) to be L, and at time t1, the touch interface 120 senses the edge tangent of the region AL (t2) to be L′. Thus, during the period t1-t2, the touch interface 120 senses the edge change of the region AL (t1) to be an edge change in the +X-direction. Similarly, at time t1, the touch interface 120 senses the edge tangent of the region AR (t1) to be R, and at time t2, the touch interface 120 senses the edge tangent of the region AR (t2) to be R′. Thus, during the period t1-t2, the touch interface 120 senses the edge change of the region AR (t1) to be an edge change in the −X-direction.
In other words, during the period t1-t2, the directions of the edge changes of the regions AL (t1) and AR (t1) substantially point to the same target (not shown). Thus, the control unit 130 defines the touch gesture of the user's finger as a zoom-in gesture according to the direction of the edge changes of the regions AL (t1) and AR (t1).
On the other hand, at time t1, the distance between the edge tangents L and R is Δd (t1), and at time t2, the distance between the edge tangents L′ and R′ is Δd (t2). Thus, if the distance Δd (t1) is greater than the distance Δd (t2), the control unit 130 defines the touch gesture of the user's finger as a zoom-in gesture according to the directions of the edge changes of the regions AL (t1) and AR (t1).
To be specific, in order to prevent the control unit 130 from defining the touch gesture as a moving gesture in the +X-direction according to the direction of the edge change of the region AL (t1) during the period t1-t2, the control unit 130 first determines whether area changes of the regions AL (t1) and AR (t1) satisfy a condition of performing a zoom-in judgment sequence. The control unit 130 performs the zoom-in judgment sequence only if the area changes of the regions AL (t1) and AR (t1) satisfy the condition of performing the zoom-in judgment sequence. For example, if the area changes of the regions AL (t1) and AR (t1) are as illustrated in
Additionally, before defining the touch gesture of the user's finger as the zoom-in gesture, the control unit 130 first determines whether the distance Δd (t2) between the edge tangents L′ and R′ at time t2 is smaller than a first distance threshold. If the distance Δd (t2) is smaller than the first distance threshold, the control unit 130 defines the touch gesture as a zoom-in gesture.
In other words, the control unit 130 performs a zoom-in judgment sequence according to area changes of a first region (i.e., the region AL (t1)) and a second region (i.e., the region AR (t1)) and determines whether the distance (i.e., the distance Δd (t2)) between the first region and the second region is smaller than a first distance threshold. If the distance between the first region and the second region is smaller than the first distance threshold, the control unit 130 defines the touch gesture as a zoom-in gesture. In the present embodiment, the directions of the edge changes of the regions AL (t1) and AR (t1) substantially point to the same target (not shown).
Thus, the touch interface 120 senses the edge changes over time of the regions AL′ (t1) and AR′ (t1) within a specific timing tolerance Δt. The control unit 130 defines the touch gesture of the user's finger according to the edge changes over time of the regions AL′ (t1) and AR′ (t1). Herein the timings have such a sequence as t1<t2.
In the present embodiment, the directions of the edge changes of the regions AL′ (t1) and AR′ (t1) substantially point away from the same target (not shown) during the period t1-t2. Thus, the control unit 130 defines the touch gesture of the user's finger as a zoom-out gesture according to the directions of the edge changes of the regions AL′ (t1) and AR′ (t1).
On the other hand, at time t1, the distance between the edge tangents of the regions AL′ (t1) and AR′ (t1) is Δd′ (t1), and at time t2, the distance between the edge tangents of the regions AL′ (t2) and AR′ (t2) is Δd′ (t2). Thus, if the distance Δd′ (t1) is smaller than the distance Δd′ (t2), the control unit 130 defines the touch gesture of the user's finger as a zoom-out gesture according to the directions of the edge changes of the regions AL′ (t1) and AR′ (t1).
Similarly, in order to prevent the control unit 130 from defining the touch gesture as a moving gesture in the −X-direction according to the direction of the edge change of the region AL′ (t1) during the period t1-t2, the control unit 130 first determines whether the area changes of the regions AL′ (t1) and AR′ (t1) satisfy a condition of performing a zoom-out judgment sequence. The control unit 130 performs the zoom-out judgment sequence only if the area changes of the regions AL′ (t1) and AR′ (t1) satisfy the condition of performing the zoom-in judgment sequence. For example, if the area changes of the regions AL′ (t1) and AR′ (t1) are as illustrated in
Additionally, before defining the touch gesture of the user's finger as the zoom-out gesture, the control unit 130 first determines whether the distance Δ′d (t2) between the edge tangents of the regions AL′ (t2) and AR′ (t2) at time t2 is greater than a second distance threshold. If the distance Δ′d (t2) is greater than the second distance threshold, the control unit 130 defines the touch gesture as a zoom-out gesture.
In other words, the control unit 130 performs a zoom-out judgment sequence according to the area changes of a first region (i.e., the region AL′ (t1)) and a second region (i.e., the region AR′ (t1)) and determines whether the distance (i.e., the distance Δd′ (t2)) between the first region and the second region is greater than a second distance threshold. If the distance between the first region and the second region is greater than the second distance threshold, the control unit 130 defines the touch gesture as a zoom-out gesture. In the present embodiment, the directions of the edge changes of the regions AL′ (t1) and AR′ (t1) substantially point away from the same target (not shown).
It should be noted that in
Referring to
Besides, the touch sensing method described in this embodiment of the invention is sufficiently taught, suggested, and embodied in the embodiments illustrated in
In the embodiment described above, the touch regions between the finger 110 and the touch interface 120 are round regions or elliptical regions. However, it should be understood by those having ordinary knowledge in the art that the invention is not limited thereto, and the touch regions may be in any shape without departing the scope of the invention.
Besides, the touch sensing method described in this embodiment of the invention is sufficiently taught, suggested, and embodied in the embodiments illustrated in
As described above, in a touch sensing system provided by an embodiment of the invention, a touch gesture is defined according to an edge change of a region on the touch interface corresponding to an object, and a corresponding touch operation is performed according to the touch gesture. Additionally, in a touch sensing method provided by an embodiment of the invention, whether a touch gesture is a moving gesture, a rotation gesture, a flip gesture, a zoom-in gesture, or a zoom-out gesture is determined according to any change of a touch region. Thereby, application of touch sensing technique is made more diversified. As to the users, the touch sensing method in embodiments of the invention offers a pseudo 3-dimensional touch sensing mode such that the application of touch sensing technique is made more diversified.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the invention without departing from the scope or spirit of the invention. In view of the foregoing, it is intended that the invention cover modifications and variations of this invention provided they fall within the scope of the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
99108929 | Mar 2010 | TW | national |