Exemplary embodiments of the present disclosure relate to vehicle operating systems using image and/or gesture capture techniques, related methods and vehicles using such systems.
When operating a traditional vehicle, turning on or off an air-conditioning system, turning on or off a multimedia system, or the like, is typically achieved by directly touching buttons or screen selections. However, if a driver uses such operating methods while driving the vehicle, the operations can prove inconvenient, contribute to driver distraction, and have an overall adverse effect on safe operation of the vehicle.
Exemplary embodiments of the present disclosure may address at least some of the above-noted problems. For example, according to first aspects of the disclosure, vehicle operating systems, related vehicle operating methods and vehicles using such vehicle operating systems, may provide a relatively safe, reliable and convenient operating atmosphere, with reduced driver distraction and less focused attention requirements.
In embodiments, a vehicle operating system (for operating a vehicle including a driving seat for a vehicle driver and at least one passenger seat for passengers) is provided. The vehicle operating system may include one or more camera devices for capturing and/or “shooting” images of gestures, such as head, arm, hand and/or finger actions and the like, of the driver and/or images of gestures of a passenger. Embodiments may include a storage device for storing operating signals corresponding to gestures and/or indicia of a plurality of gestures, such as shape, motion, etc.
A processing device may be included that is configured, for example, to select one of the driver and the passengers as a gesture command operator, and to control the camera device so as to capture hand or other action images of the selected gesture command operator. This may include, for example, changing a direction or other parameter of the camera to better capture gestures of the selected gesture command operator. In embodiments, the processing device may be configured to convert captured images of gestures, such as hand actions, etc., into corresponding operating signals based at least in part on the operating signals stored in the storage device, and to send out the operating signals to associated execution devices, such as navigation, entertainment, climate control, or other vehicle systems.
According to further aspects of the disclosure, an electric vehicle including a vehicle operating system as described herein may also be provided.
According to further aspects of the disclosure, a vehicle operating method may include one or more of storing operating signals corresponding to gestures; selecting one of a driver and a passenger as a gesture command operator; shooting a gesture action image of the gesture command operator; converting the gesture action image into a corresponding operating signal according to the stored operating signals; and executing, by an execution device, the corresponding operation according to the operating signal.
According to further aspects of the disclosure, a gesture command operator can be automatically (or manually) selected from the driver or the passengers, and the gesture command operator can be automatically (or manually) switched from the driver to the passenger, e.g. when the driver needs to focus their attention, so that the vehicle can be operated according to the hand actions of the passenger. This may involve, for example, reorienting a camera, switching between different cameras, etc. Therefore, interference with the driver may be avoided, and driving safety improved. Meanwhile, vehicle components may be operated by gesture actions, avoiding direct button touch or screen selection, so that the operation is made more convenient, less intensely focused, and overall safety improved.
Additional features, advantages, and embodiments of the invention may be set forth or apparent from consideration of the following detailed description, drawings, and claims. Moreover, it is to be understood that both the foregoing summary of the invention and the following detailed description are exemplary and intended to provide further explanation without limiting the scope of the invention claimed. The detailed description and the specific examples, however, indicate only preferred embodiments of the invention. Various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art from this detailed description.
The accompanying drawings, which are included to provide a further understanding of the invention, are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the detailed description serve to explain the principles of the invention. No attempt is made to show structural details of the invention in more detail than may be necessary for a fundamental understanding of the invention and various ways in which it may be practiced. In the drawings:
Various example embodiments of the present disclosure will be described below with reference to the drawings constituting a part of the description. It should be understood that, although terms representing directions are used in the present disclosure, such as “front”, “rear”, “upper”, “lower”, “left”, “right”, and the like, for describing various exemplary structural parts and elements of the present disclosure, these terms are used herein only for the purpose of convenience of explanation and are determined based on the exemplary orientations shown in the drawings. Since the embodiments disclosed by the present disclosure can be arranged according to different directions, these terms representing directions are merely used for illustration and should not be regarded as limiting. Wherever possible, the same or similar reference marks used in the present disclosure refer to the same components.
Unless defined otherwise, all technical terms used herein have the same meanings as commonly understood by one of ordinary skill in the art to which the invention pertains. The embodiments of the invention and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments and examples that are described and/or illustrated in the accompanying drawings and detailed in the following description. It should be noted that the features illustrated in the drawings are not necessarily drawn to scale, and features of one embodiment may be employed with other embodiments as the skilled artisan would recognize, even if not explicitly stated herein. Descriptions of well-known components and processing techniques may be omitted so as to not unnecessarily obscure the embodiments of the invention. The examples used herein are intended merely to facilitate an understanding of ways in which the invention may be practiced and to further enable those of skill in the art to practice the embodiments of the invention. Accordingly, the examples and embodiments herein should not be construed as limiting the scope of the invention, which is defined solely by the appended claims and applicable law. Moreover, it is noted that like reference numerals reference similar parts throughout the several views of the drawings.
The embodiments of a vehicle operating system of the present disclosure will be described below with reference to the accompanying drawings.
When the vehicle is running, the processing device 202 may select one of the driver or the passengers as a gesture command operator, who has the right to operate the above execution devices through gestures such as hand actions, etc. That is to say, the gesture command operator can be switched between the driver and the passenger(s). In some examples, the system may allow a user to set a default for the gesture command operator, the system may respond to specific commands that set the gesture command operator, and/or the system can infer the gesture command operator based on a number of inputs. One or more camera device(s) 101 may be arranged in the cabin for shooting images of gestures from the gesture command operator and transmitting the gesture images to the processing device 202.
Operating signals corresponding to various gestures may be pre-stored in the storage device 203. The processing device 202 may convert the gesture action images captured by camera(s) 101 into corresponding operating signals according to the operating signals corresponding to the hand actions stored in the storage device 203, and send the operating signals to the corresponding execution devices. The execution devices may then execute the corresponding operations according to the operating signals.
In some examples, gesture action images from multiple camera devices 101 may be interpreted to determine the corresponding operating signals. For example, a hand moving toward one camera may result in images that are only slightly different, whereas images from a camera pointing perpendicular to the direction of motion may provide images with significant and easy to discern changes. By evaluating images from multiple cameras, the system may be able to better judge 3-dimensional gestures, more easily detect movement from desired angles, etc.
According to methods described herein, one of the driver or the passengers can be selected as the gesture command operator, and when the driver needs to focus their attention to driving, the right of operating the execution devices can be transferred to the passenger, so as to avoid distraction of the driver and improve vehicle driving safety.
An exemplary structure for a camera device 101 is shown as
Generally, systems may be set such that the passenger sitting on the front passenger seat 103 (“front passenger” hereinafter) acts as the passenger who can become the gesture command operator. When the front passenger acts as the passenger who can become the gesture command operator, only one shooting device 101 need be arranged in front of the driving seat 102, the front of the driving seat 102 herein is not limited to the right front of the driving seat 102, e.g. the front between the driving seat 102 and the front passenger seat 103 as shown in
In some embodiments, the system may also set the passenger sitting in the rear passenger seat 104 or the rear passenger seat 105 as the gesture command operator. In such cases, another camera device may also be arranged (besides the camera device 101). Such additional camera(s) may be arranged, for example, at the upper part of the front part of the vehicle cabin (the front part of the cabin herein refers to the part ahead of the front passenger seat 103 and the driving seat 102 of the cabin, and the upper part refers to the part higher than the seats in the cabin), is a door frame between the front and back seats, in a roof of the vehicle, etc. When the gesture command operator is the driver, the camera device 101 in front of the driving seat 102 may be used, and when the gesture command operator is switched to the passenger in the rear passenger seat 104 or the rear passenger seat 105, the active camera device may be switched to another camera better positioned to capture gestures of the rear passenger(s) (e.g. if the camera device 101 in front of the driving seat 102 cannot clearly and completely capture hand actions or other gestures of the passenger in the rear passenger seat 104 or the rear passenger seat 105).
As shown in 401, before a vehicle is driven, operating signals corresponding to various gestures, such as hand actions, may be stored, e.g. in the storage device 203. This may also include storing any number of indicia used to determine whether a particular gesture is being performed based on captured gesture action images. For example, various shapes, motions, or other parameters may be stored, and used by the command system to perform various gesture recognition techniques based on captured images. This step may be preset in the vehicle by a manufacturer before the vehicle leaves the factory, or set by a user through a bootstrap program after the vehicle is sold and before first (or subsequent) driving. In some examples, the user may be allowed to associate specific gestures with certain operating signals, e.g. by entering a “record mode” and/or by selecting from a menu of preconfigured gestures that the system can recognize.
The processing device 202 selects one of the driver and the passengers as the gesture command operator. In this embodiment, as in 402, each time the vehicle is started, the processing device 202 selects the driver as the gesture command operator by default, and controls the camera device 101 to be directed at the driver, so as to capture the hand actions or other pre-defined gestures of the driver. Then, the processing device 202 determines whether to switch the gesture command operator, e.g. according to whether a switch or other specified command is received.
Proceeding with 403 to 406, a subroutine for switching the gesture command operator is shown. The subroutine depicted in 403 to 406 can not only describe how the gesture command operator may be switched to the passenger when the current gesture command operator is the driver, but also can be used to show the gesture command operator is switched to the driver when the current gesture command operator is the passenger.
In 403, a processing device (such as 202) may determine whether a switch instruction is received, and if so, 404 to 406 are carried out to switch the gesture command operator. Otherwise, the flow may proceed with 407 to continue capturing the gesture actions of the current gesture command operator. In some examples, the switch instruction can be made by pressing a button. In other embodiments, the switch instruction may (additionally or alternatively) be a hand action of the current gesture command operator. For example, when the hand action of the current gesture command operator is that the five fingers are spread, the palm is roughly perpendicular to the forearm, then a first is made, the five fingers are spread again and the palm lightly rocks leftwards and rightwards, it is regarded that the gesture command operator sends out a switch instruction. Of course, any number of other gestures are possible, including gestures using various finger, hand, arm, head, and torso motions, etc. In some examples, multiple frames from the camera(s) may be stored, for example, to recognize previous gesture motions after a particular part of an action is recognized.
When the switch instruction is received, the flow may proceed with 404, to judge whether the gesture command operator is the driver, and if so, the flow may continue with 406 to switch the gesture command operator to a passenger. Otherwise, the gesture command operator is a passenger, and the flow may proceed with 405 to switch the gesture command operator to the driver. In some examples, each time the vehicle is started, the processing device 202 may select the driver as the gesture command operator by default, and during the first switching the processing device 202 can switch the gesture command operator from the driver to a passenger, and during the second switching, the processing device 202 can switch the gesture command operator from the passenger to the driver.
If no switch instruction is received in 403, or after switching operations 405/406 are complete, the flow may continue with 407 to 409, in which the gesture(s) (such as hand actions) shot by the camera device 101 is converted into the corresponding operating signals, e.g. by the processing device 202.
In some examples, a checking function such as shown in 407 and 408 may be adopted, e.g. to prevent interference from people other than the gesture command operator or from other factors with the operation. In this case, a captured hand action may be checked, to determine whether the hand action comes from the gesture command operator, so as to exclude the interference from people other than the gesture command operator or other factors. Of course, other gestures may also be checked using similar techniques.
As shown in 407, the camera device 101 not only shoots the relevant hand actions, but also shoots an image from a body locating point of the gesture command operator to his hand. The body locating point as discussed herein may be, for example, a head or breast, e.g. the image of the body part from the head or the breast of the gesture command operator to his hand may be completely captured. Next, in 408, the processing device 202 judges whether the hand image and the body locating point image are image-continuous in the shot image, namely whether the hand in the image continuously extends to the body locating point to form the body part of the operator. If so, it is indicated that the shot hand image comes from the gesture command operator, otherwise, it is indicated that the shot hand image may come from the interference of people other than the gesture command operator or other factors, and then step 407 is carried out again to re-capture an image. To improve the accuracy of judgment, the body locating point may be at least above the upper arm of the hand making the action.
After the hand action (or other gesture) is checked, the flow may continue with 409, in which the hand action may be converted into the corresponding operating signal. Since the operating signals corresponding to various gestures have been stored in the storage device 203 in 401, the processing device 202 converts the gesture action image into the corresponding operating signal, e.g. based on comparisons between gesture image information associated with the stored operating signals and the captured gesture action image(s). This may include various types of pattern, anatomical, and/or motion recognition techniques.
In an exemplary embodiment, each operating signal may correspond to a two-step hand action, the first step of the hand action being to select an execution device and the second step of the hand action being to instruct the selected execution device to carry out a specific operation. The first step of the hand action for selecting an execution device may be an Arabic numeral indicated by a gesture, e.g. the Arabic numeral 1 indicated by a gesture expresses selecting the center console 205, the Arabic numeral 2 indicated by a gesture expresses selecting the air-conditioning system 206 and the like. The second step of the hand action for instructing the selected execution device to carry out an specific operation may include an action for instructing the execution device to be turned on or turned off (e.g. the index finger touches the thumb in a ring shape and the other three fingers are upright to express turn-on), an action for instructing raising or lowering parameters (temperature, volume, etc.) of the execution device (e.g. the index finger rotates clockwise or anticlockwise), an action for instructing switching between options of the execution device (e.g. the five fingers are upward and move up to represent moving to the previous option, the five fingers are downward and move down to represent moving to a next option) and the like. Taking the operation of raising the temperature of the air-conditioner as an example, the hand action corresponding to the operating signal may include: selecting the air-conditioning system 206 through the Arabic numeral 2 indicated by a gesture, and then raising the temperature through the gesture expressing raising the temperature. Any number of other gestures and combinations of gestures are possible for identifying the execution device and function, or other command techniques, such as function-driven commands that cause execution of multiple devices.
In some examples, the system may transition between different modes, in which similar gestures may be associated with different execution commands. For example, a first mode may be an audio control mode, in which up and down hand gestures are associated with volume control, whereas a second mode may be an environmental control mode in which the same up and down hand gestures are associated with temperature control. Different mode indicators associated with the current gesture command mode may be displayed on a central console or the like to inform the user of the current gesture command mode, projected on a windshield or other transparent surface, and/or audible cues may be provided that inform the user of the current mode, without them having to focus on a display.
After the hand action is converted into the corresponding operating signal, the flow may proceed with 410, in which a processing device (e.g. 202) may determine the execution device corresponding to the operating signal, and send the operating signal to the corresponding execution device. This may be done in various ways, including sending commands to integrated devices via wiring in the vehicle, sending wireless commands to Bluetooth or other peripheral devices, etc. In some examples, commands may also be sent to execution devices outside of the vehicle, such as automatic garage doors, home door locks, home lighting, etc.
The flow may proceed with 411, in which the corresponding execution device(s) receive the operating signal, and execute the corresponding operation according to the operating signal. As mentioned above, the execution device includes at least one of a center console 205, a multimedia system 207, a door switch 210, a refill opening switch 208, Bluetooth equipment 209, an interior light switch 211, an air-conditioning system 206, as well as any other onboard, peripheral or remote device that the control system is configured to communicate with.
Taking the action for moving a touch screen operating interface 502 in the center console 205 as an example, a process of converting a hand gesture action into the corresponding operating signal, sending the operating signal to the execution device and executing the corresponding operation by the execution device, is described below.
As shown in
The touch screen operating interface 502 of the center console 205 in this embodiment can move on the touch screen 501, and the touch screen 501 at least extends from the front part between the driving seat 102 and the front passenger 103 to the front part of the front passenger seat 103, so when the driver needs to use the center console 205, the touch screen operating interface 502 can move to a part of the touch screen 501 closer to the driver, as shown in
In some examples, the movement of the touch screen operating interface 502 on the touch screen 501 may be based on a gesture action of the gesture command operator, as described herein, and/or it may be automatically initiated based on switching the gesture command operator.
In one example, the gesture command operator makes the Arabic numeral 1 indicated by a gesture and then makes the action that the five fingers are spread and the palm slides rightwards, the processing device 202 converts the gesture and the action into an operating signal, and the operation corresponding to the operating signal is that the touch screen operating interface 502 on the center console 205 moves rightwards on the touch screen 501. The processing device 202 sends the operating signal to the center console 205, and finally, the center console 205 receives the operating signal and enables the touch screen operating interface 502 to move rightwards on the touch screen 501.
In some examples, vehicle command systems such as described herein may also be configured to permit certain operating signals, that are normally disabled during operation of the vehicle, if initiated via a gesture command. For example, certain Bluetooth phone commands may be disabled when a vehicle is in motion, but a gesture command may be allowed to initiate the command, thereby allowing the driver or passenger to utilize the function in a less distracting way than conventional methods. Likewise, certain functions may be disabled when the driver is the gesture command operator, and certain functions may be enabled when a passenger is the gesture command operator (or vice versa). One example of this might be allowing the driver to easily adjust the climate control temperature on their side of the vehicle using gesture commands when they are the gesture command operator, and reconfiguring the command system to allow the passenger to control the climate control temperature on their side of the vehicle using gesture commands when they are the gesture command operator. It should further be appreciated that, in some instances, multiple cameras, or cameras with wide viewing angles may allow for multiple users to act as gesture command operators simultaneously. For example, in the case of non-contradictory commands (such as separate climate control regions), different cameras may be configured to received simultaneous commands from the driver and the passenger, and the command system may implement both sets of commands. In cases with contradictory commands, the system may also be configured to assign a hierarchy, e.g. whereby the driver's gesture commands are given precedence.
The present disclosure further provides a vehicle (such as an electric automobile) using a vehicle operating system as described herein, with other parts of the vehicle using the framework of the existing vehicle. The vehicle operating system may be substantially the same as described herein, and is therefore not redundantly described.
Although the present disclosure has been described with reference to the specific embodiments shown in the drawings, it should be understood that the lightweight fastening methods provided by the present disclosure can have a variety of variations without departing from the spirit, scope and background of the present disclosure. The description given above is merely illustrative and is not meant to be an exhaustive list of all possible embodiments, applications or modifications of the invention. Those of ordinary skill in the art should be still aware that, parameters in the embodiments disclosed by the present disclosure can be changed in different manners, and these changes shall fall within the spirit and scope of the present disclosure and the claims. Thus, various modifications and variations of the described methods and systems of the invention will be apparent to those skilled in the art without departing from the scope and spirit of the invention.
The present application is a continuation of U.S. Nonprovisional Application No. 14/967,368, filed Dec. 14, 2015, which is a continuation of U.S. Nonprovisional Application No. 14/883,621, filed Oct. 15, 2015, now U.S. Pat. No. 9,547,373, issued Jan. 17, 2017, which claims the benefit of U.S. Provisional Application No. 62/150,848, filed Apr. 22, 2015, and U.S. Provisional Application No. 62/133,991, filed Mar. 16, 2015, the disclosures of which are hereby incorporated by reference in their entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4026468 | Tinder et al. | May 1977 | A |
4248383 | Savage et al. | Feb 1981 | A |
4324363 | Rauen, Jr. | Apr 1982 | A |
5115342 | Rowe et al. | May 1992 | A |
6766036 | Pryor | Jul 2004 | B1 |
7050606 | Paul et al. | May 2006 | B2 |
7289645 | Yamamoto et al. | Oct 2007 | B2 |
8625855 | El Dokor | Jan 2014 | B2 |
8818716 | El Dokor et al. | Aug 2014 | B1 |
8983718 | Ricci | Mar 2015 | B2 |
9539988 | Hsiao et al. | Jan 2017 | B2 |
9547373 | Hsiao et al. | Jan 2017 | B2 |
9586618 | Sham | Mar 2017 | B2 |
9604601 | Hsiao et al. | Mar 2017 | B2 |
9649938 | Schlittenbauer et al. | May 2017 | B2 |
9855817 | Hsiao et al. | Jan 2018 | B2 |
20020005440 | Holt et al. | Jan 2002 | A1 |
20020134857 | Zimmer | Sep 2002 | A1 |
20030155001 | Hoetzer et al. | Aug 2003 | A1 |
20050206511 | Heenan et al. | Sep 2005 | A1 |
20070077541 | Champagne et al. | Apr 2007 | A1 |
20070084484 | Porter et al. | Apr 2007 | A1 |
20070278325 | Sato et al. | Dec 2007 | A1 |
20080116379 | Teder | May 2008 | A1 |
20080210780 | Discher et al. | Sep 2008 | A1 |
20090098815 | Hotary | Apr 2009 | A1 |
20090250533 | Akiyama et al. | Oct 2009 | A1 |
20090278915 | Kramer et al. | Nov 2009 | A1 |
20100230991 | Fioravanti | Sep 2010 | A1 |
20100274480 | McCall et al. | Oct 2010 | A1 |
20110031921 | Han | Feb 2011 | A1 |
20110073142 | Hattori et al. | Mar 2011 | A1 |
20110107272 | Aguilar | May 2011 | A1 |
20110128543 | Choi | Jun 2011 | A1 |
20110128555 | Rotschild et al. | Jun 2011 | A1 |
20110266375 | Ono et al. | Nov 2011 | A1 |
20110292212 | Tanabe et al. | Dec 2011 | A1 |
20120032899 | Waeller et al. | Feb 2012 | A1 |
20120163657 | Shellshear | Jun 2012 | A1 |
20120266922 | Krahn et al. | Oct 2012 | A1 |
20120274549 | Wehling et al. | Nov 2012 | A1 |
20130002407 | Terrier | Jan 2013 | A1 |
20130092758 | Tanaka et al. | Apr 2013 | A1 |
20130094086 | Bochenek | Apr 2013 | A1 |
20130117963 | Liu | May 2013 | A1 |
20130204457 | King et al. | Aug 2013 | A1 |
20130235381 | Kroekel et al. | Sep 2013 | A1 |
20140009615 | Kiyohara et al. | Jan 2014 | A1 |
20140081521 | Frojdh et al. | Mar 2014 | A1 |
20140089864 | Cheng et al. | Mar 2014 | A1 |
20140090673 | Atsumi et al. | Apr 2014 | A1 |
20140121927 | Hanita | May 2014 | A1 |
20140145933 | Chae et al. | May 2014 | A1 |
20140222253 | Siegel et al. | Aug 2014 | A1 |
20140223384 | Graumann | Aug 2014 | A1 |
20140277936 | El Dokor et al. | Sep 2014 | A1 |
20140282271 | Lu et al. | Sep 2014 | A1 |
20140306826 | Ricci | Oct 2014 | A1 |
20140309849 | Ricci | Oct 2014 | A1 |
20140365228 | Ng-Thow-Hing et al. | Dec 2014 | A1 |
20150022664 | Pflug et al. | Jan 2015 | A1 |
20150054933 | Wasiek et al. | Feb 2015 | A1 |
20150105939 | Blaesing | Apr 2015 | A1 |
20150138357 | Romack et al. | May 2015 | A1 |
20150141043 | Abramson | May 2015 | A1 |
20150151722 | Gokan et al. | Jun 2015 | A1 |
20150158388 | Kalbus et al. | Jun 2015 | A1 |
20150168174 | Abramson | Jun 2015 | A1 |
20150175172 | Truong | Jun 2015 | A1 |
20150203077 | Gokan | Jul 2015 | A1 |
20150314684 | Kronberg | Nov 2015 | A1 |
20150317527 | Graumann et al. | Nov 2015 | A1 |
20150367859 | Roth et al. | Dec 2015 | A1 |
20160001330 | Romack et al. | Jan 2016 | A1 |
20160048725 | Holz et al. | Feb 2016 | A1 |
20160086391 | Ricci | Mar 2016 | A1 |
20160121817 | Liu et al. | May 2016 | A1 |
20160272164 | Hsiao et al. | Sep 2016 | A1 |
20160272165 | Hsiao et al. | Sep 2016 | A1 |
20160272242 | Sham | Sep 2016 | A1 |
20160274668 | Hsiao et al. | Sep 2016 | A1 |
20160274669 | Hsiao et al. | Sep 2016 | A1 |
20160305176 | Ko et al. | Oct 2016 | A1 |
20160357262 | Ansari | Dec 2016 | A1 |
20170080904 | Hsiao et al. | Mar 2017 | A1 |
20170182981 | Hsiao et al. | Jun 2017 | A1 |
20170235371 | Hsiao et al. | Aug 2017 | A1 |
20170282717 | Jang | Oct 2017 | A1 |
Number | Date | Country |
---|---|---|
101291834 | Oct 2008 | CN |
103294190 | Sep 2013 | CN |
103448685 | Dec 2013 | CN |
103998316 | Mar 2014 | CN |
104040464 | Sep 2014 | CN |
205632419 | Oct 2016 | CN |
205788090 | Dec 2016 | CN |
1937525 | Jul 2008 | EP |
2949520 | Dec 2015 | EP |
2007253640 | Oct 2007 | JP |
2011C89604 | Mar 2011 | KR |
2013101058 | Jul 2013 | WO |
Entry |
---|
European Office Action for EP 16159404 dated Aug. 5, 2016, all pages. |
First Office Action with a search report dated Aug. 18, 2017 from the China Patent Office for 2016101434844, 8 pages. |
Specification U.S. Appl. No. 62/133,991. |
Specification U.S. Appl. No. 62/150,848. |
U.S. Appl. No. 14/883,621, filed Oct. 15, 2015, Non-Final Office Action dated Apr. 25, 2016, all pages. |
U.S. Appl. No. 14/883,621, filed Oct. 15, 2015, Notice of Allowance dated Sep. 14, 2016, all pages. |
U.S. Appl. No. 14/967,368, filed Dec. 14, 2015, Non-Final Rejection dated Apr. 10, 2017, all pages. |
U.S. Appl. No. 14/967,368, filed Dec. 14, 2015, Notice of Allowance dated Aug. 23, 2017, all pages. |
Number | Date | Country | |
---|---|---|---|
20180079277 A1 | Mar 2018 | US |
Number | Date | Country | |
---|---|---|---|
62150848 | Apr 2015 | US | |
62133991 | Mar 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14967368 | Dec 2015 | US |
Child | 15828969 | US | |
Parent | 14883621 | Oct 2015 | US |
Child | 14967368 | US |