This application is the national phase entry of International Application No. PCT/CN2021/075521, filed on Feb. 5, 2021, the entire contents of which are incorporated herein by reference.
The present disclosure relates to the field of medical instrument technology, and more particularly, to a soft-bodied apparatus and a method for opening an eyelid.
To prevent postoperative infection, eyes need to be cleaned and disinfected before ophthalmologic operation. In the process of eye cleaning before routine ophthalmologic operation, upper and lower eyelids need to be opened first to fully expose lacrimal passages, conjunctival sac and other parts for cleaning. At present, clinically medical staff mainly uses fingers to open the eyelids of patients to clean parts in the eyes.
In the existing technologies, there are some auxiliary instruments and equipment for eye cleaning. For example, the patent application CN201710253440.1 discloses an eye opening device for opening an upper eyelid and a lower eyelid. An operator can use this device to open the upper and lower eyelids of himself/herself or others. The patent application CN201780043837.6 discloses a device for cleaning an eye, which specifically includes a container and a locator. The patent application CN201711199743.6 discloses a comfortable ophthalmic eye washer. However, the existing auxiliary instruments and equipment for eye cleaning require the operator to directly open the upper and lower eyelids with the help of fingers or instruments, and there is no technical solution for automatically opening eyelids.
An objective of the present disclosure is to overcome the above-mentioned defects of the existing technologies by providing a soft-bodied apparatus and a method for opening an eyelid, which can recognize and locate eyelids of patients in real time by using visual sensors, and then can automatically open the eyelids of the patients using robots based on the positioning information.
According to a first aspect of the present disclosure, there is provided a soft-bodied apparatus for opening an eyelid. The apparatus includes: a head support module, a real-time eyelid positioning module, a robot end-effector real-time positioning module, and an automatic eyelid opening operation module. The head support module is configured to support a head of a user. The real-time eyelid positioning module is configured to recognize and locate real-time poses of upper and lower eyelids of the user in real time. The robot end-effector real-time positioning module is configured to reconstruct a real-time shape and a pose of a robot soft-bodied end-effector. The automatic eyelid opening operation module includes a robot body and a robot control system. The robot body is provided with a multi-axis rigid body mechanical arm and a soft-bodied end-effector. The robot control system takes the real-time poses of the upper and lower eyelids of the user as a motion target, and takes the real-time shape and the pose of the soft-bodied end-effector as feedback information to control motion of the robot body to complete an action of automatically opening the eyelid of the user.
According to a second aspect of the present disclosure, there is provided a method for opening an eyelid using the above apparatus. This method includes:
Compared with the existing technologies, the present disclosure has the advantage that as compared with manual operation, patients' eyelids can be opened quickly and accurately, burden of ophthalmologists and treatment costs can be effectively reduced, efficiency of ophthalmological treatment can be improved, and probability of cross infection can be reduced for both doctors and patients.
Other features and advantages of the present disclosure will become apparent from the following detailed description of exemplary embodiments of the present disclosure with reference to accompanying drawings.
The accompanying drawings herein are incorporated in and constitute a part of this specification, illustrate embodiments of the present disclosure and, together with the specification, serve to explain the principles of the present disclosure.
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It is to be noted that the relative arrangement, numerical expressions, and numerical values of the components and steps set forth in these embodiments do not limit the scope of the present disclosure unless otherwise specifically stated.
The following description of at least one exemplary embodiment is actually merely illustrative, and in no way serves as any limitation on the present disclosure and application or use thereof.
Technologies, methods and equipment known to those of ordinary skill in the related art may not be discussed in detail, but where appropriate, the technologies, methods and equipment should be considered as part of the specification.
In all examples shown and discussed herein, any specific values should be interpreted as merely exemplary and not limiting. Therefore, other examples of the exemplary embodiment may have different values.
It is to be noted that similar reference numerals and letters indicate similar items in the following accompanying drawings. Therefore, once an item is defined in one drawing, there is no need to discuss this item further in subsequent drawings.
With reference to
The head support module 110 is configured to support a head of a patient to prevent the head of the patient from substantially moving during eye washing.
The real-time eyelid positioning module 120 is configured to recognize and position poses of upper and lower eyelids of the patient in real time using multiocular visual information.
The robot end-effector real-time positioning module 130 is configured to reconstruct a shape and a pose of a soft-bodied robot end-effector in real time using the multiocular visual information.
The automatic eyelid opening operation module 140 includes a robot body and a robot control system. The robot body includes a multi-axis rigid body mechanical arm and a soft-bodied end-effector. The robot control system takes the real-time pose of the eyelid of the patient as a motion target, and takes the real-time shape and pose information of the soft-bodied end-effector as feedback information to control motion of the robot body to complete the action of automatically opening the eyelid of the patient.
Specifically, the head support module supports and fixes the head of the patient through an underjaw support plate and a forehead rest, to prevent the head of the patient from substantially moving during eye washing. As shown in
It is to be noted that the head support module may have various different shapes of integrated frameworks. For example, shapes of the head support module include roughly square, rectangle, trapezoid, etc., or other regular or irregular polygonal shapes.
The real-time eyelid positioning module 120 uses a visual detection method to detect the pose of the eyelid in real time, and an overall solution is as shown in
The robot end-effector real-time positioning module 130 is configured to position a soft-bodied robot end-effector in real time to ensure accuracy and safety of the operation of the soft-bodied robot in the process of automatically opening the eyelid. As shown in
In one embodiment, to extract the contour of the robot end-effector, a marker with obvious color or shape features may be pasted onto the end-effector, such that the contour of the end-effector may be determined by extracting features of the marker.
The automatic eyelid opening operation module 140 includes a robot body and a robot control system. The robot body is composed of a multi-axis rigid body mechanical arm and a soft-bodied end-effector arranged at an end of the multi-axis rigid body mechanical arm. The multi-axis rigid body mechanical arm may be fixedly mounted on a treatment couch, a treatment chair or a movable platform. The soft-bodied end-effector is prepared from a flexible soft-bodied deformable material fitting to an eyelid structure, and is controlled by, for example, pneumatic drive to move. Further, a flexible force sensor is integrated on the soft-bodied end-effector to feed back force information of the soft-bodied end-effector in real time.
In the embodiments of the present disclosure, by controlling the motion of the multi-axis rigid body mechanical arm, the robot body can be ensured to move to the location of the target eyelid as quickly as possible. Accuracy of eyelid opening and comfort of the patient can be ensured by arranging the soft-bodied end-effector at the end of the mechanical arm and combining with the pneumatic drive. Furthermore, by means of the flexible force sensor arranged on the soft-bodied end-effector and in combination with feedback of the force information, the robot control system can timely and accurately adjust the force and the angle of the operation on the eyelid, to provide maximum comfort for the patient and ensure operation safety under the premise of quickly completing the operation of eyelid opening.
In one embodiment, the end-effector is configured to be fit for simulating the operation of opening the eyelid with a finger. For example, the end-effector is provided with a flexible finger-like protrusion and a tooth-like protrusion, and the eyelid is opened by controlling the engagement time, the turning angle, and the turning force of the finger-like protrusion and the tooth-like protrusion. The turning angle and the turning speed in this method are easy to accurately control, such that a probability of damaging an eyeball is reduced, work efficiency is improved, and pain of the patient is decreased. For another example, the end-effector is provided with an arc-shaped eyelid opener connected to a rotating shaft. By controlling rotation of the rotating shaft, the eyelid opener is driven to perform the operation of eyelid opening.
It is to be understood that to control the robot for completing the task of eyelid opening as required, it is required to determine a relative positional relationship between the robot and external environment, especially the location and the pose of the robot end-effector. If a configuration of one robot has been determined, geometric information and joint information of constituent parts of this robot are known. Therefore, a desired operation may be achieved by controlling the robot after obtaining the pose of the robot end-effector with respect to a reference coordinate system. A specific process of driving the robot to move is omitted herein.
It is to be noted that those skilled in the art may make appropriate changes and modifications to the above-mentioned embodiments without departing from the spirit and scope of the present disclosure. For example, visual sensors used in the real-time eyelid positioning module and the robot end-effector real-time positioning module may be depth cameras or RGB-D cameras (color-depth cameras), etc. The extraction of the real-time location of the eyelid, the extraction of the contour of the end-effector, and calibration of a multiocular camera may also be implemented by using other image processing algorithms.
The robot control system in the present disclosure may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium include the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. The computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may include copper transmission cables, optical fiber transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object-oriented programming language such as Smalltalk, C++, Python or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In a scenario involved with the remote computer, the remote computer may be coupled to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or may be coupled to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
Aspects of the present disclosure are described with reference to flowcharts and/or block diagrams according to the method, apparatus (system) and a computer program product of the embodiments of the present disclosure. It is to be understood that each block of the flowcharts and/or block diagrams, and combinations of blocks in the flowcharts and/or block diagrams, can be implemented by the computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that these instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in one or more blocks in the flowcharts and/or block diagrams. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having instructions stored therein includes an article of manufacture including instructions which implement aspects of the function/act specified in one or more blocks in the flowcharts and/or block diagrams.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in one or more blocks in the flowcharts and/or block diagrams.
The flowcharts and block diagrams in the accompanying drawings illustrate architectures, functions and operations of possible implementations of systems, methods, and computer program products according to a plurality of embodiments of the present disclosure. In this regard, each block in the flowcharts or block diagrams may represent a module, a program segment, or a portion of instructions, which includes one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions denoted by the blocks may occur in a sequence different from the sequences shown in the accompanying drawings. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in a reverse sequence, depending upon the functions involved. It is also to be noted that each block in the block diagrams and/or flowcharts and/or a combination of the blocks in the block diagrams and/or flowcharts may be implemented by a special-purpose hardware-based system executing specific functions or acts, or by a combination of a special-purpose hardware and computer instructions. It is well known to those skilled in the art that implementations by means of hardware, implementations by means of software and implementations by means of software in combination with hardware are equivalent.
The descriptions of the various embodiments of the present disclosure have been presented above for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Therefore, it is apparent to an ordinary skilled person in the art that modifications and variations could be made without departing from the scope and spirit of the embodiments. The terminology used herein is chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the present disclosure is limited by the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2021/075521 | 2/5/2021 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2022/165753 | 8/11/2022 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4604787 | Silvers, Jr. | Aug 1986 | A |
4733802 | Sheldon | Mar 1988 | A |
4981479 | Py | Jan 1991 | A |
5054906 | Lyons, Jr. | Oct 1991 | A |
5064420 | Clarke | Nov 1991 | A |
5133702 | Py | Jul 1992 | A |
5566406 | Demeny | Oct 1996 | A |
6676607 | de Juan, Jr. | Jan 2004 | B2 |
7320692 | Bender | Jan 2008 | B1 |
8942828 | Schecter | Jan 2015 | B1 |
9489753 | Bishop | Nov 2016 | B1 |
9775512 | Tyler | Oct 2017 | B1 |
9821475 | Lynn | Nov 2017 | B1 |
10817722 | Raguin | Oct 2020 | B1 |
11266511 | Hughes | Mar 2022 | B1 |
20020068992 | Hine | Jun 2002 | A1 |
20020163212 | Py | Nov 2002 | A1 |
20020165790 | Bancroft | Nov 2002 | A1 |
20030109885 | Tano | Jun 2003 | A1 |
20040005000 | Shake | Jan 2004 | A1 |
20050127693 | Py | Jun 2005 | A1 |
20050269113 | Plessala, Jr. | Dec 2005 | A1 |
20060017297 | Py | Jan 2006 | A1 |
20060131904 | Mears | Jun 2006 | A1 |
20080081996 | Grenon | Apr 2008 | A1 |
20090161827 | Gertner | Jun 2009 | A1 |
20110092925 | Voss | Apr 2011 | A1 |
20120059517 | Nomura | Mar 2012 | A1 |
20120071840 | Bertelsen | Mar 2012 | A1 |
20120190981 | Harris | Jul 2012 | A1 |
20120253360 | White | Oct 2012 | A1 |
20120296472 | Nagai | Nov 2012 | A1 |
20120310184 | Pedersen | Dec 2012 | A1 |
20130018281 | Nagale | Jan 2013 | A1 |
20130090640 | Nagale | Apr 2013 | A1 |
20130090648 | Nagale | Apr 2013 | A1 |
20140018957 | Matsumoto | Jan 2014 | A1 |
20150032117 | Kim | Jan 2015 | A1 |
20150126970 | Thompson | May 2015 | A1 |
20150148597 | Ciulla | May 2015 | A1 |
20150265467 | Hershoff | Sep 2015 | A1 |
20150335388 | Iida | Nov 2015 | A1 |
20150366447 | Su | Dec 2015 | A1 |
20160193000 | Ogawa | Jul 2016 | A1 |
20160214255 | Uhlenbrock | Jul 2016 | A1 |
20170024907 | Bermano | Jan 2017 | A1 |
20170049320 | Nishizaki | Feb 2017 | A1 |
20170296430 | Barkholt | Oct 2017 | A1 |
20170348062 | Sweeney, II | Dec 2017 | A1 |
20180049829 | Yates | Feb 2018 | A1 |
20180050453 | Peters | Feb 2018 | A1 |
20180071032 | de Almeida Barreto | Mar 2018 | A1 |
20180116497 | Patterson | May 2018 | A1 |
20180250822 | Shimodaira | Sep 2018 | A1 |
20180250823 | Shimodaira | Sep 2018 | A1 |
20190061163 | Yamaguchi | Feb 2019 | A1 |
20190142677 | Linder | May 2019 | A1 |
20190357897 | Barash | Nov 2019 | A1 |
20190358083 | Barash | Nov 2019 | A1 |
20200000518 | Kiernan | Jan 2020 | A1 |
20200038101 | Tobey | Feb 2020 | A1 |
20200108506 | Goller | Apr 2020 | A1 |
20200155242 | Sweeney, II | May 2020 | A1 |
20200189895 | Lessing | Jun 2020 | A1 |
20200261155 | Popovic | Aug 2020 | A1 |
20200297358 | Cameron | Sep 2020 | A1 |
20200345356 | Leimbach | Nov 2020 | A1 |
20200345357 | Leimbach | Nov 2020 | A1 |
20200345358 | Jenkins | Nov 2020 | A1 |
20200345359 | Baxter, III | Nov 2020 | A1 |
20200345360 | Leimbach | Nov 2020 | A1 |
20200345446 | Kimball | Nov 2020 | A1 |
20200405375 | Shelton, IV | Dec 2020 | A1 |
20210015550 | Highsmith | Jan 2021 | A1 |
20210015551 | Fuentes-Ortega | Jan 2021 | A1 |
20210059745 | Highsmith | Mar 2021 | A1 |
20210059777 | Overmyer | Mar 2021 | A1 |
20210077183 | Basu | Mar 2021 | A1 |
20210077184 | Basu | Mar 2021 | A1 |
20210085386 | Rao | Mar 2021 | A1 |
20210121231 | Basu | Apr 2021 | A1 |
20210186458 | Giphart | Jun 2021 | A1 |
20210228281 | Calloway | Jul 2021 | A1 |
20210244479 | Wassall | Aug 2021 | A1 |
20210276203 | Carithers | Sep 2021 | A1 |
20210339399 | Schluntz | Nov 2021 | A1 |
20210393353 | Campagna | Dec 2021 | A1 |
20210401401 | Giphart | Dec 2021 | A1 |
20220133299 | Baxter, III | May 2022 | A1 |
20220133300 | Leimbach | May 2022 | A1 |
20220133301 | Leimbach | May 2022 | A1 |
20220133302 | Zerkle | May 2022 | A1 |
20220133303 | Huang | May 2022 | A1 |
20220133310 | Ross | May 2022 | A1 |
20220133311 | Huang | May 2022 | A1 |
20220133312 | Huang | May 2022 | A1 |
20220133427 | Baxter, III | May 2022 | A1 |
20220133428 | Leimbach | May 2022 | A1 |
20220148333 | Funes Mora | May 2022 | A1 |
20220227002 | Kouno | Jul 2022 | A1 |
20220233350 | Van Gorden | Jul 2022 | A1 |
20220257298 | Fox | Aug 2022 | A1 |
20220346776 | Aronhalt | Nov 2022 | A1 |
20220346781 | Shelton, IV | Nov 2022 | A1 |
20220346784 | Shelton, IV | Nov 2022 | A1 |
20220362931 | Wheaton | Nov 2022 | A1 |
20220362944 | Zamani | Nov 2022 | A1 |
20220378613 | Glozman | Dec 2022 | A1 |
20220395275 | Diaz-Chiosa | Dec 2022 | A1 |
20230013731 | Nienaber | Jan 2023 | A1 |
20230083909 | Haro | Mar 2023 | A1 |
20230100638 | Xia | Mar 2023 | A1 |
20230146178 | Haro | May 2023 | A1 |
20230157872 | Glozman | May 2023 | A1 |
20230165713 | Glozman | Jun 2023 | A1 |
20230166411 | Harada | Jun 2023 | A1 |
20230226685 | Gil | Jul 2023 | A1 |
20230233204 | Gil | Jul 2023 | A1 |
20230255714 | Gil | Aug 2023 | A1 |
20240138905 | Okarski | May 2024 | A1 |
20240253246 | Smith | Aug 2024 | A1 |
Number | Date | Country |
---|---|---|
107303240 | Oct 2017 | CN |
107753272 | Mar 2018 | CN |
109843231 | Jun 2019 | CN |
111588469 | Aug 2020 | CN |
112971877 | Jun 2021 | CN |
109431452 | Aug 2021 | CN |
115533900 | Dec 2022 | CN |
2080494 | Jul 2009 | EP |
4066749 | Oct 2022 | EP |
2006277293 | Oct 2006 | JP |
2009539509 | Nov 2009 | JP |
2011200943 | Oct 2011 | JP |
5391070 | Jan 2014 | JP |
WO-2012088471 | Jun 2012 | WO |
WO-2020160097 | Aug 2020 | WO |
WO-2020215121 | Oct 2020 | WO |
WO-2020237939 | Dec 2020 | WO |
Entry |
---|
“A facial wearable robot for supporting eye opening and closure movement;” Kozaki et al., 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 1812-1817), Sep. 1, 2017. (Year: 2017). |
“Kinect-Based Hand Tracking for First-Person-Perspective Robotic Arm Teleoperation,” Bai et al., 2018 IEEE International Conference on Information and Automation (ICIA) (pp. 684-691), Aug. 1, 2018 (Year: 2018). |
“Chapter 2 Camera Calibration,” Zhengyou Zhang, (https://www.semanticscholar.org/paper/Chapter-2-CAMERA-CALIBRATION-Zhengyou-Zhang-Zhang/6e20c43a0077d6580975625c44411e8c3fcf3ffe) Published 2009 (Year: 2009). |
Number | Date | Country | |
---|---|---|---|
20230100638 A1 | Mar 2023 | US |