This application relates to the field of internet receivers that are capable of receiving Internet radio signals or audio and/or visual signals from a stored playlist or database. More specifically, this application relates to speakers and video screens that relate to a specific station or database that is received through these devices by BLUETOOTH®, Wi-Fi or Wi-Max. It is also noted that one BLUETOOTH®, Wi-Fi or Wi-Max Internet receiver may be used to supply signals to multiple speakers and screens.
The combination of internet-based audio transmissions with digital images has led to great improvements in entertainment, information distribution and healthcare. The present invention relates to the use of audio and visual transmissions through the internet to accomplish a refractive eye exam, thus eliminating the need for a patient to visit an optometrist in person.
It is therefore an object of the invention to configure a system in which a refractive eye examination can be conducted using audio and video transmissions through the internet and/or other systems, such as BLUETOOTH® This object is accomplished according to the invention by a system configured for conducting a refractive examination of an eye of a patient that has a communication device such as a smartphone, the communication device comprising a communication module that connects to the internet or other information highway, preferably wirelessly, and is configured to transmit and receive information over the internet or other information highway and a processor that is programmed to connect to a remote computer via the communication module. The remote computer has a data storage device that stores images of eye charts. The communication device further has a user interface that is configured for allowing a user to control processes of the processor, a display screen, at least one speaker connected to the processor to play signals received by the processor, and at least one microphone connected to the processor.
The communication device is mounted in a headset configured to be worn by the patient, so that the patient can view the display screen of the communication device while wearing the headset. The headset can be a standard Virtual Reality “VR” headset that is commercially available or can be custom designed for this purpose. The communication device is wirelessly connected to a remote computer having a database of several eye charts that can be displayed on the display screen of the communication device. An application program of the communication device allows the eye charts to be displayed there. The general format of the eye charts can be configured in any standard way that Optometrists and Ophthalmologists use to determine the patent's refractive needs. Assorted letters, numbers, symbols and/or shapes, in varying sizes, can be displayed.
In a preferred embodiment, the headset has two separate viewing sections, so that when the headset is worn by the patient, one part of the display screen is visible by only one eye of the patient and the other part of the display screen is visible only by the other eye of the patient. The eye charts can be displayed on the display screen so that an entire eye chart is shown to a single eye of the wearer. Alternatively, both sides of the display screen can display the same eye chart, for viewing with both eyes. This way, the optometrist can display the eye chart on the eyes individually, or both at the same time, to check the refraction.
The optometrist is located at a computer in a remote location. The computer has a database that stores the images of the eye charts, and a processor that connects to the communication device for transmitting the eye charts to the communication device for display to the patient.
The database of eye charts contains a wide selection of eye charts that have been modified to simulate various refractions that a patient with different degrees of myopia, hyperopia, presbyopia, astigmatism or other refractive condition would need. For example, in the standard eye chart, if the patient can only read the top two rows of letters or symbols, the optometrist can choose a chart that is configured with a certain degree of correction, and load that onto the communication device for viewing by the patient. The optometrist continues loading different eye charts onto the communication device until the patient indicates that they can see the eye chart sufficiently clearly.
Once the correct version of the eye chart is documented for each eye, the optometrist can prepare the prescription for the patient. Each eye chart in the database is correlated with a specific prescription. The processor can also be programmed with software that automatically prepares the prescription based on the selected corrected eye charts. The prescription can then be sent manually or automatically to the patient via email or text message, or can be loaded into an online account of the patient for later use.
The communication between the patient and the optometrist can take place audibly using the communication device, communicating via standard cellular signals, or over the internet. As each eye chart is presented, the patient reads it out and the information is transmitted from the microphone of the communication device to the remote computer of the optometrist, or directly to a telephone being used by the optometrist.
As used in this specification and the appended claims, the singular forms “a”, “an” and “the” include plural referents unless the context clearly indicates otherwise.
As used in this specification and the appended claims, “internet” refers not only to the internet, but also to any wide area network or local area network. Use of the term “internet” is not intended to limit the present invention to communications received via the world wide web.
As used in this specification and the appended claims, a “speaker” means any sound emitting device and is not limited to standard electromechanical transducer type speakers. Non-limiting examples of suitable speakers are piezoelectric speakers, electrostatic speakers, flat panel speakers and digital speakers.
As used in the specification and the appended claims, a “smartphone” is a mobile telephone equipped with internet capability.
As used in the specification and the appended claims, an “application” or “app” is a software program installed on a smartphone, which can perform certain functions directly or is used to directly connect the smartphone to an internet-based program via a link on the display screen of the smartphone.
The various embodiments and aspects of the invention described here can be employed individually or in conjunction with other embodiments and aspects. Descriptions of individual aspects and embodiments does not preclude the inclusion of other aspects, embodiments or additional structural components.
As shown in
Smartphone 30 is equipped with speakers 32, a microphone 33, a display screen 39, a processor 34 and a transmitter/receiver 35 for cellular and/or wireless internet communication with a processor in a remote computer 40, such as shown in
It is to be understood that the invention is not limited to the details of construction or process steps set forth in the following description. The invention is capable of other embodiments and of being practiced or being carried out in various ways.
While there have been shown, described and pointed out fundamental novel features of the invention as applied to preferred embodiments or aspects thereof, it will be understood that various omissions and substitutions and changes in the form and details of the device illustrated and in its operation may be made by those skilled in the art without departing from the spirit of the invention. It is the intention, therefore, to be limited only as indicated by the scope of the claims appended hereto.
This application is a continuation in part of U.S. patent application Ser. No. 16/702,892, filed on Dec. 4, 2019, which is a continuation in part of U.S. patent application Ser. No. 15/787,813, filed on Oct. 19, 2017, which is a continuation in part of U.S. patent application Ser. No. 15/581,209, filed on Apr. 28, 2017 (now U.S. Pat. No. 9,936,316 issued Apr. 3, 2018), which is a continuation in part of U.S. patent application Ser. No. 15/401,773, filed on Jan. 9, 2017 (now U.S. Pat. No. 9,693,140 issued Jun. 27, 2017), which is a continuation under 35 U.S.C. § 120 of U.S. patent application Ser. No. 15/161,658, filed on May 23, 2016 (now U.S. Pat. No. 9,584,913 issued Feb. 28, 2017), which is a continuation-in-part of U.S. patent application Ser. No. 14/710,707, filed on May 13, 2015 (now U.S. Pat. No. 9,367,285 issued Jun. 14, 2016), which is a continuation-in-part of U.S. patent application Ser. No. 13/856,795, filed on Apr. 4, 2013 (now U.S. Pat. No. 9,060,040 issued Jun. 16, 2015), which is a continuation in part of U.S. patent application Ser. No. 13/331,469 filed on Dec. 20, 2011 (now U.S. Pat. No. 8,467,722 issued Jun. 18, 2013), which is a continuation-in-part of U.S. patent application Ser. No. 12/180,901 filed Jul. 28, 2008 (now U.S. Pat. No. 8,099,039 issued Jan. 17, 2012), which claims the benefit of priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application Ser. No. 60/954,879, filed Aug. 9, 2007, the entirety of all of which is hereby incorporated by reference. This application also claims priority under 35 USC 119(e) of U.S. Provisional Application Ser. No. 63/063,166 filed on Aug. 7, 2020.
Number | Name | Date | Kind |
---|---|---|---|
6389463 | Bolas | May 2002 | B2 |
7003515 | Glaser et al. | Feb 2006 | B1 |
7058356 | Slotznick | Jun 2006 | B2 |
7059728 | Alasaarela et al. | Jun 2006 | B2 |
7065342 | Rolf | Jun 2006 | B1 |
7738151 | Garner et al. | Jun 2010 | B2 |
7817591 | Cooley | Oct 2010 | B2 |
7873040 | Karlsgodt | Jan 2011 | B2 |
8260230 | Zigler et al. | Sep 2012 | B2 |
8467722 | Spector | Jun 2013 | B2 |
8472866 | Spector | Jun 2013 | B1 |
8543095 | Brown et al. | Sep 2013 | B2 |
8725065 | Spector | May 2014 | B2 |
9060040 | Spector | Jun 2015 | B2 |
10714217 | Seriani | Jul 2020 | B2 |
10762994 | Seriani | Sep 2020 | B2 |
10888222 | Monhart | Jan 2021 | B2 |
10916347 | Seriani | Feb 2021 | B2 |
10983351 | Samec | Apr 2021 | B2 |
11256096 | Samec | Feb 2022 | B2 |
20040046783 | Montebovi | Mar 2004 | A1 |
20040198175 | Shively et al. | Oct 2004 | A1 |
20050248233 | Pompei | Nov 2005 | A1 |
20060168097 | Pittelli | Jul 2006 | A1 |
20080086687 | Sakai et al. | Apr 2008 | A1 |
20080194175 | Last et al. | Aug 2008 | A1 |
20100042920 | Sigal | Feb 2010 | A1 |
20120019883 | Chae et al. | Jan 2012 | A1 |
20130021579 | Husain | Jan 2013 | A1 |
20130230179 | Beaty et al. | Sep 2013 | A1 |
20140133664 | Beaty et al. | May 2014 | A1 |
20140342660 | Fullam | Nov 2014 | A1 |
20140375771 | Gabara | Dec 2014 | A1 |
20150142536 | Marlow et al. | May 2015 | A1 |
20150256564 | Reynolds | Sep 2015 | A1 |
20170231487 | Carrafa | Aug 2017 | A1 |
20180263488 | Pamplona | Sep 2018 | A1 |
20190125181 | Lindig | May 2019 | A1 |
20190142270 | Monhart | May 2019 | A1 |
20200320770 | Charlson | Oct 2020 | A1 |
20200397288 | Zidan | Dec 2020 | A1 |
20210030270 | Goyal | Feb 2021 | A1 |
Number | Date | Country | |
---|---|---|---|
20210051407 A1 | Feb 2021 | US |
Number | Date | Country | |
---|---|---|---|
63063166 | Aug 2020 | US | |
60954879 | Aug 2007 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15161658 | May 2016 | US |
Child | 15401773 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16702892 | Dec 2019 | US |
Child | 17084816 | US | |
Parent | 15787813 | Oct 2017 | US |
Child | 16702892 | US | |
Parent | 15581209 | Apr 2017 | US |
Child | 15787813 | US | |
Parent | 15401773 | Jan 2017 | US |
Child | 15581209 | US | |
Parent | 14710707 | May 2015 | US |
Child | 15161658 | US | |
Parent | 13856795 | Apr 2013 | US |
Child | 14710707 | US | |
Parent | 13331469 | Dec 2011 | US |
Child | 13856795 | US | |
Parent | 12180901 | Jul 2008 | US |
Child | 13331469 | US |