Rearview mirrors have traditionally been the preferred device for capturing visual activity occurring behind the back of the vehicle for observation by the vehicle driver. A need exists to expand the capability of the traditional function of a rearview mirror through easy substitution of a rearview display system that may function over or in place of a conventional rearview mirror.
A system is provided that includes a housing, a display attached to the housing, and a camera attached to the housing located proximate to the display. In some embodiments, the camera is configured to capture images over a 360-degree field of view in at least a first plane through the housing and capture images over a 220-degree field of view in a second plane perpendicular to the first plane.
In some embodiments, the camera is attached to an edge of the housing. In some embodiments, the system includes a processor coupled to the camera. In some embodiments, the system includes a night-vision system coupled to the camera.
The processor may be programmed by a computer-readable, non-transitory, programmable product, comprising code, executable by the processor, for causing the processor to do the following: recognize objects captured by the camera according to a predetermined characteristic and provide an alert in connection with recognition of an object.
The objects may include one or more weapons and one or more hazards. The objects recognized by the processor may include one or more human faces.
In some embodiments, the system includes supplementary code, executable by the processor, for causing the processor to cause the camera to zoom in on objects in the field of view of the camera recognized as a hazard by the processor.
The disclosure also includes a detachable rear-view driving system, comprising: a housing; an attachment system for detachably affixing over a rear-view mirror in a vehicle; a display attached to the housing; a camera attached to the housing located proximate the display, the camera being configured to capture images over a 360-degree field of view in at least a first plane through the housing and capture images over a 220-degree field of view in a second plane perpendicular to the first plane; a processor coupled to the camera; and a communication system coupled to the processor.
In some embodiments, the camera is attached to an edge of the housing, and the system includes a night-vision system coupled to the camera. The system may also include one or more additional cameras coupled to the display and the processor.
The foregoing and other features and advantages of the invention will be apparent from the following, more particular description of the preferred embodiments of the invention, the accompanying drawings, and the claims.
For a complete understanding of the present invention, the objects, and advantages thereof, reference is now made to the ensuing descriptions taken in connection with the accompanying drawings briefly described as follows.
Reference numbers/symbols have been carried forward.
Preferred embodiments of the present invention and their advantages may be understood by referring to
Although certain embodiments and examples are disclosed below, inventive subject matter extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses and to modifications and equivalents thereof. Thus, the scope of the claims appended hereto is not limited by any of the particular embodiments described below. For example, in any method or process disclosed herein, the acts or operations of the method or process may be performed in any suitable sequence and are not necessarily limited to any particular disclosed sequence. Various operations may be described as multiple discrete operations, in turn, in a manner that may be helpful in understanding certain embodiments; however, the order of description should not be construed to imply that these operations are order-dependent. Additionally, the structures, systems, and/or devices described herein may be embodied as integrated or separate components.
To compare various embodiments, certain aspects and advantages of these embodiments are described. Not necessarily all such aspects or advantages are achieved by any particular embodiment. Thus, for example, various embodiments may be carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other aspects or advantages as may also be taught or suggested herein.
In other embodiments, memory/database 204 may be located remotely. Memory/database 204 may store images and information that may be accessed by processor 202 and/or sent from viewing system 200 in connection with processor 202 and images appearing on display 100. In some embodiments, processor 202 may be an onboard artificial intelligence (AI)/ neural net processor.
Auxiliary/additional camera system 206 may also be coupled to processor 202. Auxiliary/additional camera system 206 may include one or more cameras positioned within or outside a vehicle. Display 100 may receive information from processor 202 and camera 104 that enables display 100 to show images/video on screen 106. In some embodiments, camera 104 may be an infrared camera. In other embodiments, camera 104 and auxiliary/additional camera system 206 may include night vision capability, such as an infrared imaging system. In some embodiments, the night vision system may be an uncooled infrared (IR) imaging system 210.Additionally or alternatively, in some embodiments the night vision system is a passive color night vision system. A passive night vision system generally operates by way of ambient light and often operates at longer infrared wavelengths than an active system.
In some embodiments, screen 106 may be a plasma screen that interacts with camera 104 and/or auxiliary/additional camera system 206. In some embodiments, screen 106 may display different viewing modes, which may be selected in connection with touching various positions on the plasma screen embodiment of screen 106. Some of these view modes may include the display of multiple images from different cameras or the simultaneous display of multiple images from different cameras. In some embodiments, auxiliary/additional camera system 206 may include a 80 × 1, 80° camera mounted at the base of the display 100, for instance, within or near camera 104. In some embodiments, auxiliary/additional camera system 206 may include an optional second camera, which is wired or wireless, that serves as a dual camera, fed into display 100 along with camera 104.
In some embodiments, camera 104 and or auxiliary/additional camera system 206 may operate with light assist. Light assit is a light built provided with a camera or camera system, mounted near a lens that may assist in foucusing a camera during low light applications. In some embodiments, range of the camera may not be significantly affected whether light assist is used or not and as such light assist may be optional, although it may be applied to multiple cameras in auxiliary/additional camera system 206.
Objects on the screen that may prompt an alert include one or more weapons, one or more hazards, and one or more human faces. In some embodiments, processor 202 may be programmed with supplementary software to allow enhanced functionality for viewing system 200. One example of enhanced functionality may occur in connection with processor 202, causing camera 104 to zoom in on objects in the field of view of camera 104 recognized, by the processor 202, as a hazard, person of interest, weapon, etc.
Processor 202 may be programmed with image recognition software to detect certain human faces, conditions, events, and/or circumstances in connection with images appearing on screen 106. Alerts may be dispatched in connection with detection by showing an icon, color, blinking light, or other indication appearing on screen 106. Alternatively, or additionally, a sound may be emitted from a speaker of audio interface 118 in connection with an alert. Further, processor 202 may be connected to communications system 220, having a transmitter and/or receiver. In some embodiments, communications system 220 may be a transceiver. Images detected by processor 202 due to its image processing programming may result in an alert being dispatched to a remote location through communication system 220. Face recognition data may be accessed locally from memory/database 204 and/or from memory/database located remotely through communication system 220. In some embodiments, one or more images (including video) prompting an alert may be stored locally in memory/database 204. Alternatively, communication system 220 may access a memory/database 204 located remotely from display 100. In some embodiments, a facial identification (ID) system may be implemented with memory/database 204 in conjunction with programming of processor 202 and communication system 220 to produce a security verification alarm system. With such a system, an alert may be produced near viewing system 200 or at a remote location in conjunction with display information from viewing system 200. In some embodiments, an object recognition system may be implemented with memory/database 204 in conjunction with programming of processor 202 and communication system 220. Consequently, with such an object recognition system, hazardous objects may recognized in display 100 producing an alert. The object recognition system may be further expanded to implement a collision avoidance and or vehicle navigation system such as a vehicle parking system.
Additional embodiments following that may be combined with or stand apart from the foregoing. In some embodiments, the term “rearview mirror” references a display screen that functions as a rearview mirror in, for instance, a vehicle. Especially, by virtue of its intended positioning, largely within a vehicle, the systems disclosed herein may operate in adverse weather conditions.
Similar to
In some embodiments, the camera 308 is coupled to display screen 400 via a wired connection. Camera 308 may be wirelessly coupled to display screen 400, such as via Bluetooth, WiFi, cellular, or any other wireless communication protocol.
Combination mirror and display screen 300 may include a storage medium configured to store at least one recording captured by camera 308. In some embodiments, the storage medium and camera 308 are configured to loop the recording after a predetermined amount of time. For example, the recording may loop every 10 minutes, every 20 minutes, every 30 minutes, every hour, or any other amount of time. In some embodiments, the combination mirror and display screen 300 includes a button configured to save a recording. The ability to save a recording may enable a user to capture an important event, like a car accident, animal sighting, or other notable event, and watch it again later. In some embodiments, the storage medium comprises a secure digital (“SD”) card.
None of the steps described herein is essential or indispensable. Any of the steps can be adjusted or modified, and other or additional steps can be used. Any portion of any of the steps, processes, structures, and/or devices disclosed or illustrated in one embodiment, flowchart, or example in this specification can be combined or used with or instead of any other portion of any of the steps, processes, structures, and/or devices disclosed or illustrated in a different embodiment, flowchart, or example. The embodiments and examples provided herein are not intended to be discrete and separate from each other.
The section headings and subheadings provided herein are non-limiting. The section headings and subheadings do not represent or limit the full scope of the embodiments described in the sections to which the headings and subheadings pertain. For example, a section titled “Topic 1” may include embodiments that do not pertain to Topic 1, and embodiments described in other sections may apply to and be combined with embodiments described within the “Topic 1” section.
To increase the clarity of various features, other features are not labeled in each figure.
The various features and processes described above may be used independently of one another or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method, event, state, or process blocks may be omitted in some implementations. The methods, steps, and processes described herein are also not limited to any particular sequence, and the blocks, steps, or states relating thereto can be performed in other sequences that are appropriate. For example, described tasks or events may be performed in an order other than the order specifically disclosed. Multiple steps may be combined in a single block or state. The example tasks or events may be performed in serial, parallel, or other ways. Tasks or events may be added or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.
The conditional language used herein, such as, among others, “can,” “could,” “might,” “may,” “e.g.,” and the like, unless expressly stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements, and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment. The terms “comprising,” “including,” “having,” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. Conjunctive language such as the phrase “at least one of X, Y, and Z,” unless expressly stated otherwise, is otherwise understood with the context as used in general to convey that an item, term, etc., may be either X, Y, or Z. Thus, such conjunctive language is not generally intended to imply that certain embodiments require at least one of X, at least one of Y, and at least one of Z to each be present.
The term “and/or” means that “and” applies to some embodiments, and “or” applies to some embodiments. Thus, A, B, and/or C can be replaced with A, B, and C written in one sentence and A, B, or C written in another sentence. A, B, and/or C means that some embodiments can include A and B, some embodiments can include A and C, some embodiments can include B and C, some embodiments can only include A, some embodiments can include only B, some embodiments can include only C, and some embodiments can include A, B, and C. The term “and/or” is used to avoid unnecessary redundancy.
The term “substantially” is used to mean “completely” or “nearly completely.” For example, the disclosure includes, “The display screen 2400 may be configured to display the second field of view 4600b in substantially real-time.” In this context, “substantially real-time” is used to mean that the display screen 2400 may display the second field of view 4600b in real-time or with a slight delay, such as a few seconds.
The various features and processes described above may be used independently of one another or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method, event, state, or process blocks may be omitted in some implementations. The methods, steps, and processes described herein are also not limited to any particular sequence, and the blocks, steps, or states relating thereto can be performed in other sequences that are appropriate. For example, described tasks or events may be performed in an order other than the order specifically disclosed. Multiple steps may be combined in a single block or state. The example tasks or events may be performed in serial, in parallel, or in some other manner. Tasks or events may be added or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.
The foregoing may be accomplished through software code running in one or more processors on a communication device in conjunction with a processor in a server running complementary software code.
Some of the devices, systems, embodiments, and processes use computers. Each of the routines, processes, methods, and algorithms described in the preceding sections may be embodied in and fully or partially automated by code modules executed by one or more computers, computer processors, or machines configured to execute computer instructions. The code modules may be stored on any type of non-transitory computer-readable storage medium or tangible computer storage device, such as hard drives, solid-state memory, flash memory, optical disc, and/or the like. The processes and algorithms may be implemented partially or wholly in application-specific circuitry. The results of the disclosed processes and process steps may be stored, persistently or otherwise, in any non-transitory computer storage, such as, e.g., volatile or non-volatile storage.
It is appreciated that to practice the method of the foregoing as described above, it is not necessary that the processors and/or the memories of the processing machine be physically located in the same geographical place. That is, each of the processors and the memory (or memories) used by the processing machine may be located in geographically distinct locations and connected to communicate in any suitable manner. Additionally, it is appreciated that each of the processor and/or the memory may be composed of different physical pieces of equipment. Accordingly, it is not necessary that the processor be one single piece of equipment in one location and that the memory be another single piece of equipment in another location. That is, it is contemplated that the processor may be two pieces of equipment in two different physical locations. The two distinct pieces of equipment may be connected in any suitable manner. Additionally, the memory may include two or more portions of memory in two or more physical locations.
To explain further, processing, as described above, is performed by various components and various memories. However, it is appreciated that the processing performed by two distinct components as described above may, in accordance with a further embodiment of the foregoing, be performed by a single component. Further, as described above, the processing performed by one distinct component may be performed by two distinct components. Similarly, the memory storage performed by two particular memory portions, as described above, may, in accordance with a further embodiment of the foregoing, be performed by a single memory portion. Further, the memory storage, performed by one distinct memory portion described above, may be performed by two memory portions.
Further, various technologies may be used to provide communication between the various processors and/or memories, as well as to allow the processors and/or the memories of the foregoing to communicate with any other entity, i.e., to obtain further instructions or to access and use remote memory stores, for example. Such technologies used to provide such communication might include a network, the Internet, Intranet, Extranet, LAN, an Ethernet, wireless communication via cell tower or satellite, or any client server system that provides communication, for example. Such communications technologies may use any suitable protocol, such as TCP/IP, UDP, or OSI, for example.
As described above, a set of instructions may be used to process the foregoing. The collection of instructions may be in the form of a program or software. The software may be in the form of system software or application software, for example. The software might also be a collection of separate programs, a program module within a larger program, or a portion of a program module, for example. The software used might also include modular programming in the form of object-oriented programming. The software may instruct the processing machine on what to do with the data being processed.
Further, it is appreciated that the instructions or set of instructions used in the implementation and operation of the foregoing may be in a suitable form such that the processing machine may read the instructions. For example, the instructions that form a program may be in the form of a suitable programming language, which is converted to machine language or object code to allow the processor or processors to read the instructions. That is, written lines of programming code or source code in a particular programming language are converted to machine language using a compiler, assembler, or interpreter. The machine language is binary-coded machine instructions specific to a particular type of processing machine, i.e., to a particular type of computer, for example. The computer understands the machine language.
Any suitable programming language may be used in accordance with the various embodiments of the foregoing. Illustratively, the programming language used may include assembly language, Ada, APL, Basic, C, C++, COBOL, dBase, Forth, Fortran, Java, Modula-2, Pascal, Prolog, Python, REXX, Visual Basic, and/or JavaScript, for example. Further, it is not necessary that a single type of instruction or single programming language be utilized in conjunction with the operation of the system and method of the foregoing. Rather, any number of different programming languages may be utilized as necessary and/or desirable.
Also, the instructions and/or data used in the practice of the foregoing may utilize any compression or encryption technique or algorithm as may be desired. An encryption module might be used to encrypt data. Further, files or other data may be decrypted using a suitable decryption module, for example.
As described above, the foregoing may illustratively be embodied in the form of a processing machine, including a computer or computer system, for example, that includes at least one memory. It is to be appreciated that the set of instructions, i.e., the software, for example, that enables the computer operating system to perform the operations described above, may be contained on any of a wide variety of media or medium, as desired. Further, the data processed by the set of instructions might also be contained in a wide variety of media or medium. That is, the particular medium, i.e., the memory in the processing machine, utilized to hold the set of instructions and/or the data used in the foregoing may take on any of a variety of physical forms or transmissions, for example. Illustratively, the medium may be in the form of paper, paper transparencies, a compact disk, a DVD, an integrated circuit, a hard disk, a floppy disk, an optical disk, a magnetic tape, a RAM, a ROM, a PROM, an EPROM, a wire, a cable, a fiber, a communications channel, a satellite transmission, a memory card, a SIM card, or other remote transmissions, as well as any other medium or source of data that may be read by the processors of the foregoing.
Further, the memory or memories used in the processing machine that implements the foregoing may be in various forms to allow the memory to hold instructions, data, or other information, as desired. Thus, the memory might be in the form of a database to hold data. The database might use any desired arrangement of files, such as a flat file arrangement or a relational database arrangement, for example.
In the system and method of the foregoing, a variety of “user interfaces” may be utilized to allow a user to interface with the processing machine or machines used to implement the foregoing. As used herein, a user interface includes any hardware, software, or combination of hardware and software used by the processing machine that allows a user to interact with the processing machine. A user interface may be in the form of a dialogue screen, for example. A user interface may also include any of a mouse, touch screen, keyboard, keypad, voice reader, voice recognizer, dialogue screen, menu box, list, checkbox, toggle switch, a pushbutton, or any other device that allows a user to receive information regarding the operation of the processing machine as it processes a set of instructions and/or provides the processing machine with information. Accordingly, the user interface is any device that provides communication between a user and a processing machine. The information provided by the user to the processing machine through the user interface may be in the form of a command, a selection of data, or some other input, for example.
As discussed above, a user interface is utilized by the processing machine that performs a set of instructions such that the processing machine processes data for a user. The user interface is typically used by the processing machine for interacting with a user to convey or receive information from the user. However, it should be appreciated that in accordance with some embodiments of the system and method of the foregoing, it is not necessary that a human user actually interacts with a user interface used by the processing machine of the foregoing. Rather, it is also contemplated that the user interface of the foregoing might interacts, i.e., convey and receive information, with another processing machine rather than a human user. Accordingly, the other processing machine might be characterized as a user. Further, it is contemplated that a user interface utilized in the system and method of the foregoing may interact partially with another processing machine or processing machines while also interacting partially with a human user.
While certain example embodiments have been described, these embodiments have been presented by way of example only and are not intended to limit the scope of the inventions disclosed herein. Thus, nothing in the foregoing description is intended to imply that any particular feature, characteristic, step, module, or block is necessary or indispensable. Indeed, the novel methods and systems described herein may be embodied in various forms; furthermore, various omissions, substitutions, and changes in the form of the methods and systems described herein may be made without departing from the spirit of the inventions disclosed herein.
The present application claims priority to U.S. Provisional Pat. Application No. 63/300,545, filed on Jan. 18, 2022, entitled “Combination Mirror and Display Screen,” the entire disclosure of which is incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
63300545 | Jan 2022 | US |