The present disclosure generally relates to systems and methods for color calibration using artificial reality devices with color tuned exteriors.
Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user. Artificial reality can include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. AR, VR, MR, and hybrid reality devices often receive information through cameras or other optical modules on a headset, e.g., glasses, and provide content through visual means.
Since artificial reality devices heavily rely on accurate optical information to provide seamless and realistic output for users, the devices rarely have any color cosmetics added due to the stringent optical requirements of any cameras and/or optical modules behind the cover windows, e.g., lenses. Moreover, colored cover windows act as a color filter and create significant challenges to the complex operations of cameras and other optical modules utilizing received and transmitted light.
In meeting the described challenges, the present disclosure provides systems and methods for color tuning optical modules and executing color calibration methods on artificial reality systems and devices. Exemplary embodiments include artificial reality systems with colored lenses specifically tuned to the optical modules of the system. The optical modules can be cameras, such as infrared cameras, visible spectrum cameras, and the like.
In one exemplary embodiment, a device includes a lens, a plurality of cameras positioned behind the lens, a colored coating on the lens, and a processor and non-transitory memory including computer-executable instructions. The plurality of cameras can include a first camera for processing visible light and a second camera for processing infrared light. The colored coating includes a plurality of regions, with each region having a color profile for selectively transmitting light. A first region is positioned in front of the first camera and the second region is positioned in front of the second camera.
The computer-executable instructions, when executed by the processor, cause the device to receive light information indicative of at least one of: visible light received at the first camera or infrared light received at the second camera, wherein the received light information provides environmental information for executing an operation on the device; identify wavelengths reflected by the color profile positioned in front of each camera; determine a color calibration for the light information based on the color profile, wherein the color calibration amplifies the wavelengths reflected by the color profile; update the environmental information based on the color calibration; and execute the operation on the device based on the updated environmental information.
Additional embodiments include a laser emitter positioned behind the lens and a third region on the colored coating having a color profile for selectively transmitting infrared light. Embodiments can include two regions for transmitting light, and a region for transmitting infrared light positioned between the two visible light regions.
The colored coating can include a first plurality of layers on an inner face of the lens, and a second plurality of layers on an outer face of the lens. The first plurality of layers can include an inner ink layer, a middle high-contrast layer, and an outer anti-reflective layer. The second plurality of layers can include an inner hard-coat (HC) layer, a middle anti-reflective (AR) layer, and an outer anti-fingerprint (AF) layer. The HC layer increases adhesion between the substrate material and the AR layer and improves performance against scratches and abrasion. The colored coating can also include a plurality of ink layers, with each ink layer reflecting a range of wavelengths.
In embodiments, the second region has less than a 20% transmission rate for wavelengths below 750 nm. In another embodiment, the second region has less than a 10% transmission rate for wavelengths below 730 nm. In another embodiment, the second region has less than a 5% transmission rate for wavelengths below 700 nm. In another embodiment, the second region has greater than a 60% transmission rate for wavelengths above 850 nm.
Each region can include an on-axis color profile, and an off-axis color profile. The on-axis color profile can have greater than a 90% transmission rate for wavelengths above 500 nm. In other embodiments, the on-axis color profile can have greater than a 96% transmission rate for wavelengths between 500-700 nm. The off-axis color profile can have a transmission rate of greater than 64% for wavelengths above 500 nm. The off-axis color profile can have a transmission rate of greater than 73% for wavelengths between 500-700 nm.
In embodiments, the first camera identifies red, green, blue, and yellow wavelength values. Each color profile can further include a transmission profile and reflection profile. The operations on the device can include one or more of generating an image on a display or executing a simultaneous location and mapping (SLAM) function. In some embodiments, the colored coating can be applied to the lens using a pad printing technique. The lens formation and colored coating can also be performed using a thermoformed or injection molding technique. In other embodiments, a colored coating can be applied to the lens using at least one of sputtering and e-beam evaporation techniques.
Exemplary embodiments of the present invention can utilize a variety of hardware, such as glasses, headsets, controllers, peripherals, mobile computing devices, displays, and user interfaces to effectuate the methods and operations discussed herein. Embodiments can further communicate with local and/or remote servers, databases, and computing systems. In various embodiments, the artificial reality device can include glasses, a headset, a display, a microphone, a speaker, and any of a combination of peripherals, and computing systems.
The summary, as well as the following detailed description, is further understood when read in conjunction with the appended drawings. For the purpose of illustrating the disclosed subject matter, there are shown in the drawings exemplary embodiments of the disclosed subject matter; however, the disclosed subject matter is not limited to the specific methods, compositions, and devices disclosed. In addition, the drawings are not necessarily drawn to scale. In the drawings:
The present disclosure provides systems and methods for color tuning optical modules and executing color calibration methods. Embodiments discussed herein allow for cosmetic color tuning on a lens or cover window, with the colored coating having specific optical functionality for camera and optical modules adjacent to it. As applied to artificial reality devices, lenses of such devices can be tuned to reflect a particular color in non-camera area and transmit a known but different color in camera areas. As such, any color effects due to lenses in front of the cameras can be calibrated to promote optimal camera performance.
Embodiments include unique optical stack combinations, using anti-reflective, infrared, and/or opaque ink, for example. Such devices and color calibrations techniques can be applied to cameras for specified wavelengths, such as infrared in the 840-860 nm range, the visible spectrum range, and others.
The present disclosure can be understood more readily by reference to the following detailed description taken in connection with the accompanying figures and examples, which form a part of this disclosure. It is to be understood that this disclosure is not limited to the specific devices, methods, applications, conditions or parameters described and/or shown herein, and that the terminology used herein is for the purpose of describing particular embodiments by way of example only and is not intended to be limiting of the claimed subject matter.
Also, as used in the specification including the appended claims, the singular forms “a,” “an,” and “the” include the plural, and reference to a particular numerical value includes at least that particular value, unless the context clearly dictates otherwise. The term “plurality”, as used herein, means more than one. When a range of values is expressed, another embodiment includes from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by use of the antecedent “about,” it will be understood that the particular value forms another embodiment. All ranges are inclusive and combinable. It is to be understood that the terminology used herein is for the purpose of describing particular aspects only and is not intended to be limiting.
It is to be appreciated that certain features of the disclosed subject matter which are, for clarity, described herein in the context of separate embodiments, can also be provided in combination in a single embodiment. Conversely, various features of the disclosed subject matter that are, for brevity, described in the context of a single embodiment, can also be provided separately or in any sub combination. Further, any reference to values stated in ranges includes each and every value within that range. Any documents cited herein are incorporated herein by reference in their entireties for any and all purposes.
In various embodiments, lenses can appear to be a single, uniform color, despite a plurality of regions with unique optical characteristics. The coating on the lens can comprise a plurality of regions, each of which serves to simultaneously reflect a particular color, while transmitting a known but different color. Camera modules and other hardware behind each lens region can calibrate out the known color distortion to enable normal functionality. Accordingly, the coating, with its distinct color regions, enables the creation and use of lenses in a plurality of colors and designs for a variety of devices, such as AR headsets, other head-mounted devices, and technologies utilizing light filtered through a lens.
An artificial reality device, for example, can comprise a plurality of cameras configured to receive light transmitted through the lens. The lens coating, such as a colored coating, affects the transmission of light through the lens. For example, a lens with a black color coating will typically have a much lower transmission rate than a clear lens. Similarly, the colored lens can act as a filter to light passing through. Cameras receiving light through the lens can be tuned to the unique color characteristics of the lens to ensure accuracy in various operations executed in response to the received image(s).
In embodiments, a lens can comprise a plurality of regions tuned to provide specific optical characteristics based on the hardware, e.g., cameras, light emitters, etc., behind the lens. Lens 100 comprises a plurality of regions tuned to optimize operations related to visible light and infrared light. In particular, region 110 can optimize operations utilizing light in the infrared spectrum 150, and regions 120, 130a, and 130b can optimize operations utilizing light in the visible spectrum 140. The size of each region 110,120, 130a, 130b can vary, and may be the same or different, depending on the optical requirements of the cameras and artificial reality system. The location of each region can be positioned anywhere on the lens as well.
In addition, an artificial reality device can comprise one or more cameras or laser emitters behind each lens region. The lens, which can be a colored lens, can affect operations by the device, such as displaying images, executing location functions, and general operations on a virtual reality device.
In various embodiments, the coating, such as a colored coating, comprises a plurality of regions each comprising a color profile. The color profiles selectively transmit light and can comprise on-axis and off-axis color profiles that transmit light differently, based on the angle of transmission through the lens.
In embodiments, one or more cameras positioned behind each lens region receives transmitted light. A computing system, comprising a processor and non-transitory memory comprising computer-executable instructions, operates with the camera, to receive information associated with the received light wavelengths, determine a color calibration, and update the received information to perform one or more operations. In various embodiments, such operations can be artificial reality functions.
In embodiments, the processor and memory can comprise instructions that receive light one or more cameras. The received light can provide environmental information, such as scene information, that can be usable to execute one or more operations on the device. Since the color profiles are known, the computing system can identify, among other things, wavelengths of light reflected by the color profile positioned in front of each camera. The computing system can further determine a color calibration based on the known color profile. In examples, as discussed herein, the color calibration amplifies wavelengths of light reflected by the color profile. The computing system can then update environmental information obtained from the received light, based on the color calibration. The device can then execute one or more operations based on updated environmental information.
In various embodiments, the executed operation can be a display and/or projection of the environmental information via one or more light emitting devices. The display can occur on a plurality of display devices, such as a monitor, external display, mobile device, AR/VR headset, and the like. In other embodiments, the operation can relate to one or more functions of an AR device, such as a user interaction, processing of visual data, simultaneous location and mapping (SLAM) functions, capturing a picture, obtaining environmental information, or emitting light, e.g., through a laser emitter, light emitting diode (LED), etc., through the lens, and any of a plurality of features and functions utilizing the received light.
In various embodiments, as illustrated in
As illustrated in
In some embodiments, a lens can further include at least one region 110 optimized to enhance operations utilizing infrared light. Like region 120, region 110 can be centrally positioned. In embodiments, region 110 can be placed above other regions, e.g., regions 130a, 130b, 120. Moreover, one or more hardware devices, such a camera and/or laser emitter can be positioned behind region 110. A camera behind infrared region 110 can receive light filtered by the color profile of region 110. A laser emitter behind infrared region 110 can emit light through region 110. In any or all cases, a computing system associated with the lens and associated hardware devices can enhance, tune, and/or optimize operations associated with light being received and/or emitted through the color profiles of each region 110, 120, 130a, 130b on the lens.
It will be appreciated that the position of the various regions can be adjusted based on the particular camera, emitter, and/or computing system components behind the lens. For example, visible wavelength regions 130a, 130b can be tuned to enhance operations utilizing visible wavelengths. Such regions 130a, 130b can further enhance user experience and visibility through a placement in front of a line of sight of the user and providing greater transmission of wavelengths within the visible realm.
In a system utilizing a lens or cover window without a tint, such as a clear lens, and/or in systems that do not utilize any lens or cover window, light 205a traveling through does not change. To a camera or other light receiving device behind the lens or cover window, the object 210 appears with its natural color. In other words, the lens or cover window does not filter, distort, or otherwise alter the appearance of the object 210.
However, in a system utilizing a colored window 240, such as a lens with a blue tint, light 205b traveling through the colored window 240 becomes distorted, as the colored window selectively reflects certain wavelengths of light and transmits other wavelengths of light. An object 220 viewed through the colored window 240 becomes distorted and can appear to have an inaccurate color. In one example, if the object is a white cup, viewing the object through a blue-tinted lens can make the object appear blue.
To correct this color distortion, image signal processing (ISP) tuning 225 can compensate for the impact of the colored window 240. In embodiments, the ISP tuning 225 provides a color calibration and/or white balance adjustment to compensate for the known color distortion caused by the colored window 240. A computing device in communication with the one or more cameras or light receptors receiving the filtered light can apply ISP tuning 225 techniques to color correct the object 230. Continuing the above example, the lens reflects blue light causing the user to see a blue tint. The camera behind the colored window 240 accordingly receives less transmitted blue light, and needs to color correct for the discrepancy, since the object's color became distorted from the colored window 240. The ISP tuning 225 can color correct this distortion to account for the blue color profile of the colored window, and cause the object to appear white, i.e., its natural color.
As discussed herein, many AR/VR devices and headsets utilize a plurality of cameras behind the lens, and execute operations based on the images received. The images are often reflective of environmental information, such as the scene a user sees through the lens. Since the received images typically serve as the foundation for many operations on the artificial reality device, it is essential that the computing system and its processor accurately identify and detect the view through the lens. Accordingly, the ISP tuning operations 225 help ensure that the received light is color corrected, based on the color profile of the lens in front of the camera and/or light receptor device. By knowing the color profile in front of the cameras and/or light receptor device and having the ability to tune and color correct the received light, devices and systems can effectively and accurately function despite the color of the lens. This enables a plurality of lens colors, designs, and configurations, that could not previously be implemented, due to color distortions and inaccuracies caused by filtered light.
Moreover, such systems, methods, and devices can be applied to windows comprising a variety of shapes and sizes, such as flat lenses, curved lenses, and other 2D and 3D lens shapes.
Various embodiments can utilize colored lenses comprising one or more regions comprising a color profile. In embodiments comprising a plurality of regions, two or more regions can have the same or different color profiles. Any of a variety of lens designs and color profiles can be utilized in accordance with embodiments.
In embodiments, a system can receive visible light transmitted through a first region configured to selectively transmit visible light and receive infrared light through a second region configured to selectively transmit infrared light 305. Such regions can be on a colored lens, for example, on an AR/VR headset and/or in accordance with other device embodiments discussed herein. Accordingly, the first region's color profile allows for the selective transmission of visible light and the second region's color profile allows for the selective transmission of infrared light. It will be appreciated that more or less regions can be present on systems, and that the particular color profiles defined in step 305 are but one example.
Regardless of various color profiles and the number of regions, exemplary embodiments receive light at a plurality of cameras positioned behind a lens comprising a color coating 310. The light can be indicative of environmental information, such as scenery, a view through the lens, and the like.
In embodiments, received light provides environmental information for executing an operation on the device. In one example, on an artificial reality headset, cameras positioned behind the lens can execute an operation to capture an image intended to reflect a snapshot of the environment beyond the lens. Since the colored lens and the regions in front of the camera distort the light, a color calibration, based on the color profile of the region in front of the camera, can help generate an image with realistic colors (see, e.g.,
When light is first received at the plurality of cameras 310, systems can further identify wavelengths of light reflected by the color profile of a first region positioned in front of a first camera and a second region positioned in front of a second camera 320. In various embodiments, one or more cameras can be positioned behind a region, and the colored lens can comprise a plurality of regions. The design of the lens, with regard to placement and number of regions can vary based on the system's purpose, function, use, and design, among other factors.
The color calibration for light received at each camera can be based on the color profile of the region through which the light travels. A color profile can further comprise a transmission profile and a reflection profile, indicative of wavelengths selectively transmitted and reflected, respectively. In an example, in a region having a color profile tuned to selectively transmit visible light, a computing system can calibrate the received light information based on the wavelengths filtered, reflected, and/or transmitted.
In particular, systems and methods determine a color calibration for light received at each camera based on the color profile, wherein the color calibration amplifies. wavelengths of light reflected by the color profile 330. For example, a color profile can comprise a reflection profile, indicative of wavelengths that are reflected. In embodiments, reflection profiles can indicate a percentage, ratio, or other indication of an amount of light reflected per wavelength and/or wavelength range. Similarly, embodiments can utilize reflection profiles associated with the color profile to assist in the determination of the color calibration, and determination of wavelengths of light for amplification.
Systems and methods can further update the environmental information based on the color calibration 340. As discussed herein, the environmental information can be indicative of a view through the lens, from the perspective of a user or other viewer or viewing device. In other examples, environmental information can comprise one or more objects, colors, and features. Systems and methods execute one or more operations on the device based on the updated environmental information 350.
An example of an operation can be an execution of a simultaneous location and mapping (SLAM) function 360a. Other possible operations include light transmission through the colored coating 360b. Such light transmissions can utilize one or more of a laser emitter, a light emitting diode (LED), or other light emitting device. An operation can comprise generating, projecting, and/or displaying an image on a display 360c. The display can be, for example, one or more monitors, computing devices, screens, mobile devices in communication with the devices and computing systems utilized herein. Tracking operations, auto-focus, and AR/VR functions, among many other operations can utilize environmental information.
Systems and devices can determine a color calibration based on the colored coating 420. The color calibration amplifies a reflection color profile associated with the colored coating. Systems and devices update received images based on the color calibration.
In certain embodiments, based on the color calibration, a third color profile can be applied to received images to tune the view through the lens and compensate for the colored coating 440. The color calibration can dynamically adjust the color calibration when the received images indicate a change in the view through the lens 450.
Such operations can be helpful when utilizing the received images for one or more operations, as discussed herein. In an example, the view through the lens, as observed by the one or more cameras, may be displayed on one or more displays, such as a local display, on a backside of the colored lens, or on one or more external devices. Since the cameras view through the lens becomes distorted based on the colored coating on the lens, the colored calibration amplifies the reflected wavelengths, to compensate for the effect of the colored coating. Such operations aid in generating accurate images with realistic colors, despite colored coatings. Such operations further enable various colored coatings and designs to be applied onto devices, without affecting the function and operation of the device, e.g., AR/VR devices.
The two lenses without any infrared ink coating, corresponding to the curves for Sample A 510 and Sample B 540, demonstrated a consistent transmission rate of over 90% for wavelengths between 400 nm and 900 nm. The lens corresponding to curve for Sample C 530 may include infrared ink on polycarbonate/polymethylmethacrylate (PC/PMMA). In other embodiments, the Sample may include any such ink and substrate combination, such as an ink and transparent polymer combination. For the three curves relating to lenses with infrared coating, i.e., Sample B 520, Sample C 530, and Sample D 550, the lenses have a less than 20% transmission rate for wavelengths below 750 nm, and less than 10% transmission rate for wavelengths below 730 nm. Above 850 nm, transmission rates increase to at least 60%. In some examples, as with the curve for Sample E 550, transmission rates can increase to 70% or greater for wavelengths of 800 nm and above. While the tested coatings demonstrate transmission rates for infrared inks, it will be appreciated that various types of coatings, directed toward particular wavelengths can be applied in a similar manner. Likewise, such coatings can include discrete regions on a lens, as discussed herein, and such transmission data can be applicable for determining color profiles, transmission profiles, and reflection profiles for such regions.
In embodiments the lens can be a curved lens, such that an outer portion comprises a convex shape. In the example illustrated in
It will be appreciated that lens designs can comprise more or less material layers than illustrated in
Table 1 illustrates data related to transmission profiles for a plurality of lens types and colors, ranging from green, red, blue, clear, and combinations of such colors. The following table provides transmission spectra data for various lens configurations and examples. Transmission profiles, comprising transmission data for a plurality of wavelengths and/or ranges of wavelengths, can provide a basis for color calibration operations. The coloration discussed in the following table is relevant to custom ink meant for near-infrared usage.
As discussed above, embodiments of the present invention comprise lenses having one or more regions, with each region comprising one or more color profiles. A particular region can comprise differing on-axis and off-axis color profiles, each with a transmission profile and a reflection profile. On-axis and off-axis refer to the angle of incidence (AOI) of light received at a particular region. On-axis indicates light received directly, with little to no AOI, while off-axis indicates light received at an angle. Different color profiles can exist for different AOIs and/or ranges of AOIs.
Table 2 illustrates specific transmission requirements for embodiments of camera regions as a function of wavelength. With respect to Table 2, camera regions represent lens regions, e.g., on a lens of an artificial reality device, behind a camera is positioned and receives light. Based on camera needs for optimal functionality, minimum transmission requirements can optimize one or more cameras. Table 2 indicates specific requirements for on-axis and off-axis, e.g., 70-degree AOI, for ranges of wavelengths.
In embodiments, an on-axis color profile can transmit over 77% of light between 400-860 nm, with the greatest transmission between 500-700 nm. An off-axis color profile wherein the on-axis color profile for at least one region has greater than a 90% transmission rate for wavelengths above 500 nm. An on-axis color profile for at least one region on a lens provides over 96% transmission rate for wavelengths between 500-700 nm. Off-axis color profile in embodiments can comprise a transmission rate of greater than 64% for wavelengths above 500 nm and/or a transmission rate of greater than 73% for wavelengths between 500-700 nm.
Table 3 illustrates color calibration data utilizing on-axis and off-axis color profile information for a blue colored lens. The color calibration identifies the signal to noise ratio (SNR) for red (R), green (G), blue (B), and yellow (Y) wavelengths, both on-axis and off axis, with regard to a point of reference (Cool White, CW) and Blue. The delta values for the on-axis measurements indicate a drop in SNR which can be compensated during a color calibration operation. The delta values for the off-axis measurements indicate an SNR enhancement which can also be compensated during color calibration operations. It will be appreciated that while SNR can serve as a basis for color calibration operations discussed herein, they are but one example of color profile data and measurements applicable for color calibration operations. Exemplary embodiments can utilize other measurements and values instead of or in addition to the SNR measurements, and each are in accordance with the various embodiments discussed herein.
It will be appreciated that while PVD processing methods can form products, devices, and lenses in accordance with embodiments, formation of such embodiments are not limited to such processing methods. A plurality of processing methods, systems, devices, and apparatuses can generate one or more layers and aspects of products and devices in accordance with embodiments.
The processor 32 may be a special purpose processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. In general, the processor 32 may execute computer-executable instructions stored in the memory (e.g., memory 44 and/or memory 46) of the node 30 in order to perform the various required functions of the node. For example, the processor 32 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the node 30 to operate in a wireless or wired environment. The processor 32 may run application-layer programs (e.g., browsers) and/or radio access-layer (RAN) programs and/or other communications programs. The processor 32 may also perform security operations such as authentication, security key agreement, and/or cryptographic operations, such as at the access-layer and/or application layer for example.
The processor 32 is coupled to its communication circuitry (e.g., transceiver 34 and transmit/receive element 36). The processor 32, through the execution of computer executable instructions, may control the communication circuitry in order to cause the node 30 to communicate with other nodes via the network to which it is connected.
The transmit/receive element 36 may be configured to transmit signals to, or receive signals from, other nodes or networking equipment. For example, in an embodiment, the transmit/receive element 36 may be an antenna configured to transmit and/or receive radio frequency (RF) signals. The transmit/receive element 36 may support various networks and air interfaces, such as wireless local area network (WLAN), wireless personal area network (WPAN), cellular, and the like. In yet another embodiment, the transmit/receive element 36 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 36 may be configured to transmit and/or receive any combination of wireless or wired signals.
The transceiver 34 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 36 and to demodulate the signals that are received by the transmit/receive element 36. As noted above, the node 30 may have multi-mode capabilities. Thus, the transceiver 34 may include multiple transceivers for enabling the node 30 to communicate via multiple radio access technologies (RATs), such as universal terrestrial radio access (UTRA) and Institute of Electrical and Electronics Engineers (IEEE 802.11), for example.
The processor 32 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 44 and/or the removable memory 46. For example, the processor 32 may store session context in its memory, as described above. The non-removable memory 44 may include RAM, ROM, a hard disk, or any other type of memory storage device. The removable memory 46 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, the processor 32 may access information from, and store data in, memory that is not physically located on the node 30, such as on a server or a home computer.
The processor 32 may receive power from the power source 48, and may be configured to distribute and/or control the power to the other components in the node 30. The power source 48 may be any suitable device for powering the node 30. For example, the power source 48 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.
The processor 32 may also be coupled to the GPS chipset 50, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the node 30. It will be appreciated that the node 30 may acquire location information by way of any suitable location-determination method while remaining consistent with an exemplary embodiment.
In operation, CPU 91 fetches, decodes, and executes instructions, and transfers information to and from other resources via the computer's main data-transfer path, system bus 80. Such a system bus connects the components in computing system 200 and defines the medium for data exchange. System bus 80 typically includes data lines for sending data, address lines for sending addresses, and control lines for sending interrupts and for operating the system bus. An example of such a system bus 80 is the Peripheral Component Interconnect (PCI) bus.
Memories coupled to system bus 80 include RAM 82 and ROM 93. Such memories may include circuitry that allows information to be stored and retrieved. ROMs 93 generally contain stored data that cannot easily be modified. Data stored in RAM 82 may be read or changed by CPU 91 or other hardware devices. Access to RAM 82 and/or ROM 93 may be controlled by memory controller 92. Memory controller 92 may provide an address translation function that translates virtual addresses into physical addresses as instructions are executed. Memory controller 92 may also provide a memory protection function that isolates processes within the system and isolates system processes from user processes. Thus, a program running in a first mode may access only memory mapped by its own process virtual address space; it cannot access memory within another process's virtual address space unless memory sharing between the processes has been set up.
In addition, computing system 200 may contain peripherals controller 83 responsible for communicating instructions from CPU 91 to peripherals, such as printer 94, keyboard 84, mouse 95, and disk drive 85.
Display 86, which is controlled by display controller 96, is used to display visual output generated by computing system 200. Such visual output may include text, graphics, animated graphics, and video. Display 86 may be implemented with a cathode-ray tube (CRT)-based video display, a liquid-crystal display (LCD)-based flat-panel display, gas plasma-based flat-panel display, or a touch-panel. Display controller 96 includes electronic components required to generate a video signal that is sent to display 86.
Further, computing system 1600 may contain communication circuitry, such as for example a network adaptor 97, that may be used to connect computing system 200 to an external communications network, such as network 12 of
This disclosure contemplates any suitable number of computer systems 1700. This disclosure contemplates computer system 1700 taking any suitable physical form. As example and not by way of limitation, computer system 1700 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (such as, for example, a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, a tablet computer system, or a combination of two or more of these. Where appropriate, computer system 1700 may include one or more computer systems 1700; be unitary or distributed; span multiple locations; span multiple machines; span multiple data centers; or reside in a cloud, which may include one or more cloud components in one or more networks. Where appropriate, one or more computer systems 1700 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein. As an example and not by way of limitation, one or more computer systems 1700 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein. One or more computer systems 1700 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
In exemplary embodiments, computer system 1700 includes a processor 1702, memory 1704, storage 1706, an input/output (I/O) interface 1708, a communication interface 1710, and a bus 1712. Although this disclosure describes and illustrates a particular computer system having a particular number of particular components in a particular arrangement, this disclosure contemplates any suitable computer system having any suitable number of any suitable components in any suitable arrangement.
In exemplary embodiments, processor 1702 includes hardware for executing instructions, such as those making up a computer program. As an example and not by way of limitation, to execute instructions, processor 1702 may retrieve (or fetch) the instructions from an internal register, an internal cache, memory 1704, or storage 1706; decode and execute them; and then write one or more results to an internal register, an internal cache, memory 1704, or storage 1706. In particular embodiments, processor 1702 may include one or more internal caches for data, instructions, or addresses. This disclosure contemplates processor 1702 including any suitable number of any suitable internal caches, where appropriate. As an example and not by way of limitation, processor 1702 may include one or more instruction caches, one or more data caches, and one or more translation lookaside buffers (TLBs). Instructions in the instruction caches may be copies of instructions in memory 1704 or storage 1706, and the instruction caches may speed up retrieval of those instructions by processor 1702. Data in the data caches may be copies of data in memory 1704 or storage 1706 for instructions executing at processor 1702 to operate on; the results of previous instructions executed at processor 1702 for access by subsequent instructions executing at processor 1702 or for writing to memory 1704 or storage 1706; or other suitable data. The data caches may speed up read or write operations by processor 1702. The TLBs may speed up virtual-address translation for processor 1702. In particular embodiments, processor 1702 may include one or more internal registers for data, instructions, or addresses. This disclosure contemplates processor 1702 including any suitable number of any suitable internal registers, where appropriate. Where appropriate, processor 1702 may include one or more arithmetic logic units (ALUs); be a multi-core processor; or include one or more processors 1702. Although this disclosure describes and illustrates a particular processor, this disclosure contemplates any suitable processor.
In exemplary embodiments, memory 1704 includes main memory for storing instructions for processor 1702 to execute or data for processor 1702 to operate on. As an example and not by way of limitation, computer system 1700 may load instructions from storage 1706 or another source (such as, for example, another computer system 1700) to memory 1704. Processor 1702 may then load the instructions from memory 1704 to an internal register or internal cache. To execute the instructions, processor 1702 may retrieve the instructions from the internal register or internal cache and decode them. During or after execution of the instructions, processor 1702 may write one or more results (which may be intermediate or final results) to the internal register or internal cache. Processor 1702 may then write one or more of those results to memory 1704. In particular embodiments, processor 1702 executes only instructions in one or more internal registers or internal caches or in memory 1704 (as opposed to storage 1706 or elsewhere) and operates only on data in one or more internal registers or internal caches or in memory 1704 (as opposed to storage 1706 or elsewhere). One or more memory buses (which may each include an address bus and a data bus) may couple processor 1702 to memory 1704. Bus 1712 may include one or more memory buses, as described below. In exemplary embodiments, one or more memory management units (MMUs) reside between processor 1702 and memory 1704 and facilitate accesses to memory 1704 requested by processor 1702. In particular embodiments, memory 1704 includes random access memory (RAM). This RAM may be volatile memory, where appropriate. Where appropriate, this RAM may be dynamic RAM (DRAM) or static RAM (SRAM). Moreover, where appropriate, this RAM may be single-ported or multi-ported RAM. This disclosure contemplates any suitable RAM. Memory 1704 may include one or more memories 1704, where appropriate. Although this disclosure describes and illustrates particular memory, this disclosure contemplates any suitable memory.
In exemplary embodiments, storage 1706 includes mass storage for data or instructions. As an example and not by way of limitation, storage 1706 may include a hard disk drive (HDD), a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. Storage 1706 may include removable or non-removable (or fixed) media, where appropriate. Storage 1706 may be internal or external to computer system 1700, where appropriate. In exemplary embodiments, storage 1706 is non-volatile, solid-state memory. In particular embodiments, storage 1706 includes read-only memory (ROM). Where appropriate, this ROM may be mask-programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), electrically alterable ROM (EAROM), or flash memory or a combination of two or more of these. This disclosure contemplates mass storage 1706 taking any suitable physical form. Storage 1706 may include one or more storage control units facilitating communication between processor 1702 and storage 1706, where appropriate. Where appropriate, storage 1706 may include one or more storages 1706. Although this disclosure describes and illustrates particular storage, this disclosure contemplates any suitable storage.
In exemplary embodiments, I/O interface 1708 includes hardware, software, or both, providing one or more interfaces for communication between computer system 1700 and one or more I/O devices. Computer system 1700 may include one or more of these I/O devices, where appropriate. One or more of these I/O devices may enable communication between a person and computer system 1700. As an example and not by way of limitation, an I/O device may include a keyboard, keypad, microphone, monitor, mouse, printer, scanner, speaker, still camera, stylus, tablet, touch screen, trackball, video camera, another suitable I/O device or a combination of two or more of these. An I/O device may include one or more sensors. This disclosure contemplates any suitable I/O devices and any suitable I/O interfaces 1708 for them. Where appropriate, I/O interface 1708 may include one or more device or software drivers enabling processor 1702 to drive one or more of these I/O devices. I/O interface 1708 may include one or more I/O interfaces 1708, where appropriate. Although this disclosure describes and illustrates a particular I/O interface, this disclosure contemplates any suitable I/O interface.
In exemplary embodiments, communication interface 1710 includes hardware, software, or both providing one or more interfaces for communication (such as, for example, packet-based communication) between computer system 1700 and one or more other computer systems 1700 or one or more networks. As an example and not by way of limitation, communication interface 1710 may include a network interface controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI network. This disclosure contemplates any suitable network and any suitable communication interface 1710 for it. As an example and not by way of limitation, computer system 1700 may communicate with an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless. As an example, computer system 1700 may communicate with a wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network), or other suitable wireless network or a combination of two or more of these. Computer system 1700 may include any suitable communication interface 1710 for any of these networks, where appropriate. Communication interface 1710 may include one or more communication interfaces 1710, where appropriate. Although this disclosure describes and illustrates a particular communication interface, this disclosure contemplates any suitable communication interface.
In particular embodiments, bus 1712 includes hardware, software, or both coupling components of computer system 1700 to each other. As an example and not by way of limitation, bus 1712 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a front-side bus (FSB), a HYPERTRANSPORT (HT) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCIe) bus, a serial advanced technology attachment (SATA) bus, a Video Electronics Standards Association local (VLB) bus, or another suitable bus or a combination of two or more of these. Bus 1712 may include one or more buses 1712, where appropriate. Although this disclosure describes and illustrates a particular bus, this disclosure contemplates any suitable bus or interconnect.
Herein, a computer-readable non-transitory storage medium or media may include one or more semiconductor-based or other integrated circuits (ICs) (such, as for example, field-programmable gate arrays (FPGAs) or application-specific ICs (ASICs)), hard disk drives (HDDs), hybrid hard drives (HHDs), optical discs, optical disc drives (ODDs), magneto-optical discs, magneto-optical drives, floppy diskettes, floppy disk drives (FDDs), magnetic tapes, solid-state drives (SSDs), RAM-drives, SECURE DIGITAL cards or drives, any other suitable computer-readable non-transitory storage media, or any suitable combination of two or more of these, where appropriate. A computer-readable non-transitory storage medium may be volatile, non-volatile, or a combination of volatile and non-volatile, where appropriate.
Herein, “or” is inclusive and not exclusive, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A or B” means “A, B, or both,” unless expressly indicated otherwise or indicated otherwise by context. Moreover, “and” is both joint and several, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A and B” means “A and B, jointly or severally,” unless expressly indicated otherwise or indicated otherwise by context.
The scope of this disclosure encompasses all changes, substitutions, variations, alterations, and modifications to the example embodiments described or illustrated herein that a person having ordinary skill in the art would comprehend. The scope of this disclosure is not limited to the example embodiments described or illustrated herein. Moreover, although this disclosure describes and illustrates respective embodiments herein as including particular components, elements, feature, functions, operations, or steps, any of these embodiments may include any combination or permutation of any of the components, elements, features, functions, operations, or steps described or illustrated anywhere herein that a person having ordinary skill in the art would comprehend. Furthermore, reference in the appended claims to an apparatus or system or a component of an apparatus or system being adapted to, arranged to, capable of, configured to, enabled to, operable to, or operative to perform a particular function encompasses that apparatus, system, component, whether or not it or that particular function is activated, turned on, or unlocked, as long as that apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative. Additionally, although this disclosure describes or illustrates particular embodiments as providing particular advantages, particular embodiments may provide none, some, or all of these advantages.