This invention relates generally to computer systems, and more particularly to computer systems utilizing graphical user interfaces.
Graphical user interfaces or GUI are becoming increasingly popular with computer users. It is generally accepted that computers having graphical user interfaces are easier to use, and that it is quicker to learn an application program in a GUI environment than in a non-GUI environment.
A relatively new type of computer which is well suited for graphical user environments is the pen-based or pen-aware computer system, hereinafter generically referred to as a “pen computer system,” “pen computer,” or the like. A pen-based computer system is typically a small, hand-held computer where the primary method for inputting data includes a “pen” or stylus. A pen-aware computer system is one which has been modified to accept pen inputs in addition to traditional input methods.
A pen computer system is often housed in a relatively flat enclosure, and has a dual-function display assembly which serves as both an input device and an output device. When operating as an input device, the display assembly senses the position of the tip of a stylus on the viewing screen and provides this positional information to the computer's central processing unit (CPU). Some display assemblies can also sense the pressure of the stylus on the screen to provide further information to the CPU. When operating as an output device, the display assembly presents computer-generated images on the screen.
Typically, graphical images can be input into the pen computer systems by merely moving the stylus across the surface of the screen, i.e. making a “stroke” on the screen. A stroke can be defined as the engagement of the screen with a stylus, the movement of the stylus across the screen (if any), and its subsequent disengagement from the screen. As the CPU senses the position and movement of the stylus, it can generate a corresponding image on the screen to create the illusion that the stylus is drawing the image directly upon the screen, i.e., that the stylus is “inking” an image on the screen. With suitable recognition software, text and numeric information can also be entered into the pen-based computer system in a similar fashion. Methods for recognizing the meaning of “ink” are well known to those skilled in the art.
Pen computer systems tend to discourage the use of a keyboard as an input device. Most of the software written for pen computers is designed to function well with pen strokes and by “tapping” the stylus against the computer screen in defined areas. A “tap” is a stroke which does not move substantially across the screen. In addition, a primary feature of many pen computer systems is their portability, which a keyboard, if included with the pen system, would seriously degrade.
In some instances, however, the need arises on a pen-based computer for data entry in a keyboard-like fashion. For example, the pen-based computer might be running a non-pen aware program that normally accepts characters from a keyboard. Also, in some cases, the only way to enter data efficiently might be to use a keyboard-like input device.
In particular, a need might arise on a pen computer to enter command or character that is normally or most efficiently executed with keystrokes on a keyboard-based system. In some pen computer systems, such keyboard-like entry of commands can be accomplished using a keyboard image displayed on the screen of the pen computer. The keyboard image resembles a standard keyboard, and keys are selected using a stylus. Most keyboard commands and characters can be entered in this fashion. Another alternative is to provide a recognition window for inputting handwritten data which is then recognized and sent to an application program as if it were typed from a keyboard. A problem with all such input approaches is that they occupy valuable screen space, which is often very limited on pen computer systems.
The efficient use of the available display screen space for observation of images and windows containing images, while particularly pronounced for pen computer systems, is common to all computer systems which display information or images to the user. No matter how large a particular display may be, a particular user will be tempted to attempt to display more information on the screen than can effectively be handled.
Images or information presented on a display screen are typically presented as opaque images, i.e., images “behind” a displayed image are obscured. This is the case with display windows which are layered on a particular screen, with the uppermost window image partially or completely blocking the view of the lower windows. For two windows to be capable of interaction, it is preferable that the user be able to observe both images at the same time, or at close to the same time.
The present invention provides for the selective creation, establishment, and processing of opaque and translucent images and opaque and translucent windows independently or in connection with other translucent images or a base opaque image provided on a display screen of a computer system. The provision of the translucent image of the present invention makes it possible to optimize space usage of the computer screen itself. Further, the invention also advantageously allows a translucent image to be formed proximate to and with specific reference to particular elements of opaque application images beneath it.
The invention further includes a method for providing a translucent image on the screen of a computer system including the steps of: 1) displaying a translucent image on the screen such that at least one opaque image can be seen through the translucent image, and 2) conducting operations with respect to either the translucent image or upon opaque images on the screen of the computer system. Both translucent and opaque image fields can be employed, which can each be completely blank without any features or elements. Particular operations upon images are considered to be image operations in regions or domains which are defined to be either translucent or opaque regions. Further, the translucent image involved may be a so-called “overlay” image produced by a computer implemented process of the present invention referred to herein as the “overlay utility.”
The present invention additionally provides a translucent overlay image over a base image provided on a screen of a pen computer system. The overlay image can serve as an input device for application programs partially obscuring images made on the screen by the application programs. The provision of the translucent overlay image of the present invention makes it possible to use much or all of the screen of the pen computer system for input. It also advantageously allows controls in the overlay image to be formed proximate to specific elements of application images beneath it.
A method for providing an overlay image on the screen of a computer system in accordance with the present invention includes the steps of: 1) Displaying a base image on the screen of the computer system; and 2) displaying an overlay image on the screen such that overlapped portions of the application image can be seen through the overlay image. Preferably, the base image is produced by an unmodified application program running on the computer system, and the overlay image is produced by a computer implemented process of the present invention referred to herein as the “overlay utility”.
A method for displaying images on a screen of a selected computer system in accordance with the present invention includes the steps of: 1) running an application program on a central processing unit (CPU) of a computer system to produce a base opaque image on a screen coupled to the CPU; and 2) running an overlay program on the CPU to produce a translucent image on the screen such that portions of an opaque base image which are overlapped by the overlay image are at least partially visible through the overlay image. Preferably, the step of running the overlay program includes the steps of: 1) displaying a translucent image on the screen; 2) intercepting screen inputs which contact the overlay image; 3) processing the intercepted screen inputs in the CPU; and 4 updating the application program based upon the process screen inputs. The step of displaying a translucent image preferably involves the blending of a translucent image with the base image. In one embodiment of the present invention, the blending is accomplished within the CPU, and in another embodiment of the present invention, the blending is accomplished externally to the CPU in specialized video driver circuitry.
A computer system in accordance with the present invention includes a central processing unit (CPU), a screen assembly coupled to the CPU, a mechanism coupled to the screen assembly for displaying a base image on the screen assembly, and a mechanism coupled to the screen assembly for displaying a translucent image on the screen assembly such that portions of the base image which are overlapped by the overlay image are at least partially visible through the overlay image. Preferably, the screen assembly includes an LCD matrix display provided with input from a stylus, a pen, a trackball, a mouse, or a keyboard, as the case may be.
In the computer system of the present invention, the mechanism for displaying the opaque base image preferably includes a first computer implemented process running on the CPU to produce first video data, and video driver circuitry coupled between the CPU and the screen assembly, which is receptive to the first video data. Also preferably, the mechanism for displaying the translucent image includes a second computer implemented process running on the CPU producing second video data, wherein the video driver circuitry is also receptive to the second video data. The computer system blends the first video data and the second video data to produce a blended image on the screen assembly. In one embodiment of the present invention, the blending is part of the second computer implemented process running on the CPU. In another embodiment of the present invention, the blending is accomplished within the hardware of the video driver circuitry.
The computer system according to the invention includes a central processing unit (CPU), a screen for displaying images, the screen being coupled to said CPU, a display coupled to the screen for displaying a translucent image, and an arrangement for conducting image operations beneath the level of a translucent image produced by the display. The computer system may for example, according to one embodiment, be effective to perform image operation with reference to a translucent image on the screen. The computer system according to the invention may further include a screen coupled to the CPU, a display coupled to the screen for displaying a translucent image on the screen, and an arrangement for conducting image operations with reference to a translucent image or an opaque image on the display. The computer system may further include an arrangement effective for conducting selectable image operations with reference to a translucent image or an opaque image on a display screen.
An advantage of the present invention is that a translucent overlay can be provided which permits a user to input data into an active application program without obscuring the user's view of the program's display window. The overlay image of the present invention is therefore well suited for computer systems having limited display areas, including for example pen computer systems.
Another advantage of the overlay image of the present invention is that it works with both pen-aware and non-pen-aware application programs. Therefore, the overlay image of the present invention can be used with the many thousands of application programs which are not designed to be used in pen computer systems.
These and other advantages of the present invention will become apparent upon reading the following detailed descriptions and studying the various figures of the drawings.
FIG. 18 is a view of a Macintosh computer screen showing a desktop, a window produced by an application program called “AppleShare” and a utility program known as “PenBoard”;
FIG. 19 illustrates a non-transparent overlay which mostly obscures the desktop and window of the AppleShare application program;
FIG. 20 illustrates the overlay keyboard after it has been made translucent by the method and apparatus of the present invention;
FIGS. 21a-21c illustrate the entry of data to the active window of the AppleShare program;
FIG. 22 is a diagram illustrating the “Display an Overlay Image” step 138 of FIG. 6B;
FIG. 23 illustrates an alternate embodiment of the “Display an Overlay Image” step 138 of FIG. 6B;
FIG. 24 illustrates the operation of the “Blending Engine” 1190 of FIG. 23;
FIG. 25 illustrates a video driver circuitry of a prior art Macintosh computer system produced by Apple Computer, Inc. of Cupertino, Calif.; and
FIG. 26 illustrates video driver circuitry in accordance with the present invention which provides overlay VRAM and blending capabilities.
As shown in
The CPU 12 is preferably a commercially available, single chip microprocessor, and is preferably a complex instruction set computer (CISC) chip such as the 68040 microprocessor available from Motorola, Inc. CPU 12 is coupled to ROM 14 by a data bus 28, control bus 29, and address bus 31. ROM 14 contains the basic operating system for the computer system 10. CPU 12 is also connected to RAM 16 by busses 28, 29, and 31 to permit the use of RAM 16 as scratch pad memory. Expansion RAM 17 is optionally coupled to RAM 16 for use by CPU 12. CPU 12 is also coupled to the I/O circuitry 18 by data bus 28, control bus 29, and address bus 31 to permit data transfers with peripheral devices.
I/O circuitry 18 typically includes a number of latches, registers and direct memory access (DMA) controllers. The purpose of I/O circuitry 18 is to provide an interface between CPU 12 and such peripheral devices as display screen assembly 20 and mass storage 24.
Display assembly 20 of computer system 10 is both an input and an output device. Accordingly, it is coupled to I/O circuitry 18 by a bi-directional data bus 36. When operating as an output device, the display assembly 20 receives data from I/O circuitry 18 via bus 36 and displays that data on a suitable screen. The screen for display assembly 20 can be a liquid crystal display (LCD) of the type commercially available from a variety of manufacturers. The input device (“tablet”) of a preferred display assembly 20 in accordance with the invention can be a thin, clear membrane which covers the LCD display and which is sensitive to the position of a stylus 38 on its surface. Alternatively, the tablet can be an embedded RF digitizer activated by an “active” RF stylus. Combination display assemblies are available from a variety of vendors.
Other types of user inputs can also be used in conjunction with the present invention. While the method of the present invention is described in the context of a pen system, other pointing devices such as a computer mouse, a track ball, or a tablet can be used to manipulate a pointer or a cursor 39 on a screen of a general purpose computer. Therefore, as used herein, the terms “pointer,” “pointing device,” “pointer inputs” and the like will refer to any mechanism or device for pointing to a particular location on a screen of a computer display.
Some type of mass storage 24 is generally considered desirable. However, the mass storage 24 can be eliminated by providing a sufficient amount of RAM 16 and expansion RAM 17 to store user application programs and data. In that case, RAMs 16 and 17 can be provided with a backup battery to prevent the loss of data even when the computer system 10 is turned off. However, it is generally desirable to have some type of long term storage 24 such as a commercially available miniature hard disk drive, nonvolatile memory such as flash memory, battery-backed RAM, PC-data cards, or the like.
In operation, information is input into the computer system 10 by “writing” on the screen of display assembly 20 with stylus 38. Information concerning the location of the stylus 38 on the screen of the display assembly 20 is input into the CPU 12 via I/O circuitry 18. Typically, this information comprises the Cartesian (i.e., x & y) coordinates of a pixel of the screen of display assembly 20 over which the tip of the stylus is positioned. Commercially available combination display assemblies include appropriate circuitry to provide the stylus location information as digitally encoded data to the I/O circuitry of the present invention. The CPU 12 then processes the data under control of an operating system and possibly an application program stored in ROM 14 and/or RAM 16. The CPU 12 then produces data which is output to the display assembly 20 to produce appropriate images on its screen.
Expansion bus 22 is coupled to the data bus 28, the control bus 29 and the address bus 31, similar to the other components in system 10. Expansion bus 22 provides extra ports to couple devices such as modems, display switches microphone, speaker, etc., to the CPU 12.
Next, a “process cursor” operation is undertaken, according to step 46. According to this step, as will be noted in greater detail below, particularly with reference to
Next, according to step 48, it is determined whether or not an overlay task is requested. If not, process control returns to point A preceding the process cursor step 46, and system operation cycles though the process cursor step 46 and decision step 48 repeatedly until an overlay task is requested in step 48.
Two overlay tasks in accordance with the present invention include “translucent request” and an “opaque request.” If there is a translucent request then step 50 undertakes the operation of rendering a desired image translucent Similarly, if there is an opaque request, then step 52 is undertaken to render a desired image opaque. After completing either step 50 or 52, control returns to point A with a subsequent process cursor operation being conducted according to step 46. The essential functions of the process cursor operation are as expressed with reference to
To indicate the implementation of the invention in greater detail,
It should be noted that, in this preferred embodiment, wand icon 66 is used to designate the overlay task which is tested in step 48 of
Accordingly, by following the steps of
Translucency and opaqueness can be selected in a variety of manners, such as by express keyboard commands. Furthermore, a user may perform a number of image activities in the translucent window with reference to underlying opaque window 62. In this case, the user has selected a simple tracing operation to duplicate the image of underlying circle 68, albeit with a slightly smaller radius. The process of the invention accordingly permits the accomplishment of any of a range of desired tasks. For example, if instead of circle 68, a complex image of a photograph of a house were displayed in opaque window 62, according to the process of the invention, a translucent overlay window could be suitably positioned thereover, permitting the user to make a sketch of selected features of the house on the overlying translucent window.
An alternate version of the invention is shown with reference to
With respect to the question of precisely how the image operation outlined in
The coordinate space 80 defined for the particular computer system 10 ranges from coordinates (−32,767; −32,767) to (+32,767; +32,767), thereby defining the space in terms of a selected pair of diagonal corner points. The top left corner coordinate points of the respective operating system and translucent or overlay screens, respectively 81 and 82, are respectively, for example, (0,0) and (0′,0′). The blending process to be discussed below essentially blends the domains of the respective coordinate image screens 81 and 82 together for display on screen 60. According to a preferred version of the invention, the blended or overlapping regions are displayed on screen 60 as 50% half-tone images, whether in color or otherwise.
The implementation of computer process 133, as will be seen with reference to
When the Frame Rect routine 104 goes to pointer table 106 in an attempt to call Show Cursor Routine 108, process control is instead diverted to a process 112 known as “Overlay Show Cursor Patch.” The Overlay Show Cursor Patch process 112 interacts with a Blending Engine process 114 to blend a first screen image 116 (see
In
In
By way of additional detail, process step 138 of
In
Much of the operation of the process illustrated in
The operating system 172, as part of its functioning, will make periodic calls to various system task processes. The system task 198 performs such functions as execute “Device Driver Code” and “Desk Accessory Code.”The process of the present invention opportunistically takes advantage of these periodic system task calls by modifying a pointer table 200 to turn over process control to an Overlay System Task Patch 202. This Overlay System Task Patch, with the Overlay Shield Cursor Patch 186, the Overlay Show Cursor Patch 188, and Blending Engine 190 comprise the overlay utility 133 of
The MMU modification of
In
In
The blocks are chained together as indicated in
The purpose of the header “padding” is for page alignment. The pages 288 are aligned in memory so that the MMU can properly map onto them. The number of bytes in the header padding 298 depends on where the header happens to be allocated in memory. If it is only a few bytes from a page boundary, then the header padding is only a few bytes in length. In some cases, the header padding may approach a full page in size (4K in this instance). Trailer “padding” 290 contains the remaining bytes in the block, which is allocated at a fixed size. Again, this fixed size in the preferred embodiment is 4K.
In FIG. 18, a screen 1040 of a Macintosh computer system made by Apple Computer, Inc., of Cupertino, Calif., includes a desktop image 1042 produced by a Macintosh operating system, a window 1044 produced by a “AppleShare” application program made by Apple Computer, Inc., and a palette 1046 produced by a small application program or “utility” known as “PenBoard” made by Apple Computer, Inc. The desktop 1042, which includes a menu bar 1048 and a desk area 1050, often displays a number of icons 1052, 1054 and 1056, which represent different objects or functions. For example, the icon 1052 represents a hard disk drive; icon 1054 represents the “trash can” in which files can be deleted; and icon 1056 represents a folder which can contain applications and documents of various types. The menu bar 1048 preferably includes a number of labels 1058, 1060, and 1062 for pull-down menus, as is well known to Macintosh users.
As mentioned previously, the desktop 1042-is created by the operating system (sometimes referred to as the “Finder”). The Finder can be considered to be a specialized form of application program which displays an image on the entirety of the screen 1040. In other words, the “window” size of the desktop 1042 is the same size as the screen 1040. The application program AppleShare which creates the window 1044 typically does not take over the entire screen 1040. Similarly, the palette 1046 (which is just a specialized form of window) is produced by the PenBoard application, and does not occupy the entire space of the screen 1040.
As is apparent by studying FIG. 18, the screen 1040 can quickly become occupied with icons, windows and palettes. This is not a major problem in traditional computer systems wherein the primary forms of input comprise keyboards and pointer devices, such a mice. However, in the pen computer systems where these more traditional forms of input devices are not always available, the limitations of screen size becomes readily apparent.
In FIG. 19, a keyboard image 1064 has been provided on screen 1040 to aid in the input of data to the AppleShare application program described previously. Preferably, this keyboard image 1064 is provided by dragging a keyboard icon 1066 off of the PenBoard palette 1046 in a fashion more fully described in copending U.S. patent application Ser. No. 08/060,458, filed May 10, 1993, on behalf of Gough et al., entitled “Method and Apparatus for Interfacing With a Computer System”, and assigned to the assignee of the present application, the disclosure of which is hereby incorporated herein by reference in its entirety. As can be seen in this FIG. 19, the keyboard image 1064 completely obscures the icons 1052, 1054 and 1056 of FIG. 18, and almost totally obscures the window 1044 of the AppleShare application program. Information can be entered into the window 1044 of the application program from the keyboard image 1064 by “tapping” on a “key” with the stylus 38. For example, arrow 1068 on the keyboard image 1064 represents the “tapping” on the key “R” with the stylus 38. This tapping action will send a “R” to be displayed in the window 1044 of the AppleShare application just as if a “R” had been typed on a physical keyboard. Again, the functioning of the keyboard image 1064 is discussed in the aforementioned copending U.S. patent application of Gough et al.
While the keyboard image 1064 can be used to input data into a currently active application program (such as AppleShare), the keyboard image prevents any user feedback of the information being entered into application windows obscured by the keyboard image. Therefore, it is difficult for the user to determine whether data has been properly entered into the application program. This, in turn, slows down the data entry process, and greatly increases the chances for errors.
The present invention solves this problem, as illustrated in FIG. 20. A user taps on a “translucency” icon 1069 on the keyboard image 1064 of FIG. 19 with the stylus 38 to cause the keyboard 1064 to become translucent. By translucent it is meant herein that the overlay image can be seen, but it can also be seen through. Tapping on the translucency icon 1069 of the keyboard image 1064′ of FIG. 20 would cause the “solid” keyboard image 1064 of FIG. 19 to reappear.
As can be seen, the translucent keyboard image 1064′ allows the window 1044 and icons 1052, 1054, and 1056, to be seen through the translucent keyboard image 1064′. In other words, portions of base images which are overlapped by the keyboard image 1064′, can still be seen (with some loss in resolution) through the translucent keyboard image 1064′.
The functioning of the keyboard image 1064′ will be explained in greater detail with reference to FIGS. 21a-21c. In FIG. 21a, the stylus 38 is used to “tap” on the “r” key as indicated by the arrow 1068 and the shading of the “r” key. The keyboard image 1064′ “intercepts” the tap 1068 which would otherwise fall on the window 1044, and, instead causes a “r” to be sent to the AppleShare program and be displayed in a password field of the window 1044. (Actually, AppleShare would display a “bullet” instead of the “r” to maintain the security of the password, but it will be assumed in this example that the typed password will remain visible). The “r” within the password field of window 1044 can be seen through the translucent window 1064′ in this figure. In FIG. 21b, second tap 1068 on the “i” key will cause the keyboard image 1064′ to “intercept” the tap which would otherwise fall on the window 1044, and to send a “i” character to the AppleShare application program which then displays an “i” after the “r” in the password field of window 1044. Next, as seen in FIG. 21c, the “p” key is tapped at 1068, causing the keyboard 1064′ to intercept the tap which would otherwise fall on the window 1044 and to send the “p” character to the AppleShare program which displays the character in the password field after the character “r” and “i.” Other characters and control characters (such as the “return” button 1070) can be sent to the application program controlling window 1044 in a similar fashion.
It will be apparent with a study of FIGS. 20 and 21a-21c that the translucent keyboard image 1064′ is a distinctly superior user interface for situations in which screen area is at a premium. Since images “beneath” the translucent keyboard image 1064′ can be seen through the keyboard image, the user has immediate feedback as to the accuracy of his or her input to the active application program. For example, if a key were “tapped” in error, the backspace key 1072 can be tapped on the translucent keyboard 1064′ so that the correct character can be reentered. The translucent keyboard 1064′ therefore effectively expands the useful area of screen 1040 by providing multiple, usable, overlapped images.
A preferred method in accordance with the present invention for implementing the process 133 on a Macintosh computer system is illustrated with reference to FIG. 22. The illustrated method of FIG. 22 is fairly specific to the Macintosh computer system. It will therefore be apparent to those skilled in the art that when the process 133 is implemented on other computer systems, such as MS-DOS compatible computer systems and UNIX computer systems, that the methodology of FIG. 22 will have to be modified. However, such modifications will become readily apparent to those skilled in the art after studying the following descriptions of how the process 133 is implemented on the Macintosh computer system.
In FIG. 22, the operating system, application program, overlay utility, system routines, etc., are shown in a somewhat hierarchical fashion. At the highest level is the operating system 1096 of the computer system 10 of FIG. 1. Running under the operating system 1096 is an application program 1098, such as the aforementioned AppleShare application program. Application program 1098, when it wants to open a window such as window 1044 of FIG. 18, calls a set of routines 1100 provided by the operating system 1096. More specifically, in the Macintosh operating system, application program 1098 calls a “New Window” routine 1102 which, in turn, calls a “Frame Rect” routine 1104. The Frame Rect routine uses a pointer table 1106 to call a “Shield Cursor” routine 1107 and a “Show Cursor” routine 1108. If the application program 1098 were running on system 1096 without the process 133 of the present invention, this would be the entirety of the calls to open up the window 1044 of FIG. 18. This process is extensively documented in the multi-volume reference set, Inside Macintosh, by C. Rose et al., Addison-Wesley Publishing Company, Inc., July 1988 and are well known to those skilled in the art of programming on the Macintosh operating system.
The implementation of computer implemented process 133 modifies this normal flow of routine calls in the following way. When the application program 1098 calls the New Window routine 1102 which calls the Frame Rect routine 1104, which attempts to call the Shield Cursor Routine, the Frame Rect routine 1104 instead calls a portion of the process of step 138 of FIG. 6B known as the Overlay Shield Cursor Patch 1110. This is accomplished by having the process 138 modify the pointer table 1106 such that when the Frame Rect routine 1104 is trying to call the Shield Cursor Routine 1107 it, instead, calls the Overlay Shield Cursor Patch 1110. After the Overlay Shield Cursor Patch 1110 completes its process, the Shield Cursor Routine 1107 is then called. As far as the Frame Rect routine 1104 is concerned, it does not know of the diversion of process control to the Overlay Shield Cursor Patch process 1110, and instead believes that it directly called the Shield Cursor Routine 1107.
The process step 138 of FIG. 6B similarly “tricks” the Frame Rect routine 1104 when it attempts to call the Show Cursor Routine 1108. In that instance, when the Frame Rect routine 1104 goes to the pointer table 1106 in an attempt to call the Show Cursor Routine 1108, process control is instead diverted to a process 1112 known as “Overlay Show Cursor Patch”. The Overlay Show Cursor Patch process 1112 interacts with a Blending Engine process 1114 to blend a first screen image 1116 generated by the Macintosh operating system and the application program, with a second image 1118 (in this case, the keyboard image) to form the blended image 1120. The operation of the Blending Engine will be discussed in greater detail subsequently. After the completion of the blending process of 1114, the Overlay Show Cursor Patch process 1112 turns over process control to the “Show Cursor Routine” process 1108. Again, as far as the Frame Rect routine 1104 is concerned, it made a direct call to the “Show Cursor Routine” 1108 and was ignorant of the diversion of the process control to the Overlay Show Cursor Patch 1112 and the Blending Engine 1114.
FIG. 23 illustrates an alternate embodiment of the present invention which has been optimized for screen-writing speed. While the process of FIG. 22 works very well, it requires that the entirety of the base screen 1116 be rewritten whenever the blended image 1120 is to be refreshed. The alternative process of FIG. 23 only refreshes the portions of the blended image that needs to be refreshed, thereby greatly increasing the writing speed to the screen 1040.
Much of the operation of the process illustrated in FIG. 23 is similar to that described in FIG. 22. An operating system 1172 supports an application program 1174 which, when it wants to open a window, calls a set of routines 1176 including a “New Window routine” 1178 and Frame Rect routine 1180. The Frame Rect routine 1180 then, as before, attempts to first call the Shield Cursor Routine 1182 first and then the Show Cursor Routine 1184. Again, as before, the pointer table is modified such that when the Frame Rect routine tries to call the Shield Cursor Routine 1182, it instead calls the Overlay Shield Cursor Patch 1186 of the present invention, and when the Frame Rect routine 1180 attempts to call the Show Cursor Routine 1184 it instead calls the Overlay Show Cursor Patch 1188. The Overlay Show Cursor Patch calls a Blending Engine 1190 which blends a partial base image 1192 with an overlay image 1194 to create a blended image 1196.
The system 1172, as part of its functioning, will make periodic calls to various system task processes 1198. The system task 1198 performs such functions as execute “Device Driver Code” and “Desk Accessory Code.” The process of the present invention opportunistically takes advantage of these periodic system task calls by modifying a pointer table 1200 to turn over process control to an Overlay System Task Patch 1202. This Overlay System Task Patch, along with the Overlay Shield Cursor Patch 1186, the Overlay Show Cursor Patch 1188, and the Blending Engine 1190 comprise the overlay utility 133 of FIGS. 6A and 6B in this second preferred embodiment.
FIG. 24 is used to illustrate the operation of the Blending Engine 1190 of FIG. 23 in greater detail. The process 138 of FIG. 6B remaps certain pages of VRAM to the RAM screen buffer when an overlay image contains objects that overlap these pages. The RAM overlay screen buffer 1194 is then merged with the RAM screen buffer 1192 in the Blending Engine 1190 by a process similar to that previously described and inserts the blended image into a “hole” 1204 of the VRAM screen buffer 1196. The portions 1206 and 1208 of the VRAM screen buffer remain the VRAM since the overlay image of the present invention does not overlap pages comprising these portions of the screen.
Since portions 1206 and 1208 are pages of VRAM screen buffer memory which are not overlapped, at least in part, by an overlay image of the present invention, these portions 1206 and 1208 can remain in VRAM screen buffer. VRAM screen buffer is much faster memory for video purposes than the RAM screen buffer 1192. Also, changes made to the RAM screen buffer 1192 or to the RAM overlay screen buffer 1194 that do not cause a change in portions 1206 and 1208 do not require that the system blend the portions 1206 and 1208. The combination of these factors substantially increase the blending speed of the VRAM screen buffer and therefore of the display on screen 1040.
FIGS. 25 and 26 are used to illustrate an alternate embodiment of the present invention wherein the blending of the base image and the overlay image are performed in the video driver hardware rather than within a computer implemented process on the CPU. In FIG. 25, a prior art video driver system of a Macintosh computer system is illustrated. In this prior art example, the video driver circuit 1302 is coupled to an address bus 1304 and a data bus 1306 connected to a Motorola 68030 microprocessor. The video driver circuit 1302 includes a color screen controller CSC 1307, and two banks of VRAM 1308 and 1310. The CSC 1307 produces LCD control and data on a bus 1312 which control a black and white or color liquid crystal display (LCD). For example, the video driver circuit 1302 can drive an Esher LCD circuit for a 640 by 400 bit display, with eight bits of information per pixel.
In FIG. 26, a modified video driver circuit 1302′ is coupled to the same Motorola 68030 address bus 1304 and data bus 1306, and includes the same CSC 1307, VRAM 1308, and VRAM 1310. However, the data and address connections have been modified as indicated. In this implementation, data from the screen buffer and the overlay screen buffer are input into the VRAM of modified video driver circuit 1302′, and combined therein to provide LCD control and blended data on the bus 1312. Again, the video driver circuit 1302′ can control a black and white or color LCD, except this time instead of having eight bits per pixel, there are four bits allocated to the base image and four bits allocated to the overlay image. A color look-up table (CLUT)—not shown—of CSC 1307 is loaded with 256 entries which detail each possible combination of bits from the 4 bit screen and the 4 bit overlay, and what the resultant blended value is. The color capability of the CSC 1307 is therefore no longer used for color look-up, and is instead used for the blending values. This technique makes it possible to use off-the-shelf integrated circuits, such as the CSC 1307 which is available from Chips & Technologies, Inc. of San Jose, Calif., to perform an entirely new operation.
In summary, the method of the invention includes establishing translucent images on a display screen including displaying a translucent images and conducting image operations enabled by the translucent image. Image operations can be any kind of operation conducted on an image or window. Drawing an image, placing an image, or for that matter modifying, moving, expanding, or changing an image or a window, are considered to be image operations. A reference image could be provided by a selected first application program. The translucent image could be produced by a selected second application program. The user is thus enabled to make sketches on the translucent image or window based upon what he or she sees on the base image produced by the first application program. This is made possible without any direct intervention in the operations of the first application program. In short, the features of the first application program are advantageously employed, without any modification of the first application program itself. The technical enablement of this cooperative screen is found in a feature of the invention according to which the second application program intercepts certain screen inputs of the first application program and uses them to supply the screen input needed as to the second application program.
The image operations enabled by the concurrent interoperability of the two applications can be implemented by user selected intervention at any of a number of screen operational levels. The base image or window is considered to operate at a lower level, or below the level of the translucent image or window. Thus, the translucent image or window is known as the “overlay” image or window. Typically, the cursor is active at the particular level at which the user can operate. In any case, according to the invention, it may be useful to operate at either the base level, i.e., the level of the base image or window, or at the translucent or overlay level. In other words, user input is permitted at either the base image or the translucent image. By a particular user input with respect to an image, the user implements a selected computer implemented process and the process receives screen inputs which contact or are otherwise associated with a particular window as the computer implemented process is effective for processing the screen inputs. These various inputs are controllable selectively by the user, in that users can take specific actions to determine which of the levels will be active for them. This can, for example, be accomplished by action of clicking or activating a pen or stylus or by another well known action users are considered capable of actuating. A particular window just opened is automatically active, as the newest window created or activated. Another window or image can be activated merely by user selection in positioning the cursor over the window or image and clicking on the mouse, trackball or another applicable interface device.
While this invention has been described in terms of several preferred embodiments, it is contemplated that many alterations, permutations, and equivalents will be apparent to those skilled in the art. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.
This application is a broadening reissue of U.S. Pat. No. 6,072,489, issued on Jun. 6, 2000, and a continuation application of copending broadening reissue U.S. application Ser. No. 12/437,500, filed on May 7, 2009, which is a continuation application of broadening reissue Ser. No. 10/163,748, filed on Jun. 5, 2002, now U.S. Pat. No. Re. 41,922. U.S. Pat. No. 6,072,489 is a continuation-in-part of patent application Ser. No. 08/060,572, filed May 10, 1993 under the title “Method and Apparatus for Displaying an Overlay Image,” now U.S. Pat. No. 5,638,501 on behalf of Gough et al. and assigned to the same assignee as herein, the disclosure of which is hereby incorporated herein by reference in its entirety. Priority rights and claims of benefit based upon this earlier-filed patent application are claimed.
Number | Name | Date | Kind |
---|---|---|---|
4555775 | Pike | Nov 1985 | A |
4686522 | Hernandez et al. | Aug 1987 | A |
4783648 | Homma et al. | Nov 1988 | A |
4823281 | Evangelisti et al. | Apr 1989 | A |
4827253 | Maltz | May 1989 | A |
4868765 | Dieffendorff | Sep 1989 | A |
4914607 | Takanashi et al. | Apr 1990 | A |
4954970 | Walker et al. | Sep 1990 | A |
4959803 | Kiyohara et al. | Sep 1990 | A |
4974196 | Iwami et al. | Nov 1990 | A |
4992781 | Iwasaki et al. | Feb 1991 | A |
5119476 | Texier | Jun 1992 | A |
5124691 | Sakamoto et al. | Jun 1992 | A |
5157384 | Greanias et al. | Oct 1992 | A |
5185808 | Cok | Feb 1993 | A |
5233686 | Rickenbach et al. | Aug 1993 | A |
5252951 | Tannenbaum et al. | Oct 1993 | A |
5260697 | Barrett et al. | Nov 1993 | A |
5265202 | Krueger et al. | Nov 1993 | A |
5283560 | Bartlett | Feb 1994 | A |
5283867 | Bayley et al. | Feb 1994 | A |
5307452 | Hahn et al. | Apr 1994 | A |
5313227 | Aoki et al. | May 1994 | A |
5313571 | Hirose et al. | May 1994 | A |
5333255 | Damouth | Jul 1994 | A |
5351067 | Lumelsky et al. | Sep 1994 | A |
5367453 | Capps et al. | Nov 1994 | A |
5398309 | Atkins et al. | Mar 1995 | A |
5425137 | Mohan et al. | Jun 1995 | A |
5425141 | Gedye | Jun 1995 | A |
5463726 | Price | Oct 1995 | A |
5463728 | Blahut et al. | Oct 1995 | A |
5467441 | Stone et al. | Nov 1995 | A |
5467443 | Johnson et al. | Nov 1995 | A |
5469540 | Powers, III et al. | Nov 1995 | A |
5469541 | Kingman et al. | Nov 1995 | A |
5475812 | Corona et al. | Dec 1995 | A |
5491495 | Ward et al. | Feb 1996 | A |
5528738 | Sfarti et al. | Jun 1996 | A |
5581243 | Ouellette et al. | Dec 1996 | A |
5581670 | Bier et al. | Dec 1996 | A |
5590265 | Nakazawa | Dec 1996 | A |
5596690 | Stone et al. | Jan 1997 | A |
5613050 | Hochmuth et al. | Mar 1997 | A |
5617114 | Bier et al. | Apr 1997 | A |
5638501 | Gough et al. | Jun 1997 | A |
5651107 | Frank et al. | Jul 1997 | A |
5652851 | Stone et al. | Jul 1997 | A |
5684939 | Foran et al. | Nov 1997 | A |
5729704 | Stone et al. | Mar 1998 | A |
5798752 | Buxton et al. | Aug 1998 | A |
5818455 | Stone et al. | Oct 1998 | A |
5831615 | Drews et al. | Nov 1998 | A |
5949432 | Gough et al. | Sep 1999 | A |
6072489 | Gough et al. | Jun 2000 | A |
7505046 | Louveaux | Mar 2009 | B1 |
RE41922 | Gough et al. | Nov 2010 | E |
Number | Date | Country |
---|---|---|
0280582 | Aug 1988 | EP |
0635779 | Jan 1995 | EP |
0635780 | Jan 1995 | EP |
0280582 | Jul 1995 | EP |
0635779 | Oct 2000 | EP |
0635780 | Jan 2001 | EP |
H2-114319 | Apr 1990 | JP |
1991288891 | Dec 1991 | JP |
H3-288891 | Dec 1991 | JP |
Entry |
---|
Greenberg, An Interdisciplinary Laboratory for Graphics Research and Applications, Siggraph, Jul. 1977, pp. 90-97. |
Snyder, An Interactive Tool for Placing Curved Surfaces without Interpenetration, Microsoft Corporation, 1995, pp. 209-218. |
Joe Abernathy, “Power in your palm?: Apple's Newton MessagePad and Tandy's Tandy Z-PDA personal digital assistants (Harware Review) (Evaluation)”, PC World, vol. 11, No. 11, Nov. 1993, p. 84(2), PC World Communications Inc., San Francisco. |
Apple News Release, “Apple Announces Major Agreements for Newton Technology At Live '93,” Apple Computer Inc., Cupertino, CA, Sep. 16, 1993, 2 pages, London, UK. |
Apple News Release, “Apple Showcases Newton Family Features in CES Progress Report,” Apple Computer Inc., Cupertino, CA, Jan. 8, 1993, pp. 29-30, Las Vegas, NV. |
Abstract of Product Announcement, “Big hit? Or Sculley's folly? (the Apple Newton personal digital assistant) (Product Announcement),” Fortune, vol. 128, No. 2, Jul. 26, 1993, p. 52(2), Fortune Magazine published by Time Inc., New York, NY. |
Henry Bortman, “PDAs: Newton Talks (Apple's personal digital assistant gaining connectivity options),” MacUser, vol. 10, No. 2, Feb. 1994, p. 173(1), Ziff-Davis Publishing Company, New York, NY. |
Henry Bortman, “The Newton Generation (Hardware Review) (Apple Newton MessagePad),” MacUser, vol. 9, No. 10, Oct. 1993, p. 101(8), Ziff-Davis Publishing Company, New York, NY. |
Henry Bortman, “Apple's Newton grows up (Newton MessagePad 110)” MacUser, vol. 10, No. 5, May 1994, p. 37 (1), Ziff-Davis Publishing Company, New York, NY. |
Cameron Crotty, “Sneak Peak: new Newton OS,” Macworld, vol. 12, No. 11, Nov. 1995, p. 139(1), Macworld Communications, Inc., St Framingham, MA. |
Apple News Release, “First Newton—The MessagePad—Hits The Market,” Apple Computer Inc., Cupertino, CA, Jul. 30, 1993, 4 pages, Cupertino, CA. |
Beth Freedman, “Third Parties, Apple form Newton consortium (Newton Industry Association),” PC Week, vol. 10, No. 49, Dec. 13, 1993, p. 3(1), Ziff-Davis Publishing Company, New York, NY. |
David Hallerman, “Newton falls short. (Hardware Review),” Home Office Computing, vol. 11, No. 11, Nov. 1993, p. 54(2), Scholastic Inc., New York, NY. |
Apple News Release, “Large Corporations Line Up Behind Newton,” Apple Computer Inc., Cupertino, CA, Aug. 2, 1993, 2 pages, Cupertino, CA. |
Apple News Release, “Launch of Newton Industry Association,” Apple Computer Inc., Cupertino, CA, Dec. 7, 1993, 1 page, Santa Clara, CA. |
Apple News Release, “Apple Computer, Inc. Reports Record Revenue and Unit Shipments for Fourth Fiscal Quarter,” Apple Computer Inc., Cupertino, CA, Oct. 14, 1993, 2 pages, Cupertino, CA. |
Cary Lu, “A Small Revelation: Newton has arrived-at long last,” MacWorld, Sep. 1993, pp. cover, 102 and 104-106, Macworld Communications, Inc., St Framingham, MA. |
John Markoff, “Apple's Newton Poised for a Rebirth,” The New York Times, Sep. 18, 1995, 2 pages, The New York Times Company, New York, NY. |
Mike McGuire, “Newton PDA sales cause traffic jams,” PC Week, vol. 10, No. 31, Aug. 9, 1993, p. 20(1), Ziff-Davis Publishing Company, New York, NY. |
Mike McGuire, “Apple Newton demand strong despite glitches,” PC Week, vol. 10, No. 38, Sep. 27, 1993, p. 146(1), Ziff-Davis Publishing Company, New York, NY. |
Michael J. Miller, “Design: Apple Newton MessagePad,” PC Magazine, vol. 12, No. 22, Dec. 21, 1993, p. 142(1), Ziff-Davis Publishing Company, New York, NY. |
Michael Moeller, “Apple exec gives blueprint For Newton, PDA features,” PC Week, vol. 11, No. 44, Nov. 7, 1994, p. 57(2), Ziff-Davis Publishing Company, New York, NY. |
Mark Moore, “Updated Newton OS given boost in handwriting support,” PC Week, vol. 12, No. 47, Nov. 27, 1995, p. 35(1), Ziff-Davis Publishing Company, New York, NY. |
Jane Morrissey, “Apple stock gains on Newton debut, more price cuts.” PC Week, vol. 10, No. 31, Aug. 9, 1993, p. 143(1), Ziff-Davis Publishing Company, New York, NY. |
Apple News Release, “Newton MessagePad Hits The Ground Running,” Apple Computer Inc., Cupertino, CA, Aug. 18, 1993, 2 pages, Cupertino, CA. |
Apple News Release, “Newton MessagePad Sales Exceed 50,000,” Apple Computer Inc., Cupertino, CA, Sep. 30, 1993, 2 pages, Cupertino, CA. |
Robin Raskin, “Apple Newton: the journey continues,” PC Week, vol. 13, No. 1, Jan. 11, 1994, p. 31(1), Ziff-Davis Publishing Company, New York, NY. |
Charles Seiter, Abstract of “Apple Newton MessagePad (Hardware Review)” MacWorld, vol. 10, No. 12, Dec. 1993, p. 52(2), Macworld Communications, Inc., St Framingham, MA. |
Charles Seiter, “Apple Newton MessagePad, ” MacWorld, vol. 10, No. 12, Dec. 1993, p. cover, 52-53, Macworld Communications, Inc., St Framingham, MA. |
Alison L. Sprout, Abstract of “Getting the most out of Newton.” Fortune, vol. 130, No. 2, Jul. 25, 1994, p. 237(1), Fortune Magazine published by Time Inc., New York, NY. |
Mitzi Waltz, Abstract of “The great (little) communicator. (Apple Newton MessagePad),” MacWorld, vol. 10, No. 12, Dec. 1993, p. 188(1), Macworld Communications, Inc., St Framingham, MA. |
Scheifler, R.W. & James W. Gettys., “The X Window System,” ACM Transactions on Graphics, vol. 6, No. 2, Apr. 1986 pp. 79-109. |
Henry, Tyson R., et al., “Integrating Gesture and Snapping into a User Interface Toolkit,” Dept. of Computer Science, University of Arizona, 1990, ACM 089791-4104/90/0010/0112m, pp. 112-122. |
Hiroshi Ishii et al., “Toward an Open Shared Workspace: Computer and Video Fusion Approach of Teamworkstation,” Communications of the ACM, Dec. 1991, pp. 37-50, vol. 34, No. 12. |
Bartlett, Joel F., “Transparent Controls for Interactive Graphics,” Jul. 1992, WRL Technical Note TN-3, Published by Digital Equipment Corporation. |
Bier, Eric A., et al., “A Taxonomy of See-Through Tools,” 1994, Proceedings of CHI, pp. 358-364. |
Roberts, W.T., et al., NeWS and X, Beauty and the Beast?, Department of Computer Science, Jul. 25, 1988, pp. 1-50, Queen Mary College, United Kingdom, London. |
The Green Door, “Business Opportunities in Consumer Electronics,” Aug. 1992, pp. 1-4, Sun Confidential and Proprietary. |
Behind the Green Door, “Deep Thoughts on Business Opportunities in Consumer Electronics,” Presented by The Members of the Green Team, Aug. 1991, pp. 1-46, Sun Confidential and Proprietary. |
Beyond the Green Door, “Further Thoughts on Business Opportunities in Consumer Electronics,” Edward Frank, Michael Sheridan, Jun. 8, 1992, pp. 1-19, Sun Confidential and Proprietary. |
FirstPerson Inc. “A Wholly-Owned Subsidiary of Sun Microsystems, Inc. Business Plan,” Edward Frank, Michael Sheridan, with David Lehman, Jun. 8, 1992, pp. 1-61, Sun Confidential and Proprietary. |
Akeley et al, High-performance Polygon Rendering, Computer Graphics, vol. 22, No. 4, Aug. 1988, pp. 239-246. |
Hiroshi Ishii and Kazuho Arita, “ClearFace: Translucent Multiuser Interface for TeamWorkStation,” in ACM SIGCHI Bulletin, Oct. 1991, pp. 67-68, vol. 23, No. 4, ACM, New York, New York. |
Douglas C. Engelbart and William K. English, “A Research Center for Augmenting Human Intellect,” AFIPS Conference Proceedings of the 1968 Fall Joint Computer Conference, Dec. 1968, pp. 395-410, vol. 33, San Francisco, California. Reprinted by Thompson Book Company, Washington D.C. |
Hiroshi Ishii and Kazuho Arita, “ClearFace: Translucent Multiuser Interface for TeamWorkStation,” Proceedings of ECSCW-91, Sep. 1991, pp. 163-174, Amsterdam, The Netherlands, Editors L. Bannon, M. Robinson and K. Schmidt. |
Foley, J.D., Van Dam, A., Feiner, S.K., Hughes, J.F., Computer Graphics: Principles and Practice, 1990, pp. 754-758, 909-910, Second Edition, Addison-Wesley Publishing Company, Reading, Massachusetts. |
Hearn, Donald and Baker, M. Pauline, Computer Graphics, 1994, pp. 508-511, Second Edition, Prentice Hall, Inc., Englewood Cliffs, New Jersey. |
Vince, John, Computer Animation, 1992, pp. 134, 314, Addison-Wesley Publishing Company, Reading, Massachusetts. |
Angel, Edward, Interactive Computer Graphics: A Top-Down Approach with OpenGL, 1997, pp. 57-58, 214-215, 412-414, Addison-Wesley Longman, Inc., Reading, Massachusetts. |
Glassner, Andrew S., Editor, Graphics Gems, 1990, pp. 397-399, Academic Press, Inc., San Diego, California. |
IBM Technical Disclosure Bulletin, “Transparent Window Selection”, vol. 30, No. 11, Apr. 1988, pp. 268-270. |
Anonymous, “Method to Allow Users to Select Transparent Color for Windows”, Mar. 1993, Research Disclosure. |
Bier et al., “Toolglass and Magic Lenses: The See-Through Interface,” 1993, Computer Graphics Proceedings, Annual Conference Series. |
GUI, “Method allowing user to select transparent color for windows”, Research Disclosure, Mar. 1993. |
Number | Date | Country | |
---|---|---|---|
Parent | 12437500 | May 2009 | US |
Child | 08130079 | US | |
Parent | 10163748 | Jun 2002 | US |
Child | 08130079 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 08060572 | May 1993 | US |
Child | 08130079 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 08130079 | Sep 1993 | US |
Child | 13874286 | US | |
Parent | 08130079 | Sep 1993 | US |
Child | 12437500 | US |