Present invention relates to providing content to an output device and, in particular, to providing universal output in which an information apparatus can pervasively output content to an output device without the need to install a dedicated device dependent driver or applications for each output device.
The present invention relates to universal data output and, in particular, to providing a new data output method and a new raster image process for information apparatuses and output devices.
As described herein, information apparatuses refer generally to computing devices, which include both stationary computers and mobile computing devices (pervasive devices). Examples of such information apparatuses include, without limitation, desktop computers, laptop computers, networked computers, palmtop computers (hand-held computers), personal digital assistants (PDAs), Internet enabled mobile phones, smart phones, pagers, digital capturing devices (e.g., digital cameras and video cameras), Internet appliances, e-books, information pads, and digital or web pads. Output devices may include, without limitation, fax machines, printers, copiers, image and/or video display devices (e.g., televisions, monitors and projectors), and audio output devices.
For simplicity and convenience, hereafter, the following descriptions may refer to an output device as a printer and an output process as printing. However, it should be understood that the term printer and printing used in the discussion of present invention refer to one embodiment used as a specific example to simplify the description of the invention. The references to printer and printing used here are intended to be applied or extended to the larger scope and definition of output devices and should not be construed as restricting the scope and practice of present invention.
Fueled by an ever-increasing bandwidth, processing power, wireless mobile devices, and wireless software applications, millions of users are or will be creating, downloading, and transmitting content and information using their pervasive or mobile computing devices. As a result, there is a need to allow users to conveniently output content and information from their pervasive computing devices to any output device. As an example, people need to directly and conveniently output from their pervasive information apparatus, without depending on synchronizing with a stationary computer (e.g., desktop personal computer) for printing.
To illustrate, a mobile worker at an airport receiving e-mail in his hand-held computer may want to walk up to a nearby printer or fax machine to have his e-mail printed. In addition, the mobile worker may also want to print a copy of his to-do list, appointment book, business card, and his flight schedule from his mobile device. As another example, a user visiting an e-commerce site using his mobile device may want to print out transaction confirmation. In still another example, a user who takes a picture with a digital camera may want to easily print it out to a nearby printer. In any of the above cases, the mobile user may want to simply walk up to a printer and conveniently print a file (word processing document, PDF, HTML etc) that is stored on the mobile device or downloaded from a network (e.g., Internet, corporate network).
Conventionally, an output device (e.g., a printer) is connected to an information apparatus via a wired connection such as a cable line. A wireless connection is also possible by using, for example, radio communication or infrared communication. Regardless of wired or wireless connection, a user must first install in the information apparatus an output device driver (e.g., printer driver in the case the output device is a printer) corresponding to a particular output device model and make. Using a device-dependent or specific driver, the information apparatus may process output content or digital document into a specific output device's input requirements (e.g., printer input requirements). The output device's input requirements correspond to the type of input that the output device (e.g., a printer) understands. For example, a printer's input requirement may include printer specific input format (e.g., one or more of an image, graphics or text format or language). Therefore, an output data (or print data in the case the output device is a printer) herein refers to data that is acceptable for input to an associated output device. Examples of input requirements may include, without limitation, audio format, video format, file format, data format, encoding, language (e.g., page description language, markup language etc), instructions, protocols or data that can be understood or used by a particular output device make and model.
Input requirements may be based on proprietary or published standards or a combination of the two. An output device's input requirements are, therefore, in general, device dependent. Different output device models may have their own input requirements specified, designed or adopted by the output device manufacturer (e.g., the printer manufacturer) according to a specification for optimal operation. Consequently, different output devices usually require use of specific output device drivers (e.g., printer drivers) for accurate output (e.g., printing). Sometimes, instead of using a device driver (e.g., printer driver), the device driving feature may be included as part of an application software.
Installation of a device driver (e.g., printer driver) or application may be accomplished by, for example, manual installation using a CD or floppy disk supplied by the printer manufacturer. Or alternatively, a user may be able to download a particular driver or application from a network. For a home or office user, this installation process may take anywhere from several minutes to several hours depending on the type of driver and user's sophistication level with computing devices and networks. Even with plug-and-play driver installation, the user is still required to execute a multi-step process for each printer or output device.
This installation and configuration process adds a degree of complexity and work to end-users who may otherwise spend their time doing other productive or enjoyable work. Moreover, many unsophisticated users may be discouraged from adding new peripherals (e.g., printers, scanners, etc.) to their home computers or networks to avoid the inconvenience of installation and configuration. It is therefore desirable that an information apparatus can output to more than one output device without the inconvenience of installing multiple dedicated device dependent drivers.
In addition, conventional output or printing methods may pose significantly higher challenges and difficulties for mobile device users than for home and office users. The requirement for pre-installation of a device-dependent driver diminishes the benefit and concept of mobile (pervasive) computing and output. For example, a mobile user may want to print or output e-mail, PowerPoint® presentation documents, web pages, or other documents at an airport, gas station, convenience store, kiosk, hotel, conference room, office, home, etc. It is highly unlikely that the user would find at any of these locations a printer of the same make and model as is at the user's base station. As a consequence, under the conventional printing method, the user would have to install and configure a printer driver each time at each such remote location before printing. It is usually not a viable option given the hundreds, or even thousands of printer models in use, and the limited storage, memory space, and processing power of the information apparatus.
Moreover, the user may not want to be bothered with looking for a driver or downloading it and installing it just to print out or display one page of email at the airport. This is certainly an undesirable and discouraging process to promote pervasive or mobile computing. Therefore, a more convenient printing method is needed in support of the pervasive computing paradigm where a user can simply walk up to an output device (e.g., printer or display device) and easily output a digital document without having to install or pre-install a particular output device driver (e.g., printer driver).
Another challenge for mobile users is that many mobile information apparatuses have limited memory space, processing capacity and power. These limitations are more apparent for small and low-cost mobile devices including, for example, PDAs, mobile phones, screen phones, pagers, e-books, Internet Pads, Internet appliances etc. Limited memory space poses difficulties in installing and running large or complex printer or device drivers, not to mention multiple drivers for a variety of printers and output devices. Slow processing speed and limited power supply create difficulties driving an output device. For example, processing or converting a digital document into output data by a small mobile information apparatus may be so slow that it is not suitable for productive output. Intensive processing may also drain or consume power or battery resources. Therefore, a method is needed so that a small mobile device, with limited processing capabilities, can still reasonably output content to various output devices.
To output or render content (e.g. digital document) to an output device, a raster image processing (RIP) operation on the content is usually required. RIP operation can be computationally intensive and may include (1) a rasterization operation, (2) a color space conversion, and (3) a halftoning operation. RIP may also include other operations such as scaling, segmentation, color matching, color correction, GCR (Grey component replacement), Black generation, image enhancement compression/decompression, encoding/decoding, encryption/decryption GCR, image enhancement among others.
Rasterization operation in RIP involves converting objects and descriptions (e.g. graphics, text etc) included in the content into an image form suitable for output. Rasterization may include additional operations such as scaling and interpolation operations for matching a specific output size and resolution. Color space conversion in RIP includes converting an input color space description into a suitable color space required for rendering at an output device (e.g. RGB to CMYK conversion). Digital halftoning is an imaging technique for rendering continuous tone images using fewer luminance and chrominance levels. Halftoning operations such as error diffusion can be computationally intensive and are included when the output device's bit depth (e.g. bits per pixel) is smaller than the input raster image bit depth.
Conventionally, RIP operations are included either in an information apparatus, or as part of an output device or output system (e.g. in a printer controller).
One drawback for the data output method 102 of
Another drawback for the conventional data output method 102 of
It will be understood that a reference to print data or output data including a language, such as PDL, should be interpreted as meaning that the print data or output data is encoded using that language. Correspondingly, a reference to a data output process generating a language, such as PDL, should be interpreted as meaning that the data output process encodes data using that language.
There are many drawbacks in the conventional data output method 104 shown in
Another drawback is that the output data that includes PDL can creates a very large file size that would increase memory and storage requirements for the information apparatus, the output device and/or the printer controller etc. Large file size may also increase the bandwidth required in the communication link between the information apparatus and the output device.
Finally, to rasterize text in an output device, a printer controller may need to include multiple fonts. When a special font or international characters is not included or missing in the printer controller, the rendering or output can potentially become inaccurate or inconsistent.
Accordingly, this invention provides a convenient universal data output method in which an information apparatus and an output device or system share the raster image processing operations. Moreover, the new data output method eliminates the need to install a plurality of device-dependent dedicated drivers or applications in the information apparatus in order to output to a plurality of output devices.
In accordance with present invention, an electronic system and method of pervasive and universal output allow an information apparatus to output content conveniently to virtually any output device. The information apparatus may be equipped with a central processing unit, input/output control unit, storage unit, memory unit, and wired or wireless communication unit or adapters. The information apparatus preferably includes a client application that may be implemented as a software application, a helper application, or a device driver (a printer driver in case of a printer). The client application may include management and control capabilities with hardware and software components including, for example, one or more communication chipsets residing in its host information apparatus.
The client application in the information apparatus may be capable of communicating with, managing and synchronizing data or software components with an output device equipped with an output controller of present invention.
Rendering content in an output device refers to printing an image of the content onto an substrate in the case of a printing device; displaying an image of the content in the case of a displaying device; playing an audio representation of the content in a voice or sound output device or system.
An output controller may be a circuit board, card or software components residing in an output device. Alternatively, the output controller may be connected externally to an output device as an external component or “box.” The output controller may be implemented with one or more combinations of embedded processor, software, firmware, ASIC, DSP, FPGA, system on a chip, special chipsets, among others. In another embodiment, the functionality of the output controller may be provided by application software running on a PC, workstation or server connected externally to an output device.
In conventional data output method 102 as described with reference to
In one implementation of this invention, the intermediate output data includes MRC (Mixed raster content) format, encoding and compression techniques, which further provides improved image quality and compression ratio compared to conventional image encoding and compression techniques.
In an example of raster image process and data output method of the present invention, a client application such as a printer driver is included in an information apparatus and performs part of raster image processing operation such as rasterization on the content. The information apparatus generates an intermediate output data that includes an output image corresponding to the content and sends the intermediate output data to an output device or an output system for rendering. An output controller application or component included in the output device or output system implements the remaining part of the raster image processing operations such as digital halftoning, color correction among others.
Unlike conventional raster image processing methods, this invention provides a more balanced distribution of the raster image processing computational load between the Information apparatus and the output device or the output system. Computational intensive image processing operations such as digital halftoning and color space conversions can be implemented in the output device or output system. Consequently, this new raster image processing method reduces the processing and memory requirements for the information apparatus when compared to conventional data output methods described with reference to
In another implementation, the present invention provides an information apparatus with output capability that is more universally accepted by a plurality of output devices. The information apparatus, which includes a client application, generates an intermediate output data that may include device independent attributes. An output controller includes components to interpret and process the intermediate output data. The information apparatus can output content to different output devices or output systems that include the output controller even when those output devices are of different brand, make, model and with different output engine and input data requirements. Unlike conventional output methods, a user does not need to preinstall in the information apparatus multiple dedicated device dependent drivers or applications for each output device.
The combination of a smaller-sized client application, a reduced computational requirement in the information apparatus, and a more universal data output method acceptable for rendering at a plurality of output devices enable mobile devices with less memory space and processing capabilities to implement data output functions which otherwise would be difficulty to implement with conventional output methods.
In addition, this invention can reduce the cost of an output device or an output system compared to conventional output methods 104 that include a page description language (PDL) printer controller. In the present invention, an information apparatus generates and sends an intermediate output data to an output device or system. The intermediate output data in one preferred embodiment includes a rasterized output image corresponding to the content intended for output. An output controller included in an output device or an output system decodes and processes the intermediate output data for output, without performing complex interpretation and rasterization compared to conventional methods described in process 104. In comparison, the conventional data output process 104 generates complex PDL and sends this PDL from an information apparatus to an output device that includes a printer controller (e.g. a PostScript controller or a PCL5 controller among others). Interpretation and raster image processing of a PDL have much higher computational requirements compared to decoding and processing the intermediate output data of this invention that include rasterized output image or images. Implementing a conventional printer controller with, for example, PDL increases component cost (e.g. memories, storages, ICs, software and processors etc.) when compared to using the output controller included in the data output method of this present invention.
Furthermore, an output data that includes PDL can create a large file size compared to an intermediate output data that includes rasterized output image. The data output method for this invention comparatively transmits a smaller output data from an information apparatus to an output device. Smaller output data size can speed up transmission, lower communication bandwidth, and reduce memory requirements. Finally, this invention can provide a convenient method to render content at an output device with or without connection to a static network. In conventional network printing, both information apparatus and output device must be connected to a static network. In this invention, through local communication and synchronization between an information apparatus and an output device, installation of hardware and software to maintain static network connectivity may not be necessary to enable the rendering of content to an output device.
According to the several aspects of the present invention there is provided the subject matter defined in the appended independent claims.
Additional objects and advantages of the present invention will be apparent from the detailed description of the preferred embodiment thereof, which proceeds with reference to the accompanying drawings.
Sets forth below are definitions of terms that are used in describing implementations of the present invention. These definitions are provided to facilitate understanding and illustration of implementations of the present invention and should in no way be construed as limiting the scope of the invention to a particular example, class, or category.
Output Device Profile (Or Object)
An output device profile (or object) includes software and data entity, which encapsulates within itself both data and attributes describing an output device and instructions for operating that data and attributes. An output device profile may reside in different hardware environments or platforms or applications, and may be transported in the form of a file, a message, a software object or component among other forms and techniques. For simplicity of discussion, a profile or object may also include, for example, the concept of software components that may have varying granularity and can consist of one class, a composite of classes, or an entire application.
The term profile or object used herein is not limited to software or data as its media. Any entity containing information, descriptions, attributes, data, instructions etc. in any computer-readable form or medium such as hardware, software, files based on or including voice, text, graphics, image, or video information, electronic signals in analog or digital form, etc., are all valid forms of profile and object definition.
A profile or object may also contain in one of its fields or attributes a reference or pointer to another profile or object, or a reference or pointer to data and or content. A reference to a profile or object may include one or more, or a combination of pointers, identifiers, names, paths, addresses or any descriptions relating to a location where an object, profile, data, or content can be found.
An output device profile may contain one or more attributes that may identify and describe, for example, the capabilities and functionalities of a particular output device such as a printer. An output device profile may be stored in the memory component of an output device, an information apparatus or in a network node. A network node includes any device, server or storage location that is connected to the network. As described below in greater detail, an information apparatus requesting output service may communicate with an output device. During such local service negotiation, at least a partial output device profile may be uploaded to the information apparatus from the output device. By obtaining the output device profile (or printer profile in the case of a printer), the information apparatus may learn about the capability, compatibility, identification, and service provided by the output device.
As an example, an output device profile may contain one or more of the following fields and or attribute descriptions. Each of following fields may be optional, and furthermore, each of the following fields or attributes may or may not exist in a particular implementation (e.g., may be empty or NULL):
Identification of an output device (e.g., brand, model, registration, IP address etc.)
Content (or data content, digital content, output content) is the data intended for output, which may include texts, graphics, images, forms, videos, audio among other content types. Content may include the data itself or a reference to that data. Content may be in any format, language, encoding or combination, and it can be in a format, language or encoding that is partially or totally proprietary. A digital document is an example of content that may include attributes and fields that describe the digital document itself and or reference or references to the digital document or documents. Examples of a digital document may be any one or combination of file types: HTML, VHTML, PostScript, PCL, XML, PDF, MS Word, PowerPoint, JPEG, MPEG, GIF, PNG, WML, VWML, CHTML, HDML, ASCII, 2-byte international coded characters, etc. Content may be used interchangeably with the term data content, output content or digital content in the descriptions of present invention.
Intermediate Output Data
Output data (or print data in case of a printer) is the electronic data sent from an information apparatus to an output device. Output data is related to the content intended for output and may be encoded in a variety of formats and languages (e.g. postscript, PCL, XML), which may include compressed or encrypted data. Some output device manufacturers may also include in the output data (or print data) a combination of proprietary or non-proprietary languages, formats, encoding, compression, encryption etc.
Intermediate output data is the output data of the present invention, and it includes the broader definition of an output file or data generated by an information apparatus, or a client application or device driver included in the information apparatus. An intermediate output data may contain text, vector graphics, images, video, audio, symbols, forms or combination and can be encoded with one or more of a page description language, a markup language, a graphics format, an imaging format, a metafile among others. An intermediate output data may also contain instructions (e.g. output preferences) and descriptions (e.g. data layout) among others. Part or all of an intermediate output data may be compressed, encrypted or tagged.
In a preferred embodiment of this invention, intermediate output data contains rasterized image data. For example, vector graphics and text information or objects that are not in image form included in content can be rasterized or conformed into image data in an information apparatus and included in an intermediate output data. Device dependent image processing operations of a RIP such as digital halftoning and color space conversions can be implemented at an output device or an output system.
The intermediate output data can be device dependent or device independent. In one implementation, the rasterized output image is device dependent if the rasterization parameters used, such as resolution, scale factor, bit depth, output size and or color space are device dependent. In another implementation of this invention, the rasterized image may be device independent if the rasterization parameters used are device independent. Rasterization parameter can become device independent when those parameters include a set of predetermined or predefined rasterization parameters based on a standard or a specification. With predefined or device independent rasterization parameters, a client application of present invention can rasterize at least a portion of the content and generate a device independent image or images included in the intermediate output data. By doing so, the intermediate output data may become device independent and therefore, become universally acceptable with output devices that have been pre-configured to accept the intermediate output data.
One advantage of rasterizing or converting text and graphics information into image data at the information apparatus is that the output device or printer controller no longer needs to perform complex rasterization operation nor do they need to include multiple fonts. Therefore, employing the intermediate output data and the data output method described herein could potentially reduce the cost and complexity of an output controller, printer controller and or output device.
One form of image data encoding is known as mixed raster content, or MRC. Typically, an image stored in MRC includes more than one image or bitmap layers. In MRC, an image can be segmented in different layers based on segmentation criteria such as background and foreground, luminance and chrominance among others. For example, an MRC may include three layers with a background layer, a foreground layer and a toggle or selector layer. The three layers are coextensive and may include different resolution, encoding and compression. The foreground and background layers may each contain additional layers, depending on the manner in which the respective part of the image is segmented based on the segmentation criteria, component or channels of a color model, image encoding representation (HLS, RGB, CMYK, YCC, LAB etc) among others. The toggle layer may designate, for each point, whether the foreground or background layer is effective. Each layer in a MRC can have different bit depths, resolution, color space, which allow, for example, the foreground layer to be compressed differently from the background layer. The MRC form of image data has previously been used to minimize storage requirements. Further, an MRC format has been proposed for use in color image fax transmission.
In one embodiment of present invention, the intermediate output data includes one or more rasterized output images that employ MRC format, encoding and or related compression method. In this implementation, different layers in the output image can have different resolutions and may include different compression techniques. Different information such as chrominance and luminance and or foreground and background information in the original content (e.g. digital document) can be segmented and compressed with different compression or encoding techniques. Segmented elements or object information in the original content can also be stored in different image layers and with different resolution. Therefore, with MRC, there is opportunity to reduce output data file size, retain greater image information, increase compression ratio, and improve image quality when compared to other conventional image encoding and compression techniques. Implementations of rasterization, raster image processing and intermediate output data that include MRC encoding in the present invention are described in more detail below.
Rasterization
Rasterization is an operation by which graphics and text in a digital document are converted to image data. For image data included in the digital document, rasterization may include scaling and interpolation. The rasterization operation is characterized by rasterization parameters including, among others bit depth and resolution. A given rasterization operation may be characterized by several more rasterization parameters, including output size, color space, color channels etc. Values of one or more of the rasterization parameters employed in a rasterization operation may be specified by default; values of one or more of the rasterization parameters may be supplied to the information apparatus as components of a rasterization vector. In a given application, the rasterization vector may specify a value of only one rasterization parameter, default values being employed for other rasterization parameters used in the rasterization operation. In another application the rasterization vector may specify values of more than one, but less than all, rasterization parameters, default values being employed for at least one other rasterization parameter used in the rasterization operation. And in yet another application the rasterization vector may specify values of all the rasterization parameters used in the rasterization operation.
Information apparatus 200 is a computing device with processing capability. In one embodiment, information apparatus 200 may be a mobile computing device such as palmtop computer, handheld device, laptop computer, personal digital assistant (PDA), smart phone, screen phone, e-book, Internet pad, communication pad, Internet appliance, pager, digital camera, etc. It is possible that information apparatus 200 may also include a static computing device such as a desktop computer, workstation, server, etc.
Information apparatus 200 may contain components such as a processing unit 380, a memory unit 370, an optional storage unit 360 and an input/output control unit (e.g. communication manager 330). Information apparatus 200 may include an interface (not shown) for interaction with users. The interface may be implemented with software or hardware or a combination. Examples of such interfaces include, without limitation, one or more of a mouse, a keyboard, a touch-sensitive or non-touch-sensitive screen, push buttons, soft keys, a stylus, a speaker, a microphone, etc.
Information apparatus 200 typically contains one or more network communication unit 350 that interfaces with other electronic devices such as network node (not shown), output device 220, and output system 230. The network communication unit may be implemented with hardware (e.g., silicon chipsets, antenna), software (e.g., protocol stacks, applications) or a combination.
In one embodiment of the present invention, communication interface 240 between information apparatus 200 and output device 220 or output system 250 is a wireless communication interface such as a short-range radio interface including those implemented according to the Bluetooth or IEEE 802.11 standard. The communication interface may also be realized by other standards and/or means of wireless communication that may include radio, infrared, cellular, ultrasonic, hydrophonic among others for accessing one or more network node and/or devices. Wired line connections such as serial or parallel interface, USB interface and fire wire (IEEE 1394) interface, among others, may also be included. Connection to a local network such as an Ethernet or a token Ring network, among others, may also be implemented in the present invention for local communication between information apparatus 200 and output device 220. Examples of hardware/software components of communication units 350 that may be used to implement wireless interface between the information apparatus 200 and the output device 220 are described in more detail with reference to
For simplicity,
Information apparatus 200 may be a dedicated device (e.g., email terminal, web terminal, digital camera, e-book, web pads, Internet appliances etc.) with functionalities that are pre-configured by manufacturers. Alternatively, information apparatus 200 may allow users to install additional hardware components and or application software 205 to expand its functionality.
Information apparatus 200 may contain a plurality of applications 205 to implement its feature sets and functionalities. As an example, a document browsing or editing application may be implemented to help user view and perhaps edit, partially or entirely, digital documents written in certain format or language (e.g., page description language, markup language, etc.). Digital documents may be stored locally in the information apparatus 200 or in a network node (e.g., in content server). An example of a document browsing application is an Internet browser such as Internet Explorer, Netscape Navigator, or a WAP browser. Such browsers may retrieve and display content (e.g. digital content) written in mark-up languages such as HTML, WML, XML, CHTML, HDML, among others. Other examples of software applications in the information apparatus 200 may include a document editing software such as Microsoft Word™ which also allows users to view and or edit digital documents that have various file extensions (e.g., doc, rtf, html, XML etc.) whether stored locally in the information apparatus 200 or in a network node. Still, other example of software applications 205 may include image acquisition and editing software.
As illustrated previously with reference to
Client application 210 includes a rasterization component 310 to conform content into one or more raster output images according to one or more rasterization parameters; an intermediate output data generator component 320 that generates and/or encodes intermediate output data that includes the one or more output images; and a communications manager 330 that manages the communication and interaction with an output device 220 or system 250 or output controller 230. Communications manager can be implemented as part of the client application 210 (shown in
The client application 210 may also optionally include or utilize one or more of the following components or operations:
The above functionalities and process of client application 210 of present invention are described in further detail in the client application process with reference to
Output device 220 is an electronic system capable of outputting digital content regardless of whether the output medium is substrate (e.g., paper), display, projection, or sound. A typical example of output device 220 is a printer, which outputs digital documents containing text, graphics, image or any combination onto a substrate. Output device 220 may also be a display device capable of displaying still images or video, such as, without limitation, televisions, monitors, and projectors. Output device 220 can also be a device capable of outputting sound. Any device capable of playing or reading digital content in audio (e.g., music) or data (e.g., text or document) formats is also a possible output device 220.
A printer is frequently referred to herein as an example of an output device to simplify discussion or as the primary output device 220 in a particular implementation. However, it should be recognized that present invention applies also to other output devices 220 such as fax machines, digital copiers, display screens, monitors, televisions, projectors, voice output devices, among others.
Rendering content with an output device 220 refers to outputting the content on a specific output medium (e.g., papers, display screens etc). For example, rendering content with a printer generates an image on a substrate; rendering content with a display device generates an image on a screen; and rendering content with an audio output device generates sound.
A conventional printing system in general includes a raster image processor and a printer engine. A printer engine includes memory buffer, marking engine among other components. The raster image processor converts content into an image form suitable for printing; the memory buffer holds the rasterized image ready for printing; and the marking engine transfers colorant to substrate (e.g., paper).
The raster image processor may be located within an output device (e.g. included in a printer controller 410) or externally implemented (in an information apparatus 200, external controller, servers etc). Raster image processor can be implemented as hardware, software, or a combination (not shown). As an example, raster image processor may be implemented in a software application or device driver in the information apparatus 200. Examples of raster image processing operations include image and graphics interpretation, rasterization, scaling, segmentation, color space transformation, image enhancement, color correction, halftoning, compression etc.
Marking engine may use any of a variety of different technologies to transfer a rasterized image to paper or other media or, in other words, to transfer colorant to a substrate. The different marking or printing technologies that may be used include both impact and non-impact printing. Examples of impact printing may include dot matrix, teletype, daisywheel, etc. Non-impact printing technologies may include inkjet, laser, electrostatic, thermal, dye sublimation, etc.
The marking engine 426 and memory buffer 424 of a printer form its printer engine 420, which may also include additional circuitry and components, such as firmware, software or chips or chipsets for decoding and signal conversion, etc. Input to a printer engine 420 is usually a final rasterized printer-engine print data generated by a raster image processor 406. Such input is usually device dependent and printer or printer engine specific. The printer engine 420 may take this device dependent input and generate or render output pages (e.g. with ink on a substrate).
When a raster image processor is located inside an output device 220, it is usually included in a printer controller 410 (as shown in
Print data sent to a printer with printer controller 410 is usually in a form (e.g. postscript) that requires further interpretation, processing or conversion. A printer controller 410 receives the print data, interprets, process, and converts the print data into a form that can be understood by the printer engine 420A. Regardless of the type of print data, conventionally, a user may need a device-specific driver in his or her information apparatus 200 in order to output the proper language, format, or file that can be accepted by a specific printer or output device 220.
Regardless of type or sophistication level, different output device 220 conventionally needs different printer drivers or output management applications in an information apparatus 200 to provide output capability. Some mobile devices with limited memory and processing power may have difficulty storing multiple device drivers or perform computational intensive RIP operations. It may also be infeasible to install a new device dependent or specific printer driver each time there is a need to print to a new printer. To overcome these difficulties, present invention provides several improvements to output device 220 or output system 250 as described in detail next.
In present invention, output device 220 may include an output controller 230 to help managing communication and negotiation processes with an information apparatus 200 and to process output data. Output controller 230 may include dedicated hardware or software or combination of both for at least one output device 220. Output controller 230 may be internally installed, or externally connected to one or more output devices 220. The output controller 230 is sometimes referred to as a print server or output server.
In one implementation, output device 220 may include a communication unit 550 or adapter to interface with information apparatus 200. Output device 220 may sometimes include more than one communication unit 550 in order to support different interfaces, protocols, or communication standards with different devices. For example, output device 220 may communicate with a first information apparatus 200 through a Bluetooth interface while communicating with a second information apparatus 200 through a parallel interface. Examples of hardware components of a wireless communication unit are described in greater detail below with reference to
In one embodiment, output controller 230 does not include a communication unit, but rather utilizes or manages a communication unit residing in the associated output device 220 such as the illustration in
Output controller 230 may contain an embedded operating system 680. With an operating system, some or all functionalities and feature sets of the output controller 230 may be provided by application software managed by the operating system. Additional application software may be installed or upgraded to newer versions in order to, for example, provide additional functionalities or bug fixes.
Output controller 230 typically includes a memory unit 640, or may share a memory unit with, for example, printer controller 410. The memory unit and storage unit, such as ROM, RAM, flash memory and disk drive among others, may provide persistent or volatile storage. The memory unit or storage unit may store output device profiles, objects, codes, instructions or data (collectively referred to as software components) that implement the functionalities of the output controller 230. Part of the software components (e.g., output device profile) may be uploaded to information apparatus 200 during or before a data output operation.
An output controller 230 may include a processor component 670A and 670C, a memory component 650, an optional storage component 640, and an optional operating system component 680.
The output controller 230 may be connected externally to an output device 220 or integrated internally into the output device 220.
Other possible implementations of output controller 230 may include, for example, a conventional personal computer (PC), a workstation, and an output server or print server. In these cases, the functionalities of output controller 230 may be implemented using application software installed in a computer (e.g., PC, server, or workstation), with the computer connected with a wired or wireless connection to an output device 220. Using a PC, server, workstation, or other computer to implement the feature sets of output controller 230 with application software is just another possible embodiment of the output controller 230 and in no way departs from the spirit, scope and process of the present invention.
The difference between output controller 230 and printer controller 410 should be noted. Printer controller 410 and output controller 230 are both controllers and are both dedicated hardware and or software for at least one output device 220. Output controller 230 refers to a controller with feature sets, capabilities, and functionalities of the present invention. A printer controller 410 may contain functions such as interpreting an input page description language, raster image processing, and queuing, among others. An output controller 230 may include part or all of the features of a printer controller 410 in addition to the feature sets, functionalities, capabilities, and processes of present invention.
Functionalities and components of output controller 230 for the purpose of providing universal data output may include or utilize:
When associated with an output device 220 that includes a printer controller 410, the output controller of present invention may further include or utilize:
In addition to the above components and functionalities, output controller 230 may further include one or more of the following:
When output controller 230 is implemented as firmware, or an embedded application, the configuration and management of the functionalities of output controller 230 may be optionally accomplished by, for example, using controller management software in a host computer. A host computer may be a desktop personal computer (PC), workstation, or server. The host computer may be connected locally or through a network to the output device 220 or the controller 230. Communication between the host computer and the output controller 230 can be accomplished through wired or wireless communication. The management application software in the host computer can manage the settings, configurations, and feature sets of the output controller 230. Furthermore, host computer's configuration application may download and or install application software, software components and or data to the output controller 230 for the purpose of upgrading, updating, and or modifying the features and capabilities of the output controller 230.
Output device 220 in one implementation includes or is connected to output controller 230 described above. Therefore, functionalities and feature sets provided by output controller 230 are automatically included in the functionalities of output device 220. The output device 220 may, however, implement or include other controllers and/or applications that provide at least partially the features and functionalities of the output controller 230.
Therefore, the output device 220 may include some or all of the following functionalities:
An output device 220 may further comprise optionally one or more of the following functionalities:
As described with reference to
Some printers do not include a raster image processor or printer controller 410, as illustrated in
Another implementation of the combined controller 230F shown in
The above are examples of different implementations and configurations of output controller 230. Other implementations are also possible. For example, partial functionalities of output controller 230 may be implemented in an external box or station while the remaining functionalities may reside inside an output device 220 as a separate board or integrated with a printer controller 410. As another example, the functionalities of output controller 230 may be implemented into a plurality of external boxes or stations connected to the same output device 220. As a further example, the same output controller 230 may be connected to service a plurality of output devices 220
RF link controller 810 implements real-time lower layer (e.g., physical layer) protocol processing that enables the hosts (e.g., information apparatus 200, output controller 230, output device 220, etc.) to communicate over a radio link. Functions performed by the link controller 810 may include, without limitation, error detection/correction, power control, data packet processing, data encryption/decryption and other data processing functions.
A variety of radio links may be utilized. A group of competing technologies operating in the 2.4 GHz unlicensed frequency band is of particular interest. This group currently includes Bluetooth, Home radio frequency (Home RF) and implementations based on IEEE 802.11 standard. Each of these technologies has a different set of protocols and they all provide solutions for wireless local area networks (LANs). Interference among these technologies could limit deployment of these protocols simultaneously. It is anticipated that new local area wireless technologies may emerge or that the existing ones may converge. Nevertheless, all these existing and future wireless technologies may be implemented in the present invention without limitation, and therefore, in no way depart from the scope of present invention.
Among the currently available wireless technologies, Bluetooth may be advantageous because it requires relatively lower power consumption and Bluetooth-enabled devices operate in piconets, in which several devices are connected in a point-to-multipoint system. Referring to
Configuration of infrared adapters 820 may vary depending on the intended rate of data transfer.
A client application 210 in the information apparatus may be in the form of a device driver, invoked by other applications residing in the information apparatus 200 to provide output service. Alternatively, the client application 210 of present invention may be an application that includes data output and management component, in addition of other functionalities such as content acquisitions, viewing, browsing, and or editing etc. For example, a client application 210 in an information apparatus 200 may itself include components and functions for a user to download, view and or edit digital document 900 in addition of the output management function described herein.
Raster image process method 902 allows an information apparatus 200 such as a mobile device to pervasively and conveniently output content (e.g. a digital document) to an output device 220 or system 250 that includes an output controller 230. A client application 210 in an information apparatus 200 may perform part of raster image processing operations (e.g. rasterization operation). Other operations of raster image processing such as halftoning can be completed by the output device 220 or by the output controller 230. In conventional data output methods, raster image processing is either implemented entirely in an information apparatus (e.g. a printer that does not include a printer controller with reference to
In step 910, rasterization operation, a content (e.g. digital document), which may include text, graphics, and image objects, is conformed or rasterized to image form according to one or more rasterization parameters such as output size, bit depth, color space, resolution, number of color channels etc. During the rasterization operation, text and vector graphics information in the content are rasterized or converted into image or bitmap information according to a given set of rasterization parameters. Image information in the content or digital document may be scaled and or interpolated to fit a particular output size, resolution and bit depth etc. The rasterization parameters are in general device dependent, and therefore may vary according to different requirements and attributes of an output device 220 and its output engine. There are many ways to obtain device dependent rasterization parameters, as described in more detail below with reference to
In an alternative implementation, rasterization parameters may be predetermined by a standard or specification. In this implementation, in step 910 the content 900 is rasterized to fit or match this predefined or standard rasterization parameters. Therefore, the rasterized output image becomes device independent. One advantage of being device independent is that the rasterized output image is acceptable with controllers, devices and/or output devices implemented or created with the knowledge of such standard or specification. A rasterized image with predefined or standardized attributes is usually more portable. For example, both the client application 210 and output device 220 or its output controller 230 may be preprogrammed to receive, interpret, and or output raster images based on a predefined standard and/or specification.
Occasionally, a predefined standard or specification for rasterization parameters may require change or update. One possible implementation for providing an easy update or upgrade is to store information and related rasterization parameters in a file or a profile instead of hard coding these parameters into programs, components or applications. Client application 210, output controller 230, and/or the output device 220 can read a file or a profile to obtain information related to rasterization parameters. To upgrade or update the standard specification or defaults requires only replacing or editing the file or the profile instead of replacing a software application or component such as the client application 210.
In step 920 the rasterized content in image form is encoded into an intermediate output data. The intermediate output data, which describes the output content, may include image information, instructions, descriptions, and data (e.g. color profile). The rasterized output image may require further processing including one or more of compression, encoding, encryption, smoothing, image enhancement, segmentation, color correction among others before being stored into the intermediate output data. The output image in the intermediate output data may be encoded in any image format and with any compression technique such as JPEG, BMP, TIFF, JBIG etc. In one preferred embodiment, a mixed raster content (MRC) format and its related encoding and/or compression methods are used to generate the output image. The advantages of using MRC over other image formats and techniques may include, for example, better compression ratio, better data information retention, smaller file size, and or relatively better image quality among others.
In step 930, the intermediate output data is transmitted to the output device 220 or output system 250 for further processing and final output. The transmission of the intermediate output data may be accomplished through wireless or wired communication links between the information apparatus 200 and the output device 220 and can be accomplished through one or multiple sessions.
In step 940, the output device 220 or output system 250 receives the transmitted intermediate output data. The output device 220 or output system 250 may include an output controller 230 to assist communicating with the information apparatus 200 and/or processing the intermediate output data. Output controller 230 may have a variety of configurations and implementations with respect to output device 220 as shown in
If the intermediate output data includes components with MRC format or encoding techniques, it may contain additional segmented information (e.g. foreground and background), which can be used to enhance image quality. For example, different techniques or algorithms in scaling, color correction, color matching, image enhancement, anti-aliasing and or digital halftoning among others may be applied to different segments or layers of the image information to improve output quality or maximize retention or recovery of image information. Multiple layers may later be combined or mapped into a single layer. These image processing and conversion components and/or operations can be included in the output controller 230 of present invention.
In step 950, the decoded or retrieved output image from the intermediate output data may require further processing or conversion. This may include one or more of scaling, segmentation, interpolation, color correction, GCR, black generation, color matching, color space transformation, anti-aliasing, image enhancement, image smoothing and or digital halftoning operations among others.
In an embodiment where the output device 220 does not include a printer controller, an output controller 230 or an output device 220 that includes output controller, after performing the remaining portion of RIP operations (e.g. color space conversion and halftoning) on the output image, may further convert the output data in step 950 into a form that is acceptable for input to a printer engine for rendering.
In an alternative embodiment where the output device 220 or the output system 250 includes a conventional printer controller, the output controller may simply decodes and or converts the intermediate output data (print data in this example) into format or language acceptable to the printer controller. For example, a printer controller may require as input a page description language (e.g. PostScript, PCL, PDF, etc.), a markup language (HTML, XML etc) or other graphics or document format. In these cases, the output controller 230 may interpret, decompress and convert the intermediate print data into an output image that has optimal output resolution, bit depth, color space, and output size related to the printer controller input requirements. The output image is then encoded or embedded into a printer-controller print data (e.g. a page description language) and sent to the printer controller. A printer-controller print data is a print data that is acceptable or compatible for input to the printer controller. After the printer controller receives the printer-controller print data, the printer controller may further perform operations such as parsing, rasterization, scaling, color correction, image enhancement, halftoning etc on the output image and generate an appropriate printer-engine print data suitable for input to the printer engine.
In step 960, the output-engine output data or printer-engine print data generated by the output controller 230 or the printer controller in step 950 is sent to the output engine or printer engine of the output device for final output.
With reference to
During output process 1002, a user may need to select one or more output devices 220 for output service. An optional discovery process step 1020 may be implemented to help the user select an output device 220. During the discovery process step 1020, a user's information apparatus 200 may (1) search for available output devices 220; (2) provide the user with a list of available output devices 220; and (3) provide means for the user to choose one or more output devices 220 to take the output job. An example of a discovery process 1020 is described below in greater detail with reference to
The optional discovery process 1020 may sometimes be unnecessary. For example, a user may skip the discovery process 1020 if he or she already knows the output device (e.g., printer) 220 to which the output is to be directed. In this case, the user may simply connect the information apparatus 200 to that output device 220 by wired connections or directly point to that output device 220 in a close proximity such as in the case of infrared connectivity. As another example, a user may pre-select or set the output device or devices 220 that are used frequently as preferred defaults. As a result, the discovery process 1020 may be partially or completely skipped if the default output device 220 or printer is found to be available.
In stage 1030, the client application may interact with output device 220, the user, and/or other applications 205 residing in the same information apparatus 200 to (1) obtain necessary output device profile and/or user preferences, (2) perform functions or part of raster image processing operations such as rasterization, scaling and color correction, and/or (3) convert or encode at least partially the rasterized content (e.g. digital document) into an intermediate output data. The processing and generation of the intermediate output data may reflect in part a relationship to an output device profile and/or user preferences obtained, if any. The intermediate output data generated by the client application 210 is then transmitted through wired or wireless local communication link(s) 240 to the output controller 230 included or associated with the selected output device 220 or output system 250. An exemplary client application process is described in greater detail with reference to
In step 1040, the output controller 230 of present invention receives the intermediate output data. In the case where the selected output device 230 does not include a printer controller, the output controller 230 of present invention may further perform processing functions such as parsing, interpreting, decompressing, decoding, color correction, image enhancement, GCR, black generation and halftoning among others. In addition, the output controller 230 may further convert or conform the intermediate output data into a form or format suitable for the output engine (e.g. printer engine in the case of a printer). The generated output-engine output data from the output controller is therefore, in general, device dependent and acceptable for final output with the output engine (or the printer engine in case of a printer) included in the selected output device 220 or output system 250.
In the case where the selected output device 220 is a printer, and when the printer includes or is connected to a printer controller, the output controller 230 may generate the proper language or input format required to interface with the printer controller (referred to as printer-controller print data). The printer controller may for example require a specific input such as a page description language (PDL), markup language, or a special image or graphics format. In these cases, the output controller 230 in step 1040 may interpret and decode the intermediate output data, and then convert the intermediate output data into the required printer-controller print data (e.g. PDL such as PostScript or PCL). The printer-controller print data generated by the output controller is then sent to the printer controller for further processing. The printer controller may perform interpretation and raster image processing operations among other operations. After processing, the printer controller generates a printer-engine print data suitable for rendering at the printer engine.
In either case, the output controller 230 or printer controller generates an output-engine output data that is suitable for sending to or interfacing with the output engine or the printer engine included in the output device for rendering. The output data may be temporarily buffered in components of the output device 220. An implementation of the output device process 1040 is described in greater detail with reference to
The steps included in the universal pervasive output process 1002 may proceed automatically when a user requests output service. Alternatively, a user may be provided with options to proceed, cancel, or input information at each and every step. For example, a user may cancel the output service at any time by, for example, indicating a cancellation signal or command or by terminating the client application 210 or by shutting down the information apparatus 200 etc.
Various protocols and or standards may be used during discovery process 1020. Wireless communication protocols are preferred. Wired communication, on the other hand, may also be implemented. Examples of applicable protocols or standards may include, without limitation, Bluetooth, HAVi, Jini, Salutation, Service Location Protocol, and Universal Plug-and-play among others. Both standard and proprietary protocols or combination may be implemented in the discovery process 1020. However, these different protocols, standards, or combination shall not depart from the spirit and scope of present invention.
In one implementation an application (referred here for simplicity of discussion as a “communication manager,” not shown) residing in the information apparatus 200 helps communicate with output device 220 and manages service requests and the discovery process 1020. The communication manager may be a part of or a feature of the client application 210. Alternatively or in combination, the communication manager may also be a separate application. When the communication manager is a separate application, the client application 210 may have the ability to communicate, manage or access functionalities of the communication manager.
The discovery process 1020 may be initiated manually by a user or automatically by a communication manager when the user requests an output service with information apparatus 200.
In the optional step 1100, a user may specify searching or matching criteria. For example, a user may indicate to search for color printers and or printers that provide free service. The user may manually specify such criteria each time for the discovery process 1020. Alternatively or in combination, a user may set default preferences that can be applied to a plurality of discovery processes 1020. Sometimes, however, no searching criteria are required: the information apparatus 200 may simply search for all available output devices 220 that can provide output service.
In step 1101, information apparatus 200 searches for available output devices 220. The searching process may be implemented by, for example, an information apparatus 200 (e.g. with the assistance of a communication manager) multi-casting or broadcasting or advertising its service requests and waiting for available output devices 220 to respond. Alternatively or in combination, an information apparatus 200 may “listen to” service broadcasts from one or more output devices 220 and then identify the one or more output devices 220 that are needed or acceptable. It is also possible that multiple output devices 220 of the same network (e.g., LAN) register their services with a control point (not shown). A control point is a computing system (e.g., a server) that maintains records on all service devices within the same network. An information apparatus 200 may contact the control point and search or query for the needed services.
In step 1102, if no available output device 220 is found, the communication manager or the client application 210 may provide the user with alternatives 1104. Such alternatives may include, for example, aborting the discovery process 1020, trying discovery process 1020 again, temporarily halting the discovery process 1020, or being notified when an available output device 220 is found. As an example, the discovery process 1020 may not detect any available output device 220 in the current wired/wireless network. The specified searching criteria (if any) are then saved or registered in the communication manager. When the user enters a new network having available output devices 220, or when new compatible output devices 220 are added to the current network, or when an output device 220 becomes available for any reason, the communication manager may notify the user of such availability.
In step 1106, if available output devices 220 are discovered, the communication manager may obtain some basic information, or part of or the entire output device profile, from each discovered output device 220. Examples of such information may include, but not limited to, device identity, service charge, subscription, service feature, device capability, operating instructions, etc. Such information is preferably provided to the user through the user interface (e.g., display screen, speaker, etc.) of information apparatus 200.
In step 1108, the user may select one or more output devices 220 based on information provided, if any, to take the output job. If the user is not satisfied with any of the available output device 220, the user may decline the service. In this case, the user may be provided with alternatives such as to try again in step 1110 with some changes made to the searching criteria. The user may choose to terminate the service request at any time. In step 1112, with one or more output devices 220 selected or determined, the communication link between information apparatus 200 and the selected output device or devices 220 may be “locked”. Other output devices 220 that are not selected may be dropped. The output process 1020 may then proceed to the client application process of step 1030 of
A client application 210 may obtain content (e.g. digital document) 900 or a pointer or reference to the content in many ways. In a preferred embodiment, the client application 210 is in the form of a device driver or an independent application, and the content or its reference can be obtained by the client application 210 from other applications 205 in the same information apparatus 200. To illustrate an example, a user may first view or download or create a digital document by using a document browsing, viewing and or editing application 205 in his/her information apparatus 200, and then request output service by launching the client application 210 as a device driver or helper application. The client application 210 communicates with the document browsing or editing application to obtain the digital document or reference to the digital document. As another example, the client application 210 is an independent application and it launches another application to help locate and obtain the digital document for output. In this case, a user may first launch the client application 210, and then invoke another application 205(e.g. document editing and or browsing application) residing in the same information apparatus 200 to view or download a digital document. The client application 210 then communicates with the document browsing or editing application to obtain the digital document for output.
In another embodiment, the client application 210 itself provides multiple functionalities or feature sets including the ability for a user to select the content (e.g. digital document) for output. For example, the client application 210 of present invention may provide a GUI where a user can directly input or select the reference or path of a digital document that the user wants to output.
In order to perform rasterization operation on content (e.g. digital document) 900, the client application 210 in step 1210 needs to obtain device dependent parameters of an output device 220 such as the rasterization parameters. Device dependent parameters may be included in an output device profile. A client application 210 may obtain an output device profile or rasterization parameters in various ways. As an example, an output device profile or rasterization parameters can be obtained with one or combination of the following:
It is important to note that step 1210 is an optional step. In some instance, part of or the entire output device profile or related device dependent information may have been already obtained by the client application 210 during the prior optional discovery process (step 1020 in
In one implementation, the client application 210 communicates with one or more output devices 220 to upload output device profiles stored in the memory or storage components of those one or more output devices 220 or their associated one or more output controllers 230. In some instance, the uploaded output device profile may contain partially or entirely references or pointers to device parameters instead of the device parameters themselves. The actual output device parameters may be stored in a network node or in the information apparatus 200, where they can be retrieved by the client application 210 or by other applications 205 using the references or pointers. It should be noted that a plurality of information apparatuses 200 may request to obtain output device profile or profiles from the same output device 220 at the same time or at least during overlapping periods. The output device 220 or its associated output controller 230 may have components or systems to manage multiple communication links and provide the output device profile or profiles concurrently or in an alternating manner to multiple information apparatuses 200. Alternatively, an output device 220 may provide components or systems to queue the requests from different information apparatuses 200 and serve them in a sequential fashion according to a scheme such as first come first served, quality of service, etc. Multi-user communication and service management capability with or without queuing or spooling functions may be implemented by, for example, the output controller 230 as optional feature sets.
In another implementation, one or more output device profiles may be stored locally in the information apparatus 200. The client application 210 may provide a GUI where a user can select a profile from a list of pre-stored profiles. As an example, the GUI may provide the user with a list of output device names (e.g. makes and models), each corresponding to an output device profile stored locally. When the user selects an output device 220, the client application 210 can then retrieve the output device profile corresponding to the name selected by the user.
In certain cases, during a discovery or communication process described earlier, the client application 210 may have already obtained the output device ID, name, or reference or other information in a variety of ways described previously. In this case, the client application 210 may automatically activate or retrieve an output device profile stored in the information apparatus 200 based on the output device ID, name, or reference obtained without user intervention.
In yet another implementation, the client application 210 may use a set of pre-defined default values stored locally in a user's information apparatus 200. Such defaults can be stored in one or more files or tables. The client application 210 may access a file or table to obtain these default values. The client application 210 may also create or calculate certain default values based on the information it has obtained during previous steps (e.g. in optional discovery process, based on partial or incomplete printer profile information obtained, etc). A user may or may not have an opportunity to change or overwrite some or all defaults.
Finally, if, for any reason, no device dependent information is available, the client application 210 may use standard output and rasterization parameters or pre-defined default parameters. The above illustrates many examples and variations of implementation, these and other possible variations in implementation do not depart from the scope of the present invention.
In step 1220, the client application 210 may optionally obtain user preferences. In one exemplary implementation, the client application 210 may obtain user preferences with a GUI (graphical user interface). For simplicity, a standard GUI form can be presented to the user independent of the make and model of the output device 220 involved in the output process. Through such an interface, the user may specify some device independent output parameters such as page range, number of cards per page, number of copies, etc. Alternatively or in combination, the client application 210 may also incorporate output device-dependent features and preferences into the GUI presented to the user. The device-dependent portion of the GUI may be supported partly or entirely by information contained in the output device profile obtained through components and processes described in previous steps. To illustrate, device dependent features and capabilities may include print quality, color or grayscale, duplex or single sided, output page size among others.
It is preferred that some or all components, attributes or fields of user preferences have default values. Part or all default values may be hard-coded in software program in client application 210 or in hardware components. Alternatively, the client application 210 may also access a file to obtain default values, or it may calculate certain default values based on the information it has obtained during previous steps or components (e.g. from an output device profile). A user may or may not have the ability to pre-configure, or change or overwrite some or all defaults. The client application 210 may obtain and use some or all defaults with or without user intervention or knowledge.
In step 1230, the client application 210 of present invention performs rasterization operation to conform a content (e.g. a digital document), which may includes objects and information in vector graphics, text, and images, into one or more output images in accordance with the rasterization parameters obtained in previous steps. During rasterization process, text and vector graphics object or information in the content is rasterized or converted into image or bitmap form according to the given set of rasterization parameters. Image information in the content may require scaling and interpolation operations to conform the rasterization parameters. Rasterization process may further include operations such as scaling, interpolation, segmentation, image transformation, image encoding, color space transformation etc. to fit or conform the one or more output images to the given set of rasterization parameters such as target output size, resolution, bit depth, color space and image format etc.
In step 1240, the client application 210 generates an intermediate output data that includes the rasterized one or more output images. The intermediate output data of the present invention may contain image information, instructions, descriptions, and data such as color profile among others. Creating and generating intermediate output data may further include operations such as compression, encoding, encryption, smoothing, segmentation, scaling and or color correction, among others. The image or images contained in an intermediate output data may be variously encoded and/or implemented with different image formats and/or compression methods (e.g. JPEG, BMP, TIFF, JBIG etc or combination). One preferred implementation is to generate or encode the output image in the intermediate output data with mixed raster content (MRC) description. The use of MRC in the data output process of present invention provides opportunities to improve the compression ratio by applying different compression techniques to segmented elements in the content. In addition, MRC provides opportunities to maintain more original content information during the encoding process of the output image and, therefore, potentially improve output quality.
In step 1250, the client application 210 transmits intermediate output data to an output device 220 through local communication link 240. The communication link may be implemented with wired or wireless technologies and the transmission may include one or multiple sessions.
It should be recognized that
Another optional process is that a user may be asked to provide payment or deposit or escrow before, during or after output service such as step 1210 or 1250 with reference to
The process illustrated in
In client output process 1204, since the rasterization parameters are predefined, the client application 210 may not need to upload printer profiles from the selected output device 230. Consequently, no two-way communication between the information apparatus 200 and the output device or devices 220 is necessary in this process 1204 when compared with process 1202 illustrated in
The standard or predefined rasterization parameters may be hard coded or programmed into the client application 210 and/or the output controller 230. However, instead of hard coding those parameters, one technique to facilitate updates or changes is to store those standard parameters in a default file or profile. The standard or predefined parameters contained in the file or profile can be retrieved and utilized by applications in an information apparatus 200 (e.g. client application 210) and/or by applications or components in an output device 220 or the output controller 230. In this way, any necessary updates, upgrades or required changes to those predefined or standard parameters can be easily accomplished by replacing or modifying the file or profile instead of modifying or updating the program, application or components in the information apparatus 200, output device 220 and/or output controller 230.
A client application process 1204 providing universal output capability to information apparatus 200 may include or utilize:
One advantage of the client output process 1204 of
The output device 220 or output system 250 may include an output controller 230 internally or externally to assist the management and operation of the output process 1302. As shown in
In step 1300, output device process 1302 is initiated by client application 210 transmitting an intermediate output data to output device 220 or output system 250. In step 1310, the output device 220 reads and interprets the intermediate output data, containing at least one raster output image relating to the content intended for output. During the reading and interpretation process 1310, the output device 220 may include components that parse the intermediate output data and perform operations such as decompression, decoding, and decryption among others. The output image may be variously encoded and may include one or more compression methods.
In the event that the method of image encoding includes MRC format, then, in one example implementation, during decoding and mapping of the output image in step 1310, the lower resolution layer and information in an image that includes MRC may be mapped, scaled or interpolated to a higher-resolution output image to produce a better image quality. Therefore, step 1310, in the event that the intermediate output data includes MRC component, each layer in an MRC image can be decompressed, processed, mapped and combined into a single combined output image layer. Step 1310 may also include scaling, color space transformation, and/or interpolation among others. In addition to the possibility of mapping methods using different scaling and interpolation ratio with different layers, another advantage of using MRC is that segmentation information contained in MRC can be utilized to apply different image processing and enhancement techniques to data in different layers of an MRC image in step 1320.
In step 1320, the output device 220 may further perform image processing operations on the decoded output image. These image processing operations may include, for example, color correction, color matching, image segmentation, image enhancement, anti-aliasing, image smoothing, digital watermarking, scaling, interpolation, and halftoning among others. The image processing operations 1320 may be combined or operated concurrently with step 1310. For example, while each row, pixel, or portion of the image is being decoded and or decompressed, image processing operations 1320 is applied. In another implementation, the image processing 1320 may occur after the entire output image or a large portion of the image has been decoded or decompressed.
If the intermediate output data includes MRC component, then in step 1320, there are additional opportunities to improve image quality. An image encoded in MRC contains segmented information that a traditional single layer image format does not usually have. As an example, foreground can be in one layer, and background in another. As another example, chrominance information may be in one layer and luminance may be in another. This segmented information in MRC may be used to apply different or selective image processing methods and algorithms to different layers or segments to enhance image quality or retain or recover image information. Different image processing techniques or algorithms may include color matching, color correction, black generation, halftoning, scaling, interpolation, anti-aliasing, smoothing, digital watermarking etc. For example, one can apply calorimetric color matching to foreground information and perceptual color matching to background information or vice versa. As another example, error diffusion halftoning can be applied to foreground and stochastic halftoning can be applied to background or vice versa. As yet another example, bi-cubic interpolation can be applied to a layer and bi-linear or minimum distance interpolation can be applied to a different layer.
In step 1330, the output device 220 or the output controller 230 may convert the processed image (e.g. halftoned) into a form acceptable to the output engine of output device 220. This conversion step is optional, depending on the type, format and input requirement of a particular output device engine (e.g. printer engine in case of a printer). Different output engines may have different input raster image input requirements. As an example different output engines may require different input image formats, number of bits or bytes per pixel, compression or uncompressed form, or different color spaces (e.g. such as RGB, CMY, CMYK, or any combination of Hi-Fi color such as green, orange, purple, red etc). Incoming raster image data can be encoded in a row, in a column, in multiple rows, in multiple columns, in a chunk, in a segment, or a combination at a time for sending the raster data to the output engine. In some cases, step 1330 may be skipped if the result of step 1320 is already in a form acceptable to the output device engine. In other cases, however, further conversion and or processing may be required to satisfy the specific input requirement of a particular output device engine.
It is important to note that the above described processing from step 1310 to step 1330 may require one or more memory buffers to temporarily store processed results. The memory buffer can store or hold a row, a column, a portion, or a chunk, of the output image in any of the steps described above. Storing and retrieving information into and from the memory buffer may be done sequentially, in an alternating fashion, or in an interlaced or interleaved fashion among other possible combinations. Step 1310 to step 1330 operations can be partially or completely implemented with the output controller 230.
In step 1370, the output device engine included in the output device 220 or output system 250 receives the output-engine output data generated in step 1330 or step 1320. The output-engine output data is in a form that satisfies the input requirements and attributes of the output engine, such as color space, color channel, bit depth, output size, resolution, etc. The output engine then takes this output-engine output data and outputs or renders the data content through its marking engine or display engine.
One advantage of data output method 1002 that includes output device process 1302 is that it has less processing requirements on an information apparatus 200 compared to conventional process with reference to
For example, some image processing functions, such as halftoning (e.g. error diffusion) may require substantial processing and computing power. In data output process 1002 that includes output device process 1302, halftoning is performed in step 1320 by an output device component (e.g. the output controller 230) included in the output device 220 or the output system 250, not in the information apparatus 200; therefore reducing the computational requirements for the information apparatus 200. Another advantage of data output 1302 is that the intermediate output data is less device dependent than the output data generated by conventional output method 102 with reference to
Some output devices 220 may contain a printer controller 410. An example of this type of output device or printer is a PostScript printer or PCL printer among others.
There are many printing system configurations for providing the data output capability and process to a printer or a printing system that includes a printer controller. In one example, the existing printer controller in the output device 220 may incorporate the feature sets provided by the output controller to form a “combined controller” as described previously with reference to
An output device process 1304 and operations for an output device 220 or system 250 that includes a printer controller 410 may include or utilize:
In output device process 1304, step 1300 (receiving intermediate output data) and step 1310 (interpret intermediate output data) are identical to step 1300 and step 1310 in output device process 1302, which have been described in previous sections with reference to
In step 1350, the output controller 230 converts the intermediate print data into a printer-controller print data that is in a form compatible or acceptable for input to a printer controller. For example, a printer controller may require as input a specific page description language (PDL) such as PostScript. The output controller 230 then creates a PostScript file and embeds the output image generated or retrieved in step 1310 into the PostScript file. The output controller 230 can also create and embed the output image from step 1310 into other printer controller print data formats, instructions or languages.
In step 1360, the printer controller receives printer-controller print data generated in step 1350 that includes an acceptable input language or format to the printer controller. The printer controller may parse, interpret, and decode the input printer-controller print data. The printer controller may further perform raster image processing operations such as rasterization, color correction, black generation, GCR, anti-aliasing, scaling, image enhancement, and halftoning among others on the output image. The printer controller may then generate a printer-engine print data that is suitable for input to the printer engine. The type and or format of printer-engine print data may vary according to the requirement of a particular printer engine.
It is important to note that the above described process from step 1310 to step 1360 may require one or more memory buffer to temporarily store processed results. The memory buffer can store or hold a row, a column, a portion, or a chunk, of the output image in any of the steps described above. Storing and retrieving information into and from the memory buffer may be done sequentially, alternated, or in an interlaced or interleaved fashion among other possible combinations. Process and operations of step 1310 to step 1360 can be implemented with output controller 230.
In step 1370, the printer engine included in the output device 220 or output system 250 generates or renders the final output based on the printer-engine print data generated in step 1360. For example, the printer-engine print data may be in CMY, CMYK, and RGB etc, and this may be in one or more bits per pixel format, satisfying the size and resolution requirement of the printer engine. The printer engine included the output device 220 may take this print data and generate or render an output page through its marking engine.
Having described and illustrated the principles of our invention with reference to an illustrated embodiment, it will be recognized that the illustrated embodiment can be modified in arrangement and detail without departing from such principles. In view of the many possible embodiments to which the principles of our invention may be applied, it should be recognized that the detailed embodiments are illustrative only and should not be taken as limiting the scope of our invention. Rather, I claim as my invention all such embodiments as may come within the scope of the following claims and equivalents thereto.
Unless the context indicates otherwise, a reference in a claim to the number of instances of an element, be it a reference to one instance or more than one instance, requires at least the stated number of instances of the element but is not intended to exclude from the scope of the claim a structure or method having more instances of that element than stated. Specifically, but without limitation, a reference in a claim to an or one output device or system, to an or one image, or to a or one rasterization parameter is not intended to exclude from the scope of the claim a structure or method having, including, employing or supplying two or more output devices or system, images or rasterization parameters.
This application claims benefit of Provisional Application No. 60/262,764 filed Jan. 19, 2001, the entire disclosure of which is hereby incorporated by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
3629493 | Morgenfruh | Dec 1971 | A |
3833297 | Swartz | Sep 1974 | A |
3848856 | Reeber et al. | Nov 1974 | A |
4262301 | Erlichman | Apr 1981 | A |
4266863 | Hollingsworth et al. | May 1981 | A |
4291956 | Vogelgesang | Sep 1981 | A |
4291957 | Hollingsworth | Sep 1981 | A |
4301599 | Leay | Nov 1981 | A |
4335955 | Lopata | Jun 1982 | A |
4340905 | Balding | Jul 1982 | A |
4360264 | Baker et al. | Nov 1982 | A |
4417792 | Martin | Nov 1983 | A |
4428001 | Yamamura et al. | Jan 1984 | A |
4431282 | Martin | Feb 1984 | A |
4435059 | Gerber | Mar 1984 | A |
4495490 | Hopper et al. | Jan 1985 | A |
4539585 | Spackova et al. | Sep 1985 | A |
4541010 | Alston | Sep 1985 | A |
4553835 | Morgan, Jr. | Nov 1985 | A |
4580880 | Watson | Apr 1986 | A |
4602280 | Maloomian | Jul 1986 | A |
4603330 | Horne et al. | Jul 1986 | A |
4758881 | Laspada | Jul 1988 | A |
4956665 | Niles | Sep 1990 | A |
4958220 | Alessi et al. | Sep 1990 | A |
4979032 | Alessi et al. | Dec 1990 | A |
5048057 | Saleh | Sep 1991 | A |
5166809 | Surbrook | Nov 1992 | A |
5220674 | Morgan et al. | Jun 1993 | A |
5228118 | Sasaki | Jul 1993 | A |
5257097 | Pineau et al. | Oct 1993 | A |
5270773 | Sklut et al. | Dec 1993 | A |
5287194 | Lobiondo | Feb 1994 | A |
5303342 | Edge | Apr 1994 | A |
5319711 | Servi | Jun 1994 | A |
5337258 | Dennis | Aug 1994 | A |
5353388 | Motoyama | Oct 1994 | A |
5404433 | Hosugai | Apr 1995 | A |
5412798 | Garney | May 1995 | A |
5467434 | Hower, Jr. et al. | Nov 1995 | A |
5475507 | Suzuki et al. | Dec 1995 | A |
5479206 | Ueno et al. | Dec 1995 | A |
5485634 | Weiser et al. | Jan 1996 | A |
5490287 | Itoh | Feb 1996 | A |
5515480 | Frazier | May 1996 | A |
5519641 | Beers et al. | May 1996 | A |
5524185 | Na | Jun 1996 | A |
5537107 | Funado | Jul 1996 | A |
5537517 | Wakabayashi et al. | Jul 1996 | A |
5564109 | Snyder et al. | Oct 1996 | A |
5566278 | Patel et al. | Oct 1996 | A |
5568595 | Yosefi et al. | Oct 1996 | A |
5580177 | Gase et al. | Dec 1996 | A |
5596697 | Foster et al. | Jan 1997 | A |
5604843 | Shaw et al. | Feb 1997 | A |
5613123 | Tsang et al. | Mar 1997 | A |
5613124 | Atkinson et al. | Mar 1997 | A |
5619257 | Reele et al. | Apr 1997 | A |
5619649 | Kovnat et al. | Apr 1997 | A |
5625757 | Kageyama et al. | Apr 1997 | A |
5636211 | Newlin | Jun 1997 | A |
5644662 | Vuylsteke | Jul 1997 | A |
5664243 | Okada et al. | Sep 1997 | A |
5675717 | Yamamoto | Oct 1997 | A |
5687332 | Kurahashi | Nov 1997 | A |
5699495 | Snipp | Dec 1997 | A |
5710557 | Schuette | Jan 1998 | A |
5717742 | Hyde-Thomson | Feb 1998 | A |
5724106 | Autry et al. | Mar 1998 | A |
5737501 | Tsunekawa | Apr 1998 | A |
5739928 | Scott | Apr 1998 | A |
5748859 | Takayangi et al. | May 1998 | A |
5754655 | Hughes | May 1998 | A |
5757952 | Buytaert et al. | May 1998 | A |
5761480 | Fukada et al. | Jun 1998 | A |
5796394 | Wicks et al. | Aug 1998 | A |
5802314 | Tullis et al. | Sep 1998 | A |
5822230 | Kikinis | Oct 1998 | A |
5826244 | Huberman | Oct 1998 | A |
5831664 | Wharton | Nov 1998 | A |
5832191 | Thorne | Nov 1998 | A |
5838320 | Matthews, III et al. | Nov 1998 | A |
5838926 | Yamagishi | Nov 1998 | A |
5845078 | Tezuka et al. | Dec 1998 | A |
5852721 | Dillon et al. | Dec 1998 | A |
5859970 | Pleso | Jan 1999 | A |
5862321 | Lamming et al. | Jan 1999 | A |
5867633 | Taylor, III et al. | Feb 1999 | A |
5870723 | Pare, Jr. et al. | Feb 1999 | A |
5880858 | Jin | Mar 1999 | A |
5881213 | Shaw et al. | Mar 1999 | A |
5903832 | Seppanen et al. | May 1999 | A |
5907831 | Lotvin et al. | May 1999 | A |
5911044 | Lo et al. | Jun 1999 | A |
5916309 | Brown et al. | Jun 1999 | A |
5917542 | Moghadam | Jun 1999 | A |
5926104 | Robinson et al. | Jul 1999 | A |
5930466 | Rademacher | Jul 1999 | A |
5931919 | Thomas et al. | Aug 1999 | A |
5933498 | Schneck et al. | Aug 1999 | A |
5937112 | Herregods et al. | Aug 1999 | A |
5940843 | Zucknovich et al. | Aug 1999 | A |
5946031 | Douglas | Aug 1999 | A |
5946110 | Hu et al. | Aug 1999 | A |
5953546 | Okada et al. | Sep 1999 | A |
5960162 | Yamamoto | Sep 1999 | A |
5968176 | Nessett et al. | Oct 1999 | A |
5974401 | Enomoto et al. | Oct 1999 | A |
5978560 | Tan et al. | Nov 1999 | A |
5983200 | Slotznick | Nov 1999 | A |
5987454 | Hobbs | Nov 1999 | A |
5993047 | Novogrod et al. | Nov 1999 | A |
6006265 | Rangan et al. | Dec 1999 | A |
6009464 | Hamilton et al. | Dec 1999 | A |
6020973 | Levine et al. | Feb 2000 | A |
6023715 | Burkes et al. | Feb 2000 | A |
6034621 | Kaufman | Mar 2000 | A |
6035214 | Henderson | Mar 2000 | A |
6043898 | Jacobs | Mar 2000 | A |
6046820 | Konishi | Apr 2000 | A |
6061142 | Shim | May 2000 | A |
6069707 | Pekelman | May 2000 | A |
6070185 | Anupam et al. | May 2000 | A |
6072595 | Yoshiura | Jun 2000 | A |
6076076 | Gottfreid | Jun 2000 | A |
6076109 | Kikinis | Jun 2000 | A |
6078906 | Huberman | Jun 2000 | A |
6087060 | Chase et al. | Jul 2000 | A |
6088450 | Davis et al. | Jul 2000 | A |
6101291 | Arney et al. | Aug 2000 | A |
6138178 | Watanabe | Oct 2000 | A |
6141659 | Barker et al. | Oct 2000 | A |
6144997 | Lamming et al. | Nov 2000 | A |
6145031 | Mastie et al. | Nov 2000 | A |
6148346 | Hanson | Nov 2000 | A |
6167514 | Matsui et al. | Dec 2000 | A |
6173407 | Yoon et al. | Jan 2001 | B1 |
6184996 | Gase | Feb 2001 | B1 |
6189148 | Clark et al. | Feb 2001 | B1 |
6189993 | Mantell | Feb 2001 | B1 |
6192407 | Smith et al. | Feb 2001 | B1 |
6195564 | Rydbeck et al. | Feb 2001 | B1 |
6199099 | Gershman et al. | Mar 2001 | B1 |
6199106 | Shaw et al. | Mar 2001 | B1 |
6201611 | Carter et al. | Mar 2001 | B1 |
6205495 | Gilbert et al. | Mar 2001 | B1 |
6211858 | Moon et al. | Apr 2001 | B1 |
6215483 | Zigmond | Apr 2001 | B1 |
6215494 | Teo | Apr 2001 | B1 |
6223059 | Haestrup | Apr 2001 | B1 |
6225993 | Lindblad et al. | May 2001 | B1 |
6226098 | Kulakowski et al. | May 2001 | B1 |
6233611 | Ludtke et al. | May 2001 | B1 |
6236971 | Stefik et al. | May 2001 | B1 |
6246486 | Takahashi | Jun 2001 | B1 |
6252964 | Wasilewski et al. | Jun 2001 | B1 |
6255961 | Van Ryzin | Jul 2001 | B1 |
6256666 | Singhal | Jul 2001 | B1 |
6263503 | Margulis | Jul 2001 | B1 |
6285357 | Kushiro et al. | Sep 2001 | B1 |
6285889 | Nykanen et al. | Sep 2001 | B1 |
6288790 | Yellepeddy et al. | Sep 2001 | B1 |
6292283 | Grandbois | Sep 2001 | B1 |
6324521 | Shiota et al. | Nov 2001 | B1 |
6330611 | Itoh et al. | Dec 2001 | B1 |
6360252 | Rudy et al. | Mar 2002 | B1 |
6363149 | Candelore | Mar 2002 | B1 |
6363452 | Lach | Mar 2002 | B1 |
6366912 | Wallent et al. | Apr 2002 | B1 |
6366965 | Binford et al. | Apr 2002 | B1 |
6369909 | Shima | Apr 2002 | B1 |
6379058 | Petteruti et al. | Apr 2002 | B1 |
6385305 | Gerszberg et al. | May 2002 | B1 |
6389010 | Kubler et al. | May 2002 | B1 |
6396598 | Kashiwagi et al. | May 2002 | B1 |
6418439 | Papierniak et al. | Jul 2002 | B1 |
6421716 | Eldridge et al. | Jul 2002 | B1 |
6421748 | Lin et al. | Jul 2002 | B1 |
6430599 | Baker et al. | Aug 2002 | B1 |
6434535 | Kupka et al. | Aug 2002 | B1 |
6437786 | Yasukawa | Aug 2002 | B1 |
6442375 | Parmentier | Aug 2002 | B1 |
6449052 | Sherer et al. | Sep 2002 | B1 |
6452692 | Yacoub | Sep 2002 | B1 |
6453127 | Wood et al. | Sep 2002 | B2 |
6467688 | Goldman et al. | Oct 2002 | B1 |
6473070 | Mishra et al. | Oct 2002 | B2 |
6473800 | Jerger et al. | Oct 2002 | B1 |
6477575 | Koeppel et al. | Nov 2002 | B1 |
6480292 | Sugiyama | Nov 2002 | B1 |
6487587 | Dubey | Nov 2002 | B1 |
6487599 | Smith et al. | Nov 2002 | B1 |
6489934 | Klausner | Dec 2002 | B1 |
6493104 | Cromer et al. | Dec 2002 | B1 |
6496855 | Hunt et al. | Dec 2002 | B1 |
6510235 | Shin et al. | Jan 2003 | B1 |
6510515 | Raith | Jan 2003 | B1 |
6515988 | Eldridge et al. | Feb 2003 | B1 |
6526129 | Beaton et al. | Feb 2003 | B1 |
6529522 | Ito et al. | Mar 2003 | B1 |
6540722 | Boyle et al. | Apr 2003 | B1 |
6542173 | Buckley | Apr 2003 | B1 |
6542491 | Tari | Apr 2003 | B1 |
6545722 | Schultheiss et al. | Apr 2003 | B1 |
6546387 | Triggs | Apr 2003 | B1 |
6546419 | Humpleman et al. | Apr 2003 | B1 |
6553240 | Dervarics | Apr 2003 | B1 |
6553431 | Yamamoto et al. | Apr 2003 | B1 |
6556313 | Chang et al. | Apr 2003 | B1 |
6577861 | Ogasawara | Jun 2003 | B2 |
6578072 | Watanabe et al. | Jun 2003 | B2 |
6584903 | Jacobs | Jul 2003 | B2 |
6587835 | Treyz | Jul 2003 | B1 |
6600569 | Osada et al. | Jul 2003 | B1 |
6601108 | Marmor | Jul 2003 | B1 |
6604135 | Rogers et al. | Aug 2003 | B1 |
6604148 | Dennison | Aug 2003 | B1 |
6607314 | McCannon et al. | Aug 2003 | B1 |
6608928 | Queiroz | Aug 2003 | B1 |
6618039 | Grant et al. | Sep 2003 | B1 |
6621589 | Al-Kazily et al. | Sep 2003 | B1 |
6622015 | Himmel et al. | Sep 2003 | B1 |
6623527 | Hamzy | Sep 2003 | B1 |
6628302 | White et al. | Sep 2003 | B2 |
6628417 | Naito et al. | Sep 2003 | B1 |
6633346 | Yamamoto | Oct 2003 | B1 |
6633395 | Tuchitoi et al. | Oct 2003 | B1 |
6643650 | Slaughter et al. | Nov 2003 | B1 |
6654135 | Mintani | Nov 2003 | B2 |
6658625 | Allen | Dec 2003 | B1 |
6670982 | Clough et al. | Dec 2003 | B2 |
6671068 | Chang et al. | Dec 2003 | B1 |
6678004 | Schultheiss et al. | Jan 2004 | B1 |
6678751 | Hays et al. | Jan 2004 | B1 |
6690918 | Evans | Feb 2004 | B2 |
6694371 | Sanai | Feb 2004 | B1 |
6697848 | Hamilton et al. | Feb 2004 | B2 |
6701009 | Makoto et al. | Mar 2004 | B1 |
6705781 | Iwazaki | Mar 2004 | B2 |
6707581 | Browning | Mar 2004 | B1 |
6711677 | Wiegley | Mar 2004 | B1 |
6725281 | Zintel et al. | Apr 2004 | B1 |
6735616 | Thompson et al. | May 2004 | B1 |
6738841 | Wolff | May 2004 | B1 |
6741871 | Siverbrook et al. | May 2004 | B1 |
6745229 | Gobin et al. | Jun 2004 | B1 |
6748195 | Phillips | Jun 2004 | B1 |
6750978 | Maarggraff et al. | Jun 2004 | B1 |
6751732 | Strobel et al. | Jun 2004 | B2 |
6753978 | Chang | Jun 2004 | B1 |
6757070 | Lin et al. | Jun 2004 | B1 |
6760745 | Tan et al. | Jul 2004 | B1 |
6775407 | Gindele et al. | Aug 2004 | B1 |
6778289 | Iwata | Aug 2004 | B1 |
6785727 | Yamazaki | Aug 2004 | B1 |
6788332 | Cook | Sep 2004 | B1 |
6788428 | Shimokawa | Sep 2004 | B1 |
6789228 | Merril et al. | Sep 2004 | B1 |
6798530 | Buckley | Sep 2004 | B1 |
6801692 | Nishimura et al. | Oct 2004 | B2 |
6801962 | Taniguchi | Oct 2004 | B2 |
6813039 | Silverbrook et al. | Nov 2004 | B1 |
6816724 | Asikainen | Nov 2004 | B1 |
6819919 | Tanaka | Nov 2004 | B1 |
6826632 | Wugofski | Nov 2004 | B1 |
6839775 | Kao et al. | Jan 2005 | B1 |
6840441 | Monaghan et al. | Jan 2005 | B2 |
6856430 | Gase | Feb 2005 | B1 |
6857716 | Nagahashi | Feb 2005 | B1 |
6859197 | Klein et al. | Feb 2005 | B2 |
6859228 | Chang et al. | Feb 2005 | B1 |
6859937 | Narayan et al. | Feb 2005 | B1 |
6889385 | Rakib et al. | May 2005 | B1 |
6892251 | Anderson et al. | May 2005 | B2 |
6895444 | Weisshaar et al. | May 2005 | B1 |
6915124 | Kiessling et al. | Jul 2005 | B1 |
6922258 | Pineau | Jul 2005 | B2 |
6941014 | Lin et al. | Sep 2005 | B2 |
6947067 | Halttunen | Sep 2005 | B2 |
6947995 | Chang et al. | Sep 2005 | B2 |
6952414 | Willig | Oct 2005 | B1 |
6957194 | Steflk et al. | Oct 2005 | B2 |
6958821 | McIntyre | Oct 2005 | B1 |
6980319 | Ohta | Dec 2005 | B2 |
6983310 | Rouse | Jan 2006 | B2 |
6990548 | Kaylor | Jan 2006 | B1 |
6996555 | Muto et al. | Feb 2006 | B2 |
7016062 | Ishizuka | Mar 2006 | B2 |
7024200 | McKenna | Apr 2006 | B2 |
7028102 | Larsson et al. | Apr 2006 | B1 |
7039445 | Yoshizawa | May 2006 | B1 |
7058356 | Slotznick | Jun 2006 | B2 |
7076534 | Cleron et al. | Jul 2006 | B1 |
7088691 | Fujita | Aug 2006 | B2 |
7099304 | Liu et al. | Aug 2006 | B2 |
7133845 | Ginter et al. | Nov 2006 | B1 |
7133846 | Ginter et al. | Nov 2006 | B1 |
7143356 | Shafrir et al. | Nov 2006 | B1 |
7149726 | Lingle et al. | Dec 2006 | B1 |
7155163 | Cannon et al. | Dec 2006 | B2 |
7164885 | Jonsson et al. | Jan 2007 | B2 |
7180614 | Senoo et al. | Feb 2007 | B1 |
7197531 | Anderson | Mar 2007 | B2 |
7237253 | Blackketter et al. | Jun 2007 | B1 |
7239346 | Priddy | Jul 2007 | B1 |
7263270 | Lapstun et al. | Aug 2007 | B1 |
7272788 | Anderson et al. | Sep 2007 | B2 |
7318086 | Chang et al. | Jan 2008 | B2 |
7346374 | Witkowski et al. | Mar 2008 | B2 |
7348961 | Shneidman | Mar 2008 | B1 |
7359714 | Parupudi et al. | Apr 2008 | B2 |
7360230 | Paz et al. | Apr 2008 | B1 |
7366468 | Yoshida | Apr 2008 | B2 |
7370090 | Nakaoka et al. | May 2008 | B2 |
7403510 | Miyake | Jul 2008 | B1 |
7454796 | Mazzagatte et al. | Nov 2008 | B2 |
7460853 | Toyoshima | Dec 2008 | B2 |
7477890 | Narayanaswami | Jan 2009 | B1 |
7478403 | Allavarpu et al. | Jan 2009 | B1 |
7554684 | Senoo et al. | Jun 2009 | B1 |
7593123 | Sugahara | Sep 2009 | B2 |
7609402 | Chang et al. | Oct 2009 | B2 |
7660460 | Wu et al. | Feb 2010 | B2 |
7743133 | Motoyama et al. | Jun 2010 | B1 |
RE41416 | Liu et al. | Jul 2010 | E |
RE41487 | Liu et al. | Aug 2010 | E |
RE41532 | Liu et al. | Aug 2010 | E |
RE41689 | Liu et al. | Sep 2010 | E |
7805720 | Chang et al. | Sep 2010 | B2 |
RE41882 | Liu et al. | Oct 2010 | E |
7908401 | Chang et al. | Mar 2011 | B2 |
7929950 | Rao et al. | Apr 2011 | B1 |
7941541 | Chang et al. | May 2011 | B2 |
7944577 | Chang et al. | May 2011 | B2 |
7949223 | Shiohara | May 2011 | B2 |
7953818 | Chang et al. | May 2011 | B2 |
7986298 | Dulaney et al. | Jul 2011 | B1 |
RE42725 | Chang et al. | Sep 2011 | E |
RE42828 | Liu et al. | Oct 2011 | E |
8086961 | Saeki et al. | Dec 2011 | B2 |
RE43181 | Liu et al. | Feb 2012 | E |
8169649 | Chang et al. | May 2012 | B2 |
8184324 | Chang et al. | May 2012 | B2 |
8285802 | Chang | Oct 2012 | B2 |
8296757 | Chang | Oct 2012 | B2 |
8332521 | Chang et al. | Dec 2012 | B2 |
8533352 | Chang | Sep 2013 | B2 |
8595717 | Chang et al. | Nov 2013 | B2 |
8630000 | Chang et al. | Jan 2014 | B2 |
8705097 | Chang et al. | Apr 2014 | B2 |
8711408 | Chang et al. | Apr 2014 | B2 |
8964220 | Chang et al. | Feb 2015 | B2 |
8972610 | Chang | Mar 2015 | B2 |
8989064 | Chang | Mar 2015 | B2 |
9015329 | Chang et al. | Apr 2015 | B2 |
9036181 | Chang et al. | May 2015 | B2 |
9037088 | Chang et al. | May 2015 | B2 |
9042811 | Chang et al. | May 2015 | B2 |
9043482 | Chang | May 2015 | B2 |
9069510 | Chang et al. | Jun 2015 | B2 |
9092177 | Chang et al. | Jul 2015 | B2 |
9110622 | Chang et al. | Aug 2015 | B2 |
9164718 | Chang et al. | Oct 2015 | B2 |
9298407 | Chang et al. | Mar 2016 | B2 |
9383956 | Chang et al. | Jul 2016 | B2 |
9389822 | Chang et al. | Jul 2016 | B2 |
20010011302 | Son | Aug 2001 | A1 |
20010012281 | Hall et al. | Aug 2001 | A1 |
20010015717 | Mishra et al. | Aug 2001 | A1 |
20010029531 | Ohta | Oct 2001 | A1 |
20010032254 | Hawkins | Oct 2001 | A1 |
20010034222 | Roustaei et al. | Oct 2001 | A1 |
20010055951 | Slotznick | Dec 2001 | A1 |
20020012329 | Atkinson et al. | Jan 2002 | A1 |
20020017827 | Zuppero et al. | Feb 2002 | A1 |
20020026492 | Fujita | Feb 2002 | A1 |
20020038612 | Iwazaki | Apr 2002 | A1 |
20020042263 | Ishikawa | Apr 2002 | A1 |
20020049839 | Miida et al. | Apr 2002 | A1 |
20020057452 | Yoshino | May 2002 | A1 |
20020059489 | Davis et al. | May 2002 | A1 |
20020062398 | Chang et al. | May 2002 | A1 |
20020062406 | Chang et al. | May 2002 | A1 |
20020065873 | Ishizuka | May 2002 | A1 |
20020077980 | Chang | Jun 2002 | A1 |
20020078101 | Chang et al. | Jun 2002 | A1 |
20020081993 | Toyoshima | Jun 2002 | A1 |
20020087622 | Anderson | Jul 2002 | A1 |
20020090912 | Cannon et al. | Jul 2002 | A1 |
20020092029 | Smith | Jul 2002 | A1 |
20020097408 | Chang et al. | Jul 2002 | A1 |
20020097415 | Chang et al. | Jul 2002 | A1 |
20020097416 | Chang et al. | Jul 2002 | A1 |
20020097417 | Chang et al. | Jul 2002 | A1 |
20020097418 | Chang et al. | Jul 2002 | A1 |
20020097419 | Chang et al. | Jul 2002 | A1 |
20020097433 | Chang | Jul 2002 | A1 |
20020099884 | Chang et al. | Jul 2002 | A1 |
20020178272 | Igarashi et al. | Nov 2002 | A1 |
20020194302 | Blumberg | Dec 2002 | A1 |
20030002072 | Berkema et al. | Jan 2003 | A1 |
20030013483 | Ausems et al. | Jan 2003 | A1 |
20030013484 | Nishimura et al. | Jan 2003 | A1 |
20030061606 | Hartwig | Mar 2003 | A1 |
20030120754 | Muto et al. | Jun 2003 | A1 |
20030122934 | Shiohara | Jul 2003 | A1 |
20030128272 | Clough | Jul 2003 | A1 |
20030160993 | Kang | Aug 2003 | A1 |
20040057075 | Stewart et al. | Mar 2004 | A1 |
20050125664 | Berkema et al. | Jun 2005 | A1 |
20050204176 | Togawa | Sep 2005 | A1 |
20050210120 | Yukie et al. | Sep 2005 | A1 |
20050222963 | Johnson | Oct 2005 | A1 |
20070125860 | Lapstun et al. | Jun 2007 | A1 |
20070129109 | Silverbrook et al. | Jun 2007 | A1 |
20070133073 | Shida et al. | Jun 2007 | A1 |
20080007482 | Morioka | Jan 2008 | A1 |
20080049253 | Chang et al. | Feb 2008 | A1 |
20080049651 | Chang | Feb 2008 | A1 |
20080201236 | Field et al. | Aug 2008 | A1 |
20080218776 | Takami et al. | Sep 2008 | A1 |
20080318602 | Chang et al. | Dec 2008 | A1 |
20090002760 | Chang et al. | Jan 2009 | A1 |
20090070411 | Chang et al. | Mar 2009 | A1 |
20090094457 | Lapstun et al. | Apr 2009 | A1 |
20090180142 | Suzuki et al. | Jul 2009 | A1 |
20090290182 | Hashimoto et al. | Nov 2009 | A1 |
20100039660 | Chang et al. | Feb 2010 | A1 |
20100039669 | Chang et al. | Feb 2010 | A1 |
20100201996 | Chang et al. | Aug 2010 | A1 |
20100203824 | Chang et al. | Aug 2010 | A1 |
20100227550 | Chang et al. | Sep 2010 | A1 |
20110016280 | Chang et al. | Jan 2011 | A1 |
20110034150 | Chang et al. | Feb 2011 | A1 |
20110035682 | Chang et al. | Feb 2011 | A1 |
20110138378 | Chang et al. | Jun 2011 | A1 |
20110167166 | Chang | Jul 2011 | A1 |
20110167175 | Chang | Jul 2011 | A1 |
20110197159 | Chaganti et al. | Aug 2011 | A1 |
20110211226 | Chang et al. | Sep 2011 | A1 |
20110279829 | Chang et al. | Nov 2011 | A1 |
20110279863 | Chang et al. | Nov 2011 | A1 |
20120226777 | Shanahan | Sep 2012 | A1 |
20120230315 | Chang et al. | Sep 2012 | A1 |
20120258700 | Chang et al. | Oct 2012 | A1 |
20130095887 | Chang et al. | Apr 2013 | A1 |
20130103775 | Chang et al. | Apr 2013 | A1 |
20130104052 | Chang et al. | Apr 2013 | A1 |
20130109353 | Chang et al. | May 2013 | A1 |
20140018130 | Chang | Jan 2014 | A1 |
20140082604 | Chang et al. | Mar 2014 | A1 |
20150356561 | Chang et al. | Dec 2015 | A1 |
20150356564 | Chang et al. | Dec 2015 | A1 |
20150356565 | Chang et al. | Dec 2015 | A1 |
20150363763 | Chang et al. | Dec 2015 | A1 |
20150381612 | Chang et al. | Dec 2015 | A1 |
20160011836 | Chang et al. | Jan 2016 | A1 |
20160174068 | Chang et al. | Jun 2016 | A1 |
20160239232 | Chang et al. | Aug 2016 | A1 |
20160239243 | Chang et al. | Aug 2016 | A1 |
20170039009 | Chang et al. | Feb 2017 | A1 |
20170064746 | Chang et al. | Mar 2017 | A1 |
20170075636 | Chang et al. | Mar 2017 | A1 |
20170078521 | Chang et al. | Mar 2017 | A1 |
20170185376 | Chang et al. | Jun 2017 | A1 |
20170228202 | Chang et al. | Aug 2017 | A1 |
20170249116 | Chang et al. | Aug 2017 | A1 |
Number | Date | Country |
---|---|---|
1217503 | May 1999 | CN |
01821101.1 | Apr 2004 | CN |
02806907 | Oct 2004 | CN |
100334577 | Aug 2007 | CN |
20101044809.3 | Sep 2010 | CN |
201010144167.7 | Sep 2010 | CN |
201010444174 | Sep 2010 | CN |
0691619 | Oct 1996 | EP |
0738949 | Oct 1996 | EP |
0738979 | Oct 1996 | EP |
0952513 | Oct 1999 | EP |
2332764 | Jun 1999 | GB |
11316658 | Nov 1999 | JP |
200195096 | Dec 2001 | WO |
200195097 | Dec 2001 | WO |
0284928 | Oct 2002 | WO |
Entry |
---|
Bettstedder, Christian “A Comparison of Service Discovery Protocols and Implementation of the Service Location”, Technische Universitat Munchen (TUM), Sep. 13, 2000, D-80290, Munich Germany. |
U.S Patent and Trademark Office, Notice of Allowance and Fee(s) Due for U.S. Appl. No. 10/053,651, dated Jun. 15, 2009, 11 pages. |
U.S Patent and Trademark Office, Notice of Allowance and Fee(s) Due for U.S. Appl. No. 12/606,178, dated Sep. 7, 2011, 15 pages. |
U.S Patent and Trademark Office, Notice of Allowance and Fee(s) Due for U.S. Appl. No. 12/581,868, dated Jan. 20, 2012, 35 pages. |
U.S Patent and Trademark Office, Notice of Allowance and Fee(s) Due for U.S. Appl. No. 11/933,031, dated May 9, 2012, 60 pages. |
U.S Patent and Trademark Office, Notice of Allowance and Fee(s) Due for U.S. Appl. No. 12/907,865, dated Jun. 11, 2012, 37 pages. |
U.S Patent and Trademark Office, Notice of Allowance and Fee(s) Due for U.S. Appl. No. 12/903,048, dated Jul. 17, 2012, 43 pages. |
USPTO; Office Action, U.S. Appl. No. 10/053,654; dated Nov. 29, 2005; 23 pages. |
USPTO; Office Action, U.S. Appl. No. 10/053,654; dated Sep. 8, 2006; 26 pages. |
USPTO; Office Action, U.S. Appl. No. 11/929,445; dated Nov. 21, 2012; 21 pages. |
USPTO; Office Action, U.S. Appl. No. 11/929,445; dated Apr. 25, 2012; 17 pages. |
USPTO; Office Action, U.S. Appl. No. 11/929,445; dated Feb. 15, 2011; 12 pages. |
USPTO; Office Action, U.S. Appl. No. 11/929,445; dated Jul. 20, 2010; 14 pages. |
USPTO; Office Action, U.S. Appl. No. 11/929,445; dated Dec. 24, 2009; 14 pages. |
USPTO; Office Action, U.S. Appl. No. 12/783,504; dated Nov. 21, 2012; 25 pages. |
USPTO; Office Action, U.S. Appl. No. 12/783,504; dated Apr. 15, 2011; 17 pages. |
USPTO; Office Action, U.S. Appl. No. 12/783,504; dated Oct. 4, 2010; 14 pages. |
USPTO; Office Action, U.S. Appl. No. 12/581,868; dated Dec. 20, 2010; 16 pages. |
USPTO; Office Action, U.S. Appl. No. 12,606,178; dated Jan. 20, 2011; 19 pages. |
USPTO; Notice of Allowance, U.S. Appl. No. 12/903,048; dated Jul. 17, 2012; 11 pages. |
USPTO; Office Action, U.S. Appl. No. 13/108,922; dated Nov. 5, 2012; 46 pages. |
USPTO; Office Action, U.S. Appl. No. 13/103,958; dated Nov. 16, 2012; 6 pages. |
House, C and Quon, D; “An on-line communication print service for the demanding client”; Proceedings of the 11th Annual International Conference on Systems Documentation (Waterloo, Ontario, Canada); Oct. 5-8, 1993; SIGDOC '93 ACM, New York NY; pp. 135-139. |
Bisdikian, C, et al; “WISAP: A Wireless Personal Access Network for Handheld Computing Devices”; Personal Communications, IEEE [see also IEEE Wireless Communications], vol. 5 No. 6; Dec. 1998; pp. 18-25. |
European Patent Office, Examination Report for EP Application No. 01985549.3, dated Oct. 26, 2010, 4 pages. |
Haynie, Dave, The Zorro III Bus Specification, Mar. 20, 1991, 60 pages, Document Revision 1.10, Commodore-Amiga Inc. |
Miller, “Mapping Salutation Architecture APIs to Bluetooth Service Discovery Layer,” Jul. 1, 1999, Version 1.0. |
Schulyer et al, Solutions to Sharing Local Printers: LAN Systems Inc., LANSpool, $395 per Server PC Week 1989, vol. 6, No. 39, pp. 75(2), see entire document. |
Screenshots from Microsoft® NT™, Figures 5-7, 1998, 3 pages. |
Screenshots from Microsoft® Word 2000, Figures 1-4, 1999, 4 pages. |
State Intellectual Property Office of China; Office Action for CN028069072; dated Jul. 3, 2009; 5 pages including English-language summary. |
State Intellectual Property Office of China; First Office Action for CN1217503 (publication No. 20100144809.3); dated Nov. 30, 2010; 18 pages including English-language summary and claims. |
State Intellectual Property Office of China; Decision to Grant for CN1217503 (publication No. 201010144809.3); dated Jan. 12, 2012; 3 pages including English-language summary. |
State Intellectual Property Office of China; First Office Action for CN20100144174.7; dated Feb. 23, 2011; 8 pages including English-language summary. |
State Intellectual Property Office of China; Second Office Action for CN20100144174.7; dated Apr. 5, 2012; 6 pages including English-language summary. |
State Intellectual Property Office of China; Third Office Action for CN20100144174.7; dated Nov. 5, 2012; 9 pages including English-language summary and claims. |
State Intellectual Property Office of China; First Office Action for CN20100144167.7; dated Apr. 13, 2011; 22 pages including English-language summary and claims. |
State Intellectual Property Office of China; Second Office Action for CN20100144167.7; dated May 3, 2012; 10 pages including English-language summary. |
U.S. Patent and Trademark Office, Notice of Allowance and Fee(s) Due for U.S. Appl. No. 10/053,651, dated Jun. 15, 2009, 11 pages. |
USPTO; “Notice of Allowance” for U.S. Appl. No. 10/016,223; dated Dec. 29, 2010; 28 pages. |
USPTO, Notice of Allowance regarding U.S. Appl. No. 11/933,005, dated Jan. 12, 2011, 19 pages. |
USPTO, Notice of Allowance regarding U.S. Appl. No. 09/992,183, dated Jan. 25, 2011, 34 pages. |
USPTO; “Notice of Allowance” for U.S. 12/907,865; dated Mar. 18, 2011; 22 pages. |
USPTO, Office Action regarding U.S. Appl. No. 09/992,413, dated Sep. 14, 2011, 29 pages. |
USPTO, Office Action regarding U.S. Appl. No. 12/204,695, dated Sep. 15, 2011, 29 pages. |
USPTO; Office Action, U.S. Appl. No. 12/783,504; dated Mar. 30, 2012; 17 pages. |
USPTO; Office Action, U.S. Appl. No. 12/764,015; dated Apr. 24, 2012; 17 pages. |
USPTO; Office Action, U.S. Appl. No. 12/764,032; dated Apr. 25, 2012; 18 pages. |
USPTO; Office Action, U.S. Appl. No. 12/764,032; dated Nov. 21, 2012; 22 pages. |
USPTO; Office Action, U.S. Appl. No. 12/783,504; dated Nov. 21, 2012; 17 pages. |
USPTO; Office Action, U.S. Appl. No. 12/764,015; dated Dec. 18, 2012; 22 pages. |
U.S. Receiving Office, International Search Report regarding International Application No. PCT/US01/43796, Mar. 20, 2002, 3 pages. |
U.S. Receiving Office, International Preliminary Examination Report regarding International Application No. PCT/US01/43796, Jan. 15, 2003, 5 pages. |
World Intellectual Property Organization (King Y. Poon, authorized officer); “International Search Report” for application No. PCT/US2001/48057 (publication No. WO 2002/041118); dated Jan. 6, 2003; 3 pages. The '057 and '223 applications both claim priority to U.S. Appl. No. 60/245,101. |
The (King Y. Poon, authorized officer); “Corrected International Preliminary Examination Report” for application No. PCT/US2001/48057 (publication No. WO 2002/041118); dated Aug. 24, 2004; 11 pages. The '247 and '223 applications both claim priority to U.S. Appl. No. 60/245,101. |
WIPO (Tod Kupstas, authorized officer); “International Search Report” for application No. PCT/US01/46247 (publication No. WO 2002/046867) dated Jun. 7, 2002; 4 pages. The '247 and '233 applications both claim priority to U.S. Appl. No. 60/245,101. |
WIPO (Glenton Burgess, authorized officer); “International Search Report” for application No. PCT/US01/46247 (publication No. WO 2002/046867); dated Jul. 24, 2002; 3 pages. The '247 and '223 applications both claim priority to U.S. Appl. No. 60/245,101. |
U.S. Patent and Trademark Office; Notice of Allowance regarding U.S. Appl. No. 11/929,501; dated Apr. 12, 2013; 38 pages. |
U.S. Patent and Trademark Office; Office Action regarding U.S. Appl. No. 12/783,504; dated Jul. 8, 2013; 39 pages. |
U.S. Patent and Trademark Office, Office action regarding U.S. Appl. No. 11/929,501, dated Dec. 24, 2009, 36 pages. |
U.S. Patent and Trademark Office, Office action regarding U.S. Appl. No. 11/929,501, dated Aug. 18, 2010, 30 pages. |
U.S. Patent and Trademark Office, Office action regarding U.S. Appl. No. 12/764,015, dated Nov. 12, 2010, 30 pages. |
U.S. Patent and Trademark Office, Office action regarding U.S. Appl. No. 12/764,032, dated Dec. 9, 2010, 28 pages. |
U.S. Patent and Trademark Office, Office action regarding U.S. Appl. No. 11/929,501, dated Feb. 17, 2011, 26 pages. |
U.S. Patent and Trademark Office, Office action regarding U.S. Appl. No. 12/764,032, dated Jun. 27, 2011, 28 pages. |
U.S. Patent and Trademark Office, Office action regarding U.S. Appl. No. 12/764,015, dated Jul. 12, 2011, 31 pages. |
U.S. Patent and Trademark Office, Office action regarding U.S. Appl. No. 11/929,501, dated Jun. 13, 2012, 46 pages. |
U.S. Patent and Trademark Office, Notice of Allowance regarding U.S. Appl. No. 11/929,501, dated Apr. 12, 2013, 51 pages. |
U.S. Patent and Trademark Office, Office action regarding U.S. Appl. No. 12/764,015, dated Aug. 23, 2013, 49 pages. |
U.S. Patent and Trademark Office, Office action regarding U.S. Appl. No. 11/929,445, dated Sep. 23, 2013, 34 pages. |
U.S. Patent and Trademark Office, Office action regarding U.S. Appl. No. 12/764,032, dated Oct. 1, 2013, 60 pages. |
U.S. Patent and Trademark Office, Office Action for U.S. Appl. No. 13/476,947, dated Dec. 23, 2013, 20 pages. |
State Intellectual Property Office, Office Action for Chinese Application No. 201010144167.7, dated Feb. 4, 2013, 7 pages. |
State Intellectual Property Office, Office Action for Chinese Application No. 201010144174.7, dated Jul. 24, 2013, 5 pages. |
U.S. Patent and Trademark Office, Office Action for U.S. Appl. No. 11/929,445, dated Feb.11, 2014, 31 pages. |
U.S. Patent and Trademark Office, Office Action for U.S. Appl. No. 12/783,504, dated Mar. 6, 2014, 29 pages. |
U.S. Patent and Trademark Office, Office Action for U.S. Appl. No. 10/053,651, dated Aug. 23, 2006, 5 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 10/053,651, dated Dec. 8, 2008, 11 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 10/053,651, dated Mar. 10, 2009, 20 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 12/581,868, dated Sep. 2, 2011, 6 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 12/606,178, dated Jan. 27, 2012, 50 pages. |
U.S. Patent and Trademark Office, Office Action for U.S. Appl. No. 11/929,501, dated Jun. 13, 2012, 38 pages. |
U.S. Patent and Trademark Office, Office Action for U.S. Appl. No. 12/764,015, dated Dec. 18, 2012, 23 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 11/929,501, dated Jan. 3, 2013, 20 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 11/929,501, dated Sep. 3, 2013, 19 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 12/764,015, dated Apr. 4, 2014, 54 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 13/477,987, dated Feb. 25, 2014, 104 pages. |
U.S. Appl. No. 60/224,701 dated Aug. 11, 2000. |
U.S. Appl. No. 60/22,7878 dated Aug. 25, 2000. |
U.S. Appl. No. 60/243,654 dated Oct. 26, 2000. |
U.S. Appl. No. 60/208,967 dated Jun. 2, 2000. |
U.S. Appl. No. 60/220,047 dated Jul. 21, 2000. |
U.S. Appl. No. 60/239,320 dated Oct. 10, 2000. |
State Intellectual Property Office, Office Action for Chinese Application No. 201010144167.7, dated Feb. 8, 2014, 4 pages. |
State Intellectual Property Office of China, Fifth Office Action for CN201010144174.7, dated Apr. 16, 2014, 7 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 12/764,015, dated Apr. 28, 2014, 51 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 11/929,445, dated Jun. 25, 2014, 61 pages. |
U.S. Patent and Trademark Office, Office Action for U.S. Appl. No. 13/476,947, dated Jun. 26, 2014, 85 pages. |
State Intellectual Property Office of China, Notice of Allowance for Chinese Patent Application No. 201010144167.7, dated Jul. 24, 2014, 3 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 13/477,987, dated Jun. 6, 2014, 26 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 12/764,015, dated Sep. 11, 2014, 29 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 12/764,032, dated Sep. 12, 2014, 29 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 13/477,987, dated Sep. 26, 2014, 42 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 11/929,445, dated Oct. 24, 2014, 33 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 12/783,504, dated Nov. 7, 2014, 45 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 12/783,504, dated Aug. 29, 2014, 78 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 12/764,015, dated Dec. 18, 2014, 29 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 12/764,032, dated Dec. 31, 2014, 38 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 13/476,947, dated Jan. 8, 2015, 55 pages. |
U.S. Patent and Trademark Office, Supplemental Notice of Allowability for U.S. Appl. No. 13/477,987, dated Jan. 21, 2015, 7 pages. |
U.S. Patent and Trademark Office, Corrected Notice of Allowability for U.S. Appl. No. 12/783,504, dated Mar. 5, 2015, 26 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 12/783,504, dated Apr. 24, 2015, 36 pages. |
U.S. Patent and Trademark Office, Office Action for U.S. Appl. No. 15/051,534, dated Sep. 15, 2016, 77 pages. |
U.S. Patent and Trademark Office, Office Action for U.S. Appl. No. 15/359,147, dated Dec. 30, 2016, 76 pages. |
European Patent Office, Examination Report for European Patent Application No. 01985549.3, dated Jan. 10, 2017, 10 pages. |
U.S. Patent and Trademark Office, Final Office Action for U.S. Appl. No. 15/051,534, dated May 4, 2017, 12 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 15/359,147, dated Jul. 25, 2017, 13 pages. |
U.S. Patent and Trademark Office, Notice of Allowance for U.S. Appl. No. 15/051,534, dated Jul. 27, 2017, 18 pages. |
Number | Date | Country | |
---|---|---|---|
20020097433 A1 | Jul 2002 | US |
Number | Date | Country | |
---|---|---|---|
60262764 | Jan 2001 | US |