Despite the tremendous diversity of electronic and electromechanical devices such as telephones, microwave ovens, automated teller machines (ATMs), etc., there has been little innovation in the user interfaces associated with these devices in recent years. For example, these devices are conventionally preconfigured at a factory with a particular user interface and a predefined user interface functionality. The user interface of a device is a portion of the device that enables a user to interact with or control the device. As an example, a user interface often associated with telephones, microwave ovens, and ATMs is a keypad. User interface functionality defines how the device responds to the user's interactions with the user interface. As an example, the user interface functionality associated with pressing a key on a keypad may include displaying information on a display component associated with the user interface and taking an action.
User interfaces can include buttons, switches, dials, displays, and so forth. The user interface typically receives input from users and/or provides output to users. As examples, telephones and ATMs have buttons that enable users to make selections. Some ATMs and telephones, such as Voice over Internet Protocol (VolP) telephones, may additionally have a display that provides visual output to the user, such as on a liquid crystal display (LCD), cathode ray tube (CRT), or other type of display. Such displays may include “touchscreen” functionality that enables users to make selections by touching an appropriate region of the display.
Manufacturers sometimes manufacture different product models that are based on a common underlying platform. As an example, a manufacturer may produce multiple product models that implement different user interfaces. When these user interfaces enable different functionality, a portion of the electronic device corresponding to the implemented user interface may also need to change. As an example, when the user interface provides no physical buttons to enable entry of numbers (e.g., phone numbers), the display may need to enable selection of numbers, such as by providing virtual buttons with numbers. To change the functionality provided by electronic devices, manufacturers conventionally modify the electronic device, such as by changing configuration switches, replacing or reprogramming firmware or software, and so forth. Thus, customization of electronic devices can become expensive or time-consuming.
Facilities are provided for adapting the user interface functionality of devices based on a configuration of the user interface components. A device can receive one or more user interface components, detect the type of the received components, and adapt the user interface functionality based on the received components. The user interface component can be a portion of the user interface that is easily installed, such as during manufacture, distribution, or sales. Even a user can install a user interface component in some embodiments. As an example, the user interface component can be attached to a portion of the device that users can employ.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Facilities are provided for adapting user interface functionality of devices based on a configuration of the user interface components. In various embodiments, a device can receive one or more user interface components, detect the type of the received components, and adapt the user interface functionality based on the received components. The user interface component can be a portion of the user interface that is easily installed, such as during, for example, manufacture, distribution, or sales. Even a user can install a user interface component in some embodiments. As an example, the user interface component can be attached to a portion of the device that users can employ. Examples of devices that can receive such user interface components include communications devices, ATMs, and indeed many if not all electronic or electromechanical devices that have user interfaces.
In various embodiments, a user interface component is a thin “faceplate” that can be removably attached to the device, such as by sliding the faceplate into position, snapping it into position, and so forth. The faceplate may cover a portion of the device or the entire device. In some embodiments, the faceplate covers a user interface portion of the device. In some embodiments, the faceplate can be the housing for the device.
Each user interface component can be manufactured to provide a different configuration of keys, buttons, dials, etc. Each such configuration is a “type” of user interface component and the user interface component is identified by its type. The user interface component is able to communicate its type to the device with which it operates. As examples, the user interface component can communicate its type to the device via a wired connector or wirelessly, such as by using radio frequency identification (“RFID”) or other wireless communications means. In some embodiments, the user interface component communicates its type to the device with which it operates when queried by the device. As an example, the device can send a signal to a user interface component that is attached to the device and, in response, the user interface component responds with its type.
The device detects the type of user interface components it receives, such as through a wired port or by wireless communications. Upon detecting the type, the device can load user interface functionality associated with that particular type of user interface component. As examples, the device can load the functionality from a memory associated with the facility. The memory can be located within the device housing, or it can be located at another device with which the device is capable of communicating. In various embodiments, the device may determine whether the user interface functionality associated with the type of user interface component it receives can be loaded from memory and, if not, attempt to load the user interface functionality from a networked device via, for example, an intranet or the Internet. The device can then operate according to the loaded user interface functionality.
In some embodiments, the device is a Voice over Internet Protocol phone that can exchange VolP messages with other devices. The VolP phone can have a user interface with a display area and a keyboard area. The display area can include a “touchscreen.” The keyboard area can accommodate different user interface components. Examples of user interface components for a VolP phone can include standard 4×3 telephone keys (e.g., indicating numerals 0-9 and symbols # and *), as commonly employed on conventional telephones, and programmable “soft keys” for storing commonly dialed locations and enabling features such as hold, conference calling, speakerphone, and so forth. As an example, a manufacturer may provide two types of user interface components: user interface component type A may provide standard 4×3 keys, whereas user interface component type B may not provide standard 4×3 keys. When the VolP phone receives a user interface component (types A or B), it loads user interface functionality relating to that user interface component. The VolP phone may also be preconfigured with the functionality associated with user interface component type B. This can be referred to as a default configuration. With this functionality, the touchscreen may provide “soft” 4×3 keys whether or not the VolP phone is configured with standard 4×3 keys. When a user touches locations of the touchscreen corresponding to these keys, the device acts as if the user had pressed similar physical keys. Thus, this default behavior may be provided whether the VolP phone receives user interface component type B or not. When the VolP phone receives user interface component type A, it may load the corresponding user interface functionality from its memory or from the memory of another device located on the network, such as a server. Because user interface component type A provides the 4×3 keys, its user interface would not need to also provide the 4×3 keys on the touchscreen. Instead, it may provide access to other phone functionality. Alternatively, it may provide 4×3 keys both physically in the keyboard area as well as virtually in the display area. The manufacturer may thus market two different models of phones without the expense of actually building two different types of phones.
Thus, in various embodiments the facility enables a manufacturer to easily adapt devices according to different needs. A manufacturer of a device can provide different user interfaces for the device. As an example, a manufacturer of a VolP phone may customize the user interface to provide multiple models without bearing the expense of manufacturing several different housings, user interfaces, and so forth. The manufacturer can provide customers with their choice of user interfaces. For example, a customer may purchase a single type of phone for use by engineers and executives but change the functionality provided by the phones by installing a first user interface component type for use by the engineers and a second, different user interface component type for use by the executives.
The facilities will now be described with reference to the figures.
The device 102 can be compatible with, and receive, a plurality of different user interface components, such as components 112a-112n. Each user interface component 112 may have various keys, buttons, dials, etc., as is illustrated. As an example, user interface component 112b has a set of 4×3 keys 130. The user interface components can also have apertures 128a-128n to enable a user to view or access the display area 108. The apertures 128a-128n may optionally be covered by a clear plastic or another transparent material.
The device 102 may have one or more connectors 124 that interface with corresponding connectors associated with user interface components 112a-112n. As examples, user interface component 112a has connector 118a, user interface component 112b has connector 118b, and user interface component 112n has connector 118n. Although the illustrated embodiment shows a female connector 124 on the device 102 and corresponding male connectors on the user interface components 112a-112n, the respective orientations of the connectors could be reversed. In various embodiments, the connectors may make contacts without male or female ends. In various embodiments, the connection can be a wireless connection. For example, the user interface components 112a-112nmay have RFID chips and the device may have an RFID transponder or vice versa. The device 102 can receive type information about the user interface component and user inputs (e.g., key selections or other input) via the connectors 124/118a-118n. In various embodiments, the device 102 and user interface components 112a-112n may have multiple connections. As an example, the device 102 may query the user interface component for its type via a first connection but receive user input via a second connection.
The device 102 may connect to one or more networks (not shown) via, e.g., a network connection cable 126. The network connection can employ digital or analog networks, such as Ethernet, telephone, etc. The network connection can also be wireless, such as over IEEE 802.11, infrared, Bluetooth, etc. The device 102 may use the network connection to load user interface functionality. The device 102 can also use the network connection to enable communications. As an example, a Vol P telephone may use an Ethernet or IEEE 802.11 connection to enable voice or video conversations. The device may also connect to the network via a computer (not shown), such as by using a universal serial bus (USB), serial communications port, parallel communications port, wireless network adapter, Ethernet network adapter, and so forth.
Although a telephone-type device is shown in
In the embodiment illustrated in
The mobile device 202 may have one or more connectors 208 to interface and/or communicate with the user interface components 204a-204b when they are attached to the device 202. In operation, the connector 208 can identify the type of interface component, receive user input, and so forth.
The devices may also have one or more display areas 210, such as to provide output to a user. The display areas 210 may also receive user input via, e.g., a touchscreen. Examples of touchscreens include those employed by handheld computing devices (e.g., MICROSOFT POCKET PC), tablet computing devices, etc. Such touchscreens may receive input via a finger or electromechanical device, such as a stylus.
In the embodiment of
At block 508, the routine 500 detects the type of user interface component that is attached. As an example, the routine can detect the type of user interface component by querying the user interface component via, e.g., a physical or wireless connector. At decision block 510, the routine 500 determines whether the user interface component type provided by the user interface component is recognizable. As an example, the routine may check a table of user interface component types stored in a memory associated with the device. When the type is recognizable, the routine 500 continues at block 512. Otherwise, the routine 500 continues at block 514. In some embodiments, such as when the user interface component type is not stored in memory directly associated with the device, the routine 500 may check a different memory, e.g., memory associated with a server computing device or other repository, to determine whether information about the type is stored in the server's memory. The routine may perform this step so that a default user interface functionality is not loaded, such as at block 514. The server's memory can be a primary or secondary storage.
At block 512, the routine 500 loads user interface functionality associated with the defected type of user interface component. The routine 500 may load user interface functionality from a memory associated with the device or a remote memory. As an example, when the user interface functionality is not stored in device memory, the routine may load it from a server computing device. At block 514, the routine 500 loads a default user interface functionality, such as from the device memory. The default user interface functionality may provide device features when a user interface component is not installed or when the device does not recognize the installed user interface component. The routine 500 then continues at block 516, where it returns.
Those skilled in the art will appreciate that the logic illustrated in
At block 606, the device is configured, such as by adding the assigned user interface component type to a location, such as a table, in device memory. The user interface component type may also be added to memory in a server instead of or in addition to the device memory.
At block 608, the user interface functionality associated with the user interface component is added, such as to the device memory or a server memory. At block 610, the routine returns.
In various embodiments, the connectors between user interface components and devices may also provide output from the devices to the connectors, such as to provide power to a light (such as a key backlight), change labels on keys, buttons, dials, etc., and so forth.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims. Accordingly, the invention is not limited, except as by the appended claims.