Buxton, W. (1994). Combined keyboard / touch tablet input device. XEROX Disclosure Journal, 19(2), March/April 1994, 109-111.



William A. S. Buxton

Proposed Classification
U.S. CL. 341/022
Intl. CL. H03k 17/94

Systems with graphical user interfaces (GUIs) are often called direct manipulation Systems. One way to improve such systems is to provide more versatile input mechanisms in order to improve the range and directness of the manipulations the user can perform. Touch tablets have been shown to be very useful in increasing the range of direct manipulation actions a user is able to perform when interacting with a GUI. [2] In particular, two techniques have been developed for expanding the versatility of touch tablets as direct manipulation input devices. One such technique involves the concept of partitioning the touch sensitive surface of the tablet into multiple locations, or input windows, analogous to windows on a display screen, in order to create multiple virtual input devices. Touch actions on the surface of the tablet are interpreted differently depending on where each touch occurs, and thus a single touch tablet input device can serve to function as a number of virtual input devices.[1] A second technique involves providing touch tablets with the ability to sense the location (and possibly pressure) of each of many simultaneous points of contact.[3] Such a multi-touch tablet supports nearly simultaneous direct manipulation inputs, and when coupled with the technique of the multiple input windows just described, enables two-handed or multi-fingered input for a variety of closely coupled manipulations. For example, a user would be able to scroll through a window's contents with one hand while zooming or scaling with the other, or would be able to simultaneously adjust multiple parameters using linear controllers (called sliders). Thus, providing a relatively large, multi-touch receptive touch surface that supports multiple input windows would he a useful additional input device in a direct manipulation system. However, adding such a device to a conventional direct manipulation system that already includes a pointing device, such as a mouse, and a keyboard would add to the desktop space required for the system (called the system's footprint.)

The combined input device disclosed here is designed to permit the addition of a multifunctional touch tablet input device to a conventional direct manipulation system while maintaining the system’s exiting footprint. As shown in the Figures 1A and lB, this is accomplished by redesigning the conventional keyboard 20 to be the top surface of a two-sided (top and bottom) input device 10 and mounting the touch tablet 30 on the bottom surface. Keyboard 20 is a conventional QWERTY keyboard that has small raised feet 22.24, 26 and 28 (or some similar mechanism) that prevent the keys from being depressed when the touch tablet is being used and the keyboard is the bottom surface of device 10. Keyboard 20 also preferably includes a sensor (not shown), such as a mercury switch, which communicates the state of the keyboard (active or inactive) to the system. The touch tablet side 30 of device 10, shown more completely in Figure lB, may be designed so as to permit a physical template 50 to be easily mounted thereon. Such a template is used to provide tactile feedback about the mapping of the touch surface to specific functions or virtual devices that are operational thereon. [2]

To use the touch tablet side 30 of input device 10, the user simply flips input device 10 over from the keyboard side 20 to the touch tablet side 30.

The design of the combined keyboard/touch tablet device recognizes that direct manipulation actions such as those supported by the touch tablet are generally not tightly coupled with keyboard actions; that is, it is not very likely that an application will require a user to frequently switch between entering keyboard strokes, which are typically in the character domain, and entering direct actions, which are typically in the graphical domain.

Numerous applications may be supported by the addition of a touch tablet as an input device to a direct manipulation system. Three applications are described here briefly, to illustrate the devices versatility. In the context of a financial modeling application that includes one or more spreadsheets, the keyboard side of the device may be used in a conventional manner to create the spreadsheet and to enter financial data and formulas. For exploring "what if" scenarios, however, the touch tablet surface of the device could be used with multiple virtual sliders to control the values of the data in several key cells. The user could then investigate the relationship among those cells by manipulating their values in much the same manner as control over sound variables is manipulated in a physical audio mixing console. In design applications for the design of computer user interface, that include touch sensitive surfaces, such as in some photocopiers or duplicators, prototyping and testing of such interfaces could be significantly improved with the addition of the touch tablet device to the design system, since direct testing could be easily accomplished while designs were being constructed, without the need to build costly physical prototypes. In the domain of computer-supported keyboard music composition and education, a touch tablet having a template of an instrument's keyboard could be easily used to teach music theory, performance, or composition without the need for connecting a separate physical device to the system.


[1] Brown, E., Buxton, W. & Murtagh, K. (1990). Windows on tablets as a means of achieving virtual input devices. In D. Diaper et al. (Eds), Human-Computer Interaction - INTERACT ‘90, Elsevier Science Publishers B.V. (North-Holland), 675-681.

[2] Buxton, W. Hill, R. & Rowley, P. (1985). Issues and Techniques in Touch-Sensitive Tablet Input, Computer Graphics, 19(3), 215-224.

[3] Lee, S.K., Buxton, W. & Smith, K.C. (1985). A multi-touch three dimensional touch-sensitive tablet. Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI’85), 21-27.