Multi-Touch Screens and Mouse-Driven Screens – Comparison Essay

Digital devices are highly founded on input and output devices that permit human interaction with the system. Initially, input devices such as keyboard, mouse, joysticks, and trackballs were used and mostly separately build from the main device. Later there was an integration of keyboard and the mouse sensor on the laptop. This technology later advanced to the use of the touch screen. Today, most modern devices including smartphones, tablets, mini computers, ATM Machine screens, and TV, are created with the ability to point, select, drag, click and type by tapping the screen. Mouse driven screens are mostly used for laptops and desktops used for intense functionality. This paper compares Multi-Touch Screens and Mouse-Driven Screens, noting their similarities and differences.

Read also Multi Touch Screens Vs Mouse Driven Screens – Sample Paper

Metaphors Used in the Design of Applications that Run on Each Type of Screen

A touchscreen refers to an electronic visual display that can detect the location and presence of touch in the display area. The phrase normally refers to touching the device’s display area with a hand or finger. Touch screens can as well sense other passive objects for instance stylus. Touchscreen refers to any cathode ray tube or liquid crystal display based monitor technology that takes onscreen direct input. The main metaphors that are commonly used in these kinds of screens include a touch sensor, sensor, touch controller, tapping, dragging, or sliding.  In this technology, the sensor assists in transferring the command to the controller which transfers the command to the system (Bhalla & Bhalla, 1). The mouse-driven screen is operated with a mouse interfaced with the device screen. The mouse contains two buttons for right-clicking and left-clicking. Some of the main metaphors, in this case, including clicking, clicking and dragging to selector highlight. The left click is mostly used to position the cursor, while the right-click is used to invoke a command drop menu. Dragging while holding the left button enhance highlighting or moving the cursor. In this case, the mouse click transmits the command to the system through the communication bus. In both cases, commands are executed and output displayed on the screen.

Differences in Interaction Types and Styles that Apply for these Screens and Application Running on Them

The touch screen involves two main technologies that include resistive and capacitive touch screens. Resistive touch screens contain acrylic panels coated with conductive and resistive layers of indium tin oxide with spaces in-between. These acknowledge user input through pressure on the screen. Capacitive touch screens contain a conductive laminate covering a glass screen. The input is acknowledged when users touch the screens and the current flows to the user’s fingers from the laminate (Travis & Murano, 3). Although multi-touch screen and mouse-driven screens are driven they do employ similar technology in initiating commands and communicating them. Both interfaces are founded on the same basic cognition functions that include seeing, positioning, and acting, whether the functions are initiated by tapping or clicking. In both cases, the user identified the item needed on the screen. Then moves the fingers or the cursor to the identified position, and click on it if using the mouse or tap if using a touch screen. Tap in the touch screen is equivalent to the left click. However, the positioning in the touch screen is different from the mouse-driven screen as it is more direct; the hand changes the position without influencing the screen movements. On mouse-driven, the mouse has to be dragged along a horizontal and vertical plane to move from one position to another. The mouse allows one to differentiate commands given based on whether it was a right or a left-click. Similarly, a touch screen permits users to use different touches to make different commands. For instance, long-pressing can permit one to invoke copy command when the option is not readily available (Forlines et al., 2).

Conceptual Model Employed in the Design of Multi-Touch Screens and Mouse-Driven Screens

According to Bhalla and Bhalla (1), a basic touchscreen contains three primary components that include software drivers, a controller, and a touch sensor. Being an input device, touchscreen requires to be integrated with a PC and a display or other devices that make develop a complete touch input system. A touch sensor employs a voltage change to determine the touch location on the screen. A controller takes information from the touch sensor and changes it to the language that the PC can understand. It also establishes the kind of connection or interfaces the PC needs. Software drivers refer to the PC system software that permits the computer and the touchscreen to operate together. It helps the computer operating system to interpret touch event messages sent to the controller.

In a mouse-driven screen, a mouse can be internally built especially for laptops or external and attached via a USB interface. Others can be wirelessly connected through the LED. The command given by a mouse varies based on whether it was left-clicked, right-clicked, or double-clicked. The mouse command is transmitted to the PC system by the use of a communication bus that permits the communication between the input devices, the processor, and the output devices. The main difference between two interfaces is that the multi-touch screen interface touch has no left or right-click, the two are executed using a single tap (Forlines et al., 2).

Share with your friends
Order Unique Answer Now

Add a Comment