KCU Flashcards
What is HCI?
- Human–computer interaction (HCI) is the study of interaction between people and computers
- But Internet-of-Things (ubiquitous computing) means that HCI is becoming…
- The study of interaction between people and machines
Goal of HCI
- Major Goal
- To improve the interactions between people and computers (machines?)
- Making computers more usable and receptive to the user’s needs.
- Long Term Goal
- to design systems that minimize the barrier between the human’s cognitive model of what they want to accomplish and the computer’s understanding of the user’s task
Usability, User Experience, User Centered Design
- Three important sub-domains of HCI
- Usability
- User Experience (UX )
- User centered Design
- Each of these sub-domains are concerned with the design and evaluation of the user-interface.
- However, each has a different scope and focus.
Usability
- Usability is an essential concept in HCI and concerned with making systems easy to learn, easy to use, limiting errors and the severity of errors.
- Usability refers to how successfully a user can use a system to accomplish a specific goal.
- Uses terms like error rates, time to complete tasks, task failures, number of lookups made etc.
User Experience
- User experience encompasses an end user’s entire experience with a interface — not just how well the interface worked, but how they expected it to work, how they feel about using it, and how they feel
about the system overall. - Uses terms like satisfaction, intuitive, frustration, good experience, difficult, confusing.
User Centered Design
- User-centered design (UCD) is an iterative design process in which designers focus on the users and their needs in each phase of the design process.
- In UCD, design teams involve users throughout the design process via a variety of research and design techniques, to create highly usable and accessible products for them.
Why is HCI Important?
- Why do we often see user-interfaces are inefficient, confusing or
difficult to use (sometimes barely unusable)? - Badly design interfaces can waste the user’s time, cause frustration and lead to errors.
- Users often leave website or apps with bad interfaces in frustration.
- Is it because developers:
- Don’t care?
- Don’t have the time?
- Don’t really know what makes good design?
- Concerning web-apps and desktop apps, the principles of good UI
design have been well understood since the 1990’s. - However, application of these design principles have often been
neglected. Why? - Prioritizing functionality over usability/user experience
- Budgetary and time constraints (overruns are common)
- Where does usability/user experience fit into the software development
lifecycle?
Is HCI a Solved Problem?
- Adoption of NUI methods and the Metaverse is changing everything.
- Application of HCI to NUI and Metaverse is far from a solved problem.
- NUI technologies now over threshold for paradigm shift.
- Can lead to redesign of interfaces for NUI methods
- Currently NUI methods often simulate mouse point and click.
- NUI methods are foundation of Metaverse.
Paradigms in HCI
Command Line Interfaces (CLI)
User types commands to the computer in the form of text
Graphical User interfaces (GUI)
User directly manipulates graphical representations on a computer screen with a pointing device
Natural User Interfaces (NUI)
Simulates more natural real-world interactions
Metaverse
Immersive interfaces
Command Line Interfaces (CLI)
> CLIs are text-based. Users control the computer by typing in commands.
CLIs require little processing power and are extremely powerful, but it can take longer to learn than a GUI.
Originally, most interfaces were CLIs.
>They still exist within modern operating systems, for example the command prompt app in Windows, and Terminal in macOS. Often used in IoT
devices.
Graphical User Interfaces (GUI)
- Desktop Metaphor
- Based on Point-and-Click
- Not adapted to NUI input modalities
- Often NUI inputs simulate point and click.
- So not much has changed since 1984
Natural User Interfaces (NUI)
- Natural User Interfaces
- Mimics real-world interaction
- Not fully developed
- Often used to simulate point-and-click in GUI
- Potential for redesign of user interface for the NUI method
- Not to restrict NUI method to simulating point and click.
Natural User Interfaces (NUI) examples
- Speech Recognition
- Voice assistants such as Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, and Google’s Assistant are the most popular voice assistants and are embedded in smartphones or dedicated home speakers. Users can ask their assistants questions, control home automation devices and media playback via voice, and manage other basic tasks such as email, to-do lists, and calendars with verbal commands.
- Brain-machine Interface
- Developing field of brain-machine interfaces read brain signals and translate these into actions within the computer system.
- Brain-machine interfaces have many possible applications in health sector as they allow paralysed patients to communicate via a computer such to control their environment (smart home), or control wheelchair etc.
- Touch Screen
- Touch screen interfaces allow users to interact with a machine or device with the touch of a finger.
- Currently, this is the most common form of NUI application and is a natural and intuitive way to interact with computing devices.
- Gesture recognition
- Gesture tracking involves tracking user motions and physical actions and using these as input to computing devices.
- For example, the Nintendo Wii and PlayStations have controllers with accelerometers and gyroscopes to sense the rotation, acceleration, and tilting from which gestures and actions can be inferred
- Gaze Tracking
- Gaze tracking is a NUI method that estimates gaze-point on a display screen based upon user eye-movements.
- Windows 10 and 11 comes with Windows Eye Control API.
- Users can attach an eye tracker to their Windows PC and control the PC through eye-gaze.
Metaverse
- Immersion in VR
- Simulates an immersive environment
- Immersion in AR
* Augments real environment
Defining the User Interface
- Proper interface design will provide a mix of well-designed input and output mechanisms that satisfy the user’s needs, capabilities, and limitations in the most effective way possible.
- The best interface is one that it not noticed, one that permits the user to focus on the information and task at hand, not the mechanisms used to present the information and perform the task.
- User interface design is a subset of HCI.
- HCI designers must consider a variety of factors:
* What people want and expect, physical limitations and abilities
people possess.
* What people find enjoyable and attractive.
* How information processing systems work.
* Technical characteristics and limitations of the computer
hardware and software must also be considered. - Part of a computer and its software that people can see, hear, touch, talk to, or otherwise understand or direct.
- Input and Output
- Input is some form of communication of requirements
- Predominant input method is Point-and-Click.
- Output is the results of processing user’s requirements
- Predominant output method is Display Screens
USER INTERFACE DESIGN LIFECYCLE
- Requirements Capture
* How are users currently completing their tasks.
* Observation
* Questioning
* However, users sometimes don’t know what they need. - Design Alternatives
* Develop user interface designs to fulfil the requirements from requirements capture.
* Can draft various design options. - Prototyping
* Create porotypes for the various design alternatives. - Evaluation
* We take one or more of the prototypes and test the usability of the system.
* Test with users or usability experts.
REQUIREMENTS CAPTURE
- Aim is to better understand the problem space
- We start by analysing the users
- We collect information about how the user currently achieves their tasks.
- Talk to your clients, market research, tools from qualitative research
(interviews, observations, case-studies etc.). - Methods used:
- Direct observation e.g., watch user conducting tasks.
- Survey e.g., questionnaire.
- Focus group e.g., meet with small group of users who discuss tasks.
- Interviews e.g., one-to-one interview with user
Types of Users
- Primary Users
- Use the design directly e.g., ‘end users’.
- Secondary Users
- Do not use the design directly.
- May provide input to design.
- May receive output from design.
- Tertiary Users
- Do not use the design at all.
- ## Effected by design in some way.
- Direct users: These users actively use the system to carry out their tasks or duties. They directly interact with the system’s features.
- Indirect users: Indirect users receive data or information from the system but do not directly operate it. For example, bank customers who rely on the system generated account statements fall into this category.
- Remote users: Remote users do not directly interact with the system themselves. Instead, they depend on the system to provide output or results. Bank customers who check their account balances online without physically visiting a branch are remote users.
- Support users: Support users are part of the administrative or technical team responsible for maintaining and assisting other system users. They ensure the smooth functioning of the system and provide support to novices, intermediates, and experts.