As we have discussed machines, let’s move on with more important section — discuss those, who interact with machines — users and operators and, actually, humans.
Humans from interaction loop perspective are very similar compared to machines. Both have “processors” (CPUs and brains), both have input and output “devices” (senses organs and muscles for humans). But the nature is very different — while machines are very formal and discrete, humans are extremely flexible (from rational to irrational) and “analogous”.
We will discuss the difference later, but let’s first start with human’s “inputs” and “outputs”.
Humans perceive information from external world via senses organs and some additional senses, that are not localized. Let’s make a comprehensive review of these “inputs”:
- Vision (by eyes)
- Hearing sounds (by ears)
- Touch (by skin)
- Smell by nose receptors
- Taste with tongue
- Pain by most body nerves.
- Body orientation in space by internal ear
- Temperature by skin
- Proprioception (relative position of limbs) with muscle spindles.
Some senses are used frequently in HMI, while others are less popular and have only limited usage.
On the other side, human outputs are more limited, but some of them can be used by interactive devices to receive information from human:
- Direct muscle activity
- Voice (actually, it also a result of muscle activities, but too different by purpose and result)
- Limbs and body position/location (the indirect muscle activity)
- Body parameters (temperature, heart rate, electrodermal activity)
- Brain activity