Skip to main content

We often assess physical workload (how much a person can lift, carry, or sustain), but we rarely do the same for mental work.
Yet every job places demands on our minds.

  • How busy is the operator?
  • How complex are their tasks? Should we divide tasks into smaller, more manageable parts?
  • Will the operator be able to respond to unexpected events?
  • Can they handle additional responsibilities?
  • Should we simplify the interface? 

These questions all point to one key concept: mental workload, the invisible cognitive effort behind every decision, reaction, and task. Just as we recognize the limits of muscular strength, we must acknowledge that a person is not merely a source of physical force, but a complex information processor. In modern work environments, the brain is constantly taxed by the need to interpret sensory data, recall instructions, and make decisions under pressure. This ‘mental workload’ acts like an invisible weight; when it exceeds our mental capacity, the result is not physical strain, but a breakdown in attention, decision-making, and ultimately, safety.
Understanding and measuring mental workload is the first step toward designing safer, more efficient, and more human-centered workplaces.

What Is Mental Workload?

Mental workload refers to the amount of cognitive effort required to perform a task [1,2]. It depends on two items:

  1. The demands of the task, and
  2. The individual’s available mental capacity to meet those demands.

When a task is too easy, a low mental workload, we often feel bored or disengaged. Over time, this can reduce alertness and delay reactions. For example, imagine monitoring an automated system for hours: The monotony can dull your awareness, making it easier to miss a critical alert.

When a task is too difficult, a high mental workload, it can exceed your abilities and lead to frustration or mistakes. In the same way that prolonged weightlifting strains the muscles, engaging in complex tasks over time results in mental fatigue.

The goal is to reach the “sweet spot” where the challenge matches your skills, the so-called flow state, where you’re fully immersed and performing at your best.

Think of Your Brain Like a Computer:

  • Low Workload: The task uses very little of your RAM. You’re bored, and performance might actually suffer from a lack of engagement.
  • Optimal Workload: The task uses an appropriate, comfortable amount of RAM. This is where you are most focused and productive.
  • High Workload (Overload): The task demands more RAM than you have. Your system freezes, slows down, or makes errors. 

What Affects Mental Workload?

The demands placed on our minds depend on two main areas:
the task and the individual.

Task Factors:

  • Complexity [3] – How intricate is the task?
    Driving on a straight, empty highway in a driving simulator may feel easy. In contrast, navigating dense traffic, unpredictable obstacles, and dynamic weather conditions demands sustained attention and rapid reasoning, contributing to high mental workload.
  • Time Pressure [4] – Deadlines amplify cognitive load.
    Even moderately difficult tasks like assembling routine parts at a steady pace may feel manageable, but when production quotas loom and the conveyor belt speeds up under tight shift deadlines, the ticking clock heightens stress, demands quicker decision-making to avoid errors, and escalates overall mental workload for the assembly line worker.
  • Multitasking [5] – Switching between activities forces the brain to constantly refocus. Multitasking increases fatigue and errors, such as when an air traffic controller juggles monitoring radar screens for multiple aircraft trajectories, communicating clearance instructions to pilots, coordinating with adjacent sectors or ground crews, and responding to sudden weather alerts or emergencies, all demanding rapid context shifts that heighten cognitive strain and risk of oversight in this high-stakes environment.
  • Uncertainty or Novelty [4] – Unclear instructions or new procedures demand extra processing.
    Adapting to the unfamiliar consumes more mental resources than routine work.

Individual Factors:

  • Experience & Skill [6] – Experts use efficient mental shortcuts, while novices must consciously think through each step, resulting in increased workload.
    In flight or driving simulators, experience and skill greatly affect mental workload. Experts rely on efficient mental shortcuts and intuition, like a veteran pilot instinctively handling turbulence or an experienced driver navigating traffic fluidly, keeping cognitive load low. Novices, however, must consciously process each step, checking instruments, anticipating hazards, or following checklists, which heightens workload and risks of overload. (Think of a beginner pilot verifying every control versus an expert scanning instinctively, or a novice driver hesitating at turns while a pro reacts seamlessly.)
  • Stress Level [7] – External stress drains mental capacity.
    When you’re already tense or anxious, even small tasks can feel overwhelming. Driving in high-traffic conditions might feel manageable on a good day, but yet can become intensely demanding if you’re under stress.

Why Mental Workload Matters in Industry

Maintaining optimal human performance under pressure is a universal challenge across all sectors.

As presented below, both mental overload and underload negatively affect common industrial roles. By using real-time brain activity monitoring, we can detect when a person’s mental workload is hitting a danger zone, either too high or too low, and automatically adjust the level of automation to keep the operator in the sweet spot of peak performance [3, 8, 9, 10].

 

1) Training & Simulation (Aviation, Automotive, and Beyond)

The challenge: Training is often evaluated with outcomes (time, errors, pass/fail), and methods usually follow one-size fits all concept, but these do not reveal how hard the task was for each trainee. Two trainees can achieve the same score with very different cognitive costs. In simulators, overload can lead to breakdowns, poor learning, and unsafe strategies, while underload indicates the training is not sufficiently challenging to build robust skills [11, 12, 13]. Without an objective MWL signal, it is difficult to calibrate difficulty, pacing, and instructor interventions consistently across trainees and sessions.

Improvement potential with InnoBrain (real-time MWL tracker):

  • Objective training dosage: quantify whether scenarios actually reach the intended cognitive demand zone.
  • Personalized progression: adapt scenario difficulty and pacing to keep trainees in a productive learning band.
  • Evidence-based debriefing: pinpoint when and where overload/underload occurred to explain performance and guide coaching.

Operational actions enabled (examples):

  • If MWL is high: adjust scenario intensity (traffic density, secondary tasks), slow pacing, or provide stepwise guidance.
  • If MWL is low: add controlled complexity (interruptions, additional decision points), or increase variability to prevent “training by rote.”
  • Instructor dashboard: timeline view of MWL with scenario events to support targeted feedback and standardized assessment.

2) Operator Monitoring

The challenge: In live operations, performance can drift before it is visible in incidents, defects, or near-misses. Operators may be overloaded during peak demand (multi-tasking, alarms, time pressure) or underloaded during monotony (long monitoring periods) [8, 14, 15, 16, 17,18]. Self-assessment is unreliable, and traditional monitoring relies on lagging indicators and supervisor observation.

Teleoperation example (mobility/remote operations): A teleoperator can become underloaded during long, uneventful monitoring (reduced vigilance, slower hazard detection) and overloaded when multiple feeds, communications, and rapid maneuvers converge (cognitive tunneling, delayed decisions, higher error risk) [19].

Manufacturing & manual assembly example: Repetitive stations can drive low MWL (mind-wandering, missed anomalies), while variant-heavy, high-precision assembly and changeovers can create high MWL (sequence errors, wrong-part selection, rework, and safety risk) [20, 21, 22].

Improvement potential with InnoBrain (real-time MWL tracker):

  • Early warning for cognitive risk by detecting sustained overload/underload before failure modes manifest.
  • Workload-aware decision support that prioritizes information and interventions based on current operator capacity.
  • Operational optimization via aggregated MWL trends by shift/station/task to identify systemic hotspots and training needs.

Operational actions enabled (examples):

  • If MWL is high: simplify UI/HMI, mute non-essential alerts, prioritize only safety-critical cues; trigger structured support escalation (peer assist, micro-break, task deferral where permitted).
  • If MWL is low: introduce vigilance prompts and active check-ins, adjust task rotation or engagement routines, and flag extended underload periods for supervisory awareness.
  • Supervisor dashboard: MWL timelines and heatmaps mapped to events (e.g., remote interventions, changeovers) to guide staffing, SOP tuning, and interface/process redesign.

3) Gaming (Live Gameplay, Competitive Play, and Training Modes)

The challenge: Games routinely create rapid swings between MWL overload (high-intensity fights, multitasking, time pressure, information density) and MWL underload (downtime, repetitive loops, long sessions, “autopilot” play). The risk is two-fold: overload leads to avoidable errors and frustration; underload leads to disengagement, sloppy play, and churn. Traditional telemetry (K/D, accuracy, APM, time-on-task, quits) is largely lagging and cannot reliably distinguish “low skill” from “low capacity” at the moment.

Improvement potential with InnoBrain (real-time MWL tracker):

  • Earlier detection of frustration and disengagement drivers than outcome metrics alone.
  • Better tuning of difficulty and pacing using an objective “cognitive cost” signal, not just performance.
  • Player-specific personalization (accessibility and skill development) without making the experience feel arbitrary.

Operational actions enabled (examples):

  • If MWL is high: reduce non-critical HUD clutter, lower notification density, simplify decision surfaces (e.g., fewer simultaneous prompts), smooth pacing (slightly longer recovery windows, clearer objective cues), and tailor coaching tips to essentials.
  • If MWL is low: increase engagement through controlled challenge injection (micro-objectives, slightly higher enemy pressure, more varied tasks), richer feedback loops, or optional “skill drills” to keep players in a productive arousal zone.

For esports/training: MWL timelines aligned with match events to separate “strategy issues” from overload, enabling targeted coaching and practice design.

Why We Can’t Just “Ask”

Traditionally, researchers have relied on three main methods: subjective questionnaires (like NASA-TLX), monitoring behavioral performance, and using secondary tasks [2].

However, relying solely on these traditional methods presents significant challenges. Here is why subjective reports and behavioral checks aren’t enough, and why physiological data is the future.

  1. The Disconnect Between Perception and Reality

A major limitation of questionnaires is that they rely on human perception. Research indicates that subjective reports can be misleading. For instance, individuals may report low workload on a questionnaire, yet their physiological data, such as brain activity and heart rate, reveal high levels of workload [23]. This “underestimation” is particularly common during passive tasks, such as monitoring an automated system rather than actively controlling it.

  1. Performance Monitoring is Too Late

Monitoring behavioral performance (e.g., checking for errors) is an objective way to measure workload, but it suffers from being a “lagging indicator.” By the time someone makes a mistake or their performance drops, they are already mentally overloaded. In high-stakes environments, waiting for an error is not a viable strategy. Attempting to solve this by adding a “secondary task” to test mental capacity often creates a new problem: the assessment tool itself becomes a source of distraction [2], potentially altering the user’s natural performance.

The Future of Measuring Mental Workload

To overcome current limitations in estimating the MWL, we need methods that are objective, non-intrusive, and immediate. This is where physiological biomarkers come in. 

Among the various physiological measurements, EEG stands out. Unlike other metrics, EEG can provide time-resolved correlates of workload that are difficult to capture with self-report alone. It allows for second-by-second measurement, enabling us to not only detect when workload spikes but also assess the underlying cognitive sources of that strain [2].

As automation grows and tasks become more cognitively demanding, mental workload is no longer optional to understand. Modern approaches, combining behavior, context, and physiological signals, make it increasingly feasible to quantify cognitive effort in real time.

When organizations can translate invisible cognitive processes into measurable signals, they can better:

  • design safer and more efficient systems,
  • reduce disengagement and cognitive fatigue, and
  • build environments that support sustainable human performance.

Balancing mental workload is not only about productivity. It is a safety, well-being, and systems-design imperative in the modern workplace.

 


References

  1. Eggemeier, F. T., Wilson, G. F., Kramer, A. F., & Damos, D. L. (1991). Workload assessment in multi-task environments. In Multiple task performance (pp. 207-216). CRC Press.
  2. Meshkati, N., Hancock, P. A., Rahimi, M., & Dawes, S. M. (1995). Techniques in mental workload assessment.
  3. Osia, A., Tahamtan, Z., Zhao, L., Davari, M., & Nybacka, M. (2025). A Real-time Unconstrained EEG-Classifier for Mental Workload Monitoring. Presented at the DSC 2025 Europe – Driving Simulation Conference Europe 2025, Stuttgart, Germany, Sep 24-26.
  4. Liu, D., Peterson, T., Vincenzi, D., & Doherty, S. (2016). Effect of time pressure and target uncertainty on human operator performance and workload for autonomous unmanned aerial system. International Journal of Industrial Ergonomics, 51, 52-58.
  5. Salvucci, D. D., & Bogunovich, P. (2010, April). Multitasking and monotasking: The effects of mental workload on deferred task interruptions. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 85-88).
  6. Haufler, A. J., Spalding, T. W., Santa Maria, D. L., & Hatfield, B. D. (2000). Neuro-cognitive activity during a self-paced visuospatial task: comparative EEG profiles in marksmen and novice shooters. Biological psychology, 53(2-3), 131-160.
  7. Gaillard, A. W. (1993). Comparing the concepts of mental load and stress. Ergonomics, 36(9), 991-1005.
  8. Aricò, P., Borghini, G., Di Flumeri, G., Colosimo, A., Bonelli, S., Golfetti, A., … & Babiloni, F. (2016). Adaptive automation triggered by EEG-based mental workload index: a passive brain-computer interface application in a realistic air traffic control environment. Frontiers in human neuroscience, 10, 539.
  9. Schneider, T., Weyhe, D., Schlender, M., Cetin, T., Tabriz, N., & Uslar, V. N. (2025). Evaluating the efficiency and ergonomics of a novel smart surgical lighting system: a passive oddball experiment with EEG measurements to assess workplace strain in clinical settings. Frontiers in Medical Technology, 7, 1584606.
  10. Di Flumeri, G., De Crescenzio, F., Berberian, B., Ohneiser, O., Kramer, J., Aricò, P., … & Piastra, S. (2019). Brain–computer interface-based adaptive automation to prevent out-of-the-loop phenomenon in air traffic controllers dealing with highly automated systems. Frontiers in human neuroscience, 13, 296.
  11. Luong, T., Argelaguet, F., Martin, N., & Lécuyer, A. (2020, March). Introducing mental workload assessment for the design of virtual reality training scenarios. In 2020 IEEE conference on virtual reality and 3D user interfaces (VR) (pp. 662-671). IEEE.
  12. Dahlstrom, N., & Nahlinder, S. (2009). Mental workload in aircraft and simulator during basic civil aviation training. The International journal of aviation psychology, 19(4), 309-325.
  13. Felton, E. A., Williams, J. C., Vanderheiden, G. C., & Radwin, R. G. (2012). Mental workload during brain–computer interface training. Ergonomics, 55(5), 526-537.
  14. Kaber, D. B., Onal, E., & Endsley, M. R. (2000). Design of automation for telerobots and the effect on performance, operator situation awareness, and subjective workload. Human factors and ergonomics in manufacturing & service industries, 10(4), 409-430.
  15. Brain–Computer Interface-Based Adaptive Automation to Prevent Out-Of-The-Loop Phenomenon in Air Traffic Controllers Dealing With Highly Automated Systems.
  16. Di Flumeri, G., Borghini, G., Aricò, P., Sciaraffa, N., Lanzi, P., Pozzi, S., … & Babiloni, F. (2018). EEG-based mental workload neurometric to evaluate the impact of different traffic and road conditions in real driving settings. Frontiers in human neuroscience, 12, 509.
  17. Borghini, G., Astolfi, L., Vecchiato, G., Mattia, D., & Babiloni, F. (2014). Measuring neurophysiological signals in aircraft pilots and car drivers for the assessment of mental workload, fatigue and drowsiness. Neurosci. Biobehav. Rev., 44, 58–75. doi: 10.1016/j.neubiorev.2012.10.00.
  18. Wilson, G. F. (2002). An analysis of mental workload in pilots during flight using multiple psychophysiological measures. The International Journal of Aviation Psychology, 12(1), 3-18.
  19. Zhang, T., Zhang, X., Lu, Z., Zhang, Y., Jiang, Z., & Zhang, Y. (2022). Feasibility study of personalized speed adaptation method based on mental state for teleoperated robots. Frontiers in Neuroscience, 16, 976437.
  20. Ma, Q. G., Shang, Q., Fu, H. J., & Chen, F. Z. (2012). Mental workload analysis during the production process: EEG and GSR activity. Applied Mechanics and Materials, 220, 193-197.
  21. Berlin, C., Bergman, M. W., Chafi, M. B., Falck, A. C., & Örtengren, R. (2021, May). A systemic overview of factors affecting the cognitive performance of industrial manual assembly workers. In Congress of the International Ergonomics Association (pp. 371-381). Cham: Springer International Publishing.
  22. Puspawardhani, E. H., Suryoputro, M. R., Sari, A. D., Kurnia, R. D., & Purnomo, H. (2016, July). Mental workload analysis using NASA-TLX method between various levels of work in the plastic injection division of a manufacturing company. In Advances in Safety Management and Human Factors: Proceedings of the AHFE 2016 International Conference on Safety Management and Human Factors, July 27-31, 2016, Walt Disney World®, Florida, USA (pp. 311-319). Cham: Springer International Publishing.
  23. Hancock, P. A., & Matthews, G. (2019). Workload and performance: Associations, insensitivities, and dissociations. Human factors, 61(3), 374-392.