Performance Decision makers frequently encounter environments without perfect information, in which factors such as the distribution of missing information and estimates of missing information significantly impact decision accuracy and speed. This work presents an experiment which modifies an environment with missing information (total information, option imbalance, cue balance) and examines user estimates of the missing information to understand how accuracy and decision speed respond under time pressure. Results indicate that regardless of the way missing information is estimated, certain distributions of missing information reduce decision accuracy. Results from this work also indicate that beyond information distribution and estimation strategy, differences in decision strategy adopted may explain significant differences in decision performance. High performers tend to ignore a greater percentage of information instead of attempting to estimate it, thereby adopting a strategy more heuristic in nature.
Recently, research by groups in academia, industry, and government has shifted toward the development of AI and machine learning tools to advise human decision-making in complex, dynamic problems. Within this collaborative environment, humans alone are burdened with the task of managing team strategy due to the AI-agent’s use of an unrealistic model of the human-agent’s decision-making process. This work investigates the use of an unsupervised machine learning method to enable AI-systems to differentiate between human decision-making strategies, enabling improved team collaboration and decision support. An interactive experiment is designed in which human-agents are subjected to a complex decision-making environment (a storm tracking interface) in which the provided visual data sources change over time. Behavioral data from the human-agent is collected, and a k-means clustering algorithm is used to identify individual decision strategies. This approach provides evidence of three distinct decision strategies which demonstrated similar degrees of success as measured by task performance. One cluster utilized a more analytic approach to decision-making, spending more time observing and interacting with each data source, while the other two clusters utilized more heuristic decision-making strategies. These findings indicate that if AI-based decision support systems utilize this approach to distinguish between the human-agent’s decision strategies in real-time, the AI could develop an improved “awareness” of team strategy, enabling better collaboration with human teammates.