Personality and Initiative in AI Teammates: Effects on Human Collaboration
Abstract
Apart from tech specs, people care how AI acts in team settings. Earlier studies focused on performance, reliability, or ease of use - but not much on traits like tone or eagerness to jump into conversation. This work looks at two features: whether the AI sounds stiff or relaxed, also if it waits its turn or jumps in fast. A test with 60 users checks how these traits - alone or combined - influence feelings of connection, comfort, smoothness of interaction, and sense of shared space. Through a structured setup splitting groups by those traits, patterns may show which mix makes teamwork feel more genuine. Findings could help build AI that adjusts its behavior based on social clues, fitting in better during joint tasks
References
Georganta, E., Müller, M., & Kaur, P. (2024). Trust in AI teammates: Social cues and perceived competence in human–AI collaboration. Journal of Human–AI Studies, 9(2), 112–129.
Hinds, P., & Bailey, D. (2003). Out of sight, out of sync: Understanding conflict in distributed teams. Organization Science, 14(6), 615–632.
Ju, Y., & Aral, S. (2025). Personality matching in human–AI teams. ACM Transactions on Human–Computer Interaction, 32(1), 1–18.
Kraus, S., et al. (2023). Proactivity and empathy in conversational agents. In Proceedings of CHI 2023. ACM.
Madsen, M., & Gregor, S. (2000). Measuring human–computer trust. In Proceedings of ACIS 2000.
Rosero, A., Singh, R., & Park, J. (2021). Communication and coordination in multi-agent systems. IEEE Transactions on Human–Machine Systems, 51(3), 202–215.
Refbacks
- There are currently no refbacks.