When Groups Stop Listening
- Written by Mark J. Chironna, PhD.

Arrogance, Hubris, and the Art of Knowing What You Don’t Know
We have all been in that room. The one where the air itself seems arranged around a particular person or a particular circle, and where an unspoken message is transmitted with perfect clarity: outsiders are tolerated here, but they are not truly welcome. You might not be able to point to a single thing that was said. You might leave uncertain about what exactly happened. But you know what you felt.
What you encountered was the social architecture of arrogance. And if the community you walked into has been at it long enough, you may have encountered something more serious: hubris.
These two words are often used interchangeably. They are not the same thing, and the difference matters considerably.
Aristotle understood arrogance as a vice of excess in self-assessment, a habitual overestimation of one’s own importance or capacity communicated through how one relates to others.1 It is a posture, a way of holding oneself in relation to the people around you that consistently signals you consider yourself above them.
Hubris is something older and darker. The ancient Greeks used the word hybris to describe something that went well beyond personal pride. It was the deliberate humiliation of another for the pleasure of demonstrating superiority, a transgression against proper limits, an offense against right order.2 Here is one way to feel the difference: arrogance is a character flaw. Hubris is arrogance that has stopped acknowledging any authority above itself, including reality.
The neurologist and former cabinet minister David Owen spent years studying what he calls “Hubris Syndrome” in political leaders, arguing that hubris is not simply a personality trait but a condition that certain kinds of power reliably produce.3 Remove accountability, add unchecked authority, let success go unexamined long enough, and something shifts. The person, or the community, stops being correctable. That is the threshold. And once crossed, it is very hard to walk back.
What I find most sobering about all of this is that neither arrogance nor hubris is simply a matter of individual character. They are reproduced socially. Tribes make arrogance. And they do it whether they intend to or not.
The social psychologist Agnieszka Golec de Zavala has spent years researching what she calls “collective narcissism,” the shared belief that one’s group is exceptional and insufficiently appreciated by the wider world.4 Her research reveals something important: collective narcissism is not the same as healthy group pride. It is, at its root, a form of group fragility. The grandiosity and the insecurity are not opposites. They are the same thing wearing different clothes. Groups that perform superiority loudest are often groups most anxious about whether their claims can bear scrutiny.
Dacher Keltner’s research on power deepens this picture.5 Status reduces empathy. Groups that believe they hold special access, special knowledge, or special authority tend to institutionalize that belief through who they let close, who they acknowledge warmly, and who they quietly pass over.
Here is what I find most instructive, though, and it is something the research does not always address directly. We tend to assume the problem lives in the loud, dominant, performance-oriented leader. That person is relatively easy to identify. What is much harder to see is the quiet leader who generates exactly the same tribal pathology through subtler means.
The philosopher René Girard spent his career arguing that human beings are primarily imitative creatures.6 We learn what to want, how to carry ourselves, how to treat others, not through explicit instruction, but by absorbing the models in front of us. A leader who embodies condescension quietly, without ever stating it, installs that condescension in their followers through imitation alone. No announcement is required. The community simply becomes what it has absorbed.
The anthropologist Edward Hall showed that physical and spatial behavior communicates hierarchy far more powerfully than words ever can.7 Who gets close to the center? Who is greeted warmly, and who is merely acknowledged? Who receives the private conversation after the public event? These distributions of proximity and access are the real constitution of a community, regardless of whatever values are printed in its materials.
And when something goes wrong, when the group’s behavior contradicts its stated values, the sociologist Diane Vaughan’s concept of “structural secrecy” describes what tends to follow.8 Inconvenient information simply never surfaces to the people who could act on it. Not through conspiracy. Through the normal operation of a culture that has learned, without anyone needing to say so, what questions are welcome and what questions are not.
The psychologist Leon Festinger called what follows cognitive dissonance reduction.9 Groups, like individuals, are motivated to resolve the gap between what they believe about themselves and what the evidence suggests. When the quiet leader declines to name the contradiction, the group papers over it. Silence at the top grants permission.
So what do we do with any of this?
I want to suggest that the answer is not another posture. It is not the performance of openness or the theater of humility. The philosopher Linda Zagzebski, writing in the tradition of virtue epistemology, argues that genuine epistemic humility is a character disposition, not a technique.10 It begins with an honest reckoning with the conditions under which your own reasoning is most likely to go wrong.
And Robert Roberts and Jay Wood have made the case that epistemic humility without epistemic courage is just passivity.11 The person genuinely committed to knowing well is willing to say both “I do not know” and “here is what I believe, and here is why,” and to hold both without the anxiety that makes premature certainty so tempting.
The philosopher C. Thi Nguyen draws a distinction I find clarifying. An epistemic bubble, he argues, is an environment where you mostly hear what you already believe. An echo chamber is something worse: it is an environment that has inoculated you against the credibility of outside sources in advance.12 A bubble can be burst by new information. An echo chamber has already decided that new information from certain directions cannot be trusted.
That distinction applies everywhere. It applies to political movements, professional guilds, academic communities, online subcultures, and religious traditions of every variety. The temptation to build communities that validate rather than challenge is not the property of any single ideology. It is a structural temptation available to any group that has developed a strong enough sense of its own distinctiveness.
The room I described at the beginning of this piece exists in many forms. The question worth sitting with is not only whether we have been in it. It is whether we have been among the ones who built it.
Notes
1. Aristotle, Nicomachean Ethics, trans. Terence Irwin (Indianapolis: Hackett, 1999), IV.3, 1123b–1125a.
2. Nick Fisher, Hybris: A Study in the Values of Honour and Shame in Ancient Greece (Warminster: Aris and Phillips, 1992); Douglas MacDowell, “Hybris in Athens,” Greece and Rome 23, no. 1 (1976): 14–31.
3. David Owen and Jonathan Davidson, “Hubris Syndrome: An Acquired Personality Disorder?” Brain 132, no. 5 (2009): 1396–1406.
4. Agnieszka Golec de Zavala et al., “Collective Narcissism and Its Social Consequences,” Journal of Personality and Social Psychology 97, no. 6 (2009): 1074–1096.
5. Dacher Keltner, The Power Paradox (New York: Penguin Press, 2016).
6. René Girard, Deceit, Desire, and the Novel, trans. Yvonne Freccero (Baltimore: Johns Hopkins University Press, 1965).
7. Edward T. Hall, The Hidden Dimension (New York: Doubleday, 1966).
8. Diane Vaughan, The Challenger Launch Decision (Chicago: University of Chicago Press, 1996), 238–278.
9. Leon Festinger, A Theory of Cognitive Dissonance (Stanford: Stanford University Press, 1957).
10. Linda Trinkaus Zagzebski, Virtues of the Mind (Cambridge: Cambridge University Press, 1996).
11. Robert C. Roberts and W. Jay Wood, Intellectual Virtues (Oxford: Oxford University Press, 2007), 233–264.
12. C. Thi Nguyen, “Echo Chambers and Epistemic Bubbles,” Episteme 17, no. 2 (2020): 141–161.
Written by Mark J. Chironna, PhD.
Dr. Mark Chironna is a public scholar, executive and personal coach, and thought leader with five decades of experience in leadership development, cultural analysis, and future-focused strategies. With advanced degrees in Psychology, Applied Semiotics and Futures Studies, and Theology, he brings a unique interdisciplinary approach to helping individuals and organizations navigate complexity, unlock potential, and craft innovative solutions.
As a Board Certified Coach with over 30,000 hours of experience, he empowers leaders and teams to thrive through resilience, foresight, and actionable strategies. Passionate about human flourishing, he integrates psychological insight and cultural trends to inspire growth and transformation.
WEBSITE:: www.markchironna.com












