Close Menu
    Facebook X (Twitter) Instagram
    TRENDING :
    • Burned out middle manager? Try fractional work
    • German Foreign Minister Doubts NATO’s Survival
    • ‘AI is just amplifying that weakness’: The dangers of having AI draft difficult conversations for you
    • Brits Are Feeling The Economy Collapse In Real-Time
    • How to build trust at a new job
    • America’s Housing Stress Is Rising, But This Is Not 2008 All Over Again
    • The future of healthcare is about giving back attention
    • Which housing markets have the most—and least—mortgage distress right now?
    Populist Bulletin
    • Home
    • US Politics
    • World Politics
    • Economy
    • Business
    • Headline News
    Populist Bulletin
    Home»Business»‘AI is just amplifying that weakness’: The dangers of having AI draft difficult conversations for you
    Business 5 Mins Read

    ‘AI is just amplifying that weakness’: The dangers of having AI draft difficult conversations for you

    Business 5 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email Copy Link
    Follow Us
    Google News Flipboard
    Share
    Facebook Twitter LinkedIn Pinterest Email

    “No more reading emails, OK?” says tech founder and content creator Jason Yeager’s satirical boss character MyTechCeo in a recent TikTok skit.  “I want your AI reading my AI-generated email—and answering my email.”

    It’s a parody, but only just. 

    AI emails are proliferating across industries. In October, LinkedIn’s CEO Ryan Roslansky said he uses AI for almost every “super high-stakes” email he sends. And a recent survey from the email verification software company ZeroBounce found that one in four respondents admit to using it daily for drafting or editing their own emails. 

    On Reddit, employees swap stories about bosses who use AI “to answer every email at work and thinks no one notices” or who “only communicate through AI-generated emails and it’s giving me anxiety.” When unsure, the most realistic response is to use AI too. Plug your message into a chatbot, tweak what comes out, and send it back.

    But if you receive a message that was likely written by AI, especially in the midst of a disagreement, you can tell—something’s off.

    It sounds a little too well drafted. The tone is reasonable and balanced. And while the problems are addressed, there’s something missing: the voice of the person you’re communicating with. (A dead giveaway, of course, is when the prompt is left in.)

    Emails may sound smoother this way, but experts worry that outsourcing difficult conversations also bypasses the relationship-building that makes workplaces function. When you ask a chatbot to rewrite your message to be more “concise” or “professional,” it can also strip away the emotional substance of the exchange—an act that may be shaping the future of work for the worse, incubating a generation of professionals who can’t talk to one another.

    The great social offloading

    There is some reported benefit to “dry-chatting” with AI—practicing tricky topics with a bot first so you can tackle the issue directly and clearly with someone afterward. Used as rehearsal, AI can be an effective tool in building confidence. 

    But when used as a substitute, it does the opposite. Filling the gap entirely, with one person’s ChatGPT effectively talking to another person’s Claude, can create distance. This runs counter to what companies say they want when bringing colleagues back into the office: creativity, collaboration, and stronger working relationships.

    “When it handles the hard conversation, the human never builds the muscle of doing that,” Leena Rinne, vice president of leadership, business, and coaching at the workplace skills management platform Skillsoft, tells Fast Company. “It’s not just that the interaction risks feeling like AI—because it does—but you’re actually compromising trust with the person.”

    Rinne calls this outsourcing of difficult conversations “social offloading.” It’s particularly problematic when leaders resort to it, Rinne says, because it “almost regresses their ability to have the hard conversations.”

    “Now you’re less in the moment and less able to do this thing that leaders need to be able to do,” she says. It’s a problem for everyone involved: The boss isn’t developing the skill of communicating more clearly, and the employee isn’t figuring out how to effectively push back and ask for clarity. 

    Carla Bevins, associate teaching professor of business management communication at Carnegie Mellon University’s Tepper School of Business, tells Fast Company she’s increasingly seeing people rely on AI-generated language in high-stakes moments.

    “In some cases, both parties are doing this, which means the exchange is technically happening, but the relational work is not,” she says. From a business communication perspective, this distinction matters because difficult conversations are about so much more than just clarity or tone. 

    “They are where leaders signal judgment, accountability, and intent in real time,” Bevins says. 

    The temptation makes sense

    The appeal is understandable. Sarah Wittman, an assistant professor of management at George Mason University’s School of Business, tells Fast Company that a lot of people have never been formally trained in how to have difficult conversations or resolve conflict constructively.

    She points to social media and short-form content shrinking attention spans, along with the perfunctory exchanges that are familiar to many workplaces. At the same time, employees are busy and often anxious about getting laid off.

    “We’re on the clock, messaging on Slack or Teams, or in meetings where, in the best of cases, there might be some social chit-chat,” Wittman says. “In this world, it seems logical that people are turning to a tool that can give them quick answers to solve problems that they may not know how to solve.”

    For people navigating power imbalances or tense workplaces, AI can also feel like a way to protect themselves from saying the wrong thing or escalating a conflict.

    Caitlin Collins, an organizational psychologist at the performance management software platform BetterWorks, tells Fast Company this signals that a workplace isn’t providing psychological safety for its workers. “AI is just amplifying that weakness,” she says.

    Over time, the concern is that more and more conflict avoidance will reshape workplace culture for the worse.

    Send the messy draft

    Communication is especially important to learn in our early careers. Those who spent their university years, and even their first few professional years, on a laptop are in particular need of strengthening this muscle.

    In organizations that are flattening and removing middle managers, leaders already have less time to dedicate to mentoring and nurturing them.

    “When this layer is compressed and AI fills the gap, employees at both levels lose the chance to observe and practice,” Bevins says. 

    Instead, Rinne argues, leaders should set the tone by sending the messy first draft. It’s more honest, and conveys what they really mean.

    “There is an element of authenticity that shows up when I make a mistake—when I flub the conversation,” she says.

    “Me going back and saying, ‘Hey, I’m really sorry’, or ‘I wish I would’ve handled that differently’, builds trust,” she adds. “It can’t be my AI apologizing for me.”





    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

    Related Posts

    Burned out middle manager? Try fractional work

    May 7, 2026

    How to build trust at a new job

    May 7, 2026

    The future of healthcare is about giving back attention

    May 7, 2026
    Top News
    Business 4 Mins Read

    Salesforce Unveils Agentforce IT Service, Revolutionizing Employee Support with AI

    Business 4 Mins Read

    In today’s fast-paced business world, small companies often find themselves drowning in IT-related tasks that…

    Housing markets where deals are emerging as homebuilder inventory piles up to 16-year high

    October 11, 2025

    This Microsoft security team stress-tests AI for its worst-case scenarios

    March 25, 2026

    AI is the new workplace issue dividing managers and employees

    March 6, 2026
    Top Trending
    Business 5 Mins Read

    Burned out middle manager? Try fractional work

    Business 5 Mins Read

    Middle managers are at a crossroads right now. With the “Great Flattening”…

    Economy 3 Mins Read

    German Foreign Minister Doubts NATO’s Survival

    Economy 3 Mins Read

    When former German Foreign Minister Joschka Fischer says openly, “I have my…

    Business 5 Mins Read

    ‘AI is just amplifying that weakness’: The dangers of having AI draft difficult conversations for you

    Business 5 Mins Read

    “No more reading emails, OK?” says tech founder and content creator Jason…

    Categories
    • Business
    • Economy
    • Headline News
    • Top News
    • US Politics
    • World Politics
    About us

    The Populist Bulletin was founded with a fervent commitment to inform, inspire, empower and spark meaningful conversations about the economy, business, politics, government accountability, globalization, and the preservation of American cultural heritage.

    We are devoted to delivering straightforward, unfiltered, compelling, relatable stories that resonate with the majority of the American public, while boldly challenging false mainstream narratives that seem to only serve entrenched elitists, and foreign interests.

    Top Picks

    Burned out middle manager? Try fractional work

    May 7, 2026

    German Foreign Minister Doubts NATO’s Survival

    May 7, 2026

    ‘AI is just amplifying that weakness’: The dangers of having AI draft difficult conversations for you

    May 7, 2026
    Categories
    • Business
    • Economy
    • Headline News
    • Top News
    • US Politics
    • World Politics
    Copyright © 2025 Populist Bulletin. All Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.