- An Xbox executive suggested that Late -off employees use AI to guide emotional help and career
- This advice gave rise to the reaction and guided the executive to delete his LinkedIn post
- Microsoft has kept 9,000 employees in recent months in AI in recent months.
Microsoft has been promoting its AI’s ambitions for the past several years, but for former employees, an executive pitch about the power of AI, which was recently allowed, has come with a strange cooling.
Among the largest period of roofs in two years, Xbox Game Studios Publishing Executive Producer Matt Turn Bill suggests that AI Chat Boats can help the affected people rebuild their grief, crafts experience and their confidence.
The indicator was to support, but it felt many game developers angry.
Turn Bill took his potentially well meaningful but certainly poor and timely message to LinkedIn. He shared ideas for indicating AI chat boot, which he claimed that he could help colleagues to visit career uncertainty and emotional turmoil.
The reaction was sharp and angry, which caused him to delete the post, but you could read it below Brandon Sheffield’s Bluesky post.
Xbox Game Studios Publishing Executive Producer Met Turn Bill – After Microsoft’s Sixth .The suggests on LinkedIn that people who have been allowed may turn to AI for help. He seriously thought it would be a good idea to post it.
– @Brandon.inSertCredit.com (@Brandon.inSertCredit.com.bsky.socialJes 2025-07-07t07: 54: 06.534Z
Turnbull urged colleagues to emphasize the AI ​​to reduce the “emotional and academic burden of job waste”, along with the immediate views of 30 -day recovery plans and LinkedIn messages. Perhaps the largest eyebrow lifting was suggested to help the imposor syndrome re -surgery after leaving.
Turnbull wrote, “No AI tool is an alternative to your voice or living experience.” But at a time when there is a lack of mental energy, these tools can help you evacuate with fast, calm and more explanation. “
Even the most charitable interpretation of his position cannot only be ignored how serious and poor time is about the advice. And the angry game developers flooded the comments, which would potentially be removed.
They do not agree to put it lightly, they do not agree that the calf is an emotional puzzle that is resolved with the algorithm. Instead, perhaps a man can understand his career and a stir in life, and how human sympathy, supporting networks, and solid help requires, such as introducing someone who can help you get a new job.
AI therapy
The incident is even worse in the context of Microsoft in which billions of people build AI infrastructure, while dramatically shrink their gaming teams. It is more hypocritical to emphasize the developers designed to keep the AI ​​tilt after losing jobs. It is telling people to use a lot of technology that has led to their job reduction.
With the turn bill, the use of AI, AI, can help with some mental health concerns and can be useful in resuming or preparation for a job interview. It is not a terrible idea to make outplacement services part of the AI. This offers Microsoft’s arm for internal coaching and carrier transfer, which can enhance recruiters, resistant workshops, and its offer advice. But it cannot and cannot change these human services. And being one of those people who let you go to you to use AI to find a new job is contrary to auxiliary. This is just an insult to the upper part of the injury.
The dual approach to Microsoft’s laying people and doubling AI infrastructure is as a test of his company’s culture as much as its technical ability. Shall we see a new standard where you will come with AI prompt packages instead of consulting and separation. Will come? If this is the message, “Use chat boats freely to help you after you dismiss,” Expect a much more provocative, tone deaf nonsense from executives.
Perhaps they should ask these chat boats how to communicate with humans without being angry, because this is a lesson they have not learned well.