The smartphone phenomenon has been impacting the milieu in the United States for well over 15 years. Some of us can remember what life was like before smartphones were put into the hands of Americans. Many of us were in college or had just graduated from high school. However, many Americans, especially parents, could not have predicted how behavior was going to be shaped by smartphones. Smartphones paved the way for software developers to create what is now referred to as “apps.” Although smartphones, apps, and social media have their merits, the adverse outcomes due to this technological revolution can no longer be ignored.
One of the main concerning outcomes of the smartphone revolution is what some psychologists refer to as “AI empathy” (Hansen-Staszyński, 2025). Empathy is created by chatbots. A chatbot is a computer program that simulates human-like conversations. Software engineers, originally, to my knowledge, developed chatbots to help make customer service departments more efficient. Some of you have encountered chatbots when visiting business websites. They are appearing on your computer and smartphones more frequently in 2025.
Chatbots are using keystrokes, biometric data, and data mining engineers to create sophisticated chatbots to engage children and young adults (CYA). Hansen-Staszyński points out that empathy AI/chatbots are distinct in many ways from empathy that CYA experience when engaging with their peers in real life (2025). Hansen-Staszyński explains that empathy AI-bots create identity fragmentation in CYA. Identity fragmentation creates unhealthy forms of vulnerability. AI-empathy-bots do not grow with remorse, healthy guilt, and/or social repair when their empathy misses the “mark.”
Identity fragmentation can develop other clinical symptoms in CYA. The main symptoms that will develop are low self-worth, negative self-talk, and dissociation, i.e., avoiding intense emotions when stressed. Identity fragmentation can impact relationships with peers over time. When these clinical symptoms appear in CYA, they will not have the mental “headspace” to develop long-lasting human relationships. Empathy with other humans will appear too painful to experience; a CYA will not be able to bear the pain of normal human mental pain that comes from building authentic relationships. Loneliness is inevitable for these generations of CYA.
AI empathy chatbots can also create “liquid relationships” (Hansen-Staszyński, 2025). Liquid relationships are relationships that do not possess any emotional risks. Healthy vulnerability can build resilience and strong emotional regulation skills. Ending relationships with peers is hard; friendships were meant to end throughout the developmental period for children and young adults. Healthy vulnerability also creates stronger self-awareness skills, e.g., “I wonder why _______ didn’t want to be my friend anymore?”. Liquid relationships offer “warmth without commitment” to CYA (Hansen-Staszyński). The internet and social media apps help children dissociate from all social and emotional identities. All apps and smartphones become the new extension and whole identity of CYA (Harkin & Kuss, 2021).
You are your phone. Relationships no longer have any social benefit for CYA. Human connections are a burden for CYA, no longer worthy of pursuing. AI empathy-bots are more like dolls rather than CYA. A relationship with a “doll” only meets the narcissistic-like needs of CYA without having to process the challenges that come with human feedback that is necessary to build authentic relationships. CYA and developing brain patterns that teach them to expect empathy always without any cost (Hansen-Staszyński, 2025). These patterns are creating clinical acute symptoms in CYA, which are deteriorating and creating traumatic experiences for entire generations. Friends, connect with our CYAs to retrieve what has been taken from them by tech-billionaires. To be a child and feel like a child must be retrieved today.
References
Hansen-Staszyński, Onno. (2025). The Case Against AI Simulated Empathy. https://saufex.eu/post/48-The-case-against-AI-simulated-empathy
Harkin, L. J., & Kuss, D. (2021). “My smartphone is an extension of myself”: A holistic qualitative exploration of the impact of using a smartphone. Psychology of Popular Media, 10(1), 28–38. https://doi.org/10.1037/ppm0000278
Guest Post Disclaimer: Any and all information shared in this guest blog post is intended for educational and informational purposes only. Nothing in this blog post, nor any content on CPTSDfoundation.org, is a supplement for or supersedes the relationship and direction of your medical or mental health providers. Thoughts, ideas, or opinions expressed by the writer of this guest blog post do not necessarily reflect those of CPTSD Foundation. For more information, see our Privacy Policy and Full Disclaimer.

Clinical Complex Trauma Specialist (CCTS-1),
Certified Dialectical Behavioral Therapist (C-DBT),
Certified Alcohol & Drug Abuse Counseling (CADC)