What are Americans like as a nation? How does the USA culture impact self-esteem? Americans have a complicated relationship with self-esteem. Specifically, USA culture includes values that support self-esteem and others that are harmful to self-esteem. According to psychotherapist and self-esteem expert Nathaniel Branden, the latter predominates today. Here’s how the American culture erodes the self-esteem of the nation.
How the USA Culture Impacts Self-Esteem
