How DEI became the latest battleground in the right’s ‘war on woke’
As the term diversity, equity and inclusion (DEI) becomes weaponised by right-wing politicians and pundits, American businesses are beginning to back out of their commitments to support and provide an inclusive working environment.
The origin of the term DEI as a workplace concept dates back to the 1960s amid a cultural and societal shift towards a more equitable society. The 1964 Civil Rights Act was the first federal law in the US that prohibited discrimination on the grounds of race, colour, sex or national origin, making it illegal for employers to discriminate in hiring, firing and terms of conditions of employment.
It’s worth noting that federal protection for LGBTQ+ employees came through court cases and executive orders, and not until the 90s at the earliest.
The cultural shift brought on by
Leave a Reply