Clip
The Importance of Liberal Social Justice
Liberal social justice has been a key part of American society, utilizing the tools of liberalism to fight against unjust laws and promote equality. While there may be criticism of "wokeness" and social justice, it is important to recognize that these ideas stem from the proud tradition of liberal social justice in the US.