When you work in Corporate America, it takes things from you.
You know that going into it, or at least you should. It’s been the case for quite a while, but any pretense that corporations cared about their employees was stripped away for good in the 1980s and 1990s. Companies changed the names of their personnel departments to “Human Resources.” Layoffs…
Keep reading with a 7-day free trial
Subscribe to Lost in Left Field to keep reading this post and get 7 days of free access to the full post archives.