Are your bosses / managers acting more aggressive lately?
Are your bosses / managers acting more aggressive lately?
Wondering if it's just me and my own legitimate fuck-ups at work or if this is possibly part of a broader trend. I've been at my current blue collar job for almost two years. For the first few months I was just training and not touching anything, but I've been on my own for about a year and a half. Starting maybe six months ago, my managers have called me into meeting after meeting, usually every few weeks, to talk about how I suck and how they're going to fire me. In their defense, I was legitimately fucking up. And before you tell me to unionize: I already talked about it with my coworkers (white males on the older side) and they aren't interested. I'm on my own at this job for 99% of the time I'm out there so I barely talk with them anyway.
One manager called me recently to thank me for my hard work. Two meetings ago, he basically said I had nothing to worry about, with regard to one of my recent fuck-ups; then we just had another meeting yesterday where he and another manager once again threatened to fire me (over the same fuck-up). We have regular safety meetings with my coworkers—a few safety meetings ago, the managers gave us a list of items all of us needed to have in our work vehicles. One meeting later, they told us we had too many items in our work vehicles. They're just kind of all over the place, and I'm wondering if this is because they're under pressure from their superiors or market trends or they feel emboldened by Trump? What do you think?