I am ashamed of my job. I was at a "higher" school (it's called Gymnasium in my country, you go there for 2 more years than those who "only" go to middle school and you get a more valuable degree which allows you to go to university). All my classmates go to uni, study some awesome things. It's generally seen as a "must" to go to uni if you went to the Gymnasium, getting a regular job is just for the "dumb" people who went to regular school. I know this sounds harsh, but that's really how people see it. Now I tried uni, but failed miserably, and I figured that I'd rather get a regular job that brings me less money than wasting years of my life at uni where I possibly won't even get a degree. It's a really primitive office job, but I like it a lot. I'm not smart, and there I feel like I'm good at what I'm doing, so I'm happy about it. But whenever I meet someone new or catch up with old classmates, and they tell me how they're becoming doctors and lawyers and then ask what I'm doing, I'm so ashamed to tell them. Because they always look at me strangely and then ask "What? Why?" and try to convince me that it's a stupid choice and that I should go to uni. At this point I'm even avoiding meeting people altogether because I hate having these conversations.