When did people start feeling like they are entitled to so much stuff? When I was growing up, I was taught that if you wanted something, you earned it. Either through working for money or just being helpful. The only times that I didn’t have to earn what I wanted was birthdays and Christmases. Seriously, when did people start expecting something for nothing?
I know there are people out there who have always felt that way, but it just seems that more and more people in this country expect handouts. Maybe it’s a product of the people who have always felt that way having kids. If they weren’t taught to work for what they want, why would they teach that lesson to their kids?
This generation has become a generation of greedy kids who grow up into greedy adults…and maybe it’s my generations fault for teach them that it’s okay. I don’t know. I just hope that MAYBE there are enough of us out there to flip this around. Otherwise, our society as we know it might be screwed.