Bad things happen to everyone, that is the only truth people say. Now my question is, why does it keep getting worse when I’m trying my best to move forward, to work on myself, to finally get over all the traumas, all the self-loathing and just learn to be myself and respect who I am, body and soul? I’ve learned to enjoy myself, to see my value and according to everyone things are supposed to get better from that point forward. Well, I’m starting to think that some of us are just born to be figurants in this world, some people just don’t get a happy ending even when they try hard to get one. I guess life isn’t always fair and hope doesn’t always makes things better, sometimes, it just biggers the fall.
What is the point?
Leave a reply