This article will go beyond the borders of the TWD universe, so if you are not interested in anything like that I've at least warned you ;)
When I first watched TWD I was amazed. Not just because of the actors' performances and such but more importantly because I saw a glimpse of what I would want to achieve in life and that is reuniting the individuals we all are in a natural setting. And by that I mean a "natural" life. You might say: "Natural life? Isn't everything we experience natural in a way? Is this gonna be some philosophical BS from a guy that doesn't know anything?" To be honest: that may be the case. But then again: who on this planet really knows anything?
Anyway here's my point: The setting of this series made me aware of the dangers that lie ahead of humanity. I basically think that we all have a lifestyle that will lead us into extinction or in the best case to a scenario that the characters in TWD have to deal with: Only a few humans left, scarce resources due to environmental damages etc, violence ruling the land. I bet you all know what I'm talking about: Don't you sometimes get the feeling when watching the news or just simply walking down the street and watching your surroundings that something is really wrong?
We have evolved immensly over the past few hundred years. Or so we think. The price I think we paid is that now we don't even know any longer what to live for. From my point of view it seems like many values of social interaction have been destroyed but even more alarming is the fact that a general sense of responsibility has been lost amongst the majority of people on this planet. Our generation might not face the consequences of our actions but the ones after us certainly will. And once the natural ressources will have been close to depleted there will be either complete destruction or a general state of mind that resembles the one that zombies tend to have.
All I'm saying is: we can take responsibility for our lives NOW or simply wait for this planet to do it for us.
Call me depress(ed/ive) and/or out of touch with the real world but whenever I think about the current state of this "society" it makes me kinda sad. The Walking Dead surprisingly gave me hope and I'm now more willing than ever to CHANGE something (for real, not like Obama ;D) for good to make this life an experience that gives us all we truly desire again.
Please share your thoughts on this with me and possibly tell me where this "idea" could go if done correctly!
with best regards