There has always been the debate about wether television is a good or bad influence on people. Does it make us more violent? Or does it cast a light on subjects?
Well, the debate will go on as long as there is a television to watch. I am on the side that believes that things are better to be talked about.
Of course, most times we watch fictionalized versions of the issues we are living. I mean, there is probably not an issue about violence against zombies, or zombie violence; but of course The Walking Dead is tackling lot of social issues in the show.
In other cases, the issues are much more blatantly portrayed, such as was the case of Breaking Bad, in which the drug providers were the “heroes” of the show. Perhaps some people learned more about drug addiction watching the show than by getting information from government sources or non profit organizations.
And we are getting a fictionalized version. Many of the TV shows that are tackling some critical issue like this, or alcohol abuse, or anorexia, or depression, or a long list of etcetera usually place a link to a non profit organization where to look for more information. I really recommend to follow those links. (In this article I added one of such links). TV shows for kids and teenagers also are used to delve into hard matters, a show like Degrassi comes to mind instantly.
I believe that joining the conversation is always good, especially when you give the tools for more knowledge. But I can understand those who prefer to avoid the subject. It´s not easy at all. And often times it is also a way to make people cringe just with the idea of having to sit and talk about it.
Let me know your thoughts.