So apparently the movie "The Hunger Games" came out in theaters last night, and apparently it was very good (or so they tell me). People all over the internet and radio are saying the movie was amazing, and so true to the book, and they can't wait for the next installment. Even adults are raving about them, talking about how beautifully written they are and how inspired they feel reading about the adventures of Katniss (the heroine).
But I have a question. And before some of you start groaning, let me assure you I am asking this in all seriousness and sincerity.
What is the message of these books?
Yes, I have read them, and I found them to have entertainment value, but since everyone seems so taken with them I was wondering if I'm perhaps missing some underlying something that makes them so attractive?
So! If you like, I would appreciate it if y'all would comment and let me know your thoughts. :) I'm not trying to start an argument; I am genuinely curious about the series and why it is so popular, even among grown men and women.
~Grace