Every christmas thread I have ever read, or any christmas related topics I seem to hear are always related to gifts.
Why is it important what you get for christmas, or why brag about what you get? Christmas has become a holiday to show off your financial and social status within the world, rather than what it really is, a celebration. Not a celebration about Religion (Although it has been morphed into one) but a celebration of the last solstice of the year.
The original christmas celebrations, many thousands of years ago were celebrations about it being the coldest day of the year, because the sun was furthest away. It symbolized that the weather would be getting warmer, and new life was just around the corner. How it got from there to where it is now is a mystery of human history. Religion having started as sun worship ( That's a fact ) and having placed the birth of christ convieniently on this day.
But anyways, my original point is as follows.
Why do you think Christmas has become about gifts, and if it's not about gifts for you, what is important to you on Christmas?