The power of marketing in the 21st century is astounding. Since the first teeth whitening treatments came out in the early 90s, society has come to accept that white teeth are an indication of good oral health and wellness. While there’s nothing wrong with desiring a brighter smile, it shouldn’t come at the cost of all else.
If you want to learn the truth behind dental health and teeth whitening, continue reading. Not everything is as it seems.
(more…)