Winters are undoubtedly beautiful, with the snow, mist and the fog, depending on where you live. However, there are a lot of negative effects winters have on us. Along with your comfort and warmth, your skin too is strongly affected by the brutal winters. Here are the various ways in which winters damage your skin and how you can take care of it:
Infographic by – WishingUwell.com