You don't speak for everyone, anon.
I've been depressed more times than I can count. Extremely depressed. When I get told to smile, I don't feel attacked. I feel they're genuinely trying to help.
Also, being told to smile is probably good advice
Since it's been proven that smiling actually does release dopamine and serotonin, which are linked to released stress and increased happiness. Ironically, by feeling down in the dumps, even if you don't feel like smiling, smiling will improve your mood, at the very least a little bit. In addition to improving your immune system.
So, smiling even if you don't feel happy actually can turn your mood around. I don't know why people think it's linked to 'muh soggy knees' when scientifically it just makes you feel better.
Telling people to smile literally is saying, "I want you to feel better and have less stress." How that is hating someone I can't even begin to fathom.