I actually specifically don't. There's something almost fatalist about it to me.
I get the same feeling when a story with a transformed human ends with them leaving society forever to join a bunch of mutants like them.
As far as I can tell the -only- thing they have in common is that they were both humans that turned into animal-monster-people, and not to say that that can't be the basis for a relationship, but something about the whole idea of>well they're both monsters, they HAVE to be together!
Just strikes this weirdly depressing chord for me. Like, I prefer it when they DO find confidence, rejoin society, and reattain some semblance of normality, instead of embracing what the dicks antagonizing them want them to be, an outcast.
Does that make any sense?