1: Women have babies. It's what they are supposed to do.
2: A far better question is "Why do female celebrities have to ruin themselves by having pointless dreadful plastic surgery?"
3: The world turns and people come and go. Enjoy them while they are here.
Agred. And why do they so often disfigure themselves with tattoos? I've never seen one woman who looks better with them.
Fact is, women are good to look at in order to attract a mate and further the species.