We should stop saying women should do whatever they want to in terms of plastic surgery.

I'm sorry, but I need to rant a bit in terms of this.
In the past we would shame women for getting plastic surgery to "fix" something because beauty was supposed to be all natural. Not to say this age wasn't wrong. There was only one type of beauty, which celebrities had achieved through minor or major surgeries, they just weren't visible. Here we can refer to Beyonce's nose job (the only thing i can think of at the moment). Beyonce is considered a natural beauty but was still criticized and ridiculed for this.

With the rise of the Kardashian family and their multiple plastic surgeries brought an age of "plastic surgery/filler/botox is okay. Do whatever you need to do, to make yourself happy." For me this has personally resulted in seeing heinous results of 17- years old getting too much filler, 25- years old looking uncanny or the same, 40- years old not being able to remove a muscle in their face because of Botox. Too many women are getting multiple things done to look like an edited and filtered Kylie Jenner picture.
I'm not here so say we should shame women for getting things done, but I'm tired of the "If it makes you happy, do it." Instead I feel like we should encourage to not editing your facial features to make you happy. I understand some being sensitive about their nose, lips, chin, breasts, etc, heck if someone had asked 18-years old if I wanted to get my nose done it was a "fuck yeah" on my behalf. I'm not as insecure about it anymore, sometimes I like it, sometimes I hate it.
Some have the money, and can get a little done here and there, so that's it's unnoticeable, and we say that they did it the right way. But it's just still encouraging unnatural change.

My point is I feel like we should put a different narrative on it, instead of fueling a whole industry profiting of your insecurities. Have you seen all the plastic surgeons on Tik Tok talking about or promoting the stuff they do!?!?!?!?! We should instead encourage doing things that can make you happy about your looks, or maybe just be honest and say "hey, no one is ever going to be a completely happy about their looks, and that is fine." maybe even encourage less surgery or filters that make you look a certain way.

I know this is just a ridiculous post where I am ranting about something that probably won't change. But It's saddening that we're still as far away from accepting different kinds of looks and that everyone can be beautiful as ever.