The GC imposes a non-trivial overhead, but this is usually OK because hardware is so crazy overpowered nowadays.
And the overhead of GC can be minimized (to near zero) in the common case, namely a variable that is not shared (what Rust calls "borrowed"), but only constructed, used, and then destructed; in such a way that the compiler can see the life cycle of the variable. In modern C++, that's the use case of unique_ptr.
Reference counting is somewhere in the middle. I like
this description of it. Basically it's a lightweight GC the cost of which is shared across your entire program.
And sadly, pure reference counting does not work, it leads to memory leaks. Classic example: You create two data structures A and B, and hold references to them (in local variables). Now you cross-link them, so A has a reference (pointer) to B, and B a reference to A. Finally, you release your two local references to the two variables (for example when the variables go out of scope). A and B will never be destroyed, because they have a ref count of 1. Even in modern C++ using shared_ptr, this memory leak exists.
So in my (not at all humble) opinion, there still is no perfect solution to memory management. It's always tradeoffs. The traditional C way (rely on the programmer to be a super-genius who never makes mistakes) works in certain areas, but does not scale to the population at large. Because of C++'s origin as "C with classes", it has never been able to completely close those gaps; but at least in modern C++ we have idioms that are rich enough so that we can prevent memory management problems (stale pointers, leaks) if we apply enough bondage and discipline. It's not the language itself that solves the problem, it is the conventions built on top of it: coding rules, style guides, reviews, unit tests, and autopsies after crashes (valgrind is in that category). At the other extreme, languages such as Java and Python solve the memory management problem, but that solution comes at a cost. The cost of CPU cycles spent on GC is minor; the real cost is forcing the programmer into a certain way of doing things. And then there are various new languages, such as Go, Rust, Kotlin, Carbon, none of which have stood the test of writing large systems of system programs in them.
If our industry was sane, we'd trade off these factors whenever considering a new project or platform. Sadly it is not, and what we have is a silver bullet culture.
Actually, I contend that our industry is very very sane. Whenever a big (commercial, not hobbyist) project starts, it evaluates what tools to use. And it picks the optimal one, looking at tradeoffs: What is out biggest bottleneck? Programmer skill? Wall clock time of development? CPU cycles? Reliability once deployed? A language that's designed for readable and long-term maintainable and enhancable code? Different projects find different answers, because their tradeoffs are so radically different. Sometimes, they make a mistake and settle on the wrong answer; c'est la vie. Sometimes the answer they find is painful, for example "we'd really like to code in Y because it will save CPU power in the long run, but we need to use X because we don't have enough manpower to rewrite existing libraries that are in X".
My personal answer has been (for the last ~5 years) to do MOST of my coding in Python and SQL, using C++ only where necessary for performance and to interoperate with existing code. Your mileage WILL vary.
Every so often there's a shiny new thing that is going to solve every problem for ever and ever amen. Rust is just the latest silver bullet.
And now we're switching from good engineering practice to psychology, sociology, and mass hysteria. While in my experience good (commercial) engineering practice looks at all available tools (which explicitly includes Rust as an option), the internet-based culture chases the "shiny object", and often prematurely declares it to be the silver bullet. We had that happen with Java about 25 years ago (and I spent 5 years of my life promoting and coding in Java): It is a very fine language, and a good engineering solution to many problems. Alas it was over-hyped as the "solution to everything", and it couldn't satisfy those exaggerated expectations. Then the frustration set it. And the language and infrastructure both ossified and got more and more (too many?) features, and today it is used less than it should be.