In the final analysis you have a microprocessor executing machine code. Any rules that the production process of the target machine's software attempted to enforce, by using a particular high level language translator, whether it be rust or any other high level language, can always be subverted; if the attacker can deliver his exploit code to the target machine and then kick the cpu into running that exploit with a sufficiently high privilege level. Processor designers have attempted to provide various ways of locking down the hardware (eg, intel SGX, 'Intel Hardware Shield' , the AMD/ARM equivalents, and so on) which in turn were rapidly broken or are in the process of being cracked by security researchers and hackers.
It's a never-ending arms race. Whether software written in 'Rust' is really any more secure than software written in C is open to question. I suspect as one category of exploits is closed off, others will be opened. The programming techniques that mitigate against well-known exploits in C code such as buffer overflows have been known for many years, they only need to be taught and actually used. Will considerably more complex translators like Rust really make it impossible for the programmers (fresh graduates, H1Bs or offshore) that companies typically want to hire, to write code that can be exploited? I doubt it. So I remain skeptical of the claims of the people selling "safe" high-level language translators; it sounds like snake-oil to me. It very much remains to be seen whether real software written in 'Rust' is actually any "safer" than software written in C.
The real reason for many security vulnerabilities in software lies much more in the process of commercial software production than in the choice of any particular language compiler. Who in a programming shop has not heard "have you finished coding it yet?", "I'm about half of the way through, I've got most of the basic goodpath I showed the team at yesterday's early demo, but a lot of it is still a prototype, its not ready yet, I need to add all the error handling", "ok, well check it in right now anyway, we'll start testing it tomorrow" and then a few days later, round the corner in the test shop: "how far through the test plan are you?", "I'm halfway through, there's another 20 pages of tests to go, say another two weeks?", "OK, stop testing it right now anyway, we are freezing the build this afternoon, we're shipping it tomorrow". Big bonus payments all round for the management team for "meeting the dates".
And then one year later "Holy crap!! Major hack of the software, thousands of customer records stolen at the client's site. We've debugged the code, we found it's all the fault of the programmer, he didn't write any code to check for error cases. And the testers dont appear to have tested that part either, they must have screwed up the test plan." Yep, "this is the way."
It's a never-ending arms race. Whether software written in 'Rust' is really any more secure than software written in C is open to question. I suspect as one category of exploits is closed off, others will be opened. The programming techniques that mitigate against well-known exploits in C code such as buffer overflows have been known for many years, they only need to be taught and actually used. Will considerably more complex translators like Rust really make it impossible for the programmers (fresh graduates, H1Bs or offshore) that companies typically want to hire, to write code that can be exploited? I doubt it. So I remain skeptical of the claims of the people selling "safe" high-level language translators; it sounds like snake-oil to me. It very much remains to be seen whether real software written in 'Rust' is actually any "safer" than software written in C.
The real reason for many security vulnerabilities in software lies much more in the process of commercial software production than in the choice of any particular language compiler. Who in a programming shop has not heard "have you finished coding it yet?", "I'm about half of the way through, I've got most of the basic goodpath I showed the team at yesterday's early demo, but a lot of it is still a prototype, its not ready yet, I need to add all the error handling", "ok, well check it in right now anyway, we'll start testing it tomorrow" and then a few days later, round the corner in the test shop: "how far through the test plan are you?", "I'm halfway through, there's another 20 pages of tests to go, say another two weeks?", "OK, stop testing it right now anyway, we are freezing the build this afternoon, we're shipping it tomorrow". Big bonus payments all round for the management team for "meeting the dates".
And then one year later "Holy crap!! Major hack of the software, thousands of customer records stolen at the client's site. We've debugged the code, we found it's all the fault of the programmer, he didn't write any code to check for error cases. And the testers dont appear to have tested that part either, they must have screwed up the test plan." Yep, "this is the way."
Last edited: