Can C++ Be As Safe As Rust? |
Written by Harry Fairhead |
Wednesday, 10 April 2024 |
Herb Sutter is a well known and respected C++ champion and he thinks that the language only needs a few tweaks to make it as safe as Rust. Can this be true? Herb Sutter knows C++ and is in position to make things happen. He is chair of the ISO C++ standards committee and hence he should be taken seriously when he makes a comment about its present or its future. In a recent blog post he makes the case that C++ isn't fundamentally flawed and that only a few changes are needed to make it as safe as Rust, or any of the other challengers to its position as the top systems programming language. The key problem with languages such as C++ with no explicit memory management is that they allow out-of-bounds read/write, use-after-free and NULL pointer dereference. Essentially all of these come down to accessing memory that the program doesn't own. These are the areas where C++ needs improvements. Sutter argues that the problem is that C++ allows these things to happen by default. That is, it has no safety barrier to stop you from making a mess of memory management. He doesn't want to restrict C++ so that you can't do things that you might want to do, but to make good practice the default. If you want to do things the old way then you should be able to opt out. Memory-safe languages like Rust have restrictions that you have to go outside of to complete some tasks. For example, Rust's ownership rules restrict you to data structures that are essentially tree-like with no cycles. To write anything more complex you have to lower your standards. The idea of extending C++ and then restricting it is not new. As Sutter points out: Since at least 2014, Bjarne Stroustrup has advocated addressing safety in C++ via a “subset of a superset”: That is, first “superset” to add essential items not available in C++14, then “subset” to exclude the unsafe constructs that now all have replacements. And, as I have commented before, there is no-one who can make C++ seem like a rational language better than Stroustrup. After reading anything by him you are exposed to a rational view of C++ that makes everything elegant. A few minutes away from his vision and you are back to the personal dialects of C++ that make it so difficult to find your way into the code. Sutter suggests that it is just a short road to a much improved C++ and that perfection isn't economically attainable nor necessary. A very reasonable view. He makes a case for default enforcement of existing restrictions on programming in C++. For example: "Enforce the Pro.Type safety profile by default. That includes either banning or checking all unsafe casts and conversions (e.g., static_cast pointer downcasts, reinterpret_cast), including implicit unsafe type punning via C union and vararg." I'm not at all sure I know any longer what unsafe type punning is and I think some of this has to do with compilers wanting to optimize code in ways that go against the programmer's intentions. This is complex. Less complex is his second suggestion: Enforce the Pro.Bounds safety profile by default, and guarantee bounds checking. I can't understand why this isn't already the case. It might be something to do with the inefficiency of having to make such checks. There is also an addition to the idea: Pointer arithmetic is banned (use std::span instead); this enforces that a pointer refers to a single object. Array-to-pointer decay, if allowed, will point to only the first object in the array. This one is harder to swallow, but it can be turned off. The remaining suggestions are mostly obvious and make you think why isn't this already so. I'm very pleased to read the statement on Undefined Behavior, UB: Not all UB is bad; any performance-oriented language needs some. But we know there is low-hanging fruit where the programmer’s intent is clear and any UB or pitfall is a definite bug ... I think I would argue with the fact that UB is necessary. A good language should never result in UB - just in default behavior. Would these suggestions make C++ a better language? The answer is obviously yes, but in a limited sense. Any language, no matter how bad, can be made good, or at least better, by the addition of tools that enforce good practice and help the program generate good code. Taking this one stage further, and making the tools part of the standard, is also good sense. However, I am still of the opinion that C++ is flawed because there are simply too many ways of achieving the same result. If you are a C++ expert then you will have no problem with this as you are on top of the language, but experts are rare. The average or casual C++ programmer has a lot of trouble understanding existing C++ code because of the possible range of expression. To give you an example, and this is one I most recently encountered rather then the worst, consider the treatment of return values. In C you can always discard a return value. In C++17 you can addd [[nodiscard]] and the compiler will flag an error if you ignore the return value. Seems simple enough, but there is std::ignore which can be used to override the [[nodiscard]], It's not standard, but it is proposed in C++26 and it is recommended practice along with a proposal for a [[discard]] attribute to formally discard a return value that has been explicitly marked as [[nodiscard]]. What can one say? C++ is a high-level language that simply has not left its low-level roots behind and the more it tries, the messier it gets. Read the rest of Sutter's proposals. They are an insight into C++ thinking if nothing else.
More InformationRelated ArticlesRust Twice As Productive As C++ C Is Number One Language Again C Undefined Behavior - Depressing and Terrifying (Updated) GCC Gets An Award From ACM And A Blast From Linus To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.
Comments
or email your comment to: comments@i-programmer.info |
Last Updated ( Wednesday, 10 April 2024 ) |