Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> that violate the very complicated 3/5/0 rule

Is it actually complicated? There’s only the rule of 0 - either your class isn’t managing resources directly & has none of the 5 default methods defined explicitly (destructor, copy constructor/assignment, move constructor/assingment), or it manages 1 and exactly 1 resource and defines all 5. Following that simple rule gives you exception safety & perfect RAII behavior. Of all the things in C++, it seemed like the most straightforward rule to follow mechanically.

BTW, the rule of 3 is from pre-C++11 - the addition of move construct/move assignment makes it the rule of 5 which basically says if you define any of those default ones you must define all of them. But the rule of 0 is far stronger in that it gives you prescriptive mechanical rules to follow for resource management.

It’s much easier to do RAII correctly in Rust because of the ecosystem of the language + certain language features that make it more ergonomic (e.g. Borrow/AsRef/Deref) + some ownership guarantees around moves unless you make the type trivially copyable which won’t be the case when you own a resource.



> Is it actually complicated?

It is. There is no point in arguing otherwise.

To understand the problem, you need to understand why it is also a solution to much bigger problems.

C++ started as C with classes, and by design aimed at being perfectly compatible with C. But you want to improve developer experience, and bring to the table major architectural traits such as RAII. This in turn meant you add support for custom constructors, and customize how your instances are copied and destroyed. But you also want to be able to have everything just work out of the box without forcing developers to write boilerplate code. So you come up with the concept of special member functions which are automatically added by the compiler if they are trivial. However, forcing that upon every single situation can cause problems, so you have to come up with a strategy that suits all use cases and prevents serious bugs.

Consequently, you add a bunch of rules which boil down to a) if the class/struct is trivial them compilers simply add trivial definitions of all special member functions s that you don't have to, but once you define any of those special member functions yourself them the compiler steps back and let's you do all the work.

Then C++ introduced move semantics. This refreshes the same problem as before. You need to retain compatibility with C, and you need to avoid boilerplate code, and on top of that you need to support all cases that originated the need for C++'s special member functions. But now you need to support move constructors and move assignment operators. Again, it's fine if the compiler adds those automatically if it's a trivial class/struct, but if the class has custom constructors and destructors then surely you also need to handle moves in a special way, so the compiler steps back and lets you do all the work. On top of that, you add the fact that if you need custom code to copy your objects around, surely you need custom code to move them too, and thus the compiler steps back to let you do all the work.

On top of this, there are also some specific combinations of custom constructors/destructors/copy constructors/copy assignment operators which let the compiler define move constructors/move assignment operators.

It all makes absolutely sense if you are mindful of the design requirements. But if you just start to onboard onto C++ and barely know what a copy constructors is, all these aspects are arcane and sadistic. If you declare nothing then your class instances are copied and moved automatically, but once you add a constructor everything suddenly blows up and your code doesn't even compile anymore. You spotted a bug where an instance of a child class isn't being destroyed properly, and once you add a virtual destructor you suddenly have an unrelated function call throw compiler errors. You add a snazzy copy constructor that's very performant and your performance tests suddenly start to blow up because of the performance hit if suddenly having to copy all instances instead of the compiler simply moving them. How do you sort out this nonsense?

The rule of 5 is a nice rule of thumb to allow developers to have a simple mental model over what they need to do to avoid a long list of issues, but you still have no control over what you're doing. Things work, but work by sheer coincidence.


The need to define all 5 has basically nothing to do with C++'s heritage. If you allow those operations to be defined, they all must be defined when you define one of them.

There is a neater design in rust with its own tradeoffs: destructors are the only special function, move is always possible and has a fixed approach, copying is instead .clone(), assignment is always just a move, and constructors are just a convention with static methods, optionally with a Default trait. But that does constrain you: especially move being fixed to a specific definition means there's a lot you can't model well (self-referential structures), and that's a core part of why rust can have a neater model. And it still has the distinction you are complaining about with Copy, where 'trivial' structures can be copied implicitly but lose that as soon as they contain anything with a destructor or non-trivial .clone().

And in C++ it's pretty easy to avoid this mess in most cases: I rarely ever fully define all 5. If I have a custom constructor and destructor I just delete the other cases and use a wrapper class which handles those semantics for me.


> The need to define all 5 has basically nothing to do with C++'s heritage. If you allow those operations to be defined, they all must be defined when you define one of them.

I'm sorry, that is not true at all.

Nothing forces you to add implementations, at least not for all cases. That's only a simplistic rule of thumb that helps developers not well versed on the rules of special member functions (i.e., most) to get stuff to work by coincidence. You only need to add a, say, custom move constructor when you need it and when the C++ rules state the compiler should not generate one for you. There's even a popular table from a presentation from ACCU2014 stating exactly in which condition you need to fill in your custom definition.

https://i.sstatic.net/b2VBV.png

You are also wrong when you assert this has nothing to do with C++'s heritage. It's the root cause of each and every single little detail. Special member functions were added with traits and tradeoffs for compatibility and ease of use, and with move semantics the committee had to revisit everything over again but with an additional layer of requirements. The rules involving default move constructors and move assignment operators are famously nuanced and even arbitrary. There is no way around it.

> There is a neater design in rust (...)

What Rust does and does not do is irrelevant. Rust was a greenfield project that had no requirement to respect any sort of backward compatibility and stability. If there is any remotely relevant comparison that would be Objective-C, which also took a minimalist approach based on custom factory methods and initializes that rely on conventions, and it is a big boilerplate mess.


It would be more user-friendly if non-defined members of the 5 were automatically deleted, IMO.


> It is. There is no point in arguing otherwise.

Well, I don’t know how to respond to this. I clarified what the rules actually are (< 1 paragraph) and following them blindly leads to correct results. You’ve brought in a whole bunch of nonsense about why C++ has become complex as a language - it’s not wrong but I’m failing to connect the dots as to how the rule of 0 itself is hard to follow or complex. I’m kind of taking as a given that whoever is writing the code is mildly familiar enough with C++ to understand RAII & is trying to apply it correctly.

> The rule of 5 is a nice rule of thumb to allow developers to have a simple mental model over what they need to do to avoid a long list of issues, but you still have no control over what you’re doing. Things work, but work by sheer coincidence.

First, as I’ve said multiple times, it’s the rule of 0. That’s the rule to follow to get correct composition of resource ownership & it’s super simple. As for not having control, I really fail to see how that is - C++ famously gives you too much control and that’s the problem. As for things working by sheer coincidence, that’s like your opinion. To me “coincidence” wouldn’t explain how many lines of C++ code are running in production.

Look, I think C++ has a lot of warts which is why I prefer Rust these days. But the rule of 0 is not where I’d say C++’s complexity lies - if you think that is the case, I’d recommend you use another language because if you can’t grok the rule of 0, the other footguns that lie in wait will blow you away to smithereens.


In addition, it's actually pretty easy in most cases where you do what a non-trivial constructor and destructor to just delete the other 3, and wrap it in unique_ptr or similar to manage the hard parts. I think I've defined all 5 approximately once, and mostly for the fun of it in a side project.


> nonsense ... not wrong

So it's not nonsense?

I think GP clearly laid out the base principles that lead to emergent complexity . GP calls this "coincidence" to convey the feeling of lots of complexity just narrowly avoiding catastrophe in a process that is hard to grok for someone getting into C++. GP also gave some scenarios in which the rule of 0 no longer applies and you now simply have to follow some other rule. "just follow the rule" is not very intuitive advice. The rule may be simple to follow but the foundations on which it rests are pretty complicated, which makes the entire rule complicated in my worldview and also that of GP. In your view, the rule is easy to follow therefore simple. Let's agree to disagree on that. Again, being told "you need to just follow this arbitrary rule to fix all these sudden compiler errors" doesn't inspire confidence in ones code, hence (I think) the usage of "coincidence". If I were using such a language, I'd certainly feel a bit nervous and unsure.


> GP calls this "coincidence" to convey the feeling of lots of complexity just narrowly avoiding catastrophe in a process that is hard to grok for someone getting into C++

I think that's what they said themselves:

>> It all makes absolutely sense if you are mindful of the design requirements. But if you just start to onboard onto C++ and barely know what a copy constructors is, all these aspects are arcane and sadistic

IMO not knowing why something works (in any language) is an unpleasant feeling. Then if you have the chance you can look under the hood, read things - it's exactly why I'm reading this thread - and little by little get a better understanding. That's called gaining experience.

> Again, being told "you need to just follow this arbitrary rule to fix all these sudden compiler errors" doesn't inspire confidence in ones code, hence (I think) the usage of "coincidence"

That's exactly what other languages like Haskell or Rust are praised for. Why does C++ receive a different treatment when it tries to do the same thing instead of crashing on you at runtime, for once?


> That's exactly what other languages like Haskell or Rust are praised for.

You making a trivial change, and suddenly there are entire new classes of bugs all over your code is an aspect that does really not receive any praise. People using those two languages work hard on avoiding that situation, and it clearly feels like a failure when it happens.

The part about pointing problems at compile time so the developer will know it sooner is great. And I imagine is the part you are talking about. But the GP was talking about the other part of the issue.


> Things work, but work by sheer coincidence

I wouldn't be so dramatic. House of cards don't stay put by coincidence !




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: