Where to find 'Real' programmers online?

In many cases, if a function is fully successful, you don't want cleanup. You return the newly allocated memory / resource as well as the other allocations underpinning it. Whereas a defer would incorrectly clean it up, even on success leading to a dangling pointer.

So defer could be nice... but for it to be really useful, it would been to be a "cancellable defer". And that is rare in programming languages.
Defer is useful for dealing with temporary stuff. e.g "sharedVar.Lock(); defer sharedVar.Unlock()". Not when you want an effect to persist past a function return. If for example you must return 3 open files from a function or none in case of error, don't use defer (in case the first 2 open succeeded and 3rd one failed).
 
Several decades ago, I was working on a language translator for a DSL that described electrical wiring diagrams of vehicles. It was based on the XPL compiler from “A Compiler Generator”, by McKeeman, Horning, and Wortman. The DSL was described by a bunch of BNF rules, which were used to generate parsing tables. When the compiler encountered a syntax error, it didn’t have a good way of dealing with it; all it knew was that none of the rules applied. The default behavior was to assume that there was an erroneous token and delete it. It usually ended up deleting the rest of the program. This was particularly bad, because you could batch up several compiles, and it would delete the subsequent programs.

There was a divider line that separated the different programs in the batch. If the compiler encountered a divider line while it was deleting erroneous tokens, I had it do a GOTO to a global label, to code that would restart the compiler, because it was too difficult to unwind the state of things when it was several subroutines deep. (It could have been done, but the architecture I was dealing with did not make it easy.)

It felt really slimy, not only doing a GOTO, but a GOTO to a global label, two sins at once. (Glenford J. Myers would have been so disappointed in me.)

But it fixed the problem, and I didn’t have to completely rewrite the compiler framework.
 
It's funny, we keep talking about GOTO and ignoring the C++ elephant in the room: exceptions. I personally LOVE exception, but again, only when used correctly.
I'm a big fan of exceptions. And its often annoying that a lot of "safety" subsets always tend to think that the language is magically safer without them (sometimes it is but not always).

That said, I have seen some crappy parsers throw exceptions as part of their normal success flow or to escape nested loops. That stuff is pretty gross!

But why does every minor feature in C++ have to have a 33 page explanation?
C++ is the favored language of pedantic language laywers

Agreed. The C++ community tends to be extremely noisy, with ideas pulling in so many different directions. The language is so (overly)complex these days, maybe it *does* need every feature to be painstakingly considered? I have a little talk here that I am potentially quite terrified of the questions. I have spoken at software engineering conferences but never a C++ focused one and I am sure the experts there will pick apart good chunks of the proposed tool / library :).
 
The key is predictability. When GOTOs are used without discipline, the code path can go anywhere. For example, I have seen code that GOTOs into the middle of a DO loop, because they want to use some of the code in there, then has an IF statement that jumps out if they are not doing iterations. (That's why they invented subroutines, BTW.) Dijkstra preferred languages that helped you not shoot yourself in the foot.

I spent about 35 years coding in mainframe assembly language. I liked to pretend I was a compiler: my loops and IF statements were always coded the same way, so I always knew where the code was going to exit. (That is a problem with, for example, doing GOTOs to jump out of loops. If I am testing, I want to be able to put a breakpoint at the end of a loop, and know that it will always hit that breakpoint however it exits the loop. [Sometimes that means I have to set a flag to indicate how the loop was exited, and then use an IF statement on the flag to figure out what to do next. But this is better then jumping out of the loop, because I know it will always hit my breakpoint.])

I should probably add that I seem to be one of a very few people who do not like the various "structured programming" macros that are available. I use the standard constructs, but I generate them by hand. Why? The structured programming macros abstract you from the details, but you end up coding something like IF (A, GT, B, CH); The details have been hidden, but then you have to tell it to use a Compare Halfword instruction, because the fields are halfwords. Now you are down in the details again. To me, it's like when pilots are flying a plane with the autopilot, and suddenly it has a problem and turns control back to the pilot. The pilots have not been paying attention, and now they have to fly the plane, and they don't have a good feel for where they are, or what has been going on. (BTW, I always felt that the macros ought to be able to figure out which compare instruction to use, based on how the fields were designed. )

As always, YMMV.
 
I'm a big fan of exceptions.
So am I. They allow writing code much more clearly, because correct error raising and handling becomes either very concise (just say throw Exception('foo bar wen wrong')), or even invisible (most functions don't need try/catch blocks, because most exceptions filter way up the stack).

EXCEPT: (ha ha, pun)

In languages where memory management and move semantics is explicit (the C family, Rust), exceptions create havoc of having to get out of lots of blocks, having to run destructors, and making sure everything is destroyed/released/unallocated/closed/... correctly. In the standard C++ RAII idiom where we rely on destructors, that seems to work if you don't look carefully. The problem arises when the act of destroying/closing/releasing can itself fail, and raise more exceptions. That's why one of the coding rules is "never allow a destructor to throw". But what if the destructor has to perform an action that can actually fail, such as closing a file, which in turn requires writing the last buffered content first? C++ can not handle that in a reasonable and human-readable fashion.

Second, this only works well if exceptions are used solely for things that are truly exceptional (ha ha, another pun), and don't need to be handled, except at very few places (ideally 1 in a single-threaded program). Actions that commonly fail and need good diagnostics or error recovery (like retries) shall not use exceptions, which mostly excludes using them for file or disk IOs (file systems can be full, disks will have errors) or network communication (networks are famously fickle). If you use them for these common cases, the code will we littered with try/catch blocks. Honestly, that's not a big deal, since using an error return and if/else block is just as bad.

But where I really get upset is when exceptions are used for common and expected situations. My favorite (anti-favorite?) example is that Python uses an exception for iterators to signal that they have finished. This means that my pre-conception (if I see try/catch or try/except blocks there must be error handling in there) is wrong, and I can not longer visually distinguish normal flow from bad situations.

Still, on balance exceptions are better than the alternatives, albeit not perfect.
 
It's funny, we keep talking about GOTO and ignoring the C++ elephant in the room: exceptions. I personally LOVE exception, but again, only when used correctly.

Just don't use exceptions for regular flow control. Apart from bad style it is also horribly slow to actually throw exceptions (as opposed to just setting them up). This is also unique to C++, the fact that unthrown exceptions are very fast.

Code:
       1.2 nsec/call        1.2 user        0.0 sys: atoi
      75.1 nsec/call       75.1 user        0.0 sys: snprintf
     153.4 nsec/call      153.4 user        0.0 sys: snprintf_float
     134.8 nsec/call      134.8 user        0.0 sys: fnmatch
      20.1 nsec/call       20.1 user        0.0 sys: condvar_signal
      15.7 nsec/call       15.7 user        0.0 sys: mutex_lock_unlock
       6.0 nsec/call        6.0 user        0.0 sys: pthread_mutex_trylock
      26.8 nsec/call       26.8 user        0.0 sys: gettimeofday
     195.9 nsec/call      195.9 user        0.0 sys: strncpy
       5.3 nsec/call        5.3 user        0.0 sys: strchr
     216.1 nsec/call       15.5 user      201.8 sys: getrusage
      90.7 nsec/call       23.1 user       67.1 sys: read
     142.4 nsec/call       38.9 user      103.5 sys: read1bdevzero
     241.8 nsec/call       32.9 user      208.9 sys: read8kdevzero
      56.9 usec/call      298.4 user    56554.5 sys: read2mdevzero
       4.1 nsec/call        4.1 user        0.0 sys: rand
       2.7 nsec/call        2.7 user        0.0 sys: random
       5.1 nsec/call        5.1 user        0.0 sys: floatrand
    5729.5 nsec/call     5720.0 user        0.0 sys: cpp_testhrow_throw_48
    5362.4 nsec/call     5373.1 user        0.0 sys: cpp_testhrow_throw_24
    5244.3 nsec/call     5236.3 user        0.0 sys: cpp_testhrow_throw_12
    5100.3 nsec/call     5094.5 user        0.0 sys: cpp_testhrow_throw_4
    5055.9 nsec/call     5063.4 user        0.0 sys: cpp_testhrow_throw
       6.0 nsec/call        6.0 user        0.0 sys: cpp_testhrow_no_throw
       4.8 nsec/call        4.8 user        0.0 sys: cpp_testhrow_no_possible_throw
       2.2 nsec/call        2.2 user        0.0 sys: cpp_testhrow_no_cleanup_no_throw
       1.2 nsec/call        1.2 user        0.0 sys: cpp_testhrow_no_cleanup_no_possible_throw
From https://github.com/cracauer/ulmbenchmarks

So, actually throwing an exception in C++ is 1000x slower than not throwing it.

This is clang on FreeBSD, gcc is not significantly different.
 
That said, I have seen some crappy parsers throw exceptions as part of their normal success flow or to escape nested loops. That stuff is pretty gross!
I'm not sure how common knowledge it is, but Stroutrup himself defined exceptions as "just another generic mechanism of program execution", not for the explicit and exclusive use of catching errors/exceptions. It seems that the restrictions on their more generic use has grown out of several generations of "best practices".
 
Just don't use exceptions for regular flow control. Apart from bad style it is also horribly slow to actually throw exceptions (as opposed to just setting them up). This is also unique to C++, the fact that unthrown exceptions are very fast.
Well, I mean that does make sense, and is expected, if you are thinking like a full stack computer scientist. Of course exceptions are going to be "expensive"
 
Agreed. The C++ community tends to be extremely noisy, with ideas pulling in so many different directions. The language is so (overly)complex these days, maybe it *does* need every feature to be painstakingly considered? I have a little talk here that I am potentially quite terrified of the questions. I have spoken at software engineering conferences but never a C++ focused one.

My beef with C++ is that the amount of writings you have to consume to just use many of there features. Obviously for this you don't read the lawyerisms. The point being that the purely practical side is large.
 
I'm not sure how common knowledge it is, but Stroutrup himself defined exceptions as "just another generic mechanism of program execution", not for the explicit and exclusive use of catching errors/exceptions. It seems that the restrictions on their more generic use has grown out of several generations of "best practices".
Interesting. I recall in one of his books(?) he mentioned that using an exception to escape a deeply nested loop was "cute" but then went on to discuss other approaches to doing so. I have no idea which though... It was certainly an earlier one.

Makes sense. "catch" is quite generic, rather than try/fail or try/fault which I recall seeing in the C++/clr generated IL much later on once exceptions for error handling only became a more established idea.
 
hus is the shame. It goes to how they teach computer science these days.
There are Universities that begin / began with a functional languages.

I am not computer scientist, I learned first to program the pocket calculator, later FORTRAN,
took a lecture on computer architecture (real computer, without microprocessor). It is a different curriculum.
 
Back
Top