Ok, I will give you my uninformed opinion. I have edited this to be more specific based on the comments of the original post and of the comments that I received below.
As for the choice of language, storage tends not to be an issue with current systems.
One of the nice features of C is that it has the smallest runtime library. So if you compile “Hello World” in either K&R C or ANISI C (I think I was using either “Power C”, or “Turbo C” compiler under DOS), it would build an executable that was approximately 4K. This fits nicely under the storage limits at the time (C++ was being developed, but was not widely available).
When C++ came out (I started playing with it about 1993), the “Hello World” exe would go to 20M due to the runtime library being much that much larger. Switch back to C, and you would get a 4K exe. So yes, C++ is larger, however, with current servers, desktops and even smart phones having gigabytes of memory – the compiled size of the executable is not as important.
My original assertion “With current systems, storage is not an issue.” was too ambiguous.
When you are dealing with an 8085 and 2 k of memory with no external memory, then storage is a hugh issue and you probably will choose assembly over any other language. Hugh issue in 1975, I would guess it is not a current issue in embedded systems given the options for memory that currently exist (disclaimer, I don’t work with embedded systems).
Based on what I learned in college and from job interviews (about 1990), embedded processors didn’t have a large amount of memory and you really didn’t have the option to get more. You may have the choice of C or assembly, but you certainly didn’t have the 4M available for C++.
That said, now you have have options. Granted it is $0.5 dollars more per chip (see the informed opinion below), and if you are in the business of shaving a fraction of cent off of your consumer products, and then it may be an design issue.