EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

Embedded platforms for beginners?

Started by Elof Huang September 1, 2017
On 06/09/17 12:42, rickman wrote:
> Clifford Heath wrote on 9/5/2017 10:21 PM: >> On 06/09/17 12:06, rickman wrote: >>> Paul Rubin wrote on 9/4/2017 1:20 AM: >>>> rickman <gnuarm@gmail.com> writes: >>>>> It's not like "GC'd" is a commonly used abbreviation. >>>> >>>> At least among language geeks, it's definitely commonly used. >>> >>> Exactly, when you are talking to people who already know what you are >>> talking about. >> >> Most people would just Google "memory gc" to educate themselves, >> rather than bleating about how ignorant they are. "Garbage >> Collection" is the very first hit on Google. But instead of >> you spending 30 seconds learning something, a hundred or more >> people have had to spend 30 seconds reading unnecessary posts. > > Exactly. I see dozens of un-useful abbreviations in newsgroups every > day. They waste people's time. In this case it wasn't just mine. The > rule for use of abbreviations in documents is to explain the > abbreviation the first time it is used in a doc. It would be a good > rule for usage in a thread.
This industry is characterized by lots of jargon and acronyms. The most desirable quality in selecting a new hire is that they show a propensity and enthusiasm for finding things out by themselves, rather than asking other people. By the way, how's the job hunt going? Keep going, you're sure to make it some year.
Clifford Heath wrote on 9/6/2017 12:26 AM:
> On 06/09/17 12:42, rickman wrote: >> Clifford Heath wrote on 9/5/2017 10:21 PM: >>> On 06/09/17 12:06, rickman wrote: >>>> Paul Rubin wrote on 9/4/2017 1:20 AM: >>>>> rickman <gnuarm@gmail.com> writes: >>>>>> It's not like "GC'd" is a commonly used abbreviation. >>>>> >>>>> At least among language geeks, it's definitely commonly used. >>>> >>>> Exactly, when you are talking to people who already know what you are >>>> talking about. >>> >>> Most people would just Google "memory gc" to educate themselves, >>> rather than bleating about how ignorant they are. "Garbage >>> Collection" is the very first hit on Google. But instead of >>> you spending 30 seconds learning something, a hundred or more >>> people have had to spend 30 seconds reading unnecessary posts. >> >> Exactly. I see dozens of un-useful abbreviations in newsgroups every day. >> They waste people's time. In this case it wasn't just mine. The rule for >> use of abbreviations in documents is to explain the abbreviation the first >> time it is used in a doc. It would be a good rule for usage in a thread. > > This industry is characterized by lots of jargon and acronyms. > The most desirable quality in selecting a new hire is that they > show a propensity and enthusiasm for finding things out by > themselves, rather than asking other people. > > By the way, how's the job hunt going? Keep going, you're sure > to make it some year.
Obviously you don't understand newsgroups if you think this is a job interview. This is a discussion group. Many people come from different backgrounds with different levels of familiarity of the jargon of any particular micro-area of knowledge. If you want to be able to communicate you won't express every thought with the minimum amount of keystrokes. The term is garbage collection. It's not that had to type. Just like *many* other abbreviations it is often not worth using since it just isn't that much harder to type the name. -- Rick C Viewed the eclipse at Wintercrest Farms, on the centerline of totality since 1998
On 05/09/17 22:09, Phil Hobbs wrote:
> On 09/04/2017 05:46 AM, Tom Gardner wrote: >> On 04/09/17 00:13, Phil Hobbs wrote: >>> On 09/03/2017 06:00 PM, Clifford Heath wrote: >>>> On 04/09/17 05:58, George Neuner wrote: >>>>> On Sun, 3 Sep 2017 17:52:42 +1000, Clifford Heath <no.spam@please.net> >>>>> wrote: >>>>> >>>>>> On 03/09/17 16:14, rickman wrote: >>>>>>> What is GC? >>>>>> >>>>>> Garbage collection. Make new objects and just lose them, >>>>>> the GC will pick them up. You never know when the GC will >>>>>> run, and you need a *lot* more memory, >>>>> >>>>> Those are misconceptions: there are systems that are completely >>>>> deterministic, others that are very predictable (e.g., GC is >>>>> schedulable in particular time slots, won't exceed given percentage of >>>>> CPU time, etc.), and systems that don't need much additional memory. >>>>> >>>>> There even are systems that combine predictability with low memory >>>>> overhead. >>>> >>>> All that is true. None of it invalidates what I said. >>>> >>>>>> so it's not a good way to get the most out of an MCU. >>>>> >>>>> I would say it it depends on the system. >>>>> My position is that GC has no place in a small system. >>>> >>>> Thanks for confirming my position. >>> >>> Well, given that dynamic allocation has no place in small systems either, GC is >>> a bit of a non-issue. ;) >> >> "Small" is getting larger all the time :) >> >> Nowadays it isn't completely ridiculous to consider >> having GC on a small system - although I've never >> done it. >> >> >>>>> OTOH, if you're talking about a complex >>>>> system with 32-bit controller and 100s of KB (or more) of RAM ... >>>> >>>> Then you have resources in such excess that *you don't need* >>>> to get the best out of it. Exactly what I said, in other words. >>> >>> Until it's been running long enough that the heap is fragmented into tiny, tiny >>> bits. >> >> If you roll your own general purpose GC then at best you >> will be reinventing a wheel. In almost all cases such a >> wheel will be /far/ from circular. Even special purpose GCs >> can be a pig to get working. >> >> C/C++ as a language is, of course, a very bad starting point >> for a general purpose GC. Boehm made heroic efforts, and >> do to having to make pessimising assumptions, only managed >> to get it "frequently correct". > > I've never felt the need for GC in an embedded system myself, and I'm > sufficiently averse to midnight phone calls that I don't use dynamic allocation > there either. > > There are probably situations where it's helpful, but I'm an instruments > builder, and the MCU and memory are very rarely the cost driver. Field > failures--even transient ones--are what costs money, not hardware. Having said > that, I don't gold-plate the hardware either. It usually wouldn't matter to the > customer, but one does have one's professional standards. ;)
No disagreement there, but the boring stuff is becoming more commonplace. By "boring" I'm thinking of fancy bitmapped GUIs and USB/networking/etc, and at GC is more likely to creep in through that kind of bling. As for MCU/memory not being a cost driver, that means they no longer preclude GC.
On 06/09/17 05:42, Phil Hobbs wrote:
> A big-ass phone switch is _not_ my idea of an embedded system.
Typical "definitions" of "embedded system" run along the lines of "An embedded system is a computer system with a dedicated function within a larger mechanical or electrical system, often with real-time computing constraints." So it is reasonable to argue that telco stuff /is/ an embedded system. Doubly so when you look at how bits of software from different companies are cobbled and patch-panneled together.
On 05/09/17 18:35, George Neuner wrote:
> On Tue, 05 Sep 2017 14:15:23 +0200, David Brown > <david.brown@hesbynett.no> wrote: > >> On 04/09/17 07:55, Clifford Heath wrote: >> >>> Please, why don't you copy/paste all the content from >>> http://www.memorymanagement.org/ >>> There might be some people here who have not read it all yet. >>> Sigh... Yes, there is a whole domain of science here. >>> Far, far more than you have reminded us of yet. >> >> I had a little look at that site. It had some useful explanations, but >> is /seriously/ biased. It gives the impression that the only reason you >> would /not/ want garbage collection is if you want your program to be >> bloated, slow, inefficient, waste memory - and leak resources, have >> buffer overflows and other nasal daemons. > > Unfortunately, the pro-GC bias is one that is firmly rooted in > reality. >
Garbage collection certainly has its advantages - but equally it is not a magic wand that solves all memory problems, nor is it the right tool in all situations.
> In this forum we get a distorted view because the general skill level > of embedded developers is relatively high. >
True. And we also tend to avoid dynamic memory as much as possible!
> In the broader programming world, the average skill level is just > slightly above "script kiddy". The headlong march to "managed" > languages such as Java, C#, Python, etc. is industry acknowledging > that the average programmer simply can't write a complex program > without hand-holding.
On PC's, I usually write in Python - and the automatic memory management is one of the nice things about the language. In that context, I absolutely agree that automatic memory management and garbage collection is the right way to go. My point is just that there are other situations, other types of programming - and other solutions. The website there was written like a marketing campaign rather than an unbiased reference. As well as dismissing C-style memory management out of hand, it described C++'s methods as "even harder" with no basis whatsoever (RAII and synchronous destructors makes memory management far easier) and pretty much ignored smart pointers altogether. It also ignores the problems of garbage collection. In particular, it is very easy to get circular references in complex structures, and it is easy to get memory leaks in garbage collected systems. Garbage collection makes the issues of memory management different - it is not a magic fix-all solution. It lets you avoid the micro-managing and write simpler code, but it does not fix everything. It just teaches a generation of programmers that it does not matter if your Python program leaks 100K memory an hour - that's only a GB a year, and everyone has gigabytes to spare.
> > And it isn't just memory management ... the runtimes of these > languages also - in whole or in part - manage files, network > connections, threads, synchronize object access, etc.
They do indeed - but that is a process that is mostly independent of garbage collection. The only direct connection is that by making your clean-up happen asynchronously and behind the scenes, you have lost control of where your tidy-up is done. With C++, your file will be closed by the file object's destructor. That means your code that /uses/ the file object does not need to have the close file logic, but you know the file will be properly closed when the using function exits by return or exception. With a garbage collected language, you have the choice of manually putting in the "close" call in your using function, carefully catering for try/except/finally blocks to make sure it is handled in exceptions, or you put it in the object's finaliser to be run eventually, at an unknown time in an unknown order. For Python (as the gc'ed language I use most), you just assume it will all be tidied up sooner or later. It works because garbage collection runs regularly so it won't be /too/ long before the destructor is called.
> > C isn't even being taught in a lot of schools now, so even those who > have a (relatively) recent CS/IS degree may have no experience of > programming that requires manual resource management. > > > All in all, general software development is a sorry state of affairs > that is getting worse with every passing year. I have stated in the > past my belief that the majority of software "developers" would be > doing the world a favor by finding another occupation. >
I certainly think that most people who program in C, should be using other languages. And most programs that are written in C, would be better written in other languages. (Embedded development is major exception, though even here there are alternatives.) And the memory management in C is a big reason for that opinion. But telling people that garbage collection is always the best choice and that it solves all memory and resource problems, is not the answer. It's like saying if you buy a car with automatic gears, collision detection with automatic breaking, etc., then you don't need to learn to drive properly.
On 05/09/17 19:02, Rob Gaddi wrote:
> On 09/05/2017 09:35 AM, George Neuner wrote: >> Unfortunately, the pro-GC bias is one that is firmly rooted in >> reality. >> >> In this forum we get a distorted view because the general skill level >> of embedded developers is relatively high. >> >> In the broader programming world, the average skill level is just >> slightly above "script kiddy". The headlong march to "managed" >> languages such as Java, C#, Python, etc. is industry acknowledging >> that the average programmer simply can't write a complex program >> without hand-holding. >> > > I used to write a lot of assembly. Then I wrote a lot of C. These days > I write a lot of Python. And you know what? I get more done faster > with fewer bugs. > > Just the same as going from ASM to C, there are things that the machine > is just better at doing than you are. Here and there, yeah, I can sit > there and hand-tweak better ASM than the C compiler will give me. But > for an entire project of any scope? Better to let the optimizing > compiler do 90% of the job over 100% of the project than for me to do > 100% of the job over only 20% of the project.
Agreed.
> > GC is the same thing. Think how complicated the simple function "Create > a new string from the concatenation of two existing strings." becomes > when you start having worry about malloc/free. Either: > A) the function has to accept a pre-allocated buffer, in which case > you've moved all the complexity of adding strlens and not getting off by > one out of the function, making the function pointless, or > B) the function does its own malloc, which you've now hidden under the > hood while still forcing the caller to remember to free it at some point. >
Anyone working in C with malloc/free for code that does a lot of string work, is working in the wrong language. Even though you can hide the malloc's and frees in wrappers that make it easier to track, it is still immensely painful - at best, you are using function-call syntax instead of infix operators and writing code that is hard to follow.
> With GC if you need the new string, you magically create it on the heap, > and then you walk away, comfortable in the knowledge that it'll get > taken care of.
Garbage collection is /not/ needed for this - though it is certainly one way to do it. C++'s model works fine: #include <iostream> #include <string> #include <stdlib.h> using namespace std::string_literals; int main(void) { auto s = "Equation: "s; auto t = u8"&pi;r&sup2;"s; std::cout << (s + t + "\n"); } No leaks, no problem using strings just like in Python, and no garbage collection. Of course, the C++ model has its own advantages and disadvantages, just like the C malloc/free, and just like any given garbage collection method.
> >> And it isn't just memory management ... the runtimes of these >> languages also - in whole or in part - manage files, network >> connections, threads, synchronize object access, etc. >> > Having tools to make the routine parts of the job easier so that you can > concentrate on the larger problem is a good thing. > >> C isn't even being taught in a lot of schools now, so even those who >> have a (relatively) recent CS/IS degree may have no experience of >> programming that requires manual resource management. >> >> >> All in all, general software development is a sorry state of affairs >> that is getting worse with every passing year. I have stated in the >> past my belief that the majority of software "developers" would be >> doing the world a favor by finding another occupation. >> >> YMMV, >> George >> > >
On 09/06/2017 03:45 AM, Tom Gardner wrote:
> On 05/09/17 22:09, Phil Hobbs wrote: >> On 09/04/2017 05:46 AM, Tom Gardner wrote: >>> On 04/09/17 00:13, Phil Hobbs wrote: >>>> On 09/03/2017 06:00 PM, Clifford Heath wrote: >>>>> On 04/09/17 05:58, George Neuner wrote: >>>>>> On Sun, 3 Sep 2017 17:52:42 +1000, Clifford Heath >>>>>> <no.spam@please.net> >>>>>> wrote: >>>>>> >>>>>>> On 03/09/17 16:14, rickman wrote: >>>>>>>> What is GC? >>>>>>> >>>>>>> Garbage collection. Make new objects and just lose them, >>>>>>> the GC will pick them up. You never know when the GC will >>>>>>> run, and you need a *lot* more memory, >>>>>> >>>>>> Those are misconceptions: there are systems that are completely >>>>>> deterministic, others that are very predictable (e.g., GC is >>>>>> schedulable in particular time slots, won't exceed given >>>>>> percentage of >>>>>> CPU time, etc.), and systems that don't need much additional memory. >>>>>> >>>>>> There even are systems that combine predictability with low memory >>>>>> overhead. >>>>> >>>>> All that is true. None of it invalidates what I said. >>>>> >>>>>>> so it's not a good way to get the most out of an MCU. >>>>>> >>>>>> I would say it it depends on the system. >>>>>> My position is that GC has no place in a small system. >>>>> >>>>> Thanks for confirming my position. >>>> >>>> Well, given that dynamic allocation has no place in small systems >>>> either, GC is >>>> a bit of a non-issue. ;) >>> >>> "Small" is getting larger all the time :) >>> >>> Nowadays it isn't completely ridiculous to consider >>> having GC on a small system - although I've never >>> done it. >>> >>> >>>>>> OTOH, if you're talking about a complex >>>>>> system with 32-bit controller and 100s of KB (or more) of RAM ... >>>>> >>>>> Then you have resources in such excess that *you don't need* >>>>> to get the best out of it. Exactly what I said, in other words. >>>> >>>> Until it's been running long enough that the heap is fragmented into >>>> tiny, tiny >>>> bits. >>> >>> If you roll your own general purpose GC then at best you >>> will be reinventing a wheel. In almost all cases such a >>> wheel will be /far/ from circular. Even special purpose GCs >>> can be a pig to get working. >>> >>> C/C++ as a language is, of course, a very bad starting point >>> for a general purpose GC. Boehm made heroic efforts, and >>> do to having to make pessimising assumptions, only managed >>> to get it "frequently correct". >> >> I've never felt the need for GC in an embedded system myself, and I'm >> sufficiently averse to midnight phone calls that I don't use dynamic >> allocation >> there either. >> >> There are probably situations where it's helpful, but I'm an instruments >> builder, and the MCU and memory are very rarely the cost driver. Field >> failures--even transient ones--are what costs money, not hardware. >> Having said >> that, I don't gold-plate the hardware either. It usually wouldn't >> matter to the >> customer, but one does have one's professional standards. ;) > > No disagreement there, but the boring stuff is becoming > more commonplace. By "boring" I'm thinking of fancy bitmapped > GUIs and USB/networking/etc, and at GC is more likely to > creep in through that kind of bling. > > As for MCU/memory not being a cost driver, that means > they no longer preclude GC.
The primary thing that precludes GC in my embedded projects is me. Cheers Phil Hobbs -- Dr Philip C D Hobbs Principal Consultant ElectroOptical Innovations LLC Optics, Electro-optics, Photonics, Analog Electronics 160 North State Road #203 Briarcliff Manor NY 10510 hobbs at electrooptical dot net http://electrooptical.net
On 08/09/17 01:23, Phil Hobbs wrote:
> On 09/06/2017 03:45 AM, Tom Gardner wrote: >> On 05/09/17 22:09, Phil Hobbs wrote: >>> On 09/04/2017 05:46 AM, Tom Gardner wrote: >>>> On 04/09/17 00:13, Phil Hobbs wrote: >>>>> On 09/03/2017 06:00 PM, Clifford Heath wrote: >>>>>> On 04/09/17 05:58, George Neuner wrote: >>>>>>> On Sun, 3 Sep 2017 17:52:42 +1000, Clifford Heath <no.spam@please.net> >>>>>>> wrote: >>>>>>> >>>>>>>> On 03/09/17 16:14, rickman wrote: >>>>>>>>> What is GC? >>>>>>>> >>>>>>>> Garbage collection. Make new objects and just lose them, >>>>>>>> the GC will pick them up. You never know when the GC will >>>>>>>> run, and you need a *lot* more memory, >>>>>>> >>>>>>> Those are misconceptions: there are systems that are completely >>>>>>> deterministic, others that are very predictable (e.g., GC is >>>>>>> schedulable in particular time slots, won't exceed given percentage of >>>>>>> CPU time, etc.), and systems that don't need much additional memory. >>>>>>> >>>>>>> There even are systems that combine predictability with low memory >>>>>>> overhead. >>>>>> >>>>>> All that is true. None of it invalidates what I said. >>>>>> >>>>>>>> so it's not a good way to get the most out of an MCU. >>>>>>> >>>>>>> I would say it it depends on the system. >>>>>>> My position is that GC has no place in a small system. >>>>>> >>>>>> Thanks for confirming my position. >>>>> >>>>> Well, given that dynamic allocation has no place in small systems either, >>>>> GC is >>>>> a bit of a non-issue. ;) >>>> >>>> "Small" is getting larger all the time :) >>>> >>>> Nowadays it isn't completely ridiculous to consider >>>> having GC on a small system - although I've never >>>> done it. >>>> >>>> >>>>>>> OTOH, if you're talking about a complex >>>>>>> system with 32-bit controller and 100s of KB (or more) of RAM ... >>>>>> >>>>>> Then you have resources in such excess that *you don't need* >>>>>> to get the best out of it. Exactly what I said, in other words. >>>>> >>>>> Until it's been running long enough that the heap is fragmented into tiny, >>>>> tiny >>>>> bits. >>>> >>>> If you roll your own general purpose GC then at best you >>>> will be reinventing a wheel. In almost all cases such a >>>> wheel will be /far/ from circular. Even special purpose GCs >>>> can be a pig to get working. >>>> >>>> C/C++ as a language is, of course, a very bad starting point >>>> for a general purpose GC. Boehm made heroic efforts, and >>>> do to having to make pessimising assumptions, only managed >>>> to get it "frequently correct". >>> >>> I've never felt the need for GC in an embedded system myself, and I'm >>> sufficiently averse to midnight phone calls that I don't use dynamic allocation >>> there either. >>> >>> There are probably situations where it's helpful, but I'm an instruments >>> builder, and the MCU and memory are very rarely the cost driver. Field >>> failures--even transient ones--are what costs money, not hardware. Having said >>> that, I don't gold-plate the hardware either. It usually wouldn't matter to the >>> customer, but one does have one's professional standards. ;) >> >> No disagreement there, but the boring stuff is becoming >> more commonplace. By "boring" I'm thinking of fancy bitmapped >> GUIs and USB/networking/etc, and at GC is more likely to >> creep in through that kind of bling. >> >> As for MCU/memory not being a cost driver, that means >> they no longer preclude GC. > > The primary thing that precludes GC in my embedded projects is me.
And that's an incontestable reason :) Personally I've never included GC in my embedded projects (although I use it everywhere else). But my reason is different: I'm interested in the /hard/ realtime behaviour. That frequently makes it difficult for me to include things like caches and interrupts. Fortunately there /are/ alternatives.
On 08/09/17 09:42, Tom Gardner wrote:
> On 08/09/17 01:23, Phil Hobbs wrote: >> On 09/06/2017 03:45 AM, Tom Gardner wrote:
>>> As for MCU/memory not being a cost driver, that means >>> they no longer preclude GC. >> >> The primary thing that precludes GC in my embedded projects is me.
My main reasoning for not using garbage collection, is that my code does not generate garbage :-)
> > And that's an incontestable reason :) > > Personally I've never included GC in my embedded projects > (although I use it everywhere else). But my reason is > different: I'm interested in the /hard/ realtime behaviour. > > That frequently makes it difficult for me to include things > like caches and interrupts. Fortunately there /are/ alternatives. >
Usually in that kind of project, you are avoiding dynamic memory and not using any kind of malloc/free equivalent.
On 08/09/17 11:07, David Brown wrote:
> On 08/09/17 09:42, Tom Gardner wrote: >> On 08/09/17 01:23, Phil Hobbs wrote: >>> On 09/06/2017 03:45 AM, Tom Gardner wrote: > >>>> As for MCU/memory not being a cost driver, that means >>>> they no longer preclude GC. >>> >>> The primary thing that precludes GC in my embedded projects is me. > > My main reasoning for not using garbage collection, is that my code does > not generate garbage :-)
Lucky you; usually my code /is/ garbage :)
>> And that's an incontestable reason :) >> >> Personally I've never included GC in my embedded projects >> (although I use it everywhere else). But my reason is >> different: I'm interested in the /hard/ realtime behaviour. >> >> That frequently makes it difficult for me to include things >> like caches and interrupts. Fortunately there /are/ alternatives. >> > > Usually in that kind of project, you are avoiding dynamic memory and not > using any kind of malloc/free equivalent.
Agreed, although I don't object to using malloc during startup. I've always liked (and usually been able to achieve) almost complete separation between the hard realtime code and the "other stuff". Frequently the "other stuff" is coded in whatever is convenient, which usually includes GC.

The 2024 Embedded Online Conference