EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

Embedded platforms for beginners?

Started by Elof Huang September 1, 2017
On 04/09/17 15:24, Paul Rubin wrote:
> Clifford Heath <no.spam@please.net> writes: >> Generational GC relies on being able to move things between heaps, >> so object handles cannot be pointers (or rather, must be indirect). > > The handles are usually still pointers. First you copy the object to > the other heap, then later you scan it, copy the objects that the first > objects points to, and update the pointers in the first object to point > to the new locations. This is described pretty well in the SICP chapter > on metacircular evaluators (mitpress.mit.edu/sicp).
Paul, Please, why don't you copy/paste all the content from http://www.memorymanagement.org/ There might be some people here who have not read it all yet. Sigh... Yes, there is a whole domain of science here. Far, far more than you have reminded us of yet. No, it's *still* not the way to get the best from most embedded hardware. It's an acceptable compromise, some of the time, is all that can really be said.
On 04/09/17 00:13, Phil Hobbs wrote:
> On 09/03/2017 06:00 PM, Clifford Heath wrote: >> On 04/09/17 05:58, George Neuner wrote: >>> On Sun, 3 Sep 2017 17:52:42 +1000, Clifford Heath <no.spam@please.net> >>> wrote: >>> >>>> On 03/09/17 16:14, rickman wrote: >>>>> What is GC? >>>> >>>> Garbage collection. Make new objects and just lose them, >>>> the GC will pick them up. You never know when the GC will >>>> run, and you need a *lot* more memory, >>> >>> Those are misconceptions: there are systems that are completely >>> deterministic, others that are very predictable (e.g., GC is >>> schedulable in particular time slots, won't exceed given percentage of >>> CPU time, etc.), and systems that don't need much additional memory. >>> >>> There even are systems that combine predictability with low memory >>> overhead. >> >> All that is true. None of it invalidates what I said. >> >>>> so it's not a good way to get the most out of an MCU. >>> >>> I would say it it depends on the system. >>> My position is that GC has no place in a small system. >> >> Thanks for confirming my position. > > Well, given that dynamic allocation has no place in small systems either, GC is > a bit of a non-issue. ;)
"Small" is getting larger all the time :) Nowadays it isn't completely ridiculous to consider having GC on a small system - although I've never done it.
>>> OTOH, if you're talking about a complex >>> system with 32-bit controller and 100s of KB (or more) of RAM ... >> >> Then you have resources in such excess that *you don't need* >> to get the best out of it. Exactly what I said, in other words. > > Until it's been running long enough that the heap is fragmented into tiny, tiny > bits.
If you roll your own general purpose GC then at best you will be reinventing a wheel. In almost all cases such a wheel will be /far/ from circular. Even special purpose GCs can be a pig to get working. C/C++ as a language is, of course, a very bad starting point for a general purpose GC. Boehm made heroic efforts, and do to having to make pessimising assumptions, only managed to get it "frequently correct".
On 04/09/17 07:24, Paul Rubin wrote:
> Clifford Heath <no.spam@please.net> writes: >> Generational GC relies on being able to move things between heaps, >> so object handles cannot be pointers (or rather, must be indirect). > > The handles are usually still pointers. First you copy the object to > the other heap, then later you scan it, copy the objects that the first > objects points to, and update the pointers in the first object to point > to the new locations. This is described pretty well in the SICP chapter > on metacircular evaluators (mitpress.mit.edu/sicp). >
That is /one/ way to handle it. There are many others. For example, when you have pools of memory that have blocks of the same size, you might use an array and have indexes into that array as your handles. Or you might have something that is basically an address, but specifically manipulated in order to make it easily distinguishable to a garbage collector, and to avoid user code accidentally using it as a pointer. This could be as simple as adding an offset to put the value outside the valid range in the device - software bugs would then cause hard faults that can quickly be identified.
On 04/09/17 07:55, Clifford Heath wrote:
> On 04/09/17 15:24, Paul Rubin wrote: >> Clifford Heath <no.spam@please.net> writes: >>> Generational GC relies on being able to move things between heaps, >>> so object handles cannot be pointers (or rather, must be indirect). >> >> The handles are usually still pointers. First you copy the object to >> the other heap, then later you scan it, copy the objects that the first >> objects points to, and update the pointers in the first object to point >> to the new locations. This is described pretty well in the SICP chapter >> on metacircular evaluators (mitpress.mit.edu/sicp). > > Paul, > > Please, why don't you copy/paste all the content from > http://www.memorymanagement.org/ > There might be some people here who have not read it all yet. > Sigh... Yes, there is a whole domain of science here. > Far, far more than you have reminded us of yet.
I had a little look at that site. It had some useful explanations, but is /seriously/ biased. It gives the impression that the only reason you would /not/ want garbage collection is if you want your program to be bloated, slow, inefficient, waste memory - and leak resources, have buffer overflows and other nasal daemons.
> > No, it's *still* not the way to get the best from > most embedded hardware. It's an acceptable compromise, > some of the time, is all that can really be said.
Phil,


On 02-09-17 17:41, Phil Hobbs wrote:
> On 09/01/2017 07:49 AM, Clifford Heath wrote: >> On 01/09/17 16:07, Elof Huang wrote: >>> I want to learn Embedded systems, but don't know how to start it, >>> do you recommend any platform to start? >> If you're really a beginner, there is really nothing better than >> the Arduino system.
> Unless you actually want to learn to build systems. Arduino is sort of > embedded-Python-lite, and (last time I looked) has horribly archaic > development tools. I've had source-level debugging since Microsoft 5.2 > for DOS, circa 1989. Arduino? Not last time I checked.
I agree, but the question is this is really such an issue for a beginner. Let's keep in mind that the arduino project started out in an university of art to allow people to create interactive art. (like a painting that changes color when you move your hand over it). For that purpose, -as this is a tool to be used by artists, not engineers, its main goal is to hide the complexity of the low-level hardware interfaceing in libraries, ... it does do that very well. I agree, once you start writing code that interrupt service routines, timers, DMA-channels, etc. doing debugging with just print-commands soons hits its limits, ... but that's not what a beginner does, it it? My advice usually is, start with one of these arduino-starter kits to get your feet wet, and -once you have done that- you can then descide on what your next step will be: native AVR + gcc, mbed, STM32 + libopencm3, LPCxpresso (indeed, also very nice), PIC, MSP430, ... Or you can go "up" in the hardware stack and opt for (say) micropython an a STM32F4 (I think, the ideal tool to do fast-development talking to some new SPI/i2c/can-bus chip) or go down in the hardware stack: FPGAs/VHDL/verilog/myhdl/...
> Cheers > Phil Hobbs
Cheerio! Kr. Bonne.
On Tue, 05 Sep 2017 14:15:23 +0200, David Brown
<david.brown@hesbynett.no> wrote:

>On 04/09/17 07:55, Clifford Heath wrote: > >> Please, why don't you copy/paste all the content from >> http://www.memorymanagement.org/ >> There might be some people here who have not read it all yet. >> Sigh... Yes, there is a whole domain of science here. >> Far, far more than you have reminded us of yet. > >I had a little look at that site. It had some useful explanations, but >is /seriously/ biased. It gives the impression that the only reason you >would /not/ want garbage collection is if you want your program to be >bloated, slow, inefficient, waste memory - and leak resources, have >buffer overflows and other nasal daemons.
Unfortunately, the pro-GC bias is one that is firmly rooted in reality. In this forum we get a distorted view because the general skill level of embedded developers is relatively high. In the broader programming world, the average skill level is just slightly above "script kiddy". The headlong march to "managed" languages such as Java, C#, Python, etc. is industry acknowledging that the average programmer simply can't write a complex program without hand-holding. And it isn't just memory management ... the runtimes of these languages also - in whole or in part - manage files, network connections, threads, synchronize object access, etc. C isn't even being taught in a lot of schools now, so even those who have a (relatively) recent CS/IS degree may have no experience of programming that requires manual resource management. All in all, general software development is a sorry state of affairs that is getting worse with every passing year. I have stated in the past my belief that the majority of software "developers" would be doing the world a favor by finding another occupation. YMMV, George
On 09/05/2017 09:35 AM, George Neuner wrote:
> Unfortunately, the pro-GC bias is one that is firmly rooted in > reality. > > In this forum we get a distorted view because the general skill level > of embedded developers is relatively high. > > In the broader programming world, the average skill level is just > slightly above "script kiddy". The headlong march to "managed" > languages such as Java, C#, Python, etc. is industry acknowledging > that the average programmer simply can't write a complex program > without hand-holding. >
I used to write a lot of assembly. Then I wrote a lot of C. These days I write a lot of Python. And you know what? I get more done faster with fewer bugs. Just the same as going from ASM to C, there are things that the machine is just better at doing than you are. Here and there, yeah, I can sit there and hand-tweak better ASM than the C compiler will give me. But for an entire project of any scope? Better to let the optimizing compiler do 90% of the job over 100% of the project than for me to do 100% of the job over only 20% of the project. GC is the same thing. Think how complicated the simple function "Create a new string from the concatenation of two existing strings." becomes when you start having worry about malloc/free. Either: A) the function has to accept a pre-allocated buffer, in which case you've moved all the complexity of adding strlens and not getting off by one out of the function, making the function pointless, or B) the function does its own malloc, which you've now hidden under the hood while still forcing the caller to remember to free it at some point. With GC if you need the new string, you magically create it on the heap, and then you walk away, comfortable in the knowledge that it'll get taken care of.
> And it isn't just memory management ... the runtimes of these > languages also - in whole or in part - manage files, network > connections, threads, synchronize object access, etc. >
Having tools to make the routine parts of the job easier so that you can concentrate on the larger problem is a good thing.
> C isn't even being taught in a lot of schools now, so even those who > have a (relatively) recent CS/IS degree may have no experience of > programming that requires manual resource management. > > > All in all, general software development is a sorry state of affairs > that is getting worse with every passing year. I have stated in the > past my belief that the majority of software "developers" would be > doing the world a favor by finding another occupation. > > YMMV, > George >
-- Rob Gaddi, Highland Technology -- www.highlandtechnology.com Email address domain is currently out of order. See above to fix.
On 05/09/17 18:02, Rob Gaddi wrote:
> I used to write a lot of assembly. Then I wrote a lot of C. These days I write > a lot of Python. And you know what? I get more done faster with fewer bugs. > > Just the same as going from ASM to C, there are things that the machine is just > better at doing than you are. Here and there, yeah, I can sit there and > hand-tweak better ASM than the C compiler will give me. But for an entire > project of any scope? Better to let the optimizing compiler do 90% of the job > over 100% of the project than for me to do 100% of the job over only 20% of the > project.
...
> Having tools to make the routine parts of the job easier so that you can > concentrate on the larger problem is a good thing.
Yes indeed. I don't see your statements as being contentious. Does GC solve all problems? No. (But it does solve many) Does GC have disadvantages? Yes. (But not as many as some people like to believe) Are there situations where GC is contra-indicated Yes. (But that's boring since it is true of all technologies)
Clifford Heath <no.spam@please.net> writes:
> No, [GC is] *still* not the way to get the best from most embedded > hardware. It's an acceptable compromise, some of the time, is all that > can really be said.
Sure, that's absolutely true, for big hardware as well as embedded, but we live in an era where "getting the best from the hardware" isn't an issue for most projects these days. It's been like that on desktops for decades, and it's now getting like that even for small embedded stuff. Right now I'm building what amounts to a modified LED blinky. In the old days that would have been a transistor circuit or a 555 timer. Or it could straightforwardly be done with the smallest of PIC 10Fxxxx's or whatever. But instead I'm using the Gemma M0 board that I mentioned, a 32-bit ARM cpu programmed in Python, just because it's cheap enough and it seems like the easiest approach. Sure if I was making millions of them I'd have to go down the cost reduction road, but I'm making just a handful. I'm sure the same is true of most things built by contributors here. Being able to get down to the machine level used to be the starting point of embedded programming, but now I'd consider it further down the path.
David Brown <david.brown@hesbynett.no> writes:
>> http://www.memorymanagement.org/ > I had a little look at that site. It had some useful explanations, but > is /seriously/ biased. ...
You might look at https://www.cs.kent.ac.uk/people/staff/rej/gc.html I've seen the 2011 book and it is great. Hans Boehm also has some good GC pages.

The 2024 Embedded Online Conference