EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

Embedded platforms for beginners?

Started by Elof Huang September 1, 2017
On 09/03/2017 06:00 PM, Clifford Heath wrote:
> On 04/09/17 05:58, George Neuner wrote: >> On Sun, 3 Sep 2017 17:52:42 +1000, Clifford Heath <no.spam@please.net> >> wrote: >> >>> On 03/09/17 16:14, rickman wrote: >>>> What is GC? >>> >>> Garbage collection. Make new objects and just lose them, >>> the GC will pick them up. You never know when the GC will >>> run, and you need a *lot* more memory, >> >> Those are misconceptions: there are systems that are completely >> deterministic, others that are very predictable (e.g., GC is >> schedulable in particular time slots, won't exceed given percentage of >> CPU time, etc.), and systems that don't need much additional memory. >> >> There even are systems that combine predictability with low memory >> overhead. > > All that is true. None of it invalidates what I said. > >>> so it's not a good way to get the most out of an MCU. >> >> I would say it it depends on the system. >> My position is that GC has no place in a small system. > > Thanks for confirming my position.
Well, given that dynamic allocation has no place in small systems either, GC is a bit of a non-issue. ;)
> >> OTOH, if you're talking about a complex >> system with 32-bit controller and 100s of KB (or more) of RAM ... > > Then you have resources in such excess that *you don't need* > to get the best out of it. Exactly what I said, in other words.
Until it's been running long enough that the heap is fragmented into tiny, tiny bits. Cheers Phil Hobbs -- Dr Philip C D Hobbs Principal Consultant ElectroOptical Innovations LLC Optics, Electro-optics, Photonics, Analog Electronics 160 North State Road #203 Briarcliff Manor NY 10510 hobbs at electrooptical dot net http://electrooptical.net
On 04/09/17 09:13, Phil Hobbs wrote:
> On 09/03/2017 06:00 PM, Clifford Heath wrote: >> On 04/09/17 05:58, George Neuner wrote: >>> On Sun, 3 Sep 2017 17:52:42 +1000, Clifford Heath <no.spam@please.net> >>> wrote: >>> >>>> On 03/09/17 16:14, rickman wrote: >>>>> What is GC? >>>> >>>> Garbage collection. Make new objects and just lose them, >>>> the GC will pick them up. You never know when the GC will >>>> run, and you need a *lot* more memory, >>> >>> Those are misconceptions: there are systems that are completely >>> deterministic, others that are very predictable (e.g., GC is >>> schedulable in particular time slots, won't exceed given percentage of >>> CPU time, etc.), and systems that don't need much additional memory. >>> >>> There even are systems that combine predictability with low memory >>> overhead. >> >> All that is true. None of it invalidates what I said. >> >>>> so it's not a good way to get the most out of an MCU. >>> >>> I would say it it depends on the system. >>> My position is that GC has no place in a small system. >> >> Thanks for confirming my position. > > Well, given that dynamic allocation has no place in small systems > either, GC is a bit of a non-issue. ;) > >> >>> OTOH, if you're talking about a complex >>> system with 32-bit controller and 100s of KB (or more) of RAM ... >> >> Then you have resources in such excess that *you don't need* >> to get the best out of it. Exactly what I said, in other words. > > Until it's been running long enough that the heap is fragmented into > tiny, tiny bits.
Modern generational GC compacts the heap quite effectively. The only real difficulties are (a) doing that unobtrusively and (b) being certain that you won't ever run out of memory due to leaks.
On 04/09/17 09:13, Phil Hobbs wrote:
> On 09/03/2017 06:00 PM, Clifford Heath wrote: >> On 04/09/17 05:58, George Neuner wrote: >>> On Sun, 3 Sep 2017 17:52:42 +1000, Clifford Heath <no.spam@please.net> >>> wrote: >>> >>>> On 03/09/17 16:14, rickman wrote: >>>>> What is GC? >>>> >>>> Garbage collection. Make new objects and just lose them, >>>> the GC will pick them up. You never know when the GC will >>>> run, and you need a *lot* more memory, >>> >>> Those are misconceptions: there are systems that are completely >>> deterministic, others that are very predictable (e.g., GC is >>> schedulable in particular time slots, won't exceed given percentage of >>> CPU time, etc.), and systems that don't need much additional memory. >>> >>> There even are systems that combine predictability with low memory >>> overhead. >> >> All that is true. None of it invalidates what I said. >> >>>> so it's not a good way to get the most out of an MCU. >>> >>> I would say it it depends on the system. >>> My position is that GC has no place in a small system. >> >> Thanks for confirming my position. > > Well, given that dynamic allocation has no place in small systems > either, GC is a bit of a non-issue. ;) > >> >>> OTOH, if you're talking about a complex >>> system with 32-bit controller and 100s of KB (or more) of RAM ... >> >> Then you have resources in such excess that *you don't need* >> to get the best out of it. Exactly what I said, in other words. > > Until it's been running long enough that the heap is fragmented into > tiny, tiny bits.
Modern generational GC compacts the heap quite effectively. The only real difficulties are (a) doing that unobtrusively and (b) being certain that you won't ever run out of memory due to leaks.
On 09/03/2017 07:58 PM, Clifford Heath wrote:
> On 04/09/17 09:13, Phil Hobbs wrote: >> On 09/03/2017 06:00 PM, Clifford Heath wrote: >>> On 04/09/17 05:58, George Neuner wrote: >>>> On Sun, 3 Sep 2017 17:52:42 +1000, Clifford Heath <no.spam@please.net> >>>> wrote: >>>> >>>>> On 03/09/17 16:14, rickman wrote: >>>>>> What is GC? >>>>> >>>>> Garbage collection. Make new objects and just lose them, >>>>> the GC will pick them up. You never know when the GC will >>>>> run, and you need a *lot* more memory, >>>> >>>> Those are misconceptions: there are systems that are completely >>>> deterministic, others that are very predictable (e.g., GC is >>>> schedulable in particular time slots, won't exceed given percentage of >>>> CPU time, etc.), and systems that don't need much additional memory. >>>> >>>> There even are systems that combine predictability with low memory >>>> overhead. >>> >>> All that is true. None of it invalidates what I said. >>> >>>>> so it's not a good way to get the most out of an MCU. >>>> >>>> I would say it it depends on the system. >>>> My position is that GC has no place in a small system. >>> >>> Thanks for confirming my position. >> >> Well, given that dynamic allocation has no place in small systems >> either, GC is a bit of a non-issue. ;) >> >>> >>>> OTOH, if you're talking about a complex >>>> system with 32-bit controller and 100s of KB (or more) of RAM ... >>> >>> Then you have resources in such excess that *you don't need* >>> to get the best out of it. Exactly what I said, in other words. >> >> Until it's been running long enough that the heap is fragmented into >> tiny, tiny bits. > > Modern generational GC compacts the heap quite effectively. > The only real difficulties are (a) doing that unobtrusively > and (b) being certain that you won't ever run out of memory > due to leaks.
I assume that you haven't got a lot of aliased pointers running round your code! Cheers Phil Hobbs -- Dr Philip C D Hobbs Principal Consultant ElectroOptical Innovations LLC Optics, Electro-optics, Photonics, Analog Electronics 160 North State Road #203 Briarcliff Manor NY 10510 hobbs at electrooptical dot net http://electrooptical.net
Clifford Heath <no.spam@please.net> writes:
> * No need to allocate or manage memory (it's GC'd) > * API access to all port pins, audio, touch, etc > * No need to touch a peripheral device directly > * No need to initialize the chip or devices > > In other words, it's fairly highly-managed. That's a fine thing, but > the skills are not easily transferable to other MCUs.
Aren't #2-#4 the same with Arduino and Mbed? Does that super low level stuff really matter for most embedded programming any more? Does OP have reasonable experience at desktop-level programming? I'd say if not, then embedded in general isn't a good place to start. Everything is easier on desktop systems.
On 04/09/17 10:16, Phil Hobbs wrote:
> On 09/03/2017 07:58 PM, Clifford Heath wrote: >> On 04/09/17 09:13, Phil Hobbs wrote: >>> On 09/03/2017 06:00 PM, Clifford Heath wrote: >>>> On 04/09/17 05:58, George Neuner wrote: >>>>> On Sun, 3 Sep 2017 17:52:42 +1000, Clifford Heath <no.spam@please.net> >>>>> wrote: >>>>> >>>>>> On 03/09/17 16:14, rickman wrote: >>>>>>> What is GC? >>>>>> >>>>>> Garbage collection. Make new objects and just lose them, >>>>>> the GC will pick them up. You never know when the GC will >>>>>> run, and you need a *lot* more memory, >>>>> >>>>> Those are misconceptions: there are systems that are completely >>>>> deterministic, others that are very predictable (e.g., GC is >>>>> schedulable in particular time slots, won't exceed given percentage of >>>>> CPU time, etc.), and systems that don't need much additional memory. >>>>> >>>>> There even are systems that combine predictability with low memory >>>>> overhead. >>>> >>>> All that is true. None of it invalidates what I said. >>>> >>>>>> so it's not a good way to get the most out of an MCU. >>>>> >>>>> I would say it it depends on the system. >>>>> My position is that GC has no place in a small system. >>>> >>>> Thanks for confirming my position. >>> >>> Well, given that dynamic allocation has no place in small systems >>> either, GC is a bit of a non-issue. ;) >>> >>>> >>>>> OTOH, if you're talking about a complex >>>>> system with 32-bit controller and 100s of KB (or more) of RAM ... >>>> >>>> Then you have resources in such excess that *you don't need* >>>> to get the best out of it. Exactly what I said, in other words. >>> >>> Until it's been running long enough that the heap is fragmented into >>> tiny, tiny bits. >> >> Modern generational GC compacts the heap quite effectively. >> The only real difficulties are (a) doing that unobtrusively >> and (b) being certain that you won't ever run out of memory >> due to leaks. > > I assume that you haven't got a lot of aliased pointers running round > your code!
Generational GC relies on being able to move things between heaps, so object handles cannot be pointers (or rather, must be indirect). Languages that do it don't allow you to see the pointer as a number. ... another reason why you can't get the most out of the hardware.
rickman <gnuarm@gmail.com> writes:
> It's not like "GC'd" is a commonly used abbreviation.
At least among language geeks, it's definitely commonly used.
Clifford Heath <no.spam@please.net> writes:
> Generational GC relies on being able to move things between heaps, > so object handles cannot be pointers (or rather, must be indirect).
The handles are usually still pointers. First you copy the object to the other heap, then later you scan it, copy the objects that the first objects points to, and update the pointers in the first object to point to the new locations. This is described pretty well in the SICP chapter on metacircular evaluators (mitpress.mit.edu/sicp).
George Neuner <gneuner2@comcast.net> writes:
> My position is that GC has no place in a small system. OTOH, if > you're talking about a complex system with 32-bit controller and 100s > of KB (or more) of RAM ...
MicroPython works with some contortions on the BBC Micro:bit which has 16k of ram, and much more nicely on the recent Adafruit SAMD21 boards which have 32k of ram. I don't know how much Luanode uses. Camelforth/4e4th (non-gc) runs ok on the MSP430G2553 which has 16k of flash and 512 bytes of ram. That might be a good way to start, or maybe Amforth on the AVR-based Arduinos. Who really cares about these any more though, except for legacy products? Everything is 32 bit now. I bought an Adafruit Gemma M0 board last week and had hoped to try it out this weekend, but have been distracted by other things. Real soon now though. If it's what it's cracked up to be, it basically puts the nail into the Arduino.
On 04/09/17 15:30, Paul Rubin wrote:
> If it's what it's cracked up to be, it basically puts > the nail into the Arduino.
Lots of things could do that. None of them have. The problem is, too many could; which to choose? I'm not supporting Arduino, I'm just saying that the best solution does not necessarily win mindshare. The hardware is almost immaterial; most people go where there's community.

The 2024 Embedded Online Conference