Forums

Embedded Basic interpreter recommendations?

Started by John Speth March 9, 2009
On Sun, 15 Mar 2009 18:30:27 GMT, Jon Kirwan
<jonk@infinitefactors.org> wrote:

>scarse
scarce. I guess my 'c's were scarce a little while ago, so I had to use some spare 's's. I've bought a few extra 'c's now. ;) Jon
On Sun, 15 Mar 2009 06:37:20 GMT, Jon Kirwan
<jonk@infinitefactors.org> wrote:

>On Sun, 15 Mar 2009 07:57:13 +0200, Paul Keinanen <keinanen@sci.fi> >wrote:
>>However, with a microcontroller with 16-64 KiB RAM and even some slow >>serial flash memory could be used to run quite complicated systems >>with overlay loading. In such systems 16 bit addresses are more than >>enough. > >Which brings up the ancient concept of named and unnamed COMMON blocks >and CHAINing -- &#2013265920; la mode FORTRAN and BASIC. I can see some definite >utility here.
If the HW manufacturers would make microcontrollers with sufficient loadable (10-100 KiB) code space "core" in RAM, it might be possible to reduce the total system power consumption, since the permanent storage "disk" only needs to be powered up to load the overlay/chained segment. Paul
On Mar 15, 11:15=A0pm, "Chris Burrows" <cfbsoftw...@hotmail.com> wrote:
> > The PC implementation of the Lilith M-Code execution engine occupies 22Kb=
.
> The *total* M-code files of the full Modula-2 compiler occupy about 60kb. > However, this comprises five passes each of which is loaded in separately > (like an overlay), the largest being 16kb.
Interesting stats, getting closer to microcontroller space, but still more than Turbo Pascal ...
> To put everything into perspective the original Lilith only had 256k of R=
AM
> (organised as 16-bit words) and 44k of that was used for the bit-mapped > graphics display. The CPU clock speed was about 6Mhz, and the hardware > supported by the operating system included a hard disk drive, laser print=
er,
> local area network etc.
What was the Lilith Debug support like ? - Single Step, and Break points ? Or do we just rely on the Silicon Debug and use a Host PC for serious debug ? -jg
"Jon Kirwan" <jonk@infinitefactors.org> wrote in message
news:dkgqr4ppgdg2u98ogcsn9pjksd5p2vgrtj@4ax.com...
> > I'm sure that seems like "only" to you.
It is "only" when read in context - only 600K out of the total of 150MB.
> Times have changed. I remember developing applications for fairly > complex rental management, sitting on an IMSAI 8080 with a TOTAL of > 16k RAM.
Luxury! Oh - the good old days! Weren't they so much fun! This is 2009 - I also like to revisit the past but I sure would not like to be back there permanently. My first programming expereince was with an Elliott 903 in the UK in the late 1960's. We had to load the ALGOL60 compiler in from paper tape every time we wanted to switch from SIR the assembler. A decade or so later my first microcomputer was a Signetics S2650 with a whopping 4k of RAM. Programs had to be typed in in hex and stored to cassette tape. After many weeks of programming I still only managed to fill 1k. Several months ago I was bitten by the nostalgia bug and rebuilt it from scratch. The fond memories didn't take long to vanish once I started to recall how much of a pain it used to be .... Yet again, later in the 1980's I can recall spend several weeks just working on a menu system. These days I can concentrate on writing the actual applications and get things done. I am still as aware of memory usage and performance as I always was but don't get hung up about it. When programming for Windows, if a particular approach takes an extra 50kb but means I can get the job done in a day instead of a month then I'm not going to lose any sleep worrying about it. When programming embedded systems I'd agree - our 1980's strategies and experience really come in handy. -- Chris Burrows CFB Software Armaide: ARM Oberon-07 Development System for Windows http://www.cfbsoftware.com/armaide
On Mar 16, 6:59=A0am, -jg <Jim.Granvi...@gmail.com> wrote:
> > Interesting stats, getting closer to microcontroller space, but still > more > than Turbo Pascal ... >
Yes - but Modula-2 had a lot more than the Turbo Pascal of the same period. Turbo Pascal's two main competitive attributes were that it was fast and it was cheap. As a serious software development tool for medium to large applications it had some severe limitations. It was limited to creating 64K COM files, had no linker, no overlay capabilities etc. etc. Most TP programs I saw at the time were hardly recognisable as Pascal - they looked more like assembler because of all the tricks needed to run on the IBM PC.
> > What was the Lilith Debug support like ? - Single Step, and Break > points ?
It had an interactive multi-window 'post mortem' source debugger which enable you to drill down the call stacks identifying source lines and inspecting the value of all of the variables.
> Or do we just rely on the Silicon Debug and use a Host PC for serious > debug ? >
The Lilith was a standalone personal workstation - there was no 'Host PC'. -- Chris Burrows CFB Software Armaide: ARM Oberon-07 http://www.cfbsoftware.com/armaide
On Mon, 16 Mar 2009 11:28:29 +1030, "Chris Burrows"
<cfbsoftware@hotmail.com> wrote:

>"Jon Kirwan" <jonk@infinitefactors.org> wrote in message >news:dkgqr4ppgdg2u98ogcsn9pjksd5p2vgrtj@4ax.com... >> >> I'm sure that seems like "only" to you. > >It is "only" when read in context - only 600K out of the total of 150MB. > >> Times have changed. I remember developing applications for fairly >> complex rental management, sitting on an IMSAI 8080 with a TOTAL of >> 16k RAM. > >Luxury! Oh - the good old days! Weren't they so much fun! This is 2009 - I >also like to revisit the past but I sure would not like to be back there >permanently. > >My first programming expereince was with an Elliott 903 in the UK in the >late 1960's. We had to load the ALGOL60 compiler in from paper tape every >time we wanted to switch from SIR the assembler. > >A decade or so later my first microcomputer was a Signetics S2650 with a >whopping 4k of RAM. Programs had to be typed in in hex and stored to >cassette tape. After many weeks of programming I still only managed to fill >1k. Several months ago I was bitten by the nostalgia bug and rebuilt it from >scratch. The fond memories didn't take long to vanish once I started to >recall how much of a pain it used to be .... > >Yet again, later in the 1980's I can recall spend several weeks just working >on a menu system. These days I can concentrate on writing the actual >applications and get things done. I am still as aware of memory usage and >performance as I always was but don't get hung up about it. > >When programming for Windows, if a particular approach takes an extra 50kb >but means I can get the job done in a day instead of a month then I'm not >going to lose any sleep worrying about it. When programming embedded systems >I'd agree - our 1980's strategies and experience really come in handy.
Yes, Chris. Similar perspective here. The one thing I'd add to all this is that _if_ BASIC is being placed in resource starved microcontrollers (and many of them look an awful lot like the resource starved equipment of yester-year), then the techniques applied back then could have good purchase here. Just a thought. Jon
"Jon Kirwan" <jonk@infinitefactors.org> wrote in message
news:lifqr4h4rkjsdmo68e7hc2093lao4auu1p@4ax.com...
> On Sun, 15 Mar 2009 21:45:44 +1030, "Chris Burrows" > > Well, Lilith had what could only have been considered "HEAVEN" when we > were working on the timesharing system. 256k of RAM? My gosh! 6MHz?! > Jeesh, darn! If only. >
Are you sure we're talking about the same timeframe? In 1983 my home PC had 512Kb of RAM and a 6Mhz CPU. Mind you it was a Sage which was a great little system compared with what else was around at the time. However, I still would have given my right arm to have had a Lilith to work on. Chris
On Mar 16, 3:23=A0pm, cfb <cfbsoftw...@hotmail.com> wrote:
> On Mar 16, 6:59=A0am, -jg <Jim.Granvi...@gmail.com> wrote: > > What was the Lilith Debug support like ? - Single Step, and Break > > points ? > > It had an interactive multi-window 'post mortem' source debugger which > enable you to drill down the call stacks identifying source lines and > inspecting the value of all of the variables.
But could you also single step, set break points and watch variables ? Any code-size indication on this portion ?
> > > Or do we just rely on the Silicon Debug and use a Host PC for serious > > debug ? > > The Lilith was a standalone personal workstation - there was no 'Host > PC'.
I was talking in the context of this thread, and relative to a M-Code implementation on a uC. With modern uC, you have a reasonable chunk of silcon support for Debug, but often intended to be driven from a remote JTAG master. ROM is very cheap, and there are uC solutions with largish stable code in ROM (Maxim's DS80C40x for example, have 64K ROM ) -jg
On Mon, 16 Mar 2009 13:56:43 +1030, "Chris Burrows"
<cfbsoftware@hotmail.com> wrote:

>"Jon Kirwan" <jonk@infinitefactors.org> wrote in message >news:lifqr4h4rkjsdmo68e7hc2093lao4auu1p@4ax.com... >> On Sun, 15 Mar 2009 21:45:44 +1030, "Chris Burrows" >> >> Well, Lilith had what could only have been considered "HEAVEN" when we >> were working on the timesharing system. 256k of RAM? My gosh! 6MHz?! >> Jeesh, darn! If only. > >Are you sure we're talking about the same timeframe? In 1983 my home PC had >512Kb of RAM and a 6Mhz CPU. Mind you it was a Sage which was a great little >system compared with what else was around at the time. However, I still >would have given my right arm to have had a Lilith to work on.
Probably not the same timeframe. I'm talking about machines of 1969 to 1972, not 1983! And if you had 512kb of RAM on your IBM AT in 1983 (which would have had to have been after August or so, memory serving) you would have had to pay the same price my business did for the AT, which was $5,995, I think. About 6 grand. I'm not sure, but it might have been with 512kb. Sounds about right. (No way I could afford that as a personal machine, though. Not even then and certainly no way a decade before that, which is the period I was discussing about the timeshared BASIC.) Jon
On Mar 16, 1:56=A0pm, -jg <Jim.Granvi...@gmail.com> wrote:
> > But could you also single step, set break points and watch variables ?
No - it was a 'post mortem' debugger. Typically with this sort of system you would use defensive-programming techniques e.g. using assertions to test for pre/post-conditions at strategic points in your code. The intention is to trap any unexpected conditions that occur during testing and then use the PMD to identify the offending code.
> Any code-size indication on this portion ?
A directory listing of debug.obj using Emulith shows 31k (words). However, it would also require the existence of the non-trivial bit- mapped multi-window display management software to function. Chris