EmbeddedRelated.com
Forums

Low level programming

Started by aniruddhpr4 5 years ago25 replieslatest reply 5 years ago2197 views

I am a newbie when it comes to embedded programming. I can program simple stuff if I am given the header files and all the startup code for ARM or AVR etc. But, I want to dive deeper and my goal is to write every piece of code for a given hardware.

I have researched a bit and understood how a program is executed from ground up. To write low level code, initialization code and as such, many of them suggested reading the datasheet but I am unable to grasp what exactly needs to be done from the datasheet.

For example, I chose atmega328p for this purpose and I tried reading through the datasheet for how to write the code for reset vector and all that low level programming but in vain, I don't know where to even start.

In theory it is simplified ( I referred O'Reilly's Programming Embedded systems in C and C++) and seems possible. But there is no proper source or theory guided with practical application anywhere.

Can anybody point me in the right direction? Where should I begin in a datasheet? What should I look for? If you can give me the complete road map it would be extremely helpful.

[ - ]
Reply by QLMarch 18, 2019

One resource you might want to try is the "Modern Embedded Systems Programming" video course on YouTube. This course uses the modern ARM Cortex-M microcontroller to teach low-level programming in the first 20 lessons or so, including everything you ask for (machine code, register access, function calls, startup code, interrupts, etc). The later lessons move on to explain the architecture of embedded systems.

[ - ]
Reply by Bob11March 18, 2019

In High School a long long time ago I wire-wrapped my own computer using a 6800 with front panel toggle switches and a couple of 7-segment displays, and toggled the assembly code in by hand. I actually got it all working. That's the REALLY REALLY low-level way to do it!

Presuming you're taking the sane approach these days and starting with a development board for your chosen micro, the steps roughly look like this:

1) Start by learning how to read and write the memory. These days that typically means learning how to load the code into FLASH. Decades ago that meant EEPROM burners and UV erasing, today it means you have the appropriate debugging/programming probe. Verify you can write/read memory at the reset vector.

2) Carefully read up on the reset vector. Does it have to be a branch instruction? Where do you put the code in memory? Don't worry about interrupts or anything else at this stage, except for watchdog timers that might be enabled by default and might bark at you unexpectedly. Usually they're not enabled by default, which is preferable. See the datasheet.

3) Your development board should have a port driving an LED or something similar to monitor code execution. (It also helps to have an oscilloscope to probe pins when you're exploring micros at this level.)

4) Write SIMPLE assembly code to do something VERY simple, like branch from the reset vector to a simple routine that toggles the LED on/off every second. This will involve learning how to get the loader to put the code and the reset vector in the right spots in memory, how to enable and set to output mode the port the LED is on, and how to turn off any watchdogs or enable any internal clocks the processor might need. Don't try to use the stack or anything else at this stage. Just set the LED port bit, load a register, count down, clear the LED port bit, load a register, count down, and jump back to the start. The datasheet is your friend here, along with any app notes the vendor provides on how to bring up your microprocessor. I would recommend using command line tools if possible at this stage, because most vendors today ship GUI tools that will "boot you" far beyond these initial learning steps and hide all the implementation details you are trying to learn.

5) Once you are at the point where you have an LED flashing, you're halfway home. The next step is to bring up some more peripherals. This usually involves writing some simple functions to access the ports involved, which means adding some code to set the stack pointer up in memory properly so you can jump and return from subroutines. Once you've got that done, you can try enabling a simple interrupt handler or two and checking that they work as expected.

6) If you're ready to move on to C, it's time to learn about crt0, the C startup code that runs before main(). There's not too much to it--the reset vector will jump to code that sets up the stack pointer, and loads a few registers with the address of the bss (data) area in memory, will have to clear out the static data area, may have to copy some constant data from FLASH to RAM, and do other setup stuff the ABI the C compiler for that processor requires. The vendor GUI tools often hide this stuff, so you may have to dig a bit to find out the requirements for your board/compiler combo. This is done in assembly typically, with a final jump to main(). You also have to write some code to decide what to do if main() returns.

If you've gotten the above working, you've basically accomplished your goal. You've now gotten your processor from the reset vector, to the minimum processor setup necessary to begin executing code, to setting up the various address spaces, stack and so forth needed for interrupt handlers to work and C to run properly, and now you're in main(). The world is yours!


[ - ]
Reply by aniruddhpr4March 18, 2019

Thank you so much! This was really helpful!

[ - ]
Reply by JackCrensMarch 18, 2019

Bob11 nailed it!!!

Jack

[ - ]
Reply by dnjMarch 18, 2019

I would hardily agree with QL on preparation. Diving into bare metal programming takes some baby steps to prevent drowning in the deep end of the pool.

There was a time when processors, even on business machines were rather simple to program in assembly languages. I cut my programming teeth (in the 1960s) on course after course that required some form of assembler. FORTRAN just didn't cut it for all things. Those machines are extinct, so you should look for a semi-modern alternative for getting the feel of assembler or even embedded C.

I had a project thrust upon me which used an 8051 core. Had to do a refresh of all of the things that I had forgotten over the years because I was entrenched in C and C++. The register structure is simple; the instructions are simple; and the capacity is limited. But, these little processors are the core of millions of devices. Maybe, billions.

Money well invested would be to find a development kit using an 8051 and see if you can get it started, booted and do some simple things as suggested in other replies. Once you have these basics in place, you can expand your knowledge to all of the peripheral registers, addressing modes, interrupt vectors, etc etc which you will find in a far more complex ARM CORTEX processor.

The knowledge is good as you really get to see what the processor is doing as opposed to calling some high level API to do everything for you. You will start to think like a computer and understand what is going on.

Good Luck. It is an enlightening experience.

If you can find some copies of Dr Knuth's programming books, you will find the MIX assembly language which he invented to illustrate algorithms in his books. They are old, but they are classics.

[ - ]
Reply by aniruddhpr4March 18, 2019

Thank you. I have had a course on 8051 in my earlier semesters and I experimented on it but not in the way you have asked me to here. So I will certainly give it a try. 


Yes, will certainly refer Dr. Knuth. Thank you very much!

[ - ]
Reply by rtomkinsMarch 18, 2019
First, my qualifier. I love free shit!


STMicroelectronics makes a number of low cost modules called Discovery Boards. I have some experience working with a number of STM32 based Discovery Boards.

STM has built and actively maintains a software development infrastructure in support of the STM32 product family. This includes programming applications, tracing applications and something they call the STM32CubeMX. Along with the Cube, are downloadable libraries of sample source code (initialization, drivers, example peripheral code and more) as well as integration of a RTOS, FreeRTOS and StemWin, a professional graphical stack library. Inaddition, they recently purchased Atollic to acquire TrueSTUDIO, an eclipse based C/C++ integrated development tool.

All of the above, with the exception of the Discovery Boards is FREE SHIT!

This Discovery Board, 32F746GDISCOVERY, is available form DigiKey for $57.00 CAD.

The sample source code available through STM32CubeMX, is extensive and for anyone studying 32 bit MCU development based on ARM technology, this is the cats meow.

The STM32CubeMX source code has example source for many of the popular development tools, Keil, IAR and others, as well as TrueSTUDIO and TrueSTUDIO will translate the source code examples that are in OpenSTM32 format.

I believe that doing all the code yourself can be an admirable endeavour, but, given the resources available to you through all the free tools, if you are developing a product for the marketplace, you get a much better ROI (Return on Investment, aka your time and effort) if you make use of the canned code that is provided by the manufacturer and get your product in front of the consumer before your competitor does.

In closing, the STMIcroelectronics infrastructure includes extensive documentation and descriptions for the various libraries, and MCUs, you can't go wrong using their product set.


[ - ]
Reply by aniruddhpr4March 18, 2019

Haha, free shit is what made me want to dive deeper. I have looked for alternatives to all the paid stuff ever since I started using Linux. I have access to an STM32 Discovery board. I am programming it using eclipse cdt and gnu mcu toolchain. But I haven't tried cubeMX. So I will definitely give whatever you are suggesting, a good try. Thank you.

[ - ]
Reply by RallygoerMarch 18, 2019

The Atmega328 does not really need any init code to get it going. You just need to know the few basics of where the jump vectors are. In its simplest form the reset address is at address 0x0000, followed by the interrupt jump address, you just need to make sure at address 0x0000 you put jmp START. Nothing else needs initializing to get you going. There is a nice little tutorial here showing just how little you need in assembler to get you going. It expands enough to keep you interested, and actually create the universal embedded "hello world".

https://medium.com/jungletronics/meeting-assembly-...


But as other posters have said it is always a good idea to have not only the data sheet, but the instruction set documentation too, and always keep an eye out for the errata sheets....

[ - ]
Reply by mr_banditMarch 18, 2019

First, LOTS of good stuff in this thread!

Second - my father, who at one point (late 60's, early 70's) was one of the top operating systems designers. He taught me how to program - at the time it was an 8080 assembly. Debugging was using DBG to put "patches": put a JMP to an unused memory area, hand asm the mnemonics into binary into that area, then JMP back. (The call stack was very small.) Obviously, keeping good notes was critical. Initial programming was done in ASM on coding sheets, which were put onto punch cards by a punch card service. Running the assembler was non-trivial and slow.

He used a format that I have been using for ASM programming since then. ASM is as low-level as "normal" humans can (usually) handle, and the code can get obscure. This format helps with understanding what is going on.

(He also gave me this advise about commenting: Assume that your code will be updated/debugged by a serial killer with a hair trigger, and your name is in the header. Keep in mind you will be a different person 6 months or next week.)

The format is:

A comment block, with ALL used registers and what they contain (eg "R5: number of times to loop"), followed by a comment of WHAT you are going to do (eg "Run thru the foo[] array calling bar() with the loop counter and the reading from device baz, setting foo[loop_counter] to the bar() results"). You can also lay out what to do with pseudo code (google "peudo code"). Some examples: http://www.unf.edu/~broggio/cop2221/2221pseu.htm

End the comment block with ALL registers you will use. (R5 = number of times to loop; R6 = loop counter; R7 = parameter to bar() and return value from bar(); R8 = results from device baz; R9 = scratch)  (each of these should be on their own line, formatted cleanly for easy understanding - this editor's formatting is too simple.)

THEN - 10..12 lines of ASM code. if you have more than that, your code is too complex - refactor to simplify. Keep the format cleanly lined up and readable!! Any comments on a line should be to explain something obscure ("this is a volatile device register", "read MS byte of U16", not "add 1 to R5").

THEN - the next comment block

A similar method works for high level languages - a comment block with the variables you will use, the algorithm, and the results.

Remember KISS: Keep It Simply Stupid". The cleverness is in the algorithm, not the code. The code should look like "run dick run".

Look at the listing. See how the ASM mnemonics are converted to machine code.

[ - ]
Reply by CustomSargeMarch 18, 2019

Off the top of my head:

> Try to find an assembler book / guide for the processor you chose.

> Get very familiar with whatever IDE (integrated development environment) goes with that processor.

> Build a uC circuit with a gate / transistor driver for an LED and write the code to make it flash.

> Change the build to an LED driver like a STP04CM05 and code to to flash, then sequence.

> Keep incrementally building more complex stuff and coding more things for it to do.

> Don't get caught up trying to learn advanced techniques while the basics aren't a solid foundation. Learn I2C, LCD, keypad input and other necessary devices. You'll get there with patience and perseverance. 

> Coding interrupts, the when and why, will happen when you cross into applications that have multiple concurrent functions.

A rough outline, nothing more... Good Hunting  <<<)))


[ - ]
Reply by Jim_255March 18, 2019

aniruddhpr4,

My first question is Why? Your processor of choice is a 20 Mhz 8 bit device with a 50nS instruction cycle time. Honestly, the  C language tools and libraries for that device should generate sufficiently fast enough code for even the tightest Interrupt service timing for nearly anything you would use that part for.

I am a seasoned embedded systems engineer with over 40 years of experience.  I've been working on various embedded systems since before the C language.  In the embedded environment the only programming language we had was assembler. I actually wrote a great many things in MACHINE CODE before the assembler application existed. We did EVERYTHING from scratch. If that's really where you want to go, throw out the C compiler and use the assembler. Each processor, AVR, ARM, PIC, Z-80, 8086,... has their own assembly language and because of this you will have to learn a new language for each processor you use.  In a nutshell there is no " complete roadmap" because each processor is different.


-Jim

[ - ]
Reply by JackCrensMarch 18, 2019

Jim, like you I'm a "seasoned" computer guy.  Like you, I started writing code long before C was available.  But I'm puzzled by your comment that you had to write MACHINE code because no assembler was available. AFAIK, assemblers have been around almost as long as computers.  Aside from outliers like the EDSAC, or even, for that matter, the Babbage engine, the first assembler was written for the IBM 701, ca. 1951.  I myself was writing IBM 650 code ca. 1956.

So I'm sure that you don't mean "before assemblers existed." I'm guessing that you must mean you were writing for a soecific processor before there was an assembler for that processor.  A flight computer, perhaps?

But even then, _NOBODY_ has to write programs in "machine code."  AFAIK, no "modern" digital computer (meaning, since 1948), had ever been built that didn't have mnemonics associated with the instruction set.  

I don't remember all the 8080 mnemonics anymore, but I do recall that the hex code for CALL was hex CD, RET C9, JMP C3, etc. If you truly don't have an assembler, it's an easy thing to write down your program as mnemonics, then hand-assemble to the equivalent hex codes.

BTW, I have met a few folks that prided themselves on doing things the hard way. In 1961, I actually had a blowhard systems programmer declare proudly (and I paraphrase),

"Assembly language is for wimps! REAL programmers code in absolute binary!"

He was an idiot, of course. Which I'm quite sure you aren't.

So, again, I'm puzzled by your assertion that you wrote MACHINE code from scratch.  Would you care to elaborate?






[ - ]
Reply by Jim_255March 18, 2019

Ok I'd be happy to elaborate.

I was designing embedded systems with 8080/8088 Z-80, 6502 and several other processors as early as 1977, while I was in college. Neither the Z-80 nor 8080/8088 assemblers were commercially available until the early 1980. We had a trainer in college called a Bi-Tran Six. We programmed it in octal, that's where I cut my teeth on programming as an Electronics Engineering major.  https://imgur.com/gallery/INabLxA


[ - ]
Reply by JackCrensMarch 18, 2019

Jim, thanks so much for the extra details.  I see that you & I have been down many of the same roads.

I'm still a little confused about the dates, though. Are you 100% CERTAIN of those dates?

In 1974, I was manager of a company called Comp-Sultants in Huntsville, AL. We were building embedded systems around the Intel 4004, then the 4040.  We even sold a computer kit: http://www.oldcomputermuseum.com/micro-440.html

Now, Intel had been marketing the 4004 since 1971, and you just have to know that Intel would never sell chips without a development system so we could actually use it. We had one, a single-board computer with a 4004 chip and some RAM, and an assembler and debugger in ROM. Also an EPROM reader/burner.  I/O was via a Teletype ASR-33.  Later on, we upgraded the ROM so as to assemble to the 4040. I myself wrote many programs on that board -- all before 1976.

In 1975, we got a contract to develop 8080 software for Scientific Atlanta.  By that time, Intel had a very nice computer, the Intellec 8, with 64K RAM and the assembler/debugger in ROM. I wrote all the software for that project.

Bottom line: We seem to have done a lot of the same things with the 8090, Z-80, but I still can't reconcile your comment that no commercial assemblers were available until almost 1980.

Jack

.

[ - ]
Reply by aniruddhpr4March 18, 2019

The answer to why is, I simply want to learn what happens under the hood.

I am learning assembly code as well but even assembly code, for example, the Keil assembler requires a startup. So a lot of abstraction is presented, which I don't like. I want to be as close to the device as possible. Anyway thank you for the reply.

[ - ]
Reply by dbrion06March 18, 2019

Well, I agree with jim:

40 years ago, before coding in asm, I wrote (with a pencil, paper and an eraser -for typos and errors-) what it should do (using algol/pascal : both languages existed, were known, though there was no compilers).

Now, there are optimizing cross compilers, (and code checkers : uninitialized variables are detected: they are CPU agnostic...) and optimizers: both gcc (for avr-s, arm-s and esp-s) and sdcc (for Z80 and 8051 derived) have some amount of optimisation.

You should ask your self the question : can I code in ASM better than optimized c(++)? Both sdcc (trivial)  and gcc generate assembly -thus, the question can get an answer for every problem...- And to understand (optimiser) generated code, google search can bring back (ask google "avr assembly instructions " , if your country has a comfortable web) documentation, hard to find a millenium ago such as https://www.microchip.com/webdoc/avrassembler/avra...


There is another issue when self teaching assembly: unless one finds books and time to read them (one should), one reads one's own errors and copies them... (generated assembly has less horrors). Fora try to correct main issues, but cannot fix bad habits.

You should beging with a good knowledge of C++ and have a look at generated assembly. Perhaps a simulator (never used) can be of great help.


I agree there are -rare- cases where inline assembly  (every c compiler copes with in line assembly, AFAIK) is useful : I know the recent neopixel library (where cycle accurate timings are needed  see https://github.com/adafruit/Adafruit_NeoPixel/blob... for a lot of ASM-s ) and avr-libc (used by arduino, but is hidden....) . But the vast majority uses little, if anay at all, assembly.


And the way programs can be downloaded ("bootloaders" ) preexist before init (else, how could one put reset vectors? interrupt handlers?)

[ - ]
Reply by jms_nhMarch 18, 2019

For PIC microcontrollers, try using MCC -- it sets up registers for different peripherals, and will give you a good starting point so you know what to look for in a datasheet when you need to fine-tune it. (although theoretically MCC is supposed to be able to support something like 90-99% of use cases, depending on the peripheral.)

Also, the idea of "low-level programming" via processor registers is completely independent from the choice of assembly vs. C/C++. Some answers here mention assembly, but that's not what you're asking about. You can access CPU registers in C or C++ just fine, you just need the right #include files and linker scripts so that the compiler and linker generate the appropriate code. (fine print: the compiler owns certain primary registers that you shouldn't touch in C, like the program counter, stack pointer, and any ALU registers.)

[ - ]
Reply by JackCrensMarch 18, 2019

Hi, ani!  I've read your inquiry and the replies, and have a couple of bits of advice. Some of this is going to rub other folks the wrong way, and I apologize in advance for any offense.

First, pay no attention to all the folks telling you not to do what you want to do. It's very true that the easiest way to build AVR-based systems is to buy an Arduino and write C++ code in the IDE, making heavy use of the many libraries available. 


But you made it clear that you want a deeper understanding of the CPU core. IMHO, nobody has the right to tell you that you're foolish to want to learn it.

Second, pay no attention to advice from people who mention the 8051, or writing code before assemblers existed (that would have to be before 1954, maybe earlier).

The thing is, early microprocessors like the 8080, Z80, 8051, etc. were from a different era, when the CPU was one chip on a motherboard.  RAM, ROM, I/O devices, etc. were hard-wired and support by discrete chip-select logic.


Modern microcontrollers are different beasts altogether.  They're Systems on Chip (SoC) devices, and all the logic is programmable.


Back in 1994, I had a contract to build a system around the Motorola 68332. It was not exactly SoC -- RAM and ROM were still separate -- but the counters, timers, I/O devices, and even the chip-select logic were all on the chip.  Because of this, the first thing you had to do was to write the initialization code.

The Good News is, Motorola had a HUGE User's Manual, that described everything that needed to be done to initialize the chip.


The Bad News was, the manual was inscrutable. It tended to say "you could do this, or then again you could do that," but never explained the pros and cons of the alternate approaches.


Fortunately, Moto also had an expert -- read ONE (1) expert -- whom I only knew as Charlie.  I'm guessing that Charlie was spread pretty thin, but he would always be accessible within 24 hours, and would always take the time to explain the pros and cons of the different approaches.


Many thanks to Charlie, I was able to create robust startup code that properly initialized the memory space, the counter-timers, the watchdog timer, the UARTs, etc., etc., and the interrupt vectors. The project was a success and the product was delivered on time and on budget.

The other folks are quite right when they say that you don't have to do those kinds of things anymore, because libraries exist.

Even so, I found it eminently satisfying to know _EXACTLY_ what the startup code was doing, and how I could change it myself, if I wanted to.


I'm betting that you'd like that, also.





[ - ]
Reply by aniruddhpr4March 18, 2019

Yes! you are on target with what I want to do. I guess the harder we toil, the better the satisfaction. Anyway thank you for the reply.

[ - ]
Reply by mr_banditMarch 18, 2019

Reading datasheets is an art and science. And sometimes they lie. Be aware of the errata sheets - they are what went wrong, and can save you  lot of time.

You can write "raw" C for arduinos. goto the atmel (now microchip) for details. The free C IDE is https://www.microchip.com/design-centers/8-bit/avr...

TO make things a little easier, get a teensy 2.0 or 2.0++ because the bootloader is really easy You would write C (either in the C IDE or an external editor like vim), and set the IDE to generate a linked file in a particular location. The teensy bootloader is set to the same location. TO load the new code, you push the button on the teensy. (All of this is explained on pjrc.com) Be aware the teensy 3.x boards use an ARM, so the toolchain is different.)

The very first thing to do is blink an LED. Look at the code for the Arduino example ("blinky" if I recall). Look at the #include files, and look at the code. then lookup the registers that are used in the 328 mpu datasheet. You need to do a few things to any IO register: set the direction (in/out), if output, what its initial value for each output pin, etc.

It really helps to understand boolean operations (AND and OR - in C they are & and |) and (inversion) "~" and (not) "!"). This how bits are manipulated in general, and in registers in particular.

QL gives you a good link. If you go with an Arduino, you will need to compare/contrast Atmel and ARM.

BUT - if you go with QL's lesson link, use the teensy 3.0 - the bootloader sequence should be about the same.

When you get into peripherals, start with SPI and get a SPI chip breakout from Adafruit or Sparkfun. (You could also use I2C, but it is harder to bit-bang.) You could also use this as a chance to use the built-in peripherals (SPI, I2C, etc) - write polled code first.

And - get a rygol oscope http://www.saelig.com/product/ds1054z.htm under $400 - far be it for me to mention a simple hack, found on the interwebs, that will convert this scope from 50 MHz to 100 MHz by changing one byte in EEPROM. You will first think 4 channels is way too many, then you will be glad for 4 channels...

Good luck and success!

[ - ]
Reply by SolderdotMarch 18, 2019

You want to write every piece of code for a given hardware. I assume, most of the code will be C and C++, doing everything in Assembly language is... possibly but why should one want to throw away all the things that make things a bit easier?

If I get it right your trouble starts with reset and the code that is executed right afterwards, until your program established an environment capable of coping with the binary the compiler created.

This part is indeed not trivial to write on your own, because you need not only cover the stuff the processor requires but also the stuff the compiler expects to be done.

Examples:

- Some processors have a ROM overlay at the boot address for executing a boot loader. This overlay needs to be switched off which is done by programming some registers.

- Some processors have built-in memories which must be enabled before being accessed, again you most likely need to write some register correctly for that.

- The stack memory must be setup, so when calling a subroutine the return address can be pushed onto a meaningful address - and poped upon return.

- A decent C-compiler will expect all gobal variables being initialized to zero during startup. There must be some code which does that, since the reset value of memories is normally unspecified and surely not zero. Other memory areas need to be initialized to other values, e.g. global variables having an initialization value unequal to zero.

Only few of that stuff you will find in a datasheet. You may find the location and structure of the reset vector in there, but the rest is... Establishing the C-environment and interworking assembly language with C language.

My proposal: Write a little application in C-language, a simple application which does not anything beyond increasing a counter, using all the stuff which is delivered with your development environment. Download that executable into the processor and single step through the disassembled code using a debugger, from reset vector on, till you reach main(). You will find a quite surprising lot of stuff being executed there, many of it coming from the runtime lib of the compiler.

In parallel get acquainted with the ASM-C-interworking standard (in ARM world that would be the "procedure call standard"), so you understand how functions are being invoked and how parameters and return values are passed.

The trouble is: Those things vary from processor family to processor family. Not even in the ARM world the reset vectors look the same on all devices, and subroutine calling on an Intel machine is completely different from what you will see in ARM world. So first decide on which HW platform you want eduacte yourself and only then do the deep dive. I've done all what you're intending to do in ARM world and even for me it is not a trivial task to map my knowledge on other processor architectures, such as ARC or Tensilica.

Good luck and have fun!

[ - ]
Reply by aniruddhpr4March 18, 2019

Thank you so much! That was a lot of valuable information. I was going through some articles and blogs where a lot of linker tweaking is done. So I was guessing that I had to do a little bit of compiler environment setting too. Anyway, I am willing to go to any extent to know it all. At least for a single device.

[ - ]
Reply by atomqMarch 18, 2019

Around 20 years ago I was in the same state of mind.. My MSC thesis was based on 8052 and fortunately - for different reasons - I was pushed to use assembler. I suppose it was a good starting point. The good thing about it was to control all the hardware in the way I needed plus the experience, the upside was that it took a time to grasp all the system. My next step was moving to the AVR family and first touch to C coding. 8 bit AVRs are not too complicated in the sense of its architecture, pretty intuitive organised SFRs and a memory map. Well C gives you all the overhead that is required to build "an application" you can even compose your own object to mimic C++ object programming with full control over your hardware.

I would like to point here that you can still use assembly code whenever and wherever it is required while programming in C once you know how the function is invoked and how the parameters are passed to it.

Knowing the device is a must. That is why you need to have by the hand the datasheet of the micro you use. First thing is to know how a particular piece of hardware works (the interfaces, ADC, watchdog, etc.) and then how to control or configure it with SFR bits. You can find in the datascheets all the required information.

A lots of efforts I'd say but the reward is a fun of controling all that engine in the way you want. Next step that I would suggest is to move into some more sophisticated architecture like STM32 and use CUBE where all the init code can be written for you and you get the pre-configured engine to drive with your own code.

I hope it helps. Best of luck. Adam.

[ - ]
Reply by waydanMarch 18, 2019

This thread is a little old, but I think I’m in a similar place to you. You’ve already gotten lots of good advice from the experienced members of the forum; I'm adding my two cents precisely because I am not in that group. Last year I got an ARM-based NXP development board (FRDM-KV11Z) and I hope some notes from my naïve endeavor can help you.

After getting my development board, I downloaded the trial IAR tool chain and wrote a program to flash some LEDs. Dr. Miro Samek’s YouTube videos were a great help in getting me up to this point, and I’d recommend you watch the first 5-10 videos for a crash course.

The next step, I decided, was to write some software using the vendor-supplied API, but after reading through the documentation, I found their structure to be confusing and messy. I started to consider writing my own hardware abstraction layer (HAL) and read about implementing one in C. During my experimentation, I got a lot of mileage from compiling with the -S flag in LLVM and GCC to generate assembly code. By reading this compiler-generated assembly, I started to appreciate what the chip is doing to execute code at a low level. It’s useful to have the ARM user guide on hand to help in reading the mnemonics.

Shortly after I started writing a HAL, I got fed up with C. I was also learning Python at the time and wanted a more object-oriented language. I tried out some programming in C++, which was fun, but eventually I got bit by the Ada bug. I really like Ada, primarily for its strong type system which makes it easier for me to abstract memory-mapped hardware registers. While looking for Ada support on this site, I found a great tutorial by Maciej Sobczak. Even if you’re not planning on using Ada, I think the first four sections in this tutorial are useful in understanding how to compile, link, and generate a binary file.

Now I’ve come full-circle and am back to writing assembly code. This is really just practice for my own benefit. I’d like to understand the startup and linking needs of my processor better. I have a github repository where I’m keeping my work under an MIT license so that anyone can use my code to augment their learning. I also recommend a series of articles by Dr. Miro Samek called Building Bare-Metal ARM Systems with GNU.

Here are a few personal opinions I have in summary:

  • I want to use a FREE tool chain (as opposed to a trial version) so that I can scale up in complexity while using the same tools
  • I want to avoid using an IDE because it hides a lot of implementation details behind configuration menus
  • But I MUST have a way to connect a probe or else debug is impossible
  • I also want to know how to read assembly so I fundamentally understand what the processor is doing

I wish you all the best in your leaning. I’ve found studying embedded systems to be fun, challenging, and very rewarding.

Cheers,

Daniel