EmbeddedRelated.com
Forums

Will SoC completely replace generalized microcontrollers?

Started by Telenochek November 28, 2005
  I am wondering if the SoC (ARM/AMBA architecture)  (where a
whole system with upgradeable hardware modules/ IP cores can be stuffed
inside a single chip) will make all kinds of generalized
microcontrollers (like PIC) obsolete.
When PIC microcontrollers are used, they often need external hardware
to help them, DSP blocks cannot be integrated into the chip at will
(its all up to Microchip, whatever they decide to include in a chip).
    Just seems to me like SoC will eventually replace every MCU based
system, because of the processing power, and application-specific
flexibility in hardware. And almost all systems can use extra
processing power, capabilities & etc.
    Maybe a traffic light with video camera and remote alerts for
speeders + array radar sensing of speeding cars & reporting their
position via GPS. I'm not saying that developing a supercomputing
traffic light is a very high priority task, just using it as an
illustration of stuffing more capability into into a simple system.

What would be the problem will be with replacing almost all MCU-based
systems with SoC?

Telenochek wrote:

> I am wondering if the SoC (ARM/AMBA architecture) (where a > whole system with upgradeable hardware modules/ IP cores can be stuffed > inside a single chip) will make all kinds of generalized > microcontrollers (like PIC) obsolete.
You're talking apples and oranges. The PIC excels at simple jobs where the development time is measured in days. Hang some stuff on the I/O's and write some quick code. The parts cost is measured in cents. Moving IP cores around in an FPGA then converting it to a custom design takes months and tens or hundreds of thousands of dollars.
> When PIC microcontrollers are used, they often need external hardware > to help them, DSP blocks cannot be integrated into the chip at will > (its all up to Microchip, whatever they decide to include in a chip). > Just seems to me like SoC will eventually replace every MCU based > system, because of the processing power, and application-specific > flexibility in hardware. And almost all systems can use extra > processing power, capabilities & etc.
No. Sometimes it's best to keep things as simple as possible. Sometimes a PIC is the right thing, sometimes an ARM is the right thing and sometimes an op-amp is the right thing. I don't see that changing in the next 5 years.
> Maybe a traffic light with video camera and remote alerts for > speeders + array radar sensing of speeding cars & reporting their > position via GPS. I'm not saying that developing a supercomputing > traffic light is a very high priority task, just using it as an > illustration of stuffing more capability into into a simple system. > > What would be the problem will be with replacing almost all MCU-based > systems with SoC?
Money, time, retraining.
Sure I understand that right now FPGA dev tools put a heavy burden on
the developer. Same goes for ARM development.

But when the development tools become advanced enough so that you can
go:
I want a 16bit 30MIPS processor in the center, and a CAN controller
over here and RF transmitter over here, and a FFT core right here.
Then write some code in software (Java/C++)  (not VHDL/Verilog HDLs) or
better yet just draw it in a visual GUI in the form of a block diagram,
click the block, set some settings and it works. With all the hardware
correctly configured & clocking synchronization issues automatically
taken care of.
Like C++ application development, only in hardware.

How far away are we from the above scenario?

In article <1133203186.281807.245400@z14g2000cwz.googlegroups.com>, 
interpasha@hotmail.com says...
> Sure I understand that right now FPGA dev tools put a heavy burden on > the developer. Same goes for ARM development. > > But when the development tools become advanced enough so that you can > go: > I want a 16bit 30MIPS processor in the center, and a CAN controller > over here and RF transmitter over here, and a FFT core right here. > Then write some code in software (Java/C++) (not VHDL/Verilog HDLs) or > better yet just draw it in a visual GUI in the form of a block diagram, > click the block, set some settings and it works. With all the hardware > correctly configured & clocking synchronization issues automatically > taken care of. > Like C++ application development, only in hardware. > > How far away are we from the above scenario? > >
Probably not too far if one of the bigger software vendors thinks they can sell enough units at about $10,000 each to pay for a few million dollars in development cost. I wouldn't hold my breath waiting for it, though!@ ;-) When you get that package, you can then figure out how to pay the license fees for the IP that will be included. I suspect that will involve lawyers! ;-( Mark Borgerson
> Probably not too far if one of the bigger software vendors thinks > they can sell enough units at about $10,000 each to pay for > a few million dollars in development cost. I wouldn't hold > my breath waiting for it, though!@ ;-)
> When you get that package, you can then figure out how to pay > the license fees for the IP that will be included. I suspect > that will involve lawyers! ;-(
Xilinx development tools don't cost that much if you don't buy "the latest & the greatest". In fact, the WebPack costs nothing. And Spartan3 Starter Kit costs only 100$ Just the Spartan3 Starter Kit alone with the WebPack have the capabilities unthinkable just a decade ago. 200K reprogrammable gates of completely custom logic?! IP cores & software are just a piece of plastic, that costs nothing to produce apart from development costs (which can be a lot, of course). Provided the software sells enough the price can easily go down. Of course putting the design in silicon will be expensive...
Telenochek wrote:
<snip>
> > What would be the problem will be with replacing almost all MCU-based > systems with SoC?
Let's see: Price, and operational details, like battery life, Size etc. Nothing really important :) -jg
> Price, and operational details, like battery life, Size etc.
Actually, size of a SoC is much smaller than of a system composed of individual hardware pieces put together on a board.
> I expect a continuing trend for > more and better peripherals to be included on-chip.
So in your opinion instead of having highly customizable chip and systems, the trend is to offer a wider variety of chips for developers to choose from?
On 28 Nov 2005 10:39:46 -0800, the renowned "Telenochek"
<interpasha@hotmail.com> wrote:

>Sure I understand that right now FPGA dev tools put a heavy burden on >the developer. Same goes for ARM development. > >But when the development tools become advanced enough so that you can >go: >I want a 16bit 30MIPS processor in the center, and a CAN controller >over here and RF transmitter over here, and a FFT core right here. >Then write some code in software (Java/C++) (not VHDL/Verilog HDLs) or >better yet just draw it in a visual GUI in the form of a block diagram, >click the block, set some settings and it works. With all the hardware >correctly configured & clocking synchronization issues automatically >taken care of. >Like C++ application development, only in hardware. > >How far away are we from the above scenario?
Synthesizeable cores take more silicon, use more power and are lower performance than hard cores, all other things being equal. One trend is to put hard processors onto FPGA chips, where, like unused peripherals on a microcontroller, they can be ignored if not used, since they only add a few percent to the chip area. I don't think we'll see anything like you suggest anytime soon (at least not at a popular price point). I expect a continuing trend for more and better peripherals to be included on-chip (and you can use or not use them) and for processors to be included on FPGAs, but the latter would have to have a lot of flash ROM included to start to look very attractive, and the chips are already *way* too big and expensive for many volume applications (even with state-of-the-art 90nm processes etc.). Some things are limited by process too-- it may not be economical to put a large flash array on the same chip as an RF periperal etc. or FPGA RAM-based structure or high-precision analog circuitry like delta-sigma converters and voltage references. Best regards, Spehro Pefhany -- "it's the network..." "The Journey is the reward" speff@interlog.com Info for manufacturers: http://www.trexon.com Embedded software/hardware/analog Info for designers: http://www.speff.com
Telenochek wrote:
> Sure I understand that right now FPGA dev tools put a heavy burden on > the developer. Same goes for ARM development. > > But when the development tools become advanced enough so that you can > go: > I want a 16bit 30MIPS processor in the center, and a CAN controller > over here and RF transmitter over here, and a FFT core right here. > Then write some code in software (Java/C++) (not VHDL/Verilog HDLs) or > better yet just draw it in a visual GUI in the form of a block diagram, > click the block, set some settings and it works. With all the hardware > correctly configured & clocking synchronization issues automatically > taken care of. > Like C++ application development, only in hardware. > > How far away are we from the above scenario?
If you want to look at trends, the problem is not going to be software, it will be power consumption. Yes, SW has been getting steadily better, but the Static Icc of FPGAs has been getting worse for a number of generations. Take something like a MSP430, and try and get even close to replace that electrical performance in a FPGA. Another flaw of the SoC mindset, is the resource wastage. For some designs it looks a good solution, but for many others, the components the SoC vendor has chosen, have hard ceilings, and you end up more constrained in design, than when using more conventional components. The number of generalized microcontrollers has been growing, not declining, over the last decade, and that is not about to reverse. -jg