EmbeddedRelated.com
Forums

AREF bypass capacitance on ATMega2560?

Started by Joerg August 19, 2013
In comp.arch.embedded,
Joerg <invalid@invalid.invalid> wrote:
> > Nope. Not if it's in the worlds of medical or aerospace. There you have > a huge re-cert effort on your hands for changes. New layout? Back to the > end of the line.
That is not always true (at least for medical equipment, no experience with aerospace). If the change is minor enough, it may be enough to write a rationale that explains the change and how it does not impact the function of the equipment. If the notified body agrees with the rationale, only a limitied effort is required to re-cert.
> > Sometimes changing is very time consuming. I recently learned that this > is even the case for alarm systems. "If we even add as much as one > capacitor for EMC we have to go through the whole insurer certification > process again".
Weird, I would expect a similar approach with a rationale or something would be enough. -- Stef (remove caps, dashes and .invalid from e-mail address to reply by mail) Antonym, n.: The opposite of the word you're trying to think of.
On 9/7/2013 6:23 PM, Joerg wrote:
> rickman wrote: >> On 9/7/2013 4:46 PM, Joerg wrote: >>> Paul Rubin wrote: >>>> Joerg<invalid@invalid.invalid> writes: >>>>> I don't see how the equivalent of a TMS320 or a big MSP430 could fit >>>>> into one of these small Lattice devices. >>>> >>>> I had thought the parts of those processors that would bloat up badly >>>> (instruction decode etc.) are pretty simple so the overall effect of the >>>> bloat is ok in the scheme of things. The parts doing the most work >>>> (memory, arithmetic) are done in the FPGA hardware (RAM and DSP blocks, >>>> adders connected to the LUT's somehow) as efficiently as on the MCU's. >>>> >>>> I do think softcores seem like a silly idea a lot of the time, and am >>>> looking forward to more low end FPGA's with MCU blocks. >>> >>> >>> Much of it has to do with legacy code. Yes, some things could even be >>> done more efficiently in the FPGA because you can actually streamline >>> the HW to the task, something neither uC nor DSP allow. For example, why >>> have a 32-bit HW multiplier when you know you'll never exceed 23 bits? >>> But legacy code won't run anymore and you need FPGA specialists to make >>> it all work. >> >> No, you would need a DSP specialist. The FPGA designer only needs to >> know how to code the FPGA. >> > > So for this kind of solution in an FPGA you need a DSP specialist and an > FPGA specialist? That would be a problem.
You can do it anyway you want. I'm just making the distinction between DSP knowledge and FPGA knowledge. They aren't very much the same. I also make a distinction between a DSP designer and a DSP coder. Again, not much in common. Coding doesn't really require a lot of DSP knowledge and DSP designers often aren't experts at coding the finicky chips.
>> But that is exactly the point of the FPGA in DSP apps. You code to the >> app, not to a processor. >> > > How long do the usual FPGA stay in the market? Meaning plop-in > replaceable, same footprint, same code, no changes.
Life span is typically *much* better than MCUs. First, there are *no* second sources so whatever chip family you select is the only one that will fit your layout. There *may* be more than one member of that family that will fit the same socket, that is common, but not guaranteed. So you often will get a choice of two, three or four sizes and you often get an upgrade path from your first selection. Just in case you are familiar with the compilation process, in an FPGA you *always* have to recompile for the target. Even if they are pin compatible you can't load a design for an whatever-02 chip into a whatever-03 part. Those are the limitations. As to the market life, that it typically well over 10 years. Spartan 3 was introduced some 10 years ago and it is not yet the oldest chip Xilinx has in current full production. I'm still considering using it for new designs. Similar situation for Altera. I was just burned by Lattice announcing EOL of their XP line. This was because they got a new guy in at the top with a new broom I suppose. I'm sure you can find various MCUs which have been in production for 10 years, but I know Atmel likes to replace products from time to time with similar "pin compatible" devices which are 99.9% compatible. I expect for the 8 bit parts life span is not such an issue. For the larger parts I expect life span is a bit more limited and for the top end chips, I'm pretty sure their life span is measured in double digit months. Can you still buy any of the Pentium 4s that were all over the place seven or eight years ago? I can't even find a Core 2 Duo. What lifespan have you seen for MCUs? -- Rick
On 9/7/2013 5:33 PM, Paul Rubin wrote:
> rickman<gnuarm@gmail.com> writes: >> How about an MCU array instead? http://www.greenarraychips.com/ > > Yes, we've had many discussions about that part ;-). > >> considering a softcore "silly" is not a useful engineering analysis. > > The engineering analysis is implied: it takes far more silicon to > implement a microprocessor in LUTs than directly in silicon, plus you > lose a lot of speed because of all the additional layers and lookups.
That is a pointless comparison. I have never once opened up a chip to see how much silicon it used. I compare the things I can see from the outside, cost, power consumption, etc... You can infer anything you wish. The proof of the pudding is in the eating. This exactly the type of bias I'd like to overcome.
>> Bernd Paysan rolled his own small processor design for an ASIC > > Yes, the ASIC bypassed the relative inefficiency of doing the same thing > in FPGA's. It would be cool to have some tiny processors like that > available as hard cells on small FPGA's.
Ok, but your "efficiency" rating is not any of real value in a design. Stop limiting yourself by pointless metrics. If you like the idea of a lot of processors on a chip, then design one on an FPGA and see how it works. Do you see what I'm trying to say? -- Rick
rickman <gnuarm@gmail.com> writes:
> That is a pointless comparison. I have never once opened up a chip to > see how much silicon it used. I compare the things I can see from the > outside, cost, power consumption, etc...
Yes, and those are quite closely dependent on the amount of silicon used.
> You can infer anything you wish. The proof of the pudding is in the > eating.
OK. That GA144 you mentioned has 144 cpu nodes made in a rather old process technology (0.18 micron, I guess 1990's vintage). They still manage to run the thing at 700+ Mhz, keep power consumption to around 0.5W with all cpu's running full speed, and sell it for $20 in small quantity. Can you do anyting like that with an FPGA? What will it cost? How much power will it use? I'll accept the b16 as a comparable processor to a GA144 node. Bernd's paper mentions the b16 ran at 25 MHz in a Flex10K30E, a 30-to-1 slowdown, power consumption not mentioned. But I don't know how the underlying silicon processes compare.
On 9/7/2013 5:48 PM, Joerg wrote:
> rickman wrote: >> On 9/7/2013 3:39 PM, Joerg wrote: >>> rickman wrote: >>>> On 9/7/2013 1:59 PM, Joerg wrote: >>>>> rickman wrote: >>>>>> On 9/7/2013 11:10 AM, Joerg wrote: >>>>>>> For example, this: >>>>>>> >>>>>>> http://www.ti.com/lit/ds/symlink/tms320c5535.pdf >>>>>> >>>>>> I don't see it for $3. Did you get a quote for your project? TI says >>>>>> it is $5 to $8 at qty 1k depending on the flavor. You still need >>>>>> to add >>>>>> Flash. >>>>>> >>>>> >>>>> 1k qty is $3.67 at Digikey: >>>>> >>>>> http://www.digikey.com/product-detail/en/TMS320C5532AZHH10/296-32741-ND/2749713 >>>>> >>>>> >>>> >>>> Not the same part pal. You're trying to pull a fast one on me? Are we >>>> talking about the TMS320C5535 with "tons" of memory or the TMS320C5532 >>>> with *much* less memory? >>>> >>> >>> It doesn't have the single access RAM but it does have 64k dual access >>> RAM. That's a lot of RAM in embedded. >> >> You do this often. Start talking about one thing and shift the context >> to another. ... > > > I didn't. I said it is a DSP with large memory, which it is.
You first give a part, the C5535, as the chip with big memory, then it becomes the C5532 which is less memory and less expensive. I can't tell what you are talking about when the subject changes.
>> ... Projects have design requirements. I often am able to meet >> my design requirements with an FPGA and no MCU. I often can't say the >> opposite, being able to use an MCU without the FPGA. >> >> >>>>> $3.02 with 12wks leadtime at Arrow: >>>>> >>>>> http://components.arrow.com/part/detail/51425505S8988412N7713?region=na >>>>> >>>>> ROM is included. >>>> >>>> ROM is not Flash.. is it? Are you thinking in terms of a mask ROM? >>>> >>> >>> You can use the bootloader or OTP your own bootloader if you don't want >>> to store your programming in ROM. In most situations this is part of a >>> larger computerized system from where it can download its programming. >> >> That's a different wrinkle. It is common to have a micro load an FPGA, >> many don't contain their own Flash. But I haven't seen this done with >> DSPs as often and almost never with MCUs. But if that works for your >> project, great. You certainly wouldn't have a problem loading an FPGA >> then. >> > > The fact that most FPGA don't have flash is fine ... but ... there must > be a decent bootloader inside. In one of the upcoming projects it must > be able to bootload via USB. So the device must wake up with a certain > minimum in brain functionality to handle the USB stuff. With FPGA that > can become a challenge unless you provide a serial memory device (which > adds cost).
No, you won't find any FPGAs which can wake up talking over USB. But you will find FPGAs with internal Flash if you wish to design a USB bootloader.
>>>>>>>> ... It has been a while since I looked >>>>>>>> hard at DSP chips, but I don't recall any I would call remotely >>>>>>>> "big" >>>>>>>> for $3. The TI chips that would be "big" are the TMS6xxx line which >>>>>>>> start somewhere around $20 the last time I looked and that requires >>>>>>>> all >>>>>>>> memory and I/O to be separate. The smaller DSP chips that you >>>>>>>> can get >>>>>>>> for the $3 range are not "big" in any sense and only a very few of >>>>>>>> them >>>>>>>> include Flash memory. So you still need another chip. >>>>>>>> >>>>>>> >>>>>>> It has tons of memory on board. >>>>>> >>>>>> Yes, and many FPGAs have "tons" of memory on board although not for >>>>>> $3... but then this isn't a $3 part either... >>>>>> >>>>> >>>>> It is a $3 part. See above. >>>> >>>> No, you need to pick a part number and stick with it. >>>> >>> >>> I gave a part number. Still waiting for your $3 FPGA part number :-) >> >> Actually you gave me two part numbers, one for $5 and one for just over >> $3. What's your point? I gave you info to find the iCE40 line. Xilinx >> also makes FPGAs that are very affordable and does Altera and Lattice. >> > > Both of the ones I gave you are $3. The DSP costs $3.02 and the MSP430 > is $3.09. These are over-the-counter no-haggle prices. Can a $3 iCE40 > device emulate a TMS320 or a big MSP430? I can't tell because I don't > know this Lattice series and I am not an FPGA expert. But it sure looks > like they'd have a hard time.
No, the C5535 part is not $3. That is what I mean by two part numbers.
>> I have already explained that I would never do a design in an FPGA to >> wholly incorporate a DSP or MCU. That would be absurd. So why do you >> keep asking about that? >> > > Because you wrote yesterday, quote "For $3 however, you can get a chip > large enough for a CPU (with math) and room for your special logic".
I said "a CPU" not "any CPU". I never said it would duplicate a commercial device. I'm talking about function.
>> Depending on your design requirements there are any number of FPGAs that >> will do the job and some may be $3. What are your design requirements? >> > > As I said, I do not have any hammered out ones yet but it'll come. This > was just about you $3 claim. So I gave some examples of devices that > cost $3.
Yes, and there are FPGAs in that price range which can be used to implement a CPU plus other logic.
>> I understand the concept of work that can't be moved. You don't need to >> continue to explain that. I was asking why you said most of your word >> didn't have that requirement and yet you still were debating the point. >> Now I get it, you are talking about two different things, work that can >> be moved and work that can't be moved. >> > > Yup. Hence the need for availability of local programmer talent. Less > local availability means potential problems. That is because (where > possible) I like to use architectures I am familiar with. > > Programmer talent means longterm. For example, if a client has an issue > with an 8051 design long after the original programmer has moved on I > could find programmers within a 10-mile radius. Try that with an FPGA. > In San Jose it may be possible but not out here.
I can't speak of your environment. I know my friend of many years stayed away from FPGAs in spite of the fact that he is a very capable designer. He finally paid me for a week of FPGA design work which I then turned over to him and helped him get started with HDL. It's not hard at all. You don't really need anyone special. That is the sort of thinking I am trying to dispel. Another example. A software designer came to a newsgroup looking for info on programming FPGAs. He used the mindset of a software guy and wanted to do a "hello world" program. We tried to explain to him that hardware isn't software and HDL isn't C. But he persisted and I gave him advice over a week or so. I tried to turn it into a consulting gig but his bosses didn't want to pay the bucks. He ended up doing just fine with his software mindset and convinced his boss to pay me $500 over my protests. I cashed the check when it came. The point is that FPGAs are not so hard that you need a unique talent to design them. That may have been true 10+ years ago, but they are very mainstream now and much easier to work with. I bet even *you* could do an FPGA design, lol. I don't care where you are located, if you can't find an FPGA designer, you aren't looking very hard.
>>>> Not sure what the requirements are for your CODEC, but I have been using >>>> the AKM parts with good results. Minimal or *no* programming, >>>> configuration is done by a small number of select pins, very simple. I >>>> have yet to find another one as small, AK4552, AK4556. >>>> >>> >>> Plus their prices are quite good. >> >> Which, AKM or the other? I'd like to think I can get a CD quality CODEC >> for $3 from nearly anyone. I mainly picked AKM because of the size, 6x6 >> mm without going to a microBGA. >> > > AKM has good prices.
Ok. I have no complaints on prices. Their lead time can be a problem. I had a conversation, disti, manufacturers guy and me. I was complaining about a 14 week lead time and he bragged that a 14 week lead time was *good*. I give my customers a 10 week lead time... see the problem? Digikey sell them now so it is not such an issue. I even ended up speaking with a buyer or planner who was coordinating the shipment of an order last spring. Once you reach them they are very nice.
>> High 10's or low 10's. Up to say, 20 or 30 ksps is easy to do in an >> FPGA with decent resolution, 12 bits. Getting 16 bits is harder, I've >> not tried it, but should be possible. >> > > Mostly I need 40-50ksps. But 20 is often ok.
I haven't done 12 bits at 50 ksps, but I expect it is doable. Just cross the t's and dot the i's.
>> I was looking at using a LVDS input for a comparator and Xilinx did it >> in a demo of a SDR. They are very short on details, but they talk about >> 1 mV on the input. I know that's not anything special, I'm hoping to do >> better, much better. >> > > If you can keep substrate noise in check it could work. Try to remain > fully differential as much as you can. Not sure if FPGA design suites > still let you hand-place blocks so you can avoid making a big racket > right next to the ADC area.
*Everything* in an FPGA makes noise, it's all digital. Yes you can hand place logic if you want. That is the sort of thing best done at the end if possible when you are ready to finalize the chip. But what would you have in an FPGA design that makes more noise than anything else? Each logic block is very small and has a pretty low power consumption. It would be the I/O that has significant power spikes and you have total control over that.
>>> I wasn't referring to a specific project, just your claim that FPGA can >>> do the same job as processors at the same price. >> >> Yes, that is my claim. The obvious exception is when some feature is >> needed that just isn't available in an FPGA. I'm not saying *every* >> project can be done better in an FPGA. I'm saying that designers tend >> to just not consider FPGAs when they are often viable solutions. >> > > In most of my apps I need much of the functionality that a decent uC > affords, like the $3 device from the MSP430 series I mentioned.
If you need 256 kB of memory then you won't reach a $3 price tag. If you need something more like the low end processor you mentioned that might be doable in the low end FPGAs. They have block RAM, but it scales with the size of the chip. When you have a specific requirement we can look and see what matches.
>>> One project will probably require something of the caliber of a >>> MSP430F6733. Whether this kind or a DSP, what is key is that we are able >>> to use pre-coooked library routines. In my case for complex (I/Q) signal >>> processing, FFT, non-linear filtering and so on. Sometimes legacy >>> routines must be kept and in an FPGA that would require an IP block that >>> can emulate the respective processor well enough. >> >> Ok, that is likely a no-go. If you really want to emulate a DSP chip >> then an FPGA is not likely to be a useful way to proceed. Wanting to >> run DSP precompiled library code is a bit of an extreme requirement. If >> the customer wants a DSP, then by all means give them a DSP. But don't >> automatically exclude an FPGA from the task. >> > > Sometimes it would also be ok if there were similar pre-cooked FPGA > routines (I/Q signal processing, non-linear filters et cetera).
There are design tools that will generate function blocks, filters, etc. I have not had to deal with them. The DSP stuff I have done I just coded up in HDL.
>>> But what I see most is this: The respective client has in-house talent >>> for writing code. They are familiar with a particular architecture, have >>> a vast arsenal of re-usable code built up, and naturally they do not >>> wish to give this up. If it's a dsPIC like last year, then that goes in >>> there. If it's Atmel and MSP430 like this year, then that is used. Has >>> to be, the customer is king. >> >> Yeah, well that is a deal killer for *any* other alternative. That is >> not related to what I was saying. My point is that if you don't have >> any specific requirement that dictates the use of a given chip, an FPGA >> has as good a chance at meeting the requirements as an MCU or DSP. In >> fact, FPGAs are what get used when DSPs aren't fast enough. My point is >> you don't have to limit them to the high end. They also do very well at >> the low end. >> > > No disagreement there, programables have come a long way since the days > of GALs. Which I never used because they were expensive power guzzlers. > > One other challenge that needs to be met in most of my cases is > longevity of the design. A FPGA would have to remain available for more > than just a few years. For example, one of my uC-based designs from the > mid 90's is still in production. Since I kind of had a hunch that this > would happen I used an 8051 family uC. Is there something similar in the > world of FPGA?
That is typically not a problem, but pick a device that is relatively new to start with. The vendors are *all* about their latest and greatest products. I guess they need a critical mass of design wins up front which they get revenue from over the life of the part. So they push the newest stuff and let you ask about the older parts. I don't think there is anything like the 8051 other than the 22V10 perhaps. The 8051 is an anomaly in the MCU world. You won't see a DSP equivalent for example. So far users typically want more, more, more from FPGAs. So a stationary design would not have a market. Even though there are ever larger markets for low end parts, they keep redesigning them to make them cheaper. When they do that they add incompatibility because it doesn't affect the bulk of the users, recompile and you are good to go. But pin compatibility, no, that just doesn't exist other than within a single family. Fortunately product life is typically not an issue. -- Rick
rickman wrote:
> On 9/7/2013 6:23 PM, Joerg wrote: >> rickman wrote: >>> On 9/7/2013 4:46 PM, Joerg wrote: >>>> Paul Rubin wrote: >>>>> Joerg<invalid@invalid.invalid> writes: >>>>>> I don't see how the equivalent of a TMS320 or a big MSP430 could fit >>>>>> into one of these small Lattice devices. >>>>> >>>>> I had thought the parts of those processors that would bloat up badly >>>>> (instruction decode etc.) are pretty simple so the overall effect >>>>> of the >>>>> bloat is ok in the scheme of things. The parts doing the most work >>>>> (memory, arithmetic) are done in the FPGA hardware (RAM and DSP >>>>> blocks, >>>>> adders connected to the LUT's somehow) as efficiently as on the MCU's. >>>>> >>>>> I do think softcores seem like a silly idea a lot of the time, and am >>>>> looking forward to more low end FPGA's with MCU blocks. >>>> >>>> >>>> Much of it has to do with legacy code. Yes, some things could even be >>>> done more efficiently in the FPGA because you can actually streamline >>>> the HW to the task, something neither uC nor DSP allow. For example, >>>> why >>>> have a 32-bit HW multiplier when you know you'll never exceed 23 bits? >>>> But legacy code won't run anymore and you need FPGA specialists to make >>>> it all work. >>> >>> No, you would need a DSP specialist. The FPGA designer only needs to >>> know how to code the FPGA. >>> >> >> So for this kind of solution in an FPGA you need a DSP specialist and an >> FPGA specialist? That would be a problem. > > You can do it anyway you want. I'm just making the distinction between > DSP knowledge and FPGA knowledge. They aren't very much the same. I > also make a distinction between a DSP designer and a DSP coder. Again, > not much in common. Coding doesn't really require a lot of DSP > knowledge and DSP designers often aren't experts at coding the finicky > chips. >
I have learned that with uC as well. There are lots of programmers but not too many who can lay down a realtime program architecture. So I do that a lot. And I am a guy who cannot really program uCs very easily, I don't speak much C. With DSP it's the same and I would expect that also from FPGA. I propose an architecture, what needs to be calculated, and when. Then a programmer takes over. But I can't justify more than one person for that. So if some or a lot of uC or DSP core have to be poured into an FPGA then I guess the FPGA guys has to take over both.
> >>> But that is exactly the point of the FPGA in DSP apps. You code to the >>> app, not to a processor. >>> >> >> How long do the usual FPGA stay in the market? Meaning plop-in >> replaceable, same footprint, same code, no changes. > > Life span is typically *much* better than MCUs. > > First, there are *no* second sources so whatever chip family you select > is the only one that will fit your layout. ...
That is one of my concerns. With 8051 uCs you have multiple sources as long as you stick to customary packages such as a 44-pin flat-pack.
> ... There *may* be more than one > member of that family that will fit the same socket, that is common, but > not guaranteed. So you often will get a choice of two, three or four > sizes and you often get an upgrade path from your first selection. Just > in case you are familiar with the compilation process, in an FPGA you > *always* have to recompile for the target. Even if they are pin > compatible you can't load a design for an whatever-02 chip into a > whatever-03 part. Those are the limitations. >
Yeah, that I was aware of. And changing to a whatever-03 would be a major headache in many of my cases. Because it's medical, aerospace similar where that can trigger a complete re-cert.
> As to the market life, that it typically well over 10 years. Spartan 3 > was introduced some 10 years ago and it is not yet the oldest chip > Xilinx has in current full production. I'm still considering using it > for new designs. Similar situation for Altera. ...
Well over 10 years is good. But only if that means no change to any new versions that require a re-compile. Early on in my career that happened and one guy promptly got busy with three months of regression testing. Oh what fun.
> ... I was just burned by > Lattice announcing EOL of their XP line. This was because they got a > new guy in at the top with a new broom I suppose. >
Not so cool :-(
> I'm sure you can find various MCUs which have been in production for 10 > years, but I know Atmel likes to replace products from time to time with > similar "pin compatible" devices which are 99.9% compatible. I expect > for the 8 bit parts life span is not such an issue. For the larger > parts I expect life span is a bit more limited and for the top end > chips, I'm pretty sure their life span is measured in double digit > months. Can you still buy any of the Pentium 4s that were all over the > place seven or eight years ago?
Yup: http://components.arrow.com/part/detail/41596500S6440784N2936?region=na
> ... I can't even find a Core 2 Duo.
No problem either: http://components.arrow.com/part/detail/42952225S9497728N2936?region=na
> > What lifespan have you seen for MCUs? >
The 89C51 I designed in in the mid-90's is still living. Not sure how long it was in production when I designed it in. The nice thing is that these are made by several companies, even Asian ones such as Winbond. So it was no surprise when I took apart our pellet stove for maintenance and found one of those in there as well. 2nd source is important to me, and my clients. -- Regards, Joerg http://www.analogconsultants.com/
Stef wrote:
> In comp.arch.embedded, > Joerg <invalid@invalid.invalid> wrote: >> Nope. Not if it's in the worlds of medical or aerospace. There you have >> a huge re-cert effort on your hands for changes. New layout? Back to the >> end of the line. > > That is not always true (at least for medical equipment, no experience > with aerospace). If the change is minor enough, it may be enough to > write a rationale that explains the change and how it does not impact > the function of the equipment. If the notified body agrees with the > rationale, only a limitied effort is required to re-cert. >
I really doubt they would agree if a code re-compilation was required to make this work. With code and firmware they have become very careful because there have been to many mishaps. Most of the time the notified bodies or even the FDA do not care much about the code, they care about your process. So then the onus is on the company, and there mostly on the VP of Quality Control. He or she will normally not take a re-compile lightly, or as something that can be brushed under the carpet as "not too risky". It is the same with some hardware. I went through a whole re-cert once just because we had to switcher the manufacturer for one little transformer. The bottomline is that in the unlikely but possible situation where something bad happens you need to be prepared. Then there will be a barrage of request for documents from the regression testing and all that. Woe to those who then don't have them.
>> Sometimes changing is very time consuming. I recently learned that this >> is even the case for alarm systems. "If we even add as much as one >> capacitor for EMC we have to go through the whole insurer certification >> process again". > > Weird, I would expect a similar approach with a rationale or something > would be enough. >
There are many other markets with similar requirements. One of them is railroad electronics, especially for countries like Germany. -- Regards, Joerg http://www.analogconsultants.com/
rickman <gnuarm@gmail.com> writes:
> The point is that FPGAs are not so hard that you need a unique talent > to design them. That may have been true 10+ years ago, but they are > very mainstream now and much easier to work with.
There still appears to be a complete absence of FOSS toolchains, at least for any current interesting parts.
>> like the $3 device from the MSP430 series I mentioned. > If you need 256 kB of memory then you won't reach a $3 price tag.
That part (MSP430F6733) has 64k of flash and 4k of ram, not out of reach. It does have some nice other features that may be hard to duplicate with an fpga, like quite low power consumption: http://www.ti.com/product/msp430f6733
On 9/7/2013 7:45 PM, Joerg wrote:
> rickman wrote: >> On 9/7/2013 6:23 PM, Joerg wrote: >>> >>> How long do the usual FPGA stay in the market? Meaning plop-in >>> replaceable, same footprint, same code, no changes. >> >> Life span is typically *much* better than MCUs. >> >> First, there are *no* second sources so whatever chip family you select >> is the only one that will fit your layout. ... > > > That is one of my concerns. With 8051 uCs you have multiple sources as > long as you stick to customary packages such as a 44-pin flat-pack.
Yes, but 8051s aren't DSPs either are they? You seem to be switching gears again. I can't keep up. I know you do different designs, but can the FPGA be wrong for *all* of them? You seem to have all requirements for all designs.
>> ... There *may* be more than one >> member of that family that will fit the same socket, that is common, but >> not guaranteed. So you often will get a choice of two, three or four >> sizes and you often get an upgrade path from your first selection. Just >> in case you are familiar with the compilation process, in an FPGA you >> *always* have to recompile for the target. Even if they are pin >> compatible you can't load a design for an whatever-02 chip into a >> whatever-03 part. Those are the limitations. >> > > Yeah, that I was aware of. And changing to a whatever-03 would be a > major headache in many of my cases. Because it's medical, aerospace > similar where that can trigger a complete re-cert.
Why would you need to change to the whatever-03. Once it is qualified you can stick with it. My point is that you have flexibility in the device, no one is making you switch.
>> As to the market life, that it typically well over 10 years. Spartan 3 >> was introduced some 10 years ago and it is not yet the oldest chip >> Xilinx has in current full production. I'm still considering using it >> for new designs. Similar situation for Altera. ... > > > Well over 10 years is good. But only if that means no change to any new > versions that require a re-compile. Early on in my career that happened > and one guy promptly got busy with three months of regression testing. > Oh what fun.
Why not talk to the vendors?
>> ... I was just burned by >> Lattice announcing EOL of their XP line. This was because they got a >> new guy in at the top with a new broom I suppose. >> > > Not so cool :-(
Yeah, I'm unhappy about it. I thought I could get more development funds for a redo but the division reselling this board in their product doesn't want to spend any cash on it. I've been asked to spend my dime and I likely will. I make good money on this product.
>> I'm sure you can find various MCUs which have been in production for 10 >> years, but I know Atmel likes to replace products from time to time with >> similar "pin compatible" devices which are 99.9% compatible. I expect >> for the 8 bit parts life span is not such an issue. For the larger >> parts I expect life span is a bit more limited and for the top end >> chips, I'm pretty sure their life span is measured in double digit >> months. Can you still buy any of the Pentium 4s that were all over the >> place seven or eight years ago? > > > Yup: > > http://components.arrow.com/part/detail/41596500S6440784N2936?region=na > > >> ... I can't even find a Core 2 Duo. > > > No problem either: > > http://components.arrow.com/part/detail/42952225S9497728N2936?region=na
How do you know these are the parts that were designed in the system of interest? They made a huge number of variants and I know I have seen EOL notices for Pentium 4s.
>> What lifespan have you seen for MCUs? >> > > The 89C51 I designed in in the mid-90's is still living. Not sure how > long it was in production when I designed it in. The nice thing is that > these are made by several companies, even Asian ones such as Winbond. So > it was no surprise when I took apart our pellet stove for maintenance > and found one of those in there as well. > > 2nd source is important to me, and my clients.
If you really need that level of consistency, then you will be using nothing but 8051s all your career. I don't know of any digital component that has lived as long as the 8051 other than perhaps LS-TTL. I also don't know of any other MCU that is second sourceds. If the 8051 does what you need, then go for it. But again you are mixing conversations. That's why it is so frustrating to have a conversation with you. You talk about not being able to use a part unless it has a product life as long as the 8051 and then you talk about using various DSP chips in the same context. I *know* you won't be able to buy those DSP chips 10 years from now. TI just doesn't provide that level of support unless they have a special program for long lived parts I'm not aware of. I've seen lightly selling DSPs drop from the marketplace after less than 5 years. The DSP market was just a tiny exploration by TI initially. Then they saw cell phones as a way to utilize that capability. They actually reorganized the entire company to take full advantage of it. As a result they ended up with four segments for DSPs. 1) Cell phone devices - small, low power and cheap in large quantities. Not much need for longevity at all... basically the C5xxx line. 2) Cell base stations - powerful devices that can handle multiple channels, power consumption not important and cost is secondary. This is the C6xxx line. Again, they focus on new, not longevity. 3) Scientific DSP - floating point. C67xx lines. Relatively low volumes compared to the other two, but they seem to think it is an important market. New designs are not as frequent. Longevity might be better than the other two, but no promises. 4) Motor control, white goods, etc - fixed point with price the major factor. These have appeared in a range of variations, some with flash, some with ADCs, etc. These are almost MCUs with simlar performance, slow compared to segment 1 and 2. Intended for high volume apps, but again, longevity is not important. So if you are going to consider DSPs for your apps, I expect you would be looking at the last category. I'm pretty sure I wouldn't be designing from this group if I wanted to be building this board 10 years from now though. Have you talked to TI about longevity? -- Rick
On 9/7/2013 8:39 PM, Paul Rubin wrote:
> rickman<gnuarm@gmail.com> writes: >> The point is that FPGAs are not so hard that you need a unique talent >> to design them. That may have been true 10+ years ago, but they are >> very mainstream now and much easier to work with. > > There still appears to be a complete absence of FOSS toolchains, at > least for any current interesting parts.
There are no FOSS bit stream generators and there never will be. If that is a no-go for you, then you will never use FPGAs from any of the existing companies.
>>> like the $3 device from the MSP430 series I mentioned. >> If you need 256 kB of memory then you won't reach a $3 price tag. > > That part (MSP430F6733) has 64k of flash and 4k of ram, not out of > reach. It does have some nice other features that may be hard to > duplicate with an fpga, like quite low power consumption: > http://www.ti.com/product/msp430f6733
You have been reading old books. All FPGAs aren't power hungry. Check the Lattice site for iCE40 line. Very low power. -- Rick