EmbeddedRelated.com
Forums

FPGA size and tools

Started by drjohnsmith 8 years ago1 replylatest reply 8 years ago180 views

Is the gap between the biggest and the smallest fpga's affecting #tools adversely ?

For instance, the way I design a CPLD / small FPGA for gpio / processor interface is very different to how I design say a video processing overlay engine.

One would be a few hours, one MUCH longer

One I would have a simple test bench, one I would have a fully regressive test strategy.

One would be almost self documenting, one would need a lot more documents,

I'm not going to write UML or system verilog to prove a processor decode / address logic in a small FPGA / cpld, its just not cost effective.

Yet, I have to use the same tools for both, and the tools are optimized for the high cost / big devices. 

Do you find that the tools are actually hindering you productivity for the small designs ?



[ - ]
Reply by cfeltonFebruary 23, 2016

My experience has been in the opposite direction.  The latest industry tools are lacking considerably to build large complex systems.  As you outlined, the tools make it more difficult (rather than less) in developing a large system, in my opinion. Example, SystemVerilog (SV) support is inconsistent across vendors and it is a "dumping" ground for all kinds of "features" (e.g. I want to abstract a bus, do I use a struct, class, interface ... and which vendors actually support interfaces).     

With small design, obviously skip all the complicated UVM, etc.  Some of the SV features for design are nice, even with small designs.  As you state, stick to the straightforward testbench and minimalistic code.

The only downside I have seen, is with some of the flows, the small designs take as long as larger designs but not enough to make in unbearable (just painful).