EmbeddedRelated.com
Forums
Memfault Beyond the Launch

What's more important optimisations or debugging?

Started by Unknown May 30, 2007
I'm trying to get a feel for what people now consider more important,
optimisations or debugging ability. In the past with such tight memory
limits I would have said optimisations but now with expanded-memory
parts becoming cheaper I would think that debugging ability is more
important in a development tool.  It's is not exactly a black and
white question, debug or smallused, but more a ratio. eg. 50% debug/
50% optimised or 70% debug/30% optimised, etc.

Any feedback would be greatly appreciated.

On 30 May 2007 15:15:07 -0700, rhapgood@gmail.com wrote:

>I'm trying to get a feel for what people now consider more important, >optimisations or debugging ability. In the past with such tight memory >limits I would have said optimisations but now with expanded-memory >parts becoming cheaper I would think that debugging ability is more >important in a development tool. It's is not exactly a black and >white question, debug or smallused, but more a ratio. eg. 50% debug/ >50% optimised or 70% debug/30% optimised, etc. > >Any feedback would be greatly appreciated.
I don't understand the question. Debugging features in a development tool or extra debugging information in executable code? Programming is an art. Not only is it not a black and white question, but percentages don't make sense either. The style of a product depends on the circumstances and the arbitrary preferences of the a-holes involved in its development or its use. There is no right or wrong.

rhapgood@gmail.com wrote:

> I'm trying to get a feel for what people now consider more important, > optimisations or debugging ability. In the past with such tight memory > limits I would have said optimisations but now with expanded-memory > parts becoming cheaper I would think that debugging ability is more > important in a development tool. It's is not exactly a black and > white question, debug or smallused, but more a ratio. eg. 50% debug/ > 50% optimised or 70% debug/30% optimised, etc. > > Any feedback would be greatly appreciated.
Whatever you do, you will be screwed. VLV
On May 31, 9:21 am, BubbaGump <BubbaGump@localhost> wrote:

> I don't understand the question. Debugging features in a development > tool or extra debugging information in executable code?
The question refers to debugging features in a toolsuite. When choosing a set of tools for a particular project would you place more emphasis on finding a compiler/IDE combination that makes for easy accurate debugging or finding one [compiler] that can produce the most optimal code? Alternatively you may go for tools that do neither exceptionally well but do both in an 'OK' manner. I know it's more complicated than it seems but sometimes you have to step away from the details and just look at the big picture, that's what I'm trying to do here.
On May 30, 5:15 pm, rhapg...@gmail.com wrote:
> I'm trying to get a feel for what people now consider more important, > optimisations or debugging ability. In the past with such tight memory > limits I would have said optimisations but now with expanded-memory > parts becoming cheaper I would think that debugging ability is more > important in a development tool. It's is not exactly a black and > white question, debug or smallused, but more a ratio. eg. 50% debug/ > 50% optimised or 70% debug/30% optimised, etc.
The rule is "Make it right, _then_ make it fast." Fast enough is fast enough. If the optimizer makes your code undebuggable, and you need the debugger, don't use the optimizer. That said, I generally set my compiler to optimize for space. It hasn't really caused me any debugging troubles in at least 5 or 10 years. Of course, most of my debug activity resembles inserting printf statements rather than stepping through code in an emulator. YMMV. Regards, -=Dave
>> I'm trying to get a feel for what people now consider more important, >> optimisations or debugging ability. In the past with such tight memory
Depends. What's more important, time-to-market or development cost? If it's a totally new class of widget, then the most important thing might be getting first to market, in which case go for easier debugging and don't spare the engineering costs. If all you have to offer is a cheaper version of the same old widget, then engineering and manufacturing costs matter, so write it in assembler for speed and size. On the other hand, figure that the cost of memory is going down too, so your competitors can reduce their memory costs simply by waiting. -- mac the na&#4294967295;f
BubbaGump wrote:
> rhapgood@gmail.com wrote: > >> I'm trying to get a feel for what people now consider more important, >> optimisations or debugging ability. In the past with such tight memory >> limits I would have said optimisations but now with expanded-memory >> parts becoming cheaper I would think that debugging ability is more >> important in a development tool. It's is not exactly a black and >> white question, debug or smallused, but more a ratio. eg. 50% debug/ >> 50% optimised or 70% debug/30% optimised, etc. >
... snip ...
> > Programming is an art. Not only is it not a black and white question, > but percentages don't make sense either. The style of a product > depends on the circumstances and the arbitrary preferences of the > a-holes involved in its development or its use. There is no right or > wrong.
Besides which an application with bugs in it is unusable until the bugs are removed. You can always change the optimization level applied. -- <http://www.cs.auckland.ac.nz/~pgut001/pubs/vista_cost.txt> <http://www.securityfocus.com/columnists/423> <http://www.aaxnet.com/editor/edit043.html> <http://kadaitcha.cx/vista/dogsbreakfast/index.html> cbfalconer at maineline dot net -- Posted via a free Usenet account from http://www.teranews.com
Ryan H wrote:
> On May 31, 9:21 am, BubbaGump <BubbaGump@localhost> wrote: > > >>I don't understand the question. Debugging features in a development >>tool or extra debugging information in executable code? > > > The question refers to debugging features in a toolsuite. When > choosing a set of tools for a particular project would you place more > emphasis on finding a compiler/IDE combination that makes for easy > accurate debugging or finding one [compiler] that can produce the most > optimal code? Alternatively you may go for tools that do neither > exceptionally well but do both in an 'OK' manner. > > I know it's more complicated than it seems but sometimes you have to > step away from the details and just look at the big picture, that's > what I'm trying to do here.
That's suggesting a strange trade-off ?. Compiler quality and Debug quality are not on a trade off see-saw. Indeed, they can sometimes be chosen quite independantly. *Key Point* No point optimising that which does not work. Certainly, code ceilings are less of a problem today, than in the past. Most modern uC have quite good On-Chip Debug support, so someone starting a 'white sheet' new design, should look for devices with this level of Debug support. [See a recent thread about 'live' access to memory during debug ] As your project matures, and cash flow improves, you can also afford better compilers - if you find you really do need them. -jg
Dave Hansen wrote:
> On May 30, 5:15 pm, rhapg...@gmail.com wrote: >> I'm trying to get a feel for what people now consider more important, >> optimisations or debugging ability. In the past with such tight memory >> limits I would have said optimisations but now with expanded-memory >> parts becoming cheaper I would think that debugging ability is more >> important in a development tool. It's is not exactly a black and >> white question, debug or smallused, but more a ratio. eg. 50% debug/ >> 50% optimised or 70% debug/30% optimised, etc. > > The rule is "Make it right, _then_ make it fast." Fast enough is fast > enough. If the optimizer makes your code undebuggable, and you need > the debugger, don't use the optimizer. >
Remember Knuth's golden rules about optimisation: 1. Don't do it. 2. (For experts only) Don't do it yet. That applies to hand-tuning of the source code, rather than automatic optimisations in a compiler, but it's important to remember that the speed of the code is irrelevant if it does not work.
> That said, I generally set my compiler to optimize for space. It > hasn't really caused me any debugging troubles in at least 5 or 10 > years. Of course, most of my debug activity resembles inserting > printf statements rather than stepping through code in an emulator. > YMMV. >
In my experience, it is often much easier to debug code when you have at least some optimising enabled on the compiler. Code generated with all optimisations off is often hard to read (for example, local variables may end up on a stack, while register-based variables can be easier to understand). mvh., David
> Regards, > > -=Dave > >
In news:1180571919.025883.219930@i13g2000prf.googlegroups.com
timestamped 30 May 2007 17:38:39 -0700, Ryan H <rhapgood@gmail.com>
posted:
     "On May 31, 9:21 am, BubbaGump <BubbaGump@localhost> wrote:
     
     > I don't understand the question.  Debugging features in a development
     > tool or extra debugging information in executable code?
     
     The question refers to debugging features in a toolsuite. [..]
     
     [..]"


Has anyone experience or impressions of debuggers which allow stepping
backwards in time through program flow, such as apparently provided
for desktops/workstations by UndoDB (
WWW.Undo-Software.com
) and Java (Virtual Machine?) debuggers? If so, for which processors
and with which tools? I imagine it would be possible to pay Undo
Limited to port a version of its debugger which would be compatible
with any of the targets supported by the GNU DeBugger GDB as UndoDB is
a wrapper for GDB.

Curious,
Colin Paul Gloster

Memfault Beyond the Launch