EmbeddedRelated.com
Forums

Makefile or IDE?

Started by pozz December 2, 2021
On 12/02/2021 12:46 PM, pozz wrote:
> > Do you use IDE or Makefile? Is there a recent and much better alternative to make (such as cmake or SCons)? >
Whatever will run on your box.(usually that's make/automake and nothing else)
Am 09.12.2021 um 11:54 schrieb pozz:

> I'd prefer to have the same tree in source and build dirs:
What on earth for? Subdirectories for sources are necessary to organize our work, because humans can't deal too well with folders filled with hundreds of files, and because we fare better with the project's top-down structure tangibly represented as a tree of subfolders. But let's face it: we very rarely even look at object files, much less work on them in any meaningful fashion. They just have to be somewhere, but it's no particular burden at all if they're all in a single folder, per primary build target. They're for the compiler and make alone to work on, not for humans. So they don't have to be organized for human consumption. That's why virtually all hand-written Makefiles I've ever seen, and a large portion of the auto-generated ones, too, keep all of a target's object, list and dependency files in a single folder. Mechanisms like VPATH exist for the express purpose of easing this approach, and the built-in rules and macros also largely rely on it. The major exception in this regard is CMake, which does indeed mirror the source tree layout --- but that's manageable for them only because their Makefiles, being fully machine-generated, can become almost arbitrarily complex, for no extra cost. Nobody in full possession of their mental capabilities would ever write Makefiles the way CMake does it, by hand.
On 10/12/2021 18:35, Hans-Bernhard Bröker wrote:
> Am 09.12.2021 um 11:54 schrieb pozz: > >> I'd prefer to have the same tree in source and build dirs: > > What on earth for? > > Subdirectories for sources are necessary to organize our work, because > humans can't deal too well with folders filled with hundreds of files, > and because we fare better with the project's top-down structure > tangibly represented as a tree of subfolders. > > But let's face it: we very rarely even look at object files, much less > work on them in any meaningful fashion.  They just have to be somewhere, > but it's no particular burden at all if they're all in a single folder, > per primary build target.  They're for the compiler and make alone to > work on, not for humans.  So they don't have to be organized for human > consumption. > > That's why virtually all hand-written Makefiles I've ever seen, and a > large portion of the auto-generated ones, too, keep all of a target's > object, list and dependency files in a single folder.  Mechanisms like > VPATH exist for the express purpose of easing this approach, and the > built-in rules and macros also largely rely on it. > > The major exception in this regard is CMake, which does indeed mirror > the source tree layout --- but that's manageable for them only because > their Makefiles, being fully machine-generated, can become almost > arbitrarily complex, for no extra cost.  Nobody in full possession of > their mental capabilities would ever write Makefiles the way CMake does > it, by hand.
There are other automatic systems that mirror the structure of the source tree for object files, dependency files and list files (yes, some people still like these). Eclipse does it, for example, and therefore the majority of vendor-supplied toolkits since most are Eclipse based. (I don't know if NetBeans and Visual Studio / Visual Studio Code do so - these are the other two IDE's commonly used by manufacturer tools). The big advantage of having object directories that copy source directories is that it all works even if you have more than one file with the same name. Usually, of course, you want to avoid name conflicts - there are risks of other issues or complications such as header guard symbols that are not unique (they /can/ include directory information and not just the filename, but they don't always do so) and you have to be careful that you #include the files you meant. But with big projects containing SDK files, third-party libraries, RTOS's, network stacks, and perhaps files written by many people working directly on the project, conflicts happen. "timers.c" and "utils.c" sound great to start with, but there is a real possibility of more than one turning up in a project. It is not at all hard to make object files mirror the source tree, and it adds nothing to the build time. For large projects, it is clearly worth the effort. (For small projects, it is probably not necessary.)
Am 10.12.2021 um 18:35 schrieb Hans-Bernhard Bröker:
> Am 09.12.2021 um 11:54 schrieb pozz: >> I'd prefer to have the same tree in source and build dirs: > > What on earth for? > > Subdirectories for sources are necessary to organize our work, because > humans can't deal too well with folders filled with hundreds of files, > and because we fare better with the project's top-down structure > tangibly represented as a tree of subfolders. > > But let's face it: we very rarely even look at object files, much less > work on them in any meaningful fashion.  They just have to be somewhere, > but it's no particular burden at all if they're all in a single folder, > per primary build target.
But sometimes, we do look at them. Especially in an embedded context. One example could be things like stack consumption analysis. Or to answer the question "how much code size do I pay for using this C++ feature?". "Did the compiler correctly inline this function I expected it to inline?". And if the linker gives me a "duplicate definition" error, I prefer that it is located in 'editor.o', not '3d3901cdeade62df1565f9616e607f89.o'.
> The major exception in this regard is CMake, which does indeed mirror > the source tree layout --- but that's manageable for them only because > their Makefiles, being fully machine-generated, can become almost > arbitrarily complex, for no extra cost.  Nobody in full possession of > their mental capabilities would ever write Makefiles the way CMake does > it, by hand.
The main reason I'd never write Makefiles the way CMake does it is that CMake's makefiles are horribly inefficient... But otherwise, once you got infrastructure to place object files in SOME subdirectory in your build system, mirroring the source structure is easy and gives a usability win. Stefan
Il 10/12/2021 19:44, David Brown ha scritto:
> On 10/12/2021 18:35, Hans-Bernhard Bröker wrote: >> Am 09.12.2021 um 11:54 schrieb pozz: >> >>> I'd prefer to have the same tree in source and build dirs: >> >> What on earth for? >> >> Subdirectories for sources are necessary to organize our work, because >> humans can't deal too well with folders filled with hundreds of files, >> and because we fare better with the project's top-down structure >> tangibly represented as a tree of subfolders. >> >> But let's face it: we very rarely even look at object files, much less >> work on them in any meaningful fashion.  They just have to be somewhere, >> but it's no particular burden at all if they're all in a single folder, >> per primary build target.  They're for the compiler and make alone to >> work on, not for humans.  So they don't have to be organized for human >> consumption. >> >> That's why virtually all hand-written Makefiles I've ever seen, and a >> large portion of the auto-generated ones, too, keep all of a target's >> object, list and dependency files in a single folder.  Mechanisms like >> VPATH exist for the express purpose of easing this approach, and the >> built-in rules and macros also largely rely on it. >> >> The major exception in this regard is CMake, which does indeed mirror >> the source tree layout --- but that's manageable for them only because >> their Makefiles, being fully machine-generated, can become almost >> arbitrarily complex, for no extra cost.  Nobody in full possession of >> their mental capabilities would ever write Makefiles the way CMake does >> it, by hand. > > There are other automatic systems that mirror the structure of the > source tree for object files, dependency files and list files (yes, some > people still like these). Eclipse does it, for example, and therefore > the majority of vendor-supplied toolkits since most are Eclipse based. > (I don't know if NetBeans and Visual Studio / Visual Studio Code do so - > these are the other two IDE's commonly used by manufacturer tools).
Atmel Studio, now Microchip Studio, that is based on Visual Studio mirrors exactly the source tree to the build dir.
> The big advantage of having object directories that copy source > directories is that it all works even if you have more than one file > with the same name. Usually, of course, you want to avoid name > conflicts - there are risks of other issues or complications such as > header guard symbols that are not unique (they /can/ include directory > information and not just the filename, but they don't always do so) and > you have to be careful that you #include the files you meant. But with > big projects containing SDK files, third-party libraries, RTOS's, > network stacks, and perhaps files written by many people working > directly on the project, conflicts happen. "timers.c" and "utils.c" > sound great to start with, but there is a real possibility of more than > one turning up in a project.
Yes, these are the reasons why I'd like to put object files in subdirectories.
> It is not at all hard to make object files mirror the source tree, and > it adds nothing to the build time. For large projects, it is clearly > worth the effort. (For small projects, it is probably not necessary.)
Ok, it's not too hard (nothing is hard when you know how to do it), but it's not that simple too.
On 11/12/2021 16:18, pozz wrote:

> > Ok, it's not too hard (nothing is hard when you know how to do it), but > it's not that simple too. >
Of course. And once you've got a makefile you like for one project, you copy it for the next. I don't think I have started writing a new makefile in 25 years!
On 2021-12-11, David Brown <david.brown@hesbynett.no> wrote:
> On 11/12/2021 16:18, pozz wrote: > >> Ok, it's not too hard (nothing is hard when you know how to do it), but >> it's not that simple too. > > Of course. > > And once you've got a makefile you like for one project, you copy it for > the next. I don't think I have started writing a new makefile in 25 years!
Too true. You don't write a Makefile from scratch any more than you sit down with some carbon, water, nitrogen, phosphorus and whatnot and make an apple tree. You look around and find an nice existing one that's closest to what you want, copy it, and start tweaking. -- Grant
Am 11.12.2021 um 10:01 schrieb Stefan Reuther:
> Am 10.12.2021 um 18:35 schrieb Hans-Bernhard Br&ouml;ker:
>> But let's face it: we very rarely even look at object files, much less >> work on them in any meaningful fashion.&nbsp; They just have to be somewhere, >> but it's no particular burden at all if they're all in a single folder, >> per primary build target. > > But sometimes, we do look at them. Especially in an embedded context.
In my experience, looking at individual object files does not occur in embedded context any more often than in others.
> One example could be things like stack consumption analysis.
That one's actually easier if you have the object files all in a single folder, as the tool will have to look at all of them anyway, so it helps if you can just pass it objdir/*.o. Or to
> answer the question "how much code size do I pay for using this C++ > feature?". "Did the compiler correctly inline this function I expected > it to inline?".
Both of those are way easier to check in the debugger or in the mapfile, than by inspecting individual object files.
> > And if the linker gives me a "duplicate definition" error, I prefer that > it is located in 'editor.o', not '3d3901cdeade62df1565f9616e607f89.o'.
Both are equally useless. You want to know which source file they're in, not which object files. Do you actually use a tool that obfuscates the o file nimes like that?
> But otherwise, once you got infrastructure to place object files in SOME > subdirectory in your build system, mirroring the source structure is > easy and gives a usability win.
I don't think you've actually mentioned a single one, so far. None of the things you mentioned had anything to do with _where_ the object files are.
Am 10.12.2021 um 19:44 schrieb David Brown:

> The big advantage of having object directories that copy source > directories is that it all works even if you have more than one file > with the same name.
Setting aside the issue whether the build can actually handle that ("module names" in the code tend to only be based on the basename of the source, not its full path, so they would clash anyway), that should remain an exceptional mishap. I don't subscribe to the idea of making my everyday life harder to account for (usually) avoidable exceptions like that.
Am 11.12.2021 um 18:47 schrieb Hans-Bernhard Br&ouml;ker:
> Am 11.12.2021 um 10:01 schrieb Stefan Reuther: >> Am 10.12.2021 um 18:35 schrieb Hans-Bernhard Br&ouml;ker: >>> But let's face it: we very rarely even look at object files, much less >>> work on them in any meaningful fashion.&nbsp; They just have to be somewhere, >>> but it's no particular burden at all if they're all in a single folder, >>> per primary build target. >> >> But sometimes, we do look at them. Especially in an embedded context. > > In my experience, looking at individual object files does not occur in > embedded context any more often than in others.
Neither in my experience, but this is because I look at individual object files even for desktop/server applications, but I don't expect that to be the rule :)
>> Or to >> answer the question "how much code size do I pay for using this C++ >> feature?". "Did the compiler correctly inline this function I expected >> it to inline?". > > Both of those are way easier to check in the debugger or in the mapfile, > than by inspecting individual object files.
For me, 'objdump -dr blah.o | less' or 'nm blah.o | awk ...' is the easiest way to answer such questions. The output of 'objdump | less' is much easier to handle than gdb's 'disas'. And how do you even get function sizes with a debugger?
>> And if the linker gives me a "duplicate definition" error, I prefer that >> it is located in 'editor.o', not '3d3901cdeade62df1565f9616e607f89.o'. > > Both are equally useless. You want to know which source file they're in, > not which object files.
I want to know in what translation unit they are in. It doesn't help to know that the duplicate definition comes from 'keys.inc' which is supposed to be included exactly once. I want to know which two translation units included it, and for that it helps to have the name of the translation unit - the initial *.c/cpp file - encoded in the object file name.
> Do you actually use a tool that obfuscates the o file nimes like that?
Encoding the command-line that generates a file (as a cryptographic hash) into the file name is a super-easy way to implement rebuild-on-rule-change. I use that for a number of temporary files. I do not use that for actual object files for the reasons given, but it would technically make sense.
> I don't think you've actually mentioned a single one, so far.&nbsp; None of > the things you mentioned had anything to do with _where_ the object > files are.
There are no hard technical reasons. It's all about usability, and that's about the things you actually do. If you got a GUI that takes you to the assembler code of a function with a right-click in the editor, you don't need 'objdump'. I don't have such a GUI and don't want it most of the time. Stefan