EmbeddedRelated.com
Forums

ARM IDE

Started by flash011 November 12, 2008
Chris H wrote:
> In message <C5idnS6fWt4aS7jUnZ2dnUVZ8r6dnZ2d@lyse.net>, David Brown > <david.brown@hesbynett.removethisbit.no> writes >>> Statistically I have found it worse amongst those who say "If I had >>> the source I could fix it" which is why I am very unconvinced when >>> people say having the course of a compiler is useful >>> >> >> You do realise he is talking about the source to the *library* here, >> not the compiler? An experienced developer of a high level is going >> to be just as capable of understanding, debugging and correcting a >> toolkit's library as the library developers are - assuming there is >> decent information available about things like calling and naming >> conventions in the library. > > Possibly, maybe and not normally. That said several commercial > compilers do make the library source available its just not FOSS. >
There is no need for the library to be under any kind of open source license (although it can be an advantage). I dislike any tools which don't include source for their libraries - the source is the final word in the library documentation, and it is sometimes very useful when debugging my own code.
>> They are not going to be as skilled at specific tasks such as writing >> the library code in a way that it compiles properly for all its >> targets, and perhaps they are not going to write code as optimally or >> portably. And they are certainly not going to be able to run all the >> tests and qualifications. > > So you have an untested and unqualified non-standard library.... >
Would you rather use a tested, qualified and buggy library or an untestet, unqualified but correct library? Personally, I am far more interested in making systems that work than trying to make sure I have someone else to blame when they don't work.
>> But in situations like this one (and cases I have seen myself), the >> user is perfectly capable of finding and correcting a bug in the >> library source code if he has access to it. > > The problem is that will out the full regression tests the user has no > idea if he has fixed the bug without causing any other problems. See > the definition of debugging :-) >
Personally, I use a concept called "modular" programming. I think it's quite popular among both professional and non-professional programmers - it's even used by commercial closed-source developers. The idea is that you write parts of your code that do specific things, and you test and document these parts so that you know what they do. That way, you can change one part of your code without breaking everything else. Done well, you don't even have to *test* everything else again until you are doing final validation tests.
> >> (it can be useful for other things, such as for changing development >> platform, > I don t see how. See above. >
Yes, I can see how you've clearly explained how simple it is to change development platform without source code for the tools. You gave clear and specific instructions for how a user can simply move their binary-only tools between Windows on x86, Linux on a PPC, and Solaris on a sparc. Having source code for the compiler tools is obviously a frivolous extravagance in such circumstances.
>> archiving tools, > Not really. You need to archive the actual binary you are using,. > (Possibly with a PC to run it on. >
It may be *possible* to archive the binaries (depending on the licenses, and any PITA node-locking restrictions), and it is often *useful* to archive the binaries, but it is undoubtedly useful to be able to archive the sources as well.
>> or as a safety net in case the compiler developers stop development on >> the tools). > Why would you need the source here? >
If a compiler company stops development on tools that I use, how can I ensure that I have access to these tools in the future? Perhaps I need to run them on another computer, and can no longer buy a license. Perhaps the tools need a small fix, but are no longer being updated. If I have the source code with an appropriate license (not necessarily open source), I have a way out - I can make these changes myself, or pay someone to make the changes. I am not locked in at the mercy of the tool developers. Many companies consider open source licenses to be a big advantage as a safety net in this way, regardless of the price they pay to get it.
>> But no one is talking about compiler source code here. > ? What were you talking about? >
I'm replying to your post.
Chris H wrote:
> In message <gg7h7h$tcn$1@aioe.org>, > Anders.Montonen@kapsi.spam.stop.fi.invalid writes >> Chris H <chris@phaedsys.org> wrote: >>> In message <IuudncF2mMiaQ7jUnZ2dnUVZ8umdnZ2d@lyse.net>, David Brown >>> <david.brown@hesbynett.removethisbit.no> writes >>>> If you are getting your gcc binaries ready-made (such as from >>>> CodeSourcery) rather than checking out and building your own copy, you >>>> might want to ask your particular supplier about their source code >>>> tracking and logging. >>> This would be essential. I am sure Code Sorcery can supply this. However >>> you would only be validating that part ticular version of the binary >>> form CS. Not GCC or the source. >> >> So the really awesome thing about closed-source compilers is that you >> buy a validated binary but the really awful thing about open-source >> compilers is that you buy a validated binary? > > This is the sort of problem with discussions like this they get clouded > by religious bigotry > > I did not mention closed or open source. > > We were discussing the problems of validating GCC compilers. GCC being > a very large collection of compilers from many sources that has minimal > control and trace ability. > > The other point was that if you validate for example a Byte craft > compiler (since Walter is partaking of this thread) you have the > complete compiler development and history in one place and the binary > only comes from one place and all the Byte Craft compilers of that > version for that target are validated. > > For GCC you can not validate "GCC compilers for a specific target" just > a specific variant from a specific supplier if you can get the full > history and the version you are validation is under similar control to > the Bytecraft compiler. Ie if you validated the GCC arm compiler from > Code Sourcery it would have no impact or relevance to any other GCC ARM > compiler. OR for that matter any Code Sorcery compiler built from > source by some one else (even if it was the same version as the one tested) > > SOME GCC suppliers can do this and validate their compilers others can't > but the vast majority of GCC compilers are virtually impossible to > validate. > > Recently I was talking to a company doing a safety critical project who > decided after some investigation that it would be far more cost > effective both in time and money to use a commercial compiler at about > 4K USD per seat for their developers rather than a "supported" GCC > compiler as the cost of validating the GCC would be far in excess of > validating the commercial compiler. > > This has nothing to do with open or closed source. >
Are you trying to say that "validation" of a compiler is dependent on having complete histories of all code that has ever been used in any versions of that compiler? That would seem to contradict the idea that the only relevant factor is exactly how a particular binary build of the compiler handles the tests. And I am correct in assuming that the Plum Hall tests only check the binary, then the history of the code is totally irrelevant for such testing and validation. Additionally, few software projects have such a clear control of the history of their code and the contributions to them as large open source projects. Collaborative open source projects are carried out in public, and the people who have write access to the source code trees are all vetted by their peers around the world. All commits are discussed and reviewed. Contrary to your beliefs, there is excellent control and traceability in such projects - much more so than in many closed source projects (though there are no rules - both open source and closed source have their share of well-managed and badly-managed projects).
In message <svydndrinO67urXUnZ2dnUVZ8qDinZ2d@lyse.net>, David Brown 
<david.brown@hesbynett.removethisbit.no> writes
>>> They are not going to be as skilled at specific tasks such as >>>writing the library code in a way that it compiles properly for all >>>its targets, and perhaps they are not going to write code as >>>optimally or portably. And they are certainly not going to be able >>>to run all the tests and qualifications. >> So you have an untested and unqualified non-standard library.... >> > >Would you rather use a tested, qualified and buggy library or an >untestet, unqualified but correct library?
How do you know it is correct without testing it? A tested and validated system is better as you know EXACTLY what it will do.
> Personally, I am far more interested in making systems that work than >trying to make sure I have someone else to blame when they don't work.
With an untested/validated system you have no idea if or how it works.
>Personally, I use a concept called "modular" programming. I think it's >quite popular among both professional and non-professional programmers >- it's even used by commercial closed-source developers. The idea is >that you write parts of your code that do specific things, and you test >and document these parts so that you know what they do. That way, you >can change one part of your code without breaking everything else. >Done well, you don't even have to *test* everything else again until >you are doing final validation tests.
:-) Of course you can write modular systems al the best ones are. However I recall a presentation by some one who had the first validated ISO C compiler and I can tell you it is not as simples as you are suggesting.
>>> (it can be useful for other things, such as for changing development >>>platform, >> I don t see how. See above. >Yes, I can see how you've clearly explained how simple it is to change >development platform without source code for the tools. You gave clear >and specific instructions for how a user can simply move their >binary-only tools between Windows on x86, Linux on a PPC, and Solaris >on a sparc.
They can't. If you have tools for x86 that are validated YOU can not simply move the, to PPC or SCPAR with out a complete retest.
>Having source code for the compiler tools is obviously a frivolous >extravagance in such circumstances.
Pointless unless you are going to do a full and complete re-test
>>> archiving tools, >> Not really. You need to archive the actual binary you are using,. >>(Possibly with a PC to run it on. > >It may be *possible* to archive the binaries (depending on the >licenses, and any PITA node-locking restrictions), and it is often >*useful* to archive the binaries,
It is not only possible I know a lot of companies that do this. Also quite often archiving at least one PC/workstation with the system (this is for safety critical developments)
> but it is undoubtedly useful to be able to archive the sources as >well.
Not really because if you rebuild the sources you need to do a complete re-test of the new system you build. Using any compiler (from the binary) other than the original one used on the original hardware is going to give you a different compiler that needs retesting.
>>> or as a safety net in case the compiler developers stop development >>>on the tools). >> Why would you need the source here? > >If a compiler company stops development on tools that I use, how can I >ensure that I have access to these tools in the future?
You have the binary
> Perhaps I need to run them on another computer, and can no longer buy >a license. > Perhaps the tools need a small fix, but are no longer being updated.
Archive the system.
>If I have the source code with an appropriate license (not necessarily >open source), I have a way out - I can make these changes myself, or >pay someone to make the changes
And fully re-test or validate the compiler... I forgot you don't test your compilers and have no real idea if they are performing correctly
>. I am not locked in at the mercy of the tool developers. Many >companies consider open source licenses to be a big advantage as a >safety net in this way, regardless of the price they pay to get it.
This is a bit of a myth really. In 30 years I have never come across this as a real problem. It happens that compiler companies disappear but I have never seen where it has been a major problem. The times where it does occur it is usually easily got around. Often porting the code to a more modern compiler is a lot less hassle than trying to rebuild a compiler that is that old. -- \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\ \/\/\/\/\ Chris Hills Staffs England /\/\/\/\/ \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
In message <AMSdncUbs-bt-rXUnZ2dnUVZ8tninZ2d@lyse.net>, David Brown 
<david.brown@hesbynett.removethisbit.no> writes
>Chris H wrote: >> In message <gg7h7h$tcn$1@aioe.org>, >>Anders.Montonen@kapsi.spam.stop.fi.invalid writes >>> Chris H <chris@phaedsys.org> wrote: >>>> In message <IuudncF2mMiaQ7jUnZ2dnUVZ8umdnZ2d@lyse.net>, David Brown >>>> <david.brown@hesbynett.removethisbit.no> writes >>>>> If you are getting your gcc binaries ready-made (such as from >>>>> CodeSourcery) rather than checking out and building your own copy, you >>>>> might want to ask your particular supplier about their source code >>>>> tracking and logging. >>>> This would be essential. I am sure Code Sorcery can supply this. However >>>> you would only be validating that part ticular version of the binary >>>> form CS. Not GCC or the source. >>> >>> So the really awesome thing about closed-source compilers is that you >>> buy a validated binary but the really awful thing about open-source >>> compilers is that you buy a validated binary? >> This is the sort of problem with discussions like this they get >>clouded by religious bigotry >> I did not mention closed or open source. >> We were discussing the problems of validating GCC compilers. GCC >>being a very large collection of compilers from many sources that has >>minimal control and trace ability. >> The other point was that if you validate for example a Byte craft >>compiler (since Walter is partaking of this thread) you have the >>complete compiler development and history in one place and the binary >>only comes from one place and all the Byte Craft compilers of that >>version for that target are validated. >> For GCC you can not validate "GCC compilers for a specific target" >>just a specific variant from a specific supplier if you can get the >>full history and the version you are validation is under similar >>control to the Bytecraft compiler. Ie if you validated the GCC arm >>compiler from Code Sourcery it would have no impact or relevance to >>any other GCC ARM compiler. OR for that matter any Code Sorcery >>compiler built from source by some one else (even if it was the same >>version as the one tested) >> SOME GCC suppliers can do this and validate their compilers others >>can't but the vast majority of GCC compilers are virtually impossible >>validate. >> Recently I was talking to a company doing a safety critical project >>who decided after some investigation that it would be far more cost >>effective both in time and money to use a commercial compiler at about >>USD per seat for their developers rather than a "supported" GCC >>compiler as the cost of validating the GCC would be far in excess of >>validating the commercial compiler. >> This has nothing to do with open or closed source. >> > >Are you trying to say that "validation" of a compiler is dependent on >having complete histories of all code that has ever been used in any >versions of that compiler? That would seem to contradict the idea that >the only relevant factor is exactly how a particular binary build of >the compiler handles the tests. And I am correct in assuming that the >Plum Hall tests only check the binary, then the history of the code is >totally irrelevant for such testing and validation.
You test the binary but that is only part of the validation. You have to show the but history and loot at the other tests, the development process and the documentation etc... It is a two part process.
>Additionally, few software projects have such a clear control of the >history of their code and the contributions to them as large open >source projects. Collaborative open source projects are carried out in >public, and the people who have write access to the source code trees >are all vetted by their peers around the world.
For safety critical systems you require qualified and experienced people.....
> All commits are discussed and reviewed. Contrary to your beliefs, >there is excellent control and traceability in such projects - much >more so than in many closed source projects
I do have some evidence to the country for GCC but it is not in a public document (it comes from one of the GCC development places) -- \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\ \/\/\/\/\ Chris Hills Staffs England /\/\/\/\/ \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Chris H wrote:
> In message <svydndrinO67urXUnZ2dnUVZ8qDinZ2d@lyse.net>, David Brown >>>> They are not going to be as skilled at specific tasks such as >>>> writing the library code in a way that it compiles properly for all >>>> its targets, and perhaps they are not going to write code as >>>> optimally or portably. And they are certainly not going to be able >>>> to run all the tests and qualifications. >>> >>> So you have an untested and unqualified non-standard library.... >> >> Would you rather use a tested, qualified and buggy library or an >> untestet, unqualified but correct library? > > How do you know it is correct without testing it? > A tested and validated system is better as you know EXACTLY what it will > do.
Errrrm... Sorry to wake you up in your Ivory Tower: We started with the assumption that the developer has found a *bug*. In other words, he has just proven that the "tested and validated" thing is *wrong*. Don't tell me now that the developer is incompetent and doesn't know how to use the thing. I get my salary for knowing what I'm doing, and I think I'm not too bad in that. I wouldn't have a number of confirmed compiler and library bugs on my credit side otherwise (on the other hand, I have several times convinced coworkers that their observation is not a bug. I just happen to have a little experience in compiler construction as well as C++ standardese.). And don't tell me either that bugs in "tested and validated" things don't happen. They happen.
>> Personally, I am far more interested in making systems that work than >> trying to make sure I have someone else to blame when they don't work. > > With an untested/validated system you have no idea if or how it works.
I can read and write C code quite well. If I can test whether my selfmade function works, I can also test someone else's function. Or someone else's function after my modification.
>> but it is undoubtedly useful to be able to archive the sources as well. > > Not really because if you rebuild the sources you need to do a complete > re-test of the new system you build. Using any compiler (from the > binary) other than the original one used on the original hardware is > going to give you a different compiler that needs retesting.
Here's a quite simple test: compile your project with the official & approved binary. Compile your project with the compiler you built from the source. Compare. (Of course you don't compare the raw ELF files, just the sections that matter, i.e. the stuff that's ultimately loaded to the target.) You can do this at any time in advance, and you can repeat it as often as you want. Now you know that your self-made compiler produces the same thing as the offical & approved one, and from your own tests, you know that the produced binary is sufficiently correct. What better qualification test for a compiler can one imagine? Of course, the two compilers might deviate at some time in the future, but then you only know they differ, you don't know which one is right. It might be the official & approved one, or it might not be.
>>>> or as a safety net in case the compiler developers stop development >>>> on the tools). >>> >>> Why would you need the source here? >> >> If a compiler company stops development on tools that I use, how can I >> ensure that I have access to these tools in the future? > > You have the binary
...plus a dongle plugging into an interface no-one produces any longer. Or, a binary which doesn't run on any computer one can buy (Win16 or DOS, anyone?)
>> If I have the source code with an appropriate license (not necessarily >> open source), I have a way out - I can make these changes myself, or >> pay someone to make the changes > > And fully re-test or validate the compiler... I forgot you don't test > your compilers and have no real idea if they are performing correctly
Because the compiler maker's test suite doesn't guarantee that it performs correctly, that's a moot point. I have outlined a test above.
>> . I am not locked in at the mercy of the tool developers. Many >> companies consider open source licenses to be a big advantage as a >> safety net in this way, regardless of the price they pay to get it. > > This is a bit of a myth really. In 30 years I have never come across > this as a real problem. It happens that compiler companies disappear > but I have never seen where it has been a major problem. The times > where it does occur it is usually easily got around. > > Often porting the code to a more modern compiler is a lot less hassle > than trying to rebuild a compiler that is that old.
Maybe if you buy "100% ANSI C" products. In reality, system vendors sell integrated environments, where you can use the full product only when you also use their compiler. Just take things like inline-assembly. Everyone does it in a different way. Or system building tools: how do you generate the system image containing all your tasks, drivers, processes, etc. Stefan
In message <ggbs9u.uk.1@stefan.msgid.phost.de>, Stefan Reuther 
<stefan.news@arcor.de> writes
>Chris H wrote: >> In message <svydndrinO67urXUnZ2dnUVZ8qDinZ2d@lyse.net>, David Brown >>>>> They are not going to be as skilled at specific tasks such as >>>>> writing the library code in a way that it compiles properly for all >>>>> its targets, and perhaps they are not going to write code as >>>>> optimally or portably. And they are certainly not going to be able >>>>> to run all the tests and qualifications. >>>> >>>> So you have an untested and unqualified non-standard library.... >>> >>> Would you rather use a tested, qualified and buggy library or an >>> untestet, unqualified but correct library? >> >> How do you know it is correct without testing it? >> A tested and validated system is better as you know EXACTLY what it will >> do. > >Errrrm... Sorry to wake you up in your Ivory Tower: We started with the >assumption that the developer has found a *bug*. In other words, he has >just proven that the "tested and validated" thing is *wrong*.
This can happen Then you fix it. The choice was a bug in a validated system OR an untested and unqualified compiler which may or may not be correct. You won't know until you test and validate it if it is correct.
>>> Personally, I am far more interested in making systems that work than >>> trying to make sure I have someone else to blame when they don't work. >> >> With an untested/validated system you have no idea if or how it works. > >I can read and write C code quite well. If I can test whether my >selfmade function works, I can also test someone else's function. Or >someone else's function after my modification.
>Maybe if you buy "100% ANSI C" products.
I think you mean ISO 9899 not ANSI C...... but we don't need to be precise :-) -- \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\ \/\/\/\/\ Chris Hills Staffs England /\/\/\/\/ \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Chris H wrote:
> In message <ggbs9u.uk.1@stefan.msgid.phost.de>, Stefan Reuther >> Chris H wrote: >>> How do you know it is correct without testing it? >>> A tested and validated system is better as you know EXACTLY what it will >>> do. >> >> Errrrm... Sorry to wake you up in your Ivory Tower: We started with the >> assumption that the developer has found a *bug*. In other words, he has >> just proven that the "tested and validated" thing is *wrong*. > > This can happen Then you fix it. > > The choice was a bug in a validated system > OR > an untested and unqualified compiler which may or may not be correct. > You won't know until you test and validate it if it is correct.
Maybe we're talking about different meanings of "testing". You said a commercial software vendor had the big advantage of testing against Plum Hall / Perennial, and everything else isn't worth anything. But of course I test my fixes. I test those on *my* test workloads until I'm confident that they work. But than I'm back to "go", trying to convince the tech support guys that I'm not a clueless first-semester student, but that this source code patch actually fixes a problem...
>> I can read and write C code quite well. If I can test whether my >> selfmade function works, I can also test someone else's function. Or >> someone else's function after my modification. > >> Maybe if you buy "100% ANSI C" products. > > I think you mean ISO 9899 not ANSI C...... but we don't need to be > precise :-)
ISO/IEC 9899:1999, to be even more nitpicking. Still you read "100% ANSI C" quite often. Or just "Standard C", whatever that means... Stefan
In message <ggc1g9.13c.1@stefan.msgid.phost.de>, Stefan Reuther 
<stefan.news@arcor.de> writes
>Chris H wrote: >> In message <ggbs9u.uk.1@stefan.msgid.phost.de>, Stefan Reuther >>> Chris H wrote: >>>> How do you know it is correct without testing it? >>>> A tested and validated system is better as you know EXACTLY what it will >>>> do. >>> >>> Errrrm... Sorry to wake you up in your Ivory Tower: We started with the >>> assumption that the developer has found a *bug*. In other words, he has >>> just proven that the "tested and validated" thing is *wrong*. >> >> This can happen Then you fix it. >> >> The choice was a bug in a validated system >> OR >> an untested and unqualified compiler which may or may not be correct. >> You won't know until you test and validate it if it is correct. > >Maybe we're talking about different meanings of "testing". You said a >commercial software vendor had the big advantage of testing against Plum >Hall / Perennial, and everything else isn't worth anything.
I did not say that. I said that Plum Hall and Perennial were two test suites that are widely recognised for testing the language. Has also been pointed out (several times) they are part of a test regime. Not all of it
> But of >course I test my fixes. I test those on *my* test workloads until I'm >confident that they work.
The authors of Plum-Hall and Perennial have a provenance. Also the commercial compiler will have all the documentation, documented processes (and this will have to be a suitable process), full histories and bug fixes by named qualified and experienced people. Also the validated compiler will be Independently tested by qualified and experienced people Your "confident" assurances amount to what? Not a lot really? You expect me to bet my life and the lives of others on your say so?
>But than I'm back to "go", trying to convince the tech support guys that >I'm not a clueless first-semester student, but that this source code >patch actually fixes a problem...
Not convinced me so far.
>>> I can read and write C code quite well. If I can test whether my >>> selfmade function works, I can also test someone else's function. Or >>> someone else's function after my modification. >> >>> Maybe if you buy "100% ANSI C" products. >> >> I think you mean ISO 9899 not ANSI C...... but we don't need to be >> precise :-) >ISO/IEC 9899:1999, to be even more nitpicking.
Not at all. You are wrong. It will more probably be 9899:1990 + A1 + TC1 +TC2 + TC3
> Still you read "100% ANSI >C" quite often. Or just "Standard C", whatever that means...
Quite so what are you testing to? You see quite slipshod on some points We are discussing validation of compilers and you are vague about the standard you are testing to. It hardly fills me with any confidence. -- \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\ \/\/\/\/\ Chris Hills Staffs England /\/\/\/\/ \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Chris H wrote:
<snip>
> Your "confident" assurances amount to what? Not a lot really? You > expect me to bet my life and the lives of others on your say so? >
Yes, that is *exactly* what people developing safety critical systems expect you to do. The compiler is just one of the many tools used by one of the many developers during the design of one of the many parts of any given safety critical system. Whether the particular compiler is Plum Hall tested or not is a tiny drop in the ocean of the required careful development procedures and the required testing. You seem to be of the impression that it is a critical part, and that Plum Hall testing makes the final product safer even if the compiler and/or library has bugs! If the safety-critical software developers are doing their jobs, then bugs in the compiler and library will be spotted during *their* testing.
Chris H wrote:
> In message <ggc1g9.13c.1@stefan.msgid.phost.de>, Stefan Reuther >> But of >> course I test my fixes. I test those on *my* test workloads until I'm >> confident that they work. > > The authors of Plum-Hall and Perennial have a provenance. Also the > commercial compiler will have all the documentation, documented > processes (and this will have to be a suitable process), full histories > and bug fixes by named qualified and experienced people. Also the > validated compiler will be Independently tested by qualified and > experienced people > > Your "confident" assurances amount to what? Not a lot really? You > expect me to bet my life and the lives of others on your say so?
Okay, let's make it short: What you're saying is that I should better stop finding, fixing and reporting bugs, because I'm too incompetent for it, and cannot test the fixes in a documented, proven, whatever way as Plum Hall and Perennial do. I should better ship a system of which I know it misbehaves, but which has P&P's blessing. Right? Guess why I'm writing to tech support. To tell them there is a problem, propose a fix, and have them review and approve it! But this does not work if the 1st level supporter bounces back the report for silly unreasonable formal reasons without even reading it.
>> But than I'm back to "go", trying to convince the tech support guys that >> I'm not a clueless first-semester student, but that this source code >> patch actually fixes a problem... > > Not convinced me so far.
I had posted a real example. You have chosen to ignore it. Like the tech support guy. I have then chosen to ignore the bogus product. Stefan