On 18.6.14 01:56, Tom Gardner wrote:> On 17/06/14 21:31, Tauno Voipio wrote: >> On 5 channel tape all holes was letter shift, where extra >> ones were ignored, so it could be used as rubout. > > There were, of course, multiple incompatible 5 > channel encodings. My first machine code program, > in which I unknowingly reinvented the concept of > a simple FSM, converted between two of them (the > teleprinter we had at school and the Elliott > teleprinter/computer we used at the local > Tech College)So did I, at Helsinki University of Technology. Even the Elliott 5 track code used the same letter and figure shift characters as the common teleprinter. In the 60's, it was customary that each computer manufacturer ffet it obliged to have own character coding. Elliott was one of them, as well as e.g. IBM and Sperry Rand Univac. -- -TV
filling remaining array elements with fixed value
Started by ●June 12, 2014
Reply by ●June 18, 20142014-06-18
Reply by ●June 19, 20142014-06-19
On 2014-06-13, Niklas Holsti <niklas.holsti@tidorum.invalid> wrote:> How sure are you that your virtual machine snapshot, taken in 2014 on > your current PC and hypervisor, will run on your brand-new PC in the > year 2029? > > As I understand them, what are called "virtual machines" on PCs only > virtualize as little of the machine as is necessary to support multiple > OS's on the same hardware, but are not full emulations of the PC > processor and I/O. I have not seen any promises from hypervisor vendors > to support 15-year-old VM snapshots on future PC architectures, which > may be quite different.> This question is of interest to me because I am working on projects with > maintenance foreseen until the 2040's. Some people have suggested > virtual machines as the solution for keeping the development tools > operational so long, but I am doubtful.I have a 32-bit VMWare virtual machine that was created nine years ago that is still in use today. It has transitioned from VMWare version 5 through versions 6, 7, 8, 9, and 10 with no issues, and its host PC has transitioned from a 32-bit processor and 32-bit OS to a 64-bit processor and 64-bit OS with no issues. -- John W. Temples, III
Reply by ●June 20, 20142014-06-20
On 14-06-20 03:28 , John Temples wrote:> On 2014-06-13, Niklas Holsti <niklas.holsti@tidorum.invalid> wrote: > >> How sure are you that your virtual machine snapshot, taken in 2014 on >> your current PC and hypervisor, will run on your brand-new PC in the >> year 2029? >> >> As I understand them, what are called "virtual machines" on PCs only >> virtualize as little of the machine as is necessary to support multiple >> OS's on the same hardware, but are not full emulations of the PC >> processor and I/O. I have not seen any promises from hypervisor vendors >> to support 15-year-old VM snapshots on future PC architectures, which >> may be quite different. > >> This question is of interest to me because I am working on projects with >> maintenance foreseen until the 2040's. Some people have suggested >> virtual machines as the solution for keeping the development tools >> operational so long, but I am doubtful. > > I have a 32-bit VMWare virtual machine that was created nine years ago > that is still in use today. It has transitioned from VMWare version 5 > through versions 6, 7, 8, 9, and 10 with no issues, and its host PC > has transitioned from a 32-bit processor and 32-bit OS to a 64-bit > processor and 64-bit OS with no issues.That is an impressive example of the upwards compatibility of the x86 architecture -- but still only within that architecture family. With Moore'law slowing down or perhaps stagnating, I'm not at all sure that x86 will still be in common use in the 2040's. Maybe it will, maybe it won't. -- Niklas Holsti Tidorum Ltd niklas holsti tidorum fi . @ .
Reply by ●June 20, 20142014-06-20
Op 20-Jun-14 14:55, Niklas Holsti schreef:> On 14-06-20 03:28 , John Temples wrote: >> On 2014-06-13, Niklas Holsti <niklas.holsti@tidorum.invalid> wrote: >> >>> How sure are you that your virtual machine snapshot, taken in 2014 on >>> your current PC and hypervisor, will run on your brand-new PC in the >>> year 2029? >>> >>> As I understand them, what are called "virtual machines" on PCs only >>> virtualize as little of the machine as is necessary to support multiple >>> OS's on the same hardware, but are not full emulations of the PC >>> processor and I/O. I have not seen any promises from hypervisor vendors >>> to support 15-year-old VM snapshots on future PC architectures, which >>> may be quite different. >> >>> This question is of interest to me because I am working on projects with >>> maintenance foreseen until the 2040's. Some people have suggested >>> virtual machines as the solution for keeping the development tools >>> operational so long, but I am doubtful. >> >> I have a 32-bit VMWare virtual machine that was created nine years ago >> that is still in use today. It has transitioned from VMWare version 5 >> through versions 6, 7, 8, 9, and 10 with no issues, and its host PC >> has transitioned from a 32-bit processor and 32-bit OS to a 64-bit >> processor and 64-bit OS with no issues. > > That is an impressive example of the upwards compatibility of the x86 > architecture -- but still only within that architecture family. With > Moore'law slowing down or perhaps stagnating, I'm not at all sure that > x86 will still be in common use in the 2040's. Maybe it will, maybe it > won't.Even if it won't, chances are that that there will be x86 emulators (and associated PC hardware) for whatever platform is mainstream in 2040, just like today where you have (very accurate) emulators for computers that became obsolete decades ago (like the Commodore 64).
Reply by ●June 20, 20142014-06-20
On Fri, 20 Jun 2014 15:55:49 +0300, Niklas Holsti <niklas.holsti@tidorum.invalid> wrote:>On 14-06-20 03:28 , John Temples wrote: > >> I have a 32-bit VMWare virtual machine that was created nine years ago >> that is still in use today. It has transitioned from VMWare version 5 >> through versions 6, 7, 8, 9, and 10 with no issues, and its host PC >> has transitioned from a 32-bit processor and 32-bit OS to a 64-bit >> processor and 64-bit OS with no issues. > >That is an impressive example of the upwards compatibility of the x86 >architecture -- but still only within that architecture family. With >Moore'law slowing down or perhaps stagnating, I'm not at all sure that >x86 will still be in common use in the 2040's. Maybe it will, maybe it >won't.I wouldn't worry about it too much - you'll just get wrinkles. Technically, the x86 "architecture" is smoke and mirrors now. Current x86 cpus don't execute x86 instructions at all ... they dynamically translate x86 code into an internal RISC-like instruction set [which varies by generation]. In terms of their micro-architecture, current x86 cpus are massively OoO, load/store, complex RISC machines having many hundreds of registers. For the past 20 years - since the Pentium Pro - x86 cpus haven't implemented the x86 program model, they have *emulated* it. I doubt x86 will ever completely disappear. If we do get to the point that there aren't OTS cpus that natively "run" x86 code, there will be FPGA cores, software emulators and x86->? binary converters for migrating software to whatever is the new dominant architecture. George
Reply by ●June 25, 20142014-06-25
Hi David, [Mo time for a "thorough" reply -- instead, I'll just address the main points. Therefore, *much* elided...] On 6/16/2014 1:34 AM, David Brown wrote:>>> Or you and he can continue to keep one foot nailed to the floor by >> >> -----^^^ >> >>> thinking that anything free or open source must be so amateurish that no >>> sane company would use them, and that every problem must be beaten to >>> death by a hammer because no other tools are allowed. >> >> FOR THE RECORD, I probably use and *deploy* more FOSS than anything >> *you* use/deploy. All of *my* software development work is done with >> FOSS tools (I rely on Solaris and Windows based "proprietary" tools >> for certain documentation, test, and hardware design tasks for which >> the available FOSS tools *pale*). >> >> The fact that *EVERYTHING* I am currently writing (software and docs) >> and designing (hardware) will be available under an unencumbered >> (*non*-GPL) license demonstrates my commitment to the concept of >> "Open" hardware/software. >> >> How many *thousands* of hours of *your* time are you GIVING AWAY? >> >> "Nailed to the floor"? Hardly! > > That's great - but why do you continually refer to FOSS so disparagingly? > > "Look at some of the FOSS projects and the hodgepodge of tools > they (somewhat arbitrarily) rely upon for proof of this. I.e., > it's *fine* -- if you want to drink the koolade..."Here, I was referring to the environments they put in place to develop/support the "product's development", not the actual resulting products. E.g., I recall one project released as bz2 archives -- before bz2 was in "widespread" use. So, before I can even get started *looking* at it, I have to build and install a bz2 executable. Why? Would a gzip'ed tarball have been *that* much bigger? Would the bz2 tarball save hours/minutes of download time? Gobs of disk space (assuming you *don't* unpack it)? Some reason you can't release a gzip tarball alongside the bz2? (this is what they did -- AFTER I complained) Is there a reason you have to use odemake, pmake, bmake, etc.? Or, worse yet, some completely *hacked* build system (look, for example, at Jaluna's u=build system :< )? If so, do you bother to share the reasons *why* you REQUIRE it? Or, was it just a whim? NIH? Or, worse, an "experiment"? Is there a reason your documentation can't be a regular man page? Why an info file? Why a set of web pages? A latex document? I.e., what *else* do I have to build in order to build the entire complement of your deliverables? Or, a perl script that emulates what two lines of sed(1) could accomplish? Granted, the "younger generation" is more geared towards newer tools (e.g., perl in lieu of awk) than those of us with longer "histories" would have employed in many similar roles. But, often the choices appear arbitrary -- as if someone was using this "project" as a chance to "play" (because he didn't have to answer to a PHB!) [I suspect "play"/experimentation is the underlying basis for many of these "decisions". And, even if they prove to have been *bad* choices, there's little inclination to go back and "fix" things: "Gee, we should have just used make instead of this bizarre set of hand-crafted scripts..."] The other aspect of FOSS that is annoying is its "incompleteness". Many are simply "ideas" and not full-fledged "products". Like adding "reverse" to a vehicle but never quite sorting out the fact that velocity backwards should be at a much slower rate (gearbox/tranny) than *forward*! "We'll fix that in the next release! (apologies to anyone who backed into a wall due to our current oversight)" How did you *test* your "product"? (Ooops!) Did you forget that little detail? Or, was it all just ad hoc testing "on the fly" as you were writing the code? How can *I* verify that it does what you *think* it does? How do I know *how* to test it? How to know what the weaknesses in its algorithms are likely to be (without studying them in detail -- with the COPIOUS documentation that you've probably forgotten to include!) Which features have you incompletely implemented (before being lured off to add yet more *incomplete* features)? Which things should I *not* expect to work? Which should I avoid *using* (if I am just a CONSUMER of your product) and which should I concentrate on testing and improving (if I am a fellow developer)? How do I even know what it is *supposed* to do (formally) if you haven't documented that? (and, how did YOU know how to *test* it if you didn't have a formal statement of how it was supposed to work?) Are the "default options" for the application accurately indicated in the documentation? Or, do they actually *differ* from what the documentation states? Are there *other* options that are not documented? Is this omission intentional (perhaps they are obsolescent or not completely implemented) or accidental? IME, there are very few FOSS projects that satisfy these criteria.> Now, I certainly won't contest that /some/ FOSS projects are a > hodgepodge, or "amateurish" or "hobbyest", as you also describe them. > > I feel I am getting a lot of mixed messages here. I am hearing that you > wouldn't recommend Python (or other tools) because they are a > "hodgepodge of tools" written by amateur hobbyests, and not suitable for > serious use - and I am also hearing that you are a strong FOSS supporter.Where did I claim they were written by hobbyists? My only reference to that term was in the context of CPU emulators: "Can you find a 68040 emulator? 99000? 32000? Z380? etc. (*other* than "hobbyist" attempts)" So, *can* you find one that is actively maintained by a "capable" community? One on whose efforts you would be willing to rest the future of your project? Can you find a formal definition of the contract that it *claims* to make with the "hosted applications"? [20+ years after MS introduced Windows, can Wine *yet* emulate EVERYTHING that you could do on a "286"? Is there a list that lets a potential user evaluate whether it is even worth *trying* their favorite app? Any guarantee that even if it *looks* like it is working that they won't later discover some feature in the app that *won't* work properly?] What annoys me about most FOSS is that most don't treat their "output" as a genuine (supportable) *product*. [You hear developers lament that all these "problems" IN THEIR DAY JOBS are the result of pressures from PHB's, marketing, etc. Yet, when these same developers operate in an environment where those pressures are nonexistent, the same problems manifest! I.e., the common issue is the very developers who tried to lay the blame on their bosses! :< Yeah, I get it. Documentation and testing aren't fun. It should only be required for medical/safety -- let Wild West Software prevail everywhere else in the name of "time to market"...] Yeah, I know... documentation and testing are "no fun". But, presumably, you *want* people to use your "product" (even if it is FREE) so wouldn't you want to facilitate that? I'm pretty sure folks don't want to throw lots of time and effort into something only to see it *not* used! Granted, the "development" issue that I initially discussed is a tough one -- how can I *expect* all FOSS developers to "settle" on a common/shared set of tools/approach? While this is common *within* an organization, it would be ridiculous to expect Company A to use the same tools and process that Company B uses! And, griping about it would be presumptuous. OTOH, it's fair to gripe when Company (Organization) X does things in a way that is needlessly more complicated or dependent (on a larger base of tools) than it needs to be.> Clearly you /are/ a strong FOSS supporter - but, at least in this > thread, you are coming across as being against it in principle. > > So I am sorry if I misread or misinterpreted here (and even sorrier if I > mixed things up somehow), but that is how your posts appeared to me.When I started my current set of (FOSS) projects, I was almost in a state of panic over the "requirements" it would impose on others. Too many tools, too much specialized equipment, skillsets, etc. After fretting about this for quite some time -- constantly trying to eliminate another "dependency" -- I finally realized the "minimum" is "a lot" and that's just the way it is! OTOH, it is highly unlikely that some *one* will need to undertake all of these activities. So, the requirements can be spread out over a *group* -- even if it is a "local" group -- instead of a single developer. *You* don't have to be fluent in all four of the languages used in the system (not counting application specific languages). Nor do you have to be savvy with the various hardware design, fab and test tools. Or, the tools used to prepare developer, user and run-time documentation. Just some subset of those. And, *I* will endeavor to pick good tools that adequately address their respective needs so you aren't forced to use two *different* tools (e.g., languages) for the same class of "task". As such, you only have to invest (time and/or money) in a *minimal* set of tools. [Which is very different from "chasing FOSS" in general]
Reply by ●June 25, 20142014-06-25
On 6/16/2014 5:27 AM, Simon Clubley wrote:> On 2014-06-16, David Brown<david.brown@hesbynett.no> wrote: >> On 16/06/14 00:34, Don Y wrote: >>> >>> And the same could be said for C++, Java, Perl, etc. What will be the >>> language du jour *next* year? >> >> That's a valid point - but Python has been a common "language du jour" >> for such uses for the last ten years, and shows no sign of being replaced. > > No, but it's been through a incompatible language revision.This is true of other languages, as well. E.g., foo =- 2; Every language (tool) you rely upon in a project creates a dependency on that tool and the staff who must be able to *use* it. Try taking someone accustomed to (i.e., recent grad) C11 and sit them down in front of a C89 codebase WITH C89 TOOLS (because your industry requires tools to be formally certified prior to use). They will spend countless hours wondering why their "perfect" code is throwing compiler errors. And, once you've clued them in to the reason, they'll be fuming /sotto voce/ each time they stumble on another "gotcha". I.e., the tool is "fighting them" -- they're not a "happy camper".> (Which is probably a sign of language maturity. :-)) > > Don't forget as well that a decade ago Perl was very popular and now > it's a legacy language.Where will C++, Java, Python, Perl, Ruby, etc. be "a few years hence"? Where will the folks who can efficiently *develop* with them be? (e.g., 6502 ASM programmers are still in demand -- because no one *writes* 6502 ASM nowadays, "mainstream")
Reply by ●June 25, 20142014-06-25
On 2014-06-25, Don Y <this@is.not.me.com> wrote:> On 6/16/2014 5:27 AM, Simon Clubley wrote: >> >> No, but it's been through a incompatible language revision. > > This is true of other languages, as well. E.g., foo =- 2; > > Every language (tool) you rely upon in a project creates a dependency > on that tool and the staff who must be able to *use* it. Try taking > someone accustomed to (i.e., recent grad) C11 and sit them down in > front of a C89 codebase WITH C89 TOOLS (because your industry requires > tools to be formally certified prior to use). They will spend > countless hours wondering why their "perfect" code is throwing > compiler errors. >Oh, that could be interesting. :-) I've actually used those early C compilers (a long time ago).> And, once you've clued them in to the reason, they'll be fuming /sotto > voce/ each time they stumble on another "gotcha". I.e., the tool > is "fighting them" -- they're not a "happy camper". > >> (Which is probably a sign of language maturity. :-)) >> >> Don't forget as well that a decade ago Perl was very popular and now >> it's a legacy language. > > Where will C++, Java, Python, Perl, Ruby, etc. be "a few years hence"? > Where will the folks who can efficiently *develop* with them be? >This is why it's so important to learn the concepts first and the language second. With that grounding, it's a lot easier to adapt to new languages.> (e.g., 6502 ASM programmers are still in demand -- because no one > *writes* 6502 ASM nowadays, "mainstream")I'll admit I didn't realise the 6502 was still around - I last used it back in the BBC Model B days (ie: my school days). At the time I seem to remember preferring the Z80 (can't remember why) although some classmates seemed to prefer the 6502 architecture. [*] What is the 6502 still used in ? Simon. [*] Of course, these days, schoolchildren are more likely to be comparing social media platforms instead of computer architectures. :-) -- Simon Clubley, clubley@remove_me.eisner.decus.org-Earth.UFP Microsoft: Bringing you 1980s technology to a 21st century world
Reply by ●June 25, 20142014-06-25
On 2014-06-25, Simon Clubley <clubley@remove_me.eisner.decus.org-Earth.UFP> wrote:> What is the 6502 still used in ?I think it's still in use as a core in ultra high-volume, low-cost, full-custom chips used in toys... -- Grant Edwards grant.b.edwards Yow! My EARS are GONE!! at gmail.com
Reply by ●June 25, 20142014-06-25
Hi Simon, [apologies for forgetting this greeting on my previous post :< ] On 6/25/2014 12:55 PM, Simon Clubley wrote:>>> No, but it's been through a incompatible language revision. >> >> This is true of other languages, as well. E.g., foo =- 2; >> >> Every language (tool) you rely upon in a project creates a dependency >> on that tool and the staff who must be able to *use* it. Try taking >> someone accustomed to (i.e., recent grad) C11 and sit them down in >> front of a C89 codebase WITH C89 TOOLS (because your industry requires >> tools to be formally certified prior to use). They will spend >> countless hours wondering why their "perfect" code is throwing >> compiler errors. > > Oh, that could be interesting. :-)You have an odd definition of "interesting"! :> I would use a phrase more along the lines of "<Fexpletive> annoying!"> I've actually used those early C compilers (a long time ago).Granted, I picked on an obscure feature. But, there are lots of assumptions that (we all) make, subtly, in our development environments. E.g., nowadays, a "kid" (new grad) might be startled to encounter 16b int's. And, at a complete *loss* to understand why his code is compiling yet *crashing* -- until this is made evident to him. [I've had to support lots of legacy codebases over the years so have learned not to take anything for granted from "the" tools :< ]>> And, once you've clued them in to the reason, they'll be fuming /sotto >> voce/ each time they stumble on another "gotcha". I.e., the tool >> is "fighting them" -- they're not a "happy camper". >> >>> (Which is probably a sign of language maturity. :-)) >>> >>> Don't forget as well that a decade ago Perl was very popular and now >>> it's a legacy language. >> >> Where will C++, Java, Python, Perl, Ruby, etc. be "a few years hence"? >> Where will the folks who can efficiently *develop* with them be? > > This is why it's so important to learn the concepts first and the > language second. With that grounding, it's a lot easier to adapt > to new languages.Yup. But, even there, we exhibit "biases" that we aren't even aware of in how we think things "should be". E.g., I often write self-modifying code. And, when I find myself working in an architecture (or a particular implementation) that doesn't allow me to do that, I feel stymied. As if I should be *entitled* to do that...>> (e.g., 6502 ASM programmers are still in demand -- because no one >> *writes* 6502 ASM nowadays, "mainstream") > > I'll admit I didn't realise the 6502 was still around - I last used > it back in the BBC Model B days (ie: my school days). At the time I > seem to remember preferring the Z80 (can't remember why) although > some classmates seemed to prefer the 6502 architecture. [*]It's the essence of the Moto v Intel argument (in that time frame). Lots of registers or *few* registers (with a "fast access" bank of memory). I have a particular fondness for the Z80/180/etc. but admit it is a helluva kludge. For its generation, it seemed to have "the right number" of registers -- you weren't continually doing loads and stores from A/B, etc. The 6502, OTOH, is a tiny, relatively clean architecture (if you're discussing "programmable calculators" :> )> What is the 6502 still used in ?The folks that have inquired most in recent memory have been split between military and (very) high-volume consumer goods. E.g., put the processor on a tiny die with whatever other mixed mode stuff you need and have a true "single chip" solution to <whatever> problem.> [*] Of course, these days, schoolchildren are more likely to be > comparing social media platforms instead of computer architectures. :-)Sadly, probably true!







