On 6/10/2017 6:01 PM, George Neuner wrote:
> On Fri, 9 Jun 2017 22:50:31 -0700, Don Y <blockedofcourse@foo.invalid>
> wrote:
>
>> On 6/9/2017 7:14 PM, George Neuner wrote:
>>> On Fri, 9 Jun 2017 00:06:05 -0700, Don Y wrote:
>>>
>>>> ..., if you'd a formal education in CS, it would be trivial to
>>>> semantically map the mechanisms to value and reference concepts.
>>>> And, thinking of "reference" in terms of an indication of WHERE
>>>> it is! etc.
>>>
>>> But only a small fraction of "developers" have any formal CS, CE, or
>>> CSE education. In general, the best you can expect is that some of
>>> them may have a certificate from a programming course.
>>
>> You've said that in the past, but I can't wrap my head around it.
>> It's like claiming very few doctors have taken any BIOLOGY courses!
>> Or, that a baker doesn't understand the basic chemistries involved.
>
> Comparitively few bakers actually can tell you the reason why yeast
> makes dough rise, or why you need to add salt to make things taste
> sweet. It's enough for many people to know that something works -
> they don't have a need to know how or why.
I guess different experiences. Growing up, I learned these sorts
of things by asking countless questions of the vendors we frequented.
Yeast vs. baking soda as leavening agent; baking soda vs. powder;
vs. adding cream of tartar; cake flour vs. bread flour; white sugar
vs. brown sugar; vege shortening vs. butter (vs. oleo/oil); sugar
as a "wet" ingredient; etc.
Our favorite baker was a weekly visit. He'd take me in the back room
(much to the chagrin of other customers) and show me the various bits
of equipment, what he was making at the time, his "tricks" to eek a
bit more life out of something approaching its "best by" date, etc.
[I wish I'd pestered him, more, to learn about donuts and, esp, bagels
as he made the *best* of both! OTOH, probably too many details for
a youngster to commit to memory...]
The unfortunate thing (re: US style of measurement by volume) is that
you don't have as fine control over some of the ingredients (e.g.,
what proportion of "other ingredients" per "egg unit")
[I've debated purchasing a scale just to weigh eggs! Not to tweek
the amount of other ingredients proportionately but, rather, to
select a "set" of eggs closest to a target weight for a particular
set of "other ingredients". Instead, I do that "by feel", presently
(one of the aspects of my Rx's that makes them "non-portable -- the
other being my deliberate failure to upgrade the written Rx's as I
improve upon them. Leaves folks wondering why things never come
out "as good" when THEY make them... <grin>]
> WRT "developers":
>
> A whole lot of "applications" are written by people in profressions
> unrelated to software development. The become "developers" de facto
> when their programs get passed around and used by others.
>
> Consider all the scientists, mathematicians, statisticians, etc., who
> write data analysis programs in the course of their work.
>
> Consider all the data entry clerks / "accidental" database admins who
> end up having to learn SQL and form coding to do their jobs.
>
> Consider the frustrated office workers who study VBscript or
> Powershell on their lunch hour and start automating their manual
> processes to be more productive.
>
> : < more examples elided - use your imagination >
>
> Some of these "non-professional" programs end up being very effective
> and reliable. The better ones frequently are passed around, modified,
> extended, and eventually are coaxed into new uses that the original
> developer never dreamed of.
>
> Then consider the legions of (semi)professional coders who maybe took
> a few programming courses, or who learned on their own, and went to
> work writing, e.g., web applications, Android apps, etc.
>
> It has been estimated that over 90% of all software today is written
> by people who have no formal CS/CE/CSE or IS/IT education, and 40% of
> all programmers are employed primarily to do something other than
> software development.
>
> Note: programming courses != CS/CE/CSE education
And these folks tend to use languages (and tools) that are tailored to
those sorts of "applications". Hence the reason I include a scripting
language in my design; no desire to force folks to understand data types,
overflow, mathematical precision, etc.
"I have a room that is 13 ft, 2-1/4 inches by 18 ft, 3-3/8 inches.
Roughly how many 10cm x 10cm tiles will it take to cover the floor?"
Why should the user have to normalize to some particular unit of measure?
All he wants, at the end, is a dimensionless *count*.
[I was recently musing over the number of SOIC8 devices that could fit
on the surface of a sphere having a radius equal to the average distance
of Pluto from the Sun (idea came from a novel I was reading). And, how
much that SOIC8 collection would *weigh*...]
>>> I knew someone who was taking a C programming course, 2 nights a week
>>> at a local college. After (almost) every class, he would come to me
>>> with questions and confusions about the subject matter. He remarked
>>> on several occasions that I was able to teach him more in 10 minutes
>>> than he learned in a 90 minute lecture.
>>
>> But I suspect you had a previous relationship with said individual.
>> So, knew how to "relate" concepts to him/her.
>
> In this case, yes. But I also had some prior teaching experience.
>
> I rarely have much trouble explaining complicated subjects to others.
> As you have noted in the past, it is largely a matter of finding
> common ground with a student and drawing appropriate analogies.
Exactly. I had a lady friend many years ago to whom I'd always explain
computer-related issues (more typ operational ones than theoretical ones)
using "kitchen analogies". In a playful mood, one day, she chided me for
the misogynistic examples. So, I started explaining things in terms
of salacious "bedroom activities". Didn't take long for her to request
a return to the kitchen analogies! :>
>>> CS programs don't teach programming - they teach "computer science".
>>> For the most part CS students simply are expected to know.
>>
>> I guess I don't understand the difference.
>>
>> In my mind, "programming" is the plebian skillset.
>
> Only sort of. Programming is fundamental to computer *engineering*,
> but that is a different discipline.
>
> Computer "science" is concerned with
> - computational methods,
> - language semantics,
> - ways to bridge the semantic gap between languages and methods,
> - design and study of algorithms,
> - design of better programming languages [for some "better"]
> - ...
> Programming per se really is not a requirement for a lot of it. A
> good foundation of math and logic is more necessary.
Petri nets, lamda calculus, S-machines, etc.
But, to become *practical*, these ideas have to eventually be bound
to concrete representations. You need ways of recording algorithms
and verifying that they do, in fact, meet their desired goals.
I know no one who makes a living dealing in abstractions, entirely.
Even my physics friends have lives beyond a blackboard.
>>> CSE programs are somewhat better because they [purport to] teach
>>> project management: selection and use of tool chains, etc. But that
>>> can be approached largely in the abstract as well.
>>
>> This was an aspect of "software development" that was NOT stressed
>> in my curriculum. Nor was "how to use a soldering iron" in the
>> EE portion thereof (the focus was more towards theory with the
>> understanding that you could "pick up" the practical skills relatively
>> easily, outside of the classroom)
>
> Exactly! If you can't learn to solder on your own, you don't belong
> here. CS regards programming in the same way.
But you can't examine algorithms and characterize their behaviors,
costs, etc. without being able to reify them. You can't just
magically invent an abstract language that supports:
solve_homework_problem(identifier)
>> I just can't imagine how you could explain "programming" a machine to a
>> person without that person first understanding how the machine works.
>
> Take a browse through some classics:
>
> - Abelson, Sussman & Sussman, "Structure and Interpretation of
> Computer Programs" aka SICP
>
> - Friedman, Wand & Haynes, "Essentials of Programming Languages"
> aka EOPL
All written long after I'd graduated. :> Most (all?) of my college
CS courses didn't have "bound textbooks". Instead, we had collections
of handouts coupled with notes that formed our "texts". In some cases,
the handouts were "bound" (e.g., a cheap "perfect binding" paperback)
for convenience as the instructors were writing the texts *from*
their teachings.
Sussman taught one of my favorite courses and I'm chagrined that
all I have to show for it are the handouts and my notes -- it would
have been nicer to have a lengthier text that I could explore at
my leisure (esp after the fact).
The books that I have on the subject predate my time in college
(I attended classes at a local colleges at night and on weekends
while I was in Jr High and High School). Many of the terms used
in them have long since gone out of style (e.g., DASD, VTOC, etc.)
I still have my flowcharting template and some FORTRAN coding forms
for punched cards... I suspect *somewhere* these are still used! :>
Other texts from that period are amusing to examine to see how
terminology and approaches to problems have changed. "Real-time"
being one of the most maligned terms! (e.g., Caxton's book)
> There are many printings of each of these. I happen to have SICP 2nd
> Ed and EOPL 8th Ed on my shelf.
>
> Both were - and are still - widely used in undergrad CS programs.
>
> SICP doesn't mention any concrete machine representation until page
> 491, and then a hypothetical machine is considered with respect to
> emulating its behavior.
>
> EOPL doesn't refer to any concrete machine at all.
>
>>> What would you have done differently if C were not available for
>>> writing your applications? How exactly would that have impacted your
>>> development?
>>
>> The applications are written in Limbo. I'd considered other scripting
>> languages for that role -- LOTS of other languages! -- but Limbo already
>> had much of the support I needed to layer onto the "structure" of my
>> system. Did I want to invent a language and a hosting VM (to make it
>> easy to migrate applications at run-time)? Add multithreading hooks
>> to an existing language? etc.
>>
>> [I was disappointed with most language choices as they all tend to
>> rely heavily on punctuation and other symbols that aren't "voiced"
>> when reading the code]
>
> Write in BrainF_ck ... that'll fix them.
>
> Very few languages have been deliberately designed to be read. The
> very idea has negative connotations because the example everyone jumps
> to is COBOL - which was too verbose.
Janus (Consistent System) was equally verbose. Its what I think of when
I'm writing SQL :< An 80 column display was dreadfully inadequate!
> It's also true that reading and writing effort are inversely related,
> and programmers always seem to want to type fewer characters - hence
> the proliferation of languages whose code looks suspiciously like line
> noise.
Yes, but if you're expecting to exchange code snippets with folks
who can't *see*, the imprecision of "speaking" a program's contents
is fraught with opportunity for screwups -- even among "professionals"
who know where certain punctuation are *implied*.
Try dictating "Hello World" to a newbie over the phone...
I actually considered altering the expression syntax to deliberately
render parens unnecessary (and illegal). I.e., if an expression
can have two different meanings with/without parens, then ONLY the
meaning without parens would be supported.
But, this added lots of superfluous statements just to meet that
goal *and* quickly overloads STM as you try to keep track of
which "component statements" you've already encountered:
area = (width_feet+(width_inches/12))*(length_feet+(length_inches/12)
becomes:
width = width_feet + width_inches/12
length = length_feet + length_inches/12
area = length * width
[Imagine you were, instead, computing the *perimeter* of a 6 walled room!]
> I don't know about you, but I haven't seen a teletype connected to a
> computer since about 1972.
Actually, I have one :>
>> You obviously have to understand the CONCEPT of "multiplication" in
>> order to avail yourself of it. But, do you care if it's implemented
>> in a purely combinatorial fashion? Or, iteratively with a bunch of CSA's?
>> Or, by tiny elves living in a hollow tree?
>
> Rabbits are best for multiplication.
Or, Adders and log tables! (bad childhood joke)
>> In my case, you have to understand that each function/subroutine invocation
>> just *appears* to be a subroutine/function invocation. That, in reality,
>> it can be running code on another processor in another building -- concurrent
>> with what you are NOW doing (this is a significant conceptual difference
>> between traditional "programming" where you consider everything to be a
>> series of operations -- even in a multithreaded environment!).
>>
>> You also have to understand that your "program" can abend or be aborted
>> at any time. And, that persistent data has *structure* (imposed by
>> the DBMS) instead of being just BLOBs. And, that agents/clients have
>> capabilities that are finer-grained than "permissions" in conventional
>> systems.
>>
>> But, you don't have to understand how any of these things are implemented
>> in order to use them correctly.
>
> Which is one of the unspoken points of those I books mentioned above:
> that (quite a lot of) programming is an exercise in logic that is
> machine independent.
>
> Obviously I am extrapolating and paraphrasing, and the authors did not
> have device programming in mind when they wrote the books.
>
> Nevertheless, there is lot of truth in it: identifying required
> functionality, designing program logic, evaluating and choosing
> algorithms, etc. ... all may be *guided* in situ by specific knowledge
> of the target machine, but they are skills which are independent of
> it.
But I see programming (C.A.E) as having moved far beyond the sorts of
algorithms you would run on a desktop, mainframe, etc. Its no longer
just about this operator in combination with these arguments yields
this result.
When I was younger, I'd frequently use "changing a flat tire" as an
example to coax folks into describing a "familiar" algorithm. It
was especially helpful at pointing out all the little details that
are so easy to forget (omit) that can render an implementation
ineffective, buggy, etc.
"Wonderful! Where did you get the spare tire from?"
"The trunk!"
"And, how did you get it out of the trunk?"
"Ah, I see... 'I *opened* the trunk!'"
"And, you did this while seated behind the wheel?"
"Oh, OK. 'I got out of the car and OPENED the trunk'"
"While you were driving down the road?"
"Grrr... 'I pulled over to the shoulder and stopped the car; then got out'"
"And got hit by a passing vehicle?"
Now, its not just about the language and the target hardware but, also, the
execution environment, OS, etc.
Why are people surprised to discover that it's possible for <something> to
see partial results of <something else's> actions? (i.e., the need for
atomic operations) Or, to be frustrated that such problems are so hard
to track down?
(In a multithreaded environment,) we all know that the time between
execution of instruction N and instruction N+1 can vary -- from whatever
the "instruction rate" of the underlying machine happens to be up to
the time it takes to service all threads at this, and higher, priority...
up to "indefinite". Yet, how many folks are consciously aware of that
as they write code?
A "programmer" can beat on a printf() statement until he manages to stumble
on the correct combination of format specifiers, flags, arguments, etc.
But, will it ever occur to him that the printf() can fail, at RUNtime?
Or, the NEXT printf() might fail while this one didn't?
How many "programmers" know how much stack to allocate to each thread?
How do they decide -- wait for a stack fence to be breached and then
increase the number and try again? Are they ever *sure* that they've
got the correct, "worst case" value?
I.e., there are just too many details of successful program deployment
that don't work when you get away from the rich and tame "classroom
environment". This is especially true as we move towards scenarios
where things "talk to" each other, more (for folks who aren't prepared
to deal with a malloc/printf *failing*, how do they address "network
programming"? Or, RPC/RMI? etc.)
Its easy to see how someone can coax a piece of code to work in a
desktop setting -- and fall flat on their face when exposed to a
less friendly environment (i.e., The Real World).
[Cookies tonight (while its below 100F) and build a new machine to replace
this one. Replace toilets tomorrow (replaced flange in master bath today).]