On 02/11/2022 19:45, Rick C wrote:
> On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown wrote:
>> On 02/11/2022 04:04, Rick C wrote:
>>> I recall in one of the early computer science classes I took, a
>>> professor defined an algorithm in a mathematical definition. She
>>> gave a list of properties an algorithm had to have to qualify as an
>>> algorithm. She also listed some features that were not required,
>>> such as being a computer program.
>>>
>> Just remember that any such definition will be a "lie-to-children" - an
>> oversimplification that covers what you need at the time, but not the
>> full picture.
>
> Sorry, I have no idea what you are talking about.
When you are explaining a complicated subject to someone who does not
know about it (and that's typically the case at school or university),
you simplify - you are not telling the truth, you are telling a "lie to
children". If the professor were to try to tell you all about what she
understood by the meaning of "algorithm", it would take the full course
instead of a class. You were given a "lie to children" - and that is
entirely appropriate.
You appear to be mistaking it for a complete definition, which is less good.
>
>
>> (That is not a bad thing in any way - it is essential to
>> all learning. And it's the progress from oversimplifications upwards
>> that keeps things fun. If you are not familiar with the term
>> "lie-to-children", I recommend "The Science of the Discworld" that
>> popularised it.)
>>> I recall these features:
>>>
>>> 1) Output - without output a procedure is pointless.
>>>
>>> 2) Input - ??? I want to say input is optional. So a set of steps
>>> to calculate some number of digits of pi would qualify as an
>>> algorithm, in spite of not having inputs.
>> You could argue that there is always some kind of input. For example,
>> you could say that the digit index or the number of digits is the input
>> to the "calculate pi" algorithm.
>
> You can argue anything. A procedure to calculate pi without
> specifying how many digits would not be an algorithm because of the
> "finite" requirement. It doesn't need an input to set the number of
> digits if that is built into the procedure.
>
It is most certainly an algorithm - just not a finite algorithm. Your
arguments here are circular.
>
>> (As an aside, I find it fascinating that there is an algorithm to
>> calculate the nth digit of the hexadecimal expansion of pi that does not
>> need to calculate all the preceding digits.)
>>>
>>> 3) Steps - she used qualifiers that indicated the steps had to be
>>> clear and unambiguous.
>>>
>>> 4) Finite - the steps must come to an end, i.e. at some point the
>>> algorithm has to produce the result, it can't be infinite.
>>>
>> Is that really true?
>>
>> If you can accept a "calculate pi" algorithm that does not have an
>> input, then it is not finite.
>
> Not true.
Yes, true - pi is not finite.
It is actually pretty irrelevant whether you consider a way to calculate
the first ten digits of pi as a function "calculate_pi_10()" or as
"calculate_pi(10)". It doesn't matter if the input is fixed in the
algorithm or not.
(Can I assume you have never done any functional programming? You would
find it very enlightening - it is much more appropriate for appreciating
computation theory than imperative languages like C or Forth, or
hardware design languages.)
>
> If it has no definite end, it is not an algorithm. That's why an
> operating system is not typically an algorithm, it has no specific
> termination unless you have a command to shut down the computer, I suppose.
>
Again, you are giving a circular argument. Your objection to unending
algorithms is merely that they don't fit your remembered lesson that
talked about finite algorithms. (An OS would probably be better
described as a collection of algorithms rather than a single algorithm.)
>
>> I remember a programming task from my days at university (doing
>> mathematics and computation), writing a function that calculated the
>> digits of pi. The language in question was a functional programming
>> language similar to Haskell. The end result was a function that
>> returned an infinite list - you could then print out as many digits from
>> that list as you wanted.
>
> How did it return an infinite list? You mean it returned as many
> digits as you specified or waited to have calculated? If you have an
> infinite storage system, that's a pretty remarkable achievement. But I
> expect it would be powered by an infinite improbability drive.
>
No, it returned an infinite list.
I know you are familiar with Python, but I don't know at what level of
detail - let me give a comparison here.
When you write "range(1000000)", Python generates a list of a million
numbers from 0 to 999,999. This takes time and memory. But if you
write "xrange(1000000)", it returns a generator expression, quickly and
efficiently. It only actually generates the numbers when it needs to.
In Haskell, you can write :
numbers = 1 : [x + 1 | x <- numbers]
This would be like "numbers = [1] + [x + 1 for x in numbers]" in Python,
if such recursive statements were allowed.
This is an infinite list - it behaves in all ways as an unending list of
integers.
>
>> So what was the output? What kind of output do you allow from your
>> algorithms? Does the algorithm have to stop and produce and output, or
>> can it produce a stream of output underway? Can the output be an
>> algorithm itself? Is it acceptable (according to your definition) for
>> the output of a "calculate pi" algorithm to be a tuple of a digit of pi
>> and an algorithm to calculate the next digit-algorithm tuple?
>>> I don't recall other qualifications, but I think there is at least
>>> one I'm missing. It was not a long list, and, like I've said, I
>>> don't think Input was on the list.
>>>
>> Was it "deterministic" ? Not all algorithms are deterministic and
>> repeatable, but it is often a very useful characteristic, and it might
>> have been relevant for the course you were doing at the time.
>
> If it's not deterministic, it's not an algorithm. Flipping a coin is not an algorithm.
>
Non-deterministic does not mean random.
>
>>> The web searches seem to produce some pretty vague, garbage
>>> definitions, some even saying it is a computer program, which I'm
>>> certain it does not have to be.
>>>
>>> Anyone familiar with this?
>>>
>> I think it is unlikely that you'll get a perfect match for your
>> professor's definition, except by luck - because I don't think there
>> really is a single definition. I don't think there is even a single
>> consistent definition for your features "output", "input", "steps", or
>> "finite" (in this context).
>
> Ok, that explains a lot about software development.
>
Software development is a pretty big field.
>
>> That's why Turing went to such effort to define his Turing Machine (and
>> some other mathematicians of the time came up with alternative
>> "universal computers"). If you want a rigorous definition, you have to
>> go back to 0's and 1's and something mathematically precise such as a TM
>> or universal register machine (which is easier to program than a TM).
>> Of course, you are then restricting your algorithms - it no longer
>> includes a recipe for brownies, for example, but it does give you
>> something you can define and reason about.
>
> Nothing useful here. Thanks anyway.
>
I can't imagine anything in this thread being "useful" - surely it is
just an interesting discussion?