EmbeddedRelated.com
Forums
Memfault Beyond the Launch

Definition of an Algorithm

Started by Rick C November 2, 2022
I recall in one of the early computer science classes I took, a professor defined an algorithm in a mathematical definition.  She gave a list of properties an algorithm had to have to qualify as an algorithm.  She also listed some features that were not required, such as being a computer program. 

I recall these features: 

1) Output - without output a procedure is pointless. 

2) Input - ???  I want to say input is optional.  So a set of steps to calculate some number of digits of pi would qualify as an algorithm, in spite of not having inputs. 

3) Steps - she used qualifiers that indicated the steps had to be clear and unambiguous. 

4) Finite - the steps must come to an end, i.e. at some point the algorithm has to produce the result, it can't be infinite. 

I don't recall other qualifications, but I think there is at least one I'm missing.  It was not a long list, and, like I've said, I don't think Input was on the list.  

The web searches seem to produce some pretty vague, garbage definitions, some even saying it is a computer program, which I'm certain it does not have to be.  

Anyone familiar with this? 

-- 

Rick C.

- Get 1,000 miles of free Supercharging
- Tesla referral code - https://ts.la/richard11209
Rick C <gnuarm.deletethisbit@gmail.com> writes:
> Anyone familiar with this?
Meh, there is not really an accepted formal definition, and to some extent it is a philosophy question. The word algorithm comes from Arabic, but the modern notion of Turing machines is from the 1920s. There are a bunch of viewpoints on the topic here: https://en.wikipedia.org/wiki/Algorithm_characterizations A further take is here: https://plato.stanford.edu/entries/computer-science/#Algo
On Wednesday, November 2, 2022 at 1:18:56 AM UTC-4, Paul Rubin wrote:
> Rick C <gnuarm.del...@gmail.com> writes: > > Anyone familiar with this? > > Meh, there is not really an accepted formal definition, and to some > extent it is a philosophy question. The word algorithm comes from > Arabic, but the modern notion of Turing machines is from the 1920s. > > There are a bunch of viewpoints on the topic here: > > https://en.wikipedia.org/wiki/Algorithm_characterizations > > A further take is here: > > https://plato.stanford.edu/entries/computer-science/#Algo
If there's no definition, I guess I need to stop using the term. The Wikipedia article starts off as a poor substitute for what it's supposed to be, with no reference anywhere in the section. This is one of the worst examples I've ever seen, even for Wikipedia. I'm amazed it was ever allowed. Probably a matter of the camel getting its nose under the tent. But the section that actually tried to define an algorithm (Features of a good algorithm) seems to hit the nail on the head, as far as I can see. It mentions input though, which I'm pretty sure can be a null set without impacting that the process is an algorithm. So factually, input is not required, and not part of the definition. -- Rick C. + Get 1,000 miles of free Supercharging + Tesla referral code - https://ts.la/richard11209
On 02/11/2022 06:42, Rick C wrote:
> On Wednesday, November 2, 2022 at 1:18:56 AM UTC-4, Paul Rubin > wrote: >> Rick C <gnuarm.del...@gmail.com> writes: >>> Anyone familiar with this? >> >> Meh, there is not really an accepted formal definition, and to >> some extent it is a philosophy question. The word algorithm comes >> from Arabic, but the modern notion of Turing machines is from the >> 1920s. >> >> There are a bunch of viewpoints on the topic here: >> >> https://en.wikipedia.org/wiki/Algorithm_characterizations >> >> A further take is here: >> >> https://plato.stanford.edu/entries/computer-science/#Algo > > If there's no definition, I guess I need to stop using the term.
If you stopped using a term just because there is no single formal definition, you'd have to stop saying anything at all. After all, there's no definition for the terms "definition" or "term". It just means that if you are writing something that is supposed to be rigorous, you have to define what /you/ mean by a given term in the context of the discussion.
On 02/11/2022 04:04, Rick C wrote:
> I recall in one of the early computer science classes I took, a > professor defined an algorithm in a mathematical definition. She > gave a list of properties an algorithm had to have to qualify as an > algorithm. She also listed some features that were not required, > such as being a computer program. >
Just remember that any such definition will be a "lie-to-children" - an oversimplification that covers what you need at the time, but not the full picture. (That is not a bad thing in any way - it is essential to all learning. And it's the progress from oversimplifications upwards that keeps things fun. If you are not familiar with the term "lie-to-children", I recommend "The Science of the Discworld" that popularised it.)
> I recall these features: > > 1) Output - without output a procedure is pointless. > > 2) Input - ??? I want to say input is optional. So a set of steps > to calculate some number of digits of pi would qualify as an > algorithm, in spite of not having inputs.
You could argue that there is always some kind of input. For example, you could say that the digit index or the number of digits is the input to the "calculate pi" algorithm. (As an aside, I find it fascinating that there is an algorithm to calculate the nth digit of the hexadecimal expansion of pi that does not need to calculate all the preceding digits.)
> > 3) Steps - she used qualifiers that indicated the steps had to be > clear and unambiguous. > > 4) Finite - the steps must come to an end, i.e. at some point the > algorithm has to produce the result, it can't be infinite. >
Is that really true? If you can accept a "calculate pi" algorithm that does not have an input, then it is not finite. I remember a programming task from my days at university (doing mathematics and computation), writing a function that calculated the digits of pi. The language in question was a functional programming language similar to Haskell. The end result was a function that returned an infinite list - you could then print out as many digits from that list as you wanted. So what was the output? What kind of output do you allow from your algorithms? Does the algorithm have to stop and produce and output, or can it produce a stream of output underway? Can the output be an algorithm itself? Is it acceptable (according to your definition) for the output of a "calculate pi" algorithm to be a tuple of a digit of pi and an algorithm to calculate the next digit-algorithm tuple?
> I don't recall other qualifications, but I think there is at least > one I'm missing. It was not a long list, and, like I've said, I > don't think Input was on the list. >
Was it "deterministic" ? Not all algorithms are deterministic and repeatable, but it is often a very useful characteristic, and it might have been relevant for the course you were doing at the time.
> The web searches seem to produce some pretty vague, garbage > definitions, some even saying it is a computer program, which I'm > certain it does not have to be. > > Anyone familiar with this? >
I think it is unlikely that you'll get a perfect match for your professor's definition, except by luck - because I don't think there really is a single definition. I don't think there is even a single consistent definition for your features "output", "input", "steps", or "finite" (in this context). That's why Turing went to such effort to define his Turing Machine (and some other mathematicians of the time came up with alternative "universal computers"). If you want a rigorous definition, you have to go back to 0's and 1's and something mathematically precise such as a TM or universal register machine (which is easier to program than a TM). Of course, you are then restricting your algorithms - it no longer includes a recipe for brownies, for example, but it does give you something you can define and reason about.
On Wednesday, November 2, 2022 at 4:18:00 AM UTC-4, David Brown wrote:
> On 02/11/2022 06:42, Rick C wrote: > > On Wednesday, November 2, 2022 at 1:18:56 AM UTC-4, Paul Rubin > > wrote: > >> Rick C <gnuarm.del...@gmail.com> writes: > >>> Anyone familiar with this? > >> > >> Meh, there is not really an accepted formal definition, and to > >> some extent it is a philosophy question. The word algorithm comes > >> from Arabic, but the modern notion of Turing machines is from the > >> 1920s. > >> > >> There are a bunch of viewpoints on the topic here: > >> > >> https://en.wikipedia.org/wiki/Algorithm_characterizations > >> > >> A further take is here: > >> > >> https://plato.stanford.edu/entries/computer-science/#Algo > > > > If there's no definition, I guess I need to stop using the term. > If you stopped using a term just because there is no single formal > definition, you'd have to stop saying anything at all. After all, > there's no definition for the terms "definition" or "term". > > It just means that if you are writing something that is supposed to be > rigorous, you have to define what /you/ mean by a given term in the > context of the discussion.
Your idea that language is not defined is not valid. They are defined in a dictionary, actually, many, not unlike standards, lol. Terms with jargon meaning need their specific definition in that context, or they have little value. Much of the engineering and computer science I learned in college was taught like math. Terms were defined, in order to know what is being said. Look at math. It's as much about the definitions as the various rules. If a term like algorithm is not well defined, then I won't use it in engineering unless I define it first. -- Rick C. -- Get 1,000 miles of free Supercharging -- Tesla referral code - https://ts.la/richard11209
On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown wrote:
> On 02/11/2022 04:04, Rick C wrote: > > I recall in one of the early computer science classes I took, a > > professor defined an algorithm in a mathematical definition. She > > gave a list of properties an algorithm had to have to qualify as an > > algorithm. She also listed some features that were not required, > > such as being a computer program. > > > Just remember that any such definition will be a "lie-to-children" - an > oversimplification that covers what you need at the time, but not the > full picture.
Sorry, I have no idea what you are talking about.
> (That is not a bad thing in any way - it is essential to > all learning. And it's the progress from oversimplifications upwards > that keeps things fun. If you are not familiar with the term > "lie-to-children", I recommend "The Science of the Discworld" that > popularised it.) > > I recall these features: > > > > 1) Output - without output a procedure is pointless. > > > > 2) Input - ??? I want to say input is optional. So a set of steps > > to calculate some number of digits of pi would qualify as an > > algorithm, in spite of not having inputs. > You could argue that there is always some kind of input. For example, > you could say that the digit index or the number of digits is the input > to the "calculate pi" algorithm.
You can argue anything. A procedure to calculate pi without specifying how many digits would not be an algorithm because of the "finite" requirement. It doesn't need an input to set the number of digits if that is built into the procedure.
> (As an aside, I find it fascinating that there is an algorithm to > calculate the nth digit of the hexadecimal expansion of pi that does not > need to calculate all the preceding digits.) > > > > 3) Steps - she used qualifiers that indicated the steps had to be > > clear and unambiguous. > > > > 4) Finite - the steps must come to an end, i.e. at some point the > > algorithm has to produce the result, it can't be infinite. > > > Is that really true? > > If you can accept a "calculate pi" algorithm that does not have an > input, then it is not finite.
Not true. If it has no definite end, it is not an algorithm. That's why an operating system is not typically an algorithm, it has no specific termination unless you have a command to shut down the computer, I suppose.
> I remember a programming task from my days at university (doing > mathematics and computation), writing a function that calculated the > digits of pi. The language in question was a functional programming > language similar to Haskell. The end result was a function that > returned an infinite list - you could then print out as many digits from > that list as you wanted.
How did it return an infinite list? You mean it returned as many digits as you specified or waited to have calculated? If you have an infinite storage system, that's a pretty remarkable achievement. But I expect it would be powered by an infinite improbability drive.
> So what was the output? What kind of output do you allow from your > algorithms? Does the algorithm have to stop and produce and output, or > can it produce a stream of output underway? Can the output be an > algorithm itself? Is it acceptable (according to your definition) for > the output of a "calculate pi" algorithm to be a tuple of a digit of pi > and an algorithm to calculate the next digit-algorithm tuple? > > I don't recall other qualifications, but I think there is at least > > one I'm missing. It was not a long list, and, like I've said, I > > don't think Input was on the list. > > > Was it "deterministic" ? Not all algorithms are deterministic and > repeatable, but it is often a very useful characteristic, and it might > have been relevant for the course you were doing at the time.
If it's not deterministic, it's not an algorithm. Flipping a coin is not an algorithm.
> > The web searches seem to produce some pretty vague, garbage > > definitions, some even saying it is a computer program, which I'm > > certain it does not have to be. > > > > Anyone familiar with this? > > > I think it is unlikely that you'll get a perfect match for your > professor's definition, except by luck - because I don't think there > really is a single definition. I don't think there is even a single > consistent definition for your features "output", "input", "steps", or > "finite" (in this context).
Ok, that explains a lot about software development.
> That's why Turing went to such effort to define his Turing Machine (and > some other mathematicians of the time came up with alternative > "universal computers"). If you want a rigorous definition, you have to > go back to 0's and 1's and something mathematically precise such as a TM > or universal register machine (which is easier to program than a TM). > Of course, you are then restricting your algorithms - it no longer > includes a recipe for brownies, for example, but it does give you > something you can define and reason about.
Nothing useful here. Thanks anyway. -- Rick C. -+ Get 1,000 miles of free Supercharging -+ Tesla referral code - https://ts.la/richard11209
On 11/2/2022 20:45, Rick C wrote:
> ... >>> >>> 2) Input - ??? I want to say input is optional. So a set of steps >>> to calculate some number of digits of pi would qualify as an >>> algorithm, in spite of not having inputs. >> You could argue that there is always some kind of input. For example, >> you could say that the digit index or the number of digits is the input >> to the "calculate pi" algorithm. > > You can argue anything. A procedure to calculate pi without specifying how many digits would not be an algorithm because of the "finite" requirement. It doesn't need an input to set the number of digits if that is built into the procedure.
Of course it is input. You change the number and the output changes.
> ..... >>> >>> 4) Finite - the steps must come to an end, i.e. at some point the >>> algorithm has to produce the result, it can't be infinite. >>> >> Is that really true? >> >> If you can accept a "calculate pi" algorithm that does not have an >> input, then it is not finite. > > Not true.
Of course it is true. Unless you specify the limits for an operation which is infinite in nature you have the calculating algorithm running infinitely. No need to go about calculating Pi, divide 1 by 3 using the algorithm taught at primary school using a pencil. Without giving it much thought I'd say an algorithm is an unambiguous flowchart made up in order to achieve some goal. Anything more than that would take some qualifier to the word "algorithm".
On 2022-11-02 20:45, Rick C wrote:
> On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown wrote:
[snip]
>> I remember a programming task from my days at university (doing >> mathematics and computation), writing a function that calculated the >> digits of pi. The language in question was a functional programming >> language similar to Haskell. The end result was a function that >> returned an infinite list - you could then print out as many digits from >> that list as you wanted. > > How did it return an infinite list?
Probably by using a non-strict evaluation order (https://en.wikipedia.org/wiki/Evaluation_strategy#Non-strict_evaluation) where a data object can be "unbounded" or "potentially infinite" -- only the parts of the data structure that are later accessed become "real" data held in memory. You could of course say that the part (function) of the program that produced the potentially infinite list is not an "algorithm" by itself, and becomes an algorithm only when combined with the part of the program that outputs a finite part of the list. But that seems too limited, because it that function is clearly independent of the rest of the program and implements a well-defined computation.
>> Was it "deterministic" ? Not all algorithms are deterministic and >> repeatable, but it is often a very useful characteristic, and it might >> have been relevant for the course you were doing at the time. > > If it's not deterministic, it's not an algorithm. Flipping a coin is > not an algorithm.
Randomized or probabilistic algorithms are a huge research subject with many practical applications, for example for the approximate solution of hard optimization problems (https://en.wikipedia.org/wiki/Randomized_algorithm). A very simple example is the "random sample consensus" method (RanSaC), which is often called an "algorithm" although it is not deterministic (https://en.wikipedia.org/wiki/Random_sample_consensus).
On 02/11/2022 20:53, Niklas Holsti wrote:
> On 2022-11-02 20:45, Rick C wrote: >> On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown >> wrote: > > [snip] > > >>> I remember a programming task from my days at university (doing >>> mathematics and computation), writing a function that calculated >>> the digits of pi. The language in question was a functional >>> programming language similar to Haskell. The end result was a >>> function that returned an infinite list - you could then print >>> out as many digits from that list as you wanted. >> >> How did it return an infinite list? > > > Probably by using a non-strict evaluation order > (https://en.wikipedia.org/wiki/Evaluation_strategy#Non-strict_evaluation) > where a data object can be "unbounded" or "potentially infinite" -- > only the parts of the data structure that are later accessed become > "real" data held in memory. >
The more relevant term is "lazy evaluation". It is very common in functional programming languages, and allows you to work with infinite lists, infinite mutually recursive functions, and all kinds of fun.
> You could of course say that the part (function) of the program that > produced the potentially infinite list is not an "algorithm" by > itself, and becomes an algorithm only when combined with the part of > the program that outputs a finite part of the list. But that seems > too limited, because it that function is clearly independent of the > rest of the program and implements a well-defined computation. >
The function returns a function as part of the output.
> >>> Was it "deterministic" ? Not all algorithms are deterministic >>> and repeatable, but it is often a very useful characteristic, and >>> it might have been relevant for the course you were doing at the >>> time. >> >> If it's not deterministic, it's not an algorithm. Flipping a coin >> is not an algorithm. > > Randomized or probabilistic algorithms are a huge research subject > with many practical applications, for example for the approximate > solution of hard optimization problems > (https://en.wikipedia.org/wiki/Randomized_algorithm). > > A very simple example is the "random sample consensus" method > (RanSaC), which is often called an "algorithm" although it is not > deterministic > (https://en.wikipedia.org/wiki/Random_sample_consensus). >
All sorts of algorithms are non-deterministic. Pretty much any algorithm that is executed in parallel parts is going to have non-deterministic execution, which can result in non-deterministic outputs.

Memfault Beyond the Launch