Search This Blog

Saturday, January 16, 2016

The Law of Primacy

I consider the absolute worst programming construct to be subroutine or the function.          Cleo Saulnier 
Hello?!?! Why pay attention to some random crank on the Internet?
Because I think he is onto something important...

Followers of this blog will note that it is often of and about functional programming. And talking of FP people like to cite for instance…
  • that Haskell has become real world 
  • that Scala is making the Tiobe index
  • that Lisp is resurging as Clojure 
  • that Whatsapp is written in Erlang
Yeah I have written about this some... But what exercises me more is the place of FP in CS-education — in particular that FP has entered primary CS education is (for me) a bigger deal than all the above.

So does Cleo Saulnier's quote above go against the grain of this blog??
No on the contrary I believe he is onto something big and has been surprisingly prescient.  But first let me quote him in more detail:

I consider the absolute worst programming construct to be subroutine or the function. We've used it for over 50 years now and we're still having difficulty reducing complexity. What if the unthinkable were true? What if the subroutine was the cause of all our problems? I can prove how the subroutine causes all parts of our software to become coupled and as such cannot support this as the basic building blocks of software.

I find the subroutine and all the technology around it (OOP, functional, AOP, etc.) are like a sinking ship where you keep throwing more lifeboats when what you really need is a new boat. (The ship being the subroutine and the lifeboats are OOP, funcional, AOP, etc.).

I posit a fourth "condition" for being able to produce better software and that is being able to recognise what specifically isn't working and be ready to ditch it. I see no indication of this for at least another 20 or 70 years give or take 100 years.

Computing Industry's Best Kept Secret: The function is NOT necessary to build software and may in fact be a bad tool. 
Q: So What is the alternative to functions?

With functions, it's stack based. You have to wait until the function returns in order to process the next function unless it's an internal function. This is the first in, last out rule just like a stack. I don't mean that functions are bad in of themselves for certain things. I mean that maybe they're not the *only* way. Just like different data structures have different uses. Right now, everyone is using the function. Imagine if the stack was the only data structure you could use. What a horrible world. Yet this is what we have with the function.  Unix has pipes. These are queues. As data is piped, software on both ends of the queue can execute at the same time. As data passes from one end of the queue to the other, there is no concept of execution point. Only data transformations and
transfers. Another example is the Internet where messages (real ones) get passed back and forth. Simple idea that scales and is in active use.

We've look into the stack based way of programming to death. Maybe the queue or other data processing model can be looked at, especially to solve concurrency. I feel it's a shame that there are perfectly acceptable tools available that get sidelined for the status quo. BTW, history dictates that the function is not reusable. Well, maybe it's reusable like sand is reusable to build glass. Once it's actually used, it is forever transformed into something else that can no longer be separated from the whole.


Most presentations of functional programming explicitly or implicitly start by dissing imperative programming to justify FP.

Thats fine but what does that dissing usually consist of:
  • Referential transparency
  • Danger of assignment
  • Expression is better than statement
  • etc
all of such I will call problems-in-the-small.

The fundamental abstraction units — call them function, procedure, subroutine or whatever — dont get much attention (or flak), because all languages starting from Fortran and C right up to Haskell have functions!

And this should make us sit up and take notice:

If C and Fortran have functions why is Haskell/ML/Erlang/Clojure/etc more functional?

A few people in the FP world have noted that the word 'functional' is rather a misnomer.

And when we see that, we may see that FP is successful exactly to the extent that it does not tie itself down to the standard notion of function. Consider:
  • polymorphic values
  • lazy evaluation
  • infinite data structures
  • And the most visible nowadays — monads
all these break the imperative notion of function which in Cleo Saulnier's definition is simply stack-based linkage.

Now it will be argued that
We also have generators, and coroutines, and threads, and other forms of parallel processing. The unsurprising reality is that such kinds of code are *harder* to get right than functions with their simple-minded stack-based sequential execution model.
Yes, this 'unsurprising' (uncommon?) common-sense looks so natural that the opposite view — functions are the root of the problem — seems downright crank-y.

I happen to think that Cleo Saulnier is onto something big and I want to spend some time on why.  Let me start with some examples:

Python-Range vs C-For

A fundamental building block for iteration in python is the for-loop which itself uses as a component iterators like range.  Typically, the C programmer finds this advanced/difficult/strange etc. IOW for him
for (i=0; i< N; i++)...
is easier than range(N)

The pythonista will (hopefully!) disagree but that does not make the range more easy or natural to one who started with C.

Print Hello World vs 'Hello World'

Likewise even many python programmers who have not learned to play around at the interpreter prompt think
print "Hello World"
is the program no 1.  And get surprised that it can be shortened to just
'Hello World'
Maybe related to the fact that the first program(s) they wrote were not in an interpreter?

Boxes or Sequences 

When C programmers think of lists they visualize diagrams like

And interminable hours debugging segfault errors!

The python programmer instead sees the far simpler [2,4,6,8], something closer to the math notion of sequence and further from machine-memory.
Ironically the C underbelly of python would show a picture considerably more messy than the picture above... And thats Greenspun's tenth inevitably kicking in

Generators or Lists

The Haskell programmer is even closer to math because, in addition to [2,4,6], he can also write [2,4..] ie infinite lists are as easy as finite, something that seems unnatural to the python programmer because he has to use the advanced concept of generator/iterator etc.

Are generators advanced or is it the usual sequence of learning of the typical pythonista that makes it so?

What people miss from this example even though (because?) it stares us in the face is that in the much vaunted first-classness of data structures the availability of a concrete syntax is probably as big a deal as garbage collection, managed-memory etc.

Are lists as easy for the Java programmer as for the Python programmer?I think not and thats related to the principle:
More syntactic baggage ⇒ Less clarity

Lexical Clarity or Gunk

And APL beats Python and Haskell as well
    ⍝ Scalars (0-dim)
    1 + 2
3


    ⍝ Vectors (1-dim)
    1 2 + 3 4
4 6


   ⍝ Matrices (2-dim) a,b are assumed defined
    a
1 2 3
4 5 6
    b
10 20 30
40 50 60
    a + b
11 22 33
44 55 66



Personally I find the lack of ",[]" noise liberating but also slightly unsettling. That a prompt is just 6 spaces indent is a bit much. But then I guess I am not an acute APLer

Expression or Statement

There is a basic duality in computer programming between these two columns:

Declarate Imperate
timeless time
value object
immutable mutable
constant variable
expression statement
function procedure
conditional expr if statement
list comprehension for loop
recursive function while loop
function compose (∘) statement sequence (;)

Its important to understand that this table is a dual relation.
If we give the standard 'denotational' semantics to elements in the 'Imperate' column we get the corresponding ones in the 'Declarate' column.
Complementarily if we try to implement the elements in the Declarate column we get the second

And all too often in imperative circles eg. the python mailing list, one finds the first column treated as harder than the second.

You are teaching list-comprehensions to a noob?
When he cannot write a basic for-loop?!?!
Unfair Thoughtless Cruel Sadistic

To me this is backwards because we cannot write an assignment statement without first writing an expression (on the rhs). The reverse is possible to an arbitrary degree [FPLs are proofs by example].

IOW I guess that those who argue thus were brought up on imperative programming.

For the record:

Historically
Imperative programming precedes Functional programming [1]
Logically
Functional Programming is prior to Imperative programming
Pedagogically
Jury's still out on that one. I suspect that different brains are differently structured; becomes more plausible if we see that the above table can be replicated at the more…

Conceptual/Psychological level



Is Do
concept realize
semantics implement
science technology
analysis design
noun verb
scientist engineer
program process


Or more bluntly, if you regard process as more fundamental than program you are more likely to be a machine than a programmer!


Herbert Simon and Alan Newell in their Turing award lecture highlighted the two complementing and contrasting sides of CS; viz. CS as empirical inquiry. This is an old yet bold juxtaposition.  That it is old is seen in the date of the Turing award — 1975 ie 40+ years.  And yet, that it is still futuristic 40 years on is seen in the fact that almost all CS education is (dis)organized along the pernicious dichotomy:

Theory Practice

Fortran or Assembly

[A personal one going back 3 generations of programming teachers]: In the 60s learning assembly was (considered) natural and easy, whereas Fortran was hard.  Its quite easy — as my dear colleague Abhijat would say — to practice the exact science of hindsight and write off this 1960s outlook as wrong.  But how about if it was right then, and changed in the interim? When between the 60s and now did this change?

Likewise...

  • Threads are easier in Erlang than in C — Erlang calls them processes
  • Generators are easier in Haskell than in Python — Haskell calls them (lazy) lists
  • Backtracking in Prolog — that's logic (backwards?)
  • Generalized data structures are easier in Lisp than in Java — Lisp calls them Lists, Java calls them XML!
And all these point to the Law of Primacy:  What is learned first will seem most natural, easy, obvious…


And to verify the objective truth behind this subjective-seem is next to impossible...

The state of being first, often creates a strong, almost unshakable, impression.

Things learned first create a strong impression in the mind that is difficult to erase. For the instructor, this means that what is taught must be right the first time.

Law of Primacy on wikipedia

... rather a frightening situation when you consider that the software industry costs and pays billions of dollars more than it should because teachers of programming teach as though the equation were:

    programming = imperative or OO programming

How long will it take to realize that:
  • Although "truth" is what (and how) I was taught
  • Truth is that Truth is what is

Acknowledgement

I guess I (re?)learnt about the "Law of Primacy" from Roy Smith on the python list

Footnotes

[1] A bizarre side-light on the historical side see Fortran part of CS history

No comments:

Post a Comment