Search This Blog

Sunday, March 13, 2022

Pugofer → Pug → ?

It was a rainy July day of 1993. My just-graduated student, Anuradha strode into the — at that time — ramshackle PU building with a mischievous twinkle in her eyes.

A: Sir!!!
Me: Yeah??
A: I've got a functional language for you!!!And she waved one (or was it 2?) 360 K floppies in front of my face
Me: Awww... We're a poor department we don't have fancy machines like Sun workstations to run these. In short FPLs are beautiful but only for the rich
A: It runs on PCs – Come see!!

Pug

Sunday, March 6, 2022

A Fairy Tale And a Bridge

Momma: Dr. Einstein, What should my lil boy read so that he becomes like you?

Einstein: Read him fairy tales

Momma : ! 😯!!😯!!!

In this post I shall channel 'the late Dr. Einstein' to make a case for Pugofer as a fairy tale.

Wednesday, December 27, 2017

The ‘User’ and Technology

1 Introduction

[This post is mostly for my students]

I had mentioned in the class that you are graded along three dimensions
By the nature of things for the most part, concepts are evaluated. And a bit of perspective.
If you have done some coding your technology is satisfactory: ie you know how to turn on your machine, log in and enter code. If thats ok with you dont bother with this post
If you want to go beyond that grade you need to read this and implement some of the suggestions

2 The Neologism called ‘User’

Thinking is our most intimate activity, and a lot of it is revealed by the way in which we use (and misuse) our language…

Wednesday, July 27, 2016

Mechanism Romanticism and the Origins of the Computer

Guest Article: Reposted with thanks


The story of

The origins of the electronic computer…

as it is most frequently told, is an engaging tale of intellectual turbulence in the early decades of the twentieth century. The computer grew out of dramatic upheaval in the fields of mathematics and logic, not unlike what was happening at the same time in physics, politics, and the arts.  In this paper, we shall examine the origins of the computer from the perspectives of two competing world views, which we will call “Mechanism” and “Romanticism”, after Dahlbom and Mathiassen (1993). Although the computer is considered the crowning achievement of the former of these, we shall see that, ironically, it was inspired by a discovery that represented, in a sense, a major setback for the Mechanistic mode of thinking.

Tuesday, July 5, 2016

Tips for Emacs Beginners

Emacs outshines all other editing software in approximately the same way that the noonday sun does the stars. It is not just bigger and brighter; it simply makes everything else vanish. — Neal Stephenson
Q: Why should I learn emacs? Ive heard 'real programmers' use emacs…
A: Real programmers use their brain
Real programmers program their brain
Often, real programmers' brain-programming program is emacs. — Thien-Thi Nguyen
Well that's the advertisement…
Lets get down to some details – how to start using emacs

Baby Steps 0

You can start the tutorial with C-h t
Which unforunately wastes about 200 lines saying that C-f C-b C-p C-n will do the work of ↑ ↓ ← →
I suggest you ignore this archaism

Monday, July 4, 2016

A Little 25 year old Functional Parser Gem

1 Intro

When I was playing around inside Mark Jones' gofer sources in the early 90s I saw this piece of commented C code. The code was impenetrable… at first. But the comment was elegant and beautiful and cleared up the code nicely.. on careful reading.

Why is it needed?

Wednesday, June 15, 2016

Break is goto in disguise

On the python list there was a discussion about break.  I made the comment that break is just a euphemism for goto.  I thought that this would be a commonplace. However google does not give many useful hits for this.

So thought I'd cook up an example.
Here is Apple's famous SSL-bug in shortform

Saturday, January 16, 2016

The Law of Primacy

I consider the absolute worst programming construct to be subroutine or the function.          Cleo Saulnier 
Hello?!?! Why pay attention to some random crank on the Internet?
Because I think he is onto something important...

Wednesday, January 13, 2016

Celebration of Obsolescence

Today is 13 Jan 2016, 150 years from the birth of G I Gurdjieff

Here's something I had written for a 13th Jan some years ago as my understanding of the message of Gurdjieff's magnum opus: Beelzebub's tales [Mr. B!]



We celebrate patriotism yet from the plane we see no lines
We celebrate humanity but humans are killing all life — including themselves
We celebrate art — For the neurotics by the psychotics
We celebrate religion — as an institutionalized way of hating 'others'
We celebrate technology to cure each of our problems — but technology
is our biggest problem
We celebrate science — Oh the vast aggrandisement of ignorance
We celebrate spirituality — The hysteria of the hypnotized



On this 13th of Jan, we celebrate our OBSOLETE WORLD!


Help us Mr B…


To laugh without cynicism
To weep without sentimentality
To live love
And to die free

Friday, January 1, 2016

How Long?

It takes 100 years for an idea to go from inception to general acceptance.
Dijkstra[1]
When I first read this I thought it alarming.

Then I started collecting some historical tit-bits[2]…

Friday, July 31, 2015

Faith and Rats, Gödel and Computer Science

Computer scientists dismiss Gödel as mathematics
Mathematicians classify Gödel as logic
Logicians slot Gödel into meta-mathematics
Meta-mathematicians know the truth of the matter…
…and have been dead for a century

I would like to suggest that this misunderstanding (or rather non-understanding) does not make it non-true.¹ Many educated people do know that Gödel’s theorem(s) is important even portentous. But somehow – like war – Yeah its bad but not my problem.

Let’s use the services of
A fever is raging in the town.
People are dying.

And I happen to find…
In the closet…
A dead rat

“What do rats have to do with…”

The Plague?

Do we need to start having a fever and swelling in the armpits to change our minds?
To my mind the mathematicians and CSists who think of Gödel as irrelevant are like people with a dead rat in their closet who are now beginning to run a fever and who still keep insisting:
“Whats a dead rat to do with the plague? Why should I bother?”
Gödel’s theorem is a dead rat in plague infested town. In the 1930s, people understood this. Somehow now everyone has forgotten. This post is to remind of these well-known and more well-forgotten facts.

The Terrible Theorem

Starting with the cute paradoxical statement

This statement is false

which is true if its false and false if true,  Gödels theorem maps out the large gulf between what is provable and what is true.
Now on the face of it this seems like a ridiculous thing to make a song-n-dance about. Surely there are truths that we dont know (yet)? What of it? Then science studies better... Then some more truths are revealed... etc...

To understand why its a big deal we need to understand the difference between

Analytic and Synthetic Truths

Monday, June 15, 2015

Richard O'Keefe's responses to FP Timeline

Richard O'Keefe of Otago whose quote I started FP Timeline with, wrote me some rather detailed comments about history which have interesting titbits of info.

Tuesday, June 9, 2015

Functional Programming: A Moving Target

In my last post, I gave a functional programming time line in the last 50 years. Now I'll look at two things: The place of functional in ACM Curriculum 2013 and how C has messed up the notion of functional.

ACM Curriculum 2013


Wednesday, April 29, 2015

Functional Programming: A Timeline

Rob Hagan at Monash had shown that you could teach students more Cobol with one semester of Scheme and one semester of Cobol than you could with three semesters of Cobol.
Richard O'Keefe on Erlang list
Well that was before Functional Programming hit the headlines.
These days FP is quite a buzzword. Is this for good or bad?
If real worldgood well then Scala and Clojure and Erlang and Haskell becoming more and more 'real world' is a wonderful thing.
If what is good is understanding, then I am not so sure. Many things about programming, pedagogy and programming-pedagogy that were widely understood in the 1970s and 80s have mysteriously become un-understood today.
However in this darkening of the age there are some glimmers… eg ACM's 2013 curriculum.
In this post I would like to delineate a timeline of the semantics and significance of Functional in the last 50 years. In subsequent posts I'll try to deconstruct how the semantics has shifted around in this time.

Timeline

1957
The first programming language – Fortran
1957
The first functional programming language – For(mula)Tran(slator)

Why? Whoa! How?

Read on…

Tuesday, April 21, 2015

Between Poverty and Universality lies Structure

Lisp is worth learning for the profound enlightenment experience you will have when you finally get it; that experience will make you a better programmer for the rest of your days, even if you never actually use Lisp itself           —  Eric Raymond

In ancient times people set each other puzzles such as:

       Can God make a stone so heavy that he can't lift it?

These puzzles-of-omnipotence can be rephrased in theory-of-computation lingo:

       Can God compute the uncomputable?
       If he can, how is it uncomputable?
       If he cant, how is he God?

So what are those limits of/by structure?  Unsurprisingly related to God-el's theorem:
God-el's Theorem says that for any record player, there are records which it cannot play because they will cause it to self-destruct
Gödel-Escher-Bach

And like record players what about programming languages whose abstractions can be arranged to break the language?

Structure is good because it reduces breakage; its bad because it imprisons us into precooked forms.

Following I explore the space between poverty and universality; a space which for want of a better word I will simply call structure, the most elusive being the structure of syntax.

Thursday, March 26, 2015

CS History 0

Are real numbers real?

Wait!! What does this have to do with programming? Or even computer science??

Sounds like angels-on-the-head-of-a-pin philosophy No??
NO!  CS came into existence because of this question!

Monday, March 2, 2015

Unicode: Universal or Whimsical?

Unicode Classification

In my last post, I wrote about two sides to unicode — a universal side and a babel side. Some readers while agreeing with this classification were jarred by a passing reference to ‘gibberish’ in unicode⁵.
unicode-universal-or-whimsical.html
Since I learnt some things from those comments, this post expands that classification into these¹.
  1. Babel
  2. Universal
  3. Legacy
  4. Unavoidable mess
  5. Political mess
  6. Whimsical

Thursday, February 26, 2015

Universal Unicode

What is the 'uni-' in unicode? According to the official records it comes from Unique Uniform and Universal.

Unicode starts out with the realization that ASCII is ridiculously restrictive, or the world is larger than the two sides of the Atlantic¹. This gives rise to all the blocks from Arabic to Zhuang.

However the greatest promise of unicode lies not in catering to this tower of babel but rather in those areas that are more universal. Yeah I know technically this distinction between universal and international will not stand up to scrutiny.

Tuesday, January 6, 2015

Unicode and the Universe

If you're trilingual you speak three languages, if you're bilingual you speak two languages, if you're monolingual you're American.

Mark Harris on the python list
Well if one reads that thread above, one would find that people were rather uptight with Mark Harris for that statement. And yet they have the same insular attitude towards ASCII-in-programming that Mark describes in Americans towards English (or more correctly Americanese); to wit they consider that programming with ASCII (alone) is natural, easy, convenient, obvious, universal, inevitable etc.

Is it mere coincidence that the 'A' of ASCII is short for American?

Not so long ago the world lay from a few kilometers east of The Garden of Eden to a few hundreds kilometers west.  And then it stretched to a spherical globe of 40,000 km circumference.  At that time the gods used to light lamps at night called 'stars'.

And then things changed a wee little bit, the stars and our world – suddenly grown quite small – became more 'similar' and the wider world stretches now to a few billion light-years across.

In many respects the story of ASCII to Unicode is similar. Pragmatically both represent a 0 → ∞ jump, in the sense that it was natural to use the whole of the (printable) part of ASCII.  [Many of us even used to know the code-points of ASCII quite well!] With unicode, not only is any one person knowing all the 1,114,112 characters unrealistic, even knowing what all blocks exist is infeasible.

At base this is

The problem of meaning

The smaller world is naturally more meaningful than the larger one.  Just as one can have a more warm fuzzy feeling about Momma than woman-kind, one can at least imagine a God who selects a chosen people and is solicitous and possessive about them as long as the world is comprehensible on my scale. When it becomes too large that life itself looks like a freak-accident, such beliefs are harder to maintain.

As example, consider Amerigo Vespucci 
We saw more wild animals—such as wild hogs, kids, deer, hares, and rabbits—than could ever have entered the ark of Noah; but we saw no domestic animals whatever… I fancied myself near the terrestrial paradise…
Vespucci was an adventurer, not a religious man.  By contrast today even a committed religious person would not ask whether a specific animal of the mundane world is found in the scripture of his choice. And I dare say Vespucci talks of paradise with a literalness that is not possible for a modern.

In effect our world has become so large it is difficult to give it meaning.

Likewise, even considering only extant languages…

Unicode is too large

People want to stick to ASCII because of the unending, terrifying swathes of undecipherable characters.  An argument I often hear is
Given that I have only ten fingers and a hundred or so keys in front me, how am I to invoke a specific symbol from the hundred thousand or so that are available in Unicode?
Well… Dunno what to say… If I can go from 100 characters to 200 I am twice as rich. Why worry about the million I have no use for?

But it is really much worse

Unicode has plain gibberish

You dont play with Mahjong characters? How crude!
You dont know about cuneiform? How illiterate!
You dont compose poetry with Egyptian hieroglyphs? How rude!
Shavian has not reformed you? How backward!
In short, to make effective use of unicode, it may be worthwhile to distinguish the international blocks (also called the tower of babel) from the universal parts of unicode, viz. math.

That is,

Unicode is like the universe

in the sense that in the pre-unicode era, the universe was so small that parochialism was unavoidable. Today it is so big, meaninglessness is inevitable.

In the medieval ASCII world one could choose between being one of:
1 Dummy
To sell one's computer and work (and soul?) to a proprietary format and word-processing software
2 Wizard
To master something intricate and complicated such as latex (or mathml, lilypond, troff…)
3 Programmer
Everything that is worth expressing can be expressed in ASCII.

IOW…

God made ASCII. All the rest is the work of man.
And so we had before us a delicious à la carte offering:
  1. idiocy of ignorance
  2. slavery to savantery
  3. prison of penury
Now while we are not completely free from these 'blessings' yet, we are better off than before, thanks to Unicode

To see why 1 and 2 need not be the case any more, see some suggestions made in the context of python.  Now while the suggestions are not quite serious and are unlikely to be taken seriously, as we go from established/old languages towards the bleeding edge they become more realistic.  Here's Julia and Agda.

As for not having to choose between 2 and 3, heres something I recently asked on the (la)tex list:

Here is the wikipedia page on ε-δ definition of limit where we see the well-known definition


Editing it produces this excerpt [note this is input text]
(\forall \varepsilon > 0)(\exists \ \delta > 0) (\forall x \in D)
(0 < |x − c | < \delta \ \Rightarrow \ |f(x) - L| < \varepsilon)


Now compare it with the following – also input text:

(∀ ε > 0) (∃ δ > 0) (∀ x ∈ D) (0 < |x − c| < δ  ⇒  |f(x) - L| < ε)

[Note particularly the real minus between x and c and the ASCII hyphen minus between f(x) and L]

In this age of unicode when we have xetex/luatex why do we use the first when the second is so much closer to the desired result?
Hopefully most people would agree the latter is more readable than the former.
The questions that remain are
  1. Typing it in.
  2. Is it close to luatex/xetex? 
For 2. I'd welcome help/suggestions ;-)

For 1., Ive just recently discovered pointless-xcompose which goes a good way towards solving this at least on linux¹

And I suggest we distinguish these

Levels of Input Methods

  1. Cut paste a character after searching with google
  2. Select a character from a local app like gucharmap (emacs: C-x 8 Ret)
  3. Use an editor abbrev(iation)
  4. Use an editor input method eg emacs' tex input-method will convert \forall into ∀ etc
  5. Use the compose-key (Windows users may try this – dunno…) 
  6. Switch keyboard layouts in software with something like ibus
  7. Use a special purpose hardware keyboard
As we go from 1 to 7 the expertise and efficiency increases but also the expense of setup, hardware etc. and most important, learning. The cost of assuming that only the extreme choices – 1 and 6 – are available and not all the other interim possibilities, is the binary choice between meaninglessness and parochialism.

IOW placing the slider effectively along this spectrum represents an efficient…

Huffman coding

applied to keystrokes and mouse gestures (in analogy to bits)

For a while now Ive used 1 and 3.

Combining 3 and 4 thanks to pointless-xcompose is, I expect, going to be more convenient and effective, especially when it is tailored to the subset of characters one needs frequently.

The one thing not clear is how to set up the compose key. Complete noob myself but on a recent linux¹ this may work:

$ setxkbmap -option compose:menu

to make the menu key behave like compose.  Replace the 'menu' by 'rwin' or 'ralt' to get the same behavior out of the right-windows or right-alt keys.

Acknowledgements

  1. Thomas Reuben for writing pointless-xcompose
  2. David de la Harpe Golden for introducing me to xkb (setxkbmap)


¹ Thomas Reuben, author of pointless-xcompose, points out to me that saying linux is inappropriate where X-windows would be more correct. He is right.
Left the linux there as more people are likely to know they are using linux than that they are using X-windows  ☺

Friday, September 26, 2014

Pugofer

In the early 90s  I used gofer to teach FP in the introductory programming class at the university of Pune.  At first I used Miranda/Scheme, then gofer. I was also impressed with Dijkstra's philosophy of making function application explicit with a dot ('.') and decided to incorporate this into gofer.  This changed gofer was called pugofer.

The philosophy of these changes is here. Summary of changes is: