This probably won't be well received, but I think posts like this, with lines like "a list of objects of type X is an element from the free monoid generated by the set of elements of X where the monoid structure (a unital associative binary operation) is given by concatenation", go some way towards explaining why functional programming won't ever become mainstream.
Not that you're saying it is, but it probably isn't ideal to use a programming language that is able to become mainstream. It seems likely that the ideal language should be (at least, at first) very difficult for an average mind to grasp. A language that inherently forces its users to be either 'naturally smart' or 'doggedly learned' will produce a better signal-noise ratio as a by-product: in open-source offerings, engineer-employees, and, at a micro level, within the engineer's code.
With that said, whenever readability is lost for the sake of ego, that is a loss. I'm not accusing the author of that though.
I don't think the problem here is functional programming, it's the author's over-eager use of category theory to describe something simple. In particular we're just talking about all possible lists you can create in LISP, so do we really need to resort to free monoids? The language of category theory is greate for describing complicated structures like tensors and sheaves and what have you, but in this example it's not really helping.
Epsilon-delta proofs are beyond me, but we still teach calculus in high school. Just because some parts of a field are esoteric without study doesn't mean the field is doomed to obscurity.
I know jargon looks unsurmountable at first, but if you pay close attention to definitions, they are usually extremely simple. To wit, the example you quote is just the set of strings on X.
I'm can't speak authoritatively on this subject: but I'm pretty sure that telling Guy Stelle "don’t get rid of [Cons], use it correctly!" is a very ballsy move ;)
What you need is something slightly better than the CONS: the lazy CONS: LCONS.
The LCONS is a binary cell which has an associated function. It begins life in a "promised" state. When the CAR or CDR of a LCONS is accessed for the first time, the LCONS is "forced": the function is called, and stuffs values into the CAR and CDR. Then it is dissociated from the LCONS, which behaves like a CONS after that.
In the Lisp dialect built into the TXR tool, I have conses as well as lazy conses. Lazy conses satisfy the consp function, and behave like conses for most intents and purposes.
Lazy conses give you structures that you can feed to functions that expect ordinary conses. Many of those functions instantly become empowered for lazy processing; suddenly they can handle infinite lists.
For some functions, you can easily provide lazy equivalents. A lazy mapcar returns a lazy list, which applies a function to tuples of other lazy lists.
Add the infinite sequences {0, 1, 2, ...} and {10, 11, 12, ...}, then take the first ten elements of the result:
Adding all numbers in strict cons-based list is O(n). Adding all numbers in lazy cons-based list is O(n). Adding all numbers in a strict cons-based tree can be O(log n). Adding all numbers in a lazy cons-based tree can be O(log n). Laziness is an entirely different problem from the one the author is talking about.
I see many people here in the sibling replies appear to have also so thoroughly internalized that "cons is for a list" that they appear to have entirely missed the point of the article.
The author of the post here. It seems Steele might be vindicated at the end, not on the technical level but on the social level: get rid of cons, and all the mental assumptions coming with it, for truly parallel solid data structures and algorithms that would go with them.
I get the point about getting rid of cons. You don't need the actual cons data structure; just the accessor abstraction. In TXR Lisp, I define car and cdr for objects other than conses. They are defined in a way that obeys "nil correctness".
The mapcar function was modified only very slightly to make this happen correctly for all inputs: its inputs have to be normalized so they are "nil correct": empty vectors and strings have to be mapped to nil. That case does not occur here.
Now that required a bit of more of a hack inside mappend: mappend looks at the type of the leftmost list operand and notices that it is a string. Therefore it accumulates a string.
Wait, mapcar has that hack too!
$ txr -p '[mapcar + "abcd" #(0 1 2 3)]'
"aceg"
It didn't kick in in the first example because the function being mapped did not return characters. It returned strings. Strings are not characters and so they cannot be accumulated to make a string. (But of course they can be appended to make a string, which happened in mappend).
Actually, generic protocols and parametric polymorphism are my friends. Representation is quite important, but that doesn't mean I want to hang out with it every day.
AFAIR one of the lessons from the Fortress development was in fact that choosing a suitable fundamental representation led to better protocols
i.e. if you assume that "the list" is implemented with conc (i.e. it's a tree) it's more natural to come up with "the loop" as parallel by default, while starting with a cons list you may more likely end up with a sequential one.