For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | more oisdk's commentsregister

Well, adoption rates of programming languages has almost nothing to do with the features and qualities of the language itself. Swift is big because it's backed by Apple, Go is big because it's backed by Google, Javascript because it's on the web, etc. Python might actually be the only language that succeeded "on its merits" to some extent, and even then it seems more to do with some early success and libraries than the core nature of the language itself.

As the author of the post, I don't actually think I can make a compelling argument for why someone should switch to using Haskell in their day job. I don't have real experience in the software engineering industry, and from what little I do know language choice doesn't make a huge difference.

That said, I think it's valid to say that a given pattern is bad, or another pattern is better. I was trying to argue for that in the post in a couple cases that I think Haskell does well.


I wasn't really trying to convince anyone to use Haskell at their day job: I am just a college student, after all, so I would have no idea what I was talking about!

I wrote the article a while ago after being frustrated using a bunch of Go and Python at an internship. Often I really wanted simple algebraic data types and pattern-matching, but when I looked up why Go didn't have them I saw a lot of justifications that amounted to "functional features are too complex and we're making a simple language. Haskell is notoriously complex". In my opinion, the `res, err := fun(); if err != nil` (for example) pattern was much more complex than the alternative with pattern-matching. So I wanted to write an article demonstrating that, while Haskell has a lot of out-there stuff in it, there's a bunch of simple ideas which really shouldn't be missing from any modern general-purpose language.

As to why I used a binary tree as the example, I thought it was pretty self-contained, and I find skew heaps quite interesting.


> > functional features are too complex and we're making a simple language. Haskell is notoriously complex

This is a true statement. (Opinion yada objective yada experience yada)

> In my opinion, the `res, err := fun(); if err != nil` (for example) pattern was much more complex than the alternative with pattern-matching.

This is also a true statement. (yada yada)

The insight I think you're missing is this piece right here: `we're making a simple language`. Their goal is not necessarily to make simple application code. That's your job, and you start that process by selecting your tools.

For certain tasks, pattern matching is a godsend. I'm usually very happy to have it available to me when it is. And I do often curse not having it available in other languages to be honest.

But Go users typically have different criteria for what makes simple/reliable/maintainable/debugable/"good" code than Haskell users have. Which is why the two languages are selected by different groups of people handling different tasks. You're making a tradeoff between features and limitations of various languages.

And the language designers have an even different criteria for those things. In this case, adding pattern matching would absolutely make the language itself more complex, and they apparently don't believe that language complexity is worth the benefits of pattern matching. I think that's a perfectly reasonable stance to take.


I'm not sure if I understand you: the `res, err := fun(); if err != nil` pattern shows up everywhere in most Go code, and I think that pattern-matching would be a better fit for it. Swift does it pretty well, as does Rust, both of which occupy a similar space to Go.

I get that there's a tradeoff with including certain features, I suppose I disagree that the tradeoff is a negative one when it comes to things as simple as pattern-matching, and I think it should be included in languages like Go.


I'm not arguing against pattern matching. Like I said, I prefer it where possible. I'm also not arguing in favor of multiple return with mandatory checked err values. (Though I prefer either over the collective insanity that went into making exceptions the default approach to handling errors in most languages.) I'm just pointing out that I think you're missing a key word in the stance of the go language developers.

They're not saying that if err != nil is better or worse, simpler or more complex, etc... than pattern matching for application code.

They're saying that supporting pattern matching makes the language itself more complex, and they're not in favor of that tradeoff. You're focusing on application complexity, and that's a very different thing.

Both the go language authors as well as the kind of developers that choose to use go think of the relative simplicity of the language itself as a feature. Even if it causes the application code to be slightly more complex. It's just another dimension that can be used when comparing programming languages, and one that group tends to value more than other groups.


Oh ok, I understand. I don't really buy the idea that go is a simple language, I have to say. A lot of go's design choices read (to me) as needlessly complex, like features were added to address corner cases one by one instead of implementing the fundamental feature in the first place. "Multiple return values" instead of tuples; several weird variations on the c-style for-loop; special-cased nil values instead of `Maybe` or `Optional`; `interface {}` and special in-built types instead of generics, etc. ADTs and pattern-matching would obviate "multiple return values", nils, and greatly simplify error handling.


A very instructive exercise for anyone who is or intends to be a software developer is to write some sort of interpreter and/or compiler. (As well as a virtual machine and/or emulator) Depending on your approach this can take a weekend, a few months, or the rest of your life.

For instance, and amusingly enough written in golang, one of the most respected recent books on this topic is `Writing an Interpreter in Go` and its sequel `Writing a Compiler in Go`. https://interpreterbook.com/ and https://compilerbook.com/ Both of these books are reasonably short, and have the reader make meaningful progress within a weekend.

Going through the motions of actually making your own programming language (or reimplementing an existing one) teaches you a lot of things you wouldn't otherwise expect about how to write general code, how to use existing languages effectively, and how things work under the hood. It's also one of the best ways to really get a practical feel for how to approach unit testing.

It's an exercise I'd recommend if you haven't gone through it already. It might make it really click for you why some features that seem like a no brainer and should be in every language aren't, and why some undesirable "features" are so prevalent.


> It might make it really click for you why some features that seem like a no brainer and should be in every language aren't, and why some undesirable "features" are so prevalent.

I hate this kind of "I have secret knowledge, why don't you spend T amount of your time on some big project to maybe come to the same secret insights I mean". If you have an opinion on why pattern matching is so complex and undesirable, just come out and say it please. Otherwise I'll just call you out as not really having an argument.


> I have secret knowledge, why don't you spend T amount of your time on some big project to maybe come to the same secret insights I mean

Alternate interpretation, I learned something valuable from doing this thing, perhaps you'd be interested in doing so as well since the book that took months or years to write will do a better job teaching it than I will in a five minute break while typing on HN.

It's always impressive when freely sharing knowledge and tips is somehow taken as being insular and exclusive.

> If you have an opinion on why pattern matching is so complex and undesirable

Where did I say pattern matching is undesirable? It sounds more like you just want a fight here.

Remember the HN guidelines:

> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith.


> It's always impressive when freely sharing knowledge and tips is somehow taken as being insular and exclusive.

But you didn't share knowledge. You suggested that you had knowledge that was pertinent to the topic at hand. But you didn't share it. You did share tips for resources where one can learn more, and that's great. But you didn't add something like "... and that's where I learned that pattern matching is undesirable because <technical reason>".

> Where did I say pattern matching is undesirable?

This whole thread was about you saying that pattern matching was undesirable from the point of view of Go's designers or implementors due to their design goal of simplicity. Then you mentioned those compiler resources. The only reasonable interperetation for me is that you wanted to say that you did indeed know concrete technical reasons why pattern matching in Go would be complicated and therefore undesirable.


> This whole thread was about you saying that pattern matching was undesirable from the point of view of Go's designers or implementors due to their design goal of simplicity.

The only use of "undesirable" in any of my comments was in regard to features that are prevalent across languages today. If you must know I was thinking of inheritance and exceptions specifically.

As far as pattern matching goes, I was making no arguments except to say that I like it, adding a feature like pattern matching adds some non-zero amount of complexity, and that the go authors are apparently uncomfortable with that complexity. As I am not a go author, I am unsure of their exact reasoning and would not think to say why they believe that. My implication was not that I have an exact concrete reason for why the go authors feel the way they do. It was merely that I don't inherently disbelieve them when they say they have a reason.

In fact my exact wording was "I think that's a perfectly reasonable stance to take", which does not imply agreement, only a lack of strong disagreement. In other words I don't think they're ignorant of the matter or misrepresenting the situation.

> But you didn't share knowledge. You suggested that you had knowledge that was pertinent to the topic at hand. But you didn't share it. You did share tips for resources where one can learn more, and that's great. But you didn't add something like "... and that's where I learned that pattern matching is undesirable because <technical reason>".

The comment that appears to have gotten you riled up was after the person I was talking to saying they understood. After a discussion about language complexity I thought that it would be appropriate to suggest some resources on a "quick" project that can help build an intuition on that topic. And to be honest, it's a project I like to find excuses to suggest. I find people tend to be surprised at how easy and fun it can be to make some meaningful progress.

I understand that you would like for me to somehow short circuit that process, but I don't believe I am capable of building someone else's intuition by posting a throwaway comment on HN. Intuitions are typically built on experience and tinkering, not reading someone else's experiences.

That you view that project suggestion as a continued argument is unfortunate, I can assure you that was not my intent. Again referencing the HN guidelines, I encourage you in future to try to read people's posts first with the assumption that they are being genuine and only fall back to an assumption of malice when you absolutely have to. Long drawn out arguments over semantics don't help anyone.


Ah... in religion we call that "gnosticism". (Not really important, it just struck me as something weird to find in a HN thread.)


> A very instructive exercise for anyone who is or intends to be a software developer is to write some sort of interpreter and/or compiler.

Another exercise, perhaps less demanding in this regard, is to explore using Free Monads[0] to implement an EDSL[1] for a problem domain. Of course, the approachability of this varies based on the person involved.

> For instance, and amusingly enough written in golang, one of the most respected recent books on this topic is `Writing an Interpreter in Go` and its sequel `Writing a Compiler in Go`.

Queue obligatory reference to "the dragon book":

  Compilers: Principles, Techniques, and Tools[2]
0 - https://softwareengineering.stackexchange.com/questions/2427...

1 - https://www.quora.com/What-is-an-embedded-domain-specific-la...

2 - https://suif.stanford.edu/dragonbook/


Yeah I'm definitely not saying anything bad about the Dragon book here.

But I know there's a recency bias when people are evaluating tech books, so if there's a good book from the last five years I'll recommend that over a great book from the last 15, just so there's a higher chance of the recommendation actually being used.


No worries mate.

I mentioned the dragon book by obligation, not in comparison to the works you referenced.


If anyone is curious about an updated resource, I've found Modern Compiler Design much more approachable than the Dragon Book: Published in 2012, it includes chapters on designing object-oriented, functional, and logical compilers.

https://www.springer.com/gp/book/9781461446989


Hadn't heard about that one, thanks!


one of the most respected recent books on this topic is `Writing an Interpreter in Go`

Is 'recent' the key word here? ;) cause that is a very bold claim to make.



> Multiple return values instead of tuples

I remember having some bugs in Python due to one element tuples, I don't think I would have had the same issue if Python had multiple return value instead..


You keep missing the point entirely. Go was created to solve a very specific Google scenario: offer a valid alternative to C++ and Java for whatever they do in Google. It's not a language created to make college students or language hippies happy..if you are looking for that look somewhere else. Go can be picked up by any dev with minimal experience in C/C++/Java in 1-2 weeks and that was one the main design targets. Another one was fast compile times, adding all those nice features you'd like would also make the language more complex to parse and compile. I think you can talk about how much you like Haskell all day long, but if you keep using Go as a comparison you simply show you have no clue of what you are talking about. It's literally apples to oranges.


Maybe I am missing the point! It certainly wouldn't be the first time in an argument about programming languages.

I do understand, though, that the purpose of Go is not necessarily to push the boundaries of language design. I also understand that it's important the language is easy to pick up, compiles quickly, etc.

I think that some of Go's design decisions are bad, even with those stated goals in mind. Again, I don't want to overstate my experience or knowledge of language design (although I do know a little about Google's attitude towards Go, since that's where I spent my internship learning it), but some features (like "multiple return values" instead of tuples) seem to me to be just bad. Tuples are more familiar to a broader range of programmers, aren't a strange special case, are extremely useful, and have a straightforward implementation. Also, I don't want a bunch of fancy features added to Go: ideally several features would be removed, in favour of simpler, more straightforward ones.


I do agree, I would prefer tuples to multiple return in go.

Perhaps they find it easier to teach to users coming from languages with less or no type inference? Java and C++ programmers in my experience don't tend to be familiar with tuples, despite there being a tuple in the C++ stdlib. My purely uninformed guess is that it's because of how verbose declarations can get in Java, or in C++ without auto/decltype from C++11.


I really wasn't trying to compare Python to Haskell, rather I was trying to show a few example features in Haskell with the Python code as a reference for the "standard" way to do a binary tree type thing. Other than the (admittedly awful) `__dict__` stuff, the rest of it is pretty standard. In contrast, the code you've written here is non-mutating, and uses tuples to represent a tree. If you were to google, say, "BST in Python" I'd wager almost none of the implementations would follow that style. If I was to write a skew heap in Python (that I intended to use), I would likely do it in a non-mutating way (although I certainly wouldn't use tuples and `leaf = object()`).

The point of the post was really to argue that simple features like pattern matching, ADTs, and so on, should be in languages like Python and Go. Also I wanted to make the point that functional non-mutating APIs could be simple and tend to compose well: the `unfoldr` example was all about that. In that vein, it was important that I compare the Haskell code to an imperative version.

For instance, with your `reduce` improvement: I agree that the `reduce` version is better! It's simpler, cleaner, and easier to read. But Python these days is moving away from that sort of thing: `reduce` has been removed from the top-level available functions, and you're discouraged from using it as much as possible. The point I was making is that I think that move is a bad one.

Finally, while the Python code here is shorter, you still don't get any of the benefits of pattern-matching and ADTs.

* You can only deal with 2 cases cleanly (what if you wanted a separate case for the singleton tree?).

* You are not prevented from accessing unavailable fields.

* You don't get any exhaustiveness checking.


Python has some basic pattern matching. ADTs are alright, but if you notice that MLs implement them by tagged unions, then really this is a request for syntax and ergonomics, not semantics.

Python is untyped. This fundamental separation between Python and Haskell is non-trivial, and can't be papered over. Your complaints about exhaustiveness, field existence, and case analysis are all ultimately about the fact that Python's type system is open for modification, while Haskell's is closed; in Haskell, we can put our foot down and insist that whatever we see is an instance of something that we've heard of, but in Python, this is simply not possible.

I agree, when it comes to Python's moves. I am about ready to leave Python 2, but I'm not going to Python 3.


While I am all for stronger type systems, I don't agree that you need it to do sum types. We can already do one half of ADTs (classes ~= product types), I just want the other half!

In my mind, the syntax would be something like this:

    sum_class Tree:
        case Leaf:
            pass
        case Node:
            data: Any
            left: Tree
            right: Tree

    def size(tree):
        case(Tree) tree of:
            Leaf:
                return 0
            Node(_, left, right):
                return 1 + size(left) + size(right)
A combination of data classes and pattern matching.


If you write out the coordinates for a straight line from the origin they look like this:

    3, 5; 6, 10; 9, 15...
So now, instead of having to test 6, 10 and 9, 15 when we see it, we can just test 3, 5 once and we'll know the answer to all of the points on that line. Then, to answer the question "which line is this point on?" we simplify the fraction (i.e. 2, 6 -> 1, 3).


What's the simpler pattern?


Presumably multiplying by two and adding one? That's the pattern that jumped out to me, unless they're talking about something more subtle.


Oh, of course! Looking at it now that's pretty obvious, lol.

Although I think even if I had spotted that I still would have put it into oeis to see if there was a quick membership test.


Which is to say, 2^n-1.


Other than the ability to put a predicate in the pattern position (which you can do in several languages, including Haskell with ViewPatterns), I still don't see what Perl 6 adds here? Of course it's more difficult to model the problem using sum types, but there's an asymmetry here: whereas given can't do powerful pattern-matching features like exhaustiveness checking, GADTs, auto case-split in an IDE, efficient case trees, etc., the opposite isn't true. Haskell can do everything Perl does here, just by using strings! The other features (multiple variables to split on) are pretty standard these days.

    vosotros = ["Spain", "EquitorialGuinea", "WesternSahara"]
    
    message = case (number, gender, formality, country) of
      (1         , "masculine"                        , "informal", _                     ) -> "¿Cómo estás mi amigo?"    
      (1         , "masculine"                        , "formal"  , _                     ) -> "¿Cómo está el señor?"     
      (1         , "feminine"                         , "informal", _                     ) -> "¿Cómo estás mi amiga?"   
      (1         , "feminine"                         , "formal"  , _                     ) -> "¿Cómo está la señora?"    
      ((> 1) -> T, (`elem` ["mixed","masculine"]) -> T, "informal", (`elem` vosotros) -> T) -> "¿Cómo estáis mis amigos?" 
      ((> 1) -> T, "feminine"                         , "informal", (`elem` vosotros) -> T) -> "¿Cómo estáis mis amigas?" 
      ((> 1) -> T,  _                                 , _         , _                     ) -> "¿Cómo están ustedes?"


Author here! It's my fault the code and report are a little disorganized, but there is a related work section in the full report (https://github.com/oisdk/agda-ring-solver-report/blob/master...) which mentions that paper.

Also, if you click through to the section on the algorithm itself (https://oisdk.github.io/agda-ring-solver/Polynomial.NormalFo...) you'll see a link to the same paper.


This is a bizarre article. The lede is buried a little: towards the end we get a lengthy defence of a guy called Gary Null. It starts with this section:

> It was Dr. Gary Null’s investigation into this phenomenon that first alerted me that Wikipedia was playing fast and loose with the facts beyond of the political realm ... Null is a board-certified clinical nutritionist who has conducted over 40 clinical studies on lifestyle and diet, more than anyone else in his field. ... He has published over 700 articles, many in peer-reviewed journals, and has been invited to present his findings at scientific conferences.

To me, this reads as if Gary Null has a medical degree. He does not. His qualification is a PhD from an online university (Union Institute & University). The graduate school has since been dissolved following investigation from the Ohio Board of Regents.

Moving on, the author seems to have a bone to pick with Stephen Barrett (who runs quackwatch.org, which regularly criticises Null). This is how he's described:

> Stephen Barrett, a discredited former psychiatrist

Again, to me, that seems like he lost his license or something. After a quick google, Stephen Barrett holds a medical degree, and is retired. I couldn't find any evidence of disciplinary action of any sort.

Anyway, the author's main issue with the wikipedia page on Null seems to be that it portrays him as unscientific.

> I reviewed the scientific literature on five topics where Null and Barrett disagree – sugar, alcohol, mercury, fluoridation, and the safety of vitamins and minerals – and after scanning thousands of abstracts, found Barrett to be wrong on every issue.

This is pretty ridiculous. [Barrett's write-up on Null](http://www.quackwatch.org/04ConsumerEducation/null.html) specifically mentions homeopathy, AIDS denialism, anti-vaccination, chelation therapy, among other things. Strangely the author makes no mention of those.

Hilariously, I think "the safety of vitamins and minerals" is referring to an incident where Null sued the manufacturer of his line of diet supplements (because yes, of course, he sells a line of diet supplements), after it allegedly hospitalised Null along with six other individuals. That story in itself is incredibly surreal (his website said he was "completely and totally healthy", for instance), so I recommend anyone interested to go read up on it.

I was a little disappointed at first that there wasn't a decent discussion of wikipedia's policies, but Gary Null is such a weird and wonderful character it more than made up for it!


Not really - like Norvig said, that's one interpretation, but there is another valid one.

Say you had a bunch of families with two children each. The children are evenly distributed in terms of gender and the the days of the week on which they were born. If you pick one parent from the crowd, the chance that they have at least one boy is 3 / 4, the chance that they have two is 1 / 4, and the chance that they have none is also 1 / 4:

      | B| G|
    --|--|--|
    B |BB|BG|
    --|--|--|
    G |GB|GG|
    
Each of the four squares on the above table is equiprobable. However, if the person says they have at least one boy, they must be in either the left column or the top row, so in one of three squares. There is only one square in those three with both boys, so the chance of that parent having two boys is 1 / 3.

Now, for the day-of-the-week problem. If you ask a parent if they have a male child born on Tuesday, it is not equiprobable that they're in any of those three squares. In the BG group, they all have male children, so the chance that any parent chosen has a male child born on Tuesday is 1 / 7. Similarly in the GB group. However, in the BB group, either one of their children may be born on a Tuesday to satisfy the condition. The chance that either child is born on a Tuesday is the same as the inverse of neither child being born on a Tuesday, or

    1 - (6/7 * 6/7) = 13/49
So the number of parents in the top left group (BB) who satisfy the condition is 13 / 49, whereas the number of parents in the top right (BG) is 1 / 7, and the bottom left (GB) is also 1 / 7. You're looking for the probability that a given parent in that satisfies the condition is in the top left group, which is

    (13/49) / (1/7 + 1/7 + 13/49) = 13/27


In Swift:

    (0...100)
      .filter { $0 % 2 == 0 }
      .map    { $0 * 2 }
Will pass through the sequence multiple times, yes. But to avoid it it's as easy as:

    (0...100)
      .lazy
      .filter { $0 % 2 == 0 }
      .map    { $0 * 2 }


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You