For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | le0n's commentsregister

Haskell/GHC will soon have an extension for linear types, which will bring the languages much closer: http://blog.tweag.io/posts/2017-03-13-linear-types.html


Yeah right. If only the standard library (the base package) could immediately be converted to use linear types and we can all benefit from that! (Hint: it won't. Even the fairly no-brainer AMP and BBP took an absurdly long time. This is extremely slow by Rust standards. The Haskell community might, but the stewards of the standard libraries don't have the move-fast-and-break-things attitude.)


> Even the fairly no-brainer AMP and BBP took an absurdly long time

This is because, believe it or not, Haskell is actually used in the real world and people care about their code not becoming broken.


Did rust ever have a large scale breaking change? Anyway, the linear type proposal is designed to be completely backwards compatible so that shouldn't be as big of an issue.


> Did rust ever have a large scale breaking change?

Pre 1.0: every day

Post 1.0: not really, though I guess it depends on your definition of "large scale"; we've made some soundness fixes, but made sure that they were warnings for a long time first, so most people experienced no breakage.


Yes I know it's completely backwards compatible. But in practice if the standard libraries don't change, (a) there won't be a ready source of inspiration and examples to copy from; (b) actually the feature is really geared more towards libraries than applications, so it's less useful if the base library doesn't adopt it wholeheartedly.


Fair enough, although it would be easy to have an alternative prelude with linear types that libraries could depend on. Just that the extra dependency would really suck.

Haven't sunk a lot of time into it yet to be honest, I think the biggest beneficiaries will be streaming libraries and non-gced data structures like long lived queues?


Rust uses affine types instead of linear types per some statements I've seen on the subject. I wonder why they wouldn't use affine types in this. Either way, it's good they're adding it.


A confidence interval won't adjust the points (point estimates) but will give those points with a lower sample size wide confidence intervals (often covering zero).

Using an (empirical) Bayesian multilevel model can both attach uncertainty intervals to the point estimates and appropriately "shrink" the estimates towards zero at the low-sample-size end.

The latter is more directly interpretable, at the cost of slightly more complex modelling (/assumptions).


Thanks! I think the shrinking you mention is what I was trying to say :)

Looking for explanation of multilevel model, I found http://mc-stan.org/documentation/case-studies/radon.html which seems to do exactly that in "Partial pooling model". (see graph)


I wonder whether this is the same sense in which the brain also uses logic to reason about stuff, and physics to interact with the world.


It seems to me more bent on buttressing a pro-scientific religious viewpoint.


Panama papers -- <anger> OpenBazaar -- cool!


AFAIK decentralised Bitcoin transactions are already illegal in the US if none of the parties to the transaction have a money transmitter license - correct me if I'm wrong.


You're COMPLETELY wrong... like extremely wrong. Bitcoin transactions between peers have always been legal.

Conversion of USD - Bitcoin over something like $5000/year needs a MSL


If you're trying to make a point it is lost on us, but if you elaborated I'd be interested.


Just so people know, there is a competing/complementary approach to causality in statistics, called the potential outcomes or (Neyman-)Rubin causal model, which as I understand it is currently more popular than Pearl's graphical/do-calculus approach.


There is also TS related body of knowledge about causal impact using the notion of counterfactuals. Google has sponsored research in the field [1] and also released an R package [2].

[1] http://research.google.com/pubs/pub41854.html

[2] https://google.github.io/CausalImpact/CausalImpact.html


I don't think machine learning vs Bayes vs sampling theory has much to do with the content of the article, which is more about causality than interpretations of probability.

> it's far more likely that coffee causes cancer if it can accurately predict it, even when you control for all other variables

I don't know about this: prediction is not equivalent to explanation in general. The "all the other variables" bit is also a bit of a kicker (what counts as "all"?) -- hence randomization, and, well, pretty much everything else the article discusses.


Possibly related simultaneous discussion: https://news.ycombinator.com/item?id=10328699


+1. A fantastic article. Definitely one of the best popular-level stats pieces I've read.


Is anyone really an AGI expert?


I would argue the people publishing in the AGI journal:

http://www.degruyter.com/view/j/jagi


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You