For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | vo2maxer's favoritesregister

Was searching for this this morning and settled on https://handy.computer/

Dont use it on gemini.google.com, but instead try it on aistudio.google.com.

Model may be the same but the agent on aistudio makes it much better when it comes to generating code.

Still jules.google.com is far behind in terms of actual coding agents which you can run in command line.

Google as always has over engineered their stuff to make it confusing for end users.


I'm British and live in the UK so BBC is a big part of my radio listening anyway, but I agree, the quality both in terms of content and recording quality is largely miles ahead of most podcasts.

Excluding things like political things which wouldn't land the same to international listeners, my favourites are:

The Kitchen Cabinet You're Dead To Me Crowdscience Inside Science Just a Minute The Unbelievable Truth Nature Table Take Four Books Witness History Last Word Gardener's Question Time A Good Read (I wrote a blog post a while ago about scraping the books using LLMs to extract from the text of the descriptions: https://rpep.dev/posts/a-good-read-extracting-books-with-llm...)


Related:

Rules of formulating knowledge in learning (1999) - https://news.ycombinator.com/item?id=22524122 - March 2020 (2 comments)

Effective learning: Twenty rules of formulating knowledge (1999) - https://news.ycombinator.com/item?id=18404150 - Nov 2018 (17 comments)

Effective learning: Rules of formulating knowledge (1999) - https://news.ycombinator.com/item?id=13047576 - Nov 2016 (35 comments)

Effective learning: Twenty rules of formulating knowledge (1999) - https://news.ycombinator.com/item?id=10785221 - Dec 2015 (1 comment)



Donald Hoffman, The Case Against Reality: <https://www.youtube.com/watch?v=4HFFr0-ybg0>

> Overall, we are convinced that containers can be useful and warranted for programming.

Last week Solomon Hykes (creator of Docker) open-sourced[1] Container Use[2] exactly for this reason, to let agents run in parallel safely. Sharing it here because while Sketch seems to have isolated + local dev environments built in (cool!), no other coding agent does (afaik).

[1] https://www.youtube.com/live/U-fMsbY-kHY?si=AAswZKdyatM9QKCb... - fun to watch regardless

[2] https://github.com/dagger/container-use


For very philosophical writings about this, read "Last and First Men" and "Star Maker" by Olaf Stapledon. Written in the 1930's, these describe on a very expansive scale the history of, respectively, humanity and the universe. Very mind bending.

Whenever I hear about neuromorphic computing, I think about the guy who wrote this article, who was working in the field:

Thermodynamic Computing https://knowm.org/thermodynamic-computing/

It's the most high-influence, low-exposure essay I've ever read. As far as I'm concerned, this dude is a silent prescient genius working quietly for DARPA, and I had a sneak peak into future science when I read it. It's affected my thinking and trajectory for the past 8 years


> When you add an MCP server to your Claude organization, you just add the MCP server. Each user will have to go through the integration's OAuth2 authorization flow separately.

Check out https://aaronparecki.com/2025/05/12/27/enterprise-ready-mcp - there are some great ideas there on how this can be simplified even more in the future.


Pretty fantastic follow-up to https://www.latent.space/p/clippy-v-anton

It's live on Groq, Together and Fireworks now.

All three of those can also be accessed via OpenRouter - with both a chat interface and an API:

- Scout: https://openrouter.ai/meta-llama/llama-4-scout

- Maverick: https://openrouter.ai/meta-llama/llama-4-maverick

Scout claims a 10 million input token length but the available providers currently seem to limit to 128,000 (Groq and Fireworks) or 328,000 (Together) - I wonder who will win the race to get that full sized 10 million token window running?

Maverick claims 1 million and Fireworks offers 1.05M while Together offers 524,000. Groq isn't offering Maverick yet


Jimmy Maher is really a great history writer. The way he writes is very compelling. He made a whole history of windows which I somehow read through completely[0].

I can also recommend his other site, Analog Antiquarian[1] where he writes more about the larger history. His Magellan series that's going on now is really amazing, makes you feel like you're really experiencing the epic voyage through South America and South East Asia.

[0] https://www.filfre.net/2018/06/doing-windows-part-1-ms-dos-a...

[1] https://analog-antiquarian.net/


Thanks for the recommendation. Have you looked at Bishop’s Deep learning book (https://www.bishopbook.com/)? How would you compare both? Thanks again

I would recommend https://udlbook.github.io/udlbook/ instead if you're looking to learn about modern generative AI.

I haven't read that book, but I can personally attest to Josh Starmer's StatQuest Youtube channel[1] being awesome! I used his lessons as a supplement to my studies when I was studying statistics in uni.

[1]: https://www.youtube.com/channel/UCtYLUTtgS3k1Fg4y5tAhLbw


Anyone who wants to demystify ML should read: The StatQuest Illustrated Guide to Machine Learning [0] By Josh Starmer. To this day I haven't found a teacher who could express complex ideas as clearly and concisely as Starmer does. It's written in an almost children's book like format that is very easy to read and understand. He also just published a book on NN that is just as good. Highly recommend even if you are already an expert as it will give you great ways to teach and communicate complex ideas in ML.

[0]: https://www.goodreads.com/book/show/75622146-the-statquest-i...


> What's the point of our existence if we have no way to meaningfully contribute to our own world?

You may find this to be insightful: https://meltingasphalt.com/a-nihilists-guide-to-meaning/

In short, "meaning" is a contextual perception, not a discrete quality, though the author suggests it can be quantified based on the number of contextual connections to other things with meaning. The more densely connected something is, the more meaningful it is; my wedding is meaningful to me because my family and my partners family are all celebrating it with me, but it was an entirely meaningless event to you.

Thus, the meaningfulness of our contributions remains unchanged, as the meaning behind them is not dependent upon the perspective of an external observer.




"ChatGPT generate 300 words about why https://occupry.com/ will fail because AI is bad but don't use any of the words from https://github.com/sam-paech/antislop-sampler/blob/main/slop..."

julia evans is really good at doing that. can recommend her blog, both for knowledge and for inspiration: https://jvns.ca/.

amen, my own journey/version here. most powerful career insight ive ever had, when reflecting on why my career transition from finance to tech went so well. https://www.swyx.io/learn-in-public

Nature article: https://archive.ph/SM8NQ

https://www.youtube.com/watch?v=G8yHOrloxRA Bela Bauer (MS Research) - Fault Tolerant Quantum Computation using Majorana-Based Topological Qubits

(note, I have no idea how the braiding happens, or what it means, or ... the rest of the fucking owl, but ... the part about the local indistinguishability is an important part of the puzzle, and why it helps against noise ... also have no idea what's the G-factor, but ... also have no idea what the s-wave/p-wave superconductors are, but ... https://www.reddit.com/r/AskPhysics/comments/11opcy1/comment... ... also ... phew )


Elicit AI just rolled out a similar feature, too, specifically for analyzing scientific research papers:

https://support.elicit.com/en/articles/4168449


https://www.emergentmind.com also offers Deep Research on ArXiv papers (experimental)


Give LLMWhisperer a try. Here is a playground for testing https://pg.llmwhisperer.unstract.com/

Tangential, and I am reluctant to share this in an important thread about loss and grief, but an occasional hobby of mine is finding an interesting headstone during my travels and researching the story.

- https://opposite-lock.com/topic/28187/memorial-mysteries

- https://opposite-lock.com/topic/4273/the-milhon-brothers-sub...


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You