For the best experience on desktop, install the Chrome extension to track your reading on news.ycombinator.com
Hacker Newsnew | past | comments | ask | show | jobs | submit | history | more bdbenton's commentsregister

Only slightly related, but "pawpaw" is often used in the Southern United States as a nickname for grandpa.

Reading this article was pretty surreal and funny with this in mind. Everyone rushing to buy and eat their pawpaws for their mango and banana like flavor.

I wonder if you could cultivate it on a larger scale, the demand is there. Or maybe they are more like truffles and need to be foraged.

Interesting article, now I have to try it. On a similar note, people take for granted the fact that tomatoes, potatoes, chocolate, vanilla, and blueberries all originated in the Americas.

Imagine ancient Rome without the tomato. Some things are so commonplace that we forget their origins.


And peppers!

Imagine Szechwan, Thailand, Indonesia, India, the Philippines, and Europe waiting millennia for peppers to reach their shores.

Likewise, peanuts to Indonesia and Africa.

Somehow sweet potatoes got from the Amazon to Polynesia centuries before Europeans did.


Wow, that completely changes my perception of history.

There are two competing theories on the initial population of the Americas. The first was a land bridge theory which suggests hunter-gatherers tracked mammoths across the once-frozen Bering Strait bridge of modern Russia/Alaska, but a newer and more popular theory is that Polynesian peoples arrived by boat navigation.

Apparently, sweet potatoes arrived in Polynesia around 1,200 and 1,300 AD, which gives a lot of credit to the idea that the first human inhabitants of the Americas arrived by boat from Polynesia.


Genetic evidence suggests people got to the Americas about the same time as to Australia. That would have been by boat along Japan, the Aleutians, and down the west coast, maybe 50,000 years ago.

Polynesians came along tens of thousands of years later, starting from Taiwan and spreading out to southeast Asia, the Pacific, and Madagascar. They would have sailed from Pacific islands to Peru or Chile, and picked up sweet potato there. (Surprisingly, not potatoes or peanuts, although they could have taken and then lost those.)

We have certain evidence of hominins in California 130,000 years ago -- H. erectus, Neanderthal, Sap., or "other" -- although no evidence they left descendants.


I firmly believe that people should take at least one day of the week to not work at all. My code is very buggy and sloppy when I'm completely exhausted, even my grammar and ability to organize ideas falls apart.

For many people it's traditionally Sunday or Saturday, but even if you don't observe a day of rest religiously, it is still very practical. If you are involved with intellectual work like coding, switching your brain to just relax and enjoy life at least one day a week will refresh your focus and drive.

Alan Perlis said to understand a program you must become both the machine and the program, and he also admits that programming is an unnatural act. Programming is essentially forcing your mind to think like a machine, and you can't do this all the time or it will burn you out.

Don't neglect your mental wellbeing, take care of yourself! If you want to be productive, this will help you in the long run. The servers will keep running if you take a bit of time off, people aren't built for 24/7 uptime.


Having a real weekend shouldn't be a novel idea.. I assume most people in our industry have that? I know it would be a luxury for some, but I couldn't live without it.


I agree, but with programmers in particular it's seen as a badge of honor and sometimes even as an expectation from employers to just "eat, sleep, and code" all the time. We have things like "the crunch" or hackathons, but the dark side to it is the lack of labor laws surrounding computer work.

In fact, computer programmers have special exemptions under the Fair Labor Standards Act (FLSA).

https://www.dol.gov/agencies/whd/fact-sheets/17e-overtime-co...

The FLSA guarantees minimum wage and 1.5x overtime pay for workers in the USA. In other words, if you are a computer programmer in the US, your employer can legally work you overtime/weekends without 1.5x pay like other workers.

Of course, the US also has the highest-paying programming jobs, and it has the most software developers by a very wide margin as America invented software engineering. This unique position in the labor market has caused the rights of computer programmers in the workplace to be largely overlooked.


I think the most practical use in consumer cases would be a portable replacement for desktop and mobile displays. Using everyday apps for desktop and mobile that are already established, as well as interesting AR apps for entertainment and productivity.

As with a lot of cutting-edge tech, it seems the most immediately useful application is beyond consumer tech in the world of military tech. Being able to send critical information to soldiers in the field and integrate with weapons technology has been a major focus of AR since its early inception. Early HUDs have their roots in fighter jets.

If you think about it, a lot of things we credit to the high tech industry had their beginnings in low-key and often confidential military research, from the internet to computers themselves. Although, an app for motorcyclists that displays speed, engine, and navigation data in your field of view would be seriously bad*ss and successful if it worked in a clean and functional way.

As with a lot of emerging tech, its usefulness often depends on the ingenuity of early pioneers. People thought the internet was a useless, passing trend as late as the 1990s.

Imagine silently controlling a fully-featured AR displayed system with a brain-to-computer interface and zero peripherals like some Ghost in the Shell futuristic tech. That would be pretty awesome and legitimately useful.


Long reply from a young programmer that recognizes this trend and what it implies:

Alan Perlis says you can measure the perspective of a programmer based on their thoughts on the continued vitality of Fortran. Many universities are now going full Java and lowering standards on fundamentals for financial reasons, this is a bit alarming.

If you really want to stand out, spend serious time on fundamentals. Learn to develop and deploy clean, efficient software from scratch without relying heavily on external resources that add said bloat.

A huge trend in software right now is low-code/no-code tools. These tools can replace programmers that don't have a deep enough grasp of fundamentals to compete. A very small percentage of the world population knows how to code, so these tools are becoming massively popular.

This is what separates the programmer that can quickly throw together a basic CRUD app and a programmer that can develop a successful programming language from scratch. If you can make sense of the Linux kernel and you have the drive to constantly learn new tech, it doesn't matter too much what trends come and go.

If you just want to ride the wave and think just having a CS degree or knowing how to code will guarantee you an easy life, it will be a rude awakening when the industry keeps moving forward and economic crises trim the fat from the cyclicly overvalued and monopolistic world that is software development.

This is the duality of being a programmer. Yes, you can teach yourself many useful things and acquire powerful skills relatively quickly, but if you stop learning and challenging yourself, then you are liable to be left behind.


A lot of popular web apps seem to decay in this way. The founding developers create an excellent site, it becomes a huge cash cow, then the organization becomes bloated and more focused on extracting more out of existing value than creating new value or building on the original principles.

Medium didn't always have a paywall, and I am thankful to the random users here that post archive links to bypass paywalls, I even installed a browser extension for it thanks to this site. Facebook is starting to lose users for the first time in history, despite a rebranding.

This website really doesn't have that issue as far as I can tell, and people seem receptive to promoting your software if it is actually useful. There doesn't appear to be such a heavy censorship of hacker-related material that conflicts with corporate interest, like the aforementioned paywall bypass links.

In other words, if it ain't broke don't fix it lol


none of the companies mentioned were cash cows. medium and quora barely made any money. In general it’s really hard to monetize blogging.


> if it ain't broke don't fix it

It is a difficult line, because sometimes you see a website become commercially irrelevant over time because functionality is not added. And I guess some sites need to follow visual trends to maintain customers.

Also there are long term changes that can force your hand. Mobile friendly required changes or you would lose those customers.

Some usability trends are improvements (and some trends are bad: Apple seems to prefer pretty looks at the expense of usability (Apple used to have a fantastic HIG (human interface guidelines))). https://developer.apple.com/design/human-interface-guideline...


They're not adding functionality, they're putting pay walls up in front of old functionality. Look at Quora, certain answers are now pay walled, answers are supposed to be the core of the service.


Medium was broken: it didn't bring enough revenue.

It's not because it was working from a user perspective that the underlying model wasn't broken.


And even if it broke, don't (can't) fix it.


Most of the world's oceans remain unexplored. Underwater cave systems in particular have a lot of uncharted territory. It's dangerous work, but explorers are still finding new caves and naming them.

Underwater robotics makes this task a lot easier. It's interesting to think that most of the planet's surface is abyssal plain, the largest habitat on Earth, flat surfaces on the sea floor beyond the reach of sunlight.

Ocean exploration really sparks curiosity. We are in such a rush to leave our beloved blue planet and send people to uncharted frontiers, but we still have an unfinished frontier right below our noses. Marine exploration also coincides with environmental preservation, as oceans are made of the most vital, life-giving resource on the planet.

Something interesting you learn in SCUBA diving is that sound waves travel faster in water than in air. This makes it difficult for humans to discern the source of audio while underwater, but machines like this seem to emulate the sonar capabilities of underwater creatures.

Always fascinated by what the minds at MIT are working on.


The Great Train Robbery (1903) ends with one of the bandits pointing a pistol at the camera and shooting. Apparently, this also disturbed audiences so deeply that they got up and ran out of the movie theater. This is more of an urban legend than anything, as mentioned in the article, but it goes to show the emotional power of new technology.

It reminds you of Clarke's third law, "Any sufficiently advanced technology is indistinguishable from magic." I still remember the ominous feeling of receiving an unsolicited, location-based mobile notification from Google requesting a review immediately after leaving a local restaurant. Digital surveillance works by some unseen "magic" to most users, which is more dread than horror, like a hacker posting your IP and location out of nowhere just to freak you out.


>I still remember the ominous feeling of receiving an unsolicited, location-based mobile notification from Google requesting a review immediately after leaving a local restaurant. Digital surveillance works by some unseen "magic" to most users, which is more dread than horror, like a hacker posting your IP and location out of nowhere just to freak you out.

Vernor Vinge's A Deepness in the Sky depicts a human interstellar civilization thousands of years in the future, in which superluminal travel is impossible (for the humans), so travelers use hibernation to pass the decades while their ships travel between star systems. Merchants often revisit systems after a century or two, so see great changes in each visit.

The merchants repeatedly find that once smart dust (tiny swarms of nanomachines) are developed, governments inevitably use them for ubiquitous surveillance, which inevitably causes societal collapse. <https://blog.regehr.org/archives/255>


> This is more of an urban legend than anything (…) but it goes to show the emotional power of new technology.

What does it show if it didnt actually happen?


Exaxtly. It's only mind-games.


Near/far/wherever you are...


Clarks third law or whatever just reminds me of how “progress” is a religion in the liberal west. It’s not an observation but a declaration of some silly standards which are nonsensical to any mind not indoctrinated to the delusions of elite western liberalism. On top of that, it’s also outright pompous.


Clarke's Third Law is... pompous?


I suppose if you read the inverse of the law, there's a tenuous thread connected to the notion that progress is a necessary step to disabusing our illusions of nature and how the world works. The risk is both assuming you know more than you do because you "believe in science", as well as devaluing things in the natural world because they do not fit into whatever notion of advancement is in vogue.

We often see valuable discoveries come from some area that was previously overlooked. There's not enough there to demand any sort of cute phrase but just enough to discourage the notion that a society with some advancements over another has nothing to learn from the other either.


Seriously, trying to "capture" devs with closed ecosystems just stifles creativity and limits potential with corporate gatekeepers and walled-off technologies.

The open source mindset, like with Android, is a big reason why it dominates the global mobile software market. Security is one thing, but blocking functionality and self-publishing only drives away good developers and innovative apps.


> like with Android

Most people when they think of Android or the Google Play store they think of a cesspool of ugly inconsistent amateurish trashware.


Very strong feelings lol, coincidentally that's how many Westerners think of every country outside the USA and Europe.

Doesn't change the fact that Android has the largest market share worldwide in both hardware and software.

Expensive doesn't mean quality, and a lot of developers don't want to work with a company that makes billions on child labor and similar practices. Apple has plenty of useless apps, but like anything else, users filter the best apps. Besides, developers today often deploy identical apps on multiple platforms.

This pretentious attitude is very isolating, I am just glad that ad-blocking and privacy is an option I have as a user. Additionally, I don't have to shell out for huge margins on overpriced gadgets to feed a tech monopoly that violates basic human rights for profits.

That's the last comment I'll write, because there are already enough Android/iOS flame wars polluting the internet. You had your word, I had mine, that's as civil as I can be.


> lot of developers don't want to work with a company that makes billions on child labor and similar practices

Wait? Which company is using child labor?


Exactly my thoughts on the subject of the iOS store. Disclaimer: I use iOS.


I think the problem is that phone makers mix up the concepts of Content-Filter and App-Store. They are orthogonal things, and should be separate.


I’ve never seen more creativity unleashed more quickly than the apple apps store. The play store is fine if you stick to known apps but it’s just full of shit otherwise. But that’s the point of an open ecosystem.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search:

HN For You