I don't buy it, and honestly I think hardly anyone else does either.
I remember when I tried the first iPhone in 2007. Sure, it wasn't super fast, the screen was small and pixelated by today's standards, but I remember going "Holy shit, this is amazing." It was easy to see its utility on the first use of it.
Contrast that with a VR headset. I think "This is a cool game to play for 20 mins or so", but I have absolutely no desire at all to spend any portion of my time in "the metaverse". This is fundamentally solving a problem that people don't have, and it's a solution they don't want. I have yet to see anyone (who isn't somehow paid to shill it) be genuinely excited about the potential of "the metaverse".
What about when you tried the first touchscreen smartphone, which was probably resistive and came with a stylus?
Seems unfair to compare an emerging tech with the iPhone (which took an emerging tech to the next level) off the bat. It's almost an argument in bad faith.
When I tried my first PDA, which was before anyone had stuck a SIM card in one, I was completely blown away. I wrote an entire book on the thing, and emailed it to myself for final print template tweaks, before sending to a publisher. After writing nearly a million words on it, I was just as excited about it as when I first got it.
Probably helped that the latency of the device back then was completely unnoticeable, in contrast with today's smart devices that can't keep up with my typing on their touchscreens.
On the contrary, I remember trying my first VR headset in the 90s when it was "Max Headroom-esque", so it's a bit rich to demand that I only consider the earliest smartphones by comparison, when VR has been progressing for 30 years.
And, to clarify, I think the current crop of VR headsets are quite cool. But again, I see them cool in small units of time. I also think many, many people have come to the realization that there is too much ever-present technology in their lives, and the last thing they want is to become the humans from Wall-E.
When it comes to crypto, given how antiquated our current financial settlement systems are, I can see its utility as a backend settlement layer, but 95% of what I see peddled in the crypto space is just some other get-rich-quick scheme.
I have an OG Vive, Oculus Quest, and a Cosmos with the wireless adapter. Several hundred hours of SteamVR over the course of the past few years. All that said, I can do 1-2 hours of VR every other day or so at most. The idea that this will be as popular as smartphones in the next 5, even 10 years seems crazy to me. Being "in VR" is pretty exhausting no matter how lightweight the headset is.
> I remember when I tried the first iPhone in 2007. Sure, it wasn't super fast, the screen was small and pixelated by today's standards, but I remember going "Holy shit, this is amazing." It was easy to see its utility on the first use of it.
For the dumpster fire that it was (mostly imo due to hardware decisions / public perception), I had that same reaction with Google Glass.
It was super nice being able to see messages as they came in doing various tasks. Being able to ask random questions and get visual responses, rather than an assistant reading you a random snippet from the result that it thinks is appropriate. Or being able to almost-instantly snap a pic or record a quick video.
All of it felt like a game changer due to me not having to take my hands away from whatever I was doing or try to crane my neck and wrist to get my watch to display.
I think that in the coming years Apple/Google will come out with another similar device. I don't think phones as we know them now will go away overnight in favor of these devices though. But instead they will be essentially a different smartwatch form factor for a while.
Agreed. Time may prove us wrong but I don't see how it is in any way going to be the next big thing. the cell phone is something that most everyone had and needed already so there being a smartphone that can do more is an obvious next step forward and has tangible utility and doesn't require you to change anything fundamentally about yourself or your habits. requiring special and expensive basically single use kit to just get on the platform is a nonstarter for wide adoption. yeah people will use it and a subset will love it but it's not the next facebook or smartphone barring some giant leap in something that I can't imagine.
I was playing far cry today and was chatting with my wife about it. The only thing I want a VR headset for is so I can have fast peripheral vision playing games that don't have a 3rd person perspective.
Other than that, I can not see a real utility for it outside of AR specific uses like utility mapping or architectural stuff. Whereas with the phone, like you, I was pretty amazed with a rectangle that had all the knowledge of the world on it.
Yeah, but the big difference is with VR you have to build a completely different experience.
iPhone already had a massive amount of apps for it as you could visit any existing website on it.
You don't have that with VR. It has to be built. Which is what Facebook is focused on. Building tools and AI to help creators build "worlds" that can be used in VR.
Imagine you favorite band is playing live, they don't visit your town. You can put on a VR headset join in on the event and experience it like you are there. You friends can come along without any of you guys being in the same country.
Your a primer league fan in the US, you can still experience the game like you are there.
All these experience will come as tech to build them gets better, but these are WAY harder problems then what the iPhone needed to solve when it first was released.
> Imagine you favorite band is playing live, they don't visit your town. You can put on a VR headset join in on the event and experience it like you are there. You friends can come along without any of you guys being in the same country.
That just sounds so contrived. You’d be paying money for that too… and it wouldn’t be much cheaper than going to see them live.
3D television was supposed to be the next big thing. I’ve used a VR headset before and it was neat but gimmicky and disorienting. People don’t like wearing things on their head over their eyes for extended periods of time for entertainment. I hear arguments about it’s a generational thing or it’ll catch on the more you use it but those have been said before about wearables. Ears seems fine but I think there’s something very different with covering your eyes that will be a blocker for mass market appeal.
None of the mass market metaverse things people talk about are actual problems people have. Anything that is an actual problem (remote VR surgery) isn’t mass market.
I think while VR is limiting in many ways, AR does have the potential to overtake the phone if it becomes a truly seamless extension akin to donning a pair of glasses. Seamless AR might be 10 or 20 years out I don't know, but I can see the utility.
There's a lot of 1960s SciFi where future people are planning out interstellar travel routes with slide-rules. Even "The Foundation" series posits that basically all cool technology in the future is going to be based on mastery of nuclear power.
We do a bad job of extrapolating where the future will be from current trends. The frequent breakthroughs in physics and industrial, precision manufacturing of heavy machinery around the mid 20th century eventually slowed down. The remaining problems got too hard to make major leaps and bounds improvements on at the same pace, and the desire/value to keep pushing on it slowly faded. It' still there, but a computing revolution happened instead.
There's no reason to think the future revolving around improvements in networked computing is going to go on forever, we just don't know what the next "thing" is going to be. I'd argue most of the benefits of having the internet readily available at all times in all places are basically already realized by smartphones and anything else is mostly just going to be a refinement or intensification of stuff we already know. It will not be transformational in the way smartphones were. It's on a similar arc where the leaps-and-bounds improvements are harder to come by and it's not clear what we'd do with them when they happen. Gone are the days when anyone would notice a performance improvement in their day-to-day computing tasks from having a cutting edge CPU or GPU. Advancements are increasingly only useful for more niche use cases that refine and improve what we already have rather than dramatically altering the way we work. Faster mobile data is cool, but it doesn't change the way you use your phone and neither will a more immersive computing interface markedly change how you do the tasks you use a networked computer to do.
I don't know what the next field that undergoes such revolutionary change will be. If I were a betting man I'd put my money on major breakthroughs in biomechanics and life sciences, but who can say for sure?
>It is the obvious extension of existing trends in technology.
I'm not so sure, but perhaps... although there's some really big problems that need to get solved to get there, and some of them aren't technological problems.
1. The technology isn't there yet. To their credit, it might be getting close though. A not insignificant number of people still get sick from VR goggles. The resolutions just aren't at present sufficient. Available portable computing power might not be enough at present. IMO, the cartoonized VR environments we can currently produce on the fly aren't going to be of as much interest as more realistic environments and avatars. The spatial resolution of sensors and lack of tactile feedback aren't gonna cut it for even the most basic tasks.
2. People wearing VR goggles look dorky AF. In fact, they ARE dorky AF.
I don't think Facebook's upper management fully appreciates this issue. Social stigma is a hell of a thing to try and fight. It's not impossible to do, but it will be way more challenging than I believe they think it is.
3. No one wants to wear a blindfold in public. VR goggles are basically that and worse. AR might be able to address that, but for whatever reasons, having a screen in your hand doesn't trigger the same vulnerability response people would have to VR goggles. (It should, because it steal our attention like nothing else, but it doesn't.) People don't feel comfortable if they can't read the social cues of people around them and see danger approaching.
I think one of their strategies is the most promising: Turn AR/VR into a common workspace standard. Monitors are big, clunky, hard to move, inflexible, etc. If you could use VR to create a really clean functional workspace to replace the monitor, that might be a real winner and help drive up comfort and adoption issues.
This is still a really hard domain to solve though. In video calls, we still have a lot of people who call in on their phone. This technology is compatible with lots of devices. One person wearing a VR isn't going to be compatible with everyone else though. If I'm meeting you, in general and especially for anything with higher stakes, I don't want to talk to a fucking cartoon, I want to see your face.
It’s one of dozens and dozens of these technology paths branching out before us that seems somehow both obvious and not yet ready. Any day now we’ll have self driving cars, the metaverse, cryptocurrencies, robots doing chores…there are all these things that feel “obvious” and right here…yet so far away.
HN crowd is, surprisingly, quite myopic about new technologies. Ethereum & VR headsets are great examples. Both of these are incredible and will definitely change the world in a very drastic way; yet, both are hated and ridiculed here.
> Both of these are incredible and will definitely change the world in a very drastic way;
I'd easily take the other side of that bet.
More fundamentally, though, I take issue that "the HN crowd is myopic about new technologies". I'll just speak for myself, but I was easily excited about smartphones, the potential of SaaS businesses, I remember taking my first Uber thinking "I'm glad I'm not a taxi driver", etc. etc.
The fact that many on HN look at hucksterism and bullshit with a critical eye is something I think we should collectively be proud of.
The gripe HN has with these technologies is the hype and speculation and it’s the responsible thing to do to point those out.
You see yourself as a non myopic but if you look at it historically, technology always panned out differently and in unexpected ways. Why do you presume you’re going to not make the same mistakes?
Who on HN hates VR? That tech has been breathlessly praised here, at least years ago, but it’s just since fallen into the trough of disappointment for the time being. Eventually it’ll get to the plateau.
Remember Google glass? Besides, I think new technologies are generally thought of conservatively by the majority and will only have a couple of supporters in the initial stages.
Both crypto and VR have been thoroughly debunked, many, many times, and yet people like you keep bringing them up as if you have not read any of the previous criticism. I'm not sure if there is anything that can be said to you that will help you see how deeply stupid these technologies are. Some of the critiques already posted to Hacker News have raised every valid point against these technologies. I can't think of anything more that can be said. At this point, we simply have to wait 20 years, and then you'll be able to see that these technologies never developed into anything much more than games.
This is oft-repeated but it's not as simple as that.
There's no sharp line dividing the two. You already have "VR + passthrough" which is pseudo-AR. And AR turns into VR as soon as you cover up enough of the real world.
There's some experiences/applications that suit AR more than VR and vice-versa. Most devices will offer a mix of the two.
I think a ballpark to aim for is 95% reality, 5% overlay. Whether that's achieved using passthrough or not is a minor detail.
Since the benefits will be pretty subtle QoL things, the form factor has to be very ergonomic to make it worth it, so widespread use could be decades away.
I read recently for the first time about what happened to the Maya at the hands of the Spanish and I was shocked at how bad it was. Even just in the context of the destruction of knowledge it was terrible. There were thousands of books recording centuries of precontact American history that were all destroyed by the Spanish. Only four examples out of the many thousands of piece of Maya literature have survived to the present. A few quotes from https://en.wikipedia.org/wiki/Maya_codices:
"Our knowledge of ancient Maya thought must represent only a tiny fraction of the whole picture, for of the thousands of books in which the full extent of their learning and ritual was recorded, only four have survived to modern times (as though all that posterity knew of ourselves were to be based upon three prayer books and Pilgrim's Progress)."
— Michael D. Coe
"We found a large number of books in these characters and, as they contained nothing in which were not to be seen as superstition and lies of the devil, we burned them all, which they regretted to an amazing degree, and which caused them much affliction."
— Bishop Diego de Landa
Care to elaborate on your comment to those who are not in the know? Did the Aztecs not burn scripture? I don’t know much about Mayan and Aztec history and culture, but I visited Yucatán recently and it sounded from guides like there was a lot of original Mayan culture lost in the mixing of the two. I was also told, but don’t know if it’s true, that the human sacrificing was actually an Aztec practice that got introduced in the mixing and not an original Mayan practice.
They are attempting to imply I am trying to whitewash the Spaniards. I am not, I am pointing out Aztecs erased history to create their own imperial narrative too although not on scale of Spaniards to Maya
Why the hell would you put a virology lab in the middle of a huge city with millions of people? I am not a virologist, but to me it seems like if covid was a lab-leak then it could have been avoided very easily just by having the lab in the middle of nowhere, like in the desert or the artic. Instead of the whole world having to go into lock down, with millions dying, just put the lab in a remote location, and make it so that before anyone leaves they have to quarantine for 2 weeks. This seems obvious, why isn't it standard protocol?
> This seems obvious, why isn't it standard protocol?
Would you take a job that had that requirement? Want a few days off? 2 week quarantine. Would investors be ok with this? Want to pay expensive PHD for vacation plus quarantine time (which could be days off x 10 working days every time). The problem is that the workers in the lab are just as human, have the same needs for relationships and contact, and some have enterprises outside of their day job. Want to speak at a conference? 2 week quarantine...
Sure that makes sense, but you could crank their pay way up to compensate, build places for their family to stay if they desired like the families of the scientist working on the Manhattan project, present at conferences through teleconferencing. There are scientist who stay in Antarctica for months and the international space station. Yes it would be inconvenient, but look at what (maybe) happens if you don't.
> I'm not going to take a job that makes it impossible to see my family
Literally addressed in the very next part of that same sentence you quoted:
> build places for their family to stay if they desired
Cool for you though if you aren’t for sale. There are many people in this world with quite varied interests and motivations. I wouldn’t do it either in my current position in life, but might have considered it when I was younger.
Even if the price is the lives of millions and the whole world in lockdown? We might be better off if you don't take the job then. Also you ignored the rest of the sentence you are quoting and my examples of highly skilled scientist who are willing to make that sacrifice in Antartica and the ISS.
> Even if the price is the lives of millions and the whole world in lockdown?
I don't have to research gain of functionality in virus. I can chose to research something else. Yes, some people are willing to do extreme things in pursuit of their careers, but most people, even highly educated people have things they value more than their job.
These questions, while valid, should not matter. People who had to work in Los Alamos would perhaps have preferred a location in Manhattan, too.
There is an oversupply of scientists who would do anything to work in their fields. One could also question if this sort of research has ever produced anything valuable (I'm not in the field, genuine question).
> There is an oversupply of scientists who would do anything to work in their fields.
I'm pretty sure that's not true, and assumes that people will rationally accept a bad deal. Personal experience is that the better educated a worker is, the higher their expectations are for how they are treated. Example: tenured faculty at a college.
Without knowing much about the topic, when I hear about code being proven to be correct it makes me think of the Curry-Howard correspondence, which states that proofs and programs are isomorphic to each other. Is this related at all? If programs and proofs are the same thing and you have a proof that a program is correct, is that like having a proof that a proof is correct? In which case it seems like you are getting into the domain of meta-logic.
No, not really. Under Curry-Howard, if you have a total function that returns an integer, you’ve proven that an integer exists. But we knew that already.
To prove non-trivial things, you need more sophisticated types that make interesting assertions about their values, where it’s not immediately obvious how to construct a value of that type. Special proof languages are used for this.
Frankly, I don't give a shit about the national interest of the US or the national interest of China. As a user of tiktok, I enjoy the content and I will be saddened if it shuts down. We like to think of our country as our team, and to root for it in international struggles, but the reality is that it's the average consumer that gets harmed by these power struggles between nations. These economic proxy wars are just as much of a racket as conventional wars, and the sooner we stop allowing the few to rule the many, and play out their egoic power struggles with our lives, the better.
For some time I have been thinking about why it seems like scientific progress has slowed down compared to a century ago, and then one day I came across a clip of an interview with Professor Russell where he was saying that if technological progress continued at the current rate, then very soon humanity would drive itself extinct from the creation and use of weapons that were too powerful and too easily produced. I wonder if the slowing of scientific progress might be in some ways a saving grace, where if too advanced of technology was developed too quickly, then society would be unable to evolve in time to handle it responsibly without major potential for calamity.
Would just like to point out that Vaclov Smil says the time period between about 1870-1910 is unmatched by any other time period before or since, excepting a sliver of the Han dynasty, as measured by the rapid surge in invention, innovation, and the rapidity with which those new ideas were employed in the broader society (rate of adoption). He goes on to say that even in today’s seemingly endless cycle of new products and legitimate advancement, we’re making only point solutions in comparison. For instance, the whole-house AC power system would be easily recognizable by Edison today. Improved, yes, but compared to going from nothing to the direct-to-home and whole-home solution set, it’s illustrative of relative stagnation. To be clear though, he said that it’s unrealistic to expect society to perpetuate those periods, as they are flukes - again, only twice in recorded history. I’m not doing the depth of his reasoning justice by any means, but I think his would bring another interesting angle to the idea that you relate from Russell.
I wonder what Edison would think of the internet, self driving cars, speech recognition, image recognition, social manipulation at scale, VR, modern medicine, etc.
Self-driving cars in his age were called horsses. Those (and donleys and mules) still remain the preferred mode of autonomous all-terrain mobility in some regions.
The US Army was actively considering resurrecting pack mules fpr use in Afghanistan as of 2011, though I don't believe that actually occurred.
Otherwise, E.M. Forster's "The Machine Stops" (1909) envisions much of what you describe.
It seems to me the balance is between massive state power and massive individual power. I'm pretty sure about which one I fear more. That said, I think Nick's suggestions are pretty sound.
That said, I think about the call from Freeman Dyson to teach children to develop genetic engineering skills to create warm blooded plants to bring life to the asteroid belt.... Well, maybe at some point
Not the OP, but they can both be pretty bad. Nazi Germany or the USSR on one side, or Somalian warlords on the other side. Perhaps the warlords are better, but that's not obvious to me.
A common theme amongst optimists is how much technological advance we have seen and how these advances seem to be accelerating. I am of the opposite camp and believe, roughly, that the rate of progress has been steadily declining since we landed on the moon in 1969. While I can’t explain the 1970s and 1980s, the modern Gilded Age’s lack of technological progress is easier to understand. The intellectual lobotomization of smart, young STEM talent has been an explicit strategy by Big Tech fueled by unseemly profitability and hi flying stock prices. If you can’t innovate but are wildly profitable, wouldn’t you also just pay the incremental talented engineer to work for you on anything versus working for a competitor or, worse still, work for themselves and invent something disruptive that could impact your monopoly? Of course you would...and they have.
If you were a bright, ambitious engineer graduating in STEM in the early 1960s this wasn’t the case. You went to work on something meaningful. The momentum towards spaceflight was a call to arms for the smartest and hardest working amongst us. These bright men and women worked to invent new capabilities and entire ecosystems in fuel cells, gas storage systems, thermodynamic materials, engines, mechanical timers and clocks and control systems to name a few. The cost of the entire Apollo program was $25 Billion or $150 Billion in today’s dollars.
Big Tech spent $75B on R&D in 2018 alone. Put another way, this means that in two 2018 equivalents of R&D spending, Big Tech could have sent people to the moon and back. It's fair to say, however, that what we have witnessed instead can graciously be described as something less ambitious and impactful than that.
This misallocation of capital won’t end until we demand it - every government, regulator and individual now has a role to play whether you know it or not.
I second stevenwliao's recommendation of Superforecasting by Philip Tetlock. It's a good intro. It might seem unsatisfying in that there is no single thing that makes one good, and he basically refers to it (appropriately) as good judgement. But it highlights a lot of basic areas.
Practice is also good. If you're not used to probabilistic thinking you'll need to develop that intuition and calibration.
Anything about how to think about things better is going to be useful. There's a Coursera course called Model Thinking that is might be useful. Just being curious about things in general and pushing yourself outside of your normal areas of competency/interest.
It might seem weird, but I find Twitter to be pretty essential these days. There are a lot of smart people freely sharing information and some don't mind answering questions.
"We are about to study the idea of a computational process. Computational processes are abstract beings that inhabit computers. As they evolve, processes manipulate other abstract things called data. The evolution of a process is directed by a pattern of rules called a program. People create programs to direct processes. In effect, we conjure the spirits of the computer with our spells.
A computational process is indeed much like a sorcerer's idea of a spirit. It cannot be seen or touched. It is not composed of matter at all. However, it is very real. It can perform intellectual work. It can answer questions. It can affect the world by disbursing money at a bank or by controlling a robot arm in a factory. The programs we use to conjure processes are like a sorcerer's spells. They are carefully composed from symbolic expressions in arcane and esoteric programming languages that prescribe the tasks we want our processes to perform.
A computational process, in a correctly working computer, executes programs precisely and accurately. Thus, like the sorcerer's apprentice, novice programmers must learn to understand and to anticipate the consequences of their conjuring. Even small errors (usually called bugs or glitches) in programs can have complex and unanticipated consequences.
Fortunately, learning to program is considerably less dangerous than learning sorcery, because the spirits we deal with are conveniently contained in a secure way. Real-world programming, however, requires care, expertise, and wisdom. A small bug in a computer-aided design program, for example, can lead to the catastrophic collapse of an airplane or a dam or the self-destruction of an industrial robot."
It bothers me that the qoutes in this article are all cut up, in some cases ending when a sentence clearly wasn't finished. It makes it hard to judge what they are really saying here, and I wish the full interview would be published.
15 years from now everyone will say "remember when we didn't have these computers literally strapped to our face all the time?"
It is the obvious extension of existing trends in technology.