Ever is a strong word. I shouldn’t rule out a future like that. But I don’t see the through line from our current ai to one that has entirely supplanted all human creation.
That consideration seems more along the lines of worrying about the eventual need to escape earth than a future on a closer horizon worth worrying about.
I’m more concerned about how every facet of our children’s lives will become inundated with shoddy ai being used to extract maximum profits at the cost of any humanness, and the death of all genuine communication on the internet.
AI has, in a few short years, gone from hardly being able to string words together, to writing coherent grammatical sentences, to being more proficient than an untrained human in many cases. ChatGPT is way better at writing poems than me, for example. Its style transfer capabilities are out of this world.
Thinking that the progression is going to slow down is just wishful thinking.
It is more visible with image synthesis. Sure, the style can be freely switched in a few seconds, fitting compositions from trained concepts is very impressive.
But there still is no real creativity. No emerging concepts aside from complete accidents that cannot be replicated again. The same is true for the other direction with models like Clip could create an interpretation of generated images. It is impressive, but there are still clear limitations. You cannot expect linear growth here, it could be that the current AI approaches are wrong, we hit a plateau and need fully new approaches for significant improvements. What we now have is insane amount of data and more powerful hardware, it could be that we have years of iterative and slow improvement while people fine tune their models.
I think LLM have the same problem overall, it is just more difficult to notice.
Your response to "by that metric, essentially almost everything is disallowed" is "well yeah, just don't do it". I don't think that stance would sit well with most.
ehh, it's a choice to buy into any of that bullshit.
we eloped $3,000km away to the tropics to get married. it was more of a holiday with a wedding day thrown in. we loosely, no expectations, invited a few friends and family and were pleasantly surprised to find 30 people met us there.
The whole thing, holiday and all cost us around $10k and was a total blast.
I found the traditional wedding day routine to be super claustrophobic and rigid.
The industry says weddings are "all about the bride", and IME the marketing seems almost exclusively targeting women and their anxiety. So it may be challenging to find a partner similarly minded.
>No need to risk half of my current assets and future income
life is risk. marriage is a great way to de-risk many aspects of your life. 2x income for only 1.3x increased expenses. Easier debt/mortgage access. Support (financial/emotional) in case of illness or income loss.
>other person, who probably doesn't like me very much anyways
Marrying the right person is the key (hanging out with your best friend forever is cool), and that for sure is risky/not easy.
>marriage is a great way to de-risk many aspects of your life. 2x income for only 1.3x increased expenses
Then what happens when you get married, and your spouse soon quits their job or is fired because they're a terrible employee, and you didn't see signs of this while dating? Now you have 1.3x expenses on 1x income. Worse, what if your spouse is terrible at money management? Now you have 2-3x expenses on 1-2x income.
>Support (financial/emotional) in case of illness or income loss.
What if your spouse only seemed to be emotionally supportive before you got married, but afterwards (or when you got sick) the facade was dropped? What if they drop you like a rock when you get sick?
>Marrying the right person is the key
Of course. Similarly, if someone loses the lottery, you can just tell them, "that's your own fault: you should have picked the winning ticket".
we don't live in the 50s any more. My wife and I were living together for 8 years before we got married. That sure would have been one doozy of a long con if she turned around and suddenly became a totally different person once the ink dried on the marriage certificate.
Picking the right person is not the same as buying a lottery ticket, lol.
Now you are moving the goal posts. There are many valid reasons to get married that do not involve having children. There you can take your time.
Honestly, if you want to have children, have them with someone you can see yourself co-parenting with, not someone you love. Children place a lot of stress on a marriage, and even if you are personally compatible but have different attitudes on child-rearing it will ruin the relationship. Having kids within a romantic relationship is a terrible idea, IMHO. Be practical about it.
So many what if's in life. What if I'm successful, what if I'm a failure? Will my partner still love me? What if I'm ugly, what if my health problems are too much? What if what if... What if I divorce and lose everything? So what? You could have lost everything a thousand other ways too. A car crash, a violent attack, a random event or calculated one.
Too much success can easily lead to failure, too much failure could actually lead to success if you learn..
Life is about up's and down's, I've lost so much over and over through bad circumstances (complicated deaths, parents with problems, etc) I find it mind boggling how much control people actually think they have in life. Sometimes you just get bad hands but that can be good sometimes as it can give you perspectives that others may miss out on. Empathy is something often hard earned.
So many people think if they just play life safe they will always be safe, and yeah I guess maybe. But for me, I see life has a best hope situation in general. Get lucky and good stuff might happen to you, Do your best to get lucky. I'm so used to instability in life, I personally find it amazing to realize how many people are in a blessed position to make/have friends, health, wealth and education, a happy family and such. And I'm sure so many would see my life as stable lol.
I think the best thing is to be thankful for everything you get because it might not last but is something worse because it doesn't last or is it better? A hot piece of bread is much better then one always cold because it's so short lived and a simple pleasure. Like life itself is short lived but I've been thankful for those things I've learned.
Even if bad stuff happens it's not the end of the world, because you are still alive. Though I know plenty wish that weren't alive, some being successful at getting their wish. It's understandable, a lot of life is painful and disappointing. I'm not married or anything but I can tell you from my own experience money is a tool sometimes you will have it, sometimes you won't but you can make do, health is a blessing sometimes you will have it, but when you don't you wish so much you did, friends are such a joy when you have them but they too might not last forever, family can be the same. When it comes to love, it's really a question of sacrifice. If you love someone sacrifice is much much easier.
But I think it's hard for people to truly love. It's often painful and sometimes the rewards don't seem there. How much is their life really worth? But would you really leave people worse off? I think the movie A Wonderful Life is a good example of this. I think knowing people from other cultures and countries also makes you realize how wildly different life can be for people too.
I do plan to get married one day and I personally believe sharing everything is important because loss is to be expected so enjoy it while you can. Though I know many would find me a fool but everyone is a fool in their own way. I don't think of marriage as winning the lottery ticket but then again I think you are quite lucky to marry in the first place because quite a lot of people may never see the opportunity ever come up. Even though my parents marriage went up in flames and they both lost everything in it, I think I've learned a lot of what not to do lol. I don't blame marriage for their self destruction only their own expectations of what to expect out of life.
It was their attempts at control that caused such a spiral.
Her home city of Hanoi doesn't have tap water that the locals like to drink. Even after boiling I think it can taste bad. So she grew up with heavily filtered water or bottled water. I would guess her diet was lacking in many trace minerals from this.
Just because a treatment works doesn't mean it's the best treatment, or is treating the "cause." To say that a lack of fluoride is the cause of their bad teeth is silly. Not eating sugar/acidic foods, Novamin, or a different set of bacteria in her mouth would also improve her teeth.
2 years ago a machine that understands natural language and is capable of any arbitrary, free-form logic or problem solving was pure science fiction. I'm baffled by this kind of dismissal tbh.
>but LLMs are never going to go into a recursive loop of self-improvement
Two years ago we already had GPT-2, that was capable of some problem solving and logic following. It was archaic, sure, it produced a lot of gibberish, yes, but if you followed OpenAI releases closely, you wouldn't think that something like GPT3.5 was "pure science fiction", it would just look as the inevitable evolution of GPT-2 in a couple of years given the right conditions.
No it isn’t. Even before transformers people were doing cool things with LSTMs and RNNs before that. People following this space haven’t really been surprised by any of these advancements. It’s a straight forward path imo
Few people saw it coming in just two years, sure. But most people following this space were already expecting a big evolution like the one we saw in 5-ish years.
It's a text RPG game built on top of GPT-2 that could follow arbitrary instructions. It was a full project with custom training for something that you can get with a single prompt on ChatGPT nowadays, but it clearly showcased what LLMs were capable of and things we take for granted now. It was clear, back then, that at some point ChatGPT would happen.
>I often wonder if the filter, is more to hide its true capabilities.
right now we're all sharing a slice of GPT. I wouldn't be at all surprised if there's some uber GPT (which requires a lot more processing per response) running in a lab somewhere that blows what's publicly available out of the water.
>Maybe it'll turn out to be a distinction that doesn't matter but I personally still think we're a ways away from an actual AGI.
if you had described GPT to me 2 years ago I would have said no way, we're still a long way away from a machine that can fluidly and naturally converse in natural language and perform arbitrary logic and problem solving, and yet here we are.
I very much doubt that in 5 years time we'll be talking about how GPT peaked in 2023.
It has not been solved. Even GPT-4, as impressive as it is for some use cases, is dumb and I can tell the difference between it and a human in a dozen sentences just by demanding sufficient precision.
In some contexts, will some people be caught out? Absolutely. But that's been happening for a while now.
"Dumb" isn't why the Turing Test isn't solved. (Have you seen unmoderated chat with normal people? Heck, even smart people outside the domain of expertise; my mum was smart enough to get into university in the UK in the early 60s, back when that wasn't the default, but still believed in the healing power of crystals, homeopathic sodium chloride and silicon dioxide, and Bach flower remedies…)
ChatGPT (I've not got v4) deliberately fails the test by spewing out "as a large language model…", but also fails incidentally by having an attention span similar to my mother's shortly after her dementia diagnosis.
The problem with 3.5 is that it's simultaneously not mastered anything, and yet also beats everyone in whatever they've not mastered — an extremely drunk 50,000 year old Sherlock Holmes who speaks every language and has read every book just isn't going to pass itself off as Max Musstermann in a blind hour-long trial.
The lack of an ability to take in new information is maybe the crux of my issues with the LLM to AGI evolution. To my understanding the only way to have it even kind of learn something is to include it in a preamble it reprocesses every time which is maybe workable for small facts but breaks down for updating it from the 202X corpus it was trained on.
On the one hand, what I was saying here was more about the Turing Test than about AGI. Sometimes it gets called the AGI, sometimes it's "autocomplete on steroids", but even if it is fancy autocomplete, I think 3.5 has the skill to pass a short Turing Test, but not the personality, and it needs a longer "short-term memory"-equivalent than 3.5 for a full Turing Test.
On the other hand, as I (sadly) don't get paid to create LLMs, I've only got the kind of superficial awareness of how they work that comes from podcasts and the occasional blog post, which means ChatGPT might very well understand ChatGPT better than I do.
As impressive as GPT4 is, it still makes mistakes humans would not, and so I wouldn’t say it passes the Turing Test.
I generated this just now:
Me: Jack and Jill are sitting next to each other in a room. There is no one else in the room. The person sitting next to Jack is sad. The person sitting next to Jill is angry. Who is sad, Jack or Jill?
GPT4: Based on the information provided, it is not possible to determine who is sad and who is angry, as there is a contradiction in the given statements. If Jack and Jill are the only ones in the room and sitting next to each other, then the person sitting next to Jack would be Jill and the person sitting next to Jill would be Jack. The statements about their emotions conflict with each other, so it is not possible to accurately determine who is sad or angry.
Jack and Jill are sitting next to each other in a room. There is no one else in the room. The person sitting next to Jack is sad. The person sitting next to Jill is angry. Who is sad, Jack or Jill?
ChatGPT
Since Jack and Jill are the only two people in the room and they are sitting next to each other, the person sitting next to Jack is Jill and the person sitting next to Jill is Jack. Given the conditions you provided:
Jill is the one who is sad because she is sitting next to Jack.
Jack is the one who is angry because he is sitting next to Jill.
>And yet, the general reaction of most people seems to be, "That's kind of cool, but why can't it also order me a cheeseburger?”
Well yeah.
Imagine you tell a small child that knows about calculators, “Hey can you work out 18763 + 38284, for me?” They might struggle and then maybe fetch a calculator.
The LLMs attempt to predict the answer. WTF? It’s a computer and it can see that is a plus sign. Just understand its addition, and use the rest of your computer brain and do the sum. Hell, it is connected to the internet and we just taught you everything since before 2021. Just call out to Wolfram and give me the answer.
But that’s not how computers work. And we keep saying “AI” but that I is doing a lot of heavy lifting.
Again I'm not really saying GPT has peaked I'm saying there's a categorical difference between GPT and AGI. A good enough fake might perform well enough to function like one but I have my doubts that it will. Without a way to deal with and, to some sense of the word understand, facts I don't think LLMs are suitable for use as anything beyond an aide for humans (for starters because they can't determine internally what is and isn't a fact vs hallucination so you have to constantly check their work).