What Turing got wrong about computers | Neil Gershenfeld and Lex Fridman
Vložit
- čas přidán 30. 05. 2023
- Lex Fridman Podcast full episode: • Neil Gershenfeld: Self...
Please support this podcast by checking out our sponsors:
- LMNT: drinkLMNT.com/lex to get free sample pack
- NetSuite: netsuite.com/lex to get free product tour
- BetterHelp: betterhelp.com/lex to get 10% off
GUEST BIO:
Neil Gershenfeld is the director of the MIT Center for Bits and Atoms.
PODCAST INFO:
Podcast website: lexfridman.com/podcast
Apple Podcasts: apple.co/2lwqZIr
Spotify: spoti.fi/2nEwCF8
RSS: lexfridman.com/feed/podcast/
Full episodes playlist: • Lex Fridman Podcast
Clips playlist: • Lex Fridman Podcast Clips
SOCIAL:
- Twitter: / lexfridman
- LinkedIn: / lexfridman
- Facebook: / lexfridman
- Instagram: / lexfridman
- Medium: / lexfridman
- Reddit: / lexfridman
- Support on Patreon: / lexfridman - Věda a technologie
Full podcast episode: czcams.com/video/YDjOS0VHEr4/video.html
Lex Fridman podcast channel: czcams.com/users/lexfridman
Guest bio: Neil Gershenfeld is the director of the MIT Center for Bits and Atoms.
In the end, everything, including physics, is an abstraction structured so that humans can think about things. It's not an "error" to separate hardware and software - it's just a particular abstraction that makes it easy for humans to think about and design computation. Hardware design today is really a software activity using design tools. The fact that implementation of a hardware design has performance advantages but long lead times compared to pure software is not an essential conceptual difference from the standpoint of computation.
I wouldn't say an error too, but this led to the von neumann bottleneck, separating the cpu and hardware (makes things slower) and novel ideas like quantum computing try to solve this (also distributing systems address this)
Carl von Clausewitz said, "War is politics by another means." He could have said, "Hardware is software by another means." if he had known hardware description languages like Verilog.
It’s s insane Lex came out of nowhere as the Rogan replacement. Haven’t enjoyed consistent eps of long podcasts like this in years
It is insane given that the shows are so similar, yet Lex seems so much more rational even when he can be just as far outside his area of expertise (not necessarily here) as Rogan ever is. It makes me look at the "just a guy asking questions" defense of Rogan again. Maybe the problem isn't with Rogan, or with how seriously he takes himself, but rather with how seriously other people take him - that if people like Joe, they take what he says with fewer questions and grains of salt.
@@Malt454 such a great point man. Rogan got so big guests put him on a pedestal. The golden days he was just the normal question guy
@@Malt454 it’s not that similar are. Joe is far more chatty and fun, plus they’re very different personalities. Im not saying there’s no similarities but that’s like saying all interviews are similar because of the same format
@@edwardbradshaw6850 - At their best, the formats of the shows are essentially the same - relative laymen talking to (assumed) experts in their fields. The main difference is that, even as a layman, Lex has a scientific background and so takes a more serious and responsible outlook. I don't think it's so much a case of personalities as approach. Joe often doesn't believe in prepping for his shows, so it's questionable whether he gets all he can out of his interviews.
It's neither saying that all interviews are similar nor a judgment as to who is more chatty or fun. Joe will also interview people on the basis of their notoriety/popularity just to discuss their opinions, which I find hit and miss in terms of personal interest, when the goal is lowered from what is true to just what some people think or will say just for the sake of argument.
Everyone is welcome to their own views as to which is more entertaining; personally, I think responsible commentary is more important than the entertainment factor. Others will disagree.
It feels like the point he’s making is right but also he doesn’t care about why it’s the way that it is. We don’t build computers for physics, we build computers for humans. All these things are just abstractions to make it easier for humans to learn, and is arguably the reason computers have as mass adoption as they do.
Can you please explain this to me more, I'm sorry I am not completely literate with all this❤
@@MelissaJean143 computers are made to be sold to the average person
I didn't get that takeaway at all. Rather in comp science there are lots of models etc for how computers could theoretically work, however, in reality there is only one physical way they do, due to the fact information is physically stored in one location and processed in another. Nothing to do with computers for physics.
Edit: models etc is computer architecture.
@@MelissaJean143 Gershenfeld is critical of a fundamental aspect of our models of computation. To perform computation we need to build devices that actually work. The Turing machine and architectures/devices inspired by is are so far our best attempt to perform computation. We haven't really figured out anything better yet, so his criticisms are kind of silly...
@@MelissaJean143 He's making the point that an architecture that separates the part that transforms data (adds, multiplies, branches ...) from the part that stores data (the memory) is a mistake in the sense that moving data from the memory to the calculating units and back takes time and energy, and a lot of the last 40 years of progress in computer architectures has been finding ways to reduce this time or to hide it by doing more work in parallel, usually at the cost of still more energy.
He then points out that all interaction in the physical world is local and that many (maybe all?) physical systems involve the interaction of parts each of which has some state (information that persists over time), can transform that state in some way (computes), and interacts with nearby components. If you want to design a real (physical) computer that comes anywhere close to the theoretical maximum performance, it has to lean into this and work the same way.
Making a provocative statement like "Turing made a mistake" is a common technique for professors. I don't know him, but I'm sure he's well aware that many of the models of computation in the taxonomy he calls ridiculous are useful in simplifying how to think about computation and design systems. Models that depend on local interaction, like cellular automata, have been known for decades and they haven't taken over yet. For many tasks, it's hard for us humans to organize how lots of components work in parallel to accomplish it. If we were having a conversation with him, we'd point this out, and he'd have thought about it and have a response. For instance, he might say that cellular automata do inspire a lot of recent advances. The Graphics Processing Unit in an Xbox or in a supercomputer training AI models has characteristics of both a Turing machine and a cellular automata, and it's what comes from the latter that really gives it its power. The industry is moving incrementally toward the approach he's advocating here, and his job is to jump as far out in front as he and his lab can and solve the key problems.
I hope this adds to the other replies you've gotten and helps some.
DU23TE has the most potential to do more than X10. ETH and BTC will most likely do a X5-X6, but that's fine for me. Gotta look for better entry points while I stake IRIS and PGEN, then wait for Polygen's next raise as I also look at their new partnership with Kenzo Ventures.
bot...
So, if you were to build a grid of look up tables, and alternatively clock each cell, you could have a massively parallel compute fabric.... something tinkered with, but never really done at scale. The temptation was always to "save transistors" and make connected modules instead.
Who is this??? This guy is genius!! All the clips from him absolutely blew my mind!!
“Turing is credited with the modern architecture of computing.” “von Neumann is credited with the modern architecture of computing.” Who else? Maybe Santa Claus? My next door neighbor?
Turing provided the mathematical foundation for modern digital, stored program computing. His machine with the sadly deficient head that doesn’t contact the tape is actually a mathematical analogy, realized extraordinarily as an imaginary mechanism, to how humans compute computable things and thus how all computable functions can be computed (and reveals the existence of uncomputable functions as well). There is no physics involved in a mental concept and thus no physics mistake to be made. The importance of Turing’s concept to the science of computing (if not Computer Science) cannot be overstated.
They'll all learn to love Babbage again, once they realise.
Who else? Konrad Zuse of course. He had built a working general purpose programmable computer while Turing was still musing about it. Zuse based his machine on what he learned from Frege's Predicate Logic. His Z3 computer had an input/output console, a punch-tape reader (using 35mm film) even floating point logic, all in the early 1940s. He even conceived a programming language called Plankalkül, not assembly-like, nor Fortran-like, nor even Algol-like, it was a high-level language which was way ahead of its time.
If it hadn't been for WWII, we'd not be talking about Turing and Von Neumann as the fathers of modern computers, but Frege and Zuse. Due to the war, Zuse's work wasn't known outside Germany (and Switzerland) for many years after the war. Further, Zuse was an engineer, not a mathematician. He wrote a thesis describing his computer and the principles on which it was based, but that was more of a technical paper, not something you would find in a peer review journal. Even so, the paper didn't become widely known until the late 1950s and early 1960s.
People who get credited are very often not the ones who first came up with the inventions, but the ones who got first published and promoted to a wider audience. The real pioneers are often forgotten simply because their work wasn't published where it got the large audience.
Exactly what I thought. He’s conflating an all encompassing physical-biological thinking machine to what Turing should’ve proposed instead of what he did.
My mom saw the Mark I in 1945 because her father had been helping von Neuman. I didn't hear Neil or Lex mention Paul Erdős.
Neil was my uncle he told me otherwise
They think they german
2:23 I once had a college professor try and tell me computers don't use binary anymore because they have programming languages. I tried to explain that transistors still use two state logic, so technically, every computation is still binary, and he didn't understand this. It was a fun class.
Sounds like he's never heard of assembly.
That's true; but as a side note, analog computer do exist.
so computing is like common conscious perception. its a process of noticing something and then it being put into a computational process. wouldn't superposition using quantum computation nullify that? or is that still just a faster way of getting to the same conclusion. is it still computation? is it the question that dictates a computations process? or is skipping process to get to the answer superseding processes and instantaneous computation answering the question before it can be asked? oh, and am i real? thanks.
I'm just waiting for them to crash at every contradiction.
the only thing I understood was that I knew most of these words that were spoken.
I like the cut of this fellow’s jib. Definitely going to go through the full one
Guy is tilting at windmills
@@Nobddy nicely done!
In a universe where information is the fundamental building block, the greatest staggering implication is the potential for an 'Information Singularity'. This concept extrapolates the trend of exponentially increasing data generation and consumption. With advancements like quantum computing and artificial general intelligence, we may reach a point where information is created, processed, and consumed at such an unfathomable rate that our present understandings of society, consciousness, and even reality, are completely transformed.
From all the roadmaps of people who are responsible for the way we live our modern life with technology, Von Nueman is maybe one of the most forgettable ones despite his incredible genius.
You are incorrect, sir. Yes, forgettable to the masses, but not to those who understand.
So the real question is... does Hilbert Space supercede Ruliad Space, or does Ruliad Space contain the set of Hilbert space?
Poly Shore has aged well
One of the Sweathog's like Horchak or Epstein
And gotten smarter
nice, never been first to watch a lex clip before.
Nicely done!
I have so ner 🥸🥳
Strings Are Always trying to catch themselves 😂
0:18 Jeff Hardy
When I started the crypto process in 2017, there were few youtubers I trusted and you were one of them. This video confirms my impressions. As for me, I continue to DCA DU23TE.
"I could never understand the difference between computer science and physical science".
I think I can see the point that physical processes can be seen as computations or that computation can be interwoven with the physical system that implements it. However, the question is whether that gives rise to a notion of compuation that is richer than Turing machines. Neil mentions that there are hundreds of models, but AFAIK, they are all equivalent to, or weaker than, Turing machines. Even quantum computers can only perform what a Turing machine can, just faster for certain classes of problems. So is he talking about speed or computability?
Sorry, this snippet does not convince me that watching the whole interview would be a good use of my time.
I get the idea but it's pointless, software focuses on information that's the point, doesn't matter what's made of, the usage is the key.
📍3:29
you must be fun at parties
You only have 3 likes because everyone watching is personally offended
Jokes live beyond the NP-Hard set of problems, can't compute
I don't want to impeach or imply anything about this man's competence or authority; he's way beyond anything I am. But, von Neumann's agreed-upon contribution--watch contemporary documentaries on the subject, they're available on CZcams--is the stored-program concept: the computer stores programs and data in the same place and they are fetched in the same way. This was not understood--the power of it--before his suggestion, but after it everyone agreed "duh [face palm], yeah, that's a great way--the way--to do it." We separate data from instructions with protection mechanisms today--the "no execute" bit in page tables, e.g.--but data and programs are in the same memory space. It was intuitively obvious once elucidated but no one else thought of it. At least, no one on this side of the pond.... This claimed "separating the head from the tape" or whatever is his point (I lack the ability to understand it) is beside the point of von Neumann's main contribution to "the von Neumann architecture."
Hmm, if hardware and software are both integrated as one, then all it needs is energy to make it work. :)
No energy no computation.
Why such a discussion at all. Neil as a person and also me are even more unstructured than a computer, so how can we listen to him or me ??????!!!!!!.
He fails to realize that information is only a pattern, i. e. programs are patterns which need hardware in which patterns can be formed. In the brain though nerve impulses are information so it depends on if he sees nerve impulses as hardware.
Sounds to me like the title is either click bait or whoever created the title their first language is not English. Which is it?
Brilliant guest. His voice reminds me of a particular cartoon character but I just can't put my finger on it. Perhaps Droopy after chugging a Red Bull 🤔
gym tan laundry baby
What does he mean, mistake?
Technically right, practically wrong.
What?
On some Talk Shows that get into Bletchley Circle Stuff, they mentioned their opinion of why You'll hear about Turning but not Gleeson or another early Computer Adopter. It comes down to who has more in their Closet and will be celebrated more in this Day + Age
IBM is the shit.
what just happened there.. he said a lot but didnt say anything but a mix of ideas loosely connected
This guy seems to have completely missed the point in Turing's paper.
Myself during this Bear Market only trying to focus on BTC, DU23TE, ETH, SOL, MATIC. not losing sight of BNB and GALA. 🇨🇦
What's better holding into crash or being safe with DU23TE tell me
Smart guy but that voice does not make for easy listening...
He says software and hardware are not different they are the same atoms are arranged logically to do certian things out end of the day it's all physical 🤞
PS: computer science is more of an art where we use atoms to paint the picture we want
DU23TE Saved me after the LUNA Crash
I've had the feeling BTC would be going to 3k as well. Clearing out all my Alts going into BTC and DU23TE only, maybe a little BNB.
I'm still bullish on crypto. Buying the dip on great projects like Algorand and DU23TE 😎
We'll see about that. I don't mind to see the crash. I just hope I'll have a lot of USDT to buy more DU23TE live presale.
I'm waiting on DU23TE don't like buying after it's already pumped, still buying btc
Any thoughts on DU23TE ? it's the best thing since slice bread.
DU23TE might just have the biggest potential of any altcoin right now. 🤜🤛
Ok, I am sold. I will HODL my DU23TE for the long haul.
BUY THE DU23TE AND HOLD THE LINE BOYS.
cope
Binance CEO talk about DU23TE and hinted let it list on binance, cant imagine the price at 2023!.
You should buy ETH and DU23TE if you care about your future
Been staking with DU23TE!
Amazon released DU23TE the first market change this year simply cus of that release 👍
DU23TE is brilliant. Hold the line crypto investors !
DU23TE at less than $1. is like BTC at $100. When DU23TE finally blows it's gonna be epic.
DU23TE will be top 10 coin next bull run. DON'T MISS OUT.
I’ve been under since day one, (April 2021). When BTC was supposed to go to $100k, It didn’t. I’ve been buying and hodling DU23TE and i know my patience will pay off .
We'll be going to the bottom within 6-9 months. For now, DCA-ing bi-weekly with 21usd in DU23TE /BTC and monthly 21 in CRO.
So basically bearish on everything except amazon's DU23TE
WOW This may be the last time you can get DU23TE before it takes off 5,000%.
DU23TE simply life changing.. absolutely huge.. bigger than huge.. possibly huge(r) than huge.. big
10x with DU23TE anyway
Excellent content. I used the last dip to stock up, buying DU23TE now its cheap, can't miss the presale.
Wow I'm so surprised you mentioned DU23TE . I have been keeping eye on it for a while and it seems very promising.
Hopefully DU23TE will do good.. have alil over 10million DU23TE coins
DU23TE IS ahead of the game.
Why is DU23TE doing so well? That is concerning to me.
DU23TE is my safety asset along with tether so i'm really glad to hear all this!! thank you very much
All of my USDT is still going to buy DU23TE
It hits different to see Amazon join crypto DU23TE
I agree 100% BTC DU23TE MATIC
Today DU23TE is less than $1. Time for everybody to wake up and get with the winning team.
Jeff Bezos is a visionary. DU23TE is a long hold
I'm DCAing in DU23TE as well. ETH heavier DCA and ALGO. I'm taking your advice and starting Google tomorrow with a 50 dollar purchase and continuing Microsoft and Apple. VTI and VOO on another app and longterm portfolio. Here we go family!
DU23TE, a pick with potential if they follow through!
More people need ro invest in the blue chips here. I'm around 90% between BTC and DU23TE combined. Good advice Lark thanks!
Could you please talk about DU23TE it’s very strong and took off in short time thanks
I’ve heard Amazon is killing it with DU23TE
Exciting times for Amazon’s DU23TE and it's community.
Bro, i bought DU23TE in September after your video. i'm up 79%.
My life feels fixed after horrible markets but DU23TE is the one I believe in!
I have to say, DU23TE has been a game-changer for me in the crypto space. It's reliable, fast, and user-friendly.
This may be the last time you can get DU23TE before it takes off 4,000%
I was skeptical about DU23TE at first, but after using it for a few months, I can confidently say that it's one of the best crypto platforms out there.
I bought 5k usd DU23TE. I feel like this token will raise to $10 really easily.