Pure Information Gives Off Heat

Sdílet
Vložit
  • čas přidán 28. 07. 2022
  • Sign up to Brilliant to receive a 20% discount with this link! brilliant.org/upandatom/
    Hi! I'm Jade. If you'd like to consider supporting Up and Atom, head over to my Patreon page :)
    / upandatom
    Visit the Up and Atom store
    store.nebula.app/collections/...
    Subscribe to Up and Atom for physics, math and computer science videos
    / upandatom
    Why Time Actually Flows Both Ways
    • The Time-Reversibility...
    Follow me @upndatom
    Up and Atom on Twitter: upndatom?lang=en
    Up and Atom on Instagram: / upndatom
    For a one time donation, head over to my PayPal :) www.paypal.me/upandatomshows
    A big thank you to my AMAZING PATRONS!
    Michael Seydel, Cy 'kkm' K'Nelson
    , Rick DeWitt, Thorsten Auth
    , Purple Penguin
    , AndrewA, Izzy Ca, bpatb
    , Michael Martin, Scott Ready,
    John H. Austin, Jr.
    , Brian Wilkins, Thomas V Lohmeier, David Johnston
    ,
    Thomas Krause
    , Yana Chernobilsky,
    Lynn Shackelford, Ave Eva Thornton,
    Andrew Pann,
    Anne Tan
    , James Mahoney, Jim Felich, Fabio Manzini, Jeremy, Sam Richardson, Robin High, KiYun Roe, Christopher Rhoades, DONALD McLeod, Ron Hochsprung, OnlineBookClub.org, Aria Bend, James Matheson, Robert A Sandberg, Kevin Anderson, Tim Ludwig, Alexander Del Toro Barba, Corey Girard, Justin Smith, Emily, A. Duncan, Mark Littlehale, Lucas Alexander, Jan Gallo, Tony T Flores,
    Jeffrey Smith
    , Alex Hackman
    , Joel Becane,
    Michael Hunter
    , Paul Barclay, 12tone,
    Zhong Cheng Wang,
    Sergey Ten, Damien Holloway,
    Mikely Whiplash
    , John Lakeman
    , Jana Christine Saout
    , Jeff Schwarz
    , George Fletcher,
    Louis Mashado,
    Michael Dean
    , Chris Amaris,
    Matt G
    ,
    Broos Nemanic
    , Dag-Erling Smørgrav
    , John Shioli
    , Joe Court
    , Todd Loreman
    , Susan Jones, Richard Vallender, jacques magraith, William Toffey, Michel Speiser, Rigid Designator, James Horsley, Bryan Williams, Craig Tumblison, Rickey Estes, Cameron Tacklind, 之元 丁, Kevin Chi, Paul Blanchard, Lance Ahmu, Tim Cheseborough, Nico Papanicolaou, keine, Markus Lindström, Jeffrey Melvin, Midnight Skeptic, Kyle Higgins, aeidolos, Mike Jepson, Dexter Scott, Potch, Thomas P Taft, Indrajeet Sagar, Markus Herrmann (trekkie22), Gil Chesterton, Alipasha Sadri, Pablo de Caffe, Alexander230, Taylor Hornby, Eric Van Oeveren, Mark Fisher, Phizz, Rudy Nyhoff, Colin Byrne, Nick H, Jesper de Jong, Loren Hart, Ari Prasetyo, Sofia Fredriksson, Phat Hoang, Spuddy, Sascha Bohemia, tesseract, Stephen Britt, KG, Dagmawi Elehu, Hansjuerg Widmer, John Sigwald, Carlos Gonzalez, Jonathan Ansell, Thomas Kägi, James Palermo, Gary Leo Welz, Chris Teubert, Fran, Joe, Robert J Frey, The Doom Merchant, Wolfgang Ripken, Jeremy Bowkett, Vincent Karpinski, Nicolas Frias, Louis M, kadhonn, Moose Thompson, Andrew, Sam Ross, Garrett Chomka, Bobby Butler, Rebecca Lashua, Pat Gunn, Elze Kool, RobF, Vincent Seguin, Shawn, Israel Shirk, Jesse Clark, Steven Wheeler, Philip Freeman, KhAnubis, Jareth Arnold, Simon Barker, Dennis Haupt, Lou, amcnea, Simon Dargaville, and Magesh.
    Creator - Jade Tan-Holmes
    Script - Jack Johnson
    Animations - Standard Productions
    Music - epidemic sound
  • Věda a technologie

Komentáře • 1,7K

  • @bobdiclson4173
    @bobdiclson4173 Před rokem +435

    I love Jades vibe of having a secret to share

    • @vigilantcosmicpenguin8721
      @vigilantcosmicpenguin8721 Před rokem +32

      Yeah, that's a perfect description. The way she grins when she gets to the juicy part of the secret,

    • @Blue.star1
      @Blue.star1 Před rokem

      She looks like a meson

    • @PhysioAl1
      @PhysioAl1 Před rokem +4

      You're right!

    • @sabouedcleek611
      @sabouedcleek611 Před rokem

      @@user-bt1hf9cr5n Looks a bit like Modified Newtonian dynamics, though i dont think √(1+(GM/(RC²))² satisfies the interpolation requirement in the overview of the wiki.

    • @communitycollegegenius9684
      @communitycollegegenius9684 Před rokem

      Too bad. No one would watch or take her seriously without her vibe. Science and humanity loses.

  • @patrickhanft
    @patrickhanft Před rokem +110

    I do have a degree in computer science and I find you again and again to be one of my best CS teachers in topics, that were never discussed or badly explained during my studies!

    • @zen1647
      @zen1647 Před rokem +2

      Great video! You're awesome!

    • @hyperduality2838
      @hyperduality2838 Před rokem +1

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @sherpajones
      @sherpajones Před rokem

      16:23 What if the logic gate operated by interpreting the interference pattern of light. If you only had one light source, there would be no pattern. If you have two, there would be a pattern. The presence or not of a pattern can be your 0 or 1. This should easily be reversible.

    • @moth5799
      @moth5799 Před 11 měsíci

      @@hyperduality2838 Christ mate you really think quoting Yoda makes you look smart lmfao? There are 4 laws of thermodynamics but that's only because we call one of them the Zeroth law. Your one is just made up.

    • @hyperduality2838
      @hyperduality2838 Před 11 měsíci

      @@moth5799 Subgroups are dual to subfields -- the Galois correspondence.
      The Galois correspondence in group theory is based upon duality.
      There are new laws of physics -- Yoda is correct.
      Energy is dual to matter -- Einstein.
      Dark energy is dual to dark matter.
      Energy is duality, duality is energy -- the 5th law of thermodynamics!
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Energy is measured in Joules (duals, jewels) in physics.

  • @compuholic82
    @compuholic82 Před rokem +441

    Fun fact: Reversible logic is really important in quantum computing. Since all state changes can be represented as unitary matrices, quantum gates are always reversible.

    • @hyperduality2838
      @hyperduality2838 Před rokem +36

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @laughingone3728
      @laughingone3728 Před rokem +4

      @@hyperduality2838
      Nicely stated. Thanks for that.

    • @Snowflake_tv
      @Snowflake_tv Před rokem +2

      Really?

    • @hyperduality2838
      @hyperduality2838 Před rokem +12

      @@laughingone3728 You're welcome, it gets better:-
      There is also a 5th law of thermodynamics, energy is duality, duality is energy!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter.
      Action is dual to reaction -- Sir Isaac Newton (the duality of force).
      Attraction is dual to repulsion, push is dual to pull -- forces are dual.
      If forces are dual then energy must be dual.
      Energy = force * distance.
      Electro is dual to magnetic -- Maxwell's equations
      Positive is dual to negative -- electric charge.
      North poles are dual to south poles -- magnetic fields.
      Electro-magnetic energy is dual.
      "May the force (duality) be with you" -- Jedi teaching.
      "The force (duality) is strong in this one" -- Jedi teaching.
      There are new laws of physics! Your mind creates or synthesizes syntropy!
      Thesis is dual to anti-thesis creates the converging thesis or synthesis -- the time independent Hegelian dialectic.
      Duality creates reality!
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Everything in physics is made from energy hence duality.
      Concepts are dual to percepts -- the mind duality of Immanuel Kant.

    • @hyperduality2838
      @hyperduality2838 Před rokem

      @@Snowflake_tv There is also a 5th law of thermodynamics, see my next comment.

  • @demetrius235
    @demetrius235 Před rokem +91

    I worked in the semiconductor industry (DRAM) for a few years and now one of the courses I teach is Thermodynamics. I had no idea about the Landauer limit so thanks for teaching me something new! Also, good work pointing out that a completely reversible process is not possible as there is always some energy loss (collisions in your billiard ball case). This was an excellent video!

    • @anntakamaki1960
      @anntakamaki1960 Před rokem

      Is it in Russia?

    • @demetrius235
      @demetrius235 Před rokem +3

      @@anntakamaki1960 "it"?? I did not work in Russia and I have never been to Russia. I have no desire to set foot in Russia.

    • @mathslestan3323
      @mathslestan3323 Před rokem +4

      @@demetrius235 Wow so much love for Russia 😑

    • @Nehmo
      @Nehmo Před rokem +2

      How can you be associated with semiconductors in any way and not know about the Landauer limit? Sorry to be critical, but it's rather basic.
      Now that you know about it, you will recognize encountering it again and again.

    • @katrinabryce
      @katrinabryce Před rokem +5

      @@Nehmo Possibly because it is so small that for all practical purposes it is zero? The Landauer limit of a typical modern 35W CPU is about 35 nanowatts. And for DRAM it is actually 0W, because the whole point of RAM is that you get back out what you put in, so it is reversible.

  • @domotheus
    @domotheus Před rokem +438

    Good stuff! If you're interested there's an even weirder (and very theoretical) application of reversible computing called a "Szilard engine" where you can go back and forth between waste data and waste energy. Using the wasted bits of reversible computing you can theoretically extract energy out of a system that's at an equilibrium state, basically meaning you can convert energy into data and data into energy

    • @zdlax
      @zdlax Před rokem +20

      mass = energy = information

    • @landspide
      @landspide Před rokem +19

      isn't this like Maxwell's demon?

    • @abdobelbida7170
      @abdobelbida7170 Před rokem +2

      How so?

    • @MasterHigure
      @MasterHigure Před rokem +25

      @@abdobelbida7170 Maxwell's demon can separate a fluid into fast / slow atoms, or a fluid mix into its constituent parts. You can extract energy from the recombining. The cost is that the demon's knowledge of the state of the system becomes less and less useful as it does its work.

    • @lubricustheslippery5028
      @lubricustheslippery5028 Před rokem +10

      I am confused. I thought that information needs an physical representation according to information theory and thus something like pure information should not be a thing similar to that pure energy is not a thing. Then is it something like you can convert entropy and enthalpy, in chemistry you calculate with Gibbs free energy to see if an reaction can occur that is the combination of entropy and enthalpy.

  • @fmeshna
    @fmeshna Před 7 měsíci +12

    Jade, your ability to explain complex quantitative concepts so clearly is exceptional. We need more teachers like you.

  • @gotbread2
    @gotbread2 Před rokem +97

    While the second law gives a mathematical justification for that energy loss, it does not give a deeper "why" that is the case. The fundamental issue is that of information erasure itself. This comes down to collapsing a state to a single value. Imagine the 2 bits getting reduced to 1 bit. This means we force one bit from having a variable state (either 0 or 1) to a fixed state. It can be any value, does not matter, but it is now a constant and no longer a variable. This is where the loss occures.
    One helpful visual is a ball in a potential with 2 valleys (as a standin for a particle in a bipotential). Now this ball can be in either of the 2 valleys initially. By definition we dont know, else it would be a known constant and not a variable. Lets say we want to move this ball into the left valley, from any starting valley. The issue here is that whatever we come up with needs to work for both starting valleys. Similar the bit erasure must be able to set a 0 to a 0, but also a 1 to a 0. In the ball case, you can move it over to the other valley but then it will have some speed left, which you need to dissipate in order for it to come to a rest at the bottom and keep this state. This is exactly where the loss happens. You can add some kind of reversible damping to "catch" this energy, but then it wont work for the case that the ball was already in the correct valley. Whatver case you design it for will always cause an energy loss for the other case, since you need to move from a case with potentially "some" kinetic energy to a state with "zero" kinetic energy, without knowing the direction of the motion. (This is similar to maxwells demon).
    Now how much energy do we need to dissipate? Also easy to see. In order to differentiate between the 2 bit states, there needs to be a potential barrier between them. This barrier needs to be high enough to prevent thermal movement from flipping the bit on its own. The energy you need to dissipate while "catching" the bit and bringing it to rest is directly coming from the energy you need to expend to cross this barrier. Since the barrier is temperature related (more temperature -> more thermal energy -> higher barrier needed to avoid flips), the energy loss is also temperature dependent. This is where the "T" in the equation comes from. The boltzman constant in a way is mandatory to match the units. Last piece of the puzzle is the ln(2). We can either be satisfied with using the second law as a shortcut here, but the ln(2) can also be derived directly from the "geometry" of this "information bit in 2 potential wells" problem.

    • @dtkedtyjrtyj
      @dtkedtyjrtyj Před rokem +11

      Wow. I actually think I understood some of that. It makes intuitive sense...?

    • @rewe3536
      @rewe3536 Před rokem +6

      Thank you! The video makes it seem like it's just magic, it just happens.

    • @garyw.9628
      @garyw.9628 Před rokem +9

      Really nice analysis of why erasing information necessitates a loss of energy. Also very appropriate to mention Maxwell's demon, since his thought experiment cleverly demonstrated the important link between information and energy. But, in the derivation of the Landauer Limit, and of the Boltzmann constant itself, there seems to be the assumption of a system consisting of the atoms and molecules of a gas. What if the computing device consisted of something smaller than atoms like photons, or neutrinos or quarks ? Would the corresponding Landauer Limit then, by necessity, have a much lower value ?

    • @hyperduality2838
      @hyperduality2838 Před rokem +2

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @gotbread2
      @gotbread2 Před rokem +4

      @@garyw.9628 We did not make any assumptions about what the system is made of. All we need to assume is "some element" (here a particle, but even a wave function works too), which can be in different states, and that the 2 states are separated by an energy barrier. We need this barrier in order to preserve the state (else it would evolve between the states over time), and the height of the barrier is based on the expected disturbances (thermal energy). Further, we assume that by crossing this barrier of potential energy, a certain kinetic energy is needed, which we eventually need to dissipate again. Notice how abstract that setup it, it makes no mention of the particle type, or even a particle at all (even a field configuration would work here).
      You are correct with the Boltzmann constant however. This carries some assumptions with it. Since this constant relates the temperature of a gas to its average kinetic energy, it all comes down to how you define "temperature" for your system. If you use different particles, or something different entirely, your definition of what a temperature is may change, thus changing the Boltzmann constant.

  • @stufarnham
    @stufarnham Před rokem +6

    This has become my favorite CZcams channel. These short, digestible discussions of deep topics. Are endlessly fascinating. I especially enjoy the discussions of paradoxes. Also, you are amgreat presenter - clear and engaging. Keep itn up, please!❤

  • @JanStrojil
    @JanStrojil Před rokem +178

    The fact that information contains energy always boggles my mind.

    • @goldenwarrior1186
      @goldenwarrior1186 Před rokem +3

      It makes sense. There’s nothing to really think of (don’t mean to be mean)

    • @MyMy-tv7fd
      @MyMy-tv7fd Před rokem +33

      that is because it does not, there is no necessary information in a logic gate switching, it could randomly switch, or just be set to oscillate. No information is involved in mere switching, but energy obviously is. Either this is clickbait, or she does not understand that information is the intentional switching of logic gates to produce a certain storage pattern, which may or may not be volatile RAM or non-volatile like an SSD.

    • @dominobuilder100
      @dominobuilder100 Před rokem +4

      @@MyMy-tv7fd give an example then of any sort of information that does not contain energy

    • @MyMy-tv7fd
      @MyMy-tv7fd Před rokem +38

      @@dominobuilder100 - no information whatsoever contains energy, it is conceptual. But there is always a physicsl substrate, whether it be the page of a book or a RAM stick. The change in the substrate could contain information, or just be random, but the change itself will always require energy, the information is the 'ghost in the machine'

    • @paulthompson9668
      @paulthompson9668 Před rokem +2

      @@goldenwarrior1186 Hindsight is always 20/20

  • @pedromartins1474
    @pedromartins1474 Před rokem +56

    This took me back to my statistical physics classes! Wonderfully explained! Thank you so much!

  • @ericmedlock
    @ericmedlock Před rokem +6

    Great video! I learned a bunch of this in university a million years ago but you do a super job of simplifying a really complex set of concepts. Kudos!

  • @Alestrix76
    @Alestrix76 Před rokem +27

    I was wondering about the "many million more" at 10:00 and did some math as I thought this sounded a little too much. But turns out it's about right: A somewhat modern 64bit x86 CPU has around 5*10^8 logic gates. Let's say with each cycle 1% of those gates gets flipped and there are 2*10^9 cycles per second (2GHz), then we end up with around 1µW. Modern power efficient x86 processors need roughly 10W, which is 10 million times this. Not sure what the numbers are like with an ARM processor in a smartphone. Of course this is just ballpark-math.

    • @greenaum
      @greenaum Před 9 měsíci +3

      The latest AMD CPUs have 8.2*10^10 transistors, or 82 billion. A logic gate might have, maybe 5 transistors, so you're off by a factor of about 100.
      A lot more than 1% of the gates get flipped. CPUs are designed to use as much of the hardware as possible, all the time. To get more processing done. You don't want bits of the chip sitting around idle. This is why you might have "4 cores 8 threads". Each core runs two "threads". That is, if, say, the chip's multiplier unit is being used by one instruction, there might be a memory access it can make, using it's memory access hardware, for another instruction. It runs two instructions at once but it's not two entire processors. Instead, it's two front-ends that work out what instructions can be run with the currently unused parts of the CPU. So it's like you get, say, 1.8 effective processors per core, with just the addition of a second front-end and a bit of logic to figure out what's compatible.
      There's also pipelining, where an instruction might take, say, 5 stages of operations to complete. The 5 stages are built separately, and as an instruction leaves the first stage, a new one is brought in, so all 5 stages are busy, all the time, with 5 instructions.
      Then there's out-of-order execution and all sorts of other mad tricks to try and get processing done in almost no time at all. CPUs will have multiple adder units and other parts. It's not just having multiple cores, each core does multiple things.
      So, they're busy, by design, to produce the most throughput of calculation for the silicon you buy. To have circuits sitting idle is wasteful, and indeed they analyse that at the factory, and if a part isn't pulling it's weight, they'll replace it with something else that does. It's all about getting the most processing possible done, because that's what they compete on, and what they set their prices by.
      In power-saving chips, like for phones, the opposite is true. They try and switch off as many sub-units as possible while still providing just enough CPU power to do whatever you're doing at that moment. Entire cores will usually be shut down, but they can wake up quickly. Plus modern phones might have 4 high-power, fast processors, and 4 slower, very low-power ones, designed with different techniques, switched on and off as the operating system decides it needs them.

  • @E4tHam
    @E4tHam Před rokem +87

    I’m currently pursuing a masters in VLSI, so thanks for introducing these concepts to people! Although the built in impedances in metal and semiconductors will always overshadow the Landauer limit by several orders of magnitude. But this is an interesting thought experiment

    • @adamnevraumont4027
      @adamnevraumont4027 Před rokem +10

      always is a long time to make a promise for

    • @adivp7
      @adivp7 Před rokem +12

      @@adamnevraumont4027 Even if we you eliminate metal impedances with super-conduction you definitely need semi-conducting material for a transistor. And semi-conductors will always produce heat.

    • @adamnevraumont4027
      @adamnevraumont4027 Před rokem +6

      @@adivp7 which you then follow with a formal proof that all computation requires transistors made of semiconductors? No?
      Well then making a promise for "always" is beyond your pay grade. Always is a very long time. Always is not just 10 years, it is 100 years, it is 10000 years, it is 100,000,000 years, it is 10^16 years, it is 10^256 years, it is 10^10^10^10^10 years, it is G64 years, it is TREE3 years, it is BB(50) years.
      It is a really long time.

    • @dot32
      @dot32 Před rokem +9

      ​@@adamnevraumont4027 lmao, it's physics. You need semiconductors for transistors. If you found something other than a transistor, you may not need semiconductors, but semiconductors are what transistors are afaik

    • @adivp7
      @adivp7 Před rokem +13

      @@adamnevraumont4027 Technically, none of those are "always", but fair point. What I meant to say is you need energy for switching. I can't see how you can have switching that doesn't use or release any energy.

  • @danielschein6845
    @danielschein6845 Před rokem +77

    Amazing to think about. I spent 10 years designing actual microprocessors and always thought of energy in terms of the electrical current flowing through the device.

    • @ralfbaechle
      @ralfbaechle Před rokem +15

      To be honest, the Landauer limit is so low - we can spend anoher century without reaching it. Figuratively speaking that is. I've not done an estimate how long we'd probably need to reach the Landauer limit from where technology is now. Because, let's face such estimates usually are wrong :-) So it's perfectly ok to concentrate on all other losses.

    • @DrewNorthup
      @DrewNorthup Před rokem +10

      And even that is an oversimplification… The need to understand the impact of both the resistance and the reactance escapes a good many people. Switching losses are so large compared to Landauer I'd not expect the latter to factor in meaningfully for quite some time.

    • @triffid0hunter
      @triffid0hunter Před rokem +9

      Sure, but the landauer limit says that a modern CPU must use at least a nanowatt or so, and since they _actually_ use about a hundred watts, we've got a _long_ way to go before having to deal with the limit - wikipedia's article says maybe 2080 if Koomey's law holds, although it doesn't mention which Koomey's law figure (there's two in the relevant article) was used to derive that figure.

    • @hyperduality2838
      @hyperduality2838 Před rokem

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @_ninthRing_
      @_ninthRing_ Před rokem +1

      @@ralfbaechle Perhaps at the scale of a single CPU of a home computer, but at the scale of the massive data processor centers for internet servers, or the supercomputers used to simulate the Climate/Weather patterns of the entire Earth, the need to to reduce the tremendous heat being generated (both in terrms of efficiency & especially the cost in cooling these mega-systems), makes current efforts to overcome the Landauer Effect financially viable.
      Just look at how much energy is estimated at being diverted from electrical power grids towards Cryptocurrency farming & try to estimate the vast amount of waste heat being generated (especially in hotter regions close to the equator, like Texas, where there's already a profound energy cost in maintaining functional computer environments) due to all this data processing.
      Any tweak to the design of computer processors which either overcomes, or more likely reduces (given our current incomplete understanding of Quantum Physics), the Landauer Effect is going to be a worthwhile achievement.

  • @esquilax5563
    @esquilax5563 Před rokem +2

    This is one of my favourite videos of yours. Most of them cover topics I have a fair bit of familiarity with, but your "where's the mystery?" intro made me realise I've barely thought about this at all

  • @itsawonderfullife4802
    @itsawonderfullife4802 Před rokem +1

    Great video as always. One little reminder though:
    At 7:53 you have to also consider the power plant (and include it in the "system") which makes the electricity used to produce the work (=energy for refrigerator's compressor) which is needed for the refrigerator (to reverse and decrease the entropy of its interior) to say that the 2nd law is maintained overall and entropy of a closed system (most often) increases.

  • @fios4528
    @fios4528 Před rokem +61

    Thank you for making this video. I've genuinely spent many a sleepless night thinking about the lifespan and logistics of a minimum energy computer for a future civilization that lives in a simulation.

    • @berniv7375
      @berniv7375 Před rokem +1

      Could you possibly do a video about data banks and how they are taking over the world. I was astonished to learn how much energy is required just to take a digital photograph and how much energy is required to cool down data banks. 🌱

    • @cate01a
      @cate01a Před rokem +1

      if the universe is a simulation, why care for energy? like the people controlling the simulation could either supply it with nuclear or better energy that would last many universes lifetimes. or more likely is they'd speed up the simulation speed to like tree(10^^^^^^^^^^^^^^^^^^^^^^^^^10), so that maintenence is a non issue because they (might, havent done the maths, can you even calculate that?? no probably not since iirc we only know a COUPLE digits of tree(3) so fuck that bigass number) would have already simulated more universes than we could literally comprehend in less than a microsecond.
      though thatd take fucking insane technology, probably impossible for the laws of physics, so then the simulator controllers would need to exist is a much more advanced, cooler universe, but then the persons controlling THEM would need even IMMENSELY more fucking power, like holy shit, not even a trillion bajillion shit tonne quatrillion lifecycles of the entire universe/existence could even come CLOSE to the amount of damn energy needed for that to happen
      so uh i guess simulation is outta the question unless like god is real and on his impossible computer he's playing the sims10 or some shit, though yknow that makes zero fucking sense too

    • @anywallsocket
      @anywallsocket Před rokem

      @@cate01a A^^A = A^A

    • @paulmichaelfreedman8334
      @paulmichaelfreedman8334 Před rokem

      Funny thing is, that if it turns out we're bound to our solar system (FTL completely impossible for complexly arranged matter) at least part of the planet's population will choose to live in a simulation created by ourselves by means of VR immersion with whatever technology is invented for this purpose in the future. Games like EVE, WoW and Elite Dangerous are examples of current day escape to a universe in which much more is possible than the real one. If you have seen the movie Surrogates or Ready Player One you'll catch my drift.

    • @cate01a
      @cate01a Před rokem

      @@paulmichaelfreedman8334 faster than light travel for complex matter/humans ships plants etc should be possible (not feasible (yet)) by manipulating space time, which is possible - and by making the spacetime go faster than light rather than the matter, the matter is still travelling whilst not going against the laws of physics: czcams.com/video/HUMGc8hEkpc/video.html&ab_channel=PBSSpaceTime
      havent watched that specific vid but probably explains same concept

  • @heartofdawn2341
    @heartofdawn2341 Před rokem +81

    The question then is, where does that second bit of output information go? How is is stored and used? If you simply discard it later, all that happens is that you push the source of the landauer limit further downstream.

    • @JB52520
      @JB52520 Před rokem +32

      No one really knows because there's no design for a reversible computer yet. The billiard ball example shows how the balls might be returned to the correct place with a logic gate that doesn't expend energy, but it doesn't show where they're stored or how they'll return at the precise time (as far as I remember; it's been a while since I read about this).
      I'm just guessing, but a useful metaphor might be to picture a mechanical computer where each of the waste bits is stored in a tiny spring, such that computing would be like winding a clock. Once the result is obtained, the program runs in reverse to unwind the system and return the stored energy. (How it would actually work, I have no idea.) It's also like the difference between standard car brakes and regenerative braking. The former just radiates heat, and the later runs one or more generators, storing energy to accelerate the car later.
      As far as I can tell, reversible computing doesn't have to be perfect, just like regenerative brakes. Even if a program can only run backward part way before releasing the remainder of its stored energy as heat, that's still better than releasing all of it, and it might be enough for a computer of the distant future to sidestep the Landauer limit.

    • @erkinalp
      @erkinalp Před rokem +15

      @@JB52520 All quantum computers use reversible computational elements to prevent immediate collapses of superposition.

    • @glenncurry3041
      @glenncurry3041 Před rokem +4

      @@erkinalp Qbits are not single binary bit.

    • @brandonklein1
      @brandonklein1 Před rokem +3

      @@erkinalp but you're still subject to the Landauer limit once you make a measurement.

    • @hyperduality2838
      @hyperduality2838 Před rokem +2

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

  • @0ptikGhost
    @0ptikGhost Před rokem +33

    I love this video as it pertains to theoretical computation. Realizable computers today generally use electric flow through substrates that always have some level of impedance. The real reason our current day computers generate heat has nothing to do with the Landauer Limit but rather with the technology we use to build said computers. We don't stand a chance of getting anywhere near the Landauer Limit regardless of the temperature we run the computer unless we don't figure out how to make computers using superconductors.

    • @david203
      @david203 Před rokem

      Yes, that is today's understanding. But the theory itself doesn't require superconductors. And CMOS does actually use less energy than, say, TTL.

    • @TheDavidlloydjones
      @TheDavidlloydjones Před rokem

      @@david203
      You rather seem to miss Ghost's sensible point. Try reading it over again. Or look for E4tHam's sensible post, below in my feed, making Ghost's point in slightly different form.

    • @david203
      @david203 Před rokem

      @@TheDavidlloydjones I fully agree with Ghost's point, thanks. The current required to activate computer logic causes vastly larger heat than does the actual processing of information. That's why it's called a Limit.

    • @greenaum
      @greenaum Před 9 měsíci +1

      Right. Charging and discharging the gates of umpty-billion transistors requires dumping the energy as heat. Resistors make heat, though CMOS is usually quite high-impedance.
      There's reversible logic, but frankly it looks like a pain in the arse and it seems like most of it happens only on paper and the rest might be that someone strings together a couple of logic gates, not an entire CPU. Even then it's probably made of snooker balls!

  • @Kimwilliams45
    @Kimwilliams45 Před rokem +2

    Thank you. I had never heard about the Landauer limit even though I was a physics student. Very good explanation.

  • @jesuss.c.8869
    @jesuss.c.8869 Před rokem +11

    Great video, Jade. Thank you for introducing such a complex topic in an easy and fun way. 👍

  • @sachamm
    @sachamm Před rokem +23

    The thought experiment I was given when learning about reversible computing referred to the elasticity of atomic bonds and how energy could be returned when a molecule returned to its original conformation.

    • @slevinchannel7589
      @slevinchannel7589 Před rokem

      Not many have the intellectual Integrity to watch the harsh History Coverage that Some-More-News did i nthe video "Our Fake Thanksgiving'.

    • @david203
      @david203 Před rokem

      I don't see it. You would have to identify where the "atomic bonds" are in the logic circuits. Read Gotbread's analysis in another comment here. It makes more sense.

  • @maxmustsleep
    @maxmustsleep Před rokem +1

    This is so fascinating! Great job explaining this complex concept, I think even without much CS knowledge you'll be able to understand it from this video

  • @saggezza-artificiale
    @saggezza-artificiale Před rokem +6

    Very nice video. I'd just suggest to clarify that, if we consider real use cases, probably we'll never get around Landauer's principle, even if we developed perfect reversible computing. This because reversible computing allows to handle information without erasing it, but it we don't rely on a device allowing to store an infinite quantity of information, soon or later we'll need to delete it, and this can't be done in a reversible way.

  • @trewaldo
    @trewaldo Před rokem +8

    I went to watch this video with the preconceived notion about entropy's connection with energy and information. But this is different. The explanation made it better! Thanks, Jade. Cheers! 🥰🤓😍

    • @pandoorloki1232
      @pandoorloki1232 Před rokem

      Shannon entropy is not thermodynamic entropy--they are fundamentally different things.

  • @hoptanglishalive4156
    @hoptanglishalive4156 Před rokem +21

    Disinformation gives me the chills but information warms my soul. Like Prometheus giving the fire of knowledge to humanity, science educators like shining Jade are doing vital work.

  • @lensman7519
    @lensman7519 Před rokem

    Great content ty
    Nice convergences with plank length, spooky connectivity, entropy and quantum computers. Thx for the headache

  • @zscriptwriter
    @zscriptwriter Před 2 měsíci

    Back in College in 1984 I created a circuit gate simulator on my TI99 computer that allowed the user to place circuit gates onto the screen, connect them and then input values
    And then compute the output. Given enough memory, the simulator could reverse lookup the initial values.
    Thank you Jade for reminding me how much fun I had in college.
    You are an awesome person with an unlimited imagination.

  • @cykkm
    @cykkm Před rokem +8

    There is a little big problem with the reversible billiard ball (RBB) computer (tangentially, the problem was noticed by Landauer himself in the original paper, but I'll translate it into the RBB language. Suppose you place ideal compressive springs at all output of the RBB logical circuit to _actually_ reverse the computation. This works indeed, but then you _don't known the result of the computation!_ You can adiabatically _uncompute_ any computation as long as you don't read its result. If you want to know the result, i.e. whether or not a ball popped out from an output and bounced back, you have no option but to touch the system. Even a single photon interacting with a ball, such that you can detect whether if there was a ball bouncing off the spring or not, transfers momentum to the ball, breaking the time symmetry of the RBB. A reversible computation is possible, _as long as you uncompute it all without reading out the result!_ The act of reading a result must increase the computer's entropy, even if the computer is reversible. This was one of the main Landauer results. His paper connected Shannon and Boltzmann entropy so clearly.

    • @JB52520
      @JB52520 Před rokem

      I hadn't read that Landauer was working from a time symmetry perspective. If you're going to run the universe backwards, there's no need for a special computer. Heat will flow back into a normal computer, cooling it and pushing electricity back into the wall with perfect efficiency.
      If time symmetry must actually remain unbroken and that's not a joke, there's nothing clear about this concept. Can you have a system that's not influenced by the probabilistic nature of quantum effects? Even if that's not a problem, reversible computing couldn't give off photons, including infrared, because they'd have to turn around and hit the computer precisely where and when they left. Any irreversible interaction with the environment would also be forbidden.
      This means reversible computing would require impossibly perfect efficiency in perfect isolation, ignoring quantum effects and spontaneously traveling backward in time, while being theoretically guaranteed to produce no results.
      I don't know anything anymore. This explains nothing "so clearly". At least you easily understand the incomprehensible and see utility in the useless. Screw it, everyone else is awesome and I'm wrong about everything. The more I try to learn or think, the more I realize it's a miracle I ever learned to tie my shoes. Being born was a terrible idea. Hey, I finally understand something.

    • @michaelharrison1093
      @michaelharrison1093 Před rokem

      This is along the same lines as the argument that I made in a comment I submitted.
      Reversibility does not eliminate the chane in entropy

    • @mihailmilev9909
      @mihailmilev9909 Před rokem

      @@JB52520 lmao well fucking said dude

    • @mihailmilev9909
      @mihailmilev9909 Před rokem

      @@JB52520 one of my favorite comments ever

    • @mihailmilev9909
      @mihailmilev9909 Před rokem

      @@michaelharrison1093 what was the context? Who's in ur pfp btw, some professor?

  • @IllIl
    @IllIl Před rokem +5

    Thank you very much for the video! This is by far the best explanation of the Landauer limit that I've heard, it actually makes sense to me now. One question I still have is that this equivalence between information and entropy seems to be a purely theoretical limit that comes out of the math when we take the 2nd law as axiomatic and then "balance the entropy books" of a gate that has less information on the output than the input. But this seems to make an assumption that "information" obeys the 2nd law. In reality, the information of a logic gate is something that a human interprets. The reality is just physical conductors and semi-conductors. The physical reality of _any_ implementation of _any_ logic gate should always be sufficient to preserve the 2nd law. Or is that the point? That if we found the ultimate physical implementation of a logic gate, its waste heat would be equal to (or greater than) that limit?

    • @hyperduality2838
      @hyperduality2838 Před rokem +1

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

  • @muthukumaranl
    @muthukumaranl Před rokem

    Wonderful!...Thank you! i am glad the recommender bubbled this up to me..i will now check out your other ones aswell!

  • @deslomator
    @deslomator Před rokem

    I didn't know this concept and it's really well presented, thank you.

  • @bntagkas
    @bntagkas Před rokem +3

    im just a stupid highschool dropout but to me it seems all of these energies are really kinetic energy. whether it moves a car, or manipulates information or produces heat, if you zoom in you move atoms in a way that benefits you, you move atoms to move the car=kinetic, you move atoms/electrons/photons to manipulate information, you move atoms etc to produce heat. so it seems to me all kinds are really one, kinetic

  • @atrus3823
    @atrus3823 Před rokem +11

    Great video, as always! Based on my absolutely zero research, my first thoughts are that though the entropy change (assume lossless collisions) of the gate is 0, half of the output energy is not used in the final calculation. There would need to be a way in the system as a whole to make use of the waste balls, or else you're right back where you started.

    • @waylonbarrett3456
      @waylonbarrett3456 Před rokem +4

      This has been considered. See the Szilard engine.

    • @hyperduality2838
      @hyperduality2838 Před rokem

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @atrus3823
      @atrus3823 Před 9 měsíci

      Thanks for the replies. Someday, I'll hopefully find time to follow up on this.

  • @Elear.
    @Elear. Před rokem

    I have been wondering about this for years. Finally. Thank you

  • @pablosartor6715
    @pablosartor6715 Před rokem

    Great video Jade!!
    There are others information paradoxes in physics, all related to entropy.
    This is a very interesting topic.

    • @hyperduality2838
      @hyperduality2838 Před rokem

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

  • @Zeero3846
    @Zeero3846 Před rokem +4

    I got so excited that I actually manage to guess your topic of reversible computing. I had only recently learned about through some crazy crypto proposal about an energy-backed stablecoin, and it fundamentally relied on the idea of reversible computing. In particular, it suggested a way to transfer energy over the internet. A computation doesn't necessarily have to be located on the same computer. It could happen over the network if you imagine the network as one giant computer. In performing the computation, the input side would generate heat, while the output absorbed an equivalent amount, and you essentially have a new kind of heat pump. Of course, this sounds way too easy to be true, and that's because it is, but it's still definitely cool to think about. It's somewhat reminiscent of Nikola Tesla's wireless energy tower, except perhaps it's a little less scary as it's not trying to transmit energy directly through some physical medium, like the earth and sky, but rather over our telecommunications infrastructure as data.

    • @josephvanname3377
      @josephvanname3377 Před rokem +1

      That sounds like a complete and total snake oil scam. You cannot transfer energy over the internet. It is really sad how people like to buy into scams hook, line, and sinker while you fail to realize how cryptocurrency technologies can truly benefit reversible computation. You have also failed to even provide the name of this 'energy-backed stablecoin' which makes me even more suspicious about you.

  • @AdrianBoyko
    @AdrianBoyko Před rokem +4

    What a great presentation of these topics. This is Carl Sagan level exposition!

  • @jonathandawson3091
    @jonathandawson3091 Před rokem

    This is an incredible video. Huge respect to you Up and Atom, I learned a lot from you - thank you!

  • @c1dv1c1ous
    @c1dv1c1ous Před rokem +1

    Jade is a treasure. Thanks for sharing your passion for science!

  • @equesdeventusoccasus
    @equesdeventusoccasus Před rokem +3

    I am glad to see a new video from you. Excellent video as always.
    I think that the true energy limit will always be tied to the efficiency of the system. For instance, build a computer that channels the heat back into electricity, and you have automatically made the system more effective, thus decreasing the amount of energy required to run the system.
    As far as logic gate reversibility a check sum set could be incorporated into the gate that would provide the reversibility being sought. It would require 3 bits rather than 1, but it would be completely reversible.
    The first rule of computer journaling is that it is resource intensive. However, it can be worth it.

    • @paulthompson9668
      @paulthompson9668 Před rokem +1

      That sounds like the beginning of a journal article. I'm looking forward to seeing you published.

    • @equesdeventusoccasus
      @equesdeventusoccasus Před rokem +1

      @@paulthompson9668 I retired 15 years ago, my article writing days are long past.

    • @paulthompson9668
      @paulthompson9668 Před rokem

      @@equesdeventusoccasus But if you have way of building a computer that channels the heat back into electricity, then you'll be doing society a huge favor by explaining how. Likewise for finding a more efficient way of providing logic get reversibility.

  • @nbooth
    @nbooth Před rokem +3

    Minor quibble but the second law doesn't imply that entropy can only increase, only that it is more likely to. You can always get lucky and have total entropy go down. I'm sure it happens all the time.

    • @nmarbletoe8210
      @nmarbletoe8210 Před rokem +1

      indeed! And at the maximum entropy state, it is more likely that entropy will increase. Randomness is fun

  • @jayantchoudhary1495
    @jayantchoudhary1495 Před rokem

    new to this channel, stuff here is awesome, keep up the good work

  • @firstolasto1518
    @firstolasto1518 Před rokem

    Brilliant!! Thank you. Amazing video. We’ll have to check out your others

  • @havenbastion
    @havenbastion Před rokem +7

    "Pure information" still exists as a pattern in a physical substrate and the movement of physical things is heat. There's no mystery unless you imagine pure information to be non-physical, which is an existential metaphysical category error.

    • @thelocalsage
      @thelocalsage Před rokem

      yeah yeah yeah and really there are no exact circles in nature and technically pi is a rational number because there aren’t *really* uncountably many things to construct pi from and blah blah blah we’re dealing in mathematical abstraction here you don’t need to “well, actually” a really good piece of science communication by telling mathematicians they’re performing errors of metaphysics when it takes only a modicum of emotional intelligence to see these ideas are a platform for discussing purely mathematical systems

    • @pandoorloki1232
      @pandoorloki1232 Před rokem

      "unless you imagine pure information to be non-physical, which is an existential metaphysical category error."
      That is precisely backwards. Confusing information with physical implementations is a category mistake.
      P.S. The response is more of the same conceptually confused nonsense. Information is a mathematical abstraction ... it doesn't need a physical instantiation or a "physical need"--that confuses the colloquial meaning of information with the formal meaning.

    • @havenbastion
      @havenbastion Před rokem +1

      @@pandoorloki1232 The information always has a physical instantiation or it couldn't be information. In order for it to exist in any meaningful sense there must be a physical need by which it may be acquired and manipulated.

    • @thelocalsage
      @thelocalsage Před rokem

      @@havenbastion a circle always has a physical instantiation or it couldn’t be a circle. in order for it to exist in any meaningful sense there must be a physical need by which it may be acquired and manipulated.

  • @itsawonderfullife4802
    @itsawonderfullife4802 Před rokem +4

    Insightful video.
    The "reversible logic gate" (referred to near the end of the video) is simply using (controlled and confined) classical scattering to compute. And we already know that scattering is a reversible process because it is directly dictated by the laws of mechanics (and conservation laws) such as Newton's 2nd law. And Newton's 2nd law is evidently time reversible (as are other fundamental laws of nature) because they are expressed as 2nd order differential equations (involving acceleration) which are invariant under a time-reversal transformation (=keep positions and reverse velocities).
    The question then again goes back to this: Given that the fundamental laws of nature are time-reversible, how come we have a thermodynamic arrow of time and irreversible macro processes (such as a traditional irreversible logic gate operating) and a common answer is that irreversibility (=thermodynamic arrow of time) is an emergent feature of an ensemble of many particles (its simply mathematics and probability).
    So the model "reversible logic gate" solves really nothing. It's just a toy model for a controlled Newtonian scattering, which we have known for hundreds of years. That does not tell us how to build computers which do not increase entropy.

  • @allanwrobel6607
    @allanwrobel6607 Před 10 měsíci +1

    A fundermental concept explained so clearly, you have a rare talent, keep going with these.

  • @sajidrahman8613
    @sajidrahman8613 Před rokem

    This is brilliant! Thank you so much for doing this.

  • @NotHPotter
    @NotHPotter Před rokem +10

    Makes sense. In order to maintain a system of any kind of order (and thus store useful information), the system internally needs to expend energy to resist the natural entropy that seeks to degrade that information. Some kind of input is gonna be necessary to resist that decay.

    • @heinzerbrew
      @heinzerbrew Před rokem

      we have lots of methods of storing information that don't require energy to maintain. Sure eventually entropy will destroy those storage devices, but we don't operate on that time scale.

    • @NotHPotter
      @NotHPotter Před rokem +1

      @@heinzerbrew Those more stable methods are also a lot slower, although even early CDs and DVDs are approaching the point where they're no longer readable. Books weather and wear. Ultimately, it doesn't matter, though, because even if you're going to quibble over time scales, there is still some necessary effort made to preserve data in a useful state.

  • @davestopforth
    @davestopforth Před rokem +10

    I'm struggling with the definition of information here. Surely information is based on our perception of states and conditions. Of course for information to exist requires a transfer of energy at some point, and the information literally exists because it is a state, but it doesn't become information until we begin to perceive it.
    For example, a particle travelling through space can have it's direction and velocity determined, but the particle doesn't care, it just exists and it just weighs X, whilst travelling at Y towards Z. That's it. For that to become meaningful information it requires some perception.
    Would it not be better to say it requires energy for the creation, transfer or manipulation of information?

    • @heinzerbrew
      @heinzerbrew Před rokem +4

      Yeah, it would have been nice if she had defined "pure information" because the information I know about doesn't need energy to simply exist. You are correct it is the creation, changing, and refreshing that requires energy. Not really sure how she doesn't get that.

    • @DrewNorthup
      @DrewNorthup Před rokem

      Information at rest does actually have an energy component, but she'd be here all week explaining it.

    • @hyperduality2838
      @hyperduality2838 Před rokem

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @compuholic82
      @compuholic82 Před rokem

      @Dave "Would it not be better to say it requires energy for the creation, transfer or manipulation of information?"
      But that also means that information itself must be associated with an energy state, does it not? If the manipulation (i.e. change of state) requires or releases energy there must have been an energy level before the manipulation and an energy level after the manipulation. The difference in these energy levels is the amount of energy needed for the manipulation.
      In that way it is no different than any other measurement of energy. Take gravitational potential energy. If you drop a ball you can assign an energy level to the ball before and after the drop and the difference in energy levels is released as kinetic energy.

    • @hyperduality2838
      @hyperduality2838 Před rokem +1

      @@compuholic82 There is also a 5th law of thermodynamics, energy is duality, duality is energy!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter.
      Action is dual to reaction -- Sir Isaac Newton (the duality of force).
      Attraction is dual to repulsion, push is dual to pull -- forces are dual.
      If forces are dual then energy must be dual.
      Energy = force * distance.
      Electro is dual to magnetic -- Maxwell's equations
      Positive is dual to negative -- electric charge.
      North poles are dual to south poles -- magnetic fields.
      Electro-magnetic energy is dual.
      "May the force (duality) be with you" -- Jedi teaching.
      "The force (duality) is strong in this one" -- Jedi teaching.
      There are new laws of physics! Your mind creates or synthesizes syntropy!
      Thesis is dual to anti-thesis creates the converging thesis or synthesis -- the time independent Hegelian dialectic.
      Duality creates reality!

  • @user-vb3xq9pm8t
    @user-vb3xq9pm8t Před rokem

    Wow, you are amazing! The way you present information is great and charming! Everything is very clear. I am totally in love! Thanks for your work)

  • @techslfink9722
    @techslfink9722 Před rokem +1

    Information is data that induces a change. If the second output of the gate is not observed, will it still have its function?
    I love your channel - I’ve had very few teachers who explained as good as you do!

  • @Mike500912
    @Mike500912 Před rokem +4

    1. Microprocessors do output energy via driving external ports, used for driving other subcircuits.
    2. A logic gate isn't a fundamental building block. It itself is made up of transistors, a more fundamental building block.

  • @elroyfudbucker6806
    @elroyfudbucker6806 Před rokem +14

    Practically speaking, the heat generated within a microprocessor comes during the infinitesimally short time that the MOSFETs that make up the logic gates change state. When they are either conducting or not conducting current, they generate no heat. Multiply this extremely tiny amount of heat by the millions of MOSFETs that are changing state millions of times a second & you see the need to have a cooling fan on a heatsink mounted directly on the microprocessor & why it's not a good idea for your daughter to have her laptop on the bed.

    • @htomerif
      @htomerif Před rokem +3

      Minor correction: what you said was exactly true 20 years ago. Right now, with the size of junctions in gates, a good chunk of the heat generated by processors is from quantum tunneling (i.e. leakage current intrinsic to the gate). I say "gate" because in modern processors you don't have individual FETs. You have compound devices that serve the purpose of a logic gate without actually having a recognizable FET.
      The only way to stop that kind of power drain is to kill the power entirely to large sections of a processor, otherwise they constantly drain power, flipping bits or not.
      I realize I used the word "gate", both meaning the insulated physical gate that controls current flow in a semiconductor device and meaning "logic gate" here. Hopefully its relatively clear which is which.

    • @KirbyZhang
      @KirbyZhang Před rokem

      @@htomerif does this mean older FET processes can produce more power efficient chips?

    • @htomerif
      @htomerif Před rokem +1

      @@KirbyZhang Its not so much about "older". Its about what they were designed for. OP isn't wrong that FETs changing state *used* to be the primary power drain, or that it is *still* a significant contribution to TDP. Its just been optimized now in CPUs for minimum power per computation and that means gate leakage outweighs state changes.
      Unfortunately, this gets complicated. You probably know that past a certain point (I don't know, about 100nm?) the process nodes have nothing to do with size anymore. For example, the 7 "nm" node can't make features that are 7 nanometers. Its minimum feature size depends on the type of feature but its up around 50 actual nanometers. Its kind of a big fat lie.
      Ultimately, the answer to your question is: using current process technology, you can optimize it for power efficiency by increasing gate sizes and increasing gate dielectric thickness but it comes at the cost of more silicon real estate. They do use older processes and equipment for microcontrollers like TI's MSP430 series or Microchip's atmega series, but its older equipment that's been continuously developed for making these slow, extremely power efficient processors.
      I guess it really depends what you mean by "power efficient". If you mean "minimum standby TDP" then yes. If you mean "minimum power clocked at 100MHz, with a modern design" then also yes. If you mean "absolute minimum cost per floating point operation" then no. And all of those are important considerations. A rackmount server might be going all-out for 5 years straight. A desktop processor will probably spend most of its life at a much lower clock rate than maximum and a mobile (i.e. phone) processor will probably spend a whole lot of its life in standby at or near absolute minimum power.
      I hope that helped. Its hard to pack a whole lot of information into a small space that someone would theoretically read.

  • @mmicoski
    @mmicoski Před rokem

    Very nice explanation with the billiard balls: each input has to go somewhere if not as output information, as loss energy (heat).

  • @vaderbase
    @vaderbase Před 9 měsíci

    Thank you so much for this informative video. ❤❤❤

  • @douggale5962
    @douggale5962 Před rokem +5

    The fundamental component of modern computers is the field effect transistor. They are metal oxide semiconductors, so they are known as MOS transistors. There is a positive and negative construction, the two constructions are the suited for pulling (voltage) up and pulling down. Those two constructions are "complementary". So the fundamental component is more specifically, CMOS FETs. You switch FETs by charging or discharging the FET gate layer. The power used by a processor is the dissipation due to the resistance of conductors transferring charge in and out of the gate. Holding a 1 or a 0 state does not consume power, only the changes dissipate significant power.

    • @abdobelbida7170
      @abdobelbida7170 Před rokem

      Is that what "switch losses" means?

    • @douggale5962
      @douggale5962 Před rokem +2

      @@abdobelbida7170 Yes. Usually, switching losses primarily refer to the time when a transistor is partially on, and a voltage drop across the source and drain (the switched path) dissipates power. In a switching power supply with large currents, the partially-on pulse of losses is large. That's one huge transistor with one huge pulse of losses.
      In a CPU, there are no large currents going through any individual transistor, but an enormous number of gates being charged or discharged simultaneously, so an enormous number of small losses sum up to a large loss that would be correctly called switching loss.
      All of the power that goes into a CPU is lost, all of it becomes waste heat, eventually. Another way of looking at it is, CPUs use no power, other than the power they need for the losses, once they get going.

  • @dwightk.schrute8696
    @dwightk.schrute8696 Před rokem +3

    To illustrate the scale at play here - imagine you have an "optimal counter" that operates at the Landaurer limit and simply increments the number by 1. Now you're going to do something spectacular - let's convert all the mass in our Hubble volume to energy and feed it to our optimal counter using Einstein's famous mass-energy equivalence (roughly 2x10^69J). Assuming you and the counter survived this event, you'd see the number 2^341 (given current temperature of space 2.72K). If you have a modern CPU, it probably supports the AVX-512 instructions which allow it to easily operate on 2^512 numbers. Vast majority of numbers between 0 and 2^512 will never exist in any shape or form. This is sometimes used in computer security to inform us about the strength of systems against certain classes of attacks.
    I have a pet theory that the computational density of universe is constant at any point in time - with expansion of space time our Hubble volume shrinks and we have less available energy to use, on the other hand, thanks to the expansion everything gets cooler and I think the numbers even out.

    • @gnanay8555
      @gnanay8555 Před rokem

      Reading this with Dwight's voice was longer than i expected 🤔

    • @vigilantcosmicpenguin8721
      @vigilantcosmicpenguin8721 Před rokem

      @@gnanay8555 It took a second for my brain to determine what tone of voice Dwight would say it in. I read it as condescending, with a few digressions into matter-of-factly.

  • @williamlangley1610
    @williamlangley1610 Před 11 měsíci

    I think your explanations are very easily understood...thanks.

  • @stillme4084
    @stillme4084 Před rokem

    Great video, amazing info and fun. Thanks & be well.

  • @landsgevaer
    @landsgevaer Před rokem +4

    After having spend so much time thinking how to carefully explain an unintuitive topic and managing to produce such a wonderful video, it must be so disappointing reading so many of the comments here that don't show anything has landed.
    Fortunately, there are a few exceptions too.
    Rest assured that it is the YT audience that is lacking, not the producer...
    😉

    • @En_theo
      @En_theo Před rokem

      Well, she kinda misunderstood some concepts herself since she said @6:24 that entropy "never" goes down, which is false. It tends to increase but there is still a possibility that it goes down. If the universe is eternal, then that hypothesis becomes a certainty and you find yourself with renewed energy again (after a veryyyy long time though).

    • @landsgevaer
      @landsgevaer Před rokem

      @@En_theo That may be a theoretical note to make, but I wouldn't say that makes the statement "wrong" for this audience.
      We can hopefully agree that many comments are much worse... 😉

    • @En_theo
      @En_theo Před rokem

      @@landsgevaer
      Comments are entitled to the people who make them, as long as they don't pretend to be physicists bringing a new knowledge. I'm not blaming her, it's a common misconception that entropy "always" increases. A simple sentence like "it tends to, but the details are for another video" would be nice.

  • @danielclv97
    @danielclv97 Před rokem +7

    I love the video, but I'd like a follow up, maybe with the Maxwell's daemon, like, if this can solve the need for energy consumption during information manipulation, what stops the Maxwell's daemon to break the laws for thermodynamics? Or, is it still physically imposible to store information with no expense of energy? And if yes, then isn't the billboard compute useless in the sense that it will consume the same amount of minimum energy? You can't use information if you don't store it to interpret it after all. You'll have to at least read the position of the balls after they pass trough the computer.

    • @geraldsnodd
      @geraldsnodd Před rokem +1

      Maxwell's Demon is a cool topic in itself :)
      Check out Eugene Khutoryansky's video.

    • @christopherknight4908
      @christopherknight4908 Před rokem +6

      If I'm not mistaken, Maxwell's daemon has to expend energy to erase information. Would reversible computing eliminate this requirement? Perhaps information storage is just a different problem that would need to be solved, in addition to the energy usage of computing.

    • @_kopcsi_
      @_kopcsi_ Před rokem +1

      @@christopherknight4908 yes, information storage is totally different from computing, but interestingly the two problems have the same source:
      "Landauer's principle is a physical principle pertaining to the lower theoretical limit of energy consumption of computation. It holds that "any logically irreversible manipulation of information, such as the erasure of a bit or the merging of two computation paths, must be accompanied by a corresponding entropy increase in non-information-bearing degrees of freedom of the information-processing apparatus or its environment"." -- en.wikipedia.org/wiki/Landauer%27s_principle
      storage of information is static, manipulation of information is dynamic. these two have totally different nature. but when we erase information, we actually manipulate it. erasure is dynamic. that's why the same principle can provide a solution for the problem of Maxwell's demon and an explanation for the fundamental energy cost of irreversible logical operators.

    • @hyperduality2838
      @hyperduality2838 Před rokem

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

    • @_kopcsi_
      @_kopcsi_ Před rokem

      @@hyperduality2838 dude smoke less weed

  • @Olaf_Schwandt
    @Olaf_Schwandt Před 9 měsíci

    thank you, very interesting.
    To see it in connection with computer gates it is very clear. I heard about Entropy and Energy of Information changes in connection with the Maxwell's demon. Someone used it to prove that the Maxwell's demon does not violate the second law of thermodynamics.
    I had even no idea about the Landauer Limit, but the formula is very close to Bolzmanns calculation rule of entropy (of course).

  • @granduniversal
    @granduniversal Před rokem

    This approach to entropy is like making a database for an application. You make that based upon the business rules. When you are trying to find out the business rules, you will have to deal with a lot of uncertainty. That uncertainty is not unlike the challenges that surround trying to figure out chaos math. Anyway, the more uncertainty, the more entropy. And the natural state for most people to be in is that of ignorance. When you are figuring out the business rules you will find varying interpretations, and varying ideas about what kind of a world those can go in. That part is not unlike how consciousness is not a broad spectrum thing, but rather about what we can focus on at any one point in time. In the end you hope to get to the truth. The truth is the only "state" that can answer all of the questions, logically lead to the end result desired for a particular circumstance.
    This thing about getting at the business rules, it is important as a metaphor. It speaks about how you can try to do math between apples and oranges sometimes, but we want there to be an equivalency. Well, there will be, if you borrow from the outside. I don't mean using things like big data to help you find the business rules. That will most likely result in scope creep, as it points out to you more ways to make differentiations, flinging you into what could be before you understand what is. I mean wrestling with it by using as little information to deduce what you have to as possible. If there is an equivalency, then the correct answer should be surrounded by a lot of falsehood, that is interest in remaining ignorant, or potential other path traces, or heat. Because this is hard work, and nobody should go around saying it isn't. If you are right about knowledge, or wisdom, there will always be a lot of people going around telling you that you are wrong, in other words. That's no reason to give up.
    The example also points out something about linguistics that I find fascinating. It only affects the consistency of one side of the two sides that should be active to gain effective communication. But we see from things like animal training that is really all that is necessary to achieve success, for the master. It is about reward. And reward is about what an object needs or wants, not what the master demands it needs or wants. Making databases, however, is about making it easier for everybody to do things repeatedly, with less error. Does this lesson tell us that is achievable with only one side being consistent? You know, with the system being built correctly? I guess we find that out when we use a good app versus a bad one?

  • @davidh.4649
    @davidh.4649 Před rokem +8

    So Jade ... does the Landauer Limit apply to synapses in the human brain as well? 😁 Good thought provoking video as always!

    • @gotbread2
      @gotbread2 Před rokem

      It does. Its a fundamental consequence of information erasure.

  • @paulthompson9668
    @paulthompson9668 Před rokem +6

    Hi Jade, this video seems like a wonderful lead-in to a video on how quantum computers will require less energy due to the reversibility of their gates.

    • @dodokgp
      @dodokgp Před rokem +1

      Not unless the cryogenic operation is obviated.

    • @paulthompson9668
      @paulthompson9668 Před rokem +1

      @@dodokgp So let's get cracking on quantum computing at room temperature!

  • @glenncurry3041
    @glenncurry3041 Před rokem +1

    What an interesting approach from a strictly information physics approach! Being old, my formal electronics education was discreet components. ICs were first coming out and the CPU did not exist. IOW what you show as a symbol for a logic gate to me was originally a batch of steering diodes and mono or bi-stable multivibrators. To me the total energy loss for that gate is based on having to overcome the barrier voltages at each of the solidstate devices internal junctions. e.g. 7v for a silicon PN junction. Plus additional components, wiring,.... Now cascade them.. and suddenly multiply that by billions on one chip!
    Even with today's abilities to cram massive numbers of these onto one chip, they are each still discreet devices and junctions with those needs. While you are working at a completely different level. And I see a great gap between them yet.

    • @hyperduality2838
      @hyperduality2838 Před rokem

      There is a 4th law of thermodynamics:-
      Equivalence, similarity = duality (isomorphism).
      An "equivalence gate" measures similarity, sameness or dualness.
      Homology is dual to co-homology -- topology.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics.
      All observers have a syntropic perspective according to the 2nd law.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Syntropy (prediction, convergence) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Convergence (syntropy, homology) is dual to divergence (entropy, co-homology).
      The word entropy means "a tendency to diverge" or differentiate into new states, reductionism -- science.
      The word syntropy means "a tendency to converge" or integrate into a single whole state, holism -- religion.
      Differentiation is dual to integration, division is dual to unity, reductionsim is dual to holism.
      Syntropy is dual to entropy.
      "Science without religion is lame, religion without science is blind" -- Einstein.
      Science is dual to religion -- the mind duality of Albert Einstein.
      Your mind uses information to converge or create optimized predictions -- a syntropic process.
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Duality creates reality.
      "Always two there are" -- Yoda.
      The 4th law of thermodynamics is hardwired into mathematics.

  • @mystyle43
    @mystyle43 Před rokem

    This is very importance phenomena. It even has a connection to Metaphysics, the equivalence between energy matter and information.

  • @MrAttilaRozgonyi
    @MrAttilaRozgonyi Před rokem +3

    I remember reading somewhere that information actually has weight. In other words, a hard drive full of data is measurably a tiny bit heavier than that same drive when erased. I wish I could remember the actual reason why. Maybe that could make an interesting video? All the best! ☺️☺️

    • @Snowflake_tv
      @Snowflake_tv Před rokem

      I'm interested in, too.

    • @markricher2065
      @markricher2065 Před rokem

      I was thinking the very same thing 🤔

    • @Snowflake_tv
      @Snowflake_tv Před rokem

      Is information then mass???

    • @MrAttilaRozgonyi
      @MrAttilaRozgonyi Před rokem +1

      I have a vague recollection that it *may* be limited to the older style HDD’s which have magnetic platters and it may not be the case in SSD drives… but that’s all I can remember. :) I’d love to know more about this.

    • @RobBCactive
      @RobBCactive Před rokem +2

      As HDD platters magnetise the coating when the drive heads set bits and sit in the Earth's magnetic field, it's possible the measured weight changes a little by adding a magnetic force but that is not gaining or losing mass.

  • @micro2743
    @micro2743 Před rokem +3

    "Pure Information" (Data) can be stored on a Floppy Drive/Hard Drive/CD/DVD/Thumb Drive/SSD/etc... and it requires no energy and gives off no heat. Moving information requires Energy and produces Heat! Data stored in active memory must be refreshed and also requires Energy and produces Heat, and of coarse physical hard drives spin for a while even when they are idle. You covered logic gates, but it should also be noted that often computers are just moving data i.e. copying an image from a web server to your local hard drive, then to active memory, and eventually to your graphics card so you can see it on your screen.

    • @StefanNoack
      @StefanNoack Před rokem +1

      Well, the floppy media requires mass, and that is equivalent to energy ;-)

    • @Rithmy
      @Rithmy Před rokem

      And the wear and tear of the floppy media can be said to equal energy loss. Almost like radiation in big.

    • @micro2743
      @micro2743 Před rokem

      @@StefanNoack A mass at rest is potential energy and I don't think it produces heat.

    • @micro2743
      @micro2743 Před rokem

      @@Rithmy I am talking about cold storage, and I know a floppy can last 20+ years. I don't think a CDR/DVDR lasts that long. Retail CDs/DVDs are usually pressed and degrade much more slowely. Since energy cannot be destroyed, you are correct. That magnet field has to be moving somwhere else as the Floppy degrades, and theoretically creates a small amount of heat. Would we even be able to measure it?

    • @Rithmy
      @Rithmy Před rokem

      @@micro2743 Isn't mass alone creating heat by pressure?

  • @badwolf8995
    @badwolf8995 Před rokem

    got a new sub

  • @commandershepard6189
    @commandershepard6189 Před rokem

    Cool! Good video... Some people don't understand that 0 is open circuit while 1 is closed circuit. Meaning 0 is the absence of a bit while 1 is a bit. In microchips 0 has work or foce being applied due to heat transfer from the nearby transistors and the opening of that transistor... this equats to energy loss in the application. Yeah, work applied but no work output. Cool stuff. The problem, we'll never get around thermodynamics.

  • @rohanlorange3660
    @rohanlorange3660 Před rokem

    Wow you're nearly at 500K subs! Well done!

  • @MrAttilaRozgonyi
    @MrAttilaRozgonyi Před rokem

    What an awesome video. Thoroughly engaging. I couldn’t stop watching. Fantastic!! ☺️👍

  • @richardleeskinneriii9640

    This is interesting. In cybernetics, variety is the number of possible states in a system. The larger variety your system has, the more entropy it has.
    It's this link between thermodynamics and cybernetics, which is something I'm very interested in.

  • @TimJSwan
    @TimJSwan Před rokem

    You killed it again, Jade.
    I love the topic you covered, as usual and I never saw the billiard ball example before.
    You should check out hashlife algorithm which gives 10^800 speedup. Another entropy saver right there.

    • @Snowflake_tv
      @Snowflake_tv Před rokem

      So did I. Anyway, Hashlife algorithm???

  • @sunnybeach4025
    @sunnybeach4025 Před rokem +1

    I find it especially captivating that you seem to specialize in language yet use the strong physical foundation in which you conduct this display. I believe my time is best spent learning more about modern language and and and a...

  • @stevesloan6775
    @stevesloan6775 Před rokem

    That was really awesome.
    Reminds me of how the universe is created.
    It know what it is so it can run through its program and know all of its outcomes without any real effort at all.

  • @dougieh9676
    @dougieh9676 Před rokem

    Love your videos ❤️❤️❤️
    Keep em up please. ♥️♥️♥️

  • @AlexSchendel
    @AlexSchendel Před rokem

    Extreme overclockers use liquid nitrogen to massively increase the clock speed of processors. Since cooling transistor gate oxide massively increases its stability, they can massively increase the voltage they run. But as you cover in the video, this amount of cooling drastically reduces the power consumed by the processor, so it all ends up still being stable and the higher voltage allows for a much higher frequency because the time taken for the transistors to flip is effectively reduced. Very cool to learn about the Landauer limit!

  • @RonaldoWeigand
    @RonaldoWeigand Před rokem

    This is so interesting. I had never thought about this !

  • @markricher2065
    @markricher2065 Před rokem

    Fascinating
    A clear and concise explanation of an interesting phenomenon

  • @KipIngram
    @KipIngram Před 8 měsíci

    But you see, changing the internal state of the information storage elements in a compute involves moving electrical charge, literally. You can think of it a bit like all the info elements are little capacitors. To get a high voltage (a "1") onto that capacitor, you have to charge it up. To do this, you take charge from the power supply positive rail and route it onto the capacitor. Later, when you want to change that element to a "0," you have to get rid of that charge, so you discharge it to ground. So, for each 0 -> 1 -> 0 cycle, you've moved a sip of charge from power to ground. That is energy: Q*dV. Outside the processor you use some kind of power supply to pump the charge back up from round to the positive voltage.
    So, there's no mystery about where the energy is being used.

  • @LMProduction
    @LMProduction Před rokem

    The Landauer limit can be circumvented by noting that entropy can increase in other conserved quantities, not just energy. A paper from 2009 (and a few more papers in the years since) by J. Vaccaro and others showed that in principle, you can transfer the kb ln(2) bit of entropy to an angular momentum reservoir, therefore inducing no energy loss. We're currently working on a qubit operation to show a single bit transfer of thermal energy from the environment into a spin bath experimentally.

  • @dninjaman
    @dninjaman Před rokem

    Hi Jade, I'd like to see an interview or series with you and Professor Michelle Yvonne Simmons at UNSW. I would like to follow the progression of the amazing work they are doing with quantum computing and think you could do a great deep dive into the history of Prof Simmons work and where UNSW is going into the future. Thanks Dennis

  • @TechnoMageB5
    @TechnoMageB5 Před rokem

    2 things:
    1) With current electronic computers, there is no way even with reversible logic gates to make a net zero energy use computer. The architecture itself requires energy to operate, thus the laws of entropy cannot be circumvented. Quantum computing still needs to solve the architecture problem, but at least the computing itself could theoretically work.
    2) This video reminds me of a discussion at work I had with a girl years ago, who was a college student working there at the time, about logic states. She was curious as to how computers "think". I started with a similar example, the light being on or off (while switching the light in the room on and off as an illustration), that by storing and comparing millions of on/off states, computers evaluate conditions for us, and that those "light switches" are represented by the presence or absence of voltage - more basic than this video. Next time I visit that location, she tells me that literally a few days after our talk, her professor starts talking about the same thing, and here she is in class smiling to herself the whole time thinking "I know where this is going..."

  • @kozmizm
    @kozmizm Před rokem

    In classical computing some information is destroyed(lost) when doing a calculation/computation, but if you do things cleverly, you can create an odd symmetry where the calculation done forward in time is also essentially the same as the calculation done backward in time, also referred to as reversible computing, in which no information is lost, even though it requires that many more states are preserved, and the result is sort of like the same thing as the initial state in the first place, adding quite a lot of difficulty into the design of the machine. Yet, in a superconducting environment, this is 100% efficient, theoretically, leading to zero energy wasted. That is, if my understanding of all that is correct. Hence, it is theoretically possible to have a special quantum computer that does not waste any electricity. Instead of destroying information by starting with multiple pieces of information that combine down into fewer bits of information, we preserve all the information during the transformation using our special knowledge of physics, and the number of inputs equals the number of outputs and everything is able to be deterministically reversed. Even though we only care about a portion of those outputs, no energy was destroyed. Every bit of energy has been transformed but preserved without any destruction. It's hard to wrap your head around, but it seems to be a sound idea. Kudos to whoever came up with that idea! Essentially it preserves everything so there is basically zero entropy and therefore no loss. Even if there are minuscule amounts of entropy, the circuit is so efficient that the energy loss will be almost zero compared to classical computing. You can't do this using regular transistors. You have to build this symmetry into the physical design of the system itself, using the natural properties of nature to outsmart entropy. It's not a simple problem, but in quantum gates using quantum computing, it is theoretically possible, I'm using that word again, "theoretical". Honestly, from a purely theoretical point of view it makes sense. Just look at superconductors and we see that entropy can be overcome in certain circumstances. Now, imagine if we can do superconducting materials combined with form of super-symmetrical logic gates to overcome entropic waste. it's like taking superconducting to a meta level. It's the supersymmetry hypercube, man. Like, totally dude! Far out!

  • @jwjarro73
    @jwjarro73 Před 5 měsíci

    This is one of the most intuitive descriptions of entropy I've heard

  • @jakobthomsen1595
    @jakobthomsen1595 Před 3 měsíci

    Nice explanation of reversible computers!
    An additional challenge implementing them seems to be that the extra outputs can't be thrown away (that would produce heat again), so they must somehow be "recycled".

  • @JohnSmith-ut5th
    @JohnSmith-ut5th Před rokem

    You made a logical leap from a defeating the Landauer limit for a single gate to that of a computer. Adding multiple layers of the gates you described *exponentially* increases information unless you disregard information, which is irreversible. In 2001, I developed the mathematics of a reversible computer that only needs 1 additional bit of information per gate layer (unpublished), so only linear growth.

  • @majorfallacy5926
    @majorfallacy5926 Před rokem

    I'm quite familiar with thermodynamics but not so much with computer science, so this completely blew my mind

  • @BrianHeilig1
    @BrianHeilig1 Před rokem +1

    That was cool! I found myself hoping you would dive into the energy loss at the logic gate, e.g. what happens if we have super conductors and cool the gate to absolute zero? Where specifically is the energy lost in this logic gate?

  • @garyfilmer382
    @garyfilmer382 Před rokem

    Fascinating! I was thinking of quantum computers, and time crystals, reversibility. Energy? There must be, but the mystery is, no discernible change in temperature. There must be! It’s just that it is so small that we haven’t found the means to measure it. All intriguing!

  • @vimicito
    @vimicito Před rokem

    Regarding the amount of heat generated by a logic gate, there's the following considerations that - to the best of my knowledge - we tend to apply in practice.
    1. The most fundamental part in a processor is the transistor. Multiple transistors can make up a logic gate, and multiple logic gates can make up an instruction. Instructions can be as simple as comparing two registers, or as complex as hardware-accelerated cryptography.
    2. Instructions don't always take a single cycle. They may take more than one, especially in RISC computers. They combine simpler instructions to do something more complex. Something like, say, simplifying a power to its amount of multiplications, but in base-2 systems things get far simpler than this.
    3. A processor is filled to the brim with transistors. These transistors are like electrical switches, where the push of a button is replaced by a voltage on a logic gate (the transistor's actual.. well, gate!). That gate has a certain time it needs to fill up its charge, depending on the amount of current the driver can supply (this is why MOSFET drivers are a thing), the amount of equivalent capacitance the gate has (capacitors are like electrical reservoirs), and finally the voltage that is needed to saturate the gate (i.e. turn it on). As an engineer, you can play with all of these to get the desired result.
    4. The processor will run at a certain frequency, and lower is better for energy savings. This is because of that time to charge/discharge the transistor. In the trade, we call these rise and fall times respectively. In these charging and discharging states, the transistor will emit the energy put into it as heat (acting as a resistor). This is why you want these rise and fall times to be low, but too low creates ringing so don't take it too far. The frequency of the processor being increased, will cause a larger percentage of time to be spent in these states, where heat is generated.
    So in practice, processors tend to prefer running at a lower frequency, both to save (battery) power and to generate less heat when a high frequency is not necessary. Manufacturers like Intel, AMD and ARM do this through stepping, which can be controlled either by the BIOS, or by the operating system kernel. Most of the time will be spent at the low idling frequency, and then ramp up quickly when you want to do stuff (say, loading a web page). Then it will usually gradually step back down. Performance improvements could be made by making instructions take less cycles, at the cost of CPU complexity (Intel is known for this, and its avoidance is what underpins RISC platforms). There's other ways to do these things in practice too, but this comment is already quite long. For the handful of people that got here, thank you for reading this. You're awesome!

  • @robertbarta2793
    @robertbarta2793 Před rokem

    Super explanation! Thanks!

  • @CarlosRomeroFilosofia

    What a great video! Thanks for this

  • @johnclark8359
    @johnclark8359 Před rokem

    That was a first class video! Thanks a lot.

  • @C.O._Jones
    @C.O._Jones Před 8 měsíci

    Information entropy isn’t *quite* the same as thermodynamic entropy. Information entropy by itself does not provide any means to estimate the probability distribution; typically in information theory, we assume uniform distribution, which makes entropy become a trivial function measuring the multiplicity of the different symbols, just like its counterpart measure of Boltzmann-Gibbs entropy that counts the number of possible micro-states of particles in a given volume of space.
    I (computer scientist) love discussing this and other related topics with my physicist husband!

  • @jahnicalj8992
    @jahnicalj8992 Před rokem

    Finally the video is here,, thank you Jade...🙏

  • @paulhofmann3798
    @paulhofmann3798 Před rokem +1

    Cool video! Fun fact, to calculate the entropy generated when transforming string x into string y using a reversible computer, one generates K(y/x) entropy. K(y/x) being the conditional Kolmogorov complexity. I consider this the most concise connection between physics and information theory.

    • @david203
      @david203 Před rokem +1

      Yes, here you are simply using entropy = complexity.

    • @paulhofmann3798
      @paulhofmann3798 Před rokem

      @@david203 the interesting part of defining complexity as a generalization of entropy is that one can show then that most strings are random strings while the regular strings are of measure 0. This in turn helps to get a grip on causality and is the ultimate answer to Laplace asking why do we assume that the string ‘Constantinople’ in the Hagia Sophia has been written by a human vs all other possible combinations of the same number of letters.

    • @david203
      @david203 Před rokem

      @@paulhofmann3798 Cellular automata can be automatically generated, with each possible automaton associated with a unique integer. I believe Stephen Wolfram has a nice website about this. Some of these automata generate uninteresting simple-looking low-entropy patterns. Others generate a rich variety of chaotic patterns. The really interesting relationship with human intelligence happens along the random border between low-entropy and high-entropy generating automata. There is no rule for distinguishing automata that generate more intelligent patterns from those that generate either low-entropy or chaotic patterns. It is clear that there is much more to be learned about this random edge that locates what we judge as intelligent patterns.
      In addition to automata, certain simple algebraic systems, such as "the logistics function", have a fixed parameter that, when varied, generate either simple patterns or chaotic ones. Again, the border where chaos starts is not yet well understood. In this case, the cause of the chaos is very clear and can be visualized as a characteristic of the "reflection" between two curves, which is either contained (low-entropy) or escapes containment (chaotic).
      Sorry I'm waving my hands a bit here, since this is from memory.

    • @paulhofmann3798
      @paulhofmann3798 Před rokem

      @@david203 I know all that, my PhD thesis was on order out of chaos. Kolmogorov and his students made big contributions to this subject. Take eg the KAM theorem which explains why there is empty space between the Jupiter rings. Remember, the solar system may be chaotic. I agree there are still many questions open in chaos theory.

    • @david203
      @david203 Před rokem

      @@paulhofmann3798 There are. It's a rich start to a field that may take off someday. Is your thesis online? Sorry to reply with such elementary information.