Why the Future of AI & Computers Will Be Analog

Sdílet
Vložit
  • čas přidán 8. 04. 2024
  • Why the Future of AI & Computers Will Be Analog. Secure your privacy with Surfshark! Enter coupon code UNDECIDED for an extra 3 months free at surfshark.deals/undecided Since digital took over the world, analog has been sidelined into what seems like a niche interest at best. But this retro approach to computing, much like space operas, is both making a comeback because of its power and speed. What I really wanted to know, though, was this: How can analog computing impact our daily lives? Why might it be the future of AI? And what will that look like?
    Veritasium’s video on analog computers: • Future Computers Will ...
    The Reserve Bank of New Zealand’s (RBNZ) MONIAC display: • Making Money Flow: The...
    Watch How We Solved The Home Wind Turbine Problem • How We Solved The Home...
    Video script and citations:
    undecidedmf.com/why-the-futur...
    Get my achieve energy security with solar guide:
    link.undecidedmf.com/solar-guide
    Follow-up podcast:
    Video version - / @stilltbd
    Audio version - bit.ly/stilltbdfm
    Join the Undecided Discord server:
    link.undecidedmf.com/discord
    👋 Support Undecided on Patreon!
    / mattferrell
    ⚙️ Gear & Products I Like
    undecidedmf.com/shop/
    Visit my Energysage Portal (US):
    Research solar panels and get quotes for free!
    link.undecidedmf.com/energysage
    And find heat pump installers near you (US):
    link.undecidedmf.com/energysa...
    Or find community solar near you (US):
    link.undecidedmf.com/communit...
    For a curated solar buying experience (Canada)
    EnergyPal's free personalized quotes:
    energypal.com/undecided
    Tesla Referral Code:
    Get 1,000 free supercharging miles
    or a discount on Tesla Solar & Powerwalls
    ts.la/matthew84515
    👉 Follow Me
    Mastodon
    mastodon.social/@mattferrell
    X
    / mattferrell
    / undecidedmf
    Mastodon
    mastodon.social/@mattferrell
    Instagram
    / mattferrell
    / undecidedmf
    Facebook
    / undecidedmf
    Website
    undecidedmf.com
    📺 CZcams Tools I Recommend
    Audio file(s) provided by Epidemic Sound
    bit.ly/UndecidedEpidemic
    TubeBuddy
    www.tubebuddy.com/undecided
    VidIQ
    vidiq.com/undecided
    I may earn a small commission for my endorsement or recommendation to products or services linked above, but I wouldn't put them here if I didn't like them. Your purchase helps support the channel and the videos I produce. Thank you.
  • Věda a technologie

Komentáře • 1,3K

  • @UndecidedMF
    @UndecidedMF  Před 20 dny +55

    Should we take the word of analog evangelists as gospel? Or are we better off waiting for flying cars? Secure your privacy with Surfshark! Enter coupon code UNDECIDED for an extra 3 months free at surfshark.deals/undecided
    If you liked this, check out How We Solved The Home Wind Turbine Problem czcams.com/video/SQKHJm7vd4E/video.html

    • @Paulftate
      @Paulftate Před 20 dny

      Here's food for thought for you .. not my opinion just a thought ... Remember The Jetsons? Don't remember seeing any non-white .. it is what it is 👋

    • @Paulftate
      @Paulftate Před 20 dny

      When I think analog? I think vacuum tubes

    • @Sekhmmett
      @Sekhmmett Před 20 dny +3

      No

    • @d1skel452
      @d1skel452 Před 20 dny

      You should do some redeem BrainChip and their Neuromorphic Chip based on Spiking Neural Network. It has just been launched into space with with a satellite called ANT61. It uses micro watts to a few watts of power

    • @Paulftate
      @Paulftate Před 20 dny

      @@Sekhmmett no, what?

  • @ShawnHCorey
    @ShawnHCorey Před 20 dny +741

    (circa 1960) "It would appear that we have reached the limits of what it is possible to achieve with computer technology, although one should be careful with such statements, as they tend to sound pretty silly in 5 years."" - John von Neumann

    • @jollygoodfellow3957
      @jollygoodfellow3957 Před 20 dny +75

      The current time is *always* at the limit of computing power. That's how time works.

    • @dianapennepacker6854
      @dianapennepacker6854 Před 20 dny +24

      We just won't be able to brute force it at worse. Still a myriad of technologies to somehow make it faster. Through better architectures of the chip is one of them.
      Maybe we will figure out how to program quantum to computer to more standard things. As of now they are only better for specific tasks.

    • @jackinthebox301
      @jackinthebox301 Před 20 dny +45

      The difference here, of course, is that Von Neumann wasn't dealing with the limits of physics, just their current technology. We're making silicon transistors that are approaching the hard limit of their atomic structure. We'll have to figure out a different way to scale computing power outside of shrinking silicon based transistors.

    • @magnemoe1
      @magnemoe1 Před 20 dny +5

      @@jollygoodfellow3957 Same is true for lots of stuff like car engines, we make lots of them and its great initiative to improve them.
      Make an car engine who is more powerful, use less fuel and pollute much less. we done it. Its just much more complex than an engine 50 years ago.
      Make it an hybrid to increase all 3 benefits again.
      And analogue computers are amazing, the Iowa class battleship continued to use their 1940's mechanical analogue computers.until deactivation. They had digital fire control computers for 5" guns but newer one has higher muzzle velocity and the 1980 ones just replicated the mechanical ones cheaper and smaller so you could use them even on towed guns. I imagine modern fire control systems are much more advanced and you have apps for snpers to use :)

    • @ahmetmutlu348
      @ahmetmutlu348 Před 20 dny

      @@dianapennepacker6854 i think in far future there will be smaller then atom supercomputers that will work faster then speed of light as they will be inside one atomic particle ;P other then tat i dont see much future for improvements...

  • @billmiller4800
    @billmiller4800 Před 20 dny +222

    The biggest drawback of analogue circuits is that they are quite specific to a problem, so making something that will work for most/generic problems is difficult, where a digital computer is trivially easy in comparison. But, when you need something specific, the analogue computer can be significantly faster and more energy efficient. I look forward to the hybrid components that will be coming out in the future.

    • @giannismentz3570
      @giannismentz3570 Před 19 dny +16

      Exactly. I don't find small analog components in devices bad, they are useful, cheap to make, etc. Though yeah, you can't have generic computing on anything analog, and really all the progress happened through generic computing. Abandoning digital generic computing would be incredibly stupid. Like, what, we'll get back into mainframes to program something? They are kinda pushing for this with the cloud, seems like they don't want people to have access to computing, they wanna take this away. Returning to an analog world is gonna be really really bad. Not to mention that overly complicated analog circuits are really difficult to design, so if you are say running a digital AI in some datacenter, it'd be a big help for you against others who don't have access. and yeah, they could go further and design analog computers for AI, more efficient, designed by AI, for certain parts of their models. Maybe AI itself will be adjusting knobs for node weight values or connect/disconnect circuitry, who knows. Though taking all those generic computing possibilities away from people, is just wrong and it seems like this is what they are after. I know it will happen eventually, they are anyway taking things away from people, one way or another, including knowledge, science, communications, now generic computing, it's all gonna be taken away eventually. Govs like to take away things, until they realise, those things they take away from their people, eventually it's gonna hit them back when it's too late, and they are gonna be taken away from them too. They might be allowed to just exist, while others will obviously continue their progress, albeit at a slower rate since the majority of people don't participate, but they'll still be moving forward, the rest backwards, you can see where this is going - if it's not happened already.

    • @billmiller4800
      @billmiller4800 Před 19 dny +25

      @@giannismentz3570 You reminded me of a useful point: Analogue circuits need to be calibrated regularly because of component aging. This is a problem that was bypassed with digital circuits and a prime reason they are so common now, lower TCO!

    • @leegeorg08
      @leegeorg08 Před 18 dny +9

      @@billmiller4800 I wonder how feasible it would be with some combined analogue-digital circuitry to embed regular self calibration into packages themselves

    • @rogerphelps9939
      @rogerphelps9939 Před 18 dny +4

      No need for hybrids. Digital is king for anything that is not very trivial.

    • @billmiller4800
      @billmiller4800 Před 18 dny +7

      @@rogerphelps9939 True, in most cases, but the entire point of the video is that there are cases where analogue is much faster and energy efficient, so we should keep an eye open for things that can make our battery powered devices last longer.

  • @brucefay5126
    @brucefay5126 Před 20 dny +268

    I studied analog computers/computing in the 1970s as part of my electrical engineering education. At one time (after that) I worked for a company in Ann Arbor, Michigan that made some of the most powerful analog computers in the world. (I was in marketing by then.) They were used, among other things, to model nuclear reactors and power plants. Incredibly powerful.

    • @shapelessed
      @shapelessed Před 20 dny +25

      They may be powerful, but still lose to digital computers whenever true accuracy is needed.
      I would not trust any analog system in designing such high risk objects as nuclear plants.
      Digital computers have already established a plethora of error-correcting mechanisms, many of which only have to jump in a couple of times a week, whereas analog will inherently produce ever bigger errors with every single calculation. It all just depends on what is the tolerance we'll agree to tolerate.
      I only see analog in really specific applications. Maybe some ASIC chips or contraptions, but the digital is there to stay and govern the analog.

    • @cweaver4080
      @cweaver4080 Před 20 dny +8

      Same. We used them to solve differential equations. I'd been into digital computers for a decade by that point. I thought analog computers were nifty, but at the end of the course, big shrug and move on. It opened my eyes though, and I've never forgotten that.

    • @astr010
      @astr010 Před 20 dny +16

      @@shapelessed but if we look at much of the modern AI they are probabilistic in nature so the use of analog systems in this environment would have a negligible impact on their performance.

    • @kliersheed
      @kliersheed Před 20 dny +4

      ​@@astr010 "probabilistic in nature" yes, but the probability is based on accurate math. its just "probable" because we either lack the data or the computing power to make it exact. in theory it COULD be exact.

    • @sebastianbauer4768
      @sebastianbauer4768 Před 20 dny +4

      @@shapelessed are they inherently more inaccurate for their purpose though? For example the abacus and slide computers shown in the beginning seem to be fairly accurate and produce reproducible solutions. I’m not sure where the inaccuracy would be introduced in more complex systems and certainly not with *every* calculation.

  • @mikesheahan6906
    @mikesheahan6906 Před 20 dny +614

    My analog computer has ten fingers and ten toes.

  • @olhoTron
    @olhoTron Před 20 dny +171

    1:32 No, it may seem like a infinite set, but in practice it's limited by the signal-to-noise ratio, the SNR is efectively the number of "bits" of a analog computer, the rule of thumb is 6db ~ 1 bit, also each component adds its own noise on top of the signal, you lose "bits" as your computation becomes more complex, BTW that is also kinda true on digital computers, if you use floating point numbers you lose some precision with each rounding, however on digital its easier to just use more bits if you need them, on analog decreasing noise is not so trivial

    • @rogerphelps9939
      @rogerphelps9939 Před 18 dny +5

      Absolutely right.

    • @marsovac
      @marsovac Před 18 dny +21

      In analog you can also increase the distance from the minimum and the maximum, which does a similar thing, but that is also not so trivial... you would not want to work with 20000 volts or 1000 psi :D
      Another problem is keeping noise stable in order to have a working calibration. Every physical property change can affect it. You woukd not want it to give a wrong result just because summer is too hot this year.

    • @joseeduardobolisfortes
      @joseeduardobolisfortes Před 17 dny +3

      Precision is not the point here; on the contrary, the goal of analog computer is deal with unprecise values in a more subjective way. It's more close to fuzzy logic than the precise math.

    • @solsystem1342
      @solsystem1342 Před 17 dny +10

      @@joseeduardobolisfortes
      Um, If you don't care about precision we can just use low precision approximations on a digital computer though?

    • @Bertie_Ahern
      @Bertie_Ahern Před 17 dny +2

      Sort of, but not really, and not practically, which is kinda the point

  • @Legg99
    @Legg99 Před 20 dny +141

    I built and used analogue computers (although I didn't call them that) to control the temperatures in a multi-layered cryostat for my Ph.D work in the mid '70s. I did the number crunching on a digital computer that filled the basement of the maths dept building using punch card input. 😮. 20 odd years later I found myself working with an engine management computer for a helicopter that was pure analogue. When I approached the system engineer at a large well known aerospace company who had the design control authority for the system to ask some fundamental questions about it he didn't have a clue - he was purely digital. I'm retired now but if drag my knowledge out of the closet along with my flared jeans and tie-dyed T-shirts perhaps I'll come back into fashion. 😁

    • @your_utube
      @your_utube Před 14 dny +6

      It is precisely people like you who need to start channels so that pure digital people can see how you approach a problem and capture your invaluable decades of extremely useful experience and value so the world do not lose it. What a pity if we should have too few people that can help us remain adept at such things and continue to reap its benefits, and to show everyone how the world got here in the first place.
      I might not be an analog guy, but I would watch your channel any day.

    • @keepcreationprocess
      @keepcreationprocess Před 14 dny +3

      Come back - do not let that knowledge disappear

    • @RickBeacham
      @RickBeacham Před 14 dny

      I think automatic transmissions are essentially analog computers?
      In contrast I think AT is a perfect use case for analog computer where a digital one might be more prone to error?

    • @Cyba_IT
      @Cyba_IT Před 13 dny +3

      @@RickBeacham I guess so because it uses hydraulics. And then there's my Alfa Romeo 147 which has a Selespeed gearbox which is an automated manual transmission. It consists of the standard (manual) 6-speed gearbox with the standard clutch and adds an electronically controlled hydraulic (robotic) that actuates both gear and clutch. I guess that makes it a hybrid analogue/digital computer.

  • @keresztesbotond740
    @keresztesbotond740 Před 18 dny +22

    I think the best way to sum it up is: Analog is fast and efficient, but is hard to design (or at least formulate the problem). Once you build it, it is only good at solving that specific problem.
    Digital on the other hand is much more flexible. You have an instruction set and can solve any problem that you can write an algorithm for using those instructions, so you can solve multiple problems with that same machine. Tradeoff is the mentioned slower speed and efficency (compared to analog).
    My favourite story about digital vs. analog is the Iowa-class battleships: They were built in the 1940s and were reactivated in the 80s. The fire control computers (electromechanical analog computers using cams, differentials and whatnot) were state of the art back in the day, but given the invention of the transistor and all that since then, the navy did look at upgrading them to digital. What they found is that the digital system did not offer greater accuraccy over the analog. While over 40 years technology advanced quite a bit, the laws of physics remained the same so the old analog computers worked just as well.

  • @icedreamer9629
    @icedreamer9629 Před 20 dny +96

    Analogue computing is analogous to the P vs NP problem in pure mathematics. It is fantastic at anything which is hard to calculate, but quick to check. I'm this case, anything hard to figure out how to express, but with solidly defined parameters.
    It works by shunting a good deal of the difficulty solving the problem up front to the designers of the machine. It can take years of incredibly hard work to figure out how to express a single problem in analogue form, but once you DO computing the answer for any combination or variant of the problem is virtually instantaneous.

    • @rogerphelps9939
      @rogerphelps9939 Před 18 dny +5

      No it is not. Offset and bias currents together with component tolerances will ensure that any analog computer can be easily outmatched by a digital computer.

  • @tbix1963
    @tbix1963 Před 20 dny +65

    Great video, thanks for sharing. The biggest problem with analog computers is there are so few people that know how to work on them. I’m reminded of a hydro electric plant I toured once that had an electro mechanical analog computer that controlled the units. At the time I visited it was already considered to be ancient and they were actively attempting to replace it simply because know body knew how it worked. They only knew how to turn it on, off, and wind the clock spring once a shift the exact number of spins that it needed to keep running. They had been trying to replace it with a new computer but none of the many attempts could match its precision in operating the plant and maintaining proper water flows. They were in constant fear that it might break. I checked back maybe 20 years later to ask how it was and know one working their knew what I was talking about. Sad that it was long forgotten by everyone at the plant. I thought it should have been retired to a museum, and still hope that possibly it was.

    • @bzuidgeest
      @bzuidgeest Před 20 dny +9

      Analogue computers are single purpose, where digital computers are general purpose.
      They belong in museums.

    • @mandrakejake
      @mandrakejake Před 20 dny +10

      ​@@bzuidgeest perhaps you are right, but some applications can be very specific like the one mentioned above at the power plant. Analog computers can be very power efficient, so very cheap to run. They have their merits

    • @nickhuynh6321
      @nickhuynh6321 Před 20 dny +5

      Many people don't know how things work and still be able to use them... such as a car or a kid playing a video game... So the goal for said "analog computing" is to somehow implement them for the masses such that they won't even need to concern themselves with the underlaying tech... My point is that it's not the users that is the problem, but rather it's the implementation of the product or lack of that's the issue.

    • @Tsudico
      @Tsudico Před 20 dny +8

      @@bzuidgeest Isn't that more an issue with implementation? Digital computers have been designed to be general purpose for decades, but the first ones were single purpose machines. Modular Synths and The Analog Thing show how you can alter analog computing to do different things based on how you patch them together although they connect modular functions using wires. It seems that the companies looking into analog computing are trying to do a similar thing on the scale of integrated circuits.

    • @bzuidgeest
      @bzuidgeest Před 20 dny +1

      @@Tsudico maybe, but a digital computer does not not have to be rewired and the same block of transistors can do many calculations. Analogue computers, indeed like synths need to be rewired every time. You could add some analogue switching controlled by a digital controller... A hybrid. These things are difficult to define.

  • @johnmiglautsch4587
    @johnmiglautsch4587 Před 20 dny +87

    Gave a talk at DataVersity on context. Explained in the 70's we could build a nuclear power plant instrument panel with gauges of different ranges - this allowed the techs to scan the whole wall and immediately see if anything looked out of normal range (usually straight vertical). However, everyone wanted digital, not realizing that with that change, each number had to be individually read, consciously scaled and then thought about (compare with 'normal'). With digital came the necessity for alarms because it took too much mental effort to scan the wall. Something few consider to this day...

    • @Wil_Liam1
      @Wil_Liam1 Před 19 dny +9

      So few even slow down to consider anything in these days and times..

    • @rogerphelps9939
      @rogerphelps9939 Před 18 dny +14

      @@Wil_Liam1 you can easily creatte analog looking displays with digital tech.

    • @jearlblah5169
      @jearlblah5169 Před 17 dny +1

      @@rogerphelps9939now adays, yeah it’s trivial, but back then I imagine they only had text interfaces.

    • @perrymaskell3508
      @perrymaskell3508 Před 16 dny +5

      Basic HMI stuff. With an analogue gauge, you get 3 things straight away:
      1. The current value.
      2. The range of the values (and where the current value sits within that).
      3. The rate of change.
      You have to work that out each time for each digital display. If you want an accurate reading - digital, if you want to monitor a whole lot - analogue.

    • @aafjeyakubu5124
      @aafjeyakubu5124 Před 15 dny +1

      I use analog displays that are digitally sourced at my job. One nice feature that has been programmed in is that when a limit is exceeded, the color of the display changes. Depending on the severity, the display may be yellow or red. I realize this was readily available in the 1970s though. We started seeing it in our products in the late 1990s/early 2000s.

  • @DantalionNl
    @DantalionNl Před 18 dny +71

    An analog clock is not continuous, its movement is a function of the internal gear ratios and the arm moves in discrete, quantifiable steps

    • @dalelamm
      @dalelamm Před 18 dny +19

      That's true for some clocks (mechanical with an escapement) but not a 1950's wall clock that runs off a 60 Hz synchronous motor. It's an interesting thought problem though... is an hourglass digital or analog? The answer is digital.

    • @TorutheRedFox
      @TorutheRedFox Před 17 dny +5

      it moves _to_ discrete, quantifiable steps, but it still moves instead of snapping

    • @esunisen3862
      @esunisen3862 Před 16 dny +3

      Gears are not true analog. Use a belt instead.

    • @tripplefives1402
      @tripplefives1402 Před 15 dny +11

      The gears move continuous, gears do not have discrete steps. The tolerances between gear teeth only create inaccuracies, but not complete pauses.

    • @thomaslindell5448
      @thomaslindell5448 Před 14 dny

      Just like an emulated sine wave is not a true sine wave it’s stepped and smoothed but it will never be a pure true sign wave because switching causes steps
      Digital systems can approximate analog movements but never emulate all possible steps because it’s literally incalculable because you can never break a analog signal into a digital signal because as accuracy goes up size of each discrete unit goes down to a point where it is operating at a microscopic level

  • @offbeatinstruments
    @offbeatinstruments Před 20 dny +18

    I started my career with analogue computers in the 1970s as they were still being used in industrial automation for motor control (PID: Proportional, Integral and Differential). I worked in a repair centre and built some test gear to allow me to calibrate them. It's no surprise to me that they have come back, within certain niche applications they are very powerful, although not particularly programmable, unless you count circuit design as programming :-)

    • @Kartoffelsuppe_m_Wursteinlage
      @Kartoffelsuppe_m_Wursteinlage Před 15 dny

      And there's nothing really new under the sun. I studied computer science in the 80s and we learned analog computing in a similar way. The 80s seen operational amplifier as integrated circuit, but that's all. OK, these days there are pretty fast ADCs (e.g. Delta-Sigma), so Hybrid systems could be faster. Main caveat is the mathematical understanding of the problem. No way to start anyhow programming python hoping to solve the problem with tons of iterations. ;) Math is king here.

  • @BenGrimm977
    @BenGrimm977 Před 20 dny +30

    I'm skeptical about the broad assertion - 'Why the Future of AI & Computers Will Be Analog' - that analog computing will dominate in the future. Analog computing clearly has its niche, particularly with tasks involving continuous, real-world signals-such as audio and visual processing, or interpreting sensor data-where this technology presents a clear advantage. However, framing this niche strength as 'the future' and implying a universal superiority over digital computing seems a bit overstated to me.

    • @Controllerhead
      @Controllerhead Před 18 dny +8

      Man i get the clickbait game, but it's nauseatingly naive and insultingly ignorant to assert that any form of analog computing can match or surpass the real teraflops and petaflops companies like OpenAI / Microsoft Azure are doing on the real machines that make LLMs possible and scalable today. Pardon me whilst i vomit.

    • @dieSpinnt
      @dieSpinnt Před 10 dny

      These are just pointless forecasts and guesses. The journalists are (or act like) literally on drugs! On a hype. You can't take them serious when talking about computer science. They've become the extended arm of the marketing departments and the willing barkers of the pseudo-tech bros. You are right being skeptical:) Just take that sloppy remark about Cryptocurrency or energy consumption at the beginning. Matt didn't even question that some of the use cases are really just circle-wanking and/or money-printing scams, that are very bad for the environment AND for the people. Let's call it A.S.: Artificial Stupidity!:) (In contrast to A.I., a thing that actually exists in the real world, hehehe)
      Have a good one, Ben!:)
      Edit: Later in the video, the part about AI (around 15:00 ): "(for an analog computer:) Rather than ones and zeros the data is retained in the form of voltages." ... which is the same in digital systems and the basic principle of DRAM, that has to be refreshed because the voltage of the storage element (the capacitor in the (C)MOS switch) drops because of natural existing leakage. That will be THE SAME for analog computing. This is possibly information from the "Mythic" company, that Matt repeats.
      So, I doubt that they do not know the basic physical and electronic principles that make their prototypes tick. This is most likely a straight up lie to deceive the public or investors. That's what we're up to ... Nice guys, nice industry ... don't you think?!:)))

  • @debrainwasher
    @debrainwasher Před 20 dny +15

    Whenever I design electronics, I often use analog preprocessing, since it takes much less energy to amplifie, integrate or filter signals with OP-amps using summing, PT1-, PT2-models, integrators and differentiators instead to using convolutions or FFT to construct FIR-, or IIR-filter, that needs a lot of processing power.

    • @rogerphelps9939
      @rogerphelps9939 Před 18 dny

      That may be the case but the results of using analog will still be inferior.

    • @myer3511
      @myer3511 Před 18 dny +1

      Every time I hear the word convolution my so soul dies a little..

    • @dalelamm
      @dalelamm Před 18 dny +4

      @@rogerphelps9939 Any good engineer knows that the quality of "superior" or "inferior" is multi-dimensional. The writer said that analog used less power, and that is very true most of the time. Analog will be less granular than digital, if accuracy is important. Analog is generally faster, if speed is important. If cost to perform a calculation is important, then digital is a better choice most of the time. What led you to your conclusion of "inferior"?

    • @johnhorner5711
      @johnhorner5711 Před 12 dny +1

      @@rogerphelps9939 Clearly you are not a well-informed engineer. Superior or inferior all depends on the intended purpose and the parameters of interest. Example parameters might include required accuracy, expected lifetime, running costs, maintainability, energy use, environmental consequences, ease of use, human factors and so on.

  • @JohnSostrom
    @JohnSostrom Před 20 dny +28

    When I was in the Navy the ships I was stationed on had a single gun on the bow. This was a 5"/54 and was operated by an analog computer. If you watch the SyFy movie Battleship you can see this type of aiming a large gun round at a given target.
    One of the big advantages of analog computers is that they are not subject to EMP.

    • @esunisen3862
      @esunisen3862 Před 16 dny +7

      Well they are but they recover immediately unless the pulse was strong enough to blow a component.

    • @rockpadstudios
      @rockpadstudios Před 15 dny +5

      @@esunisen3862 exactly - most electronics will just reboot. If you are under the blast there is enough energy to burn out circuits - even analog. I did EMP Testing on the B1-B in the late 80's.

    • @canobenitez
      @canobenitez Před 12 dny

      @@rockpadstudios what about diesel motors

    • @KMCA779
      @KMCA779 Před 8 dny

      @@canobenitez How basic is it? Any diesel car/truck/whatever will still get fried, if it's got a digital controls, fried... anything with any electronics will probably just be dead.

  • @VivekYadav-ds8oz
    @VivekYadav-ds8oz Před 15 dny +5

    It's important to understand in the clock example - that even though the clock hand (second hand) was moving discretely (moving only every 1 second, and not continuously), it still is analog. Because it is not necessarily the continuous nature that makes something analog, it's if the physical process happening is _analogous_ to the real event. The twist of the second-hand is directly proportional to how much of a minute has passed. However, in a digital clock, the voltages in the bits (0-5V) are not analogous or directly proportional to the time that has passed.

  • @circattle
    @circattle Před 11 dny +4

    The large hall-sized computers you show at the start of the video are actually digital computers. They just used vacuum tubes instead of transistors so took up a lot of space and used a lot of electricity.

  • @BunkerSquirrel
    @BunkerSquirrel Před 11 dny +3

    One of the biggest hurdles to analog ASICs has been the extreme difficulty of designing the IC circuitry. In fact, the most expensive and labor intensive part of a processor has always been the analog circuitry to regulate and supply power, among other things. With the help of AI we’re going to see a massive explosion in this field, since it can easily pull from natural laws, fab constraints and desired outcomes to take much of the guesswork and iteration out of analog ASIC engineering. Literally machines building the next generation of machines.

  • @sebastianalegre7148
    @sebastianalegre7148 Před 20 dny +11

    It's all gonna be weird af crystal megastructures powering supercomputers in the end

    • @user-wk4ee4bf8g
      @user-wk4ee4bf8g Před 20 dny +4

      Ya dude, I'm seeing crystals and photonic computing as big part of the future.

    • @NeostormXLMAX
      @NeostormXLMAX Před 15 dny +5

      Im going to incorporate this in my sci fi novels

    • @NeostormXLMAX
      @NeostormXLMAX Před 15 dny +4

      Also makes me think of the pylons from star craft

  • @user-wk4ee4bf8g
    @user-wk4ee4bf8g Před 20 dny +6

    Playgrounds in the 80s were amazing. My favorite was the spinning metal platform of death. That was still safer than sharing a seesaw with a betrayer, that thing was coming for your chin or they tried to smash your spine on the ground. Good way to learn which kids couldn't be trusted. Flying off the swings to land onto gravel and dirt and get cool scratches. I was always bleeding a little bit as a kid, a result of good times :)

  • @maxmyzer9172
    @maxmyzer9172 Před 20 dny +5

    6:43 so far I feel like this video isn't about analog vs digital computing and more about mechanical vs digital (not even mechanical vs electronic) ... Both digital and analog computers can be done in both electrical and mechanical ways. Hybrid computers are something that will happen though.

  • @lady_draguliana784
    @lady_draguliana784 Před 20 dny +19

    a general purpose, multi-function device is great, but it'll never be as good at a specific task as a device built to do that ONE task as efficiently and effectively as possible.
    As for analog physics, think of what's faster, pushing a button that activates an actuator that pushes a ball, or just pushing the ball yourself?

  • @RWho-bb9qg
    @RWho-bb9qg Před 18 dny

    As undergrad student in 1968 in Sydney Uni I saw demo of analog computing. Target system was a pendulum, which was modeled in a hand built electrical analog computer. Output was a voltage. It was configured to run on same time scale as a physical pendulum. Both pendulum and very large voltmeter were set up side by side, and pendulum and analog computer set running at same time. Pendulum and voltmeter needle mirrored each other for perhaps a minute and we’re still doing so when lecturer shut them down.

  • @wb8ert
    @wb8ert Před 15 dny +1

    While working for Texas Instruments, I spent six months learning about the SEL 32/55 hybrid (analog and digital) computer. Ultimately, I worked on the digital side as a systems programmer. I left TI when they planned on transferring me to work on the Ada compiler. I wanted to be a VM/370 (BSEPP, SEPP, SP, etc.) systems administrator/programmer. I took a job as the lead systems programmer at the Owens Corning Fiberglas Research Center in Newark, Ohio. I loved the job and working with the IBM user group SHARE.

  • @Hippida
    @Hippida Před 12 dny +11

    Like, the first 75% of this video talk about what analog computers are not. Add some snappy comments and twist, and you still don't come close to what is mentioned in the video headline

  • @Number_Free
    @Number_Free Před 20 dny +15

    I invented an analogue computer myself, back in the 70's. I proposed using the pulse frequency of a rectangular (digital) waveform as the physical variable. Summation would correspond to integration, driving a pulse generator. Differentiation would be via subtraction, again driving the pulse frequency.
    I'm not sure whether it would work or not, but it could be very useful if it does.

    • @user-wk4ee4bf8g
      @user-wk4ee4bf8g Před 20 dny +3

      Oh yeah totally, differentiating the pulse, I totally understand what that means and can see the potential ;)
      Smart people like you confuse me, how are we the same species? :)

    • @GordonSeal
      @GordonSeal Před 20 dny

      Why not build a test model? It could be a big thing if it works.

    • @Number_Free
      @Number_Free Před 20 dny +1

      @@GordonSeal I tried, in Python, but I am not convinced that I got the model correct. It needs to be asynchronous, but wasn't. I'm unable to do much now.
      I also suspect that A/D and D/A components will be required for the pulse generator interface, to ensure the asynchronous nature of the entire system.
      After further thought, the best approach may to simply to charge or discharge a capacitor via the pulse charges, to control a pulse generator.
      That may actually work, but may not offer any advantages.

  • @retirednobaddays456
    @retirednobaddays456 Před 12 dny +1

    In 1971 analog computing was one of the engineering classes I was required to take to get an engineering degree. Generally the process was similar to what you show at 10:27 in your video. It was very helpful to use when looking at dynamic systems. You programed the system by turning dials and moving jumper wires so there was no easy way to store you program and reprogram the system. To create a practical analog computer you would need to develop digitally controllable capacitors, resistors and transistors.

  • @Softskills-hiddenPersonalPower

    Hi Matt. I asked myself the same Analog or Digital question when I met with the Boeing Engineers in 1967. I had been hired and to go to USA because they believed me to be the guy with answers. They showed me their best analogue computer and then the fastest scientific digital computer available at that time (It was an IBM 360/65). I wrote a few algebraic/calculus equations (using Fourier analysis) based on the physics of the natural world plus a few mathematical modelling scenarios. I tried to run them on the analogue system first because the natural world we live in is not black or white with on or off. Our world is naturally a continuous analogue with a morphing of possibilities like Transitions in a Graphic as you shade from one color to another.
    It turned out that one set of equations generated an even more complex set of equations that were beyond the understanding of even the best theoretical Mathematicians.
    When I used Fortran (and Matrix theory like Upper Heisenberg) on a Digital machine I could at least see a way forward to modelling the actual physical world even if it was an approximation. By using "Quad precision/4 Word" and 32 decimal point floating point variables I could get close enough for the Boeing aircraft designers and Space scientists to get the answers they needed.
    Digital Computing is an approximation and faster computing has meant we can get some kind of results faster. However they are not a true representation of physical reality and especially of how the neural networks within our Human minds work to solve physical problems.
    So to come back to the question Analogue or Digital the answer is eventually Analogue but certainly not with the current elemental and chip based technology. The time it will will happen is when we develop "organic computing". There are some early signs of research into the cellular structure of Neurons and associated storage of information in simulated brain cells and memory. But we are at the early stages and I am personally doubtful whether it is a scientific journey we should take at this time of such cultural ignorance.
    So as you ponder what is AI you should perhaps consider the nature of human intelligence at all and the glory of what it is to have a mind generations ahead and already infinitely more capable than any we are likely to fabricate in this so called "Modern world". Party on.

  • @BLH130
    @BLH130 Před 16 dny +6

    As one who has programed and used analog computers in the past, I can only say is "I told you so." They are the most powerful computers in the world. It took a while, but I knew they would be back. Welcome back my old friends. Do great work.

  • @ericstromquist9458
    @ericstromquist9458 Před 20 dny +38

    I think analog has a great future in large language models and other AI, which is fortunate because generative AI is currently the fastest growing use of digital computation and the energy it requires. The evidence for this is that LLMs continue to function even when the weights and computations are quantized down to, say, 4 bits of precision for models designed to run on small systems. Four bits corresponds to a need for voltage accuracy of only 6.25% in an analog implementation, which should be easy to achieve. I don’t know how easy it would be to implement algorithms like the back propagation, which is used in training, in an analog form, but my guess is that it shouldn’t be difficult.

    • @ahmetmutlu348
      @ahmetmutlu348 Před 20 dny

      in future ai has to reduce its main code and simplify it :D otherwise it will start to become confused then start acting like old people under influence of demantia .. just a guess :D if there is any code atleast :P as llm is a complex structure.. and there is no actual code in lots of models but instead database used as some kind of script which is still code but logic meanings cant be converted to machine code directly... well thats what llms are indirectly designed to do :D os may be they can translate that to standard code themselves before it becomes impossible to translate/simplify :D

    • @darlokt51
      @darlokt51 Před 20 dny +8

      Sadly, that's exactly where analogue processors fail. Analogue circuits are way less scalable than transistors, shrinking an analogue chip is incredibly difficult (see Apples modem division). Also, for Ai, as it is at its core a many times over repeating number of operations, its sensitive to consecutive errors, as introduced by an analogue chip. Mythic, as far as I know, gave up on their big AI dreams and is focusing on smaller applications, some still AI adjacent, not as impacted by the problems of analogue error. Analogue has its place, but general analogue computing is very far away if ever reached, Asics are the digital answer to analogue, still digital with their benefits in scaling, without the harrowing complexity of analogue and benefits of fixed function efficiency. Analogue in chip design is the dream, but as the laws of scale punish the transistor, they hit analogue many times over, while the math of chain rule pummels it to the ground in most cases.

    • @rogerphelps9939
      @rogerphelps9939 Před 18 dny +2

      Digtalk still wins. Nlo need to worry about bias currents and offsets or component tolerances, all of which scupper anything but the most trivial of applications.

    • @ericstromquist9458
      @ericstromquist9458 Před 17 dny +1

      @@darlokt51 I see your point that backprop may not be able to be quantized down to as little as 4 bits because of the chain rule. But the real reason I'm back to this thread is to flag a public lecture I just watched by Geoffrey Hinton (who anyone involved in AI will immediately recognize) posted a month ago on the University of Oxford CZcams channel. Starting at minute 27 he talks about digital v analog implementations for almost ten minutes - his conclusion being that digital is better for future AI despite the potential advantages of analog.

    • @spark5558
      @spark5558 Před 15 dny +1

      @@darlokt51 Another problem is translating analogue over to digital as that requires bandwidth and processing to do

  • @danchadwick1495
    @danchadwick1495 Před 11 dny +2

    It is a truism, "you can't keep reducing the size of components forever."
    ... however, with the ever increasing speeds, I suspect you can upgrade the materials for those components... rather than silicon (semi conductor), use silver at the speed and temperature that it acts like a semiconductor...
    Or switch to magnetic processing to spin electrons around the component rather than through them.
    Also, a conformal coating of diamond (vapor deposit) would go a long way to ejecting unwanted heat.

  • @JH-pe3ro
    @JH-pe3ro Před 19 dny +2

    A good complementary development to the modern analog computer is the ongoing "second age of microcomputers", which has had a watershed moment lately through gradual consumerization in custom PCB design, low-cost microcontrollers and FPGAs, making it possible for hobbyists to manufacture dream machines deviating from the mainstream PC architecture. There is even a project called "Tiny Tapeout" that allows you to publish an ASIC design and get it anthologized with hundreds of other hobby designs on one chip.
    There's a whole menagerie of retrocomputer-styled designs being made using these components, from the little ESP32-VGA, Agon Light and Neo6502 to more ambitious FPGA boards like the C256 Foenix and Commander X16. At first glance, they're nostalgic toys, but they signal a further decoupling of personal computing from industrial applications, an important counterbalance in an era where everyone complains about how much our devices and data are not really under our control.

  • @nitehawk86
    @nitehawk86 Před 14 dny +11

    I wish there was a betting pool for every time Matt Ferrell was proven incorrect because he believed some marketing hype. I would be a billionaire.

  • @BigTimeRushFan2112
    @BigTimeRushFan2112 Před 20 dny +3

    Your quick shout out to Kiffness video Hold onto my fur (I like it) was awesome! More cat video references please!

  • @SG_01
    @SG_01 Před 20 dny +1

    I think as part of the digital computers portion, the sequential bit CAN be eliminated if needed. We actually do this in a number of cases, and it's called hardware acceleration. It requires making a physical piece of hardware that can do the calculation in one operation that would normally take multiple steps. For analogue computing this hardware acceleration is in many cases a mandatory step.

  • @robinpettit7827
    @robinpettit7827 Před 19 dny

    I agree, I had almost thought of starting a program myself to create programmable analog neural network due to the development of MEMRistors and other analog components

  • @Ynhockey
    @Ynhockey Před 16 dny +4

    Analog computing certainly has its uses, but not even hybrid devices are going to replace the main use cases of supercomputers. For one, measurement is as precise as the measurement device, which is basically always discreet. For example, if it can measure mass in nanograms, then any difference smaller than that cannot be measured. However, more importantly, analog computers are task-specific machines. Making task-specific digital chips has historically been incredibly expensive, but that's changing now, with Intel, TSMC, Tower and others making custom chips. Custom digital chips built to solve specific problems can also save 95% and more of the energy compared to general-purpose digital computers - it depends on the problem.

  • @chalermako
    @chalermako Před 14 dny +1

    There is so much analogue in industy. Instumentation devices send 4-20mA signals and control valves are controlled by 4-20mA outputs sent by PLCs. I'm sure the cooling on data centres is controlled by analogue valves and PID.
    Most electrical networks still use analogue measuring transformers and generator excitation and other complex feedbacks use PID. Full digitalisation has only occured in cyber and the internet.

  • @4unkb0y
    @4unkb0y Před 17 dny +1

    Thanks Matt, this a great followup to the Veritasium vid on Mythic

  • @PalimpsestProd
    @PalimpsestProd Před 20 dny +6

    Large analogue system's will probably be better at predicting / controlling chaotic events. The whole point of "expert systems" is to digitize more analogue like actions and observations.

    • @pesqair
      @pesqair Před 20 dny

      why are you using an apostrophe to make words plural?

    • @PalimpsestProd
      @PalimpsestProd Před 20 dny

      @@pesqair oops, edited. thanks.

  • @benmcreynolds8581
    @benmcreynolds8581 Před 20 dny +8

    I was most fascinated by the analog chips ability to operate really effective & efficient image recognition software. *BTW did u hear about the lost Disney sodium vapor method? How it's been rediscovered. The Corridor Crew YT channel just came out with a video where they collabed with a guy who figured out how to do this lost art in filming and media. It's actually a game changing aspect. I recommend looking into it. It gives us the ability to not have to stress and mess with all the green screen effects just to still have those look off.. I had no idea how fascinating this lost art was and the effects it had on some amazing classic movies that had such a great unique look that modern movies just don't produce but I'm really hoping we this this method skyrocket in use thru out the film industry. As well as a return of 2D Animation and other forms of creative media.

    • @esteban8681
      @esteban8681 Před 20 dny +1

      I watched that over the weekend. It was a great breakdown. Its crazy how simple the tech was back then and how more efficient it is in comparison to todays tech.

    • @Tsudico
      @Tsudico Před 20 dny

      I hear that sodium vapor has its own unique issues like requiring a very controlled environment (which you can get a sense of in the CC video). It may be better to use the knowledge of how sodium vapor achieved its effect while using a different frequency that may not have the same issue although there are limits to which frequency of light we can choose. I hear IR doesn't work due to objects giving off heat and I'm not sure it would be safe for anyone to stand in front of UV lighting for hours on end although it may allow using sun lit environments.

    • @ML-dk7bf
      @ML-dk7bf Před 20 dny

      @@Tsudico They could probably recreate the effect with LED's programmed to output the color at that wavelength. Add in some basic software and light sensors to adjust the light to the room, and bam there's your solution.

    • @Tsudico
      @Tsudico Před 20 dny

      @@ML-dk7bf One of the benefits of using other options like green/blue screen is that it can be used in natural light or outdoor environments. From what I've seen that doesn't work with sodium vapor which was what I was indicating. There might be alternative wavelengths of light that occur in natural light but that aren't used for film that could provide the benefits of the sodium vapor process in a greater variety of lighting conditions but only testing would know. Otherwise the sodium vapor technique, no matter which wavelength of light, just might be limited to specific use cases.

  • @jeffreysmith2223
    @jeffreysmith2223 Před 12 dny +1

    Although I'm now retired, I well remember studying analog computers in the 70s and having to perform the math in order to correctly program the system. It is true that the accuracy of an analog computer is far greater than digital, chip/computer designers elected to choose speed and convenience over accuracy. It is only today that society is beginning to appreciate the advantages of analog systems, a great example is the rediscovery of music from turntables and albums. There are many aficionados and music lovers, like me, who still enjoy the sound of an analog album far more than its digital counterpart, regardless of the digital format used to playback the digitized performance. I'm typing this on a regular laptop, and I recognize the convenience of it, I'm just unconvinced that its the be all and end all. I believe there is a huge future market for analog computers/circuits that has been awaiting the need for greater accuracy and dramatically reduced power availability, measuring the health of bridges comes to mind; thousands of sensors reporting on stresses within individual components, all done in real time". At any rate, thanks for this insightful video, as always it was thought provoking and time well spent!

  • @jaredeiesland
    @jaredeiesland Před 20 dny +2

    Dallas OR and The Dalles OR are about 150 miles from each other.
    One is rain forest and the other a desert, that's a pretty huge distinction when talking about water usage.

    • @esunisen3862
      @esunisen3862 Před 16 dny +1

      It doesn't matter anyway, the water is not destroyed.

  • @dloman77
    @dloman77 Před 20 dny +9

    So how does this analog revival mesh with the emergence of quantum computing? I'd love to see a follow on episode talking about this. Love your content!

    • @thewiirocks
      @thewiirocks Před 20 dny +1

      I've often felt that Quantum Computing was just a really fancy method of performing analog computations. Yes, you get more direct control over effects like quantum entanglement, but that also comes with the downside of making the computer hard to scale. I expect we'll have powerful analog components in wide usage long before the GA quantum computer is cracked.

    • @TheMrCougarful
      @TheMrCougarful Před 20 dny

      Quantum computing is about making digital act more like analog.

  • @dennistucker1153
    @dennistucker1153 Před 20 dny +4

    @Matt, very good video. FYI: Standard digital microprocessors actually operate in an analog world. The only real differences are comparators that separate signals into high and low voltages(1's and 0's) and storage as a high or low voltage. We assign meaning to patterns of these stored voltages like the ASCII table.

    • @GregConquest
      @GregConquest Před 19 dny

      This sounds analogous to a CD or DVD being digital, with each pit being open or solid (reflective), but each pit is actually a physical area with a hole either blasted out by a laser or not. That hole (or solid place) is quite clearly analog. It can be measured, and their size and placement varies by nanometers, but the digitization of the reading laser changes all those 0.000002, 0.000034, 0.000001 reflectivity measurements to 0, and the .9999997, .999999, and 1.00.000002 measurements to 1.
      I guess they never get far at all from the 0 or 1. There will never be a .1 or .9, much less a .5, but with analog, the data/signal can be anywhere between 0 and 1, and it still gets processed by the next operator in the chain.

  • @urnoob5528
    @urnoob5528 Před 13 dny +2

    analog clock actually also move in steps
    they r controlled by a crystal oscillator much the same as digital clock
    and even a pure analog device will have resolution (the minimum step) determined by our instruments, designs, and components
    there is no infinite, thats only in theory
    still, the range is gonna be much larger than any digital resolution

    • @piggyware
      @piggyware Před 9 dny

      interesting notion, but would you say I can move my hand analog?

  • @waylonk2453
    @waylonk2453 Před 14 dny

    Dang, what a neat topic! This is my favorite video of yours I've seen in a while. I'm excited to see what components of compute are better suited to analog hadware. The power savings sure are enticing for applications where one thing needs to do a very specific job for a long time.

  • @justinsorenson9170
    @justinsorenson9170 Před 20 dny +3

    Thank you for making this video!

  • @jks234
    @jks234 Před 20 dny +16

    I personally view analog computing as simply "unprotected digital computing".
    Correct me if I'm wrong.
    Digital computing is essentially gating electricity such that you can move electricity along logic corridors to create predictable outcomes by compressing a whole range of analog signals into 2 (zero and one).
    The compression is an engineering solution that allows us to concern ourselves much less with signal clarity and correction, and deal instead in these coarse bits.
    Of course, the compression and gating also is a huge waste of the actual electrical signal. The analogy is like trying to watch TV through a white blanket sheet.
    Instead of the high fidelity of the electrical signal itself, you get 0 and 1.
    Taking off the protective layer provides us with a ton of untapped signal to play with... but it also comes with its own share of hairy issues. Noise and such.
    Signal correction?

    • @user-wk4ee4bf8g
      @user-wk4ee4bf8g Před 20 dny +2

      Don't know, just commenting to boost your comment so someone smart notices it and has a chat with you about it so I can glean so dumb-people nuggets from the chat :)

    • @olhoTron
      @olhoTron Před 20 dny +3

      Some semiconductor engineer (dont rememeber his name) said that the advances in semiconductor technology can be boiled down to discoveing new techniques of making the circuits run at higher frequencies while preventing them from falling back to the analog domain

    • @fluffy_tail4365
      @fluffy_tail4365 Před 20 dny +4

      Not really. The difference between digital and analog in terms of computing is that in digital computers you use symbolic representations composed of discrete bits, transformed during discrete clock cycles (which in math kind of maps to the realm of integer, sequences and so on), while in theory an analogue system express values in a full continuous spectrum and operate on them in a continuous way (the real number line, differential equations, and so on). What you're talking about is that we implement digital logic using what is an analogue system (voltages that gate transistors), so we discretion values (binary, trynary) and make multiple parallel channels (bits) and clock the system. You could however implement a digital computer in a system that has finite discrete states, it's just that transistors are very fast and cheap and scalable and actually do not consume any power when fully on or off (which is the ideal state of a digital computer).
      In general digital gives you the best precision but also the highest versatility, because symbolic calculations can represent anything you want given a certain precision. Analog system need to be tailored to every specific problem by rewiring their underlying system to represent the problem (or just be a fixed accelerator), and while they have advantages in calculating certain problems (that describe physical systems) efficiently they are not efficient in other problems (the ones that map to discrete quantities). In fact while we know that every Turing complete digital system is equivalent, there have been also studies that have shown that you can map any analogue problem to a digital version and vice versa....it just says nothing about the complexity of the transformed calculation.
      TLDR: you're confusing a bit the physical layer with the conceptual layer, but indeed since voltages are an analogue quantity when you restrict them to represent discrete values you lose something the analogue system could do very naturally and efficiently, but this is a quirk of the specific system

    • @user-wk4ee4bf8g
      @user-wk4ee4bf8g Před 20 dny

      @@fluffy_tail4365 The physical reworking issue got me wondering about modular analog systems that can be quickly transformed into different forms. Seems like a digital system could manage the transformation of analog systems into whatever form suited the need. I am obviously out of my depth here, just an idea.

    • @user-wk4ee4bf8g
      @user-wk4ee4bf8g Před 20 dny

      @@fluffy_tail4365 Like the way Alphafold figured out protein folding. A digital system can be used to alter a physical form of coding information that does stuff in the real world.

  • @steveschritz1823
    @steveschritz1823 Před 19 dny +1

    Saw a video about a battleship targeting computer from WWII. You’d set the distance, wind speed, etc. It would also consider the speed and tilt of the ship. It was 100% mechanical with all kinds of cams and linkages to calculate the proper firing solution.

  • @ZonamaPrime
    @ZonamaPrime Před 18 dny

    Best playground accident memory: 5th grade, went up those yellow enclosed tube slide, my (bigger than me) friend Jackson was coming down like a Power Ranger. His foot connected with my nose. It didn't hurt so much as stinging then going numb, the main issue was the blood pouring out, I had to end recess right there and go to the nurse. I saw my sisters class on the way (they were lined up next to the wall for music class) and i waved to her completely forgetting how much blood I was coated in...

  • @ligerstripe99
    @ligerstripe99 Před 14 dny +7

    This video takes the cake of the most buzz words and least amount of substance that isn't talking solely about 'a.i.'..

  • @ChrisBullock1978
    @ChrisBullock1978 Před 20 dny +3

    funny look at our bodies. They are made of multiple sub systems. Why not combine both digital and analog

  • @smallduck1001001
    @smallduck1001001 Před 19 dny

    For impact on the world of processors, more than analog computing I'm looking at backside power delivery which finally separates power circuits in a chip from computing circuits, and more advance designs like the Mill CPU (still coming after >10 years) which re-attempts VLIW and promises GPU/DSP chip efficiency brought to a general purpose instruction set.

  • @pachot1525
    @pachot1525 Před 16 dny

    This material deserves a previous one just explaining key concepts involving the final message. It is not easy to capture the essence of analog computers. Anyway, today I discovered your site. Good insightful material thanks for sharing this.

  • @ExiledPrince-us5zr
    @ExiledPrince-us5zr Před 20 dny +4

    I don't mind the idea of combine the best of digital and analog computing but it feels a bit short sighted. The furture isn't either but bio-chemical computing where we don't use 1 and 0 but a breath of options in the same way our brains work.

    • @mpart_woodlathe-stuff
      @mpart_woodlathe-stuff Před 20 dny +2

      Some things to keep in mind ... an individual neuron can operate up to ~1Khz and most systems (muscles, heart, etc.) behave as chaotic systems. Side effects can be reaction time, massively parallel, synchronization, de-synch'g. Keep in mind it takes years and decades for learning how to use our brain.

    • @user-wk4ee4bf8g
      @user-wk4ee4bf8g Před 20 dny

      I wonder a lot about the potential of photonic computing. Each frequency of light is its own data stream, so one beam can hold many streams, super cool. I suspect you are correct in thinking the overall structure of computing has to change. Just a gut feeling, I have no expertise in this stuff.
      Copying living systems sounds chaotic, harder to make specific and purposeful. I'm seeing crystals and light computers, take the idea of structure to higher levels. No reason we couldn't fuse photonic and biochemical computing together, perhaps like some sort of future copy of the brain hemispheres. Seems like we want to create gods to rule over us. Makes sense, our entire history is filled with myths of beyond-human powers, we're just fulfilling the prophecies we wrote. After we experience real gods, we'll probably get over the whole idea and wish we could go back to them being imaginary.

  • @BlueUltima9999
    @BlueUltima9999 Před 20 dny +4

    Honestly, the idea of marrying Analog sensors, that can detect change without need of direct constant power, and a Digital always on Hub of information, just made me think were not far from cyborg / android bodies. The digital mind, requiring the more immediate power source (similar to a regular brain) being able to read feedback from sensor points, that dont require as much power until triggers occur that activate them. Maybe Im thinking too far out on it, but this seems closer than it has in the past.
    If Analog memory can be created, where previous outputs can be retained at significantly lower energy retention costs, rather than constantly having to generate new ones, then yeah, I can see prior reference-able memory becoming a real thing, and maybe then Assembly Intelligence (what i call AI lately) can start to store past "experiences", and then were not far off from actual AI (Artificial Intelligence)
    Interesting but Chilling Stuff

  • @ericschumann4213
    @ericschumann4213 Před 16 dny +1

    Be very careful with the desuperheater on your home HVAC system. These systems use a simple heat exchanger to transfer the heat from the heat pump working fluid to the domestic hot water, but the catch is that this happens at high temperatures (duh). If you have hard water (many, if not most, municipal water supplies are hard water to varying degrees), the heat exchanger will quickly cause the minerals to plate out of the water onto the heat exchanger surfaces. The result is that the heat exchanger becomes less efficient over time and will eventually clog up completely. To combat this, you would need to add a water softener with a high enough capacity to feed the hot water supply. Such a water softener is very expensive to buy and maintain. The alternative is to have your heat exchanger cleaned out every year, which is time-consuming and messy. To be fair, electric and gas water heaters also have this problem with their heating elements, but electric hot water heaters have cheap and easily replaceable heating elements, and gas tanks typically directly heat the the outside of the tank, so there is nothing that can really clog up.

  • @user990077
    @user990077 Před dnem

    You mentioned using analog computers to calculate heat flow in objects. I remember using nonlinear partial differential equations to model heat flow in a metal bar from a point source. Bessel functions were used in the process of calculating the heat flow.

  • @josehenry7205
    @josehenry7205 Před 20 dny +22

    I will be forever grateful to you, you changed my entire life and I will continue to preach on your behalf for the whole world to hear you saved me from huge financial debt with just a small investment, thank you Stacey Macken.

    • @Melbn-di6mi
      @Melbn-di6mi Před 20 dny

      The first time we had tried, we invested $1000 and after a week we received $4,830. That really helped us a lot to pay our bills

    • @Kristenshwan
      @Kristenshwan Před 20 dny

      Wow. I'm a bit perplexed seeing her been mentioned here also Didn't know she has been good to so many people too this is wonderful, I'm in my fifth trade with her and it has been super

    • @benjaminocampo3359
      @benjaminocampo3359 Před 20 dny

      I realize that trading/ investment comes with a lot of benefits than it seems to be on the news. As a beginner I was scared of loosing my savings but I'm glad I took the bold step that is now favoring me.

    • @LindaHarmel
      @LindaHarmel Před 20 dny

      My whole family living fine and am heading for retirement plan, thanks to Stacey Macken .

    • @AIIG-zd5dx
      @AIIG-zd5dx Před 20 dny

      I don't want to put a figure to how much i've made doing that but let's just say it's more than enough to make trading worthwhile. And when i say worthwhile, i mean it.

  • @toddhill1465
    @toddhill1465 Před 19 dny

    In the spirit of AI and computers using lots of power, I asked Microsoft Copilot the following questions:
    1) Do volcanoes exist in Antarctica?
    2) Do thermal pools and/or geysers exist in Antarctica?
    3) Given this information, is it possible to put a very large scale geothermal power plant in Antarctica?
    4) Assuming that a very large-scale geothermal power plant is established in Antarctica, would it then be feasible to put a very large scale data center in Antarctica to take advantage of the year-round low temperatures and the local power plant?
    The answers it gave were exceedingly optimistic, and each answer did come with pros and cons. Nothing concrete, however it is thought provoking and fantastical. I do think we need to acknowledge that we are ignoring the potential that is locked in the ice of the forgotten continent. Could it become an AI haven? Especially if we manage to get a lot of high-speed fiber run to where we'd need it there.

  • @VitezGonye
    @VitezGonye Před 20 dny +2

    9:52 that's not truly an analog clock either, it only shows the time every second. Like as in the temperature example at the start, if it was analog it should be able to show fractions of a second too and change continously, but it only does in 'bits' (seconds).

  • @kobester03
    @kobester03 Před 19 dny

    This brought to mind how earlier this year researchers were starting to look at simultaneous and heterogeneous multithreading (SHMT). Which showed about a 1.96 speed increase while using 15% less energy.

  • @nilsdock
    @nilsdock Před 15 dny

    What we need is a programming language to create digital blueprints of analog circuits.
    lets say I got a specific task that an analog computer would be great for, but I only have the skillset of a digital programmer.
    I would write software that describes the function of the analog computer, then a compiler runs its algorithm to model my code as a circuit. then I could simulate the operations of the analog chip using modern digital components, and I would tweak my code until it preforms as intended. then I would send it of to a manufacturer (either a simple one that crate PCBs or a more advanced one that makes silicon wafer chips). Then I get it in the mail and it works (hopefully)
    this kind of language would enable more people to understand what analog computing is and what it could do.

  • @tjrowe
    @tjrowe Před 20 dny

    The videos on this channel are so polished and interesting. Well done.

  • @ModiPooja
    @ModiPooja Před 5 dny

    I work within the energy sector and have been informed that a single data centre catered for AI alone is expected to consume as much power as two USA cities. Can't recall which cities they were. This alone leads me to appreciate the idea of shifting away from digital technologies and returning to the use of analog systems.

  • @Flameboar
    @Flameboar Před 12 dny

    We are approaching the limit of digital computing built on a Silicon substrate using MOS transistors. Heat generation is a limiting factor. Solutions might be improved cooling, selecting materials with lower resistance as well as shifting energy intensive functionalities to analog. The first two are being implemented, thus hybrid computing may be the next step.

  • @seanreynoldscs
    @seanreynoldscs Před 19 dny

    I suspect that we could use analog computing to solve machine learning tasks because we can use the voltage as a floating point number and the operations in ml can be combined with voltage manipulation circuitry to perform operations.

  • @antonsrandomstuff
    @antonsrandomstuff Před 20 dny

    Cheaper SSDs with multiple bits per one cell are analog devices, right now in our current computers. The error rates are very high, but they include a lot of recovery info and decode digital signal from basically analog readings of cells' charge state.

  • @nicholaspetit4313
    @nicholaspetit4313 Před 20 dny

    The auto industry used to use bread boards with resistors/capacitors to model the ride frequency of different suspension setups.

  • @cyberpunk.386
    @cyberpunk.386 Před 14 dny

    What I like about your videos, Matt, is that they're so very well explained and well spoken.

  • @MysteriousSoulreaper
    @MysteriousSoulreaper Před 20 dny +1

    I'm curious what the Rare Earth demands of Analog computers is vs their Digital counterparts? Though any reduction in Energy use is a net reduction in Rare Earth use because it means fewer solar panels are required.

  • @jaqhass
    @jaqhass Před 13 dny

    If you set surfsharks killswitch to strict, the killswitch will only work after it's conneted once which means you either never turn off the pc or update surfshark or you need a router with sursfhark built in, in addition to your other devices.

  • @ransonhall4834
    @ransonhall4834 Před 20 dny +1

    I think the analogy could be extended between analog and digital similarly to interpretations of quantum theory. There are empirically equivalent interpretations of discrete versus non-discrete mathematical models.
    Now I will go as far as to say that analog is analogous to nature. While we may be able to mathematically model certain features discretely, they always fail to take into account features “outside” of their defined system. This is why we can theorize with mathematics about something 50 years before we can prove it with physics. It also demonstrates the systematic way in which science goes about discovering versus how humans or conscious agents do.

  • @jerryodell1168
    @jerryodell1168 Před 20 dny +1

    Hybrid digital/analog system seems like the next logical step. With little changes in steps working toward the best. That way it is there without causing major headaches. Putting the parts that work together now and then finding correct ways to make the more difficult work later.

  • @jed-henrywitkowski6470

    In one of its iterations, the mining company that has held a presence in my state, longer than it's been a state, used a beautifully made European mechanical computer about the size of a three-drawer dessert to calculate payroll in one particular city. It was made around 1904.

  • @kenschestok9515
    @kenschestok9515 Před 15 dny

    interesting video! somewhat oddly enough as a plumber that also happens to have a bfa in sculpture and only a handful of years ago started playing with modular synths ends up on a video about analog computing and kinda understands a lot of what you are talking about from my experiences with modular synths, my art background and even intrigued by that old water run computer .. forgot where I was going with this... ??? thumbs up!!

  • @trojnara
    @trojnara Před 11 dny +1

    The fun fact is that both clocks were digital, and not analog. You could argue that the mechanical oscillator of a traditional clock is analog, but the quartz oscillator of the electronic clock is also analog. The all those gears constitute digital, even though not binary, dividers. The display only seems analog because the frequency of the mechanical oscillator may be too high and the movements may be too tiny to observe.

  • @FragFrog01
    @FragFrog01 Před 20 dny +1

    Analog computing is huge in computational neuroscience. A whole subfield, neuromorphic computing, is dedicated to predicting how neurons behave using analog computers. The way neurons fire, have ion channels open, and signals propagate is all governed by differential equations, so doing that on a chip is much more efficient and quicker.

    • @maythesciencebewithyou
      @maythesciencebewithyou Před 12 dny

      They are not governed by differential equations, they can be described by differential equations. Any system that changes can be modeled with differential equations.

  • @itzdono
    @itzdono Před 15 dny

    I was waiting for you to talk about near lossless data storage for music & photography because of the recent analog chip breakthroughs. Storing analog on a chip will be a real game changer in those fields. Being able to record & copy perfect analog signals will make vinyl records, tape, & film redundant. That's going to have such a huge impact.

    • @lumifaza
      @lumifaza Před 14 dny

      In audio field, you have just reinvented BBD (Bucket Brigade Delay). Having a non-mechanical memory still requires sampling and thus anti-aliasing filters. Even if audio quality is improved beyond what BBD offered, I don't see any quality benefit beyond digital storage. Digital audio is transparent already. Quantization you want to eliminate doesn't degrade quality, thanks to dithering. It may however be useful for more efficient near-lossless (not truly lossless because you can't make perfect copies in analog) storage.

  • @TwZlr.
    @TwZlr. Před 19 dny

    I have been working with computers since MsDos and dialup internet @ 28kb/s since the 1990's.
    Following that I pursued an education in computer science simply because it was the only thing nobody could explain to me.
    I've programmed ray tracing from scratch using linear algebra in 7 dimensions.
    Now I fully understand ones and zeros.
    In my last year of university I had to complete machine learning and artificial intelligence. It was my first child. I had to give it a goal, provide it with experience and it would give the best answer it could given the size of its brain. That was 7 years ago in 0/1 computing.
    Analog computing is fascinating.
    In theory, I believe that analog computing is like quantum computing.
    The 0/1 scream of traditional computation vs. the hum of analog and quantum computing.
    Let it hum and your answer wil be at of its point of harmonization.

  • @ianclark9598
    @ianclark9598 Před 19 dny

    I think what is happening is digital chips will have analogue components for solving specific problems. One chip is currently using an onboard capacitor to add various numbers together. The total is then output by the capacitor when needed. The capacitor is the analogue part, and unlike previous attempts at this, is accurate with its result. This has been done for all the reasons you have said, speed and energy efficiency.

  • @Zkpe02
    @Zkpe02 Před 12 dny

    The discovery of a room-temperature superconductor would have enormous technological significance. It has the potential to address global energy challenges, enhance computing speed, enable innovative memory-storage devices, and create highly sensitive sensors, among a multitude of other possibilities.

  • @dus10dnd
    @dus10dnd Před 17 dny

    He also mentioned years before significant power consumption improvements that we've been experiencing. I have one of the most powerful laptops that has ever existed... 17W of power. Oh, also... cooling is super easy with it and the fan hardly ever even kicks on. The analog clock that you had in the B-Roll... was stepping for each second. Most do.

  • @gsestream
    @gsestream Před 20 dny

    maybe not, or yes, but DVD-roms can be recorded in track trench height (difference), instead of just on/off dots, with laser interferometry precision depth measurement.

  • @urbanstarship
    @urbanstarship Před 11 dny

    Not an expert on this, but my first thought watching this was that it seems music synthesizers is an area where hybrid computers have matured for decades. They started out analogue, then went hybrid, then some purely digital, and now back to hybrid. Many aficionados prefer the sound of the analogue circuitry because there's a slight randomness to them, whereas it's more convenient to store the sounds in a digital memory. There's even the recent electro-mechanical desktop synth called the "Audio Motor Synth MKII" that uses spinning motors to generate the sound, which I suppose echoes many electric organs invented in the early 20th century that used spinning tone wheels (that were invented to be miniature versions of fully mechanical church pipe organs).

  • @stevenschmidt
    @stevenschmidt Před 18 dny +1

    Imagine an analog computer AI assistant that was nothing but an analog microphone, an analog computer, and an analog speaker, but you could hold a conversation with it just like ChatGPT. That would be crazy.

  • @Joe_VanCleave
    @Joe_VanCleave Před 20 dny

    I remember seeing the room-sized analog computer used on the USS Constellation’s Terrier missile battery running a real-time differential equation of the target intercept, as the two stage missile was in flight; hundreds of servos, synchros and gears moving simultaneously. It would frequently hit the target drone. In actual combat they were armed with 5kt nuclear warheads!

  • @cloudy_days2321
    @cloudy_days2321 Před 8 dny +1

    Differential equations is just algebra with some calculus thrown into the mix.
    You get to solve these types of equations a lot when designing the control systems for robots. Spring equations are especially annoying to solve because they oscillate and you need to predict that oscillation or your robot wont be able take two steps without falling over.

  • @EdwinaTS
    @EdwinaTS Před 15 dny

    I've been looking out for a book on hybrid computing for years. Couldn't find any! Nearest to that were books on control engineering. I did a short course on hybrid computing at year 3 of my computer science degree. It was really really easy!
    Gosh, I just did a Google and so many book references came up😮. The results of this search is vastly superior to the many searches I did in the last few years!

  • @suggesttwo
    @suggesttwo Před 13 dny

    9:07 We will get down to 1 molecule, then smaller molecules and then integration begins. Like files are compressed. Smaller, less resistance allowing lower voltage.

  • @girthquake1413
    @girthquake1413 Před 18 dny

    Some of these analog chips work in the same way as old OBD1 car ECUs: my Toyota's reports all issues and sensor problems as measurements of voltage with altered settings indicating something needing cleaning or actually breaking.
    After seeing how far some car companies are going for efficient stepdown from 48 to 12 volts, and how smart phones really haven't changed or evolved for several years... honestly I think we're closer to that maximum than we think. I don't think it's exactly to Moore's idea, but I think that heat generation and a mix of battery and size constraints are gonna be the next real limitation.

  • @Thorhian
    @Thorhian Před 18 dny

    Amdahl’s Law is good to know and keep in mind, but its conclusions only apply to a fixed workload. The world is constantly finding more and more ways to handle and crunch more and more data. As the workloads increase, so will the need for more and more parallel computing.

  • @justinhong7638
    @justinhong7638 Před 20 dny

    Would it be possible to integrate analog computing into digital computing systems? I know there is some music hardware these days that uses digital control of analog parts, but I don't know if that's applicable to analog computing.

  • @PixelsLaboratory
    @PixelsLaboratory Před 17 dny

    14:48 - that's basically the difference between volatile and non-volatile memory, but it's still digital computing
    14:58 - that's again the same thing, it's still digital, as 1s and 0s are also stored as voltages in current digital computing

  • @timonsku
    @timonsku Před 6 dny

    Most of what you describe is also true for digital ASICs. The question is not CPUs vs analog circuits but analog vs digital ASICs. A digital ASIC can also operate a specific function "instantly" and will do so massively more efficient than a general purpose computer.
    It's currently completely speculative if an analog implementation could be more efficient.

  • @Sandman755
    @Sandman755 Před 19 dny

    I foresee a computer that will mix analogue, digital, and quantum computing in such a way as to play to each one's strengths. In particular, you could use the digital chip to calculate how to program the analogue chip. I have no idea if this would use less power than simply doing it with the digital chip, though. Perhaps you would lose power for a single calculation but gain it if the solution only had to be slightly modified for each of a large number of calculations.

  • @papepcool
    @papepcool Před 17 dny

    The correlation in electronic music would be a good demonstration for the non-Mathers or Pysical painting vs. digital art

  • @MrChemluthHavanna
    @MrChemluthHavanna Před 14 dny

    I've seen a demonstration of differential equations solved by besically some OP-amps. The problems are also analog, like noises, manufacturing tolerances, imperfections, temperature and those kind of stuff.