Inventing liquid neural networks

Sdílet
Vložit
  • čas přidán 8. 05. 2023
  • Paper: www.science.org/doi/10.1126/s...
    Publication/Event: Science Robotics
    Key authors: Makram Chahine, Ramin Hasani
    Mathias Lechner, Alexander Amini, and Daniela Rus
    MIT News article: news.mit.edu/2023/drones-navi...
    Video Director: Rachel Gordon
    Videographer: Mike Grimmett
  • Věda a technologie

Komentáře • 74

  • @notallm
    @notallm Před rokem +44

    Thousands of neurons in conventional models to 19 liquid neurons performing this well is really commendable! I can't wait to see more developments!

  • @musicMan11537
    @musicMan11537 Před 4 měsíci +17

    The “liquid neural network” idea in this work is a nice (re-)popularization of neural ODEs (which have been around for some time) with the modification that the integration time constant “tau” is a function of the input (as opposed to being a constant). The use of “liquid” in the model name is not the best choice of word as historically (as early as 2004 and before) there’s a class of neural models called “liquid state machines” (LSM): en.m.wikipedia.org/wiki/Liquid_state_machine
    (In fact, an LSM are a kind of spiking neural model and this actually more brain-like than neural ODEs as in the work of this video)
    It’s important to be clear that these authors’ work is a nice little innovation on neural ODEs, but this is a far cry from biological neurons as in the actual paper they even use backpropagation through time which is clearly biologically implausible (the brain does not unroll itself backwards through time). It’s also important to know the historical context as works like this one are being a bit disingenuous by not making it more clear that they are popularizing good classic ideas (e.g., from ODEs and neural ODEs).

    • @musicMan11537
      @musicMan11537 Před 4 měsíci +4

      ODE = ordinary differential equation

  • @SureshKumar-qi7cy
    @SureshKumar-qi7cy Před rokem +18

    Inspirational conversation

  • @alexjbriiones
    @alexjbriiones Před 10 měsíci +14

    I was just reading that the main bottleneck for AI is the huge and specialized data sets. This next-level genius invention is going to be revolutionary.

  • @-www.chapters.video-
    @-www.chapters.video- Před 10 měsíci +11

    00:01 Exciting times and the start of the project
    01:00 Implementing smaller neural networks for driving
    02:05 Revolutionary results in different environments
    03:03 Creating abstract models for learning systems
    04:00 Properties and applications of liquid neural networks
    05:17 Challenges in implementing the models
    06:32 Testing and pushing the models to their limits
    07:12 Expanding to drone navigation and tasks
    08:05 Extracting tasks and achieving reasoning
    09:11 Surprising and powerful properties of liquid networks
    10:06 Zero-shot learning and adaptability to different environments
    11:27 Extraordinary performance in urban and dynamic environments
    12:31 Resiliency maps provide visual and concise answers to model decision-making
    13:14 Interpretable and explainable machine learning for safety critical applications
    14:23 Liquid networks as a counter to the scaling law of generative AI
    15:42 Under parametrized neural networks like liquid networks for future generative models
    17:10 Exploring multiple agents and composing solutions for different applications
    18:02 Extending liquid networks to perform well on static data and new types of data sources
    18:41 Embedding intelligence into embodied agents and society

  • @berbudy
    @berbudy Před 10 měsíci +5

    Thank you worm for leading us to this

  • @koustubhavachat
    @koustubhavachat Před 10 měsíci +4

    do we have pytorch implementation available for liquid neural network ? how can one start with this !

  • @atakante
    @atakante Před 10 měsíci +3

    Very important algorithmic advance, kudos to the team!

  • @benealbrook1544
    @benealbrook1544 Před 10 měsíci +5

    Amazing job, this is a fundamental shift and the way forward,. I am interested in seeing the memory footprint and cpu demands. Looking forward to the applications in other fields and perhaps replacement of traditional state of the art models.

  • @ajithboralugoda8906
    @ajithboralugoda8906 Před měsícem

    WOW! Is Liquid Neural Networks, the Signal in the LLM sea of Noise? So exciting! Great Job folks waiting to hear more breakthroughs from the great research!

  • @bvdlio
    @bvdlio Před 10 měsíci

    Great, exciting work!

  • @erikdong
    @erikdong Před 10 měsíci

    Bravo! 👏🏼

  • @palfers1
    @palfers1 Před 9 měsíci

    Thank you Mr. or Mrs. Worm!

  • @Niamato_inc
    @Niamato_inc Před 10 měsíci +3

    What a time to be alive.

    • @prolamer7
      @prolamer7 Před 10 měsíci

      mental virus is in you

  • @adamgm84
    @adamgm84 Před 29 dny

    My wig always melts when we get into composing algorithms.

  • @bobtivnan
    @bobtivnan Před 10 měsíci +19

    I wonder how much of this work makes other progress in this field obsolete? I hope Lex Fridman, who also works on autonomous vehicles at MIT, invites them to his podcast.

    • @sb_dunk
      @sb_dunk Před 10 měsíci +2

      Does he work on autonomous vehicles? I get the impression he's not as much of an AI expert as many would lead you to believe.

    • @bobtivnan
      @bobtivnan Před 10 měsíci

      @@sb_dunk he has mentioned many times on his podcast that he has worked in the autonomous vehicles field. I found this lecture at MIT czcams.com/video/1L0TKZQcUtA/video.html

    • @sb_dunk
      @sb_dunk Před 10 měsíci +2

      @@bobtivnan I wouldn't say that lecturing at MIT is equivalent to working on autonomous vehicles at MIT, the latter implies you're at the forefront of the research. The papers that I can see that he's published don't appear to be massively cutting edge either, nor even directly related to autonomous driving - the closest are about traffic systems and how humans interact with autonomous vehicles.
      My point is that we should take the supposed expertise of these people with a pinch of salt.

    • @bobtivnan
      @bobtivnan Před 10 měsíci +3

      @@sb_dunk let it go dude

    • @sb_dunk
      @sb_dunk Před 10 měsíci +3

      @@bobtivnan Oh I'm sorry, I didn't realize I wasn't allowed to question someone's credentials or expertise.

  • @qhansen123
    @qhansen123 Před 10 měsíci +2

    Why does this say “Inventing liquid neural networks”, when there’s been papers on liquid neural networks from the concept/terminology before 2004?

  • @LoanwordEggcorn
    @LoanwordEggcorn Před 10 měsíci +4

    Thanks for the talk. Would it be accurate to say the approach models nonlinearities of natural (biological) neural networks?

    • @SynaTek240
      @SynaTek240 Před 9 měsíci +1

      Kinda, but it doesn't use spikes to communicate like biological networks, but rather it uses magnitudes of values just like normal ANNs. However the way it mimics biological networks is in it being temporally freed rather than clocked, if that says anything to you :D

    • @LoanwordEggcorn
      @LoanwordEggcorn Před 9 měsíci

      @@SynaTek240 Thanks, and that seems like an important part of how it works. Biological networks are not clocked.
      The details of how the values propagate may be less relevant, though spikes can have higher order effects from interference effects or other interactions that magnitudes don't.
      Sound right?

  • @sapienspace8814
    @sapienspace8814 Před rokem +4

    Outstanding work, and very fascinating about looking at the worm's neurons, and figuring out how to practically apply it to autonomous navigation systems, with an even less number of neuron-like modules!
    The "liquid" aspect, I'm attempting to understand more, though it is fascinating to me, as it might be a natural characteristic of electro-magnetic, inductive coupling, between nearby synapses (the same, often perceived problem with human created wiring as interference "noise"/fuzziness, though applied here as a benefit, providing an opportunity for DE-fuzzification between synapses or overlapping/liquid states).
    It almost seems intriguingly similar to overlapping membership (probability/statistical distribution) functions of fuzzy states (such as what is used in Fuzzy Logic, e.g. a room is "hot", "warm", "cool", "cold"), using an application of a type of K-means clustering, or similar, to focus attention on the most frequently used regions of state space classification.
    One might be able to perceive "liquid time constant", just like as in Fuzzy Logic, as a method of merging knowledge (abstract, qualitative, fuzzy, noisy, symbolism) and mathematics together (through interpolation, or extrapolation), but it seems the self incriminating nomenclature of "Fuzzy" has been lost in machine intelligence (maybe via the Dunning-Kruger effect, paradox of humility).
    Merging "liquid time constant" (or Fuzzy Logic) with Reinforcement Learning can help naturally generate the simpler inference rules of a vast state space, and allow the machine to efficiently self learn, without an expert human creating the inference rules. I have seen this done with a neural fuzzy, reinforcement learning experiment with an inverted pendulum control experiment.
    Lately, I have been reading a book titled "The Hedonistic Neuron" written in 1982 to try to understand how these RL systems work more, they seem quite profoundly incredible.
    Thank you for sharing your incredible work!

    • @keep-ukraine-free528
      @keep-ukraine-free528 Před 10 měsíci +1

      Your idea that the "liquidity" (or the "liquid" nature of information flow) "might be a natural characteristic of electro-magnetic, inductive coupling" is incorrect. It can't be so, since information between real neurons (between any two synapses) passes not electromagnetically, but using molecules (commonly called neurotransmitters) -- which act as a key to a lock.
      To help you understand, the described NN model uses differential equations -- which give the model its "liquid" nature.

    • @sapienspace8814
      @sapienspace8814 Před 10 měsíci

      @@keep-ukraine-free528 the molecules are going to have electron orbits, and the orbits will inductively interact with each other via Maxwell's equations, whether you like it or not, this is natural, it is physics.

    • @Theodorus5
      @Theodorus5 Před 10 měsíci +1

      I think you mean 'ephaptic coupling' between neurons

  • @realjx313
    @realjx313 Před měsícem

    The attention focus,isn't that about labels?

  • @HitAndMissLab
    @HitAndMissLab Před 10 měsíci +1

    How are liquid neural networks performing in language models, where differential equations are of very little use?

    • @magi-1
      @magi-1 Před 10 měsíci

      a transformer is a fully connected graph neural network and each layer is essentially a cross section of a continuous process. I the same way an RNN is a discrete autoregressive model, you can reformat LLMs as a continuous process and sample words via a fixed integration step using euler integration.

  • @hansadler6716
    @hansadler6716 Před 10 měsíci +2

    I would like to hear a better explanation of how an image could be input to such a small network.

    • @antoruby
      @antoruby Před 10 měsíci +7

      The first layers are still regular convolutional neural nets. The decision making layers (last ones), that are traditionally fully connected layers, were substituted by the 19 liquid neurons.

  • @computerconcepts3352
    @computerconcepts3352 Před rokem +1

    Interesting 🤔

  • @Michallote
    @Michallote Před 2 měsíci

    Why has there not been any impact of this tech? I have seen it twice already in blogs and videos but nothing reflects that on the number of articles published, the references from other authors and most of all... No replicable code or pretrained models to actually test those claims. Has anyone here been able to corroborate their results??

  • @77sanskrit
    @77sanskrit Před 10 měsíci +1

    9:01 One environment that would be interesting to train it in, would be like those wind tunnel things, that they test aerodynamics of plane parts and shit. Get it trained on turbulence and loop di loops, that would be awesome!!! Just a thought🤔👍 You Guys are amazing!!!! Absolutely Genius!!!!🙏🙏🫀🧠🤖

  • @arowindahouse
    @arowindahouse Před 10 měsíci

    How can liquid neural networks have the necessary inductive biases for computer vision? If I'm not wrong, you need to add classical convolutional neural networks before the liquid for the model to be usable

  • @nikhilshingadiya7798
    @nikhilshingadiya7798 Před 10 měsíci +1

    Now llms model competition is rising 😂😂😂 love you guys for your great efforts 🎉🎉

    • @DarkWizardGG
      @DarkWizardGG Před 10 měsíci

      In the near future one of those LLM models will secretly integrate into that liquid neural network. Guys let us all welcome the "T1000" model....TAAADDDDAAAA. It's hunting time. lol😁😉😄😅😂😂😂🤦‍♂️🤦‍♂️🤦‍♂️🤖🤖🤖🤖🤖🤖🤖🤖🤖

  • @tachoblade2071
    @tachoblade2071 Před 9 měsíci +1

    if the network can understand the reason behind the tasks or causality of them.... could it "understand" language rather than just being a auto completer like gpt?

  • @spockfan2000
    @spockfan2000 Před 6 měsíci

    Is this tech available to the public? Where can I learn how to implement it? Thanks.

  • @kesav1985
    @kesav1985 Před 5 měsíci +1

    (Re)-inventing time integration schemes would have been dubbed as crappy if they were "invented" by some unknown academic at a non-elite university.
    But, hey ML hype and MIT brand would work wonders in selling ordinary stuff!

  • @vallab19
    @vallab19 Před 10 měsíci +1

    Liquid Neural Network is another level in the AI revolution.

    • @DarkWizardGG
      @DarkWizardGG Před 10 měsíci

      Yeah, T1000 in the making. Lol😁😉😄🤖🤖🤖🤖

  • @shivakumarv301
    @shivakumarv301 Před 10 měsíci

    Will it not be wise to do SWOT of the new technology and understand it's consequence before jumping into it

  • @lesmathsparexemplesatoukou3454

    CSAIL, I' m coming to you

  • @DanielSanchez-jl2vf
    @DanielSanchez-jl2vf Před 10 měsíci

    Guys lets show this to Demis Hassabis, Yann lecunn, Ilya Sutskever, Yoshua Bengio.

  • @joaopedrorocha4790
    @joaopedrorocha4790 Před 10 měsíci

    This is exciting ... This kind of network would be able to forget data that doesn't fit it's needs? For example ... I give it time series data on the first training, and it learns some pattern based on it and act according to it, but them things change on the world, and the pattern it learned is no longer that useful ... This kind of network could just forget this pattern gradually and adapt based on its new input?

  • @wiliamscoronadoescobar8113

    About this... Tell the people and the university around ...

  • @and_I_am_Life_the_fixer_of_all

    wow, I'm the 4th comment! What a time to be alive! Welcome to history key authors :D

  • @Lolleka
    @Lolleka Před 10 měsíci

    And here I was, thinking that the video was about liquid-phase computing. Silly me.

  • @arowindahouse
    @arowindahouse Před 10 měsíci

    I thought Liquid State Machines had already been invented in the 90s by Maass

  • @RoyMustang.
    @RoyMustang. Před 10 měsíci

    ❤❤❤

  • @randomsitisee7113
    @randomsitisee7113 Před 10 měsíci +2

    Sounds like bunch of Mumbo jumbo trying to cash in on AI train

  • @francisdelacruz6439
    @francisdelacruz6439 Před 10 měsíci

    Once you hv collision detection which can be a separate system you hv a certifiable self driving ecosystem. Maybe it’s time to go start-up and hv the resources to get this to an actual product. Maybe raising usd 100mn would be easy and manageable equity exposure. The drone example is a game change in the new type of Ukraine war you could use a raspberry pi equivalent board in drones and implications there will change how wars are done likely harder to invade other countries with this tech add on.

  • @amarnathmutyala1335
    @amarnathmutyala1335 Před 10 měsíci +1

    So worms can mentally drive cars ?

  • @DarkWizardGG
    @DarkWizardGG Před 10 měsíci

    This is T1000 in the making. Liquid shapeshifter AI. lol😁😉😄🤖🤖🤖🤖🤖

  • @Stopinvadingmyhardware
    @Stopinvadingmyhardware Před 10 měsíci

    Grokked.

  • @BradKittelTTH
    @BradKittelTTH Před 10 měsíci +1

    This means that the mature neurons that we humans can produce starting in our late 40s and after if in good health that operates at 10-100 times the speeds of juvenile neurons suggests that the elders could far out-think, grow new ideas, abilities, and potential just being unleashed after our 60's when we have accumulated a host of higher operating systems, synapses and neural networks that are superior to quantum computers. Given there is evidence that wii can produce 700 neurons a night, what is our potential into our 80s to get smarter too. This is the potential of humans once we understand our potential if we master the vessel Wii, all the "I"s that understand the bio-computers wii communicate through, the avatars that form the mii, with eyes watching you. A fabulous new discovery, and amazing you have been able to tap these incredible liquid neural networks that also suggest that all Beings have the potential to understand and navigate more of reality than humans ever imagined before these discoveries. If a worm can do so well with 302 neurons, what is the limitation, if any, of a billion-neuron network comprised of millions of tinier networks that intermingle on optical cable speeds. Thank you for this interview.

  • @ibrremote
    @ibrremote Před 10 měsíci

    Task-centric, not context-centric…

  • @wiliamscoronadoescobar8113

    Here...And For the goodness of education i wanna insert, introduce a camp of investigation talking about the information in the world, the contamination in it and surely...touching topics like the fotons that my investigations can touch. In the web. For references the data talked in live in a conference of my friend Andres Manuel Lopez Obrador. President Of Mexico

  • @ethanwei5060
    @ethanwei5060 Před 10 měsíci

    Another scope for this solution is for banning bad social media posts, banning bad content from children and unwanted content. Current human moderators require counselling after moderating in a day, and if this liquid neural networks can focus on the task instead of context like traditional AI, it can be game changing.

  • @MuscleTeamOfficial
    @MuscleTeamOfficial Před 10 měsíci +1

    Elongated muskrat is comin🎉

  • @Jediluvs2kill
    @Jediluvs2kill Před 10 měsíci +1

    This ramim guy is timewaste

  • @alexjbriiones
    @alexjbriiones Před 10 měsíci +1

    I am sure Elon Musk is paying attention to this group and would probably try to hire them to complement Tesla's autonomous driving. Even more ominous is that China and Russia are probably setting their engineers to duplicate this invention.

    • @bobtivnan
      @bobtivnan Před 10 měsíci

      I also thought that Musk would pursue this, for Tesla and for more general application with his new company xAI.

  • @Lolleka
    @Lolleka Před 10 měsíci

    And here I was, thinking that the video was about liquid-phase computing. Silly me.