Neuromorphic Computing from the Computer Science Perspective: Algorithms and Applications

Sdílet
Vložit
  • čas přidán 8. 06. 2024
  • Speaker's Bio: Catherine (Katie) Schuman is a research scientist at Oak Ridge National Laboratory (ORNL). She received her Ph.D. in Computer Science from the University of Tennessee (UT) in 2015, where she completed her dissertation on the use of evolutionary algorithms to train spiking neural networks for neuromorphic systems. She is continuing her study of algorithms for neuromorphic computing at ORNL. Katie has an adjunct faculty appointment with the Department of Electrical Engineering and Computer Science at UT, where she co-leads the TENNLab neuromorphic computing research group. Katie received the U.S. Department of Energy Early Career Award in 2019.
    Talk Abstract: Neuromorphic computing is a popular technology for the future of computing. Much of the focus in neuromorphic computing research and development has focused on new architectures, devices, and materials, rather than in the software, algorithms, and applications of these systems. In this talk, I will overview the field of neuromorphic from the computer science perspective. I will give an introduction to spiking neural networks, as well as some of the most common algorithms used in the field. Finally, I will discuss the potential for using neuromorphic systems in real-world applications from scientific data analysis to autonomous vehicles.
    Organizer: Dr. Ramtin Zand
    Seminar Webpage: www.icaslab.com/Seminar

Komentáře • 15

  • @megatronDelaMusa
    @megatronDelaMusa Před rokem

    kamasutra has evolved beautifully. A neuromorphic internet infrastructure would set the world ablaze. Our ability to build synthetic artificial synapses and dendritic branching would give a whole new meaning to the future. AGi has arrived on our blindside.

  • @ashwanikumar-nk7gd
    @ashwanikumar-nk7gd Před rokem

    Wow....amazing work...great talk

  • @alexandrsoldiernetizen162

    22:24 problem is stated as if either crossover or mutation is performed in generation of new population. Crossover is always performed for new population generation and in the process of crossover, mutation can be introduced, typically at a very small percentage on the order of .1 to 1%.

    • @ronsnow402
      @ronsnow402 Před 2 lety

      Super great point, & also the (.1 to 1%) is a great point, because if the crossover is to large, it will create loss in functionality or incongruency.

  • @user-lk1jv9qv4s
    @user-lk1jv9qv4s Před 3 měsíci +1

    where can i get the presentation pdf ?

  • @SystemsMedicine
    @SystemsMedicine Před 10 měsíci

    Hi iCAS. Presumably the toy race car steering back and forth is because the neuromorphic circuit is implementing a controller that is too ‘low order’: meaning the controller needs to take into account higher order difference (differential) equations. [There are 60 years of modern development in control theory, which is largely ignored by neural net people… which will lead to such problems.]

  • @ronsnow402
    @ronsnow402 Před 2 lety

    We are still using 2DIC. but Monolithic 3DIC with integrated memory created by DARPA exist, it just hasn't even reached the market yet. So even if transistor scaling stops, we will still improve at a exponential rate in digital computation. Neuromorphic will still be very important, as we find better training methods & architecture to tune permutations of weight values.

  • @celestialmedia2280
    @celestialmedia2280 Před 2 lety

    Autonomous application 👍🔥
    Eg: Using autonomous robot like 4limb Horse to navigate and explore environment like Mars as our perishable body is limited in its nature

  • @vtrandal
    @vtrandal Před rokem +1

    Awesome! The number of people claiming Moore’s Law has ended doubles every two years! Lol

  • @ravinderpaul6267
    @ravinderpaul6267 Před 2 lety

    PCBandpcm os Also do the compyer science.

  • @alexandrsoldiernetizen162

    21:40 description of selection for fitness is flawed. The selection process doesnt pick 'from best to worst', and out of the numerous algorithms for selection none of them is basic 'from best to worst', for instance fitness proportionate, truncation, and tournament. Simply selecting by best fit risks getting quagmired in local maxima.

    • @ronsnow402
      @ronsnow402 Před 2 lety +1

      Exactly, because what's "best" is sometimes unreachable, gradient decent only finds the most efficient way to get downhill. Gradient decent cannot leap hills to find better strategies outside the local adaptation landscape. Although, you can add some chance that the system can leave it's local gradient decent environment, using crossover to displace the system on the adaptation landscape... it's very inefficient though. Because of the loss generated by crossover.

    • @alexandrsoldiernetizen162
      @alexandrsoldiernetizen162 Před 2 lety +1

      @@ronsnow402 Right, except in the case of evolutionary algorithms its usually stochastic hillclimbing that gets you to the goal. Basically the same thing by a different name and slightly different statistical process.

    • @ronsnow402
      @ronsnow402 Před 2 lety

      @@alexandrsoldiernetizen162 Good point, so crossover & mutation has some of the stochastic hillclimbing properties that gets you over hills & closer to the goal. Then selection comes in & forces gradient decent optimization. Is that what you mean?

  • @RavNivara
    @RavNivara Před 2 lety

    sign me up, I will donate what is left of a once-brilliant brain and body to the cause