Stanford Seminar - Robot Learning in the Era of Large Pretrained Models

Sdílet
Vložit
  • čas přidán 12. 03. 2024
  • February 23, 2024
    Dorsa Sadigh, Stanford University
    In this talk, I will discuss how interactive robot learning can benefit from the rise of large pretrained models such as foundation models. I will introduce two perspectives. First I will discuss the role of pretraining when learning visual representations, and how language can guide learning grounded visual representations useful for downstream robotics tasks. I will then discuss the choice of datasets during pretraining. Specifically, how we could guide large scale data collection, and what constitutes high quality data for imitation learning. I will discuss some recent work around guiding data collection based on enabling compositional generalization of learned policies. Finally, I will end the talk by discussing a few creative ways of tapping into the rich context of large language models and vision-language models for robotics.
    About the speaker: dorsa.fyi/
    More about the course can be found here: stanfordasl.github.io/robotic...
    View the entire AA289 Stanford Robotics and Autonomous Systems Seminar playlist: • Stanford AA289 - Robot...
    ► Check out the entire catalog of courses and programs available through Stanford Online: online.stanford.edu/explore

Komentáře • 4

  • @cardianlfan8827
    @cardianlfan8827 Před 2 měsíci +1

    The Age of Robots has arrived. With the unified interface for large-model robot hardware, we can undoubtedly achieve the same astonishing results as GPT language systems.

  • @jaiberjohn
    @jaiberjohn Před měsícem

    It'd be good to listen to the skipped content, including reinforcement learning, and more details on the Take 2 items. Great presentation!

  • @forheuristiclifeksh7836
    @forheuristiclifeksh7836 Před měsícem +1

    15:00

  • @TV19933
    @TV19933 Před 2 měsíci

    robot learning is 76% understanding education level i assume so
    artificial intelligence use
    study education quickly 😅