Cosyne 2024 Tutorial - Talmo Pereira - SLEAP: Auto behavior quantification using DL (Part 1)

Sdílet
Vložit
  • čas přidán 10. 03. 2024
  • Get some SLEAP: Automating behavior quantification using deep learning
    Speaker: Talmo Pereira, Salk Institute for Biological Studies
    In this tutorial, attendees will learn about the advances in computer vision and deep learning that have led to the recent development of tools for markerless motion capture ("pose tracking") in typical lab settings, as well as how to put these into practice to analyze complex freely moving animal behavior.
    In the first part of the tutorial, we will cover: (i) the recent history of advances in pose estimation and tracking using deep learning (in humans and animals); (ii) the challenges in adapting these methods to the lab setting and how current tools approach them; (iii) the landscape of tooling for motion capture and downstream behavioral analysis; and (iv) conclude with an outlook on the remaining challenges and opportunities in the broader field of behavior quantification for neuroscience.
    In the second part of the tutorial, participants will learn how to use SLEAP (sleap.ai), an open-source tool for multi-animal pose tracking designed for accessibility to practitioners with no technical background in deep learning or software engineering. SLEAP is in use by thousands of researchers across the world, ranging from middle schoolers to faculty, to study everything from subcellular organelles to whale sharks across diverse experimental settings.
    Tutorial participants will receive exclusive access to cloud-based resources, requiring nothing but a laptop and web browser to complete the tutorial. By the end of the session, participants will learn how to go from a raw video of freely behaving lab animals, to annotating poses, training SLEAP models, tracking unlabeled data, and doing basic analyses to extract behavioral sequences and other relevant phenotypic information.
    Presented at Cosyne 2024 (www.cosyne.org/), Feb 29th, 2024
  • Věda a technologie

Komentáře •