What is GPT-3 and how does it work? | A Quick Review

Sdílet
Vložit
  • čas přidán 22. 07. 2024
  • You probably have heard of GPT-3 and how it is a fascinating development. But have you learned why or how GPT-3 managed to impress so many people?
    In this video, we will learn why GPT-3 is so unique, and how it manages to help bring in a new wave of excitement for AI. On top of this, we will also briefly look under the hood of GPT-3 to understand its architecture and some of its potential dangers.
    Want to give AssemblyAI’s automatic speech-to-text transcription API a try? Get your free API token here 👇
    www.assemblyai.com/?...
    Apps made with GPT-3: gpt3demo.com/
    B-roll credits:
    Video by Julia M Cameron (www.pexels.com/@julia-m-cameron) from Pexels
    Video by Jack Sparrow (www.pexels.com/@jack-sparrow) from Pexels

Komentáře • 14

  • @lilstar3705
    @lilstar3705 Před 2 lety

    What is the software used to Built GPT3

  • @spokewheeldiaries2724
    @spokewheeldiaries2724 Před 2 lety +1

    Thank you, useful video

  • @andersonsystem2
    @andersonsystem2 Před 2 lety +2

    Good video thanks

  • @karthikeyanak9460
    @karthikeyanak9460 Před 2 lety

    Could you make a video about GPTJ

    • @AssemblyAI
      @AssemblyAI  Před 2 lety +1

      Will add to the list of topics to consider. Thank you for your suggestion! - Mısra

  • @amitvyas7905
    @amitvyas7905 Před rokem

    I am a little confused. You mention 175B parameters are those not words? What are those parameters? Are those like gender, masculinity,.. etc?

    • @AssemblyAI
      @AssemblyAI  Před rokem

      Those are the values that are used by the model to calculate the outcome. On Neural Networks it would be weight and bias values.

    • @amitvyas7905
      @amitvyas7905 Před rokem

      ​@@AssemblyAI Then how are those parameters/values generated? I've read that word embeddings are generated by all LLMs but don't they need values with respect to royalness, gender, etc to get output for king-man+women = queen?

    • @amitvyas7905
      @amitvyas7905 Před rokem

      @@AssemblyAI My question is that all the LLMs use some kind of n-dimensional input vector, how is this calculated, and how "n" in n-dimension is calculated? Is that calculated by feeding in some Neural network with a dictionary or n-gram words?

    • @pathikghugare
      @pathikghugare Před rokem

      ​@@amitvyas7905 those values are initialised randomly at the start then the neural network modifies those values based on gradients in back propogation step

  • @msg3030
    @msg3030 Před 2 lety

    As good as GPT-3 is, it still has little to zero memory. It can't remember hardly anything in terms of a conversation. From what I know about AI, it is extremely difficult to program a memory into the system that even comes close to the memory of the human brain.

    • @paulden3158
      @paulden3158 Před rokem

      Why is it difficult to program a memory model?