Data Validation with Pyspark || Real Time Scenario

Sdílet
Vložit
  • čas přidán 16. 12. 2023
  • In this video will discuss about , how we are going to perform data validation with pyspark Dynamically
    Data Sources Link:
    drive.google.com/drive/folder...
    #pyspark #databricks #dataanalytics #data #dataengineering

Komentáře • 13

  • @mohitupadhayay1439
    @mohitupadhayay1439 Před měsícem

    Amazing content. Keep a playlist for Real time scenarios for Industry.

  • @ajaykiranchundi9979
    @ajaykiranchundi9979 Před 2 měsíci

    Very helpful! Thank you

  • @listentoyourheart45
    @listentoyourheart45 Před 6 měsíci

    Nice

  • @vamshimerugu6184
    @vamshimerugu6184 Před 2 měsíci

    Great explanation ❤.Keep upload more content on pyspark

  • @ArabindaMohapatra
    @ArabindaMohapatra Před měsícem

    I just started watching this playlist. I'm hoping to learn how to deal with schema-related issues in real time.Thanks

  • @skateforlife3679
    @skateforlife3679 Před 6 měsíci

    Cool, but is it like this every time ? Like you have a reference df containing all columns and file name / path and you have to iterate over it to see if its matching ?

  • @World_Exploror
    @World_Exploror Před 4 měsíci

    how did you define reference_df and control_df

    • @DataSpark45
      @DataSpark45  Před 4 měsíci

      we defined as a table in any DataBase. As of know i used them as a csv

  • @OmkarGurme
    @OmkarGurme Před 3 měsíci

    while working with databricks we dont need to start a spark session right ?

    • @DataSpark45
      @DataSpark45  Před 3 měsíci +1

      No need brother, we can continue with out defining spark session, i just kept for practice