Big Data Engineering Mock Interview | Big Data Pipeline | AWS Cloud Services | Project Architecture

Sdรญlet
Vloลพit
  • ฤas pล™idรกn 24. 03. 2024
  • ๐“๐จ ๐ž๐ง๐ก๐š๐ง๐œ๐ž ๐ฒ๐จ๐ฎ๐ซ ๐œ๐š๐ซ๐ž๐ž๐ซ ๐š๐ฌ ๐š ๐‚๐ฅ๐จ๐ฎ๐ ๐ƒ๐š๐ญ๐š ๐„๐ง๐ ๐ข๐ง๐ž๐ž๐ซ, ๐‚๐ก๐ž๐œ๐ค trendytech.in/?src=youtube&su... for curated courses developed by me.
    I have trained over 20,000+ professionals in the field of Data Engineering in the last 5 years.
    ๐–๐š๐ง๐ญ ๐ญ๐จ ๐Œ๐š๐ฌ๐ญ๐ž๐ซ ๐’๐๐‹? ๐‹๐ž๐š๐ซ๐ง ๐’๐๐‹ ๐ญ๐ก๐ž ๐ซ๐ข๐ ๐ก๐ญ ๐ฐ๐š๐ฒ ๐ญ๐ก๐ซ๐จ๐ฎ๐ ๐ก ๐ญ๐ก๐ž ๐ฆ๐จ๐ฌ๐ญ ๐ฌ๐จ๐ฎ๐ ๐ก๐ญ ๐š๐Ÿ๐ญ๐ž๐ซ ๐œ๐จ๐ฎ๐ซ๐ฌ๐ž - ๐’๐๐‹ ๐‚๐ก๐š๐ฆ๐ฉ๐ข๐จ๐ง๐ฌ ๐๐ซ๐จ๐ ๐ซ๐š๐ฆ!
    "๐€ 8 ๐ฐ๐ž๐ž๐ค ๐๐ซ๐จ๐ ๐ซ๐š๐ฆ ๐๐ž๐ฌ๐ข๐ ๐ง๐ž๐ ๐ญ๐จ ๐ก๐ž๐ฅ๐ฉ ๐ฒ๐จ๐ฎ ๐œ๐ซ๐š๐œ๐ค ๐ญ๐ก๐ž ๐ข๐ง๐ญ๐ž๐ซ๐ฏ๐ข๐ž๐ฐ๐ฌ ๐จ๐Ÿ ๐ญ๐จ๐ฉ ๐ฉ๐ซ๐จ๐๐ฎ๐œ๐ญ ๐›๐š๐ฌ๐ž๐ ๐œ๐จ๐ฆ๐ฉ๐š๐ง๐ข๐ž๐ฌ ๐›๐ฒ ๐๐ž๐ฏ๐ž๐ฅ๐จ๐ฉ๐ข๐ง๐  ๐š ๐ญ๐ก๐จ๐ฎ๐ ๐ก๐ญ ๐ฉ๐ซ๐จ๐œ๐ž๐ฌ๐ฌ ๐š๐ง๐ ๐š๐ง ๐š๐ฉ๐ฉ๐ซ๐จ๐š๐œ๐ก ๐ญ๐จ ๐ฌ๐จ๐ฅ๐ฏ๐ž ๐š๐ง ๐ฎ๐ง๐ฌ๐ž๐ž๐ง ๐๐ซ๐จ๐›๐ฅ๐ž๐ฆ."
    ๐‡๐ž๐ซ๐ž ๐ข๐ฌ ๐ก๐จ๐ฐ ๐ฒ๐จ๐ฎ ๐œ๐š๐ง ๐ซ๐ž๐ ๐ข๐ฌ๐ญ๐ž๐ซ ๐Ÿ๐จ๐ซ ๐ญ๐ก๐ž ๐๐ซ๐จ๐ ๐ซ๐š๐ฆ -
    ๐‘๐ž๐ ๐ข๐ฌ๐ญ๐ซ๐š๐ญ๐ข๐จ๐ง ๐‹๐ข๐ง๐ค (๐‚๐จ๐ฎ๐ซ๐ฌ๐ž ๐€๐œ๐œ๐ž๐ฌ๐ฌ ๐Ÿ๐ซ๐จ๐ฆ ๐ˆ๐ง๐๐ข๐š) : rzp.io/l/SQLINR
    ๐‘๐ž๐ ๐ข๐ฌ๐ญ๐ซ๐š๐ญ๐ข๐จ๐ง ๐‹๐ข๐ง๐ค (๐‚๐จ๐ฎ๐ซ๐ฌ๐ž ๐€๐œ๐œ๐ž๐ฌ๐ฌ ๐Ÿ๐ซ๐จ๐ฆ ๐จ๐ฎ๐ญ๐ฌ๐ข๐๐ž ๐ˆ๐ง๐๐ข๐š) : rzp.io/l/SQLUSD
    30 INTERVIEWS IN 30 DAYS- BIG DATA INTERVIEW SERIES
    This mock interview series is launched as a community initiative under Data Engineers Club aimed at aiding the community's growth and development
    Our highly experienced guest interviewer, Satinder, / satinder-singh-699aab2b shares invaluable insights and practical advice coming from her extensive experience.
    Our talented guest interviewee Aditya Patil, / ap-patil has an impressive approach to answering the interview questions in a very well articulated manner.
    Link of Free SQL & Python series developed by me are given below -
    SQL Playlist - โ€ข SQL tutorial for every...
    Python Playlist - โ€ข Complete Python By Sum...
    Don't miss out - Subscribe to the channel for more such informative interviews and unlock the secrets to success in this thriving field!
    Social Media Links :
    LinkedIn - / bigdatabysumit
    Twitter - / bigdatasumit
    Instagram - / bigdatabysumit
    Student Testimonials - trendytech.in/#testimonials
    Discussed Questions : Timestamp
    2:34 Brief overview of projects.
    3:19 Describe your data pipeline flow and architecture.
    5:10 What transformations do you use, and in which format do you write data to Redshift?
    6:44 How do you handle null values?
    9:03 Which file format do you use for end-user data?
    9:50 Why is Parquet preferred over ORC?
    11:10 What are the join types in Hive?
    12:07 Which types of joins are used to avoid shuffling in Hive and PySpark? Do you know the specific term?
    12:53 Explain how broadcast join avoids shuffling.
    14:07 Which property controls broadcast join in Spark?
    14:40 How do you start a Spark application in PySpark?
    16:09 What does the builder do in Spark session creation?
    17:43 What are the partitioning types in Hive?
    18:36 Difference between managed and external tables in Hive.
    19:16 Have you performed Spark performance tuning?
    19:36 Difference between repartition and coalesce in Spark?
    20:25 Have you used NoSQL databases?
    21:02 SQL coding question
    Tags
    #mockinterview #bigdata #career #dataengineering #data #datascience #dataanalysis #productbasedcompanies #interviewquestions #apachespark #google #interview #faang #companies #amazon #walmart #flipkart #microsoft #azure #databricks #jobs

Komentรกล™e • 53

  • @imranhossain1660
    @imranhossain1660 Pล™ed 4 mฤ›sรญci +5

    parquet is a columnar based storage format, so it is a very good file format in terms of retrieving the data through the query. It definitely reduces the usage of i/o read and network bandwidth. Besides that it has built in support for compression in the form of snappy format. So it reduces the space usgae. Another one I can think of is, parquet files comes up a structure with 3 components, they are header, body and footer. Heder actually the name of the file(part001,part002). Body is actual data content which it is storing and footer is basically for the metadata. This metadata includes the minimum and maximum values of the columns. So whenever we try to query the data which is stored in parquet format this metadata helps us for the data skipping which in turn fast our query execution. Hope it helps.

  • @mojibshaikh4092
    @mojibshaikh4092 Pล™ed 2 mฤ›sรญci

    Informative and Excellent interview.

  • @sruthiselvakumar9817
    @sruthiselvakumar9817 Pล™ed 4 mฤ›sรญci +2

    This interview is really great as Satinder explained some concepts like property for broadcast etc more clearly. Thanks Sumit Sir!! Expecting more videos like this..

    • @sumitmittal07
      @sumitmittal07  Pล™ed 4 mฤ›sรญci +1

      satinder will be conducting more interviews

  • @tanujarora4906
    @tanujarora4906 Pล™ed 2 mฤ›sรญci

    Satinder sir is awesome, always something to learn from his questions.

  • @KiyanshLife
    @KiyanshLife Pล™ed 4 mฤ›sรญci +4

    Best Interview I ever seen. Both of you too good at your level.

    • @sumitmittal07
      @sumitmittal07  Pล™ed 4 mฤ›sรญci +1

      yes this interview was next level

  • @sauravroy9889
    @sauravroy9889 Pล™ed 3 mฤ›sรญci

    Really nice interview sir.โค

  • @mohammedalikhan9819
    @mohammedalikhan9819 Pล™ed 4 mฤ›sรญci +9

    The interview was more focused on pyspark, sql we expect interviewer to ask more qns on AWS cloud as well. Because in most of the interview videos posted pyspark has been asked a lot.If qns on AWS would have been asked it would have been very helpful.

    • @sumitmittal07
      @sumitmittal07  Pล™ed 4 mฤ›sรญci +1

      Hi Mohammed, will definitely have some interviews planned specifically for AWS in the upcoming days.

    • @mohammedalikhan9819
      @mohammedalikhan9819 Pล™ed 4 mฤ›sรญci

      Thank you sir๐Ÿ˜Š

    • @avinash7003
      @avinash7003 Pล™ed 3 mฤ›sรญci

      I see mostly asked 70% in Pyspark SQL rest cloud โ€‹@@mohammedalikhan9819

  • @abhishekmodak8496
    @abhishekmodak8496 Pล™ed 3 mฤ›sรญci

    This was a good interview and Satinder has good experience as an interviewer.

  • @user-im6ui9zd8v
    @user-im6ui9zd8v Pล™ed 4 mฤ›sรญci +2

    This was a good interview. Different from the earlier one's. Satinder's question and advice was very good.

    • @sumitmittal07
      @sumitmittal07  Pล™ed 4 mฤ›sรญci

      this interview has really gone well

  • @goldykarn5922
    @goldykarn5922 Pล™ed 3 mฤ›sรญci

    Best interview session so far.

  • @akshaykumarverma8644
    @akshaykumarverma8644 Pล™ed 3 mฤ›sรญci

    This was a very good video

  • @safarnama65
    @safarnama65 Pล™ed 3 mฤ›sรญci

    Very Informative one of the best mock interview with proper answering and details

    • @sumitmittal07
      @sumitmittal07  Pล™ed 3 mฤ›sรญci

      Keep watching for more such insightful interviews

  • @grim_rreaperr
    @grim_rreaperr Pล™ed 3 mฤ›sรญci

    Hi Sumit Sir,
    In the first sql problem where we are required to find subject wise toppers, one case where row_number() will fail is when we have two top-scorers with the same marks in a specific subject. Please check the example below:
    student_name, subject, marks (-- derived column)
    stud_1, maths, 90 -- 1
    stud_2, maths, 90 -- 1
    stud_1,economics, 95 --1
    stud_2, economics, 90 -- 2
    stud_3, economics, 88 -- 3
    Instead of row_number(), we can choose any one from rank or dense_rank as we just need the first rankers(based on highest marks scored in each subject). My approach will be as follows:
    WITH top_scorers AS
    (
    SELECT student_name,
    subject,
    marks,
    DENSE_RANK() OVER(PARTITION BY subject ORDER BY marks DESC) AS rnk
    FROM student_marks
    )
    SELECT student_name,
    subject,
    marks
    FROM top_scorers
    WHERE rnk = 1;

  • @Sagar0155
    @Sagar0155 Pล™ed 4 mฤ›sรญci

    Interview was insightful. Learnt core concepts of spark from Satinder

  • @DataJourneyHuub
    @DataJourneyHuub Pล™ed 4 mฤ›sรญci

    Itโ€™s really helpful sir. Thank you so much

  • @abhishekkmalik4399
    @abhishekkmalik4399 Pล™ed 4 mฤ›sรญci

    Very informative video, liked the point of view by Satinder Sir.

    • @sumitmittal07
      @sumitmittal07  Pล™ed 4 mฤ›sรญci

      satinder is a very knowledgeable person

  • @ashwenkumar
    @ashwenkumar Pล™ed 2 mฤ›sรญci

    Aditya - u need to be strong in the basics and always answer straight forward and crisply on points . Donโ€™t beat the bush

  • @DesireIsIrrelevant
    @DesireIsIrrelevant Pล™ed 3 mฤ›sรญci

    Thanks for uploading such a great Interview video Sir!

    • @sumitmittal07
      @sumitmittal07  Pล™ed 3 mฤ›sรญci

      Glad you found the interview informative!

  • @sabyspeaksonline
    @sabyspeaksonline Pล™ed 3 mฤ›sรญci

    What's the difference between parquet and delta format?

  • @AliKhanLuckky
    @AliKhanLuckky Pล™ed 4 mฤ›sรญci +1

    Sir i personaly want to see satinder sirs more interviews ๐Ÿ˜Š

    • @sumitmittal07
      @sumitmittal07  Pล™ed 4 mฤ›sรญci +1

      yes definitely, he will be conducting more interviews

  • @Abhishek-14
    @Abhishek-14 Pล™ed 4 mฤ›sรญci +3

    Sir please continue python course along with this ๐Ÿ™

    • @sumitmittal07
      @sumitmittal07  Pล™ed 4 mฤ›sรญci +2

      yes, one video coming tomorrow at 7 pm

    • @Abhishek-14
      @Abhishek-14 Pล™ed 4 mฤ›sรญci

      @@sumitmittal07 thank you so much sir that's a relief to hear this.

  • @zaffer2024
    @zaffer2024 Pล™ed 4 mฤ›sรญci

    Excellent

  • @amritmanash7950
    @amritmanash7950 Pล™ed 4 mฤ›sรญci

    Very nice interview

  • @doyouwanttoknow3366
    @doyouwanttoknow3366 Pล™ed 4 mฤ›sรญci

    Please upload a gcp data engineer interview video sir

  • @mohitbutola1140
    @mohitbutola1140 Pล™ed 4 mฤ›sรญci

    have anyone have taken the course ?

    • @sumitmittal07
      @sumitmittal07  Pล™ed 4 mฤ›sรญci

      Please share your contact number if you would like to know more about the courses that I offer

  • @ameygoesgaming8793
    @ameygoesgaming8793 Pล™ed 3 mฤ›sรญci

    My SQL would be:
    SELECT student_id, max(marks)
    FROM class
    GROUP BY subject

    • @grim_rreaperr
      @grim_rreaperr Pล™ed 3 mฤ›sรญci

      every non-aggregated column in your select statement must be included in the group by statement.( here student_id is a non aggregated column and it should be in your group by clause and same applies for the subject column too which is not being called in the select statement)

    • @ameygoesgaming8793
      @ameygoesgaming8793 Pล™ed 3 mฤ›sรญci

      @@grim_rreaperr Oh yes, its a typing bug.
      It should be:
      SELECT subject, max(marks)
      FROM class
      GROUP BY subject

  • @ameygoesgaming8793
    @ameygoesgaming8793 Pล™ed 3 mฤ›sรญci

    what is NC SQL way?

    • @SB-ix7db
      @SB-ix7db Pล™ed 3 mฤ›sรญci

      ANSI

    • @ameygoesgaming8793
      @ameygoesgaming8793 Pล™ed 3 mฤ›sรญci

      so ANSI SQL is normal SQL syntax which we write right?@@SB-ix7db

  • @zaffer2024
    @zaffer2024 Pล™ed 4 mฤ›sรญci

    Why data engineer roles have very easy questions

    • @sumitmittal07
      @sumitmittal07  Pล™ed 4 mฤ›sรญci +1

      we make it look easy, else its complex.. haha

  • @akhilsingh3801
    @akhilsingh3801 Pล™ed 11 dny

    Bro is cheating on mock interview with zero fundamental knowledge of Spark or Hadoop ๐Ÿ˜‚๐Ÿ˜‚๐Ÿ˜‚. At least interviewer has asked questions to get something out of this video.