Data Engineer Mock Interview | ADF | Medallion Architecture | BRONZE, SILVER & GOLD Layer| ADLS GEN2
Vloลพit
- ฤas pลidรกn 25. 03. 2024
- ๐๐จ ๐๐ง๐ก๐๐ง๐๐ ๐ฒ๐จ๐ฎ๐ซ ๐๐๐ซ๐๐๐ซ ๐๐ฌ ๐ ๐๐ฅ๐จ๐ฎ๐ ๐๐๐ญ๐ ๐๐ง๐ ๐ข๐ง๐๐๐ซ, ๐๐ก๐๐๐ค trendytech.in/?src=youtube&su... for curated courses developed by me.
I have trained over 20,000+ professionals in the field of Data Engineering in the last 5 years.
๐๐๐ง๐ญ ๐ญ๐จ ๐๐๐ฌ๐ญ๐๐ซ ๐๐๐? ๐๐๐๐ซ๐ง ๐๐๐ ๐ญ๐ก๐ ๐ซ๐ข๐ ๐ก๐ญ ๐ฐ๐๐ฒ ๐ญ๐ก๐ซ๐จ๐ฎ๐ ๐ก ๐ญ๐ก๐ ๐ฆ๐จ๐ฌ๐ญ ๐ฌ๐จ๐ฎ๐ ๐ก๐ญ ๐๐๐ญ๐๐ซ ๐๐จ๐ฎ๐ซ๐ฌ๐ - ๐๐๐ ๐๐ก๐๐ฆ๐ฉ๐ข๐จ๐ง๐ฌ ๐๐ซ๐จ๐ ๐ซ๐๐ฆ!
"๐ 8 ๐ฐ๐๐๐ค ๐๐ซ๐จ๐ ๐ซ๐๐ฆ ๐๐๐ฌ๐ข๐ ๐ง๐๐ ๐ญ๐จ ๐ก๐๐ฅ๐ฉ ๐ฒ๐จ๐ฎ ๐๐ซ๐๐๐ค ๐ญ๐ก๐ ๐ข๐ง๐ญ๐๐ซ๐ฏ๐ข๐๐ฐ๐ฌ ๐จ๐ ๐ญ๐จ๐ฉ ๐ฉ๐ซ๐จ๐๐ฎ๐๐ญ ๐๐๐ฌ๐๐ ๐๐จ๐ฆ๐ฉ๐๐ง๐ข๐๐ฌ ๐๐ฒ ๐๐๐ฏ๐๐ฅ๐จ๐ฉ๐ข๐ง๐ ๐ ๐ญ๐ก๐จ๐ฎ๐ ๐ก๐ญ ๐ฉ๐ซ๐จ๐๐๐ฌ๐ฌ ๐๐ง๐ ๐๐ง ๐๐ฉ๐ฉ๐ซ๐จ๐๐๐ก ๐ญ๐จ ๐ฌ๐จ๐ฅ๐ฏ๐ ๐๐ง ๐ฎ๐ง๐ฌ๐๐๐ง ๐๐ซ๐จ๐๐ฅ๐๐ฆ."
๐๐๐ซ๐ ๐ข๐ฌ ๐ก๐จ๐ฐ ๐ฒ๐จ๐ฎ ๐๐๐ง ๐ซ๐๐ ๐ข๐ฌ๐ญ๐๐ซ ๐๐จ๐ซ ๐ญ๐ก๐ ๐๐ซ๐จ๐ ๐ซ๐๐ฆ -
๐๐๐ ๐ข๐ฌ๐ญ๐ซ๐๐ญ๐ข๐จ๐ง ๐๐ข๐ง๐ค (๐๐จ๐ฎ๐ซ๐ฌ๐ ๐๐๐๐๐ฌ๐ฌ ๐๐ซ๐จ๐ฆ ๐๐ง๐๐ข๐) : rzp.io/l/SQLINR
๐๐๐ ๐ข๐ฌ๐ญ๐ซ๐๐ญ๐ข๐จ๐ง ๐๐ข๐ง๐ค (๐๐จ๐ฎ๐ซ๐ฌ๐ ๐๐๐๐๐ฌ๐ฌ ๐๐ซ๐จ๐ฆ ๐จ๐ฎ๐ญ๐ฌ๐ข๐๐ ๐๐ง๐๐ข๐) : rzp.io/l/SQLUSD
30 INTERVIEWS IN 30 DAYS- BIG DATA INTERVIEW SERIES
This mock interview series is launched as a community initiative under Data Engineers Club aimed at aiding the community's growth and development
Our highly experienced guest interviewer, Ankur Bhattacharya, / ankur-bhattacharya-100... shares invaluable insights and practical advice coming from his extensive experience, catering to aspiring data engineers and seasoned professionals alike.
Our talented guest interviewee, Sasiram Kolisetty / kolisetty-sasiram-0285... has a remarkable approach to answering the interview questions in a very well articulated manner. Where he talks about the complete Medallion Architecture followed in the projects that he worked on belonging to Insurance and Green Energy Domain. He describes how the raw data was initially added to the bronze layer tables. Then performed cleaning and transformations on this data which was then added to the Silver layer tables. Finally, based on the business requirements, data was aggregated and the results were added to the Gold Layer.
Link of Free SQL & Python series developed by me are given below -
SQL Playlist - โข SQL tutorial for every...
Python Playlist - โข Complete Python By Sum...
Don't miss out - Subscribe to the channel for more such informative interviews and unlock the secrets to success in this thriving field!
Social Media Links :
LinkedIn - / bigdatabysumit
Twitter - / bigdatasumit
Instagram - / bigdatabysumit
Student Testimonials - trendytech.in/#testimonials
Discussed Questions with Timestamp
3:51 Brief discussion about the project.
9:06 What business problem are you solving in your project?
9:56 What work have you done to secure your data?
11:20 Do you use Unity Catalog for role-based policy?
11:33 How do you resolve bottlenecks in terms of latency in your pipeline?
15:14 If you have 2 large tables, what kind of join strategy will you follow?
16:12 Difference between bucketing and Z-ordering.
16:48 How do you perform data ingestion?
18:20 How would you create a pipeline for real-time data updates, aggregation, and live dashboard creation?
20:13 What triggers are available in ADF?
21:18 How does lazy evaluation help in Spark optimization?
22:45 What approach would you take for cluster configuration
24:53 SQL coding question
Tags
#mockinterview #bigdata #career #dataengineering #data #datascience #dataanalysis #productbasedcompanies #interviewquestions #apachespark #google #interview #faang #companies #amazon #walmart #flipkart #microsoft #azure #databricks #jobs
This is the best interview till now. He worked on security , CDC concepts etc. Thanks sumit sir for conducting these interviews. Really helpful who is preparing for the interviews.
Ram is one of the best candidates of the series.
All the best Ram.
yes definitely, he has a great exposure
These videos have been very helpful. I have gone over many and this would be beneficial for a long time. Great work, initiative and thanks for the efforts.
I am getting prepared for the interview next week. As you said failed interview better then not attending interview and these mock interviews giving me the confidence to face the interview. Great initiative, learning and help. Thanks!
You are doing great work. Lot of respect
This is one of the best interviews and got good insights Sumit sir.Learnt the new concept of thin executor and fat executor and his optimisation techniques are also unique like join re order Adq etcโฆ
Definitely. It is a great help sir.
This serires is great and gives me confidence to and sit in the interview.
Thanks for conducting these mock interviews Sumit sir. It is really helpful๐
It just wonderful task... Its very much helpful โคโคโคโคโค
Sumit sir you are a legend in BIGDATA...
You are the best trainer I've seen so far... Thanks for making the bigdata enthusiastics life easier....
Always happy to help!
Thank you sumit sir and team, your hard work really means a lot
The interview series is really helpful, Thank you
Sumit - these mock interviews are worth it's weight in gold.
Very helpful video.... gr8 help ...
Thank you sir , this videos are very helpful
best interview till now
absolutely
Been watching the interviews everyday. Definitely helpful. IMO based on the performance of the interviewee feedback at the end would be great.
we will add it in upcoming ones
What an amazing interview. This series is so helpful. Thanks sumit sir
Keep watching for more such engaging interviews
The interviews are very helpfulโฆ
Thank you Sumit and Team๐๐
Glad to hear that you found the mock interviews insightful. More such sessions are scheduled for release in the upcoming days.
this is very helpful series
Great one ๐
The series is amazing , I am learning a lot. Please keep making it
Happy to hear that!
Thanks for sharing. Very helpful
Glad it was helpful! Keep watching for more such interesting content
Really helpful for all Data engineer
Happy to know that you are finding the interview series helpful
Gr8 effort ๐
Best interview, every interviewer should give feedback at the end. it can be more helpful.
noted the suggestion, will incorporate it from the upcoming interviews
Thanks for the interview series!!
glad that you liked it
Good one!!
Thank you as always ๐๐ป
Always happy to share good content for all the data enthusiasts
It was really effective to us very very helpfull
Happy to know that you found the interview helpful
Very informative.
thank you
Realistic interview โค
glad that you enjoyed it
Too technical man๐ฅ
Very helpful โค
Glad it was helpful!
It was awesome
Glad that you are finding the mock interviews informative.
Best
What is the common salary range for Big data developer (4yrs) ?
Most of the interviews seem to focus on junior and mid-level DE. Iโd also appreciate some video for Senior DE role.
let me know if you would like to attend?
Sumit Sir, Please share any mock interview which will cover Azure cloud alone if possible
As soon as possible, noted
Very nice interview sir can we have an interview for someone who is having more than 5 years of experience and is looking to join as an Architect level or Team Lead, Manager in Data Engineering world.
Definitely, there would be many more such videos released as part of this interview series. Will try to consider all possible scenarios
I would like to know the answer for the question, "why in spark why we will get duplicate of the column on which join is performed, whereas in sql we do not get that duplicate ?"
From my limited knowledge, I know that in both spark and sql joins we will get the duplicate columns when join is performed on two tables.
I would really appreciate it if you could please let me know the correct answer.
The reason behind this lies in the fundamental differences between Spark and traditional SQL databases. Spark performs parallel processing to handle partitioned data distributed across different nodes in the cluster. It is inevitable that there would be duplicates while performing joins on such datasets in this context. In contrast, traditional SQL databases usually operates on a single machine or a tightly coupled cluster and ensures uniqueness by preventing the duplicated from being generated in the final results
@@sumitmittal07Thankyou so much ! That really makes sense now.
I would like to attend interview, let me know what is the process?
Please fill this form - forms.gle/UMpNCZvAHgoLvvuJ6