What is Snowflake Time Travel | Chapter-14 | Snowflake Hands-on Tutorial

Sdílet
Vložit
  • čas přidán 6. 08. 2024
  • Snowflake's Time Travel Feature is the most powerful and impactful utility, which, if used correctly and designed appropriately will save a lot of time and effort to simplify many complex data project operational activities. Time travel is not just limited to seeing the past state of your data, it brings a whole lot of new possibilities for the way design and operates data projects. This episode is a quick 30min introduction of time travel features and helps you to answer the following questions
    1. What is time travel and how it works (conceptually)?
    2. All editions of snowflake support time travel?
    3. Time Travel and Fail-safe, both are the same?
    4. How does time travel works for transient and temporary tables?
    5. Does time travel cost a lot?
    6. Time travel vs data retention parameters, are the same?
    7. What is extended SQL for time travel?
    8. Time Travel is applicable only for table or for DB as well as schema too?
    So enjoy this video series and provide your valuable feedback.
    🌐 Site: toppertips.com
    🌐 SQL Scripts www.toppertips.com/snowflake-...
    🚀 🚀 Chapters 🚀 🚀
    -----------------------------------------
    ➥ 00:00 What problem does time-travel solve?
    ➥ 01:08 Episode introduction
    ➥ 01:53 Course Tree Map
    ➥ 02:50 Why to subscribe to this channel
    ➥ 03:44 What is time travel?
    ➥ 08:15 Time travel vs Fails Safe?
    ➥ 07:59 Time Travel for Transitent & Temp Tables
    ➥ 12:01 Time Travel Hands-on - How to enable time travel
    ➥ 19:36 Time Travel Hands-on - Drop & Undrop Table, Database, Schema
    ➥ 22:56 Time Travel Hands-on - Clone (Create Table) with Time Travel
    ➥ 26:31 Time Travel Hands-on - Create Table As Select[CTAS with Time Tavel]
    ➥ 27:17 Time Travel Hands-on - Table Storage Cost for Time Travel
    ➥ 29:28 Time Travel Hands-on - Time Travel + SnowSight
    ➥ 30:40 Thank you note
    🚀 🚀 Snowflake Tutotorials (Beginners) All Episodes 🚀🚀
    ---------------------------------------------------------------------------------------------------
    ➥ Chapter 1-to-25 "Complete Playlist" 🌐 bit.ly/3iNTVGI
    ➥ Chapter-1 Snowflake Introduction & History 🌐 bit.ly/3xKHrna
    ➥ Chapter-2 Snowflake Free Trial Registration 🌐 bit.ly/3m6uiCL
    ➥ Chapter-3 Snowflake Architecture 🌐 bit.ly/3sk2fB2
    ➥ Chapter-4 Snowflake Classic or Legacy WebUI 🌐 bit.ly/3stSyzS
    ➥ Chapter-5 Snowflake SnowSight Modern WebUI 🌐 • Snowsight - Snowflake ...
    ➥ Chapter-6 Snowflake Unique Features 🌐 • Snowflake Unique Featu...
    ➥ Chapter-7 Snowflake DB/Schema/Table & Data Loading 🌐 • Snowflake Database/Sch...
    ➥ Chapter-8 Snowflake Must Know Database Objects 🌐 • Snowflake Must Know Ne...
    ➥ Chapter-9 Snowflake Fast Data Loading/Ingestion 🌐 • Fast Data Loading & Bu...
    ➥ Chapter-10 Snowflake Continuous Data Loading 🌐 • Continuous Data Loadin...
    ➥ Chapter-11 Snowflake External Table 🌐 • How to work with exter...
    ➥ Chapter-12 Snowflake Virtual Warehouse 🌐 • How To Work With Snowf...
    ➥ Chapter-13 Snowflake Micro Partition 🌐 • #13 | Micro Partitions...
    ➥ Chapter-14 Snowflake Time Travel 🌐 • What is Snowflake Time...
    ➥ Chapter-15 Snowflake Clone Feature 🌐 • Snowflake Zero Copy Cl...
    ➥ Chapter-16 Snowflake Secure Data Sharing 🌐 • Snowflake Data Sharing...
    ➥ Chapter-17 Snowflake Streams & Change Data Capture 🌐 • Snowflake Stream & Cha...
    ➥ Chapter-18 Snowflake Task & Task Tree 🌐 • Snowflake Tasks & Task...
    ➥ Chapter-19 ETL (Data Pipeline) in Snowflake 🌐 • ETL Workflow In Snowfl...
    ➥ Chapter-20 Role, Grants & Role Hierarchy in Snowflake 🌐 • Role, Grants & Role Hi...
    ➥ Chapter-21 (Part-1) Stored Procedure in Snowflake 🌐 • Stored Procedure In Sn...
    ➥ Chapter-21 (Part-2) User Defined Function in Snowflake 🌐 • User Defined Functions...
    ➥ Chapter-21 (Part-3) Snowflake Views (Standard, Secure & Materialized) 🌐 • What are views in Snow...
    ➥ Chapter-22 Snowflake Information Schema 🌐 • What Is Information Sc...
    ➥ Chapter-23 Snowflake Resource Monitor 🌐 • What Is Resource Monit...
    ➥ Chapter-24 (Part-1) Snowflake JDBC Driver 🌐 • How to Use Snowflake J...
    ➥ Chapter-24 (Part-2) Snowflake ODBC Driver 🌐 • Microsoft Excel & Snow...
    ➥ Chapter-24 (Part-3) Snowflake Python Connector 🌐 • Snowflake Python Conne...
    ➥ Chapter-25 (Part-1) Snowflake & PowerBI Reporting 🌐 coming-soon
    ➥ Chapter-25 (Part-2) Snowflake & Tableau Reporting 🌐 coming-soon
    🚀 🚀 Snowflake Certification Complete Guide & Question Dump 🚀 🚀
    -----------------------------------------------------------------------------------------------------------------
    ➥ Revised Sep 2020 Syllabus: • Video
    ➥ SnowPro Guide: bit.ly/35S7Rcb
    ➥ SnowPro Practice Test (60 Questions): bit.ly/2Ubernv
    #timetravel #dropundrop #snowflakecomputing #snowflakeclustering #snowflakecompute #snowflaketimetravel #snowflakeedition

Komentáře • 63

  • @kirankumar2650
    @kirankumar2650 Před měsícem +3

    Wonderful explanation, You are helping huge people without spending money for course training. Great Efforts which are uncountable.

  • @ketanrehpade9150
    @ketanrehpade9150 Před 2 lety +1

    this is my fav topic , so easily explained that too in 30 minutes with all demos , excellent ...Thanks

    • @DataEngineering
      @DataEngineering  Před 2 lety

      Thank you 🙏 for watching my video and your word of appreciation really means a lot to me.

  • @nishavkrishnan4271
    @nishavkrishnan4271 Před 2 lety +2

    I just finished going through Chapter-1 to Chapter-14 .Time Travel one was so cool and this chapter was interesting with all hands on tutorial. Thank you so much!

  • @peterscalise1176
    @peterscalise1176 Před 2 lety +1

    Just finished chapters 1-14...GREAT TUTORIALS! THANK YOU!!!!
    😀

  • @neerajtiwari1695
    @neerajtiwari1695 Před 2 lety

    your all 14 video , I have seen. Now I am waiting for rest of your videos which you have mentioned in course curriculum . Thanks again for your wonderful content in Video

    • @DataEngineering
      @DataEngineering  Před 2 lety

      The ch-15 is out czcams.com/video/cAyM-Nj9WOc/video.html, hope you would like and learn from it.

  • @moh.7777
    @moh.7777 Před 10 měsíci +1

    Thanks for your excellent videos.
    Can you please share the sql scripts used for time travel tutorial?

  • @dt0229
    @dt0229 Před 2 lety +1

    great content, thank you!

  • @alertforfalsecase2299
    @alertforfalsecase2299 Před 2 lety

    Excellent explanation. Really enjoyed lot

    • @DataEngineering
      @DataEngineering  Před 2 lety

      Glad it was helpful!
      ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡
      I have already published other knowledge series and snowflake certification videos and if you are interested, you can refer them.
      🌐 Snowflake Complete Guide Playlist ➥ bit.ly/3iNTVGI
      🌐 SnowPro Guide ➥ bit.ly/35S7Rcb
      🌐 Snowflake SQL Series Playlist ➥ bit.ly/3AH6kCq
      🌐 SnowPro Question Dump (300 questions) ➥ bit.ly/2ZLQm9E
      ⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡⚡

  • @tejaswinerella5223
    @tejaswinerella5223 Před 8 měsíci

    Hi,
    Thanks for your excellent videos.
    Can you please share the SQL scripts used for the time travel tutorial?

  • @vedantshirodkar
    @vedantshirodkar Před 9 měsíci

    Nicely explained. Thank you

    • @DataEngineering
      @DataEngineering  Před 9 měsíci

      Glad it was helpful!
      and yes, if you want to manage snowflake more programatically.. you can watch my paid contents .. many folks don't know the power of snowpark... these 2 videos... will help you to broaden your knowledge..
      These contents are available in discounted price for limited time.. (one for JSON and one for CSV).. it can automatically create DDL and DML and also run copy command...
      1. www.udemy.com/course/snowpark-python-ingest-json-data-automatically-in-snowflake/?couponCode=SPECIAL50
      2. www.udemy.com/course/automatic-data-ingestion-using-snowflake-snowpark-python-api/?couponCode=SPECIAL35

  • @geoffreyhibon2651
    @geoffreyhibon2651 Před 2 lety +1

    Thanks a lot for your videos :)

    • @DataEngineering
      @DataEngineering  Před 2 lety

      thanks a lot .. good to know that these videos are adding value...

  • @rociogonzalez3577
    @rociogonzalez3577 Před rokem +1

    Great tutorial :) would you be able to share the SQL scripts used?I'm not able to see it in the link you provided. Thanks :)

  • @ramum4684
    @ramum4684 Před 2 lety

    Excellent hardworking I can see from this team for making us expertise in snowflaks.
    Now I would like to get all the sql scripts used in the all vedios for practice.
    I would be thanks full if I get all the scripts including all python scripts for practice and complete understanding.
    Thanks in advance
    Ramu M

    • @DataEngineering
      @DataEngineering  Před 2 lety

      Yes, sure. Let me see how can I make all of them available.

  • @RK-wf7re
    @RK-wf7re Před 2 lety +2

    Nice content and your efforts are superb. Eagerly waiting for 15th part, when can we expect?

  • @tanayamandal111
    @tanayamandal111 Před 2 lety

    Another excellent video. Thank you so much.
    Please make a video on advance snowflake certifications if possible. Planning to go for another certification after snowpro but very little information is there on internet.

    • @DataEngineering
      @DataEngineering  Před 2 lety

      You are welcome...
      It is in my list but it takes time as I want to make sure that I cover as much as possible for my audience.
      Let me share my quick experience from my adv certification.
      1. Questions are very descriptive and it takes time to read them.
      2. You will have a time crunch as, 2 of the option will look exactly same.
      3. Lot of scenario based question from every topic
      4. 3-4 question will come from basics, so make sure you read the basic too...
      5. Lot of question from data loading, micro-partition, stored procedure (specially javascript), transaction, performance and most of them scenarios.
      6. You would not get time to go back and review, so don't jump, go in sequence.
      7. Read the documentation and new features also.. like masking, API call etc..
      and 4-6 weeks would be enough.. since you are snowpro certification, can you appear for adv de or architecture exam.
      Will surely make a detailed video, but you have to wait..

    • @tanayamandal111
      @tanayamandal111 Před 2 lety

      @@DataEngineering thank you so much for your inputs. Your videos helped a lot to clear snowpro.

  • @kannanarjun6569
    @kannanarjun6569 Před rokem

    every time I need to take statement id while update or deleting the rows

  • @govardhanyadav2684
    @govardhanyadav2684 Před rokem

    Thanks for nice content , Would love to see more !
    I have a question, I had set time travel of 4 days at schema level but inside table , I have changed to 10 days . If schema gets dropped beyond time travel period, still can I get the data for table which is having time travel period of 10 days?
    Thanks !

    • @akashsharma4769
      @akashsharma4769 Před 8 měsíci

      Hi govardhan yadav, This was a good question and it helped me to clear my concept as I was also confused. I would like to share my learnings with you.
      You have defined a schema with a time travel retention period of 2 days. This means any tables using this schema will by default keep historical data for 2 days.
      However, for a specific table that uses this schema, you have overridden the retention period and set it to 10 days instead.
      In this case, the table-level setting takes precedence. So even though the schema retention is 2 days, that table will keep historical data for 10 days.
      The schema-level retention period acts as a default that can be overridden at the table level. But any other tables using the schema and not overriding will still follow the 2 day retention.

  • @srinivasp6579
    @srinivasp6579 Před 11 měsíci

    Thank you for the detailed tutorial. I really appreciate. It looks like Sql Scripts link is not working. It is giving an error:" You don't have permission to access the resource."

  • @arumughamthiagarajan4910

    Sessions are great and worth spending time. Can you please share the link for sample tests?

    • @DataEngineering
      @DataEngineering  Před rokem

      here is the entier playlist - and look for Ch-14 from this playlist (czcams.com/play/PLba2xJ7yxHB5X2CMe7qZZu-V4LxNE1HbF.html)

  • @arumughamthiagarajan4910

    I have completed snowpro core certification. These videos are great and helped a lot. I am planning to do Snowflake Architecture Certification. Are there separate videos for that?

    • @DataEngineering
      @DataEngineering  Před rokem

      Great to hear! .. I have not yet published for architect level.. but will do it soon..

  • @harshaaaditya2430
    @harshaaaditya2430 Před rokem +1

    hi, I have a question regd "data_retention_time_in_days". Is this attribute inheritable? If a database is created with this attribute value set to say 10 days, any schema, table within the database created without this attribute defined would get the value as 10 or would it be a default value of 1?

  • @ankitsoni5286
    @ankitsoni5286 Před rokem

    Hi, it would be great if you mention which of your quiz videos we have to go thru after finishing each of this one's from series.

  • @narenkrishh7412
    @narenkrishh7412 Před rokem

    Hi bro. Please help us by providing the sql scripts. It is not working.

  • @niveditharaokulakarni4193

    hi, Thanks for making such detailed videos which are really helpful in understanding snowflake. I have question here, I have created a table on day1 . From day 1 on wards it is receiving updates. I can recover the old data up to 90 days using time travel. What if the updates are happened beyond 90 days (or beyond fail safe stage). How can i recover the data after fail safe? And why it is only 90 days. Please explain. Thanks in advance.

    • @DataEngineering
      @DataEngineering  Před 2 lety

      Thank you 🙏 for watching my video and your word of appreciation really means a lot to me.
      Time travel cost a lot for churning table (if enabled for 90 days) and if you make it forever, it will cost a hell in snowflake account. If you need to store historical data for longer period, probably time travel is not the right feature.

    • @niveditharaokulakarni4193
      @niveditharaokulakarni4193 Před 2 lety

      Thank you for taking time and responding. But what happens to the old micro partitions of a table for which updates have been received. And to those mp's for which retention period and failsafe is completed. Thank you.

  • @dhirajgrover8664
    @dhirajgrover8664 Před 2 lety

    All your vedios are great , Where I can find scripts used in your vedios ?

  • @swethakalidoss6578
    @swethakalidoss6578 Před 11 měsíci

    will fail safe enables in 1day for standard snowflake account? since standard edition is having 1 day time travel.

    • @DataEngineering
      @DataEngineering  Před 11 měsíci

      fail safe is a table level feature and not edition level feature... so yes...

  • @sharathbunty891
    @sharathbunty891 Před 11 měsíci

    We reduced time retention period from 3 to 2 so that past made(first day changes) changes will lost..isnt possible using fail safe can we bring changes we made back the changes

    • @DataEngineering
      @DataEngineering  Před 11 měsíci

      Yes, it can be done but it is too much of effort unless the data is very critical to your and your team... if it is permanent table and if you have lost data due to time-travel configuration, you can contact snowflake team and they will recover it.... I assume you are not using free trial edition and you/company has contract with snowflake team.

  • @VenkataVamsiVardhineni

    unable to open sql scripts

  • @asilbek_sanoqulov
    @asilbek_sanoqulov Před 9 měsíci

    Guys can anybody share the scripts, on site given by author it says not everyone could have access

  • @gokulnavamaniphotosbygokul1072

    I've a question, while creating a table by default it is creating under transient tables, Is there's any way to change into the permanent table?

    • @DataEngineering
      @DataEngineering  Před 2 lety

      The default table is permanent table, you might be making a mistake.
      Could you share the DDL script? that will help me to understand the problem.
      I generally use transitent in my SQL as it cost less compare to permanent table;

    • @gokulnavamaniphotosbygokul1072
      @gokulnavamaniphotosbygokul1072 Před 2 lety

      @@DataEngineering Here's the same Table Script that you've used in this tutorial -
      CREATE OR REPLACE TABLE POPWAREHOUSE.ORDER_DETAILS (
      O_ORDERKEY NUMBER(38),
      O_CUSTKEY NUMBER(38,0),
      O_ORDERSTATUS VARCHAR(1),
      O_TOTALPRICE NUMBER(12,2),
      O_ORDERDATE DATE,
      O_ORDERPRIORITY VARCHAR(15),
      O_CLERK VARCHAR(15),
      O_SHIPPRIORITY NUMBER(38,0),
      O_COMMENT VARCHAR(79)
      )
      data_retention_time_in_days = 50;
      Actually, I'm using a Free Trial Account (Enterprise Edition), Could that be a reason?

  • @rohitmanderwad6034
    @rohitmanderwad6034 Před 9 měsíci

    What happens to time travel if a table is renamed? And if I alter table and add new column? Or create or replace table?

    • @DataEngineering
      @DataEngineering  Před 9 měsíci

      good question.. but I would suggest.. it is easy to try that out and share the behaviour..

    • @rohitmanderwad6034
      @rohitmanderwad6034 Před 9 měsíci

      I would be more than happy to try it out. But I use individual personal account and unfortunately I have consumed all the credits available. The billing has emptied my pockets 😅

  • @raghumajji2326
    @raghumajji2326 Před 2 lety

    Hi..I have one question time travel...Assume that we have created 1 table with time travel of 90 days..and there are no transaction happened in between..on 92nd day I dropped the table will I be able to un drop?

    • @DataEngineering
      @DataEngineering  Před 2 lety +1

      Yes, the 90 days is a moving window and it is not fixed from the day it is created... so if you created the table on 01-Jan and dropped the table on 02-Apr (31days in Jan, 29days in Feb & 31days in Mar = 90 days).. the recovery window will start from 03-Jan and any transaction done between 01-Jan to 02-jan will be lost but all the data starting from 03-Apr minus 90 days can be recovered.
      I hope this is clear.

    • @raghumajji2326
      @raghumajji2326 Před 2 lety +1

      @@DataEngineering Thank You so much...your videos are awesome. Thank you so much for your efforts to help to learn others..

    • @DataEngineering
      @DataEngineering  Před 2 lety

      @@raghumajji2326 thanks..

  • @ushakiran3870
    @ushakiran3870 Před 2 lety

    Matillion how it's work

    • @DataEngineering
      @DataEngineering  Před 2 lety

      It is a big topic, do you have any specific requirement or learning expectation. Matallion has its own learning university.