Import Multiple Pages from Web with One Query in Excel

Sdílet
Vložit
  • čas přidán 30. 07. 2024
  • In this video, we import multiple pages of data from a website with one Excel query using Power Query.
    We create a table of URL's that can automatically update on a sheet, import data from the web and then turn it into a function with some M code.
    We then create a custom column to use this function and feed it the URLs from the table to import as many pages as we want.
    Find more great free tutorials at;
    www.computergaga.com
    ** Online Excel Courses **
    The Ultimate Excel Course - Learn Everything ► bit.ly/UltimateExcel
    Excel VBA for Beginners ► bit.ly/2JvnnRv
    Advanced Excel Tricks ► bit.ly/3CGCm3M
    Excel Formulas Made Easy ► bit.ly/2t3netw
    Creating Sports League Tables and Tournaments in Excel ► bit.ly/2viGg3J
    Connect with us!
    LinkedIn ► / 18737946
    Instagram ► / computergaga1
    Twitter ► / computergaga1
  • Jak na to + styl

Komentáře • 604

  • @Xack03
    @Xack03 Před 4 lety +49

    Not only this is incredibly useful, but is also wonderfully explained. Thanks man!

    • @Computergaga
      @Computergaga  Před 4 lety +4

      My pleasure. Thank you very much.

    • @vv9730
      @vv9730 Před 3 lety +1

      SAME HERE...BUT UR REACTION..WHILE DELETING URLs IN TABLE..!!!!

  • @GoodlyRogue
    @GoodlyRogue Před 2 lety +2

    I'm overwhelmed by the incredible level of clarity this tutorial displays! Kudos

  • @cruzinsweetsntreats
    @cruzinsweetsntreats Před 2 lety +2

    This video tutorial is now about 16 months old, but it's going to help me in pulling in the needed data from my UberEats Deliveries. I realized I needed to automate the data to help maximize my taxes, and the Power Query as a function to bring in multiple pages is really what I need. So thank you very much. I look forward to getting it implemented, so that I can link the Excel files in MS Access for further data analysis.

  • @ishthaish
    @ishthaish Před 2 lety +7

    Searched for this all over . This has been the best example I've seen and very well explained. Thank you very much

  • @nevermind2240
    @nevermind2240 Před 4 lety +4

    I can't thank you enough, Computergaga! My work project just became a breeze.

  • @amarnadhgunakala2901
    @amarnadhgunakala2901 Před 3 lety +4

    you make my date on time when looking for immediate resolution.Thanks

  • @subantaneogi
    @subantaneogi Před 4 lety +1

    This tutorial is pure Gold! Thanks! Looking forward to more videos from you. :)

  • @isparoz
    @isparoz Před 4 lety +1

    Become one of my top instructors and youtube channels with only one single video! thanks a million!

  • @VimalaKamandjaja
    @VimalaKamandjaja Před rokem +1

    I couldn't be more grateful to find this channel. Dear Computergaga, you explain everything so clear and understandable :) Real life saviour!

  • @ytlfv
    @ytlfv Před rokem +1

    I can't thank you enough for this detailed video (step by step). Its 3 years old but still valid and valuable. God Bless you!

  • @kurtl8660
    @kurtl8660 Před 3 lety +2

    You, sir, ARE A LEGEND. Searched for ages to accomplish what this allows me and came up empty. When I landed on you, I was home free. Thank YOU!!!

  • @MatthewFoxX
    @MatthewFoxX Před 2 lety +1

    This is an excellent tutorial - clear and easy to follow, thank you! It's saved me a ton of time.

  • @gauravkaul4
    @gauravkaul4 Před 3 lety

    Worked like a charm!! No tutorial has ever worked for me in 1-go!! Thanks!

  • @JohannAltmann-hu4cx
    @JohannAltmann-hu4cx Před 7 měsíci

    This Video is still the best when it comes to Excel Web Scraping! For me it was a breakthrough. Many thanks!

    • @Computergaga
      @Computergaga  Před 7 měsíci

      You're very welcome! Thank you for your comments Johann.

  • @user-iv2zx9ej2n
    @user-iv2zx9ej2n Před 3 lety

    Hello.
    I had difficulty in self-study because there were not many Korean books that taught me about power query editing or parameters.
    Thanks to your video, I have integrated multiple APIs into a single query.
    I clicked Like and Subscribe.
    Have a nice day!!

  • @szymonrutkowski5957
    @szymonrutkowski5957 Před 4 lety +1

    This is so easy that it seems impossible. I've tried so many solutions from the internet that were not even close to what you've demonstrated. Huge credit! Thx!

  • @brockgroesbeck751
    @brockgroesbeck751 Před 4 lety

    This was a huge help! Needed to download over 5000 rows of data. You saved me so much time. Thank you!

  • @bzb23
    @bzb23 Před 2 lety

    You guy are awesome, this video is what I needed, and completely blow my mind, I got quite confuse regarding URLs table and URL column, but I handle it and apply exactly as you on example that concern me, big THX.

  • @ajaykaturi965
    @ajaykaturi965 Před 4 lety +2

    Great help, Alan,
    I have a query. If my data is beyond a login (I have username and pwd), how do I access that? The data preview page lands in the login page and doesn't return any values (as expected). Is there a way to pass on these credentials also as part of the URL? Thanks!

  • @romeop2345
    @romeop2345 Před 4 lety +1

    Hi Alan, what would you do if you have one column different for each url page? for example I tried to attempt this video on currency exchange rates but it couldnt work as the column currency differs for each webpage (swedish krona, US dollar etc). All other columns are the same

  • @bookworms77
    @bookworms77 Před 4 lety

    Thank you so much! Was exactly what I needed. I will be using this method again and again!

  • @Hak-An
    @Hak-An Před 3 lety

    I was desparately in need of this and your video incrediblely came out to me. Besides your presentation is perfect... thank you

  • @boza-fp5mw
    @boza-fp5mw Před 4 lety +1

    Brilliant, these are great. How do you get data that doesnt import with the url ?, for
    example, a home and away table in football. Any ideas would be great. I
    am stuck

  • @gcanelon2000
    @gcanelon2000 Před rokem

    Excellent video. I used it to download 150 tabs of info on the WSJ website with all public companies listed in USA and it worked like a charm. Thanks a lot.

  • @TauseefKazi
    @TauseefKazi Před 5 měsíci

    Awesome tutorial breaking down this complex process in easy to follow steps along with a good example. Thank you sir.

  • @andresalizaga1
    @andresalizaga1 Před 4 lety

    This is magnificent. I just implemented it for my case and works perfectly. One question though: What if my dataset has +72000 pages? I've calculated that it would take roughly 4.1 days to fetch! Any ideas? I tried using the Ajax URL to speed things up but it was the same time.

  • @shreyaskoushik4146
    @shreyaskoushik4146 Před rokem

    This is incredibly helpful and useful. Really helped me solve a 2 year headache. Thank you so much. Appreciate the detailed guide.

  • @tomstrack42
    @tomstrack42 Před 3 lety

    Exactly what I was looking for and perfectly explained, thank you!

  • @shariftanzim
    @shariftanzim Před 2 lety

    Great video. Exactly what I was looking for and you covered it all. Thank you so much.

  • @niranjankumarcoimbatore5842

    Wow excellent information by simple teaching ... Even a layman understand this module... Creating more interest to learn...Great Work...Non computer literate man from South India Coimbatore Tamil nadu... Slowly learning computers

  • @DKthegameboy
    @DKthegameboy Před 4 lety +1

    I never see videos with this many views that have 0 dislikes but after trying this solution, I understand why. Great video!

  • @johnhawkins7117
    @johnhawkins7117 Před 4 lety

    excellent - !muy bueno! Q: In my query, I'm trying to take information from one of the columns in the table and get a STDEV of the range. It just returns the #DIV/0 error message. For example, I'm using =STDEV.S($D$2:$D$5) as my formula (and entering it as an array formula as well. No luck...and in fact no other range formulas are working either...any ideas of a workaround? Many thanks!

  • @billostrove2016
    @billostrove2016 Před 3 lety +1

    Very helpful video! Useful technique, and explained in an easy to understand way.

  • @hazelyip316
    @hazelyip316 Před rokem

    Excellent & smart stuff, saving tons of work. Grateful for sharing !!!

  • @TonyFeld
    @TonyFeld Před 3 lety

    Great tutorial. Exactly what I was looking for. With a little change in the URL table I was able to include TO and FROM dates and also page numbers. In my case I always know there is going to be 3 pages (most of the time). Not sure how to ignore error and still load the query if one day for example there is going to be 2 pages only. I still want it to load 2 pages then and ignore the error for page 3. Kind of like IFERROR function in excel. I still want it to load with maybe a msg box that page 3 was not found, or something like that. Anybody can suggest anything?

  • @m_dhifan_n
    @m_dhifan_n Před rokem

    One of the best tutorials I've ever seen! Thanks a lot!

  • @dhananjayv.8237
    @dhananjayv.8237 Před 2 lety

    Excellent !!!! This is what I was looking for. Very well explained step by step.

  • @rymask9256
    @rymask9256 Před 4 lety +2

    Thank you! Took me way too long to find this gem

  • @lordmarioh
    @lordmarioh Před 7 měsíci

    Love it ! Thank you so much! With the new Web Connector, this became so much better!

    • @Computergaga
      @Computergaga  Před 7 měsíci

      Oh yes! 👍Thank you. You're welcome.

  • @koeg70
    @koeg70 Před 3 lety

    Thank you. Well explained and saved a lot of my time. This was exactly what I was looking for.

  • @anubhavsapra2853
    @anubhavsapra2853 Před 2 lety

    This is brilliant mate. Many thanks

  • @chriss5438
    @chriss5438 Před rokem

    Thank you - If wanting to review daily in excel and if website urls are dynamic and change daily, ie/ date centric - what would you suggest to do to ensure the most recent pages are being populated in excel.

  • @HienNguyen89
    @HienNguyen89 Před 4 lety

    Love it, it's so helpful for me, thank you so much!

  • @chirag1803
    @chirag1803 Před 2 lety

    One of the best tutorials. Does the job in matter of minutes.

  • @nelsonrioux5555
    @nelsonrioux5555 Před rokem

    WOW. It works perfectly. It is exactly my need. Such a perfect example. You save me hours and hours of a boring task. Many thanks

  • @JoaoSantos-ih9qk
    @JoaoSantos-ih9qk Před rokem

    Hi can only say thank you !! you're an amazing teacher. Cheers

  • @santalvares
    @santalvares Před 3 měsíci

    Thank you! Very useful and it changes the structure of my work a lot. More efficient.

  • @serkanalic
    @serkanalic Před 3 lety

    Life saver.... Can't thank you enough... Been using this for months....

  • @robertreed4337
    @robertreed4337 Před 2 lety

    Hi Alan happy new year I hope have a good one. You've been a great help so far but on this I'm stuck I want to get certain pages of a paper sent to my excel sheet but all I'm getting is the hyperlink to the paper, can you help also I have 6 sets of 4 figures that I want to check against each other then back track a 1000 times do you think I'm being over optimistic on trying this exercise

  • @allieone8518
    @allieone8518 Před 8 měsíci

    Thanks so much for your excellent tutorial!!! It really helped get the job done!!! 👍👍👍

  • @iankr
    @iankr Před 3 měsíci

    Many thanks, Mr Computergaga - awesome!

  • @elbadlis
    @elbadlis Před rokem

    Thanks, Alan. You are awesome!

  • @folasadeadeyanju8070
    @folasadeadeyanju8070 Před 3 lety

    I am grateful for this... I had dataset of 40,000+ rows in 82 pages and it worked perfectly. Thank you for sharing the knowledge. I'm gonna try this in power bi too and see if it works. Thanks again!

    • @sindhup6239
      @sindhup6239 Před 5 měsíci

      How much time does it take to fetch the data

  • @chrislovett2541
    @chrislovett2541 Před 3 lety

    Fantastic video. However, is it possible to grab underlying URL links in a web table using this method? Or would that require VBA?

  • @muratsahin163
    @muratsahin163 Před 4 lety

    Great work! Many thanks, that helped much.

  • @wayneedmondson1065
    @wayneedmondson1065 Před 5 lety

    Hi Alan.. excellent video. Before, I was doing this one by one :( Thanks for showing these great time saving tips and tricks :) Thumbs up!

  • @philaman1972
    @philaman1972 Před 3 lety

    This was a great video. I think the main use case for this functionality is to automate web scraping of changeable data such as stocks, commodities, and prices on goods and services. Otherwise, one could simply cut and paste stagnant data.

  • @FF_Goose
    @FF_Goose Před 2 lety

    This is an excellent video. It helped me overcome a major hurdle today. Thank You!

  • @scaponyx
    @scaponyx Před 2 lety

    I am also REALLY happy as it helped me with the following use case.
    I have addresses I want to scrape off web. They are nicely in a table but there is hiearchy as such: umbrella company -> daugther companies -> branch addresses. I got url's from a sitemap and needed to load everything at once...and this helped me to do exactly that. Hopefully it helps someone else too!

    • @Computergaga
      @Computergaga  Před rokem

      Awesome! Thank you for leaving that comment.

  • @sambrooks5050
    @sambrooks5050 Před 3 lety

    This video was amazing and so easy to follow. Thanks!!!

  • @paulbate5916
    @paulbate5916 Před rokem

    Thank you very much, just spent hours trying to get this to work on other sites.

  • @johnadair4979
    @johnadair4979 Před 4 lety

    Great video! Thanks for posting.

  • @projectrevelation1633
    @projectrevelation1633 Před 11 měsíci

    Helped me a lot mate. Thanks a lot.

  • @marcovincenzoguarrera1933

    great job! very helpful! Thanks

  • @pavelbursa9247
    @pavelbursa9247 Před měsícem

    Cool tutorial! Thank you. Bombastic ! I have learned how to make a function out of a procedure!

  • @oodaikiran
    @oodaikiran Před 2 lety

    Thank you so very much...beautifully explained, Highly underrated video. Just Brilliant

  • @chensonggao9997
    @chensonggao9997 Před rokem

    This is gold! Thank you so much

  • @rddslt
    @rddslt Před 2 lety

    You have saved me a huge amount of time - thank you so much!

  • @kennethvela2784
    @kennethvela2784 Před rokem

    This is gold man, thank you, I was able to fetch currencies from last 7 years from xe site and they love me now. I was also able to get data from steamships lines and so, thank you so much Manz God bless you

    • @Computergaga
      @Computergaga  Před rokem

      You're very welcome. Nice work Kenneth.

    • @kennethvela2784
      @kennethvela2784 Před rokem

      @@Computergaga Can you show us how to do the same thing but with a PDF file? meaning the source would be the folder where the pdf's are located

  • @mattquigley6842
    @mattquigley6842 Před 4 lety

    Excellent tutorial. Thank you!

  • @abhisekgupta_
    @abhisekgupta_ Před 3 lety

    A very big thumbs up for you man. You made my work easier

  • @AMadle11
    @AMadle11 Před 2 lety

    This is so amazing. I've been trying to accomplish this for months, I'm having one small hiccup though - my info is only repeating the scrape for the FIRST URL? I seem to be following the steps exactly and I see the different URLs in the query. The output consistently is the first URL though - any idea why this might be off? Amazing video!

  • @MeArcides
    @MeArcides Před 10 měsíci

    What a piece of knowledge...very useful for me..thank you very much

    • @Computergaga
      @Computergaga  Před 10 měsíci

      You are most welcome. Glad that it was useful.

  • @psychicEgg
    @psychicEgg Před 4 lety +3

    You're an Excel legend! Thanks so much for the tutorial. Quick question, is there any way to add a short delay (eg. 2 seconds) between the import of each table page?
    Some of the tables I'm importing from the web are around 100 pages deep, and I don't want to put stress on their system and risk getting IP blocked. Thanks very much

    • @Computergaga
      @Computergaga  Před 4 lety

      Thank you. I've not used a delay before.

    • @timmeeyh6523
      @timmeeyh6523 Před 3 lety

      @@Computergaga i was wondering the same thing - is that just laymans thinking?

  • @dmertz6538
    @dmertz6538 Před 3 lety

    This just helped me ALOT - Thanks so much!

  • @MayilvelRagavel
    @MayilvelRagavel Před 4 lety

    Great Effort. Nicely presented ... Thanks a ton

  • @acemcjack
    @acemcjack Před 2 lety

    Thank you so much! You really saved me here (And taught me how to use a great and powerful tool)!

  • @allabout1135
    @allabout1135 Před 5 lety +1

    Amazing. This was masterpiece.

  • @tiendungnguyen5174
    @tiendungnguyen5174 Před 2 lety

    This was a great video. Thanks so much for the tutorial

  • @lfszekely
    @lfszekely Před 4 lety

    Fantastic!! It saved me so much time. Very well explained

  • @reachhari
    @reachhari Před 3 lety

    Fantabulous.......Great explanation & worked at the first go....Thanks a ton

  • @seanreidy3401
    @seanreidy3401 Před 11 měsíci

    Thanks, solved a data input I was having.
    Very well explained.

  • @Luciano_mp
    @Luciano_mp Před 5 lety

    Very helpful. Thanks Alan.

  • @vitorspinelli8782
    @vitorspinelli8782 Před 4 lety

    THANKS FOR THIS INCREDIBLE TIP!

  • @DangrPR
    @DangrPR Před 2 lety

    Thanks, you helped me finish important part of my Business Intelligence Internship Project

  • @galagalago
    @galagalago Před 2 lety

    This is very helpful, thank you for sharing this knowledge!

  • @xavierculebras3266
    @xavierculebras3266 Před 2 lety

    Great video and great explanation !!! Congratulations

  • @rupamroychoudhury8078
    @rupamroychoudhury8078 Před 4 lety

    Saved My day...thanks a lot buddy..loads of love

  • @richparnold
    @richparnold Před 4 lety

    Fantastic, this is amazing!!

  • @herbert633
    @herbert633 Před 3 lety

    Mahalo! You solved my whole problem just a few minutes into looking into how this works.

  • @dasrotrad
    @dasrotrad Před 3 lety

    Thank you Computergaga. This was a superb explanation and tutorial.

  • @sohailmoosa
    @sohailmoosa Před 3 lety

    Hey this is an incredible tutorial and very simplified and explained. Thanx a million.
    I tried the above steps however it gives me error from the third URL onwards. Don't know why...
    Can you share this workbook and then may be I can add my URL's there and check if it works?

  • @-Tharos-
    @-Tharos- Před rokem

    This is brilliant! I love how you go through the steps, seems like you're commenting a football match! :) Don't get me wrong it's really well explained! Thanks for that.

  • @edgarortiz9566
    @edgarortiz9566 Před 2 lety +2

    This is great, thank you!
    What if the URL doesn't have a unique identifier for the different pages? I'm trying to download data from more than 50 pages but every time I go to the next page and see a different spreadsheet, it has the same URL. There is no equivalent to the year numbers in your URL.

  • @bradleybeauclair8282
    @bradleybeauclair8282 Před 2 lety

    This was my first fx table. Thank you so much. Please follow this video up with a 5min time intelligence table video if you want. I'm going to watch your other videos.

  • @jxtrakk
    @jxtrakk Před 3 lety +1

    You saved me several hours of manual operational work

  • @massimocotrozzi9156
    @massimocotrozzi9156 Před rokem

    Great stuff! thanks a lot mate. made my work a lot easier :)

  • @neo778
    @neo778 Před 3 lety

    Great Tutorial! I want to build a tool to check a tool to scrape the first 10 webpage URLs for different keywords. The problem is the import does not fetch the href="" part, where the URL of the listed webpages are. How to get this URL-information?

  • @JustYouAvatar
    @JustYouAvatar Před 3 lety

    Thank you for this video!

  • @luigisileno8459
    @luigisileno8459 Před 4 lety

    I normally don't leave comments but just hit the "like" (or not) button. But this video honestly, is very well made and helped me a lot. Thank you Computergaga.

    • @Computergaga
      @Computergaga  Před 4 lety

      You're very welcome. Thank you for the comment Luigi.