Import Multiple Pages from Web with One Query in Excel
Vložit
- čas přidán 30. 07. 2024
- In this video, we import multiple pages of data from a website with one Excel query using Power Query.
We create a table of URL's that can automatically update on a sheet, import data from the web and then turn it into a function with some M code.
We then create a custom column to use this function and feed it the URLs from the table to import as many pages as we want.
Find more great free tutorials at;
www.computergaga.com
** Online Excel Courses **
The Ultimate Excel Course - Learn Everything ► bit.ly/UltimateExcel
Excel VBA for Beginners ► bit.ly/2JvnnRv
Advanced Excel Tricks ► bit.ly/3CGCm3M
Excel Formulas Made Easy ► bit.ly/2t3netw
Creating Sports League Tables and Tournaments in Excel ► bit.ly/2viGg3J
Connect with us!
LinkedIn ► / 18737946
Instagram ► / computergaga1
Twitter ► / computergaga1 - Jak na to + styl
Not only this is incredibly useful, but is also wonderfully explained. Thanks man!
My pleasure. Thank you very much.
SAME HERE...BUT UR REACTION..WHILE DELETING URLs IN TABLE..!!!!
I'm overwhelmed by the incredible level of clarity this tutorial displays! Kudos
Wow, thanks!
This video tutorial is now about 16 months old, but it's going to help me in pulling in the needed data from my UberEats Deliveries. I realized I needed to automate the data to help maximize my taxes, and the Power Query as a function to bring in multiple pages is really what I need. So thank you very much. I look forward to getting it implemented, so that I can link the Excel files in MS Access for further data analysis.
Excellent! Thank you.
Searched for this all over . This has been the best example I've seen and very well explained. Thank you very much
Glad it was helpful! Thank you 😊
I can't thank you enough, Computergaga! My work project just became a breeze.
You're very welcome.
you make my date on time when looking for immediate resolution.Thanks
This tutorial is pure Gold! Thanks! Looking forward to more videos from you. :)
Thank you very much Subanta.
Become one of my top instructors and youtube channels with only one single video! thanks a million!
Thank you Isparoz.
I couldn't be more grateful to find this channel. Dear Computergaga, you explain everything so clear and understandable :) Real life saviour!
Thank you very much 😊 That's great!
I can't thank you enough for this detailed video (step by step). Its 3 years old but still valid and valuable. God Bless you!
Glad it helped! Thank you 💓
You, sir, ARE A LEGEND. Searched for ages to accomplish what this allows me and came up empty. When I landed on you, I was home free. Thank YOU!!!
Great to hear, Kurt. Happy to help.
This is an excellent tutorial - clear and easy to follow, thank you! It's saved me a ton of time.
You're welcome, Matthew. Thank you.
Worked like a charm!! No tutorial has ever worked for me in 1-go!! Thanks!
Happy to help, Gaurav. Excellent!
This Video is still the best when it comes to Excel Web Scraping! For me it was a breakthrough. Many thanks!
You're very welcome! Thank you for your comments Johann.
Hello.
I had difficulty in self-study because there were not many Korean books that taught me about power query editing or parameters.
Thanks to your video, I have integrated multiple APIs into a single query.
I clicked Like and Subscribe.
Have a nice day!!
This is so easy that it seems impossible. I've tried so many solutions from the internet that were not even close to what you've demonstrated. Huge credit! Thx!
You're welcome Szymon.
This was a huge help! Needed to download over 5000 rows of data. You saved me so much time. Thank you!
Awesome to hear. Happy to help Brock.
You guy are awesome, this video is what I needed, and completely blow my mind, I got quite confuse regarding URLs table and URL column, but I handle it and apply exactly as you on example that concern me, big THX.
Great help, Alan,
I have a query. If my data is beyond a login (I have username and pwd), how do I access that? The data preview page lands in the login page and doesn't return any values (as expected). Is there a way to pass on these credentials also as part of the URL? Thanks!
Hi Alan, what would you do if you have one column different for each url page? for example I tried to attempt this video on currency exchange rates but it couldnt work as the column currency differs for each webpage (swedish krona, US dollar etc). All other columns are the same
Thank you so much! Was exactly what I needed. I will be using this method again and again!
You're welcome. Thank you Julia.
I was desparately in need of this and your video incrediblely came out to me. Besides your presentation is perfect... thank you
You're very welcome 😊
Brilliant, these are great. How do you get data that doesnt import with the url ?, for
example, a home and away table in football. Any ideas would be great. I
am stuck
Excellent video. I used it to download 150 tabs of info on the WSJ website with all public companies listed in USA and it worked like a charm. Thanks a lot.
Awesome! Thanks for sharing.
Awesome tutorial breaking down this complex process in easy to follow steps along with a good example. Thank you sir.
This is magnificent. I just implemented it for my case and works perfectly. One question though: What if my dataset has +72000 pages? I've calculated that it would take roughly 4.1 days to fetch! Any ideas? I tried using the Ajax URL to speed things up but it was the same time.
This is incredibly helpful and useful. Really helped me solve a 2 year headache. Thank you so much. Appreciate the detailed guide.
You're very welcome!
Exactly what I was looking for and perfectly explained, thank you!
You're welcome. Thanks 😊
Great video. Exactly what I was looking for and you covered it all. Thank you so much.
No worries. Good to hear, Sharif.
Wow excellent information by simple teaching ... Even a layman understand this module... Creating more interest to learn...Great Work...Non computer literate man from South India Coimbatore Tamil nadu... Slowly learning computers
So nice of you. Thank you Niranjan.
I never see videos with this many views that have 0 dislikes but after trying this solution, I understand why. Great video!
Thank you very much.
excellent - !muy bueno! Q: In my query, I'm trying to take information from one of the columns in the table and get a STDEV of the range. It just returns the #DIV/0 error message. For example, I'm using =STDEV.S($D$2:$D$5) as my formula (and entering it as an array formula as well. No luck...and in fact no other range formulas are working either...any ideas of a workaround? Many thanks!
Very helpful video! Useful technique, and explained in an easy to understand way.
Thank you, Bill.
Excellent & smart stuff, saving tons of work. Grateful for sharing !!!
Very welcome! Thank you.
Great tutorial. Exactly what I was looking for. With a little change in the URL table I was able to include TO and FROM dates and also page numbers. In my case I always know there is going to be 3 pages (most of the time). Not sure how to ignore error and still load the query if one day for example there is going to be 2 pages only. I still want it to load 2 pages then and ignore the error for page 3. Kind of like IFERROR function in excel. I still want it to load with maybe a msg box that page 3 was not found, or something like that. Anybody can suggest anything?
One of the best tutorials I've ever seen! Thanks a lot!
You're welcome. Thank you.
Excellent !!!! This is what I was looking for. Very well explained step by step.
Thank you, Dhananjay
Thank you! Took me way too long to find this gem
You're welcome.
Love it ! Thank you so much! With the new Web Connector, this became so much better!
Oh yes! 👍Thank you. You're welcome.
Thank you. Well explained and saved a lot of my time. This was exactly what I was looking for.
Glad it helped!
This is brilliant mate. Many thanks
Thank you - If wanting to review daily in excel and if website urls are dynamic and change daily, ie/ date centric - what would you suggest to do to ensure the most recent pages are being populated in excel.
Love it, it's so helpful for me, thank you so much!
You're very welcome, thank you.
One of the best tutorials. Does the job in matter of minutes.
Thank you, Chirag.
WOW. It works perfectly. It is exactly my need. Such a perfect example. You save me hours and hours of a boring task. Many thanks
Great to hear. Thank you 👍
Hi can only say thank you !! you're an amazing teacher. Cheers
Thank you! 😃
Thank you! Very useful and it changes the structure of my work a lot. More efficient.
Great to hear. Thank you.
Life saver.... Can't thank you enough... Been using this for months....
You're welcome, Serkan 👍
Hi Alan happy new year I hope have a good one. You've been a great help so far but on this I'm stuck I want to get certain pages of a paper sent to my excel sheet but all I'm getting is the hyperlink to the paper, can you help also I have 6 sets of 4 figures that I want to check against each other then back track a 1000 times do you think I'm being over optimistic on trying this exercise
Thanks so much for your excellent tutorial!!! It really helped get the job done!!! 👍👍👍
Many thanks, Mr Computergaga - awesome!
Thank you so much, Ian.
Thanks, Alan. You are awesome!
Thank you 😊 you're welcome
I am grateful for this... I had dataset of 40,000+ rows in 82 pages and it worked perfectly. Thank you for sharing the knowledge. I'm gonna try this in power bi too and see if it works. Thanks again!
How much time does it take to fetch the data
Fantastic video. However, is it possible to grab underlying URL links in a web table using this method? Or would that require VBA?
Great work! Many thanks, that helped much.
Thank you. You're welcome.
Hi Alan.. excellent video. Before, I was doing this one by one :( Thanks for showing these great time saving tips and tricks :) Thumbs up!
You're welcome Wayne.
This was a great video. I think the main use case for this functionality is to automate web scraping of changeable data such as stocks, commodities, and prices on goods and services. Otherwise, one could simply cut and paste stagnant data.
Thank you.
This is an excellent video. It helped me overcome a major hurdle today. Thank You!
Great to hear. Thank you, James 😊
I am also REALLY happy as it helped me with the following use case.
I have addresses I want to scrape off web. They are nicely in a table but there is hiearchy as such: umbrella company -> daugther companies -> branch addresses. I got url's from a sitemap and needed to load everything at once...and this helped me to do exactly that. Hopefully it helps someone else too!
Awesome! Thank you for leaving that comment.
This video was amazing and so easy to follow. Thanks!!!
You're welcome, Sam. Thank you 😊
Thank you very much, just spent hours trying to get this to work on other sites.
No worries, Paul. Very glad to help.
Great video! Thanks for posting.
You're welcome. Thank you John.
Helped me a lot mate. Thanks a lot.
No problem 👍
great job! very helpful! Thanks
My pleasure. Thank you Marco.
Cool tutorial! Thank you. Bombastic ! I have learned how to make a function out of a procedure!
Excellent! No worries, buddy.
Thank you so very much...beautifully explained, Highly underrated video. Just Brilliant
Thank you 😊
This is gold! Thank you so much
You're so welcome!
You have saved me a huge amount of time - thank you so much!
You're very welcome.
This is gold man, thank you, I was able to fetch currencies from last 7 years from xe site and they love me now. I was also able to get data from steamships lines and so, thank you so much Manz God bless you
You're very welcome. Nice work Kenneth.
@@Computergaga Can you show us how to do the same thing but with a PDF file? meaning the source would be the folder where the pdf's are located
Excellent tutorial. Thank you!
You're welcome Matt. Thank you.
A very big thumbs up for you man. You made my work easier
Awesome!! 👍
This is so amazing. I've been trying to accomplish this for months, I'm having one small hiccup though - my info is only repeating the scrape for the FIRST URL? I seem to be following the steps exactly and I see the different URLs in the query. The output consistently is the first URL though - any idea why this might be off? Amazing video!
What a piece of knowledge...very useful for me..thank you very much
You are most welcome. Glad that it was useful.
You're an Excel legend! Thanks so much for the tutorial. Quick question, is there any way to add a short delay (eg. 2 seconds) between the import of each table page?
Some of the tables I'm importing from the web are around 100 pages deep, and I don't want to put stress on their system and risk getting IP blocked. Thanks very much
Thank you. I've not used a delay before.
@@Computergaga i was wondering the same thing - is that just laymans thinking?
This just helped me ALOT - Thanks so much!
Great! Happy to help 👍
Great Effort. Nicely presented ... Thanks a ton
Thank you, Mayilvel.
Thank you so much! You really saved me here (And taught me how to use a great and powerful tool)!
Awesome! Good to hear.
Amazing. This was masterpiece.
Thank you Aleksandrs.
This was a great video. Thanks so much for the tutorial
You are so welcome!
Fantastic!! It saved me so much time. Very well explained
Glad it helped! Thank you, Luis.
Fantabulous.......Great explanation & worked at the first go....Thanks a ton
Most welcome, Hari 👍 Thank you.
Thanks, solved a data input I was having.
Very well explained.
Thank you. Glad to be able to help.
Very helpful. Thanks Alan.
You're welcome Luciano.
THANKS FOR THIS INCREDIBLE TIP!
You're very welcome Vitor. Thank you.
Thanks, you helped me finish important part of my Business Intelligence Internship Project
Happy to help 👍
This is very helpful, thank you for sharing this knowledge!
You're welcome. Thank you.
Great video and great explanation !!! Congratulations
Thank you, Xavier.
Saved My day...thanks a lot buddy..loads of love
Thank you Rupam.
Fantastic, this is amazing!!
Thank you Richard.
Mahalo! You solved my whole problem just a few minutes into looking into how this works.
👍
Thank you Computergaga. This was a superb explanation and tutorial.
You are welcome, Robert.
Hey this is an incredible tutorial and very simplified and explained. Thanx a million.
I tried the above steps however it gives me error from the third URL onwards. Don't know why...
Can you share this workbook and then may be I can add my URL's there and check if it works?
This is brilliant! I love how you go through the steps, seems like you're commenting a football match! :) Don't get me wrong it's really well explained! Thanks for that.
Thanks! 😃
This is great, thank you!
What if the URL doesn't have a unique identifier for the different pages? I'm trying to download data from more than 50 pages but every time I go to the next page and see a different spreadsheet, it has the same URL. There is no equivalent to the year numbers in your URL.
This was my first fx table. Thank you so much. Please follow this video up with a 5min time intelligence table video if you want. I'm going to watch your other videos.
You saved me several hours of manual operational work
Great! 👍
Great stuff! thanks a lot mate. made my work a lot easier :)
No problem 👍
Great Tutorial! I want to build a tool to check a tool to scrape the first 10 webpage URLs for different keywords. The problem is the import does not fetch the href="" part, where the URL of the listed webpages are. How to get this URL-information?
Thank you for this video!
You are so welcome!
I normally don't leave comments but just hit the "like" (or not) button. But this video honestly, is very well made and helped me a lot. Thank you Computergaga.
You're very welcome. Thank you for the comment Luigi.