The Definitive Guide to Buying a Computer for Data Science in 2023

Sdílet
Vložit
  • čas přidán 8. 09. 2024

Komentáře • 39

  • @IAMD8REAL
    @IAMD8REAL Před 4 měsíci +1

    This is the exact information that I will looking for. Thank you so much.😊😊

    • @GregHogg
      @GregHogg  Před 3 měsíci

      Amazing,. glad to hear it!

  • @Sosassy228
    @Sosassy228 Před 5 měsíci

    In the market for a laptop for my daughter entering college soon and this information has helped tremendously!!! Thank you for NOT comparing anything to a MacBook. My daughter is interested in civil engineering and data science. Do you have any laptop recommendations I can look in to? I was all ready to purchase the Dell XPS 14, but then I saw Just Josh’s review.

  • @brentcos9370
    @brentcos9370 Před 6 měsíci +1

    GPU really needs to support CUDA so yes, Nvidia GeForc RTX series (e.g. RTX 4060, RTX 4070...up to the RTX 4090). 👍😎

  • @PimpMyYugioh
    @PimpMyYugioh Před 18 hodinami

    Is Wattage important?

  • @D3nz13
    @D3nz13 Před rokem +4

    I don't quite agree with the statement that 16GB of RAM is fine and 32GB is overkill. For now - maybe. But considering that RAM prices are low, it's worth getting 32GB now and not need to worry about compatibility issues if you upgrade in the future (especially if you wanna overclock your RAM or even use the XMP profile - by the way, the RAM frequency is also a very important parameter, why was it skipped in the video?)

  • @Martinz_Place
    @Martinz_Place Před 2 měsíci

    Certainly! Here's a refined version:
    It's a bit funny and a bit shallow, as the M3 Pro will outperform any Intel processor on the market right now. I use both Windows and Mac: a Windows desktop and a Mac laptop. I have to say, the Mac M3 Pro is an absolute beast. For my son, who is studying Data Science and AI at university, a Mac is just a better option. iOS will always work better (I've been using Windows since DOS 3.1). So, to summarize: Mac is better for the average user, student, or general person. Windows is for gamers, and Linux is for those willing to invest a lot of time and effort into their setup.

  • @TheCsePower
    @TheCsePower Před rokem +1

    1. A GPU isgood to have but not necessary. A Ryzen 5 processor will give you decent performance, and costs wayyy less.
    2. Whatever you do dont get a HDD. The fastest HDD is slower than then the slowest SSD. It doesnt matter how powerful your setup is, if you have an HDD, your computer will be slower af.

    • @GregHogg
      @GregHogg  Před rokem +1

      Would highly recommend a GPU. It's essentially impossible to train models without one

    • @brentcos9370
      @brentcos9370 Před 6 měsíci

      It's absolutely necessary to do damn near anything related to ML/DL. And get an Nvidia GPU that supports CUDA!

  • @_mutheumusyoka
    @_mutheumusyoka Před 3 měsíci

    Beautiful video, thankyou

    • @GregHogg
      @GregHogg  Před 3 měsíci

      You're very welcome!

    • @_mutheumusyoka
      @_mutheumusyoka Před 3 měsíci

      ​Hey ​@@GregHogg, is it advisable to go for a laptop without with Nvidia Gpu? (Nvidia is pricy in kenya)
      If yes which gpu would you recommend.

  • @juanolano2818
    @juanolano2818 Před rokem

    May be I missed a discussion on GPU memory vs what we are doing. For example, to train or even to do inference in a 7B parameters model, a GPU with 24Gb is desirable. May be a 16Gb GPU can work too. But if we are targeting a 13B or more parameters models, then we may want to understand the GPU requirements to work this type of models locally. Perhaps a 40Gb? a 4090 GPU?

    • @GregHogg
      @GregHogg  Před rokem

      Agreed, vRAM is an important discussion. It was more technical then I wanted to get into. For sure, higher memory is essential for training big models on big datasets.

  • @Deandre-gq6mg
    @Deandre-gq6mg Před měsícem

    is intel the only option for data science or would a ryzen work, like the 7800x3d?

  • @benfarrar8426
    @benfarrar8426 Před rokem

    Awesome vid! I start a degree apprenticeship in data science in September any advice?

    • @GregHogg
      @GregHogg  Před rokem +1

      Try as hard as you can and I wish you the best of luck! Thank you for the support 😊

  • @allbymyself4927
    @allbymyself4927 Před 7 měsíci

    Hi, I wish anyone can help me quickly. What do you mean by lower than 11th Gen they are scamming you, as I am looking at an 8th Gen Core i7. PLEASE HELP!!!!

  • @heyniches8238
    @heyniches8238 Před 9 měsíci

    Hey, is m2 mac is good for data science?
    And please tell weather the sql servers and power bi will work on mac or not

    • @GregHogg
      @GregHogg  Před 9 měsíci

      Yeah macs are finally good for ds

    • @diazjubairy1729
      @diazjubairy1729 Před 8 měsíci

      To be able run pbi and sql servers smoothly you need to run it from VM like paralles which is not free

    • @OffersAlone
      @OffersAlone Před 3 měsíci

      Macs not good for data science. Period. I returned my MBP within a month a few months ago.

  • @slliks4
    @slliks4 Před 6 měsíci

    I3 12100 F smokes i5 10400 🐥
    The generation matters more

  • @SoyeBoy
    @SoyeBoy Před rokem +2

    Wow I really needed this video yesterday as I bought a desktop for Data Science 1 hour before this video was posted! haha
    Luckily, my research led me down much the same path EXCEPT I didn't include a GPU just yet as I don't know how big my deep learning models will be and if required I can always add it in to the desktop later (another advantage of desktop over laptop).
    As a point of reference, I live in Ireland and picked up a DELL Inspiron 3020 Desktop that meets all of your specifications. I got it for just under €900 without the GPU, but it does have the i7 processor, 16GB of Ram and 1TB SDD Hard Drive. The GPU would have added over €350 for the NVIDIA 3050 and over €450 for the NVIDIA 3060.
    Can you comment on my decision not to include a GPU? For example, would a CPU be sufficient for machine learning models only (so no neural nets)? And would it make sense to offload some of the computation to GPUs on the cloud instead?

    • @GregHogg
      @GregHogg  Před rokem +1

      Loool sorry I couldn't have done it sooner! Glad you found something:)

  • @stevesutton772
    @stevesutton772 Před 7 měsíci +11

    Oh come on dude, hdd vs ssd in 2023? Get to the point.

  • @GregHogg
    @GregHogg  Před rokem

    Take my courses at mlnow.ai/!

  • @PayneMaximus
    @PayneMaximus Před rokem +1

    Macs are overpriced gadgets that you can only lease. Do not use them at all, even less for DS.

    • @GregHogg
      @GregHogg  Před rokem +1

      This is a great comment (because I'm a Windows fan boy)

    • @PayneMaximus
      @PayneMaximus Před rokem

      @@GregHoggI'm glad you recognize wisdom! Still, there are too many fanboys out there who do not realize they are being used by the company...

    • @GregHogg
      @GregHogg  Před rokem +1

      @@PayneMaximus Let's be honest, I'm a Windows fan boy lol

    • @PayneMaximus
      @PayneMaximus Před rokem +1

      @@GregHoggI've used Windows for most of my life, every since my second incarnation (before I used DOS for quite some time), so I like it a lot, but I wouldn't call myself a Windows fanboy. I've gotten myself into Linux quite a bit for DS, and I must say modern distros are easy to use, unlike in the old days.

    • @daegrun
      @daegrun Před 11 měsíci

      Idk. I still have my 2012 mid Apple laptop. Sadly latest Mac software is only Catalina so no longevity but bootcamp in Windows 10 is still solid for now.
      But I think that Apple laptop was the last of its kind for do it yourself upgrade/replacements, which I did to the max after some years passed.
      😅 I’m nervous paying so much for Windows laptop just for the minimum this laptop already has since 2012. :/ I hope the trackpads are better these days.

  • @xerxer9251
    @xerxer9251 Před 11 měsíci

    Is NVidia just better for this?

    • @GregHogg
      @GregHogg  Před 11 měsíci +1

      For deep learning yeah pretty much

    • @brentcos9370
      @brentcos9370 Před 6 měsíci

      @@GregHogg 100% agree. CUDA!