Apple's Silicon Magic Is Over!

Sdílet
Vložit
  • čas přidán 3. 05. 2024
  • Apple silicon and the M1 took the computing world by storm. But others are catching up-and Apple is hitting the limits of physics. It's time to focus on what Apple does best: hardware.
    Follow me on Threads - threads.net/snazzyq
    Follow me on Instagram - / snazzyq
    Video Summary (this is just for SEO lol): Discover how Apple's M1 revolutionized MacBook performance, transforming from problematic designs to industry-leading innovations. Dive deep into the M3, comparing its capabilities and drawbacks against the celebrated M1 and slightly improved M2. Explore how Apple silicon, such as the M1 and M3, fares against upcoming threats like Qualcomm's Snapdragon X Elite. Get insights on the future of MacBook, Apple silicon, and the potential shifts with M4 in the fiercely competitive tech landscape.
    0:00 Things used to be HOT (in a bad way)
    0:39 The M1 changed everything
    1:40 ...but changes need to happen again!
    2:12 Why has M1/M2/M3 been such a success?
    3:28 Let's talk transistor density.
    5:27 M2 is hotter than M1-but it's complicated
    6:38 Sand, man. Sand is the problem!
    8:35 Stop being a nerd. Silicon ain't the problem.
    9:18 Competition is coming. Like, now.
    11:03 What is this "competition?"
    12:00 Apple will still dominate high-end Arm, for now
    13:29 The M3 Max is HOT and throttles HARD
    14:54 Apple Silicon CAN enable BETTER hardware
    15:56 Can't Innovate Anymore My @ss
    17:02 It's time to be bold again
  • Věda a technologie

Komentáře • 2,8K

  • @katieadams5860
    @katieadams5860 Před 13 dny +4484

    For me the biggest problem for apple silicon macs is the complete lack of upgradability and also the criminal pricing of RAM and SSD upgrades

    • @barkingsheltie
      @barkingsheltie Před 13 dny +135

      If I'm not mistaken, the video may have suggested Qualcomm's ASIC may allow for user upgradeable storage.

    • @snazzy
      @snazzy  Před 13 dny +1199

      They are gonna have to change that eventually. They’ve had the same upgrade pricing tiers since 2015 which is insane. Their base models are such a great deal but the min spec is so lousy almost everyone should upgrade and by the time you’ve doubled the ram and storage you’re at basically double the price. Sucks lol

    • @SkepticalRaptor
      @SkepticalRaptor Před 13 dny +122

      A boring trope. No one cares, except a handful of nerds that think they’re smarter than everyone else. Zzzzzzzz

    • @GlobalWave1
      @GlobalWave1 Před 13 dny +82

      With these new chips coming out from Qualcomm I think it’s will be time for Apple to reconsider their pricing to compete.

    • @OanKnight
      @OanKnight Před 13 dny +160

      @@snazzy I'm honestly surprised that the EU hasn't run riot on that yet. The change is inevitable, as they're going in hard on right to repair to the point that it's a matter of time that the right people notice the problems with the lack of upgradeability.

  • @tayjn
    @tayjn Před 13 dny +1446

    The 2021 MacBook Pros are aging beautifully, I have no desire to upgrade or feeling of fomo at all

    • @thekeepr
      @thekeepr Před 13 dny +38

      His point lol

    • @snazzy
      @snazzy  Před 13 dny +454

      Yep. I’m still using an M1 Pro MBP and M1 iMac as my daily drivers. Zero desire to upgrade.

    • @joh2434
      @joh2434 Před 13 dny +23

      M1 Mini and M1 Air here and still going strong, the Air will need a new battery soon but otherwise it's good

    • @PaulLembo
      @PaulLembo Před 13 dny +23

      @@thekeeprI would say not really. The point was to work in a Qualcomm ad, they paid him to visit after all, and then make a “call” for bold form factors which was largely pointless, thus the clickbait splash page.
      I think the Vision Pro is that form factor, which Quinn knows.
      The really slim Mac is effectively an iPad with key case. The issue there is software not hardware, which Quinn knows.
      This nets out to a Qualcomm ad and some M chip data that is sorta news and sorta not, depending on if you follow the channel.
      It’s useful for irregular viewers of Snazzy but meh for the regulars. Wish Quinn well of coursee

    • @barkingsheltie
      @barkingsheltie Před 13 dny +8

      I still have a 2019 Intel laptop, but do have a M1 Ultra. After watching the video, I think I would be well served with the 2021 MacBook Pro for travel and couch surfing too.

  • @IraQNid
    @IraQNid Před 11 dny +78

    I'm sticking to x86 via Ryzen 9 computers with 16 cores/32 threads, support for DDR5 128 GBs RAM, and all the hard drives I could ever want.

    • @mikekelly6331
      @mikekelly6331 Před 6 dny +16

      For a desktop, I agree. But put that in a laptop, you'd have to carry around a car battery to get decent battery life.
      I'd love the arm option for mobile

    • @MJSGamingSanctuary
      @MJSGamingSanctuary Před 2 dny +5

      @@mikekelly6331 Not actually true a lot of the Ryzen newer AM4 cpu's actually draw less power than their intel equivalents. They just run a little hot. Most of the time.

    • @marcuskissinger3842
      @marcuskissinger3842 Před 2 dny +2

      @@MJSGamingSanctuarycool story bro but nobody even mentioned Intel

    • @cheekoandtheman
      @cheekoandtheman Před 2 dny +1

      As lifelong pc guy , I recently started longterm camping in a beautiful rainforest and Apple is the only way to go , I’ve wasted too much money on badly built pc laptops and to get a comparable build quality pc laptop , I need to pay more than Apple and then get less performance

    • @Lauren_C
      @Lauren_C Před 2 dny

      @@cheekoandthemanI play a fair bit of games on my laptop, so while I’d certainly kill for the efficiency of an Apple Silicon Mac, my use case rules them out wholly.

  • @Daekar3
    @Daekar3 Před 11 dny +240

    As someone who has built their own desktop PCs since before a lot of CZcams viewers were born, my biggest anxiety about this transition is whether or not I'll be able to keep building and customizing my own hardware. I don't particularly mind the idea of SoCs, but I really value the modularity of the existing x86 platform. It has saved me boatload of money over the years and lets me get exactly what I care about without paying for what I don't. The thought of being stuck with only a laptop, with all the horrible tradeoffs and e-waste that comes with that is just appalling.

    • @ethernet01
      @ethernet01 Před 10 dny +9

      hoping for the same socketed systems that we are used to from intel/amd to come to arm, and just motherboards with a new chipset, maybe no chipset in the case of an SoC
      they would have great iGPUs and maybe free included ram, while keeping pcie and ddr5 the same or a new standardized ddr-dimm for arm systems
      apple themselves have proved that pcie at all is possble although no current gpus work with it
      making a new socket-its nothing new
      storage is too easy to implement, and its another opportunity for cpu makers to offer some small on board boot drive
      leaving modular GPU and RAM as the big main questions

    • @crabapple1974
      @crabapple1974 Před 9 dny +2

      I agree with you but I have my PC at home. For games, 3D-printing, hobby programming but also professional heavy programs. For professional stuff on the go my MacBook m1 has been awsome. It just works, syncs easily with my phone etc. It is way more reliable than any pc laptop (and I have had many). Both systems have their strengths and weaknesses both in software and hardware. PC definitely needs to modernize itself on many fronts. But I hope it stays open and adaptable.

    • @MarcTelesha
      @MarcTelesha Před 9 dny +5

      GPU is the hero for PC Desktops. The future is more and more GPU focused for everything creator and AI oriented and having a SoC and a GPU will be super powerful.
      Old Amiga guy here. Amigas were great because they had multiple of chips. If we can figure out the internal i/o of that in a motherboard layout we will all have custom super computers in our future.

    • @crabapple1974
      @crabapple1974 Před 9 dny

      @@MarcTelesha The future will be interposers with logic built into them. They can be designed with topologies that enables far better communication than currently between chips. How the architecture overall will be is hard to forecast. Will it be similar to the newest Nvidia AI gpus with CPU and GPU on the same interposer with shared memory? If the performance advantage is high enough that will make upgradeablility of the individial components sacrificable.

    • @veljko100able
      @veljko100able Před 9 dny +1

      Unfortunately, in future everything would move towards even larger integration, so everyhing would need to be as close as possible, on the same die.

  • @steveftoth
    @steveftoth Před 13 dny +558

    It took Apple more than a decade to produce the m1 and people can’t expect the magic reveal every year let alone even every 5 years. Tech moves much slower than the marketing.

    • @bombombalu
      @bombombalu Před 12 dny +19

      I just wish they had the courage to try different form factors.

    • @petarprokopenko645
      @petarprokopenko645 Před 12 dny +3

      @@bombombalu like what? What would be a good idea?

    • @bombombalu
      @bombombalu Před 12 dny +20

      @@petarprokopenko645 I would immediately buy a 12-inch MacBook made out of matte plastic with more rounded edges. Even more so if it was a convertible with pencil-support.
      I'm genuinely tired of aluminum. It's cold and hard and not comfortable at all. I loved my 2006 MacBook....

    • @fixups6536
      @fixups6536 Před 12 dny

      @@petarprokopenko645 An M1, M2 or even M3 12 inch Macbook would be awesome. I would purchase it on the spot.

    • @JohnSmith-ro8hk
      @JohnSmith-ro8hk Před 12 dny +11

      It took Jim Keller to make the m1 magic, he's long gone and his legacy is now over.

  • @makatron
    @makatron Před 11 dny +251

    As soon as Apple began soldering ram and storage to the boards they lost me as a client. Ram prices are borderline criminal.

    • @MelroyvandenBerg
      @MelroyvandenBerg Před 9 dny +10

      Agreed

    • @Useruseruser32121
      @Useruseruser32121 Před 9 dny +5

      Nowadays even gaming laptops have soldered-in ram.

    • @veljko100able
      @veljko100able Před 9 dny +26

      Apple ram is not soldered to the board. That's the big difference and everything is moving that way in the industry. Ram is ETCHED in the same silicon die that cpu gpu and other modules are. And believe me in future everything would be that way. That sort of tight integration enables huge performace and power savings, something that is absolutely needed...

    • @DownUnder43
      @DownUnder43 Před 8 dny +7

      ​@@Useruseruser32121don't know what you have been buying but most of the good ones don't have unless you clueless what to buy.

    • @DownUnder43
      @DownUnder43 Před 8 dny +5

      ​@@veljko100ableit is not upgradable, period 🤷🏻

  • @Sulphur_67
    @Sulphur_67 Před 11 dny +57

    not only are the ssds unupgradeable, but they’re also unreplaceable, if the ssd chips die, the mac is a brick. i hate that.

    • @albundy3929
      @albundy3929 Před 7 dny

      link?

    • @miketkong2
      @miketkong2 Před 6 dny +2

      Have you ever had a Mac with an SSD that died?

    • @lukasmauer230
      @lukasmauer230 Před 6 dny +3

      ​@@miketkong2 dude 😆, it can die easily with swapping.

    • @miketkong2
      @miketkong2 Před 6 dny +3

      @@lukasmauer230 what’s swapping? It’s an internal drive.

    • @Anthony-kp7sf
      @Anthony-kp7sf Před 6 dny +2

      @@lukasmauer230 no it can't. lmfao. the swapping thing is a meme. its virtually unheard of for modern SSDs to fail due to swap. it almost never happens, only under experimental stress testing. its not a thing anybody should ever worry about with a laptop lol.

  • @delarageaz
    @delarageaz Před 9 dny +8

    i just love it that the original M1 chip had a media engine 1.5x faster than the dedicated Afterburner card released a year earlier that apple was still selling for 2000$ at that time.

  • @krasserTerror
    @krasserTerror Před 13 dny +124

    One mistake: In the beginning of the PowerPC era Mac owners could be bragadocious for the first time. They were faster and more efficient than Intels chips. That advantage faded away later until the switch to Intel was a big step up.

    • @morphoist
      @morphoist Před 13 dny +9

      This.... but to be honest... even a 68k was zippy if you tweaked the OS right... That's why I moved away from Mac.. even though Classic OS was at times buggy.. if you knew how to arrange your extensions, it was fast and just did its job... none of this bloated OS with a kiddy interface that dumbs it down.

    • @joelv4495
      @joelv4495 Před 12 dny +16

      It's pretty ironic actually. ARM is the spiritual successor to the RISC arch in the PowerPCs.

    • @computernerd8157
      @computernerd8157 Před 12 dny +16

      They were never faster then intel that was a marketing lie. I loved the pre intel Macs Computer pre sysyem 10 days. Apple will eventually fix whatever problems they encounter but some things are by design. Apple computer where never built to be customizable since the Apple II lost to the Mac. If the Woz had his way, Apple computers would have been more like custom pc are today.

    • @themadoneplays7842
      @themadoneplays7842 Před 12 dny

      @@joelv4495 yes and no as ARM kind of predated PowerPC thanks to Acorns work on the BBC micro and all that.

    • @krasserTerror
      @krasserTerror Před 12 dny +3

      PowerPC was more efficient (in the beginning). I remember a test were the notebooks were a lot faster than Intel-Notebooks. I don't know about desktops.

  • @applesushi
    @applesushi Před 13 dny +659

    I liked the Quinn beardvolution shot.

    • @eafortson
      @eafortson Před 13 dny +4

      Came here to say exactly this ✊🏽

    • @gregmach8230
      @gregmach8230 Před 12 dny +3

      The biggest issue is that it's not a pc.
      Its greatest achievement is that it's not a cromebook.
      The Acorn pc used Risc chips.....in the 90s. Risc chips are great for using half the power and taking 3 times as long to do the same task.

    • @ICanDoThatToo2
      @ICanDoThatToo2 Před 12 dny +1

      2:00 clicky clicky

    • @LoFiAxolotl
      @LoFiAxolotl Před 11 dny +1

      slow decline into becoming a hermit

    • @nursultannazarov8379
      @nursultannazarov8379 Před 11 dny

      He wanted simpleton brains to comment about it. You got manipulated.

  • @ROberrto522
    @ROberrto522 Před 10 dny +9

    What Apple does best? Hardware? Seriously? Louis Rossmann would like a word. What Apple does best is User Experience essentially... their design process starts with things people need to do, then the build the hardware and software around that. Thats why people love their Apple gear so much.

    • @joshuamusser8893
      @joshuamusser8893 Před 19 hodinami

      Comparatively speaking it’s true though. Maybe not what Apple does best, but it’s one thing Apple typically does better than the competition.

  • @JoeStuffzAlt
    @JoeStuffzAlt Před 12 dny +28

    What you said is what AMD realized. "Wait... if we manufacture this part on the new node but these parts that won't get as much of a performance boost on these parts on the older node, we can save money!"

    • @Alice_Fumo
      @Alice_Fumo Před 8 dny +4

      It's not just saving money, shrinking certain parts (like IO) further just didn't scale to the new nodes...
      With chiplets, they have very small dies and thus the chances of major errors in them is much lower, resulting in way better yield or feasibility to make them to begin with. Meanwhile the huge-ass IO die gets made on a node which is mature enough to reliably make that size of die.
      Well, I guess it is saving money in the end after all by making the production yield of their chips not prohibitively bad.

    • @yarost12
      @yarost12 Před 5 dny

      Yeah and then those chips alone idle at ~20W...

    • @JoeStuffzAlt
      @JoeStuffzAlt Před 5 dny +1

      @@yarost12 There's Ryzen NUCs that idle under 10W.

  • @tommyv010
    @tommyv010 Před 13 dny +275

    Ok, I understand if you don’t offer upgradable ram, a lot of other brands are doing the same, but no upgradable storage? Insane.

    • @andyH_England
      @andyH_England Před 13 dny +13

      With Thunderbolt, superfast external storage with a small portable drive is easy. I have bought a base for many years and use external drives. Even when my 8-year-old iMac HD died many years ago, I ran the OS off an external Firewire drive which ran faster than the old mechanical HD. This is really not a problem in 2024.

    • @HVDynamo
      @HVDynamo Před 13 dny +48

      @@andyH_England I really want the ability to upgrade internal storage though. The 14" and 16" pro should absolutely have an extra M.2 slot in it that I can drop an SSD into later. Technically they should have up-gradable RAM too but with the way it's integrated into the CPU, it does provide some benefit there so it's a trade-off. But my general take is the Pro versions should have upgradable RAM/Storage and the Air can be soldered to allow a sleeker design (like the 12" Macbook). It really astounds me that they didn't re-release the super thin Macbook with Apple Silicon as the new Macbook Air, and then the laptops they call the Air now should just be a Macbook. It's like whomever is in charge of naming these products is asleep at the wheel.

    • @qwertzy121212
      @qwertzy121212 Před 13 dny +56

      @@andyH_England external???? gross!

    • @charlespeters1480
      @charlespeters1480 Před 13 dny +46

      @@andyH_England ,maybe for you, but a lot of us don't want portable dirves hanging off of our small, sleek, highly portable, and light laptops. In 2024 having only 256 GB (the base configuration of a macbook air) feels a bit cramped to me, especially if you want to edit large 4K video files.

    • @arihantplaying
      @arihantplaying Před 13 dny +19

      I mean external SSD is ok but
      Not giving m.2 slot for SSD is criminal I mean if I have m.2 SSD from my previous laptop or want to upgrade storage there should be 1 extra free m.2 slot in every laptop
      (And you can save the planet also if you care)

  • @PhilfreezeCH
    @PhilfreezeCH Před 13 dny +259

    3:10 there is a fourth point thats also important:
    Apple has continuously operated on a node/technology advantage. Their lower volume, high average selling price and good relations to TSMC made it possible to always be ahead of AMD or Intel in that regard.
    This is basically only possible because they sell the complete laptop and can use the higher profit margins to actually buy significant volume from TSMC, while still buying less silicon than AMD or Intel move.

    • @robertp457
      @robertp457 Před 13 dny +29

      They use their higher profit margins to fill in their bank account. If Apple made slightly less money they could sell a lot more computers, but just like Harley Davidson Apple would rather sell fewer products at much higher profit margins than put an Apple computer into everyone’s house.

    • @sihamhamda47
      @sihamhamda47 Před 13 dny +16

      They even booked all the early batches of 2nm TSMC silicon already, so other brand needs to wait a bit longer

    • @sushimshah2896
      @sushimshah2896 Před 13 dny +8

      Also, I think it was a previous video of his (or some other creator) which looked into the actual transistor count (mostly because of Apple's node advantage as you pointed out) vs performance comparison with AMD/Intel chips, and ofc its not all rosy after all!

    • @slamhk1648
      @slamhk1648 Před 12 dny

      @@sihamhamda47That’s the thing about being a key founder, you provide all the funds for TSMC to set up and go forward with a new node. It’s because of bug customers like Apple that the industry is moving forward to these new nodes. Also going by the reports many have opted not to design their chip for TSMC N3B as there’s very high costs associated with the node.

    • @sflxn
      @sflxn Před 12 dny +6

      You’re so misinformed on the volume. They are the largest user of silicon in the world and also TSMC’s highest volume customer. There are a few little known products called the iPhone, iPad, AirPods, and watches. At one point, Apple accounted for 1/3 of all silicon used in the world in a given year.

  • @hctiBelttiL
    @hctiBelttiL Před 11 dny +32

    You're discounting x86 architecture too much. AMD's 0504 custom APU made for the Steamdeck is an x86 chip which can run most modern PC games with a 15W TDP. In desktop mode it draws only about 8W in typical usage scenarios (disclaimer: I was unable to isolate the APU power draw in desktop mode, so this is the figure that represents total power draw from the battery, with the screen off).
    Personally, I'd be hyped to see powerful and energy efficient chips like this one in laptops. It has all the compatibility benefits of the x86 architecture and it's pretty efficient. AMD does offer slightly more powerful chips for laptops, but they have roughly double the power draw, which makes them less desirable for people who are on the go a lot.

    • @darekmistrz4364
      @darekmistrz4364 Před 9 dny +5

      And Steamdecks APU is pretty old now. There are newer designs APU that are pretty astounding

    • @matejjustus4573
      @matejjustus4573 Před 8 dny +2

      I think the reason is that the steamdeck is 720p if I am not wrong. We already have processors like that in laptops, but because of things like higher resolution and windows instead of optimised linux distro they seem less powerfull/impresive.

    • @hctiBelttiL
      @hctiBelttiL Před 8 dny +3

      @@matejjustus4573 yup, I agree. But what I'm interested in is a lightweight Linux laptop that can run for many hours on battery, for hassle-free dev work on the go. It just needs to run an IDE, a web browser and, in a pinch, compile some basic modules for testing. If I want to do anything fancy I can delegate the workload to my PC at home.
      And yes, I very much prefer an x86 machine to natively test my x86 binaries on.

    • @matejjustus4573
      @matejjustus4573 Před 8 dny

      @@hctiBelttiL i have old Thinkpad t480 for my studies, CZcams and programming (clion, pycharm, gitlab) and it lasts without problems.
      Right now I have 83 and 46 % of battery (two batteries and it reports ~7:30h remaining on balanced power setting.

  • @michaljurkovic
    @michaljurkovic Před 12 dny +2

    M3 is built on N3, not N3E. N3E is coming in products later this year, has a bit relaxed specs and it's cheaper than N3.
    Design of the chip takes like 3-4 years (see M1, see Meteor Lake, etc.). If apple only has like small improvement ready, than they probably lost lead and momentum.
    Scaling will again start with 2nm nodes (N2, 20A, 18A and towards 14A) thanks to several aspects, GAA FET, backside power delivery and later forksheet transistors

  • @colemanbecker1392
    @colemanbecker1392 Před 13 dny +118

    "Please clap" 😂😂😂

    • @Carter1214
      @Carter1214 Před 10 dny +1

      I laughed out loud!

    • @lahmyaj
      @lahmyaj Před 10 dny +2

      Poor ol’ low energy Jeb lol

    • @jeffsydor9430
      @jeffsydor9430 Před 10 dny +2

      I met Jeb once after that comment. He's so ashamed of it because it was never meant to sound so desperate. But at least he can laugh at himself about it. Really nice guy, but not my type of politician.

    • @TheUnderMasked
      @TheUnderMasked Před 10 dny

      👏 👏 👏

  • @jadenamber8378
    @jadenamber8378 Před 13 dny +169

    No company can sit at the top forever. But consider that the Snapdragon chip wouldn't be readying for release and mass adoption without Apple's M1. Now the pressure from QualComm and MS will keep price pressure on Apple. Competition is good.

    • @JoeStuffzAlt
      @JoeStuffzAlt Před 12 dny +7

      I'm pro x64, but I do like that there's shake0up. I hope Apple silicon gets Intel and AMD to start taming their TDPs (how many Watts they burn)

    • @slipoch6635
      @slipoch6635 Před 12 dny +4

      @@JoeStuffzAlt AMD is pretty good for the processing per watt atm

    • @Aashishkebab
      @Aashishkebab Před 11 dny +2

      ​@@JoeStuffzAltdo you mean x86?

    • @LoFiAxolotl
      @LoFiAxolotl Před 11 dny +6

      @@JoeStuffzAlt how can you be pro x86? Even intel and AMD would love to switch away from it... there's no good argument for x86 other than that every piece of software out there is basically written for it... an adoption of any other architecture would require either really good emulation/translation... or that every program gets rewritten for that architecture.... the industry decided to go with emulation/translation and if it's at some point good enough to work for the corporate clients there will be mass adoption of it... what the private end user uses is irrelevant and depends entirely on what the manufacturers provide

    • @hobosnake1
      @hobosnake1 Před 11 dny +3

      I agree. It's always great when competition leapfrogs each other. This new chip wouldn't exist without Apple, and the next amazing innovation by Apple will be spurred on by this chip.

  • @ClaudioAguileraMunoz
    @ClaudioAguileraMunoz Před dnem +1

    I have a 14 inches macbook M1 pro and it has aged beautifully. Still runs amazingly well, everything is smooth, everything is flawless, and the machine is silent af.

  • @thudang3039
    @thudang3039 Před 10 dny

    Thank you as always! Appreciate your work. I'm still on a 2020 M1 Air, and having been a "computer" person for years - this thing just won't quit! It just keeps doing the job, never lagged/throttled. It's ridiculous how good it is. 4+ years later, and I'm still waiting for any of the competition to give me something to make me feel like switching back. I can see this M1 lasting for years to come for what we do with it, and its battery life is still fine, more than a single day's casual use.

  • @fVNzO
    @fVNzO Před 13 dny +428

    I am continually amazed at how quickly these "long" videos fly by. You are an elite presenter. Dare i say the best tech product communicator on youtube.

    • @snazzy
      @snazzy  Před 13 dny +74

      Thanks so much for your kind words!

    • @pingozingo
      @pingozingo Před 13 dny +9

      @@snazzywent by in a blaze 😉

    • @snazzy
      @snazzy  Před 13 dny +17

      ☘️

    • @marvinjohn
      @marvinjohn Před 13 dny +1

      Truly dude, your funny, entertaining and have a superb outlook on technology and pick interesting topics too. You should have way more subscribers! ​@@snazzy

    • @rawdez_
      @rawdez_ Před 13 dny

      @@snazzy you totally missed the point WHY it isn't much faster. the reason is milking the tech that corporations already have. improving too fast drops older cheaper-to-produce products prices too fast. so corporations don't like fast progress which they do only when they forced by sales dropping down. now whith what they have they only compete with themselves, making faster chips will kill m1-m2-m3 products sales.
      thats the actual reason why your napking math didn't work, everything else is just marketing faitytales from corporations to justify milking the market.

  • @carylittleford8980
    @carylittleford8980 Před 13 dny +40

    Took me ages to stumble on why Apple M series machines had such limited external screen support, hugely reducing the previous number. They are using the iPhone display controller instead of a high end laptop one. As a projection artist multi outputs, 3 are really the minimum for the installations I use and that's evidently common on other hardware. My 5 year old midrange Amd based HP supports 5 external outputs (internal touch screen still on, so 6 screens in total) so for arts work it's a dream. Be great if Apple worked out its design limitation.

    • @fidelisitor8953
      @fidelisitor8953 Před 11 dny +11

      Makes sense as the M series chips are just a scaled up version of the A series iphone chips. They just threw in more cores, rebranded to M series and called it a day.

    • @jerryw1608
      @jerryw1608 Před 10 dny +4

      I use a displaylink dock to drive triple 4k displays with a macbook air m2 and I'm pretty satisfied with the results.

    • @darekmistrz4364
      @darekmistrz4364 Před 9 dny +1

      @@jerryw1608 I'm using built-in display since 2017, but I'm a weird guy that values portability and being able to work any place on earth.

    • @carylittleford8980
      @carylittleford8980 Před 4 dny +1

      @@jerryw1608 @jerryw1608 that's interesting. Is it the base M2?. Apple's site says just one screen on the base M2 or M3 and you need the highest end M2 chip to do more than just 2. There is an update that lets you run 2 external screens on some MacBook Airs but only if you shut the internal screen off, which is quite 'unique' a solution. The only Apple dealer store in my area also tells me it's just one screen on base models and you have to have bought a higher end M series to get over that limitation.

    • @noostroi
      @noostroi Před 3 dny

      @@carylittleford8980 DisplayLink is analogous to a 'software defined graphics card'. I've got a M2 Pro MBP, and a few USB C hubs that have display port connectors on them, but have only ever been able to drive 'up to 2' external displays of any resolution at the same time natively. I've got 2x 1080p monitors, and a samsung g9 dqhd screen (5120x1440), but only 2 screens will ever work at the same time, no matter which display connector they're connected to, but if I put both 1080p screens on a dual displayport DisplayLink adapter, and plug that into a USB A socket on one of my dongles, then I can get all 3 monitors (plus the laptop's own screen, so 4 in total) working. This does need the 'DisplayLink Manager' app running to get the DisplayLink adapter to function.

  • @yusuf9356
    @yusuf9356 Před 11 dny +1

    Hello, do you still use LTO tapes for storage? If so, are you satisfied?

  • @azenyr
    @azenyr Před 10 dny +16

    The problem is that the M1 made intel actually finally "wake up" and resume actually investing R&D on x86 arquitecture, and AMD followed suit. This brought innovation back into x86 that was basically stuck for years, and this innovation quickly outrun anything that Apple was trying to do with Apple Silicon. The new intel Core Ultra cpus and Ryzen are now more performant and with better efficiency from x86 than Apple Silicon with ARM64. And the x86 is still a better arquitecture for a lot of things. And now that intel and AMD are back in the fighting ring, x86 will just continue to get better at higher-end performance points

    • @thegeekno72
      @thegeekno72 Před 9 dny +4

      Apple R&D and launch cycle for the M1 happened during the same time span that Intel and AMD R&D'd theirs, AMD had already planned on a new socket and Zen4 architecture, Intel was catching up on AMD's lead with chiplets with their efficiency cores, Blue and Red didn't wake up following Moneybags' M1, they just happened to have parallel'd it

    • @sprockkets
      @sprockkets Před 8 dny +2

      Well AMD being able to do anything is special in it of itself. I still rather have AMD in a framework laptop, but I'll take a SD Elite in a Framework if they make one and it is easy to run my Linux stuff on it.
      Meanwhile, my Samsung Galaxy S24 on Dex works great too.

    • @kjeldkaj
      @kjeldkaj Před 8 dny +1

      I agree, let Apple keep their ARM and let some windows laptops have it too because why not. But that does not mean that x86 is obsolete and shouldn’t be improved further.

  • @eduardofusion
    @eduardofusion Před 13 dny +391

    So... When can we expect a Snapdragon hackintosh ?

    • @asinglefrenchfry
      @asinglefrenchfry Před 13 dny +43

      I don't think that's possible unfortunately. ARM chips work differently with how the OS needs to be coded specifically for it, or something along those lines

    • @lucasrem
      @lucasrem Před 13 dny +7

      Run Virtualization Framework in VN on Snapdragon, why not ?

    • @livemadseason
      @livemadseason Před 13 dny +33

      ​@@lucasremI do not think it would be even possible, because of the hard software/hardware binding of apple. Imagine why no one managed to run iOS on android phone.

    • @1DwtEaUn
      @1DwtEaUn Před 13 dny +7

      @@asinglefrenchfry bigger issue would be the SEP secure enclave, I would think as the M-series are ARM chips

    • @alexrosenberg_tube
      @alexrosenberg_tube Před 13 dny +26

      Apple Silicon has a ton of custom features not found in any Qualcomm chip. The OS expects to interact with those features.

  • @Johnnyprc
    @Johnnyprc Před 13 dny +162

    Apple needs to solve the GPU issue for architects, 3d modelers, cinema production companies etc. The onboard GPU doesn't work the way they suggest and i've had to go to Windows PCs with dedicated GPUs....whereas I used to be able to use Mac products just fine.

    • @lllongreen
      @lllongreen Před 13 dny +5

      What exactly with the Apple GPU is it that does not work as they say ?

    • @Johnnyprc
      @Johnnyprc Před 13 dny +70

      @@lllongreen The integrated GPU aspect. When it comes to 3D modeling etc Its no-where as powerful as dedicated GPUs.....not even close. There is no raytracing etc as well. My point about Apple's claims related to their GPU is it doesn't come close to living up to what they suggest in their marketing.

    • @MVEProducties
      @MVEProducties Před 13 dny +11

      @@JohnnyprcThe M3 Macs DO support 3D Raytracing!

    • @Johnnyprc
      @Johnnyprc Před 13 dny +47

      @@MVEProducties Apple worked with Unreal devs to make it kinda work with their software - however, other software titles don't yet work with Apples raytracing - basically, Apple has to get to together with each title and help make it work...its not natively supported in many apps that support raytracing "out of the box" - and even when it is, you can see on the Unreal threads its awful.

    • @93CamiSS
      @93CamiSS Před 13 dny +26

      I don't know about the later models but I have a M1 Pro 8core Cpu 14coreGpu 16gb ram and the Adobe Aftereffects' performance does not come even close to my old af 7th Gen i5 8gb Gtx 1070ti system. Same for unreal engine. Tried switching all the performance modes within the software but it was a shocking revelation. Luckily, gpu intensive tasks arent my primary concern.

  • @chrisslaunwhite9097
    @chrisslaunwhite9097 Před 11 dny

    First video I have ever saw from this channel, and wow... so well done! and loved the in depth Dive into CPU Design. I know you have 1.15M subs. but you just earned another! Cheers!

  • @SDX9000
    @SDX9000 Před 10 dny +2

    Silence is all I want.
    PC or Mac computers that could run 99.5% silent in everyday use. I want to have a button to allow or NOT ALLOW fans to run depending on my needs. So I end up choosing MacBook Air because it is the only computer that is silent while all others can start their fan when-ever they feel like it!

    • @timetraveler_0
      @timetraveler_0 Před 6 dny +1

      Same, still rocking m1. But would trade it for an equivalent x86 device in a heartbeat.

    • @JuanDiegoElViejoNino
      @JuanDiegoElViejoNino Před dnem

      You do know... That most laptops... Have an built-in option to adjust your fan curve right? That there's this thing called "quiet mode" on windows? I prefer having the option to ACTUALLY have active cooling (who drives an air cooled car now days?) that I can adjust to my needs

    • @SDX9000
      @SDX9000 Před 13 hodinami

      @@JuanDiegoElViejoNino No I don't? I had HP and Dell notebooks at work and one way or another they refuse to run without a fan. If you can point out to some reviewer resource that shows how to practically use and achieve "no-fan" operation on Windows notebook I would be very interested.
      On desktop PC I literally installed crazy overkill CPU coolers on 6-core CPU to ensure that fan remains mostly idle, but on notebooks you are stuck with some micro-buzzer that is there - except for MacBook Air that I have now - love the hardware in it.

    • @JuanDiegoElViejoNino
      @JuanDiegoElViejoNino Před 5 hodinami

      @@SDX9000 Just look up Asus quiet mode

  • @snakeplissken8887
    @snakeplissken8887 Před 13 dny +62

    Fun fact about the Qualcomm Elite X / Nuvia team.
    "The founding trio of Bruno, Gulati and Williams were key high-level architects at Apple whose expertise brought fruition to many generations of Apple’s SoCs and CPU microarchitectures. Williams was the chief architect on all of Apple’s CPU designs, including the recent Lightning core in the A13"

    • @yagerq
      @yagerq Před 12 dny +20

      But before working in ARM & Texas instruments.

    • @p.chuckmoralesesquire3965
      @p.chuckmoralesesquire3965 Před 10 dny +1

      so yeah, put better, nothing to do with apple hype, its just that arm is coming into age. i could do all of this on a raspberry pi 5 years ago for $35 woot.

    • @FVBmovies
      @FVBmovies Před 5 dny

      I disregard 'fun fact' comments and skip them unread, just like yours. :)

    • @snakeplissken8887
      @snakeplissken8887 Před 5 dny

      @@FVBmovies You're clearly replying. I win.

    • @FVBmovies
      @FVBmovies Před 5 dny

      @@snakeplissken8887 Still didn't read tho. Have fun with your facts.

  • @haakon_b
    @haakon_b Před 13 dny +99

    I do my graphics work on an M1 Macbook Pro and I don’t think I need to upgrade within the next 4 years. We see what’s going on then.
    I don’t need more compute power, I need good software and efficiency.
    The plague of modern computing is bad software. Programs that were 1MB in the 90s and ran on a 486 are now 500MB and need a high end cpu.

    • @DarkPa1adin
      @DarkPa1adin Před 13 dny +1

      Don't have to, rendering video length is within 10%, no big deal

    • @jani0077
      @jani0077 Před 13 dny +25

      optimization became absolute garbage in the last 10 years. Almost no application is optimized nowadays as software devs are able to brute force push the apps because of the huge leap in processing power.

    • @NathanBrownisawesome
      @NathanBrownisawesome Před 13 dny +13

      @@jani0077 i 100% agree on this, gaming is a sore example of how much devs stopped caring about optimization and just push products out with barely any QA/QC

    • @ColinBrown33
      @ColinBrown33 Před 13 dny +30

      Yes, we need to stop turning every app into an Electron Chromium-bundled JS bloatware

    • @Dave102693
      @Dave102693 Před 13 dny +4

      @@jani0077 Its why apps on Windows (especially games) are bloated as fuck and make laptops last like an hour or 2 on idle alone.

  • @petersvan7880
    @petersvan7880 Před 12 dny

    Excellent analysis, a joy watching this video. We'll see how things pan out :) Greetings from Sweden!

  • @bitessun
    @bitessun Před 12 dny

    Grabbed a 16" M1 Max 2TB last December and it so beautiful! Did some research prior and seen that the 2s & 3s were swapping out performance cores for efficiency, keys still aren't quite a nice as my now 11+ year old mid-2012 13" but maybe I'm just super used it that for typing on

  • @thor.mukbang
    @thor.mukbang Před 13 dny +88

    Your point about fan speed is exactly why I love the M1, especially the 16". This thing is cool and silent no matter how many consecutive hours I work or play games on it.

    • @snazzy
      @snazzy  Před 13 dny +29

      It truly is remarkable.

    • @Gerhard_Schroeder
      @Gerhard_Schroeder Před 13 dny +6

      @@snazzy I use for ME always an M1 MacBook Air, even when my whole team uses better Mx Macs. For Keynote Pages and 1x a month an short video export... I see no reason for an upgrade in years.

    • @bosmanka
      @bosmanka Před 13 dny +6

      Like my Mac mini M1. It’s on 24/7 and nobody ever heard the thing, no matter what I did on it. Going to my son’s room and hearing the noise there of his windows PC always makes me laugh

    • @syntex9673
      @syntex9673 Před 13 dny

      What’s it’s temperature and fan speed when you’re playing games, if you don’t mind me asking?

    • @estiennetaylor1260
      @estiennetaylor1260 Před 13 dny

      @@syntex9673 None when crapple OS can't play any real games lol.

  • @chumbawumba1959
    @chumbawumba1959 Před 13 dny +87

    The 12inch MacBook - fondly called the **MacBook Nothing** - was my ATF of Apple laptops. Its thin-ness and light-ness was insane. I used it primarily for business travel where would be in meetings, so the use case was simply MS Office, MS Teams, Visio, Web Browsing, and corporate email. It was great for all of that, particularly brilliant with the Retina Display. The KB was a bit janky, as I was constantly missing characters as I typed them (pretty fast, probably not enuf pressure) but the KB never failed or had HW issues like for many. I would love to see a MacBook built on that same fan-less and thin platform, and even if just M2 would be very successful IMHO.

    • @CosmicTeapot
      @CosmicTeapot Před 13 dny +13

      That single lonely usb-c port will never not make me wheeze laughing.

    • @orobiodecastro
      @orobiodecastro Před 12 dny

      *MacBook Adorable*

    • @iali00
      @iali00 Před 11 dny +6

      I owned one and LOVED that thing. I don’t know why Apple doesn’t bring that back. The single port was fine by me. Maybe add a second port and you’re done. It was awesome for business travel. MS office and work emails. More functional than an iPad.

    • @nyps
      @nyps Před 11 dny +1

      i had one too and i so hoped they would bring it back with one of the M processors. would be an instant buy for me over any of the current airs.

  • @BigTylt
    @BigTylt Před 12 dny +26

    The summary of this video is basically that engineering is (un)surprisingly more complicated than Internet expectations suggest.

  • @jasonboles1526
    @jasonboles1526 Před 8 dny +1

    I've used Macs since 1990, but gotta say nowadays it feels like they only have a handful of macOS developers but hundreds on the iOS team... recently had to downgrade from Sonoma back to Ventura (on 2020 intel MacBook Air) because after the upgrade the CPU was running hot and the fan was constantly on. (Activity monitor not showing anything crazy). There's an apple forum support thread with at least 300 other people who had same problem. After the downgrade (not easy- required full wipe and install even with a Time Machine backup), fan was back to normal. The good news about these Qualcomm chips is maybe there'll be ARM hackintosh macs possible.

  • @lubricustheslippery5028
    @lubricustheslippery5028 Před 13 dny +31

    ARM is 39 years old. It started with Acorn computers. So it's not an modern ISA and it have been drastically been added to in a similar way to as x86.

    • @anthonycotter1493
      @anthonycotter1493 Před 9 dny +9

      Not to mention that the different instruction sets make very little difference on something as powerful as a PC or a smartphone. It's not the ISA, it's the whole chip. The ISA myth has been disproven by the university of wisconsin and by Intel when they used to make chips for Android phones that used x86 architecture

    • @Jabjabs
      @Jabjabs Před 7 dny

      @@anthonycotter1493 ISA used to make a huge difference in the 80's, by the 90's it was down to a few percent. In the 2000's the difference disappeared completely. If ISA was the end all, we would all be using PowerPC right now. A bigger difference will come from a memory controller difference than the ISA.

    • @vanCaldenborgh
      @vanCaldenborgh Před 5 dny +1

      @@anthonycotter1493 I think it is both and performance per Watt counts, that's the main reason the development of Intel x86 Android phones stopped. But yeah, I am also sure if a brand new modern instruction set would be developed, ARM would be defeated, also on performance per Watt.

    • @patrickday4206
      @patrickday4206 Před 5 dny

      Stop making me feel so old 😂

  • @renchesandsords
    @renchesandsords Před 13 dny +19

    one thing to note tho for when apple transitioned to m1, in the prev gen intel macbook, the heatsink LITERALLY didn't make contact with the die, this was found when LTT tried to improve the cooling with better paste and found that they had to drill out some of the heatsink in order for it to touch the cpu

    • @BigTylt
      @BigTylt Před 12 dny +8

      And in the Early 2020 MacBook Air models, they added that ridiculously underpowered semi-passive heatsink that could only handle about 5-10 Watts before the CPU started to throttle...

    • @ZachariahConnor
      @ZachariahConnor Před 12 dny +16

      I honestly wonder if apple made some of the pre-m1 computers worse on purpose to make m1 look good.

    • @octav7438
      @octav7438 Před 12 dny +14

      @@ZachariahConnor it sounds like an apple thing to do honestly. I wouldnt be surprised

    • @IgorTimarac
      @IgorTimarac Před 11 dny

      @@ZachariahConnor Considering that they _do_ have some engineers employed, I can't think of any other explanation for this. But that leaves me with a feeling that they were actually caught by M1 surprise themselves--had they known that the Apple Silicon would be _so_ good, I don't think they'd have gone through the trouble.

  • @yomerosoy5852
    @yomerosoy5852 Před 12 dny +13

    So the original team that designed the m1 now works in Qualcomm, so I think Tim Apple will keep overclocking chips to compete.

    • @MaxPrehl
      @MaxPrehl Před 10 dny +4

      Hilarious if true

    • @helloukw
      @helloukw Před 9 dny +4

      The thing is apple sold well even when they were using intel chips, which were hitting thermal limits fast, so I doubt people buy apple hardware for performance first. I think nothing much will change with apple, they rarely take any risks.

  • @stultuses
    @stultuses Před 11 dny +5

    What about the so called unpatchable security flaw found recently in the M1, M2 and M3 chips?
    Tell us more about this please

    • @maynardburger
      @maynardburger Před 11 dny

      These things are almost never relevant to general users.

  • @dzalejandro
    @dzalejandro Před 13 dny +59

    Age is making Quinn to do his videos from his chair 😀
    Love your videos keep the work man!!!

    • @arihantplaying
      @arihantplaying Před 13 dny +3

      When I see their review of apple vision pro I became fan of this channel and this video is great
      I am loving this channel videos

    • @2rx_bni
      @2rx_bni Před 13 dny

      Counter suggestion: he's recording enough of them at once that standing is just impractical to do in long bouts. I'm sure it's way more comfortable. Glad he can sit down.

  • @abb0tt
    @abb0tt Před 13 dny +20

    After 30 years as an Apple user, I think my 2021 M1 MacBook Pro Max might be my last Apple computer purchase. I am excited about Framework’s roadmap.
    My biggest regret: thinking 32 GB of RAM was sufficient for my needs and realizing a swap for the 64 GB would require me to take a huge loss. My servers and PC have 128 GB…I must have been drunk when I configured that MBP. 🤦‍♂️

  • @defooster2757
    @defooster2757 Před 10 dny

    Thanks for the detailed breakdown and the glimpse into the future competition.

  • @mikekelly6331
    @mikekelly6331 Před 6 dny

    Great video going as deep as I always want these videos to go.
    Excited to FINALLY get an M1 competitor. I'm a Linux user who has been waiting for 4 years for this!

  • @cosmiccuttlefish5765
    @cosmiccuttlefish5765 Před 13 dny +96

    16:15 I love Quin’s obsession with the 12” MacBook.

    • @snazzy
      @snazzy  Před 13 dny +49

      It’s unhealthy

    • @cosmiccuttlefish5765
      @cosmiccuttlefish5765 Před 13 dny +2

      @@snazzy I have a similar relationship with the 2009 13” MacBook Pro…

    • @nateo200
      @nateo200 Před 13 dny +6

      I miss that thing. I remember when the OG 11" MacBook Air came out and I was in love with it. Still am tbh

    • @wynq
      @wynq Před 13 dny +3

      @@snazzy I'd argue it is the perfect amount of healthy!

    • @darekmistrz4364
      @darekmistrz4364 Před 9 dny

      @@nateo200 I remember when I saw it first time. I couldn't belive that something so elegant could be produced as working computer

  • @matthewcarlson5885
    @matthewcarlson5885 Před 13 dny +92

    I’m an engineer working on Apple products and I enjoyed this take. Obviously my opinion is my own but I think there are way too many skus of product. It makes way more work for me and as you pointed out, they really aren’t that different.

    • @ChrisAljoudi
      @ChrisAljoudi Před 13 dny +29

      If you really work for Apple, you should delete this comment before you get in trouble

    • @SpicysaucedHD
      @SpicysaucedHD Před 13 dny +7

      @@ChrisAljoudiHe is entitled to have his own opinion, or does Apple control minds too? :)

    • @ChrisAljoudi
      @ChrisAljoudi Před 13 dny +14

      @@SpicysaucedHD openly critiquing your employer is a bad idea in general, and even moreso at Apple where they prioritize and enforce secrecy. In a perfect world things would be different but that’s the truth

    • @SpicysaucedHD
      @SpicysaucedHD Před 13 dny +9

      @@ChrisAljoudi Self-censoring in the land of the Free cause of corporate interest? Yeah no I wouldnt do that :) Im also sure that the guy knows what he's doing.

    • @sebastienauger4068
      @sebastienauger4068 Před 13 dny

      ​Just wow 😂, he never said he worked for Apple. Learn to read@@ChrisAljoudi

  • @jeffsydor9430
    @jeffsydor9430 Před 10 dny

    This is exactly the video comparison I've been looking for! Thank you for the great explanations. And no I didn't turn it off when it got geeky! haha. I'm still using the 5,1 Mac Pro (Hackenmac) and the 2017 MBP. I've been waiting for the M3 Mac Studio and maybe an M4 MBP to replace my current setup. As a web/graphic designer, video editor and someone getting into Blender modeling for animation and printing, I tend to need a processor that can handle a large volume and I don't really want something that's getting ridiculously hot.
    After this video I'm not sure if I should wait for that anymore. Should I just go for a top of the line M1 or M2? I know that I'll still see insane performance gains from them considering where I'm at. But it almost sounds like the M3+ chips might not be worth it considering the stagnated form factors.

  • @LaczPro
    @LaczPro Před 10 dny

    I always thought of Apple Silicon as a threat to my recently bought AMD Ryzen back in 2019. But I can't deny how interesting the ARM architecture is at this moment... And this is just starting. Having competition is awesome

  • @BeefIngot
    @BeefIngot Před 13 dny +17

    I dread these snapdragon chips being locked down and inaccessible to linux users. The transition to arm would spell trouble for people who want literally any freedom over their hardware whatsoever.

    • @ULUnLoco
      @ULUnLoco Před 13 dny +1

      Ashai linux works well, at least it did last time I tried it.

    • @fionnlanghans
      @fionnlanghans Před 13 dny +1

      ​@@ULUnLocoThey could do some verified boot without an option to change that. So maybe Windows only.

    • @kushagranayyar3960
      @kushagranayyar3960 Před 13 dny +1

      That would be very ironic. Moving from proprietory to open source architecture and loosing the option.

    • @giornikitop5373
      @giornikitop5373 Před 12 dny

      @@kushagranayyar3960 what architecture is open source?

    • @JHSaxa
      @JHSaxa Před 10 dny

      Isn't RISC-V an open source ARM, basically?

  • @shanemshort
    @shanemshort Před 13 dny +8

    if people are going to say that the nanometer nomenclature doesn't make sense, don't use it. Call it N5/N3/N2 etc which is the actual product name TSMC uses.

  • @douglasgoodall3612
    @douglasgoodall3612 Před 10 dny +1

    That was a fantastic review of the Apple M phenomenon. I am grateful for having been brought up to speed on the tech involved in Apple's recent offerings. I still have a real problem with how fast their machines sunset, OS-wise. In a decade, I burned well over $20K on my Mac PRO and MacBook Pro notebooks. Lack lof upgradability is what forced me into hand-building my own development machines. My Ryzen 9 12-core 128GB-RAM with 10TB of Nvme Gen5 SSD is how I get my work done these days. I also just feel more comfortable running Linux than MacOS. I made the transition off Windows and life is better now.

  • @mdkooter
    @mdkooter Před 11 dny

    Great video. One minor detail as a not-apple-fanatic : Asus Zenbook 14" Oled basically already does everything you asked for. Powerful and economic-when-idle CPU? Check. Powerful GPU? Check. Insane battery life? check. Great formfactor, build quality? Check. Thin and light casing? check. (responding to the end of your video where you comment that the non-apple gaming laptops are stuck at 3.5 hours battery life). There's also other 17" laptops such as LG's gram that have decent 4050 videocards, good CPU's, weigh 1.35kg and are a joy to use (Albeit at the expense of some strength in the chassis).

  • @Vr00mf0ndel
    @Vr00mf0ndel Před 13 dny +139

    M1 Air is still the best laptop for everyday use

    • @snazzy
      @snazzy  Před 13 dny +42

      It’s superb, agree.

    • @williambillyshears9129
      @williambillyshears9129 Před 13 dny +3

      @@snazzy I'm on my 2nd M1 MacBook Air, base unit. (Love your channel)! 💗

    • @menumenu287
      @menumenu287 Před 13 dny +8

      i use base model m1 air for software development work. 2 docker containers, 2 ide's and 3 browsers with about 30 tabs all together and works fine. faster and silent than my work intel 13th gen.

    • @just_joseph157
      @just_joseph157 Před 13 dny +8

      I have one, and I disagree. macOS has had a choppy scrolling bug since 2020. It's very noticeable and annoying for people who are very sensitive to lag/stutters, such as myself. macOS is also far too limited compared to Windows. The Air's screen is okay but has a slow response time, so there's a lot of ghosting on darker backgrounds. It was a €1100+ laptop at launch, at least in the EU. I've seen cheaper Windows laptops with better screens. The M1 Air is a good laptop with fantastic battery life, but that's about it; the rest of the machine is either mid or average at best. It's not as good as you all make it out to be, and I certainly wouldn't have bought one if I had known that it would stutter just as much as a cheap Chromebook.

    • @sadcat6256
      @sadcat6256 Před 13 dny

      @@just_joseph157 might be some software you installed... the only time my own Air stutters is when I am editing 16K+ photos in pixelmator with a bunch of other apps open. and even then, the stutter only happens with the stage manager app switch animation. other people i know who have Airs and use them for much lighter workloads than i do, never have stutters happen to them. at least not that i've seen. so you're either lying about having a mac, installed some dubious software, or your unit is defective.

  • @gljames24
    @gljames24 Před 13 dny +41

    A thing that should be mentioned is that it wasn't really an x86 vs ARM issue. Intel basically just got stuck at a bad node size for a long time as they fab inhouse. The Steam deck uses x86, but on a TSMC AMD chip so the tdp is just better.

    • @tom_marsden
      @tom_marsden Před 13 dny +5

      Ya intel was stuck on 14nm for over 6 years seems to be stuck on 10nm now.

    • @heroofjustice3349
      @heroofjustice3349 Před 13 dny +2

      yup. In multicore performance actually Ryzens are almost on par with Apple chips. Its so strange AMD didnt try to add efficiency cores to their processors - it would be game changer.

    • @Demopans5990
      @Demopans5990 Před 13 dny +2

      @heroofjustice3349
      AMD gets pretty close to Apple in terms of pref per watt, but only at the high end server stuff, and only if you turn off boosting. 128 cores of pure compute if I recall, that only sips 300W

    • @kuriaspaul
      @kuriaspaul Před 12 dny

      ​@@heroofjustice3349They will be. AMD will be releasing a power efficient chip with efficiency cores (regular cores with less cache) by CES next year. So both Qualcomm, AMD and Intel to a lesser extent will offer high performance, high battery life laptops in 2025. Apple's advantage will negligible next year.

    • @heroofjustice3349
      @heroofjustice3349 Před 12 dny

      @@Demopans5990 Ok I will write again because link was caught by YT and my comment was not posted. So instead I ask you to put 'AMD Ryzen 9 7940HS analysis' in google and check first link from notebookcheck.
      As you can see your comment is lie. Consumer grade AMD chips for laptops like 7x40u/hs or 6800u are almost as efficient as Apple silicon under load even with worse manufacturing node(6800u). Advantage of Apple Silicon lies in E-Cores. Ryzens have only performance cores and so under low load or idle those chips needs to use more power. Its just wonder of BIG.little architecture - power hungry P-Cores are activated and powered only when there is need for more processing power than E-Cores can handle. However AMD is already planning to introduce their own E-Cores and it should put Ryzen powered laptops very close in all aspects of efficiency(including battery life) to Apple chips.

  • @vanilkharbanda
    @vanilkharbanda Před 12 dny

    I don't think you should ask or care what people think in the comments given how comprehensive your video essays are. Excellent as usual.

  • @Josephe01
    @Josephe01 Před 4 dny

    I’m gonna buy a new iPad straight away as my old one is really old. Thanks for the update. M4 sounds amazing. 🎉

  • @livemadseason
    @livemadseason Před 13 dny +5

    As usual, title makes me think it will be boring video, but the I see your beard, and I start watching, and cannot stop till the end. That's magic. I do not even have any apple device 😅

  • @TheVoiceofReason4ya
    @TheVoiceofReason4ya Před 13 dny +3

    I have an m1 pro base 16” and an upgraded model m1 imac, both are basically interchangeable for pretty much all tasks including editing 4k 60 timelines in fcpx. The export times are similar enough to not be noticeable, although the pro has more headroom and stays cooler.
    That being said, the products are so good that I have zero desire to “upgrade” (in some cases down grade with less P cores) via iterative chip releases. I think this is part of what you are saying here, and its a point well taken.

  • @jasonhemphill8525
    @jasonhemphill8525 Před 12 dny +1

    The last thing you forgot was architecture improvements. Changes to a chips cache and layout can create a huge improvement in performance even on the same node.

    • @jasonhemphill8525
      @jasonhemphill8525 Před 12 dny

      Allowing the CPUs better access to system memory via more lanes or faster ram also contributes heavily to speed gains.

  • @tristanortiz-duchesne633

    That's a very comprehensive take on the current state of Apple laptops. Very enjoyable, information packed video. These keep me subbed as I get older and spec dumps and reviews lose their interest. Keep up the great work!

  • @blechuk
    @blechuk Před 13 dny +83

    The main hope I have from competition isn’t new MacBook form factors (thought it’d be nice); it’s more realistic (read: cheaper) storage and RAM upgrades. The fact Snapdragon Elite uses NVMe over PCI should prove that Apple’s soldered-on SSDs don’t have any advantage; maybe they’ll use standard connectors, at least in Pro machines.

    • @jimcabezola3051
      @jimcabezola3051 Před 13 dny +6

      I agree with you! I also hope Snapdragon Elite X chips won't be hamstrung with non-upgradable RAM and storage.
      Why does that leave me with the sinking feeling...that that's EXACTLY what Qualcomm will do?
      Heaven forfend! 🤣🤣

    • @kushagranayyar3960
      @kushagranayyar3960 Před 13 dny +2

      Doesn't apple already support pci-e on their desktop mac pro with arm chips. Also thunderbolt is also over pcie if I'm not wrong. They just chose to hinder any upgrades.

    • @kushagranayyar3960
      @kushagranayyar3960 Před 13 dny +14

      ​@@jimcabezola3051i think RAM will be soldered due to the current speed and trace length limitation of ddr5, let's hope that new dell standard for ram gets better adoption.
      but no reason storage needs to be soldered.

    • @jimcabezola3051
      @jimcabezola3051 Před 13 dny +5

      @@kushagranayyar3960 Yes, you're right... I'm afraid you're right. The traces to a DIMM would be prohibitively long and slow. But...let's not make a base system contain LESS than 32GB, okay? 🤣🤣

    • @jimcabezola3051
      @jimcabezola3051 Před 13 dny +1

      @@kushagranayyar3960 The present Mac Pro's PCIe support is only a baby-step in the right direction, yes.
      They need to go much further, though.

  • @astrit
    @astrit Před 13 dny +21

    My man explaining chips in a Castro style 😂

  • @pennyzee1176
    @pennyzee1176 Před 11 dny

    I’m a 3D Motion artist and for the past few years I have been rocking a M1 Max MBP for most of my Adobe type work, but maintaining a high end Intel/Nvidia workstation for 3D/GPU rendering. I would LOVE an beefy M4 Ultra laptop with adequate airflow and a higher TDP. My stupid 13900ks + 3090 Ti has awful power consumption, 4090s were mostly not worth the price tag and would have required a bigger case than my already large one, and keeping up two machines is annoying. And I like having a portable workstation.
    I would like it if I could add more internal secondary NVMe storage, of course.

  • @trimonmusic
    @trimonmusic Před 11 dny

    One of the big problems for Apple is how well their M1 laptops still hold up today. Bought an M1 Max MacBook Pro (64GB RAM, 2TB SSD) in November 2023. At release, it would have cost over £4K ($5K), whereas I paid only £2K ($2.5K) for it. Nothing but positives from me so far.
    I've seen first-hand how poorly managed thermals can cause a laptop to degrade extremely quickly (cough... Razer), but with how efficient this device is I expect it to last for many years to come.

  • @aeiplanner
    @aeiplanner Před 13 dny +103

    Is today Fidel Castro's birthday?

    • @trainingtheworld5093
      @trainingtheworld5093 Před 13 dny +11

      Give him a cigar lol

    • @snazzy
      @snazzy  Před 13 dny +105

      I dunno, let me ask my Humane Ai Pin. I’ll get back to you next week.

    • @alandiegovillalobos
      @alandiegovillalobos Před 13 dny +5

      @@snazzythey should just connect it to people like the Amazon cashier. 😅

    • @omjareno9111
      @omjareno9111 Před 12 dny +5

      As a Cuban i have to say: is not funny 😢😅

    • @The_Ballo
      @The_Ballo Před 12 dny +8

      Trudeau celebrates it every year.
      ¡Papi!

  • @danielhalachev4714
    @danielhalachev4714 Před 13 dny +15

    It's refreshing to see an Apple fan, who isn't brainwashed by brand loyalty!

    • @JohnSmith-pn2vl
      @JohnSmith-pn2vl Před 11 dny

      that is not true since there are no brainwashed apple fans.
      that is a total myth created by haters

  • @attilavs2
    @attilavs2 Před 11 dny

    The thing about switching to ARM is that it would be and is much easier to do on Linux ... even with minimal support, you can have a full desktop experience running on a phone with a bit of tinkering and a while compiling.

  • @tipoomaster
    @tipoomaster Před 11 dny +1

    Jony Srouji gave a very rare forward looking hint a few months back in an interview, mentioning that with node shrinks slowing down, advanced packaging was the future of advancement. Supply chain research out of Taiwan also said there were higher orders for advanced packaging expected to come from Apple. And then add the last clue where M3 Max didn't have the Ultrafusion bridge to make an Ultra out of 2 of them.
    I wonder if the M4 line will introduce tiles, allowing different parts to be made on different nodes. Particularly IO and caches haven't been shrinking well and there's benefits to this approach, and also different nodes are better suited to either CPUs or GPUs or low power IO etc.

    • @maynardburger
      @maynardburger Před 11 dny

      The benefits there mainly come down to costs, and I dont think Apple is super concerned with that just yet. They can just raise the price and people will pay it. Plus the base M dies are still quite small(~150mm²) so continuing to more or less just glue dies together with their developed interlinks seems to still be good enough for a while. Though it's possible that perhaps for the Max/Ultra designs, they could start thinking about a more chiplet approach sooner.

  • @s8x.
    @s8x. Před 13 dny +19

    switched to windows laptop with 64gb ram upgrade after using mac m1 8gb

  • @WestUCoog
    @WestUCoog Před 13 dny +28

    M1 MacBook Air is good enough for 90% of users. Incredible snappiness and crazy battery life.

    • @DavidHuffTexas
      @DavidHuffTexas Před 13 dny +2

      Agreed. We got my high school daughter one of these after the price went down when the M2 Air was released. Fully expect the machine to last her thru college.

    • @timbambantiki
      @timbambantiki Před 13 dny +2

      my dad has one and its nice

    • @Jacked2theTs
      @Jacked2theTs Před 13 dny

      For these users, an iPad Pro/Air would be a better and more portable friendly option… I use my phone and tablet, WAY more than my laptop or desktop.

    • @hyperadapted
      @hyperadapted Před 12 dny +5

      @@Jacked2theTs I am one of those users and no, iPad would not be "a better and more portable friendly option" since iPad ist locked down on OS Level and highly limited in terms of workflow compared to MBA.

    • @just_joseph157
      @just_joseph157 Před 10 dny

      lol no. The M1 MacBook Air is filled with microstutters, due to a 4 year old bug that hasn't been fixed yet. So, there's choppy scroling everywhere. It does have really great battery life, but that's about it

  • @HanmaHeiro
    @HanmaHeiro Před 21 hodinou

    Never seen yiur channel before. But this was an awesome video. Thank you

  • @jaymesridel2652
    @jaymesridel2652 Před 11 dny

    I feel like what you mentioned in the end of the video is like what they’re doing with with vison pro because it has their silicon in it so they’re like taking that risk makes sense

  • @rene291
    @rene291 Před 12 dny +2

    i am convinced that when widows fully moves to ARM with a chip that is as even powerfull as M2, apple will be screwed.

  • @joeysanchez6777
    @joeysanchez6777 Před 13 dny +24

    So we're just going to ignore that JLC Reverso on the wrist 🔥🔥

    • @danieldavis6703
      @danieldavis6703 Před 13 dny

      I’m glad I wasn’t the only one that noticed it! 😅

    • @theshadowman1398
      @theshadowman1398 Před 13 dny

      I guess Apple Watch started make him feel cheap

    • @-johnny-deep-
      @-johnny-deep- Před 12 dny

      Good spot! Do those things really start at about $7000 USD? Yikes! Any idea what model he's wearing?

    • @jeffcullen6573
      @jeffcullen6573 Před 11 dny

      ...snazzy!

  • @gufo10games.74
    @gufo10games.74 Před 11 dny +1

    Ngl that fan audio made me start looking around my room going “wheres that fan sound from, my phone doesn’t have a fan and my computer isnt turned on”

  • @CriscOnSmootH
    @CriscOnSmootH Před 11 dny

    @Snazzy Labs Hey man I ran into a bit of an issue after typing in a terminal command I got from one of your other videos, wondering if you might know the fix? in terminal, I typed in
    % defaults write com.apple.dock autohide-delay -float 0; defaults write com.apple.dock autohide-time-modifier -int 0 ;killall Dock
    in order to make the dock show/hide faster. I liked it but decided I liked the default transition animation better so decided to set it back to default. I set it to 0.5 as you suggested. What happens now is the mouse pointer will bein the dock area for a few seconds before appearing and the transition animation is gone. It now just appears instantly after 2-3 seconds rather than smoothly coming up from the bottom. Do you know how I could fix this? I restarted the laptop and its still doing it , and the sliding up animation is gone and takes a couple seconds to appear even at 0.1-0.5

  • @plotfi1
    @plotfi1 Před 13 dny +20

    Darn, it would be so nice if M1+ machines supported egpus for local ML development workflows.

    • @Demopans5990
      @Demopans5990 Před 13 dny +3

      Yeah, pytorch and tensorflow are still very flaky on Apple silicon.

  • @whatsupchicken
    @whatsupchicken Před 12 dny +3

    For me, the otherwise boring MacBook hardware is ok, I just wish they developed macOS!

  • @parthmehra8630
    @parthmehra8630 Před 3 dny

    I use a MacBook M1 Max for music production. And even with 30-40 tracks in Logic Pro-X. I don’t hear the fan noise. This is a great feature for someone in music production .

  • @KubuntuYou
    @KubuntuYou Před 12 dny

    @snazzy How does one go about acquiring one of those Snapdragon X Elites in the acrylic case?

  • @rob8969
    @rob8969 Před 13 dny +90

    I’m offended I’m not a normal person 😅

  • @sylvershadow1247
    @sylvershadow1247 Před 13 dny +10

    My M1 Max Macbook Pro is still kicking ass as my work laptop

    • @snazzy
      @snazzy  Před 13 dny +4

      One of the best!

  • @user-hk3ej4hk7m
    @user-hk3ej4hk7m Před 11 dny +1

    I frequently see ARM touted as an efficientcy king, but this is not the biggest contributor at all. The M series has ram soldered right besides the CPU, that decreases energy use significantly. Just look at the efficiency improvements of the AMD chips using 3D V Cache, adding a substantial amount of cache means less memory fetches and memory bus use for the same workload.

  • @mohammadalhusaini1115
    @mohammadalhusaini1115 Před 11 dny

    I think the correct market economics is what apple is already heading for:
    A high value ecosystem that we pay for means that we are not pushed to continually upgrade the hardware. this way, apple can continuously make great new hardware but not be to worried about planned obsolescence. iCloud keep us there and happy and ensures that apple doesn't need us to switch once a year. That's why these M1s (which I am typing from) and the subsequent chips are incredible in their own right and last a long time without the need to constantly change.

  • @nhansgoofyvideos7581
    @nhansgoofyvideos7581 Před 13 dny +10

    Finally someone pointed out Apple pay-to-win with TSMC for the whole time, and not pixie dust that makes the Apple M-series significantly faster in comparison to others.

    • @maynardburger
      @maynardburger Před 11 dny +1

      That's not at all what was said, though. It's *an* advantage, but far from the only thing making these processors as good as they are.

  • @slizgi86
    @slizgi86 Před 13 dny +18

    They hit thermal wall faster than we were able to forget that intel has this problem too.

    • @granttaylor8179
      @granttaylor8179 Před 13 dny +8

      It took Intel 40 Years to hit a Thermal wall. It has taken Apple 4 years to hit a thermal wall.

    • @definingslawek4731
      @definingslawek4731 Před 12 dny +3

      @@granttaylor8179 In what way did it take intel 40 years? For longer than I have been alive processors have needed cooling and been hitting temperatures upwards of 80c.

    • @granttaylor8179
      @granttaylor8179 Před 12 dny

      @@definingslawek4731it was only in 1992 that any form of passive cooling was needed

    • @granttaylor8179
      @granttaylor8179 Před 12 dny

      @@definingslawek4731 It has has only been since the 486DX2-66 of 1992 that any sort of passive cooling was needed on processor.
      Intel has been making processors for over 50 years. Apple has run into similar issues a lot sooner than Intel did.
      They cannot keep shrinking the die process as that has been done for the last 50 years and it is reaching the current limits.

    • @octav7438
      @octav7438 Před 12 dny +1

      @@definingslawek4731 truth

  • @kirtmanwaring3629
    @kirtmanwaring3629 Před 11 dny

    I’m with you, the new form factors enabled by good arm chips will be sick! Down the line I could have an actual windows or linux phone with a nice slide out keyboard, one will do it. Add to that the plateau TSMC is bumping up against, reminds me of Intel five years ago.

  • @tavasoli
    @tavasoli Před 10 dny

    These are the videos that make snazzy labs a unique channel.
    Keep going!

  • @vpham92688
    @vpham92688 Před 7 dny +3

    “Napkin” math needs to be on a cocktail napkin from your local bar 😊

  • @trainingtheworld5093
    @trainingtheworld5093 Před 13 dny +8

    I own an m1 iMac and it sits mostly unused not because of the m1 but because of the awful Mac OS Sonoma.

    • @exzld
      @exzld Před 13 dny +1

      whats your gripes about the OS?

    • @whatsupchicken
      @whatsupchicken Před 12 dny

      You can downgrade!

    • @baronvonschnellenstein2811
      @baronvonschnellenstein2811 Před 12 dny +2

      @@exzld Beyond OG OSX and perhaps the next couple of major releases, each release takes a retrograde step in terms of usability.
      Originally: A lovely, clean, intuitive user experience for daily driving. Administrative features were all in sensible places - where you'd expect them to be.
      More recent iterations have had changes for the sake of change, made worse by nobbling some administrative features, as well as arbitrarily moving other admin features to random places for no good reason.
      Perversely, the biggest competition ("Windoze") have been progressively making their offering somewhat easier to administer.

    • @exzld
      @exzld Před 11 dny

      @@baronvonschnellenstein2811 windows easier? no they are making same mistakes for years now

    • @raghuveer8137
      @raghuveer8137 Před 9 dny

      @@baronvonschnellenstein2811agreed. Imo High Sierra was the peak. Everything is downhill from there

  • @alastairleith8612
    @alastairleith8612 Před 11 dny

    if your MBP is running that hot all day long then consider using a Studio or a better spec'd MBP because constant use at that heat is going to impact the lifespan of components of the laptop, and not just the SOC chips themselves, either.

  • @ialrakis5173
    @ialrakis5173 Před 11 dny

    My iMac i3, i5(?) was getting slow and as soon as the M1 Mini was announced I ordered one with 16 GB of RAM. It's still a brilliant little machine. Even ok for basic gaming. Now that we're on the topic of gaming... With Apple trying to get back into that market I'm pretty sure we can still expect some exciting announcements. From a tech point of view I'm curious of course but right now I don't need more power for anything that I'm doing.

  • @msc_1974
    @msc_1974 Před 10 dny +4

    I don't think this video is going to age well 🤣😂🤣

  • @stevens7409
    @stevens7409 Před 13 dny +14

    The real change will be when SSD reaches the speed of ram to not have to include ram anymore. A true unified memory would be data on ssd that is accessible from cpu/gpu.

    • @andyH_England
      @andyH_England Před 13 dny +18

      That is unlikely to happen as the rate of improvement in RAM speed matches the rate for UFS speeds. Therefore, RAM is always likely to be much faster. I did some maths, and RAM was about 8X faster than NAND last year.

    • @Demopans5990
      @Demopans5990 Před 13 dny +4

      Only way this will ever happen is with improvements in Optane. Which Intel canned

    • @BrunodeSouzaLino
      @BrunodeSouzaLino Před 11 dny

      The real change already happens in the server space with the X3D EPYC SKUs. They have 1 GB of L3 cache. L3 cache is two times faster than RAM and closer to the metal as well, since it's part of the CPU die. With that much cache, an EPYC CPU can just copy all the data to L3 instead of relying on RAM.

  • @BenMorse0
    @BenMorse0 Před 10 dny

    4:47 there is a 4th option. You can improve the algorithms inside the chip for more performance at the cost of more transistors

  • @atifismail4659
    @atifismail4659 Před dnem

    A 12" Macbook with M2 even M1 will be the perfect laptop for most. I am still using 2017 model (highest configuration) for daily tasks and never felt left alone.

  • @dansaghin1
    @dansaghin1 Před 8 dny +3

    The fact that the other brands are catching up with Apple is a good thing...
    Can`t wait.

  • @DannyMexen9
    @DannyMexen9 Před 13 dny +9

    I own M1 MBA and MBP.
    They run beautifully.

  • @axelotl86
    @axelotl86 Před 2 hodinami

    My M1Pro is still enough for every project I did. That was a revolution for me in a notebook formfactor.

  • @MegaUpstairs
    @MegaUpstairs Před 11 dny

    Since like 2011 the industry changed the meaning of the "x nm process". Before it meant the gate size, but since then it only means an abstracted measure that is not linear with the part sizes. The gate size is still stuck at 30-40 nm since 2011.