How Apple Just Changed the Entire Industry (M1 Chip)
Vložit
- čas přidán 2. 05. 2024
- Sign up for Morning Brew today for FREE: cen.yt/morningbrewcoldfusion3
ColdFusion Merch:
INTERNATIONAL: store.coldfusioncollective.com/
AUSTRALIA: shop.coldfusioncollective.com/
A quick note - The A14 comparison vs Intel i9 was normalised for single core performance.
-- About ColdFusion --
ColdFusion is an Australian based online media company independently run by Dagogo Altraide since 2009. Topics cover anything in science, technology, history and business in a calm and relaxed environment. In this video we take a look at Apple’s M1 chip or Apple Silicon in the new Macbook.
Interview with ARM founder: • ARM Processor - Sowing...
If you enjoy my content, please consider subscribing!
I'm also on Patreon: / coldfusion_tv
Bitcoin address: 13SjyCXPB9o3iN4LitYQ2wYKeqYTShPub8
-- New Thinking Book written by Dagogo Altraide --
This book was rated the 9th best technology history book by book authority.
In the book you’ll learn the stories of those who invented the things we use everyday and how it all fits together to form our modern world.
Get the book on Amazon: bit.ly/NewThinkingbook
Get the book on Google Play: bit.ly/NewThinkingGooglePlay
newthinkingbook.squarespace.c...
-- ColdFusion Social Media --
» Twitter | @ColdFusion_TV
» Instagram | coldfusiontv
» Facebook | / coldfusiontv
First track: Burn Water - Nostalgia Dreams • Burn Water - Nostalgia...
Other Tracks:
Shallou - Love
No Spirit - Careless
Montell Fish - Jam Session
Cody G - With You
Emancipator Greenland
Oma and Amberflame - Tropical Capricorn
Young American Primitive - Sunrise
WMD - Sentimental
Edward Sharpe and the Magnetic Zeros - Life is Hard (Teen Daze Remix)
Aerocity - And Ourt Hearts Beat Together
Abhi and Dijon - work.pool.bed
Burn Water - Burning Love
» Music I produce | burnwater.bandcamp.com or
» / burnwater
» / coldfusion_tv
» Collection of music used in videos: • ColdFusion's 2 Hour Me...
Producer: Dagogo Altraide - Věda a technologie
I remember you saying like 2 years ago, that Apple should include ARM processor, and now here we are, it happened
we alway know it will happen since x86 used too much power.... and unlike 10 years ago.. more dev are now doing arm design app...
What about RISC V processor??
Is it better than ARM Risc
Many of us predicted this years ago. I thought it might happen the moment is saw the iPhone 4S which was maybe 2014? The jump in processor speed was so significant, I knew this day would come.
@@shrin210 any update on this?
amd and intel are counting their final years. unless they rejoin the arm club. and nvidia is a smart, yet greedy guy.he owns arm now.
You can say that Apple took the RISC.
They ARMed themselves.
🥵🥵🥵
@@jorge091167 Damn you guys. Haha
INTELligent move
AMDone with this
I feel like every Processor manufacturers is just killing Intel right now
But i kinda have this fear that with the transition for computers from x86 to ARM, that everything would become very locked down and we would have less control of our devices...
Especially how Apple is now leading the transition, and they are well known to love locking down hardware and make it difficult for their users to do whatever they want with their hardware.
I love it! They got complacent and put the bean counters in charge. Its time for the engineers to lead the charge again. Any competition is a win for the consumer, I'm hoping they come back swinging, because it just means that AMD's utter domination and Apple's next round of silicone will have to compete even harder, giving all us nerds the wonderful excitement of having nice leaps and innovations again!
True Intel is getting trampled right now
honestly they kinda had it coming
Lack of adaptation kills even the mightiest of the companies.. It's unfortunate that companies that have risen to the stars drop like flies as quick.
Though I retired from the semiconductor industry 20 years ago, I still follow the technology and when I first got my hands on an M1 white paper, I must admit I didn't believe Apple could pull it off. Kudos to the ASIC Engineers who did this...absolutely unbelievable!!!
What would have been the most challenging hurdle they would have faced in desigining the M1 chip?
Would love to hear more of your thoughts on this!
Very late to the party.
Only found this channel a month ago.
Such great work. I’ve almost finished all content and look forward to what’s coming next.
Amazing channel, describing amazing people and events.
The quality of this and other similar channels makes you wonder about the future of classic TV programs.
same
The production of this blows away a lot of the crap on free to are lately
TV is dead. There are so many world-class documentary-style channels on CZcams! I just recently discovered a new Channel, History of Earth. The production quality is as good as TV! czcams.com/channels/_aOteuWIY8ITg7DQQspG1g.html
what is a "TV program" lol?
Seriously, the last time, I have watched a tv program was in 2015... I cant even watch it if I want to, excluded it from my cable ISP to save money...
"regular TV" has been dead to me for several years now. If I do need one of the major networks for sports, that is free over the air. I only pay for internet now.
This channel has been consistently at the top of tech news for such a long time ! Cheers!
Apple started an
ARMs race.
🥁
I love how in the 80s. they're talking mad tech in the news/television.
But now its all politics
Those were clips from Tech News shows. That's all they talked about. Do a search for "Computer Chronicles."
A more advanced society in some ways
Mad tech? What’s that
@@BeachLookingGuy a language millennial wouldn’t understand.
The M1 is colored black because it just came back from Intel’s funeral
God this cracked me up
LMAO😂😂😂
savage
lol. tear downs show silver. IDK why they show black in the promo material
Nice!
Thank you for ceating such information and high quality content videos each time.
you keep popping up in my autoplay. your videos are good for listening to while im at work. thanks for the upload. also, subbed
I can imagine the very tense high level meetings that are happening at Intel right now 😂
Nah they pretty chill for some reason. They still acting like they’re on top.
@@thelonelyowl2654 Fake it until you make it
I think most of them are polishing their resume's right now.
They have MBAs running the show, that's usually a disaster for anyone other than themselves
I bought an AMD CPU for the first time this week 😂
I am taking a moment to appreciate the free knowledge that exists on the Internet
It isn’t really free.
@@dwyk321 you pay with your soul?
@@dwyk321 you only need to pay for the internet service and maybe view some adds every now and then, the amount of value you gain in exchange is mind blowing
@@batmaneo your time is free? wanan come over later and bag up my leaves? ;)
@@batmaneo If the product is free, YOU are the product.
Where do you source all your research from? You level of detail in your video's is awesome!! Thanks, and keep up the great work!!
That was a REALLY well put together video! Great job, & thanks for teaching me a few of the “in the weeds” details.
Intel CPU would make a great heater during winter seasons
Lol
*Puts hands near my 9900k and rubs hands together*
I'm dying, how the turntables eh
@@qualtrox7137 Yes, vinyl records are making a comeback though... ;)
I would use a AMD Bulldozer cpu for that to be honest. Way more heat.
TSMC also deserves some credit.
Without them the ARM and AMD chips couldn’t dominate as much.
And ASML deserves credit in turn, for their EUV lithography tools. Without that, TSMC couldn't manufacture at 5 nm.
What's TSMC?
@@m.design TSMC Is a manufacturer
@@m.design The world's largest pure-play semiconductor manufacturer. It's no exaggeration to say that they are one of the major driving forces behind computation of the modern world. They don't design products themselves, but instead take orders from fabless companies and make it for them. Throughout the process, TSMC's engineers provide their input on the designs of their clients' products. You can imagine it as a back and forth group project.
@@m.design TSMC is the Taiwanese Foundry that manufactures the M1 & A-series chips for Apple using the 5 nm process. They are essentially a contract manufacturer that makes chips, and account for nearly 50% of the global chip-making market share at the moment. Along with Samsung, they are the only other manufacturer capable of producing 5 nm chips commercially today.
This video is so great and detailed. Thank you so much! :) Great value!
9:28 - "Simplified CPU's, such as ARM, generally will do one single instruction per clock cycle. While desktop chips may use many cycles to complete one complex instruction. This means more power consumption, less efficiency, and more heat produced."
...No. Both types of CPUs mostly use simple instructions that take one clock cycle. However, desktop CPUs offer many extra instructions that the ARM does not. For an ARM (RISC) to perform equivalent complex tasks, it must string together long sequences of instructions. Not only that, but a desktop CPU is loaded with specialized hardware acceleration to make it's complex instructions execute even faster. This means the RISC requires far more clock cycles for complex tasks.
RISCs are successful because those complex instructions are rarely needed. Usually, CPUs just shovel data as fast as they can. And in that contest, a RISC wins easily.
Thanks for pointing this out. There’s another benefit, equally important. In order to explore instruction level parallelism, pipelining fixed length instructions (such as the ARM ISA provides) is much easier than the x86 variable length instructions. That’s the reason intel started decomposing those in smaller fixed length ones (microops) and reassembling them back into a normal x86 instruction ever since the Pentium Pro era.
My understanding(limited) each intel core actually is doing between 2 to 4(and in rare cases 8) dflop (double precision floating-point operations) per cycle because each core actually has more than one execution unit.
I think the thing that is really telling in this video is the fact that they say they can't even tell a speed difference between different software that is compiled for different Hardware such a thing is not a sign of amazing Hardware it's a sign of really bad compiling and or programming....
Lot of this is just silly I mean my laptop is a 8 years old and still has more than 10h of battery life and its a fanless I7 and is plenty fast enough for me now granted it was like 6k new, so this is really a question of affordability not ability... and affordability and Apple not synonymous neither is reliability
so your saying Intel is "inefficient". (..That works on so many levels 😆)
That's not really true for AArch64. It's instructions are similarly complex than x86 instructions, except it lacks some of the legacy parts. The main difference is the lack of memory operands, but that's more than compensated for by the larger register file. Both designs do in fact look very similar internally and both the number of instructions needed to achieve common tasks and the number of instructions available in the instruction set are comparable.
Thank you for including some clips from our channel. We are witnessing a computing revolution!
Is this Vadim or Max?(sorry if I spelt any name wrong,)
you’re welcome, no problem. any time. :)
And he didn't give you credit for it
Your channel rock, guys, Vadim and Max !!! ✊🏼
@Coldfusion y u no give credit?
I bet "Intel's inside" is now on fire.
intel is attacked 360 degrees.. 2020 is the worst for them. AMD stabbed them, now AAPL killed them.. dang... 😅
yup, I have an intel distributor code and they give us "points" if the importer sells more (for years now)
their new strategy? they duplicated the points.
lmao what are they thinking?
More like Intel Outside
intel has been AMD's punching bag for the last couple of years. Now it's Apple's turn :D
@Amit Ezuthachan yeah, it seems they can't be that stupid to complete ignore their surroundings
Great production telling the story of these chips. I'm excited about M1!
Hey Dagogo, I am thoroughly enjoying yr channel, I love the new knowledge you are giving me, and in such an enjoyable and captivating way, thank you Dagogo.
Cheers Kev
Apple: *Launches M1*
Intel: Dead Inside
This is too underrated.
I am sure others are about to launch new chips too
apple morons believe this ridiculous video: PC Builders:
@@terrobert u arent some sort of smartass to know how to build a pc its nothing hard it just takes u know time
first amd and now apple
Not investing on R&D is investing on self-destruction
It is really cool to be able to witness this kind of development in the industry, I'm exited to see what comes next
I remember watching cold fusion back in freshman year 2012 happy to still the channels growth and success!!! Keep it up de gogó
The issue with cold fusion is that every video topic is so damn amazing, I waste most of my time deciding which one to watch 1st
And I thought I was the only one like that.
Lmao
Exactly. I try to do 3 videos I like and end up seeing more I want to see before the other 2
😂
This made me subscribe
I’ve removed my intel i7 sticker off my laptop out of shame.
Same...
hahaha nice one
lol noice
Hahahahaha 🤣🤣
**hides in a hole with my i3**
Top class research, after watching the full video I truly felt that this should be a lesson in electronics class
Compatibility: to be fair, there are _some_ apps that have trouble running on M1. But their number is very limited and it's very specialized software (e.g. some images for docker)
Not anymore, I have everything I need installed right now
developer issue..... they exist on Windows too, its not just Apple. If anyone played Mahjong on PC, you'll know there are graphic issues if you don't change the resolution,. Experienced that when we upgraded to Windows 7 back in the day. It IS a XP game after all, so probably expected.
This feels like the first actual big innovation from Apple in a long time. I want one
Apple has been making industry-leading chips for quite some time now. The only thing is, they were in iPhones so far where you can only do so much on the small screen limited by iOS. Now that they are on laptops, the full potential is finally being unleashed.
I agree... I am PURE APPLE.... and I keep watch on it...
Pretty sure airpods and apple watch were also revolutionary and changed the entire market
@@well7885 but they can never reach the power of a desktop chip. X86 will continue to rule the roost because of the graphics card rollercoaster it rides upon. Windows PC's will continue to crush Macs for the foreseeable future because they are the cutting edge. Not ARM, which is only good for mobile applications.
@@ag3ntorange164 this comment is gonna age so bad. Go check out the reviews of M1 Chip which are the lowest end laptop chips and already has the best single-core score ever. I request you just one thing, come back to see this comment in 2 years and you will realize how awe-fully wrong you were.
“As we set about designing the ARM, we didn’t really expect.. to pull it off.”
🤣 what a legend.
To be honest, every innovator or inventor before 90's are a legend because NO ONE expects them to pull it off.
They don't mention Roger (now Sophie) Wilson who was the real brains behind ARM architecture.
The best moment was when he realized the chip wasn't even connected to the power supply and still working 🤣
Honestly that was the same thing my project group thought when we took the assignment on making our own 16 bit risc cpu, but it was surprisingly simple! And we all got the second highest grade on our project exam. But I assure you Apples chip design is much more complex than ours were 😛
My first computer was an Acorn Electron. My next computer will be a 14 inch Macbook Pro with 4 ports.
This is very insightful. What a brilliant video. 👍.
Thanks for doing this vid. Very interesting tech/history lesson.
Not to be confused with “AARM” or “the assistant to the assistant to the regional manager”.
Hilarious bro
HAHAHAHAHA U GOT ME THERE BUD
dwight!
I don't get it, care to explain low earth creature?
@@dealerovski82 The Office tv show quote.
Intel : We don't think it's worth putting our chips in Iphones
Apple : So, you have chosen death.
Yes because they rather put it in servers and super computers.
There are some Intel based android smartphones.
@@ArunG273 - I’m sure all 600 android users are happy.
Made in China XD
They RISCed it but they ARMed it
Well made content. Thanks for making this
You are awesome at what you do. Thank you for these videos.
People, including me, went from passionately hating on Apple to seriously considering buying an Air or Pro M1, that's how much M1 changed the market
I kept thinking that Apple users are snobs but damn... I'm impressed as hell with this.
Yup bought my first macbook a few months ago. Got an iphone too.
@@emp437 The sheep says what?
@@emp437 good choice. going for a same
@@newguy3588 look, it's one thing to have a brand that shows your status.
It's another to have a good piece of hardware.
To have both should impress you.
Even I was just dismissing apple as a fashion brand.
About time they made me look in their direction without disgust on my face.
**INTEL, thank you so much for the severe f&*k-up on your part otherwise we likely wouldn't be here...**
Intel saved Apple from being the f-upper.
When companies get egos and think they're untouchable, this is what happens historically.
@@bjkina
When Apple tried to return the favor for the iPhone chip, intel said no.
Intel isn't going anywhere. While they still stay on ancient 14nm process while AMD, Apple, Qualcomm and basically everyone else use 6nm to 8nm produced by TSMC and Samsung, Intel is still competitive and manages to squeeze more and more performance per core. Imagine what would happen if Intel just moved what they have to 7nm TSMC node. Well they are actually considering doing just that. At the same time typical Mac workloads don't require that single core performance Intel was chasing. So it's not a new paradigm of PC hardware - it's a new type of PC which was actually pioneered not by Apple but by Surfaces and Chromebooks which used ARM long before Apple. Yes this new type of PC will be better at those types of workloads and will take some of Intel's market share but it's just a better option for that kind of tradeoff sacrificing single core performance for power efficiency. But good luck playing Guild Wars 2 with its heavy world thread on those new M1 Macbooks.
@@AntonMochalin What are you talking about. These chips are beating everything hard in single core. They are the fastest single core cpus you can get. Only the high end intel/amd can beat them in multicore (because ton of cores lol). This type of pc will be better for 99% of users. There may be an application for cisc elsewhere, but it will not be mainstream computing.
Its the future, I say this with confidence. Especially now we've got the M2 chip. I can't wait to see the day we get native chips from AMD and Intel which are ARM based, perhaps even custom chips direct from Microsoft and other companies who have also licensed from ARM.
I love your storytelling. Have recommended this video to so many of my friends.
How about making a follow up video on intel’s new strategy in lakefield chips and the future? In competition to m1?
Intel hasn't really launched a direct competitor to the M1 yet in the lower power categories, so we wont know until their later chips come out they said that are supposed to be more of a direct competitor.
@@quantuminfinity4260 they just released, you guessed it, an incredibly power hungry hot chip! Big surprise! Barely beating the m1 while drawing 130 watts: they just won’t learn
"More power with less power"
- ARM
Now we need a battery revolution. Still waiting on graphene
Honestly yeah, where's all the graphene tech? It was all the hype 5-6 years ago and then nothing. I know they were having issues producing in mass but thought they'd figure it out by now
@@MrUltimateX graphene is so good, the companies will start losing profits.
and the next battery is more likely some other metal component, not graphene
i think with batteries its more difficult because by laws of physics, we are reaching an energy density that is just too risky to be portable. so i think batteries will stay on the same-ish levels that they are now, but as we see with the M1, the components still have much optimization potential. once we can build screens, processing units etc which run with minimal heat-generation, a laptop with a present-day battery could run 40-50h easily i think. I hope i am wrong though, and there is a way of making a high energy density battery that just by laws of nature cant discharge an any explosive manner
@@amarug Theres probably a way to make high energy density battery unlikely to fail catastrophically, at least unlikely enough to make it an acceptable risk for the convenience, people still use cars even if they are deadly to use sometimes.
Basically you don't need to reach 100% safety, just close enough.
Another way to reach a similar amount of convenience would be to have fast charging battery, wouldn't really matter if you only had a few hours of peak performance if you could just stop near a power source for less than a minute. Maybe if you were going off the grid then it would matter. Then again transferring large amount of power in a relatively short amount of time probably has its own safety issue but at least you are unlikely to be holding the device in your hand when it explodes.
Or maybe some sort of universal battery combined with an easily swapable battery so you can just switch when depleted. A little less convenient than the fast charge since you probably cant bring spare battery everywhere, but i do have a friend who already does that.
But yeah it might be easier to just make stuff more energy efficient.
Very interesting and informative. Thanks 👍
If i was Intel i would be in serious panic mode right now.
First AMD, now Apple blowing Intel out of the water in a short amount of time.
They really need to freaking step up. They've gotten lazy af
Their processor manufacturing expertise is something to be positive about. Have to see how investors value Intel over the next few years
And nvidia bought arm but theyre not making any new hw yet lol.
coz intel is a goliath, such a big giant monster but kinda slow and stupid. amd on the other side, is david, tiny yet deadly. and arm is an impostor.
@@mayurgianchandani1184 Just gonna throw this out there but Intel is in their current position for being overambitious LMAO.
Thanks for remembering ACORN. I am not the least surprised by the M1's speed - I had an Acorn Archimedes . At last the world is catching up !
Very informative. Thanks for sharing.
such quality documentation well done
I've been a PC/Android fan all my life, and I take my hat off to Apple for achieving this.
I've been a PC enthusiast all my life till i wanted to know and bought my first Apple Device, expensive, overpriced, greedy, slow, that's what i thought.
Well, i was wrong. best decision in my Life, changed everything.
@@JohnSmith-pn2vl Similar story with me. I lived in Seattle, a couple of buddies were Microsoft engineers that worked on Windows. Loyalty required staying on Windows until I was forced at a new job to use a MacBook Pro. After complaining for a week or two learning to use the Mac, I realized that for me it was much better. That was 15 years ago. Since then I only use windows when a specific application requires it.
I work with both a MBP and a Windows PC. For the same price the windows beat out in performance for what I was programming. However, this M1 chip has blown me away, and the new MBA, I believe, can beat out 99% of windows laptops. This new chip is powerful.
Same. Loved Android hated apple, made the shift to iPhone 7 because one plus 6 wasn’t available and I needed the device urgently. It’s a really good phone IMO. I’m also interested to shift from Windows PC/ Linux Laptop to a Mac Mini.
until you see the price tag
When it's coldfusion content, only one thing comes in mind. *Quality Content*
There is no quality in just references to other CZcams channels, no unknown footage or stories, no new ambient music, no nothing, where is the quality you are talking about?? for me what I can see is that Dagogo is running out of money to pay for writers and content creators just like the most of influencers out there. Welcome to the real world.
This video is kind of wrong.. the cpu is not a desktop replacement, and isnt changing anything but battery laptop life for a few people
@@dertythegrower it's... Faster.
@@thisisntsergio1352 no its not.. amd beats it in cinebench benchmarks, kido.... also any new ryzen cpu laptop with an nvidia 3080 completely stomps apple everything
@@dertythegrower Apps right now are running on emulation but M1 beats Intel. After a year or so M2 will be release and it will be faster than AMD. You can tell by the graph it is increasing 300% each year.
I know of one difference in architecture between the early Motorola (used by Apple) and Intel was the Intel chip had to outsource the math function to an external chip, then wait for the answer. This was time consuming and generated extra heat.
The Motorola (Mac) chip had this function built-in. Just this one difference made the Mac graphics way better than Intel's.
I also heard that originally, IBM wanted to use the Motorola chip in their first PC release but Motorola couldn't manufacture enough chips to satisfy the order, but Intel could. Can you imagine how that would have changed history?
That was like a movie. I really enjoyed every second of it including the Brew ad. Wow.
I can see the scams now:
"How to download and install ARM chips for PC FREE Download"
“Is your computer getting slow? CLICK this button to install an extra ARM”
Lol!
If I had an extra ARM for every one of those scam ads I've seen...
@@danielcitrin-ozan9722 You'd have 2 ARMs and 3 legs
Please post a link to the video of "How to download and install ARM chips for PC FREE Download" I need more speed for my laptop....
LOL jk
In my computer architecture class, a common question was, which is more recent, CISC or RISC, and almost everyone got that one wrong. People naturally assume whatevers more complicated must be more "advanced" or newer, when no, it turns out the innovation isn't making it more complicated, but doing more things with less instructions. Thats what the innovation is. That lesson ought to be applied to lots of things :)
Yes yes and yes. I always say, the true wise man knows how to take complicated and abstract concepts and simplifies them to the point where others can understand them.
It was first when RISC Instructions were simple than cisc.. it's like with few instructions you do same job..but to be done the job..the job. Itself is fragments in many instructions that with few commands it be done...in inverse with cisc instructions...they do job step by step with to much commands to the processor that overload and consum much power.. .... In paradox nowadays RISC Instructions are too.much complicated to fragments a Job to done with fewest commands to processor it's possible
RISC vs CISC is irrelevant now
IOW, “Less Is More”!
CISC made sense back when compiler optimization was primitive and it was assumed that all complex programs and OSes would be written in assembly,. Back in those days, an advanced CPU was expected to minimize the "semantic gap" between human programmers and assembly. So, the designers of the day weren't dumb or lacking in innovation, they were designing for a different goal.
Best cost benefit in terms of computers at least for me. 100% Approved. Nice design, fast, smooth use, battery lasts more than 15 hours working. So practical as a phone. You take it from the bag and just start using from where you were, with absolutely no lag. It's really open and use. Nice nice nice
How incredibly well explained is this
I lift my hat for your professional. You provide a content which is very interesting and the way how you present it is very academic and professional. There should be more content like yours nowadays. Thank you Mr Altraide
I agree. This seems like documentary making rather than a You Tube tech review.
You can say that Apple took the RISC.
Apple played even more fundamental role in the development of ARM. In 1990 a joint venture between Acorn, Apple and VLSI gave birth to Advanced RISC Machine Ltd (later renamed to ARM Holdings). The first Apple product featuring an ARM processor wasn't the iPod as claimed in this video, but Apple Newton in 1993 - more than 8 years before the iPod actually.
Because of my work I still can't do without a real PC but I recognize and that's what has changed a lot compared to the beginning of smartphones and tablets is that I use more and more both devices to replace my Windows computers
Mobile systems are fast, reliable, simple to use and moreover they are used with the fingers, they are portable and I can take my tablet or my smartphone everywhere whereas for my PC it's a little more complicated and as he says very rightly in the video they are more energy efficient, my tablet and my smartphone can easily last a day on the battery
And it seems to evolve very quickly so I can't wait to see what the future holds for mobile systems in general
Thank you Dagogo
I don't press the SUBSCRIBE button often, and almost never click the bell icon. Before this video was ended, it made me click both of them. An engaging full of research video. 👍
competition means innovation. We can finally see pcs and notebooks getting better
Nah maybe for you... lol
And in Apples case it means top notch marketing propaganda, sheeple and fanbois 😄😄
Donn't you love it when someone does something good that there is always that one person that keeps putting them down
also means unethical marketing strategies, slave labour, releasing multiple models which are basically the same thing but with ever increasing prices and damaging the environment for greed. Etc
@@mhtbfecsq1 no, that's capitalism xD
Imagine if these laptop ARM chips became mainstream and a gaming laptop was build out of them.
I give it 10 years max and we will be running arm linux desktops from microsoft.
@@mauriciosl Microsoft had been playing around with some of their components in Linux like DirectX. Putting DirectX on Linux is just one huge leap to allow portability across platform without having to rewrite just for OpenGL stuff that went stagnant.
@Jaskaran Singh dev notes from who?
u can game on those m1 powered macbooks anyways lol, just not that much GPU power, apart from that they're basically the same.
Give it about 5 years
This has convinced me to look at the ARM (vs. x86) for my next laptop.
Great analysis! It is cristal clear that RISC is the future. I can´t wait to build my first ARM based desktop PC
The tech community : Moore's law is dead
Apple : Hold my M1
@Bokang Sepinare Bullshit venturebeat.com/2020/01/07/a-bright-future-for-moores-law/
Don't believe Apple's claim lol. Leaks have shown that M1 performs well compared with Intel-based Macs but only performs half compared to Ryzen 7 4800HS and Ryzen 5 3600X
@Bokang Sepinare In actual real-world head-to-head tests the M1 is a "middle of the road" capable CPU, not underpowered by any meaning of the word, but also nowhere near the fastest or most powerful.
It will be dead once it hits 1nm. I think 3nm EUV will be its limit
Can you tell me what is Moore's law
I think it's safe for Apple to include stickers in their MacBooks with "Intel not inside"
Except their highest end model is equipped with an i7... Doh
@@DurzoBlunts i9
@@DurzoBlunts i7 oh no no no the i9 cooking machine
"Intel: Dead Inside"
There is still a lot of Intel-licensed stuff in each MacBook and any other personal computer, no matter which CPU is in it.
Got my m3 pro MBP couple of weeks ago, but still in love with m1 air, going to keep it as long as it possible
awesome work, once again
This single video is far better than all the other reaction videos combined.It gives more insight and in depth analysis of what actually happened how it happened.
Intel got too confident and laid back before the 6th generation.
Comparing the 7700K with the 2700K is just hilarious. Both are quad-core processors.
The 7700K (2017) has a mere increase of 30% more power compared to the 2700K. (2011) Intel needed almost 6 years and 5 generations of processors for such a low increase!
The 7700K is furthermore famous for its heat generation and power consumption which makes the overclocking potential limited even with a strong cooling system.
The 2700K on the other hand had a lot of overclocking potential and easily reached the same level like a stock 7700K with a decent cooling.
Then Ryzen came out of nowhere and Intel was finally forced to stop its politics of milking their for many years unrivaled processors.
And as far as we observed it in the past few years they weren't even prepared for this case.
@@JackoBanon1 yea 2700 was my last intel CPU. i skipped then i upgraded to amd last year. and i have complained that there is no innovation in desktop world for the past decade.
i hope servers would use some arm chips soon.
I have 6700k. For sure my next cpu will be an amd. Intel is just a greedy shit company.
@@JackoBanon1 This is why competition is good. Intel has purposely held back their big changes (such as shrinkage from 14mm) since they've had no real reason to make big advances. People buy their stuff every year even with 5% increases, so they just cruised along. AMD gave a solid push in competition but Apple is gonna be the big change needed to get Intel to actually improve.
@@nikkjcrespo Yeah, Apples latest M1 processors are scarry and could change the processor market drastically in the near future.
Awesome vid. Kudos! 👏🏻
*and here's extremly smooth*
proceeds to only zoom in and out a bit
I used to write ARM assembly 20 years ago, making my software about 5 times faster than using regular C++ compilers at the time. Seems like this is a skill I might pick up again for even more screaming performance. The ARM instruction set is one of the most beautiful instruction sets I ever encountered on a CPU. So much nicer than Intel 386+, and even more elegant than Motorola 680x0. it's a great future ahead for computing!
Assembly was always screamingly fast be it ARM or any other. There always has been nothing as good as direct assembly code. It has just been less necessary as the chips got faster and there was more memory for lots of everyday applications. It has also been complicated by needing to have your software allow for different peoples machines with different software and hardware setups especially where there are multiple manufacturers -ie non-apple. But I agree re native power being unleashed by assembly.
I had done a bit C64 assembly back in the days, and a bit AVR 8-bit code during studies. But when I came to my first ARM7 project, I remember that I was also stunned by the elegance of the instructions, e.g. the shifting bits. I felt that this was a completely higher philosophy of thinking, and I liked their approach from the first moment.
Now I have just bought my first Apple Computer, and as a Linux guy, this was entirely because it's just great technology.
Well of course no widely-produced Instruction Set Architecture (ISA) is as ugly and outright *klugey* as Intel's x86, but... is even the latest ARM ISA as sweet, and so completely *orthogonal,* as that of the ADSP-2100 family? Admittedly a fixed-point DSP, NOT a general purpose CPU, it can be used as a microcontroller in many cases, and almost makes you want to eschew the C compiler and do it all in assembly.
@@tarunarya1780 this is not the whole truth. C compilers will take the CPU Architektur into account while you have to be extremely knowledgeable and have to spend a shit tone of time to get to that point. Your average programmer has no chance to get close to a good c compiler.
@@domenickeller2564 I think you meant the comment for sci fifaan. I think compilers are a good thing, and lots of projects would be too difficult and big to manage in assembly, never mind cope with vast amount of hardware out there. Even programming in VBA can seem overly tedious. Role on AI. There is a role for selectively doing some parts of programs in pure assembly depending on the speed of operation and bottlenecks faced.
Can you do a video ”Rise and fall of Flash player”
Steve Jobs put the kibosh on them
About Rosetta 2
"If Rosetta 2 works, you won't even notice it ever existed" -Angela
html killed it lol
Incredible video coldfusion :]
Amazing video Dagogo !
Clayton Christensen died earlier this year and didn't get a chance to see one of the largest companies completely disrupted - something unthinkable when he wrote the book and set out exactly how it would happen 20+ years ago.
He probably knew
@Nida Margarette Most of them focus on the same synthetic benchmark and exporting video use case. Does e.g. Zoom use AS specific hardware extensions for video decoding / encoding?
I hate the way apple does business and never bought their products but I love this new chip. Amazing.
why? you hate that they make all-encompassing tech that anyone can pickup and use without needing to optimize or change it? or that they have an iPhone for everyone at just about every price point? or literally changed the PC landscape forever a few weeks ago yet didn’t raise prices by even a DOLLAR on any of the new Macs? I don’t understand.
That's exactly why I hate the fact that it's an _Apple_ chip...
Nevan Hoffman is part of the problem.
I feel you man. I just hate Apple as a business but damn are they good at what they do.
The fact that we have to make it clear that we hate them before acknowledging or appreciating something about them goes to show WE are afraid of fully acknowledging that particular part we like about Apple otherwise we will be classed a “sheep” speaks volumes
Windows and Android user hear and the M1 chip is on a different level but bought my wife a I pad air M1 chip and I was so impressed I bought myself one it's amazing having a tablet that has the performances of a £2k pc for £900 and its the best for cheap video editing... And it was only after watching this video it got me interested and it genuinely works I can't believe how fast the M1 chip is ...
Fantastic video explaining M1 chips, thank you
I use both Windows and Macintosh; I don't believe in "brand loyalty."
Me too 👍🏻👍🏻
Same. Using iPhone + Linux + Windows. Keen to replace my Linux laptop for a Mac but that’s it. I’m not replacing my windows pc because I’m comfortable with it.
I believe in better with sensible price... 1000 dollor wheel is rediculous while this year's iPhone is kinda must go phone because Apple finally gave preference to rugged over fragile for the sake of aesthetics. They raised frame so screen couldn't come direct contact with surface making it long lasting while android OEMs make screen up on frame and call it 2.5 glass. If you drop it glass is done.
I bootcamp windows on my mac. Don't know if that will still be possible in the future though
Me too just got my M1 Mac mini for work and everything and a gaming rig with amd ryzen 7 for gaming
Joke. People still use intel lol
New Apple tech now costs an ARM and a leg.
lmao XD
Sounds a bit RISCy
They made a faster chip and didn’t increase the price of the computer..?
It‘s much cheaper than any Windows Laptop with similar performance. Now there is really no point complaining about Apple‘s prices
ROFL
I love the quality of editing and research that you guys do. That's the second time I'm watching this video because this topic is so interesting.
This feels like CZcams premium, always the best quality videos
Apple is confused whether to increase or decrease prices of their product😂
@Anurag LOL ..do you even know apple bro?
@@atiqshahriarshourav2958 do you? You clearly don't understand how apple operates now
@@fanban2926 yes, they make each piece separate so you buy more 😂
@@atiqshahriarshourav2958 you don't understand apple they literally realest the best budget phone iphone se in 2020 for 400 dollars and iphone 12 mini for 700 dollars also reduce prices for iphone 11 to 599 dollars and iphone x to 499 dollars.
@@rafalb692 They overprice their stuff
This might be revolutionary. Like Apple ways or not, they're where they are for this kind of reason. I wonder how it will inspire other manufacturers.
*Apple does nothing for 10 years* : I sleep
*Apple does 1 thing in 10 years* : Apple is revolutionary
@@RaskaTheFurry Android in 2013 be like: we don't need 64-bit processors for these small little phones. It makes no sense.
@@RaskaTheFurry dude if you watch the video you can see apple been doing the whole arm cpu thing for a while and we just didn't pay attention because intel was better. but now that's not the case so it's the thing that is neat.
@@BloodSprite-tan then it wasnt Apple, but the manufacturers that made the architecture possible.
@@RaskaTheFurry you need to research the product to perfection before you can release it
Not everyone hates Apple these days.
I'm watching this on a fully base model Macbook Air, hooked up to a 4k monitor, with multple tabs and I'm playing Borderlands 3 at a high resolution and high FPS. Madness. Never thought it possible before.
This dude could make a video about plumbing sound all sinister and secretive
How One Plumber Revolutionized Pipes
I own myself. Have privacy... And life has never been better
Welcome to 2030
www.tamperproof.earth/post/welcome-to-2030-i-own-myself-have-privacy-and-life-has-never-been-better
Never thought I'd see the day when the word "Grandma" is synonymous with "INTEL".......
Burrrrrnnnnn
Lol facts
Facebook is up next
I was surprised no mention of Apple's Newton came u, and the importance of ARM in the PDA market of the 90s.
I think it's important to explain why at 8:30 he says the chip was still running. It was running on the extremely low voltage introduced by the measuring device, it wasn't just running off of God's will.
This is why its important to have competition. Intel has been at the top of the food chain for many years but recently AMD has now caught up and you can even argue that they're more performant/efficient. And then there's the ARM chipset, spanking both chips.
I'm pretty sure Intel is shitting their pants right now
Argue? we are past that point by halve a decade. Intel is just sheit.
Well M1 is only spanking 'em on efficiency, but AMD rules the brute force market.
Arm is owned buy Nvidia…….
@@Ryang170 Not yet....
@@Yeet42069 lol my first thought exactly. AMD is crushing Intel and it's showing in every test & benchmark. Intel's been great, but AMD cleaned em out this last year.
Love your content and quality. I wish you an awesome 2021.
I love your videos I love everything about it we need more
My first project, as engineer, included a 8080- software and hardware design. Swr on MDS.
It was a success. Since then I have a fondness for Intel.
I wish them a great success!
I just bought an ARM 64-bit 4-core computer for $100. The Raspberry Pi 400.
U shouldve bought the model b, overclock like a champ with a puny heatsink
i was thinking of raspberry pi as well.
I have a banana pi, with dual core, and it is good for excel, word, light browsing, etc
@@fwefhwe4232 ARM devices should be like this, a side kick for light loads, not a $1000 machine that's powerful but fully locked by the company
$5 PiZero is good enough for me.
@@yudhobaskoro8033 RPI400 is overclocked and it can go up to 2.2