Now we know the SCORE | X Elite
Vložit
- čas přidán 29. 02. 2024
- Qualcomm's new Snapdragon X Elite benchmarks are out! Dive into the evolving ARM-based processor landscape, the promising performance of the Snapdragon X Elite, and what this could mean for the future of tech.
Run Windows on a Mac: prf.hn/click/camref:1100libNI (affiliate)
Use COUPON: ZISKIND10
🛒 Gear Links 🛒
* 🍏💥 New MacBook Air M1 Deal: amzn.to/3S59ID8
* 💻🔄 Refurb MacBook Air M1 Deal: amzn.to/45K1Gmk
* 🎧⚡ Great 40Gbps T4 enclosure: amzn.to/3JNwBGW
* 🛠️🚀 My nvme ssd: amzn.to/3YLEySo
* 📦🎮 My gear: www.amazon.com/shop/alexziskind
🎥 Related Videos 🎥
* 🤖 The end of Apple Silicon’s reign - • The end of Apple Silic...
* 🌗 Volterra vs M1 Mac Mini | Visual Studio - • Volterra vs M1 Mac Min...
* 👨💻 15" MacBook Air | developer's dream - • 15" MacBook Air | deve...
* 🤖 INSANE Machine Learning on Neural Engine - • INSANE Machine Learnin...
* 🛠️ Developer productivity Playlist - • Developer Productivity
Geekbench X Elite - browser.geekbench.com/search?...
- - - - - - - - -
❤️ SUBSCRIBE TO MY CZcams CHANNEL 📺
Click here to subscribe: www.youtube.com/@azisk?sub_co...
Join this channel to get access to perks:
/ @azisk
- - - - - - - - -
📱LET'S CONNECT ON SOCIAL MEDIA
ALEX ON TWITTER: / digitalix
- - - - - - - - -
#AppleSilicon #SnapdragonXElite #ARMProcessors - Věda a technologie
JOIN: youtube.com/@azisk/join
Looks like the start of an ARMs race.
War, war never changes.
What a handful. I hope these don't cost an arm and a leg
and it would cost a leg....
Nailed it! 😂
I see what you did there 😅
Apple has dominated the ARM chips market. It would be great to see some competition in the market.
exactly. I like my MacBook, but I want to see more options too.
Hopefully on linux it scores even better result windows for arm is very slow and not optemized
I have high hopes for Qualcomm. They finally match Apple in CPU performance on phone SoCs and absolutely destroy them in GPU.
Edit: my bad, the SD 8 Gen 3 is similar in multicore, but lacks in single core. thanks a-alex for correcting me :)
@@pikagamer9676windows for arm is actually fantastic, the problem isn’t the software, as Apple silicon proves on parallels, it’s the hardware.
@@eulehund99yeahh, competition is great, hopefully next snapdragon will get better single core score tho. But well then Apple’s A18 will be ahead again so yeah 😅. Right now A17 pro single core is like 20-30% ahead of SD 8 gen 3 which is a lot.
The perf / watt will be that really matters.
Another real world issue is the this snapdragon is ALL performance cores, so power draw in low demand situations will be inferior.
It is likely inferior to Apple and possible that it will be inferior to X86, even intel efficiency cores running windows.
I was thinking the same thing. The numbers look great, but when you factor in the number of cores & power draw this processor isn’t as good as the M3
@@skyak4493 Intel efficiency cores aren't more efficient than their big cores, they're just more area efficient, you can cram more cores in a defined space, and until Core Ultra where Intel closed the gap, AMD still had the lead regarding efficiency in x86 compared to Intel despite boasting no e-cores.
@@skyak4493 AMD's R7 7000 U-series chips can achieve better performance per watt than the M-series can, & they only have performance cores...
@@Lee.S321Except that’s not true.
1:30 I love how he carries the 5 and still gets the right answer.
😂
it's as if....
arithmetic just different on ARM
Hi, thanks this is a very interesting video, but i would like to clarify a bit the actual x86 competition, the 14900HX is in the same class as the 7945HX (they can draw +100w), so they're not comparable to a M3 or X Elite, the more accurate comparaison would be the freshly released Core Ultra 185H which scores about ~2500 points ST and ~13500 points MT.
Anyways we'll need to wait for more tests for the X Elite, specifically on battery life, as the performance is in line with the current generation.
I disagree, I think it's better to look at x86 at it's best even though it munches super hard for power, because then we will truly know what the tradeoff for power is. I'd rather know what I'm sacrificing in order to get better battery life. What I really want to know is assuming this arm chip has a GPU processor, what is it like? does it get smoked by Apples M series GPU cores? can it even compare to a low end power hungry AMD or NVIDIA card?
@@nathanacreman632 14900HX and 7945HX are literally desktop chips slapped into laptop with a more reasonable power target, it's designed for the guys that want a portable desktop more than a laptop, if you want actual mobile first designs you need to look at Meteor Lake and Hawk Point, it doesn't matter if it performs worse relative to M3 or whatever. Also Snapdragon is ARM, not everything runs on ARM, so we gotta see how it ends up running on emulated software. Also keep in mind it will release closes to the new AMD and Intel designs, so it will have to compete with that too.
@@nathanacreman632 Looks like my previous comment didn't pass through so i'll try again, no, 14900hx and 7945hx are desktop chips slapped into laptop for the guys that want a portable desktop, they do not have the same V/F curve, that why you look at mobile chips such as 8845hs or Core Ultra 185H, about Snapdragon GPU perf, looks like it's our lucky day since test has been conducted and it runs Baldur's Gate at 30fps in 1080P, quality settings not disclosed, if it is low, a 8845hs would get about 45fps for comparaison.
Also i'd like to point out that the X Elite is going to compete with new AMD and Intel designs, and both of them are going to bring significant perf/efficiency increase on mobile.
Alex, just wanted to say I love your videos. As both a Swift/SwiftUI and C#/MAUI dev your videos check all of my boxes and you do far more thorough videos then the average "export video wow look how fast" CZcamsr.
I hope we also have mini pc too with X Elite.😮
Certainly we will
I hope so too
A raspberry pi sized one even!
There is no reason for a mini PC yet, the scaling is horrible (compare the 23W result with 80W, 10% extra performance for 250% extra power consumption, even if you pump 200W into this chip, the result won't be much higher)
Its not just for personal computers. Right now arm chips from qualcom are being used for AR/VR headsets. Just like the Apple vision pro is using the m2 chip. Having just started to use the quest 3, its exciting to see where this tech leads to eventually.
I love these videos, great job!
Glad you like them!
Great quick comparison. Thank you ! You are awesome ! (Another Chip Geek)
I love to see a Core-for-Core & Clock-for-Clock comparison for the processors.
Watt-hour for watt-hour is the most relevant
thats more or less what single-core is
@@schizofennec It's not. I'm pretty sure one core of the i9-14900HX draws much more power than any ARM chip, yet the performance is similar
What about performance per watt?
Yeah, this is the main focus on mobile market.
@@urid1556it never was, it's only an Apple buzzword. That is why they moved G4 to Intel, and from intel to Apple Silicone. it's almost impossible to figure our "performance" - what it's stands for in context of what you are doing on your device. In normal use case Display is something that consumes most of power. And in other it's more important how fast you can complete the huge task rather than how efficient you can be while doing it. i'm preatty sure that in terms of GPU workload Apple is actually behind MediaTek, there GPU is a lot more powerfull probably because they have better software behind, but who knows...
according to qualcomms numbers, it should be pretty much in line with the m series in terms of power efficiency
23W Qualcomm claims for the x elite.
We wait for a review in the course of the year.
So AMD with HX are to be usually used with dicreet GPU, while HS (or U) are used as is (called APU) as they combine stronger iGPU. As far as I know, AMD has overall lead on iGPU side (if we exclude Pro and Max versions) . There is already a newer model (8940hs) that has only upgraded AI capabilites.
thanks for the clarification!
@@AZisk That's not exactly the case. HX CPUs are desktop CPUs in a laptop. Because they use desktop silicon they also have worse iGPUs (13980HX for example has the 770 desktop graphics, just slightly better; this might change next gen because of Arc graphics or maybe not). Because these are desktop chips they perform the best, but also draw the most power. Which is why the laptops with these are always so big and have discreet GPUs on the 4080/4090 level.
Has AMD improved their video ASICs at all in their igpus? For creatives such as myself INTEL for a long time has been the only option considering how great the media engines are inside intel IGPUs. As far as I'm aware they support way more codecs than amd/nvidia and are faster.
But apple silicon includes efficiency cores. So how is the power efficiency of the snapdragon?
Really hope these make their way to mini PC's, I'd love to try one as a little low power home server.
I only bought my Macbook m1 pro because of the battery life and display for the price at the time it was by far the best deal.
Hopefully Windows/Linux laptops catch up especially in software.
It would be really nice to have a good functioning linux arm laptop.
Let's see and wait
Really wanted to get an M2 Macbook Air, but even the M1 Air was too expensive for 16GB RAM where I live. Ended up getting a Zenbook which I really like except for the terrible Mediatek Wifi card.
If you are using mac, you already covered on the linux part mostly.
I'm kinda expecting Nvidia unexpectedly release their soc for arm windows.
haven’t heard of any concrete plans for that
nvidia is partnering with mediatek for that
but it's early
@@AZiskthey have been at it for a decade, guess why they wanted to buy ARM, they too know the future... AMD and INTEL are in for rough weather
If you haven't noticed yet, amd and intel have been replacing with a lot pf cisc components with Risc components. Under the hood
Wow, it's interesting how the X-elite performance is that close to m2 with testing drivers.
It's gen 2 is about to be mad. That said, I'm actively considering getting an X-elite powered laptop this year, especially considering the significant efficiency gains obviously expected.
Hardly so when you realise they need 12 performance cores to get to level of basic M3 with just 4 performance cores.
And by Qualcomm initial claims, it's going to eat about 80 W of energy. With much slower GPU.
Be aware that Nuvia, author of this chip initially designed it for servers (hence the lack of efficiency cores at all).
I don't think they would come even remotely close to efficiency of Apple Silicon.
They just picked a clever time to show you their firdt chip (right before M3 announcement), but reverse this, when the first SD X Elite machines are avaliable on the market, that time Apple also can make a same hype video, with M4, and also say the same, you will can buy this machines about 6 months later. So in this environment the M2 competitor chip is not that great, already beated by miles by the M3 (about the same multi score performance with only 4 power cores, and 4 efficiency cores, even if you want to state it's an 8 core chip, that is still way less like the 12 power cores in the SD X Elite, and the power cores difference actually 3x), but Apple already half way to release the M4 when they will be avaliable in any device (so just like on the phone market, they are a few years behind in performance, even if you compare the SD8G3 with the A17, Apple chip still beat it, but remember that, if that S24U still called to Note 24, that device should release with the A18 (iPhone 16) about 6 months later, and you are not comparing with the already one year old A17 (iPhone 15) if they still has balls to release with the iPhone..
@@ZhuJo99 X elite is rated 23W by qualcomm and their reference devices that were tested were passively cooled just like the macbook airs. 80W is the thermal power budget for ones that come with dGPU (if I remember correctly).
@@TamasKiss-yk4st who cares, even if it be slightly worse than M2 and M3 and potential M4, release. What matters is that on market we finally get laptops with good battery life and without macOs bullshit.
it'd gonna be Windows on ARM bull$hit....& that's some serious Bull$hit!!@@aimmlegate
I love this kind of stuff. I like comparing specs and benchmarks and reading articles on this stuff. Great video
Glad you enjoyed!
Trouble with ARM systems is most seem to have locked boot loaders
hey Alex , next time in depth linux performance analysis on arm , please.
Specially how is the current status of software ecosystem around linux on arm mostly highlighting major software development.
software ecosystem for arm is basically the same as for x86 specially on open source side
@@bigpodIn my experience, it's much harder to find Linux aarch64 support with common apps.
@@mairhart I think major linux desktop specific distros will definately work with arm64 based processors but it will take some time for the whole ecosystem to adapt with the new environment. It also depends on how much the manufacturer reveals about their processor.
I have confidence in the linux community , They canpretty much reverse engineer anything.
@@bigpod Let's see , X elite will definately open new arena of computing.
my common thought was if linux already works on server based on arm based processors then it should not be a problem .But again server and desktop is quite different.
@@mairhart Have you tried already ? Would love to know more about your experiences.
I wonder whether arm chips will replace low end low performance segments. I assume the Qualcomm gpu is not there yet compared to the cpu performance.
we need bootcamp on our m1 chips
Thanks for the video ❤❤❤
Thanks for watching!
@@AZisk You do lot of research, which I like the most.
Problem: where do I get this in a box in a server rack with 4x NVMe drives and 4x 10GigE SPF+ ?
super interested in pricing, I'm kinda wary that this is going to be priced like the apple pro chips even though qualcomm is comparing to the m2 base chip. still exciting to have another competitor in the space. also intel with their lpe cores is trying to close the gap in battery life on the x86 side. exciting year!
You are probably right, especially because the 12 power cores are cost way more like just the 4 power cores and 4 efficiency cores in basic M2.
Cannot wait to see a desktop-grade ARM CPU and Mobo. Pretty sure we are very close to see that switch in tech for every devices.
I just want the quiet, fast apple experience…without apple
AWS already has ARM based offerings available using their Graviton processors (they recently announced their 4th generation of them), and those are usually the best value (performance per dollar).
Hope they will include frore airjet in laptops for active cooling
I'm curious what wattage the SnapDragon is running at.
Single core score is what really matters, cpus' manufacturers were always able to increase multi-core performance but not single-core. Programms are sequential, most tasks are done by single core.
Any news on when these will release? Rumor I heard is as a microsoft exclusive laptop in July, so likely even more expensive than mac
but the wattage is incomparable, correct? M3 vs Elite
competition is always good, because when a company is the best by far at something, they get lazy or lose passion, then the product just get small improvements, but when there's competition, everyone wants to be the best so then the final products gets better and better, and of course it helps the software to really catch up too. Best
What about power consumption of each cpu?
great vid. though one thing to mention: the m series chips have a higher single core score mainly due to their higher clock speeds while the X elite ones have lower clock speeds and make it up in their higher core count.
All results in one place:
SoC Single Multi OpenGL Metal/Vulkan
Apple M2 2577 9649 25290 41390
Apple M2 Pro 2644 14233 50524 81443
Apple M3 3084 11564 30254 47358
AMD 7940HS 2664 12327 32085 31976
Intel 14900hx 2941 18279
Qualcomm Elite X 2574 12562
Hi Alex
Raptare just means “To seize away” , so raptor is just the one who snatchs something, but I think it lacks any “professional” connotation.
In fact, in Spanish “raptor” means to kidnap, so the meaning might be more ample.
Thief would be “latro” ( ladrón in Spanish) and thief would be “furtum” ( Hurto in Spanish ).
Why would you use m2 instead of m3 for comparison?
Wonder how the x elite handles x86 emulation. One thing Apple Silicon does remarkable well at is emulating the memory sync for multicore operation. That's how Apple's Rosetta can run x86 and x64 programs so quickly. Seamless support for the legacy binaries is how Apple really pulled ahead of the rest of the industry.
Apples Killer arm advantage is rosetta. Without a rosetta equivalent they are going to have a hard time.
Box64/Box86 and Fex Emu exist :P
Windows 11 already has a Rosetta like emulation/compatibility layer to run 32/64 Windows apps on ARM64 and it works well enough. More and more software is ARM64 native though. The only things that don’t work are emulated games that require GPU acceleration. Otherwise, the WOA world on current devices like Surface Pro 9 5g or Lenovo X13s are good enough. I can’t wait to see what the next gen of QC chips can do.
Win11 can a ready run x86/x64 binaries emulated. If the newcoming Qualcomm chip also has the feature of strong-memory-ordering needed for "hardware accelerated" x86/x64 emulation, like done in Rosetta, then Rosetta is not killer anymore.
Will X-Elite be only for laptops I wonder?
I'M DEMANDING IT!
I always been RISC fan and now is happening,good news :)
is x elite top of the line chip or the base chip?
I'm really keen to see the energy consumption differences as well!
Thinking of getting a used 16 inch m1 pro for around £1200, as they sell around that price. Not sure whether I should get that or wait for these or get an intel core ultra with a dedicated gpu...
Does Linux do unified memory like MacOS? Also the Snapdragon elite X doesn't have comparable memory bandwidth. I don't think you can run mixtral 8x7b on a 96GB Snapdragon elite X at least at similar speeds.
@user-ej9nl1ng9d I think M3 unified memory is different from PC UMA. Gpt-4 says the GPU can not access system memory or at least it's a hard partition where the GPU gets a dedicated part of ram.
Maybe amd hUMA is similar?
Short version if a Snapdragon elite X has 96GB system Ram can the igpu access all/most of it to run 70b models. Not a max of 16GB (32GB?} max like the 8700G. Or half system memory like windows shared memory. I believe in the M3 the igpu can access at least 75% of the system ram.
RISC-V looks like it will be even better for custom chips, so I expect that the competition will help keep costs down. This is what the industry needed, x86 had sit on similar performance for a few generations with only small improvements. This is how it was from the inception of desktop computing until the early 2000s when chip manufacturers started dropping development of new chips and we were stuck with Intel and AMD as the only option in computers.
RISC-V is many years behind ARM. There is no real high performance alternative, but even in low powered chips, the performance / watt is quite better in ARM.
@@FlorinArjocu The improvements to RISC-V are impressive, and with it just a reference specification, there’s a chance that a company can make a chip that is just as performant or better. I would love SBCs, like Raspberry Pi, build their own chips using RISC-V - Raspberry already makes their own microprocessors, so it isn’t too far fetched of an idea. Remember that AMD and Intel have swapped places in the performance scores for their respective processors and that was due to trying different designs, the same can happen with RISC-V - AMD was said to be years behind Intel for about a decade and they are now pushing the performance standards forward.
Just saying that one design is years behind doesn’t mean that it isn’t possible for a change in the market, with RISC-V it has the advantage of being able to quickly iterate and change the design, as right now it is mainly just a bunch of reference chips being made and not much for retail customers.
You didn’t mention that Apple core also have proficiency course with them. Like the M3 has six efficiency course, and only two performance cores. Compared to the 12 performance course from the competitor
is there an affordable and powerful arm linux computer?
I’d like to see your guys’ opinions regarding the licensing side of ARM versus RISC-V, for example as I understand it, the ARM intellectual property has been shuffled around and sold to various parties, while RISC-V might have an advantage (?) of being an open standard. In particular there was some hoopla about a mainland China company possibly owning IP rights to ARM? I wonder how all this plays out in the coming years. It won’t be the first time that business details have affected and even compromised the progression of good engineering (*cough* IBM PC *cough* operating system…)
What might be the tentative time we will see laptop with x elite chip
Good to see some real competition. You did skip one important metric though. Power consumption. Its all very well matching performance, but if this is achieved by doing whatever it takes power wise, at the expense of battery and cooling, then it’s not really competitive. Yet !
Other, albeit early reviews I have seen indicate the power usage of the Snapdragon is significant in order to match the M series.
Thanks for posting this. Looking forward to seeing where it goes.
Why begin the comparison with the M2 instead of the M3, since the M3 has been in production for some time and the QC chip isn’t launched yet?
Non developer here: are things developed for the macOS Apple silicon, also easy useable for any Qualcomm pc?
No, unless emulated. Qualcom will run Windows and Linux.
Will it make porting MacOS easier?
It will be great to see some competition, I’m just not sure the optimisation on windows will get anywhere close.
In Fall 2024 we get the M4 Max.
Will there be something like Rosetta2 on Mac for windows? For arm windowsl
It's already there. Win11 can run x86/x64 binaries on ARM.
Very interesting, thanks ! I'd love to see a comparisson with a high-end x86 chip such as the amd ryzen 9 7950x3d
Super interesting, but I do wish that in addition to the raw numbers, you (and everyone else) would also offer Performance/Watt scores since that's actually interesting for laptops which are what the majority of people use now.
What about the power draw?
2:07 if not if they move to ARM, AWS, Azure already providing ARM compute which are cheaper than their Intel or AMD counterpart. Many are still using Windows to run they workloads and even Windows on the paper support ARM, to many software or SDK are optimised for X86 (33 or 64)
Need to compare the power draw behind the single core and multi core numbers. That’s a big reason behind the popularity of the Mac’s.
Windows on ARM's biggest problem is Windows itself.
My subnotebook is a N100 +12GB Ram toy. When it's new, per-install W11 Home.
It's Slow, the Start menu is slow, and Web pages open slowly. CPU, most of the time, is 80 to 100%.
After reinstalling a flash copy of W10, I used a night to update everything.
The problem is still here. I use W10debloater to remove a lot I think I don't use. The better uses time from 2 hours up to 6 hours. And now can sleep correctly.
I also got MBA M1 and 13Gen i5. If the case can, most of the time, choose the MBA for the on-site job.
try Linux, might fix everything wrong with the N100
fr lol
Performance wise, the n100 isn't actually slow. It's around 1200/3400 on geekbench. For comparison, it's the same as the AMD 2400GE 4 core cpu.
i’m look forward of the future these chips can be used for gaming console or umpc.
How did you carried that five?
I’m curious if a snapdragon architecture will have a unified memory architecture like mac does?
I hope not. It's better to have the ability to swap and upgrade.
Plus, PCs have the ability to come with dGPUs
@@ehenningsenonly 16 inch laptops have upgradable ram in windows land now. In 14 inch there is only one that fits my needs and I'll have to return it because the display is super warped. (Asus zenbook pro 14 oled)
The craziest thing is that these laptops with unupgradable ram won't even sell you the ram soldered in, most of them max out at 16/32 which is so little. They don't offer 48/64.
@@definingslawek4731 I'm looking at MSI, Dell and others - all of which, have upgradable RAM.
If you're referring to Snapdragon laptops, I haven't looked recently. If true, I won't buy them.
Yeah u was plannin to buy a new laptop, currently looking at the m1 air with 16gb ram, but seening how snapdragon x elite is releasing offically in june... i think ill wait untill it releases and get some benchmarks before making a choice
So why not the 8 gen Ryzen APU's compared?
Also, the ARM chip will release on the germanium (build 26xxx) build of windows, which regular consumers won't see until Win 11 24h2.
Snapdragon x was supposed to use much more power than the m series doesn't it?? Correct me if I am wrong or missing on something ( Information from Max tech snapdragon x video )
they have two version 80w and 25w. Lower one comparable to M
1:57 doesn’t AWS have a cheaper instances for arm? same for lambda, also i think azure is the same too, so hopefully we’ll get even cheaper clouds
AWS runs on ARM (not sure if all servers, they might have a mixture). They got to the 4th generation ARM hardware.
I h ave always said that RISC based architectures are the future ever since the Pentium processors came out, I just didn’t realise it would take so long, Rosetta 2 has really given this a shot in the arm (no pun intended), do the new Qualcomm chips have any x86 or x64 emulation features?
This ARM competition is good af!
By the time its being released x86 will be having new ones too 😅
Problem with arm is since it does have less features than x86 it suffers from unoptimized code when running hence we saw glitches on android and lagging out when we used up all of those rams.
Not sure about apple side but the problem will persist especially to those translated x86 apps.
What's missing is the wattage for the Snapdragon chip. Qualcomm mentioned four possible TDPs, 12w for fanless devices, 23w for ultrabooks, 45w for bigger laptops, and an inefficient max of 80w. If the numbers are for the 12w then we have real competition.
there will only be one comparison left for these chips: efficiency (power draw). if Qualcomm won this, they still need to compete and/or cooperate with software to pursue them making their software (e.g. games) available or compatible with the arm chips, otherwise, the windows on arm machine will be completely useless ._. however, the development process would almost be as difficult as making another copy for Mac, so its actually likely to see new software for Macs from windows.
Ooh sweet I really want this for linux
We have been here before, but when the Lenovo X13s was finally out, I bought one and returned it with in a day… look at the results of the Snapdragon 8cx Gen 3 and remember, while it might be a new snapdragon, it is a generational upgrade *new floor plan* since I don’t see anything that talks about anything new Qualcomm has done for the ARMv9…. until this is in a functional product, they have been flops in performance when released. I hope they do make a jump and provide options!
Do we have the power that the processors got those results ? I mean, if raw numbers meant something intel would be a good choice for a laptop😅
Apple M chips can run Mac OS, anyone think Windows for ARM will be anywhere close?
HS is APU, monolith while HX is the desktop chiplet design on the mobile platform
This is very exciting! Arm can easily be more than enough for some people's needs, even when a lot of software is not ready for arm. If it’s powerful enough, that is, and that hasn’t been the case for Windows or *nix machines aside from MacBooks so far.
There's one big advante of Apple here: they can _force_ change. They introduce a new architecture with all the tools for transition ready at launch, then they give a few years warning when will they pull the plug of the old architecture. So any software developer who wants access to the Mac market have _no choice_ but supporting the new system.
Windows on the other hand is all about backward compatibility. So even if there are ARM PCs available and Windows for ARM works nicely, nothing forces application developers to support it, especially if x64 systems have similar or better performance for the same or similar price.
100% Correct.......
The real question is if Windows / Linux can match Rosetta 2. This is another HUGE piece of the puzzle. You talked about this with WoA but it can’t be good for just devs (which I am one), it needs to feel seamless to general users. Linux I’m less sure about but I’d like to know!
But in the end I do want to see this succeed because it makes everything better if they do!
If you watch Garry Explains video, windows emulator is not far behind rosetta 2, just that windows is lacking capable hardware as of now.
As a gamer, ARM simply isn't an option if games don't run at least within in the same ballpark as on x86 🤷♂
i use a linux on arm device every day and have for years. I am pretty sure linux will do fine in this area since they are already doing very well but i doubt it will be as easy for windows.
Linux has been running on ARM for a while (actually, a long while).
By the way.. your android also use Linux kernel,
Do these ARM chips run MacOS? If they don’t, then I have no use for them, regardless of how fast they run.
Take that up with Apple, they’re the ones who prevent you from (legally) running MacOS on non Apple hardware.
Hi Alex. I agree : ARM is the future. ARM + Linux my favorite cmobo 🙂
Linux is good for servers but terrible for PCs.
Er, why? You like being limited to arm apps?
>Not using riscv
@@sprockkets if there is no aarch version, just run it through box86
But what is the power consumption?!
Demanding it!
Potentially, we will see the Apple M4 at the end of the year. I wonder how much of a leap that will provide? Assuming another ~10% increase in single core, and ~15% in multi.
Why is the m2 the comparison apple chip and not the m3?
that’s what happens when you don’t finish the video.
Make video about which m3/ pro/ max for which programmers????….like your m1/ pro/ max video🔥
the biggest implication of this is that there may be more games on arms than ever. then if the developers are going to have a build system for arms, it might mean that they'll also consider adding support for macos also since the marginal maintenance cost has been decreased.
You appear to have been a little wrong about the "12 performance cores". The newer benchmarks for Oryon on Geekbench show that there are two clusters of cores, one 8 core and one 4 core. Not sure which is which though, but probably 8P4E.
I may have been, but got my info directly from Qualcomm
@@AZisk Sorry, I did not realize that. I'll wait for future announcements from Qualcomm before jumping to any further conclusions about the architecture.
yes but software is needed also .. long way till have stable optimize soft for this cpu.. also gpu .. will be a problem
Cannot wait for arm thinkpads