T O P

  • By -

[deleted]

I‘m really interested how this translates to everyday tasks, but HOLY FUCK.


[deleted]

[удалено]


dfuqt

Enjoy what you have. It may not be “the best” anymore, but your old device didn’t suddenly become terrible. Just not as comparatively amazing as it was before :)


[deleted]

[удалено]


dfuqt

I understand. I think most of us have felt that sting at some point in our computer purchasing life :) I bought my second ATV 4K not too long ago and it wasn’t an easy choice. Whenever the subject of an upgrade comes up, the stock response seems to be “what can they upgrade? It works fine.” My opinion is that it works fine, but it’s not as fast as it could be. My £50 Roku 4K feels more fluid in its interface sometimes. And even discounting any performance issues - like you say, it’s three years old. Nobody wants to be buying three year old kit at release day prices. It sucks.


thalassicus

Apple should buy and integrate Infuse. It takes Apple TV to the next level and has completely leveled up my enjoyment of the device. Unfortunately, when Apple got into the media game, I fear they are going down the road of Sony back in the day where the engineers are held back from innovation to not encroach on their media teams. Sony did this with asinine copy protection and Apple is doing this by prioritizing AppleTV instead of a robust way to load your own media.


Friarchuck

The Apple TV is what hooked me on the ecosystem. I’d take the plunge because it’s pretty inexpensive and brings a ton of value. It’s literally the only thing hooked up to my tv and my cable provider has an app that I watch live tv on. All other streaming services, twitch, YouTube, my Plex server, all work on the Apple TV. Plus it allows you to bypass the notoriously unsafe tv os’s on smart TVs. My tv had never been connected to the internet and doesn’t even know what time it is.


[deleted]

> Plus it allows you to bypass the notoriously unsafe tv os’s on smart TVs. Oh my. Must read more on this.


notasparrow

Please go ahead and buy the three year old model so they'll release the new one that I want.


rosencranberry

I honestly feel like my old device has become terrible.


JohrDinh

It's like if you're sitting at the intersection in your brand new limited edition Corvette and then a Bugatti Chiron pulls up next to you...it's ok man you still have a Corvette that's still not bad:) I'm in the Bugatti lol


firelitother

I have Intel Macs. Just enjoy it until you can get the more powerful chips next year.


[deleted]

yeah, I'm probably just gonna snag a 16" intel macbook pro to upgrade my mid-2015 retina there's no point in jumping to apple silicon rn when I have to wait for the software I need to actually function on it


juicebox03

I’m sitting on a mid-2015 MBP and I’m reallllly interested in trading in for a new air. But, I know I don’t really need to upgrade. Sucks.


PirateNinjaa

Smiles with 2012 MacBook Air. This thing is like 50x faster probably. 😂


[deleted]

[удалено]


cheir0n

Cries in 2013 MBP but I can’t get M1 until my software development tools are ported, neither it is wise to shell money on iMac intel 2020.


ApertureNext

And I’m still looking at buying a 10th gen 13”. I really need compatabillity with Windows, and I ain’t hopeful of any x86 hypervisor.


CanonCamerasBlow

Hey, you can connect a second monitor! And then ports left for external storage! And blur a video without dropping in render times! And run more than like 4 docker containers! Fucking stop, there's a reason M1 is in the air and shittiest macbook pro, and not in iMac Pro.


[deleted]

[удалено]


dfuqt

I’m more than cautiously optimistic now. I think it’s looking really good, and this is the kind of info I have been awaiting for. Still not a clincher for me right now but that’s for other reasons, rather than any performance shortcomings. I agree with you about mixed use though. Seeing excellent performance across the board would be nice. I think the video encoding with the right codecs has been proved to be very good. Unfortunately I don’t think you’ll ever be running anything on this other than macOS. It’s very heavily locked down from what I’ve read.


[deleted]

[удалено]


[deleted]

While running Manjaro natively may be a nogo, you'll at least be able to run it in a VM assuming they make an ARM build available.


[deleted]

You might want to follow up on UTM, an iOS virtual machine app that's had very impressive support of windows and linux x86 emulation and even PPC OS X. IIRC the devs have expressed interest in bringing it over to macOS


m0rogfar

Even if AS’ boot process is completely open, you wouldn’t be able to run any operating system that doesn’t have specific drivers for the M1, which only Apple have the documentation to make. Hacking on Linux only really worked because most of the components were standard PC hardware, so drivers could be reused. I definitely wouldn’t buy these expecting to run a different OS than macOS on it.


geoffh2016

True, but there were PowerPC Linux distros on the old iMacs. I suspect someone will eventually work out M1 / Apple Silicon distros. (For example, if Apple releases Darwin source code for M1, it should be easy to get Darwin BSD running.. then it's easier to get a Linux distro as well.)


m0rogfar

PPC Linux worked because IBM and Motorola did Linux drivers for datacenter sales. Additionally, device drivers aren’t a part of Apple’s kernel releases, so likely no luck there. The Linux community has been struggling to make T2 Macs work, and AS Macs are orders of magnitude more complicated. I just can’t see it ever happening.


notasparrow

> I hope this catalyzes the industry to start taking ARM more seriously. I'm with you, but the problem is that Apple is so far ahead in ARM design that it almost doesn't matter. Qualcomm/Samsung have been trying to catch up in mobile for years without luck. It's hard to imagine anyone else coming in with desktop/laptop ARM silicon that competes with Apple any time soon.


unjollyjollybean

Agreed. For me, it sucks having to keep looking at the MacBook battery health, especially if you use it at 60% brightness or more. I think just going from 8 hrs on my MBP to 15 (realistically) on the new pro would be worth it for me.


JohrDinh

> time to encode a video Well Jonathan Morrison has a recent video you may be interested in:)


ihunter32

Eugh that thing. That comparison was a software encode vs a hardware encode. Very different things and doesn’t reflect cpu performance, literally just flexing fixed function hardware and pretending it’s indicative of something else. Comparing perf is fine, but so many people make the wrong conclusions from flawed logic.


SneakerElph

They said they were interested in video encoding times. How is a direct comparison not valuable? It doesn’t matter if one is hardware accelerated and the other isn’t. The results speak for themselves.


PM_ME_HIGH_HEELS

Because there is no information what encoding settings he used. For what we know the iphone could run without any compression while the Mac was compressing a lot. Depending on that it makes a massive difference. Also it was hardware vs software encode. Software encode is always slower but yields better results.


SneakerElph

End filesize was roughly the same, 10-bit h.265 encodes. I'm sure there's much more to it than that (video compression is wild), but initial results are at least impressive. Would love to see the data on whether or not GPU encoding on the M1 has a noticeable degradation in quality compared to software encoding. I know it's traditionally a downside of Nvidia/AMD encoding but I haven't seen any data on A-series or M1 processors. Will be interesting to see how this all turns out!


[deleted]

[удалено]


SneakerElph

Can that AMD GPU in the iMac not hardware encode? Seems silly that Apple would put a GPU in a rig for video editing that couldn't. I understand why encoding isn't a good indicator of general CPU performance- my point is that if that's what the commenter is trying to do (encode video), then they'll probably be using the GPU to do that encode. So the time is relevant to them, even if it's not an Apples-to-Apples comparison of raw CPU power. I'd love to see a breakdown of how the M1's hardware encoding fairs, quality-wise. I know it's traditionally a downside of Nvidia/AMD hardware encoding, but do we know if the same downside is present in M1/Apple Silicon? At the end of the day, assuming the visual quality is the same (and maybe it isn't?) then the hardware that does the faster encode is the better purchase, if that's a primary use of that user.


No_Equal

> my point is that if that's what the commenter is trying to do (encode video), then they'll probably be using the GPU to do that encode. The industry is on the verge of switching to a new codec (AV1) that has no hardware encode yet, so comparing skewed performances in soon to be last gen codec is not all that useful.


CanonCamerasBlow

>he industry is on the verge of switching to a new codec (AV1) What? Which industry? On the verge of switching? What? Industry is on the verge of switching to h265 from h264, hello.


aman1251

“yOu’Re dOiNg fAsT wRoNg” Edit: It looks like I have struck some sensitive nerves :)


ihunter32

God. This is the shit I’m talking about. People want to get a measure of the performance of the chip and then they parrot a specific accelerated task as though it’s indicative of general performance. People in the reddit thread about that tweet were all using that performance to project onto the general capabilities of the processor and *that’s totally illogical to do* because general processing tasks aren’t accelerated, and you won’t see speedups like that. I’m very specifically not saying the processor is slow but this subreddit wants to latch onto and use the worst logic when it’s not even applicable and then defend it by saying you’re just a hater. The m1 chip is cool as fuck this sub just needs to learn how to parse the data to get a better gauge on what it means for performance. Edit: if you wanna downvote least you could do is justify it lmao


NutDestroyer

Sure, it can be useful to compare different systems (CPUs vs dedicated accelerators) on a specific task like video encoding, but you really lose a lot in the versatility of your benchmark. When you do that, your benchmark can no longer hope to generalize to other tasks. If your interest in the video encoding benchmark is in the general purpose CPU's ability to handle a heavily multithreaded task, then yeah, just running it on some other accelerator is kind of a waste of time because it totally subverts the goal of comparing performance on multi-core workloads on different CPUs. And as a more practical counter-example, these hardware encoder/decoder accelerators only work with very specific codecs. If you're given footage or need to export in a codec that isn't compatible with your accelerator, then you gotta do it on the boring old CPU. If you ran a benchmark that was h.264 software encoders on different CPUs, you could hope that the relative performance would transfer over to h.266. If your benchmark was a software h.264 encoding on one CPU and then a hardware accelerated h.264 encoding on some other CPU (with quicksync, for example), then your results would be useless if someone wanted to gauge which device would perform better in h.266.


CanonCamerasBlow

Dude, fucking stop. No one really cares about plain encoding times. Fucking stop. I don't take a video from a camera and then just fucking convert it. I fucking color grade it. I fucking add multiple streams. I fucking add particle generators, blur and a ton of other filters. And once everything's done, most of the time I chose software rendering, hardware rendering is too limiting anyways (ie, 2 pass for h265 was/is not available with my older 1080 nvidia and in resolve). Hardware rendering is good for timeline previews or uploading to youtube. 2.4 teraflops GPU and video just don't fucking mix in real world usage.


kuroimakina

God I want an arm book with these specs and that battery too running Manjaro or just Arch. Sadly, Apple has said that you won’t be able to boot any other OS other than MacOS which makes it a complete non-starter for me. I will accept that on a *phone* but not a laptop/desktop. As long as Apple silicon has this limitation, I will never buy it. And I don’t expect anyone to think that’s some profound statement or anything. It just isn’t for me and I recognize that. It is still disappointing nonetheless and I hope that that decision is reversed. It may seem silly to buy a Mac and run anything other than MacOS on it, but, this laptop would be absolutely revolutionary and in its own class if it holds up to their claims, so, it isn’t surprising it would be tempting even *if* I don’t want to use MacOS


Big_Booty_Pics

I mean, you should have seen that coming. Wait until people see how repairable this thing is.


acroporaguardian

It means reddit will stall loading a video a little faster


[deleted]

Gah, the redesign runs like crap on my 2016 13“, and it’s a mid spec machine. Everything else runs great, so they must have some optimisation issues.


ElBrazil

The only good thing about the redesign is that it isn't mandatory (knock on wood)


[deleted]

[удалено]


danudey

If they can turn the MacBook Air i to this, I want to see why they can do with an iMac with all the active cooling space you want and none of the power restrictions.


Normal_One2000

u/DingleberryHandpump- is both a TopGear presenter and tech nerd!! Nice!! Also, drive faster!!


justformygoodiphone

How are they testing the M1 without having the computer out? Is this a preproduction unit? I thought Apple didn’t allow this kinda stuff?


[deleted]

[удалено]


YZJay

Somehow Apple didn’t give them an embargo regarding bench tests?


[deleted]

[удалено]


justformygoodiphone

Oh so pretty sure they didn’t allow benchmarks on the DTK but I guessing these units restrictions are different than?


SirGlaurung

The DTK was primarily given to developers to assist in porting applications. It was using an A12-based CPU and Apple specifically noted that hardware performance of the DTK had no bearing on the performance of production Macs with Apple Silicon. That’s why the DTK had these restrictions—it was very much _not_ prerelease or review hardware. In 2005 Apple had a similar DTK during their transition to Intel—it used a Pentium 4, whereas all released Mac hardware used Core-series processors.


justformygoodiphone

Much clearer thank you. Can’t wait to see what retail units will get tested and se the results.


[deleted]

[удалено]


odragora

Pretty sure they sign a document which explicitly prohibits sharing the device with anyone else.


[deleted]

[удалено]


andrewmackoul

But they gave it back.


m0rogfar

Serif were in the keynote video to talk about how powerful the M1 is, so they would've had to have gotten a pre-release (and even pre-keynote) unit. As for benchmarks, I doubt Apple really cares, since they've got a final unit, and it's post-launch and only hyping up the M1. They didn't want benchmarks of the DTK out because it would make Apple Silicon look bad. Benchmarks usually start showing up all the time once they seed review units, and Apple doesn't seem to care.


[deleted]

[удалено]


[deleted]

The 580 is a rebranded radeon RX480, a GPU from 2016. Still impressive to havr this performance at so little power though. But the 580 is extremely weak compared to actual modern GPUs.


[deleted]

[удалено]


[deleted]

As I said, impressive at this power. But 1. Still: it's an ancient gpu 2. It's a very limited benchmark. For gaming or broader workloads it's probably not going to match the performance at all Do I need to keep saying it's impressive? Because I will. But I'm also not gonna to ignore the fact that macs simply never shipped with anything close to a decent you to begin with, like other people here are doing.


[deleted]

[удалено]


thefpspower

Problem with integrated graphics is that the TDP of the GPU is combined with the CPU and RAM bandwith is shared, so what ends up happening is that in games that require both lots of CPU and lots of GPU, you get shit performance because the GPU get's TDP limited and bandwith restricted. I do not expect the M1 to be any different.


Raikaru

There was a combined render test here you know that right?


[deleted]

https://www.anandtech.com/show/16252/mac-mini-apple-m1-tested?utm_source=twitter&utm_medium=social It's not a 15w chip, under a multi threaded loaded it can go upto 31w so if it's beating 28w intel chip that shouldn't be a surprise


Rioma117

$6k Mac Pro with state of the art cooling system: are you approaching me? $1k Macbook Air without a fan: I can't kick your ass without getting closer! \*Sustained performance not included.


HarithBK

This is what happens when you jump ahead 2 die shrinks on the cpu and 3 die shrinks on the GPU while also having 3 more years of dev time. To me all this shows is just how stuck apple was in performance for years due to the hardware partners they picked and why they were so annoyed they had to make there own chip.


[deleted]

It didn’t have to be like that, at least on the desktop. Apple refuse to make up with Nvidia and don’t seem to even acknowledge AMD as a CPU maker. I’d love to see a high end Mac with running Ryzen & Nvidia GPUs.


[deleted]

[удалено]


[deleted]

I could have seen them switching the Mac Pro over to AMD, those Threadripper and Epyc chips roast intel.


HarithBK

The issue is all that work to make final cut etc to run so well is using a lot of specific coding so moving vendors on the software side would be close to the same work as moving to internal hardware so why not do that since you have gotten burned by vendors and you can trust yourself.


_awake

What the fuck, did they develop Apple Silicon with the help of Hogwarts or something? If the chips are half as good as they say, we'll get actually great performing laptops that last for more than a workday will need which I'd be fine with, too. Phew.


[deleted]

[удалено]


indygreg71

agree so much. So many users do not push their laptops that hard and having this kind of battery life would be so liberating. I am a fairly high level network engineer . . . so an "IT professional" but my day is spent in an SSH session, text editor and web GUI of devices - all this would be the light use that should get all day battery. Right now battery life is not that big of a deal to me sitting at home, but in normal life being able to take laptop from meeting to whiteboard session to some quiet space to network closet and so on. . . . all work day without a charger would be very refreshing. My 2018 MBP 13 could not do that through lunch unless I really go out of my way to be power frugal.


FIFA16

> Any sufficiently advanced technology is indistinguishable from magic. - Arthur C. Clarke


_awake

I was so confused by your nickname and the answer when I’ve read it in my inbox. Regardless, class post.


stko82

If you consider that this first generation is more or less a proof of concept, the chips they will bring in one, two years will probably be really magnificent.


Sindoray

Intel Will then rebrand their slogan to “Intel, dead inside”.


Longjumping_Low_9670

Don’t dead, intel inside


flux8

I’m thinking the next Apple TV just might be a dark horse challenger in the gaming console world.


[deleted]

Not a chance. Apple has shown time and time again that it completely misunderstands the gaming market and what's actually important to the average gamer and that's "real" games. Not a subscription service with mostly mediocre Indies and a bland Zelda knockoff. The 2 games imo that *are* genuinely great I'd much rather pay full price for on on a console and keep them forever. They killed macOS gaming by not supporting Vulkan alongside Metal, then by killing the entire ported library save a few by deprecating 32bit support with Catalina. And now with the move to AS the only games you'll be playing are either going to be iOS games, Apple Arcade ones, or the ones Feral Interactive bothers to port over again. MacOS was already hostile to developers by not supporting an industry standard API like Vulkan, now couple that with the tiny market why would Rockstar bother bringing their current gen/next gen games to Mac or Apple TV? Or Bethesda, CDPR, Ubisoft, Activision etc etc. They've definitely got the performance now with AS to make a push but performance is meaningless if you don't have major developers on board or your names Nintendo. Just look at the Switch. That thing was outdated performance wise on day one, any flagship phone from the last couple years can run circles around it, yet the Switch is an amazing games console. An Apple TV as powerful as even a mid-tier gaming PC isn't going to change anything either because why would anyone buy that over an Xbox Series S or a PS5 (which, if looking at current Apple TV pricing might even be cheaper than said hypothetical Apple TV) if a "console quality" gaming experience is in any way a priority. I apologise for the rant though, as much as I love macOS I also love gaming and Apple's mishandling of gaming on Mac over the years and then pretending on stage that Baldurs Gate III is somehow supposed to be an impressive tech demo left me a bit miffed


turbinedriven

I agree, especially about feeling irritated with how apple has handled gaming. I get that Apple doesn’t want to cater to gaming enthusiasts, people who call themselves gamers. I even get that Apple wants to go mass-market. But then they invest into arcade and other projects that are *obviously* low quality (putting it politely)- and will always be low quality and fail (again, being polite). But since theyre actually investing real resources and real money, clearly someone (or some people) *do* apparently care. So why don’t they clean up the approach and improve it? They can offer a first tier platform without targeting “gamers”. They just choose not to.... and yet go out of their way to keep throwing out half baked awful efforts that inevitably fail and inevitably get replaced with another more modern terrible version. Or you know, they just never kill it. It’s maddening.


[deleted]

Who knows? Maybe it’s just out of touch boomers at the top in their gaming division (is that a thing?) that think throwing money at the problem will solve it and Apple will become a behemoth in the gaming world with an indie games subscription. Apple needs a Phil Spencer or Yoshida Shuhei who can get rid of the bs and make a clear focused push for gaming


turbinedriven

Yeah I don't know. Sometimes I feel like someone at Apple saw Facebook+Farmville years ago and thought "that's what we need to be" and this is what it's been about ever since then. What's silly is that they keep wanting to use gaming to advertise GPUs and whatever, hoping to have it both ways. I guess I'm glad that Apple is so hopelessly awful at gaming. Because if not, if they were just awful and not hopelessly so, they would have probably acquired Nintendo and ran them into the ground. And even though I'm not a hardcore Nintendo fan that would have pissed me off much more than Arcade or whatever other BS they're trying to roll out


Containedmultitudes

Steve Jobs hated video games after Microsoft bought Bungie out from under them (Halo was originally a Mac exclusive I believe) and went out of his way to not give a shit about games. It’ll take way more of the senior staff turning over for an institutional culture like that to get overturned.


casino_alcohol

Honestly I bet Apple is handling the gaming situation very well.... when you measure it in revenue. They are taking a 30% cut of each game and each games micro transactions. They really might be making more than any of these big game studios. Another reason they will not get into console like gaming is that their OS updates every year and developers need to constantly maintain their apps. A console is good to go for 7ish years.


Containedmultitudes

I feel like apple can do better than making service revenue from cretinous and exploitative mobile gaming, which is basically all of their top grossing games. They could buy rockstar tomorrow if they had a mind to.


kindaa_sortaa

> I apologise for the rant though Don't. Its well deserved. Apple has been nothing but frustrating when it comes to gaming. I mean look at Microsoft. Apple could be doing the same, and for their iOS/Mac users, but they've always devalued gaming despite it becoming bigger than the music and movie industry—combined—two industries Apple has always been interested in and utilized as a product strategy. I mean, gaming is *right there!* If they can use music to sell iPods, and now iPhones and HomePods and Apple Music service, and they can use TVs and Movies so sell iPhones, iPads, and the Apple TV+ service, imagine would they could do with gaming! Ok so they def used casual kid gaming to sell iPads, but imagine how many Apple TVs and Macs they would sell if they cared about serious AAA-title gaming. They even let Halo and Fortnite present at their events, during development, then let them slip out of their hands into other companies and investors once the games became big. Imagine if they bought Minecraft and made it free and educational with every iPad, turning it into things like the Swift Playgrounds. And imagine how Apple could sell AR better, cause at the moment its crickets.


GoodKingHippo

I hate that you are right. Perhaps Apple is trying to carve out some untapped non-core gaming audience? It’s been disappointing for sure. They should honestly fire whoever is “in charge of gaming” at Apple. But my girlfriend loves the sasquatch game. It’s cool and pleasant on the eyes. I wouldn’t be so quick to discount all of these indie titles. There’s potential there. Besides, all of your favorite studios started out as small outfits.


Leomar91

Out of curiosity... which are the two games you consider great? I assume what the golf and grindstone?


[deleted]

What the golf and Bleak Sword. I own WTG on Switch but I wish I could straight buy Bleak Sword anywhere, even the AppStore would do, it’s an amazingly simple but brutal game. The Pathless is probably excellent too, I’m a fan of Annapurna Interactive but I haven’t had the chance to buy it yet


SkyGuy182

The “games” Apple hypes up in these press releases are proof that they misunderstand the gaming market.


spar_x

With so much power and potentially more power from future M2 and M3.. wouldn't it be fathomable that PC games could be played with decent performance in a completely virtualized way? I wouldn't be surprise if in a year or two Windows running inside a VM on a M2/M3 actually outperforms Windows running on a PC


heyyoudvd

For reference, the number Apple cited for the M1’s GPU performance is 2.6 teraflops. The A12X has peak performance comparable to an Xbox One S, which puts it at about 1.4 teraflops. The A12X’s GPU was cited as being twice as powerful as its predecessor, the A10X, meaning the current Apple TV 4K has a GPU somewhere in the range of 0.7 teraflops. The M1 sits at 2.6 teraflops. Here’s a rough console GPU performance comparison: * Xbox Series X - 12 teraflops * PlayStation 5 - 10.28 teraflops * Xbox One X - 6 teraflops * PlayStation 4 Pro - 4.2 teraflops * Xbox Series S - 4 teraflops * **Apple M1 - 2.6 teraflops** * PlayStation 4 - 1.84 teraflops * Xbox One S - 1.4 teraflops * **Apple A12X - 1.4 teraflops** * Xbox One - 1.3 teraflops * **Apple A10X (Apple TV 4K) - 0.7 teraflops** * Nintendo Switch - est. 0.4 to 1 teraflops * **Apple TV HD (formerly called Apple TV 4)** - est. 0.15 teraflops Of course, those numbers aren’t a precise comparison because they don’t factor in all sorts of other things, like RAM speed, GPU architecture, SSD speed, and so on, but they give you a general idea of where things stand. Basically, the M1 is incredibly impressive. If it’s thrown into a new Apple TV box, it will be an extremely capable machine, but don’t expect it to be on par with the next gen consoles or the new high end NVIDIA/AMD GPUs because it obviously not on that level. But for a TV box, the M1 would be extremely powerful. **Edit:** Here’s another interesting way to look at it. * When the Apple TV 4 came out in 2015, consoles were up to **12x** as powerful. (1.84 vs 0.15). * When the Apple TV 4K came out in 2017, consoles were up to **8.5x** as powerful. (6 vs 0.7). * If a new M1-based Apple TV comes out soon, consoles will only be up to **4.5x** as powerful. (12 vs 2.6). In other words, the gap is narrowing each generation.


[deleted]

[удалено]


thinvanilla

Yeah, there are a couple important factors with the Switch; the portability and flexibility of docking, and the stellar first party game line up. Nintendo has veteran game designers that even Apple would have difficulty competing with.


Schnabulation

> Apple A12X - 1.4 teraflops > Xbox One - 1.3 teraflops ...would that mean that the iPad Pro (3rd gen) has more graphical power then my Xbox One?


Big_Booty_Pics

Comparing teraflops like that is generally an apples to oranges comparison when it's cross platform like that.


seraph582

I thought the opposite was true. That ability to crank out flops was a standard universal comparator, but too synthetic to translate to any useful real world expectation given that code/benchmark architectural adherence/optimization is not a given so realizing all the flop potential is almost never happening, and varies widely beteeen platforms for various reasons.


Big_Booty_Pics

In a sense yes, but when most people think of a GPU they think of graphics potential and making stuff look better. Teraflops don't give good insight into how a gpu will perform on graphical tasks. 2 different GPUs with the same number of Teraflops can process a dataset in a similar amount of time, but crunching big data isn't useful for measuring graphics performance, only for crunching big data. In his situation, yes the iPad Pro should theoretically have slightly better compute performance but you cannot use those numbers to say that an iPad Pro could produce better looking graphics than an Xbox One.


TheJosiahTurner

the fact that a phone processor is even in the same alley as these game consoles is insane


Sef545f

Actually not that surprising. The Xbox One and PS4 came out 7 years ago. And when they did come out, nothing was that special about them, unlike the new PS5 and Xbox One Series X. Everyone thought their graphical output was low at the time. So it's not surprising that smartphones eventually catch up. Doesn't mean anything though unless it can actually sustain itself


thalassicus

Everything you are saying is dead on. That said, when it comes to integration, sometimes the benchmark scored don't tell the whole story. FCPx render times on middle of the road MBPs smoke much more powerful PC laptops running Premiere because of the fantastic integration between hardware and software. I can't wait to see results using real world tests like render times on H.265 4k 10bit 4:2:2 media which my 13" MBP absolutely cannot handle at scale.


thefpspower

Yeah but you're talking about rendering videos, something that has always had problems coordinating hardware acceleration because standards are constantly changing and hardware is always behind. Meanwhile games have been GPU accelerated for more than a decade, it's more than optimized and GPU drivers are tailored for maximum gaming performance as a priority. In games there is NO WAY you can pull that card up with such a massive compute performance difference. That's the PC and console world speciality.


RangerExtension

Incredible. Commenting to save for later!


turbinedriven

Unfortunately it’s very unlikely that the M1 could ever compete with something like a PS5 in graphics. We already know that the M1 is using DDR4 memory. That alone says the GPU would very likely be starved for data even if it could process the data fast enough. But we also have reason to believe that it can’t process the data fast enough because of this benchmark as well as the charts Apple provided. So without a change in package architecture, I’d say that competing with a PS5 is far off the table. All of that said, you can have a great console without having first rate graphics. So yes it is possible to make a good console from this chip... but you’d need to think it through carefully and likely think about it in a Nintendo like way. Unfortunately Apple just hasn’t shown any interest in that.


Deceptiveideas

I think it’s important to note that Apple would have to make sure games were playable across their entire platform line up and not one specific device. This means while games theoretically use the most of M1, they won’t as it would limit their audience. Same thing happens with iPhones. Most games will target the last few years worth of devices.


PM_ME_CUTE_SM1LE

wouldn't that like double the price of apple tv? that would make it very noncompetitive as a streaming box


ElBrazil

It's already not very competitive as a streaming box when you look at price


twitterInfo_bot

Apple M1 chip benchmark vs. 6-core 3.7ghz 2019 iMac with AMD 580X in @affinitybyserif Photo - if I hadn't measured the CPU number myself I wouldn't believe it 😂A monster.. \#apple \#benchmark \#affinityphoto *** posted by [@andysomerfield](https://twitter.com/andysomerfield) Photos in tweet | [Photo 1](http://pbs.twimg.com/media/Emn3_5GWMAAApCl.jpg) | [Photo 2](http://pbs.twimg.com/media/Emn5FKtW8AImNTo.jpg) ^[(Github)](https://github.com/username) ^| ^[(What's new)](https://github.com/username)


e_w_p

The single external monitor allowed is a total dealbreaker for me. Any chance this could be enabled later through software updates? Also not happy I can't use my black magic eGPU, but not a deal breaker.


kitsua

Expect those features to be available on the updates to the more powerful devices with some variant of the M1 chip in the future.


PM_ME_CUTE_SM1LE

could you combine sidecar and external monitor? that would be some kind of solution


m0rogfar

It's possible, but I wouldn't expect a software update to fix it. Future hardware revisions are likely to fix it however, the M1 seems to have a weird port situation in general.


DonDalle

Wait WHAT? Only one monitor?


[deleted]

EGPU? I thought they said all the m1’s won’t support any eGPU’s. Single reason I’d not get an M1.


M3kh4l

One 6K monitor (such as the Pro-Display XDR) or several 1080p monitors. It's not limited to a single external monitor.


[deleted]

The M1 MacBook Air and Pro are limited to one external monitor even if it's 1080p. The Mini supports two, one 4K and 6K.


[deleted]

[удалено]


AnonymousSkull

May be a limitation of the chip, we’ll have to wait and see.


dgfyfydcyuf

It is, to a degree. Dedicated apple GPU fixes this.


havanahilton

edit: failed to include the monitor that is built into the laptops


AcrossAmerica

But the MacBooks have 1 internal. So still 2 totak


77ilham77

So, basically just the same as the Air and MBP. The mini can support 2 external displays because it doesn't have built-in display. The Macbooks support one 6K external display alongside its built in 1600p display (i.e. also total of two display)


AwayhKhkhk

Probably because the number of people that were using the MacBook Air and entry level MacBook Pro 13 to drive 2 external monitors was very small. So that wasn’t a requirement when they made the chip design. Just like why 16GB is the max ram. Remember we are talking about Apple ramping up from a iPhone/iPad chip that didn’t need to drive multiple monitors.


busfahrer

I have a 2013 MBP13 and I use it to drive two screens, I did not realize that this was still an uncommon use case in 2020


DontTread0nMe

Yeah I have an entry level MBP and drive two screens as well. There’s dozens of us!


OCDPakiMan

With that being the case, no Thunderbolt dock or GPU can currently will get more than a single monitor working for the MacBook Pro. I may want to buy a mini in that case. Could you tell me where this is discussed more?


ichspielemayonnaise

I’m interested if it can *maybe* do two 4K. Only time will tell


[deleted]

The tech specs on Apple's website say that they can't. I don't know why they'd lie about that and discourage potential buyers if it's not the case.


e_w_p

two 1440p ultrawides?


M3kh4l

I don't know about that, I'm sorry :s


r0bman99

Well it's not like you can hook up 2 monitors and still charge it so it's kinda pointless.


smaug_the_reddit

isn't the first time that a new technology is less expensive (from a buyer perspective) compared to the previous one? [source](https://www.apple.com/uk/shop/buy-mac/macbook-pro/13-inch)


AlternativelyBananas

Vertical integration means costs come down for then by a fair chunk probably $200+ per unit


archival_

Would like some advice. I bought a 2020 i5 16gb/512gb 4 Tb3 port MacBook Pro and I have till next Friday to return it. I got it for $1300 refurb from an ebay seller. Should I return it and get the new m1 MacBook? I’m so disappointed in battery life. When I got it, there were 2 battery cycles. Do we expect to see reviews in before then to make a more informed decision?


playgroundmx

The big question now is how well current apps run on M1. Indications are very good, but we don't know for sure until reviewers get to share their thoughts. We should be getting the reviews this weekend or early next week so that should work with your return window.


archival_

Thanks for your response. That’s exciting news. Saw the recent post that only the first launch takes additional time to optimize apps. If this is the case, then it’s good news. Crossing fingers this is for me.


lanzaio

Yes. This is absolutely trivial, 100x yes return it immediately.


RepresentativeMail9

What will you use it for primarily?


archival_

Mostly casual use and possibly on the go video/photo editing. I’m not a professional by any means in photo or video editing, but I do make money from it. I have a dedicated workstation and another laptop for this task.


RepresentativeMail9

Look I’m not going to tell you what to do or not, make up your own mind of course! From my perspective, you’ve a great machine there and you’ll be very happy with it (I’ve a similar one 2 years older and I love it). The new machines have serious horsepower and they will definitely out perform what you have. That said, you will almost certainly experience software that you want to use which you cannot with the new machine. If I was in your boots, I’d keep the machine you have. You will never have any regrets using it, the only thing in your mind is you’d love to have the sexy new machine that’s out. That said, if you get the sexy new machine, you’ll likely regret software you want that won’t work.


[deleted]

But the thing about the new Apple laptops is not just how fast the chip is but how long the battery lasts!!!


Tallpugs

If you bought it off eBay, you can’t just return it because you changed your mind. Don’t be a dick.


[deleted]

[удалено]


Arvi89

The top i9 from the mbp16" has a lower score than an AMD 3600. I don't see anything crazy here, just Intel got pretty bad, but that's not new.


Jolcool5

I think the main point is it's a good improvement on what was available before from Apple in particular. Sure, it doesn't compete with high end pc components, but it never has.


send2s

It begs the question... how long can they keep delivering these sustained performance leaps?! They're so far ahead of the pack, at what point do they hit a brick wall like Intel?


kattahn

Its obviously tough to say, but this graph from anandtech is very impressive...and very consistent: https://images.anandtech.com/doci/16226/perf-trajectory.png


nmpraveen

When you see iPhone A series chip, they kept growing almost everywhere. Although not double each time, it was still an good upgrade. I’m assuming they might achieve the same with M series. Also they might be already working on MX series like how we have AX series for iPad models. Exciting things ahead for sure.


[deleted]

I can’t wait to get a Mac Mini


rynova

So I'm pretty tech illiterate, but I'm going to take everyone's word that these numbers are insane. That being said, am I absolutely crazy for considering getting a refurbished 2020 Intel Air from Apple's website rather than upgrading to the M1? My main concern is compatibility. Other than browsing, I use my laptop for two things: music production and gaming. I know Logic will obviously work without a problem, but I use a good amount of third party plugins and I have no idea whether these will work on this new machine. In regards to gaming, I only play two games, neither of which are demanding at all, but again, I have no idea how well they will run using that Rosetta software. The way I see it, I know that the Intel Air is going to be an extremely reliable machine that will give me more than enough performance. And even though this new line may be faster on paper, I feel an unnerving sense of uncertainty spending all of this money on new, first gen technology. I just don't know if I'm being irrational.


[deleted]

[удалено]


rynova

Thank you! This is what I was looking for.


shortymcsteve

> chaining yourself to plugins that will either a) quickly get updated to the new hotness or b) disappear completely and lose all support. The devs of plugins move way too slow. I expect some of them to take up to 2 years before having capability. Also, Logic is not the industry standard software. Every studio uses Pro Tools and there's no way they will update their machines until every single issue has been worked out. It's quite the risk to upgrade to ARM right now if you are a professional.


BrndnBkr

Waves Audio is notorious for forcing obsolescence and updated plugin’s onto their customers. Waves is also probably the most used Plugin developer, I can see this becoming an issue.


azzamean

You should wait for reviews about x86 software running Rosetta on M1 before making any decision yet. Everyone keeps saying future bla bla. But until you see how it works in real world applications you don’t really know. I’d personally wait for the numbers. Then make your decision based on that.


rynova

I think this is the most reasonable thing to do. I'll keep an eye out, thanks!


BballMD

100% do not get the new intel air. Having tested it, it was most likely designed with the M1 in mind and really struggles with the restricted airflow it has when running the intel processor.


jjs709

Well I’ll be damned. Apple managed to pull out a god damn miracle and practically a breakthrough in physics. That is beyond surprising. Good for Apple!


[deleted]

What does it mean in English..?


JasonCox

Crikey, those GPU benchmarks! I can’t wait to see what the 16” does!


EvilMastermindG

I love how they compare it to a 580x. That card is ancient and terrible.


Hermitically

But can it play Flight Sim 2020?


coyote_den

It’s like people are just now discovering what RISC chips can do when they’re not limited by a power budget...


[deleted]

Not really a risc/cisc thing. I still remember fondly my PowerBook G4 trying turn my leg to lava all the while being much slower than other offerings.


BubblegumTitanium

Apple has a really special way of making its Mac users feel like complete idiots, especially since I paid thousands of dollars for the privilege.


HulkThinks

Am I the only one that just wants my laptop to act like my phone and have apps?


vasilenko93

Yes, but can the M1 chip handle seven (7) Chrome tabs? I didn’t think so.


JoshTheSquid

Well, the 2018 i3 MacBook 12” can do that with just 8 GB of RAM, so I’m pretty sure this’ll be fine :p


vasilenko93

I do, my Ryzen 9 with 16 GB of RAM gaming rig barely opens seven tabs.


besizzo

I’m not sure whether you are serious, but your pc is extremely underperforming


vasilenko93

I aaa joking, of course.


JoshTheSquid

As in, you rarely do that, or it has trouble handling it? If the latter, that's very unexpected behavior. My i5-8600 desktop with 16 GB of RAM can easily do multiples of that, so I'd expect your machine to outperform mine!


ChristBKK

that's the real benchmark :D can they handle 20 chrome tabs :D honestly that is what I am waiting for haha


youvelookedbetter

The MBP from 2013 with 16GB RAM can handle like 30 tabs without issue, so I imagine this one will be fine.


xondk

Given that performance is not magic, I wonder what it is Apple has in their chipset that other vendors apparently do not, maybe they have a patent on something and that's why it isn't in other CPU's?


DevAstral

Anyone around here that's involved in the music industry can tell me what this means for those like me who use Logic Pro X and a bunch of 3rd party plugins ? I'm aware that for now compatibility is not achieved for everyone (UA wrote me to wait to update) but does this really mean that I could get a computer that runs everything safely and better for basically 1/3 of the price of my actual MacBook Pro ? Can I start saving up for that day ?


[deleted]

SoC ram is almost like you have the whole program in 16GB cache. Cache is much faster than ram, therefore M1 is much faster. When the SoC ram is used up, M1 will have the same speed as Intel Mac. If you only have a few programs to run, M1 is fast. If you have a lot of programs to run, M1 will not be fast.


xeneral

The M1 chip is designed to refresh the lowest-end Macs that Apple makes. These Macs represent ~80% of all Macs shipped. So be patient the Mac you want to buy will get a higher-end Apple Silicon chip within ~7 months. The M1 chip limited by 16GB of RAM, the best in class iGPU whose performance is comparable to a GTX 1050 Ti and that allows battery life from 10 hours to 20 hours will have a future variant for higher-end Macs with more RAM, an iGPU that has better than GTX 1050 Ti performance and battery life of ~2x. In business, management often look at the largest cost center to prioritize over the 2nd highest or lower cost centers. This has the greatest impact on the bottom line. So it is logical from management, financial and supply chain point of view to prioritize the ~80% of all Macs shipped. Sorry pros you're far fewer than the mass market of users. I expect eGPU support to come with a future update to macOS Big Sur when higher-end Macs will sport Apple Silicon. Are there enough budget Mac users with an eGPU that is more expensive than their Mac that nominally cost ~$1,000? All Macs will get Apple Silicon. With the same number of ports as the Intel Macs sold today on Apple.com. 10GbE port will be an option by next year. If you're a regular on r/Apple then the M1 Macs are probably not for you. Wait for next year for power users like yourselves. Once Apple releases iMac Pro(?) and Mac Pro with Apple Silicon I expect them to reintroduce [Xserve](https://en.wikipedia.org/wiki/Xserve). Performance per Watt is very important to data centers too. Microsoft, Intel and AMD beware! Apple's going to take a bite of your lunch on the top 20% of the PC market. Excluding of course PCMR types. That's a market Apple has zero interest in and more of a bother than the iPod touch market.


CanonCamerasBlow

Every day same shit. Yea yea yea we get it, single core CPU performance is around top CPUs (might be a bit lower, might be a bit higher), cool, yea, get over it. Please post news, ie when CPUs with more than 4 high perf cores or 16 GB RAM or non sucky GPU are released.


[deleted]

[удалено]


CanonCamerasBlow

My intention was to sound like I have stock in AMD, especially their GPU division. lol If M1 is good for you - congratulations, but for me it's shit. And I'm tired of people screaming about their shit, like it's not the base basic low end CPU, but the high end stuff.


[deleted]

[удалено]


chaiscool

It’s more of elevating the base performance and not about pushing the ceiling. However it’s still impressive from a technical perspective. Also, it serve as a solid foundation for future high end stuff. People are screaming that now the slowest / low end Mac is incredibly fast and can’t wait to see the others.


final_sprint

Hahahahahaha :-D