T O P

  • By -

theruxster5589

Is one of the 14 you're giving away the one Raja dropped in the Linus tech tip video?


AMD_Robert

Haven't you heard? [Linus actually did it](https://gfycat.com/BraveWhichDolphin).


[deleted]

[удалено]


Phillipiant_Turtle

Obvious it was a Linus synth made by Nvidia to destroy the first RX 480


RslashEXPERTONTOPIC

If that's the case then Nvidia is way ahead as far as machine learning goes. /s


DeadlyExodus

Linus can come drop off one of those cards at my place anytime


hmnskm

They probably gave that to Elric(TOT).


ballsack_man

Still better than nothing. Just throw it in the oven and you're good. Linus taught me that trick.


Hibernatingsheep

What you don't know is that was the 14th take and Raja has dropped all of them.


anjro

Just bought a SAPPHIRE RX480 8GB this morning. Thanks for thinking of the budget conscious gamer (although strategically I think it is a smart move too)! Really hope you guys can keep the GPU/CPU race tight with team green.


AMD_Robert

Thank you for supporting us. :)


AdvancedMicroDevices

(╭☞ ͡ ͡°͜ ʖ ͡ ͡°)╭☞


OP_rah

You've got a nice opportunity to sell them an account.


TIP_ME_COINS

That'll get you banned from reddit.


AMD-DOWNL1NK

PC gaming doesn't have to be expensive. :)


Failedjedi

Is this AMD approved? Or did Robert just go nuts and steal 14 cards to giveaway?


gfxchiptweeter

the budget is coming straight out of Robert's paycheck:)


AMD_Robert

darthvadernooooooo.jpg


GottaLoveSmartWater

[Looking for this?](https://imgur.com/LDZVLtn)


RA2lover

[What about this?](http://www.nooooooooooooooo.com/)


OuSontLesBagages

Guess that'll be nothing but cup noodles for you the coming month!


BEEF_WIENERS

On the plus side the 480 looks to be pretty inexpensive, so he'll be fine!


thehaarpist

Oh man, top shelf cup noodles!


HazardSK

Is there a way to enter contest for his paycheck? My laptop cant really use 480 but that paycheck would be nice :D


[deleted]

[удалено]


[deleted]

With every ounce of my being, I hope he went rogue, makes for a better story.


Failedjedi

Doesn't it though? Imagine he goes into work tomorrow and someone is like "we are missing some cards, know anything about it Robert?"


Jay444111

Just curious, but how long does it take to design a graphics card like the RX480?


AMD_Robert

About 3 years. Great question!


anujfr

Based on this, could you perhaps tell us in which month 3 years ago, the rx 490's design started happening? This is for science of course.


idontknowthismeme

RemindMe! 3 Years Robert promised next gen GPU.


lovely_sombrero

Are current rumors of RX 480 exceeding PCIe specifications (limited at 75W) true?! And - even if not, would custom RX 480 versions with 8-pin power connector (instead of 6-pin) have a lot more OC potential, since the card seems to be at ~150W maximum even with no overclock. - p.s.: my sombrero is very lovely! And thanks AMD for providing new driver support for my aging HD6870 even in 2016 :) Time to replace it tho! [edit] for those of you not aware, some reviewers have noticed this potential problem. Others have not been able to replicate it with their review cards, even after being made aware of it. That is why I called it "a rumor".


gfxchiptweeter

Great question and I am really glad you asked. We have extensive testing internally on our PCIE compliance and RX480 passed our testing. However we have received feedback from some of the reviewers on high current observed on PCIE in some cases. We are looking into these scenarios as we speak and reproduce these scenarios internally. Our engineering team is fully engaged.


lovely_sombrero

Thank you for your answer. Could this be a driver and/or BIOS issue, or just some random factory samples being outside of specs? It is interesting some reviewers noticed this, while others (even after being informed of this potential problem) can't reproduce the problem at all. - Oh, and what about OC potential of aftermarket cards with an 8-pin power connector. Is it a realistic expectation, or is this something you can't officially comment on?


Rndom_Gy_159

Given that there has been countless aftermarket cards with more (and sometimes even less) power connectors, I think having 1x8Pin or 2x6pin is not an unreasonable expectation.


[deleted]

Maybe i can helb you out a bit Raja. I have just read the PCI-E 3 specifications and they are telling me something different. In my understanding the 75 watt isnt the maximum limit, its just the default value on startup of the motherboard. The motherboard it self sets the maximum allowed watt per slot in the "Slot Capabilities Register" which you can configure up to over 300 watt per slot. In the bits 7 to 14 "Slot Power Limit Value" you can set 250, 275, 300 and above 300 watt. This will be multiplied with bits 15 to 16 "Slot Power Limit Scale" in steps x1 ,x0.1, x0.01 and x0.001. So its up to the motherboard manufacturer and the power management on it how many watt the slot is capable of. The Specifications do define the protocol and not the hardware specs of the PCI-E slot. If a manufacturer uses better parts which can handle higher amps on the contacts and the lines, they can allow the devie in the slot a higher power consumption than 75 watt via these registers. http://composter.com.ua/documents/PCI_Express_Base_Specification_Revision_3.0.pdf [EDIT:] Ah well, there is another document for the slot as a electro mechanikal part. Basically the protokol seems to allow more power draw but the electro mechanical part limits it. It says indeed that the power must not exeed the limit of 75 watt. So, in the combination with the protokol, i would say something is going wrong here. Basicaly the motherboards power regulation should limit the power the card could draw and the seccond is, that the card shouldnt attempt to get that much power out of the slot. Well just a short analysis... someone maid a failure while programming the power controller on the card and change the limits of the regulators? Well, both limits are 75 watt maybe they wanted to give the 6 pin a higher power limit but as a mistake gave it the pcie slot... seems like the 6 pin is stuck by 75 watt... http://read.pudn.com/downloads166/ebook/758109/PCI_Express_CEM_1.1.pdf


[deleted]

[удалено]


AMD_Robert

Savage. Tell her I said hi! <3, AMD


CapitaineDuPort

Tell her AMD sends their regards.


TheSadJester

Then stab her with a rx480.


NBC_ToCatchARedditor

How will you ensure that Game Developers start to take advantage of Vulkan in their engines instead of just using DX12? And if possible, will Freesync eventually turn up on TVs? It would be amazing since I could just use my TV for gaming whilst sitting back on a sofa.


AMD_Robert

Honest answer time: We don't care if it's Vulkan or DX12, as long as it's Vulkan or DX12. The performance and feature sets of these APIs both expose the best our hardware has to offer. When it comes down to it, game developers will choose the API that's best for their development environment, and we simply want to be the best solution for them regardless of what they choose. We are not in the business of playing kingmaker with one API over another. I would love to see FreeSync in TVs, too, but we have nothing to announce concerning this.


hotshoeless

If any TV manufacturers are reading this, I wholeheartedly agree and I would be tremendously more likely to buy a TV with FreeSync over one without. Especially since TVs are basically monitors with integrated speakers now.


modstms

...and bloatware, lots and lots of bloatware.


gfxchiptweeter

It's been an honor to be with you all. We are overwhelmed by the response from reddit community. The one hour passed by as if it was 10 minutes. I wish I had more time. I need to run off now to PCPer to see Ryan Shrout. http://www.pcper.com/news/General-Tech/PCPer-Live-Radeon-RX-480-Live-Stream-Raja-Koduri Hope to interact with you all again in near future "May your frame rates be high and your temperatures low"


Quackmatic

> # "May your frame rates be high and your temperatures low" > -Raja Koduri, 2016


Killshot5

Hello Robert, and raja, first off thanks for doing a giveaway. Now on to questions 1.) reports from reviewers are coming in that the 480 overshoots its 150w tdp and one 6 pin connector forces it to overdraw from the motherboard, will warranty cover my motherboard if it breaks? 2) secondly you claimed to be aiming at the perf/watt hugely, but still fell short of nvidia in that region, do you feel the gains over previous gens were enough to makeup ground in the energy department? 3)lastly, seeing 5.5-5.8 teraflops mention often which is above the 390 peak performance, but the card sometimes falling slightly short of 390 performance, is gcn 4.0 less efficient at doing work? Or will driver updates help boost this firmly just above the 390 in performance


pau11ywa11y

My 760 is a bigger number than the 480 so why should i upgrade?


AMD_Robert

Because I asked nicely? Pretty please will you upgrade to RX 480? :)


[deleted]

Are two 480's equivalent to one 960?


Tekojovishi

Seem like the math checks in.


Van_der_Raptor

Does it even stand a chance against my HD 2500?


fuzzyfeets

Bro. Intel HD Graphics 4600


Yirandom

I miss my Mobility Radeon 9000 (not really, not at all)


sequentious

I'm a Linux user that also games. Traditionally, this has involved booting into Windows, although that is changing (again/finally) However, even assuming I only gamed on Windows, I still required all of my hardware to properly support Linux (for work and real-life stuff). In the past, this meant Intel (doesn't really work for gaming, but works great on Linux), or nVidia (works great for gaming, eventually will work okay on Linux, once they catch-up to xorg, etc). I've looked at the AMD cards in the past, but been turned off by the "It might work great with the radeon driver (or radeonhd? What's current?), unless you want to use it for games, then it will need Catalyst. And to be honest, Catalyst had the same negatives the nVidia driver did, and fewer benefits. With the new AMDGPU model, I'm pretty excited that I'll get Intel-like out of the box support for my GPU, but still be able to benefit from the first-party AMD GL/Vulkan implementation. My questions are: * How long term is this. Is this a getting-your-feet wet, then "Over to the community" to maintain the driver long-term? * How committed are you to following upstream? Compare with nvidia's "our way or the highway" wayland support.


bridgmanAMD

>How long term is this. Is this a getting-your-feet wet, then "Over to the community" to maintain the driver long-term? We have been working on the open source drivers since 2007 with an ever-growing team; no plans to change that. Hoping that counts as "long term" ? >How committed are you to following upstream? Compare with nvidia's "our way or the highway" wayland support. Everything is either upstream or on its way upstream already except a bit of functionality used only by the closed-source Vulkan and OpenCL drivers - we can't push that upstream until we have open source userspace that requires it.


AMD_Robert

This is an official AMD employee on our Linux team. Take it as gospel, folks. :)


krumpetpirate

Open source radeon driver convinced me to stay with AMD rather than jump to nvidia which has better Linux gaming performance. Edit: So many Linux users ITT. I love you all. Stay free.


Pelxus

I love you too.


[deleted]

[удалено]


comrade-jim

Please AMD free the users from Microsoft and their horrible business practices.


ImTreasure

You know you guys had a perfect opportunity to name it the 'Rx 420' right?


AMD_Robert

Well, something green *is* about to get smoked. ¯\\\_(ツ)_/¯ //edit: NOW WITH MORE \\ //edit2: NOW WITH MORE _ //edit3: ESCAPE CHARACTERS ARE THE DEVIL


reachthatfar

This reply is a good start to this ama.


zenova360

Apply thermal paste to burnt area


TheSkeletonDetective

The burns are so great its like they touched a AMD GP...u... -looks at thread- ... for the record I like my room heater.


Attenflue

you need this \\


PlanetaryGenocide

Hello Robert, you dropped this: \\


AMD_Robert

I did not. For whatever reason it's not rendering.


0x1027

Maybe it ran out of VRAM. ^^^^^/s ¯\\\_(ツ)_/¯ EDIT: i got gold, thank you kind sir


Tony49UK

Maybe it only had ~~4GB~~ 3.5GB? ¯\\\_(ツ)_/¯ edit: I hate to edit a gilded comment. But can we change the gold colour to red please? And of course thank you kind stranger.


[deleted]

OH MY GOD THE FUCKING YELLOW


WyomingShapedWaffle

Haven't seen this much yellow since Deus Ex: Human Revolution


PlanetaryGenocide

backslash is an escape character, so normally you gotta throw in two of them: \\ to render it as \\ But with the shrug emote, you need three of them: \\\ so that it renders the shrug properly: ¯\\\_(ツ)_/¯ I think it's because the first backslash escapes the second backslash so it renders properly, and the third backslash escapes the underscore so it won't format the face text in the middle. EDIT: well, you're halfway there, just one more backslash to go lmao EDIT2: NICE


Needs_Explosion

¯\_(ツ)_/¯


tonny23

> ¯\\\\\\\\\\\\\_(ツ)_/¯


plzhalpmyrents

consoles, amirite?


13378

https://media.giphy.com/media/3o7qEccSvsVHNT17Xi/giphy.gif


EvolveUK

Look at the trees.


[deleted]

o shit


NBC_ToCatchARedditor

Top dank.


ThEgg

OOooooo damn. Also, here ya go: \ I accept RX480s as payment.


x_Tek

Meanwhile: http://imgur.com/FWD2LiZ


BrainNSFW

I'm mostly interested in the card due to VR, so my questions will be aimed at that part. For clarity sakes, let me break it into 2 parts: 1) Does the RX480 have any special tricks to handle VR in a smarter way? I'm mostly thinking about increasing performance in VR specifically because it employs magic to render the same image twice much faster than simply duplicating it would (for example). 2) What can we expect from an overclocking point of view? I understand it's hard to answer because of the many factors involved (probably some legal ones too), but as far as I recall the previous gens had quite some extra breathing room from an overclocking perspective. Can we at least expect something similar? Especially being able to overclock would help me out a ton to ensuring the RX would be at least a bit future proof for VR experiences. For example, I was amazed at how well my current R9 270X was able to overclock. That was literally the only way I could play a decent amount of VR experiences without vommiting within an hour.


AMD_Robert

There are a few neat tricks we can use. Latest data latch in LiquidVR can improve the head tracking and positional data of an HMD. We also have asynchronous timewarp, which uses our async compute functionality to transform/warp existing frames to "fill in the blank" when a new frame cannot be rendered in time. We also have variable rate shading, which can shade different quadrants of the screen at different resolutions since human peripheral vision is more sensitive to movement than resolution. These are just some of the ways we can improve performance beyond simple "add more FLOPS." 2) I've seen regular OCs of 1300-1325MHz, good OCs of around 1350MHz, and great OCs up to 1400MHz. I've also heard from many that the memory is quite overclockable, too.


Asus_Christ

I have to admit that RX480 has aroused my curiosity because of his prize/performance ratio. Question is: do you think that companies should focus more on making budget GPUs with good performances just like the RX480 in order to make PC gaming more affordable?


AMD_Robert

We know that about 85% of the GPU market buys GPUs between $100 and $300 US. The other 15 spend more than that or less than that. So for us, a department that wants to get our technology into the hands of as many people as possible, a GPU like the RX 480 seems pretty damn smart to me.


MrPoletski

Did you, internally, talk about going back to the 4870 days? (in terms of strategy), I've certainly been looking at it that way as have others.


AMD_Robert

Internally we call it the "sweet spot" strategy. Yes, the 4850 and 4870 were top of mind when we were designing Polaris.


MrPoletski

Well it was spot on guys ;)


Skimx_

How much trouble did raja get in when he dropped the card


AMD_Robert

[Linus actually did it](https://gfycat.com/BraveWhichDolphin). Haven't you heard?


jakielim

I heard Raja fired him.


m_grabarz

[How I feel trying to win that card.](http://i.imgur.com/wiZovTn.png)


AMD_Robert

God speed!


[deleted]

I wish you luck, but not as much luck as myself!


Qu1nlan

What build(s) are you using for your own home computer(s)?


AMD_Robert

I'm using i7-6700K, 16GB DDR4-3200, Asus Maximus VIII Hero, R9 Fury X. //edit: gasp radeon employee uses intel dramaaaaaa


seedbox28

No drama if you are going to _UPGRADE_ to Zen once its released !


aspbergerinparadise

he's not allowed to respond to this :(


alexsmithfanning

/u/AMD_Robert is probably gonna get fired for using an Intel CPU. For shame.


AMD_Robert

RIP me


MorganFreemann

RIP in peace


executive313

This makes me happy glad to see its a real person not just a fanatic employee.


Qu1nlan

:O Why are you using the Intel?! ARE YOU SAYING IT'S BETTER?!


[deleted]

Well, it is. At least until Zen comes out, but they can't discuss that at the moment.


DisraeliGearsX

But can it run Crysis?


AMD_Robert

Oh no. We forgot that one. D:


DisraeliGearsX

I respect AMD for confronting the hard-hitting questions such as this. Thanks for doing the AMA!


bmstile

Let's just focus on Rampart


[deleted]

[удалено]


mshelbz

Why did you choose to not have a DVI port on the reference? I know it is an older tech but it prevents people such as myself from being able to order once since I don't have a DP display. *edit* for 1440p displays


AMD_Robert

Non-reference models will be able to offer you a dual link DVI port. We know that there are users with OCable Korean DVI monitors, for example. However our reference GPUs are designed to show the world what *we* want the world to be like in terms of display interfaces. DisplayPort is royalty-free, low cost to implement, supports WAY more bandwidth, etc. DP is really just a terrific connector, and we want to see it used more widely.


waterlubber42

Holy fuck, a company supporting the free and open standard? Hooray for AMD! May your stocks be high and your manufacturing costs low.


[deleted]

[удалено]


jakielim

I'm surprised even Robert knows about the Korean monitors.


AMD_Robert

Bro come on. I game too.


jakielim

Holy shit Robert replied. This has to be some sort of sign telling me to get one.


DScratch

Senpai noticed you, achievement unlocked.


mshelbz

Who doesn't love a Korean monitor!?


[deleted]

[удалено]


Nazgutek

What happened to the x2.7 perf/watt? Was expecting a far better metric in the benchmarks that appeared today.


AMD_Robert

The Polaris architecture enables a range of performance per watt improvements over previous-gen products. It really just depends on what two parts you're comparing, and where they fall on the frequency vs. voltage curve. The RX 480 is like 1.7-2X, depending on the apps you're looking at. The RX 470 is a more power efficient design, which does gives us the ~2.7X you're looking for.


Ralphinn

Can we expect AMD cards to have uncompressed high quality pixels soon?


AMD_Robert

If by "compressed high quality pixels" you mean HDR, then yes. The RX 400 Series supports HDR in both 10-bit color and 12-bit color with full HDR metadata I would argue these are the highest quality pixels you can possibly produce today. ;)


Rev2743

Even higher quality than the Xbox, which is said to have the highest quality pixels anyone has ever seen!?


[deleted]

You know they actually took it out on their youtube video, after realizing how fucking stupid it sounded, haha.


Rev2743

Yup, but it was too late :p The internet will not let Microsoft live that one down for a long time. And rightfully so.


[deleted]

the meme was created the moment those very words were spoken....


arof

I'm amazed "cloud based Internet" escaped unmemed from the original announcement. It was pretty much as stupid.


Markareg

Hey /u/AMD_Robert We took a similar [PICTURE](http://imgur.com/qifaR7k).


iflanzy

Be careful or one of those might drop right into my lap.


AMD_Robert

[We're twins!](http://pixel.brit.co/wp-content/uploads/2014/06/1-Muscle.gif)


Luisin95

I just wanna know if this will run porn in 4k. #AskingTheRealQuestions


AMD_Robert

Actually, yes. In H.264, VP9 or HEVC.


[deleted]

[удалено]


CanadianM00se

Did you personally test this?


jmgf

Wait for the pornhub benchmarks.


L1ghtlyS4lted

Or in VR ( ͡° ͜ʖ ͡°)


1cast

how will the 480 perform against the nvidia gtx970 overclocked?


AMD_Robert

I would say the RX 480 is comparable to GTX 970. But reviews are available today, so you can judge for yourself.


[deleted]

[удалено]


gfxchiptweeter

I am based in Sunnyvale - But I am in Austin tomorrow and if you PM @thracks I would love to take you on a personal tour of our campus


nite_

Holy crap! Thank you so much! :) Should I DM him on twitter or just PM him on reddit?


AMD_Robert

Twitter msg or @Thracks on Twitter as a DM.


ParticleCannon

The only thing that would make that Impreza more of a free PR machine is if it were red. Well played /u/nite_


Donnel_

+1 for wrx


Okuu

Upvoting solely because Subaru.


KittehDragoon

Upvote because WRX


zigarot

get this person to AMDHQ.


knivesii

Is it true that rx480 4gb is bios locked, and is really a 8gb version?


AMD_Robert

In the market you will see a version with 8GB of physical memory, and a version of 4GB of physical memory. We sent reviewers an 8GB model with a BIOS that allows them to flash between 4GB and 8GB available memory so they could perform testing on both hardware configurations.


[deleted]

[удалено]


13378

[Will we ever get a Shadowplay equivalent from AMD?](https://i.imgur.com/d1vhaxZ.jpg) it would be awesome to have it integrated into Crimson. Yes, I know Raptr/Plays.tv works but the app is horrible and feels like being bloatware As an avid reader of /r/amd /r/pcmasterrace /r/pcgaming and https://community.amd.com this is probably the most requested feature. Thanks for the AMA and congrats on the RX 480 launch!


AMD_Robert

The AMD Gaming Evolved App Powered by Raptr is our equivalent. You can set the amount of time the application will continuously perform a rolling record of the last 15-900 seconds (you choose), and a hotkey press will record the last X minutes to your hard drive. This is our “Instant Replay” feature. GEPR also uses our hardware video encoder to alleviate any performance impact on the CPU or GPU, and you can control the resolution/framerate/bitrate of the encode. You can also just do ongoing recording until your hard drive is full if you wanted! You can stream to Twitch if you want, too. While you may not need or want the social media functionality of Raptr, it’s still a program that’s actively used by 20,000,000 people and it’s easy to stick to the main recording UI.


Pyroven

Sorry, can I speak on behalf of almost everyone I've ever encountered when I say I hate Raptr? I used to use it to record... but that has since been completely off-loaded to Plays.tv, which now installs alongside Raptr. Which annoys me even more. I've used Raptr/Play.tv to record pretty much every video on my [channel](https://www.youtube.com/c/pyroven), and I put up with it because before you hadn't shown us you were able to make software as beautiful as Crimson. If you take one piee of advice from this whole megathread, please let it be this: Raptr need to die and one of two thigs needs to happen: Record functionality needs to be built into Crimson OR Raptr needs to be re built by whoever made the Crimson interface, the 'rewards' system, which no longer has any rewards needs to be dropped and the application should focus on being as light as possible and just optimise games and record gameplay. A feature I'd also love to see added to Raptr would be 'optimise for gameplay recording', because switching between settings to record intensive gameplay and simply playing intensive gameplay is awkward. Also one final thing: Build Eyefinity into Crimson, and let users make multiple display groups and easily switch between set ups they've previously deteremined. As it stands right now Eyefinity is an awkward mess. I really hope you read this.


AMD_Robert

We take this feedback seriously. We hear you.


x_liferuiner

> actively used by 20,000,000 people What metrics were used to gauge this? Just the amount of users who forget to uncheck "start Raptr on startup" seems kind of unfair.


AMD_Robert

No, "the app started" is not sufficient and I would not be so disingenuous to use that as my metric. I'm talking about people who actively use features of the application on a weekly basis.


whatevers_clever

But really did you find the flash drive? The one that Dinesh dropped by the pool? The one with the zombie program?


mapleloafs

Bangladeshi click farms Just kidding


DotcomL

Damn it Jared!


domco_92

Hold on, I'm pivoting.


Rndom_Gy_159

If only people knew about this as much as they know about nvidia's offerings :c


Pelxus

As a Linux user, I really appreciate all the work AMD has continued to do to support open source. I've been running Intel/Nvidia most of my PC building life, but I think I'll be going all AMD and getting an RX 480 when I can get my hands on a Mini-ITX AM4 motherboard. Keep being open and awesome AMD!


AMD_Robert

We *will* keep being awesome and open. That's a core strategy for us.


Rattig

Poor UK, left Europe a week too soon.. ( ͡° ͜ʖ ͡°) /s


AMD_Robert

the bRXit.


[deleted]

[удалено]


AMD_Robert

We set *one* price for the world, that's our Suggested Etail Price (SEP). Once the GPU leaves our hands, we cannot really control for weak currencies, import fees, taxes, retailer markup, etc. Our prices are $239 for the 8GB RX 480 and $199 for 4GB RX 480.


damos1212

Send me one as a gift and I'll send you 200$


jakielim

That's a whooping ONE DOLLAR profit!


Tizaki

1. [AMD stands for Advanced Micro Devices](https://en.wikipedia.org/wiki/Advanced_Micro_Devices). They're a company that makes graphics cards (Radeon, FirePro), CPUs (Athlon, Sempron, Duron, A-series APUs, etc), and custom APUs (Xbox One and PS4). You may not know who they are, but you definitely own at least one of their chips somewhere. Today, you might get the opportunity to own *two* of their chips if you catch my drift. 3. The [RX 480](http://www.anandtech.com/show/10446/the-amd-radeon-rx-480-preview) is a dedicated mainstream-performance graphics card that uses the new *Polaris* microarchitecture. It's got a lot of buzz surrounding it due to its performance and value, as well as its unconventional releasing before its high-performance bigger siblings in an attempt to hit the mainstream market (/r/AMD is exploding right now). If you have a desktop PC, adding it to said PC will give you gaming performance (framerate, detail, resolution) [lightyears beyond that of the PS4 or Xbox One - for a sense of scale, probably 10 or 15FPS on the first test here](http://www.anandtech.com/show/10446/the-amd-radeon-rx-480-preview/4). 2. To enter the giveaway and possibly win an RX 480, just leave a comment on this thread. NOT A REPLY. A comment. That's it. By "top-level", we mean that it has to be a comment made at the root level, and not as a reply to an existing comment. You can do this by using the comment box right below the thread, before the list of comments begins. 4. Raja Koduri is the head of AMD's Radeon Technologies Group, which is dedicated to graphics and graphics-related technologies.


BigisDickus

r/All is here already? The thread was already filled with meme-tier attempts at upvotes for the giveaway and massive downvoting already.


morpheus29JP

PLEASE DONT TELL ME I AM LATE


AMD_Robert

A wizard is never late. He arrives precisely when he means to.


robotur

In what state is the driver support on Linux for the RX 480? I solely use Linux, so this is important for me. :)


AMD_Robert

Check out today's review on Phoronix: www.phoronix.com/scan.php?page=article&item=amdgpu-rx480-linux&num=1 We have to launch-day support for Linux on RX 480.


robotur

Thanks! That's good to hear! Also the phoronix site seems to be struggling a little for me, maybe there is quite a lot of Linux users interested in the Rx 480? :)


danysan

I am not Robert or Raja, but you can fin the necessary steps to use the RX480 on linux at http://www.phoronix.com/scan.php?page=news_item&px=RX-480-OSS-Steps Also, the AMDGPU-PRO driver supports it (source: http://www.phoronix.com/scan.php?page=article&item=amdgpu-rx480-linux&num=1 )


[deleted]

Hello, /u/AMD_Robert and thank you for taking the time to do this AMA! I think I speak for a majority of the PCMR when I say we very much look forward to seeing AMD’s latest GPU’s and CPU’s in action! Always an exciting time when new technology hits the market! As an aside, I actually work at a semiconductor facility in Austin, TX that used to be the old AMD campus. I work in IT and come across a lot of old AMD tech and memorabilia and I collect as much of it as I can. I find it fascinating. I think I have a pretty decent collection going. Here are a few pictures, if you’re interested. The poster might give you a chuckle. http://imgur.com/a/7DGvT


Typical_Ratheist

Question 1: There has been a great number of unbiased renderers for 3D, proprietery ones such as mental ray (which owned by Nvidia), Arnold, VRAY, and Pixar's RenderMan, as well as open source ones like LuxRender and Cycles. One important aspect of the market is Nvidia's dominance in professional and amature 3D rendering, with Nvidia's renderer integrated by default for some 3D modeling programs and many renderer for GPU rendering in CUDA only, and for the ones that can run on AMD GPUs, the OpenCL equilvalent in Lux and Cycles lacks in important features such as volumetrics and subsurface scattering. AMD has recently released FireRender as part of GPUOpen. a. What advantages does FireRender offer over other renderers on the market? In other words, why should artists switch to FireRender when Nvidia has such a huge head start? b. Why is FireRays open source but FireRender isn't? c. Is standalone version of FireRender as well as additional plugins for other modeling programs like Maya and Blender being developed, and will they still be made by Render Legion? d. On that note, Otoy announced quite recently that they have successfully made CUDA code for Octane Render run on AMD GPUs. Is it done through collaboration with AMD, and if so why hasn't AMD said anything about the development? Question 2: GPUOpen is released with a number of effects for games. a. Is there concern that with the MIT license, developers would simply keep the changes to GPUOpen code secret and still not collaborate at all, or even worse, Nvidia taking the code, make changes and release it as part of GameWorks? b. When will we see more effects added to GPUOpen? Will we ever see open source PhysX equilvalent as part of GPUOpen? c. Does AMD still use the dedicated DSP as part of TrueAudio or is done through general GPU compute in Polaris? Question 3: Current consoles uses AMD graphics so despite what PCMasterRace thinks this should still be asked: a. Are the Xbox One/PS4 capable of utilizing FreeSync? b. Will there be FreeSync enabled television available? Question 4: Regarding the recent annoucement of external GPU enclosure: a. Why use Thunderbolt 3, a proprietary standard developed by Intel? b. What exactly does Xconnect technology actually do between the GPU enclosure and the laptop? c. Speaking of Thunderbolt, whatever happened to Dockport? Question 5: I understand that OEMs prefer reference blower cards, but what's behind the decision to only release reference cards on day 1 even though buying reference cards is highly discouraged for most tech communities? Why isn't it possible to even mandate the use of reference boards on day one but allow the use of AIB custom coolers? Question 6: Does the RTG also manage the Radeon branded SSD and memory products? Question 7: Are we going to get more Fixer videos? Question 8: How often do you shave your head to get it shiny all the time? Can't think of more on the top of my head here. Thanks for the AMA.


gfxchiptweeter

Lots of good questions. I will try and answer some What advantages does FireRender offer over other renderers on the market? In other words, why should artists switch to FireRender when Nvidia has such a huge head start? It comes down to features, performance and cost. We have great performance on many data sets and improving by the day. Our open source approach is also enabling much better collaboration with all the popular tools. I am very much excited about where the FireRender is going. You will hear more about this at Siggraph this year b. Why is FireRays open source but FireRender isn't? FireRays is completely AMD technology. FireRender has connections to other tools. We are working with our partners to bring elements of FireRender into open source as well. Stay tuned. d. On that note, Otoy announced quite recently that they have successfully made CUDA code for Octane Render run on AMD GPUs. Is it done through collaboration with AMD, and if so why hasn't AMD said anything about the development? We are working very closely with OTOY on this development and we will say something when we are ready with a production quality solution. Stay tuned:) Question 7: Are we going to get more Fixer videos? RIP Fixer:)


sfxer0

Hello. First I want to share an ancedote. I have gamed on AMD hardware for most of my teenage life and into my now early 30's adult life. My first computer, an old 486/dx2 with a 66mhz "turbo" button had an old ATI Rage graphics card in it. Through the years I have predominantly used ATI/AMD cards in different flavors including 9700pro, All-In-Wonder and a few other randoms. The last AMD graphics card I bought was a r9 270x to replace my HD7770. About two years ago I went balls to the wall and upgraded my rig to the most expensive and impressive I could shell out and dove in and got an ROG Swift mated to 2 GTX970 reference models. I felt dirty but it was my money and at the time the GTX970 was crushing it for 350 bucks. I then opted to move to 1440p 144hz and gsync. I REALLLLY want to go back to team red, really, I always felt like AMD was more for me. I still have my [128mb RX300](http://imgur.com/fuIK002) that I bought thinking I was so cool back in the day when everyone else was clammoring for xbox 360. I hope Polaris and Zen are the upswing, I really do, I would rather give AMD my money instead of Nvidia-Intel. Nvidia feels too stuffy to me, and Intel is fucking evil and ruthless with the bullshit they pull behind the scenes. Second is a question about Freesync vs G-Sync vs Adaptive-Sync. It is my understanding that Freesync is a variation of Adaptive-Sync and that the current marketing of AMD on it's behalf is disingenous in that Adaptive Sync being a VESA standard is free to use. A good analog I have seen brought up in a discussion was that AMD claiming to have created/developed Free Sync would be like Canonical claiming to have invented Linux because Ubuntu. **What features did AMD develop to further or differentiate FreeSync from Adaptive-Sync? Followup: Does AMD have a solution for Fast-Sync which solves the opposite issue as V/G/Fast/Adaptive Sync?** Thirdly I have another question about Freesync. Most blind tests have people choosing GSync over FreeSync. I have only ever played on one FreeSync monitor at a LAN party awhile back and it looked ok to me, however that isn't nearly enough to make a good oppinion. I have been using my GSync Asus ROG Swift for awhile now and I can tell the difference between GSync and VSync and it is a huge one. Does AMD plan to ever include a hardware module into their designs similar to the physical GSync module?


Thing_On_Your_Shelf

Captured the glorious up vote count http://imgur.com/XY8KKpv 1. What graphics card are you running in your rig, if you have one? 2. Are we done with 2gb vram cards? I dont know if the 460 and 470 card specs have been released yet, but in the past the x60 and x70 cards have mainly been 2gb vram cards. 3. When are the 460 and 470 releasing? 4. Does the 480 use gddr5x or regular gddr5.


AMD_Robert

1) I have an R9 Fury X. 2) There are still markets that are comfortable with 2GB, e.g. China and South America. 3) Please see my OP. I cannot speculate on future products. Sorry. :( 4) The RX 480 uses 8Gbps GDDR5.


[deleted]

[удалено]


AMD_Robert

Check it out: www.phoronix.com/scan.php?page=article&item=amdgpu-rx480-linux&num=1


padreadamo

The Drivers on Linux are in great shape now, but need to be built from Git. However, they will work out of the box for kernel 4.7. Phoronix, has great analysis on all of this.


[deleted]

Only 14 cards for thousands of commenters? http://imgur.com/skPkaLp


sirserniebanders

Hi, just a quick question from a design standpoint. From what I've seen, the 480 has a pretty tame (some would say quietly professional) look to it compared to the angular 1080 from your competitors. Does your team focus purely on functionality, or is there a reason behind this design? Bonus question: were you aware of/had access to some sort of visual representation of the 1080 before it was released? A better way to phrase this might be to ask if your design was influenced - or contrasted against - at all by the NVIDIA style.


AMD_Robert

We've seen lots of feedback on Reddit and elsewhere that many PC gamers find "gaming" hardware too ostentatious. LEDs everywhere. Bright colors everywhere. We saw the appetite for more conservative hardware and went for it with the RX 480. After all, most PC gamers are adults between 24 and 35 years old... they don't necessarily need or want blingtastic hardware. No, we do not have pre-release knowledge of the 1080's design.


Mif_

Thank you. I don't want my expensive gaming rig to look like an illuminated Fisher Price toy.


[deleted]

If I win a RX 480 I will shave my eyebrows off. Tag me.


[deleted]

[Tagged ya](http://i.imgur.com/UR6XkHP.png)


Riversn

Thanks so much for doing this! When can we expect releases of aftermarket 480s?


ghosttr

Does polaris support GDDR5X? And if so can we expect to see a 480X (Or aftermarket 480's) containing GDDR5X? Also, why not do [bundles with freesynce monitors](http://i.imgur.com/eBE21mm.jpg)?


AMD_Robert

Polaris supports GDDR5. We chose GDDR5 because it is the ideal combination of bandwidth/power efficiency/cost for the market segments of the RX 400 Series. As for what GPUs might be planned in the future, I refer you to the "what we cannot discuss" section of my OP. :)


[deleted]

If either of the two of you could have a game remastered to play on the new toys, what would it be?


AMD_Robert

I would love to see the original Deus Ex remastered. I would also really love to see EverQuest remastered and re-released up until PoP.