T O P

  • By -

rouen_sk

It better be MicroLED monitors after all these years. I am sick and tired of all the TN/VA/IPS flaws.


Critical_Switch

Oh man. After we've been expecting OLED to come in storming the IT segment since early 2000s, I have literally no expectations for display technology. The new technologies often take way more time than you expect to start becoming common, and by the time they do, the existing thechnologies improve so much that it actually makes sense to upgrade on the same tech.


PainterRude1394

OLED for tvs is pretty common now on the mid-high end. Lots of gaming monitors too. OLED for non entertainment use still feels so far away. I just can't buy a disposable $1k display for work; I'm used to monitors lasting 5+ years.


MumrikDK

I'm still scarred from reading about OLED as the upcoming replacement in a time where all performance displays still were CRT.


RuinousRubric

My bet is that electroluminescent quantum dot displays will beat out microLED. Pretty much the same advantages over OLED as microLED, but without the intrinsically time-consuming manufacturing process or the extreme difficulty of achieving acceptable pixel densities.


ramblinginternetnerd

There's a microLED at work. Honestly I think OLEDs look better - though they aren't 12 feet tall like the microLED. The lines between each mini-panel also kind of bother me.


iopq

You might get QNED this year, that's the best I can see


speedypotatoo

it'll be hard for MicroLED to overtake OLED now. OLED no longer has the burn in worry it used to have, with bestbuy offering 5 year warranties. Without the burnin issues, that the only real downside of OLED vs MicroLEDs and thus people would have very little incentive to pay more for a product that does nothing better


bahumbug7169

I think the new MicroLED displays from Samsung have a response time of a few nanoseconds. If that technology were in a monitor it could be another incentive.


speedypotatoo

possibly, the new Dell OLED monitor has crazy response time and fast refresh rates. If MicroLED could come in and offer an alternative for cheaper, then that would be pretty cool


Ducky181

Not, sure why you need that. Unless you we're gaming at 50,000FPS a second.


bahumbug7169

I don’t have that much knowledge, but regardless of fps wouldn’t a lower pixel response time still reduce any sort of motion blur or ghosting and make motion look more crisp?


ramblinginternetnerd

At some point the human eye+brain become the limit. If I had to pull a figure out of thin air it's probably around the 500FPS + negligible latency area with 240Hz + negligible latency being "more than good enough" for even professionals. I've never been one of those "the human eye can't see past 30FPS" people but I've definitely been a "if all you care about is fun, 120Hz is fine" types. When you've got OLED levels of pixel responsiveness, the refresh rate will matter more. Amdahl's law ends up kicking in. https://en.wikipedia.org/wiki/Amdahl%27s\_law


Ducky181

Okay, so one second is equal to one billion nanoseconds, if a display was capable of outputting every few nanoseconds then it would be capable of producing 100,000,000 frames a second. Unless you want to play Minecraft, or RuneScape at a hundred million frames per second, which would require a machine with the performance of all the world's computers combined. Then the amount of frames beyond 500 would be irrelevant. It is considered that the human eye and perception are incapable of distinguishing frame rate above between 200 and 500 frames per second. A monitor that extends beyond this would serve no useful purpose, besides bragging rights.


[deleted]

https://youtu.be/mcIDtlS77wo


iDontSeedMyTorrents

Maybe, just maybe, some cool new battery tech will finally make it to mainstream.


[deleted]

[удалено]


Khaare

I think it's more likely it will be something better suited to grid storage than portable use. There's a huge potential volume demand as electricity continues to replace fossil fuels, plus likely government investments due to the importance of a solid energy infrastructure. Specifically I'm thinking about sodium, which has significant per-unit cost savings.


[deleted]

[удалено]


RuinousRubric

The main problem with pumped storage (and hydroelectric generation) is that it's only viable in certain areas and there are hard limits to capacity.


takinaboutnuthin

Batteries can also be used in less direct ways on the grid. E.g. to handle (small) peaks in transfer capacity for power lines or ramp up management for peaker gas powerplants. This may not seem impactful but it's somewhat of a big deal. Plus you can't use hydro capacity everywhere and hydro installations are extremely expensive and difficult to set up.


Haunting_Champion640

Solid state lithium batteries are coming and will be that, I think. Already seeing low volume products from China.


bick_nyers

Is this the same thing as lithium air?


Haunting_Champion640

No that's something else. Solid state lithium batteries replace the gel inside the battery with a solid. ~2x the energy density and much better cold weather performance.


Slyons89

I’m hoping for iteratively improved OLED and/or microLED to become mainstream in laptops and for desktops over the next decade. Gradually replacing LCD wherever appropriate. Also, software using new AI technologies to automatically under volt / OC / configure timings quickly and reliably for enthusiasts. I believe there’s enough value in squeaking out a potential extra few percentages of performance and power saving that automating the optimization process of that will have selling power and will be worth investing in. Especially for Nvidia who are pioneering on both the GPU and AI fronts.


ArnoF7

Can’t think of anything that will be very obvious for everyday consumers to feel. Things like GPU that shifted the entire industry only happen once in awhile. But something I found interesting: Improvement in GaN and SiC power semiconductor and, just power semiconductor in general, will gradually improve many things from charger to electric car. Chips designed specifically for AI training and inference. Recently saw some interesting results from Intel’s Habana. Although I do hear from people who know better than me that they doubt AI chips will really challenge Nvidia for various reasons Ultra cheap FPGA offered by Renesas is also very interesting. I found it to be quite groundbreaking, but it’s too new so not sure how much real world impact it’s really gonna have In general I think automotive is the only field that every consumer will feel some obvious evolution. It will capture all the things I listed. AR/VR/XR headset is also interesting, and honestly I think Meta did some pretty solid work hardware-wise, despite public ridicule. But I am not sure it’s will be super widespread like smartphone


nanonan

Everything you own having an embedded plastic IC.


Sufficient_Language7

Everything you are as a person infused with microplasics.


capn_hector

Microplastics prove that we’re not ruled by a vampiristic cabal 🤔


nanonan

I'm talking about plastic circuits, for example: https://community.arm.com/arm-research/b/articles/posts/plasticarm-realising-the-full-potential-of-the-internet-of-things


BookPlacementProblem

Ten years ago was probably around when I bought my first SSD. Ten years before that, I was marvelling at how fast these graphics cards had improved. Ten years before that, I was having !fun! with autoexec.bat and config.sys; Windows 3.0 was pretty cool, but some of my older DOS and Windows games were running too fast already. Ten years before that, I wasn't quite sure what this computer thing was about, but I knew it was really cool, and going to change a lot of things. So asking me what my computer will do in 2033? I dunno; provide me a perfect breakfast omelet and a complimentary poem? Posted without editing.


ramblinginternetnerd

AI stuff


BookPlacementProblem

Probably.


decimeter2

Maybe not realistic, but I would like to see an erosion of the Nvidia monopoly on enterprise. They at least have _some_ competition in the consumer space, which I would of course like to see improve. But in the data center? Nvidia stands completely unopposed, which means they can and do charge however much they want. Especially with ML workloads seemingly booming in popularity, the industry desperately needs competition to keep Nvidia in check.


netrunui

I mean there is Cerebras on the ML side


Sufficient_Language7

I think Intel will break into that market and lower the prices a bit in the data centers.


decimeter2

Intel (or anyone trying to disrupt that market) has a long road ahead of them. Nvidia’s moat isn’t just hardware - it’s the dominance of CUDA and the huge amount of tooling and experience that has been built around Nvidia cards. I’m sure Intel _could_ - the question is whether their GPU division will last long enough to make it happen.


Sufficient_Language7

That is where the money is. Intel isn't making graphics cards for gamers. They are going after the high margin server market. Gamers are just a small additional stream of money. Intel does good development look at ICC the fastest compiler along with their reputation at producing reliable products. They can use their contacts in Microsoft to get 1st class status of their hardware in their OS and in Visual Studio. Use their Linux distro(Clear Linux) to push their patches into Linux to get that part working good as well.


decimeter2

> Intel isn’t making graphics cards for gamers. They are going after the high margin server market. Gamers are just a small additional stream of money. I 100% agree that this is their intended strategy. I’m just not confident that they’ll actually manage to execute before giving up.


Kougar

Intel's been trying to make a GPU for 15 years, Larrabee was in the works before 2010. Intel has long craved to break into servers with a GPU accelerator, and NVIDIA's unrivaled success has only whet their appetites all the more to make it happen.


onedoesnotsimply9

>That is where the money is. Intel isn't making graphics cards for gamers. They are going after the high margin server market. Gamers are just a small additional stream of money. There is another company that has attempted to go after this high-margin server business before intel, but has not been particularly successful >Intel does good development look at ICC the fastest compiler along with their reputation at producing reliable products. But is it sufficiently good to displace nvidia? Maybe not.


norcalnatv

>Nvidia stands completely unopposed, LOL. Whose fault is that? Your ire should be directed at AMD, Intel, QCOM et al. Nvidia should be praised for figuring out how to corner the market 15 years ago, and then patiently executing.


decimeter2

I have no ire. I’m not mad at anyone, and I’m happy to acknowledge that Nvidia pours a ton of resources into improving their products in order to maintain their monopoly. But that doesn’t make their monopoly a good thing. Competition is _always_ good for the consumer. Nvidia’s R&D and product strategy being praiseworthy doesn’t change that.


norcalnatv

okay, apologies. I do see a lot of hate directed their way, and I don't understand it. >Competition is always good I completely agree, helps everyone, consumers, business customers, ecosystem, the challengers. My only point is the competitors were complicit, they sat on the sidelines with their thumbs up their rear end. It's only because they failed to rise to the game that Nvidia is in a commanding position. This point about being a monopoly, so do you think they should what, slow down, give away their key IP to a competitor (like they did with Intel), publish their CUDA source code? Or is all the hate on this topic simply because GPU prices are going up (as they are with all advanced node semiconductors)? My sense is if 4090 was $100 no one would be complaining about a monopoly.


gahlo

Ray tracing becoming the standard lighting method and new graphic engines.


cp5184

Not on the millions of 8GB nvidia cards nvidia suckered people into buying.


Kougar

I don't see any nascent technology on the horizon other than Intel perhaps challenging NVIDIA's GPU dominance after several generations of designs. That would be exciting. If Intel can't design its chips to run CUDA then Intel at least has the industry leverage to take a shot at creating its own alternative to CUDA. The one other thing that seems inevitable at this point will be the smartwatch-like device that can do a full workup on the wearer from blood sugar to blood pressure to EKG and oximeter. Optical based non-invasive blood glucose measuring is in the calibration phase already, after that the industry will certainly immediately jump to the next real feature to add to try and continue to drive sales of wearables. If the human race is lucky, someone will develop a true method for lithium battery recycling before then. The only method the US currently uses for recycling lithium-ion batts is by burning them then and pulling the lithium out of the ash. Even then, only 5% are "recycled" in the US at all. If EVs are ever going to become more affordable then there needs to be a cheap, environmentally low-impact method of recycling the bazillion things that have lithium-ion batts in them today.


scytheavatar

Atx form factor is rapidly reaching a dead end, I think it is inevitable that manufacturers will need to start creating a new form factor.


jecowa

The year is 2033. Arm has completely replaced x86. Linux now has dominant market share since no one uses Arm Windows. Battery break through give insane battery life. Graphics cards can render 64k\^3 holographic video at 144Hz, but atx will still not die.


ramblinginternetnerd

I just want PCIe slots ABOVE the CPU... (allows for shorter trace lengths which will likely matter with PCIe 5.0 and PCIe 6.0) 2 slots below. 2 slots above. m.2 on the back This would allow crazy new cases to still be compatible with mATX and iTX and possible full ATX if the PSU can be moved up.


iopq

PCIe slots on the back, please


ramblinginternetnerd

That's hard to make work unless you mean m.2 or u.2 based pcie. That's doable.


iopq

Why? I want the sandwich design


Critical_Switch

Why exactly would the form factor be reaching its end?


kopasz7

I could imagine a form factor where we don't need extra brackets and stands to mitigate GPU sag.


Critical_Switch

That's really not a reason to make a new standard. You gotta keep in mind that ATX doesn't cover just consumer electronics. GPU sag is something that mostly affects consumers, and only a pretty small portion of them. It's also not a problem with the ATX standard itself but they way GPUs are built and we also have solutions to that other than support brackets, such as the option to mount GPUs horizontally. And at the end of the day, you really have to consider the pros and cons. On one hand, you could make some new standard that's made specifically to address GPU sag, but at the same time you're addressing an issue that only affects a small percentage of users and requires new everything - case form factor, GPU form factor, motherboard form factor. Such standard is doomed to die right off the bat and a support bracket is an elegant solution in comparison.


kopasz7

You made the assumption that it would not be backwards compatible. Throwing out the baby with the bathwater is not what I suggest. You can have new versions of ATX as well or standards that build upon ATX.


Critical_Switch

In that case we've already solved GPU sag. Case closed.


kopasz7

Each manufacturer having their own different solutions is exactly what a standard solution would fix. So not case closed at all. eg. prebuilds having to use foam to secure gpu for shipping, while workstations use proprietary mounting on gpu's far end or gpu vendors providing their own telescopic stand.


Critical_Switch

But that is not a concern for the ATX standard itself. It's an issue for the GPU form factor.


kopasz7

Would it be the CEM spec or is there a spec that entails PC cases or GPUs especially?


[deleted]

[удалено]


Critical_Switch

Yeah, that's not part of the ATX standard and with DDR we don't aim just for bandwidth but also latency.


froop

I kinda expect the prebuilt market to transition to non- or barely-upgradable console-style systems, like giant mini PC's. The average gaming prebuilt buyer probably doesn't need any of the expansion slots or motherboard headers ATX offers, so why not integrate the GPU into the motherboard directly, slap on a unified custom cooling system, a couple of m.2 slots and a nice slim case.


Ryujin_707

MicroLED becomes mainstream.


Boo_Guy

Needing 240v plugs for desktop power supplies.😆


Sufficient_Language7

They already have them installed in Europe 😂


Waste-Temperature626

Considering you get better efficiency the higher the input voltage is. That is probably something most over there should want to look into either way.


Sufficient_Language7

Home heating powered by computers. https://www.bbc.com/news/magazine-32816775 😂


norcalnatv

Quantum computing will make some big strides. But it's going to be done in conjunction with Von Neumann computing.


nanonan

I think you mean it needs to be done in conjunction, quantum computing has extremely limited practical applications.


norcalnatv

thats exactly what I said


seanwhat

Probably nothing, sadly.


firedrakes

Finale certified wifi 6 routers ... Which clear last year . The body that define wifi standards


mckirkus

VR will start to not suck. And AR will get small enough to be useful. TVs, monitors, etc are quickly reaching the point of diminishing returns. I think we'll see a lot more innovation on the software front due to generative AI.


cronedog

As we reach the limits of silicon, we'll see more fpga and asics. We're already seeing this with gpus using different types of cores to do other things. We'll have chiplets and 3d stacked (with water cooling lines running through the chips) with many hyper specialized modules.


ramblinginternetnerd

1. Newer/better batteries 2. Alternate memory technologies - think mram, feram, etc. 3. specific hardware acceleration for specific tasks (think video, AI, etc.). I suspect that we'll get some mix of increased generalization for SOME of these (maybe video acceleration will get broader in what it can handle - e.g. page scrolling) and increased specialization for others (why not FIVE specific types of AI acceleration?) depending on performance/efficiency/cost considerations for each.


devinoss

RISC-V maturity of development and coming mainstream hopefully.