Oh boy, if its not the consequences of their own actions.
Good! Motherboard prices have been wildly out of control for a while now. Asus selling 1k motherboards when that same tier of motherboard used to top out at 400 bucks max. Let these greedy fucks have their lunch.
Don’t forget gutting the connectivity on lower end motherboards. Apparently PS/2 ports and LED error lights are premium features now.
Do people still use PS/2 ports?
Me running old Asus B350 AM4 Board from 2017…
Hey, it works! I’m running a 5800x and don’t plan on touching a thing unless I need to.
The wet dream for big tech has been to get people to pay subscription fees for compute, just like businesses do for cloud hosting. They tried with Stadia to get people to play games hosted in the cloud, but that was never going to fly.
With the compute demands of AI (which is comparable to a AAA game except for the largest models), they dont want to make the same mistake and let you have the compute. They see this as an oppurtunity for subscription fees for the earth.
The fact that we cant get hardware for a reasonable price is an added bonus to this plan.
All of this only works of everyone subscribes to this shit. Businesses will, because its just easier to manage it. Consumers though should not give in. If you want to run an agent, use a small local model.
The best thing that can be done is to make local open source agents and models approachable for regular users. Right now, they arent.
I work for a large retailer that you’ve definitely heard of. We are pulling away from our cloud hosted presence and are building out a self-managed virtual data center in one of our own physical data centers.
Even enterprise knows that paying a monthly uncontrolled cost is shit.
The wet dream for big tech has been to get people to pay subscription fees for compute, just like businesses do for cloud hosting.
Imagine the mental health benefits when AI datacentres make computers unaffordable, so we all have to go outside more, and then the AI datacentres shrivel because they have no customers, because we can’t access anything with no computers. So the AI companies die off.
I can dream, ok?
I would love this to be an unintended outcome from all this. However, I don’t think that’s where we’re headed.
I, for one, think there’s a lot of slop in and around the engineering of phones. We might see a lot more software, storage, and overall activity crunched, compressed, and crammed into our portable devices instead. And with more stuff in the cloud/SaaS realm, they can also become (even) thinner clients at the same time. :(
It’s “heavier” gear like laptops and desktops that’ll probably get pushed into the pro and “prosumer” market.
The wet dream for big tech has been to get people to pay subscription fees for compute, just like businesses do for cloud hosting.
Thankfully there’s a growing number of businesses that have been burned by this, and it seems like companies are starting to try bringing their critical systems back in-house again
There is this tipping point where it becomes more cost effective to bring it inhouse, even with the staffing requirements. For small to medium sized buinesses though cloud all the way.
where it becomes more cost effective
Reliability and risk are also factors. What do you do when a vendor tries to lock you into a walled garden before cranking up prices? What about storage of sensitive information? Sometimes the additional cost of doing it in-house pays off in ways that are difficult to track
I still see fewer NAS motherboard options on Aliexpress than a year ago.
When is the best time to buy pc parts? Should I still wait?
About 2019.
If it still works, keep using it.
If you need it for work/self now - best time is now, if you don’t need it for now - later.
Speculating for necessary items (even with rental bullshit) won’t help you most likely, and would just add mental pain.
Can always buy used, and older, if that works for you (though the prices are ridiculously also high)
I know you’re just a person but when do you think prices might go back down? I feel like we’re looking at a decade out honestly
We’re mainly just waiting for the AI bubble to pop, and it’s looking more and more likely every day as companies are slowly realising the only people that like AI are the companies selling the solutions.
You are assuming it’s going to break if it gets no better it is already useful and may well get much cheaper after all the dotcom bubble breaking didn’t eliminate computing
Prices are never going to go down. Never in the history of ever has a capitalist reduced their prices after a “crisis” causes an extended period of higher prices. What might happen is that they don’t keep increasing prices for a while to allow consumer purchasing power to rise (basically you have to start earning more money).
Yep, especially because chip manufacturing has such a massive barrier to enter.
Idk man. Shit’s so wild, they might never go down (fascism, ww3, complete meltdown of capitalistic markets due to french revolution levels of incompetent wealth inequality in some specific third world country out in the west, or some wild interpolation of them all).
Or they might go down next week when some specific big AI company starts selling off datacenter parts (get ready for dirt cheap racks)
Though, with datacenter parts, problem is, they’re mostly completely useless for consumers, because of hardware vendor lock in, most likely.
I guess in the end, I am happy I bought more storage space, gpu, and replacement cpu when I needed it, even if it seemed a bit high (now it’s cosmic)
So there’s more supply than demand, meaning they should become really cheap. Right? That’s fucking good news.
What collapse are you talking about? Profit collapse for greedy corporations?
Sure, you may be able to buy a cheaper motherboard for a while. But you’ll pay through the nose to populate it, hence the falling motherboard sales.
If they get dirty cheap, I might just replace my old one using old parts
I have a home server waiting around for an SSD for a year now. I have the money, but I don’t like feeling like I’m getting scammed. So I’d rather wait for this market to collapse than give them my money.
Can we create a fund with gamers, and then buy the manufacturers that go under?
We need just one of each, MB, PS, GPU, hopefully something for CPU will be buyable as well.
I literally don’t think we will have enough money. These are near trillion dollar factories we’re talking about. Whole countries can’t even afford to make more.
The sheer amount of global cooperation necessary to make these things is baffling.
For CPU and GPU, our options are mostly TSMC, Samsung and Intel. Nvidia and AMD don’t really have fabs. Any of those can also help with the other chips on a motherboard. Samsung also do NAND so they’d be the best to acquire.
I think Samsung is going to get hit with helium shortages which are needed in fab, outside of hormuz the US is the only one with a locked supply
How has this whole saga not been an obvious indictment of ‘the free market?’
Big players shouldn’t be allowed to gobble up all the resources needed by small ones. How is it not obvious that they need to wait until production increases to meet their needs before embarking on their little project?
The free market means the market is free to fuck you. Yes you in particular 😅
mankind’s oldest trade, but doesnt compare to the fact that in 1998, The Undertaker threw Mankind off Hell In A Cell, and plummeted 16 ft through an announcer’s table.
There is no Free Market, that’s just a myth they tell us to allow them to do whatever they want. They don’t really want a true Free Market because it leads to Luigis.
No this is exactly what the free market is. Regulations that make things fairer for small people is communism.
Then give me communism bc this free market shiz aint it. Giving us the scraps that AI doesn’t want is just nuts if we get anything at all. Esp since no one beyond billionaires want this ai crap
It’s like a dozen people gaming ever one else for the rest of the chairs on the Titanic.
They want to buy the dip
What about regulations that make things fairer for tall people?
I apologise, but I am very anti-beanpole.
You mean tallmunism?
That’s a whataboutism.
AI is an amazing tool for fascists.
Annihilate private access to computing, censor and rewrite all comms, destroy free software and the last remnants of education…
Every single decision made for evil.
And all these vendors who are locking themselves into one customer are about to learn why that’s a bad idea.
The worst thing is that when used for good AI is fantastic! Scientific progress with purpose built AI to find planets, predict the weather, and tons of pattern matching has been in use for decades with positive benefits!
Even LLMs can be a useful tool in the right situations where looking like words people would say but accuracy is NOT important.
The problem is trying to use LLMs to do everything and failing while running the tech industry, the environment, and soon the economy into the ground. They took something positive, ruined it and coopted the terminology while shoving it down everyone’s throats.
Sorry but I’m so done with hearing about how cool and useful the knife that they’re plunging into our necks is.
Maybe try reading my comment again?
AI is used as propaganda tools to spread it, and it can bee seen on youtube, social media quite readily. plus it sexualizes victims to like csam, and festishized unattainable “women” for conservatives.
conservaties pretty much buys into/believes in anything that is scammy.
This article talks exactly about that: https://tante.cc/2026/04/21/ai-as-a-fascist-artifact/
How are they about to learn why that’s a bad idea? Like, when the bubble pops?
I think. Most likely they going to be outcompeted by china.
https://www.bloomberg.com/news/articles/2026-04-27/why-china-s-deepseek-qwen-and-moonshot-are-a-worry-for-us-ai-rivals
I don’t know if spending unfathomable amounts of money buying up the entire global supply of computing products can be considered a “tool”.
I’m not here to argue for or against LLMs in general, but self-hostable AI is a thing. Actually open AI is a thing.
A blanket statement saying about AI as a whole technology being good for fascism is about as useful as saying “roads are good for fascism” (they’re great for troop movement after all).
You can’t self host anything when the hardware is no longer affordable.
I recently saw a Hacker News article on people running some Deepseek V4 Flash variant on macOS, which is a surprisingly capable model for its size
Self hosting an llm ain’t the same thing as self hosting nextcloud for your docs and calendar. Yes there are small models but their output is laughable
Small models are improving and becoming more capable. The quality of local LLMs is basically unbounded. The context size of local LLMs is bounded by hardware. So local LLMs can be very capable for small, self-contained tasks.
qwen 3.6 35b running locally:
Write a Python script that can pull weather data from public sources and provide the high and low temperature for the current day in Miami, FL.Single shot. No tool/internet use, so it didn’t pull this script from elsewhere.
import requests def get_miami_weather(): # Miami, FL coordinates LATITUDE = 25.7617 LONGITUDE = -80.1918 # Open-Meteo API URL (free, no API key required) url = ( f"https://api.open-meteo.com/v1/forecast?" f"latitude={LATITUDE}&longitude={LONGITUDE}" f"&daily=temperature_2m_max,temperature_2m_min" f"&timezone=auto" ) try: response = requests.get(url, timeout=10) response.raise_for_status() # Raises error for 4xx/5xx HTTP status codes data = response.json() # Index 0 corresponds to the current day high_c = data["daily"]["temperature_2m_max"][0] low_c = data["daily"]["temperature_2m_min"][0] # Convert to Fahrenheit (commonly used in the US) high_f = (high_c * 9/5) + 32 low_f = (low_c * 9/5) + 32 print("🌤️ Miami, FL Weather for Today:") print(f"High: {high_f:.1f}°F ({high_c:.1f}°C)") print(f"Low: {low_f:.1f}°F ({low_c:.1f}°C)") except requests.exceptions.HTTPError as http_err: print(f"❌ HTTP error occurred: {http_err}") except requests.exceptions.ConnectionError: print("❌ Error: Could not connect to the weather API.") except requests.exceptions.Timeout: print("❌ Error: Request timed out.") except requests.exceptions.RequestException as err: print(f"❌ An error occurred: {err}") except KeyError as key_err: print(f"❌ Error parsing data: Missing expected key {key_err}") except Exception as err: print(f"❌ Unexpected error: {err}") if __name__ == "__main__": get_miami_weather()Output:
% python3 ./m_weather.py 🌤️ Miami, FL Weather for Today: High: 88.0°F (31.1°C) Low: 73.2°F (22.9°C)I tried to keep the size and scope within something that would reasonably fit in a comment. Looks pretty decent to me, but I can’t write Python myself. Never learned. I double-checked the LAT & LON of Miami, and it’s spot on.
It did take 47 seconds, while a cloud LLM would probably take 5 or less.
All I’m saying is local LLM isn’t garbage and it is getting better all the time.
How much ram and what gpu do you have?
Now show the output for an 8b model. The only one I’m capable of running
Gemma 4 e2b is pretty impressive for its size.
This area of computer is improving very fast. I truely belive the future of this is locally installed open models
That’s interesting.
How much ram did it use while running?
If you used a GPU, how much does it cost in today’s prices?
It’s a MacBook Pro. 36GB of ram. I am sure Macs have some kind of gpu and I understand it somehow combines GPU ram with system ram, but I don’t really know Mac hardware very well.
It’s beefy for a laptop, but the desktop I built for myself several years ago had 32 GB of ram and a GTX 1660, so I’m guessing they are similar in capability. I gave that to my daughter, so I can’t run a comparison right now.
EDIT: After doing just a bit of research, I’ve learned the unified memory architecture that Macs use, while not ideal for many purposes, is actually a big advantage for running larger inference models. So it’s possible that this particular model wouldn’t run at all on my Linux box or would run much slower because the full model wouldn’t fit in the 6GB of VRAM and create a lot of memory thrashing.
Yup, you want memory accessible to the GPU for local AI. AMD Strix Point and Mac devices are popular options. CPU can run LLMs but very slowly. I’ve got 32 GB of RAM and 8 VRAM and it’s borderline useless for models that don’t fit in the VRAM.
You can use something like KoboldCPP on Linux, which allows both RAM and VRAM combined to run a model. O’course, not as fast when compared to pure VRAM or the Mac approach, but it is an option. I use my 128gb RAM with some GPUs for running models.
decent performance on 6gb gpu without quantization: https://www.youtube.com/watch?v=8F_5pdcD3HY&t=9s
qwen 3.6 is awesome, but 48-64gb is still real money these days. (though 32gb on dedicated separate machine is also more money). Sonnet 3.5 to opus 4.5 level benchmarks. and the online cost metrics for 27b and 35b are way off considering the overall usefulness of a 48-64gb machine (inclusive of gpu vram for 35b) which even in single, non batching, use could displace $5-$7/day of use.
Local costs are much lower than online costs in linked chart, but if online, there are better models
Depends on if you even need a better model though. Can you run a good enough model is what matters for the most part.
Or available. Companies have pre-sold years worth of inventory to AI companies.
You see hot that’s tangential to what you’re replying to?
Ai is evil
LOCAL AI is not all evil
Computers are expensive
Your point is completely valid, but in another discussion.
Sorry, but I think the point about local AI not necessarily being evil is the tangent here.
The OP is about motherboard shortages, which is being driven by the big AI companies and is making hardware unaffordable for normal users
The top level reply to that is about how that’s bad because it removes the ability for people to be in control of their own computing
Then someone comes in, saying “yeah, but you can host your own AI so that it’s not evil so not all AI is bad”
Then someone points out that you can only host your AI if you can afford the hardware to do so which, as the OP and the comment you replied to pointed out, is getting really hard to do.
Only when you ignore what was literally the first premise and conclusion.
if you did not understand the comment from above it’s fine but splitting hairs like you are doing is silly (everybody knows it’s not 100 % of AI is 100% evil)…
your comment is exactly the same as when people say “guns don’t kill people, people kill people”… yes, we all know guns are not autonomously killing people, the point is that guns, as a tool, are remarkably good at doing something we do not want, which is to kill people
Not to go on a separate tangent, but that’s the entire point of guns. They are supposed to kill. That’s not meant to be some crazy conservative defense of them or opposition to regulating them. Just pointing out something that seems to get lost in conversations.
Correct… so when I tell you “guns DON’T kill people, people kill people” you are right to assume and I am just an idiot trying to jingle keys in front of you to distract you from the fact that guns do in fact kill people.
Corps want to privatize roads and make them all toll roads too
They are succeeding in my area.
Roads were also useful for random citizens and people who happened to be in the area.
LLMs are overwhelmingly more useful to bad actors.
I’ve looked into self-hosted AI and decided it’s not worth the cost - both in terms of hardware and energy - when compared to the relative value to be gotten out of it. YMMV.
Same, pretty much. It is possible though, which makes LLMs a more democratic technology than, say, nuclear reactors.
The models you can run on consumer hardware are still nowhere near the stuff that runs in corporate data centers. To stick with your metaphor, its like running a little steam engine at home while the big guys get to operate nuclear reactors…
You can get pretty far with a stack of 5090s and llama.cpp with split mode graph (or so I’ve heard, I’ve never tried), or AMD’s unified memory CPU thing.
It’s not as good as data centre grade stuff, but it’s not nothing either.
That’s kinda my point. Roads are a useful technology, but they can be used by fascists.
The US government is already setting down the legal framework to make self hostable AI ilegal so good luck with that. Also self hostable AI is still being trained on stolen material so still fascist.
right before mother’s day as well! smh.
Who knew cartels were bad for markets? I am sure economists and regulators were screaming at legislators for years but I guess they couldn’t hear them over the sound of the bags of dirty lobbyist money landing on their desks.
And then there are the financial “irregularities” funding the AI boom which are also not getting any attention.
Who knew cartels were bad for markets? I am sure economists and regulators were screaming at legislators for years
Unfortunately there’s plenty of economists that still believe in hardcore libertarianism, which is what led to our current situation
I was originally thinking of grabbing parts as they go on sale finally (this PC is from 2018ish, I think, and I guess I could upgrade some bits), but I think I’m just going to wait and get a laptop. In part because I do less gaming, the gaming is less intense, and I’m thinking about trying to spend part of the year living outside of Japan which would make the logistics of shipping a heavy full tower around (or even mid if I downsized) just too much of a headache. Still not 100% sure, though.
If we can’t make a time machine to go backwards, can we at least pause time? The future absolutely fucking sucks, let’s just avoid it altogether lol
Death to Chronos?
May shadows conceal you.
TIME CANNOT BE STOPPED
Science is not yet clear. I think quantum theories say time might not even exist
Pausing time would be worse than living forever. I could not imagine a greater torture.
deleted by creator
Seed debris in the orbit to destroy current satellites and then prevent new ones from being launched for several decades
Honestly I quite like space.
Better solution is to just kill like six people. Because that’s all the AI industry is, it’s like six really annoying rich guys who are having the most extravagant pissing contest in history.
Six? There’s Jensen Huang, Lisa Su, Alex Karp, Peter Thiel, Sam Altman, Dario Amodei, Mark Zuckerberg, Elon Musk, Satya Nadella and Sundar Pichai at a minimum to start with, but there are many others too. I’m including the CEOs of Nvidia and AMD because they’ve been pushing AI for years to get more valuable customers than we ever were to them.
Just in case anyone needs a list.
This is our fucking insane reality:
Who wins? 6 rich guys or 8 Billion people?
Easy. 6 rich guys.
…
Honestly I wouldn’t mind so much if someone created an AI that was actually useful.


















