Because LLMs are just about all that actually exists as a product, even if an inconsistent one.
Maybe some day a completely different approach could actually make AI, but that's vapor at the moment. IF it happens, there will be something to talk about.
Don't worry that much about 'AI' specifically. LLMs are an impressive piece of technology, but at the end of the day they're just language predictors - and bad ones a lot of the time. They can reassemble and remix what's already been written but with no understanding of it.
It can be an accelerator - it gets extremely common boiler-plate text work out of the way. But it can't replace any job that requires a functioning brain, since LLMs do not have one - nor ever will.
But in the end it doesn't matter. Companies do whatever they can to slash their labor requirements, pay people less, dodge regulations, etc. If not 'AI' it'll just be something else.
Text is an LLMs input and output, but, under the hood, the transformer network is capable of far more than mere re-assembly and remix of text. Transformers can approximate turing completeness as their size scales, and they can encode entire algorithms in their weights. Therefore, I'd argue they can do far more than reassemble and remix. These aren't just Markov models anymore.
(I'd also argue that "understanding" and "functional brain" are unfalsifiable comparisons. What exactly distinguishes a functional brain from a turing machine? Chess once required a functional brain to play, but has now been surpassed by computation. Saying "jobs that require a human brain" is tautological without any further distinction).
Of course, LLMs are definitely missing plenty of brain skills like working in continuous time, with persistent state, with agency, in physical space, etc. But to say that an LLM "never will" is either semantic, (you might call it something other than an LLM when next generation capabilities are integrated), tautological (once it can do a human job, it's no longer a job that requires a human), or anthropocentric hubris.
That said, who knows what the time scale looks like for realizing such improvements – (decades, centuries, millennia).
I'd imagine most people aren't 100% positive on what they want to get beforehand. Sometimes you only realize you want something after passing by it. Maybe it's something you haven't gotten in awhile and hadn't considered beforehand.
And even if you are completely sure on what you want in advance, having someone else do it is not always great. At least with Instacart, the person doing the shopping frequently didn't know where something was and just assumed it was 'out' and tried to substitute it (badly). There was all this awful delay and back-and-forth and crappy picures with the person shopping to try to get the right thing.
Doing it yourself doesn't have that problem. You know what you want, why you want it, and what you're willing to bend on. No, X brand cheese is not a substitute for Y branch I wanted, never do that. But yes, Z brand and type of milk is fine compared to what I wanted and I know they are frequently out.
Grocery store employees aren't any better at this, btw. Especially since the stores like to re-arrange on a monthly basis.
Bitcoin, and really all crypto 'currencies' were never meant to be currencies at all. Maybe a couple naive people who created them originally believed that, but it was never the goal.
They are speculative assets for gambling with. They have been since day 1.
> Bitcoin, and really all crypto 'currencies' were never meant to be currencies at all.
To be fair, there is a significant amount of disagreement about what a "currency" is supposed to be, and there is a large subset of people who believe that the desirable traits in a currency are exactly those things that make it function well as a speculative asset (notably, on average over a long time, value with respect to goods is at least flat and preferrably increasing) while simultaneously not thinking the things that another large group of people sees as desirable for a currency (e.g., lack of extreme short-term volatility) are important.
I can't speak to the original designer of Bitcoin, but I wouldn't be surprised if it and most cryptocurrencies were designed to be currencies, just by people who have a very specific (and, IMV, wrong) idea of what a currency ought to be.
A currency is fungible, easily accessible, tradable and convertible with little overhead. And in order to function, above all else a currency must have stability and trust.
If people lose faith in a currency's future, then it has no real value.
If people believe a currency (or the government/system which supports it) is unstable, then it has no real value. Real world global trade and investment is done on long timetables. You can't develop a product that won't start selling for 6+ years if you can't predict how currency will behave along that 6 years.
No one had a 'wrong' idea of what currency should be. They saw an opportunity to scam people out of all their money by convincing them that gambling was an investment and that they were much smarter and more clever, and sticking it against 'the man' or 'the system' when in fact they were just being used.
There were only two notable groups in crypto: The scammers and the suckers.
16GB is more than fine if you're not doing high-end gaming, or heavy production workloads. No need for debloating.
But it doesn't matter either way, because both 16 and 32GB have what, doubled, tripled? It's nuts. Even if you say "just buy less memory", now is a horrible time to be building a system.
Looking Glass displays (not the "hololuminescent" ones) solve many of the same things (multiple viewers, no glasses) while looking good, and in principle you could build a cube out of them, although the display can't be seen from the full 180 degrees.
I am sooooooooooooooooooooooo glad I bought a 6000Mhz 2x16 kit before all this nonsense started.
I'll be honest, I have 0 confidence that this is a transient event. Once the AI hype cools off, Nvidia will just come up with something else that suddenly needs all their highest end products. Tech companies will all hype it up, and suddenly hardware will be expensive again.
The hardware manufacturers and chip designers have gotten a taste of inflated prices and they are NOT going to let it go. Do not expect a 'return to normal'
Even if demand goes back to exactly what it what, expect prices to for some reason be >30% higher than before for no reason - or as they would call it 'market conditions'.
When it eventually does, they'll just come up with something else. Nvidia got a taste of inflated prices from the Crypto and then AI, and they're not going to just let that go. If nothing exists they'll make something and hype it endlessly to try to keep this going.
Unlikely. Unless some new technology comes around that completely invalidates existing GPUs and Nvidia cannot pivot to it quickly enough, there's just no way. They're too big, too rich, too powerful. They basically own the dedicated GPU market, with AMD holding maybe a piddly 10% at best.
I run into a similar problem. I have a power-hungry GPU (3080) and CPU (9800X3D).
All my audio equipment was on the same UPS (and therefore outlet) as my gaming PC.
The result is that any time a particularly stressful game would be open, I'd get buzzing in the speakers. (Especially if the framerate was at 360) If you ask audiophiles online they will swear up and down that a cheater plug, balanced cables, or optical isolation will fix it - that will not fix it. It's not a ground problem. It's not coming from the connection from the PC to the DAC - it's a power issue.
It seemed almost inconceivable to them that the problem was EMI from the computer making it into the equipment.
I temporarily got a double-conversion UPS (converts AC to DC to AC again) and housed the audio equipment on that instead (separate from PC) Lo-and-behold the noise was completely gone.
However, those UPS are extremely expensive, and far worse they're very loud because the fans run constantly.
So, I went with a simpler alternative. Just get a power strip and plug all the audio equipment into that on a different outlet. That reduces it massively. You can also get some strips that are designed to reduce EMI, but I haven't felt the need as of yet.
If you're a bit handy, you can assemble a line filter using a part like this https://enerdoor.com/products/fin27/ for a heck of a lot cheaper than you can buy a filtered power strip.
Even if there's very little audio-frequency attenuation, it's possible for higher frequencies to produce audio-frequency intermodulation distortion, and filtering could reduce this. This is one reason "high definition" (ultrasound sampling rate) audio is a bad idea as a listening format.
In 2013 I bought out 2 Radio Shacks worth of ferrite beads when I was hunting down signal noise in my senior design project (CNC mill rebuild and update.)
All else fails, add more beads.
Also, I learned that you can make your own shielded flat cables with aluminum duct tape.
Who knew that they had a really good reason for using 48V signaling in the original machine controls from 1986?
Maybe you’re right. My experience is with radios, where it’s possible that high frequency noise is conducted into the RF section rather than into the audio amplifier. I know that in one case, both my transmitted signal and received audio output were absolute garbage (edit: because it was picking up noise from the vehicle ignition) until I added a choke to the power input wiring.
Well the OP’s electrical noise almost certainly is coming through the USB connection as their DAC has no external power supply. Extremely common.
Your problem of an AC power supply not sufficiently filtering out high-frequency noise from mains is exceptionally rare to the point that yes, I also don’t believe that was the correct diagnosis of your issue.
Pure sine wave UPSs are not that expensive anymore man. I think the biggest "desktop" pure sine wave cyberpower sells (1500VA/1000W, CP1500PFCLCD) is <$300 now. I have a couple of them, they are great.
It's not about pure sine wave - it's double conversion. Only double-conversion would actually isolate the equipment from EMI on the line. Without that pure sine wave won't do squat for EMI.
And one of those, even the cheapest ones, run for about ~$900. And they are LOUD.
Are they loud because they're double-conversion or are they loud because they're designed for server racks? When I search for double-conversion online I can practically only find rack-mount solutions.
They're loud because unlike a regular UPS they need to run continually to convert the power back and forth. That generates a lot of waste heat, which fans must remove.
I wonder if modern motor and power control tech could be adapted to make a desk-side motor-generator set that is efficient enough to rival an always-active, dual conversion AC-DC-AC UPS.
How efficient could a small AC->motor->generator->AC chain be with a modest flywheel mass to provide cycle-to-cycle stability?
Could it ever make sense to put one of these after a standby UPS so the output is always filtered by the motor-generator but the UPS only has to kick in for outages?
One advantage of a motor-generator set is that it's relatively easy to get high-voltage isolation using an insulating shaft. It might be possible to build something that could survive nearby lightning strikes to the incoming AC line. I don't think any standard UPS can do this.
The portable lithium battery "powerstations" double as great double-conversion UPS in addition to their intended outdoors (camping, beach, etc) activities, and depending on capacity go for less than 900 USD. It's only noisy when fast charging or providing high currents.
Definitely had various computer equipment plugged in to ours and it was great (I didn't specically test for EMI).
> If you ask audiophiles online they will swear up and down that a cheater plug, balanced cables, or optical isolation will fix it - that will not fix it.
Lifting the ground on my studio monitors absolutely fixed my noise problems. I run them off a MiniDSP 2x4HD, so other sources like EMI aren't really a factor.
The problem I have with a double conversion UPS is that it isn't an ideal sinusoidal source. It implies it is on the tin, but when you've got protected loads with PWM power delivery slamming around 1+ kilowatts, there's no way to guarantee a smooth waveform with a typical ~2500VA unit. Directly passing through to the grid could provide cleaner power under the most transient conditions.
Reminds me of my friend who has bought a shit load of 1.5v AA lithium batteries. The buck converters in those little bastards wreak havoc with every speaker around. His TV remote disconnects my Bluetooth headphones every time.
It's not about isolation though - it's about line noise. An isolation transformer will let that pass straight through (minus the usual filtering due to any transformer being an inductor), whereas the AC->DC->AC conversion gets rid of all of it by effectively acting as a perfect low-pass filter.
Sigh, it's almost like I had this conversation before.
My audio equipment is not connected by USB. It's connected by optical (TOSLINK) to an external DAC. TOSLINK is not great, but it shows that it is not a USB noise problem.
I got rid of the noise via the method I described in the original comment? Moving my audio equipment to separate power strip on a separate outlet. It didn't totally remove it, but made it quiet enough to deal with.
Don't mind them. I've had a similar thing happen, but with power line Ethernet. In your case however, I'd be at least a little concerned about the building wiring.
In many analog pro audio applications, it's actually recommended that a shield be connected at one side only, for this reason. By convention but not necessarily necessity, the bond is typically kept at the receiving end, as that's almost always a device with a grounded power cord (such as a mixer). Many DI boxes feature a ground lift switch as a convenient way to achieve this. But you wouldn't want to disconnect it at both ends, as then the shield has no effect at all.
Anyway, if you had problems with your unshielded cables that would be solved by a shield, but your shielded cables caused a different problem due to the bond at both ends, this technique of using shielded cables but severing the shield at one end of them would get you the best of both worlds.
Huh, I had no idea that cables would have their shield grounded at both ends... Single point ground is such a standard in electrical design that the guidance is generally "do otherwise only if you have the ability to make many prototypes to nail RFI issues".
If you're building an audio cable your signal will peak out at a few kHz, so the cable acting as an antenna and picking up a signal in the MHz range isn't an issue. Similarly, you're not transmitting anything significant either. But a ground loop can easily ruin your day.
If you're building a cable for multi-gbps data transmission, that ground loop noise might as well not exist - it's basically DC. But ground your shielding at only one end, and suddenly you're ruining everyone's wifi!
Building a device which needs high-speed data on one side, and analog audio on the other side? Good luck...
Ruled out the monitor(s)? There's been cases where they've backfed power, and they certainly backfeed EMI as well. And, it could also be tied to FPS- assuming gsync/free sync.
If you have a multimeter its probably worth double checking if the case is low resistance grounded to the end of the cord. I'm assuming you have checked already, but as a shock hazard it bares repeating.
I also have a splitter which lets you power an USB device from a separate power supply (i.e. D+/D- lines are connected to a host and +5V comes from a separate plug, ground is shared though). And optical TOSLINK is a nice option where available.
When I upgraded my PC to the same CPU, I had the same problem of crackling/buzzing speakers on my USB DAC (externally powered, but from the same strip/outlet) when the system was under load.
I had a hunch it was power related because my PSU was nearly 10 years old and probably with just barely enough wattage. I bought a new one and all the buzzing went away.
IIRC when I was researching possible causes, beefy Ryzen CPUs were the most commonly mentioned in various forums and reddit threads.
It depends some setups generate noise when you move a cabled mouse with a high resolution. I never measured it, but I would assume it is because of the high frequency signals of the mouse generates if it is actively used.
In that case more isolated cables and connections would probably help.
Maybe some day a completely different approach could actually make AI, but that's vapor at the moment. IF it happens, there will be something to talk about.
reply