Cord failure is definitely a problem, but if you’re moderately capable with a soldering iron, it’s easy to repair the cord if the failure is away from the headphone side. It’s even fairly easy to replace an 8mm or 0.25” jack.
Your soldering skill (and sense of adventure) would have to be far better than mine to even consider doing that for wireless earbuds.
The few times I've tried to solder headphone wire I've been defeated because the wire isn't wire, it's some kind of copper and synthetic fiber weave, that the solder just won't adhere to.
You’re right, the kind of cable often used is not easy to solder. This makes it hard to solder a broken cable together again, or to replace a broken / bent plug. So best replace the entire cable and its plug — it’s still an inexpensive part.
You’ll need to solder it to the contacts inside the can, but that’s quite straightforward.
In case the internal cable that goes from one can to the other breaks, you can replace it with any bit of audio cable so you can use one that’s easy to solder.
I've repaired a few headphone wires; theyre usually thin copper wires covered with enamel insulation. Burn off the insulation with a blob of solder, or sand it off, and the solder will stick.
It’s probably N.Y.T. style requirements; a lot of style guides (eg: Chicago Manual of Style, Strunk & White, etc) have a standard form for abbreviations and acronyms. A paper like N.Y.T. does too and probably still employs copy editors who ensure that every article conforms to it.
> But once you start adding mouse clickable tabs, buttons, checkboxes etc. you left the UX for TUIs behind and applied the UX expected for GUIs, it has become a GUI larping as a TUI.
Hard disagree. Borland TurboVision [0] was one of the greatest TUI toolkits of the DOS era, had all of these:
> Turbo Vision applications replicate the look and feel of these IDEs, including edit controls, list boxes, check boxes, radio buttons and menus, all of which have built-in mouse support.
Oh man, Turbo Pascal was my first "real" programming language -- it was all various flavors of BASIC before, and mostly toy projects. The developer experience with Turbo Pascal (by which I guess I mostly mean Turbo Vision) was honestly pretty great
Vasellating. TurboVision was awesome, but it was pushing the boundary of TUI, which in my mind was great for moving hard copy to computer entered use case. To wit, hard copy on your right side, you transfer data to app without looking at screen, but just looking at hard copy, remembering when/where to hit return key, maybe tab for prior field, stuff like that.
But hey, if the screen is drawn 24 x 80 with extended ascii, it's TUI. And man, loved the "absolute" keyword in turbo pascal. Instant screen writes when writing to a 2 dimensional array.
It was something like screen: array[1..80,1..25] of byte absolute $B800:0000; So, just use all the extended ascii to assign chars to cells to draw boxes for screens, buttons, tables, whatever. Instant update.
> He is going to download Blender because someone on Reddit said it was free, and then stare at the interface for forty-five minutes.
This hits home. Not because I did it as a kid; I'm a bit old for that. But because I've done this exact thing two or three times. You stare and know, just know, that somewhere in this byzantine interface there is the raw power to do lots of cool 3D stuff. But damn. It's quite an interface.
> That is not a bug in how he’s using the computer. That is the entire mechanism by which a kid becomes a developer. Or a designer. Or a filmmaker. Or whatever it is that comes after spending thousands of hours alone in a room with a machine that was never quite right for what you were asking of it.
Yeah. For me it was an old, beat-up 286 that I couldn't get anyone to upgrade and and loving devotion to MS-DOS, old EGA Sierra games, TSR programs, TUIs, GeoWorks, and just not being able to get enough of it.
When I finally saved up enough to buy a 486 motherboard, I installed Linux because it seemed cool (and was cool) and never looked back. But that 286 sparked my obsession with computers that has influenced almost every aspect of my life.
I still love to revive old hardware and push it beyond it limits. Mostly because i think it's fun, but also because it's dirt cheap or free. Back then made an old GPS system play Monkey Island or mp3's or read E-books. Reinstalled lots of old Android phones and tablets. Made photo frames out of them. Made webcams out of them. Transformed old laptops into Chromebooks. Make lots of old NAS devices work again. Stuff like that.
I got a cracked copy of 3ds Max at a LAN party (back when it was "Discreet 3dsmax"), and immediately dragged dozens of cubes, spheres, and cones into the scene.
Then I closed it for a year. Opened it up again one day, followed a box-modeling tutorial (from the documentation PDF linked in the Help menu!), and I was hooked. Spline modeling, rigging, walk cycles, texturing, lighting experiments, every spare minute for the rest of high school.
I still remember the whole-body panic of accidentally turning on "adaptive degradation", which replaced all meshes with their bounding cubes when rotating the viewport camera, and thinking I had broken my video card.
> In most cases an IPO isn't worth it for founders because an IPO means you lose operational control.
This is counterintuitive to me.
If you’re acquired, you’re giving up ownership and you tend to lose operational control unless you have agreements in place that say otherwise.
With an IPO it seems like you have a better chance to retain control: you can control the share allocations going into an IPO to give you solid voting power. While you’re accountable to a board of directors and theoretically accountable to stockholders, in reality management often runs the show, at least until the board runs out of patience with bad earnings.
The problem is if you go public as a small company, it can be hard to survive. You need to meet expectations every time you do an earnings call or watch your stock get crushed, and it’ll never be given another chance. The burdens are also a lot higher in terms of the cost.
You don’t really see companies under $10 billion going public anymore. That may continue to be the case, but it’s terrible for entrepreneurs.
That’s just not true. At the end of the 90s the US had a budget surplus and there was a discussion of how we were going to handle it.
Then George W. Bush enacted a big tax cut in 2001 that no one remembers because it was heavily weighted toward the top 1%. Suddenly we didn’t have a surplus problem anymore.
That wasn't necessarily Bush personally, who was never the sharpest knife in the drawer, but his strategists. The Republicans had convinced themselves that surpluses encouraged government spending so by wiping them out and "starving the beast" as they put it the resulting financial crunch would create a need to slash spending, cut social welfare, and reduce the size of government.
Actually now that it's set out like that, the strategists were just as much in la-la land as Bush was.
But they never did starve anything. Even DOGE didn't cut a significant fraction of the budget. They just eliminated a bunch of ideological enemies and quit early.
They didn't even pretend to reduce the entitlement programs that they claim to hate, but are fiercely defended by the elderly, who overwhelmingly vote for them.
Here they show the debt increasing through the 90s but by less than most other decades. I don't know if it takes into account inflation though so maybe that would have made the debt have less value. Seems like they didn't use any of the surplus to pay off the debt.
It’s basically a royalty model. That’s common in some industries and with some products. I haven’t looked lately but both Unity and Unreal Engine had royalty models; game devs would pay either a fixed per-unit fee or a percentage of revenue after a certain volume of sales.
To be viable as a business plan, this requires that a certain percentage of your customers have viable products.
Here’s the thing though: anyone who has a high volume of sales will want to shed the royalty. This could be by negotiating different terms or just rewriting to avoid the component or service that wants the royalty.
For Unity and Unreal, it’s pretty common knowledge that AAA studios have separately negotiated licenses, presumably to reduce or eliminate the per-unit royalty. Some studios write their own engine, though that has its own costs.
For vibe coding I have real doubts about this model. There’s effectively no moat and no defensive IP (ie: patents), so anyone making enough revenue to pay $$$ on royalties will probably end hiring SWEs to rewrite their software to avoid royalties.
The difference is obvious: it doesn't cost Epic anything if you download their engine flail around for 5 years and release a buggy bomb. 5 years of tokens would cost a lot.
> Senior engineer looks under the hood, sees 500k lines of incomprehensible spaghetti mess with emoji comments everywhere, runs out the door and never looks back.
Senior engineering _consultant_ looks at those 500k lines of incomprehensible spaghetti mess and sees $$$: months or years of contracts and likely very dysfunctional management that is willing to pay multiple times the cost of full time employees to keep the burn on a non-payroll line and/or keep the “AI first” story rolling on.
> Senior engineering _consultant_ looks at those 500k lines of incomprehensible spaghetti mess and sees $$$: months or years of contracts and likely very dysfunctional management that is willing to pay multiple times the cost of full time employees to keep the burn on a non-payroll line and/or keep the “AI first” story rolling on.
That's not been my experience. Even pre-AI, when I was asked to find a bug in some hacked-together codebase, sticker shock was often the result.
"What do you mean, billing for a week? The guy who created this is an actual software engineer and you're billing just as much as he did!"
I've got a list of small ex-clients who won't get work from me anymore, unless they are happy with "Here's my weekly rate, 1 week minimum".
Hourly rates don't work on a client who considers $200/m to be overpaying for s/ware development services.
I suspect there’s a middle ground that involves either keeping tests more proprietary or a copyright license that bars using the work for AI reimplementation, or both.
I think it’s entirely reasonable to release a test suite under a license that bars using it for AI reimplementation purposes. If someone wants to reimplement your work with a more permissive license, they can certainly do so, but maybe they should put the legwork in to write their own test suite.
Management often has a perverse short-term incentive to make labor feel insecure. It’s a quick way to make people feel insecure and work harder ... for a while.
Also, “AI makes us more productive so we can cut our labor costs” sounds so much better to investors than some variation of “layoffs because we fucked up / business is down / etc”
Your soldering skill (and sense of adventure) would have to be far better than mine to even consider doing that for wireless earbuds.
reply