Anyone have a theory why Apple hasn't done this yet? They release an 'iBook' which is basically a wired or even wireless lapdock for your iPhone running OSX in a partition. Seems like that would decimate the entire Windows, laptop, even desktop market in short order.
Everyone with an iPhone, no longer needs their laptop/desktop. Just buy a cheap iBook and there's a good chance it'll already be better than most consumer PCs.
There isn't much demand for using phone as computer. If you are at home or work, you can buy a desktop computers for cheap. If you are traveling, you need to find a monitor and keyboard. You could carry small monitor and wireless keyboard, but then you are carrying as much as laptop. People who need to work on the road get a laptop. People who need to send email get iPad and keyboard.
Good example of the economics is that Macbook Neo or iPad Air are cheaper than new iPhone.
iPhone should export display, but more for showing videos or presentations. My Pixel 10 has USB-C display and I haven't used it, but I have computers for all purposes.
Apple should spend more effort making the iPad usable for work. It would be good candidate for USB-C display, but with iPadOS.
Imagine an executive placing their phone on a magnetic dock as they sit down, which automagically connects to the screen and gives them access to everything they were doing before. Also easy to imagine a university computer lab where everyone brings their own compute and IT doesn't have to manage physical desktops.
I'm skeptical that there's "no demand" for that kind of functionality rather than a lack of good implementations. Look at how popular wireless CarPlay and Android Auto are. They're essentially the same functionality, but tailored to an in-car experience instead of desktop.
Imagine executive tapping their phone down on reader, and it pops up everything they were doing, and they get to keep using their phone.
The first flaw in the idea is that computing is cheap. You can make a computer the size of a phone for people to carry around, that has been tried but failed. The second flaw is that everything is in the cloud, only developers and offline need local access to their files. The cloud also means that can desktop in the cloud.
You can make a computer the size of a phone. That's what the latest macbook neo is. The rest of the space inside is battery and peripherals. I'm not sure what cloud has to do with this discussion.
Re: keep using phone, that's exactly what's already possible with CarPlay and AA.
FWIW, you can plug your iPhone into an external monitor to do a Keynote presentation. You need a USB-C (or Lightning) to HDMI dongle in most cases, but it works fine.
I'm always reluctant to do non-standard stuff for presentations. There's enough that can go wrong even with a direct HDMI out. I've done it in a pinch but pretty much always carry a laptop with me when I'm presenting along with local copies of my presentations. I've actually gotten a text in the middle of the night asking me if I can fill in for another speaker who forgot and are in a different country :-)
On an upcoming trip I'm actually going to give an iPad with magnetic keyboard I bought a couple years back, assuming different travel patterns than I've had, a try. It seems to work fine. An iPad is also great for plane/train entertainment without a keyboard. But, honestly, it's no lighter than a MacBook Air would be and if my ancient MacBook Pro dies--have a newer one up in my office--that's what I'll probably buy.
I have traveled with just my iPhone and can get by but don't really love it.
How can there be demand for something that doesn't exist?
If Apple releases a $300 lapdock tomorrow, basically a screen, keyboard, battery, that allows using your iPhone as a normal general purpose computer with OSX - why would anyone buy a laptop/desktop?
HNers are significantly more technical than the median consumer and are used to text and keyboard interfaces - a large portion of humanity isn't. You see this with Foundation Models as well - most have started to shift away from only concentrating on text to TTS and STT usecases.
Also, DeX style monitor screen share with a Bluetooth keyboard has been supported since iOS 15.
Additionally, a major portion of Apple's desktop revenue is coming from poweruser and specialist demand - IT departments bulk purchasing developer laptops, designers having their entire design workflow within the MacOS environment, and video editors heavily dependent on MacOS.
Furthermore, arguments about how Apple has an incentive not to cannibalize revenue are dumb, given how open Apple is to cannibalizing revenue where PMF exists (eg. the iPad Pro versus lower tier MacBooks or the MacBook Neo versus lower tier iPads).
The entire Mac line is a teeny tiny slice of revenue compared to iPhone. Allowing OSX on iPhone would increase the utility of iPhone, leading to more sales.
> Allowing OSX on iPhone would increase the utility of iPhone, leading to more sales
That assumption is not necessarily true.
What this implies is that there is a market of existing consumers that would not buy an iPhone because it lacks OSX support.
The iPhone portion of Apple's business generates around $144B in YoY revenue in Q1FY27 [0].
Whenever an organization contemplates building a net new capability like the one you mentioned, a quick test is whether it would be able to generate and sustain at minimum the equivalent of 1% of yearly revenue.
If this was a $1B revenue opportunity it would have been implemented, but it's not.
Nor is it a feature that can actively or dramatically increase Apple's market share in most markets.
A good proxy of such demand would have been a sudden increase in iOS users using USB-C screen share and a Bluetooth keyboard to interface with an iPhone in a desktop form factor (something which has been enabled since iOS 15), but such an increase has not happened.
Consumers haven't been told they can do that though. It's not ergonomic to do that. There's not a Belkin plastic dock to support that use case, so I don't find that is a good proxy for it.
Consumers don't need to be "told" what to do. If there truly was demand, Samsung or other third-party vendors would have created an ergonomic dock for DeX enabled Samsung phones and it would have been a killer app.
Other than UI and other surface differences, the fundamental distinction between a Mac and an iDevice is... what it is.
A Mac is a real computer. I can run any code I want on it. I have root.
An iDevice is like a game console. I can only run App Store apps (without jumping through a lot of hoops). I do not have root (without again jumping through many hoops or ugly hacks).
If Apple wanted to unify the platform they have two choices. The first is to abandon the "real computer" market entirely. The second is to make iDevices real computers by unlocking them.
I suspect they'd rather keep two platforms.
Under the hood they both share a lot of code, so it's not two totally distinct platforms. It's more like two sets of defaults and two "skins."
I think the friction of using a keyboard/pointing device with a touchscreen, or fingers with a desktop interface, is too high to unify them. I know it's been done, I'm unconvinced it's been done well.
That’s the difference though. Put macOS on a phone chip and it’s now a “real computer,” just a smaller one.
The M chips are mostly just roided out A chips: higher clocks, better cooling, more P cores, big GPU, and I think deeper pipelines and more cache. The ALU and many other sections are, I think, identical.
Thermal throttling is actually a non trivial limit on phones. Put a heat sink and a fan on an A chip and sustained compute is faster.
The OS and its restrictiveness determines the class of device not the hardware.
That was already the case with the M-series chips, which are shared between Macs and higher-end iPads. The Neo just extends it to the A-series as well.
Yep I know, and now using a last gen A chip, I feel they are really rubbing our faces in it.
Like Apple is saying, "Nice iPhone 17 Pro w/ A19 w/ vapor cooling chip you have there; you know you run a full general purpose OS on it, but we're not gonna let you, nanananana :p"
No exactly, Apple is playing in our faces, all while people continue to defend the “differences” of device categories and the subsequent justification of shipping iPhones and iPads with locked bootloaders.
The belief that people only hold opposing opinions to yours because they have money on the line is such conspiracy theory nonsense. Some random teenage in middle America couldn't just really like Apple products? It's gotta be some grand conspiracy against you?
I think Apple is just really careful about how they segment their product line for each use case, and would never go for a "jack of all trades" solution like this.
Why would it decimate the Windows market? From my experience, there's a strong correlation between iPhone and Mac usage.
Looking at the stats, the Win:Mac ratio is 4:1 but Android:iPhone only 2:1 so it might hurt Windows. But if iPhone users are more likely to use Mac or don't use computers much already, then expanding iPhone capabilities would cannibalize Apple business.
Because then most people with an iPhone wouldn't need to buy a separate laptop/desktop. I'm sure Android as well would follow in short order (not the half hearted attempts they've made so far). Sales would plummet. Windows decimated.
Why would Apple want to sell a lapdock when they could instead sell you the same thing + a redundant SOC (aka, a MacBook) and then high-margin cloud services to sync all of your data between your two differently-shaped computers?
Because most people with iPhones are buying Windows computers, but give them a cheap entry lapdock into the Mac ecosystem and maybe their next more powerful system will be a Mac.
Mac is a niche right now, iPhone with OSX could level the playing field.
If Apple could bring themselves to sell a lapdock, it'd have to cost at least $500. We know this because the Magic Keyboard for iPad, just a keyboard and trackpad, is priced at $349 (and it was introduced at that price way back in 2020, so at the time, Apple believed that keyboard was worth $440 adjusted for inflation). A screen to Apple's quality standards, even a 12-13" one, cannot possibly increase that price point by less than $150. So, the Apple in our universe could not produce a lapdock, because in our universe they have a whole laptop at that price point.
On second thought, the reality distortion field is real, so I suppose if they told people their new $600 lapdock was a good value even though it costs as much as the entry-level Mac, they'd still find willing buyers.
This. The more locked down, the less in control we are, the higher margins they command. This is why app stores exist - it has nothing to do with safety or security, and everything to do with monopolizing the distribution supply chain from soup to nuts. Don’t like it? Too bad, it’s fully locked down and cracking it is a (potentially) criminal offense, so whaddayagonnadoaboutit?!
A little computer board is only a fraction of the BOM of a laptop, so a 'lapdock' of equivalent quality couldn't be very much cheaper than a whole laptop.
If you use cloud storage, your laptop already has all the stuff on your phone anyway.
The general public thinks phones and computers are fundamentally different. Heck, I remember arguing this point even on HN back when smart phones were first coming out and being generally on the losing side as people got very excited about "app stores" and such. I see no practical path to getting to the point that enough of us realize that there is simply no reason for our phones to be locked down the way they are that the companies are forced to undo it, especially with our elites pushing with all they are worth to lock things down harder.
The companies take that confusion to the bank.
There have been numerous attempts at making phone/laptop crossovers, where you can plug your phone into a dock and get a computer, or slide your phone into a laptop case, etc. Some of them are even still around, but they're all definitely second-class citizens. There's a variety of problems that I think they've had in the market, not least of which is the fact that the average person still sees "phones" and "computers" as fundamentally different so the product makes no sense to them, but another issue that I think has held them back is that the product inevitably work by porting the limitations of the phone into the computer, rather than porting the freedom of the computer into the phone.
In the USB-C era, there is no excuse for every phone not having a mode where you can plug it into any ol' USB-C hub/dock and be able to get a desktop environment, even down to the "middle-of-the-line" phones. It would require in most cases no extra hardware. They just don't.
Money? You don't think Apple would make a killing on OSX licenses and lapdock sales if they allowed OSX on iPhone tomorrow?
Mac is a tiny slice of revenue for apple. OSX on iPhone would blow it out of the water. Apple would turn the PC market upside down, taking a sizeable chunk from Windows. As there'd be no point for most people to have a separate laptop/desktop at that point.
People also thought that phones needed keyboards before Apple showed them a better way. This is all on Apple to make a reality, no one else can bring general purpose computing to iPhone except them. It's their choice to make.
It would explode sales of Mac. OSX on iPhone, people wouldn't need the separate Windows laptops they're used to. OSX on iPhone is the gateway for consumers into the OSX ecosystem.
And when those consumers want more powerful hardware, instead of buying a more powerful Windows laptop/desktop - they buy a Mac instead.
I feel like Apple knows this as well, so I can't figure out why they haven't pulled the trigger. Anti-trust risk? lol
I don’t understand the argument for why allowing it would mean more Apple computer hardware sales though. Could you explain why you think that would happen?
I think there are a number of reasons why Apple specifically hasn't done this. In addition to what others have already mentioned (demand, segmentation, profitability, etc), another factor would probably be difficulty with the overall design.
Part of why Apple's products are often praised for their design is that they do a few things really well and focus on those things, instead of trying to do absolutely everything. Consider the iPod, the iPhone, Apple TV, etc -- they're all pretty focused on doing certain things and Apple's really polished the experience. The Mac desktops and laptops kind of stretch this by allowing more things, but they still largely try to focus the user into certain workflows, via the plethora of apps that come standard with macOS and the vendor lock-in that they push.
Making a phone that can also be a full computer goes against these design principles. Apple's closed the gap a bit in recent years by making macOS and iOS a bit more similar than they used to be, but they're still pretty different. If you're on a M1/2/3/4/etc processor laptop and you've tried using an iOS-specific app (not ones that's designed for both phone and desktop) on it, you'll see some of those differences (interfaces tuned for touch are weird with a mouse, things are sized wrong for desktop, file restrictions can be weird, keyboard input can be lacking, etc etc etc), and it's not enjoyable. Going the other direction, the first thing that pops into my head is: how in the world would the mac desktop be represented on iOS? I'm someone who keeps a lot of files on his desktop, grouped in different sections of the screen for different reasons, and I have no idea how that would be represented on a relatively tiny phone screen (at least in a way that didn't destroy my intentional groups). There are other aspects of macOS that would prove tricky to have analogs on a phone screen, too, but this reply is already getting so long that very few will read it...
Now that's not to say that it's impossible. In fact it probably isn't. But there would be compromises (and those compromises would be on top of the compromises already present in iOS/macOS). To do it well, it'd be a much bigger project than most people realize. It's not just changing a few options and letting us use our phone that way. It'd be more akin to designing the first iPhone. Note that it's not just Apple who hasn't done this yet. Literally _no one_ has done it well yet. I truly hope one day Apple (or someone else, even) does it well, since that'll be a glorious day. But it'd be a huge project, so I'm not holding my breath.
It's scary, without the em dashes, and the rapid fire commenting of the account - who would ever realize this is a bot? Two easy to fix things, and after that it'd be very difficult to tell that this is a bot.
It's not a question of if there are other bots out there, but only what % of comments on HN right now and elsewhere are bot generated. That number is only going to increase if nothing is done.
The vast majority of websites you visit don’t have usable APIs and very poor discovery of the those APIs.
Screenshots on the other hand are documentation, API, and discovery all in one. And you’d be surprised how little context/tokens screenshots consumer compared to all the back and forth verbose json payloads of APIs
>The vast majority of websites you visit don’t have usable APIs and very poor discovery of the those APIs.
I think an important thing here is that a lot of websites/platforms don't want AIs to have direct API access, because they are afraid that AIs would take the customer "away" from the website/platform, making the consumer a customer of the AI rather than a customer of the website/platform. Therefore for AIs to be able to do what customers want them to do, they need their browsing to look just like the customer's browsing/browser.
Also the fact that they don't want automated abuse. At this point a lot of services might just go app only so they can have a verified compute environment that is difficult to bot.
That's true, and it's always been like that, which is why the comment that AI should be using APIs is already dead in the water. In terms of gating a websites to humans by not providing APIs, that is quickly coming to a close.
Their eye sight gets worse as they get older, that's why they like making fonts bigger on their phones and computers. The new mac is so small. I feel like it'd be uncomfortable for older people to use.
Oh. Good point. I put her Air and iPhone SE in big font. So far she's been OK. By the time her eyesight deteriorates that much she will probably be retired anyway and not really need a computer anymore.
If they wanted people in the Mac ecosystem, they could dominate the market tomorrow by enabling a DeX like experience for iPhone. This new laptop running on an out-of-date A series chip proves that it can be done.
iPhone’s are more expensive than this laptop anyways and Apple could upsell all the docks and accessories with sky high margins. It’s a mystery why they haven’t done this already.
No one would use their phone for general purpose computing if given the chance? You're saying they would rather spend hundreds, thousands even on separate laptops/desktops even if it was possible to use their phone with a display to do the same thing? I disagree.
Yep this is the biggest news. We’re one step closer to a DeX like experience for iPhone. If Apple did that they would stomp the entire Windows laptop AND desktop market.
The phone in my hand is powerful enough to handle all the general purpose computing I already do, so let me do it Apple!
iPad is the most absurd device ever. It is fully capable of running a full blown general purpose OS, but artificially restricted to be a YouTube machine. Something you give kids in a restaurant to be quiet. Putting an M4 in it is like Apple rubbing our faces in it. Look at this device that could do everything, but can't do anything.
Comments like yours just go on to show how narrow the worldview of many HN users is. Just because you don't know how people are using their iPads doesn't mean iPads "can't do anything". It defies common sense, too. If iPads couldn't do anything, why would people buy them consistently? I can imagine people buying them once because they don't know any better. But iPad is more than 15 years old now.
I know exactly how it's used. I said in my comment it's used by kids to watch YouTube. By age 4, 58% of children have their own tablet. And YouTube is the #1 app for iPad. This is the majority use case, next to collecting dust on a shelf, or gifts for people's aging parents.
You don't think an M4 chip, amazing, screen, form factor, quality - all for children to watch YouTube videos with is absurd? TSMC all busy making 3nm chips to be used for watching CoComelon. An amazingly powerful, affordable device that is totally locked out of being used for general purpose computing. That doesn't irritate you?
The complaint isn't that iPad is useless, but that it would be equally useful to nearly every happy iPad user if it had a few generations older CPU.
iPad works for lots of people, but the things that iPad is best for don't really need a powerful CPU.
There are few "Pro" apps that you can run to prove it's possible to run them (except for plugins, OS-level helper apps, extra hardware, background processing that doesn't randomly die, scripting more fine-grained than shortcuts, competent file browser, etc.) but you can max out the CPU for a few minutes and go back to a macbook for real work.
We all knew AI had the potential to be extremely powerful, and we all perused it anyways. What did we think would happen? The government/military always takes control of the most powerful/dangerous systems. If you work for a defense contractor or under ITAR then you already know this.
The right way to deal with this is political - corporate campaign contributions and lobbying. You're not going to be able to fight the military if they think they need something for national security.
Scaling has hit a wall and will not get us to AGI. Open-source models are only a couple of months behind closed models, and the same level of capability will require smaller and smaller models in the future. This is where open research can help: make the models smaller ASAP. I think it's likely that we'll be able to get something human-level to run on a single 16GB GPU before the end of the decade.
For the weights and temporary state, yes. It doesn't sound like a lot until you remember that your DNA is about 600 books worth of data by the same metric.
> Open-source models are only a couple of months behind closed models
Oh, come on, surely not just a couple months.
Benchmarks may boast some fancy numbers, but I just tried to save some money by trying out Qwen3-Next 80B and Qwen3.5 35B-A3B (since I've recently got a machine that can run those at a tolerable speed) to generate some documentation from a messy legacy codebase. It was nowhere close neither in the output quality nor in performance to any current models that the SaaS LLM behemoth corps offer. Just an anecdote, of course, but that's all I have.
Not every use case is a cloud provider or tech giant.
Newer Blackwell does 200+ tokens per second on the largest models and tens of thousands on the smaller models. Most military applications require fast smaller models, I'd imagine.
Also, custom chips are reportedly approaching an order of magnitude more for the price. It's a matter of availability right now, but that will be solved at some point.
Charitable interpretation: Local AI (unclear; maybe gpt-oss-120b) isn't nearly as good as SoTA (unstated; perhaps Claude Opus 4.6). Unstated use case(s).
> I run local models on Mac studios and they are more than capable. Don't spread fud.
Charitable interpretation: On their Mac studio (could be a cluster or single machine: unclear), local models (unclear; maybe gpt-oss-120b, maybe not) are capable for their needs. Unstated use case(s). / The "Don't spread fud." advocates for accurate information, which is a useful goal in general. However, it was uncharitable and brusque. An alternative approach would have been to ask a clarification question.
> Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize. Assume good faith. - HN Guidelines
I promise I wrote this by hand. If you confidently thought otherwise, then I would kindly ask you to read my about page.
Incorrect as of a couple of days ago, when Qwen 3.5 came out. It's a GPT 5-class model that you can run at full strength on a small DGX Spark or Mac cluster, and it still works pretty well after quantization.
I don't think vibe coders know the difference, but often when I ask AI to add a feature to a large code base, I already know how I'd do it myself, and the answer that Claude comes up with is more often the one I would have done. Codex and Gemini have burned me too many times, and I keep going back to Claude. I trust it's judgement. Anthropic models have always been a step above OpenAI and Google, even 2 years ago it was like that so it must be something fundamental.
For me, Codex does well at pure-coding based tasks, but the moment it involves product judgement, design, or writing – which a lot of my tasks do – I need to pull in Claude. It is like Claude is trained on product management and design, not just coding.
I'm there with you, but only been using it a couple months now. I find that as long as I spend a fair amount of time with Claude specifying the work before starting the work, it tends to go really well. I have a general approach on how I want to run/build the software in development and it goes pretty smoothly with Claude. I do have to review what it does and sanity check things... I've tended to find bugs where I expect to see bugs, just from experience.
I keep using the analogy of working with a disconnected overseas dev team over email... since I've had to do this before. The difference is turn around in minutes instead of the next day.
On a current project, I just have it keep expanding on the TODO.md as working through the details... I'd say it's going well so far... Deno driver for MS-SQL using a Rust+FFI library. Still have some sanity checks around pooling, and need to test a couple windows only features (SSPI/Windows Auth and FILESTREAM) in a Windows environment, and I'll be ready to publish... About 3-4 hours of initial planning, 3 hours of initial iteration, then another 1:1:1:1 hours of planning/iteration working through features, etc.
Aside, I have noticed a few times a day, particularly west coast afternoon and early evening, the entire system seems to go 1/3 the speed... I'm guessing it's the biggest load on Anthropic's network as a whole.
Everyone with an iPhone, no longer needs their laptop/desktop. Just buy a cheap iBook and there's a good chance it'll already be better than most consumer PCs.
reply