Author here, thanks for reading. Yes, naming is tricky. By mono-environment, I mean that there is one _long-lived_ environment to which we deploy software.
I'm constantly surprised by developers who like LLMs because "it's great for boiler plate". Why on earth were you wasting your time writing boiler plate before? These people are supposed to be programmers. Write code to generate the boiler plate or get abstract it away.
I suppose the path of least resistance is to ignore the complexity, let the LLM deal with it, instead of stepping back and questioning why the complexity is even there.
> Write code to generate the boiler plate or get abstract it away.
That doesn’t make any sense. I want to consider what you’re saying here but I can’t relate to this idea at all. Every project has boilerplate. It gets written once. I don’t know what code you’d write to generate that boilerplate that would be less effort than writing the boilerplate itself…
>Every project has boilerplate. It gets written once.
Agree with you - I think when my colleagues have talked about boilerplate they really mean two kinds of boilerplate: code written once for project setup, like you describe, and then repetitive code. And in the context of LLMs, they talking about repetitive code.
Interesting. I don't recall this happening during my studies. You didn't have the time in my exams to cheat, leaving for the toilet was strongly discouraged and if you did leave, you would have an invigilator standing behind you at the urinal or outside the cubicle.
The article is about accountancy. Accountancy is not academia.
>In the end of the day academia in general should stop relying on exams based on memorization of random facts and start using real world examples of what kind of work student would be working with as an employee.
I have an undergraduate degree and PhD in chemistry and I don't really think "blind memorisation" had much to do with my success. It will only get you so far.
I think there is also substantial within academia about the purpose of academia. I think a lot of academics might disagree that it is about preparing people to be employees.
Statistically, how many people will become private employees and how many people will stay in academia to do science? Disagreement is just denying reality.
I absolutely loathe when someone is looking over my shoulder. Additionally, the software for pairing is so good and being able to be help and annotate while the main "driver" can focus on solving the problem is immensely fruitful and I would argue objectively more engaging and helpful if you can highlight and demonstrate directly what it is you're communicating.
I work remote and wouldn't have it any other way, but I'll admit there are culture things I miss since I really love my team. As an employee? My employer gets way more value from me without the bullshit of going to an office.
I dislike it in general, I want to work on my own terms and I have to interrupt my way of thinking constantly because now I have to communicate my ideas to some other person.
Sometimes I like to work in the middle of the night, sometimes early in the morning. Having another person to code with me is just a blockade.
In my opinion, the openings to pair are harder to find when remote.
In person, it's usually easy to see when a junior is struggling, and pull up a chair. That same junior, in a remote environment, might not proactively ask for help. And me sending a slack message to them asking how they're doing might get a "all good" even when they are not. And sending a huddle request to them because I suspect they're struggling is just a VERY different thing than looking over at them to read their body language, or swinging my their desk to check in.
Maybe some would say that it's on that junior to know they need to ask for help. Sure, great. That does nothing to resolve the reality of this very real situation. Of course part of coaching this junior (in person or remote) would be encouraging them to be more proactive about asking for help, but if you have fewer opportunities to offer help and coaching in the first place, their growth is slower. Significantly slower in my opinion.
This is a hard difference to convey to someone who has never experienced a good in-person culture. But I know I would not be where I am if I had spent the first decade of my career working remotely.
What is your setup when pairing remotely? (Full disclosure I am building an OSS app for this purpose and just want to learn what is working for people)
Our team uses IntelliJ so we use Code with Me with Slack for AV. Specifically on Slack we have channels called pairing-room-1, pairing-room-2, etc. Allows colleagues to drop in and out.
I would try Tuple but to be honest we are fine on Code with Me.
Indeed Tuple is a great product. The goal is to match their quality and make it the OSS alternative, it's still early though, and I am trying to get some feedback.
In my penultimate year I was struggling under a huge lecture and lab workload, working at the weekends, and partying too.
I developed stomach pain and went to the campus doctor, who sent me off to A&E (ER) with suspected appendicitis. My appendix was fine, but the doctors thought I might have a stomach ulcer. I discharged myself early to get back to the library, so never actually got a diagnosis!
I was drinking 3-4 cups of coffee and ~500mL knock-off Red Bull a day, eating a poor diet, and then drinking alcohol in the evenings. I went cold turkey and spent an entire weekend in bed with the worst headache imaginable.
I think I managed a few months before cracking and getting back on coffee. I quit cold turkey again a couple of times but in recent years I titrated off of it by gradually changing ratio of caffeinated to decaffeinated beans in my coffee at home. I drink 1 - 2 small cups a day now.
Author here. Indeed - it would be just as fantastical to deny there has been no value from deep learning, transformers, etc.
Yesterday I heard Cory Doctorow talk about a bunch of pro bono lawyers using LLMs to mine paperwork and help exonerate innocent people. Also a big win.
There's good stuff - engineering - that can be done with the underlying tech without the hyperscaling.
Hey Simon, author here (and reader of your blog!).
I used to share your view, but what changed my mind was reading Hao's book. I don't have it to hand, but if my memory serves, she writes about a community in Chile opposing Google building a data centre in their city. The city already suffers from drought, and the data centre, acccording to Google's own assessment, would abstract ~169 litres of water a second from local supplies - about the same as the entire city's consumption.
If I also remember correctly, Hao also reported on another town where salt water was being added to municipal drinking water because the drought, exacerbated by local data centres, was so severe.
It is indeed hard to imagine these quantities of water but for me, anything on the order of a town or city's consumption is a lot. Coupled with droughts, it's a problem, in my view.
The fact that certain specific data centres are being proposed or built in areas with water issues may be bad, but it does not imply that all AI data centres are water guzzling drain holes that are killing Earth, which is the point you were (semi-implicitly) making in the article.
Just because it doesn’t leave the cycle doesn’t mean it’s not an issue. Where it comes back down matters and as climate change makes wet places wetter and dry places drier, that means it’s less distributed
That said, the water issue is overblown. Most of the water calculation comes from power generation (which uses a ton) and is non-potable water.
The potable water consumed is not zero, but it’s like 15% or something
The big issue is power and the fact that most of it comes from fossil fuels
>The big issue is power and the fact that most of it comes from fossil fuels
This has almost zero to do with the data centers themselves and instead the politicians we vote in.
Simply put we need more 'clean' power generation any way you go about it. Economic growth and production is based on ones ability to produce power. We've been coasting on increasing efficiency for a long time, but we've been in need of larger sources of power and distribution for decades, data centers or not.
With that said DC's shouldn't be built in west texas where it's dry. East of the Mississippi gets enough rain that you build a reservoir and you'll have more than enough water to feed the DC for decades.
Is it more of a pricing problem? If data centers paid the same for water as I do they’d be far more efficient with it. Likewise for power. Giving sweet deals to these things based on promises of jobs and tax revenue seems a bad idea at this point.
The way they measure water consumption is genuinely unbelievably misleading at best. For example measuring the water evaporated from a dams basin if any hydroelectric power is used.
Counting water is genuinely just asinine double counting ridiculousness that makes down stream things look completely insane. Like making a pound of beef look like it consumes 10,000L of water.
In reality of course running your shower for 10 to 15 hours is no where near somehow equivalent to eating beef lasagna for dinner and we would actually have a crisis if people started applying any optimization pressure on these useless metrics.
My favourite example is the complaints about “e-waste” because AirPod batteries are not replaceable. Never mind that the batteries make up about half of the weight of the things ro begin with, the entire annual production of AirPods and their cases would barely fill a single shipping container.
No, not a container ship, or anything substantial like that, just one container (1).
Yet, it makes people lose their rational minds and start foaming at the mouth about Apple’s “wasteful practices”.
Apologies, looks like I mis-remembered the numbers from the original debate, the actual number is about 110x 40-foot containers. It can be calculated easily enough from the # shipped, the volume of a charging case, and the volume of a shipping container.
It doesn’t alter the conclusion that much: in the grand scheme of human industrial activity, this is nothing.
I.e.: if AirPods were made by an independent company, that firm would be in the top 50 of the largest corporations in the world!
For example, a comparable sized company would be Coca Cola, which goes through 300,000 tons of aluminium annually for their cans, not to mention oil used for plastic bottles!
Drinking water does not magically appears in the water cycle the next day.
[0] - "And what we found is is that up to 43% of data centers, and this is our largest data centers, are located in areas of high or extremely high water stress. And that's really shocking because data centers require huge amount of drinking water to be able to cool their servers."
"I took all your money and gambled it away, but don't worry because it wasn't destroyed, it's still circulating in the economy and will come back someday."
A reduction in what can be used from the summer snowmelt is a problem, regardless of whether equivalent atoms are redeposited in the winter snowfall.
I’m sure there was some planning commission process involved with the development of these sites. I’m curious if anyone has bothered to look at those meeting minutes to see if there are some material misrepresentation of the water and power needs. I’m going to guess that answer is no.
The mistake you are making is letting the author choose your points of comparison, without having a high-level picture of where water usage goes. Comparing water usage to a city is misleading because cities don't use much water; large-scale water use is entirely dominated by agriculture.
I'm conflicted. Zooming out, the problem isn't with AI specifically but economic development in general. Everything has a side effect.
For decades we've been told we shouldn't develop urban centers because of how it development affects local communities, but really it just benefited another class of elites (anonymous foreign investors), and now housing prices are impoverishing younger generations and driving homelessness.
Obviously that's not a perfect comparison to AI, which isn't as necessary, but I think the anti-growth argument isn't a good one. Democracies need to keep growing or authoritarian states will take over who don't care so much about human rights. (Or, authoritarian governments will take over democracies.)
There needs to be a political movement that's both pro-growth and pro-humanity, that is capable of making hard or disruptive decisions that actually benefits the poor. Maybe that's a fantasy, but again, I think we should find ways to grow sustainably.
Not just the poor, how about the bottom 99%? This is what's so frustrating to me about the culture wars and identity politics. Regardless of ones views on the hot button cultural issue du jour, at best they are a distraction, and at worst actively exploited by moneyed interests as a political smokescreen to prevent changes that would be obvious wins for super majorities of the population if analyzed and viewed through a more sober and objective lens of the net effects.
IIRC Google chose to pull out altogether to punish the locals for standing up to them— even though they happily built air-cooled data centers elsewhere.
I mean yes, almost all corporations that have a choice do this. Walmart is one of the better known ones that will put a store right on the edge of a municipality that doesn't want one and cause all kinds of issues for the city at hand.
Nestle is and has been 10000x worse for global water security than all other companies and countries combined because nobody in the value chain cares about someone else’s aquifer.
It’s a social-economic problem of externalities being ignored , which transcends any narrow technological use case.
What you describe has been true for all exported manufacturing forever.
I think the point is: where does this end? Do we continue to build orders-of-magnitude bigger models guzzling orders-of-magnitude more water and other resources, in pursuit of the elusive AGI?
At some we need to end this AGI" rat race and focus on realizing practical benefits from the models we currently have.
My interpretation was "If an industry that actively works to harm the global health of humanity through their addictive and unhealthy food products is using way more water and we're OK with it, maybe we should give a pass to the industry using a fraction of that water to improve human productivity."
Ton of nuance in my characterizations of both industries, of course, but to a first approximation they are accurate.
Well, the food industry continues churning billions in profits at the expense of our health, so statistically speaking, looks like "we" are OK with!
Totally agreed that criticism should be directed where it's due. But what this thread is saying is that criticism of GenAI is misdirected. I haven't seen nearly as much consternation over e.g. the food industry as I'm seeing over AI -- an industry that increasingly looks like its utility exceeds its costs.
(If the last part sounds hypothetical, in past comments I've linked a number of reports, including government-affiliated sources, finding noticeable benefits from GenAI adoption.)
I should have been more precise with my terms, but there is a difference between "food" and the "food industry" indicated by the likes of Nestle. Yes, everybody needs food. No, nobody needs the ultraprocessed junk Nestle produces.
I didn't see the OP's point as whataboutism, but rather putting things into perspective. We are debating the water usage of a powerful new technology that a large fraction of the world is finding useful [1], which is a fraction of what other, much more frivolous (golf courses!) or even actively harmful (Nestle!) industries use.
[1] https://news.ycombinator.com/item?id=45794907 -- Recent thread with very rough numbers which could well be wrong, but the productivity impact is becoming detectable at a national level.
Attempt generosity. Can you think of another way to interpret the comment above yours? Is it more likely they are calling their own argument a red herring, or the one they are responding to?
If something looks like a "weird stance", consider trying harder to understand it. It's better for everyone else in the conversation.
reply