> Bellwether, a moonshot at Alphabet's X, is using Earth AI to provide hurricane predictions insights for global insurance broker McGill and Partners. This enables McGill's clients to pay claims faster so homeowners can start rebuilding sooner.
Could be a nice expensive contractor option for replacing the NOAA's public data that we lost. But it probably wont be picked up because it has to study the climate, which is a bad word now.
You can totally create a private version of NOAA so long as you keep the messaging about insurance intelligence and never, ever speculate out loud about the causes of hurricanes. And if that's not enough, just do what Meta did and hire some shmuck like Robby Starbuck to signal that you're on the right team.
What they want is for the government to run the satellites and provide the data on the taxpayers' dime, but only let private companies interpret that data so they can sell their forecasting
Google (with partner companies) launched a climate-monitoring satellite last year. Thanks to SpaceX, it’s cheaper than ever for private organizations to launch satellites.
Seems plausible to me. It would allow them to start contracting CAT adjusters as soon as a hurricane is expected, before other insurers start bidding for them.
Will this actually pay off for them? Who knows. But insurers are quite into ML for claims/underwriting these days, so I'd believe they're giving it a try.
You need a large workforce of adjusters to handle big events like a hurricane, but you don’t need them all the time. So catastrophe adjusters are often independent contractors.
Pay is good but hours are long, and you are often deployed far away from home.
Sorry but our AI said your home destroyed in the hurricane was not in fact destroyed by a hurricane. Claim denied. We accept no further inquiries on the matter.
100% of claims paid out instantly, so its kinda true.
I have found that using LLMs to generate queries for Overpass (Open street map query language) works really well. Great alternative if you don't care to deal with corporate nonsense.
I've been able to do really cool stuff that I would never have otherwise bothered with by having an LLM generate Overpass queries + walk me through complex setup steps with QGIS.
Just tested and while it seems interesting, there doesn't seem to (yet) be any intelligence about the imagery itself from what I can tell. For example, it can give me insights about vegetation data overlayed on a map (or try to), but it can't "find the most fertile grassland in this radius".
When there is a way to actually "search" satellite images with an LLM, it will be a game changer for businesses (and likely not to the ultimate benefit of consumers, unfortunately)
How would you even define “most fertile grassland”? What does “fertile” mean - soil nutrients, water availability, or productivity for a specific crop? And what counts as “grassland”? Are you talking about a 1 acre parcel, something for sale, or land next to a road?
There’s already data for all of this: SSURGO soil maps, vegetation indices, climatology datasets, and more — that could help you find the “most something” in a given radius. But there are too many variables for a single AI to guess your intent. That’s not how people who actually farm, conserve, or monitor land tend to search; they start from a goal and combine the relevant data layers accordingly.
In fact, crop-specific fertility maps have existed for decades, based on soil and climate averages, and they’re still good enough for most practical uses today.
It was just an example, but you are correct. A more "imagery required" example would be "Find all the houses with roofs that have been damaged in the last 6 months" or something like that which could be used by salespeople or insurers
That's a good example, yes. I think this one can actually be interpreted by multiple AI agents to do search on the algorithms, or could even train a model, and then run the model. How amazing would it be, if this could actually all happen based on a few prompts :)
In 2001 we used Erdas Imagine to do this type of work. It required humans to train the software using heads-up digitizing. Dare I say machine learning on Pentium workstations?
edit, looks like they have ai too now. could be neat to play with after how long has it been. jeesh.
I have some old screenshots of interesting locations from Google Earth circa 2006-2012 that I've never been able to track down. I wonder if something like this would be capable of geolocating them somehow -- like reverse image search for landscapes.
There's a whole community (with world tournaments [1]) around finding places from pictures: geoguessers. The top people are absolutely incredibly [2]. There are also AI trained for this purpose. Although, the perspective they use is usually from street level.
A few people recommended Geoguessr (and people like Rainbolt are definitely amazing), but yeah I reckon they're hyperspecialized on reading clues in actual street view imagery, not natural satellite footage like this.
Rainbolt often finds locations for people who have old photos of friends/family who have passed away etc., so the skills definitely seem to extend past just street view.
Out of interest: have you already tried using GPT 5 (reasoning / thinking) for that? I've had quite some success in the past using them to track down such places.
Yeah, that and Gemini 2.5. They actually were able to help identify a handful based on context clues, or at least narrow it down enough that I could find it myself. But there were three I couldn't crack -- even a forum dedicated to solving GE puzzles came up empty:
I gave it a try and look for the locations, specially the 3rd one that does indeed look like it could be in Chile.
For the 2nd picture I found an island in the French Polynesia that has very similar colors and characteristics, might be its around that area, 8°56'10.8"S 139°34'41.2"W (-8.936304553977038, -139.57811272908305)
For the 3rd picture I found many locations that look like your picture but really couldn't find one. The first one is around Mexico, though it probably isn't 27°32'39.0"N 114°45'00.5"W (27.544166, -114.750130). And the second one are islands close to Morocco 28°01'54.4"N 17°16'22.9"W (28.03198233652239, -17.27306308433365) though the angle is not the same... As a bonus for the 3rd picture, I did find in the Andes mountain something that looks like your picture: 33°38'11.7"S 70°07'01.4"W (-33.636446, -70.116968). So maybe you should also look around mountains.
At least from what I've seen in Chile the coast is usually very rocky and the water is usually lot of waves, and in the picture it looks really smooth. (Though I don't know how zoomed out the picture is)
"This looks almost certainly like a satellite view of a region in Western Australia, such as the Pilbara or the Hamersley Range. The dark areas are likely ancient, iron-rich rock formations (ironstone), and the surrounding soil is iconic of what's known as Australia's "Red Centre."
Guess which corporation just announced they're profiting off of the government shutdown of vital environmental and climate agencies? I wonder why they failed to mention any of that in this press release.
Instead of all this stuff, I'd like to see Google use their ML chops to "solve" weather forecasting and deliver ultra accurate predictions a few days ahead.
Hm, I'm quite skeptical about this claim.