Hand-held devices for testing concrete properties would be more useful. Most concrete problems come from a bad mix - too much water, not enough cement, etc. Concrete testing usually involves cutting a core out of the poured slab and sending it to a lab. Something where you stick a probe in the mix and can reject it before pouring would help. Here are some on-site concrete testers.[1] They're heavy and a pain to use.
There should be an app for this. But that's so last-decade.
I wanted to mention that Concrete is far more complex and regional than folks might imagine. The quality of gravel and sand, local impurities - these all contribute massively. It's probably best to think of it like a wine's terroir - except, unlike a bottle of wine, it's prohibitively expensive to ship both the components and the finalized mixture to different areas. If a region's limestone has a massive clay impurity then it may simply be unsuitable for large structures or require extensive filtering to the point of being uneconomical.
It's important to be aware of just how much the local geological mix can impact the viability of building with concrete because while theoretically we could use perfect concrete for every project - at that point most projects would simply be too expensive to consider undertaking. There is a very large field of engineering around establishing the realism required in settling for what you've got for the price you can afford in. It can absolutely mean that the materials required to build a high rise in Philly might be priced starkly differently from the same structure planned in Milan even with adjustments for the labor impact on pricing.
Hopefully there's good empirical data powering the model here, which just added slump prediction:
> Alongside the event, Meta is releasing a new AI model for designing concrete mixes, Bayesian Optimization for Concrete (BOxCrete). BOxCrete improves over Meta’s previous models with more robustness to noisy data as well as new features including the ability to predict concrete slump (an important indicator of concrete workability).
Seems hard to imagine not doing a slump test, trusting AI when it comes to your multi/many million dollar build outs for something so important. But perhaps still useful for planning, as a starting place?
The predictions of the model are used as recommendations for onsite testing to accelerate finding mixtures with optimal strength-speed-sustainability trade-offs. We are not replacing canonical testing with the model.
No - it's actually local variance in materials coupled with the difficulty in moving materials between markets economically. Some areas just have better suited limestone or gravel or sand and can afford to build resilient structures for a fraction of the price that it'd cost in other areas.
This issue here is mainly that it's very expensive to ship all the components of a Concrete in the volume necessary in an economical manner. Some areas of the world just lost the lottery when it comes to having resilient building materials.
Corruption absolutely is an issue as well - I don't mean to downplay it - but even if we remove it as a factor there are just a lot of variables involved in making a reliable Concrete... finding a good mix is an artform and if, for instance, your limestone quary suddenly hits a more clay-laden amalgamation then your Concrete that was reliably lasting for three decades under certain conditions might suddenly lose a decade off the expected lifetime. That change in material quality can also be difficult to detect so there are real quality assurance issues in Concrete mixtures outside of just corruption and cutting corners.
Working with multiple tons of material that dries out as you move it around is hard. There are a lot of steps between the concrete being mixed and when it finally reaches the pour.
Cutting out a piece of a slab and sending it to a lab is for post-pour validation in serious construction. There are pre-pour tests that are much simpler depending on the seriousness of what you’re building.
They are standardized for a given mix. A mix design that is based on a trial badge is submitted to the SEOR prior to pouring anything. The mix design shows the ratios ingredients (cementitious materials, find and coarse aggregates, water, air, admixtures). But Concrete is still a non-homogeneous material with lots of variations. Take for instance aggregates, if it rained the last two weeks, the moisture content will be higher but it may only be a layer on that pile. Same goes for gradation (particle size of the rock). Sometimes you get a batch with smaller rock. There are a 100 things that can go wrong to get bad mud.
But yeah, there are concrete plants that cut corners and try to save on cement (the most expensive part of the mix), which depending on the project may bite them in the ass when they have to pay to fixing it.
Yikes, what a flippant comment. The mix composition (meta's AI is helping with this) is separate from the wet concrete product. The parent is suggesting a way to test that the mix is properly mixed before pouring, not suggesting a way for construction workers to determine that the chemical properties of the mix will be correct on site. Furthermore, they're not even using LLMs, so it's not "AGI".
Do you actually see construction workers being replaced? We need more stuff built than we have people or time. We have spent a century improving process and tools and if we 10 years from now could build 3 times as much with the same people we would find a use for them all.
Awesome. People take concrete for granted. Even at small scales (e.g. your patio) with formulas provided on the cement bag, concrete can go wrong (crazing, scaling, cracks). There's a lot of unappreciated craft in the work, not only in the composition and mixing, which is what this research seems dedicated to, but also in the placing, leveling, curing, finishing.
Civil Engineering is hard, and concrete is a perfect example of how something as "simple" as concrete in reality requires significant interdisciplinary collaboration with domain experts in ChemE, MatSE, Physics, Applied Math, and CS.
Some of the most robust HPC applications I saw back when I was an undergrad were done by Civil and Structural Engineers in the ONG space.
I hate the civil engineer glazing that goes on on the white collar internet.
Civil engineers are the MDs of construction. Their relevance, pay and gatekeeper status in their industry is less a reflection they bring to the table and more a reflection of how successful their professional organizations has been in getting the government to distort the market to their benefit.
I'm not saying they don't crunch their numbers just fine, but they are massively over worshipped.
> As a result, producers need a way to rapidly explore and validate new formulations without spending months in the lab.
How do you bypass the normal process of pouring test articles and testing them months and years after cure? This is fundamentally a research activity that needs to conduct verifiable science. Not something you can guess at with an LLM.
Hi, I developed the model. We are not bypassing the regular testing process, and are not using LLMs, but Gaussian processes with vetted test data. The predictions are used as recommendations for onsite testing, to accelerate finding mixtures with optimal strength-speed-sustainability trade-offs.
Somebody needs to coin a new term for the scattershot zero-thought AI griping that is pervasive in online comments these days. Meatslop?
Obviously it's going to be more productive for a manufacturer to do a years-long curing test on 100 likely candidates instead of 100 random mixes. They obviously already screen candidates through traditional methods, but if this AI technique improves accuracy, all the better.
The current strategy of the AI hype machine is to exhaust people's reserves of attention by presenting a never-ending stream of hard-to-verify "positive" claims. It's Gish Gallop done on the Internet scale with a never-ending parade of tech influencers, proxy "journalists" and low-value accounts. The whole strategy aims for saturation and demoralized acceptance.
It's no surprise that people readjust their immediate reactions by expressing hostility and skepticism about anything AI-related without spending much time on analysis. In fact, it's an entirely rational repones.
Complaining about it without acknowledging the larger picture is disingenuous.
In this particular case, using the term "machine learning" would likely avoid the immediate negative reaction.
There was no such paradigm shift. LLMs still suck just as much as they did before, in the exact same ways they did before. In 6 months you'll be trying to BS us about the "great paradigm shift of summer 2026".
> Meta’s AI for concrete model can help suppliers more quickly incorporate U.S. materials into their mixes through an approach called adaptive experimentation.
> Proposes high-potential candidates: The AI suggests new mixes most likely to meet target specifications and can compare performance between U.S.-made and foreign materials
US imports 22% of its cement
> In 2024, Portland and blended cement were produced in 99 plants in 34 U.S. states, led by Texas, Missouri, California, and Florida. Nevertheless, there was significant import reliance. Net imports were 22% of total consumption, with the major source countries being Turkey (32%), Canada (22%), and Vietnam (10%). U.S. exports of cement last year were negligible.
I'm assuming this isn't for national security reasons, probably more to help the domestic industry deal with tariffs. I hope Meta used their extensive connections to the government.
Tangentially related, but there is a new generation of trucks that mix the concrete on-site. They can output small batches and change the mix on the fly. They solve a lot of headaches!
This may work on a small scale, not in most commercial use cases. A typical deck pour (400cy) will pour at 70-80cy/hr. you got 9-10cy/truck. Meaning you have 7 to 8 minutes to back in the truck, empty it into the hopper and leave. You barely have time to add water to the mix.
Most high-volume concrete plants are "dry-batch", which means all the ingredients get dumped into the drum and the concrete will get mixed while driving to the project site. Also, changing mixes on the fly will not "fly". No one is going to authorize the adjustment, because what happens when the mix doesn't meet specs... It will need to get chipped out.
Traditional trucks pick up cement from a facility and rotate it to keep it from setting. They don't mix it on the fly. Any extra is considered waste is poured out.
Our work on concrete here differs in that the problem is both
1) an inherently time-varying, and
2) multi-objective.
See our write-up here for details: https://arxiv.org/pdf/2310.18288
The website talks about making cement, but only describes making concrete. Making concrete involves mixing cement and fillers with water under controlled conditions. Making cement involves heating calcium carbonates and oxides with silicon dioxide or calcium silicate to form alite at a temperature of (so far as we understand) no less than 1250 C. Usually this is done with fossil fuels and any impurities in the raw materials (which are cost-constrained) go up the flue, making cement plants rather polluting. Carbon dioxide is a nearly inevitable byproduct (CaCO3 + SiO2 >> CaSiO3 + CO2) and is either captured at source (not implemented at most facilities) or released.
There is plenty of room for improvement in cement production. I'm not sure exactly how to apply AI to it but I guess I was hoping for more than this. If we are going to have the infrastructure renaissance that keeps being talked up by reformists of various stripes, we need more cement.
South America is also a surprising laggard in cement production, which is odd considering they have the materials and they need the roads. I think that environmental concerns and a continental aversion to coal might contribute.
First there was the rampocalypse. Then there was cementpocalypse. Let just hope the AI datacenters don't latch on to biofuel to supplement their energy requirements. It's just more profitable for farmers to sell calories to the AI overlords, the consumer food market is just a low margin grind.
Apologies for the sarcasm. I appreciate the drive for renewables the current AI DC buildout brings with it.
I have real fears that building materials will experience the same inflationary pressures computer memory is currently experiencing. The U.S. TSMC and Intel fab construction alone in the last couple years has had an outsized impact on building costs.
Jesus I hope they do proper testing for these experimental mixes and don't trust whatever random garbage AI decides you should mix in. This is exactly the kind of thing AI is absolutely terrible at because it has no logical skills or direct experience or ability to test it. If your AI coded stuff goes belly up, you get to try again. If your multi million dollar cement foundation turns out to be sub-par, thats multi million dollars to tear it out and then millions more to do it again right, and that is a best case scenario. The alternative is people dieing when their apartment building collapses.
We use Gaussian processes trained on vetted test data from academic and industry partners. We use these predictions to recommend mixes for onsite testing to accelerate finding mixtures with optimal strength-speed-sustainability trade-offs. None of the data and predictions go untested. The blog post goes into this in more detail.
Can you at least read the article before criticizing them? They explicitly call out that they use Bayesian Optimization (Gaussian process) thing for this. It is "AI" but not "LLM" like you think it is.
There should be an app for this. But that's so last-decade.
[1] https://store.forneyonline.com/concrete-testing-equipment/fr...
I wanted to mention that Concrete is far more complex and regional than folks might imagine. The quality of gravel and sand, local impurities - these all contribute massively. It's probably best to think of it like a wine's terroir - except, unlike a bottle of wine, it's prohibitively expensive to ship both the components and the finalized mixture to different areas. If a region's limestone has a massive clay impurity then it may simply be unsuitable for large structures or require extensive filtering to the point of being uneconomical.
It's important to be aware of just how much the local geological mix can impact the viability of building with concrete because while theoretically we could use perfect concrete for every project - at that point most projects would simply be too expensive to consider undertaking. There is a very large field of engineering around establishing the realism required in settling for what you've got for the price you can afford in. It can absolutely mean that the materials required to build a high rise in Philly might be priced starkly differently from the same structure planned in Milan even with adjustments for the labor impact on pricing.
> Alongside the event, Meta is releasing a new AI model for designing concrete mixes, Bayesian Optimization for Concrete (BOxCrete). BOxCrete improves over Meta’s previous models with more robustness to noisy data as well as new features including the ability to predict concrete slump (an important indicator of concrete workability).
Seems hard to imagine not doing a slump test, trusting AI when it comes to your multi/many million dollar build outs for something so important. But perhaps still useful for planning, as a starting place?
This issue here is mainly that it's very expensive to ship all the components of a Concrete in the volume necessary in an economical manner. Some areas of the world just lost the lottery when it comes to having resilient building materials.
Corruption absolutely is an issue as well - I don't mean to downplay it - but even if we remove it as a factor there are just a lot of variables involved in making a reliable Concrete... finding a good mix is an artform and if, for instance, your limestone quary suddenly hits a more clay-laden amalgamation then your Concrete that was reliably lasting for three decades under certain conditions might suddenly lose a decade off the expected lifetime. That change in material quality can also be difficult to detect so there are real quality assurance issues in Concrete mixtures outside of just corruption and cutting corners.
Cutting out a piece of a slab and sending it to a lab is for post-pour validation in serious construction. There are pre-pour tests that are much simpler depending on the seriousness of what you’re building.
The slump test is rather simple, for example: https://en.wikipedia.org/wiki/Concrete_slump_test
It’s basically a cone with handles and a procedure that’s easy to learn.
But yeah, there are concrete plants that cut corners and try to save on cement (the most expensive part of the mix), which depending on the project may bite them in the ass when they have to pay to fixing it.
Civil Engineering is hard, and concrete is a perfect example of how something as "simple" as concrete in reality requires significant interdisciplinary collaboration with domain experts in ChemE, MatSE, Physics, Applied Math, and CS.
Some of the most robust HPC applications I saw back when I was an undergrad were done by Civil and Structural Engineers in the ONG space.
Civil engineers are the MDs of construction. Their relevance, pay and gatekeeper status in their industry is less a reflection they bring to the table and more a reflection of how successful their professional organizations has been in getting the government to distort the market to their benefit.
I'm not saying they don't crunch their numbers just fine, but they are massively over worshipped.
How do you bypass the normal process of pouring test articles and testing them months and years after cure? This is fundamentally a research activity that needs to conduct verifiable science. Not something you can guess at with an LLM.
Obviously it's going to be more productive for a manufacturer to do a years-long curing test on 100 likely candidates instead of 100 random mixes. They obviously already screen candidates through traditional methods, but if this AI technique improves accuracy, all the better.
It's no surprise that people readjust their immediate reactions by expressing hostility and skepticism about anything AI-related without spending much time on analysis. In fact, it's an entirely rational repones.
Complaining about it without acknowledging the larger picture is disingenuous.
In this particular case, using the term "machine learning" would likely avoid the immediate negative reaction.
It’s really exhausting to feel negative all the time when faced with the cavalcade of terribly weak claims.
https://dailygalaxy.com/2026/03/rubber-used-in-undersea-tunn...
> Proposes high-potential candidates: The AI suggests new mixes most likely to meet target specifications and can compare performance between U.S.-made and foreign materials
US imports 22% of its cement
> In 2024, Portland and blended cement were produced in 99 plants in 34 U.S. states, led by Texas, Missouri, California, and Florida. Nevertheless, there was significant import reliance. Net imports were 22% of total consumption, with the major source countries being Turkey (32%), Canada (22%), and Vietnam (10%). U.S. exports of cement last year were negligible.
https://www.constructconnect.com/construction-economic-news/....
I'm assuming this isn't for national security reasons, probably more to help the domestic industry deal with tariffs. I hope Meta used their extensive connections to the government.
https://cementech.com/volumetric-technology/
Looking more closely though, this looks a lot like the Google "AI Cookie" from 2017, which also used Bayesian Optimization: https://blog.google/innovation-and-ai/technology/research/ma...
Our work on concrete here differs in that the problem is both 1) an inherently time-varying, and 2) multi-objective. See our write-up here for details: https://arxiv.org/pdf/2310.18288
There is plenty of room for improvement in cement production. I'm not sure exactly how to apply AI to it but I guess I was hoping for more than this. If we are going to have the infrastructure renaissance that keeps being talked up by reformists of various stripes, we need more cement.
South America is also a surprising laggard in cement production, which is odd considering they have the materials and they need the roads. I think that environmental concerns and a continental aversion to coal might contribute.
I have real fears that building materials will experience the same inflationary pressures computer memory is currently experiencing. The U.S. TSMC and Intel fab construction alone in the last couple years has had an outsized impact on building costs.