IBM releases a new mainframe built for the age of AI
The hardware and consulting company on Monday announced IBM z17, the latest version of its mainframe computer hardware. This fully encrypted mainframe is powered by an IBM Telum II processor and is designed for more than 250 AI use cases, the company says, including AI agents and generative AI.
Mainframes might seem like old hat, but they're used by 71% of Fortune 500 companies today, according to one source. In 2024, the mainframe market was worth an estimated $5.3 billion, per consulting firm Market Research Future.
The z17 can process 450 billion inference operations in a day, a 50% increase over its predecessor, the IBM z16, which was released in 2022 and ran on the company's original Tellum processor. The system is designed to be able to fully integrate with other hardware, software, and open-source tools.
Tina Tarquinio, VP of product management and design for IBM Z, told TechCrunch that this mainframe upgrade has been in the works for five years — well before the current AI frenzy that started with the release of OpenAI's ChatGPT in November 2022.
IBM spent more than 2,000 research hours getting feedback from over 100 customers as it built the z17, Tarquinio said. She thinks it's interesting to see that, now, five years later, the feedback they got aligned with where the market ended up heading.
"It has been wild knowing that we're introducing an AI accelerator, and then seeing, especially in the later half of 2022, all of the changes in the industry regarding AI," Tarquinio told TechCrunch. "It's been really exciting. I think the biggest point has been [that] we don't know what we don't know about what's coming, right? So the possibilities are really unlimited in terms of what AI can help us do."
The z17 is set up to adapt and accommodate where the AI market heads, Tarquinio said. The mainframe will support 48 IBM Spyre AI accelerator chips upon release, with the plan to bring that number up to 96 within 12 months.
"We are purposely building in headroom," Tarquinio said. "We're purposely building in AI agility. So as new models are introduced, [we're] making sure that we've built in the headroom for bigger, larger models — models that maybe need more local memory to talk to each other. We've built in that because we know it's really the approach that will change, right? The new models will come and go."
Tarquinio said that one of the highlights of this latest hardware — although she joked it was like being asked to pick her favorite child — is that the z17 is more energy-efficient than its predecessor and supposedly competitors, too.
"On-chip, we're increasing the AI acceleration by seven and a half times, but that's five and a half times less energy than you would need to do, like, multi-model on another type of accelerator or platform in the industry," Tarquinio said.
The z17 mainframes will become generally available on June 8.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


TechCrunch
36 minutes ago
- TechCrunch
Google's data center energy use doubled in four years
No wonder Google is desperate for more power: the company's data centers more than doubled their electricity use in just four years. The eye-popping stat comes from Google's most recent sustainability report, which it released late last week. In 2024, Google data centers used 30.8 million megawatt-hours of electricity. That's up from 14.4 million megawatt-hours in 2020, the earliest year Google broke out data center consumption. Google has pledged to use only carbon-free sources of electricity to power its operations, a task made more challenging by its breakneck pace of data center growth. And the company's electricity woes are almost entirely a data center problem. In 2024, data centers accounted for 95.8% of the entire company's electron budget. Image Credits:Tim De Chant/TechCrunch The company's ratio of data-center-to-everything-else has been remarkably consistent over the last four years. Though 2020 is the earliest year Google has made data center electricity consumption figures available, it's possible to use that ratio to extrapolate back in time. Some quick math reveals that Google's data centers likely used just over 4 million megawatt-hours of electricity in 2014. That's growth of seven-fold in just a decade. The tech company has already picked most of the low-hanging fruit by improving the efficiency of its data centers. Those efforts have paid off, and the company is frequently lauded for being at the leading edge. But as the company's power usage effectiveness (PUE) has approached the theoretical ideal of 1.0, progress has slowed. Last year, Google's company-wide PUE dropped to 1.09, a 0.01 improvement over 2023 but only 0.02 better than a decade ago. It's clear that Google needs more electricity, and to keep to its carbon-free pledge, the company has been investing heavily in a range of energy sources, including geothermal, both flavors of nuclear power, and renewables. Techcrunch event Save $450 on your TechCrunch All Stage pass Build smarter. Scale faster. Connect deeper. Join visionaries from Precursor Ventures, NEA, Index Ventures, Underscore VC, and beyond for a day packed with strategies, workshops, and meaningful connections. Save $200+ on your TechCrunch All Stage pass Build smarter. Scale faster. Connect deeper. Join visionaries from Precursor Ventures, NEA, Index Ventures, Underscore VC, and beyond for a day packed with strategies, workshops, and meaningful connections. Boston, MA | REGISTER NOW Geothermal shows promise for data center operations. By tapping into the Earth's heat, enhanced geothermal power plants can consistently generate electricity regardless of the weather. And many startups, including Google-backed Fervo Energy, are making it possible to drill profitable wells in more places. On the nuclear fusion side, Google last week announced it would invest in Commonwealth Fusion Systems and buy 200 megawatts of electricity from its forthcoming Arc power plant, scheduled to come online in the early 2030s. In the nuclear fission world, Google has pledged to buy 500 megawatts of electricity from Kairos Power, a small modular reactor startup. The nuclear deals have yet to deliver power — and they won't for five years or more. In the meantime, the company has been on a renewable energy buying spree. In May, the company bought 600 megawatts of solar capacity in South Carolina, and in January, it announced a deal for 700 megawatts of solar in Oklahoma. Google said in 2024 it was working with Intersect Power and TPG Rise Climate to build several gigawatts worth of carbon-free power plants, a $20 billion investment. The outlay isn't surprising given that solar and (to a lesser extent) wind are the only two sources of power that are readily available before the end of the decade. New nuclear power plants take years to permit and build, and even the most optimistic timelines don't see them connecting to the grid or a data center before the end of the decade. Natural gas, which the U.S. has plenty of, is hamstrung by five-plus-year waitlists for new turbines. That leaves renewables paired with battery storage. Google has contracted with enough renewables to match its total consumption, though those sources don't always deliver electrons when and where the company needs them. 'When we announced to the world that we were achieve that 100% annual matching goal, we were very clear that wasn't the end state,' Michael Terrell, Google's head of advanced energy, told reporters last week. 'The end game was 24/7 carbon free energy around the clock everywhere we operate at all times.' Google has some work to do. Worldwide, the company has about 66% of its data center consumption, matched to the hour, powered by carbon-free electricity. But that average papers over some regional challenges. While its Latin American data centers hit 92% last year, its Middle East and Africa facilities are only at 5%. Those hurdles are part of why Google is investing in stable, carbon-free sources like fission and fusion, Terrell said. 'In order for us to eventually reach this goal, we are going to have to have these technologies,' he said.
Yahoo
2 hours ago
- Yahoo
Exclusive: Ambrook raises $26.1 million Series A to provide farmers and ranchers with better accounting software
When I meet Chase Crandall, he's putting up a barbed wire fence. Even on Zoom, the Wyoming sky is clearly in the background, with an expanse of ranch land stretching out into the distance. Crandall is a sixth-generation rancher, who's tending to land that was originally settled by his ancestors in the 1850s—but is thinking proactively about technology in 2025. 'Typically, ranchers are fairly cheap and careful about spending money,' said Crandall. 'And we try to be the same. We try to be as low-cost and run as efficiently as possible.' Farming has historically been technologically-forward, from the invention of the plow in ancient Mesopotamia to the implementation of the mechanical reaper in the Industrial Revolution. So, Ambrook—the accounting software startup that Crandall uses and the reason we're talking—is part of a long tradition of farming-focused tech. Ambrook was built expressly with American agriculture and industry in mind, filling a gap that existing software like QuickBooks leaves on the table, said Mackenzie Burnett, CEO and cofounder at Ambrook. 'Farms are in this delta where they are unusually complex for their size and revenue profile,' said Burnett, whose parents are both plant health specialists at the USDA. 'The mom and pop shops are generally doing pretty thin margins, and they are running on software that's built for less complex businesses. They tend to be what we kind of call multi-P&L, dealing with multiple revenue lines. Oftentimes a lot of farms have to vertically integrate in order to just make the margin profiles work.' Founded in 2021, Ambrook has raised a $26.1 million Series A, Fortune has exclusively learned. Thrive Capital and Figma's Dylan Field (via Field Ventures) led the round, with participation from existing investor Homebrew. BoxGroup, Designer Fund, Mischief, and Not Boring are among Ambrook's new investors in this round, bringing the company's total funding to $29 million. Farming is complex, Burnett says, because inventory (like livestock) has a literal lifecycle—it is born, grows, and dies. 'Most folks in tech have an idea of how complex a factory floor is in manufacturing,' said Burnett. 'But farming involves biological factors, in the sense you have inventory that is born, grows and dies. That is a huge complication, and understanding how to manage that isn't something that QuickBooks—or even a lot of SaaS software—is built for. It's built with a different kind of professional in mind.' Ambrook's rise comes at a 'challenging' time for farming in America, one that's trending towards seeing 'fewer farmers who are larger, more sophisticated, more specialized, and more consolidated,' said Jonathan Coppess, former administrator of the Department of Agriculture's Farm Service Agency. 'Farming, like the rest of the economy, is facing real uncertainty around technological advancements,' Coppess wrote to Fortune via email. 'What will be the impact of artificial intelligence, advanced robotics, etc.? Add to that the political and geopolitical uncertainties (from tariff conflicts to wars) and those of climate change. The risks and uncertainties are magnifying in real time, making it very difficult to make any predictions, or to have confidence that trends of the past are indicative of the future.' So, what does it take to keep a family farm a thriving business? Generational planning, for one—Calvin Crandall, the ranching family's patriarch, placed the land in a perpetual trust a few years ago, ensuring the land would stay together (and, by extension, a viable business) across future generations. Efficiency and margins matter, especially when it comes to tracking inventory that grazes, gives birth, and dies. 'What Ambrook solves for me is time management,' said Chase. 'Everything's live, and I don't have to manually update anything. Our margins matter, because there's so much you can't control. Can't control the cattle market, which can get crazy and sometimes you just have to take the price you can.' How many cows do the Crandalls have anyway? Calvin drives by while Chase and I are on a Zoom, and stops to chat. I ask. The two tell me with mirth and kindness that the question is quite a faux pas. 'That's like me asking you 'how much money do you have in your bank account right now?'' Chase laughs. His dad jumps in: 'You've got some money in your bank account, and we've got some cows on our land!' See you tomorrow, Allie GarfinkleX: @agarfinksEmail: a deal for the Term Sheet newsletter here. Nina Ajemian curated the deals section of today's newsletter. Subscribe here. This story was originally featured on Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
2 hours ago
- Yahoo
Mark Zuckerberg has a literal list of AI all-stars to whom he's offering fantastic sums of money. An expert explains why he's hiring so aggressively
Mark Zuckerberg has something called 'The List'—top AI experts whom he's targeting to join Meta's Superintelligence Labs. He revealed the first 11 members in an internal memo to staff. Fortune spoke to an expert who explains Zuckerberg's logic. In an internal memo, Mark Zuckerberg revealed 11 members of his new AI Superintelligence Labs, whom he poached from OpenAI and other tech giants. The memo, obtained by CNBC and published Monday, lists top engineers and researchers hired to bolster Meta's AI efforts. It comes months after Zuckerberg began his personal AI talent crusade—one that has blindsided competitors and sparked reports of 'The List,' a compilation of the Facebook founder's recruits he hopes will staff the Meta Superintelligence Labs. The document names five ex–OpenAI staff who left the company just weeks after The OpenAI Files report discussed deep leadership concerns. Zuckerberg has been personally reaching out to potential recruits, the Wall Street Journal reported. In his pitch to OpenAI staffers, the Meta CEO offered a $100 million signing bonus and one year's compensation, OpenAI CEO Sam Altman said on a recent podcast hosted by his brother, Jack. The lucrative, pro-athlete-level job offers come after Meta's latest AI model, Llama 4, received a cool reaction from consumers and critics. Experts say Zuckerberg's moves might be a sign of desperation to keep up with competitors. 'Last time Meta released an AI model, it wasn't as successful as they expected,' Edgar Perez, a corporate trainer specializing in AI and other cutting-edge technologies, told Fortune. Zuckerberg has made good on his promise to bolster Meta's AI efforts in recent months, hiring Scale AI CEO Alexandr Wang and former GitHub CEO Nat Friedman to help lead the superintelligence team he's assembling. Perez said the next challenge for Meta is to make AI reasoning models that can contend with products like DeepSeek's R1, Google's Gemini 2.0 Flash Thinking, and OpenAI's o1 series. 'At the end of the day, what Mark Zuckerberg wants to incorporate in Meta is AI agents,' Perez said. 'To be able to be successful, [the AI agent] needs to reason. Let's say, if you want to develop a task in a company, or if a customer would like to do some task through Meta, they need to decompose a number of steps. And those steps will need to be managed through a reasoning model. That's not something that Llama, at the moment, can do, and that's why they need to refine.' But even with the allure of a nine-figure salary, Perez said Zuckerberg's offer may still be hard to swallow for someone working at a place with a strong company culture. 'What happens when somebody quits [the superintelligence team]?' Perez said. 'Let's say, in a week: Will they return the bonuses? … Having that type of money, people might just decide to stay for a year or two and then leave and start their own companies later.' Even before Zuckerberg's memo, the Meta chief's aggressive talent recruitment strategies have appeared to irk OpenAI's leadership. 'I feel a visceral feeling right now, as if someone has broken into our home and stolen something,' Mark Chen, the chief research officer at OpenAI, wrote in an internal memo obtained and published by Wired on Saturday. 'Please trust that we haven't been sitting idly by.' Chen told employees he was working with Altman and other leaders at the company 'around the clock to talk to those with offers,' adding, 'we've been more proactive than ever before, we're recalibrating comp, and we're scoping out creative ways to recognize and reward top talent.' Though Chen wrote he would fight to retain all of OpenAI's talent, he 'won't do so at the price of fairness to others.' With the announcement of a new organization within Meta, experts question if Zuckerberg's personnel investments will turn a profit in the end, or if it will lead to more AI-generated woes. In a post on LinkedIn, Vineet Agrawal, an investor in health-tech startups, called Zuckerberg's hiring spree 'desperate.' 'When you're winning with vision, you don't need to win with money,' he wrote. 'Real talent follows challenge and purpose. The moment someone chooses you purely for $100 million, they'll leave you for $101 million.' This story was originally featured on Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data