
Data Center Capacity is Shifting Rapidly to Hyperscale Operators
The number of large data centers operated by hyperscale companies reached 1,189 at the end of the first quarter, and new data from Synergy Research Group shows that they now account for 44% of the worldwide capacity of all data centers. Over half of that hyperscale capacity is now in own-built, owned data centers with the balance being in leased facilities. With non-hyperscale colocation capacity accounting for another 22% of capacity, that leaves on-premise data centers with just 34% of the total. This is in stark contrast to six years ago, when almost 56% of data center capacity was in on-premise facilities.
Looking ahead to 2030, hyperscale operators will account for 61% of all capacity, while on-premise will drop to just 22%. Over that period, the total capacity of all data centers will continue to rise rapidly, driven primarily by hyperscale capacity growing threefold over the next six years. While colocation share of total capacity will slowly decrease, colocation capacity will continue to increase each year at near double-digit rates. After a sustained period of essentially no growth, on-premise data center capacity is receiving something of a boost thanks to GenAI applications and GPU infrastructure. Nonetheless, on-premise share of the total will drop by around two percentage points per year over the forecast period.
The number of large data centers operated by hyperscale companies reached 1,189 at the end of the first quarter, and new data from Synergy Research Group shows that they now account for 44% of the worldwide capacity of all data centers. Over half of that hyperscale capacity is now in own-built, owned data centers with the balance being in leased facilities. With non-hyperscale colocation capacity accounting for another 22% of capacity, that leaves on-premise data centers with just 34% of the total.
This is in stark contrast to six years ago, when almost 56% of data center capacity was in on-premise facilities. Looking ahead to 2030, hyperscale operators will account for 61% of all capacity, while on-premise will drop to just 22%. Over that period, the total capacity of all data centers will continue to rise rapidly, driven primarily by hyperscale capacity growing threefold over the next six years.
While colocation share of total capacity will slowly decrease, colocation capacity will continue to increase each year at near double-digit rates. After a sustained period of essentially no growth, on-premise data center capacity is receiving something of a boost thanks to GenAI applications and GPU infrastructure. Nonetheless, on-premise share of the total will drop by around two percentage points per year over the forecast period.
'Cloud and other key digital services have been the prime drivers behind data center capacity expansion, and the dramatic rise of AI technology and applications is now providing an added impetus,' said John Dinsdale, a Chief Analyst at Synergy Research Group. 'However, the mix of data center capacity is quite different region by region, an example being that hyperscale owned data center capacity is much more prevalent in the US than in either the EMEA or APAC regions. Overall though, the trends are all heading in the same direction. All regions will see double-digit annual growth rates in overall data center capacity over the forecast period, and all regions will see the hyperscale owned portion of that capacity growing by at least 20% per year.'

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


TECHx
4 hours ago
- TECHx
Why Generative AI Isn't a Silver Bullet for Healthcare
Home » Smart Sectors » Healthcare » Why Generative AI Isn't a Silver Bullet for Healthcare Generative AI is revolutionizing healthcare, but challenges remain. Ashley Woodbridge, Lenovo CTO META, explores why it's no silver bullet. The healthcare sector is on the brink of a significant surge in AI investment, projected to rise by an astounding 169% over the next year – the highest increase among all industries surveyed in EMEA. The uptick follows the promising early results of AI projects, where a remarkable 74% of healthcare AI pilot projects have met or exceeded expectations, with 13% surpassing them. Yet, despite this positive momentum, only 2% of healthcare organisations have actively deployed AI at scale with the majority of initiatives remaining in the pilot or planning stages. This raises the question: how can the industry and its partners move beyond this plateau? What is Generative AI Doing? For most people, their experience of Generative AI (GenAI), has been through conversations with chatbots, creating new content or summarising data. These lighter use cases have highlighted numerous challenges that must be addressed before widespread adoption can occur, especially in highly regulated industries like healthcare. One of the primary obstacles identified in the region wide survey is data quality. AI models, including GenAI, are only as effective as the data they are trained on. In healthcare, data often exists in silos, across incompatible systems, and in formats that are difficult to interpret. There are also important patient data privacy considerations to take into account. Despite the complexity involved in sorting and standardising this data, the potential benefits of tackling these issues and making it useable by AI are immense. For example, tools like Epic's 'Slicerdicer' allow healthcare professionals to query large datasets through conversational interfaces, uncovering important trends in patient outcomes and informing better care. The depth of queries is particularly powerful, allowing healthcare providers to uncover trends amongst patients that share a condition or illness that may have otherwise remained hidden. Taking the technology in a different direction, AI-powered 'ambient digital scribes' are being trialled by NHS doctors in the UK. These systems listen to patient appointments and automatically generate clinical notes, saving hours of administrative work and helping to reduce burnout among medical staff. However, healthcare providers must tread carefully. Public sentiment towards AI in healthcare remains cautious. Only 28% of people aged over 60 feel comfortable with AI technologies being used in their care, and 75% of consumers overall want to be informed if AI is being used in their healthcare communications. Transparency is critical. For AI to enhance patient experience without undermining it, trust must be maintained. In countries like the UAE and Saudi Arabia, AI algorithms are enhancing radiology by assisting in the analysis of medical images, enabling quicker and more accurate detection of conditions such as lung cancer. Predictive analytics are being utilized for real-time patient monitoring, allowing healthcare providers to intervene early in critical situations. Generative AI is also making strides in drug discovery, particularly in Qatar, where researchers are modelling molecular interactions to accelerate the development of new therapies. Additionally, AI-powered chatbots and virtual health assistants are streamlining telemedicine services, providing preliminary diagnoses and scheduling appointments, thus improving access to care. AI's Role in Medical Research Beyond frontline care, GenAI is turbocharging work in the field of medical research. Earlier this year, a researcher at Imperial College London used an AI tool developed by Google to investigate why certain bacteria are resistant to antibiotics. In just 48 hours, the tool proposed four viable hypotheses, whereas it had taken scientists over a decade to finalise just a single hypothesis manually. The result was so astonishing that the original researcher initially suspected the AI had accessed unpublished work on his personal computer, which was proven not to have been the case. These breakthroughs are being made possible thanks to the high-performance computing systems behind increasingly powerful AI models. At Lenovo, we are proud to partner with the Broad Institute on genome analysis, helping researchers accelerate one of the most data-intensive tasks in science. The Lenovo Genomics Optimization and Scalability Tool (GOAST) reduces the time needed to analyse a whole human genome from over 100 hours to just 47 minutes. Other organizations are also harnessing advanced AI and computing tools to push the boundaries of healthcare. Hungarian company 3DHISTECH, for example, uses Lenovo's AMD Threadripper-powered ThinkStation P620 workstations to build detailed 3D virtual models of human and non-human tissue. These models can zoom in to the level of individual chromosomes, enabling new frontiers in digital pathology. Their systems are used by institutions around the world, including Harvard Medical School, Novartis, and the Wuhan Institute of Virology. Notably, a 3DHISTECH system played a pivotal role in diagnosing the first COVID-19 patient in China, demonstrating how AI-enabled technology can impact global health crises. The Human Element Remains Crucial While AI offers compelling advantages, it's important to remember that technology alone isn't the answer. The human touch in healthcare remains indispensable. Healthcare providers must ensure transparency in their AI implementations and address data quality issues to fully reap the benefits of AI. By doing so, they can enhance patient care, reduce burnout among medical staff, and drive groundbreaking research, all without sacrificing the trust and comfort of those they serve. In conclusion, the healthcare industry stands at the brink of an AI-driven revolution. With thoughtful implementation and a focus on maintaining trust, AI has the potential to transform healthcare for the better. The journey won't be without its challenges, but the rewards promise to be well worth the effort. By Ashley Woodbridge, CTO, Lenovo, META


Channel Post MEA
a day ago
- Channel Post MEA
Data Center Capacity is Shifting Rapidly to Hyperscale Operators
The number of large data centers operated by hyperscale companies reached 1,189 at the end of the first quarter, and new data from Synergy Research Group shows that they now account for 44% of the worldwide capacity of all data centers. Over half of that hyperscale capacity is now in own-built, owned data centers with the balance being in leased facilities. With non-hyperscale colocation capacity accounting for another 22% of capacity, that leaves on-premise data centers with just 34% of the total. This is in stark contrast to six years ago, when almost 56% of data center capacity was in on-premise facilities. Looking ahead to 2030, hyperscale operators will account for 61% of all capacity, while on-premise will drop to just 22%. Over that period, the total capacity of all data centers will continue to rise rapidly, driven primarily by hyperscale capacity growing threefold over the next six years. While colocation share of total capacity will slowly decrease, colocation capacity will continue to increase each year at near double-digit rates. After a sustained period of essentially no growth, on-premise data center capacity is receiving something of a boost thanks to GenAI applications and GPU infrastructure. Nonetheless, on-premise share of the total will drop by around two percentage points per year over the forecast period. The number of large data centers operated by hyperscale companies reached 1,189 at the end of the first quarter, and new data from Synergy Research Group shows that they now account for 44% of the worldwide capacity of all data centers. Over half of that hyperscale capacity is now in own-built, owned data centers with the balance being in leased facilities. With non-hyperscale colocation capacity accounting for another 22% of capacity, that leaves on-premise data centers with just 34% of the total. This is in stark contrast to six years ago, when almost 56% of data center capacity was in on-premise facilities. Looking ahead to 2030, hyperscale operators will account for 61% of all capacity, while on-premise will drop to just 22%. Over that period, the total capacity of all data centers will continue to rise rapidly, driven primarily by hyperscale capacity growing threefold over the next six years. While colocation share of total capacity will slowly decrease, colocation capacity will continue to increase each year at near double-digit rates. After a sustained period of essentially no growth, on-premise data center capacity is receiving something of a boost thanks to GenAI applications and GPU infrastructure. Nonetheless, on-premise share of the total will drop by around two percentage points per year over the forecast period. 'Cloud and other key digital services have been the prime drivers behind data center capacity expansion, and the dramatic rise of AI technology and applications is now providing an added impetus,' said John Dinsdale, a Chief Analyst at Synergy Research Group. 'However, the mix of data center capacity is quite different region by region, an example being that hyperscale owned data center capacity is much more prevalent in the US than in either the EMEA or APAC regions. Overall though, the trends are all heading in the same direction. All regions will see double-digit annual growth rates in overall data center capacity over the forecast period, and all regions will see the hyperscale owned portion of that capacity growing by at least 20% per year.'


Channel Post MEA
a day ago
- Channel Post MEA
Rubrik Acquires Predibase To Accelerate Agentic AI Adoption
Cybersecurity company Rubrik has announced it has entered into an agreement to acquire Predibase to accelerate agentic AI adoption from pilot to production at scale. Together, Predibase and Rubrik will deliver radical simplicity in models and data, resulting in improved accuracy, lower costs, better performance, and automated data governance. Venture firms Greylock and Felicis led funding of Predibase to date; terms of the transaction were not disclosed. Completion of the transaction is subject to customary closing conditions. Founded by AI technologists from Google and Uber, Predibase offers a fast way to fine-tune open source models into highly accurate, production-ready solutions. The Predibase platform combines a proprietary post-training stack for customizing models with a highly optimized inference engine. The platform includes a turbo serving engine for over 2x performance gains, along with LoRA eXchange, an open source system for deploying personalized models at scale. With Predibase, teams can support different users, use cases, and departments without ballooning infrastructure costs. 'We created Predibase to lift the barriers between an idea and production-ready AI. Today, many organizations still face challenges moving beyond the proof-of-concept stage,' said Devvret Rishi, Co-Founder and CEO of Predibase. 'Predibase removes the hardest part of that journey and accelerates production-ready AI by giving teams an easy-to-use platform to tune models to their own data and run on an optimized inference stack. This unlocks more accurate results and faster models, all at lower cost.' Overcome the Proof of Concept Wall Gartner found on average that more than half of AI projects never make it into production and it takes eight months to go from AI prototype to production. Common hurdles include the risks in accessing valuable data, limitations in model accuracy and quality, high infrastructure costs, and a lack of data governance. These challenges lead to extended time to realize a clear return on investment. Predibase delivers better performance, up to 80% cost savings, and reduced AI infrastructure complexity over hosting foundation models. A Powerful Combination for Secure, Scalable AI 'What the Predibase team has achieved with model training and serving infrastructure in the last few years is nothing short of remarkable. AI engineers and developers across the industry trust their expertise,' said Bipul Sinha, CEO, Chairman and Co-Founder of Rubrik. 'Together, Rubrik and Predibase will drive agentic AI adoption around the world and unlock immediate value for our customers.' Integrating Predibase will expand the work to secure and deploy GenAI applications that Rubrik is doing today with Amazon Bedrock, Azure OpenAI, and Google Agentspace. Organizations globally rely on Rubrik to tackle complex challenges including accessing the right data, managing security, and optimizing for cost and performance. The combination of Predibase and Rubrik will bring optimized, fine-tuned, cost-effective models with governed data to help customers securely deploy agentic AI. Bipul Sinha discussed the acquisition in a blog post here.