Agile, flexible, responsible: next gen data centres
Data centre development is critical to the UK's digital future. Karen Fletcher looks at what it takes to deliver in this sector.
12 November 2025

For most members of the public, the term ‘data centre’ probably suggests a large, windowless building behind high gates, with few visitors. In fact, far from an estate of homogeneous IT-warehouses, the UK data sector market is more nuanced in both design and operation and becoming more so as the sector develops.
This is a sector which is couched in confidentiality, and while the need for discretion is understandable given the sensitive data stored in these facilities, it has led to some misconceptions about data centre buildings. This is particularly true for issues such as energy and water use, for example, where the reality may be very different than assumed.
Speaking to professionals who are designing and delivering these buildings sheds new light on what the critical challenges in the industry really are, and how technology, clients and investors are shaping the future of UK data centres.
Behind the scenes, data centres operate as a finely-tuned balance of power, cooling and design. But each element must evolve, keeping pace with technology. Together, these factors are shaping the current and future generations of data centres.
Our interviewees
Danny Cross | Director - Head of Data Centres at Winvic Construction |
Rennie Dalrymple | Partner Project Management at Ridge & Partners |
Steven Davidson | Senior Director at Robert Bird Group |
Derek Main | Director Data Centre, Mission Critical at Hoare Lea |
Jonathon Stockdale | Director at Studio NWA |
Power – a question of distribution not generation
Access to electrical power is the single most important factor for data centre location – it’s the lifeblood of the sector. However, a national shortage of power is not the problem.
As Rennie Dalrymple, Partner Project Management at Ridge & Partners notes: “The UK absolutely has enough generation capacity. The challenge is transmission and distribution; getting power from where it’s produced to where the data centres need it.”
As a result, power is the most significant constraint when selecting potential data centre sites. Projects can hinge on existing grid connections, or the ability to secure new ones quickly, which rarely happens.
One current trend is that, as land prices rise in the traditional London-centric data centre clusters, new sites are being sought and developers are naturally following the power.
Jonathon Stockdale, Director at architectural practice, Studio NWA which specialises in data centres, says: “Scotland, the North East, South Wales – places traditionally off the data centre map – now make sense because of power proximity. Power is constrained in the South, but more abundant in the North and Scotland, where there is also relatively easy access to renewably-generated power.”
But the story of energy in this sector does not end with the grid. Inside each facility, the way data is stored, used and transmitted is also driving energy use. Derek Main, Director Data Centre, Mission Critical for Hoare Lea, notes that power demand varies, depending on the type of data centre: “AI and high-performance computing are energy intensive,” he explains. “But they remain a small part of total activity. Most data centres are still focused on transaction processing, data storage and archival, each with its own distinct energy profile.”

This means that for sites where latency is not critical, locating data centres close to renewable generation is logical and sustainable. For example, under a process known as ‘wind curtailment’ UK wind farm operators are paid to stop generation to avoid overloading the grid – at a cost to consumers through their energy bills.
As Steven Davidson, Senior Director of Robert Bird Group notes, data centres could make a positive contribution to balancing the UK’s grid: “If data centres were positioned close to wind generation, they could use that energy, stabilise the grid and cut waste as well as ultimately lowering costs for consumers.”
Energy use – shaped by technology, shaping design
There is a misconception that the power demands of today’s data centres are entirely driven by the growth of AI, but this is not the case according to experts in the field.
“There has been a bit of a misunderstanding about AI data centres,” says Main. “They’re a small fraction of the overall data centre world; currently a tiny percentage of global capacity. Even with growth, they will probably only ever represent around 20% to 30% of total use.”
Instead, the energy use of a data centre is more nuanced and depends on how data is stored and used, giving a more varied energy profile than many might expect.
Derek Main points out that it is vital to understand the function of a data centre and what proportion AI will represent within that, as it’s likely to be small. Although AI workloads increase rack density and therefore cooling and power demand, other functions within the compute of the data centre, such as enterprise, co-location and cloud compute are also seeing increased rack densities.
This means that designers must bear in mind the energy demands of other functions as these will lead to variations in data demand across data centres.

“For example, you have transactional data for online shopping. Then you have storage provision, where sometimes there can be several levels such as instantaneous for fast data access,” says Main.
“A good example of this is working with 3D modelling where users are constantly transferring data back and forth. Then you might have archive storage, which could be anywhere in the world. But the demand for energy is all completely different for each of these functions.”
Subtle differences aside, the fact is that higher rack densities are changing the demand for supporting services, particularly cooling.
The balance of white space (where the IT is housed) and grey space (plant rooms to support the building’s critical functions such as cooling) within data centres is changing. Where the ratio of white-to-grey space was once 1:1 this could now reach 1:2 or 1:3.
This shift impacts the work of architects as well as engineers, as Stockdale explains: “Higher rack densities could shrink the data hall in theory, but getting enough cooling into small spaces is the constraint. Roof area for chillers and condensers often caps how much IT load we can host. So density targets, cooling surface area and plant layouts become an intricate jigsaw."
Understanding what drives energy use inevitably leads to another critical challenge: keeping equipment cool. As rack densities rise and chip technology advances, the industry is re-thinking how it manages heat.
Cooling – separating fact from fiction
Cooling and energy use are two sides of the same coin in data centre design and delivery. Like energy consumption, cooling requirements have been impacted by increased rack densities because they operate at higher temperatures.
This in turn leads to increased cooling demand in the building. But as with energy use, there are important subtleties behind this general trend, which have led to some misunderstandings.
For example, there are concerns about the use of water in UK data centres, largely because of the term ‘liquid cooling’. As Main says: “In this case, liquid refers to specialist coolants within a closed loop, not water. Adiabatic or evaporative cooling methods, which use water, are used in the USA but are extremely rare in the UK because of Legionella concerns.”
In fact, a TechUK survey in 2025 found that 51% of UK data centre sites use waterless cooling systems, and 64% use less than 10,000m3 of water per year which is less than a typical leisure centre.
There is a range of technologies available for cooling, and engineers focus on optimising efficiencies by applying these (see our Box Out). However, their choices can be narrowed, depending on the IT technologies being used in the data centre.

“The challenge is that server manufacturers aren’t aligned,” says Main. “It’s like the VHS versus Betamax problem. Competing approaches mean that there is no universal standard for cooling technology. We can design for any cooling technology, but we can’t dictate the technology path as the chip and server manufacturers set the pace.”
The narrowing of cooling technology choices adds to the challenge of meeting rising server temperatures. Twenty years ago, servers ran at 19oC to 21oC. Today, air temperatures coming off the racks can routinely reach 30oC.
This is another point where technology manufacturers are trailing efficient cooling techniques. It is common for server manufacturers to require ambient temperatures between 18oC to 27oC (within the ASHRAE 2016 recommended range). Exceeding this level can void their equipment warranties.
Here in the UK, the use of free cooling allows data centres to achieve good levels of efficiency – achieving annual average PUE values of 1.1 or 1.2. However, this could be improved.
“If chip manufacturers would allow that higher temperature to rise just a few more degrees, we could extend free cooling hours and save even more energy,” says Main. Free cooling is achieved when the heat exhausted is at a temperature above ambient air temperature, so less efficient refrigeration is not required.
Technology is evolving quickly, but so are the design and construction methods brining these buildings to life. With demand rising and build times under pressure, modular construction, flexibility and collaboration have become the watchwords of delivery.
Design and construction – building smarter and better
Like power and cooling, data centre design and construction are shaped by their function, but other factors are coming into play.
Steven Davidson says: “The hyperscalers tend to use a standard model for their builds. They have developed a template that fits their supply chain and they know it works. Once the site is ready, they can deploy that design in almost any location with a few regional or material changes.”
Davidson notes that his team’s job is to set up the site for rapid deployment: “From a geotechnical and civils perspective, where we can really help the project is by remediating the ground efficiently to make it suitable for the client’s design.”

Speed of build is critical for many data centre clients, and this requires experience and know-how from contractors. Earlier in 2025, Winvic Construction announced that it would be moving into the data centre sector, using its expertise in delivering large, technically-demanding projects to focus on the hyperscale market.
Danny Cross, Director - Head of Data Centres at Winvic, says: “The data centre sector needs experienced builders, which is why we’re entering this market. Our know-how in delivering large industrial and logistics projects at scale and at pace gives us a strong advantage.”
He highlights the importance of coordination in complex projects: “We understand fast-track construction, but in data centres, MEP integration drives the overall programme. So, it’s not just about building a shell quickly; it’s about coordinating complex mechanical and electrical installations that are critical to commissioning.”
Modular and prefabricated design are becoming more widely used to support faster builds. “Co-location providers often use fully-prefabricated, power-ready shells that can be rapidly deployed,” says Davidson.
“We’re seeing the supply chain move more in line with the hyperscalers in this respect.”
Although templates and prefabrication are being used in the data centre sector, flexibility is critically important because the functionality of the data centre may change to suit an incoming client and simply because technology evolves quickly.
Dalrymple has seen this impact the delivery of data centre projects: “It is a constant challenge. The building may take years to deliver, but IT loads, rack densities and MEP philosophies evolve much faster.”
Davidson agrees: “The design you start with at Stage 2 may not be the design you build,” he says. “Requirements change along the journey, even on site. So, we build in redundancy from the start with optimised grid layouts, piling options with enhanced capacity as well as a long list of other key design criteria assumptions at early stages. When those late design changes come, your ability to remain agile in design means you are prepared. I enjoy that, it’s when you get to do proper engineering.”

Stockdale says that a flexible approach is vital: “On the technical side, we design for future flexibility. We don’t size purely for today and we have to understand the trajectories for technology. We need to allow the building to adapt.”
But there is a challenge to flexibility: “Over-designing for flexibility can make a scheme unviable, but under-designing risks obsolescence. It’s always a balance,” says Dalrymple.
The level of flexibility and coordination needed for data centre projects requires a very specific approach from the whole design and construction team, as well as the application of IT.
“I can’t imagine delivering these projects without a 3D collaborative environment,” says Stockdale. "We have dedicated BIM staff who maintain model health, standards and automation so that we can keep models performing and consistent.”
Danny Cross agrees that teamwork is critical: “The key is trusted partnerships, and close collaboration between all parties – client, contractor and MEP specialists. That’s how you deliver these highly technical facilities on time.”
However, design is not only an aspect of function. Today’s data centres are increasingly being developed closer to urban centres, making good design a key element of the build, particularly when it comes to gaining planning permissions.
“Design quality expectations have risen – and rightly so,” says Stockdale. “We recently secured planning committee approval for a multi-storey scheme to the West of London on a major arterial route. The council wanted more than a ‘blank box’ design. We worked hard on massing articulation and frontage, splitting the data halls and putting offices between, creating a less dense design.”
The interface between data centres and the wider public realm is becoming more of a factor in design as projects move away from the ‘traditional’ locations near London. This means that construction teams may find themselves working with planning authorities that do not have experience of looking at data centres as infrastructure in their area.
“There’s an education curve,” says Stockdale. “In new areas we spend time explaining what data centres are and how they operate. I sit with another director on the on a group that engages with the Civil Service DC team. Overall, central and local governments are learning fast, and that’s pushing clients towards higher-quality architecture.”
Site constraint is another factor influencing design of data centres. As sites close to London become more squeezed, it challenges engineers to re-think: “If a site has power, capacity and connectivity it often makes sense to build upwards,” says Davidson.
“The challenge is when the facility hosts multiple tenants, each with different cooling or rack density requirements. Structurally, we can handle almost anything, but there is a limit. You’re not going to build a fifty-storey data centre, but adaptability is key.”
Good design is increasingly a factor for success in this sector. As Rennie Dalyrmple says: “These projects are large and have a visual impact, so taking planners on that design journey is crucial. Some are also adjacent to residential areas, so the design quality and communication process really matter.”
Speed and flexibility are not the only measures of success for this sector. As data centres grow in scale and visibility, the sector faces mounting scrutiny over its environmental and social footprint. This is prompting some deep thinking on carbon, community and what we do with legacy sites.
Data centres and ESG – designing for community and re-use
With the focus on energy and water use, it’s important not to lose sight of other aspects of data centres’ environmental and social impacts.
Embodied carbon is an increasingly important factor in the built environment which is impacting new design and leading to more considerations around re-use of existing structures.
Stockdale says that as an architect, he can influence carbon outcomes from the early stages of design: “Our biggest lever is embodied carbon. These are vast buildings, so material choices matter. We have in-house sustainability specialists and internal carbon tools that let us run real-time carbon counts on our models. We can then test how small design tweaks affect totals.”
Davidson agrees that embodied carbon needs careful consideration: “Unfortunately, embodied carbon is still poorly measured. People often compare the carbon of a new-build against another new-build, ignoring the carbon already ‘spent’ in the existing structure. We need better metrics, but the principle is the same – every tonne of concrete already cast has released its carbon. Reusing that frame is always a win.”
Re-using legacy data centres has many benefits – not least, they already have power connection. The typical life-cycle of a data centre varies, and often the limiting factor is rack density and its associated cooling systems. These can make it challenging to re-use some existing sites.
“Most modern facilities use closed-loop systems to minimise water use,” says Dalrymple. “That will become the norm. Older facilities will eventually need major upgrades to remain viable.”
The rapid growth of data centres and government backing has given them a high profile, including in the national press. Coverage has not always been positive, and some of this has been the result of misunderstandings noted earlier. The government’s approach to overturning local planning decisions that rejected data centre applications has not helped their image.
But the industry recognises that it must acknowledge its societal impacts as it grows. Derek Main says: “Data centres are sometimes being done to communities, not with them. At Hoare Lea in 2024, we set out to examine our social responsibility and how it connects to our data centre work. We developed a framework called the Social Charter, which measures the social value of our data centre schemes.”
This Charter focuses on integrating facilities more positively into communities, not only delivering jobs and investment, but also in terms of design, ecology and education.
“Historically, industrial projects were community anchors,” says Main. “We’ve lost that connection. The Charter aims to reintroduce it by setting benchmarks for how operators engage locally, create green spaces, share opportunities.”
Stockdale agrees with this view: “We have to think about the impact of a data centre in that context and how we interface so that we’re not just windowless spaceships landing in these environments.”
Dalrymple echoes this, saying: “Planning authorities are beginning to grasp the balance. Data centres bring investment and jobs, but they must also be designed responsibly and with local engagement.”
The road ahead for UK data centres
The UK’s data centre sector is growing and attracting new investment and entrants to the market.
As someone at the forefront of delivering for this sector, Rennie Dalrymple is bringing all sides together: “My role bridges the world of finance and investment with property, real estate and construction, helping both sides understand each other. Data centres are complex, long-gestation, capital-heavy projects, so navigating the full cycle from planning to delivery is a significant part of what we do.”
While the investment that the industry now attracts is welcome, it has also created change in the market. Dalrymple notes that in the past, developers were more inclined to build space speculatively and prove their model by attracting tenants later. But new investors have brought a new approach.
“Institutional investors have entered the market, backed by pensions and sovereign funds,” says Dalrymple. “That money comes with strict governance and risk controls, meaning they won’t release funds until a pre-lease is signed. It’s understandable, but has slowed things down.”
The level of investment can bring major changes for areas that are attracting new data centre development: “Areas like Teesside, South Wales and parts of Scotland have strong power availability and affordable land. These areas often have histories of industrial decline, so inward investment from digital infrastructure can be transformative. Once a major project lands, others tend to cluster nearby, creating ‘availability zones’. It’s potentially a significant economic opportunity,” he adds.
Smaller data centres are also on the cards for the future, with urban ‘edge’ centres providing fast data communication hubs for developments such as driverless cars. In some cases, very small ‘data centres’ can be packaged into containers and placed in very small sites such as car parks – a plug-and-play approach.
Moving into urban areas to deliver closer connectivity provides the option of re-using other types of buildings to house a form of data centre.
Davidson says: “We talk about legacy data centres themselves, but there’s also the question of what happens to other under-used buildings such as commercial or industrial properties that are no longer used. Some could host small-scale data functions to enhance their value.”
He suggests using a modular unit on the roof: “Structurally, it can be cheaper to place heavier kit higher up as the loads distribute through the building frame as they move downward. You might repurpose lower floors for mixed use, perhaps community or residential, while integrating data or energy facilities above or below.”
Stockdale agrees that this could be a useful option: “Adaptive re-use of existing structures, especially robust urban frames such as old mills and industrial buildings, with character and presence, is an option. Not all will suit the weight and precision of data centre equipment, but some are excellent candidates and could be given a second life with the right risk appetite from users.
As the industry progresses, new entrants such as Winvic are important to delivering growth. Having teams ready to go is vital. Danny Cross says: “We have a clear business commitment to the data centre sector and over the last six to twelve months we have focused on developing our supply chain, upskilling our teams, recruiting new expertise and partnering with MEP specialists.”
Conclusions
The UK government wants the data centre sector to succeed as part of its plans for national growth and an AI-driven future. If it could do one thing to accelerate that success, it would be to improve how electrical power reaches the places that need it most – encouraging investment in transmission alongside generation.
The industry itself is stepping up. Engineers and architects are designing with greater sensitivity to energy use, carbon impact and community context. They are also exploring how data centres can play a more active role in the wider power network, helping to balance the grid, anchor renewable energy supply and reduce the cost of wind containment to households.
The UK now has the expertise, the investment and the will to lead this next phase. What’s needed is alignment: between government and industry, between power and planning and between the nation’s digital ambitions and the infrastructure that will sustain them.
Useful technical terms for data centre infrastructure
CPU (Central Processing Unit) - a general-purpose processor which handles a variety of tasks such as running applications and system management.
GPU (Graphics Processing Unit) - a specialised processor originally developed for gaming. GPUs work much faster than CPUs, with thousands of cores designed to work on tasks that can be broken down into thousands of smaller units. This is the heart of AI, machine learning and 3D rendering.
Rack – a data centre rack is the physical frame or cabinet that houses servers, storage devices, cables switches and other IT equipment. To increase ‘rack density’ means to put more power and computing performance into a rack. This is usually measured by the amount of power in kilowatts that the rack consumes. For example, the industry is moving from a typical 5kW rack to 20kW or 30kW or more.
Rear door cooling – a heat exchanger is attached to the back of a server. Heat is captured by liquid coolant in the heat exchanger and carried away from the equipment. This is categorised as a form of liquid cooling but is used in conjunction with air cooling in the server room to extract heat from the space.
Direct-to-chip (D2C) cooling – cold plates with coolant tubes are attached to the chips in a server. These absorb heat which is circulated via the coolant back to a coolant distribution unit (CDU) where the heat is removed. This is considered a form of liquid cooling because of the use of refrigerant. D2C is considered more energy efficient than rear door cooling as it removes the heat from the source, however air cooling is still required to manage heat from other components.
Immersion cooling – Hardware such as CPUs and GPUs are entirely submerged in a dielectric coolant to dissipate heat. While more efficient than other cooling methods, it is still an emerging technology, though it is gaining traction in the era of AI and high density racks.
Latency – the time delay when data travels from its source to its destination. This can be within a data centre or between a user and the data centre. Low latency indicates that the time for data to travel is rapid; high latency means data takes longer to reach its destination. The requirement for high latency will depend on the main function of the data centre. For example, low latency is critical for cloud-based gaming and real-time trading.
PUE – Power Usage Effectiveness - the ratio which compares the power used by IT equipment in a data centre with the power used by other equipment. A lower PUE indicates that a data centre is more efficient since it is using less energy for non-IT infrastructure, such as cooling. Theoretically, the best PUE would be 1.0, with all power focused on compute. However there will always be demand from other equipment. In the UK, data centres can achieve PUEs of 1.1 or 1.25 which is considered highly energy efficient.
WUE – Water Usage Effectiveness – the ratio of total annual water used to the total annual energy consumed by the IT equipment. It is expressed as litres per kilowatt hour (L/kWh). A lower WUE indicates greater water efficiency.
