News
José Manuel Moya: "Federation is the next natural phase: the next step is to optimize at the aggregate level, taking advantage of the characteristics of each data center."
José Manuel Moya is a tenured professor in the Department of Electronic Engineering at the Polytechnic University of Madrid, with more than 25 years of experience in teaching and research in digital infrastructures and cyber-physical systems. His scientific career focuses on the energy efficiency of data centers, the operation of critical data-based infrastructures, and distributed and edge computing. He has supervised 12 doctoral theses, published more than 100 international papers, and led competitive and transfer projects in collaboration with leading technology companies. He was Director of CeSViMa, the UPM's high-performance computing center, and currently leads the area of energy efficiency in data centers at the Computational Simulation Center. He combines research, industrial transfer, and strategic vision to drive systemic evolution of European digital infrastructure.
In your LinkedIn presentation, you argue that Europe cannot meet its digital and climate commitments with the current operating model. What exactly is it about the infrastructure model we use today that cannot be scaled up?
That's a good question, because the phrase can sound provocative. The first thing I want to clarify is that I don't think the problem is technological. Europe has built technically extraordinary data centers. In fact, Europe has been a pioneer in incorporating energy efficiency as a structural criterion in the design and operation of data centers. The problem is more subtle. It's operational.
I have worked with infrastructures where the technical level is extremely high, but when you start to look closely at the actual behavior of the system, you realize that we optimize isolated variables, not the whole.
Optimizing PUE (Power Usage Effectiveness), for example, was revolutionary at the time. But today I have seen centers with excellent PUE that are operating at less than 30% of their actual IT capacity. That means you have a very efficient machine... but running at half speed. From the outside, it looks perfect. From the inside, not so much.
Another very clear example: we measure electricity consumption with pinpoint accuracy, but we rarely know the thermal quality of the heat we expel. In some projects I have been involved in, when we modeled the residual heat, we discovered that it could have been used in urban networks if it had been designed with that logic in mind from the outset.
The current model is designed to optimize parts. And what is coming now, with AI, with mass electrification, with the integration of renewables, requires optimizing systemic behavior. And that is not a criticism. It is a natural evolution. Each stage of the sector has had its focus. Now the focus has to be broadened.
Spain is often presented as a country with great potential in digital infrastructure. What is needed to move beyond being a promise and become a truly mature market?
Spain is a very interesting case: it has extremely high renewable energy penetration, excellent submarine connectivity, physical space to grow, and a privileged geostrategic position between Europe, Africa, and America. Not everyone has that. But the challenge is not technical, it is structural.
Renewable energy generation has grown impressively in recent years. However, the grid, storage, and coordinated planning have not evolved at the same pace. This creates tensions and, at times, a sense of uncertainty.
I have had conversations with operators who are willing to invest hundreds of millions and whose biggest concern is not the market or demand, but the predictability of access to the power grid. It is not a problem of total capacity. It is a problem of synchronization.
And this is where I believe Spain has a historic opportunity. If we are able to jointly plan new digital nodes, network reinforcements, storage, and active participation by large consumers such as data centers, we can turn that apparent limitation into a real competitive advantage.
It would not be the first time that Spain has led the way in integrating renewable energy. We could also lead the way in digital-energy integration.
Is the main obstacle to the sector's development investment, regulation, or an operating model that no longer meets the new demands of AI and resilience?
Honestly, investment is not the problem. The capital is there. There is enormous interest in investing in digital infrastructure. Regulation is not the enemy either. In fact, it is often trying to bring order to very rapid growth. The real change lies in the nature of the load.
AI has completely changed the energy profile of data centers. We are talking about racks of 50, 80, or even 100 kW. Liquid cooling. Load peaks that change in seconds. Power requirements that are closer to an industrial plant than a traditional data center.
If we operate with tools and mental models designed for the cloud of ten years ago, we will fall short. There is a silent race already underway: whoever masters advanced operation, energy integration, and real efficiency will be the preferred partner for the new generation of AI loads. The interesting thing is that this is not a threat to the sector. It is an opportunity to evolve.
The PUE has been used as a benchmark for years. Is it still sufficient?
The EUP was absolutely necessary. It was the common language that allowed the sector to take efficiency seriously. But today it is only the beginning of the conversation.
A center may have a PUE of 1.2 and be operating well below its actual capacity. Is it efficient? Yes, from a very specific perspective. Is it optimal from a systemic point of view? Not necessarily.
Now we need to start talking about actual usage, digital efficiency per unit of service, hourly renewable energy (not just annual), effective heat reuse, and the ability to interact with the grid. The European framework itself, with the EED and the Delegated Regulation, is already pushing in that direction.
And I think that's positive. It means we're maturing.
You talk about infrastructure that functions as a system. What does that mean in practice?
When I talk about a system, I am not referring to something abstract. I am referring to something very specific that I have seen many times in practice.
In many data centers, information is fragmented. The infrastructure team has its metrics for power and its metrics for cooling. The IT team has different ones. Each optimizes its own area. And they all do it well. But the whole is not always optimized.
Operating as a system means starting to make cross-cutting decisions. For example, if actual workload behavior indicates that certain peaks are predictable, cooling can be anticipated rather than reactive. If we know that the power grid will be under strain at a specific time slot, the center can modulate certain non-critical loads or coordinate with energy storage. If the waste heat has sufficient thermal quality, integration with urban networks can be designed from the outset. This requires three things: comprehensive observability, dynamic modeling, and decision-making based on real data.
I have seen projects where simply integrating the electrical, thermal, and IT usage layers into a single model reveals inefficiencies that were invisible until then. When that happens, the center ceases to be a highly efficient building and becomes a smart energy platform. And that leap is enormous.
Are total observability and predictive control aspirational or already necessary?
They are necessary. And I will go one step further: they are not only necessary, they are strategic. The data center is, in reality, a data-generating machine: electrical data, thermal data, IT usage data, dynamic behavior data, network interaction data. And yet, we often treat this data as a technical byproduct, not as a strategic asset.
In the new era of AI, with renewable integration and structured reporting under the EED and Delegated Regulation, operational data is likely to become the most valuable asset in the data center. This is because it is the only data that allows for the optimization of actual utilization, the anticipation of risks, the provision of flexibility to the grid, the demonstration of verifiable sustainability, and the making of decisions based on actual behavior rather than design assumptions.
But here comes the most challenging part: this is not just technology. It's organization.
Leveraging this asset means breaking down silos. It means that energy, IT, and operations must work toward a common vision. It means incorporating real analytical capabilities. It means changing the internal culture.
I have seen centers that want to talk about systemic efficiency... And when you look at the actual telemetry, they have two sensors per cold aisle and none that relate thermal behavior to IT load in real time. With that level of visibility, you are not optimizing. You are estimating. And estimating can work when loads are stable. But with AI, high densities, and renewable integration, operating by estimation is taking a risk.
The interesting thing is that when you start to densify measurement intelligently, the model changes completely. Suddenly patterns emerge, hidden inefficiencies are identified, setpoints are optimized with real data, and underutilized capacity is discovered without the need to increase contracted power.
And that's when you realize that the real asset isn't the sensor itself. It's the ability to interpret the data. And this is only the first step; the possibilities that open up from there are enormous.
For years, we have designed data centers to withstand the worst-case scenario, but now we have to learn to operate them according to the real scenario, which is now more variable and increasingly unpredictable.
Is the sector prepared for European regulation, or is there a gap?
There is a gap, yes. And that's normal. Regulation is evolving faster than many legacy operating models. For years, the focus was on availability and cost. Now the focus includes energy traceability, water footprint, heat reuse, and grid integration.
But I see something positive in this: Regulation is forcing the sector to structure data and further professionalize operations. Those who invest now in consistent measurement systems and analytical capabilities will not only comply. They will have a competitive advantage.
I have spoken with operators who initially viewed reporting as a burden. After structuring their information, they discovered opportunities for optimization that they had not detected before. Well-managed transparency is not a threat. It is a strategic tool.
With the explosion of AI, is the Spanish electricity system ready?
This is one of the most complex questions. Spain has extraordinary renewable energy penetration. That is a huge strength. But it also means that the grid requires more flexibility, more storage, and better planning.
Royal Decree 1183/2020 regulates access and connection, but the key is not only in the regulation. It is in anticipation.
The electrical system can support digital growth, but it requires synchronization between energy planning and digital planning.
If the new digital demand is planned in conjunction with network and storage reinforcements, the result can be very powerful. If not, tensions and delays will arise.
This is how I see it: digitization and the energy transition are not separate processes. They are the same process from two different perspectives. If we understand that, Spain can become an exportable model of renewable-digital integration.
Which energy technologies are truly viable in the long term?
I believe we are entering a phased transition. And it is important to understand that not everything happens at the same time.
First phase: renewables, storage, and smart demand management. This is the foundation. Without it, there can be no discussion. Clean electrification, storage for stabilization, and the ability to shift load intelligently. This phase is already underway.
But the second phase is more interesting: collaboration, the pooling of resources. Today, each data center optimizes within its own perimeter. The next leap is to optimize at an aggregate level, taking advantage of the characteristics of each data center in terms of energy generation and storage capacities, computing capacities, and communication capacities, and even differences in electricity contracts.
First, within the same organization: federate centers, redistribute loads based on renewable availability, hourly price, latency, or criticality. This is already technically possible.
But if we take it a step further, why not enable effective collaboration between organizations? I'm not talking about sharing secrets. I'm talking about sharing capacity or flexibility under clear rules and mutual benefits. If several companies could coordinate part of their non-critical load, we could smooth out peaks, reduce redundant investment, and improve systemic efficiency.
Federation is the next natural phase. Moving from local optimization to global optimization.
And then comes the third phase, which is the most transformative: redesigning the data center model itself.
We are reaching the limits of traditional architecture. Direct-to-chip cooling is a transitional solution to support the growing density of AI. But technologies such as two-phase immersion open the door to something deeper: designing highly optimized blocks for specific applications. This allows us to move toward data centers built like a Lego system, with specialized, interchangeable blocks optimized for their specific function.
If we add structured leasing models to this, we can give IT infrastructure three useful lives: 1) highly demanding applications (AI-intensive, advanced simulation); 2) less critical environments (universities, applied research, public services); and 3) contexts with greater financial pressure or lower criticality (developing countries, basic digitization).
In a federated model, that transition is not a patch. It is part of the design. That improves economic returns, yes. But it also improves real circularity, reduces material pressure, and broadens social impact.
In summary:
- Clean and smart energy.
- Federation and systemic collaboration.
- Modular, specialized, and circular infrastructure.
Long-term sustainability isn't just a matter of green kilowatts. It's a matter of systemic architecture. And I think we're right at the moment when that conversation is starting to become possible.
If you had to identify one absolute priority for a strong and sovereign European digital infrastructure, what would it be?
The priority is very clear to me: integration. Integrating real operational data into decision-making. Integrating data centers with the power grid. Integrating digital planning and energy planning. Integrating economic efficiency and environmental efficiency.
Europe has technical talent, capital, and strategic needs. What is lacking, in many cases, is cross-cutting coordination. We don't need more talk. We need systems that function as systems. And, if I'm honest, I think we're at an exciting moment because, for the first time, digital infrastructure is at the center of the energy, economic, and geopolitical debate. That's not a burden. It's a historic responsibility, and it's also a huge opportunity for those who want to lead the next stage.
This integration is only possible if we understand that the most valuable asset of the data center is not the building or the installed power. It is the operational data. Without it, there is no real optimization, no effective federation, and no systemic architecture.
[1] Directive (EU) 2023/1791, http://data.europa.eu/eli/dir/2023/1791/oj
[2] Delegated Regulation (EU) 2024/1364, http://data.europa.eu/eli/reg_del/2024/1364/oj
HR Technologies
Learning Technologies)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)
)