An interview with Alina Rubina, project manager Tellus
Eleven projects, funded by the German federal government to the tune of 117.4 million Euro, are currently developing prototypes for Europe’s future Gaia-X data ecosystem. They are working on business models, cloud platforms, applications, and programming interfaces. But only one of the eleven projects deals with the lowest layer of infrastructure, the network itself. In an interview, project manager Alina Rubina explains how the Tellus project wants to give the Internet an upgrade.
Alina, you used to work in academia and have received several awards for your research publications. Today you work as the project manager for the Gaia-X funded project Tellus at DE-CIX, one of the world’s largest Internet Exchange operators. What is your professional background and what brought you to Gaia-X?
I have two Master’s degrees in Communication Technology from the Technical Universities in Riga and Ilmenau. In my master’s thesis at the TU Ilmenau, I investigated path calculations for drone-based localisation of mobile devices. After graduating in 2015, I continued my research in Ilmenau as a research assistant for five years. When I joined DE-CIX in May 2020, I was immediately put in charge of the Gaia-X project. At that time, there was no concrete funded project. My employer was initially interested in supporting the Gaia-X initiative from the infrastructure side.
Gaia-X is all about data sovereignty and cloud services. Why is DE-CIX, an operator of Internet exchanges, i.e. a network specialist, involved in Gaia-X?
I think the discussions about data sovereignty and data-based business models often forget the technical underpinnings: the network infrastructure that carries our data. For most people, networks are a black box. But the Internet as we know it is not powerful and reliable enough for many critical and demanding applications. We want to bring this perspective and our experience in this field to the development of Gaia-X.
Why does the Internet need an upgrade?
Broadband rollout in Germany is about bringing fast Internet access to all households and businesses. But the Internet is more than just a data socket at home or in the office. The bandwidth of my DSL or cable connection of 50, 100 or even 1,000 megabits per second only tells me how fast my connection to my local Internet provider is. But it does not guarantee that my experience of a digital service will be truly satisfying. Just as important is the path the data takes between my computer and, say, Netflix’s servers. For most of today’s digital applications, the Internet is good enough. So far, anyway. During the Covid-19 pandemic, however, we saw what happens when significantly more people around the world use data-intensive services. Streaming providers had to temporarily reduce the quality of their video transmissions to keep the Internet from crashing. This is even more true for advanced applications of the coming data age, such as Industry 4.0, robotics, AI, or telemedicine. That’s not what the Internet was built for.
Why, for example, does Industry 4.0 overwhelm the conventional Internet?
Increasingly, these systems must react in real time and handle data volumes of a different order of magnitude than previously. This requires networks to transmit data not only very quickly, but also extremely reliably. The flow of data must not be interrupted, delayed, or fragmented. However, the Internet only transmits data according to the “best effort principle”. This means that it forwards data as quickly as possible, using the best path currently available. It does its best, but there is no technical guarantee of error-free and reliably fast transmission.
Can you give us an example for this?
One of our project partners at Tellus is Mimetik. Their team is developing an intelligent data glove to digitally assist manual processes. The glove uses sensors to create a digital image of the movements of a human hand. This allows the user to control an industrial machine virtually and remotely. Or the glove can assist the user in the field while the connected system monitors the quality of processes or warns of hazards. What makes this application so tricky from a networking perspective is that the glove and the machine may not be in the same location. They can be thousands of kilometres apart. There is usually not enough computing power on site to process the data from the glove. It has to be uploaded to a cloud and transferred from there to the machine. The whole process has to happen in real time, so that the glove can be used intuitively in virtual space.
This places extreme demands on the transport network, particularly in terms of latency, i.e. reaction time. Mimetik’s system can partially compensate for weaknesses in the network by simulating real time: An artificial intelligence in the cloud anticipates the next movement of the hand and forwards the movement command to the machine before the complete sensor data has arrived from the glove. However, neither the performance nor the confidentiality and integrity of best-effort connections are sufficient for such applications.
How will you solve this problem with Tellus?
The solution in this case would be a software-based end-to-end connection at the network level between the data glove, the AI in the cloud and the machine that is controlled by the glove. Instead of sending the data via a random connection path on the Internet, our software calculates a network path based on the technical requirements and sends the data over the optimal connection. We can guarantee all of the important technical parameters. But that is only half the story.
What more do you need than an optimal network connection?
Three things: standards, market transparency, and automation. Setting up end-to-end network connections still requires a lot of manual effort. The switching and ordering process itself takes time. It also takes a lot of time to find the best technical and economically optimal connection.
What makes the use of end-to-end connections at the network level so costly?
The problem lies in the number and complexity of the steps involved. First, a company must determine its requirements for a network connection in terms of performance, reliability, and security. Then it has to find a network operator whose infrastructure product matches those requirements and budget. Getting an overview of the market is also time-consuming. As things stand today, it is almost impossible to automate this process. Because vendors work with proprietary systems, there is a lack of standards, especially for processes and interfaces. As a result, network connectivity for distributed, mission-critical cloud applications remains largely fragmented, with no consistent service guarantees.
Why is there a need for action in the data economy on the topic of network infrastructure?
In the data-driven economy, complex multi-cloud applications and real-time exchange between and within data spaces will be commonplace. In such scenarios, companies need countless end-to-end network connections. Across multiple platforms and network providers. With data sovereignty and data protection compliance built in. Initiatives like Gaia-X are driving demand. But today, such high-performance connections are still very cumbersome to manage: Companies cannot rely on uniform processes. Instead, they have to deal with multiple contracts and bureaucracy. With Tellus, we want to radically simplify this: one contract, one standard process and little additional communication.
Where do you start in the search for better infrastructure solutions?
First, we looked at the technical requirements of sophisticated data-driven applications. What goes wrong in practice? What are the critical parameters? Based on our findings, we developed a concept for an optimal technical solution.
What solution strategy do you pursue for more interoperability and less complexity in the infrastructure sector?
We are creating a software superstructure, the Tellus overlay. With this, we are finally making the systems of network providers and cloud platforms interoperable. This will make it much easier and faster to set up end-to-end network connections. We also want to improve the speed and quality of service selection. To achieve this, providers describe their services in a precise and standardised way in a Gaia-X compatible service catalogue. This means that customers no longer have to rely on marketing promises, but can select services based on objective criteria. In our case, the Tellus software will take care of this and make appropriate suggestions to the user.
How does the Tellus overlay work?
Network operators and cloud providers install the Tellus software on their systems. Each of them represents a node, a virtual node in our network. At the heart of this hierarchical architecture is the Super Node. The Tellus software on the Super Node calculates the best route for each end-to-end network connection. The system pulls the necessary information from the service catalogue, where all providers store their services along with guaranteed performance and security requirements. The Super Node automatically matches customer requirements with suitable providers. What companies used to do painstakingly by hand will in future be solved directly by code command and thus automated, thanks to Tellus. We call this Network as a Code.
How does Tellus contribute to more data sovereignty within Gaia-X?
The vision of Gaia-X is a secure and trustworthy data infrastructure for self-managed data ecosystems. The Tellus project is Gaia-X compliant and will use the Gaia-X Federation Services (GXFS). Our contribution is to provide the appropriate network infrastructure for the future European data ecosystems. We automate access to high-performance data connections and integrate disparate network and cloud services into an interoperable architecture. This does not mean that conventional Internet connections will become obsolete. Tellus is the solution for demanding applications where the Internet cannot deliver in terms of latency, bandwidth, security, resilience, dynamics, and monitoring.
What other organisations and companies participate in Tellus and what areas do they focus on?
There are ten partner companies in working on Tellus, and DE-CIX is the consortium leader. Mimetik, IONOS and Trumpf are researching specific use cases. Mimetik is working on the data glove application, IONOS on a digital twin for industrial applications, and Trumpf on a scenario for “Equipment-as-a-Service – Pay per Part”. Here, business customers use a laser cutter for sheet metal processing, for example, without having to buy or lease it. The partners Plusserver and Spacenet contribute their many years of experience as providers of cloud services, as does Wobcom as an Internet service provider. KAEMI strengthens the project with its expertise as an infrastructure specialist for Network & Security as a Service. CISPA focuses on security, and Cloud&Heat is a specialist for energy-efficient data centre infrastructure.
What is your project plan? How far has Tellus progressed? And how are you going to use the GXFS services?
Our project started in November 2021 and covers a period of three years. Our plan has five milestones. The requirements analysis I mentioned earlier has already been completed, and we have also defined the technical architecture, which is the second milestone. We are now halfway through and working on the concept for the network and cloud layer. In this phase, we will also decide how to implement the GXFS and which modules to use. We are currently evaluating this. We aim to complete the concept phase in the summer and start the technical implementation in the autumn. The plan is to present the Tellus prototype to the public at the end of 2024.
Alina, Thank you very much for the interview!
Andreas Weiss & Thomas Sprenger
Every month on LinkedIn and www.gxfs.eu
Every month, we will guide you through the world of Gaia-X on LinkedIn and www.gxfs.eu. Our analyses and interviews give background and insights into how a European initiative and its collaborators want to create an ecosystem for value creation from data.
Heading this series of articles is Andreas Weiss. As Head of Digital Business Models at eco as well as Director of EuroCloud Deutschland_eco, Andreas Weiss is well connected and familiar with the Internet and cloud industry in Europe. He brings his experience to Gaia-X Federation Services (GXFS), whose project teams are responsible for the development of Gaia-X core technologies. Led by eco, the GXFS-DE project is also funded by the German Federal Ministry of Economic Affairs and Climate Action and is in close exchange with the Gaia-X Association for Data and Cloud (AISBL). Weiss is supported on this blog by Thomas Sprenger, an author and copywriter who has been writing about the digital transformation for twenty years.