The Shapers
>
Ängla Pändel on Balancing Regulation and Innovation, and How It Will Impact the Future of Legal Work
The Shapers

Ängla Pändel on Balancing Regulation and Innovation, and How It Will Impact the Future of Legal Work

‘It's like putting together a puzzle where the pieces are constantly moving.’

News
>
Ängla Pändel on Balancing Regulation and Innovation, and How It Will Impact the Future of Legal Work

Ängla Pändel on Balancing Regulation and Innovation, and How It Will Impact the Future of Legal Work

We sit down with our users weekly—and some of the most compelling voices of the legal profession—to explore what shapes the future of their work and the world of law itself. We call them The Shapers. We spoke with Ängla Pändel, a Senior Associate in the Corporate Compliance and Risk team at Mannheimer Swartling in Stockholm.

Ängla’s path to the legal profession, her focus on future-oriented law, and her unwavering commitment to fundamental rights exemplify the vision, adaptability, and dedication that makes her a Shaper.

Ängla’s journey wasn’t linear. Before pursuing her legal career, she spent years in the hospitality and events industry. When she decided to study law, she found the traditional curriculum somewhat static—until she encountered the Swedish Institute of Law and Internet (Institutet för Juridik och Internet). There, she discovered a realm in which legal expertise meets constant innovation, a space where emerging issues like data protection, privacy, and digital rights hold center stage. Ängla quickly realized that to keep pace with technological change, lawyers must navigate uncharted territory.

Now, having guided numerous clients through the complexities of GDPR and anticipating the arrival of the AI Act, Ängla remains passionate about legal work that is simultaneously forward-looking and grounded in fundamental human rights. In her view, regulations should never become so rigid that they stifle progress or undermine core freedoms. Especially when it comes to freedom of expression with the assured right to example privacy.

Ängla believes that the rapid pace of technological change demands a holistic perspective—one that considers how overlapping regulations interact and how businesses, especially startups, can thrive within these frameworks. She draws on her experience to help clients harmonize their operations with an ever-shifting legal landscape. For her, it’s about more than just compliance. It’s about shaping the future of law so that it remains vital, dynamic, and deeply human. 

What attracted you to the Swedish Institute of Law and the Internet?

It seemed like those who knew the law had all the answers, and there wasn't much room for change. But at the Swedish Institute of Law and the Internet, we were dealing with emerging legal issues—pushing the boundaries and using law as a tool to address new problems that were constantly arising. The laws weren't written yet; we were essentially helping to write them. That was incredibly inspiring to me.

Is that how you got into legal informatics at Stockholm University?

Yes, exactly. Through the institute, I applied for a position as a teaching assistant in legal informatics at Stockholm University. I was fortunate to work with some of the most brilliant minds in Sweden and even globally in the field of future-oriented law, often referred to as IT law. I worked closely with Professor Cecilia Magnusson Sjöberg, who is incredibly knowledgeable in these areas.

What were the big legal questions in IT law at that time?

This was around 2014 or so. GDPR hadn't come into effect yet but was on the horizon. There was a lot of discussion about automated decision-making, especially within government agencies, and a strong focus on data protection and privacy from a personal data perspective. The shift toward data as a valuable asset hadn't fully happened yet. I found myself immersed in digital rights issues, particularly related to freedom of expression and privacy protection.

Being involved with GDPR from the early stages must have been fascinating. What was that experience like?

It was incredibly exciting to learn alongside professionals who had been working in data protection and privacy for a long time.

GDPR was a significant shift—a new way of focusing on data protection that hadn't existed before. Together with our clients, we navigate a wide range of matters, from large-scale initial GDPR compliance projects to the precise and targeted advising we provide today.

This includes comprehensive legal and risk assessments, compliance audits, advocacy in judicial proceedings, representation in regulatory inquiries and court proceedings. It's been a thrilling journey to be part of.

Looking back, were there any misconceptions or surprises about GDPR implementation?

I think initially, we expected a faster pace and thought that sanctions and fines would be handed out left and right. There was also a bit of a misplaced focus in the early GDPR projects. Given what we know now, the emphasis wasn't always where it should have been. Over time, legal development has steered us more toward a risk-based approach. Risk assessments have become much more central to decision-making than the more categorical structuring we did earlier, like strictly focusing on getting the formal documentation in place.

You mentioned a need for a more dynamic approach to data protection. Could you elaborate on that?

Certainly. GDPR can be quite rigid, which makes it challenging to meet all the formal requirements, especially with new technologies. The rigidity doesn't necessarily equate to poor data protection practices; it's just that the regulations sometimes don't fit new technological contexts. For example, I firmly believe it's challenging to combine GDPR with new AI developments in a way that is practical and effective.

Do you think over-regulation in Europe is stifling innovation compared to places like the U.S.?

I completely agree that's a significant risk. I don't perceive any unwillingness among our clients to meet regulatory requirements and conduct sustainable, responsible operations.

However, compliance becomes extremely challenging when requirements conflict or lack alignment. Take AI models, for example—the uncertainty around whether data is considered personal at different stages of development and use can create significant roadblocks to innovation.

Meanwhile, the U.S., which we've often criticized for inadequate data protection, has a different approach to how data can be used in relation to AI. 

There are major risks to directing EU companies to rely exclusively on AI systems and models trained solely on data from non-EU/EEA citizens. Such practices result in outputs that fail to accurately reflect the diversity and realities of EU citizens, hinder linguistic development within the EU, and ultimately have detrimental effects on the advancement of AI technology within the region. However, the landscape is constantly changing so we will probably see changes there too.

Is there a risk that companies become too risk-averse and shy away from opportunities due to the complexity of regulations?

Absolutely. We often talk about a "chilling effect" when over-regulation leads to hesitation or inaction. There's a risk that companies might relocate operations because the regulatory environment demands enormous resources to navigate, resources that startups, in particular, may not have. The regulations lack sufficient grace periods or exemptions for startups, imposing disproportionate compliance burdens. Conversely, there's also a risk that companies might choose to ignore compliance altogether because the hurdles are just too high.

How similar is the situation with the upcoming AI Act compared to the GDPR rollout?

While they aren't identical, there are similarities. Both involve companies having to grapple with new regulations without much guidance. With the AI Act, we see the same kind of uncertainty and the need for "high-quality guessing" to advise clients effectively. The challenge is compounded when regulations like the AI Act and GDPR don't align well with each other, making compliance even more complex.

How do you approach advising clients on these new, uncertain regulations? Is there a methodology you follow?

We rely heavily on a helicopter perspective—stepping back from the individual regulations to see the bigger picture. It's essential to understand how different regulations interact with each other. For example, how does the definition of something in one regulation affect another? We build extensive diagrams mapping out digital compliance, incorporating various factors like regulatory texts, case law, best practices , political positions, and policy initiatives.

It's like putting together a puzzle where the pieces are constantly moving.

Given your experience, what fundamental values do you think are crucial to maintain as the internet evolves over the next 10 to 15 years?

I strongly believe in holding onto fundamental rights, especially as enshrined in principles of sustainability—which today go far beyond environmental concerns. Many of the regulations we work with aim to protect fundamental human rights, like the right to privacy and freedom of expression. As we navigate this rapid technological shift, maintaining a focus on these core values will keep us on the right path.

Do you think sometimes we lose sight of these fundamental rights amid all the detailed regulations?

Yes, absolutely. Take GDPR, for example. Initially, everyone was focused on data processing agreements and privacy notices, without grasping the main purpose of the regulation. The same goes for cookie banners under the ePrivacy Directive. We've ended up with a user experience where people are just clicking through layers of options without truly enhancing privacy. In focusing on these details, we risk missing the bigger picture.

How can we prevent over-regulation from stifling innovation while still protecting fundamental rights?

We need better synchronization between regulations and a more dynamic, risk-based approach. Overly detailed regulations can quickly become outdated due to the fast pace of technological change. Emphasizing fundamental rights and finding the right balance is key. It's also crucial for regulators to engage in more dialogue with each other to ensure that new laws don't conflict and create impossible situations for businesses.

At the end of the day, do you find that within your firm or among colleagues, there's a lot of debate and differing opinions on how to interpret these new regulations?

Yes and no. Even within our firm,

we don't always agree on interpretations at start. That's part of the process of working in areas of law that are still developing but it ultimately leads to better results

by allowing us to consider and address diverse perspectives and positions. For some questions, there can be as many interpretations as there are lawyers at the start. We then work through it, refining our views until we reach a final position. Much of what we do involves "informed guessing" because we're often advising clients ahead of official guidelines or court decisions. It's challenging but also exciting as we're playing a key role in shaping the future of legal compliance.

Team Leya
29/1/2025

‘It's like putting together a puzzle where the pieces are constantly moving.’

Ängla Pändel on Balancing Regulation and Innovation, and How It Will Impact the Future of Legal Work

We sit down with our users weekly—and some of the most compelling voices of the legal profession—to explore what shapes the future of their work and the world of law itself. We call them The Shapers. We spoke with Ängla Pändel, a Senior Associate in the Corporate Compliance and Risk team at Mannheimer Swartling in Stockholm.

Ängla’s path to the legal profession, her focus on future-oriented law, and her unwavering commitment to fundamental rights exemplify the vision, adaptability, and dedication that makes her a Shaper.

Ängla’s journey wasn’t linear. Before pursuing her legal career, she spent years in the hospitality and events industry. When she decided to study law, she found the traditional curriculum somewhat static—until she encountered the Swedish Institute of Law and Internet (Institutet för Juridik och Internet). There, she discovered a realm in which legal expertise meets constant innovation, a space where emerging issues like data protection, privacy, and digital rights hold center stage. Ängla quickly realized that to keep pace with technological change, lawyers must navigate uncharted territory.

Now, having guided numerous clients through the complexities of GDPR and anticipating the arrival of the AI Act, Ängla remains passionate about legal work that is simultaneously forward-looking and grounded in fundamental human rights. In her view, regulations should never become so rigid that they stifle progress or undermine core freedoms. Especially when it comes to freedom of expression with the assured right to example privacy.

Ängla believes that the rapid pace of technological change demands a holistic perspective—one that considers how overlapping regulations interact and how businesses, especially startups, can thrive within these frameworks. She draws on her experience to help clients harmonize their operations with an ever-shifting legal landscape. For her, it’s about more than just compliance. It’s about shaping the future of law so that it remains vital, dynamic, and deeply human. 

What attracted you to the Swedish Institute of Law and the Internet?

It seemed like those who knew the law had all the answers, and there wasn't much room for change. But at the Swedish Institute of Law and the Internet, we were dealing with emerging legal issues—pushing the boundaries and using law as a tool to address new problems that were constantly arising. The laws weren't written yet; we were essentially helping to write them. That was incredibly inspiring to me.

Is that how you got into legal informatics at Stockholm University?

Yes, exactly. Through the institute, I applied for a position as a teaching assistant in legal informatics at Stockholm University. I was fortunate to work with some of the most brilliant minds in Sweden and even globally in the field of future-oriented law, often referred to as IT law. I worked closely with Professor Cecilia Magnusson Sjöberg, who is incredibly knowledgeable in these areas.

What were the big legal questions in IT law at that time?

This was around 2014 or so. GDPR hadn't come into effect yet but was on the horizon. There was a lot of discussion about automated decision-making, especially within government agencies, and a strong focus on data protection and privacy from a personal data perspective. The shift toward data as a valuable asset hadn't fully happened yet. I found myself immersed in digital rights issues, particularly related to freedom of expression and privacy protection.

Being involved with GDPR from the early stages must have been fascinating. What was that experience like?

It was incredibly exciting to learn alongside professionals who had been working in data protection and privacy for a long time.

GDPR was a significant shift—a new way of focusing on data protection that hadn't existed before. Together with our clients, we navigate a wide range of matters, from large-scale initial GDPR compliance projects to the precise and targeted advising we provide today.

This includes comprehensive legal and risk assessments, compliance audits, advocacy in judicial proceedings, representation in regulatory inquiries and court proceedings. It's been a thrilling journey to be part of.

Looking back, were there any misconceptions or surprises about GDPR implementation?

I think initially, we expected a faster pace and thought that sanctions and fines would be handed out left and right. There was also a bit of a misplaced focus in the early GDPR projects. Given what we know now, the emphasis wasn't always where it should have been. Over time, legal development has steered us more toward a risk-based approach. Risk assessments have become much more central to decision-making than the more categorical structuring we did earlier, like strictly focusing on getting the formal documentation in place.

You mentioned a need for a more dynamic approach to data protection. Could you elaborate on that?

Certainly. GDPR can be quite rigid, which makes it challenging to meet all the formal requirements, especially with new technologies. The rigidity doesn't necessarily equate to poor data protection practices; it's just that the regulations sometimes don't fit new technological contexts. For example, I firmly believe it's challenging to combine GDPR with new AI developments in a way that is practical and effective.

Do you think over-regulation in Europe is stifling innovation compared to places like the U.S.?

I completely agree that's a significant risk. I don't perceive any unwillingness among our clients to meet regulatory requirements and conduct sustainable, responsible operations.

However, compliance becomes extremely challenging when requirements conflict or lack alignment. Take AI models, for example—the uncertainty around whether data is considered personal at different stages of development and use can create significant roadblocks to innovation.

Meanwhile, the U.S., which we've often criticized for inadequate data protection, has a different approach to how data can be used in relation to AI. 

There are major risks to directing EU companies to rely exclusively on AI systems and models trained solely on data from non-EU/EEA citizens. Such practices result in outputs that fail to accurately reflect the diversity and realities of EU citizens, hinder linguistic development within the EU, and ultimately have detrimental effects on the advancement of AI technology within the region. However, the landscape is constantly changing so we will probably see changes there too.

Is there a risk that companies become too risk-averse and shy away from opportunities due to the complexity of regulations?

Absolutely. We often talk about a "chilling effect" when over-regulation leads to hesitation or inaction. There's a risk that companies might relocate operations because the regulatory environment demands enormous resources to navigate, resources that startups, in particular, may not have. The regulations lack sufficient grace periods or exemptions for startups, imposing disproportionate compliance burdens. Conversely, there's also a risk that companies might choose to ignore compliance altogether because the hurdles are just too high.

How similar is the situation with the upcoming AI Act compared to the GDPR rollout?

While they aren't identical, there are similarities. Both involve companies having to grapple with new regulations without much guidance. With the AI Act, we see the same kind of uncertainty and the need for "high-quality guessing" to advise clients effectively. The challenge is compounded when regulations like the AI Act and GDPR don't align well with each other, making compliance even more complex.

How do you approach advising clients on these new, uncertain regulations? Is there a methodology you follow?

We rely heavily on a helicopter perspective—stepping back from the individual regulations to see the bigger picture. It's essential to understand how different regulations interact with each other. For example, how does the definition of something in one regulation affect another? We build extensive diagrams mapping out digital compliance, incorporating various factors like regulatory texts, case law, best practices , political positions, and policy initiatives.

It's like putting together a puzzle where the pieces are constantly moving.

Given your experience, what fundamental values do you think are crucial to maintain as the internet evolves over the next 10 to 15 years?

I strongly believe in holding onto fundamental rights, especially as enshrined in principles of sustainability—which today go far beyond environmental concerns. Many of the regulations we work with aim to protect fundamental human rights, like the right to privacy and freedom of expression. As we navigate this rapid technological shift, maintaining a focus on these core values will keep us on the right path.

Do you think sometimes we lose sight of these fundamental rights amid all the detailed regulations?

Yes, absolutely. Take GDPR, for example. Initially, everyone was focused on data processing agreements and privacy notices, without grasping the main purpose of the regulation. The same goes for cookie banners under the ePrivacy Directive. We've ended up with a user experience where people are just clicking through layers of options without truly enhancing privacy. In focusing on these details, we risk missing the bigger picture.

How can we prevent over-regulation from stifling innovation while still protecting fundamental rights?

We need better synchronization between regulations and a more dynamic, risk-based approach. Overly detailed regulations can quickly become outdated due to the fast pace of technological change. Emphasizing fundamental rights and finding the right balance is key. It's also crucial for regulators to engage in more dialogue with each other to ensure that new laws don't conflict and create impossible situations for businesses.

At the end of the day, do you find that within your firm or among colleagues, there's a lot of debate and differing opinions on how to interpret these new regulations?

Yes and no. Even within our firm,

we don't always agree on interpretations at start. That's part of the process of working in areas of law that are still developing but it ultimately leads to better results

by allowing us to consider and address diverse perspectives and positions. For some questions, there can be as many interpretations as there are lawyers at the start. We then work through it, refining our views until we reach a final position. Much of what we do involves "informed guessing" because we're often advising clients ahead of official guidelines or court decisions. It's challenging but also exciting as we're playing a key role in shaping the future of legal compliance.

Team Leya
29/1/2025
Get started

Let's make your legal work flow

Book a demo to get started