In this episode, we dive into the Product Liability Directive and Cyber Resilience Act with Daniel Thompson, CEO of Crab Nebula. The EU’s new legislative framework impacts manufacturers in ways we don’t totally understand, but are going to bring substantial changes to how companies use and develop open source. Daniel explains the broader implications for software security and the future of digital products in the European market.

This episode is also available as a podcast, search for “Open Source Security” on your favorite podcast player.

Episode Transcript

Josh Bressers (00:00) Today, open source security is talking to Daniel Thompson, the CEO of Crab Nebula, an expert repertoire at ETSI Daniel, welcome to the show.

Daniel (00:09) Hey, thanks. You know, I’ve been a long time listener and I feel like open source security kind of has a problem because ⁓ open source software has the same abbreviation, right? Like, is this software supposed to be secure? I kind of thought it was supposed to be secure from the beginning, right? That’s what the quality is all about.

Josh Bressers (00:27) No, no.

I don’t know, no, nothing’s secure. Come on, you should know.

Daniel (00:34) It’s great to be here, yeah.

Josh Bressers (00:35) But so, no,

I’m really excited. So I brought Daniel in, because Daniel’s involved with some of the upcoming European regulations. There’s like the CRA, which we’ve all heard about and I’ve talked about on the show a couple of times. But there’s also a thing called the Product Liability Directive, which Daniel said before we hit record is much more pressing and urgent than the CRA is, which intrigues me because from my perspective being obviously very open source adjacent.

The CRA is the thing that I’ve had my gaze on for so long. So yeah, I’m really excited to have you here, Daniel. So I’ll just kind of let you take it from here and we’ll go where we go.

Daniel (01:14) All right. Well, I mean, we did talk about the PLD briefly and I kind of look at the Cyber Resilience Act and the Product Liability Directive as ⁓ two different approaches for treating the dangers of software that is insecure. Whereas the Cyber Resilience Act takes the proactive approach like, hey, this is what you have to do to make secure by design software and products with digital elements. They need to have a reporting mechanism.

cybersecurity risk assessment, and you know, got all this technical documentation, you have to make your declaration of conformity and all that stuff. That’s all prescriptive. That’s what the European Commission expects a manufacturer to do in order to achieve the presumption of conformity. And the presumption of conformity is the notion that you are fulfilling the expectations of the regulator by applying certain standards that help you

do things in a way that’s appropriate. So it’s prescriptive. Whereas, on the other hand, the product liability directive is there in case you as a manufacturer make a mistake. And out of the mouth of babes, I’ve heard it said explicitly that if you are in conformance with the CRA, you probably won’t have to worry about the product liability directive. And the product liability directive takes the litigation approach. You’ve messed up.

this is what’s going to happen to you as a manufacturer. I think the important thing to remember, and I talk about this online and I’ll tell you right now, the PLD only has something to do with natural persons engaging in non-business activities. So if you’re doing business stuff, the PLD has nothing to do with you. B2B doesn’t matter. You’re not a party. You’re not a party to whatever’s going on there.

What does it mean to be party to a lawsuit brought on by the Product Liability Directive? Well, we have to remember that the Product Liability Directive and the Cyber Resilience Act didn’t just magically appear. They are part of what we know in Europe as the new legislative framework, the NLF. And the NLF has been designed over and over again to inform consumers and manufacturers how they have to

Participating in the single market of the European Union, as it were. It introduces market surveillance and there’s an accreditation system for conformity assessments and there’s clear obligations for economic operators, know, the manufacturer, the importer, the distributor, the authorized representative, and to some extent, the user. But, I mean, how do you tell a European what they’re supposed to do? mean, could…

Go try that in Paris and see how far you get, right? And then you have also the baseline regulation for the CE marking, which is kind of like the FCC marking. It shows that the manufacturer has ⁓ gone through all of the hurdles to certify, as it were, that their product is safe for use, right? And I think that…

Josh Bressers (04:10) Right?

Daniel (04:36) I would like to come back to safety in a little while. I’m not quite at the point of safety because we’re still talking about cybersecurity, but the product liability directive, for those of you who haven’t heard of it, has three important pieces. One of them is that it’s a no-fault liability law. That means that if a consumer is damaged, that they don’t have to prove what exactly broke when the software damaged them, just that it was the software.

Right? So that makes the ⁓ association with the manufacturer a lot easier. Right? So the no fault liability is definitely like one of those things where it kind of makes sense if you think about it. How do you know if it’s a library that mismanaged floats or it’s a transistor that was ⁓ given too much electricity? Right? Like, how do you know? You don’t know. You just know that that

product with digital elements was the big problem. The next one is that liability is joint and several. What this means in practice is that if a consumer sues a manufacturer for damage that’s been done to their person or their property, that they can sue the, I call it the downstream manufacturer, the company at the very end that made the final product that’s probably consumed of open source parts here and there. ⁓

And that manufacturer can then sue, obviously can then sue the component manufacturers that ⁓ made the mistake after all of the dust has settled, they’ll figure out who it was. since this is the open source security podcast, you should know as a listener that open source components are not considered party. And what that means is the people, groups, organizations, entities,

Josh Bressers (06:14) Right. Right.

Daniel (06:34) responsible for producing this open source ⁓ component or ultimate product, well, they can’t be sued here. ⁓

in theory. So here’s where reality is going to kind of catch up with us because there are a limited number of European countries that will take the product liability directive, which is as opposed to an act, a directive needs to be transposed into the respective national legislation. A very few number of them will take it and transpose it one to one. Luxembourg, for example, is one country that will just

Josh Bressers (06:47) In theory, yes.

Daniel (07:14) take the legislation and say, okay, this is it, this is law, there we go. It saves a lot of time. mean, why would you bother doing any other thing but, you know, Germany, Malta. ⁓ And what this means is that some countries might interpret things a little bit differently. They might try to align the legislation with their existing legislation, which to be fair is legitimate. When you’re talking about civil law, the

the sovereignty of a nation is not something that you can legally even give away. So while in the template legislation of the product liability directive, it says open source components are not considered. It remains to be seen the extent to which that is actually transposed. What does open source mean, Josh?

Josh Bressers (08:09) mean, the CRA, I feel like, is doing a fair job of defining that, where it’s kind of a very volunteer-focused effort versus like, I’m a company, because a good example is like the company I work for, Anchor. We have open source tools, Syft and Grype that we release. Like, we can’t throw our hands in the air and be like, sorry CRA, we’re open source. We are a business releasing open source software versus, for example, when you have like,

a random person releasing open source out of their basement, right? That is not the sort of, we’ll say, organized, I don’t even know what to call it in the context, but you know what I mean? Like that is the open source they are trying to protect, right? That like volunteer focused. Now, the way you’re looking, I suspect there’s a bunch of holes in everything I just said, which is fine, but it, and I understand the difficulty in this.

Daniel (09:02) I’m a Grype user. I like Grype It’s open source. My company uses Grype for one of our products. And there’s kind of like a litmus test to discover, a company ⁓ claim that one of its offerings is an open source project? And it’s important to negotiate here with everybody that you’re talking to the difference between a project and a product. Those two guys in their basement, it’s a project.

Josh Bressers (09:04) Awesome.

Daniel (09:31) according to the European Commission, only until they start making more money with it, then it costs them to create it and maintain it. It’s this weird kind of like, well, you know, it’s probably open source, but you turn it into a business, you’re making money with it. You got to pay your taxes. You got to be involved in the single market. Right. That’s how business works. And for your products, know, Anchore for example,

Can your company exist without this open source project?

Josh Bressers (10:05) No, we definitely can’t. Like Syft and Grype are the foundation of our enterprise offering.

Daniel (10:10) then it is likely you will have to put a CE marking on those two products in order to make sure that you can continue to offer it on the market.

Josh Bressers (10:22) Yeah,

I 100 % expect.

Daniel (10:25) But I’m not sure of that. At what point does the business interest of an open source library, where does the nature of it change?

Josh Bressers (10:39) I mean, yeah, it’s tough. Like, I get it. These are tough questions and we don’t have all the answers to it. Now, the one thing I will say in all of my observations, because I like, you are actually involved with the various regulators in Europe. You know, you are in Europe, you’re in Malta, you are heavily involved in this space. I am not. I sit outside kind of as an observer and I feel like the European Union has done a nice job.

of actually taking some feedback and listening and making some good changes to everything going on versus I feel like quite often when we’re dealing with regulators, it’s like a shut up nerd, we’ve already solved this, go away.

Daniel (11:24) I think a lot of the openness to open source has to do with the early engagement of the NLNet Labs Foundation Group because their direct personal involvement made the regulators set up and pay attention. And by regulator, what I’m talking about here are those people working for the European Commission that

had the charge of not only writing the Cyber Resilience Act, but also communicating it with these stakeholders and also explaining it to people who never cared about cybersecurity. Right? Where it was just like, well, it works on my iPhone, so it’s got to be good, right? I can download it from the app store, so somebody else is worried about this. don’t have to. Right? And I mean,

I feel like there’s a seismic shift about to happen. And it’s not really like, the Brussels effect or, GDPR is going to create the CCPA and HIPAA, you know. No, this really fundamentally addresses the shortcomings that we’ve been talking about in the cybersecurity industry since forever.

I mean, since the first hackers got around, you know, we’re just like, ⁓ maybe you want to fix this because I just stole all your stuff.

Josh Bressers (13:01) Yes, yes. And so it’s funny you say that because Adam Shostak, he does a lot of threat modeling things in the security universe. And he has a very nice class for anyone looking to learn more about threat modeling. But he put a thing on Mastodon over the weekend that was talking about bridges. Like every bridge has a risk management effort, right? Because that’s just the nature of public infrastructure like that. And I’ve been thinking about that, and it’s like,

I understand what Adam is trying to say, but at the same time, like if you look at the world of, let’s say, you know, public infrastructure, when there is a mistake, when there is a problem, when an accident happens, there’s an enormous amount of investigation into that event. And then we make changes to how we build bridges or how we fix, whatever, right? In security, I feel like it’s, well, it happened again, everybody, you know, whatever. And like, I feel like we never learn a lesson.

Daniel (13:59) Nothing we can do about it.

Yeah, you know, geez, these big companies, they’re just so big, they’re just gonna keep on doing what they’re doing, you know?

Josh Bressers (14:04) Yeah, right. Poor

little Microsoft. How could they possibly help out?

Daniel (14:10) Well,

I mean, to be fair, I am kind of moonlighting. my company was awarded a tender by ETSI That’s the ETSI, ETSI the European Telecommunications Standards Institute, not the ETSY one. And what we’re doing is we’re serving as the rapporteurs. That means kind of like the subject matter expert secretaries, if you will, for ⁓ three

of the important standards listed in Annex 3 of the Cyber Resilience Act. So we’re working on the cyber standardization of the browser, of the password manager, and of the boot manager. And it’s not just Microsoft that’s in these meetings. It’s Apple, it’s Google, it’s Mozilla. And there’s a vested interest from industry to make sure that we get it right.

Josh Bressers (15:00) Yeah. Yeah.

Daniel (15:10) And yet I have a personal mission and anyone who’s listening to this who feels that they’re addressed by any of these three verticals or any of the important verticals really can reach out to us. We have a mandate from ETSI to invite the open source community as experts to this standardization process and to be directly involved. It’s a couple hours of meetings every couple of weeks. ⁓

We have a very tight deadline. We’re aiming to get this all ⁓ published in final form toward the end of the year. mean, it’s complicated. Like writing a standard is not just, you know, let’s go chat GPT and figure it out, you know, how browsers work. There’s a lot of back and forth and negotiations and discussions and it starts with scoping. Like you would not believe how long smartest people in the room

discussed what is browsing. It’s very intriguing and I recommend it. It’s very rewarding ⁓ work. And yet, I have questions. We are accompanied by some very smart people from the commission. Philippe if you’re listening, hey, thank you. But in such a…

landmark piece of legislation, there’s so much work that things are unclear. Like, I’ll give you an example. Here’s some un-clarity for you. Originally, we thought that anything that was communicating across the network was in scope. It still is. Don’t get me wrong. You know, if it’s communicating across the network, it’s in scope. But recently, with all of the work that’s being done on the horizontal standards, as well as the vertical standards, it’s kind of come to our attention that

Maybe it’s everything running on the operating system.

Josh Bressers (17:09) Ha ha ha.

Daniel (17:11) And then in the browser meeting, we discussed the classical flow of an exploit across multiple layers of the operating system. And a lot of them start not with the browser, but with the website. It starts with somebody not sanitizing their inputs, some kind of XSS. There’s a lot of ways you can do this. And then…

suddenly you’re in the browser and then there’s another chain to exploit in the browser. And suddenly you have root on the operating system of the user or please, maybe the servers, you never know where you’re going to traverse horizontally or even diagonally. It’s like, you know, so the questions that we’re trying to get addressed are also responding to the questions we’re getting from the open source community and from

⁓ the broad field of stakeholders, I would say. So in contrast to a lot of other standardization processes that I’ve kind of seen post-factum, I feel like we’re at least trying our best to incorporate and include members of the open source community wherever we can. And so, I mean, I’m not here to do outreach for that, but I will tell everybody that, you know, it’s important work. It’s going to frame the internet for the next decades.

Josh Bressers (18:34) The annoying

Yeah, for sure. And you can do outreach. I’m cool with that. Like, yeah, anyone interested in this, I’ll have a link to some way to contact Daniel in the show. So use that. But no, I mean, this is why I wanted you to come on the show, though, right? mean, this is why no one knows this except Daniel and I. It has been like months in the making to get him on this because it seemed like one of us kept canceling at the last minute. so like, but now whatever, we both had issues.

Daniel (19:01) No, this is the one of us who is canceling at the last minute.

Josh Bressers (19:06) But like, this is

the thing, right? This is why I wanted you here is because you’re involved and you have this knowledge that I feel like is so important and it is going to affect open source and closed source and the internet and everything we do for, yes, decades, decades and decades to come a hundred percent. Because the other thing is you mentioned a while back that every sovereign nation, you know, creates their own laws for this stuff. And that’s the, that’s the thing, right? And that’s what makes it so

I guess scary in a way is this isn’t like the EU drafts a thing and then we’re like, all right, that’s the law. We’re done everybody. The EU is complicated. Right. And so if there are mistakes now, it will be 10 years before those mistakes end up fixed in all the jurisdictions, suspect.

Daniel (19:52) I mean, do you remember DORA,

Digital Operational Resilience Act? It just entered effect or was in application in January. It was immediately challenged, immediately challenged, and ⁓ two weeks later, ⁓ several parts of it were reversed. So there are a lot of controls here. I mean, on the one hand, yes, you have the litigation process that can happen, but

Josh Bressers (20:00) Yeah, yeah.

that’s impressive then.

Daniel (20:20) I think what’s more likely to happen here is the regulators, and by regulator, what I really mean are the enforcement, the surveillance authorities, the people who are trying to make sure that this is all tracked and being done appropriately. I think what we’re going to see is a lot of warnings in the beginning, and then we’re going to see fines. And that’s going to take time. It’s just going to take time.

Josh Bressers (20:43) Yeah, yeah.

Well, and that’s an important thing as well, I think is I remember when GDPR happened and everyone was saying, my goodness, this is going to end the world. What businesses are going to go bankrupt from GDPR fines. And the reality is like, there haven’t been that many fines. Like you have to be pretty egregious to get a big GDPR.

Daniel (21:05) At the same time, if the scope does expand to include everything running within a, I’m not saying it will, I have a lot of questions still, we’re trying to figure this out, but if it does expand to everything running in the context of an operating system, there’s a lot of room for scope and maybe companies like yours, Josh, should really consider donating the open source code to a foundation so that it can be stewarded.

I think that’s the important part that got carved out for us as a ⁓ broad open source community. And I know there’s voices out there, I don’t agree with them, but there are voices out there that said, ⁓ the open source steward program was just built for Linux Foundation and Eclipse and Apache. And yes, while that does reflect a bit of truth, it doesn’t stop anybody else from militantly protecting the source code away from interests that can change the license.

example. doing that will then do a couple things. Stewarded open source cannot carry a CE market. It can’t because it’s not a product. It’s not on the market by definition. ⁓ Does that mean that you don’t have any obligations? No, you still are supposed to report to the regulators that you’re being attacked if you’re being attacked.

You’re supposed to have a contact address. And honestly, if you’re not shipping a entirely granular SBOM in 2025, are you really here? You know, like, come on. I’m not talking about your package lock. That may be enough, but we can, we can, and we should be doing better. And I, I feel that that’s the larger impulse here. Like,

Josh Bressers (23:01) I I agree, but the product liability directive doesn’t necessitate an SBOM, right?

Daniel (23:01) You

No, but the product liability directive is expected to be in application sooner. Maybe Germany will be first. The deadline is two years from December 11th, 2024. So December 11th, 2026, it’s going to be available in a number of European countries, which is why I see it as the larger threat. And let me explain the threat model.

Josh Bressers (23:16) Right, right.

Daniel (23:37) ⁓ Let’s say you have an open source project and it’s being used by a big player and that big player gets sued for, ⁓ how about psychological damage, which is actually described. It’s a novel type of damage that you can sue a manufacturer for. So that manufacturer gets sued for psychological damage and their legal team goes through the entire SBOM and associates ⁓ the risk with the different component.

distributors isn’t the right word creators some of them are open source probably the majority of them are open source but I mean they’re already like out millions billions who knows might as well send a letter to every single open source manufacturer in your SBOM and be like hey can you prove to me that you are really open source

Josh Bressers (24:32) man. Right. That’s dicey, isn’t it?

Daniel (24:32) and not a manufacturer.

I don’t know. mean, it’s a play in the playbook from legal firms who got nothing else to lose except, you know, and then, you know, let’s imagine you’re these two guys in the basement, you’re just doing your little project about converting floats into integers or whatever. And you get a letter like that over or an issue filed on GitHub. I mean…

It’s chilling and it’s scary. And I have been talking about this with friends and I mean, maybe there’s a group of us who might be putting together a protection fund for small projects that get swept up into something stupid like that. We’re not there yet. Nothing is public and I can’t name it obviously. But we can’t let that happen. We can’t let it happen unaddressed. Unaddressed. mean, we can’t stop it.

Josh Bressers (25:31) for sure, for sure. Yeah, yeah.

Daniel (25:35) But I think that’s a risk that we’ll see and at least one legal firm is going to try it.

Josh Bressers (25:43) yes, guaranteed, for sure, for sure. man, that’s terrifying. I didn’t even realize that was a thing. Wow.

Daniel (25:52) Yeah,

but that’s how the joint and several liability is going to work, right? Like the final manufacturer gets sued, the one downstream at the bottom, that is the manufacturer. And as the manufacturer, you are responsible for every line of code in your product, including that of third parties to the extent that the law provides. you know, it’s not just psychological damage that’s novel.

it’s also data loss on a consumer device. This doesn’t fit into the realm of what cyber insurance is giving you these days. And then here’s the kind of weird part. Let’s say you’re a company and you’re servicing B2B and you have a product that you’re selling only to other ⁓ manufacturers that are companies, but then that manufacturer uses your product and their product and sells it to consumers.

Are you no libel under the PLD?

you would be, right? Yeah, you would. I mean, unless you’re verifiably open source. And if you are protected by the amazing people at the Eclipse Foundation or the Linux Foundation or the Apache Foundation or NLNets Sister Association, the Commons Conservancy, who have lawyers, maybe they’re not prepared for this, but that lawyer’s going to get a letter and be like, no, sorry, we’re recognized by the European Commission as

Josh Bressers (26:57) you would, okay.

Daniel (27:25) being this type. yeah, we are absolutely not manufacturers. We’re stewards. So I think that we’re going to see a couple things happening. We’re going to see more associations forming where people collectively protect their code from such moronic, litiginous idiocy. I’m sorry, I’m trying to big scrabble words here, you know.

But I think at the same time, what I’m personally afraid of and what I’ve already seen happening here and there are people shutting down their repositories, closing their open source organizations because they’re not sure if it’s going to impact them. And yanking isn’t going to be a political move. It’s going to be survival.

Josh Bressers (28:10) Yeah, that’s what’s gonna happen, yeah.

Daniel (28:24) I’m trying to convince people that it’s going to be okay, but at the end of the day, this is not legal advice. If you’re listening to me right now and you’re scared, I can help you find a lawyer if you want one, but I am not a lawyer. ⁓

I might be your authorized representative. know, like in Europe, part of the CRA mandates, it’s required that non-European manufacturers have a European authorized representative who doesn’t do much else except keeps the records for 10 years and communicates as liaison between the manufacturer and the regulator in case something happens. ⁓

Josh Bressers (29:07) which

you know, you know what I’m going to do, Daniel, I’m just going to bring you back for CRA cause I feel like I don’t want to dive into that at this point cause we’re, nearly out of time here, but I also have one data point for you and for anyone listening. So you mentioned this whole like, you know, foundation as a stewart versus, you know, manufacturer versus project, like the thing to consider is so yes, there is the Eclipse foundation. There’s the Linux foundation. There’s the Apache foundation. There’s all these foundations collectively.

they probably have a couple of thousand open source projects that fall under their umbrella. In the world, there are, think NPM alone has like 10 million packages. There are, I use the ecosystems data, which is a project by Andrew Nesbitt. He’s actually guessing not too long ago on the show. ⁓ There’s like 30 million open source projects that they’re tracking, right? And there’s almost certainly more than that. Like 30 million versus a couple of thousand. Like this is,

Daniel (29:54) Beautiful project.

Josh Bressers (30:07) a mind-boggling difference that we’re going to have to rectify at some point.

Daniel (30:16) I can’t count to 30 million. I never tried. think the large, the most, the most modules, node modules I ever had in a project was a library project a while back. And that had something like 4,000 node modules in it. But I mean, this was back in the days of like Babel JS and stuff, right? You know, or you had to transpilot yourself.

Josh Bressers (30:20) I

And node still works like that, where when you install a node module, you get like hundreds of things beneath it. Like that’s just normal there.

Daniel (30:45) Yeah, it’s

interesting. Like if there was one takeaway that I had for all of your listeners right now, the investment of time versus value that you’re to get the most out of is to create an SBOM that includes all your transients. Everything that is ⁓ below top level, because I’m kind of sure or hopeful, let’s call it hopeful.

I’m hopeful that the TC1 that’s working on the technical committee one that’s working on the broad horizontal standard agree with my concern that a top level SBOM is utterly useless. Like it doesn’t do anything for anybody except fill the check mark in your ISO 27001 that you got an SBOM. ⁓ It’s not going to help you. It’s actually going to cost you more time.

Josh Bressers (31:30) yeah, for sure.

Yeah. Yeah.

Daniel (31:43) to figure out which version of what library was in that product at that point in time four years ago.

Josh Bressers (31:50) Right, right. Yeah, cause no one keeps those artifacts around. Yup.

Daniel (31:54) Nah. So do your SBOM

homework. That’s coming from me and from Ola and from Anthony and from Josh and from everybody in the security community. Save yourselves the time and effort because no matter what happens with the SBOM, with Europe, ⁓ with your product, with somebody’s open source project, hackers are going to come and they’re going to win and you’re going to discover new vulnerabilities and like…

That’s as given as the weather. You’re not going to say, I solved rain, so you’re not going to buy umbrellas anymore. That doesn’t work on this planet, at least at all in the past decade.

Josh Bressers (32:27) Yeah, yeah.

Right, right. Yeah, I agree. let’s call it there and I’ll just have you come back for some more CRA stuff. I’ve, my goodness, Daniel, we have been talking for more than half an hour now and I feel like it’s been five minutes and I feel like you’ve taught me more in 30 minutes than I’ve learned in the last two years of trying to figure this stuff out. So I love it.

Daniel (32:58) Well, thank you and ⁓ yeah, I I try to do my best. I care about us. I care about our community. I care about open source. I care about good software.

Josh Bressers (33:08) Yeah, that’s awesome. And I mean, thank you for the work you’re doing. Like this is a big deal and I’m glad we have someone who knows what they’re doing helping out, which is huge.

All right, Daniel, thank you much.

Daniel (33:23) You’re welcome