Voice Biometrics and Privacy Regulations with Douwe Korff
Learn from one of the world’s leading experts about the key privacy regulations that impact Voice Biometrics implementation in consumer-facing use cases. Covering the key regulations from North America and Europe (BIPA, GDPR, CCPA, UK DPA etc.) Douwe Korff explained how meeting these regulations shouldn’t be seen as a barrier, but that compliance can help improve user acceptance and adoption.
Douwe’s presentation was followed by an open question-and-answer session hosted by Matt Smallman, where there were many interesting and relevant questions.
Matt is the author of “Unlock Your Call Centre: A proven way to upgrade security, efficiency and caller experience”, a book based on his more than a decade’s experience transforming the security processes of the world’s most customer-centric organisations.
Matt’s mission is to remove “Security Farce” from the call centre and all our lives. All organisations need to secure their call centre interactions, but very few do this effectively today. The processes and methods they use should deliver real security appropriate to the risk, with as little impact on the caller and agent experience as possible. Matt is an independent consultant engaged by end-users of the latest authentication and fraud prevention technologies. As a direct result of his guidance, his clients are some of the most innovative users of modern security technology and have the highest levels of customer adoption. He is currently leading the business design and implementation of modern security for multiple clients in the US and UK.Only available to signed-in members
[00:00:00] Matt: Okay. I’m gonna make a start then because you don’t want to hear much from me. You want to mostly hear from Douwe and that this interesting subject we’ve got lined up for you this afternoon or this morning depending on where you’re joining from.
[00:00:09] Just a quick introduction from me and a bit of housekeeping, and then we’ll pass straight over to Douwe. So, my name is Matt Smallman. I’m the author of Unlock Your Call Center. And my work is helping organizations improve the usability and efficiency and security of their call center, Identification, Authentication and Fraud Prevention processes.
[00:00:24] But above all that, my real mission is to help organizations get rid of frustrating, time consuming, and pointless security processes wherever I find them. And for that reason we’ve established what I call the modern customer security community. Votes are, and nominations are open for a more catchy title.
[00:00:39] But whilst I think I do a great job for my clients, I’m just one person and we’ve heard from many people that they want to take advantage of the experience of others to help them along this journey. And they don’t think existing forums allowed them to do this in the manner in which they thought was most appropriate, particularly given the sensitive nature of the topics being discussed..
[00:00:56] I use the term modern to contrast between what I see as the traditional knowledge-based Authentication processes, like mother’s maiden name or date of birth, and even transitional approaches like pins and SMS and passwords that address some of the security challenges of the call center, but ultimately don’t improve the usability and efficiency significantly.
[00:01:16] Really what we’re trying to do here is we’re talking about modern customer security technologies such as Voice Biometrics, Network Authentication, app-based calling, behavioral analytics, and how organizations might decide what the most appropriate is, implement those and do so effectively for their organization.
[00:01:33] And we’re trying to do that in a community spirited way so that we share best practice amongst ourselves that creates engaging content that informs people about this field and about the opportunities exist in it that enables them to deliver change for their organizations that ultimately has impact and improves customer conversations whatever type of business, whatever type of organization they might be, and again, hopefully they’ll come back and share their experience with us, which then continues to spin this flywheel, which you’ll see on the slide to the side.
[00:02:02] A few housekeeping rules before we start. First off, everyone’s videos on mute and their audio’s off. We have a chat and a Q&A function for this afternoon. So if you have any questions or comments you want to raise, then please do so. Using either of those functions. Some people prefer one, some prefer the other.
[00:02:18] I’ll be the moderator of those and when I think it’s appropriate to drop those into Douwe’s presentation, I’ll do so. But otherwise we’ll have a more open discussion at the end. I also want to just draw your attention to what we call the Chatham House rule. Not, we don’t call it that, but there, there is a rule called the Chatham House Rule.
[00:02:33] And this is that we want you to be free to use the information you get from the session today particularly Douwe’s prepared remarks, and you’re more than welcome to attribute those to him. But neither the identity nor the affiliation of anyone else who might end up speaking other than me, obviously should be revealed or the nature of those conversations, or they are obviously free to use that information and what comes from that for your own advantage, for your organization’s benefit.
[00:02:54] This call is being recorded. There will be an edited version of it that is made available, removing many of those kind of interesting questions that you might get in these sessions. And mostly consisting of Douwe’s prepared remarks. So you have to join live to get the benefit of the whole conversation.
[00:03:08] So just an incentive to, to keep doing that.
[00:03:10] So without further delay I’m delighted to introduce Douwe . So, data privacy and privacy regulations concerns are often seen as barriers to adoption of some of these modern Authentication technologies, particularly Voice Biometrics or even an unfair burden that has to be complied with or worked around or fixed.
[00:03:27] And I don’t for one moment believe that’s true and thankfully Douwe doesn’t either. Because the truth is that high quality human-centered design both delivers both. Engaged and active users of these services, but is also inevitably by, its very designed, going to comply with these regulations.
[00:03:45] In many cases. I view these regulations as a health check, as a sanity check to make sure we’ve done the right thing rather than a set of constraints under which we need to design the business processes that make the most this technology. Now Douwe has prepared some slides to provide context for our discussion.
[00:03:59] And you will get a copy of those afterwards, so you don’t need to necessarily take notes. And they’re gonna appear where I am right now at the bottom half of the screen. Quick intro for Douwe, and I’m sure he’ll do a bit more in a second. So, I came across Douwe through some of my work with clients dealing with a particularly prickly problem which a couple of people on the call might be familiar with.
[00:04:17] And Douwe was able to provide the most reason discussion of the issue and its basis in multiple different jurisdictions and law that I’d ever seen. Almost overnight, like out, just out, out of thin air, out, out of his mind and his experience in this field. And then when I look back over his cv, I see that he’s actually been providing expert advice in the field of privacy digital security, digital rights since before.
[00:04:39] I left school, which is quite frightening. In fact, you might even consider some of his early work to have laid the foundations and the groundwork for the, some of the privacy regimes that we currently have in place. And I see most recently he’s also been training information commissioners in some jurisdictions as well.
[00:04:54] So I don’t think I could possibly find anyone more qualified to speak on this subject. Although Douwe is gonna give you a bit more of, a bit more of a flavor of him and his work before, before he kicks off. So, with that then Douwe over to.
[00:05:06] Douwe: Thanks very much. If you put the first of my slides on this the cover one that’s not just to to blow my own trumpet, but it gives you a few of my affiliations.
[00:05:14] I’ve got a few more, and it gives you some links to two recent, fairly recent publications. One is a handbook on the EU General Data Protection regulation, which I’m sure all of you have heard of. And it’s free of charge, and we created it. That is Marie Georges, one of the. Most oldest and active data protection persons in Europe to train people, to train data protection officers in five European countries.
[00:05:40] And Maria and I. By the way, Marie was one of the authors of the 1995 data Protection Directive together with Wolfgang so I know them very well. And Marie and I did a sort of extract of that, of the first chapter of that book, which is about the origins and meanings of data protection, especially people outside Europe.
[00:06:01] Sometimes don’t realize what exactly data protection is and where it comes from. So to the extent that you’re interested in that click on those links and you can download those documents. The first, if rather, a long book. Which, and by the way, now revising for Oxford University Press in a more expanded version.
[00:06:16] And the other one is a good one if you want to just know what these funny Europeans are all about. But let me get on with the topic of today. The next slide please. This is the core of any development in the digital world. What you need is trust. Without trust, the digital environment won’t work for you and it won’t work for your clients or for your customers.
[00:06:40] It requires three things more than anything. Openness, honesty, and respect. Respect of the individuals who is personal information you are processing, collecting, and. And respect means you’ve got to give them choice and you’ve got to give them what the Germans call is. Informational self-determination best pronounced in the German census judgment of 1983 for those are particularly interested.
[00:07:09] It’s quoted in the data protection manual that I gave you on the front. Next slide. Before I go into this, how it’s translated into let me go on to this one. How are these concepts, transparency, honesty respect translated into data protection, privacy laws. As you will be aware, data protection is now available, or privacy or data protection laws are now in place in the vast majority of countries.
[00:07:38] Graham Greenleaf, marvelous colleague of mine from Australia does a very. Map and a summary of all the laws in the world. I can recommend that I haven’t put the link in there, but if you type in Graham Greenleaf privacy Laws in the world, you’ll get all the links for it. The most important ones are the EU General Data Protection Regulation that was adopted in 2016 and came into force in May, 2018, but some elements are only just now filtering.
[00:08:07] And in the usa the California California Consumer Privacy Act and the follow up, the C P R A that came into force just now in the 1st of January, 2023. And also important in the present context is the special Illinois Biometric Information Privacy Act, BIPA by the way, there are a number. All state laws in the United States, four or five, or even more now.
[00:08:34] And there are plans, but plans have been going on for a while for a federal privacy law. The latest one, as you can see is now also being put forward on a bipartisan basis which may get further than the other ones. One of the problems, my, my apologies is that it might preempt the state laws, so to the extent that the state laws are very strict at the moment.
[00:08:57] A federal law might undercut them, and that would be one of the big battles that might happen at the federal level. There are also some plans in the United States for federal biometric law, but they have not been very advanced. Let’s go to the next slide, but I have a comment to make before that.
[00:09:12] Before I get to these common features, today’s topic is particularly about biometric data and the European regulators in particular, but also as you can see from BIPA, some of the American legislators have been particularly concerned about biometric data, which is pervasive anywhere. I wrote a note on it with gladly sent to you.
[00:09:30] If you ask for one, one of of Matt’s clients and my kind of company, I advise ValidSoft. They are particularly concerned about certain features of Biometrics. First of all, that it’s easy to capture biometric information on people without them being aware of it. You can capture my voice right now by just listening to me speaking and record.
[00:09:51] The second one is that if raw biometric data are used for certain purpose like Authentication and you lose your raw data, then somebody else can impersonate you. It I think it was the Chaos Computer Club in Germany that stole that. Obtained the fingerprints of the Minister of Interior from a glass that he’d been holding at a conference, and they said, we now got your fingerprint and no device that uses your fingerprint to secure anymore.
[00:10:20] It was not completely true because devices also use lifeness checks, but you can see the risk that the European regulators see in this in this context. The third concern they have is that biometric data, if you have biometric databases in created in different contexts, can easily be matched up. And that can create central databases that can use for multitude of purposes, which is directly contrary to the very concept of data protection, certainly in Europe, but also increasingly elsewhere, including the United States.
[00:10:54] And the fourth is a mission. If you start using my voice just to authenticate me is one thing, but if you start now trying to read my emotions into what I say, then that is a creep, a change in the purpose, a change in what’s happening that I as the data subject, as the individual will not normally be aware of.
[00:11:17] So, how is data protection implemented in law? And there are interesting parallels there. Similarities in spite of all kind of differences, similarities between the approaches basically globally. And I’m giving you some examples here from, again, Europe and the United States. There is a duty to inform people.
[00:11:36] You can’t respect people if you don’t tell them what you’re going to do with your data. The EU article 13 of the GDPR says you have to give people a long list. It’s a long list of very specific information about what you’re going to do with data when you collect data from them, and I urge you to read that list and then you might be worried, but later on I’m going to explain how you can do this compliantly and still practically in a way that doesn’t hamper your operations..
[00:12:05] And the second thing is to give in the United States, sorry. The emphasis tends to be on privacy notices. So companies are required under the CCPA and under BIPA, under all American state laws to put up privacy notices where people can easily note and find out what will happen to the data they provide to that company.
[00:12:27] Now, of course, I’ll come to it in the next slide. You have to abide by. By your privacy notices. Otherwise, the FTC will take action. The second thing is choice. And unless there is a particularly special reason for wanting to use people’s biometric data, either because it is required by law, for instance, in passports or borders or things like that it will be rare that it will be mandatory.
[00:12:54] The European data protection authorities speak. Using Biometrics to gain access to a laboratory where dangerous viruses are being handled. It’s that kind of extreme situation. In almost all other circumstances, you will need the consent of the person’s concerned, and crucially, you will need some proof that you have obtained that consent.
[00:13:18] The consent must be free informed and expressed in some way. You can’t imply consent. That’s not valid consent either under the CCPA , CPRA or under the GDPR, or under BIPA. Or in the BIPA specifically. You have to obtain what’s called a written release. There has to be proof that consent has been given.
[00:13:40] Let’s go to the next slide.
[00:13:45] Ah, I seem to have missed numbered things. There you are nevermind. The next common feature is honesty and respect of. If you have put up a privacy notice, if you have informed people of what you’re going to do with that data, then you must stick to that and you must act in accordance with what you’ve said.
[00:14:00] You must of course also not use dark patterns. That is becoming an increasingly important issue, not just in the context of data protection legislation, but also in the context of the AI Act in Europe. For instance, the Federal Trade Commission in in America has has taken an interest.
[00:14:17] Tackling dark patterns. Dark patterns are manipulative ways of making people consent in inverted comas to things they wouldn’t consent to if they realized what was going on. And finally, final on the introductory things I should mention as an increasing convergence between data protection law, privacy law, consumer protection law, and competition law.
[00:14:41] You may have noticed that in in Europe, the competition authorities are taking an increasing interest in the massive platforms like Google and Meta that use so much of the data. So these things converge. They all tend to come down to these principles that I’ve mentioned earlier. Honesty, openness, respect.
[00:15:02] Let’s go more specifically to the topics of the day. Next slide please. I hope I’m not taking too much time. So how do you inform people? I have to put my glasses on because I can’t read this very well. There you are, especially if you’re using a call center or trying to obtain people’s consent in the context of a.
[00:15:25] But it’s very difficult, in fact, nearly impossible to provide the information in Article 13 or in the privacy notice in the United States in the context of a court. The simple thing would be especially also to obtain that consent and to obtain a record of the consent. But the simple thing to do would be to send them either leaflet in hard copy or a link by email to a.
[00:15:50] And on the website, you provide all the information that the law requires, and that is necessary to be honest and open with your client. The European Data Protection authorities have stressed over many years that the best way to do this is to provide layered information. You provide top level information that should at least include who you are, what the data is that you’re looking for, what you’re going to use.
[00:16:18] Matt: And then beyond that, you go to the next page in the leaflet or the next page on the website, and that explains in more detail for people who have more detail about what exactly are you doing? Are you creating a voice print? How long do you retain the data? By the way, retention periods are a crucial issue that people must be informed of, especially on the US law, but also on the European law.
[00:16:39] Again,
[00:16:44] That will then, that leaflet, or the link will say, next time you call us, we will ask you if you’ve read this information and if you want to be enrolled in the biometric scheme. Next slide, please.
[00:17:03] Yes,
[00:17:06] so the next slide gives the further information. In the example I’m giving there of of ValidSoft is a useful one because it builds on on the normal information that you have. By the way, I’ve given a link to the European data Protection Protection Boards guidelines in the previous slide.
[00:17:23] Slide six. Maybe you can just go back to that. Matt, for a. That one the link in blue. I strongly recommend that you read that. And by the way, also the article 29 working parties WP 193 guidelines on biometric data. But again, I can provide you the data if you like, look at that, because that gives you the information, the advice on how to inform people and how to do it in a lab form.
[00:17:50] Upfront information, more detailed information, yet more detailed information. What ValidSoft does is very useful extra information. I’ll read it out to you. The voice print we used to authenticate you is unreversible. It cannot be used to recreate your voice. It’s a major security assurance and it addresses that point that European data protection authorities make.
[00:18:14] And it is doubly unique in that it cannot be linked to anybody other than you. That’s the purpose of Authentication, but also it cannot be linked to any other voice print in any other Authentication scheme, not even another scheme using ValidSoft software. So, that is, that guarantees and, is an assurance in data protection terms, specifically addressing the European Data Protection authorities concern about the matching of data.
[00:18:40] It was one of the main reasons why ValidSoft got a privacy seal for its Voice Biometrics solution.
[00:18:48] Let’s go to the next slide. It’s on consent. There’s only a few to go. Don’t worry. As you can see I’m just taking this from the Data Protection Words guidelines on consent. The link is given. There again consent can be obtained by ticking a box, an optional box on a website, but of course, you must keep a record of that.
[00:19:10] Can also be obtained through a record. A reliable record, secure record of a recorded oral statement. When, and that is of course, important how you can do it. It can also go on the last quote there. You can do it in a confrence in a call, but it is difficult. I’ve addressed that difficulty in the previous slide.
[00:19:33] In advance, send information on what you want to do and how you’re going to do it. Then you can, in the call, follow up to make sure that the person has understood that and that therefore the consent is informed and free, and that’s explained in the next slide.
[00:19:51] With two scenarios, the one in which you you have had somebody click on the website. Yes, I want to use this bank’s Voice Biometric Authentication. Next time I’ll call, I would like you to enroll me. Of course, the call handler should be informed of the fact that this person has ticked the box. So, you got the record already of the consent, but the sensible thing is to say, I can see that you’ve read our page and that you said you agree to be enrolled.
[00:20:19] Is that okay? Shall we do that now? And then of course you make an either, you record the whole conversation, or you keep another record, a log of the fact that the person again confirms, yes, I want to be enrolled, and you keep a record of that, you need that both in the terms of the written release under BIPA and in terms of the accountability principle under the GDPR.
[00:20:42] In the other scenario, you haven’t got a previous record. Somebody has been sent a leaflet. You don’t know if they’ve read it. You don’t know if they understood it. When that person calls, the call handler says, hello, did you get the leaflet? Did you read it? Do you understand what happened? Is there anything I need to clarify?
[00:20:59] And if they say, I want clarification, they must of course be able to give that clarification. But if they say, oh yeah, I got to leave them, I’m perfectly happy with. Then that should be recorded in the same way as in the previous scenario, and then you can continue with the enrollment. This is simply practical demonstration of how you can meet the supposedly very difficult requirements of the law in a very practical way.
[00:21:23] That is not a hindrance to the customer, and that will not impede your legitimate activities. That’s it. Any questions? Next slide.
[00:21:41] I’m gonna come back and join us at this stage. Thanks very much Douwe for that kind of whistle stop tour of the regulations. It provoked an awful lot of thoughts in my mind. So I’m gonna I, I’m gonna start off the questions and I’m gonna hope that we get some more questions in the chat or in the Q&A session.
[00:21:54] So, I’m sure many people on this call are already involved in different elements of of Voice Biometrics and its implementation. And probably have questions forefront of their mind. So don’t feel free to put those in either the chat or in the question area if you don’t want to mention, if you don’t want to be attributed, then use the Q&A feature.
[00:22:12] And I think only I can see that, and I won’t necessarily name people. Just say, please don’t mention me. So, oh, Simon Davies already has a question, so he. Douwe mentioned a written record required by BIPA does a tick box on a website adequately meet this requirement?
[00:22:27] The basic answer is yes.
[00:22:28] It’s and I’ve got my my basic insights into American law from one of the leading American privacy lawyers. I’ll not mention him, but he a good colleague of mine. It’s clear in the US law that written record includes an electronically recorded written consent. And yes, you can get the consent from a log of the click.
[00:22:50] But only obviously provided that the information before the click was very clear and almost an explicit. So, you must not have a click box that allows people to say, oh, I agree. Before having even looked at the information, you must first send them to the information page, force them to at least scroll through the information page, and it can be layered.
[00:23:14] It doesn’t have to. 20 pages of legalese, but at least they must have seen the full page. They must have had the opportunity to look at more detail. If they have done that, then they click and you’ve logged it. Then you’ve got valid electronically recorded consent. As I understand it, I’m not giving legal advice in this seminar.
[00:23:33] And I think the other interesting feature of BIPA is just who it’s applicable to, so that there are a bunch of exemptions under different US regulations. I don’t know if they, you were able to speak a bit
[00:23:44] more about that. Yes. The first one to mention is that in the United States banks have been subject to the Graham Bailey Leech Act, GBLA for many years.
[00:23:54] They fought it in the beginning but they are now. Content with it, let’s put it that way. And that’s got a privacy section in called Title Five in the GBLA and all other privacy laws in the United States to date, specifically exempt organizations that are subject to the GBLA from their more demanding requirements.
[00:24:16] So financial institutions in the United States are by and large subject to the privacy requirements of the GBLA, but not, or at least not. To the further going state laws including BIPA including the CCPA. On another diff big difference between Europe is because in Europe, the whole point of data protection law, you’ll see it in that paper on the origins and meaning of data protection, is it protects everybody.
[00:24:42] Whereas in the United States, privacy laws tend to only protect consumers and there’s a specific definition of consumers. And it only. Tends to apply to companies dealing with consumers, whereas in Europe, any entity. Charities companies, public bodies that deal with individuals are covered by this generic data protection law.
[00:25:05] Those are big differences. So yes, you always have to look. There are also big exemptions for medical institutions that are covered by HIPPA in the United States. So it for your particular organization, you must always carefully check how we subject to the law, but. Even if not, go back to my very first slide.
[00:25:24] Trust is the core issue. If you want to get the trust of the people you’re dealing with these, this advice that I’ve given you, which is not impossible to do, would basically be a good idea to follow anyway, even if you’re not terribly legally, completely required to do it. On top of that, at some stage there will be federal privacy laws, and I think they will also apply to the currently exempt organization.
[00:25:51] I, I just remember when I first started working in this field more than a decade ago, and started to talk about implementations with clients, I used to put up this slide that had the kind of the range of offer and consent questions on it, and there, there’s a, like not asking somebody at all.
[00:26:04] It is just really weird. Yep. Because you go from authenticating people by their mother’s maiden name and date of birth to suddenly. Nothing. And that’s just really weird. And it sets consumers off and customers off going, well, what happened? What changed? Is it still secure? Is it not? So not even informing them is just gonna create you more problems than not.
[00:26:23] And then we used to be on this spectrum and say, so do you inform them? Do you inform them in advance? Do you inform them afterwards? Do you. Ask them, do you give them an option to opt out or do you give, or should it be an opt-in option? And this is in the absence of a lot of these regulations, I think we’ve now settled in most jurisdictions on this kind of opt-in model.
[00:26:40] And it’s the degree of the opt in ness that is required. BIPA has some more stringent requirements for what actually opting in looks like. But I think broadly asking people maybe with a positive spin on it, what would they like to do? This thing is the right thing for. Privacy, personal privacy reasons, but also for kind of the effectiveness of these services.
[00:26:58] It’s perfectly legitimate to say this. If we’re using your voice biometric or another biometric, it’s more secure and it’s more convenient. Yeah. And people understand that they will be worried about the security and you can tell them. The kind of things that ValidSoft says that is totally secure, it cannot be misused for another thing.
[00:27:15] If there is a problem with it, we can re-enroll you, your voiceprint. We, the old voiceprint goes out, we create a new one. One thing you were talking about exemptions and I was talking about the exemption, the US law the reverse. Is the case with the EU gdpr. The EU GDPR has extensive application also to companies that are not based in the European economic area in the EU or the EEA.
[00:27:38] If you collect information and use information on individuals in the EU, EEA, you are subject to the GDPR or either if you offer them goods or services, or if you collect the data. Monitoring the behavior of these people. Typically by having trackers on your website, it constitutes monitoring of behavior. So be very aware that EU data protection, or which is the strictest still in the world can have significant implications for you.
[00:28:08] If you don’t target people in the eu, the law might not apply if you only rarely get a visitor from the EU to your website. It might not be strictly enforced, but if you have a website that is specifically also aimed at people in the eu, for instance, because you accept payment in Euros or you have a call center with an EU telephone number, then you must expect that you are subject to the GDPR and you’ll have to comply with the GDPR.
[00:28:39] And that’s been increasingly, strongly enforced. It finds up to 4% of annual turnover. That’s. I don’t like to use the stick. I prefer the carrot.
[00:28:52] And, and just to follow up on that, I think my, certainly the best practice that I recommend to clients is always to do this as two steps to do what I call the offer step first.
[00:28:59] And we’ll be talking about this in a fortnight’s time is where you position the service, its features and benefits, and give customers the opportunity to ask questions before you then move on to the more formal, often disclosure, which is where. Lawyers and I often fall out over the specific wording, but I’m sure we usually manage to find a happy medium where people are robustly informed.
[00:29:19] Yeah. There, there are some good guidelines saying not use words like I agree. I agree. In, in the case consent. Always have anything is not, you want to have an affirmative action. Absolutely, you’re right. Doesn’t make many forms and it can be discussed with people who know how to interact with ordinary consumers in non-legal parlance.
[00:29:40] That’s quite possible.
[00:29:41] That’s absolutely right. The use of plain English
[00:29:44] Yes.
[00:29:44] Or whatever language is appropriate is essential at this point. Because otherwise otherwise it just plain Dutch as well, uses and overwhelms people.
[00:29:51] Absolutely.
[00:29:51] So a few other questions that kind of come up. So, we’ll come back, we’ll, if we got time, we’ll come back to cases and who’s, because there’s a couple of legal, there’s a couple of cases in different US jurisdictions around this issue at the moment.
[00:30:03] But we’ve got another interesting question in the c hat from Keith. I think he’s asking about alternative purposes. So there is, my understanding here. Douwe can correct me. There is still under GDPR in the UK Data Protection Act. There, there are Whilst consent is the preferred mechanism.
[00:30:18] There are basis, there are cases when consent may not be appropriate to be gained. And the case, I think we’re thinking about most of here is fraud prevention. Whereas if we asked a fraudster, are they happy for us to process their voice in order that we can check that they’re. They are the fraudster.
[00:30:33] They’re not a fraudster, then they’re likely to not give us consent. So, I dunno if you wanna talk about, I think it’s the public, substantial public interest exemption.
[00:30:40] Douwe: Important. There was a problem there with EU law in that, and this is covered by Article 23 in the gdpr, which unfortunately says member states may depart from the requirements relating to data subject rights about informing and consent when necessary to fight.
[00:30:58] The problem with that is it leaves it to the member states exactly how they do it and to what extent they do it, and the different member states have adopted different laws. On these exemptions. So yes, by and large you will find in all these laws some exemptions that say you obviously don’t have to inform somebody that you reasonably suspect of being a fraudster that you are suspecting him.
[00:31:20] But the details depend on the national law. So in that particular respect you have to crawl through as many laws as you happen to be subject to. And that is really annoying. Other context. The, the data protection authorities in Europe generally say valid consent is difficult to obtain from your employees because you’re obviously in a hierarchical relation with them.
[00:31:42] In some context that may not be a problem. For instance, ValidSoft we are asked the employees asked to participate in trials of the. They don’t have to. They’re saying, would you like to try it out? And we can improve the software if we do it. I don’t think there’s a problem with that kind of consent.
[00:31:59] If it’s consent about more tricky things, then you might, there might be a problem. It’s resolved in many European countries, not in the UK well, the UK is not an EU country anymore anyway. But, and not in all by advising seeking the advice of the of the workers’ council. In many countries there’s a workers’ council.
[00:32:17] So if you want to introduce Biometrics as a means of recording, who goes in and out of the building, and when that happens, or when people make deliveries, if the delivery drivers and things like that, you discuss a, with the workers’ council and you come up with a good set of rules for it, for instance.
[00:32:34] The driver must be able to switch it off. If he starts being out of office, he’s now using the car for his own purposes, and he doesn’t, the boss doesn’t need to know where his car is out of office hours, but in office hours, it should be. Those are the kind of details that you work out in discussions with your employees, and if you do that rightly, you’ll be okay as far as data protection laws concerned, but you’ll have to look at the national laws.
[00:32:57] The US laws tend to have much broader exemptions for fraud prevention by.
[00:33:03] Matt: Yeah. So I think in UK law there’s a specific basis of processing that is preventing
[00:33:07] Yes.
[00:33:07] Prevention of fraud. Yeah. So just a few more questions that keep, keep popping up to, both to me and to other people on the call.
[00:33:12] I, I think one of the interesting challenges I often have when discussing this technology with lawyers and businesses for the first time is when does what, like in call centers, we record calls all the time. We have tons of audio all over the place and yes, that is personal information.
[00:33:26] It may contain personal information, but it’s not biometric information. And we generally, the basis of processing is some kind of disclosure that’s been made to the customer and the business’s interest to allow to do that. So when does the record ing when. When does a piece of audio become biometric data and therefore require consent?
[00:33:45] And that’s often the distinction between those is I find people get confused with, I dunno if you want to say anything on that.
[00:33:53] Douwe: The European Data Protection Authority, again, it’s that article 29 working party document. WP 193, I think. It says a pure recording is not yet biometric data. The moment you do something with it, in order to I uniquely identify a person that does become biometric data.
[00:34:14] So a recording of this here if somebody says, only get the audio recording of this doc of this. This event. And then says, I want, who is that speaker? I’ve got an awful lot of recordings I’m going to see if I can identify, ah, that’s core again. That becomes a biometric data. So as long as it’s purely a recording and you’re not using it to identify a person that’s not biometric data.
[00:34:37] The definition of biometric data and GDPR is biometric information when used to uniquely identify a person.
[00:34:44] Matt: Perfect. That’s a really great explanation. I will reuse that again. We’ve got another question in the chat. I think it’s related to this kind of power imbalance that you talked about with consent.
[00:34:52] So, particularly as it applies to government organizations. So I think like a tax authority is, has quite a lot of power and influence over you as an individual, for example, and therefore can you ever freely give them your consent because is that.
[00:35:08] Douwe: The proper view is if you look at GDPR, certain legal basis do not apply to public authorities.
[00:35:16] Public authorities should have a statutory basis for what they’re doing and the law and the regulations under the law that explain what they can do and cannot do, should. Limit exactly what they do. Now, some people, for instance, my local authority may put on cultural events and allow me to sign up to a newsletter about their cultural events.
[00:35:38] They can use my consent for that because that’s not in the exercise of a statutory function that, that has some kind of a direct effect on me. But if it’s. Tax arrangements or other public sector activities, then they will have to have the statutory requirement. The lines are not easy to draw sometimes.
[00:35:58] Matt: Because, I think there was a, there was quite quite an interesting case in the UK with the HMRC who who implemented Voice Biometrics and did quite a good job of that. But I think there was some weaknesses in their implementation. I think notably, they didn’t actually do a privacy impact assessment before they.
[00:36:13] Launched the service. But they basically didn’t allow users to not give consent. It was effectively that you couldn’t get past the part of their IVR that was asking for your permission unless you said yes. They just kept asking you. And I think that was subsequently found to be not not appropriate on the basis that customers couldn’t give freely because they weren’t, they had no other options.
[00:36:31] Douwe: And be aware that the UK Data Protection Authority, Information Commission’s office is much less strict than most of the European ones, if that had happened in France or Italy or Germany, it would’ve blown up. Somebody would’ve been sacked high up. And the UK ICO tends to be much weaker and beware.
[00:36:49] Whole UK data protection law is likely to start diverging from the EU law and they may well lose their adequacy decision adequacy determination in a year or so. It’s a big problem that the government hasn’t woken up to yet. .
[00:37:02] Matt: And one, one final question, because we’re coming up on time. Please if you’ve got any more throw them in the chat, I’m sure Douwe and I can stay on after time just for a few more minutes if you want to hear the answers.
[00:37:12] Yeah, an interesting scenario occurred with a couple of clients at the moment where they currently have as you probably know, Voice, Biometrics has both a text dependent mode where you usually have some form of fixed passphrase, and it also has a text independent mode where it is generally authentication in the background of conversation.
[00:37:27] And in a couple of cases we’ve got people who’ve it really made the best of this technology one, one of those forms and now want to implement the other form. And they’ve collected consent validly and all the rest of it for one form of the technology, and they now want to implement the other form. And they’re left in this position of wondering do I to what?
[00:37:44] To what extent is this a different purpose? Or is this under the principles? Is this just something I need to tell people I’m doing? Or do I not even need to tell people I’m doing this same thing slightly differently.
[00:37:55] Douwe: If the purpose remains the same? We want to uniquely identify you with your voice.
[00:38:01] Then the fact that there is a technical change from active to the broad context in which you speak, I don’t think you need, it might be useful to inform people, we are improving our system. We can now do it without having you speak out specific words. That would be helpful. And it would also mean people would not get suspicious.
[00:38:18] Why don’t, why am I no longer asked to read out certain numbers or certain words? So it’s always good to be open. I don’t think you need new consent because there’s no change in the purpose. A change in purpose would be if you said, I’m using Voice Biometrics for Authentication. I now also want to do emotion detection.
[00:38:37] That is a extremely tricky issue. I would warn anybody very strongly against it. You are entering into very hot waters. Not saying it can never be done, but I would,
[00:38:49] Matt: I think even the science is to be questioned in that space. Yeah. Even the science.
[00:38:54] Douwe: Much tricker. Yeah. Yeah.
[00:38:56] Matt: That’s been fantastic.
[00:38:57] We, we are at time now and I know people will have to drop off to other sessions. So I wanna thank you very much, Douwe that was a really really informative discussion and a great set of discussions after that. And some good questions from our participants. I’m glad they were able to ask those.
[00:39:11] We may well see you again. I think this is a very hot topic, so, so thank you very much for your time. Just a quick bit of. If I can set the right, so I got to get this one. Just a quick bit of housekeeping then. So, for everyone on the call. Oh, can I get us to move forward? So we have a couple more events coming up.
[00:39:27] In two weeks time. We have the best practices for the enrollment journey. Unfortunately Sam, who was gonna speak on that session is no longer available. So if someone wants to, would like to to join the discussion after that, then I’d love to have you. Otherwise, it’s just me talking for a while.
[00:39:42] Which won’t be as, it won’t be as interesting as if we get to have a discussion like we have today. And then we’ve also added two new sessions to the calendar that you might not have seen yet. So on April the 20th just to get my calendar over here we’ll be doing our Voice Biometrics 101, which is a, basically a primer and introduction to Voice Biometrics and how that might apply to your call center.
[00:40:00] And then really interestingly, given all the discussion around synthetic voices and the threats from those, we’ll be holding a session on May the 4th looking at vulnerabilities in Voice Biometrics and what can be done to mitigate those and how you should properly assess those.
[00:40:14] So thank you very much everyone for joining us. You will, if you haven’t already, we’ll receive a feedback request by email. This is session three now of our community. We are still finding our feet and we would love to get your feedback on what you want to see more of or less of.
[00:40:28] So, thank you very much for joining us. Please give us your feedback, either on the feedback form or via email or directly. And once again, Douwe thank you very much for your time and contribution this afternoon. It’s been really useful. Thanks.