Skip to content

Identity is all about data – Steve Wilson – S1.12

My guest for this episode is Steve Wilson from Lockstep. He is my go-to human for all things relating to identity. As we were contemplating this episode we started talking about identity (which might not actually be the best word for it) and then Steve said “identity is all about data” and that is where our discussion started.

Stephen Wilson
Steve Wilson

“Steve is a researcher, innovator and analyst in data protection. He has been a lead digital identity adviser to the governments of Australia, Hong Kong, Indonesia, Kazakhstan, Macau, New Zealand, and Singapore, and has been awarded 10 patents. In 2018, he was described by digital ethnographer Tricia Wang as “one of the most original thinkers in digital identity in the world today”. Starting in public key infrastructure in 1995, Steve saw the potential for this technology in digital credentials. In 2004, he was awarded his first patent for anonymously verifiable attributes. In 2011, he discovered an ecological explanation for the diversity of digital identities as ensembles of attributes, which in turn convinced him that all digital identity boils down to data.”

https://lockstep.com.au/about/stephen-wilson/
Episode link
RSS Feed

Links to things we mentioned

Lockstep – Data Verification Platform

Transcript

Kate Carruthers [00:00:01]:

Hi, and welcome to this episode of the Data Revolution podcast. I’m Kate Carruthers, and my guest today is Steve Wilson.

Steve Wilson [00:00:09]:

Hello, Kate.

Kate Carruthers [00:00:10]:

Oh, hello. I’m going to introduce you Steve. Steve is an original thinker whose research achievements include six US. Patents, three in public key security, and three in biomedical technology. He a researcher, an innovator, and an analyst in data protection. And he’s been a lead digital identity advisor to governments around the world. And he has been awarded ten patents. There were six before, now there’s ten. Anyway, he is somebody who is one of my go to people for things around data protection and security. Welcome to the show, Steve.

Steve Wilson [00:00:48]:

Well again. G’day, Kate. Good to see you. Thanks for having me.

Kate Carruthers [00:00:53]:

I was just pondering how we met, and I actually think my very dear friend Stilgherrian introduced us. Probably a long time ago.

Steve Wilson [00:01:03]:

Could be. I floated around UNSW for a long time. I started a post grad course there some time ago. And love UNSW. You love people like you. So yeah, there would have been lots of intersections, I’m sure.

Kate Carruthers [00:01:22]:

So what are we going to talk about today?

Steve Wilson [00:01:25]:

Well, I love this all power to data theme of yours and the way that you are digging up so many important lessons in your professional background in data, and it’s all being reinvented. So there’s many ways in I like to unpack how important data is in the economy. I nearly said digital economy, but we’re dropping the digital now, aren’t we?

Kate Carruthers [00:01:52]:

The digital economy and the economy are the same thing now.

Steve Wilson [00:01:56]:

So look, there’s a number of ways into the topic, but I’m working a lot at the moment on and thinking a lot about how do we protect data at a level that is commensurate with its value. So whether you think that data is like the new crude oil or not, like every metaphor, that one’s a bit radioactive. It’s got strengths and weaknesses. But look, data is important and we don’t protect it, and I think that we should.

Kate Carruthers [00:02:24]:

And that is a really good point. Because if you think about data as an asset, you’ve also got to think about it as a liability. And it’s the two sides of the one coin now, because if you keep too much data and you don’t protect it adequately, you can have a data breach, which I’ve been in every major data breach in Australia in the last twelve months.

Steve Wilson [00:02:46]:

And it’s not theoretical. These data breaches are not fantasy. In fact, they’ve become in a way, they’ve become a little bit mundane. And that’s so dangerous. Security geeks like to say data breaches are inevitable. And I tell you what, my head explodes. I don’t know why we think that it’s cool to just accept that sort of thing. It’s horrible to accept any level of data breaches, let alone to say we’re all going to be breached. Get over it.

Kate Carruthers [00:03:15]:

Well, I always feel a strange sort of kinship with the people who’ve been breached, because a lot of the time I’ve been in it, and I’ve been one of the people saying we need to invest and been told no. And I’m pretty sure that most of those organizations had people who wanted to invest in protecting the data and were told no. So I do have some sympathy for some of the people in those organizations.

Steve Wilson [00:03:41]:

But yeah, 100%, it’s a wicked job. It security and cybersecurity in general. But what is it that makes data valuable? I think that that’s really important. We’ve got this vague idea of data protection, which in the rest of the world is synonymous with data privacy. The GDPR has data protection in its name, and while in Australia we do privacy impact assessments, in Europe they do data protection impact assessments. So privacy and data protection are really literally synonymous in a lot of the literature. But I think that what we do in data protection. I mean, I love it. I’m actually one of the rare technologists who advocates for data privacy principles and data privacy law, but I think it’s only a model. Like, it’s only a start. Data privacy boils down to a set of principles that are about limiting the collection of personal information, limiting the use of personal information, limiting the disclosure, not banning collection, use and disclosure, but limiting it and being transparent and getting rid of data when you finish with it. And all of this stuff actually turns out to be pretty bloody good idea in the light of data breaches. But I think that it’s just a start. There’s a default assumption that data is valuable and you should keep it close, but it’s not a very artful way of looking at the value of data. We actually want to share data, not only as businesses and governments and researchers, but I mean, think about it. Individuals absolutely need to share data, partly because social media and just being social requires you to talk about yourself, but also to get digital services. You need to tell people stuff. You need to share your shopping history.

Kate Carruthers [00:05:45]:

Is we want to take friction out of our processes, and the way we can take friction out of our processes is data and automation. So we need to share our data to make that, you know, who wants to go back to the bad old days? I can remember going to the Department of Motor Transport back in the day and being in the queue and with pieces of paper and stuff, and now it’s miraculous. With service New South Wales, you can do it all on your phone. You don’t have to talk to anybody. And that’s powered by.

Steve Wilson [00:06:15]:

Some. It’s a modern miracle. Some of that stuff and data sharing and data disclosures got a bad rap. And some of the data brokers have really poisoned the well, haven’t they? They’ve made data sharing synonymous with data surveillance and surveillance capitalism. And that’s kind of sad because you can’t live under a rock. I mean, civilized people want I mean, to be really blunt, I want my doctor to know stuff about me. And if the doctor thinks it’s in my interest for the nurse to know the same things about me, then I need to trust the doctor to be sharing data behind my back.

Kate Carruthers [00:06:56]:

COVID helped me get over my aversion to electronic health record because you were going to doctors and I was just like, they need to know. I need to just let them have it.

Steve Wilson [00:07:07]:

Right, but you must have in the back of your mind some comfort that there’s professional standards for medical providers, don’t you? It’s different telling your doctor your history.

Kate Carruthers [00:07:18]:

I know one of my mates does tech support for my local GP office – Terrifying.

Steve Wilson [00:07:28]:

Yeah, there’s a lot of sausage making going on in general practice software that’s.

Kate Carruthers [00:07:32]:

You know, they’re essentially small business, and small businesses really have no idea about data protection and data security. So there’s large swathes of the Australian economy that are unprotected working, as I do in a big organization that’s working on things like the security of critical infrastructure and privacy legislation and stuff. We can forget that there are organizations that don’t even know that these things exist or know how to even approach data security.

Steve Wilson [00:08:06]:

Yeah, we set people up to fail, especially in small business and in general practice. But another thing to talk about is the mental models that people have for data and how it works and what does it mean to control data. Can we have a reasonable expectation that people can control data for themselves? So I’ve been in can I talk about identity?

Kate Carruthers [00:08:37]:

I think it’s an important part of the fabric of the data worldview. The bits that I think are important. There’s the data, the storage, there’s the integration of data sources. There’s the identity and access management. There’s knowing who the individual is. So master data management so that you get a golden record so you know who you’re talking to. So I see all of that as a big part of the data landscape that we need to master.

Steve Wilson [00:09:05]:

Yeah. And it boils down to context, isn’t it? Like, what do you need to know about people in different contexts? You talk about a master record or a golden record, but that’s going to be a different record from context to context, isn’t it?

Kate Carruthers [00:09:21]:

That’s one of the things we’re talking about at work, is how can we make contextually relevant information about individuals available in a context and do it safely and securely?

Steve Wilson [00:09:34]:

Yeah. Now, that’s one of the cool things about some of the tech that’s emerging, like the Verifiable credentials. Technology is wonderful.

Kate Carruthers [00:09:42]:

Explain what that means because not everybody will know what that means. And I’m really interested to talk about this.

Steve Wilson [00:09:48]:

Okay, so we all know what credentials are in real life. I think, like a university credential or a Credential a passport to drive a car or a passport is like a Credential to cross borders. And the Verifiable credentials movement is partly about digitizing those things in a reliable, high quality way. So not just taking photocopies or scans or copying down numbers, but actually capturing the metadata about who issued a Credential, when was it issued, what are the rules and terms and conditions for a Credential? If you claiming to be an accountant, that’s sort of interesting, but what’s the metadata? What’s your scope of practice? Where were you qualified? Where are you licensed to work? So you can wrap all of that stuff into a digital document and you can try and format it to be machine readable, that’s kind of straightforward, and you can digitally sign it by the issuer. So it’s tamper resistant and it’s got provenance. You know exactly that. It’s come from the Australian chartered accountants people, rather than the American, for example. So I’m building up this picture that the Verifiable Credential is partly a digitization, partly a signature of the issuer, so you know where it’s come from. The final twist in this is that if you issue a Credential to the right sort of end user wallet, then when it’s in the right hands, you can prove that it’s in the right hands. You can prove that when somebody rocks up to a website and says, hey, I’m an accountant from Australia, you can actually also prove that the right person was in charge of the presentation. So, look, accountants, that’s a bit sort of eerie theory, but what about proof of age? This is really important in Australia. We’ve got a whole lot of state and federal initiatives to require proof of age for when you’re buying liquor online, for example. These are rules that are going to be legislated in the next few months. So it’s a wicked problem. How do I prove that I’m over 18 without proving everything else about myself? I don’t want to necessarily talk about how old I am or where I live or whatever. I just want to prove one fact about myself, which is that I’m legitimate to buy a Grog. That’s a really important use case.

Kate Carruthers [00:12:14]:

But it is in Australia. Yeah.

Steve Wilson [00:12:17]:

Now, is it the grog or the law that you’re talking about?

Kate Carruthers [00:12:20]:

I think I think the the purchasing of booze in Australia is an important cultural thing.

Steve Wilson [00:12:26]:

It’s critical infrastructure. The final part of this Verifiable Credential story is really important, that the liquor store wants to know as well as a can, that if somebody’s claiming to be over 18, that Credential is in the right hand. So that’s called proof of presentation or proof of ownership. Now, it all boils down to cryptography. So the credential is digitally signed by the know government or New South Wales Driver Licensing Bureau. They sign the certificate, but then when you present it, you sign it again using some sort of wallet technology. Now, all of that, again, might sound theoretical, but we’ve been using this technology for ten or 15 years in chip cards and we’ve been using it for the last three years in mobile phone wallets. So under the covers, when you’ve got click to Pay in your iPhone, and I’m going to talk about iPhone, because that’s just me, but it’s exactly the same for Google. If you click to Pay in a mobile phone app, it reaches inside the secure element of your phone, it pulls out some data relating to your Visa or Mastercard or Amex card that has been loaded with your consent and with the consent of the bank.

Kate Carruthers [00:13:44]:

I verified one card this morning.

Steve Wilson [00:13:47]:

There you go. Now, when you click to Present or click to Pay, your phone is doing some magic cryptography under the covers. It’s also digitally signing on behalf of yourself and sending it off to the merchant. So the merchant gets a cryptographic parcel of information, data and metadata, and the merchant goes, OK, look, I’ve got the credit card number, I’ve got it from Steve Wilson. I’ve also actually got it from Steve Wilson’s iPhone. They can actually tell what iPhone I’m using now. That’s goodness. Because they know that the phone’s been unlocked by the person who owns the.

Kate Carruthers [00:14:21]:

Phone, which is, this is all great and good for us, but one of the things we have to do is trust the people to whom we give that data, don’t we?

Steve Wilson [00:14:32]:

Oh, yeah. So first thing is give them as little as possible. Disclosure minimization is like a really important rule. I don’t want to tell the liquor store anything more than, well, ideally, my credit card number, my delivery address and the fact that I’m over 18.

Kate Carruthers [00:14:49]:

So they don’t even need to know your year of birth or anything, they just need to know, yes, this person’s allowed to buy booze in Australia.

Steve Wilson [00:14:57]:

Yeah.

Kate Carruthers [00:15:00]:

This is one of the things I always feel really uncomfortable when I go to an RSL club. I don’t go very often. It’s usually there’s some kind of meeting there and I have to hand over my driver’s license and they scan it and I just hate that.

Steve Wilson [00:15:16]:

Yeah, I was there last week, we had something local in the golf club and I was that guy in the queue that held things up while I said, you’re not scanning my driver’s license because on the front of your license is the licensed card number.

Kate Carruthers [00:15:29]:

Yeah.

Steve Wilson [00:15:30]:

So I said, I’d rather type in all of the information that you really need. And I did that. That’s so important. What sort of database is being run at a golf club? How hackable is that? It’s terrible.

Kate Carruthers [00:15:46]:

If the government is going to regulate anything, they should regulate this kind of stuff. They should regulate data minimization, where they just need to know that I don’t live within 5 miles. 5 over 18. They’re the only two facts they need to know. About me.

Steve Wilson [00:16:06]:

Exactly.

Kate Carruthers [00:16:07]:

So it goes back to your thing of context and relevance.

Steve Wilson [00:16:11]:

Yeah. And in New South Wales. We’re getting really close. The digital driver’s license is leading to a digital identity. I wish they wouldn’t call it that because it’s actually a lot less than identity. When this thing’s up and running, it’s been piloted and it’s working fairly well, but it’s really a collection of factoids, and I think we’ve got to stop calling that identity. The fact that I’m over 18 and lift ten KS away, that’s not my identity. It’s just a really important attribute, and I can prove it using the New South Wales technology.

Kate Carruthers [00:16:46]:

People are going to call it an identity because.

Steve Wilson [00:16:52]:

The trouble is that there’s two different sorts of identity. My identity is Steve Wilson biological entity, and I feel that strongly, and it’s analog and it’s biological and it’s social, and it’s me. And everybody else can bugger off, and my identity is sovereign. But that’s not what we’re dealing with online. Online, we’re dealing with a whole lot of little factoids that are relevant in different contexts, and it’s so different from identity that I’ve been in Identity for nearly 30 years, and it’s the slowest corner of it by miles. We have been having the same arguments about digital identity for 15 years, and it’s slow because we make it too complicated, and it’s slow because we call it identity. I mean, I’m just going to be quite blunt about that.

Kate Carruthers [00:17:39]:

It’s not what should we call it?

Steve Wilson [00:17:41]:

We should call it attributes or facts and figures or credentials. That’s great. If I’m an accountant and I need to sign off an audit report, or I tell you what, if I’m a homeowner and somebody comes to the door to fix my pipes and they’re a plumber, I don’t want to know anything about that person other than the fact that they’re a licensed plumber and that.

Kate Carruthers [00:18:03]:

They have no complaints against them.

Steve Wilson [00:18:06]:

Yeah, okay. That’s good, too. That’s good. Now you got to stitch all that stuff together accurately so that you index the data properly so you know that it applies to the right person. But if a plumber came to your door and you said, Show me your identity, they’d probably be insulted because they just want to show you that they’re a plumber. So when you say, what do we call it? I just think it’s incredibly lazy that we keep calling this constellation effects and Figures Identity because it’s so not Identity.

Kate Carruthers [00:18:38]:

I just remember back when we were talking about Web 2.0, which was the stupid and wrong name for that, and we got stuck with it. And then there was Web 3.0, which was allegedly crypto and digital currencies and stuff, which was also a wrong and stupid name. So there are so many wrong and stupid names out there.

Steve Wilson [00:18:59]:

Yes. And we don’t seem to have the sort of temerity to fix that. What’s in a name? A lot, especially when you’re playing with identity. And it crosses between laypeople and deeply technical people, and it crosses over between professions and citizens and government regulators. I think it’s really important that we call a spade a spade, and we know there’s too much identity information out there. Look at the optus breach. It is ridiculous that I am vulnerable because some facts and figures of mine have fallen into the dark web. It is ridiculous that people can just play my numbers behind my back and assume my identity. It’s ridiculous. So there’s so much identity sloshing around out there. Why don’t we just try to minimize identity? And the first step is to call it what it is and it’s data. It’s facts and figures.

Kate Carruthers [00:20:02]:

Like some of those recent data breaches, I can’t remember which one now, one.

Steve Wilson [00:20:06]:

Of them was, I know what you.

Kate Carruthers [00:20:09]:

Did there, but one of them was they were using production data in a test environment and hadn’t secured their test environment adequately. And my blood runs cold when I think of how many organizations that would fit, because a lot of the times they don’t apply all the same controls to non production environments. So if I was a bad person, I’d be out there poking around at people’s test environments for sure.

Steve Wilson [00:20:40]:

I started a lot of my career before identity in medical devices. And there was a horror story of a medical they’re called pacemaker programmers. They’re special purpose modified PCs, modified laptops that doctors use to interrogate upload data from a pacemaker and reprogram its parameters. And a really famous pacemaker. Hacker got out on ebay, found one of these things for a $1,000, bought it overrode the normal login. Just got into Dos and found that on the hard drive was the complete test environment and a whole lot of devcode. And the passwords for the dev environment that had been copied onto the hard drive of this medical device. Unbelievable.

Kate Carruthers [00:21:28]:

Yeah. I think that there is genuinely so much bad practice on a customary level out there in It land where developers have done stuff like that without thinking about it, because there was not any perceived risk. But now that you don’t have to rob banks, you used to have to rob banks to make money, and now you can sit at home in your pajamas in your mother’s basement and just get online and do the equivalent of robbing banks. So there’s a real incentive for people to work out where weak spots are like that. So a lot of our practices in It are really bad and need a severe yeah, yeah.

Steve Wilson [00:22:14]:

We need to take a hard look at ourselves, don’t we, kate thing about sitting in your basement doing identity theft, it reminds me of another point about the quality of data. All of that stolen data gets replayed know your customer processes. So to open a new online bank account, all I need to know is somebody’s birth certificate and passport and driver’s license. Now, nobody does that online. Nobody does it face to face anymore, as far as I know. Criminals don’t go down to the back blocks and buy fake passports for $100 and fake driver’s licenses for $50 and then rock up to the local suburban bank branch and open a bank account. I don’t think anybody does that anymore. Instead, you can buy the same data online for about a 10th of the price as the going rate. Now, what’s interesting to me is that the KYC process is still the same. The bank still just wants to know four or five facts about you in an algorithm that then says it’s probably Steve Wilson and it’s good enough. So the problem with so called identity theft is that it’s actually data theft. People don’t steal Steve Wilson’s identity. They just steal enough facts and figures about me that they can pretend to be me online. And this is all about data. So when the government responds, I’m so divided in my opinion about this. I love that our government is responding with real muscles against the optus breach and doing something about it. But it disturbs me that they seem to be moving towards the national ID as a response to this, because we don’t need any new ID. What we need to do is to make our existing identity facts and figures better so that they can’t be replayed. So I like to say that we don’t have any identity problem in the wake of the optus bridge, we’ve got a data problem. And I wish that we were really sort of focused about that, because if we could solve the provenance of data for identification purposes, then oh my God, you could solve the provenance of data in the AI world that we’re all worried about, or we’re worried about who’s training the data, where’s the data coming from. We’re worried about algorithmic transparency. So when a credit rating is made or an insurance rating is made, and it’s not in my favor, I’d like to know what the algorithm is. Well, we could know that we could stamp all of these analytics processes with the algorithm, and we could stamp it with the governance of the data. So there’s this pattern in my mind that all of these problems boil down to data and metadata. We live on data. We need to have better data quality, and we could actually measure data quality, and we could imprint the data quality like a hallmark on every piece of data that matters in a fairly straightforward way. As they say. We have technology just pick up on.

Kate Carruthers [00:25:20]:

Something that you just mentioned. Traditionally, when we recommend people start to use multifactor authentication, we do it because there’s another factor that is not inherent in the thing itself. So it’s typically something you know, something you have and something you ask your identity. One of the things we probably need is something that I have that I can say, this is really me. It’s not some stranger. What we’re probably going to have to be able to do is take that kind of a multifactor authentication approach into the identity world so that we can avoid this problem of someone pretending to be you or me online, opening up a new bank account with the stuff they got off the dark Web. And it’s the additional factors of, yes, some facts about yourself, but some other things that only I know.

Steve Wilson [00:26:30]:

So are you going towards the proof of humanness kind of issue and the World Coin project and they’re trying no.

Kate Carruthers [00:26:38]:

I don’t want to go that way because I think that’s just creating a very large honeypot for someone to steal. But increasingly we’re going to have to solve this. And I think that a lot of the approaches don’t allow for anything outside of facts. These things that you’re referring to as facts and facts are very easy to get hold of in our world because they’re data and people steal data.

Steve Wilson [00:27:07]:

This is exactly my mission locate or my passion, because data can be stolen. You need that extra layer of metadata that says, well, look, it’s not just a fact. My driver’s license is 1234 XYZ. But when that string comes across the Internet and hits a website, it can also be signed by me. So the web server knows that it’s come from a person in control of a private key that’s certified and bound.

Kate Carruthers [00:27:36]:

Which is the other factor that I’m talking about. Do you want to unpack that for folks? Because I’m not sure everyone will understand what we’re talking about.

Steve Wilson [00:27:43]:

Well, it’s back to this digital signature thing. A digital signature is an extra code. I actually call it metadata. In itself, it’s meaningless. It’s just literally like 256 ones and zeros. But it’s a code that’s generated from a cryptographic key in a secure element, which is your phone hardware or a chip card. The signature is processed on the core data. So if I want to prove my credit card number, I sign that in my iPhone, the credit card number. And that signature code goes across the network and it hits the server. The merchant server. The merchant server uses a public key. It’s like a master key to undo the signature. And it sees that it matches two things. The signature matches the hardware that it came from and the signature matches that credit card number. So you get some programmatic logic. You get a rule that can be in the software at the merchant that says if that signature code checks out, then I know that this credit card number has been presented by the guy that controls the credit card. And that pattern has been with us for a very long time in payments. And like I say, it’s been very popular in Apple Pay and Google Pay now for about three or four years. In wallets. It’s exactly the same pattern that we need to present any important data. And if you had, then nobody would be vulnerable. After the Office breach, have you considered.

Kate Carruthers [00:29:17]:

How you might mesh something like homomorphic encryption into that world?

Steve Wilson [00:29:24]:

Sure, it’s an extra layer. So I think the important layer is the bottom layer that says, this is the provenance of the data. We know where it’s come from.

Kate Carruthers [00:29:33]:

Yeah.

Steve Wilson [00:29:34]:

So you build up on top of that and say, well, let’s make the data even more secure, like defense and depth. Homomorphic encryption is a clever way of scrambling the data so that.

Kate Carruthers [00:29:46]:

I might just explain that for people, because not everybody probably knows what homomorphic encryption is. It’s something that Ian, Opperman and I are really obsessed with at the moment. So homomorphic encryption is a form of encryption that allows people to do computations on encrypted underlying data without the need to decrypt it. So that’s what it is. And I was just thinking that if you took what Steve said and put that together, it would be a really nice package because we’re putting a lot of trust in our phones to do this for us.

Steve Wilson [00:30:23]:

We sure are. We put our lives into our phones now. It’s the sort of fulcrum for everything that we do.

Kate Carruthers [00:30:30]:

Yeah, I grab my phone rather than my wallet now because everything’s on my phone.

Steve Wilson [00:30:35]:

Yeah. I love that you’re doing homomorphic encryption with Ian because it is that extra layer of it’s the absolute way of minimizing disclosure, because if homomorphic encryption works, then you don’t ever need to unscramble the data. That’s why I love it.

Kate Carruthers [00:30:51]:

I’m fascinated by it and I think it’s one of the really big future things. So one other question, because I need to let you go, is how do you see this sort of playing out in the real world? Is this likely to be something like people are pursuing the national identity, the state identity and all of that stuff? Do you see the stuff that you’re talking about becoming? How will it play into that space, do you think?

Steve Wilson [00:31:22]:

Well, we’ve got these patterns that people are used to now, like click to pay and tap to pay. The very same technology could present any facts and figures using the same cryptographic, wrapping and signatures, so that when you present some code online to the other side of the world, the server knows that it’s come from Steve Wilson with my consent. And it’s different data for different contexts. So if I need to prove my age, then the relying party. The rule is I’ll trust in the age. If it comes from a license authority or if I’m trying to claim that I have a particular health condition and I want to quote my health Identifier, then that is a totally different fact and it needs to come from a totally different context. But we’ve sorted this out for payments, a merchant anywhere in the world can accept. My credit card without knowing me or even my bank because of this layers and layers of governance. So I think it’s going to play out in a really mundane way. I can see my smartphone wallet having verifiable credentials for maybe 20 or 25 facts and figures that matter every month. And those facts and figures are issued from respective issuers. They’re not issued by Apple. They’re not issued by the bank. They’re issued by different communities of interest. Now what you need to do is to distribute the metadata that allows all of this to be unpacked and digested. Now, the payment system distributes metadata through merchant banks. Acquiring banks set merchants up to accept Amex or Diners or Visa or Amex.

Kate Carruthers [00:33:11]:

I know it all too well, right?

Steve Wilson [00:33:14]:

Well, what that involves is the merchant has privileges into the network through their own bank, the so called merchant bank. And the merchant bank signs them up to a set of terms and conditions and a standard contract. And the merchant bank also provides a gateway. And in that gateway is the metadata that allows a merchant to know the difference between a Mastercard and a Visa card and an Amex card. It’s a really elegant technical model for distributing the metadata so that credit cards make sense.

Kate Carruthers [00:33:46]:

Who would have thought credit cards will save us in the future?

Steve Wilson [00:33:54]:

This is going to sound really sort of pathological. They are my inspiration. I don’t want Visa and Mastercard to run this network. I think it’s got to be a new network. But we do look at what they have done, is that they’ve made these very important facts and figures absolutely digestible anywhere in the world. It’s a technological marvel that I can go to Mongolia and buy a souvenir with an Australian bank issued Mastercard. Yeah, think about that. How the hell does the be really.

Kate Carruthers [00:34:29]:

Good if our identity was as easy to use as that? So that really great note. Really appreciate your time this evening.

Steve Wilson [00:34:42]:

Great pleasure. Kate. Thanks for digging in.