Welcome Esther to the stage. Before we begin, there's a... No, that's not quite working. Do it manually. So, before we begin, those are my details. I've been working in IT for 20 years, and I've been working in open source for over a decade. Don't worry, those will be up later. But before we begin, let's consider this little fella here. I don't know if any of you have heard about the metaphor about putting a cute little froggy like that in some water and then gradually turning up the heat. But consider froggy here for a minute while I say a few statements to you. So, do we need privacy? I mean, we've all had these little discussions with our family about Facebook and Instagram and things like that. So, perhaps it's too late. And teenagers, they've been online for years. They've posted to MySpace, to Bebo. They've now got additional things like TikTok and VSCO. So, you know, they don't care about it anyway. Do they need privacy? So, it's dead. Throw it away. And as our governments keep telling us, we need to give up a little bit of it just to stop the bad people hurting us. So, yeah. How's that water feeling? So, we've got this huge problem with privacy. We've been gas-lit into thinking that in order to get all the goodies, to get free email, we need to give up just a little bit of our data. It'll be fine. Nothing will be done with it, we promise. And people don't entirely get why they need privacy. So, we have an issue. We have a complicated problem space to explain. But it's not been the first time that we've had a problem with privacy. But it's not been the first time that humanity has faced a problem space like this. Back in ancient Greece, people used to use fables to explain to people how they interact with each other in society. Simple children's stories. So, with that in mind, I'd like to introduce you to this guy. He was a Roman poet called Ovid, came after Virgil. And he wrote quite a bit about mythology. And unfortunately for him, he was writing during the time of Augustus, towards the end of Augustus' life, when Augustus was trying to get through certain reforms within the Roman Republic. He was on his way to hoodwinking the Roman Senate into thinking they still lived in a democracy. And along with part of that plan was to focus on traditional family values. And unfortunately for Ovid, he got exiled all the way to the Black Sea because of two possible things no one's entirely sure. Unfortunately, there were rumours of an affair with the emperor's granddaughter, Julia. And what he also did was make the huge mistake of publishing a poem called the Ars Aramata, which pick-up artists love because it gives men an idea about the scientific way to go and pick up women, get their interest and then get rid of them when you're bored of them. Classic scholars don't like that interpretation. They say it was kind of more a pest take of other scientific treaties that other Roman writers were writing. So, everything that Ovid wrote was part of the poem. But that Ovid was political even though it was about mythology because mythology was a huge part of Roman life. They had political festivals that were entwined with religion. And Augustus was on his way to becoming a god. And in Ovid's particular piece, the Metamorphosis, he wrote a lot about transformation and about mortals and people, and the descendants who had less power compared to the gods, how the gods would just kind of mess about with them for minor transgressions. So, he was making a kind of political point and trying to be edgy. And Augustus just went, screw that. Of course, nowadays, you can't just go and exile people you don't like. So what you do instead is you put them in jail. Carry on. I've covered that quite well. So I shall go into the next part, which is a lovely little story called Iaeus and Argus. So Iaeus was a lovely, gorgeous nymph who unfortunately caught the attention of Zeus, a god who never did stop having a few me too moments. And he decided that his next squeeze was going to be Iaeus. She said no. He ignored her. Me too moment. And Zeus did what he wanted anyway, and then decided to go and cover it up because he didn't want his wife Hera finding out. So he covered the land in a big, huge cloud. And as what happens with all covers up, someone notices when you're covering these things up. She went, that's suspicious. I'm going to go and see what's going on. So she went down to go, hey, what you up to, Zeus? And what she saw was this. He was just standing there innocently with a cow. And she's like, hmm, can I have that cow? She's ever so pretty. And Zeus went, uh, yeah, yeah, yeah, have the cow. Sloped off quietly. So Hera, knowing full well what he'd been up to, doing the classic thing of blaming the victim, decides to go and put Iaeus under constant surveillance with her friendly surveillance tool, who was a giant shepherd with hundreds of eyes around him. And the surveillance was constant because he only needed two eyes in order to fall asleep. So if she tried to wander off and get the word out to her family what had happened to her, he'd be following along and lead her back. So eventually her family kicked up a ruckus, and Zeus felt, you know, tiny, tiny little bit guilty, and maybe he should do something about it. So he then got Hermes to go and sort the solution for him. And what Hermes did was do a bit of social engineering on Argus by telling him a story, put Argus to sleep, and then killed Argus. Of course, oh, you're still a cow at this point. She managed to get away, and then Hera, being tiny, tiny little bit pissed off, decided to go and pursue her with a guide fly, where eventually she got all the way to the Nile in Egypt, fell beside the river, and then got turned back. Yay! So, okay, that's the original Greek myth. What did Ophidad do it? Peacocks! Hera was very upset about her tool of surveillance being murdered, so she decided to immortalize him by putting his eyes on a peacock. Isn't that lovely? So what does this have to do with nowadays? Well, if you're a security firm, and if you want to show that you're all about the surveillance, you go and call yourself Argus, and you put an eye right in the middle of that logo or beside it. And panoptes means kind of, you know, all eyes, all seeing, and that leads us on to a philosophical idea that was proposed by Jeremy Bentham in the 18th century. He was a philosopher. He was trying to find ways that you could more efficiently manage and maintain the prisons, offices, and hospitals for... hang on a second... for healthcare. So his brother, when he was in Russia, came up with an idea of a desk in the middle where he'd sit, and he'd have several workstations around him. So we've got Jeremy Bentham's brother to thank for the open plan office. Awesome. But Bentham took this idea a little bit further. So the idea of the panopticon is you have a tower in the center with cells all around it, so it's a dome that goes round, and the idea is you have the monitor sitting in the center, and he can shine a light in any cell, but the person in the cell cannot see who's observing him. He just knows he's being observed. And Foucault de Vette explains Bentham's idea a lot further, explaining that when someone's being observed constantly like that, they start to modify their behavior, become more compliant, because you don't want to step out of your box. But there's a little bit of a problem with that, because we've just gone and thought about the particular technology. We've not focused all that much on what it means to be that observed person. And maybe Greek mythology isn't really an appropriate thing for nowadays. It's not always possible to turn someone into a cow. So how can people relate to that? Okay, well, we'll try another thing. We'll try George Orwell with 1984. There's a full expansion of Bentham's philosophical idea. You've got extra tools for this. You've got modification of speak with Ingsoc, and you've got rewriting of the press. But again, we're falling in love with the technical part rather than thinking about the main character. Nobody ever remembers Winston Smith. They always remember the party mechanism. So other people were concerned about this. The Internet Advisory Board and the Internet Engineering Task Force were incredibly worried about this. So they wrote an RFC. And there was a good reason, because the Internet was just kicking off when they wrote it. Well, to be exact, the Web was kind of kicking off. But some of you might not know what RFCs are. How many do? How many don't? Okay, well, we'll do a brief bit. The requests for comment. So when a new technology is proposed to be used on the Web or on the Internet in general, someone writes a request for comment and then puts it out for other people to examine and make suggestions on or say, no, we don't want that. And the nice thing is that anyone can submit an RFC. They mainly come from the IETF, but people can submit them. Who are outside the board, and it just goes through an additional period of review. And the nice thing is you can even submit joke RFCs for April 1. But not everybody likes following the rules. Some people find it's a bit of a bother. Some large corporations prefer maintaining their monopolistic advantage. So they tend to be rather guidelines for how you should run things like your Internet service. I prefer it when people follow the RFCs because it makes my job a lot easier. So back to the RFC. So the US didn't like their companies exporting encryption. There was an anecdote from around about that time where you had an American company that had to go and strip out their encryption, ship it over to Europe, and then tell them how to put the encryption back in. And there were other such instances around the world like that. And in 2015, they went, let's make it best for current practice. And the discussions on that said that they felt it had been treated like best current practice for years, so let's just make it official. So the statement up there is quite clear. They feel that all Internet users are entitled to an adequate degree of privacy. And they were very upset about the idea of governments around the world trying to restrict that or weaken it. So that's the US example. And this one in particular really resonates with me with what the Australian government is trying to do, the UK government is trying to do, and so is AG Barr in the US just this week. And that's been another one out of the toolkit. Occasionally, governments like suggesting that, can we just have a register of keys? You can trust us with it. And of course, there are regimes around the world who don't allow it. So what is threatening our privacy? Well, we know this one. It's quite obvious. Although even having encrypted chats doesn't mean you're necessarily safe if you're backing up your WhatsApp stuff up to the cloud. And of course, we're just feeding in our data. We can't stop ourselves. We're just hooked on it. And then of course, we have our family and friends just going, I'm going to do a DNA test. Where do I come from? And yeah, I really object to members of my family just uploading part of my DNA profile because it is part of your profile. So am I being a little bit paranoid? Well, here we go. So David Cameron, very much like Malcolm Turnbull, doesn't particularly like encryption. I'm not sure he understands mathematics, to be honest. And like Australia as well, we had metadata law where larger ISPs have to collect connection data, whatever NAT address the person had, browsing history, just so that the government can go and search in it when they want to. And then GCSQ went, can we just get access to things? Can we just get access to people's private data? We're not going to misuse it, honestly. Although, you know, this is a bit cheeky given the fact with the Snowden revelations, it was revealed that, no, they're doing it all the time. They don't bother with warrants. The GCSQ, that's not their bag. And of course, once it's collected, well, even our health care services where you think they have a duty of care to protect your privacy, they'll just sell on your medical details. They'll claim it's anonymised. But no data is truly anonymous. It only takes linking it up with other data to build your profile. And unfortunately, my government doesn't really like following the rules. The Schengen data covers things like stolen cars or tracking criminals. But in the future, Schengen data is actually going to expand out to border entry and hotels and things. And unfortunately, when the news is reporting on about UK officials, what they mean are third-party contractors like IBM, who then we know that data has probably ended up in the US, which is in breach of EU law. And of course, in a very 1984 way, if you've got a population of people who you find inconvenient and you just want to get rid of them out of your country after you've encouraged them to settle in your country in the 50s, you can just delete their landing cards so they have no proof. And this has killed people. And in general, oh my goodness, UK government officials, at the last check on the Wikipedia page for data loss, over 30 entries at all levels of government have just put a folder down somewhere, lost a USB key here and there, left a laptop in a taxi. So yeah, you should not trust your government to look after your data. They're not qualified. And you, well, America, that's even worse. Yet again, we're looking at the going dark strategy. And of course, they're very good pals with Palantir, who use all of the data that your family and friends put in there to help track people, even very causal relationships can link and lead to someone being deported. And of course, they're not afraid of spying on their own population with the NSA giving access to that. And what's even worse is the FBI likes doing illegal access into that data. And the personal data leak isn't just things like names and addresses. It can cover things like your magazine subscriptions. So it gives organizations like Palantir quite an interesting profile to build up on how you think and what you believe in. Yeah, Australia. We love that quote, don't we, people? And of course, like the UK, you've got your own collection act, although that's updated from the 1979 Act, which is just catching up with the technology. Australia likes catching up with the technology. And of course, Australian Federal Police, they love just not bothering with warrants, doing 160 metadata searches on journalists, just because they just want to... they just don't like the truth getting out. Although, surely there's some hope. There's a privacy amendment that got enabled, well, was introduced in February 19... sorry, not 1918, we're a century onwards, 2018. And you think, surely organizations will take this seriously. Surely, yeah? Yeah, no. My health, where you have to actually opt out of it consciously, they had so many data breaches, people were asking the Australian government not to roll it out, just don't, please don't. And to be fair, it wasn't just Australian medical records that just got left online on a server outside Australia's jurisdiction. It's the same with the US as well, things like medical tests and x-rays. More data to de-anonymize you. This one, Queensland Health leaving their records on the side of the road in Bowen Hills. I mean, okay, the data was marked for deletion, but that's a real dereliction of care. And even though it was a contractor, the buck has to stop with Queensland Health. And yeah, all those poor students having to resit their exams this November, they deserve better than this. And well, you know all about the Mikey Lake with Public Transport Victoria. So Australia doesn't really have a good duty of care with its data either. And here's one of the biggest threats to our privacy. In the UK, per head of population, we're the most surveilled country in the world. We just love putting these things up like candy. Although to be fair, it's not just governments. Everybody's getting into home surveillance devices in an idea for safeguarding your family, keeping them safe, making sure your doggies are okay. And unfortunately, it also tends to be a rather creeping symptom of gentrification. I went to Seattle in November into Capitol Hill, which is a very liberal, safe population. And these devices are appearing everywhere, pointing right out into the street, out of a false, really solving Amazon's problem, not really providing security. And what's worse with this is the other reporting app where you can go and report someone turning up at your door. He maybe just looks a bit suspicious or, you know, from a marginalized population. And this data gets shared with about 600 police forces in the US. So just have a think for a moment about that data that someone's just a citizen, private citizen has made a value judgment on a person without due process. And of course, with all those cameras, this is what's going to happen next. It is already happening. Microsoft in November got a bit of flak recently because they invested in an Israeli surveillance firm because they're actively doing facial recognition and video tracking on Palestinians in the occupied territory. It's going from camera to camera tracking these people. And it's military military grade technology. And of course, Amazon are very keen to have facial recognition as frictionless, where you can go and get get entry into a supermarket, get served. Your identity is there so you can pay for your goods without getting your card out. Except there's a problem with this because facial recognition isn't actually that good. There's bias built into it, unfortunately. And you know, that's a solution that can be solved. Google thought allegedly they got third party contractors to go and pay students in Starbucks vouchers and also homeless people as well. And with the idea that the homeless people were less likely to ask questions about what this technology was about. So, yeah, and it tends not to be very good for identifying women or minorities. So it's gradually getting introduced to some public areas in Canary Wharf as well in London. So it's spreading. You've got to keep an eye on this stuff. And this even comes down to using facial recognition to access government services. In October, the French government decided to go and introduce the LSM system. And you'll love this. The LSM system, you've got to go and take pictures of yourself with your Android phone, because it's only for Android, because it's meant to make it accessible to all of the population, including the poor people in rural areas. And this means that they can access government services, and that includes going into your local government building to do planning permission and things like that. And in schools, facial recognition with the added bonus of artificial intelligence is being tested in China. At least three high schools have facial recognition and tracking, where they're actively tracking pupils sitting in the class to see if they're being attentive. There was an article where they did interview the students, and it's counterproductive to what the aim of the software is trying to do, because the students end up sitting there trying to stay awake and fold the camera rather than paying attention in the class. And another Chinese project is trying to check how attentive toddlers in Japan are. That's not right. And as I said, frictionless. Getting your phone, using things like the iPhone's face unlock, just open your phone with, because that technology can and will be used elsewhere. So yeah, awesome. And then onto other bits of biometric DNA that you can't change. More and more tracking with things like your fingerprints, with your gait. These are coming. These are being tested. But yeah, DNA. That's not going to bite us in the butt, is it? In April 2019, there was a BuzzFeed investigation inspired by the solving of the Golden Gate case in Los Angeles. The perpetrator was identified by a relative's DNA because that person had gone and submitted their DNA to GEDmatch. So BuzzFeed decided to get 10 of their journalists to volunteer to sign up to GEDmatch and put their DNA profile in. And another journalist would see if he could track them down using GEDmatch and publicly available information like Facebook, like Yellow Pages. And about six out of the ten, he managed to do a partial analysis. Some he managed to track them straight down, straight away. Some he didn't. And it just depended on the distribution of that data and what was available. GEDmatch are a particular concern because they've always been quite open about sharing that DNA data with law enforcement. And now they've also just been sold to a company that's got an even more cosy relationship with law enforcement. So commercial DNA testing companies like Ancestry.com and 23andMe, they say they still need a warrant. But I wouldn't be quite so calm about that, to be honest, because 23andMe are selling on their DNA data to drugs companies to develop drugs. And you have got to consider the semantic map that is you, because companies like Palantir, their secondary market is actually insurance. In Ireland, you have health insurance policies that depend on you wearing a fitness tracker in order for you to get in, sorry, to get a cheap premium. So imagine that DNA data being sold on to an insurance company and then being passed on to Palantir to make a risk assessment. And then imagine that data then perhaps being used in the political machinery of a country to try and target people based on ads. I mean, that would never happen, would it? So that's the modern panopticon right here. But instead of one person looking out, everybody else is looking in at you and doing assessments based on the data that's fed in about you. So, yeah, how is that water feeling, folks? Tying a bit toasty? So, yeah, do we need privacy? Your family, they're all on it. Well, unfortunately, your family are your primary weakness, sadly. And even if you move yourself off social media, use other bits of social media, even if you've never been on Facebook, your shadow profile is right there about you. So we can't do much about that. We know there's data being held about you. This isn't about individuality, though. Because we've got to think about the next generation of environmental protesters out there. You have a generation of kids out there that have just been through one of the worst bushfire scenes ever, and it's going to get worse for them. And in the UK and in Australia, the government is actually prepared to pin a criminal record on those children. We have a duty to try and sort that. It's too late for us. Let's be the lifeboat. Because they are private. They're used to having being surveilled by their parents. Their parents check on them on Facebook. There was a girl in the care system in the US who knew she was being monitored through Facebook. So she'd delete it and then she'd rejoin again. Because it's their communications lifeline with their friends. It's how they communicate. And they tend to hide stuff from their parents by performing a form of steganography. They have shared references that they use. The government can try and ban encryption as much as it likes, but you can still encode a message in plain text. And do we need to give up a little privacy for our security? Well, you know what? There was a US general who later became a president who didn't like the idea of that. The 34th one. And he's very clear in what he thinks about giving up your security. You want to do that? Go to prison. You'll get looked after. You'll get fed. But you're not really going to have your dignity. You're not going to have your human rights considered. You have very little freedom in a prison. And that's what we're facing. So yeah, what does define our security? Because I don't want it to be that. I want free thought and development and free open conversations. We need to be empathetic. We need to band together. We all have so many differences because we're a diverse conference. But our governments want to put in those divisions. They want to set us against each other. Humanity has a lot more in common than we realise. So yeah, is there any hope? Well, yeah, there is hope. The Australian press has started waking up gradually despite Murdoch. It's just a shame that it took the news corps offices being raided for them to start waking up to it. Because they should have been waking up earlier with the metadata searches. So yeah, well done. And we have cities in California and certain townships in Massachusetts who are actively banning facial recognition. And they can't wait for the state legislator to get round to it. So they're just doing it. Direct democracy in action. And of course, we're here. The very subject of the conference is who's watching us. And we've had so many talks around this subject, all saying about the virtual panopticon and what data we're putting out there. And we're not judging people. We're not blaming anyone for this. It's a fact of life. Our municipal authorities put data up on Facebook to inform the population. So yeah, what do we do? Well, you know, you shouldn't just take my word for it. These sources are all out there. There's various reports on the effect that surveillance can have on populations. There are several organizations that you can follow on Twitter. And well, we're here at a very technical conference with a lot of developers. Build in that privacy. You know, don't do like Amazon Ring where you don't enforce two-factor authentication causing hackers to be able to go and hack your customers' phone with Ring devices. Thank you. To be able to hack those devices and cause a security breach in your own household by talking to your children through it. Maybe just don't buy the surveillance product. But yeah, build in that two-factor authentication. HTTPS, not HTTP. And yeah, follow people like Privacy International. They've got a very good campaign at the moment that privacy should not be a luxury. And they're trying to force... Well, ask Google to consider enforcing their Play Protect a bit more with third-party manufacturers of cheap Android phones. But like anything, we kind of have to look after our own privacy first just so we have an idea about how to help our families to do it. We can't help our families if we've got no idea about how to do that. And definitely start being the bridge over for your family and friends. The reason why diaspora has taken so long to get traction, unlike Macedon, is because at first it was a bit hard to set it up. It was a bit hard to use. Macedon's grown quite a bit because of the simplicity of youth, because of the diverse Macedon instances. Your family are not going to use communities like this unless we help them get their family and friends over. So it's on to us to build it. It's on to us not to be judgmental. We have to take a leaf from the Ubuntu book. We've got to be community-focused. It's not just about the technology and the superiority of our solution. We have got to have empathy about this. Because if we continue haranguing our friends and our family about this, they're just not going to tell us when they're using these technologies. You know, they're going to go, oh, God, Esther's on a rant again. I wish you'd shut up about the DNA. And I'd rather we had an open conversation without judgment. So, yeah, you can use your own personal family histories. You can use the cow just to get people to understand the dangers they're in, but to tell them it's all right, I can help you. Trust me. Because focus on the cow. Don't go on about how scary technology is. They already know technology's scary. That's the reason why they glaze over whenever you talk to them about it. You know, focus on Aya, not Argus bin Optus, not the 100-eyed giant. Because that's a pretty horrible future to be looking at. I don't want that. So, yeah, find simple ways just to talk about it, just little conversations at first. Build it up. Rebuild that trust you have with your family and friends. And as I said, you've got a golden opportunity just now, sadly. You've got a climate crisis. And I think a lot of people in this room have had family and friends affected by the bushfires. And they're being left behind by the government. And once you've been able to communicate with your family adequately about the dangers, you're going to be a bit better placed to talk to your political representatives as well. Because they're not all that good with the technology either. And because you've been talking to your family and friends about the dangers, get them to contact their political representatives as well. One thing I found with the independence movement in the UK and with the Remain movement in the UK is, yeah, there are setbacks, but people have mobilised. And because of that mobilisation, they have contacted the representatives. That's the reason why Brexit's been delayed about three times, because people were engaged. And in the Scottish Parliament, people do hold the representatives to account. You can't access those representatives. The only way you can stop politicians just running roughshod over you is you have to be engaged in politics. You can't step back from this. You've got to speak to your representatives. They'll try and deny you access. Keep going. And, you know, I have hope with things like the cities in the US banning facial recognition and the general unease with privacy. I have hope that we're going to populate that peacock with the banning of tracking technologies, or at least the limitation of tracking technologies. I'll take that. Because we're in the dystopia right now. It's in places like China, but it's here with things like the AA bill as well. It's not a good time. And we've got to not normalise that semantic map, this new panopticon. And I am not precious about this hashtag. In order for us to engage and get our fight back for our privacy, this has to be a decentralised movement. I want people out there, when they see human rights violations or the government overstepping its mark with technology, use that hashtag. Have a read of the RFC. It's really easy to read. Take the points from it. Quote the RFC. Just go mad with it. So, any questions? Thank you. Raise your hand if you want to ask a question. Awesome. I've answered everybody's questions. Well, I'll ask first one, I'm sure some of the people would ask after that. So, how do you think we can talk to family and friends? That's a great advice. But can we use any media resources? Yes, of course. Do you have any good examples? Well, off the top of my head, I can think of Captain America, the Winter Soldier, where you've got the helicarriers going up and they've collected lots of data, points of data to go and target people that are a threat to the government. I mean, you do have things like Mr. Robot, but a really good example of a possible data profile is actually a sci-fi show that was a prequel to Battlestar Galactica called Caprica. And it was cool because the main character actually got blown up in the first episode, but she'd created a digital avatar based on things like bus tickets and things. So, this show was very ahead of its time in terms of could you build a virtual profile out of bits of data that people left around from public transport and what they posted online. All right. Thank you. Anyone else? There you go. So, people are suckers for convenience, and I kind of feel that as much as you educate your family, and even if they agree and are concerned, they're not going to change their behaviour very much. So, do you think the solution is more about regulation or more about alternative products that are convenient? I think there's a little bit of both. The trouble is when you're trying to do regulation, you've got the same problem as you do with talking to your families. You have got to get the regulators, you've got to get the politicians, the political body to agree with you on that, because we found this with the copyright acts in the EU. I had about five representatives for Scotland, and I wrote to them about my concerns with the EU copyright and the link tax. And the SNP ministers, they got the point, one of the Labour MEPs did. Sadly, the Conservatives had been got to by the lobbyists, and that's what we're facing. We're facing lobbyists for publications, for big tech firms that want things to be a certain way. And, you know, it's on us to provide that convenience in community. And half the time, it isn't just the convenience that have attracted people to products like Facebook and Twitter, because we're seeing people using Macedon, and they're finding it easier. So, it's a little bit of both. Thanks, Esther, great talk. You talked about the power of story. What advice do you have for us to really harness stories to help have those conversations? I think we have to do it gently. You've got to kind of gently lead up to the unpalatable pill. But, I mean, we all have family members who've, you know, there's been tales of bravery with things like the Second World War, where people have faced adversity. So, often a good point to start, if possible, are the family histories or people that they've admired in their, when they were younger, people had gone through the war, and they heard all these war stories. And you can adapt it to, you know, what would that person have thought? What did they fight for? It's gentle little steps. This isn't a kind of panacea. This isn't going to fix our society overnight. But, as I said, you've got mythology. I mean, that's a simple start. A lot of us start with mythology. Of course, there's different cultures and religions. They have their own one. Ultimately, it's kind of a, it's a personalized thing. This has to be one-to-one. You can't just go and say, post this talk up on Facebook and expect your family to understand it. I've tried it. They don't. So, it has to be face-to-face. We have time for one more question. All right. Thank you, Esther. Thank you.