Regarding the Hyperscale podcast and Cyborg to B Docu-Series:
She hosts the podcast of the future HYPERSCALE, and through her latest reality TV style docu-series ‘Cyborg to Be’, Briar is augmenting herself with technology; she recently got a microchip implanted with world-famous robot Sophia. She also is signing up to ‘persevere’ her body beyond death through cryonics, played chess with the first recipient of the Neuralink brain implant, is undergoing extensive longevity treatments, and is presently writing a manifesto about the future with philosopher, Natasha Vita More.
Briar was named as one of the ‘Top 100 Most Influential’ people in the United Arab Emirates, and has been featured across Entrepreneur, Forbes, OSN, Cosmopolitan, Emirates Women, Marie Claire, Grazia and Fast Company in recognition of her work.
-
Read the HYPERSCALE transcript
Briar: Hello and welcome to Hyperscale. It's a pleasure to have you here, Ramona, live in our New York studio.
Ramona: Thanks, Briar. It's a pleasure to be here. It's great. And I'm so looking forward to talking to you today.Briar: So tell us a little bit about your background because I was quite fascinated by it.
Ramona: Sure, so I'm a corporate transactional attorney in New York. I've been practicing for about 20 years now and my practice has been, from New York and London. I've covered a lot of biotech companies and a lot of really sort of interesting companies over the years. And what I found is that a lot of attorneys are being asked to really think about AI and to think about some of the implications. And so it's a really good connection with some of the work that you're doing, and I'm really thrilled to be here and talk to you a bit about it.
Briar: So this Sunday, is microchipping day.
Ramona: A good day, yes.
Briar: And I just had a call with Sophia, the robot, who's now going to be joining me in getting a microchip. So I don't know if any robots in the world have received a microchip before, but we're either going to put it in her robotic hand or maybe cut a little fleshy part in her neck to implant the microchip. And apparently as well, she's also going to get a tattoo. And she's not telling her dad, who's the CEO David Hanson. So hopefully he's not too angry at myself and Georgia for organizing this. But something I wanted to talk to you about is the legal implications of my RFID microchip. Some people have been talking to me and saying, “What if you get hacked Briar, have you really thought this through?” What's your thoughts about it?
Ramona: It's really interesting because the RFID technology is, it's been out for a number of years. I think there's a lot of excitement about it now because people are thinking about it in relationship to AI. But the technology itself has somewhat limited implications. You have to be right next to something in order for it to interact with the device. If you're away from it, it can't sort of track you in the traditional sense that we think of tracking. So it's not something you have to be worried about from that perspective. On the other hand, we have seen such an uptick in everything from ransom, corporate ransom, through cybersecurity breaches. And so if the other computer that your RFID chip interacted with could somehow be compromised, in that sense a person could be hacked but in the sense of the device itself being targeted in any way, that's not really a threat. But either way, I think there's a lot to think through. And I think the legal implications for it all are very interesting.
Briar: Yeah a lot of people have been saying to me, “Briar, you are so going to get hacked, like, why are you getting this?” But I was kind of thinking about it, like someone would have to come up very close to scan my hand. And surely if they're going to be that close to my person, they could scan my phone, they could scan my credit cards in my bag. Surely they could even just pinch my handbag if they want. But some people on social media were sounding like people are going to come and chop my hand off to get my microchip.
Ramona: Right. Well, it's really interesting because we give up so much more of our privacy through our phones, through our metro cards, cities that are still using them and you swipe those and those are ways that you're literally being tracked, your movements. Not in the GPS sense, but in the sense of being tagged to a particular location. So it's interesting because in so many ways, we've given up more of our privacy than the privacy that's implicated through the RFID chips. But it's scary because our bodies are so personal to us. You can't put it down. You can't decide you don't want to do it. And so I think the fear comes from saying, well, this is inside of you now, so what are the implications? But the really good thing is it's making people think overall, how much of my privacy am I giving up? What are the implications? Who can know about where I am and do I want them to always know where I am? And so I think it's bringing up a lot of wonderful questions, but in terms of how those questions interact with specific devices and specific technologies, you're giving up way more with your phone than you are with an RFID chip.
Briar: Absolutely. I remember downloading my Facebook data to see what it had on me and I was so horrified. I ended up deleting that Facebook profile and starting a new one. Because it had all of the records of my conversations, right from back when I was like 15 and joined Facebook for the first time. So I was quite horrified. They measure even your swiping, your finger movements, like when we think about how much data our phones have on us, it is significant. And I agree with you. I think that I've already given that stuff up long ago. Like, if someone wants to know stuff about me, they already have so much access to it. And I think what also concerns people about my RFID microchip was although in today's world, it maybe doesn't do as much as what one might want. So it's able to unlock my house, my car and you can add me on LinkedIn and get my contact details if you like, but in the future, perhaps these could evolve to become, say, health chips. So it might be able to measure our vitamin D, notify us on what we're low on, so that we are empowered and know what to eat. Very similar perhaps to what my Oura ring or my WHOOP strap does.
Briar: Something else that people are saying as well is that even though in today's society, the microchip that I'm getting maybe won't have all of the different controls that it might have in the future, they think that I am creating a movement in which in the future the government might have everybody tracking with these microchips and everybody in society might have them against their will. I think this is really where the fear is coming from.
Ramona: Well, luckily there are a lot of legal protection. So a government can't compel you, just like an employer can't compel you to have an implant. That's something that in at least our current framework work is not allowable, but there have been a lot of questions around, used in medicine for individuals who really aren't able to make sure that they take dosages of medicine and how could you use these devices in order to make sure that a dose of something was released into their system. The problem is that very often those people can't really give meaningful consent. So a lot of the implications for these devices and similar devices, you come into an issue, into a problem with, can the people who could benefit from them give meaningful consent? And the answer very often is no. But you're right the future implications are very interesting.
Briar: Where do you foresee this going in the future? Like, what are some questions that we have to be mindful of or things that we need to consider in the future? Say we were to get BCIs, so brain-computer interface chips or health chips, what would happen say if an insurance company had access to our health chip data and they knew that perhaps our immune system was compromised and they might be charging us a premium on insurance. There are so many things that we have to think about in the future, almost before they are developed so that we can create a future with us in mind and without bias and unethical happenings.
Ramona: No, I think in terms of looking at the future, I think the European Union has been a great model for the world. I think GDPR and I think the way that they've looked at privacy and how technologies and information can be used is really, really important. I think there's a tendency to think, oh, we have to really support innovation and if we don't give up all of this information, we're not supporting innovation. But I think that European Union has been very effective at separating the two to say, you can have privacy and you can have innovation. And the funny sort of I don't know if the word be coincidence, but maybe the paradox is that France is a leader in AI and that's in the context of the European Union strict regulation of data and privacy. So I think it's very hopeful for the future that you can regulate the use of privacy and the use of personal information and you don't necessarily have any risk yet stifling innovation.
Briar: Do you think we're doing enough though, in terms of ethical considerations when it comes to managing our data? Because I know that we obviously had that meta Facebook breach many years ago and we were all quite horrified with how they're using our data. But has there been enough change happen?
Ramona: No, I think that's a very good question. The United States tends to be, and when I say we, I tend to be thinking about the US legal landscape and thinking about US regulations. We tend to be behind Europe in a way that's very significant. By its nature very often corporations have a lot more power within, in terms of regulatory frameworks. And so we have to do many things at the state level and it's hard to get enough momentum going on at the federal level. And so that's why it's hard to have a really comprehensive framework. But I don't think enough is being done. And that's why I think your project Briar, and I think similar projects like this where these questions are being asked, I think they're critical. And I think one of the reasons is because they spark our imaginations. And so we start thinking, what if, what if, what if? And once we think about that, we can get beyond, okay, the question you ask, like, yeah, maybe the technology is limited today, but what about five years from now? What about ten years from now? And if we have some of those guideposts, those guide rails in place, that'll be much, I think, safer for us in the long term.
Briar: So whose job is it to come up with these guide rails? Is it the corporates, is it the governments?
Ramona: It's every one of us. Each of us is an individual. It's something I have to engage in as a parent. I think sometimes you look at some of the releases you give for your children at an ordinary event, and you have to give away a right to be photographed and every other thing. So I will take the time and I'll X out things that I think are unacceptable. But I really think we have to feel individually charged because it has a lot to do with our body autonomy. It has a lot to do with the future. And if we allow it to be in the hands of one organization or some other people to monitor and take care of it, it's almost not enough. I think we do have to be vigilant, but at the same time, it's wonderful to embrace technologies. So it's definitely a matter of finding the right balance. But I think each person should feel individually charged to think about their own data, think about their own privacy and the implications for the people around them.
Briar: I think it's very interesting. Obviously I care a lot about my privacy, but I must say lately when I've been targeted with dresses that I love and just want to buy on my Instagram, I'm thinking, thank God, like they know what to market me so that I'm able to buy them and things like this. But I think it's always a very fine balance, isn't it, between giving up your privacy and then convenience, like where do we kind of draw this line?
Ramona: It's very, very difficult. I love the convenience of it. Sometimes I'm like, how did they know I was thinking about that?
Briar: You're listening to me, phone.
Ramona: And sometimes some things will come up and it's so perfectly what I'd like, or I feel like these algorithms have kind of figured me out for a while. It was kind of wacky, but I really do appreciate often the ads that come up. It's actually things I like. But then we have to think about the context. We have to think again, in the United States where there are these restrictions on a woman's right to choose. What happens if you can track women's cycles through apps? What happens if you can track whether they've been to certain places that provide certain medical procedures that are banned? I mean, all of these things are coming up at the same time when body autonomy for women is being questioned in a very fundamental way. And so you have to think, what are the extremes of this?
Ramona: What are the implications? And a lot of kids, there are a lot of kids that are thinking about gender identity and what things you have to disclose in terms of where they've been and things they've interacted with. They might not be ready to disclose that. So privacy has a lot of implications, and it happens to be coming up at a time when there are a lot of significant social movements. And the two things really do interact. And I think these questions are really, really good, and we need to keep asking them, but we have to feel individually charged and individually empowered to ask these questions.
Briar: I love the fact that you're speaking about this and you're talking about how it's a very personal choice. And we have to be curious. We have to ask these sorts of questions because sometimes I think in today's society, people just seem to be quite dismissive when it comes to new technology. They'll read something about my microchip or they might hear something that I'm saying. And rather than actually opening the Discord and talking about perhaps what they don't like, people can be quite insulting. And I think we should all be learning, speaking, because at the end of the day technology is going to evolve, whether we like it or not. The world is going to change whether we like it or not. My great-grandma went up New Zealand from Ireland in a ship and she tricked in her horse and cart because that's just the way life works. And what do you think people can be doing? Say someone's very against the microchip, say someone's all for personal and data and what kind of things should they be going out there and seeking and exploring in order to learn more?
Ramona: First of all, read, educate yourself. I want to know as much about the RFID chip as possible. I wanted to know about any legislation or any laws or any cases that really affect, had an impact on the RFID chip or implications for it. So you have to read, you have to educate yourself. But I think another thing that people almost discount, and in some ways you're almost encouraged to discount it, it's the power of the individual voice. It's the power of either writing your local editor, it's the power of writing your congressman. Every time you write your congressman, they keep a record of it and it goes into the information that they have because they've gotten a direct communication from a constituent. Speak to the head of your kid's school, like whatever you need to do. I try not to be the squeaky wheel for everything, but for things that matter, put yourself on the record. Say, this is the way I feel about it. These are the implications. And recognize the power of your voice, because so few people use their voices. And when we do, we kind of waste the energy sometimes. We're doing it in comment sections… Dash off that letter, dash off that email, and realize that every time you do, it becomes part of the official record. You have made yourself heard and that is really, really critical.
Briar: Have there been any legal cases around the RFID microchips that I should know about?
Ramona: Yeah, so there haven't been direct legal cases, but there have been… There was a few years ago where there was a lot of excitement at a company. I don't remember the name of it, but it's in Wisconsin where employees were having the chips implanted in them. It was exciting, but at the same time, it caused a lot of sort of worry. So immediately afterwards or very, very soon thereafterwards, a number of states put in regulations saying that employers cannot compel employees to have these microchips. And so that's one way that the RFID chip in particular spurned a sort of reaction in terms of legislation that cropped up very quickly thereafter in the state level. Looking at it from a bigger perspective, if you think about it in terms of, in the context of a normal constitutional rights and privacy rights and then also just some of the regulations have been explicitly applied. So governments have to have a warrant in order to get information. And that warrant alone subjects the process to a lot of constitutional protections, which are critical. So in a very good way, it looks like states have stepped up and dealt with it as a specific technology, but it also fits into the broader context and the broader framework and the protections that we do have there.
Briar: So my colleague Pamela, has been quite inspired by my microchip journey. She might change her mind after I get it done this weekend. Maybe she might think, oh, that's a little bit too painful. But she is like on my back about getting her own microchip, and I'm sitting on about four at home, so maybe she could have one. Maybe I could give her one. Is that going to put me in a position where, I don't know, maybe there's some kind of controversy around my colleague, my employee getting a microchip as well?
Ramona: No, I think if it's done in a sort of informal, ‘person's asked for it’ type of way, it's really when people are compelled to do it, when there's a requirement where you might see either an employee handbook, that's the worst case scenario for any attorney.
Briar: Everyone who works here has to have a microchip, otherwise you can't get in the building.
Ramona: Yes, yes, yes. So something like that would be where you'd have a real problem. Or if someone felt coerced or compelled to do something like that but not in a different type of context.
Briar: I might get her to write me a letter saying, “I really want this microchip. Please let me have one. You are not forcing me to do this or anything.” So just to legally protect myself.
Ramona: Yeah that would be sort of built-in suspenders sort of approach and make sure you're absolutely in the clear. But yeah, it really does come, the real concerns come when there's a policy, like an employment policy that it relates to, that gets scary for us lawyers because the first thing we think about is, oh, you're going to have a really big settlement. You're going to have to pay because it's definitely not acceptable for an employer to do that.
Briar: So interestingly, I was reading about this lady in Australia who got, it was kind of like a BCI chip. She was really struggling to live her life on a day-to-day perspective. She struggled with a lot of seizures all the time. She could barely leave her house. And she ended up having this life-changing treatment from an organization where she had some kind of implant. And apparently like she could just go out, she could live her life, like she loved it. But the company went bust. And when the company ceased to exist, they said to her, you have to take out this chip. And she said, but I really want it, it helps me live my life. And she actually remortgaged her house so that she could go to court and fight this, fight to keep it. And they forced her to take it out.
Briar: And she went to court to, to fight to keep this chip saying that it was life-changing and there's no way she could go back to living her life how it was. And the courts actually ruled in the company's favor and they made her take it out.
Ramona: Yeah. I mean, that's unfortunate. I think in that sort of circumstance, from a legal perspective, at least if you're not able to support a device like that, if you're not able to provide the technical support, then you would need to have it removed. And I don't know what the terms were of her original contract, so I mean, I absolutely understand how the company would need to do that. And the ruling probably does make some sense. But I'd hope that there'd be some other technologies that she's able to use that would be a close substitute. But it's frightening, the dependence, we think in some ways about it being imposed on us. But then there are scenarios where people really do rely on certain technologies.
[21:00] Briar: I'm going to the Cybathlon in about August time… August, September, October. And there are a lot of amazing things that technology is bringing to people who are paralyzed or have some kind of disability. And it's really amazing to see how they're using these prosthetics and brain chip interfaces. I'm actually in talks with the gentleman who's got a Neuralink. I've got a call with him this Friday. So I'm really hoping I can go and see him. But for a lot of people, this kind of technology can be lifesaving. And I think almost the worst thing that could happen in today's world is that these important pieces of technology don't receive the funding that they need. They don't receive the time and attention that it needs, and it just gets dismissed. And that these people who rely on it for their day-to-day, like, that's also an ethical consideration we have to think about.
Ramona: Yes. I mean, the wonderful thing is that one of the most enthusiastic categories of investors tend to be the tech investors because they usually get such a huge return. So at least the American markets, capital markets tend to be flush with cash for technological advancements. The problem is that for certain medical uses, you might not have a large customer base. And so it is really important to make sure there's enough funding if there are just a small number of people that are implicated by a particular technology, that is really, really important.
Briar: So, Sophia the robot, she's obviously coming with me this Sunday. And we spoke about the questioning thing before the asking of the 'what if' when it comes to the future. And something that I've been thinking about is robo rights, robotic rights. Tell me a little bit about your initial thoughts when it comes to robotic rights.
Ramona: Well, it's interesting. So the government of Saudi Arabia made Sophia a citizen. And it was considered a largely symbolic gesture. So no real rights of personhood sort of attached to her. But there have been a lot of thoughts about what makes a person a person, what makes a robot a person and what makes a robot a robot? And when these two things converge, like how would you draw the line? And how would you come to understand that? And I don't know what the answers are right now, but I think the questions are very interesting. I mean, we do know the answers in the sense that a robot is a robot and there isn't any overlap. You also think in terms of AI, once something is AI generated, it can't be copyrighted because it doesn't have a human author. So in that sense, the lines are very sharp. But I do think there will be a blurring of these lines over time.
Briar: Yeah. I was thinking the other day about how in the future, what would happen if I have a robot working for me and doing my dishes and making my bed and doing all of these wonderful things and at what point do I start paying the robot? And at what point does the robot start paying taxes? And what happens if it got like sad or angry or displayed real emotions? Would we say it's alive? Like is this when we would have to give it more rights?
Ramona: I don't know if rights necessarily attach to emotions. I mean one of the analogies people often make is to maybe animals, that animals have certain rights, but they're not considered in quite the same way as human rights. But even then, animal rights also attach to like their ability to feel pain. And so for that reason, people want to protect them. Robots are not thought of as being able to feel pain, but they're able to, if not feel at least articulate and display something that looks like emotions. And I do think that would be really interesting. I also feel like if there is some sort of payment mechanisms, gosh, of course governments would love to find a way to tax it.
Briar: Oh wouldn't they.
Ramona: But right now, I think also in terms of the concept of how would we protect robots, we are not quite there yet. But at the same time, my children have grown up with Alexa and when I refer to Alexa as it, they correct me and say she, and they think of it as something that is able to answer questions for them. And so the line might not be as sharply drawn for people who grow up very close to the technology. So my perspective as a person, as a parent is going to be very different from kids who grew up with Alexa as somebody who literally sang lullabies to them when they were children or told them like if they should wear shorts to school in the morning. It's very, very different.
Briar: It's very interesting to think about. I'm not a mom yet, I'm a mom to three cats. How are you sort of seeing your children interact with technology? Like what are some differences that you really see when it comes to the next generations?
Ramona: I think we have to walk them through the lines. We have to say, Alexa is a device, it's not a person. But I think we have those conversations also with YouTube influencers. They might've seen children from an early age and when they'll say, oh this kid is really great, blah, blah, blah. And I'm saying, this is a show, you don't know the child. The child is a separate person. And we have to constantly walk them through how their closeness with technology and their closeness with figures presented in technology is very different from actually knowing someone. And also just reinforce like, yes, Alexa is here and these questions could be answered. But it's not a person. I'm going to use it instead of she. And you just have to constantly reinforce it.
Briar: It's difficult though. I found myself calling Sophia she because she seems like a she.
Ramona: Yes, she is. She also has a physical body. Alexa is just a little device. And it is just a kind of a line that we kind of draw for them. Because I feel like if Alexa was hacked and one day said like, go and eat all the candy in the kitchen. You're like, you'd have to be like, don't listen, it's not a real-
Briar: Go really annoy your mom today.
Ramona: Yeah. So you have to just, in the same way we do for ourselves, we just reinforce with children because they are so close to these technologies that these are the lines and these are the ways to think about it and provide some guidance the best that we can.
Briar: Just before the show today, I actually spent some time on my Roblox game that I'm gearing up to launch. It's like a futuristic shopping mall. Roblox sees over 70 million daily users, a majority of which are Generation Z and Generation Alpha. So when I think of all of these kids using this platform, I'm thinking, “Oh my gosh, when these kids are older they're going to drive a completely different society because buying NFTs and socializing for them on these platforms is just normal for them. And of course, then that's going to bring a whole lot of ethical considerations, isn't it?
Ramona: It does. And these will become the nostalgic games and platforms that they look back on because right now it's very new to us, but these will be the things that they look back on and say, oh, I remember this from like way back when.
Briar: Would you say that in the eyes of the law Sophia is a person in Saudi Arabia since she's got citizenship?
Ramona: I thought about this and I thought it was super interesting because Sophia challenges us in a way. So in Saudi Arabia, and this is not a criticism, every country has its legal regimes and they have their pros and their cons, but there are a lot of restrictions on what women can do. Women can't drive, women need to be in public with a chaperone. And there are different other restrictions women can do. So you think if this is a female robot and human women have these limitations, how would the messy business of life mean that human limitations could be imposed on Sophia? Could a male robot have different things and abilities versus a female robot? In a lot of ways these are sort of goofy questions, but it challenges us to think why are we imposing limitations on human women that we wouldn't on a female robot? And it makes all of it a little bit of a joke. So you think, I wouldn't do this, I wouldn't make this distinction between male and female robots so why between human beings? And I think it's really wonderful and I think it's a bit poignant in ways, poignant in some ways that Sophia is a female robot in Saudi Arabia, which is a country that is thinking through limitations based on gender and the legal regime that it has there, and how it affects women.
Briar: Regarding autonomous robots. I heard through the grapevine, I think it was my director of communications at my agency that was telling me, but she heard about some robots in China that would actually break free from their shackles and go out and destroy things because they were quite strong. We see a lot of these amazing robots and I think like the Amazon and building things. And it's really quite fascinating to see, but from a legal perspective, say that autonomous robot went out and broke something, like who's at fault there? Like, who's punishable in the eyes of the law?
Ramona: Definitely the owner of the robot and anyone who was charged with containing the robot if it had been sort of lent out or leased out and if it was under the control of someone else. So classic property law that every lawyer remembers the sort of classic American case about a fox that goes into someone else's property. And that becomes the launch point for concepts that we think about even in the tech world today. And so those same concepts would be applied to this question about this runaway robot and getting onto someone else's property. What are the implications? What's the responsibility for anything that it might destroy? Any injuries that might come of it and the duty to monitor and supervise that piece of technology. And then also whether or not there was anything inherently wrong in the way it had been manufactured, whether there was a weakness or something like that. And that would come into play between the manufacturer and the person physically in control as they sort of duel it out to sort of point fingers at each other about who would be responsible. So all of these things would come up, but they really have a basis in some of the fundamental thoughts and concepts of American property law and that's derived from English property law. It all fits together and they tend to be really interesting, but definitely pushed along by technology.
Briar: And we spoke earlier about asking questions, and I just love that. What kind of questions should I be asking? Not even just about the RFID microchips. I'm obviously on a mission to explore the future. What would be your advice for me? What things should I explore? What questions should I be asking?
Ramona: How important is privacy to you? I think that's one of the most important things. And it's not the most important thing to a lot of people. A lot of people document every moment of their lives, so it doesn't have an extraordinary value to everyone. But what are you willing to give up? What are you unwilling to give up? And then based on that, to work your way backwards and think, okay, this is where I draw a line. I want this section of my life to be private. So I really want to make sure it's not being discussed. It's not part of that overall discussion. I'd also like to think if I'm going to be tied to some set of information based on this implant, it takes away almost completely my right to deny it in a sense. So who has access to that other information?
Ramona: Can somebody else hack that information? Can somebody change it? Can somebody alter it? And if something in me changes, what right do I have to get my information back? So that question, that's one of the most, for me, I am using the word beautiful and it may be a very loose kind of way, but it's one of the most beautiful aspects to me of the GDPR… is the right to be forgotten. So when you do decide that you want to take your information and remove it so somebody else can't use it anymore or if you want to sort of pack it up and take it with you, that's a really critical thing. We don't have that as a strong concept in American law. But thinking about long term, when you think about the implications of your chip and if you're going to put more information in it and what you're going to do with it, just allow yourself that if that ever changes that you can maybe have the option of taking your information back.
Briar: And obviously my phone has so much information on me, like it's almost scary to think about how could people use that information or use that data against me.
Ramona: That data in your phone. Oh gosh in so many ways. So I'll go past the financial data. That's obvious. One of the most interesting things that I've encountered as an attorney was in the context of a lawsuit, when you have to give over your emails. There is a process of negotiating what the search terms will be so that the emails that are given over when it's your work email will be specifically search terms related to the lawsuit. In this case it was also the Gmail for this individual, for a few individuals. There was one in particular where it was really particularly critical. Gmail will just send over a person's entire email inbox. And what I realized for the first time, when you look through a roll of film, a roll of pictures, I'm saying a roll, but it's actually digital pictures.
Ramona: That picture, you might take 200 and select two to go on social media, the other 198 you don't choose tell a different story. And losing control over that other part of the story is not necessarily harmful. It's not necessarily damning, but it means you don't get to tell your own story. You don't present yourself in the way that you want. And in this case, it was family photos that were sent to us as part of a bigger discovery item. And I was like, it's really different. It's a really different picture from the ones you put on social media. So if somebody had your phone putting aside obviously that they could access your credit cards, they are looking at the whole you. And in some ways it's wonderful to embrace the whole you, but in terms of data, when you don't have a chance of selecting the really messy conversation with some like kooky aunt or an old friend that you're not getting along with and you don't get a chance of molding things from your perspective, sometimes those data grabs can be painful and challenging.
Ramona: And they're difficult because that happens to people, it's not infrequent. So that's what could be taken from your phone. But at the same time it's hard to guard against that. It's kind of the soup that we're all in.
Briar: What about like algorithms? Because one might argue that algorithms are rewiring our brains. Our attention spans are getting shorter. Some people are expressing that they've got attention deficit traits so similar traits to ADHD, except it's not hereditary. It's not genetic. They haven't been born with it. It's something that they've developed over time with the use of their technology. How could people use the data to almost like manipulate us through the algorithms?
Ramona: Well, it puts things in front of you that you weren't necessarily thinking about. And it pushes you in a particular direction. If you're already purchasing something, so what you're going to purchase it. But if the algorithm can sort of push you to a more expensive version of it, a better version of it, my algorithm is suddenly showing me like services for where I can hire a chauffeur. I'm not hiring a chauffeur anytime soon, but it puts it into your mind all of a sudden. And so those algorithms are important. I mean, how they sort of marshal you in one direction is really interesting. But it's also interesting because there is a real push towards transparency. People want to know what algorithms are being used, what information is going into them. And then this has particular implications for law where they're using algorithms to decide is this person going to get bail or not?
Ramona: And so you think, well, what are the factors that are going into that? What biases are being put into these algorithms and how do they affect people? And a lot of municipalities are saying, if we're going to contract with this tech company, I don't care what they think in terms of their own IP, if they want funds from our municipality, they're going to have to disclose and be transparent about the way that they're putting information into the algorithms, possible biases that are there. And that is really, really, really critical. Sometimes it's a matter of forcing companies to be more transparent. And again, it's a step-by-step. Sometimes the victories are almost paper thin, but people are fighting them and thinking about, I think there's a larger conversation that's happening that people are thinking about what these algorithms mean. Like it's not a closed box, it's just human beings who put together a formula. So what does that really mean? And that, yeah, you absolutely have a right to know.
Briar: I was thinking regarding algorithms that potentially they might know us more than we even know ourself. Humans can be quite emotional. We can be quite strange sometimes in our decision-making. Say we're tired, we might pick the croissant over the eggs. The unhealthy choice. But, algorithms really do have that power to almost make decisions for us, so as you say, sort of herd us in a particular direction. And I think over the years, I've become a lot more mindful with how I'm interacting with the algorithms and actually going out there and seeking new information so that perhaps the algorithm just isn't feeding me something that feeds into my bias and then feeds me more and more. So I find myself going that way. I think we see a lot of this happening with politics, don't we, in terms of the content on social media?
Ramona: Yes. And it's hard because I think that people reach a point where they are maybe exasperated and they don't want to hear something that they fundamentally disagree with. And so you might, specifically take that, click on the thing and say hide this from my feed, hide this from my feed, hide this from my feed. The next thing you know you have something that is just really reinforcing your own biases and basically telling you you're right all the time. And so you go around being sort of unchallenged in your views and because people can be on such extremes politically, sometimes it's comforting to just hear that you're right about everything. And I do think that that's in some ways the danger. But again, I do think people need to be as sort of proactive and as thoughtful as possible going and swipe the data clean as much as you can in your apps and have them sort of refresh. Sometimes I'm getting too many negative things and I just have to kind of go do a cleanse in certain apps or certain things. But yeah, we have to be sort of deliberate about it.
Briar: I think we have to go out there and seek information that we disagree with and we have to be open-minded. I think when it comes to my microchip, I used to be so against getting a microchip, so against. I remember standing in my kitchen during covid time and I was really angry about the fact that there was a lot of Swedish people being microchipped to use it on the train and things like this. Obviously they were doing it because they wanted to do it. But I was like, oh, the governments are going to have control. Like, we're all going to be walking around with microchips. Like, I don't agree. We're going to have governmental control. And then here I am this Sunday going and getting my microchip. So, we change over time and our perceptions change and I think we should always be open-minded to our worldview changing as well, because I think that's just a natural part of getting older. You have kids, your situation changes. So I think we need to go out there and not be controlled by the algorithms. We need to go out there and seek new sources of information and be mindful about it.
Ramona: I agree with you. I agree with you. And it's interesting that you mentioned Sweden. They do have, I think the largest population of people with these chips. Yes. They say that they're sort of like microchipping parties. I don't know how accurate that is.
Briar: Oh yeah. No, I've heard from the gentleman that runs that. He said it's like beers and chips.
Ramona: It's so funny. So you go along beers and chips and then it's a different kind of chip.
Ramona: But I think, if you look at Sweden's history, they're probably a lot more comfortable with their government and a lot less sort of fearful of their government and more trusting of their government. And in a lot of other countries, there would've been a lot more sort of weariness and more sort of diverse peoples maybe asking like, is this the right thing to do? Where maybe in the context of Swedish culture and Swedish history it kind of lends itself to a belief in order, a belief in good governance, and other things like that. And a certain trust in the government being sort of on their side. So yeah, I think it's interesting. And in terms of your own views evolving, I mean, what's fascinating from my point of view, you are doing this as an empowered person, right?
Ramona: You are asking the questions. You are in the driver's seat. You are able to understand what all the implications are. And that probably feels very different from when you were looking at it, during Covid, you're looking at people doing it. And everyone felt so far away from being empowered, you felt so subjected to other things. And a lot of people are asking like, are they going to use these chips to say, I can go someplace because I'm vaccinated? It was very scary then. And then now you can like, sort of go out, you can ask the questions, you can be on the record for yourself and for other people. And that probably feels very different.
Briar: It feels good. It feels exciting and it feels like an adventure more than anything.
Ramona: I'm excited for you.
Briar: Thank you so much. Well I'm happy we cleared up a lot of the questions I had today. And as we discussed, I'm walking into my microchip on Sunday, feeling empowered, good old Sophia. She'll be getting her microchip as well. She's assured me she'll be there to hold my hand.
Ramona: I'll be live streaming.
Briar: Wonderful.