@media screen and (max-width: 1023px){section[data-id=”block_623ff0e279abd11b5c0e03657ed174df”]{ }}@media screen and (min-width: 1024px) and (max-width: 1365px){section[data-id=”block_623ff0e279abd11b5c0e03657ed174df”]{ }}@media screen and (min-width: 1366px){section[data-id=”block_623ff0e279abd11b5c0e03657ed174df”]{ }}

Kelly Peterson is the Chief Privacy and Compliance Officer for Yobi AI, a company dedicated to building models based on consented data to democratize access to data in an ethical and privacy-respecting manner. As CPO, Kelly establishes the strategy for the company’s compliance programs and advises on product development utilizing PbDD. She collaborates cross-functionally with key internal stakeholders and external partners to explain Yobi’s unique approach to AI development.
body.single-post p, body.single-post li{color: #131313;} body.single-post li a{color: #E33E2B; font-weight: 400 !important;} .center-block{margin:0 auto;float:none;display:block;clear: both; margin-bottom: 0px;text-align: center;} .podwrap {margin-top:20px; }.podwrap img{margin-right:10px; width:98%; margin: 0px; } .podwrap.last{margin-bottom:12px; margin-top: 0px !important;}.podwrap.pod1{margin-bottom:0px;} .podwrap div{display:inline-block; width:21%;} iframe{text-align: center;display: block; margin: 20 auto; float : none;} .iframe-container{ position: relative;width: 100%;padding-bottom: 56.25%; height: 0;}.iframe-container iframe{position: absolute;top:0;left: 0;width: 100%;height: 100%;}
@media screen and (max-width: 640px){ .podwrap { width: 100%; position: relative; display: inline-block!important;}.podwrap div{width:36%;}.podwrap img{margin-bottom: 0px !important;} }
Here’s a glimpse of what you’ll learn:
- Kelly Peterson’s career journey from English teacher to privacy leadership roles
- The role of trust centers in demonstrating transparency around a company’s privacy and security practices
- Strategies for building a trust center through internal advocacy and cross-functional collaboration
- The business case for investing in privacy and compliance early
- How transparency and showing consumers the benefits of using their behavioral data builds trust
- Why trust should be treated as a design choice when building AI products and features
- The challenge of navigating overlapping privacy laws, AI regulations, and other privacy-adjacent regulations
- Kelly’s personal privacy tip
In this episode…
Building trust around how companies collect and use consumer personal information has become a defining challenge. Companies need to be upfront with the types of personal information they collect from consumers, why they collect it, and how it is used. Making that information easy to access can help people better understand a company’s privacy and security practices. And one way to do that is through a trust center.
Trust centers do more than build credibility. They can also serve as an efficient sales and marketing tool that quickly answers questions about an organization’s privacy and security practices. Building one often starts with an internal advocate. That advocate can work with sales and marketing teams to demonstrate how having privacy and security information in one place enables more effective responses to requests from organizations evaluating potential business partnerships. When building AI tools or other new products and features, companies should treat trust as a design choice and be transparent about how behavioral data is used and the benefits consumers receive from it.
In this episode of She Said Privacy/He Said Security, Jodi and Justin Daniels talk with Kelly Peterson, Chief Privacy and Compliance Officer at Yobi AI, about building trust-centered approaches to privacy and security practices. Kelly explains the role trust centers play in demonstrating transparency to consumers and business partners. She shares how businesses benefit from building new products, features, and AI tools with trust in mind, and why demonstrating the benefits of using consumer behavioral data helps build trust. Kelly also discusses the challenges companies face when navigating overlapping privacy laws, AI regulations, and other privacy-adjacent regulations.
Resources Mentioned in this episode
- Jodi Daniels on LinkedIn
- Justin Daniels on LinkedIn
- Red Clover Advisors’ website
- Red Clover Advisors on LinkedIn
- Red Clover Advisors on Facebook
- Red Clover Advisors’ email: info@redcloveradvisors.com
- Data Reimagined: Building Trust One Byte at a Time by Jodi and Justin Daniels
- Kelly Peterson on LinkedIn
- Yobi AI
Sponsor for this episode…
This episode is brought to you by Red Clover Advisors.
Red Clover Advisors uses data privacy to transform the way that companies do business together and create a future where there is greater trust between companies and consumers.
Founded by Jodi Daniels, Red Clover Advisors helps companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. They work with companies in a variety of fields, including technology, e-commerce, professional services, and digital media.
To learn more, and to check out their Wall Street Journal best-selling book, Data Reimagined: Building Trust One Byte At a Time, visit www.redcloveradvisors.com.
Powered by Rise25 Podcast Production Company
@media screen and (max-width: 1023px){section[data-id=”block_a390f9b5d7a12eed0269d1507667f3c7″]{ margin-top: -100px; margin-bottom: -50px;}}@media screen and (min-width: 1024px) and (max-width: 1365px){section[data-id=”block_a390f9b5d7a12eed0269d1507667f3c7″]{ margin-top: -100px; margin-bottom: -50px;}}@media screen and (min-width: 1366px){section[data-id=”block_a390f9b5d7a12eed0269d1507667f3c7″]{ margin-top: -100px; margin-bottom: -50px;}}
Intro 0:00
Jen, welcome to the She Said Privacy/He Said Security podcast, like any good marriage, we will debate, evaluate, and sometimes quarrel about how privacy and security impact business in the 21st Century.
Jodi Daniels 0:21
Hi. Jodi Daniels, here. I’m the founder and CEO of Red Clover Advisors, a certified women’s privacy consultancy. I’m a privacy consultant and certified informational privacy professional providing practical privacy advice
Justin Daniels 0:34
to overwhelmed companies. Hello. I am Justin Daniels, I am a shareholder and corporate M&A and tech transaction lawyer at the law firm, Baker Donelson, advising companies in the deployment and scaling of technology. Since data is critical to every transaction, I help clients make informed business decisions while managing data privacy and cyber security risk. And when needed, I lead the legal cyber data breach response brigade.
Jodi Daniels 0:59
And this episode is brought to you by ding for the people who can’t see you pulling on my hair. Red Clover Advisors, we help companies to comply with data privacy laws and establish customer trust so that they can grow and nurture integrity. We work with companies in a variety of fields, including technology, e-commerce, professional services and digital media. In short, we use data privacy to transform the way companies do business together, we’re creating a future where there’s greater trust between between companies and consumers to learn more and to check out our best selling book, Data Reimagined: Building Trust One Byte at a Time. Visit redcloveradvisors.com, hello, hello.
Justin Daniels 1:38
I want to take this opportunity to mention that I launched my own substack page. Oh, how do people find your substack page? They can go on to substack and type in my name, and we know who my first subscriber was. Yes, whether she liked it or not,
Jodi Daniels 1:56
it was actually quite good. So anyone who wants to keep learning and reading in depth, do you want to just give like a snippet of what you’re talking about? Because it’s not security,
Justin Daniels 2:05
it is the intersection of sometimes politics, technology and accountability.
Jodi Daniels 2:11
There you go. You can go to sub. But today we have Kelly Peterson, who is the chief privacy and compliance officer for Yobi AI, a company dedicated to building models based on consented data to democratize access to data in an ethical and privacy respecting manner as CPO Kelly establishes the strategy for the company’s compliance program and advises on product development using privacy by design. She collaborates cross functionally with key internal stakeholders and external partners to explain Yobi’’s unique approach to AI development. And Kelly, we’re so glad that you are here today.
Kelly Peterson 2:47
Thank you for having me. I’m very happy to be here.
Justin Daniels 2:51
So Kelly, would you like to tell us a little bit about your career journey?
Kelly Peterson 2:55
Sure, it was definitely a non linear path. I am not an attorney. I’m actually a former high school English teacher. Sometimes I would like to say we’re a covering teacher, but I was an English major, got started in business development after about two to three years of doing that, left and went into high school education, got burned out after nine years, went back to the company that I first started out in biz dev, and actually became a contract manager and was dealing with compliance and contracts of that nature. GDPR was on the horizon. We had an office in the Netherlands. We had European clients, and no one else wanted to touch privacy. And I said, I raised my hand and said, Well, I’m kind of ethically aligned with this. I think, like it’s interesting. And so the GC went, you, you figure out how we’re going to comply with this law over there. And that’s how I got my feet wet. And from there, I actually was referred to Grindr, and joined Grindr as their first in-house data protection officer and global compliance manager. Became CISO while there, under the tumultuous time of some pre IPO left there and went to Amazon advertising, where I was over European regulatory compliance, and then actually went back to Grindr. Was Chief Privacy Officer at Grindr, and now, most recently, at Yobi. So very lots more details could be given, but that’s the short version of my my tenure and privacy so far.
Jodi Daniels 4:28
I love that, and I have to imagine that, you know, understanding sales is really helpful as you’re working with those teams. The ability to read really long, complex policies and regulations as well as how to communicate comes in handy quite often.
Kelly Peterson 4:43
Yeah, a lot of transferable skills.
Jodi Daniels 4:45
100% isla. I love that path. So one of the things that I actually think really pulls out that theme of communication is Yobi’s Trust Center. And I, I love trust centers. Been talking about trust centers for. Really long time, because when done well, I think it really highlights publicly the company’s commitment to privacy, security compliance. AI keep filling in the blank. So can you share a little bit what, what was the motivation for building a Trust Center?
Kelly Peterson 5:16
Yeah, great question. I, too, am a huge fan of Trust Center, so that’s why one of my first priorities in joining was getting that built and up off the ground. Because as a company whose kind of ethos is built on privacy centric, you know, modeling behaviors, I wanted to not be that company that says, you know, when something bad happens, what’s the first line? People always say, privacy is very important to us, right? And then here’s a list of why you should believe us, even though we never had anything, you know, publicly listed on this until something went wrong. And I think trust centers are just kind of a no-brainer way to make a public non regulation, you know, demanding commitment to saying, here’s what we do, here’s why we do it. Here’s how we do it, and here’s what you can do about it, like here’s the controls that you can have. And I think the point that you said about when they’re done well, really helps establish trust between the consumers you’re working with, even regulatory bodies, because you’re kind of putting your money where your mouth is by saying, here’s the information you need to know a little bit more about us without having to go dig around multiple links to find it, to have to sign an NDA to get access to certain, you know, to all this information that really just, you know, a Trust Center is, is a is a way to basically walk the walk when it comes to saying, If privacy is important to you and security is important to you, well then put it out there publicly for people to know. So that’s why, when that’s the ethos of the company I joined, it was kind of a no brainer for everyone. When I said, we need to do this, it was like, yeah, that that makes complete sense. Let’s do it. Let’s build it.
Jodi Daniels 6:54
I was looking yesterday an interesting B2C, IoT health related device came along, and the very first thing I went to look at is, what kind of data are they collecting, and how is this secure? And I was sort of half half okay with the information, and they they really it’s sensitive information, they would probably have much higher sales if they started with a Trust Center. Maybe I’ll send them this episode.
Kelly Peterson 7:22
Please do. Yeah, I think that people, for a long time, thought that they had to hide something, you know, if they were using sensitive information, which makes sense in the healthcare space, the information you’re going to be collecting is most likely sensitive under some sort of regulation. Stop trying to hide the ball like just be up front. This is what we collect. Here’s why we collect it, here’s how we’re collecting it, and the benefit to you and how you can control what controls the company has in place for you to manage that data and to say, you know, to put limits on it or understand more. Bingo. What she said, when
Justin Daniels 7:56
did they start having trust centers? Because my engagement with trust centers, I think of it more in terms of when I’m looking at AI tools that my clients want to buy, but it sounds to me like from your privacy, Pro’s perspective, these have been going on for a longer period of time.
Jodi Daniels 8:13
I’m not sure I could tell you the specific years origin, but they have definitely been around, and some companies are using them. From it started with privacy and security, I am seeing greater adoption because so many com, at least so many software companies, are trying to turn in cells into AI companies, or just the brand new AI companies. And now people have woken up to wait, I actually might care about privacy and security and how you’re using my data. There’s still a huge market, though, for trust centers. So there’s a lot of companies like Fanta has a Trust Center Andrada has a Trust Center. They bought kind of other Trust Center companies. There’s one called Safe base. There’s a couple others, and then some just make them themselves.
Kelly Peterson 8:59
Yeah, and yeah, we decided to go with trust arc as a no code solution. You know, we’re a startup. We’re small, we don’t have it can be difficult to get engineering resources to build a site and to host it and to maintain it. And so it was a no-brainer to say, oh, there’s a no code solution that I can maintain, the compliance team can maintain without any engineering involvement. And it gets across the message we need to like, let’s, let’s invest in that. And so that’s the route that we took. Makes lots of sense.
Justin Daniels 9:28
What kind of dovetailing on that point it sounds, at least when I’ve dealt with it. You know, building a Trust Center requires collaboration. You’ve got privacy, security, other folks. So how do you recommend that other companies begin this process with the Trust Center,
Kelly Peterson 9:45
I think you find an internal advocate. For me, it was also a tool that wasn’t just a demonstration of our commitment to privacy, but going to my sales and marketing team and saying, hey, when you guys are trying to have conversations and give. In the world, and the kind of heightened awareness, not just of consumers and regulators with regards to how data is collected and how it’s being used, but even if we’re B to B, right, we’re not B2C. So how are we going to convince this company that we’re a trustworthy company to partner with? And one way to kind of put that information out there was a Trust Center, so going to my sales and marketing team and saying, wouldn’t it be great if we had all this information in one place that you could just give a link to people who you’re trying to get through and give that to their compliance team up front and like, you know, not hide the ball, so to speak. I know I’ve said that, but they were like, yeah. And actually, when we released it, they already had a use for it. On day one, it was like, Oh, I just sent the link to this entity who is asking for this information, and everything’s there, and the response back was, this is fantastic. This is exactly what I need. Thank you. Instead of, oh, now we’re emailing back and forth, we have to hop on a call. Instead, you can have much more efficient conversations direct com like direct and pointed conversations, instead of getting through the miasma of all the third party risk management assessments that may have to be filled out or follow up questions or things of that. So I think internally, finding an advocate that will see the value in what the Trust Center brings, maybe not from a pure privacy or security perspective, but how it will make their life easier as well, and then begin to build those relationships cross functionally to internally. Kind of champion that which is, I think, how all privacy pros and security pros attack a lot of problems internally. Of how we need to get things done is to get that internal buy in.
Justin Daniels 11:32
Well, you know what? You make a really interesting point there. Because now that you say it that way, I really think you can frame a Trust Center as a sales and marketing tool, because when I vet tools for my law firm, we’re an enterprise buyer. And to me, when there’s a trust center that has the sock two and has all the information, one, it’s a huge credibility booster, because that tells me, on a certain level, you get it and then it’s all in one place, so that, hopefully when we have a conversation, it’s framed by we already know what you do, and I guess I don’t know, I I’m now thinking a Trust Center is kind of like a sales and marketing tool to help get through the due diligence a little faster. It is.
Jodi Daniels 12:10
It’s why I like Trust Center so much, because it absolutely helps the sales team. It helps credibility. There are companies, there are buyers just like you Justin. They’re looking at sites, and if they don’t see what they want, it doesn’t look like a real company, or it doesn’t look like they’re taking it seriously, poof, off. They go to the next one. But if I can quickly see all the things that they have, then your product might be horrible. But at least from privacy and security standpoint, you get some good you get some good check marks, and that’s going to help. It also helps from a time frame perspective, because it removes Kelly. I think what you were just describing, which is, well, let me find the document, let me ask, let me send it, and then you have to do that all day long. It just removes that. Now there are certainly going to be some follow on questions, there’s going to be contracts, there’s going to be extra special assessments, sometimes a significant amount of the questions are going to be addressed
Justin Daniels 12:59
that people have Yes. And I guess Kelly is a follow up, because you worked at Grindr, you worked at Amazon. This company is a startup. You know, a lot of companies, particularly in AI, where it’s such a race to market and the competitive pressure is immense, how do you get companies and management who’s all focused on the features and getting it out there to say, hey, we kind of have to take privacy or in security seriously, because if you want people to buy this tool, no Trust Center that’s populated, well, we’re not even getting to evaluating the tool if you want to have meaningful customers.
Kelly Peterson 13:34
Yeah, that’s a great question. I think that time has evolved to help privacy professionals of be able to make an argument to, you know, the CEO or the CFO or whoever’s holding the purse strings to decide whether we’re going to spend money to do this or not. Say you can invest now and it’ll cost you this much, or you can wait until we’re forced to do something, and it’s going to cost you exponentially more, right? Do the compliance work up front. You know, there’s that tipping point, especially with startups of like, we need to invest in building the product. We need to invest in getting ourselves, out them, out into market, but we can’t do that at the complete expense of compliance and, you know, regulatory management, and especially for a company like Yobi, where privacy is the A part of the ethos of the company that happened way earlier, I think, in the startup phase than normal. You know, I, I’m a I was a pretty early hire in the company’s history, which I think is unusual as far as what you’re going to add to the C suite in the very early stages of a startup. And I think that there’s really very little excuse for more mature businesses to not have trust centers or places because, especially if they’ve had some regulatory or compliance issues, that seems like a very easy way to solve for some of those headaches. But, you know, I think you guys know even better than I do, like we can talk a lot about it, just, just do it now, do the work. Up front, because it’s going to be more painful when you’re ordered to do it, when there’s a consent decree forcing you to do something, than if you just took the took the initiative to spend that money up front and it you actually, you know, don’t be penny wise, pound foolish. Essentially, the adage is
Justin Daniels 15:16
so I guess Kelly, one of the other things it sounds like you’re saying is the benefit for privacy and security pros is this new technology AI that cuts across privacy, security, IP, regulatory at least one benefit. It seems to be coming at a time where privacy and security are taken more seriously, so that these conversations get had earlier, and you’re not in the in the bucket of wait until we have to be forced by the consent decree.
Kelly Peterson 15:44
Yes, exactly. Yeah. I think we’re at a good inflection point in time. You know, it’s not foolproof, like, we still get pushed back. They’re still like, Do we really have to do these kinds of things? Do we really need to spend money on it? But I think that there’s just much more consumer awareness. It’s politically supported, at least in the US on both sides of the aisle, to some degree of like this is something we could probably come along with and find some common ground. And then businesses are just more careful of who they’re doing business, because nobody wants to be the Cambridge Analytica of the past in the AI world now, like, you know, we’re waiting to see who’s going to be that, and everyone, I think, is taking steps to try to avoid that.
Jodi Daniels 16:26
What I would add is, I think AI related tools and services are helping privacy and security teams get their message across, because customers are evaluating, can I trust you, and what are you doing with my data, which are often answered from privacy and security teams? Absolutely. So switching gears just a little bit, but still kind of on the theme of trust. Behavioral data often feels, you know, really sensitive to customers. It’s tied to preferences. It’s tied to intent. How do you think companies can consider Trust, which we’ve just been talking about, and consent in designing products that derive value from that data.
Kelly Peterson 17:06
Yeah, great question. I think there’s two key points. One is transparency, which we just talked about, like, be upfront with what you’re doing. Like, what is the data you’re collecting? Why do you need that data? How is it being used? But then, additionally to that, what’s the value to people? Because behavioral data can’t seem creepy, but yet, when we make it, when we make it very clear what the value exchanges to a consumer, oftentime it becomes one of the most beloved data exchanges, and an example of that is Spotify wrapped, or unwrapped, right? Spotify is collecting a lot of data on its listeners, and then at the end of the year, they get this cool package that they put together, which is a personalization of, did you know that? You know, I loved it when I found out I was in like, the top one person of Beyonce listeners the year her Renaissance album came out. You know, I felt like I really achieved something great. And that was like, I like that they had that data to then report back to me. Wow, this is it. And also, wow, my kids take over my Spotify playlist a lot, because I may not be the number one, you know, like, I can’t even remember. I think it was a Brazilian KIDS ARTIST like patachi patata. I was like, I’m not listening to that all the time. My kids are like, breaking into my Spotify list all the time. So I think when companies can kind of think about the ways that they can use data to really demonstrate through the core product, but also through these ancillary things that they can offer. Like this is why this is how we’re using your data to make our product better and personalized for you. Specifically, I think that companies also need to think about our customers going to be surprised. Like I didn’t know this company was using my data for this. And that goes back to Jodi, what you were saying earlier, like I read this in the Trust Center. I read this about what data is being collected, but it’s not quite clear to me all the way. So that’s probably going to lead to some consumer or business surprise, which is what you absolutely want to avoid. I think you need to be forthright and make sure and put yourself in the consumer the customer’s shoes, of what does a customer think is going to happen versus what is actually important and what’s actually happening, and make sure that there’s very little space in between those two, those two worlds.
Jodi Daniels 19:12
Really like how you just said very little space between those two worlds. I think that’s a really clever way of saying it, and very important, because that’s the truth. Yeah, you’re pondering, I
Justin Daniels 19:27
guess, what Kelly was saying, because I was having a conversation with someone today about, how do we need to think about AI in our company when we have both European and US operations. So you know, the regulation regimes are different. But think to Kelly’s point, if you’re transparent, you do what you say you’re going to do, you understand kind of what the unintended consequences are. Those are kind of the principles you’re really broadly looking at around any of the different regulatory regimes. So it’s not like you can say, I’m not. Sure what the regulatory environment is. You kind of have a sense of what the principles that are underlying them is. It’s just, I guess where I struggle a little bit Kelly is I just see how many companies are racing for market share because this AI market is perceived to be so important. And I just struggle sometimes with what I expect. Some of the boardroom discussions are where people bring up these good points, and they rationalize it away by saying, Well, if we don’t deploy this AI feature that targets this behavioral data, Company X will, and now will be an afterthought, and it’s really powerful. When that gets said, I don’t know how you combat that. I really struggle with it.
Kelly Peterson 20:39
It is difficult, and I think that it’s a, it’s an issue that we’re all grappling with, right, whether it’s your your company is adopting AI tools, and, like you said, like, if we don’t use these agentic AI tools, we’re going to be left behind. Because, you know, other companies are going to reduce their head count by, you know, bringing on AI agents, or whatever the story may be. I think, though, if you take it back to first principles for the for the people on the ground who are designing products, who are making these choices, it’s really choosing to have trust as a design choice. And if this feature we’re putting in, is this going to enhance trust in our company and our product, or is it going to decrease it? And if it decreases it, then you need to evaluate the risk. Are you doing so at the risk of potentially alienating people, are you doing it at the risk of causing compliance issues that are going to cost you more money than any ground you’re going to cover in the market share or whatnot? And it’s really getting people to stop thinking about the immediate future and also the long term future, and that’s where trust is a design choice, because it’s very hard to earn trust, but it’s very easy to lose it, if that risk becomes too high and out of balance.
Jodi Daniels 21:47
Well, Justin, you were talking about privacy regulations, and anyone listening knows that privacy regulations are changing all day long and evolving. Halle, I’m curious what trends do you think so? What is your crystal ball that will be most significant for companies operating in AI and data science?
Kelly Peterson 22:08
Yeah, I think the trend is finding common ground between privacy regulations that have now been in effect for quite a while. You know, GDPR, even e privacy in Europe, CCPA and all the patchwork of state laws, but then also what I call privacy adjacent regulations, like the EU Digital Services Act and things that aren’t privacy necessarily first, but they converge and touch on things and sometimes are at odds with what privacy regulations are. And then when you add AI regulations on top of that, which are saying you need to combat bias, you need to combat, you know, hallucinations and all these things. But privacy law is saying, Well, you can’t use sensitive data for this kind of purpose. And it’s like, well, how are we, you know, how can we make all these things work together? So I think a trend that we need to stay focused on is, how are policymakers going to even attempt to try to rectify some of those things, or are they going to say, you all figure it out on the operational side of things. Don’t care if the technology doesn’t support it right now, but that’s your burden in doing so. And so I think that continuing to kind of see the tension between all of that is something that we’re going to to come to a head fairly soon, especially in the European markets.
Jodi Daniels 23:26
Yeah, I would agree. I would just also add to the complexity that some of the privacy laws are adding elements that impact AI. So if a state can’t make it a separate AI regulation, then they sometimes add something in around referencing automated decision making, which kind of references a Yeah, yeah.
Justin Daniels 23:45
So Kelly, is there a best personal privacy or security tip you would like to share with our audience?
Kelly Peterson 23:53
Yes, I think that we need to go through what I call a default detox like every quarter, at least as consumers, which is go in on your phone, go in on your devices, and go in and check the settings and see what is turned on by default or turned off by default, because those can change over time. And I think it’s the defaults that people don’t understand can really be the floodgate that opens up the data flow transactions between what’s taken off a device, what’s tracked on a device, and people don’t realize that they do have the controls there. They’re just buried in the settings perhaps somewhere, and they can change. So on a quarterly basis, go in and delete apps off your phone that you don’t actually use anymore, and the ones that you do use, go in and make sure see what their privacy settings are and what you can control, and make sure that you’re really comfortable with what’s on being on, and toggle it off if it is so, yeah, I think we need a default detox.
Jodi Daniels 24:47
I love that and such good advice, especially with so many companies doing default on in ways that people might not have expected when you are not doing default detoxes and it. People on privacy and AI and compliance. What do you like to do for fun?
Kelly Peterson 25:06
I love to travel. Being based in the Midwest, in Kansas City, I’m centrally located. But really, lately, I’ve been into solo travel, so taking trips alone, not planning big, extravagant group trips, but just going out and exploring a destination on my own, and the freedom that comes with that, and being able to travel, however, I like, without any, without having to please. You know, lots of different personalities.
Jodi Daniels 25:32
Is there a destination that is coming up that you’re excited about, or one that you most recently did?
Kelly Peterson 25:38
I did, I did Scotland for New Year’s Eve, and it was amazing. I spent New Year’s Eve in a castle, because I’ve always wanted to do that, and I got to go sightseeing. And I did Harry I’m a big Harry Potter nerd, so I went and saw kind of J K Rowling’s, you know, writing spots and things like that. And I did it all on my own.
Jodi Daniels 25:56
Really fascinating. Well, Kelly, I’m so grateful that you joined us today. If people would like to connect with you, where should they go?
Kelly Peterson 26:03
Yeah, you can find me on LinkedIn. You can also find me at my company’s website, which is yobi.ai, and that’s about all the social media I have
Jodi Daniels 26:13
amazing Well, thank you again. We really appreciate it. Thank you.
Outro 26:21
Thanks for listening to the She Said Privacy/He Said Security podcast. If you haven’t already, be sure to click Subscribe to get future episodes and check us out on LinkedIn. See you next time.
(function($){
$(‘[data-id=”block_a390f9b5d7a12eed0269d1507667f3c7″]’).find( ‘.accordion-title’ ).on(‘click’, function(e) {
e.preventDefault();
$(this).toggleClass(‘active’);
$(this).next().slideToggle(‘fast’);
});
})(jQuery);
@media screen and (max-width: 1023px){section[data-id=”block_29296760bcf7d7e8f9f74ffcc64f4acb”]{ }}@media screen and (min-width: 1024px) and (max-width: 1365px){section[data-id=”block_29296760bcf7d7e8f9f74ffcc64f4acb”]{ }}@media screen and (min-width: 1366px){section[data-id=”block_29296760bcf7d7e8f9f74ffcc64f4acb”]{ }}
Privacy doesn’t have to be complicated.
As privacy experts passionate about trust, we help you define your goals and achieve them. We consider every factor of privacy that impacts your business so you can focus on what you do best.

The post Why Every Company Needs a Trust Center appeared first on Red Clover Advisors.






