Reema Vadoliya podcast interview – why data accessibility might be the solution to addressing bias in AI.

AnchorApple PodcastsBreakerGoogle PodcastsOvercastPocketCastsRadio PublicSpotifyAmazon MusicPodcast AddictStitcherYouTube

Reema Vadoliya kicks off S6, join us as we explore how AI is reshaping businesses, highlighting both the upsides and challenges. We discuss the importance of making data more accessible to combat bias in AI systems and the hurdles companies face in this effort.

Reema describes herself as a passionate business founder, storyteller, and advocate for inclusion in data. She has extensive experience in data strategy, analytical exploration, data collection, and governance. She’s on a mission to challenge the perception of data as a dreary necessity and draw out the real human stories that organically empower intentional inclusion in data and beyond.

*As a reminder, you can now also get this podcast in video form, on both Spotify video, or YouTube.*

Listen/watch now, right above the subscribe button, or pick your favourite listening platform from this list:

Spotify: Click here
Apple Podcasts:
 Click here
YouTube: Click here

Use a different listening platform? Choose it here.

https://youtu.be/2UAyiFkCulE
YouTube version of the podcast episode. Click above to watch.

As always, if you enjoyed this, and previous episodes, please like, rate, share, and subscribe to the podcast – it all helps!  

Useful Links

Podcast page: ⁠https://podcasters.spotify.com/pod/show/azeemdigitalasks⁠

My Twitter page: ⁠⁠⁠⁠https://twitter.com/AzeemDigital⁠⁠⁠⁠

My LinkedIn: ⁠⁠⁠⁠https://www.linkedin.com/in/azeema1/⁠⁠⁠⁠

My website: ⁠⁠⁠⁠https://www.iamazeemdigital.com/⁠⁠⁠

Reema’s LinkedIn: ⁠⁠⁠https://www.linkedin.com/in/reemavadoliya/

Episode Transcript

Azeem (00:01.6)

Hello everyone and welcome back to the Azeem Digital Asks podcast. We’re back for another season. I cannot wait. It’s been a long time coming, but I will not bore you with what’s happened in between. That’s a story for another day. Please also, you know, as usual, like, rate, share and subscribe. I’ve got a brilliant guest for you today, Reema Vardolia. I’m not gonna talk too much because you are here to see and listen.

to all of the good lists that they’ve got to share with us. So without further ado, would you like to introduce yourself to the audience?

Reema Vadoliya (She/her) (00:34.922)

Yeah, absolutely. Thanks so much having me, Azeem. So I’m Reema Vadolia. I run People of Data. For the last seven or so years, I’ve been working in data and analytics, and I absolutely, truly love it so, so much, which isn’t how everyone thinks about data. But through all of the work that I’ve done, working at big companies and small companies and speaking at conference stages, I’ve not necessarily talked about my own personal experiences with data and…

how as a person who ticks a lot of the protected characteristics box on those supposed diversity and inclusion forms, that it just feels endlessly frustrated to be just put into a teeny tiny box. So that’s why last year I quit my job to start People of Data. And it’s all about collecting data inclusively in a way that we can represent the brilliant diversity and intersectionality that we all hold as individuals so that we can actually use.

people’s data to do good things and actually, you know, serve them well and build trust and make data maybe even like a tiny bit exciting for people. You don’t have to go as far as to loving it. But if I can make people even just a tiny bit excited, then I think I’ve done a good job. So that’s me.

Azeem (01:47.424)

Amazing, and that right there is why I don’t intro the guests because I cannot top that. So that’s really what we’re gonna be talking about today. And I’m really excited to get into this topic with you. I guess naturally the first place that I can start then we’re talking about some data AI. The obvious one, a lot of my audience are in businesses, have their own businesses. How do you think that AI impacts businesses either positively or negatively?

Reema Vadoliya (She/her) (02:16.298)

Yeah, absolutely. I think, you know, AI, it depends how much you’re using it as to how you feel about it. From my perspective, I think it can be an incredibly time -saving tool. I think it can provide a bit of clarity and break some of the silos that you might have. I’m thinking about kind of more the chat GPT thing things at the moment, but even in tools such as Canva or, you know, other tools where you’ve got AI.

you know, quote unquote, working in the background there for you, it’s allowing you to interact with something other than just yourself. And I think that can be really, really useful to just break down some of those silos that occur when you’re working on your own. But I think some of the negative things are we sometimes trust it a little bit too much or take things that it says verbatim and allow us to

of stretch the truth a little bit or actually worse still is compound the truth and I think all the bias and the kind of exclusion that it can create really. So I think that’s kind of where we need to be cautious when we’re using it as with everything I guess the same and maybe this is too simplistic to compare it to but if you’re putting your shoes on you need to make sure that you don’t have a stone in your shoe and that you tie in your shoelaces. It’s you know everything should be used with a little bit of caution and risk.

assessment but certain things feels like you should be comparing that a little bit more.

Azeem (03:42.72)

No, absolutely. I love that analogy. I’m absolutely going to steal that. So I’m just letting you know in advance. Thank you very much. You talked about accessibility and making data more accessible. You also touched on bias as well. So how would you say that making data more accessible would help to combat bias in sort of AI systems and tools?

Reema Vadoliya (She/her) (03:48.33)

Yeah, cool.

Reema Vadoliya (She/her) (04:08.842)

Yeah, so I want to start off a bit about talking about what accessibility really is. So I’m going to read this definition here. The way that I see it and kind of the definition that I think is appropriate here is it’s the practice of making information, activities and or environments sensible, meaningful and usable for as many people as possible. And I think there’s a few key things in there is like sensible, meaningful, usable for as many people as possible. So when we think about that,

In a data perspective, what I mean is it’s like the practice of making data sensible, meaningful, usable for as many people as possible. And I don’t believe that data is seen in that way at the moment. What I see and what I hear when I tell people also that I work in data and I do it with a smile on my face, people are just like, I don’t like data. It’s not, you know, I’m not good at maths and

Or what, what does data mean really? And like, you know, I don’t really trust any sort of data. And then we start talking about AI sometimes too. But for me, data accessibility is about making it not just fun, but yes, sensible, meaningful, usable. So when we think about how AI works to again, hugely oversimplify here is it takes information in does some sort of processing based on the information that it knows and the context that it knows, and then it spits out an output.

And that output there is entirely dependent on two things. One, which data has gone into it and two, how it’s actually processing that data. So the thing is, is if we put in information that’s not sensible, meaningful, usable, then what we’re doing on the outside, on the output of the AI is getting, you know, non nonsensical, not meaningful, not really usable data and information, but we don’t understand how.

not sensible, not meaningful, not usable that information is when it goes in. So for me, what I’m really talking about with data accessibility is how do we make sure that that data is actually representing, representing all of people’s like brilliant intersectional identities. And right now when we use protected characteristics to define someone, we’re not really looking into who they truly are and, you know, what do they represent and, and, and

Reema Vadoliya (She/her) (06:26.858)

you know, the brilliant fun things that those boxes just can never capture really.

Azeem (06:33.76)

Absolutely. You made me think of something there because as you were speaking, I was thinking a lot of people probably don’t consider that. I think so. I mean, feel free, feel free to disagree. But I think a lot of people don’t consider what they’re putting in quite quickly, not quite quickly, quite easily because they just want the easiest answer that comes out of it. I don’t think a lot of people consider that. Why do you think that’s the case? Why do you think people just don’t really think about that? Those signs of things.

Reema Vadoliya (She/her) (07:03.018)

think it’s a little bit of the dopamine factory wanting a quick win and having the opportunity to do all of that those things I mentioned, you know, like get the creative output, get that feedback really, really fast, get the time that you save. But I think there’s just something around if you’re saving hours, hopefully, from using some of these tools, then how do you just redirect five to 10 minutes of that to do some of these checks and just just

just have a little pause there and just think about what it is that’s actually kind of going on in and around that interaction that you’ve had there. But I think the reason why is because we are just as humans, like there’s so many things that compete for our priority and our time that to do a due diligence check with your data and your AI is just on a long list of things to do, you know, because then if we go back to that shoe analogy, would we go back and check that the shoe is definitely made in an exact

right way, like you have trust with a brand that once I put my shoe on, the sole is not just going to suddenly fall off or that when it rains, my feet are going to get entirely soaked. So I think it’s about, there’s a trust that is just given automatically in some of these spaces. I think we, not enough people are looking to get their trust earned in this domain, which yeah, we trust tech a lot. I think that can be a good thing and a bad thing, right?

Azeem (08:24.416)

Yeah, absolutely. Absolutely. I couldn’t, I couldn’t agree with you more sometimes probably too much if I think about my own behaviors, but the episodes a lot about me. So apologies for digressing. Right. We’ve talked a lot about data accessibility. The point that you’ve mentioned that the term that you’ve used quite a lot from a, from a business point of view, what challenges would you say that businesses face when they’re trying to improve this? And how would you think business businesses can.

to tackle these issues.

Reema Vadoliya (She/her) (08:55.818)

Yeah, I think when I talk about accessibility, inclusion, diversity, it can be really easy to think that’s just HR’s role. But with all of this, and when we’re thinking about people’s information, we’re not just talking about your team and who’s in your company. We’re thinking about who are your customers and how do we understand who they are. And also that might be your external stakeholders, like your funders or…

know, it could be anyone. It could just be someone that has awareness of your brand. And I think that’s where businesses really need to care about data. I think that’s a point that’s been given plenty of times. And, and just to touch on it briefly, data is information. It gives us access to knowledge. It allows us to go kind of past just gut feeling. And so this is all really, really important because it allows us to make really good and like, you know, evidence -based business decisions. And so why people should…

care about this is because if we can understand truly who our stakeholders, who our audiences are, then what we can do is start to serve them better. And what does serving them better mean? It means meeting our organizational goals. To put it the most simply, what does that mean? It means that we’re getting the money that we’re seeking to earn from our audiences to make the business keep on running. So I think this truly isn’t just a nice to have exercise. This is really, how do we understand?

what we’re actually doing and the impact that we’re doing in an inclusive way, because there’s some really interesting statistics out there about, you know, Gen Z and the accountability that they hold for two brands to say, okay, if I’m going to work at your company, I really need you to have a really solid and actionable diversity and inclusion plan. Otherwise I don’t want to work at your company. It’s really easy for someone to go on TikTok or Instagram.

and just write something online and, and completely just trash a brand. So how do we build trust with these people in, you know, like a digital world that’s so fragile. And I think what that comes from is building trust. How does that come is in the interactions that we have the one way ones that are an individual just handing over that information. We need to make sure that we’re actually being clear as to why do we need this information? What are we doing with it?

Reema Vadoliya (She/her) (11:13.386)

GDPR in theory does help with this, but I do see a lot of cases, unfortunately, where people are just saying, well, I need this data just for a board report, or I need this data just for information purposes. But that’s not actually a very clear reason. That’s, you know, I think we need to be adding a lot more clarity on how and why we’re using data.

Azeem (11:32.928)

Yeah, definitely. I’ve done an episode in the past purely purely on data where we had a conversation about very something very similar to what you just said. Somebody asked for data to make their theory work. So rather than being led by what the data actually says, this person in their request said, I want you to find the data that tells this story for me, which is strictly the truth, which, excuse me, kind of leads me on to

The point about regulation, and you’re talking a lot about AI and stuff as well, which brings me quite nicely to the point, or the question I should say is, we’re probably a bit too far down the road now, but you know, lever say lever, do you think that AI should be regulated? And if so, who do you think should be accountable to ensure diversity in all of the models and training sets that are out there?

Reema Vadoliya (She/her) (12:30.058)

Yeah, the short answer is absolutely we need regulation on this. The more complex answer is that it’s really difficult to know who’s the right person and who’s the right body that exists as an organization that can keep up with the pace of change in this. I’ve spoken to some people who work in, you know, cyber security, who on a literal daily to hourly basis, sometimes are changing their slides before a presentation conference because it’s just changing that fast and

Yeah, I wish there was some way we could kind of slow it down. And I think there have been a few signs of that happening with some, you know, document signed from, from big tech leaders kind of saying, maybe we should slow this down a little bit. But I think the challenge that we come to is curiosity is there to see what we can do with AI and the curiosity sometimes wins over the, should we be doing this? And it’s the, can we be doing this? That’s the question that normally gets answered.

I think yes, absolutely we need regulation. I think there’s some really interesting organizations that exist like the Alan Durin Institute has a lot of brilliant research that’s going on there. I know there’s obviously the European kind of organizations. I think it was just this morning or yesterday that I was reading about their EU AI something or other organization being built. I definitely should know what it’s called, but

I’m sure that will be available. But yeah, I think it’s about holding each other accountable and understanding what the risks are to not doing that. And I think the risks are talked about well enough. Yeah.

Azeem (14:10.88)

Yeah, absolutely. And if it does come to you after we record, we can always share it in the show notes. That’s not a problem at all. There’s so much there. My head is just literally going around and we’re sure that even once we part ways and I edit this recording, I’m going to regret not asking specific questions. So I’m probably going to have to tap you up for a follow up blog post. But naturally, we’ve talked a lot about sort of where we’ve come from.

where we are. Let’s move on to the future. So when it comes to this sort of area, the topics we’ll be talking about in terms of AI and data accessibility, where do you see things heading in the medium to long term future?

Reema Vadoliya (She/her) (14:54.346)

Where I hope that we get to is we can move past protected characteristics as a way to in kind of commercial businesses understand who our audiences are, because I don’t think those questions tend to be specific enough to actually understand how, as an organization, we can serve individuals. I think in public sector spaces, it’s a completely different conversation and there’s still…

I’m still always forming my thoughts on what the right approach there is, but I think kind of within arts and community spaces in particular that we really need to question how we’re collecting this information. And the way that I would love to see that go in the future is we’re asking much broader questions that sure aren’t maybe uniform across all sectors, but it’s giving us an opportunity to really say, like, this is the impact that we’re having. This data collection method of

of using people’s protected characteristics is decades old. It’s arguable as to whether it was fit for purpose a few decades ago, but I think now, especially in such an intersectional society, not just in the UK where you and I live, but across the world like this, so much diversity, which is brilliant across society. So we need to really create a space where we can understand what that looks like and how we can identify the challenges that exist.

Yeah, the short answer is basically a nice, inclusive, friendly way to collect information where individuals don’t question why they’re handing over that information because it’s just very clear to them. And there’s a bit more of a culture around saying, yeah, I’m happy to hand over this data because I trust that you’re going to do something good with it. Whereas right now, I think we’re quite far away from that.

Azeem (16:46.112)

Yeah, absolutely. The trust point is something I was just going to pick up with you on briefly. Do you think that with all this advancement and steps forward in terms of AI and everything else, do you think, I don’t want to put words in your mouth. Where do you think that leaves people’s trust? Do you think that we are less trustworthy now or are we more trustworthy?

Reema Vadoliya (She/her) (17:09.194)

complicated. I think it depends on who you are. I was listening to a podcast recently talking about whether phones are bad or just the culture around the internet is bad because those two things are really quite separate. And thinking about how the conversations that can happen on social media create challenge to our narratives. Those things aren’t bad necessarily. But I think

Azeem (17:10.432)

Ugh.

Reema Vadoliya (She/her) (17:37.418)

around trust in data where we’re at right now is that people just don’t have a clear understanding of why we’re using it. And they don’t feel like they have any choice and choice and control are really big things in trust. So yeah, I think that’s where we’re at. And you can see it ironically to bring in the protected characteristics across certain demographics around ages. You can see that some people are a lot more trusting because it doesn’t it doesn’t websites are sometimes designed in a way.

where it doesn’t feel like you have a choice but to hand over the information. I’m thinking about some of the flight booking websites. It feels like you’re not having a choice as to whether you can book a seat or not, but you have to and it will cost you 15 pounds or whatever it is. So yeah, I think it kind of comes to that trust and control and people’s experiences as to how that is. But it’s, yeah, as with any negative experience, the negative experience is normally shared a lot further than the positive. So people do hear about those.

times where people’s data is being used and unfortunately abused as well.

Azeem (18:41.856)

Yeah, definitely. My mute button didn’t want to unmute then. We’re rapidly coming towards the end of the episode, which I’m frustrated about because I could literally talk to you for hours about this stuff. Before we do that, I’ve just got a couple more questions for you. One I’m definitely going to put you on the spot for now. We’ve talked quite a lot in terms of AI, data accessibility, diversity.

But is there one thing that you wanted to talk about or discuss that I haven’t asked you about yet? And if so, what is that?

Reema Vadoliya (She/her) (19:15.722)

That’s a good question. I think for me, it’s trying to think about how we storytell data for me. Part of the reason why I love it so much is I truly just see it as like an adventure and a place to explore and a playground to take all of the information that’s there and think about, right, how do we build a story from this? How do we say here’s what’s going on? So I would love for people to see data in a different way. And if you’re not comfortable with the word data, think about it as information where

So every single day, probably every single second, I imagine I’m not a bio, biological kind of scientist, but we’re taking in so much information all of the time and we’re processing so, so much in our brains. So I just want to kind of reframe data as not something that just exists in spreadsheets or bar charts or pie charts or whichever chart you prefer, but something that you can create really bold, clear narratives of that allow you to

get more understanding. I think there’s a difference between data and like that kind of information part of things over here and the understanding that you can get from a really brilliant story. So yeah, if you want to get a little bit excited about data, then it’s a, I think hopefully I can make people feel that way.

Azeem (20:32.608)

Amazing, I love it. Just before you go and before you share all of your social details and whatever, let’s distill it down into one thing. So everything you’ve been speaking about over the past 20 minutes or so, what’s the one thing that you’d like people to take away from this episode? Whether you want to hit stop recording and let’s start to reflect on what they’ve seen or heard. What’s one thing you’d like people to take away from this?

Reema Vadoliya (She/her) (20:58.474)

that data doesn’t have to be scary. It can definitely seem overwhelming and like we don’t have that control and trust and, and opportunity to kind of have an influence in how we use data. But I really want to get to a place where data is not scary for people and they can understand that they do have control, that it’s something that impacts all of us. And yeah, that it’s really not something to be nervous about because

and I think that will come from the fact that we will hold organisations to account when they’re asking for information.

Azeem (21:33.536)

Love it. Fantastic. Yeah, my mind is going 100 miles an hour now with lots of stuff that you’ve discussed and I’m sure I’ll have more questions afterwards. So be ready for a couple of emails. But yeah, before I let you go, firstly, thank you so much from me and for the people who are going to listen to and watch this in advance. This has been incredible. Before you go, though, please do share where people can sort of find you and follow you on social media and also be sure to check the show notes because I’ll drop all of these links in as well.

Reema Vadoliya (She/her) (22:04.426)

Perfect. Well, yeah, thank you so much for having me. I’ve had a fun time chatting. I am normally on LinkedIn doing lots of things, talking about data on there and, and yeah, starting to share my excitement over there. So you can find me, Rima Vidolia on LinkedIn. Same for people of data, people of data on LinkedIn. We’ve also got Instagram, which I can’t remember where the underscores are, but people of data on there and soon to have a few other things coming up. But for now.

LinkedIn is probably the best place and I would love to chat to you about your data, how you feel about data and allow you to kind of enable some success and unlock the information that comes from collecting really good and meaningful data, which is come from a place of trust as well.

Azeem (22:51.68)

Amazing. Love that. Definitely do collect because she’s an absolute legend. I’m very glad that our paths crossed earlier this year. So yeah, thank you very much. This has been absolutely incredible. Yeah. And I think that wraps up this episode. So a massive thank you. As always, the boring stuff, please do like, write, share and subscribe. We are back. We are so back. You can expect much more content and you’ll be seeing a lot of re -read all over.

my social media soon because I’m going to be promoting the hell out of this episode. So if you’re not subscribed, make sure you subscribe already because we’ve got way more content coming and I’m going to get a whole load of content out of this episode. Literally, I could do a whole month based on what you’ve just said. So thank you so much. And yeah, stay tuned and we’ll see you for the next episode.

Reema Vadoliya (She/her) (23:43.946)

Thank you.