How can we optimise for a meaningful life?

Ali Abdaal
 
Taimur Abdaal
 
Neel
 
11.Jan.2021

Ali
[SPONSOR] Hey friends if, you wish you weren't hearing an ad right now then straight after you listen to this episode head over to watchnebula.com/not-overthinking with a little hyphen thing in between the not and the overthinking. So watchnebula.com/not-overthinking. Through Nebula, you'll firstly get access to all of our podcast episodes ad free. Secondly, you'll see exclusive content from me and a load of other educational-ish creators. And thirdly, it directly supports this podcast, so you'll incentivize me and Taim to record more episodes. My name is Ali, I'm a doctor and YouTuber
Taimur
I'm Taimur, I'm a data scientist and writer.
Ali
And you're listening to Not Overthinking.
Taimur
The weekly podcast where we think about happiness, creativity and the human condition.
Ali
Hello, everyone. And welcome back to another episode of Not Overthinking. Taimur, how you doing today?
Taimur
All right, I actually have a new webcam set up as of this week. So I have a fancy camera, which is looking at my face. So hopefully when we upload these episodes to YouTube next week, which I think you said we'd do it next week. Right, Ali?
Ali
I think that's the plan. Yeah.
Taimur
Okay. You're handling that side of things. Let's see what happens.
Ali
Yeah, we will see. We'll see how that goes. But yeah, you'll look nice and nice and good on the webcam, as they say.
Taimur
Although there's a bit of a problem because the camera is positioned like to the side of my monitor. So when I'm looking at your face on the screen, the camera will see me looking off to the side, which is a bit weird. So I don't know how to solve that. But I guess it's the same for you.
Ali
Yeah, it's always the problem with this sort of webcam setup, where I have my camera kind of 45 degrees from me. So when I actually look at the screen, it looks like I'm not looking at the camera. There's a YouTuber, I know he uses one of these teleprompter setups on top of an extra monitor to it, and then HDMI in as like a third monitor. And so you can make eye contact while looking at someone on a Zoom call. And that is like the cynical because teleprompters, for those of you don't know, have one way glass so you can s e something on the scree But the lens won't recognize the thing on the screen, which is how you can read stuff on a teleprompter without it looking as if you're kind of not looking at the lens. Anyway, all that aside, today is a very exciting episode because we are joined by a friend of ours called Neel. Neel, welcome to the show. How you doing?
Neel
I'm doing pretty well, thanks. Thanks for having me.
Ali
That's a quite right. [SPONSOR] Before we get into anything, we need to talk about this week's sponsor for the podcast. And this week is actually Brilliant, who are very kindly sponsoring this podcast. They are sponsoring us again for 2021. So clearly, you guys were doing something right is going to bring into org/notoverthinking in 2020 because they wanted to continue to support our podcast. But Brilliant, I think ties in nicely with Neel, because Neel probably doesn't want us to mention this. But Neel studied maths at Cambridge, and actually ranked first in his year group, which is like a huge deal. So Neel, do you want to tell us a little bit about intuition, and how brilliant is the best thing ever?
Neel
Of course. So I'm a really big fan of like, actually understanding things rather than just rote memorization or going through the motions. And I think if I'll make a list of the things that I think most math students get wrong, is just focusing on details rather than the bigger picture. And I think what Brilliant brilliantly achieves is actually trying to give people an understanding of what the hell is going on. And the way I can think about maths, it's like, you figure out what the hell is going on. And then all of the details is put into place. But if you do it the wrong way around, you just spend three years really confused, and then never talked about maths again.
Ali
It sounds like this is music to Taimu's ears, because yeah, he has said this many times.
Taimur
I wish he told me that like five years ago.
Ali
Anyway, if you guys want to improve your own intuition, then as you know by now Brilliant is a fantastic online platform for courses in math, science and computer science. So head over to brilliant.org/notoverthinking. And the first 200 people to click that link this week, assuming 200 of you click that link will get 20% of the annual premium subscription. And it helps support the podcast. So win-win all around. How was that as a sponsored plug, Taimur?
Taimur
I think that was solid. Maybe we should have Neel on every week to do the Brilliant plug.
Ali
Yeah, we can do like a pre-recorded segment.
Neel
If you want I can just say like "I'm Neel and I endorse this message" (laughter) "and you can pay me half of the sponsor fees and I can give you my permission
Ali
All right. That sounds like it would be pretty good overall, it depends on how my promotions go this year. So guys, if you want Neel to continue to be on the podcast, brilliant.org/notoverthinking. Anyway. So Neel, we connected through like a friend, like a mutual friend. And then you have a blog where you've written lots and lots of blog posts, which are all very well thought out. Most of them. And I referenced your blog in an email newsletter several months ago, and loads of people replied to that email saying, "Oh my god, I love this guy. This blog has changed my life" and all this sort of thing. So I thought it'd be really interesting to have you on the podcast to talk about some of the stuff that we're all mutually interested in. But in particular, the whole sort of effective altruism vibe and how we can sort of do good in the world in a sort of rational setting. But before we kind of get into that, it would be great to hear sort of what's like your story. How did we get here? If you can give us a little bit of background about about you?
Neel
Sure. So I guess I'm the kind of person who loves optimizing things. And just like everything in my life, like, I just find it really intuitive to find goals, to strategize, to plan things, I find it really exciting to like, 80-20 things, find the clever hacks, find the ways to make things better. And I mean, I think a lot of your listeners will vibe with this. But I think I take this pretty far, like, one of my favorite portraits ever was trying to sing, but I didn't really have close friends, and then tried to optimize my ability to like form emotional bonds and like, actually vibe with people. And this somehow actually worked. I'm so confused about this.
Taimur
Wait, we have to dig into this. Can you elaborate a bit? How long?
Neel
Okay, so like, back when I was in school, I have friends, but I never really had close friends. And I never really realized that this was like a thing that can happen.
Ali
How are you defining close friends?
Neel
I'm kind of like, people who I knew really well. Like, I know how they think I know what matters to them. I know that like fears and insecurities. I feel comfortable just hanging out with them being vulnerable, and like, sharing the things on my mind. And like we kind of guess each other. That definition you could vibe with?
Ali
Yeah, that makes sense. Just before this, this podcast recording, I was actually doing one of these plan your life exercises, and they were talking about close friends. And the metric they used is, I think, people that you would be comfortable ringing up at three o'clock in the morning if you needed something. And I felt uncomfortable with that particular definition, because I feel like I've got close friends. But you know, I was generally feel uncomfortable about bringing someone at three o'clock in the morning. But anyway..
Neel
I thought you my friends, but I also value my sleep. I'm more funny to be around. So really, if they don't ring me at 3AM, it's an investment in our friendship. But anyway, and then I ends up in this like, year long relationship that ended up with your first year. And when I was like, over the my heart being broken, all that jazz, I was thinking about what I missed from that. I think one of the things I really missed was that we were kind of actually good friends. And there's a lesson this really happened. And so like optimizing things, I was like "Okay, how can I fix this as like, efficiently as possible?" And so I was thinking back on like, the people I kind of felt close with, and people I had bonded with, and the common themes. And I said, Okay, I think vulnerability, honesty, sharing personal things, and like, kind of actually putting an effort and supporting each other seem like the common theme said, so how can you get more of this in my life? And the brilliant idea that 19 year old Neel came up with was, so you guys know, 36 questions to fall in love?
Ali
Oh, yes.
Neel
The New York Times thing? So, I was like okay, so sharing personal things and being honest, seems the key. And there's like loads of these lists of questions online of kind of deep personal questions that aren't the kind of things that normally come up. I think, like, half of them are crap, but half of them are interesting. So I made a shortlist of my favorites. And then I went to a bunch of the people who I was kind of friends with and said: "Hey, I'm trying to become close friends with people. Are you down for just like, going for a several hour walk going through this list? And just like trying to actually get to know each other?" Somehow, a bunch of them said yes. And then I had about a 50% success rate for like, still being pretty good friends with them today.
Taimur
Wow. That's a pretty solid success rate.
Neel
I know, I was like, I still can't really believe this worked. And like, nowadays, I'm a lot less artificial than this. And like, I try not to do the like, standard list of questions thing. But I kind of think that it's like, I had this default way of doing socializing and friendships, we certainly do small talk and talk about stuff and subtle things. And going through this list of deep personal questions, just like broke me out of that, and put me far away in some completely different, like place with the social landscape. And I tried that for a while. And that made me realize that like, you can just kind of be vulnerable and get close to people way way faster than you normally do. And a lot of people just really vibe with this. And I think that I just leveled up a tongue and just ability to become close to people in that project.
Taimur
Oh, wow, that sounds interesting. Wait so, before that was there like a fear of like, vulnerability, like you were kind of scared of being vulnerable with people. It just wouldn't really come to your mind that oh, I can like open up about something or I can like ask them about something. What was going on before that?
Neel
Yeah, well, the second one, it just wasn't a thing that I realized I could do.
Taimur
Okay, right. Yeah.
Neel
It's like you're like in a role playing game. And in social interaction you have like a few actions: chat, talk about the weather, give them a compliment, and ask a deep personal question wasn't on my actual list. (laughter)
Ali
So what's an example of one of the like, what's an example of vulnerability in this particular context? Like, what's something that you'd be more open to sharing with, like now that you may not have earlier? And just to give a bit of context, like the first time we spoke, I also felt like an instant like connection to you. Because, you know, we were both, I feel like sharing quite somewhat vulnerable things. And you said something that "oh, you know, it makes me anxious when I have phone calls like this because xx.." I was like "oh, wow, that's like such a nice thing to say." Because I felt a little bit like I sort of anxious about this phone call as well, because I'd been reading your blog and stuff. And the fact that you sort of put that feeling into words made me feel instantly at ease.
Neel
Yeah, some factors that my level of anxiety is directly proportional to the amount of YouTube subscribers somebody has. So that conversation was just off the charts.
Ali
..isn't it?
Neel
Yeah, I think like people tend not talk about like that anxieties, or insecurities, or like things that bother them, or just like things ongoing that grid in you life. And I think you can take talk about this a bit too far. Like, I think I don't really like being around people who just only ever talk about the ways that life sucks. But just being opened, thats a thing going in your mind. That it's not a big deal. But it's just like that, and you're comfortable sharing it, I think goes a long way, especially trashy I like quite a lot is I kind of think about conversations as giving the other person hooks to dig into, into whatever they find interesting. So I just like in the standards, how's your life going the conversation, you just bring up the things that I think a genuinely interesting and I want to talk about. And don't push on them, but just leave them there. And sometimes the other person picks on one, sometimes they don't. And if you just give enough hooks, you'll eventually get into a conversation you both actually enjoy. And I think a lot of people get kind of closed off by default. And it's hard to get anything to happen with them. And just saying, implicitly saying "Oh, here's a wide range of things I'd be excited to discuss with you. Pick the ones that you think are fun" Just to have way more awesome conversations.
Ali
And this is an idea that I saw, I came across in a book when I was I think 18 or 19. So similar age you were, when you discovered this stuff, it was called like Charisma On Command. And the way that they described it was as if it's, they call it Velcro Theory, which is as if you're providing lots and lots of Velcro straps in your opening spiel for like, "Hey, what do you do? Or how's your day going? Or what's up." just to be able to, for someone to kind of latch on with velcro. And so I was like "Oh, this is this is really interesting." And that advice that I also started using when I would do interviews and things. And when coaching people in like med school interviews that you know when someone asks you why medicine or anything like that, just give him a few different velcros to hook on to and then they'll just ask you about whatever they're most interested in.
Neel
Yeah, I love that. I think this is like one of the most useful things I ever really got for just having good conversations consistently.And be willing to do the same with what the other person says. Like, notice the hints of "Oh, that's a bit surprising." or "Like that's a novel thing you said, but I don't think I quite agree with or I don't quite understand." I just say like "Oh, that was interesting. Can I hear more about that?" And then if he does, like, all times in the same conversation, you probably just gone down a random path that you haven't got down at a conversation before. So, anyway.
Ali
Nice. We were talking about kind of your story. So this wasn't like 19 when you sort of revolutionized your own social interaction by realizing that you can systematically become closer with people.
Neel
Yeah, I think it's also worth stressing that like, the idea isn't in each conversation, you're like aggressively optimizing it to the details, because doesn't really work. People think you're a bit weird, or like a bit cold. What you're going to do? Is you come up with like, a good high level strategy for finding situations that go well, and then the situation you just kind of go with the flow, or maybe go with the flow, but every so often you check a list of questions and like pick the next one, but then you just try to like let things happen. And the optimization is you put yourself in situations but going with the flow works well rather than leading to an hour discussing the weather. But anyway, so.
Ali
Yeah, I think that's an important caveat to make because one thing that we've noticed on a lot of our episodes recently where you know for example when you're explaining this but Taimur and I know that you're not going to be militantly aggressive with this tactic, but someone listening to the podcast bite on charitably interpret your thing and say "Wow, this guy's weird. "He's going to be militantly aggressive." And so just having that slight caveat is like guys, you know, this is not about being super aggressive. This is about kind of using it and going with the flow most of the time, I think that's quite helpful.
Neel
Yeah, I find I get this a lot when I talk about the idea of optimizing. Like, a lot of people come up with reasons why optimizing is bad.
Taimur
People don't like it. They don't like it.
Neel
Yeah, like, actually, that's awesome. Because if you try to optimize things, and you make your conversations really militant and aggressive, that doesn't work. And so if you're good at optimizing, you need to understand all the common ways that like an 18 year old for optimized life fails, and figure out what you're gonna do about each of those. And if you can address all of the like common complaints, then you're probably doing pretty well.
Taimur
Yeah, you need some regularization in there, am I right? I didn't get it.
Neel
So, as you can probably tell, I like optimizing things. And a couple of years ago, when I was reading around things online, I came across this movement called Effective Altruism. That was basically had the cool pitch of, I kill about doing good, some charities was doing good or a lot more effective than others. So we should figure out what those are and do those. And for someone like me, who like optimizing things, this just clicked, like, yeah, of course. But I don't really do anything about this. I kind of had on an intellectual level, but not really on an emotional level. And I'm the kind of person who cares a lot about consistency. And it's not kind of sad. Like if you have a value, and you never act on the value, is it actually a value? And in hindsight, I basically just spent several years procrastinating. But then, about two years ago, I was doing this classic student thing of trying to figure out what the hell I was doing with my life, like, the what is my meaning, what is my purpose, and like, I like optimizing things. But if you don't have a goal, then it's just kind of purposeless, or you're optimizing for the wrong thing. And like a lot of the obvious goals like money or status or prestige, were like, I could go and just try to get those, but I don't actually care about them, and it would feel kind of empty. And I knew that I cared about things like happiness, and fulfillment. But I can get those in loads of different ways. So it's not really a drive. And I realized that one of the few things that actually did seem worth making a central purpose was I want to make progress in the world's biggest problems, and make the world a better place. And I can't really imagine putting a bunch of effort to it, I can't really imagine progressing, putting a bunch of effort towards this. But I still had this reluctance of, I just didn't feel like a strong passionate drive to do this. Like you hear about these people who spend their lives working with NGOs, in Africa, in terrible conditions, making the world a better place. And I'm just not the kind of person who would be able to do that day in and day out. But the thing that clicked is I realized that I don't need this to do a lot of good. Like, there are ways of doing a lot of good, sacrifice, like I can find important problems to work on that I think are interesting, I can find the work that matters, and try to figure out how to shape the work to fit my motivations. And, like, I think the effective altruism movement was a really big part in this click happening for me. Because both it gave me a lot of important ideas about what doing good means. But I'm also the kind of person who gets pretty socially influenced, like the saying, You're the average of the, like, the 10 closest friends is very true for me, I get all my motivation from being around people. And in Effective Altruism, I met this community of like, awesome people who actually cared about making the world better. And we're doing something about this. And making a lot of friends who care about that really made that click on an emotional level, in a way that just thinking about this didn't, and kind of add it to my list of RPG actions that I could take in my life.
Ali
Okay, so that's interesting. One thing I wrote down as you were talking is that you said that one of the tenants of Effective Altruism is this idea that or other one of the ways that you were thinking was that, A, I care about doing good, B, there are ways of doing good in the world that are more effective inverted commas that others, therefore I should sort of optimize for those effective ways of doing good. And the thing I wrote down was this idea of I care about doing good. This is something that we discussed with our mutual friend Lucy, as well in the podcast a few months ago, about this, this gulf between the intellectual, yes, doing good in the world is a good thing. Versus, this is something I actually care about on an emotional level. And, and so you said that when you start hanging out with people who were into this sort of stuff. It helped it click emotionally. Can you identify like what it was about hanging out with these people that helped you emotionally care about doing good rather than just intellectually care about doing good?
Taimur
Does it add like a social sort of status element to it where, you know, you get the kind of long term deep and meaningful benefits of doing good and living by your own values. But you also get some like, nice short term benefits of making friends and people thinking you're a nice and cool guy. And instead of finding your tribe and kind of participating in your tribe, is it that or is that a bit on charitable?
Neel
That's definitely part of it. Like, one thing that often comes up when thinking about altruism is some people think that it just has to be this purely noble, selfless thing where your sacrificing everything you care about to make the world better. And I just, I don't vibe with that at all. Like, I think what matters is there are people who could be suffering, who are not. There are like lives saved, the world is a better place. If the people doing this have like, really awesome, happy lives full of pleasure and hedonism, like, that's great. This is fine. And I think that people are motivated by these base things like status and prestige. And if you can make doing good, be the thing that gets you loads of status and prestige. But make sure it's actually doing good, not just looking like you're doing good. Like I think that's awesome. Which is a long preamble to say yes, totally. Or part of it is there's a sense of status and prestige of a lot of my friends think that doing good as effectively as you can is awesome. Part of it is hanging out with people who think all about motivation. But think about it from the perspective of I know what I want to be doing. I know the kind of person I want to be, is able to just like do what's best for the world. I'm not the kind of person, how can I shake my life, so I get closer to that kind of person. And realizing that was a thing that I could do. I think another thing that's pretty important to my motivation, at least personally, is concreteness. Like for a while, I couldn't imagine a life that wasn't being a math academic, because I was doing math degree, I liked pure maths, I was good at pure maths, and I never really done anything else. The real world seemed far off and scary. And I think in hindsight, what was going on is that I, it felt concrete, what doing maths was like. It didn't feel concrete or doing anything else's like. Just going out on the mall and trying things, makes them feel much more concrete. And if it's concrete, I can imagine a life where I do this. I actually, I'm currently on a gap year of doing some internships and things that I think could be really important for the world. And a lot of my goal here is actually sort of doing this might be like, make it concrete. And at the end of the year say is this something that I am now excited about? Because you just can't tell if you've ever tried something.
Taimur
There's a there's a book that we mentioned in almost every single podcast episode called Aspiration by Agnes Callard. The books basically about like, how do we go from how do we basically change our values? You know, like, previously, maybe you didn't care about doing good. Now you care, you know, one of your values is doing good, but like, what does that process look like of sort of actually changing your value system. And one interesting example that she gives in the book, is that okay, suppose you are, suppose you're in school, you're sort of like a music student, and you don't currently appreciate like opera or something. But you feel like there's probably something to this whole opera thing, and you want to gain the value of appreciating opera. Now like that, there's like a gulf between, you know, I don't appreciate opera right now I want to like appreciate opera, you know, in the future. And I think a lot of people get stuck at this gulf because it feels inauthentic. It feels like I don't appreciate opera now, like, show you like it's a bit phony for me to like, go to these operas and stuff and pretend like I like it, or whatever. But one of the points that Agnes makes in the book is that these value changes are, you know, it's a process where there's like a gradient from like, not having the value to having the And so like, you know, in order to appreciate opera, you will go to opera a value. bunch of times, the first few times, you know, you might not actually get it, but eventually, you know, over time, it might actually sort of grow on you. And so I think that's kind of about making the thing concrete, where, like, you can't truly appreciate or really hold the value until you've kind of, you know, done something towards it, you know, but you can't really, you can't really think that you are a parent, or you can't really think that you understand or appreciate what it's like to be a parent until you have actually taken some steps toward it, had kids. And so, that sort of reminded me of the thing you're saying about kind of making these things concrete that are otherwise just like nice abstract ideas that you might like or you might like to like.
Neel
Yeah, I like that. I think another point with concreteness is when talking about altruism, and like, extreme altruism, like, I want to actually make the world better in ways that matter altruism, it's kind of easy to get a bit cynical and think about this as naive. And I think one of the reasons that happens is somebody is imagining some starry eyed teen who's looking at the world and saying, I will make this better, but has no plans. No, it has to do this. And then it's going to pick the easiest available thing. And one of the things I really like about the effective altruism community is it can give me actual concrete plans for okay, here's a specific problem that I think really matters. And here's a specific way I think the world could be different such that this problem is less of an issue. And now I'm going to back chain from that goal, and figure out what I could do to make this happen. And obviously, the world is complicated, you can't perfectly strategize your way to fixing it. But I think having some plans, is like a really good antidote to this cynicism, or nihilism. But they just feel overwhelming and unfixable. But I don't mean kind of vague. Maybe you've got if I just put a bit about what effective altruism actually is.
Ali
Sure, yeah. Put it in more concrete terms.
Neel
Sure. So I think the movement kind of breaks down into two parts. There's the intellectual project of I want to make the world a better place. I want to do this effectively, I only have so many resources, so much time and effort to give, how do I do this? And what does that actually mean? And this is a really hard problem. Like, I don't know the answers, it takes a lot of effort. We have a bunch of tools from like science, economics, philosophy,..the world, and we can apply them to this problem. And the second part is the practical project of actually doing something about it. And a community of people who try to put this into practice in different ways. One thing I cannot emphasize, is EA isn't about, sorry. I'm going to abbreviate effects about tourism to EA a bunch in this episode, because it's a long word. It's not like a single group or organization. It's like a philosophy, and a community of lots of people, and lots of different organizations trying to make the world better in their own way. And given this kind of complex and messy. I was thinking might work to break this podcast down into one talking about the intellectual projects and what the ideas are, and to talk about the practical project. Like if the viewers buy with this and care about this, what could they actually do about it?
Ali
Yeah, sounds like a plan.
Neel
Okay, so I think the main interesting and novel idea is this whole idea of effectiveness, like, I hope people are pretty on board with altruism. And but I think it's not natural to want people to think hard about effectiveness and calculation, when thinking about how to improve the world. And I think the first step wouldn't be thinking about effectiveness, its thinking about why I actually care about altruism, and making the world better. And I think for me, it's the I think there's a lot of pain, death and suffering in the world. I think the world is very much not the way it could be. I want to live in a world where everyone is happy, healthy and thriving, and human civilization will flourish into the long term future. And our children and our children's children will have lives far better than lives we have today. And this is fundamentally a statement about the world. It's little statement about feeling good, or feeling like I'm improving things. Like when I save a life, what matters is that there is a person who has like hopes, dreams, a story. And they are able to live that out, rather than that being like abruptly snuffed out. And, like, top of my head is like, cool, and it can resonate me, but it's not the goal. And this means that you need to be results focused. Because if you can find a way to save more lives, which doesn't intuitively feel as good, that's way better. Because the goal is actually helping people and saving people. So you need to be grounded in the consequences. And I mean, I think the key insight of being results focus is just like a pretty generally applicable thing in life. Like, Ali talks a lot about the evidence back study techniques. If you pull several all nighters obsessively highlighting your notes, that is a lot less good than spending 15 minutes every morning, doing your daily Anki and doing some spaced repetition, even if you're like working much harder in the first category, because what matters is you actually learn things and actually get results and I think that this is like, a lot more important for doing good than a lot of people intuitively think. And the two key reasons here are that there's a big spreads in how much good different things can do. And you can't help everyone, so you need to choose and make trade offs, how you do good. So an example I really like there's a big spread, is this idea called the 100 X multiplier. So there's a pretty robust result from sociology that people's welfare improves by about fair amount when you double the income, no matter how wealthy they originally were, but all like really rich people. Like if somebody lives on $1 a day, or $2 a day, that's about as big a deal as somebody on $100 a day going to $200 a day. And the world's poorest people are about 100 times poorer than the average person in the West, like, the average income in the US is about $30,000. And there are a bunch of people that big charities help, that live on about $1 a day, or about $300 a year. And you can kind of think of most interventions as basically just giving the equivalent to giving people money. Like if you run a soup kitchen, that's kind of equivalent to giving people the money to like, buy more food for themselves. And so the it would take about $30,000 to double the income of the average person in the US. But with that money, you could double the income of 100 people in Sub Saharan Africa. And there's this awesome charity called Give Directly. That just literally does that, it takes the world's poorest people, and just give them the money. And these are the people who live on about $1 a day. And personally think we could do even better than this, like Give Directly isn't the place that I actually give money to. . But the key point here is the spreads is massive. And like I think when discussing the word AIDS, there's often a lot of objections people have like "oh, you're not respecting people's autonomy, you're creating dependency." But this is just literally giving the world's neediest people money, and telling them make our lives better, however you think is best. And it's 100 times better than the average helping people West. And it's kind of hard to get your head around how big a deal 100 X is. Like in London, I spend about 1000 pounds a month for rent, if there were 100 X differences in the rental market, then shopping around to putting an effort to get me an apartment for 10 pounds a month. If I spend about 2000 pounds on a like typical range MacBook, if there were 100 X differences shopping around getting you on for like 20 pounds. Like the normal world doesn't have 100 X differences in it. The things that are like, as trivial as buying a laptop. But saving lives has these massive differences in it. And like, it's easy to just do good by following your intuitions and following this warm fuzzy feeling of helping people. But I don't find this 100 X difference intuitive. So following my intuitions means I'll miss out on things like this. And the second key point is that there are trade offs, and the trade offs matter. Like if you think about something like time management, there are loads of awesome things I want to be doing. Like I could be doing this podcast with you guys. I could be reading a machine learning textbook, I could be hanging out with my friends. I could be doing tutoring, earning some money. I don't have enough time to do everything I want to do. And if we just say yes to everything that comes up. I'm implicitly saying no to things. I'm just saying no to things that come along later. I need to learn how to prioritize because I can't avoid making a choice. But I can avoid choosing by default, and just picking the things that come up first. and improving the world is the same key insight. You only have so much time, money and energy to help people, you're always making a choice. But I want to make a conscious choice. And because the spread is so big, I need to put a lot of efforts and a lot of attention to effectiveness. Sorry, end of spiel.
Ali
Okay. Yeah, that makes a lot of sense. I guess one of the key insights of EA, and you can correct me if I'm misrepresenting EA here, but one of the insights that kind of sold me on it several years ago, was the idea that you can actually put a value, like a monetary value on saving a life. Value probably the wrong word more like cost, like it costs a certain amount of money to actually save a life. And I think at the time and estimate that I heard is that maybe about $2,500 or £2000 pounds to save a life through giving money to the Against Malaria Foundation. Is that roughly accurate? Is that how you think about the saving of lives in monetary cost as well, or are there other factors that make this more nuanced?
Neel
So, I think a common punchline in this interview is that everything is more complicated than it usually seems. So yes. So to flesh that out a bit more, there's this organization called GiveWell, who were an amazingly good charity evaluator, who, like do a lot of research into global health charities and global development charities, and tried to do look at the evidence base, and tried to do cost effectiveness calculations of how much good they do. And one of the top charities is the Against Malaria Foundation, who give like bed nets to children at Sub Saharan Africa, to protect them from malaria. And malaria is like a really big deal. Like, I think if you look at the entire history of human civilization, malaria is one of the biggest killers ever. And even today, it still kills like, half a million people a year. And bed nets are really cheap. And so yeah, well, half this calculation, that it costs, I think, £3000 pounds nowadays to save a life. And obviously, this is an average, you can't take it literally. But I think that this is useful. And I think it's useful to like, give context. Like, I'd know, the average house in London costs like a million pounds. That's about 300 lives. It really gives context to like how we spend money. But I also find it helpful to think of this as like a lower bound. Because if you want to have a clear cost effectiveness calculation like this, need to have a really robust evidence base needs to be incredibly confident that your intervention works. And one of the like, really interesting debates in effective altruism over the last few years is how much you should prioritize having really high confidence, something works and like really robust estimates, versus being open to more uncertainty. If things could go way better. For example, if you're comparing a climate change charity, to an anti-malaria charity, it's much harder to put like, like, figure out how much a ton of CO2 converts to in like lives lost. But that doesn't mean that it's worse, like a common criticism of EA is that, ah, you're only doing cost effective calculations. And you're not open to things like systemic change. And I don't think that's true. It's just really hard to do this. But yeah, maybe be good now to talk about, like, what EAs actually care bout, and how much of it is that will help stuff like this.
Taimur
So before we move on to that, I think one more, more sort of But I think the, I think like having having this foundation of like, you should philosophical note about this, I think is similar, I'm very sympathetic towards EA. I think one thing that some people struggle with, I don't know, I struggle with this as well, is that, you know, if you take this completely utilitarian approach of, you know, trying to value a life and then sort of optimizing your resources in order to save lives, you know, the assumption is that, yeah, the whole the foundation of all of this is that you should care equally about everyone in the world. Right? That's what this whole kind of stands on. And it' a very nice idea. actually care about everyone in the world equally, you shouldn't care about helping your family more than helping, you know, someone who's more deserving all this kind of stuff. I think there is something, there is something lost in not having a sense of a local community, you know? I think like, a lot of people derive meaning from close knit local communities. And even though it is a nice idea to like, care about everyone equally. I think, yeah, it feels to me, like something is lost, if you don't have a sense of the local community. And you're sort of, you know, only ever focusing on doing the most kind of good globally, like, I think, I think a lot of people would have trouble with this. And from what I write, I don't know if EA has a good, like, way of marrying these two things. Because the thing I'm talking about of local communities, it's completely unquantifiable. It's like, stuff to do with with like, meaning and things like that. Yeah, just like this.
Neel
I hate when you can't quanitfy important things.
Taimur
Yeah. So it makes it really tricky. I mean, the way I kind of, you know, justify it in my head is like, Okay, I have this like, I have this need for local community and sort of narrow meaning. And so I will, you know, I will be unethical and give money to the homeless man in London rather than give that money to someone more deserving or like, I'll be unethical and like, you know, do this thing, which only, like, helps people in, you know, in the place where I live, which is obviously, you know, much more well off than a bunch of other places around the world. `I see that as like "Okay, I'm not perfect, I'm going to do this thing" which is like, definitely not doing the most good. But I'm sort of optimizing for like my own personal meeting. Yeah. Occasionally. How do you think about that at all? Like, what's the current sort of, How does EA kind of think about this?
Neel
Yeah. So that's a really tough question. So I think the way I think about it, you, so I think it is just like, obviously true, that if I can save 100 lives that are far away where I never meet the people, or one life of like, somebody in the UK, I should save 100 lives far away. Like, there's a right answer and a wrong answer to that question.
Taimur
Sure.
Neel
But I think that's really engaging with the meat of what you're saying, which is that this feeling of local community and meaning is like, important, and something is lost if we forget about that. Is that the key point?
Taimur
Yeah, it feels like something. It feels like some fundamental aspects of the human condition is lost, if you completely ignore that, you know?
Neel
Yeah. So the way I think about this, I like to think about my life in terms of having budgets, like, for example, I do not donate all of my money above exactly what I need to live. Most people in effective altruism do not do this. And in part, I think there's a good argument for not doing this. Because if you're on a movement, but you say that you're only late when if you're wearing a hair shirt, and nothing else, then you're not really going to get many people on board. And the point is to be something that people actually want to be part of. But I think the second point is the feeling of meaning. And my life, not sucking is kind of important to doing good in the long term. Like, I really liked the quote, doing good as a marathon, not a sprint. Like, if you're the kind of person who thinks about how can I aggressively optimize every last thing, and you're like, ah, hustle, meaning how many dead kids in Africa is not worth like, then you're probably gonna stop trying to do good after like, a few months and just burn out. And this means that over the course of life, you don't need much good. But also, if you only ever help people in your community, you're not doing that much good for the world. And so I like to think of my life in terms of having budgets, like, I will think about the amount of time and energy that I want to spend on things to do with altruism. And I'll say maybe 80% of that is going to like the things that I think are most effective, and 20% of that, I just want to spend in the ways that like, feel meaningful to me. And you make that split once, and you never think about it again. Because you really don't want the thing where every time you're thinking where you can spend a pound. You're like, man, I could give 20 pounds, that homeless guy, but then I couldn't stand to give it to a kid dying of malaria. Yeah, and I really don't want that decision every day.
Taimur
That's pretty stressful.
Neel
That's takes so much energy. So one thing I really like, there's this thing called a Give What We Can pledge. I know Ali has signed. And did an awesome video about. And the idea is you pledge amounts of your income, say, 10%, to give to effective charities, and then you just say, okay, that part of 10% of my income has gone. I'm going to figure out how to spend that effectively. And effectiveness is the only thing I care about here, and then what a lot of people do is they say, okay, 10% is the ball, I am giving that. With the other 90%, I can do whatever I want. And if I want to fund a new playground for a local park near me, I can do that. But don't take it from the budget of making the world a better.. And that's the best way I found a balancing between these two. Which is kind of key because I'm a mathematician, I want things to be principles, I want it to be a perfect solution. But it takes a lot of energies, to think about these questions. And just drawing the line in the sand saying that's my budget -- is the best solution I've come up with.
Taimur
Yeah, I think the budget thing, it's a good like practical solution. I do think it's a massive cop out though, because it doesn't address the like, the actual, like, philosophical problem of you know, should we be like completely utilitarian and sort of ignore this, you know, important part of being human like I don't think that actually addresses the thing, but I think it's like a good pragmatic way to live day to day and it's similar to how I do it as well, actually.
Neel
Yeah, I think the main issue here in case is this whole thing about having this feeling of meaning is like, important to my long term motivation and happiness. If I want to improve the world effectively as I can. I need to be a long term happy in most cases.
Taimur
Okay, yeah.
Neel
It's just a way of implementing that.
Taimur
I get that. I'm gonna continue to push back. I think that is still a cop out because I think it's a cop out because you're essentially still denying that there can be another like another sort of justifiable motivation, you know, you're saying that we both think right, the only justified motivation is to do the most good. And I'm just trying to like, hack my life so that I can live long, long enough and happy enough to keep doing the most good. But you're still denying the fundamental, like, human thing of what I think is a fundamental human thing of like, local community, and some like, you know, local tribe, I think you're denying that, and you're saying that actually, it's actually the only the utilitarian kind of saving lives that counts. And I'm doing this budgeting thing in service of that. And I think like, I think a lot of people, I think EA doesn't sit right with a lot of people, because I haven't seen anything that really addresses this. And I feel like, it would be great if there was a way of saying, like, yes, the sort of utilitarian approach of saving lives is, you know, is good, and you should do that. But also, like, there is some kind of, I don't maybe Call of Duty or something to like to local community. And it would be nice if the framework can kind of holistically allow both of those two things to exist. Whereas I think the thing I've heard from VA is like, you sort of have to subscribe to the fully utilitarian thing of saving lives. And like, you know, that's sort of the only source of the only kind of justifiable kind of valuable way to live or whatever. Did you get what I mean? And I think that's.. I find that a little dissatisfying. I think that's like, it's yeah, it's sort of something that other people struggle with as well. But I don't think that's a good answer by the way, I'm not trying to push you to it. So like, give me like some grand theory about this. Oh, yeah.
Neel
Yeah, I think one of the things that's kind of messy here is to like, my mind runs on this biological hardware that has evolved over hundreds of 1000s of years living in tribes of less than 100 people. And this means that I'm hardwired to think about the people in my local community. Yeah, and this is just like, a thing that's hard coded into my mind. But the world we live in is so different. There are 7 billion people who live in condition, many who live in conditions far worse than the things I see around me. And my mind, just like, can't cope with this. And that leads to weird sections, like the budget we talking about Another thing that might be slightly satisfying, is the idea that what matters isn't whether I personally feel the feeling of local community, that it exists, and that if I can, like, make the lives of many people better, such that they can support the local community. Then like, maybe that's more important than me personally feeling it.
Ali
Guys, I think I've got a solution. The solution is that this whole, I feel like this whole debate is basically the marshmallow test on steroids. Our human brains are hardwired to want the marshmallow right now because of various things. But if we can delay our human urge for the marshmallow right now, and delay the gratification, we will get more marshmallows in the future. Similarly, our human, our imperfect human brains are hardwired to seek meaning in the local community right now. And therefore knowing that we have to at least contract that feeling slightly by thinking about the fact that Okay, I need to, I need to sort of not wait that so highly, because my intellect is not capable of sort of counteracting this sort of 10s of 1000s of years of evolution. It's just a marshmallow test on steroids.
Taimur
I think that's like, I think that's pretty similar to what Neel was saying, I think, yeah, I think it's the same cop out. That's saying, like, the only justifiable, valuable thing to do is, is all the lifesaving and look this is like a it's almost like an axiomatic thing. But I think, again, you're basically saying, no it's not really justifiable to care about the local community thing, we need to fight this because the only thing that's justifiable is saving lots of lives. I think a lot of people would feel that it is almost a starting point that, you know, some aspects of local communityor like caring, for example, care about your family is like the most constrained form of local community, right? It's like a starting point thing. I'm not making like some, you know, propositional claim that like, Oh, you know, you should have a family. I think a lot of people have a starting point of, you know, they care about their local community, they will just see that as a starting point of, like, what counts in life? And what you're saying is that, no, that doesn't count in life, what counts in life has saving more lives. And so I don't think that really solves a problem, I think. I think that's, yeah, my point is just that it would be nice if there was a principled way to allow for both of these things. But I think currently in terms of philosophically where EA is, there's actually no way to do that. And I think a lot of people have trouble with EA for that reason.
Ali
I mean, so I guess it's similar to, like, the thing that comes to mind is this thing I've been open to basically people for months about this idea of self improvement versus self acceptance, and how there is in fact no sort of firmly westernly philosophically i.e. sort of logically rational way of squaring these two that, you know, this balance between self acceptance and self improvement. And in fact, if we take a more, quote, "Eastern" philosophy approach to it, where there is this Yin and Yang, there is this balance, and we actually don't have to fully reconcile these things. We can care about self improvement, while at the same time also caring, also accepting ourselves, we can think about improving the world as a whole, effectively, while at the same time also being invested in our local community. I agree, it would be nice if there were a firm sort of equation to satisfy this.
Taimur
Yeah, you could definitely care about both things. But I think like, I don't think we're gonna make too much progress. And I think like, in caring about both things, it is somewhat, you know, there's a bit of cognitive dissonance because, like, I think the EA kind of philosophy really does require you to buy into the very utilitarian way of looking at things whether that's one thing you value, but obviously like practically, yeah, you can definitely live your life caring about both things. And being okay, that is slightly inconsistent or being okay, that's whatever, right? And I imagine everyone who is here lives like that.
Neel
Yeah, I think that's like one thing people take from this part of conversation. It's the you can be an EA. And you can also get by local community or family. A lot of people too. And it can feel this, like tension, you don't have to give up all major of improving the lives of everyone. And that also annoys me as well. But anyway, should I return to saying what the hell EA actually is? So the key insight that I'm going to hammer on again, and again, is this idea of spread, there is a big difference in how much good different things can do. And one of the really important ways this manifests is I kind of think of effective altruism as being about 80-20, doing good. Like, there's a percent of the effort gets 20% of the good done, 20% gets a percent of good done, you just should always be looking for like the big wins, and doing those and just forgetting about the rest. And one of the biggest ways that manifests is this idea called Cause Neutrality. So I think a lot of like, altruistic movements are focused around a course, like gender equality, climate change, protest against the wall, things like that. And but I think that the problem you're working on, is one of the biggest determinants of how much good you do, because there's a really big spreads between how important different problems are. And so what effective altruists do, they say, I just want to do the most good, I don't know which causes matter the most. So I'm going to be neutral between them. And try to figure out which ones matter most. This is a general point with this tension with like, personal connection comes up, where people who say, lost there grandparents on breast cancer, really care about breast cancer charities, something like that. The way I kind of think about this is I have a personal connection to people living and flourishing in general, it doesn't matter what stop to think that there's a thing that stops them living in flourishing, I just want the world to be better. And doesn't matter whether somebody loses a parent to breast cancer or like war or whatever, I just want to make sure as many people as possible don't lose.. and figure out the right cause is kind of hard. But a heuristic I find useful is look for the groups who don't have power, and don't have a voice. The people who are disenfranchised by society, because you can kind of think about the world as there is an optimal way that we would try to help everyone. But the people who can argue for themselves get a lot more power and resources. So the people who can't argue for themselves are going to not have as much as they should have. And three big groups are the world's poorest people. This leads to problems like work in global health and development, non human animals, which leads to areas like working on factory farming, and clean meats, and future people like people who don't exist yet who have no political representation, but where we're doing things now that could really affect them, like climate change, not thinking about catastrophic risks in our society, like risk to future pandemics. Personally, I think that future people is the one on the list that people find most surprising, but also the one that I think in some sense matters most Like, it's hard to estimate just how many future people will be, there will be. But it feels pretty safe to say there'll be a lot more that are alive today. But if you look at any problem that affects the future, like climate change, it's blindingly obvious that the world today neglects the interests. And this combines with another bias society has. What we don't pay a lot of attention to like, small risks are really big things like say, pandemics. And I think there are a bunch of like small risks that could cause massive catastrophes that will affect people generations to come, or even cause human extinction. And these will massively affect future people, but society isn't thinking about them. This leads to a kind of more general insight that I find helpful for thinking about altruism, which is what I call looking for big deals. So if you look at history, I tend to for the most part, history is dominated by like, a few events or trends that really matters, like World War I and World War II or new technologies like the Industrial Revolution, computers or electricity, or like medical technology, like vaccines and sanitation. And when you only have so much time and attention, you want to focus on the things that have loads of leverage. And when I think about big deals that can happen in the future. I think that one thing that really matters, these catastrophic risks, like climate change, nuclear war, and natural pandemics, like obvious ones, most speculatively, I think that future big technologies could massively affect how civilization goes like AI, or synthetic biology. And I think this year has made it pretty clear that society does not or rather 2020 has made it pretty clear that society does not think about these risks. And one thing that EA cares care about a lot, that I'm feeling a lot more invested in now is preventing future pandemics. And people been talking about this for years, and how the world is unprepared. And one thing that I find kind of exciting is that there are actual things we could do about this. Like, there's a company called Sherlock Biosciences, who are working on making broad spectrum testing, which means you can have tests where you could put a sample from a patient on them. And it could like show you all of the different pathogens, viruses and bacteria in there. And let you know if there's something novel on that. And like, just imagine a world where and we had back in December 2019. They have these tests. So people came in with some weird pneumonia. They put all these tests like, oh, there's something new here, that's kind of concerning. Maybe we should lock down this neighborhood of Wuhan this weird novel pneumonia goes through the neighborhood of Wuhan, it doesn't spread, and live in the West that pays attention to a new story. That's the kind of world I want to live in. And I think that's world we could live in. But it's not the kind of thing society investing the resources in. And as an altruitst, I care about that a lot. That makes sense so far?
Taimur
Yeah, I'm on board with that.
Neel
Cool, the two other tools that EA finds helpful. The second tool is that finding the best ways to do good is really hard. The world is complex and difficult to understand. So you're going to be wrong about things. And this means that you need to be putting in constant effort to have true beliefs. Like that's why we call it Cause Neutrality, not why you should only care about future people. And that's the right call with the masses. Because we're like, we could be wrong. You need to think about tools like what are your biases, using cost effective calculations rather than intuitions, testing your beliefs. One key thing for me is when somebody disagrees with me, actually trying to hear them out, and take them seriously and understand why we disagree. Because sometimes when I do this, I'm like "I would just roll, thank's for letting me know" And it's easy to just get defensive and be like "No, you're wrong. I'm awesome. everything I'm saying is true." And a third and more speculative tool, the people in the community we talking about a lot in the few last years. Is this idea that you should be willing to be ambitious, and pursue uncertain strategies, that could be really important. The idea here is that if we look at ways people have tried to do good in the past, most of the impact came from a few things were really big deals. And that like the average impact per person was dominated by a few people who just did things that went amazingly well. And so you want to try to be that kind of person. This is the same kind of insight as a venture capitalist, 1000 startups, thinking one might be big, the rest will fail. An example like here, you guys know the story of Stanislav Petrov?
Taimur
Nope.
Neel
So he was a I think lieutenant in the Russian military during the Cold War, back in the 80s, and he was monitoring their nuclear warning systems. And the system told him there are five missiles in time. And it was his job to like, raise the alarm, say, the US launching missiles on us, we need to launch back. And basically starting a nuclear war. And he said "Wait, why did they send us five missiles early rather than 1000s." And he just didn't do anything about this. And this turned out to the sunlight cleansing of clouds and a bug in the system. And if you had raised the alarm, there's a realistic chance this could lead to World War III, and like, kill hundreds of millions of people directly, and possibly billions from the ashes from the city. The ashes from burnt cities just like locked up the sun, and cause it's called a nuclear winter that could like destroy global agriculture. And the facts he got was that he was kind of demoted informally punished because it was really embarrassing. And it was just massively hushed up until after the Cold War. And I think that he saved millions of lives.
Taimur
That's so crazy.
Neel
And I don't think it's kind of terrified about the story is so the nuclear weapons [UNCLEAR] woman, meaning when we think something bad is happening, I'm willing to launch back to retaliate. And the US and Russia's missiles are still on [UNCLEAR]. Well, China's are not. China says, if you actually land nuclear weapons in our soil. We will launch back at you. We have enough missiles that this will still fuck you up. This isn't like we're not going to risk causing World War III. And like one person can do this much good, it's crazy.
Taimur
That's a great story. I never heard that before.
Ali
Yeah, so what are some practical ways in which one can go about trying to become the next Petrov?
Taimur
[UNCLEAR]
Neel
Yeah, I mean [UNCLEAR] Petrov is hard. I mean, I'm pretty excited about more accuracy people going in something like the military, and being in important positions like that. But I'm not sure this is like the best thing people can be doing. So one, I think the main thing that I think about is again, how can you 80-20 doing good when it comes to putting this into practice. And, again, this means identifying the big things that really matter. And the slogan I quite like is "Get the big things right." And don't sweat the small stuff. A lot of the problems are the big things. But it's easy to burn a lot of energy struggling about the small stuff like am I recycling right?
Ali
Am I saving money on this latte?
Neel
Yeah, exactly. Like, am I getting the cheapest groceries so I can up save morely than eight. And like, you'll put loads of energy and it doesn't really matter. So a pretty common misconception is that EA is only about making donations. And it's only about evaluating charities and finding the best ones. The way I like to think about it is my life is kind of a collection of resources, of which money is one of the resources. And with the actions I take, I choose how I allocate my resources to change the world, like money is one, but time, energy, my social capital might productive labor hours. And for older people, the most important resource by fay is that career, like the average career is 80,000 hours, which is just an obscenely long time, like, I have not lived, I don't think I've lived 80,000 hours. It's just like, I was not sure I think I have like 80,000 hours like 22, 16,000.
Ali
Quick maths coming up.
Neel
It's a ridiculously long time, which is the important part. And you can kind of think of donations as just like, converting your career into money, and then forget where to allocate that money, which, like, can be a good way to use your career, but isn't obviously the best way. And another thing that's kind of funky when you realize you have 80,000 hours in your career, is that if we think there's like more than 100 times spread, and how much how important different things can be. It makes a lot of sense spend a lot of time prioritizing and thinking and planning. And if you spend like 10% 80,000 hours planning, that's 8000 hours, which is like either, about 200 workweeks for about four years of your life just thinking and planning. Like, I don't intend to spend four years of my life thinking and planning about my career. Yeah, but that would actually be a pretty reasonable thing to do. Your life is really long. And so like obviously thinking about your career is kind of intense. I think there are some people who will hit us off and be like "Oh my god, effective altruism, the movement I didn't know I need this, I want this to be a big focal point in my career" Some people who weren't. And I think there are like, lots of small things you can do, like just thinking about ways to make money. And giving to charities, [UNCLEAR] is a great resource for this to say. But I think that figure out whether you can do a lot of good with your career is like, the first thing to think about. And one mistake that I made for a while is I kind of thought of my career as I can do the things that will make me happy, or I can do things that are good for the world. And it's just this one dimensional spectrum. But these things trade off against each other. And, like, I'm not the kind of person who can drive themselves day in and day out with feeling that I'm doing the most good. Like, I'm just not that good a person, though I have some friends who are and they are awesome people, and I respect them a lot. And one of the things that clicked is that one, motivation, and my personal fit is really important with how I use is like really important with choosing the right career. Like you can't do high impact work during the thing that you hate, that you think is good for the world. So you force yourself into doing it. I think this is a mistake. I see a lot of people, they whether they care about doing good or not. They just think about their career in terms of what do my parents think I should do, what other people around me do, like first year mathematician university who want to be math academics, or 16 year olds can who want to be a doctor or a banker, or whatever. And I think this is crazy, like the right thing for me, it's not the right thing to the person next to me. Like say, Ali, you're clearly really good at this whole YouTuber being an internet personality thing, people listen to you. This means that the best way for you to improve the world is probably not the best way for the average person to help the world because you have this platform and leverage and other people who listen to you. And you can like spread and pull ideas to while for somebody like me, I do not think this is my calling. And I should go and like solve some hard math problems in ways that are important for the world. And I think another thing that kind of clicked is that there are you can kind of shape the way you do things to be more or less motivating. Like, I've already talked about how I'm pretty socially influenced. And having a community around me is really helpful.I think this mindset of motivation, hacking is just like a really good thing to practice in general. Like, back when I was in Uni, I was pretty neurotic person. And so it was exam time. And I knew that I was going to spend a bunch of time trying to prepare for exams. But those are kind of boring. So I decided to set myself the goal of figure out how to use something that was actually fun. And then I pursued these strategies like solving a public revision lecture series, or running up notes that I thought were actually good. And like figure some intuitions and publishing goes to people, or like giving in for tutoring to my friends. And I still achieve this cool goal of like learning my course well, do well on exams. But that term was a loss, and possibly the most fun time I had and like my entire time in the university. And I was just spending it like learning math as well as I could. And I mean, this is just the things that I personally find motivating because I like social things. I like teaching, I like people. They're going to be different things for each person. I wrote a blog post called Little Life Your Excited About, hopefully you can link in the show notes that try to give some thoughts on how different people can find the things that make them motivated and excited. But I think the key insight I want to convey is that you can look at the things you're doing, figure out the things that you find exciting and motivating, and shaped the task that satisfy those, like, I get really motivated by the feeling of making progress. I like ticking box in a checklist. So every day I would make a checklist of what to do that day. And that makes it more fun. And I think there are doing these things can be like a massive difference. I can make a difference to the point that if I find a career that I think is really important for world that I could be good at. Then I can make into something that I would actually have a good time to, I think that anyone listening to this, who thinks this might work for them should really spend a lot of time try and experimenting. Because if you can get motivation hacking, right, it's like, insane. And a slogan I kind of liked for thinking about this whole thing of altruism is "You don't need to be a saint to do good." You don't need to be the kind of person who feels this like deep wellspring of passionate altruism and warm, fuzzy feelings for the world. Like, it's really awesome if you feel this. A bunch of people begin to feel this. And this is great. But if you don't feel this, but you can figure out how to like, shape your life so that you do things that are important, and have a good time. Even if the motivation just comes from like, I really liked my coworkers, or these problems are fascinating. What matters is the world is a better place as a result of your actions. There's a quote I really like to kind of tie off the section. "Courage isn't about not being afraid. It's about being afraid and going head anyway. Similarly, caring isn't about being overwhelmed by emotion. It's about not feeling a strong emotional drive, and doing the right thing anyway."
Taimur
That's great. Love it.
Neel
Yeah, an article I find really motivating, called On Carry.
Ali
Oh, yes.
Neel
Ali mentioned that on previous newsletter.
Ali
Great article. Yeah, I used to think you had to feel, for example, the way [UNCLEAR] does i.e. sort of genuinely feel the suffering of people to, you know, feasibly do good to the world and give to charity, or like "Oh, I'm just not that kind of person" I read the article, I was like "Damn, okay." I don't need to be that kind of person, I can just do it anyway, because it's the right thing to do.
Neel
Yeah, one thing I find kind of helpful is rather than thinking motivation, as this like big lump, separating it into drive, and values. So value is like, if I look back at my life, 10 years from now, what things will I actually care about having achieved and doing? and drive is like, what keeps me motivated, day to day, like, what do I get up in the morning to do and, for me, altruism doing good is like a really, really cool value. And it's like, kind of a drive. Like, I feel happy when I do good. But I have lots of other drives, like social motivation, status, solving fun problems, the feeling of excitement and novelty. And what matters is that I achieved my values, my drives don't have to be the fact that I'm achieving my values. Like, I think a lot of people listening to this probably care a lot about productivity, and getting shit done. And one way that you can, like hyperdrives do this is by listening to podcasts like this, or following Ali YouTube called Ali Abdaal. I just like listening to people you respect talking about this, putting yourself to the social rub, people who care about that. And like that can achieve the value of doing less work without the really heartfelt and having the drive of do lots of work.
Ali
This is really, really good stuff, I definitely need to chat to you about this offline as well, because a lot of this stuff is making up some of the chapters of my upcoming book. While I say upcoming, it's a long while, long while a way, about meaningful productivity. And one of the main spiels in the book is going to be this idea of, we've actually got a whole section about hacking motivation about how can we, you know, how can we align our wants with the things that we have decided in advance that are going to be goals that are worthwhile pursuing are things that are meaningful to us. On that note, one thing that I've been thinking a lot about is values, and you seem to be quite clear on what your own values are. At least that's how it comes across. Do you have any ways that you find helpful of thinking about or identifying this list of values?
Neel
So, one, so come up with the values, it's like really hot. But one idea that I've been kind of toying with the last few months is, when you have a hard problem, just be willing to spend a few hours working on it. Like, I get productivity coaching. A few months ago, I was talking to my coach about how my biggest problem is I just don't have goals. And I don't know what I'm aiming up beyond this kind of like, high level thing of making the little [UNCLEAR], but like, what does that actually mean? I realized I'm talking to her like never actually just tried solving this. And so I just like blocked out two hours my calendar, opened empty document, and just started trying to map out what my values and goals were and solve the problem. But it felt like I made some decent progress. And get into this mindset where you're not trying to find your perfect fully formed goals, but you're just trying to like, make progress on it and be a bit less confused and like I get back to document from time to time and like, edit it, I don't really agree with that anymore. But just like actually trying to make explicit the fuzzy border of my head is really helpful. And I find quite often when I'm chatting about things with friends. They mentioned some big problem like, I don't know what I'm doing with my life. And I say "Have you ever just sat down for two hours and [UNCLEAR] lead them to doing that. So like it was pretty useful.
Taimur
Yeah, I think being comfortable just sitting down and thinking as an activity itself. It's a bit weird, I think we're generally not used to that like thinking is stuff that happens on the side while you're doing other things, but like thinking as an activity in itself is, yeah, for some reason, it's weird.
Neel
Another hack I'm a really big fan of is five minute timers. Like, so the way I would start, like a two hour block like this, which is really intimidating. It's a sort of five minute timer, and spent this five minutes listing things like to do to make progressiveness, like, get someone and I find this, like, I was really skeptical when I first heard the hack of just set a five minute timer and try solving this big problem, because it seems like it can't do anything to help. But I also doing it and seeing holy shit, this really works. I think there's this mindset called learned helplessness where something feels difficult and off person, and you just don't think about it. You don't want to think about it and you fluctuate. But if I set a five minute timer, then I now have this sense of urgency. I just need to get something done in five minutes. And I don't have time to be a perfectionist or think it's impossible. I just have to actually try for a few minutes. And a reflex I'm trying to build is anytime I'm ever complaining about a problem or feeling stuck, just set a five minute timer. And if I can't solve it in five minutes, I've lost nothing. Half the same I have a good idea. One thing I now I bought a physical timer, and it's on my desk and it's great, because when I have this feeling, I can just [UNCLEAR] and twist the dial. And it's now five minutes to. And it's just like completely effortless.
Taimur
Alright, that's a great idea. I'm gonna order.
Ali
Five minute timer.
Neel
Yeah, it's like you twist the dial. And then you have like a ring of plastic they get slightly smaller. They're called Timer Timers on Amazon, for some reason.
Ali
Oh, Amazon has a series of hour glasses. [UNCLEAR] Five minute hourglass. Now I need to find obviously the perfect one that goes with a desk setup. Because otherwise, you can't be having like a bright red one. Minimalist hour glass.
Neel
If you wanted to show like, we got a custom made that's like exactly the decor of your desk.
Ali
Mate, this is actually genius. And then it's also a tax deductible business expense. So am I right?
Neel
I love it. But yeah, another thing that I think is kind of helpful was just talking to people, like, I'm a pretty extroverted person. And I find that a lot of my thinking is done by talking to somebody. I just like bouncing ideas off of friends. Like having this all the awkward silence and the pressure [UNCLEAR] Awkward silence is like, a really good way to be creative.
Taimur
Yeah.
Ali
Yeah, that's interesting. So I've been in my quest to write this book, I've been trying to write 2000 words each day for the last since the start of the year. And I did it. It worked well for the first six days. And for the last three, I haven't made much progress. So I thought, you know what, this morning, I'm going to do the thing where you go for a walk and record a voice note where I sort of, say my thoughts out loud, and then use Otter or Descript to transcribe it and stuff. And I found that like, it's just not the same when you're just alone with your thoughts. And I was thinking it would be nice if I could ring someone up, record the conversation and then be like, right, let's talk about meaningful productivity. And I'm sure we'll make some progress on this like half an hour work or something like that.
Neel
I mean, hit me up.
Ali
I will add you to my list of people.
Neel
I mean, I'm kind of bad at being available, like when I get randomly called. There's a chance.
Ali
I'll send you a Calendly link.
Neel
Excellent. If you want to schedule in meaningful productivity chat.
Ali
Yeah. Fantastic. Speaking of, this week was the first time I use a Calendly link to schedule a date.
Neel
Oh, I love it.
Ali
Yeah. But I just thought I'd share that it's like a win win for the week.
Neel
I love it. Oh, yes. One of my recent experiments for trying to optimize my life, my dating life is I now have a go on a date with Neel anonymous form on my website.
Taimur
Oh, wow.
Neel
So far, all few responses have been joke's. Anyone listening to this.
Taimur
Will definitely link to that page of the website. Probably we could get some inbound leads.
Ali
What are your qualified questions on this form? If any?
Neel
Let's see. So I try to keep it reasonably. I'm trying to bounce between, [UNCLEAR] complete trolls. So the other than that, just like boring questions.
Ali
Oh, trolls?
Neel
What are you looking for in your romantic life right now. And we'll link to a blog post where I talk about, like, how I think about good friendships. Because that's also I think good relationships. Why did you decide to fill this out? Or like, what about me resonates? And like, why are you interested in me? And what kind of things do you find exciting to think about? Or talk about?
Taimur
I think a big benefit of like, having a sort of, you know, having some kind of public self online, where you're actually sort of broadly authentic about in line with the way you actually are in real life, is that like, you know, if, for example, if someone reads your blog, and they really like, you know, the posts and the way you think and so chances are you probably get on with them. And so it's actually a great filtering mechanism for for this kind of stuff.
Neel
Yeah, I think there's also this annoying thing with romance, where, if two people interested each other, there's a lot, there's a big barrier to say anything. Cuz if a person isn't interested, that's kind of awkward. And we've done it to friendship, right. And this is just a good way to publicly signal like, it's chill.
Taimur
Yeah.
Neel
Hit me up if things don't work out. I don't care. I still think you're cool. Yes. Anyway, other than advertising my love life. Maybe good to give a few more thoughts on like, exactly what people should do if they want to use their career to make the world better.
Ali
Oh, yeah. Great. What's our call to action here?
Neel
Yeah, well, so I suppose I just want to talk about like, careers generally, altruistical note. Because I think careers are really important. And I see a lot of people making mistakes. And I think two of the biggest mistakes I see people making is one, just, especially like students and young people is one just like not thinking about it ever. And two, getting really stressed and overwhelmed about it. So I think it's crazy to never think about it. Because your career is such a big passion for life. It's like 80,000 hours, it's insane. And, more importantly, I think you can productively spend time when you're young, to make the rest of your career better. And but I think it's also kind of understandable to feel stressed and overwhelmed about it. Because there are people who like, yes, I get the argument. It's a really big deal. And, like, I completely abide with this because it stress me out. But I think this is a classic case of having unrealistically high sadness of yourself, like, people anchor themselves to thinking that they're failing, if they haven't perfectly solved this problem, and they're perfectly or they're doing without, and I kind of slip into this, like, I don't know what I'm doing with my life, I maybe feel like I should, you know, I'm 22. I've graduated, and then sometimes chat with my friends who are 10 years older than me. And they say, "what the hell are you talking about" I don't know what I'm doing with my life, this is completely fine. But the solution also isn't to just give up on this. It's not a binary of confused or not confused, it's a spectrum. And the less confused you are, the more you can productively think about this, I find it helpful to think that as an opportunity, like, most things we ever do, just don't matter in the long term. Like, when I look back on what I was doing a year ago, other than going outside, but more of most of it, just like I don't really care, there are very few ways to trade time now to make the rest of my life better. And it's pretty short. It's like it's a very short list. It's like forming close and meaningful friendships, learning things and watching myself and figure out how I'm doing with my life. And there's just an awesome opportunity to make things better. And I think that you can do things productively. There are like, two key ways I think about this, again, engaging with this cool thing of confusion. So first, becoming less confused. I like to think about careers in terms of how can I gain as much information as I can, like, is the puzzle. The world is big and overwhelming. There are better and worse things I could be doing. And I want to figure out what these are. So I need to be grounded and actually gardens, the world run tests, try things and gain data. I should just run lizard tests. Start with cheap things like talking to people, scale up asking for advice, scale up to more expensive things like doing projects. Even more expensive things like doing internships, even doing jobs, I think you should think of your first job as just how can I gain as much information data as I can. And note that by career, I don't necessarily mean like getting a soulless corporate job, or like a thing you started apply for. Like, I think the what Ali is doing with being a popular thing, or Taim is doing with a startup, these are both like careers in the sense of what you will do with your life and your productive energy. And for some people, this kind of lifestyle is a great fit, some people it isn't, you can run tests to figure this out. I think people should be a lot more willing to ask for help. This recent blog post was just about how I personally suck at asking for help. And [UNCLEAR] is more like, I've got so much value from asking mentors for advice. And like, what is your job like? What things do you think I'm currently missing? How should I test it? What should I be thinking about? Like, I'm confused to the point that I'm currently on a gap year, just doing a series of internships, to get more data, and see what different things are like. And the second thing is, you're really confused. And what you can do is to be robust to the confusion, and do things that are just good, no matter what your like long term goals are. And I think one of the best ways of doing this is what I call becoming awesome. Like, trying to get skills, learn things and become better, I think especially good to focus on the meta skills, like productivity, social skills, communication, how to learn things better, like, and my university experience, I probably put in more effort to self improvement, and getting better at learning that actually did learning maths. And I think that paid off pretty well. Investing and things like this just pays off in the rest of your life. And I think that if you actually spend time being strategic about this, like what skills you bottleneck, how can you get practice and get better at those, and then actually doing it just pays off really good dividends, 80,000 hours, who were affected by organization, specialist and career advice, and have loads of other awesome ideas, call this career capital. And the idea is that one of the big variables explaining how you do over like 40 year career is just how much skills you gain especially early on, and that you should focus on the years over anything else like success or prestige, like hockey pays off in a better position. Cal Newports book So Good They Can't Ignore You is also pretty greatly on careers and hammer this point him a lot. So those are like careers in general, how can people do things better? If you specifically care a lot about doing good with your career. And again, career is like one of the biggest things you have for shaping the world. So if you care about altruism, think about [UNCLEAR]. So firstly, there's a organization called 80,000 Hours, they're really, really awesome. And their career advice is way better than anything I can say in the next 10 minutes. And they have a article called the key ideas series, which you can easily link to in the show notes. And people should just like go read that. And they're like years of evidence backed research. And one of the annoying things about trying to give career advice in a podcast is everyone's journey is different. And everyone's like, optimal career path is different. So everything I say ends up being a bit generic. So yeah, I said this one already. Motivation and good fit is really, really, really important. There's a lot of spreads between different careers in what they do, but also between different people on the same path, like the best researcher does maybe like 10 to 100 times as much research as the average one. And you're not going to be exceptional, unless you really care about what you're doing. And you're actually motivated about it. So that's really important to pay attention to. And you're like doing good as a marathon on a sprint. You don't want to burn out. I think it's easy when you start thinking about doing goods to only think about sacrifice. I think that's really wrongheaded. I also think it's easy to have a limited view of altruistic careers. Like when you ask most people what kind of careers do good people we're gonna say things like being a doctor voluntary with a charity going and working for an NGO in a developing country. And I think that's all cool, but there are a lot of other things. Brain dump of some problems that I think really matter, biosecurity that thinking about future pandemics and how to prevent them or prepare for them, where you can do things like work on like, what could biology and biotechnology try to influence policy to make it better? Thinking about other emerging technologies, especially AI, which I think is like, probably one of the biggest deals is going to happen in the next century. I think there are a lot of important technical problems here that can be solved, to get systems that do exactly what we want, rather than just things that kind of do what we want. There's lot of important policy work. Like, if half of all jobs are going to be automated in the next 50 years. How can we make sure that this is good for the world, rather than just creating an even more hyper rich class? And how can the world adapt to these big social trends? I'm doing research into how to prioritize between problems. And like, all the important points are missing. This can do with like talented economists, falsely researchers, social scientists, working on global health, like just earning a lot of money and donating it can make a really big difference here. Working for policy, working for NGOs, working on global development, like if you're an economist working with governments in poorer countries, making what's better the factory farmed animals, like going and campaigning and talking to companies. Working on technologies like clean meat. What can problems like climate change? I think one of the big bottlenecks here is policy. Another thing I want people to work on is a lot of the risks dominated by the really bad scenarios, where we have like a lot more warming than we expect, like rather than having to rigorously have like six degrees. But very few people will research that stuff, and extreme climate change. A lot of these problems are kind of underlaid by having a messy political system, where politicians want to look good for the next five years. Rather than thinking long term. People don't use best practices from forecasting. And like, actually making the world better. And getting political systems that do that would be awesome. And I don't know needs to be done to make that happen. But I think there's good work that's been done there. The other thing is, you're certainly have to like directly, like do a research on these problems, like generating money and donating it to organizations doing good work is awesome. Be doing kind of outreach and getting other people care about them is awesome. Being a kind of like more meta person work, being a manager or doing operations work at an organization doing a lot of goods. Generally just spreading important ideas in the world, like being a journalist, or say, being a YouTuber, he talks about important things in his videos, and hence. Yeah, and ramble about. One thing is that the listeners can connect with. If you're kind of feeling overwhelmed by all of the options, a good way of getting traction is like, sit down and try to figure out which problems do you think are most important? 80,000 hours has a bunch of good write ups on different things. There are also a lot of problems they haven't had time to think about, or to, like hard think about the recipe for the big deal, like climate change. Finding the ones that you care about the most, which is pretty personal question. And then thinking about the different ways you could work on each of these problems, and seeing which of them might fit you well. And then think about what you could do to get efficient about this. And like, explore and test prep. That was a long ramble. So to recap, the key points from that. careers are really, really important, this massively affects, there's like a big chunk of your total impact on the world. This will affect both your happiness, and fulfillment, and also just how the world is different. This is one of the best things to think about. I think it's useful to see it as an opportunity, not a duty or an obligation. You can orient towards getting information to become less confused. You can focus on gaining skills and career capital, even if you are confused about just a robustly good thing. And if you care about altruism, I would recommend picking out the most important problems. And then thinking about which careers you think could help those that are high impact and might fit you well. And you should go read what he does and say about this. Because it's far better than anything I'm saying right now.
Ali
Amazing.
Taimur
I will definitely link to ATK in the show notes.
Ali
Yeah, I was taking notes throughout of all the other things that were mentioned. So we will link to as many of the things that I can decipher my handwriting on as well, along with Neel's blog, and 80,000 hours and all the other resources that we talked about in the podcast, so I feel like that's a good place to wrap things up.
Neel
Yeah, I think There's like one key message you take from this, it's the, if you like my poor self, and you think doing good is important, before doing nothing about it, you can actually do something about this. And you don't have to be an amazingly good person to make the world better, a big thing in your life. And you can do this while having an awesome life. And if you vibe with this, and you fight with why think about things, come to effective altruist or call people. If there's a like, there's lots of little effective altruism groups across the world, especially in big cities and universities. I can link to a map of like different groups across the world. And there's a bunch of online events nowadays. So if you want to come hang out with people, and see if that's the kind of people you will spend more time with, I think that'd be great. checking out 80,000 hours and that book from careers. Finally, I think yay, is a kind of messy, complicated topic. And lots people disagree about it. And I really hate it when I'm listening to somebody talking about something at all about. I'm like "They got all of these things wrong, and this is terrible." And this happens all the time, if talking about effective altruism, and they really don't want to be that guy. So there was a great article recently about misconceptions about effective altruism, when people hear popular betrayals. So I was thinking, we might link to that in the show notes. And people can check that out. And then if they left this podcast thinking something, and the article says it's misconception, it's completely my screw up.
Taimur
I think I read the article when it came out a few weeks ago. Yeah, I think like, I've seen lots of discussions about EA on Twitter. And it's, it seems like, there's like a cabal of EA haters were like, yeah, obviously, like a thread on Twitter, where I feel like EA is being completely misrepresented. And people are sort of like hating on a thing that doesn't actually exist. See, I think that article is really good.
Neel
Yeah, I think like, a lot of EA ideas went viral, like, about eight years ago. Go to Wall Street and earn loads of money. And give it your like, [UNCLEAR] back to charities with all the issues that Global Aid has, and never think about things like high level change.
Taimur
Yeah.
Neel
Think about motion and be a cold calculating person and [UNCLEAR] to not fall into any of those traps here.
Taimur
Yeah, for sure.
Neel
Yeah, please email me.
Taimur
Yeah, we'll link to all that. So where can people find you online, Neel? So your website is NeelNanda.IO, Is that right?
Neel
Yep. That's my blog. I have about 100,000 words of random essays on there. Mostly to do with like, rationality, productivity, motivation, and social skills. I think preparing for this podcast has inspired me to like write some more things about EA, because I actually work out my thoughts on things. And I have a top posts section on there with like, my favorite places to stop. So people want to hear me ramble even more, you should go check out my blog.
Taimur
Neelnanda.io and I will drop a link to that as well. Sweet.
Neel
Email me or fill out either of my contact [UNCLEAR]
Ali
Yeah, man. Thanks for coming on the podcast, Neel. Usually we end with reading a review. So Taim, do we have any?
Taimur
We had a really scathing review.
Ali
Acceptable to read out loud? I think it's acceptable to read out louds. It scathing towards me, so I can't imagine anyone else objecting to this. Alright, so this is a one star review from Maya_94. In Great Britain. It's entitled disappointing episode. Maya says "I've been a listener of the podcast since its infancy. The latest podcast episode misogyny, generalization and controversial topics was hard for me to listen to. It was poorly conducted, and throughout the majority of the conversation, there was a defensive undertone by the hosts, especially Taimur, it felt like it felt more like a year nine English class debates rather than a discussion of the topics, misogyny and generalization in an open non judgmental I expected an intellectual and well researched episode but what was presented to environment. us was the inner unedited ramblings of both hosts both Ali and Taimur constantly interrupted the guest Sheen. Dismissed her opinion and told her on multiple occasions that she was uncomfortable with the topics of discussion, even though she had stated that she was happy discussing the topics, which I find extremely ironic given that the podcast is titled Misogyny. Once Sheen had stated why generalized statements cannot be made without context to Ali instead of listening to why biases and generalizations are detrimental to society. Taimur interrupted Sheen to defend his brother repeatedly stating that she was taking this conversation somewhere it didn't need to be taken. I could not finish this episode. I had to end it at one minute, one hour 32 when another dismissive statement was made by Taimur, although a lot was discussed, nothing was of substance. I kept shouting, just listen, listen to the host. This could have been a really productive and useful episode. But just continuing to resemble a car wreck as the episode went on." Pretty scary. That might be the worst review we've ever had of any episode.
Neel
[UNCLEAR]
Taimur
Yeah, so I think that the last week's episode was probably the episode that's gotten the most responses, and probably the most polarized responses. I think on one side, we had a bunch of like emails and DM's, I thought who actually said this thing about, like, I could not finish this episode, it was too frustrating. And a bunch of people said, I couldn't, I couldn't finish this episode was too frustrating. Because like, you know, you guys, just, you know, you weren't kind of pushing back enough against, you know, some of the stuff Sheen was saying, or like, you know, a lot of people were, I guess, in favor of our point of view and where we were coming from, and they felt like, we weren't actually, you know, properly trying to justify that. And then a few people were also in the camp of this reviewer, where they were less sympathetic to our view, and they felt like, you know, we were to kind of entrenched in our own views.
Ali
Yeah, it's very interesting just how polarized it was.
Taimur
Yeah. And even on, like, specific points. So for example, you know, the review says, Oh, I actually messaged Sheen after this, offering the review. And ask her if she felt like I was interrupting her and dismissing her and things like that, she said, she actually didn't feel that way at all. And it's interesting that this review brings up, you know, remember, there was a couple of points, where I think there was one point where I asked Sheen if she felt attacked by something. And she, I think she said, she felt like a little bit attacked or something. And this person of this review, highlighted that as, like, a bad thing where I was, like, you know, trying to claim that she was uncomfortable when she wasn't. And we actually had another one, someone else who I think, in an email or a tweet, I can't remember exactly where it was. So that I, it was good that you guys were actually a bit empathetic, and could tell that sort of, you know, at some points, you know, Sheen made even more uncomfortable talking about this than you guys were. And so like, even on, you know, on a higher level, it's very polarizing in terms of like, broadly how it was conducted. And then even on very specific things, like different people have had totally opposite takeaways and interpretations of very specific things in the episode.
Ali
So clearly, that sort of the gender issues is something we need to discuss a little bit more on the podcast.
Neel
I like to handle well.
Taimur
No, I think it's tricky topic to handle well, yeah, I think. Yeah, I think we didn't really get to the bottom of anything. I think like a lot of the episode was spent trying to get on the same page about things. And I think we eventually kind of got on the same page about a few things. But yeah, I think definitely more more discussion to be had about the topic.
Ali
Indeed. So watch this space. And if you're not subscribed to the podcast already, you should hit that subscribe button. So you get new episodes downloaded to your phone, or podcast player wherever you get these episodes.
Taimur
Cool. And then the final thing is that this week, we actually kicked off the Not Overthinking members community, it started off as a WhatsApp group. And then we very quickly switched to Slack, because it's very hard to have multiple conversations on WhatsApp and for a lot of people to contribute. So we now have the Slack group. There are currently 54 people in it. We'll be keeping it closed off for just these people probably for the next few weeks. As we figure out, you know, what should the group actually look like? What benefits should members actually get and things like that. And then hopefully, in a few weeks time, once the group and us have figured out how things should work, then we can start expanding it a little bit. So stay tuned for that.
Ali
Absolutely. Thank you everyone for joining, and we'll see you next week.
Taimur
Thanks for coming, Neel.
Ali
That's it for this week. Thank you for listening.
Taimur
If you liked this episode, please leave us a review on Apple podcasts on the Apple podcast website. If you're not using an iPhone, there's a link in the show notes.
Ali
If you've got any thoughts on this episode, or any ideas for new podcast topics. We'd love to get an audio message from you with your conundrum question or just anything that we could discuss.
Taimur
Yeah, if you're up for having your voice played on the podcast and your question being the springboard for our discussion, email us an audio file mp3 or voice notes to hi@notoverthinking.com.
Ali
If you've got thoughts, but you'd rather not have your voice played publicly, that's fine as well tweet or DM us at @noverthinking on Twitter please.